Methodology for Prototyping Increased Levels of Automation for Spacecraft Rendezvous Functions
NASA Technical Reports Server (NTRS)
Hart, Jeremy J.; Valasek, John
2007-01-01
The Crew Exploration Vehicle necessitates higher levels of automation than previous NASA vehicles, due to program requirements for automation, including Automated Rendezvous and Docking. Studies of spacecraft development often point to the locus of decision-making authority between humans and computers (i.e. automation) as a prime driver for cost, safety, and mission success. Therefore, a critical component in the Crew Exploration Vehicle development is the determination of the correct level of automation. To identify the appropriate levels of automation and autonomy to design into a human space flight vehicle, NASA has created the Function-specific Level of Autonomy and Automation Tool. This paper develops a methodology for prototyping increased levels of automation for spacecraft rendezvous functions. This methodology is used to evaluate the accuracy of the Function-specific Level of Autonomy and Automation Tool specified levels of automation, via prototyping. Spacecraft rendezvous planning tasks are selected and then prototyped in Matlab using Fuzzy Logic techniques and existing Space Shuttle rendezvous trajectory algorithms.
Banks, Victoria A; Stanton, Neville A
2015-01-01
Automated assistance in driving emergencies aims to improve the safety of our roads by avoiding or mitigating the effects of accidents. However, the behavioural implications of such systems remain unknown. This paper introduces the driver decision-making in emergencies (DDMiEs) framework to investigate how the level and type of automation may affect driver decision-making and subsequent responses to critical braking events using network analysis to interrogate retrospective verbalisations. Four DDMiE models were constructed to represent different levels of automation within the driving task and its effects on driver decision-making. Findings suggest that whilst automation does not alter the decision-making pathway (e.g. the processes between hazard detection and response remain similar), it does appear to significantly weaken the links between information-processing nodes. This reflects an unintended yet emergent property within the task network that could mean that we may not be improving safety in the way we expect. This paper contrasts models of driver decision-making in emergencies at varying levels of automation using the Southampton University Driving Simulator. Network analysis of retrospective verbalisations indicates that increasing the level of automation in driving emergencies weakens the link between information-processing nodes essential for effective decision-making.
NASA Technical Reports Server (NTRS)
Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelly, J. H.; Depkovich, T. M.
1984-01-01
A generic computer simulation for manipulator systems (ROBSIM) was implemented and the specific technologies necessary to increase the role of automation in various missions were developed. The specific items developed are: (1) capability for definition of a manipulator system consisting of multiple arms, load objects, and an environment; (2) capability for kinematic analysis, requirements analysis, and response simulation of manipulator motion; (3) postprocessing options such as graphic replay of simulated motion and manipulator parameter plotting; (4) investigation and simulation of various control methods including manual force/torque and active compliances control; (5) evaluation and implementation of three obstacle avoidance methods; (6) video simulation and edge detection; and (7) software simulation validation.
Improving Grid Resilience through Informed Decision-making (IGRID)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burnham, Laurie; Stamber, Kevin L.; Jeffers, Robert Fredric
The transformation of the distribution grid from a centralized to decentralized architecture, with bi-directional power and data flows, is made possible by a surge in network intelligence and grid automation. While changes are largely beneficial, the interface between grid operator and automated technologies is not well understood, nor are the benefits and risks of automation. Quantifying and understanding the latter is an important facet of grid resilience that needs to be fully investigated. The work described in this document represents the first empirical study aimed at identifying and mitigating the vulnerabilities posed by automation for a grid that for themore » foreseeable future will remain a human-in-the-loop critical infrastructure. Our scenario-based methodology enabled us to conduct a series of experimental studies to identify causal relationships between grid-operator performance and automated technologies and to collect measurements of human performance as a function of automation. Our findings, though preliminary, suggest there are predictive patterns in the interplay between human operators and automation, patterns that can inform the rollout of distribution automation and the hiring and training of operators, and contribute in multiple and significant ways to the field of grid resilience.« less
Automated Decision-Making and Big Data: Concerns for People With Mental Illness.
Monteith, Scott; Glenn, Tasha
2016-12-01
Automated decision-making by computer algorithms based on data from our behaviors is fundamental to the digital economy. Automated decisions impact everyone, occurring routinely in education, employment, health care, credit, and government services. Technologies that generate tracking data, including smartphones, credit cards, websites, social media, and sensors, offer unprecedented benefits. However, people are vulnerable to errors and biases in the underlying data and algorithms, especially those with mental illness. Algorithms based on big data from seemingly unrelated sources may create obstacles to community integration. Voluntary online self-disclosure and constant tracking blur traditional concepts of public versus private data, medical versus non-medical data, and human versus automated decision-making. In contrast to sharing sensitive information with a physician in a confidential relationship, there may be numerous readers of information revealed online; data may be sold repeatedly; used in proprietary algorithms; and are effectively permanent. Technological changes challenge traditional norms affecting privacy and decision-making, and continued discussions on new approaches to provide privacy protections are needed.
NASA Technical Reports Server (NTRS)
Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelly, J. H.; Depkovich, T. M.
1984-01-01
A generic computer simulation for manipulator systems (ROBSIM) was implemented and the specific technologies necessary to increase the role of automation in various missions were developed. The specific items developed were: (1) Capability for definition of a manipulator system consisting of multiple arms, load objects, and an environment; (2) Capability for kinematic analysis, requirements analysis, and response simulation of manipulator motion; (3) Postprocessing options such as graphic replay of simulated motion and manipulator parameter plotting; (4) Investigation and simulation of various control methods including manual force/torque and active compliance control; (5) Evaluation and implementation of three obstacle avoidance methods; (6) Video simulation and edge detection; and (7) Software simulation validation. This appendix is the user's guide and includes examples of program runs and outputs as well as instructions for program use.
NASA Technical Reports Server (NTRS)
Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelly, J. H.; Depkovich, T. M.; Wolfe, W. J.; Nguyen, T.
1986-01-01
The purpose of the Robotic Simulation (ROBSIM) program is to provide a broad range of computer capabilities to assist in the design, verification, simulation, and study of robotic systems. ROBSIM is programmed in FORTRAM 77 and implemented on a VAX 11/750 computer using the VMS operating system. The programmer's guide describes the ROBSIM implementation and program logic flow, and the functions and structures of the different subroutines. With the manual and the in-code documentation, an experienced programmer can incorporate additional routines and modify existing ones to add desired capabilities.
NASA Technical Reports Server (NTRS)
Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.
1982-01-01
A variety of artificial intelligence techniques which could be used with regard to NASA space applications and robotics were evaluated. The techniques studied were decision tree manipulators, problem solvers, rule based systems, logic programming languages, representation language languages, and expert systems. The overall structure of a robotic simulation tool was defined and a framework for that tool developed. Nonlinear and linearized dynamics equations were formulated for n link manipulator configurations. A framework for the robotic simulation was established which uses validated manipulator component models connected according to a user defined configuration.
NASA Technical Reports Server (NTRS)
Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelly, J. H.; Depkovich, T. M.
1984-01-01
The purpose of the Robotics Simulation (ROBSIM) program is to provide a broad range of computer capabilities to assist in the design, verification, simulation, and study of robotic systems. ROBSIM is programmed in FORTRAN 77 and implemented on a VAX 11/750 computer using the VMS operating system. This programmer's guide describes the ROBSIM implementation and program logic flow, and the functions and structures of the different subroutines. With this manual and the in-code documentation, and experienced programmer can incorporate additional routines and modify existing ones to add desired capabilities.
NASA Technical Reports Server (NTRS)
Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.
1982-01-01
The derivation of the equations is presented, the rate control algorithm described, and simulation methodologies summarized. A set of dynamics equations that can be used recursively to calculate forces and torques acting at the joints of an n link manipulator given the manipulator joint rates are derived. The equations are valid for any n link manipulator system with any kind of joints connected in any sequence. The equations of motion for the class of manipulators consisting of n rigid links interconnected by rotary joints are derived. A technique is outlined for reducing the system of equations to eliminate contraint torques. The linearized dynamics equations for an n link manipulator system are derived. The general n link linearized equations are then applied to a two link configuration. The coordinated rate control algorithm used to compute individual joint rates when given end effector rates is described. A short discussion of simulation methodologies is presented.
Applied Swarm-based medicine: collecting decision trees for patterns of algorithms analysis.
Panje, Cédric M; Glatzer, Markus; von Rappard, Joscha; Rothermundt, Christian; Hundsberger, Thomas; Zumstein, Valentin; Plasswilm, Ludwig; Putora, Paul Martin
2017-08-16
The objective consensus methodology has recently been applied in consensus finding in several studies on medical decision-making among clinical experts or guidelines. The main advantages of this method are an automated analysis and comparison of treatment algorithms of the participating centers which can be performed anonymously. Based on the experience from completed consensus analyses, the main steps for the successful implementation of the objective consensus methodology were identified and discussed among the main investigators. The following steps for the successful collection and conversion of decision trees were identified and defined in detail: problem definition, population selection, draft input collection, tree conversion, criteria adaptation, problem re-evaluation, results distribution and refinement, tree finalisation, and analysis. This manuscript provides information on the main steps for successful collection of decision trees and summarizes important aspects at each point of the analysis.
ERIC Educational Resources Information Center
Vernon, Elizabeth
It is generally accepted in the library world that an automated catalog means more accessible data for patrons, greater productivity for librarians, and an improvement in the sharing of bibliographic data among libraries. While the desirability of automation is not a controversial issue, some aspects of automating remain problematic. This article…
Effects of automation of information-processing functions on teamwork.
Wright, Melanie C; Kaber, David B
2005-01-01
We investigated the effects of automation as applied to different stages of information processing on team performance in a complex decision-making task. Forty teams of 2 individuals performed a simulated Theater Defense Task. Four automation conditions were simulated with computer assistance applied to realistic combinations of information acquisition, information analysis, and decision selection functions across two levels of task difficulty. Multiple measures of team effectiveness and team coordination were used. Results indicated different forms of automation have different effects on teamwork. Compared with a baseline condition, an increase in automation of information acquisition led to an increase in the ratio of information transferred to information requested; an increase in automation of information analysis resulted in higher team coordination ratings; and automation of decision selection led to better team effectiveness under low levels of task difficulty but at the cost of higher workload. The results support the use of early and intermediate forms of automation related to acquisition and analysis of information in the design of team tasks. Decision-making automation may provide benefits in more limited contexts. Applications of this research include the design and evaluation of automation in team environments.
Ruohonen, Toni; Ennejmy, Mohammed
2013-01-01
Making reliable and justified operational and strategic decisions is a really challenging task in the health care domain. So far, the decisions have been made based on the experience of managers and staff, or they are evaluated with traditional methods, using inadequate data. As a result of this kind of decision-making process, attempts to improve operations usually have failed or led to only local improvements. Health care organizations have a lot of operational data, in addition to clinical data, which is the key element for making reliable and justified decisions. However, it is progressively problematic to access it and make usage of it. In this paper we discuss about the possibilities how to exploit operational data in the most efficient way in the decision-making process. We'll share our future visions and propose a conceptual framework for automating the decision-making process.
NASA Astrophysics Data System (ADS)
Zhang, Zhong
In this work, motivated by the need to coordinate transmission maintenance scheduling among a multiplicity of self-interested entities in restructured power industry, a distributed decision support framework based on multiagent negotiation systems (MANS) is developed. An innovative risk-based transmission maintenance optimization procedure is introduced. Several models for linking condition monitoring information to the equipment's instantaneous failure probability are presented, which enable quantitative evaluation of the effectiveness of maintenance activities in terms of system cumulative risk reduction. Methodologies of statistical processing, equipment deterioration evaluation and time-dependent failure probability calculation are also described. A novel framework capable of facilitating distributed decision-making through multiagent negotiation is developed. A multiagent negotiation model is developed and illustrated that accounts for uncertainty and enables social rationality. Some issues of multiagent negotiation convergence and scalability are discussed. The relationships between agent-based negotiation and auction systems are also identified. A four-step MAS design methodology for constructing multiagent systems for power system applications is presented. A generic multiagent negotiation system, capable of inter-agent communication and distributed decision support through inter-agent negotiations, is implemented. A multiagent system framework for facilitating the automated integration of condition monitoring information and maintenance scheduling for power transformers is developed. Simulations of multiagent negotiation-based maintenance scheduling among several independent utilities are provided. It is shown to be a viable alternative solution paradigm to the traditional centralized optimization approach in today's deregulated environment. This multiagent system framework not only facilitates the decision-making among competing power system entities, but also provides a tool to use in studying competitive industry relative to monopolistic industry.
Artificial intelligence in cardiology.
Bonderman, Diana
2017-12-01
Decision-making is complex in modern medicine and should ideally be based on available data, structured knowledge and proper interpretation in the context of an individual patient. Automated algorithms, also termed artificial intelligence that are able to extract meaningful patterns from data collections and build decisions upon identified patterns may be useful assistants in clinical decision-making processes. In this article, artificial intelligence-based studies in clinical cardiology are reviewed. The text also touches on the ethical issues and speculates on the future roles of automated algorithms versus clinicians in cardiology and medicine in general.
Systems Operation Studies for Automated Guideway Transit Systems : Summary Report
DOT National Transportation Integrated Search
1980-02-01
In order to examine specific Automated Guideway Transit (AGT) developments and concepts and to build a better knowledge base for future decision-making, UMTA has undertaken a new program of studies and technology investigations called the Urban Mass ...
Reasoning and Knowledge Acquisition Framework for 5G Network Analytics
2017-01-01
Autonomic self-management is a key challenge for next-generation networks. This paper proposes an automated analysis framework to infer knowledge in 5G networks with the aim to understand the network status and to predict potential situations that might disrupt the network operability. The framework is based on the Endsley situational awareness model, and integrates automated capabilities for metrics discovery, pattern recognition, prediction techniques and rule-based reasoning to infer anomalous situations in the current operational context. Those situations should then be mitigated, either proactive or reactively, by a more complex decision-making process. The framework is driven by a use case methodology, where the network administrator is able to customize the knowledge inference rules and operational parameters. The proposal has also been instantiated to prove its adaptability to a real use case. To this end, a reference network traffic dataset was used to identify suspicious patterns and to predict the behavior of the monitored data volume. The preliminary results suggest a good level of accuracy on the inference of anomalous traffic volumes based on a simple configuration. PMID:29065473
Reasoning and Knowledge Acquisition Framework for 5G Network Analytics.
Sotelo Monge, Marco Antonio; Maestre Vidal, Jorge; García Villalba, Luis Javier
2017-10-21
Autonomic self-management is a key challenge for next-generation networks. This paper proposes an automated analysis framework to infer knowledge in 5G networks with the aim to understand the network status and to predict potential situations that might disrupt the network operability. The framework is based on the Endsley situational awareness model, and integrates automated capabilities for metrics discovery, pattern recognition, prediction techniques and rule-based reasoning to infer anomalous situations in the current operational context. Those situations should then be mitigated, either proactive or reactively, by a more complex decision-making process. The framework is driven by a use case methodology, where the network administrator is able to customize the knowledge inference rules and operational parameters. The proposal has also been instantiated to prove its adaptability to a real use case. To this end, a reference network traffic dataset was used to identify suspicious patterns and to predict the behavior of the monitored data volume. The preliminary results suggest a good level of accuracy on the inference of anomalous traffic volumes based on a simple configuration.
DOT National Transportation Integrated Search
1982-06-01
In order to examine specific Automated Guideway Transit (AGT) developments and concepts, and to build a better knowledge base for future decision-making, the Urban Mass Transportation Administration (UMTA) undertook a new program of studies and techn...
Content Classification: Leveraging New Tools and Librarians' Expertise.
ERIC Educational Resources Information Center
Starr, Jennie
1999-01-01
Presents factors for librarians to consider when decision-making about information retrieval. Discusses indexing theory; thesauri aids; controlled vocabulary or thesauri to increase access; humans versus machines; automated tools; product evaluations and evaluation criteria; automated classification tools; content server products; and document…
ERIC Educational Resources Information Center
Lauckner, Heidi; Paterson, Margo; Krupa, Terry
2012-01-01
Often, research projects are presented as final products with the methodologies cleanly outlined and little attention paid to the decision-making processes that led to the chosen approach. Limited attention paid to these decision-making processes perpetuates a sense of mystery about qualitative approaches, particularly for new researchers who will…
Knowledge Representation Artifacts for Use in Sensemaking Support Systems
2015-03-12
and manual processing must be replaced by automated processing wherever it makes sense and is possible. Clearly, given the data and cognitive...knowledge-centric view to situation analysis and decision-making as previously discussed, has lead to the development of several automated processing components...for use in sensemaking support systems [6-11]. In turn, automated processing has required the development of appropriate knowledge
Fontan Surgical Planning: Previous Accomplishments, Current Challenges, and Future Directions.
Trusty, Phillip M; Slesnick, Timothy C; Wei, Zhenglun Alan; Rossignac, Jarek; Kanter, Kirk R; Fogel, Mark A; Yoganathan, Ajit P
2018-04-01
The ultimate goal of Fontan surgical planning is to provide additional insights into the clinical decision-making process. In its current state, surgical planning offers an accurate hemodynamic assessment of the pre-operative condition, provides anatomical constraints for potential surgical options, and produces decent post-operative predictions if boundary conditions are similar enough between the pre-operative and post-operative states. Moving forward, validation with post-operative data is a necessary step in order to assess the accuracy of surgical planning and determine which methodological improvements are needed. Future efforts to automate the surgical planning process will reduce the individual expertise needed and encourage use in the clinic by clinicians. As post-operative physiologic predictions improve, Fontan surgical planning will become an more effective tool to accurately model patient-specific hemodynamics.
NASA Technical Reports Server (NTRS)
Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.
1982-01-01
Documentation of the preliminary software developed as a framework for a generalized integrated robotic system simulation is presented. The program structure is composed of three major functions controlled by a program executive. The three major functions are: system definition, analysis tools, and post processing. The system definition function handles user input of system parameters and definition of the manipulator configuration. The analysis tools function handles the computational requirements of the program. The post processing function allows for more detailed study of the results of analysis tool function executions. Also documented is the manipulator joint model software to be used as the basis of the manipulator simulation which will be part of the analysis tools capability.
More steps towards process automation for optical fabrication
NASA Astrophysics Data System (ADS)
Walker, David; Yu, Guoyu; Beaucamp, Anthony; Bibby, Matt; Li, Hongyu; McCluskey, Lee; Petrovic, Sanja; Reynolds, Christina
2017-06-01
In the context of Industrie 4.0, we have previously described the roles of robots in optical processing, and their complementarity with classical CNC machines, providing both processing and automation functions. After having demonstrated robotic moving of parts between a CNC polisher and metrology station, and auto-fringe-acquisition, we have moved on to automate the wash-down operation. This is part of a wider strategy we describe in this paper, leading towards automating the decision-making operations required before and throughout an optical manufacturing cycle.
Moore, Bethany; Bone, Eric A
2017-01-01
The concept of triage in healthcare has been around for centuries and continues to be applied today so that scarce resources are allocated according to need. A business impact analysis (BIA) is a form of triage in that it identifies which processes are most critical, which to address first and how to allocate limited resources. On its own, however, the BIA provides only a roadmap of the impacts and interdependencies of an event. When disaster strikes, organisational decision-makers often face difficult decisions with regard to allocating limited resources between multiple 'mission-critical' functions. Applying the concept of triage to business continuity provides those decision-makers navigating a rapidly evolving and unpredictable event with a path that protects the fundamental priorities of the organisation. A business triage methodology aids decision-makers in times of crisis by providing a simplified framework for decision-making based on objective, evidence-based criteria, which is universally accepted and understood. When disaster strikes, the survival of the organisation depends on critical decision-making and quick actions to stabilise the incident. This paper argues that organisations need to supplement BIA processes with a decision-making triage methodology that can be quickly applied during the chaos of an actual event.
Durif-Bruckert, C; Roux, P; Morelle, M; Mignotte, H; Faure, C; Moumjid-Ferdjaoui, N
2015-07-01
The aim of this study on shared decision-making in the doctor-patient encounter about surgical treatment for early-stage breast cancer, conducted in a regional cancer centre in France, was to further the understanding of patient perceptions on shared decision-making. The study used methodological triangulation to collect data (both quantitative and qualitative) about patient preferences in the context of a clinical consultation in which surgeons followed a shared decision-making protocol. Data were analysed from a multi-disciplinary research perspective (social psychology and health economics). The triangulated data collection methods were questionnaires (n = 132), longitudinal interviews (n = 47) and observations of consultations (n = 26). Methodological triangulation revealed levels of divergence and complementarity between qualitative and quantitative results that suggest new perspectives on the three inter-related notions of decision-making, participation and information. Patients' responses revealed important differences between shared decision-making and participation per se. The authors note that subjecting patients to a normative behavioural model of shared decision-making in an era when paradigms of medical authority are shifting may undermine the patient's quest for what he or she believes is a more important right: a guarantee of the best care available. © 2014 John Wiley & Sons Ltd.
Panjikar, Santosh; Parthasarathy, Venkataraman; Lamzin, Victor S; Weiss, Manfred S; Tucker, Paul A
2005-04-01
The EMBL-Hamburg Automated Crystal Structure Determination Platform is a system that combines a number of existing macromolecular crystallographic computer programs and several decision-makers into a software pipeline for automated and efficient crystal structure determination. The pipeline can be invoked as soon as X-ray data from derivatized protein crystals have been collected and processed. It is controlled by a web-based graphical user interface for data and parameter input, and for monitoring the progress of structure determination. A large number of possible structure-solution paths are encoded in the system and the optimal path is selected by the decision-makers as the structure solution evolves. The processes have been optimized for speed so that the pipeline can be used effectively for validating the X-ray experiment at a synchrotron beamline.
Sharing intelligence: Decision-making interactions between users and software in MAESTRO
NASA Technical Reports Server (NTRS)
Geoffroy, Amy L.; Gohring, John R.; Britt, Daniel L.
1991-01-01
By combining the best of automated and human decision-making in scheduling many advantages can accrue. The joint performance of the user and system is potentially much better than either alone. Features of the MAESTRO scheduling system serve to illustrate concepts of user/software cooperation. MAESTRO may be operated at a user-determinable and dynamic level of autonomy. Because the system allows so much flexibility in the allocation of decision-making responsibilities, and provides users with a wealth of information and other support for their own decision-making, better overall schedules may result.
NASA Astrophysics Data System (ADS)
Danilova, Olga; Semenova, Zinaida
2018-04-01
The objective of this study is a detailed analysis of physical protection systems development for information resources. The optimization theory and decision-making mathematical apparatus is used to formulate correctly and create an algorithm of selection procedure for security systems optimal configuration considering the location of the secured object’s access point and zones. The result of this study is a software implementation scheme of decision-making system for optimal placement of the physical access control system’s elements.
Unlocking the full potential of Earth observation during the 2015 Texas flood disaster
NASA Astrophysics Data System (ADS)
Schumann, G. J.-P.; Frye, S.; Wells, G.; Adler, R.; Brakenridge, R.; Bolten, J.; Murray, J.; Slayback, D.; Policelli, F.; Kirschbaum, D.; Wu, H.; Cappelaere, P.; Howard, T.; Flamig, Z.; Clark, R.; Stough, T.; Chini, M.; Matgen, P.; Green, D.; Jones, B.
2016-05-01
Intense rainfall during late April and early May 2015 in Texas and Oklahoma led to widespread and sustained flooding in several river basins. Texas state agencies relevant to emergency response were activated when severe weather then ensued for 6 weeks from 8 May until 19 June following Tropical Storm Bill. An international team of scientists and flood response experts assembled and collaborated with decision-making authorities for user-driven high-resolution satellite acquisitions over the most critical areas; while experimental automated flood mapping techniques provided daily ongoing monitoring. This allowed mapping of flood inundation from an unprecedented number of spaceborne and airborne images. In fact, a total of 27,174 images have been ingested to the USGS Hazards Data Distribution System (HDDS) Explorer, except for the SAR images used. Based on the Texas flood use case, we describe the success of this effort as well as the limitations in fulfilling the needs of the decision-makers, and reflect upon these. In order to unlock the full potential for Earth observation data in flood disaster response, we suggest in a call for action (i) stronger collaboration from the onset between agencies, product developers, and decision-makers; (ii) quantification of uncertainties when combining data from different sources in order to augment information content; (iii) include a default role for the end-user in satellite acquisition planning; and (iv) proactive assimilation of methodologies and tools into the mandated agencies.
NASA Technical Reports Server (NTRS)
Haley, D. C.; Almand, B. J.; Thomas, M. M.; Krauze, L. D.; Gremban, K. D.; Sanborn, J. C.; Kelley, J. H.; Depkovich, T. M.; Wolfe, W. J.; Nguyen, T.
1986-01-01
The purpose of the Robotics Simulation Program is to provide a broad range of computer capabilities to assist in the design, verification, simulation, and study of robotics systems. ROBSIM is program in FORTRAN 77 for use on a VAX 11/750 computer under the VMS operating system. This user's guide describes the capabilities of the ROBSIM programs, including the system definition function, the analysis tools function and the postprocessor function. The options a user may encounter with each of these executables are explained in detail and the different program prompts appearing to the user are included. Some useful suggestions concerning the appropriate answers to be given by the user are provided. An example user interactive run in enclosed for each of the main program services, and some of the capabilities are illustrated.
Development of a QFD-based expert system for CNC turning centre selection
NASA Astrophysics Data System (ADS)
Prasad, Kanika; Chakraborty, Shankar
2015-12-01
Computer numerical control (CNC) machine tools are automated devices capable of generating complicated and intricate product shapes in shorter time. Selection of the best CNC machine tool is a critical, complex and time-consuming task due to availability of a wide range of alternatives and conflicting nature of several evaluation criteria. Although, the past researchers had attempted to select the appropriate machining centres using different knowledge-based systems, mathematical models and multi-criteria decision-making methods, none of those approaches has given due importance to the voice of customers. The aforesaid limitation can be overcome using quality function deployment (QFD) technique, which is a systematic approach for integrating customers' needs and designing the product to meet those needs first time and every time. In this paper, the adopted QFD-based methodology helps in selecting CNC turning centres for a manufacturing organization, providing due importance to the voice of customers to meet their requirements. An expert system based on QFD technique is developed in Visual BASIC 6.0 to automate the CNC turning centre selection procedure for different production plans. Three illustrative examples are demonstrated to explain the real-time applicability of the developed expert system.
DOT National Transportation Integrated Search
1974-08-01
Volume 3 describes the methodology for man-machine task allocation. It contains a description of man and machine performance capabilities and an explanation of the methodology employed to allocate tasks to human or automated resources. It also presen...
Automation bias: a systematic review of frequency, effect mediators, and mitigators.
Goddard, Kate; Roudsari, Abdul; Wyatt, Jeremy C
2012-01-01
Automation bias (AB)--the tendency to over-rely on automation--has been studied in various academic fields. Clinical decision support systems (CDSS) aim to benefit the clinical decision-making process. Although most research shows overall improved performance with use, there is often a failure to recognize the new errors that CDSS can introduce. With a focus on healthcare, a systematic review of the literature from a variety of research fields has been carried out, assessing the frequency and severity of AB, the effect mediators, and interventions potentially mitigating this effect. This is discussed alongside automation-induced complacency, or insufficient monitoring of automation output. A mix of subject specific and freetext terms around the themes of automation, human-automation interaction, and task performance and error were used to search article databases. Of 13 821 retrieved papers, 74 met the inclusion criteria. User factors such as cognitive style, decision support systems (DSS), and task specific experience mediated AB, as did attitudinal driving factors such as trust and confidence. Environmental mediators included workload, task complexity, and time constraint, which pressurized cognitive resources. Mitigators of AB included implementation factors such as training and emphasizing user accountability, and DSS design factors such as the position of advice on the screen, updated confidence levels attached to DSS output, and the provision of information versus recommendation. By uncovering the mechanisms by which AB operates, this review aims to help optimize the clinical decision-making process for CDSS developers and healthcare practitioners.
ERIC Educational Resources Information Center
Rupp, André A.
2018-01-01
This article discusses critical methodological design decisions for collecting, interpreting, and synthesizing empirical evidence during the design, deployment, and operational quality-control phases for automated scoring systems. The discussion is inspired by work on operational large-scale systems for automated essay scoring but many of the…
Automated Test-Form Generation
ERIC Educational Resources Information Center
van der Linden, Wim J.; Diao, Qi
2011-01-01
In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…
NASA Astrophysics Data System (ADS)
Lowe, Robert; Ziemke, Tom
2010-09-01
The somatic marker hypothesis (SMH) posits that the role of emotions and mental states in decision-making manifests through bodily responses to stimuli of import to the organism's welfare. The Iowa Gambling Task (IGT), proposed by Bechara and Damasio in the mid-1990s, has provided the major source of empirical validation to the role of somatic markers in the service of flexible and cost-effective decision-making in humans. In recent years the IGT has been the subject of much criticism concerning: (1) whether measures of somatic markers reveal that they are important for decision-making as opposed to behaviour preparation; (2) the underlying neural substrate posited as critical to decision-making of the type relevant to the task; and (3) aspects of the methodological approach used, particularly on the canonical version of the task. In this paper, a cognitive robotics methodology is proposed to explore a dynamical systems approach as it applies to the neural computation of reward-based learning and issues concerning embodiment. This approach is particularly relevant in light of a strongly emerging alternative hypothesis to the SMH, the reversal learning hypothesis, which links, behaviourally and neurocomputationally, a number of more or less complex reward-based decision-making tasks, including the 'A-not-B' task - already subject to dynamical systems investigations with a focus on neural activation dynamics. It is also suggested that the cognitive robotics methodology may be used to extend systematically the IGT benchmark to more naturalised, but nevertheless controlled, settings that might better explore the extent to which the SMH, and somatic states per se, impact on complex decision-making.
Integrating Test-Form Formatting into Automated Test Assembly
ERIC Educational Resources Information Center
Diao, Qi; van der Linden, Wim J.
2013-01-01
Automated test assembly uses the methodology of mixed integer programming to select an optimal set of items from an item bank. Automated test-form generation uses the same methodology to optimally order the items and format the test form. From an optimization point of view, production of fully formatted test forms directly from the item pool using…
Automatically updating predictive modeling workflows support decision-making in drug design.
Muegge, Ingo; Bentzien, Jörg; Mukherjee, Prasenjit; Hughes, Robert O
2016-09-01
Using predictive models for early decision-making in drug discovery has become standard practice. We suggest that model building needs to be automated with minimum input and low technical maintenance requirements. Models perform best when tailored to answering specific compound optimization related questions. If qualitative answers are required, 2-bin classification models are preferred. Integrating predictive modeling results with structural information stimulates better decision making. For in silico models supporting rapid structure-activity relationship cycles the performance deteriorates within weeks. Frequent automated updates of predictive models ensure best predictions. Consensus between multiple modeling approaches increases the prediction confidence. Combining qualified and nonqualified data optimally uses all available information. Dose predictions provide a holistic alternative to multiple individual property predictions for reaching complex decisions.
Advanced Technologies and Methodology for Automated Ultrasonic Testing Systems Quantification
DOT National Transportation Integrated Search
2011-04-29
For automated ultrasonic testing (AUT) detection and sizing accuracy, this program developed a methodology for quantification of AUT systems, advancing and quantifying AUT systems imagecapture capabilities, quantifying the performance of multiple AUT...
Automation Bias: Decision Making and Performance in High-Tech Cockpits
NASA Technical Reports Server (NTRS)
Mosier, Kathleen L.; Skitka, Linda J.; Heers, Susan; Burdick, Mark; Rosekind, Mark R. (Technical Monitor)
1997-01-01
Automated aids and decision support tools are rapidly becoming indispensible tools in high-technology cockpits, and are assuming increasing control of "cognitive" flight tasks, such as calculating fuel-efficient routes, navigating, or detecting and diagnosing system malfunctions and abnormalities. This study was designed to investigate "automation bias," a recently documented factor in the use of automated aids and decision support systems. The term refers to omission and commission errors resulting from the use of automated cues as a heuristic replacement for vigilant information seeking and processing. Glass-cockpit pilots flew flight scenarios involving automation "events," or opportunities for automation-related omission and commission errors. Pilots who perceived themselves as "accountable" for their performance and strategies of interaction with the automation were more likely to double-check automated functioning against other cues, and less likely to commit errors. Pilots were also likely to erroneously "remember" the presence of expected cues when describing their decision-making processes.
NASA Astrophysics Data System (ADS)
Chan-Amaya, Alejandro; Anaya-Pérez, María Elena; Benítez-Baltazar, Víctor Hugo
2017-08-01
Companies are constantly looking for improvements in productivity to increase their competitiveness. The use of automation technologies is a tool that have been proven to be effective to achieve this. There are companies that are not familiar with the process to acquire automation technologies, therefore, they abstain from investments and thereby miss the opportunity to take advantage of it. The present document proposes a methodology to determine the level of automation appropriate for the production process and thus minimize automation and improve production taking in consideration the ergonomics factor.
Expert systems for automated maintenance of a Mars oxygen production system
NASA Astrophysics Data System (ADS)
Huang, Jen-Kuang; Ho, Ming-Tsang; Ash, Robert L.
1992-08-01
Application of expert system concepts to a breadboard Mars oxygen processor unit have been studied and tested. The research was directed toward developing the methodology required to enable autonomous operation and control of these simple chemical processors at Mars. Failure detection and isolation was the key area of concern, and schemes using forward chaining, backward chaining, knowledge-based expert systems, and rule-based expert systems were examined. Tests and simulations were conducted that investigated self-health checkout, emergency shutdown, and fault detection, in addition to normal control activities. A dynamic system model was developed using the Bond-Graph technique. The dynamic model agreed well with tests involving sudden reductions in throughput. However, nonlinear effects were observed during tests that incorporated step function increases in flow variables. Computer simulations and experiments have demonstrated the feasibility of expert systems utilizing rule-based diagnosis and decision-making algorithms.
Automation bias: a systematic review of frequency, effect mediators, and mitigators
Roudsari, Abdul; Wyatt, Jeremy C
2011-01-01
Automation bias (AB)—the tendency to over-rely on automation—has been studied in various academic fields. Clinical decision support systems (CDSS) aim to benefit the clinical decision-making process. Although most research shows overall improved performance with use, there is often a failure to recognize the new errors that CDSS can introduce. With a focus on healthcare, a systematic review of the literature from a variety of research fields has been carried out, assessing the frequency and severity of AB, the effect mediators, and interventions potentially mitigating this effect. This is discussed alongside automation-induced complacency, or insufficient monitoring of automation output. A mix of subject specific and freetext terms around the themes of automation, human–automation interaction, and task performance and error were used to search article databases. Of 13 821 retrieved papers, 74 met the inclusion criteria. User factors such as cognitive style, decision support systems (DSS), and task specific experience mediated AB, as did attitudinal driving factors such as trust and confidence. Environmental mediators included workload, task complexity, and time constraint, which pressurized cognitive resources. Mitigators of AB included implementation factors such as training and emphasizing user accountability, and DSS design factors such as the position of advice on the screen, updated confidence levels attached to DSS output, and the provision of information versus recommendation. By uncovering the mechanisms by which AB operates, this review aims to help optimize the clinical decision-making process for CDSS developers and healthcare practitioners. PMID:21685142
Trade-off decisions in distribution utility management
NASA Astrophysics Data System (ADS)
Slavickas, Rimas Anthony
As a result of the "unbundling" of traditional monopolistic electricity generation and transmission enterprises into a free-market economy, power distribution utilities are faced with very difficult decisions pertaining to electricity supply options and quality of service to the customers. The management of distribution utilities has become increasingly complex, versatile, and dynamic to the extent that conventional, non-automated management tools are almost useless and obsolete. This thesis presents a novel and unified approach to managing electricity supply options and quality of service to customers. The technique formulates the problem in terms of variables, parameters, and constraints. An advanced Mixed Integer Programming (MIP) optimization formulation is developed together with novel, logical, decision-making algorithms. These tools enable the utility management to optimize various cost components and assess their time-trend impacts, taking into account the intangible issues such as customer perception, customer expectation, social pressures, and public response to service deterioration. The above concepts are further generalized and a Logical Proportion Analysis (LPA) methodology and associated software have been developed. Solutions using numbers are replaced with solutions using words (character strings) which more closely emulate the human decision-making process and advance the art of decision-making in the power utility environment. Using practical distribution utility operation data and customer surveys, the developments outlined in this thesis are successfully applied to several important utility management problems. These involve the evaluation of alternative electricity supply options, the impact of rate structures on utility business, and the decision of whether to continue to purchase from a main grid or generate locally (partially or totally) by building Non-Utility Generation (NUG).
End-User Applications of Real-Time Earthquake Information in Europe
NASA Astrophysics Data System (ADS)
Cua, G. B.; Gasparini, P.; Giardini, D.; Zschau, J.; Filangieri, A. R.; Reakt Wp7 Team
2011-12-01
The primary objective of European FP7 project REAKT (Strategies and Tools for Real-Time Earthquake Risk Reduction) is to improve the efficiency of real-time earthquake risk mitigation methods and their capability of protecting structures, infrastructures, and populations. REAKT aims to address the issues of real-time earthquake hazard and response from end-to-end, with efforts directed along the full spectrum of methodology development in earthquake forecasting, earthquake early warning, and real-time vulnerability systems, through optimal decision-making, and engagement and cooperation of scientists and end users for the establishment of best practices for use of real-time information. Twelve strategic test cases/end users throughout Europe have been selected. This diverse group of applications/end users includes civil protection authorities, railway systems, hospitals, schools, industrial complexes, nuclear plants, lifeline systems, national seismic networks, and critical structures. The scale of target applications covers a wide range, from two school complexes in Naples, to individual critical structures, such as the Rion Antirion bridge in Patras, and the Fatih Sultan Mehmet bridge in Istanbul, to large complexes, such as the SINES industrial complex in Portugal and the Thessaloniki port area, to distributed lifeline and transportation networks and nuclear plants. Some end-users are interested in in-depth feasibility studies for use of real-time information and development of rapid response plans, while others intend to install real-time instrumentation and develop customized automated control systems. From the onset, REAKT scientists and end-users will work together on concept development and initial implementation efforts using the data products and decision-making methodologies developed with the goal of improving end-user risk mitigation. The aim of this scientific/end-user partnership is to ensure that scientific efforts are applicable to operational, real-world problems.
Conference on Automated Decision-Making and Problem Solving, the Third Day: Issues Discussed
NASA Technical Reports Server (NTRS)
Hawkins, W. W.; Pennington, J. E.; Baker, L. K.
1980-01-01
A conference held at Langley Research Center in May of 1980 brought together university experts from the fields of Control Theory, Operations Research, and Artificial Intelligence to explore current research in automation from both the perspective of their own particular disciplines and from that of interdisciplinary considerations. Informal discussions from the final day of the those day conference are summarized.
Effects of Automation Types on Air Traffic Controller Situation Awareness and Performance
NASA Technical Reports Server (NTRS)
Sethumadhavan, A.
2009-01-01
The Joint Planning and Development Office has proposed the introduction of automated systems to help air traffic controllers handle the increasing volume of air traffic in the next two decades (JPDO, 2007). Because fully automated systems leave operators out of the decision-making loop (e.g., Billings, 1991), it is important to determine the right level and type of automation that will keep air traffic controllers in the loop. This study examined the differences in the situation awareness (SA) and collision detection performance of individuals when they worked with information acquisition, information analysis, decision and action selection and action implementation automation to control air traffic (Parasuraman, Sheridan, & Wickens, 2000). When the automation was unreliable, the time taken to detect an upcoming collision was significantly longer for all the automation types compared with the information acquisition automation. This poor performance following automation failure was mediated by SA, with lower SA yielding poor performance. Thus, the costs associated with automation failure are greater when automation is applied to higher order stages of information processing. Results have practical implications for automation design and development of SA training programs.
Humans: still vital after all these years of automation.
Parasuraman, Raja; Wickens, Christopher D
2008-06-01
The authors discuss empirical studies of human-automation interaction and their implications for automation design. Automation is prevalent in safety-critical systems and increasingly in everyday life. Many studies of human performance in automated systems have been conducted over the past 30 years. Developments in three areas are examined: levels and stages of automation, reliance on and compliance with automation, and adaptive automation. Automation applied to information analysis or decision-making functions leads to differential system performance benefits and costs that must be considered in choosing appropriate levels and stages of automation. Human user dependence on automated alerts and advisories reflects two components of operator trust, reliance and compliance, which are in turn determined by the threshold designers use to balance automation misses and false alarms. Finally, adaptive automation can provide additional benefits in balancing workload and maintaining the user's situation awareness, although more research is required to identify when adaptation should be user controlled or system driven. The past three decades of empirical research on humans and automation has provided a strong science base that can be used to guide the design of automated systems. This research can be applied to most current and future automated systems.
Metzger, Ulla; Parasuraman, Raja
2005-01-01
Future air traffic management concepts envisage shared decision-making responsibilities between controllers and pilots, necessitating that controllers be supported by automated decision aids. Even as automation tools are being introduced, however, their impact on the air traffic controller is not well understood. The present experiments examined the effects of an aircraft-to-aircraft conflict decision aid on performance and mental workload of experienced, full-performance level controllers in a simulated Free Flight environment. Performance was examined with both reliable (Experiment 1) and inaccurate automation (Experiment 2). The aid improved controller performance and reduced mental workload when it functioned reliably. However, detection of a particular conflict was better under manual conditions than under automated conditions when the automation was imperfect. Potential or actual applications of the results include the design of automation and procedures for future air traffic control systems.
IFC BIM-Based Methodology for Semi-Automated Building Energy Performance Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bazjanac, Vladimir
2008-07-01
Building energy performance (BEP) simulation is still rarely used in building design, commissioning and operations. The process is too costly and too labor intensive, and it takes too long to deliver results. Its quantitative results are not reproducible due to arbitrary decisions and assumptions made in simulation model definition, and can be trusted only under special circumstances. A methodology to semi-automate BEP simulation preparation and execution makes this process much more effective. It incorporates principles of information science and aims to eliminate inappropriate human intervention that results in subjective and arbitrary decisions. This is achieved by automating every part ofmore » the BEP modeling and simulation process that can be automated, by relying on data from original sources, and by making any necessary data transformation rule-based and automated. This paper describes the new methodology and its relationship to IFC-based BIM and software interoperability. It identifies five steps that are critical to its implementation, and shows what part of the methodology can be applied today. The paper concludes with a discussion of application to simulation with EnergyPlus, and describes data transformation rules embedded in the new Geometry Simplification Tool (GST).« less
An evaluation of NASA's program in human factors research: Aircrew-vehicle system interaction
NASA Technical Reports Server (NTRS)
1982-01-01
Research in human factors in the aircraft cockpit and a proposed program augmentation were reviewed. The dramatic growth of microprocessor technology makes it entirely feasible to automate increasingly more functions in the aircraft cockpit; the promise of improved vehicle performance, efficiency, and safety through automation makes highly automated flight inevitable. An organized data base and validated methodology for predicting the effects of automation on human performance and thus on safety are lacking and without such a data base and validated methodology for analyzing human performance, increased automation may introduce new risks. Efforts should be concentrated on developing methods and techniques for analyzing man machine interactions, including human workload and prediction of performance.
Effects of imperfect automation on decision making in a simulated command and control task.
Rovira, Ericka; McGarry, Kathleen; Parasuraman, Raja
2007-02-01
Effects of four types of automation support and two levels of automation reliability were examined. The objective was to examine the differential impact of information and decision automation and to investigate the costs of automation unreliability. Research has shown that imperfect automation can lead to differential effects of stages and levels of automation on human performance. Eighteen participants performed a "sensor to shooter" targeting simulation of command and control. Dependent variables included accuracy and response time of target engagement decisions, secondary task performance, and subjective ratings of mental work-load, trust, and self-confidence. Compared with manual performance, reliable automation significantly reduced decision times. Unreliable automation led to greater cost in decision-making accuracy under the higher automation reliability condition for three different forms of decision automation relative to information automation. At low automation reliability, however, there was a cost in performance for both information and decision automation. The results are consistent with a model of human-automation interaction that requires evaluation of the different stages of information processing to which automation support can be applied. If fully reliable decision automation cannot be guaranteed, designers should provide users with information automation support or other tools that allow for inspection and analysis of raw data.
Function allocation for humans and automation in the context of team dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeffrey C. Joe; John O'Hara; Jacques Hugo
Within Human Factors Engineering, a decision-making process called function allocation (FA) is used during the design life cycle of complex systems to distribute the system functions, often identified through a functional requirements analysis, to all human and automated machine agents (or teammates) involved in controlling the system. Most FA methods make allocation decisions primarily by comparing the capabilities of humans and automation, but then also by considering secondary factors such as cost, regulations, and the health and safety of workers. The primary analysis of the strengths and weaknesses of humans and machines, however, is almost always considered in terms ofmore » individual human or machine capabilities. Yet, FA is fundamentally about teamwork in that the goal of the FA decision-making process is to determine what are the optimal allocations of functions among agents. Given this framing of FA, and the increasing use of and sophistication of automation, there are two related social psychological issues that current FA methods need to address more thoroughly. First, many principles for effective human teamwork are not considered as central decision points or in the iterative hypothesis and testing phase in most FA methods, when it is clear that social factors have numerous positive and negative effects on individual and team capabilities. Second, social psychological factors affecting team performance and can be difficult to translate to automated agents, and most FA methods currently do not account for this effect. The implications for these issues are discussed.« less
de Bruin, Jeroen S; Adlassnig, Klaus-Peter; Leitich, Harald; Rappelsberger, Andrea
2018-01-01
Evidence-based clinical guidelines have a major positive effect on the physician's decision-making process. Computer-executable clinical guidelines allow for automated guideline marshalling during a clinical diagnostic process, thus improving the decision-making process. Implementation of a digital clinical guideline for the prevention of mother-to-child transmission of hepatitis B as a computerized workflow, thereby separating business logic from medical knowledge and decision-making. We used the Business Process Model and Notation language system Activiti for business logic and workflow modeling. Medical decision-making was performed by an Arden-Syntax-based medical rule engine, which is part of the ARDENSUITE software. We succeeded in creating an electronic clinical workflow for the prevention of mother-to-child transmission of hepatitis B, where institution-specific medical decision-making processes could be adapted without modifying the workflow business logic. Separation of business logic and medical decision-making results in more easily reusable electronic clinical workflows.
Monitoring costs in the ICU: a search for a pertinent methodology.
Reis Miranda, D; Jegers, M
2012-10-01
Attempts to determine costs in the intensive care unit (ICU) were not successful until now, as they failed to detect differences of costs between patients. The methodology and/or the instruments used might be at the origin of this failure. Based on the results of the European ICUs studies and on the descriptions of the activities of care in the ICU, we gathered and analysed the relevant literature concerning the monitoring of costs in the ICU. The aim was to formulate a methodology, from an economic perspective, in which future research may be framed. A bottom-up microcosting methodology will enable to distinguish costs between patients. The resulting information will at the same time support the decision-making of top management and be ready to include in the financial system of the hospital. Nursing staff explains about 30% of the total costs. This relation remains constant irrespective of the annual nurse/patient ratio. In contrast with other scoring instruments, the nursing activities score (NAS) covers all nursing activities. (1) NAS is to be chosen for quantifying nursing activities; (2) an instrument for measuring the physician's activities is not yet available; (3) because the nursing activities have a large impact on total costs, the standardisation of the processes of care (following the system approach) will contribute to manage costs, making also reproducible the issue of quality of care; (4) the quantification of the nursing activities may be the required (proxy) input for the automated bottom-up monitoring of costs in the ICU. © 2012 The Authors. Acta Anaesthesiologica Scandinavica © 2012 The Acta Anaesthesiologica Scandinavica Foundation.
2015-05-01
multiple automated cognitive tests, data management and reporting capabilities, and executive menu. The DANA battery was given on a Trimble NOMAD ...handheld computing device using a stylus for consistency. The NOMAD runs a custom version of the Android Operating System and has a color 3.5 inch... digit pairs are shown below a key, & the participant indicates it matches the one in the key. PRO This test targets decision-making capabilities
Formal verification of human-automation interaction
NASA Technical Reports Server (NTRS)
Degani, Asaf; Heymann, Michael
2002-01-01
This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training material (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces.
Automation bias: decision making and performance in high-tech cockpits.
Mosier, K L; Skitka, L J; Heers, S; Burdick, M
1997-01-01
Automated aids and decision support tools are rapidly becoming indispensable tools in high-technology cockpits and are assuming increasing control of"cognitive" flight tasks, such as calculating fuel-efficient routes, navigating, or detecting and diagnosing system malfunctions and abnormalities. This study was designed to investigate automation bias, a recently documented factor in the use of automated aids and decision support systems. The term refers to omission and commission errors resulting from the use of automated cues as a heuristic replacement for vigilant information seeking and processing. Glass-cockpit pilots flew flight scenarios involving automation events or opportunities for automation-related omission and commission errors. Although experimentally manipulated accountability demands did not significantly impact performance, post hoc analyses revealed that those pilots who reported an internalized perception of "accountability" for their performance and strategies of interaction with the automation were significantly more likely to double-check automated functioning against other cues and less likely to commit errors than those who did not share this perception. Pilots were also lilkely to erroneously "remember" the presence of expected cues when describing their decision-making processes.
NASA Astrophysics Data System (ADS)
Gupta, Mahima; Mohanty, B. K.
2017-04-01
In this paper, we have developed a methodology to derive the level of compensation numerically in multiple criteria decision-making (MCDM) problems under fuzzy environment. The degree of compensation is dependent on the tranquility and anxiety level experienced by the decision-maker while taking the decision. Higher tranquility leads to the higher realisation of the compensation whereas the increased level of anxiety reduces the amount of compensation in the decision process. This work determines the level of tranquility (or anxiety) using the concept of fuzzy sets and its various level sets. The concepts of indexing of fuzzy numbers, the risk barriers and the tranquility level of the decision-maker are used to derive his/her risk prone or risk averse attitude of decision-maker in each criterion. The aggregation of the risk levels in each criterion gives us the amount of compensation in the entire MCDM problem. Inclusion of the compensation leads us to model the MCDM problem as binary integer programming problem (BIP). The solution to BIP gives us the compensatory decision to MCDM. The proposed methodology is illustrated through a numerical example.
Automation and robotics for the Space Station - An ATAC perspective
NASA Technical Reports Server (NTRS)
Nunamaker, Robert R.
1989-01-01
The study of automation and robotics for the Space Station by the Advanced Technology Advisory Committee is surveyed. The formation of the committee and the methodology for the Space Station automation study are discussed. The committee's recommendations for automation and robotics research and development are listed.
Dynamic Decision-Making in Multi-Task Environments: Theory and Experimental Results.
1981-03-15
The operator’s primary responsibility in this new role is to extract information from his environment, and to integrate it for’ action selection and its...of the human operator from one of a controller to one of a supervisory decision-maker. The operator’s primary responsibility in this new role is to...troller to that of a monitor of multiple tasks, or a supervisor of sev- ~ I eral semi-automated subsystems. The operator’s primary task in these
Software for rapid prototyping in the pharmaceutical and biotechnology industries.
Kappler, Michael A
2008-05-01
The automation of drug discovery methods continues to develop, especially techniques that process information, represent workflow and facilitate decision-making. The magnitude of data and the plethora of questions in pharmaceutical and biotechnology research give rise to the need for rapid prototyping software. This review describes the advantages and disadvantages of three solutions: Competitive Workflow, Taverna and Pipeline Pilot. Each of these systems processes large amounts of data, integrates diverse systems and assists novice programmers and human experts in critical decision-making steps.
Hemorrhage Detection and Segmentation in Traumatic Pelvic Injuries
Davuluri, Pavani; Wu, Jie; Tang, Yang; Cockrell, Charles H.; Ward, Kevin R.; Najarian, Kayvan; Hargraves, Rosalyn H.
2012-01-01
Automated hemorrhage detection and segmentation in traumatic pelvic injuries is vital for fast and accurate treatment decision making. Hemorrhage is the main cause of deaths in patients within first 24 hours after the injury. It is very time consuming for physicians to analyze all Computed Tomography (CT) images manually. As time is crucial in emergence medicine, analyzing medical images manually delays the decision-making process. Automated hemorrhage detection and segmentation can significantly help physicians to analyze these images and make fast and accurate decisions. Hemorrhage segmentation is a crucial step in the accurate diagnosis and treatment decision-making process. This paper presents a novel rule-based hemorrhage segmentation technique that utilizes pelvic anatomical information to segment hemorrhage accurately. An evaluation measure is used to quantify the accuracy of hemorrhage segmentation. The results show that the proposed method is able to segment hemorrhage very well, and the results are promising. PMID:22919433
Just-in-time automated counseling for physical activity promotion.
Bickmore, Timothy; Gruber, Amanda; Intille, Stephen
2008-11-06
Preliminary results from a field study into the efficacy of automated health behavior counseling delivered at the moment of user decision-making compared to the same counseling delivered at the end of the day are reported. The study uses an animated PDA-based advisor with an integrated accelerometer that can engage users in dialogues about their physical activity throughout the day. Preliminary results indicate health counseling is more effective when delivered just-in-time than when delivered retrospectively.
Flores, Walter
2010-01-01
Governance refers to decision-making processes in which power relationships and actors and institutions' particular interests converge. Situations of consensus and conflict are inherent to such processes. Furthermore, decision-making happens within a framework of ethical principles, motivations and incentives which could be explicit or implicit. Health systems in most Latin-American and Caribbean countries take the principles of equity, solidarity, social participation and the right to health as their guiding principles; such principles must thus rule governance processes. However, this is not always the case and this is where the importance of investigating governance in health systems lies. Making advances in investigating governance involves conceptual and methodological implications. Clarifying and integrating normative and analytical approaches is relevant at conceptual level as both are necessary for an approach seeking to investigate and understand social phenomena's complexity. In relation to methodological level, there is a need to expand the range of variables, sources of information and indicators for studying decision-making aimed to greater equity, health citizenship and public policy efficiency.
A Case Study of Reverse Engineering Integrated in an Automated Design Process
NASA Astrophysics Data System (ADS)
Pescaru, R.; Kyratsis, P.; Oancea, G.
2016-11-01
This paper presents a design methodology which automates the generation of curves extracted from the point clouds that have been obtained by digitizing the physical objects. The methodology is described on a product belonging to the industry of consumables, respectively a footwear type product that has a complex shape with many curves. The final result is the automated generation of wrapping curves, surfaces and solids according to the characteristics of the customer's foot, and to the preferences for the chosen model, which leads to the development of customized products.
Problem solving using soft systems methodology.
Land, L
This article outlines a method of problem solving which considers holistic solutions to complex problems. Soft systems methodology allows people involved in the problem situation to have control over the decision-making process.
International Students Decision-Making Process
ERIC Educational Resources Information Center
Cubillo, Jose Maria; Sanchez, Joaquin; Cervino, Julio
2006-01-01
Purpose--The purpose of this paper is to propose a theoretical model that integrates the different groups of factors which influence the decision-making process of international students, analysing different dimensions of this process and explaining those factors which determine students' choice. Design/methodology/approach--A hypothetical model…
William H. Cooke; Dennis M. Jacobs
2002-01-01
FIA annual inventories require rapid updating of pixel-based Phase 1 estimates. Scientists at the Southern Research Station are developing an automated methodology that uses a Normalized Difference Vegetation Index (NDVI) for identifying and eliminating problem FIA plots from the analysis. Problem plots are those that have questionable land useiland cover information....
Automated power management and control
NASA Technical Reports Server (NTRS)
Dolce, James L.
1991-01-01
A comprehensive automation design is being developed for Space Station Freedom's electric power system. A joint effort between NASA's Office of Aeronautics and Exploration Technology and NASA's Office of Space Station Freedom, it strives to increase station productivity by applying expert systems and conventional algorithms to automate power system operation. The initial station operation will use ground-based dispatches to perform the necessary command and control tasks. These tasks constitute planning and decision-making activities that strive to eliminate unplanned outages. We perceive an opportunity to help these dispatchers make fast and consistent on-line decisions by automating three key tasks: failure detection and diagnosis, resource scheduling, and security analysis. Expert systems will be used for the diagnostics and for the security analysis; conventional algorithms will be used for the resource scheduling.
Solid Waste Management Planning--A Methodology
ERIC Educational Resources Information Center
Theisen, Hilary M.; And Others
1975-01-01
This article presents a twofold solid waste management plan consisting of a basic design methodology and a decision-making methodology. The former provides a framework for the developing plan while the latter builds flexibility into the design so that there is a model for use during the planning process. (MA)
Library Automation in Sub Saharan Africa: Case Study of the University of Botswana
ERIC Educational Resources Information Center
Mutula, Stephen Mudogo
2012-01-01
Purpose: This article aims to present experiences and the lessons learned from the University of Botswana (UB) library automation project. The implications of the project for similar libraries planning automation in sub Saharan Africa and beyond are adduced. Design/methodology/approach: The article is a case study of library automation at the…
Automation of Acquisition Records and Routine in the University Library, Newcastle upon Tyne
ERIC Educational Resources Information Center
Line, Maurice B.
2006-01-01
Purpose: Reports on the trial of an automated order routine for the University Library in Newcastle which began in April 1966. Design/methodology/approach: Presents the author's experiences of the manual order processing system, and the impetus for trialling an automated system. The stages of the automated system are described in detail. Findings:…
Emerging Educational Institutional Decision-Making Matrix
ERIC Educational Resources Information Center
Ashford-Rowe, Kevin H.; Holt, Marnie
2011-01-01
The "emerging educational institutional decision-making matrix" is developed to allow educational institutions to adopt a rigorous and consistent methodology of determining which of the myriad of emerging educational technologies will be the most compelling for the institution, particularly ensuring that it is the educational or pedagogical but…
Costa, Susana P F; Pinto, Paula C A G; Lapa, Rui A S; Saraiva, M Lúcia M F S
2015-03-02
A fully automated Vibrio fischeri methodology based on sequential injection analysis (SIA) has been developed. The methodology was based on the aspiration of 75 μL of bacteria and 50 μL of inhibitor followed by measurement of the luminescence of bacteria. The assays were conducted for contact times of 5, 15, and 30 min, by means of three mixing chambers that ensured adequate mixing conditions. The optimized methodology provided a precise control of the reaction conditions which is an asset for the analysis of a large number of samples. The developed methodology was applied to the evaluation of the impact of a set of ionic liquids (ILs) on V. fischeri and the results were compared with those provided by a conventional assay kit (Biotox(®)). The collected data evidenced the influence of different cation head groups and anion moieties on the toxicity of ILs. Generally, aromatic cations and fluorine-containing anions displayed higher impact on V. fischeri, evidenced by lower EC50. The proposed methodology was validated through statistical analysis which demonstrated a strong positive correlation (P>0.98) between assays. It is expected that the automated methodology can be tested for more classes of compounds and used as alternative to microplate based V. fischeri assay kits. Copyright © 2014 Elsevier B.V. All rights reserved.
Integrated Risk-Informed Decision-Making for an ALMR PRISM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Muhlheim, Michael David; Belles, Randy; Denning, Richard S.
Decision-making is the process of identifying decision alternatives, assessing those alternatives based on predefined metrics, selecting an alternative (i.e., making a decision), and then implementing that alternative. The generation of decisions requires a structured, coherent process, or a decision-making process. The overall objective for this work is that the generalized framework is adopted into an autonomous decision-making framework and tailored to specific requirements for various applications. In this context, automation is the use of computing resources to make decisions and implement a structured decision-making process with limited or no human intervention. The overriding goal of automation is to replace ormore » supplement human decision makers with reconfigurable decision-making modules that can perform a given set of tasks rationally, consistently, and reliably. Risk-informed decision-making requires a probabilistic assessment of the likelihood of success given the status of the plant/systems and component health, and a deterministic assessment between plant operating parameters and reactor protection parameters to prevent unnecessary trips and challenges to plant safety systems. The probabilistic portion of the decision-making engine of the supervisory control system is based on the control actions associated with an ALMR PRISM. Newly incorporated into the probabilistic models are the prognostic/diagnostic models developed by Pacific Northwest National Laboratory. These allow decisions to incorporate the health of components into the decision–making process. Once the control options are identified and ranked based on the likelihood of success, the supervisory control system transmits the options to the deterministic portion of the platform. The deterministic portion of the decision-making engine uses thermal-hydraulic modeling and components for an advanced liquid-metal reactor Power Reactor Inherently Safe Module. The deterministic multi-attribute decision-making framework uses various sensor data (e.g., reactor outlet temperature, steam generator drum level) and calculates its position within the challenge state, its trajectory, and its margin within the controllable domain using utility functions to evaluate current and projected plant state space for different control decisions. The metrics that are evaluated are based on reactor trip set points. The integration of the deterministic calculations using multi-physics analyses and probabilistic safety calculations allows for the examination and quantification of margin recovery strategies. This also provides validation of the control options identified from the probabilistic assessment. Thus, the thermalhydraulics analyses are used to validate the control options identified from the probabilistic assessment. Future work includes evaluating other possible metrics and computational efficiencies, and developing a user interface to mimic display panels at a modern nuclear power plant.« less
Defining the drivers for accepting decision making automation in air traffic management.
Bekier, Marek; Molesworth, Brett R C; Williamson, Ann
2011-04-01
Air Traffic Management (ATM) operators are under increasing pressure to improve the efficiency of their operation to cater for forecasted increases in air traffic movements. One solution involves increasing the utilisation of automation within the ATM system. The success of this approach is contingent on Air Traffic Control Operators' (ATCOs) willingness to accept increased levels of automation. The main aim of the present research was to examine the drivers underpinning ATCOs' willingness to accept increased utilisation of automation within their role. Two fictitious scenarios involving the application of two new automated decision-making tools were created. The results of an online survey revealed traditional predictors of automation acceptance such as age, trust and job satisfaction explain between 4 and 7% of the variance. Furthermore, these predictors varied depending on the purpose in which the automation was to be employed. These results are discussed from an applied and theoretical perspective. STATEMENT OF RELEVANCE: Efficiency improvements in ATM are required to cater for forecasted increases in air traffic movements. One solution is to increase the utilisation of automation within Air Traffic Control. The present research examines the drivers underpinning air traffic controllers' willingness to accept increased levels of automation in their role.
William H. Cooke; Dennis M. Jacobs
2005-01-01
FIA annual inventories require rapid updating of pixel-based Phase 1 estimates. Scientists at the Southern Research Station are developing an automated methodology that uses a Normalized Difference Vegetation Index (NDVI) for identifying and eliminating problem FIA plots from the analysis. Problem plots are those that have questionable land use/land cover information....
[Shared decision-making in mental health care: a role model from youth mental health care].
Westermann, G M A; Maurer, J M G
2015-01-01
In the communication and interaction between doctor and patient in Western health care there has been a paradigm shift from the paternalistic approach to shared decision-making. To summarise the background situation, recent developments and the current level of shared decision-making in (youth) mental health care. We conducted a critical review of the literature relating to the methodology development, research and the use of counselling and decision-making in mental health care. The majority of patients, professionals and other stakeholders consider shared decision-making to be desirable and important for improving the quality and efficiency of care. Up till recently most research and studies have concentrated on helping patients to develop decision-making skills and on showing patients how and where to access information. At the moment more attention is being given to the development of skills and circumstances that will increase patients' interaction with care professionals and patients' emotional involvement in shared decision-making. In mental health for children and adolescents, more often than in adult mental health care, it has been customary to give more attention to these aspects of shared decision-making, particularly during counselling sessions that mark the transition from diagnosis to treatment. This emphasis has been apparent for a long time in textbooks, daily practice, methodology development and research in youth mental health care. Currently, a number of similar developments are taking place in adult mental health care. Although most health professionals support the policy of shared decision-making, the implementation of the policy in mental health care is still at an early stage. In practice, a number of obstacles still have to be surmounted. However, the experience gained with counselling and decision-making in (youth) mental health care may serve as an example to other sections of mental health care and play an important role in the further development of shared decision-making.
Automation - Changes in cognitive demands and mental workload
NASA Technical Reports Server (NTRS)
Tsang, Pamela S.; Johnson, Walter W.
1987-01-01
The effect of partial automation on mental workloads in man/machine tasks is investigated experimentally. Subjective workload measures are obtained from six subjects after performance of a task battery comprising two manual (flight-path control, FC, and target acquisition, TA) tasks and one decisionmaking (engine failure, EF) task; the FC task was performed in both a fully manual (altitude and lateral control) mode and in a semiautomated mode (autmatic latitude control). The performance results and subjective evaluations are presented in graphs and characterized in detail. The automation is shown to improve objective performance and lower subjective workload significantly in the combined FC/TA task, but not in the FC task alone or in the FC/EF task.
ERIC Educational Resources Information Center
van den Bosch, Roxette M.; Espin, Christine A.; Chung, Siuman; Saab, Nadira
2017-01-01
Teachers have difficulty using data from Curriculum-based Measurement (CBM) progress graphs of students with learning difficulties for instructional decision-making. As a first step in unraveling those difficulties, we studied teachers' comprehension of CBM graphs. Using think-aloud methodology, we examined 23 teachers' ability to…
Cognitive Task Analysis of Business Jet Pilots' Weather Flying Behaviors: Preliminary Results
NASA Technical Reports Server (NTRS)
Latorella, Kara; Pliske, Rebecca; Hutton, Robert; Chrenka, Jason
2001-01-01
This report presents preliminary findings from a cognitive task analysis (CTA) of business aviation piloting. Results describe challenging weather-related aviation decisions and the information and cues used to support these decisions. Further, these results demonstrate the role of expertise in business aviation decision-making in weather flying, and how weather information is acquired and assessed for reliability. The challenging weather scenarios and novice errors identified in the results provide the basis for experimental scenarios and dependent measures to be used in future flight simulation evaluations of candidate aviation weather information systems. Finally, we analyzed these preliminary results to recommend design and training interventions to improve business aviation decision-making with weather information. The primary objective of this report is to present these preliminary findings and to document the extended CTA methodology used to elicit and represent expert business aviator decision-making with weather information. These preliminary findings will be augmented with results from additional subjects using this methodology. A summary of the complete results, absent the detailed treatment of methodology provided in this report, will be documented in a separate publication.
Unlocking the Full Potential of Earth Observation During the 2015 Texas Flood Disaster
NASA Technical Reports Server (NTRS)
Schumann, G. J-P.; Frye, S.; Wells, G.; Adler, R.; Brakenridge, R.; Bolten, J.; Murray, J.; Slayback, D.; Policelli, F.; Kirschbaum, D.;
2016-01-01
Intense rainfall during late April and early May 2015 in Texas and Oklahoma led to widespread and sustained flooding in several river basins. Texas state agencies relevant to emergency response were activated when severe weather then ensued for 6 weeks from 8 May until 19 June following Tropical Storm Bill. An international team of scientists and flood response experts assembled and collaborated with decision-making authorities for user-driven high-resolution satellite acquisitions over the most critical areas; while experimental automated flood mapping techniques provided daily ongoing monitoring. This allowed mapping of flood inundation from an unprecedented number of spaceborne and airborne images. In fact, a total of 27,174 images have been ingested to the USGS Hazards Data Distribution System (HDDS) Explorer, except for the SAR images used. Based on the Texas flood use case, we describe the success of this effort as well as the limitations in fulfilling the needs of the decision-makers, and reflect upon these. In order to unlock the full potential for Earth observation data in flood disaster response, we suggest in a call for action(i) stronger collaboration from the onset between agencies, product developers, and decision-makers;(ii) quantification of uncertainties when combining data from different sources in order to augment information content; (iii) include a default role for the end-user in satellite acquisition planning; and(iv) proactive assimilation of methodologies and tools into the mandated agencies.
An automated methodology development. [software design for combat simulation
NASA Technical Reports Server (NTRS)
Hawley, L. R.
1985-01-01
The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.
Barber, Larissa K; Smit, Brandon W
2014-01-01
This study replicated ego-depletion predictions from the self-control literature in a computer simulation task that requires ongoing decision-making in relation to constantly changing environmental information: the Network Fire Chief (NFC). Ego-depletion led to decreased self-regulatory effort, but not performance, on the NFC task. These effects were also buffered by task enjoyment so that individuals who enjoyed the dynamic decision-making task did not experience ego-depletion effects. These findings confirm that past ego-depletion effects on decision-making are not limited to static or isolated decision-making tasks and can be extended to dynamic, naturalistic decision-making processes more common to naturalistic settings. Furthermore, the NFC simulation provides a methodological mechanism for independently measuring effort and performance when studying ego-depletion.
What Every Librarian Should Know About Proposed Changes in Cataloging Rules: A Brief Overview
ERIC Educational Resources Information Center
Edgar, Neal L.
1975-01-01
A discussion of Anglo-American Cataloging Rules includes history, revisions, recent changes, decision-makers and disagreements, and the question of how standardization and the needs of automation will affect the usefulness of the catalog for the library's users. (LS)
Foundation Degree Students and Their Educational Decision-Making
ERIC Educational Resources Information Center
Greenbank, Paul
2009-01-01
Purpose: The purpose of this paper is to examine the decision-making process of students who decided to study for a foundation degree. Design/methodology/approach: The research involved interviewing 30 students who were on, or had recently completed, a business-related foundation degree. Findings: This study found that students were not adopting a…
An analytical procedure to assist decision-making in a government research organization
H. Dean Claxton; Giuseppe Rensi
1972-01-01
An analytical procedure to help management decision-making in planning government research is described. The objectives, activities, and restrictions of a government research organization are modeled in a consistent analytical framework. Theory and methodology is drawn from economics and mathe-matical programing. The major analytical aspects distinguishing research...
Sustainability and Ethics as Decision-Making Paradigms in Engineering Curricula
ERIC Educational Resources Information Center
El-Zein, Abbas; Airey, David; Bowden, Peter; Clarkeburn, Henriikka
2008-01-01
Purpose: The aim of this paper is to explore the rationale for teaching sustainability and engineering ethics within a decision-making paradigm, and critically appraise ways of achieving related learning outcomes. Design/methodology/approach: The paper presents the experience of the School of Civil Engineering at the University of Sydney in…
Got risk? risk-centric perspective for spacecraft technology decision-making
NASA Technical Reports Server (NTRS)
Feather, Martin S.; Cornford, Steven L.; Moran, Kelly
2004-01-01
A risk-based decision-making methodology conceived and developed at JPL and NASA has been used to aid in decision making for spacecraft technology assessment, adoption, development and operation. It takes a risk-centric perspective, through which risks are used as a reasoning step to interpose between mission objectives and risk mitigation measures.
Automated Discovery and Modeling of Sequential Patterns Preceding Events of Interest
NASA Technical Reports Server (NTRS)
Rohloff, Kurt
2010-01-01
The integration of emerging data manipulation technologies has enabled a paradigm shift in practitioners' abilities to understand and anticipate events of interest in complex systems. Example events of interest include outbreaks of socio-political violence in nation-states. Rather than relying on human-centric modeling efforts that are limited by the availability of SMEs, automated data processing technologies has enabled the development of innovative automated complex system modeling and predictive analysis technologies. We introduce one such emerging modeling technology - the sequential pattern methodology. We have applied the sequential pattern methodology to automatically identify patterns of observed behavior that precede outbreaks of socio-political violence such as riots, rebellions and coups in nation-states. The sequential pattern methodology is a groundbreaking approach to automated complex system model discovery because it generates easily interpretable patterns based on direct observations of sampled factor data for a deeper understanding of societal behaviors that is tolerant of observation noise and missing data. The discovered patterns are simple to interpret and mimic human's identifications of observed trends in temporal data. Discovered patterns also provide an automated forecasting ability: we discuss an example of using discovered patterns coupled with a rich data environment to forecast various types of socio-political violence in nation-states.
Improving automation standards via semantic modelling: Application to ISA88.
Dombayci, Canan; Farreres, Javier; Rodríguez, Horacio; Espuña, Antonio; Graells, Moisès
2017-03-01
Standardization is essential for automation. Extensibility, scalability, and reusability are important features for automation software that rely in the efficient modelling of the addressed systems. The work presented here is from the ongoing development of a methodology for semi-automatic ontology construction methodology from technical documents. The main aim of this work is to systematically check the consistency of technical documents and support the improvement of technical document consistency. The formalization of conceptual models and the subsequent writing of technical standards are simultaneously analyzed, and guidelines proposed for application to future technical standards. Three paradigms are discussed for the development of domain ontologies from technical documents, starting from the current state of the art, continuing with the intermediate method presented and used in this paper, and ending with the suggested paradigm for the future. The ISA88 Standard is taken as a representative case study. Linguistic techniques from the semi-automatic ontology construction methodology is applied to the ISA88 Standard and different modelling and standardization aspects that are worth sharing with the automation community is addressed. This study discusses different paradigms for developing and sharing conceptual models for the subsequent development of automation software, along with presenting the systematic consistency checking method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Zein-Sabatto, Saleh; Mikhail, Maged; Bodruzzaman, Mohammad; DeSimio, Martin; Derriso, Mark; Behbahani, Alireza
2012-06-01
It has been widely accepted that data fusion and information fusion methods can improve the accuracy and robustness of decision-making in structural health monitoring systems. It is arguably true nonetheless, that decision-level is equally beneficial when applied to integrated health monitoring systems. Several decisions at low-levels of abstraction may be produced by different decision-makers; however, decision-level fusion is required at the final stage of the process to provide accurate assessment about the health of the monitored system as a whole. An example of such integrated systems with complex decision-making scenarios is the integrated health monitoring of aircraft. Thorough understanding of the characteristics of the decision-fusion methodologies is a crucial step for successful implementation of such decision-fusion systems. In this paper, we have presented the major information fusion methodologies reported in the literature, i.e., probabilistic, evidential, and artificial intelligent based methods. The theoretical basis and characteristics of these methodologies are explained and their performances are analyzed. Second, candidate methods from the above fusion methodologies, i.e., Bayesian, Dempster-Shafer, and fuzzy logic algorithms are selected and their applications are extended to decisions fusion. Finally, fusion algorithms are developed based on the selected fusion methods and their performance are tested on decisions generated from synthetic data and from experimental data. Also in this paper, a modeling methodology, i.e. cloud model, for generating synthetic decisions is presented and used. Using the cloud model, both types of uncertainties; randomness and fuzziness, involved in real decision-making are modeled. Synthetic decisions are generated with an unbiased process and varying interaction complexities among decisions to provide for fair performance comparison of the selected decision-fusion algorithms. For verification purposes, implementation results of the developed fusion algorithms on structural health monitoring data collected from experimental tests are reported in this paper.
Morales, F; Molina, H; Cruz, N; Valladares, P; Muñoz, J; Ortega, I; Torres, O; Leon, M
1995-01-01
The CLECOS_P system was conceived for registering and automating the processing of clinical evaluations performed on patients with Parkinson's disease who undergo functional neurosurgery and/or neural transplant. CLECOS_P represents the first time a computerized system is able to offer--with high precision and considerable time-savings--an integral analysis of the evolutive behavior of the universe in integrated variables at the core assessment program for intracerebral transplantations (CAPIT). CAPIT is used internationally for the evaluation and follow-up of patients with this pathology who have undergone neural transplant. We used the so-called MEDSAC methodology for the preparation of this system. The methodology that was used for the design of an intelligent system aimed at medical decision-making was based on the quantitative analysis of the clinical evolution. At the present moment, there are 20 patients controlled by this system: 11 bilaterally transplanted, 9 unilaterally (registered in ranks of 3 months before operation up to 1, 2, 3, 6, 9, 12, 18, and 24 months after operation). The application of CLECOS_P to these patients permitted the evaluation of 400 clinical variables, where a better evolutive characterization of the patients was obtained, thus getting most favorable results with personalized therapeutic methods aimed at raising their quality of life. CLECOS_P is used in a multi-user environment on a local area network running Novell Netware version 3.11.
Development of an Automated Emergency Response System (AERS) for Rail Transit Systems
DOT National Transportation Integrated Search
1984-10-01
As a result of a fire in 1979 at the Bay Area Rapid Transit District (BART), a microprocessor-based information retrieval system was developed to aid in the emergency decision-making process. This system was proposed, designed and programmed by a sup...
Performance modeling of automated manufacturing systems
NASA Astrophysics Data System (ADS)
Viswanadham, N.; Narahari, Y.
A unified and systematic treatment is presented of modeling methodologies and analysis techniques for performance evaluation of automated manufacturing systems. The book is the first treatment of the mathematical modeling of manufacturing systems. Automated manufacturing systems are surveyed and three principal analytical modeling paradigms are discussed: Markov chains, queues and queueing networks, and Petri nets.
A multicriteria-based methodology for site prioritisation in sediment management.
Alvarez-Guerra, Manuel; Viguri, Javier R; Voulvoulis, Nikolaos
2009-08-01
Decision-making for sediment management is a complex task that incorporates the selections of areas for remediation and the assessment of options for any mitigation required. The application of Multicriteria Analysis (MCA) to rank different areas, according to their need for sediment management, provides a great opportunity for prioritisation, a first step in an integrated methodology that finally aims to assess and select suitable alternatives for managing the identified priority sites. This paper develops a methodology that starts with the delimitation of management units within areas of study, followed by the application of MCA methods that allows ranking of these management units, according to their need for remediation. This proposed process considers not only scientific evidence on sediment quality, but also other relevant aspects such as social and economic criteria associated with such decisions. This methodology is illustrated with its application to the case study area of the Bay of Santander, in northern Spain, highlighting some of the implications of utilising different MCA methods in the process. It also uses site-specific data to assess the subjectivity in the decision-making process, mainly reflected through the assignment of the criteria weights and uncertainties in the criteria scores. Analysis of the sensitivity of the results to these factors is used as a way to assess the stability and robustness of the ranking as a first step of the sediment management decision-making process.
Domestic Abuse and Child Contact: Positioning Children in the Decision-Making Process
ERIC Educational Resources Information Center
Holt, Stephanie
2011-01-01
Drawing on a three-year Irish research study, this paper focuses on the decision-making process in child contact, specifically the assessment and management of risk of continuing abuse to young people previously exposed to domestic abuse. A mixed methodological approach involved the completion of survey questionnaires by 219 mothers and the…
The Impact of Video Gaming on Decision-Making and Teamworking Skills
ERIC Educational Resources Information Center
Campus-Wide Information Systems, 2005
2005-01-01
Purpose: To discuss the considerable impact of video gaming on young players' decision-making and teamworking skills, and the belief that video games provide an invaluable "training camp" for business. Design/methodology/approach: An interview with John Beck, the author of the book Got Game: How a New Generation of Gamers Is Reshaping Business…
ERIC Educational Resources Information Center
Mahony, L.; Lunn, J.; Petriwskyj, A.; Walsh, K.
2015-01-01
In this study, the pedagogical decision-making processes of 21 Australian early childhood teachers working with children experiencing parental separation and divorce were examined. Transcripts from interviews and a focus group with teachers were analysed using grounded theory methodology. The findings showed that as teachers interacted with young…
ERIC Educational Resources Information Center
Condon, John T.; Corkindale, Carolyn J.; Russell, Alan; Quinlivan, Julie A.
2006-01-01
This research examined adolescent males' decision-making when confronted with a hypothetical unplanned pregnancy in a sexual partner. An innovative methodology, involving a computerized simulation game was utilized with 386 Australian males (mean age of 15 years). Data were gathered from responses made during the simulation, and questionnaires…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ulsh, M.; Wheeler, D.; Protopappas, P.
The U.S. Department of Energy (DOE) is interested in supporting manufacturing research and development (R&D) for fuel cell systems in the 10-1,000 kilowatt (kW) power range relevant to stationary and distributed combined heat and power applications, with the intent to reduce manufacturing costs and increase production throughput. To assist in future decision-making, DOE requested that the National Renewable Energy Laboratory (NREL) provide a baseline understanding of the current levels of adoption of automation in manufacturing processes and flow, as well as of continuous processes. NREL identified and visited or interviewed key manufacturers, universities, and laboratories relevant to the study usingmore » a standard questionnaire. The questionnaire covered the current level of vertical integration, the importance of quality control developments for automation, the current level of automation and source of automation design, critical balance of plant issues, potential for continuous cell manufacturing, key manufacturing steps or processes that would benefit from DOE support for manufacturing R&D, the potential for cell or stack design changes to support automation, and the relationship between production volume and decisions on automation.« less
A hierarchical-multiobjective framework for risk management
NASA Technical Reports Server (NTRS)
Haimes, Yacov Y.; Li, Duan
1991-01-01
A broad hierarchical-multiobjective framework is established and utilized to methodologically address the management of risk. United into the framework are the hierarchical character of decision-making, the multiple decision-makers at separate levels within the hierarchy, the multiobjective character of large-scale systems, the quantitative/empirical aspects, and the qualitative/normative/judgmental aspects. The methodological components essentially consist of hierarchical-multiobjective coordination, risk of extreme events, and impact analysis. Examples of applications of the framework are presented. It is concluded that complex and interrelated forces require an analysis of trade-offs between engineering analysis and societal preferences, as in the hierarchical-multiobjective framework, to successfully address inherent risk.
Bridging the Gap between Human Judgment and Automated Reasoning in Predictive Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.; Riensche, Roderick M.; Unwin, Stephen D.
2010-06-07
Events occur daily that impact the health, security and sustainable growth of our society. If we are to address the challenges that emerge from these events, anticipatory reasoning has to become an everyday activity. Strong advances have been made in using integrated modeling for analysis and decision making. However, a wider impact of predictive analytics is currently hindered by the lack of systematic methods for integrating predictive inferences from computer models with human judgment. In this paper, we present a predictive analytics approach that supports anticipatory analysis and decision-making through a concerted reasoning effort that interleaves human judgment and automatedmore » inferences. We describe a systematic methodology for integrating modeling algorithms within a serious gaming environment in which role-playing by human agents provides updates to model nodes and the ensuing model outcomes in turn influence the behavior of the human players. The approach ensures a strong functional partnership between human players and computer models while maintaining a high degree of independence and greatly facilitating the connection between model and game structures.« less
An Ensemble-Based Forecasting Framework to Optimize Reservoir Releases
NASA Astrophysics Data System (ADS)
Ramaswamy, V.; Saleh, F.
2017-12-01
Increasing frequency of extreme precipitation events are stressing the need to manage water resources on shorter timescales. Short-term management of water resources becomes proactive when inflow forecasts are available and this information can be effectively used in the control strategy. This work investigates the utility of short term hydrological ensemble forecasts for operational decision making during extreme weather events. An advanced automated hydrologic prediction framework integrating a regional scale hydrologic model, GIS datasets and the meteorological ensemble predictions from the European Center for Medium Range Weather Forecasting (ECMWF) was coupled to an implicit multi-objective dynamic programming model to optimize releases from a water supply reservoir. The proposed methodology was evaluated by retrospectively forecasting the inflows to the Oradell reservoir in the Hackensack River basin in New Jersey during the extreme hydrologic event, Hurricane Irene. Additionally, the flexibility of the forecasting framework was investigated by forecasting the inflows from a moderate rainfall event to provide important perspectives on using the framework to assist reservoir operations during moderate events. The proposed forecasting framework seeks to provide a flexible, assistive tool to alleviate the complexity of operational decision-making.
González-Vidal, Juan José; Pérez-Pueyo, Rosanna; Soneira, María José; Ruiz-Moreno, Sergio
2015-03-01
A new method has been developed to automatically identify Raman spectra, whether they correspond to single- or multicomponent spectra. The method requires no user input or judgment. There are thus no parameters to be tweaked. Furthermore, it provides a reliability factor on the resulting identification, with the aim of becoming a useful support tool for the analyst in the decision-making process. The method relies on the multivariate techniques of principal component analysis (PCA) and independent component analysis (ICA), and on some metrics. It has been developed for the application of automated spectral analysis, where the analyzed spectrum is provided by a spectrometer that has no previous knowledge of the analyzed sample, meaning that the number of components in the sample is unknown. We describe the details of this method and demonstrate its efficiency by identifying both simulated spectra and real spectra. The method has been applied to artistic pigment identification. The reliable and consistent results that were obtained make the methodology a helpful tool suitable for the identification of pigments in artwork or in paint in general.
An Analysis on a Negotiation Model Based on Multiagent Systems with Symbiotic Learning and Evolution
NASA Astrophysics Data System (ADS)
Hossain, Md. Tofazzal
This study explores an evolutionary analysis on a negotiation model based on Masbiole (Multiagent Systems with Symbiotic Learning and Evolution) which has been proposed as a new methodology of Multiagent Systems (MAS) based on symbiosis in the ecosystem. In Masbiole, agents evolve in consideration of not only their own benefits and losses, but also the benefits and losses of opponent agents. To aid effective application of Masbiole, we develop a competitive negotiation model where rigorous and advanced intelligent decision-making mechanisms are required for agents to achieve solutions. A Negotiation Protocol is devised aiming at developing a set of rules for agents' behavior during evolution. Simulations use a newly developed evolutionary computing technique, called Genetic Network Programming (GNP) which has the directed graph-type gene structure that can develop and design the required intelligent mechanisms for agents. In a typical scenario, competitive negotiation solutions are reached by concessions that are usually predetermined in the conventional MAS. In this model, however, not only concession is determined automatically by symbiotic evolution (making the system intelligent, automated, and efficient) but the solution also achieves Pareto optimal automatically.
Probabilistic Flood Maps to support decision-making: Mapping the Value of Information
NASA Astrophysics Data System (ADS)
Alfonso, L.; Mukolwe, M. M.; Di Baldassarre, G.
2016-02-01
Floods are one of the most frequent and disruptive natural hazards that affect man. Annually, significant flood damage is documented worldwide. Flood mapping is a common preimpact flood hazard mitigation measure, for which advanced methods and tools (such as flood inundation models) are used to estimate potential flood extent maps that are used in spatial planning. However, these tools are affected, largely to an unknown degree, by both epistemic and aleatory uncertainty. Over the past few years, advances in uncertainty analysis with respect to flood inundation modeling show that it is appropriate to adopt Probabilistic Flood Maps (PFM) to account for uncertainty. However, the following question arises; how can probabilistic flood hazard information be incorporated into spatial planning? Thus, a consistent framework to incorporate PFMs into the decision-making is required. In this paper, a novel methodology based on Decision-Making under Uncertainty theories, in particular Value of Information (VOI) is proposed. Specifically, the methodology entails the use of a PFM to generate a VOI map, which highlights floodplain locations where additional information is valuable with respect to available floodplain management actions and their potential consequences. The methodology is illustrated with a simplified example and also applied to a real case study in the South of France, where a VOI map is analyzed on the basis of historical land use change decisions over a period of 26 years. Results show that uncertain flood hazard information encapsulated in PFMs can aid decision-making in floodplain planning.
Electronic Design Automation: Integrating the Design and Manufacturing Functions
NASA Technical Reports Server (NTRS)
Bachnak, Rafic; Salkowski, Charles
1997-01-01
As the complexity of electronic systems grows, the traditional design practice, a sequential process, is replaced by concurrent design methodologies. A major advantage of concurrent design is that the feedback from software and manufacturing engineers can be easily incorporated into the design. The implementation of concurrent engineering methodologies is greatly facilitated by employing the latest Electronic Design Automation (EDA) tools. These tools offer integrated simulation of the electrical, mechanical, and manufacturing functions and support virtual prototyping, rapid prototyping, and hardware-software co-design. This report presents recommendations for enhancing the electronic design and manufacturing capabilities and procedures at JSC based on a concurrent design methodology that employs EDA tools.
Development of a First-of-a-Kind Deterministic Decision-Making Tool for Supervisory Control System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cetiner, Sacit M; Kisner, Roger A; Muhlheim, Michael David
2015-07-01
Decision-making is the process of identifying and choosing alternatives where each alternative offers a different approach or path to move from a given state or condition to a desired state or condition. The generation of consistent decisions requires that a structured, coherent process be defined, immediately leading to a decision-making framework. The overall objective of the generalized framework is for it to be adopted into an autonomous decision-making framework and tailored to specific requirements for various applications. In this context, automation is the use of computing resources to make decisions and implement a structured decision-making process with limited or nomore » human intervention. The overriding goal of automation is to replace or supplement human decision makers with reconfigurable decision- making modules that can perform a given set of tasks reliably. Risk-informed decision making requires a probabilistic assessment of the likelihood of success given the status of the plant/systems and component health, and a deterministic assessment between plant operating parameters and reactor protection parameters to prevent unnecessary trips and challenges to plant safety systems. The implementation of the probabilistic portion of the decision-making engine of the proposed supervisory control system was detailed in previous milestone reports. Once the control options are identified and ranked based on the likelihood of success, the supervisory control system transmits the options to the deterministic portion of the platform. The deterministic multi-attribute decision-making framework uses variable sensor data (e.g., outlet temperature) and calculates where it is within the challenge state, its trajectory, and margin within the controllable domain using utility functions to evaluate current and projected plant state space for different control decisions. Metrics to be evaluated include stability, cost, time to complete (action), power level, etc. The integration of deterministic calculations using multi-physics analyses (i.e., neutronics, thermal, and thermal-hydraulics) and probabilistic safety calculations allows for the examination and quantification of margin recovery strategies. This also provides validation of the control options identified from the probabilistic assessment. Thus, the thermal-hydraulics analyses are used to validate the control options identified from the probabilistic assessment. Future work includes evaluating other possible metrics and computational efficiencies.« less
A Validity-Based Approach to Quality Control and Assurance of Automated Scoring
ERIC Educational Resources Information Center
Bejar, Isaac I.
2011-01-01
Automated scoring of constructed responses is already operational in several testing programmes. However, as the methodology matures and the demand for the utilisation of constructed responses increases, the volume of automated scoring is likely to increase at a fast pace. Quality assurance and control of the scoring process will likely be more…
Extending the Instructional Systems Development Methodology.
ERIC Educational Resources Information Center
O'Neill, Colin E.
1993-01-01
Describes ways that components of Information Engineering (IE) methodology can be used by training system developers to extend Instructional Systems Development (ISD) methodology. Aspects of IE that are useful in ISD are described, including requirements determination, group facilitation, integrated automated tool support, and prototyping.…
Automated control of hierarchical systems using value-driven methods
NASA Technical Reports Server (NTRS)
Pugh, George E.; Burke, Thomas E.
1990-01-01
An introduction is given to the Value-driven methodology, which has been successfully applied to solve a variety of difficult decision, control, and optimization problems. Many real-world decision processes (e.g., those encountered in scheduling, allocation, and command and control) involve a hierarchy of complex planning considerations. For such problems it is virtually impossible to define a fixed set of rules that will operate satisfactorily over the full range of probable contingencies. Decision Science Applications' value-driven methodology offers a systematic way of automating the intuitive, common-sense approach used by human planners. The inherent responsiveness of value-driven systems to user-controlled priorities makes them particularly suitable for semi-automated applications in which the user must remain in command of the systems operation. Three examples of the practical application of the approach in the automation of hierarchical decision processes are discussed: the TAC Brawler air-to-air combat simulation is a four-level computerized hierarchy; the autonomous underwater vehicle mission planning system is a three-level control system; and the Space Station Freedom electrical power control and scheduling system is designed as a two-level hierarchy. The methodology is compared with rule-based systems and with other more widely-known optimization techniques.
Human/Automation Trade Methodology for the Moon, Mars and Beyond
NASA Technical Reports Server (NTRS)
Korsmeyer, David J.
2009-01-01
It is possible to create a consistent trade methodology that can characterize operations model alternatives for crewed exploration missions. For example, a trade-space that is organized around the objective of maximizing Crew Exploration Vehicle (CEV) independence would have the input as a classification of the category of analysis to be conducted or decision to be made, and a commitment to a detailed point in a mission profile during which the analysis or decision is to be made. For example, does the decision have to do with crew activity planning, or life support? Is the mission phase trans-Earth injection, cruise, or lunar descent? Different kinds of decision analysis of the trade-space between human and automated decisions will occurs at different points in a mission's profile. The necessary objectives at a given point in time during a mission will call for different kinds of response with respect to where and how computers and automation are expected to help provide an accurate, safe, and timely response. In this paper, a consistent methodology for assessing the trades between human and automated decisions on-board will be presented and various examples discussed.
j5 DNA assembly design automation.
Hillson, Nathan J
2014-01-01
Modern standardized methodologies, described in detail in the previous chapters of this book, have enabled the software-automated design of optimized DNA construction protocols. This chapter describes how to design (combinatorial) scar-less DNA assembly protocols using the web-based software j5. j5 assists biomedical and biotechnological researchers construct DNA by automating the design of optimized protocols for flanking homology sequence as well as type IIS endonuclease-mediated DNA assembly methodologies. Unlike any other software tool available today, j5 designs scar-less combinatorial DNA assembly protocols, performs a cost-benefit analysis to identify which portions of an assembly process would be less expensive to outsource to a DNA synthesis service provider, and designs hierarchical DNA assembly strategies to mitigate anticipated poor assembly junction sequence performance. Software integrated with j5 add significant value to the j5 design process through graphical user-interface enhancement and downstream liquid-handling robotic laboratory automation.
ERIC Educational Resources Information Center
Watkins, Arthur Noel
The purpose of this study was to identify and describe the decision-making processes in senior high schools that were implementing programs of individualized schooling. Field methodology, including interviews, observations, and analysis of documents, was used to gather data in six senior high schools of varying size located throughout the country,…
ERIC Educational Resources Information Center
Trimmer, Karen
2016-01-01
This paper investigates reasoned risk-taking in decision-making by school principals using a methodology that combines sequential use of psychometric and traditional measurement techniques. Risk-taking is defined as when decisions are made that are not compliant with the regulatory framework, the primary governance mechanism for public schools in…
Strategic Decision-Making Learning from Label Distributions: An Approach for Facial Age Estimation.
Zhao, Wei; Wang, Han
2016-06-28
Nowadays, label distribution learning is among the state-of-the-art methodologies in facial age estimation. It takes the age of each facial image instance as a label distribution with a series of age labels rather than the single chronological age label that is commonly used. However, this methodology is deficient in its simple decision-making criterion: the final predicted age is only selected at the one with maximum description degree. In many cases, different age labels may have very similar description degrees. Consequently, blindly deciding the estimated age by virtue of the highest description degree would miss or neglect other valuable age labels that may contribute a lot to the final predicted age. In this paper, we propose a strategic decision-making label distribution learning algorithm (SDM-LDL) with a series of strategies specialized for different types of age label distribution. Experimental results from the most popular aging face database, FG-NET, show the superiority and validity of all the proposed strategic decision-making learning algorithms over the existing label distribution learning and other single-label learning algorithms for facial age estimation. The inner properties of SDM-LDL are further explored with more advantages.
Strategic Decision-Making Learning from Label Distributions: An Approach for Facial Age Estimation
Zhao, Wei; Wang, Han
2016-01-01
Nowadays, label distribution learning is among the state-of-the-art methodologies in facial age estimation. It takes the age of each facial image instance as a label distribution with a series of age labels rather than the single chronological age label that is commonly used. However, this methodology is deficient in its simple decision-making criterion: the final predicted age is only selected at the one with maximum description degree. In many cases, different age labels may have very similar description degrees. Consequently, blindly deciding the estimated age by virtue of the highest description degree would miss or neglect other valuable age labels that may contribute a lot to the final predicted age. In this paper, we propose a strategic decision-making label distribution learning algorithm (SDM-LDL) with a series of strategies specialized for different types of age label distribution. Experimental results from the most popular aging face database, FG-NET, show the superiority and validity of all the proposed strategic decision-making learning algorithms over the existing label distribution learning and other single-label learning algorithms for facial age estimation. The inner properties of SDM-LDL are further explored with more advantages. PMID:27367691
NASA Technical Reports Server (NTRS)
1997-01-01
Patterned after the Cassini Resource Exchange (CRE), Sholtz and Associates established the Automated Credit Exchange (ACE), an Internet-based concept that automates the auctioning of "pollution credits" in Southern California. An early challenge of the Jet Propulsion Laboratory's Cassini mission was allocating the spacecraft's resources. To support the decision-making process, the CRE was developed. The system removes the need for the science instrument manager to know the individual instruments' requirements for the spacecraft resources. Instead, by utilizing principles of exchange, the CRE induces the instrument teams to reveal their requirements. In doing so, they arrive at an efficient allocation of spacecraft resources by trading among themselves. A Southern California RECLAIM air pollution credit trading market has been set up using same bartering methods utilized in the Cassini mission in order to help companies keep pollution and costs down.
Recent advances in automated protein design and its future challenges.
Setiawan, Dani; Brender, Jeffrey; Zhang, Yang
2018-04-25
Protein function is determined by protein structure which is in turn determined by the corresponding protein sequence. If the rules that cause a protein to adopt a particular structure are understood, it should be possible to refine or even redefine the function of a protein by working backwards from the desired structure to the sequence. Automated protein design attempts to calculate the effects of mutations computationally with the goal of more radical or complex transformations than are accessible by experimental techniques. Areas covered: The authors give a brief overview of the recent methodological advances in computer-aided protein design, showing how methodological choices affect final design and how automated protein design can be used to address problems considered beyond traditional protein engineering, including the creation of novel protein scaffolds for drug development. Also, the authors address specifically the future challenges in the development of automated protein design. Expert opinion: Automated protein design holds potential as a protein engineering technique, particularly in cases where screening by combinatorial mutagenesis is problematic. Considering solubility and immunogenicity issues, automated protein design is initially more likely to make an impact as a research tool for exploring basic biology in drug discovery than in the design of protein biologics.
ERIC Educational Resources Information Center
Collins, Michael J.; Vitz, Ed
1988-01-01
Examines two computer interfaced lab experiments: 1) discusses the automation of a Perkin Elmer 337 infrared spectrophotometer noting the mechanical and electronic changes needed; 2) uses the Gouy method and Lotus Measure software to automate magnetic susceptibility determinations. Methodology is described. (MVL)
Envisioning and evaluating future scenarios has emerged as a critical component of both science and social decision-making. The ability to assess, report, map, and forecast the life support functions of ecosystems is absolutely critical to our capacity to make informed decisions...
Cooperative Game Theoretic Models for Decision-Making in Contexts of Library Cooperation.
ERIC Educational Resources Information Center
Hayes, Robert M.
2003-01-01
Presents a brief summary of Cooperative Economic Game Theory, followed by a summary of specific measures identified by Nash, Shapley, and Harsanyi. Reviews contexts in which negotiation and cooperation among libraries is of special economic importance, and for two of these contexts-cooperative acquisitions and cooperative automation-illustrates…
Collaborative Strategic Decision Making in School Districts
ERIC Educational Resources Information Center
Brazer, S. David; Rich, William; Ross, Susan A.
2010-01-01
Purpose: The dual purpose of this paper is to determine how superintendents in US school districts work with stakeholders in the decision-making process and to learn how different choices superintendents make affect decision outcomes. Design/methodology/approach: This multiple case study of three school districts employs qualitative methodology to…
NASA Astrophysics Data System (ADS)
Mozgovoy, Dmitry k.; Hnatushenko, Volodymyr V.; Vasyliev, Volodymyr V.
2018-04-01
Vegetation and water bodies are a fundamental element of urban ecosystems, and water mapping is critical for urban and landscape planning and management. A methodology of automated recognition of vegetation and water bodies on the territory of megacities in satellite images of sub-meter spatial resolution of the visible and IR bands is proposed. By processing multispectral images from the satellite SuperView-1A, vector layers of recognized plant and water objects were obtained. Analysis of the results of image processing showed a sufficiently high accuracy of the delineation of the boundaries of recognized objects and a good separation of classes. The developed methodology provides a significant increase of the efficiency and reliability of updating maps of large cities while reducing financial costs. Due to the high degree of automation, the proposed methodology can be implemented in the form of a geo-information web service functioning in the interests of a wide range of public services and commercial institutions.
Space Station man-machine automation trade-off analysis
NASA Technical Reports Server (NTRS)
Zimmerman, W. F.; Bard, J.; Feinberg, A.
1985-01-01
The man machine automation tradeoff methodology presented is of four research tasks comprising the autonomous spacecraft system technology (ASST) project. ASST was established to identify and study system level design problems for autonomous spacecraft. Using the Space Station as an example spacecraft system requiring a certain level of autonomous control, a system level, man machine automation tradeoff methodology is presented that: (1) optimizes man machine mixes for different ground and on orbit crew functions subject to cost, safety, weight, power, and reliability constraints, and (2) plots the best incorporation plan for new, emerging technologies by weighing cost, relative availability, reliability, safety, importance to out year missions, and ease of retrofit. A fairly straightforward approach is taken by the methodology to valuing human productivity, it is still sensitive to the important subtleties associated with designing a well integrated, man machine system. These subtleties include considerations such as crew preference to retain certain spacecraft control functions; or valuing human integration/decision capabilities over equivalent hardware/software where appropriate.
Programming methodology for a general purpose automation controller
NASA Technical Reports Server (NTRS)
Sturzenbecker, M. C.; Korein, J. U.; Taylor, R. H.
1987-01-01
The General Purpose Automation Controller is a multi-processor architecture for automation programming. A methodology has been developed whose aim is to simplify the task of programming distributed real-time systems for users in research or manufacturing. Programs are built by configuring function blocks (low-level computations) into processes using data flow principles. These processes are activated through the verb mechanism. Verbs are divided into two classes: those which support devices, such as robot joint servos, and those which perform actions on devices, such as motion control. This programming methodology was developed in order to achieve the following goals: (1) specifications for real-time programs which are to a high degree independent of hardware considerations such as processor, bus, and interconnect technology; (2) a component approach to software, so that software required to support new devices and technologies can be integrated by reconfiguring existing building blocks; (3) resistance to error and ease of debugging; and (4) a powerful command language interface.
Intelligent Automation Approach for Improving Pilot Situational Awareness
NASA Technical Reports Server (NTRS)
Spirkovska, Lilly
2004-01-01
Automation in the aviation domain has been increasing for the past two decades. Pilot reaction to automation varies from highly favorable to highly critical depending on both the pilot's background and how effectively the automation is implemented. We describe a user-centered approach for automation that considers the pilot's tasks and his needs related to accomplishing those tasks. Further, we augment rather than replace how the pilot currently fulfills his goals, relying on redundant displays that offer the pilot an opportunity to build trust in the automation. Our prototype system automates the interpretation of hydraulic system faults of the UH-60 helicopter. We describe the problem with the current system and our methodology for resolving it.
Automated Blazar Light Curves Using Machine Learning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Spencer James
2017-07-27
This presentation describes a problem and methodology pertaining to automated blazar light curves. Namely, optical variability patterns for blazars require the construction of light curves and in order to generate the light curves, data must be filtered before processing to ensure quality.
Pistorio, Salvatore G; Nigudkar, Swati S; Stine, Keith J; Demchenko, Alexei V
2016-10-07
The development of a useful methodology for simple, scalable, and transformative automation of oligosaccharide synthesis that easily interfaces with existing methods is reported. The automated synthesis can now be performed using accessible equipment where the reactants and reagents are delivered by the pump or the autosampler and the reactions can be monitored by the UV detector. The HPLC-based platform for automation is easy to setup and adapt to different systems and targets.
Adolescent Sexual Decision-Making: An Integrative Review.
Hulton, Linda J.
2001-10-03
PURPOSE: The purpose of this integrative review was to summarize the present literature to identify factors associated with adolescent sexual decision-making. Thirty-eight salient research studies were selected as a basis of this review from the databases of Medline, CINAHL, and Psychinfo using the Cooper methodology. CONCLUSIONS: Two categories of decision-making were identified: 1) The research on factors related to the decisions that adolescents make to become sexually active or to abstain from sexual activity; 2) The research on factors related to contraceptive decision-making. The most consistent findings were that the factors of gender differences, cognitive development, perception of benefits, parental influences, social influences, and sexual knowledge were important variables in the decision-making processes of adolescents. IMPLICATIONS: Practice implications for nursing suggest that clinicians should assess adolescent sexual decision-making in greater detail and address the social and psychological context in which sexual experiences occur. Nurses must be aware of the differences between adolescent and adult decision-making processes and incorporate knowledge of growth and development into intervention strategies. Moreover, to the degree that adolescent sexual decision-making proves to be less than rational, interventions designed to improve competent sexual decision-making are needed.
Workload-Based Automated Interface Mode Selection
2012-03-22
Process . . . . . . . . . . . . . . . . . . . . . 31 3.5.10 Agent Reward Function . . . . . . . . . . . . . . . . 31 3.5.11 Accelerated Learning... Strategies . . . . . . . . . . . . 31 4. Experimental Methodology . . . . . . . . . . . . . . . . . . . . . . . . 33 4.1 System Engineering Methodology...26 5. Agent state function. . . . . . . . . . . . . . . . . . . . . . . . . . . 28 6. Agent reward function
Armbruster, David A; Overcash, David R; Reyes, Jaime
2014-01-01
The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists. PMID:25336760
Armbruster, David A; Overcash, David R; Reyes, Jaime
2014-08-01
The era of automation arrived with the introduction of the AutoAnalyzer using continuous flow analysis and the Robot Chemist that automated the traditional manual analytical steps. Successive generations of stand-alone analysers increased analytical speed, offered the ability to test high volumes of patient specimens, and provided large assay menus. A dichotomy developed, with a group of analysers devoted to performing routine clinical chemistry tests and another group dedicated to performing immunoassays using a variety of methodologies. Development of integrated systems greatly improved the analytical phase of clinical laboratory testing and further automation was developed for pre-analytical procedures, such as sample identification, sorting, and centrifugation, and post-analytical procedures, such as specimen storage and archiving. All phases of testing were ultimately combined in total laboratory automation (TLA) through which all modules involved are physically linked by some kind of track system, moving samples through the process from beginning-to-end. A newer and very powerful, analytical methodology is liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS). LC-MS/MS has been automated but a future automation challenge will be to incorporate LC-MS/MS into TLA configurations. Another important facet of automation is informatics, including middleware, which interfaces the analyser software to a laboratory information systems (LIS) and/or hospital information systems (HIS). This software includes control of the overall operation of a TLA configuration and combines analytical results with patient demographic information to provide additional clinically useful information. This review describes automation relevant to clinical chemistry, but it must be recognised that automation applies to other specialties in the laboratory, e.g. haematology, urinalysis, microbiology. It is a given that automation will continue to evolve in the clinical laboratory, limited only by the imagination and ingenuity of laboratory scientists.
Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving
Elfring, Jos; Appeldoorn, Rein; van den Dries, Sjoerd; Kwakkernaat, Maurice
2016-01-01
The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle’s surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture. PMID:27727171
ERIC Educational Resources Information Center
Greenbank, Paul; Hepworth, Sue
2008-01-01
Purpose: This paper aims to examine the extent to which economic factors influence the career decision-making process of working class students. Design/methodology/approach: The study involved an initial survey of 165 final-year students from a range of degree programmes. It was followed by in-depth interviews with 30 working class students.…
ERIC Educational Resources Information Center
Clemens, Rachael Annette
2017-01-01
This qualitative and interpretive inquiry explores the information behavior of birthmothers surrounding the processes of decision-making, coping, and living with the act of child relinquishment to adoption. An interpretative phenomenological analysis methodology is used to reveal the phenomenon as experienced by eight birthmothers, women who…
E Pluribus Analysis: Applying a Superforecasting Methodology to the Detection of Homegrown Violence
2018-03-01
actor violence and a set of predefined decision-making protocols. This research included running four simulations using the Monte Carlo technique, which...actor violence and a set of predefined decision-making protocols. This research included running four simulations using the Monte Carlo technique...PREDICTING RANDOMNESS.............................................................24 1. Using a “ Runs Test” to Determine a Temporal Pattern in Lone
ERIC Educational Resources Information Center
Amzat, Ismail Hussein; Idris, Datuk Abdul Rahman
2012-01-01
Purpose: The purpose of this paper is to discuss the effect of management and decision-making styles on the job satisfaction of academic staff in a Malaysian Research University. Design/methodology/approach: The sample consisted of 218 respondents. The instruments used in the study were the Teacher Job Satisfaction Questionnaire and the Decision…
Haby, Michelle M; Chapman, Evelina; Clark, Rachel; Barreto, Jorge; Reveiz, Ludovic; Lavis, John N
2016-08-18
The objective of this work was to inform the design of a rapid response program to support evidence-informed decision-making in health policy and practice for the Americas region. Specifically, we focus on the following: (1) What are the best methodological approaches for rapid reviews of the research evidence? (2) What other strategies are needed to facilitate evidence-informed decision-making in health policy and practice? and (3) How best to operationalize a rapid response program? The evidence used to inform the design of a rapid response program included (i) two rapid reviews of methodological approaches for rapid reviews of the research evidence and strategies to facilitate evidence-informed decision-making, (ii) supplementary literature in relation to the "shortcuts" that could be considered to reduce the time needed to complete rapid reviews, (iii) four case studies, and (iv) supplementary literature to identify additional operational issues for the design of the program. There is no agreed definition of rapid reviews in the literature and no agreed methodology for conducting them. Better reporting of rapid review methods is needed. The literature found in relation to shortcuts will be helpful in choosing shortcuts that maximize timeliness while minimizing the impact on quality. Evidence for other strategies that can be used concurrently to facilitate the uptake of research evidence, including evidence drawn from rapid reviews, is presented. Operational issues that need to be considered in designing a rapid response program include the implications of a "user-pays" model, the importance of recruiting staff with the right mix of skills and qualifications, and ensuring that the impact of the model on research use in decision-making is formally evaluated. When designing a new rapid response program, greater attention needs to be given to specifying the rapid review methods and reporting these in sufficient detail to allow a quality assessment. It will also be important to engage in other strategies to facilitate the uptake of the rapid reviews and to evaluate the chosen model in order to make refinements and add to the evidence base for evidence-informed decision-making.
Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention.
Al-Hajj, Samar; Fisher, Brian; Smith, Jennifer; Pike, Ian
2017-09-12
Background : Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods : Inspired by the Delphi method, we introduced a novel methodology-group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders' observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results : The GA methodology triggered the emergence of ' common g round ' among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders' verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusion s : Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ' common ground' among diverse stakeholders about health data and their implications.
Design Methodology for Automated Construction Machines
1987-12-11
along with the design of a pair of machines which automate framework installation.-,, 20. DISTRIBUTION IAVAILABILITY OF ABSTRACT 21. ABSTRACT SECURITY... Development Assistant Professor of Civil Engineering and Laura A . Demsetz, David H. Levy, Bruce Schena Graduate Research Assistants December 11, 1987 U.S...are discussed along with the design of a pair of machines which automate framework installation. Preliminary analysis and testing indicate that these
Beyond Self-Report: Emerging Methods for Capturing Individual Differences in Decision-Making Process
Connors, Brenda L.; Rende, Richard; Colton, Timothy J.
2016-01-01
People vary in the way in which they approach decision-making, which impacts real-world behavior. There has been a surge of interest in moving beyond reliance on self-report measures to capture such individual differences. Particular emphasis has been placed on devising and applying a range of methodologies that include experimental, neuroscience, and observational paradigms. This paper provides a selective review of recent studies that illustrate the methods and yield of these approaches in terms of generating a deeper understanding of decision-making style and the notable differences that can be found across individuals. PMID:26973589
Connors, Brenda L; Rende, Richard; Colton, Timothy J
2016-01-01
People vary in the way in which they approach decision-making, which impacts real-world behavior. There has been a surge of interest in moving beyond reliance on self-report measures to capture such individual differences. Particular emphasis has been placed on devising and applying a range of methodologies that include experimental, neuroscience, and observational paradigms. This paper provides a selective review of recent studies that illustrate the methods and yield of these approaches in terms of generating a deeper understanding of decision-making style and the notable differences that can be found across individuals.
A Methodology for Developing Army Acquisition Strategies for an Uncertain Future
2007-01-01
manuscript for publication. Acronyms ABP Assumption-Based Planning ACEIT Automated Cost Estimating Integrated Tool ACR Armored Cavalry Regiment ACTD...decisions. For example, they employ the Automated Cost Estimating Integrated Tools ( ACEIT ) to simplify life cycle cost estimates; other tools are
DOT National Transportation Integrated Search
2012-01-01
This report describes the methodology and results of analyses performed to determine motorist understanding, as well as : the operational and safety effectiveness, of automated flagger assistance devices (AFADs) relative to the use of flaggers at lan...
Workload-Matched Adaptive Automation Support of Air Traffic Controller Information Processing Stages
NASA Technical Reports Server (NTRS)
Kaber, David B.; Prinzel, Lawrence J., III; Wright, Melanie C.; Clamann, Michael P.
2002-01-01
Adaptive automation (AA) has been explored as a solution to the problems associated with human-automation interaction in supervisory control environments. However, research has focused on the performance effects of dynamic control allocations of early stage sensory and information acquisition functions. The present research compares the effects of AA to the entire range of information processing stages of human operators, such as air traffic controllers. The results provide evidence that the effectiveness of AA is dependent on the stage of task performance (human-machine system information processing) that is flexibly automated. The results suggest that humans are better able to adapt to AA when applied to lower-level sensory and psychomotor functions, such as information acquisition and action implementation, as compared to AA applied to cognitive (analysis and decision-making) tasks. The results also provide support for the use of AA, as compared to completely manual control. These results are discussed in terms of implications for AA design for aviation.
A Mixed Methodological Analysis of the Role of Culture in the Clinical Decision-Making Process
ERIC Educational Resources Information Center
Hays, Danica G.; Prosek, Elizabeth A.; McLeod, Amy L.
2010-01-01
Even though literature indicates that particular cultural groups receive more severe diagnoses at disproportionate rates, there has been minimal research that addresses how culture interfaces specifically with clinical decision making. This mixed methodological study of 41 counselors indicated that cultural characteristics of both counselors and…
Five Steps for Improving Evaluation Reports by Using Different Data Analysis Methods.
ERIC Educational Resources Information Center
Thompson, Bruce
Although methodological integrity is not the sole determinant of the value of a program evaluation, decision-makers do have a right, at a minimum, to be able to expect competent work from evaluators. This paper explores five areas where evaluators might improve methodological practices. First, evaluation reports should reflect the limited…
Policy capturing as a method of quantifying the determinants of landscape preference
Dennis B. Propst
1979-01-01
Policy Capturing, a potential methodology for evaluating landscape preference, was described and tested. This methodology results in a mathematical model that theoretically represents the human decision-making process. Under experimental conditions, judges were asked to express their preferences for scenes of the Blue Ridge Parkway. An equation which "captures,...
Tough Teens: The Methodological Challenges of Interviewing Teenagers as Research Participants
ERIC Educational Resources Information Center
Bassett, Raewyn; Beagan, Brenda L.; Ristovski-Slijepcevic, Svetlana; Chapman, Gwen E.
2008-01-01
Encouraging a teenager to have a conversation in a semistructured research interview is fraught with difficulties. The authors discuss the methodological challenges encountered when interviewing adolescents of European Canadian, African Canadian, and Punjabi Canadian families who took part in the Family Food Decision-Making Study in two regions of…
Development of a support tool for complex decision-making in the provision of rural maternity care.
Hearns, Glen; Klein, Michael C; Trousdale, William; Ulrich, Catherine; Butcher, David; Miewald, Christiana; Lindstrom, Ronald; Eftekhary, Sahba; Rosinski, Jessica; Gómez-Ramírez, Oralia; Procyk, Andrea
2010-02-01
Decisions in the organization of safe and effective rural maternity care are complex, difficult, value laden and fraught with uncertainty, and must often be based on imperfect information. Decision analysis offers tools for addressing these complexities in order to help decision-makers determine the best use of resources and to appreciate the downstream effects of their decisions. To develop a maternity care decision-making tool for the British Columbia Northern Health Authority (NH) for use in low birth volume settings. Based on interviews with community members, providers, recipients and decision-makers, and employing a formal decision analysis approach, we sought to clarify the influences affecting rural maternity care and develop a process to generate a set of value-focused objectives for use in designing and evaluating rural maternity care alternatives. Four low-volume communities with variable resources (with and without on-site births, with or without caesarean section capability) were chosen. Physicians (20), nurses (18), midwives and maternity support service providers (4), local business leaders, economic development officials and elected officials (12), First Nations (women [pregnant and non-pregnant], chiefs and band members) (40), social workers (3), pregnant women (2) and NH decision-makers/administrators (17). We developed a Decision Support Manual to assist with assessing community needs and values, context for decision-making, capacity of the health authority or healthcare providers, identification of key objectives for decision-making, developing alternatives for care, and a process for making trade-offs and balancing multiple objectives. The manual was deemed an effective tool for the purpose by the client, NH. Beyond assisting the decision-making process itself, the methodology provides a transparent communication tool to assist in making difficult decisions. While the manual was specifically intended to deal with rural maternity issues, the NH decision-makers feel the method can be easily adapted to assist decision-making in other contexts in medicine where there are conflicting objectives, values and opinions. Decisions on the location of new facilities or infrastructure, or enhancing or altering services such as surgical or palliative care, would be examples of complex decisions that might benefit from this methodology.
Development of a Support Tool for Complex Decision-Making in the Provision of Rural Maternity Care
Hearns, Glen; Klein, Michael C.; Trousdale, William; Ulrich, Catherine; Butcher, David; Miewald, Christiana; Lindstrom, Ronald; Eftekhary, Sahba; Rosinski, Jessica; Gómez-Ramírez, Oralia; Procyk, Andrea
2010-01-01
Context: Decisions in the organization of safe and effective rural maternity care are complex, difficult, value laden and fraught with uncertainty, and must often be based on imperfect information. Decision analysis offers tools for addressing these complexities in order to help decision-makers determine the best use of resources and to appreciate the downstream effects of their decisions. Objective: To develop a maternity care decision-making tool for the British Columbia Northern Health Authority (NH) for use in low birth volume settings. Design: Based on interviews with community members, providers, recipients and decision-makers, and employing a formal decision analysis approach, we sought to clarify the influences affecting rural maternity care and develop a process to generate a set of value-focused objectives for use in designing and evaluating rural maternity care alternatives. Setting: Four low-volume communities with variable resources (with and without on-site births, with or without caesarean section capability) were chosen. Participants: Physicians (20), nurses (18), midwives and maternity support service providers (4), local business leaders, economic development officials and elected officials (12), First Nations (women [pregnant and non-pregnant], chiefs and band members) (40), social workers (3), pregnant women (2) and NH decision-makers/administrators (17). Results: We developed a Decision Support Manual to assist with assessing community needs and values, context for decision-making, capacity of the health authority or healthcare providers, identification of key objectives for decision-making, developing alternatives for care, and a process for making trade-offs and balancing multiple objectives. The manual was deemed an effective tool for the purpose by the client, NH. Conclusions: Beyond assisting the decision-making process itself, the methodology provides a transparent communication tool to assist in making difficult decisions. While the manual was specifically intended to deal with rural maternity issues, the NH decision-makers feel the method can be easily adapted to assist decision-making in other contexts in medicine where there are conflicting objectives, values and opinions. Decisions on the location of new facilities or infrastructure, or enhancing or altering services such as surgical or palliative care, would be examples of complex decisions that might benefit from this methodology. PMID:21286270
Automation of the electron-beam welding process
NASA Astrophysics Data System (ADS)
Koleva, E.; Dzharov, V.; Kardjiev, M.; Mladenov, G.
2016-03-01
In this work, the automatic control is considered of the vacuum and cooling systems of the located in the IE-BAS equipment for electron-beam welding, evaporation and surface modification. A project was elaborated for the control and management based on the development of an engineering support system using existing and additional technical means of automation. Optimization of the indicators, which are critical for the duration of reaching the working regime and stopping the operation of the installation, can be made using experimentally obtained transient characteristics. The automation of the available equipment aimed at improving its efficiency and the repeatability of the obtained results, as well as at stabilizing the process parameters, should be integrated in an Engineering Support System which, besides the operator supervision, consists of several subsystems for equipment control, data acquisition, information analysis, system management and decision-making support.
Analysis of Trinity Power Metrics for Automated Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michalenko, Ashley Christine
This is a presentation from Los Alamos National Laboraotyr (LANL) about the analysis of trinity power metrics for automated monitoring. The following topics are covered: current monitoring efforts, motivation for analysis, tools used, the methodology, work performed during the summer, and future work planned.
NASA Astrophysics Data System (ADS)
Gordon, E.; Lukas, J.
2009-12-01
Through the Western Water Assessment RISA program, we are conducting a research project that will produce science synthesis information to help local, state, and federal decision-makers in Colorado and Wyoming develop adaptation strategies to deal with climate-related threats to forest ecosystem services, in particular bark beetle infestations and stand-replacing wildfires. We begin by using the problem orientation framework, a policy sciences methodology, to understand how decision-makers can most effectively address policy problems that threaten the attainment of socially accepted goals. By applying this framework to the challenges facing decision-makers, we more accurately identify specific areas where scientific research can improve decision-making. WWA researchers will next begin to connect decision-makers with relevant scientific literature and identify specific areas of future scientific research that will be most effective at addressing their needs.
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.
1987-01-01
The Software Automation, Generation and Administration (SAGA) project is investigating the design and construction of practical software engineering environments for developing and maintaining aerospace systems and applications software. The research includes the practical organization of the software lifecycle, configuration management, software requirements specifications, executable specifications, design methodologies, programming, verification, validation and testing, version control, maintenance, the reuse of software, software libraries, documentation, and automated management.
Addressing Climate Change in Long-Term Water Planning Using Robust Decisionmaking
NASA Astrophysics Data System (ADS)
Groves, D. G.; Lempert, R.
2008-12-01
Addressing climate change in long-term natural resource planning is difficult because future management conditions are deeply uncertain and the range of possible adaptation options are so extensive. These conditions pose challenges to standard optimization decision-support techniques. This talk will describe a methodology called Robust Decisionmaking (RDM) that can complement more traditional analytic approaches by utilizing screening-level water management models to evaluate large numbers of strategies against a wide range of plausible future scenarios. The presentation will describe a recent application of the methodology to evaluate climate adaptation strategies for the Inland Empire Utilities Agency in Southern California. This project found that RDM can provide a useful way for addressing climate change uncertainty and identify robust adaptation strategies.
1987-03-01
contends his soft systems methodology is such an approach. [Ref. 2: pp. 105-107] Overview of this Methodology is meant flor addressing fuzzy., ill...could form the basis of office systems development: Checkland’s (1981) soft systems methodology , Pava’s (1983) sociotechnical design, and Mumlbrd and
Soft robot design methodology for `push-button' manufacturing
NASA Astrophysics Data System (ADS)
Paik, Jamie
2018-06-01
`Push-button' or fully automated manufacturing would enable the production of robots with zero intervention from human hands. Realizing this utopia requires a fundamental shift from a sequential (design-materials-manufacturing) to a concurrent design methodology.
Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David
2018-04-01
Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.
AAC Best Practice Using Automated Language Activity Monitoring.
ERIC Educational Resources Information Center
Hill, Katya; Romich, Barry
This brief paper describes automated language activity monitoring (LAM), an augmentative and alternative communication (AAC) methodology for the collection, editing, and analysis of language data in structured or natural situations with people who have severe communication disorders. The LAM function records each language event (letters, words,…
A methodology for automatic intensity-modulated radiation treatment planning for lung cancer
NASA Astrophysics Data System (ADS)
Zhang, Xiaodong; Li, Xiaoqiang; Quan, Enzhuo M.; Pan, Xiaoning; Li, Yupeng
2011-07-01
In intensity-modulated radiotherapy (IMRT), the quality of the treatment plan, which is highly dependent upon the treatment planner's level of experience, greatly affects the potential benefits of the radiotherapy (RT). Furthermore, the planning process is complicated and requires a great deal of iteration, and is often the most time-consuming aspect of the RT process. In this paper, we describe a methodology to automate the IMRT planning process in lung cancer cases, the goal being to improve the quality and consistency of treatment planning. This methodology (1) automatically sets beam angles based on a beam angle automation algorithm, (2) judiciously designs the planning structures, which were shown to be effective for all the lung cancer cases we studied, and (3) automatically adjusts the objectives of the objective function based on a parameter automation algorithm. We compared treatment plans created in this system (mdaccAutoPlan) based on the overall methodology with plans from a clinical trial of IMRT for lung cancer run at our institution. The 'autoplans' were consistently better, or no worse, than the plans produced by experienced medical dosimetrists in terms of tumor coverage and normal tissue sparing. We conclude that the mdaccAutoPlan system can potentially improve the quality and consistency of treatment planning for lung cancer.
The Reliability of Methodological Ratings for speechBITE Using the PEDro-P Scale
ERIC Educational Resources Information Center
Murray, Elizabeth; Power, Emma; Togher, Leanne; McCabe, Patricia; Munro, Natalie; Smith, Katherine
2013-01-01
Background: speechBITE (http://www.speechbite.com) is an online database established in order to help speech and language therapists gain faster access to relevant research that can used in clinical decision-making. In addition to containing more than 3000 journal references, the database also provides methodological ratings on the PEDro-P (an…
The Ranking of Higher Education Institutions in Russia: Some Methodological Problems.
ERIC Educational Resources Information Center
Filinov, Nikolay B.; Ruchkina, Svetlana
2002-01-01
The ranking of higher education institutions in Russia is examined from two points of view: as a social phenomenon and as a multi-criteria decision-making problem. The first point of view introduces the idea of interested and involved parties; the second introduces certain principles on which a rational ranking methodology should be based.…
Evaluation of the Factors That Determine Quality in Higher Education: An Empirical Study
ERIC Educational Resources Information Center
Tsinidou, Maria; Gerogiannis, Vassilis; Fitsilis, Panos
2010-01-01
Purpose: The aim of this paper is to identify the quality determinants for education services provided by higher education institutions (HEIs) in Greece and to measure their relative importance from the students' points of view. Design/methodology/approach: A multi-criteria decision-making methodology was used for assessing the relative importance…
ERIC Educational Resources Information Center
Nickerson, Carol A.; McClelland, Gary H.
1988-01-01
A methodology is developed based on axiomatic conjoint measurement to accompany a fertility decision-making model. The usefulness of the model is then demonstrated via an application to a study of contraceptive choice (N=100 male and female family-planning clinic clients). Finally, the validity of the model is evaluated. (TJH)
The Decisions of Elementary School Principals: A Test of Ideal Type Methodology.
ERIC Educational Resources Information Center
Greer, John T.
Interviews with 25 Georgia elementary school principals provided data that could be used to test an application of Max Weber's ideal type methodology to decision-making. Alfred Schuetz's model of the rational act, based on one of Weber's ideal types, was analyzed and translated into describable acts and behaviors. Interview procedures were…
Stream habitat analysis using the instream flow incremental methodology
Bovee, Ken D.; Lamb, Berton L.; Bartholow, John M.; Stalnaker, Clair B.; Taylor, Jonathan; Henriksen, Jim
1998-01-01
This document describes the Instream Flow Methodology in its entirety. This also is to serve as a comprehensive introductory textbook on IFIM for training courses as it contains the most complete and comprehensive description of IFIM in existence today. This should also serve as an official guide to IFIM in publication to counteract the misconceptions about the methodology that have pervaded the professional literature since the mid-1980's as this describes IFIM as it is envisioned by its developers. The document is aimed at the decisionmakers of management and allocation of natural resources in providing them an overview; and to those who design and implement studies to inform the decisionmakers. There should be enough background on model concepts, data requirements, calibration techniques, and quality assurance to help the technical user design and implement a cost-effective application of IFIM that will provide policy-relevant information. Some of the chapters deal with basic organization of IFIM, procedural sequence of applying IFIM starting with problem identification, study planning and implementation, and problem resolution.
A stochastic conflict resolution model for trading pollutant discharge permits in river systems.
Niksokhan, Mohammad Hossein; Kerachian, Reza; Amin, Pedram
2009-07-01
This paper presents an efficient methodology for developing pollutant discharge permit trading in river systems considering the conflict of interests of involving decision-makers and the stakeholders. In this methodology, a trade-off curve between objectives is developed using a powerful and recently developed multi-objective genetic algorithm technique known as the Nondominated Sorting Genetic Algorithm-II (NSGA-II). The best non-dominated solution on the trade-off curve is defined using the Young conflict resolution theory, which considers the utility functions of decision makers and stakeholders of the system. These utility functions are related to the total treatment cost and a fuzzy risk of violating the water quality standards. The fuzzy risk is evaluated using the Monte Carlo analysis. Finally, an optimization model provides the trading discharge permit policies. The practical utility of the proposed methodology in decision-making is illustrated through a realistic example of the Zarjub River in the northern part of Iran.
Methodological challenges of validating a clinical decision-making tool in the practice environment.
Brennan, Caitlin W; Daly, Barbara J
2015-04-01
Validating a measurement tool intended for use in the practice environment poses challenges that may not be present when validating a tool intended solely for research purposes. The aim of this article is to describe the methodological challenges of validating a clinical decision-making tool, the Oncology Acuity Tool, which nurses use to make nurse assignment and staffing decisions prospectively each shift. Data were derived from a larger validation study, during which several methodological challenges arose. Revisions to the tool, including conducting iterative feedback cycles with end users, were necessary before the validation study was initiated. The "true" value of patient acuity is unknown, and thus, two approaches to inter-rater reliability assessment were used. Discordant perspectives existed between experts and end users. Balancing psychometric rigor with clinical relevance may be achieved through establishing research-practice partnerships, seeking active and continuous feedback with end users, and weighing traditional statistical rules of thumb with practical considerations. © The Author(s) 2014.
Automated Scoring in Context: Rapid Assessment for Placed Students
ERIC Educational Resources Information Center
Klobucar, Andrew; Elliot, Norbert; Deess, Perry; Rudniy, Oleksandr; Joshi, Kamal
2013-01-01
This study investigated the use of automated essay scoring (AES) to identify at-risk students enrolled in a first-year university writing course. An application of AES, the "Criterion"[R] Online Writing Evaluation Service was evaluated through a methodology focusing on construct modelling, response processes, disaggregation, extrapolation,…
Automating Formative and Summative Feedback for Individualised Assignments
ERIC Educational Resources Information Center
Hamilton, Ian Robert
2009-01-01
Purpose: The purpose of this paper is to report on the rationale behind the use of a unique paper-based individualised accounting assignment, which automated the provision to students of immediate formative and timely summative feedback. Design/methodology/approach: As students worked towards completing their assignment, the package provided…
Model of Emotional Expressions in Movements
ERIC Educational Resources Information Center
Rozaliev, Vladimir L.; Orlova, Yulia A.
2013-01-01
This paper presents a new approach to automated identification of human emotions based on analysis of body movements, a recognition of gestures and poses. Methodology, models and automated system for emotion identification are considered. To characterize the person emotions in the model, body movements are described with linguistic variables and a…
Giesbrecht, Chantelle J.; Thornton, Allen E.; Hall-Patch, Clare; Maan, Evelyn J.; Côté, Hélène C. F.; Money, Deborah M.; Murray, Melanie; Pick, Neora
2014-01-01
Background Through implementation of combination antiretroviral therapy (cART) remarkable gains have been achieved in the management of HIV infection; nonetheless, the neurocognitive consequences of infection remain a pivotal concern in the cART era. Research has often employed norm-referenced neuropsychological scores, derived from healthy populations (excluding many seronegative individuals at high risk for HIV infection), to characterize impairments in predominately male HIV-infected populations. Methods Using matched-group methodology, we assessed 81 HIV-seropositive (HIV+) women with established neuropsychological measures validated for detection of HIV-related impairments, as well as additional detailed tests of executive function and decision-making from the Cambridge Neuropsychological Test Automated Battery (CANTAB). Results On validated tests, the HIV+ women exhibited impairments that were limited to significantly slower information processing speed when compared with 45 HIV-seronegative (HIV−) women with very similar demographic backgrounds and illness comorbidities. Additionally, select executive impairments in shifting attention (i.e., reversal learning) and in decision-making quality were revealed in HIV+ participants. Modifiers of neurocognition in HIV-infected women included detectable HIV plasma viral load, active hepatitis C virus co-infection, and self-reported depression symptoms. In contrast, leukocyte telomere length (LTL), a marker of cellular aging, did not significantly differ between HIV+ and HIV− women, nor was LTL associated with overall neurocognition in the HIV+ group. Conclusions The findings suggest that well-managed HIV infection may entail a more circumscribed neurocognitive deficit pattern than that reported in many norm-referenced studies, and that common comorbidities make a secondary contribution to HIV-related neurocognitive impairments. PMID:24595021
Situating methodology within qualitative research.
Kramer-Kile, Marnie L
2012-01-01
Qualitative nurse researchers are required to make deliberate and sometimes complex methodological decisions about their work. Methodology in qualitative research is a comprehensive approach in which theory (ideas) and method (doing) are brought into close alignment. It can be difficult, at times, to understand the concept of methodology. The purpose of this research column is to: (1) define qualitative methodology; (2) illuminate the relationship between epistemology, ontology and methodology; (3) explicate the connection between theory and method in qualitative research design; and 4) highlight relevant examples of methodological decisions made within cardiovascular nursing research. Although there is no "one set way" to do qualitative research, all qualitative researchers should account for the choices they make throughout the research process and articulate their methodological decision-making along the way.
A Method for Evaluating the Safety Impacts of Air Traffic Automation
NASA Technical Reports Server (NTRS)
Kostiuk, Peter; Shapiro, Gerald; Hanson, Dave; Kolitz, Stephan; Leong, Frank; Rosch, Gene; Bonesteel, Charles
1998-01-01
This report describes a methodology for analyzing the safety and operational impacts of emerging air traffic technologies. The approach integrates traditional reliability models of the system infrastructure with models that analyze the environment within which the system operates, and models of how the system responds to different scenarios. Products of the analysis include safety measures such as predicted incident rates, predicted accident statistics, and false alarm rates; and operational availability data. The report demonstrates the methodology with an analysis of the operation of the Center-TRACON Automation System at Dallas-Fort Worth International Airport.
Computer-aided biochemical programming of synthetic microreactors as diagnostic devices.
Courbet, Alexis; Amar, Patrick; Fages, François; Renard, Eric; Molina, Franck
2018-04-26
Biological systems have evolved efficient sensing and decision-making mechanisms to maximize fitness in changing molecular environments. Synthetic biologists have exploited these capabilities to engineer control on information and energy processing in living cells. While engineered organisms pose important technological and ethical challenges, de novo assembly of non-living biomolecular devices could offer promising avenues toward various real-world applications. However, assembling biochemical parts into functional information processing systems has remained challenging due to extensive multidimensional parameter spaces that must be sampled comprehensively in order to identify robust, specification compliant molecular implementations. We introduce a systematic methodology based on automated computational design and microfluidics enabling the programming of synthetic cell-like microreactors embedding biochemical logic circuits, or protosensors , to perform accurate biosensing and biocomputing operations in vitro according to temporal logic specifications. We show that proof-of-concept protosensors integrating diagnostic algorithms detect specific patterns of biomarkers in human clinical samples. Protosensors may enable novel approaches to medicine and represent a step toward autonomous micromachines capable of precise interfacing of human physiology or other complex biological environments, ecosystems, or industrial bioprocesses. © 2018 The Authors. Published under the terms of the CC BY 4.0 license.
Automated Algorithm for J-Tpeak and Tpeak-Tend Assessment of Drug-Induced Proarrhythmia Risk
Johannesen, Lars; Vicente, Jose; Hosseini, Meisam; ...
2016-12-30
Prolongation of the heart rate corrected QT (QTc) interval is a sensitive marker of torsade de pointes risk; however it is not specific as QTc prolonging drugs that block inward currents are often not associated with torsade. Recent work demonstrated that separate analysis of the heart rate corrected J-T peakc (J-T peakc) and T peak-T end intervals can identify QTc prolonging drugs with inward current block and is being proposed as a part of a new cardiac safety paradigm for new drugs (the “CiPA” initiative). In this work, we describe an automated measurement methodology for assessment of the J-T peakcmore » and T peak-T end intervals using the vector magnitude lead. The automated measurement methodology was developed using data from one clinical trial and was evaluated using independent data from a second clinical trial. Comparison between the automated and the prior semi-automated measurements shows that the automated algorithm reproduces the semi-automated measurements with a mean difference of single-deltas <1 ms and no difference in intra-time point variability (p for all > 0.39). In addition, the time-profile of the baseline and placebo-adjusted changes are within 1 ms for 63% of the time-points (86% within 2 ms). Importantly, the automated results lead to the same conclusions about the electrophysiological mechanisms of the studied drugs. We have developed an automated algorithm for assessment of J-T peakc and T peak-T end intervals that can be applied in clinical drug trials. Under the CiPA initiative this ECG assessment would determine if there are unexpected ion channel effects in humans compared to preclinical studies. In conclusion, the algorithm is being released as open-source software.« less
Automated Algorithm for J-Tpeak and Tpeak-Tend Assessment of Drug-Induced Proarrhythmia Risk
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johannesen, Lars; Vicente, Jose; Hosseini, Meisam
Prolongation of the heart rate corrected QT (QTc) interval is a sensitive marker of torsade de pointes risk; however it is not specific as QTc prolonging drugs that block inward currents are often not associated with torsade. Recent work demonstrated that separate analysis of the heart rate corrected J-T peakc (J-T peakc) and T peak-T end intervals can identify QTc prolonging drugs with inward current block and is being proposed as a part of a new cardiac safety paradigm for new drugs (the “CiPA” initiative). In this work, we describe an automated measurement methodology for assessment of the J-T peakcmore » and T peak-T end intervals using the vector magnitude lead. The automated measurement methodology was developed using data from one clinical trial and was evaluated using independent data from a second clinical trial. Comparison between the automated and the prior semi-automated measurements shows that the automated algorithm reproduces the semi-automated measurements with a mean difference of single-deltas <1 ms and no difference in intra-time point variability (p for all > 0.39). In addition, the time-profile of the baseline and placebo-adjusted changes are within 1 ms for 63% of the time-points (86% within 2 ms). Importantly, the automated results lead to the same conclusions about the electrophysiological mechanisms of the studied drugs. We have developed an automated algorithm for assessment of J-T peakc and T peak-T end intervals that can be applied in clinical drug trials. Under the CiPA initiative this ECG assessment would determine if there are unexpected ion channel effects in humans compared to preclinical studies. In conclusion, the algorithm is being released as open-source software.« less
Man-Robot Symbiosis: A Framework For Cooperative Intelligence And Control
NASA Astrophysics Data System (ADS)
Parker, Lynne E.; Pin, Francois G.
1988-10-01
The man-robot symbiosis concept has the fundamental objective of bridging the gap between fully human-controlled and fully autonomous systems to achieve true man-robot cooperative control and intelligence. Such a system would allow improved speed, accuracy, and efficiency of task execution, while retaining the man in the loop for innovative reasoning and decision-making. The symbiont would have capabilities for supervised and unsupervised learning, allowing an increase of expertise in a wide task domain. This paper describes a robotic system architecture facilitating the symbiotic integration of teleoperative and automated modes of task execution. The architecture reflects a unique blend of many disciplines of artificial intelligence into a working system, including job or mission planning, dynamic task allocation, man-robot communication, automated monitoring, and machine learning. These disciplines are embodied in five major components of the symbiotic framework: the Job Planner, the Dynamic Task Allocator, the Presenter/Interpreter, the Automated Monitor, and the Learning System.
Machine learning of network metrics in ATLAS Distributed Data Management
NASA Astrophysics Data System (ADS)
Lassnig, Mario; Toler, Wesley; Vamosi, Ralf; Bogado, Joaquin; ATLAS Collaboration
2017-10-01
The increasing volume of physics data poses a critical challenge to the ATLAS experiment. In anticipation of high luminosity physics, automation of everyday data management tasks has become necessary. Previously many of these tasks required human decision-making and operation. Recent advances in hardware and software have made it possible to entrust more complicated duties to automated systems using models trained by machine learning algorithms. In this contribution we show results from one of our ongoing automation efforts that focuses on network metrics. First, we describe our machine learning framework built atop the ATLAS Analytics Platform. This framework can automatically extract and aggregate data, train models with various machine learning algorithms, and eventually score the resulting models and parameters. Second, we use these models to forecast metrics relevant for networkaware job scheduling and data brokering. We show the characteristics of the data and evaluate the forecasting accuracy of our models.
Ngo, Tuan Anh; Lu, Zhi; Carneiro, Gustavo
2017-01-01
We introduce a new methodology that combines deep learning and level set for the automated segmentation of the left ventricle of the heart from cardiac cine magnetic resonance (MR) data. This combination is relevant for segmentation problems, where the visual object of interest presents large shape and appearance variations, but the annotated training set is small, which is the case for various medical image analysis applications, including the one considered in this paper. In particular, level set methods are based on shape and appearance terms that use small training sets, but present limitations for modelling the visual object variations. Deep learning methods can model such variations using relatively small amounts of annotated training, but they often need to be regularised to produce good generalisation. Therefore, the combination of these methods brings together the advantages of both approaches, producing a methodology that needs small training sets and produces accurate segmentation results. We test our methodology on the MICCAI 2009 left ventricle segmentation challenge database (containing 15 sequences for training, 15 for validation and 15 for testing), where our approach achieves the most accurate results in the semi-automated problem and state-of-the-art results for the fully automated challenge. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.
[Methodological problems in the use of information technologies in physical education].
Martirosov, E G; Zaĭtseva, G A
2000-01-01
The paper considers methodological problems in the use of computer technologies in physical education by applying diagnostic and consulting systems, educational and educational-and-training process automation systems, and control and self-control programmes for athletes and others.
Roadway safety analysis methodology for Utah : final report.
DOT National Transportation Integrated Search
2016-12-01
This research focuses on the creation of a three-part Roadway Safety Analysis methodology that applies and automates the cumulative work of recently-completed roadway safety research. The first part is to prepare the roadway and crash data for analys...
Advanced Airframe Structural Materials: A Primer and Cost Estimating Methodology
1991-01-01
laying machines for larger, mildly con- toured parts such as wing and stabilizer skins. For such parts, automated tape laying machines can operate many...heat guns (90-130°F). However, thermoplastics require as much as 650°F for forming. Automated tape laying machines for these materials use warm...cycles to properly seat the plies onto the tool. This time-consuming process can sometimes be eliminated or reduced by the use of automated tape laying procedures
Isoda, Yuta; Sasaki, Norihiko; Kitamura, Kei; Takahashi, Shuji; Manmode, Sujit; Takeda-Okuda, Naoko; Tamura, Jun-Ichi; Nokami, Toshiki; Itoh, Toshiyuki
2017-01-01
The total synthesis of TMG-chitotriomycin using an automated electrochemical synthesizer for the assembly of carbohydrate building blocks is demonstrated. We have successfully prepared a precursor of TMG-chitotriomycin, which is a structurally-pure tetrasaccharide with typical protecting groups, through the methodology of automated electrochemical solution-phase synthesis developed by us. The synthesis of structurally well-defined TMG-chitotriomycin has been accomplished in 10-steps from a disaccharide building block.
Hasanah, C. I.
2003-01-01
Quality of life measures are designed to enable patients’ perspectives on the impact of health and healthcare interventions on their lives to be assessed and taken into account in clinical decision-making and research. This paper discusses some approaches, methodological as well as interpretative issues of health related quality of life research. PMID:23386798
Operations management system advanced automation: Fault detection isolation and recovery prototyping
NASA Technical Reports Server (NTRS)
Hanson, Matt
1990-01-01
The purpose of this project is to address the global fault detection, isolation and recovery (FDIR) requirements for Operation's Management System (OMS) automation within the Space Station Freedom program. This shall be accomplished by developing a selected FDIR prototype for the Space Station Freedom distributed processing systems. The prototype shall be based on advanced automation methodologies in addition to traditional software methods to meet the requirements for automation. A secondary objective is to expand the scope of the prototyping to encompass multiple aspects of station-wide fault management (SWFM) as discussed in OMS requirements documentation.
Demonstration of a Safety Analysis on a Complex System
NASA Technical Reports Server (NTRS)
Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey;
1997-01-01
For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.
Environmental Risk Assessment of dredging processes - application to Marin harbour (NW Spain)
NASA Astrophysics Data System (ADS)
Gómez, A. G.; García Alba, J.; Puente, A.; Juanes, J. A.
2014-04-01
A methodological procedure to estimate the environmental risk of dredging operations in aquatic systems has been developed. Environmental risk estimations are based on numerical models results, which provide an appropriated spatio-temporal framework analysis to guarantee an effective decision-making process. The methodological procedure has been applied on a real dredging operation in the port of Marin (NW Spain). Results from Marin harbour confirmed the suitability of the developed methodology and the conceptual approaches as a comprehensive and practical management tool.
Cost-Effectiveness Analysis of the Automation of a Circulation System.
ERIC Educational Resources Information Center
Mosley, Isobel
A general methodology for cost effectiveness analysis was developed and applied to the Colorado State University library loan desk. The cost effectiveness of the existing semi-automated circulation system was compared with that of a fully manual one, based on the existing manual subsystem. Faculty users' time and computer operating costs were…
Automated Serials Control at the Indian Institutes of Technology: An Overview
ERIC Educational Resources Information Center
Ghosh, Tapas Kumar; Panda, K. C.
2011-01-01
Purpose: The purpose of this paper is to highlight the functional attributes of the automated serials control systems of the libraries in seven Indian Institutes of Technology (IITs) and provide a comparative analysis. Design/methodology/approach: Features of the serials control modules of the library management systems (LMSs) in use in the…
A set of coupled semantic data models, i.e., ontologies, are presented to advance a methodology towards automated inventory modeling of chemical manufacturing in life cycle assessment. The cradle-to-gate life cycle inventory for chemical manufacturing is a detailed collection of ...
A Binary Programming Approach to Automated Test Assembly for Cognitive Diagnosis Models
ERIC Educational Resources Information Center
Finkelman, Matthew D.; Kim, Wonsuk; Roussos, Louis; Verschoor, Angela
2010-01-01
Automated test assembly (ATA) has been an area of prolific psychometric research. Although ATA methodology is well developed for unidimensional models, its application alongside cognitive diagnosis models (CDMs) is a burgeoning topic. Two suggested procedures for combining ATA and CDMs are to maximize the cognitive diagnostic index and to use a…
Technical Processing Librarians in the 1980's: Current Trends and Future Forecasts.
ERIC Educational Resources Information Center
Kennedy, Gail
1980-01-01
This review of recent and anticipated advances in library automation technology and methodology includes a review of the effects of OCLC, MARC formatting, AACR2, and increasing costs, as well as predictions of the impact on library technical processing of networking, expansion of automation, minicomputers, specialized reference services, and…
Automating Risk Analysis of Software Design Models
Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P.
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688
Automating risk analysis of software design models.
Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P
2014-01-01
The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.
Delakis, Ioannis; Wise, Robert; Morris, Lauren; Kulama, Eugenia
2015-11-01
The purpose of this work was to evaluate the contrast-detail performance of full field digital mammography (FFDM) systems using ideal (Hotelling) observer Signal-to-Noise Ratio (SNR) methodology and ascertain whether it can be considered an alternative to the conventional, automated analysis of CDMAM phantom images. Five FFDM units currently used in the national breast screening programme were evaluated, which differed with respect to age, detector, Automatic Exposure Control (AEC) and target/filter combination. Contrast-detail performance was analysed using CDMAM and ideal observer SNR methodology. The ideal observer SNR was calculated for input signal originating from gold discs of varying thicknesses and diameters, and then used to estimate the threshold gold thickness for each diameter as per CDMAM analysis. The variability of both methods and the dependence of CDMAM analysis on phantom manufacturing discrepancies also investigated. Results from both CDMAM and ideal observer methodologies were informative differentiators of FFDM systems' contrast-detail performance, displaying comparable patterns with respect to the FFDM systems' type and age. CDMAM results suggested higher threshold gold thickness values compared with the ideal observer methodology, especially for small-diameter details, which can be attributed to the behaviour of the CDMAM phantom used in this study. In addition, ideal observer methodology results showed lower variability than CDMAM results. The Ideal observer SNR methodology can provide a useful metric of the FFDM systems' contrast detail characteristics and could be considered a surrogate for conventional, automated analysis of CDMAM images. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Intersubjective decision-making for computer-aided forging technology design
NASA Astrophysics Data System (ADS)
Kanyukov, S. I.; Konovalov, A. V.; Muizemnek, O. Yu.
2017-12-01
We propose a concept of intersubjective decision-making for problems of open-die forging technology design. The intersubjective decisions are chosen from a set of feasible decisions using the fundamentals of the decision-making theory in fuzzy environment according to the Bellman-Zadeh scheme. We consider the formalization of subjective goals and the choice of membership functions for the decisions depending on subjective goals. We study the arrangement of these functions into an intersubjective membership function. The function is constructed for a resulting decision, which is chosen from a set of feasible decisions. The choice of the final intersubjective decision is discussed. All the issues are exemplified by a specific technological problem. The considered concept of solving technological problems under conditions of fuzzy goals allows one to choose the most efficient decisions from a set of feasible ones. These decisions correspond to the stated goals. The concept allows one to reduce human participation in automated design. This concept can be used to develop algorithms and design programs for forging numerous types of forged parts.
Conflict Resolution Automation and Pilot Situation Awareness
NASA Technical Reports Server (NTRS)
Dao, Arik-Quang V.; Brandt, Summer L.; Bacon, Paige; Kraut, Josh; Nguyen, Jimmy; Minakata, Katsumi; Raza, Hamzah; Rozovski, David; Johnson, Walter W.
2010-01-01
This study compared pilot situation awareness across three traffic management concepts. The Concepts varied in terms of the allocation of traffic avoidance responsibility between the pilot on the flight deck, the air traffic controllers, and a conflict resolution automation system. In Concept 1, the flight deck was equipped with conflict resolution tools that enable them to fully handle the responsibility of weather avoidance and maintaining separation between ownship and surrounding traffic. In Concept 2, pilots were not responsible for traffic separation, but were provided tools for weather and traffic avoidance. In Concept 3, flight deck tools allowed pilots to deviate for weather, but conflict detection tools were disabled. In this concept pilots were dependent on ground based automation for conflict detection and resolution. Situation awareness of the pilots was measured using online probes. Results showed that individual situation awareness was highest in Concept 1, where the pilots were most engaged, and lowest in Concept 3, where automation was heavily used. These findings suggest that for conflict resolution tasks, situation awareness is improved when pilots remain in the decision-making loop.
People adopt optimal policies in simple decision-making, after practice and guidance.
Evans, Nathan J; Brown, Scott D
2017-04-01
Organisms making repeated simple decisions are faced with a tradeoff between urgent and cautious strategies. While animals can adopt a statistically optimal policy for this tradeoff, findings about human decision-makers have been mixed. Some studies have shown that people can optimize this "speed-accuracy tradeoff", while others have identified a systematic bias towards excessive caution. These issues have driven theoretical development and spurred debate about the nature of human decision-making. We investigated a potential resolution to the debate, based on two factors that routinely differ between human and animal studies of decision-making: the effects of practice, and of longer-term feedback. Our study replicated the finding that most people, by default, are overly cautious. When given both practice and detailed feedback, people moved rapidly towards the optimal policy, with many participants reaching optimality with less than 1 h of practice. Our findings have theoretical implications for cognitive and neural models of simple decision-making, as well as methodological implications.
Automotive Marketing Methods and Practice
DOT National Transportation Integrated Search
1979-09-01
The report is a comprehensive examination of the current marketing practices, marketing methodologies, and decision-making processes utilized by the domestic automotive industry. The various marketing elements, such as products, consumer behavior, sa...
[Research applications in digital radiology. Big data and co].
Müller, H; Hanbury, A
2016-02-01
Medical imaging produces increasingly complex images (e.g. thinner slices and higher resolution) with more protocols, so that image reading has also become much more complex. More information needs to be processed and usually the number of radiologists available for these tasks has not increased to the same extent. The objective of this article is to present current research results from projects on the use of image data for clinical decision support. An infrastructure that can allow large volumes of data to be accessed is presented. In this way the best performing tools can be identified without the medical data having to leave secure servers. The text presents the results of the VISCERAL and Khresmoi EU-funded projects, which allow the analysis of previous cases from institutional archives to support decision-making and for process automation. The results also represent a secure evaluation environment for medical image analysis. This allows the use of data extracted from past cases to solve information needs occurring when diagnosing new cases. The presented research prototypes allow direct extraction of knowledge from the visual data of the images and to use this for decision support or process automation. Real clinical use has not been tested but several subjective user tests showed the effectiveness and efficiency of the process. The future in radiology will clearly depend on better use of the important knowledge in clinical image archives to automate processes and aid decision-making via big data analysis. This can help concentrate the work of radiologists towards the most important parts of diagnostics.
Jeong, Jeong-Won; Shin, Dae C; Do, Synho; Marmarelis, Vasilis Z
2006-08-01
This paper presents a novel segmentation methodology for automated classification and differentiation of soft tissues using multiband data obtained with the newly developed system of high-resolution ultrasonic transmission tomography (HUTT) for imaging biological organs. This methodology extends and combines two existing approaches: the L-level set active contour (AC) segmentation approach and the agglomerative hierarchical kappa-means approach for unsupervised clustering (UC). To prevent the trapping of the current iterative minimization AC algorithm in a local minimum, we introduce a multiresolution approach that applies the level set functions at successively increasing resolutions of the image data. The resulting AC clusters are subsequently rearranged by the UC algorithm that seeks the optimal set of clusters yielding the minimum within-cluster distances in the feature space. The presented results from Monte Carlo simulations and experimental animal-tissue data demonstrate that the proposed methodology outperforms other existing methods without depending on heuristic parameters and provides a reliable means for soft tissue differentiation in HUTT images.
NASA Technical Reports Server (NTRS)
Bard, J. F.
1986-01-01
The role that automation, robotics, and artificial intelligence will play in Space Station operations is now beginning to take shape. Although there is only limited data on the precise nature of the payoffs that these technologies are likely to afford there is a general consensus that, at a minimum, the following benefits will be realized: increased responsiveness to innovation, lower operating costs, and reduction of exposure to hazards. Nevertheless, the question arises as to how much automation can be justified with the technical and economic constraints of the program? The purpose of this paper is to present a methodology which can be used to evaluate and rank different approaches to automating the functions and tasks planned for the Space Station. Special attention is given to the impact of advanced automation on human productivity. The methodology employed is based on the Analytic Hierarchy Process. This permits the introduction of individual judgements to resolve the confict that normally arises when incomparable criteria underly the selection process. Because of the large number of factors involved in the model, the overall problem is decomposed into four subproblems individually focusing on human productivity, economics, design, and operations, respectively. The results from each are then combined to yield the final rankings. To demonstrate the methodology, an example is developed based on the selection of an on-orbit assembly system. Five alternatives for performing this task are identified, ranging from an astronaut working in space, to a dexterous manipulator with sensory feedback. Computational results are presented along with their implications. A final parametric analysis shows that the outcome is locally insensitive to all but complete reversals in preference.
Isoda, Yuta; Sasaki, Norihiko; Kitamura, Kei; Takahashi, Shuji; Manmode, Sujit; Takeda-Okuda, Naoko; Tamura, Jun-ichi
2017-01-01
The total synthesis of TMG-chitotriomycin using an automated electrochemical synthesizer for the assembly of carbohydrate building blocks is demonstrated. We have successfully prepared a precursor of TMG-chitotriomycin, which is a structurally-pure tetrasaccharide with typical protecting groups, through the methodology of automated electrochemical solution-phase synthesis developed by us. The synthesis of structurally well-defined TMG-chitotriomycin has been accomplished in 10-steps from a disaccharide building block. PMID:28684973
Developing Mobile BIM/2D Barcode-Based Automated Facility Management System
Chen, Yen-Pei
2014-01-01
Facility management (FM) has become an important topic in research on the operation and maintenance phase. Managing the work of FM effectively is extremely difficult owing to the variety of environments. One of the difficulties is the performance of two-dimensional (2D) graphics when depicting facilities. Building information modeling (BIM) uses precise geometry and relevant data to support the facilities depicted in three-dimensional (3D) object-oriented computer-aided design (CAD). This paper proposes a new and practical methodology with application to FM that uses an integrated 2D barcode and the BIM approach. Using 2D barcode and BIM technologies, this study proposes a mobile automated BIM-based facility management (BIMFM) system for FM staff in the operation and maintenance phase. The mobile automated BIMFM system is then applied in a selected case study of a commercial building project in Taiwan to verify the proposed methodology and demonstrate its effectiveness in FM practice. The combined results demonstrate that a BIMFM-like system can be an effective mobile automated FM tool. The advantage of the mobile automated BIMFM system lies not only in improving FM work efficiency for the FM staff but also in facilitating FM updates and transfers in the BIM environment. PMID:25250373
Developing mobile BIM/2D barcode-based automated facility management system.
Lin, Yu-Cheng; Su, Yu-Chih; Chen, Yen-Pei
2014-01-01
Facility management (FM) has become an important topic in research on the operation and maintenance phase. Managing the work of FM effectively is extremely difficult owing to the variety of environments. One of the difficulties is the performance of two-dimensional (2D) graphics when depicting facilities. Building information modeling (BIM) uses precise geometry and relevant data to support the facilities depicted in three-dimensional (3D) object-oriented computer-aided design (CAD). This paper proposes a new and practical methodology with application to FM that uses an integrated 2D barcode and the BIM approach. Using 2D barcode and BIM technologies, this study proposes a mobile automated BIM-based facility management (BIMFM) system for FM staff in the operation and maintenance phase. The mobile automated BIMFM system is then applied in a selected case study of a commercial building project in Taiwan to verify the proposed methodology and demonstrate its effectiveness in FM practice. The combined results demonstrate that a BIMFM-like system can be an effective mobile automated FM tool. The advantage of the mobile automated BIMFM system lies not only in improving FM work efficiency for the FM staff but also in facilitating FM updates and transfers in the BIM environment.
NASA Astrophysics Data System (ADS)
Gorman, J.; Voshell, M.; Sliva, A.
2016-09-01
The United States is highly dependent on space resources to support military, government, commercial, and research activities. Satellites operate at great distances, observation capacity is limited, and operator actions and observations can be significantly delayed. Safe operations require support systems that provide situational understanding, enhance decision making, and facilitate collaboration between human operators and system automation both in-the-loop, and on-the-loop. Joint cognitive systems engineering (JCSE) provides a rich set of methods for analyzing and informing the design of complex systems that include both human decision-makers and autonomous elements as coordinating teammates. While, JCSE-based systems can enhance a system analysts' understanding of both existing and new system processes, JCSE activities typically occur outside of traditional systems engineering (SE) methods, providing sparse guidance about how systems should be implemented. In contrast, the Joint Director's Laboratory (JDL) information fusion model and extensions, such as the Dual Node Network (DNN) technical architecture, provide the means to divide and conquer such engineering and implementation complexity, but are loosely coupled to specialized organizational contexts and needs. We previously describe how Dual Node Decision Wheels (DNDW) extend the DNN to integrate JCSE analysis and design with the practicalities of system engineering and implementation using the DNN. Insights from Rasmussen's JCSE Decision Ladders align system implementation with organizational structures and processes. In the current work, we present a novel approach to assessing system performance based on patterns occurring in operational decisions that are documented by JCSE processes as traces in a decision ladder. In this way, system assessment is closely tied not just to system design, but the design of the joint cognitive system that includes human operators, decision-makers, information systems, and automated processes. Such operationally relevant and integrated testing provides a sound foundation for operator trust in system automation that is required to safely operate satellite systems.
a Semi-Automated Point Cloud Processing Methodology for 3d Cultural Heritage Documentation
NASA Astrophysics Data System (ADS)
Kıvılcım, C. Ö.; Duran, Z.
2016-06-01
The preliminary phase in any architectural heritage project is to obtain metric measurements and documentation of the building and its individual elements. On the other hand, conventional measurement techniques require tremendous resources and lengthy project completion times for architectural surveys and 3D model production. Over the past two decades, the widespread use of laser scanning and digital photogrammetry have significantly altered the heritage documentation process. Furthermore, advances in these technologies have enabled robust data collection and reduced user workload for generating various levels of products, from single buildings to expansive cityscapes. More recently, the use of procedural modelling methods and BIM relevant applications for historic building documentation purposes has become an active area of research, however fully automated systems in cultural heritage documentation still remains open. In this paper, we present a semi-automated methodology, for 3D façade modelling of cultural heritage assets based on parametric and procedural modelling techniques and using airborne and terrestrial laser scanning data. We present the contribution of our methodology, which we implemented in an open source software environment using the example project of a 16th century early classical era Ottoman structure, Sinan the Architect's Şehzade Mosque in Istanbul, Turkey.
A comprehensive methodology for intelligent systems life-cycle cost modelling
NASA Technical Reports Server (NTRS)
Korsmeyer, David J.; Lum, Henry, Jr.
1993-01-01
As NASA moves into the last part on the twentieth century, the desire to do 'business as usual' has been replaced with the mantra 'faster, cheaper, better'. Recently, new work has been done to show how the implementation of advanced technologies, such as intelligent systems, will impact the cost of a system design or in the operational cost for a spacecraft mission. The impact of the degree of autonomous or intelligent systems and human participation on a given program is manifested most significantly during the program operational phases, while the decision of who performs what tasks, and how much automation is incorporated into the system are all made during the design and development phases. Employing intelligent systems and automation is not an either/or question, but one of degree. The question is what level of automation and autonomy will provide the optimal trade-off between performance and cost. Conventional costing methodologies, however, are unable to show the significance of technologies like these in terms of traceable cost benefits and reductions in the various phases of the spacecraft's lifecycle. The proposed comprehensive life-cycle methodology can address intelligent system technologies as well as others that impact human-machine operational modes.
A methodology for Manufacturing Execution Systems (MES) implementation
NASA Astrophysics Data System (ADS)
Govindaraju, Rajesri; Putra, Krisna
2016-02-01
Manufacturing execution system is information systems (IS) application that bridges the gap between IS at the top level, namely enterprise resource planning (ERP), and IS at the lower levels, namely the automation systems. MES provides a media for optimizing the manufacturing process as a whole in a real time basis. By the use of MES in combination with the implementation of ERP and other automation systems, a manufacturing company is expected to have high competitiveness. In implementing MES, functional integration -making all the components of the manufacturing system able to work well together, is the most difficult challenge. For this, there has been an industry standard that specifies the sub-systems of a manufacturing execution systems and defines the boundaries between ERP systems, MES, and other automation systems. The standard is known as ISA-95. Although the advantages from the use of MES have been stated in some studies, not much research being done on how to implement MES effectively. The purpose of this study is to develop a methodology describing how MES implementation project should be managed, utilising the support of ISA- 95 reference model in the system development process. A proposed methodology was developed based on a general IS development methodology. The developed methodology were then revisited based on the understanding about the specific charateristics of MES implementation project found in an Indonesian steel manufacturing company implementation case. The case study highlighted the importance of applying an effective requirement elicitation method during innitial system assessment process, managing system interfaces and labor division in the design process, and performing a pilot deployment before putting the whole system into operation.
ERIC Educational Resources Information Center
Damboeck, Johanna
2012-01-01
Purpose: The aim of this article is to provide an analysis of the features that have shaped the state's decision-making process in the United Nations, with regard to the humanitarian intervention in Darfur from 2003 onwards. Design/methodology/approach: The methodological approach to the study is a review of political statement papers grounded in…
Integrating Ecosystem Services Into Health Impact Assessment
Health Impact Assessment (HIA) provides a methodology for incorporating considerations of public health into planning and decision-making processes. HIA promotes interdisciplinary action, stakeholder participation, and timeliness and takes into account equity, sustainability, and...
NASA Astrophysics Data System (ADS)
García-Santos, Glenda; Madruga de Brito, Mariana; Höllermann, Britta; Taft, Linda; Almoradie, Adrian; Evers, Mariele
2018-06-01
Understanding the interactions between water resources and its social dimensions is crucial for an effective and sustainable water management. The identification of sensitive control variables and feedback loops of a specific human-hydro-scape can enhance the knowledge about the potential factors and/or agents leading to the current water resources and ecosystems situation, which in turn supports the decision-making process of desirable futures. Our study presents the utility of a system dynamics modeling approach for water management and decision-making for the case of a forest ecosystem under risk of wildfires. We use the pluralistic water research concept to explore different scenarios and simulate the emergent behaviour of water interception and net precipitation after a wildfire in a forest ecosystem. Through a case study, we illustrate the applicability of this new methodology.
Universal Verification Methodology Based Register Test Automation Flow.
Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu
2016-05-01
In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.
Pak, Richard; Fink, Nicole; Price, Margaux; Bass, Brock; Sturre, Lindsay
2012-01-01
This study examined the use of deliberately anthropomorphic automation on younger and older adults' trust, dependence and performance on a diabetes decision-making task. Research with anthropomorphic interface agents has shown mixed effects in judgments of preferences but has rarely examined effects on performance. Meanwhile, research in automation has shown some forms of anthropomorphism (e.g. etiquette) have effects on trust and dependence on automation. Participants answered diabetes questions with no-aid, a non-anthropomorphic aid or an anthropomorphised aid. Trust and dependence in the aid was measured. A minimally anthropomorphic aide primarily affected younger adults' trust in the aid. Dependence, however, for both age groups was influenced by the anthropomorphic aid. Automation that deliberately embodies person-like characteristics can influence trust and dependence on reasonably reliable automation. However, further research is necessary to better understand the specific aspects of the aid that affect different age groups. Automation that embodies human-like characteristics may be useful in situations where there is under-utilisation of reasonably reliable aids by enhancing trust and dependence in that aid. Practitioner Summary: The design of decision-support aids on consumer devices (e.g. smartphones) may influence the level of trust that users place in that system and their amount of use. This study is the first step in articulating how the design of aids may influence user's trust and use of such systems.
Effect of descriptive information and experience on automation reliance.
Yuviler-Gavish, Nirit; Gopher, Daniel
2011-06-01
The present research addresses the issue of reliance on decision support systems for the long-term (DSSLT), which help users develop decision-making strategies and long-term planning. It is argued that providing information about a system's future performance in an experiential manner, as compared with a descriptive manner, encourages users to increase their reliance level. Establishing appropriate reliance on DSSLT is contingent on the system developer's ability to provide users with information about the system's future performance. A sequence of three studies contrasts the effect on automation reliance of providing descriptive information versus experience for DSSLT with two different positive expected values of recommendations. Study I demonstrated that when automation reliance was determined solely on the basis of description, it was relatively low, but it increased significantly when a decision was made after experience with 50 training simulations. Participants were able to learn to increase their automation reliance levels when they encountered the same type of recommendation again. Study 2 showed that the absence of preliminary descriptive information did not affect the automation reliance levels obtained after experience. Study 3 demonstrated that participants were able to generalize their learning about increasing reliance levels to new recommendations. Using experience rather than description to give users information about future performance in DSSLT can help increase automation reliance levels. Implications for designing DSSLT and decision support systems in general are discussed.
A comparison of automated crater detection methods
NASA Astrophysics Data System (ADS)
Bandeira, L.; Barreira, C.; Pina, P.; Saraiva, J.
2008-09-01
Abstract This work presents early results of a comparison between some common methodologies for automated crater detection. The three procedures considered were applied to images of the surface of Mars, thus illustrating some pros and cons of their use. We aim to establish the clear advantages in using this type of methods in the study of planetary surfaces.
Hasan, Haroon; Muhammed, Taaha; Yu, Jennifer; Taguchi, Kelsi; Samargandi, Osama A; Howard, A Fuchsia; Lo, Andrea C; Olson, Robert; Goddard, Karen
2017-10-01
The objective of our study was to evaluate the methodological quality of systematic reviews and meta-analyses in Radiation Oncology. A systematic literature search was conducted for all eligible systematic reviews and meta-analyses in Radiation Oncology from 1966 to 2015. Methodological characteristics were abstracted from all works that satisfied the inclusion criteria and quality was assessed using the critical appraisal tool, AMSTAR. Regression analyses were performed to determine factors associated with a higher score of quality. Following exclusion based on a priori criteria, 410 studies (157 systematic reviews and 253 meta-analyses) satisfied the inclusion criteria. Meta-analyses were found to be of fair to good quality while systematic reviews were found to be of less than fair quality. Factors associated with higher scores of quality in the multivariable analysis were including primary studies consisting of randomized control trials, performing a meta-analysis, and applying a recommended guideline related to establishing a systematic review protocol and/or reporting. Systematic reviews and meta-analyses may introduce a high risk of bias if applied to inform decision-making based on AMSTAR. We recommend that decision-makers in Radiation Oncology scrutinize the methodological quality of systematic reviews and meta-analyses prior to assessing their utility to inform evidence-based medicine and researchers adhere to methodological standards outlined in validated guidelines when embarking on a systematic review. Copyright © 2017 Elsevier Ltd. All rights reserved.
Automated Historical and Real-Time Cyclone Discovery With Multimodal Remote Satellite Measurements
NASA Astrophysics Data System (ADS)
Ho, S.; Talukder, A.; Liu, T.; Tang, W.; Bingham, A.
2008-12-01
Existing cyclone detection and tracking solutions involve extensive manual analysis of modeled-data and field campaign data by teams of experts. We have developed a novel automated global cyclone detection and tracking system by assimilating and sharing information from multiple remote satellites. This unprecedented solution of combining multiple remote satellite measurements in an autonomous manner allows leveraging off the strengths of each individual satellite. Use of multiple satellite data sources also results in significantly improved temporal tracking accuracy for cyclones. Our solution involves an automated feature extraction and machine learning technique based on an ensemble classifier and Kalman filter for cyclone detection and tracking from multiple heterogeneous satellite data sources. Our feature-based methodology that focuses on automated cyclone discovery is fundamentally different from, and actually complements, the well-known Dvorak technique for cyclone intensity estimation (that often relies on manual detection of cyclonic regions) from field and remote data. Our solution currently employs the QuikSCAT wind measurement and the merged level 3 TRMM precipitation data for automated cyclone discovery. Assimilation of other types of remote measurements is ongoing and planned in the near future. Experimental results of our automated solution on historical cyclone datasets demonstrate the superior performance of our automated approach compared to previous work. Performance of our detection solution compares favorably against the list of cyclones occurring in North Atlantic Ocean for the 2005 calendar year reported by the National Hurricane Center (NHC) in our initial analysis. We have also demonstrated the robustness of our cyclone tracking methodology in other regions over the world by using multiple heterogeneous satellite data for detection and tracking of three arbitrary historical cyclones in other regions. Our cyclone detection and tracking methodology can be applied to (i) historical data to support Earth scientists in climate modeling, cyclonic-climate interactions, and obtain a better understanding of the cause and effects of cyclone (e.g. cyclo-genesis), and (ii) automatic cyclone discovery in near real-time using streaming satellite to support and improve the planning of global cyclone field campaigns. Additional satellite data from GOES and other orbiting satellites can be easily assimilated and integrated into our automated cyclone detection and tracking module to improve the temporal tracking accuracy of cyclones down to ½ hr and reduce the incidence of false alarms.
Spillover Effects of Loss of Control on Risky Decision-Making
Beisswingert, Birgit M.; Zhang, Keshun; Goetz, Thomas; Fischbacher, Urs
2016-01-01
Decision making in risky situations is frequently required in our everyday lives and has been shown to be influenced by various factors, some of which are independent of the risk context. Based on previous findings and theories about the central role of perceptions of control and their impact on subsequent settings, spillover effects of subjective loss of control on risky decision-making are assumed. After developing an innovative experimental paradigm for inducing loss of control, its hypothesized effects on risky decision-making are investigated. Partially supporting the hypotheses, results demonstrated no increased levels of risk perceptions but decreased risk-taking behavior following experiences of loss of control. Thus, this study makes a methodological contribution by proposing a newly developed experimental paradigm facilitating further research on the effects of subjective loss of control, and additionally provides partial evidence for the spillover effects of loss of control experiences on risky decision-making. PMID:26930066
A method for studying decision-making by guideline development groups.
Gardner, Benjamin; Davidson, Rosemary; McAteer, John; Michie, Susan
2009-08-05
Multidisciplinary guideline development groups (GDGs) have considerable influence on UK healthcare policy and practice, but previous research suggests that research evidence is a variable influence on GDG recommendations. The Evidence into Recommendations (EiR) study has been set up to document social-psychological influences on GDG decision-making. In this paper we aim to evaluate the relevance of existing qualitative methodologies to the EiR study, and to develop a method best-suited to capturing influences on GDG decision-making. A research team comprised of three postdoctoral research fellows and a multidisciplinary steering group assessed the utility of extant qualitative methodologies for coding verbatim GDG meeting transcripts and semi-structured interviews with GDG members. A unique configuration of techniques was developed to permit data reduction and analysis. Our method incorporates techniques from thematic analysis, grounded theory analysis, content analysis, and framework analysis. Thematic analysis of individual interviews conducted with group members at the start and end of the GDG process defines discrete problem areas to guide data extraction from GDG meeting transcripts. Data excerpts are coded both inductively and deductively, using concepts taken from theories of decision-making, social influence and group processes. These codes inform a framework analysis to describe and explain incidents within GDG meetings. We illustrate the application of the method by discussing some preliminary findings of a study of a National Institute for Health and Clinical Excellence (NICE) acute physical health GDG. This method is currently being applied to study the meetings of three of NICE GDGs. These cover topics in acute physical health, mental health and public health, and comprise a total of 45 full-day meetings. The method offers potential for application to other health care and decision-making groups.
NASA Astrophysics Data System (ADS)
Vazquez Rascon, Maria de Lourdes
This thesis focuses on the implementation of a participatory and transparent decision making tool about the wind farm projects. This tool is based on an (argumentative) framework that reflects the stakeholder's values systems involved in these projects and it employs two multicriteria methods: the multicriteria decision aide and the participatory geographical information systems, making it possible to represent this value systems by criteria and indicators to be evaluated. The stakeholder's values systems will allow the inclusion of environmental, economic and social-cultural aspects of wind energy projects and, thus, a sustainable development wind projects vision. This vision will be analyzed using the 16 sustainable principles included in the Quebec's Sustainable Development Act. Four specific objectives have been instrumented to favor a logical completion work, and to ensure the development of a successfultool : designing a methodology to couple the MCDA and participatory GIS, testing the developed methodology by a case study, making a robustness analysis to address strategic issues and analyzing the strengths, weaknesses, opportunities and threads of the developed methodology. Achieving the first goal allowed us to obtain a decision-making tool called Territorial Intelligence Modeling for Energy Development (TIMED approach). The TIMED approach is visually represented by a figure expressing the idea of a co-construction decision and where ail stakeholders are the focus of this methodology. TIMED is composed of four modules: Multi-Criteria decision analysis, participatory geographic Information systems, active involvement of the stakeholders and scientific knowledge/local knowledge. The integration of these four modules allows for the analysis of different implementation scenarios of wind turbines in order to choose the best one based on a participatory and transparent decision-making process that takes into account stakeholders' concerns. The second objective enabled the testing of TIMED in an ex-post experience of a wind farm in operation since 2006. In this test, II people participated representing four stakeholder' categories: the private sector, the public sector, experts and civil society. This test allowed us to analyze the current situation in which wind projects are currently developed in Quebec. The concerns of some stakeholders regarding situations that are not considered in the current context were explored through a third goal. This third objective allowed us to make simulations taking into account the assumptions of strategic levels. Examples of the strategic level are the communication tools used to approach the host community and the park property type. Finally, the fourth objective, a SWOT analysis with the participation of eight experts, allowed us to verify the extent to which TIMED approach succeeded in constructing four fields for participatory decision-making: physical, intellectual, emotional and procedural. From these facts, 116 strengths, 28 weaknesses, 32 constraints and 54 opportunities were identified. Contributions, applications, limitations and extensions of this research are based on giving a participatory decision-making methodology taking into account socio-cultural, environmental and economic variables; making reflection sessions on a wind farm in operation; acquiring MCDA knowledge for participants involved in testing the proposed methodology; taking into account the physical, intellectual, emotional and procedural spaces to al1iculate a participatory decision; using the proposed methodology in renewable energy sources other than wind; the need to an interdisciplinary team for the methodology application; access to quality data; access to information technologies; the right to public participation; the neutrality of experts; the relationships between experts and non-experts; cultural constraints; improvement of designed indicators; the implementation of a Web platform for participatory decision-making and writing a manual on the use of the developed methodology. Keywords: wind farm, multicriteria decision, geographic information systems, TIMED approach, sustainable wind energy projects development, renewable energy, social participation, robustness concern, SWOT analysis.
Improta, Giovanni; Russo, Mario Alessandro; Triassi, Maria; Converso, Giuseppe; Murino, Teresa; Santillo, Liberatina Carmela
2018-05-01
Health technology assessments (HTAs) are often difficult to conduct because of the decisive procedures of the HTA algorithm, which are often complex and not easy to apply. Thus, their use is not always convenient or possible for the assessment of technical requests requiring a multidisciplinary approach. This paper aims to address this issue through a multi-criteria analysis focusing on the analytic hierarchy process (AHP). This methodology allows the decision maker to analyse and evaluate different alternatives and monitor their impact on different actors during the decision-making process. However, the multi-criteria analysis is implemented through a simulation model to overcome the limitations of the AHP methodology. Simulations help decision-makers to make an appropriate decision and avoid unnecessary and costly attempts. Finally, a decision problem regarding the evaluation of two health technologies, namely, the evaluation of two biological prostheses for incisional infected hernias, will be analysed to assess the effectiveness of the model. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Luz, Maria; Manzey, Dietrich; Modemann, Susanne; Strauss, Gero
2015-01-01
Image-guided navigation (IGN) systems provide automation support of intra-operative information analysis and decision-making for surgeons. Previous research showed that navigated-control (NC) systems which represent high levels of decision-support and directly intervene in surgeons' workflow provide benefits with respect to patient safety and surgeons' physiological stress but also involve several cost effects (e.g. prolonged surgery duration, reduced secondary-task performance). It was hypothesised that less automated distance-control (DC) systems would provide a better solution in terms of human performance consequences. N = 18 surgeons performed a simulated mastoidectomy with NC, DC and without IGN assistance. Effects on surgical performance, physiological effort, workload and situation awareness (SA) were compared. As expected, DC technology had the same benefits as the NC system but also led to less unwanted side effects on surgery duration, subjective workload and SA. This suggests that IGN systems just providing information analysis support are overall more beneficial than higher automated decision-support. This study investigates human performance consequences of different concepts of IGN support for surgeons. Less automated DC systems turned out to provide advantages for patient safety and surgeons' stress similar to higher automated NC systems with, at the same time, reduced negative consequences on surgery time and subjective workload.
Automated space vehicle control for rendezvous proximity operations
NASA Technical Reports Server (NTRS)
Lea, Robert N.
1988-01-01
Rendezvous during the unmanned space exploration missions, such as a Mars Rover/Sample Return will require a completely automatic system from liftoff to docking. A conceptual design of an automated rendezvous, proximity operations, and docking system is being implemented and validated at the Johnson Space Center (JSC). The emphasis is on the progress of the development and testing of a prototype system for control of the rendezvous vehicle during proximity operations that is currently being developed at JSC. Fuzzy sets are used to model the human capability of common sense reasoning in decision-making tasks and such models are integrated with the expert systems and engineering control system technology to create a system that performs comparably to a manned system.
A methodology for producing reliable software, volume 1
NASA Technical Reports Server (NTRS)
Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.
1976-01-01
An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.
Visual Display Principles for C3I System Tasks
1993-06-01
early in the design process is now explicitly recognized in military R & D policy as evidenced by the Navy’s HARDMAN and the Army’s MANPRINT programs...information): required sampling rate for each battlefield area, target type, and sensor type, etc.? - Change detections aids - Where is the enemy...increasing load and sophistication for - Automated measurement and operators and decisionmakers scoring (%hits, miss distances, attrition rates , etc
Cai, Wenli; Lee, June-Goo; Fikry, Karim; Yoshida, Hiroyuki; Novelline, Robert; de Moya, Marc
2012-07-01
It is commonly believed that the size of a pneumothorax is an important determinant of treatment decision, in particular regarding whether chest tube drainage (CTD) is required. However, the volumetric quantification of pneumothoraces has not routinely been performed in clinics. In this paper, we introduced an automated computer-aided volumetry (CAV) scheme for quantification of volume of pneumothoraces in chest multi-detect CT (MDCT) images. Moreover, we investigated the impact of accurate volume of pneumothoraces in the improvement of the performance in decision-making regarding CTD in the management of traumatic pneumothoraces. For this purpose, an occurrence frequency map was calculated for quantitative analysis of the importance of each clinical parameter in the decision-making regarding CTD by a computer simulation of decision-making using a genetic algorithm (GA) and a support vector machine (SVM). A total of 14 clinical parameters, including volume of pneumothorax calculated by our CAV scheme, was collected as parameters available for decision-making. The results showed that volume was the dominant parameter in decision-making regarding CTD, with an occurrence frequency value of 1.00. The results also indicated that the inclusion of volume provided the best performance that was statistically significant compared to the other tests in which volume was excluded from the clinical parameters. This study provides the scientific evidence for the application of CAV scheme in MDCT volumetric quantification of pneumothoraces in the management of clinically stable chest trauma patients with traumatic pneumothorax. Copyright © 2012 Elsevier Ltd. All rights reserved.
An autonomous satellite architecture integrating deliberative reasoning and behavioural intelligence
NASA Technical Reports Server (NTRS)
Lindley, Craig A.
1993-01-01
This paper describes a method for the design of autonomous spacecraft, based upon behavioral approaches to intelligent robotics. First, a number of previous spacecraft automation projects are reviewed. A methodology for the design of autonomous spacecraft is then presented, drawing upon both the European Space Agency technological center (ESTEC) automation and robotics methodology and the subsumption architecture for autonomous robots. A layered competency model for autonomous orbital spacecraft is proposed. A simple example of low level competencies and their interaction is presented in order to illustrate the methodology. Finally, the general principles adopted for the control hardware design of the AUSTRALIS-1 spacecraft are described. This system will provide an orbital experimental platform for spacecraft autonomy studies, supporting the exploration of different logical control models, different computational metaphors within the behavioral control framework, and different mappings from the logical control model to its physical implementation.
GT-CATS: Tracking Operator Activities in Complex Systems
NASA Technical Reports Server (NTRS)
Callantine, Todd J.; Mitchell, Christine M.; Palmer, Everett A.
1999-01-01
Human operators of complex dynamic systems can experience difficulties supervising advanced control automation. One remedy is to develop intelligent aiding systems that can provide operators with context-sensitive advice and reminders. The research reported herein proposes, implements, and evaluates a methodology for activity tracking, a form of intent inferencing that can supply the knowledge required for an intelligent aid by constructing and maintaining a representation of operator activities in real time. The methodology was implemented in the Georgia Tech Crew Activity Tracking System (GT-CATS), which predicts and interprets the actions performed by Boeing 757/767 pilots navigating using autopilot flight modes. This report first describes research on intent inferencing and complex modes of automation. It then provides a detailed description of the GT-CATS methodology, knowledge structures, and processing scheme. The results of an experimental evaluation using airline pilots are given. The results show that GT-CATS was effective in predicting and interpreting pilot actions in real time.
A Decision Support Methodology for Space Technology Advocacy.
1984-12-01
determine their parameters. Program control is usually exercised by level of effort funding. 63xx is the designator for advanced development pro- grams... designing systems or models that successfully aid the decision-maker. One remedy for this deficiency in the techniques is to increase the...methodology for use by the Air Force Space Technology Advocate is designed to provide the following features [l11:146-1471: meaningful reduction of available
Automatic Feature Selection and Improved Classification in SICADA Counterfeit Electronics Detection
2017-03-20
The SICADA methodology was developed to detect such counterfeit microelectronics by collecting power side channel data and applying machine learning...to identify counterfeits. This methodology has been extended to include a two-step automated feature selection process and now uses a one-class SVM...classifier. We describe this methodology and show results for empirical data collected from several types of Microchip dsPIC33F microcontrollers
Neuroeconomics: cross-currents in research on decision-making.
Sanfey, Alan G; Loewenstein, George; McClure, Samuel M; Cohen, Jonathan D
2006-03-01
Despite substantial advances, the question of how we make decisions and judgments continues to pose important challenges for scientific research. Historically, different disciplines have approached this problem using different techniques and assumptions, with few unifying efforts made. However, the field of neuroeconomics has recently emerged as an inter-disciplinary effort to bridge this gap. Research in neuroscience and psychology has begun to investigate neural bases of decision predictability and value, central parameters in the economic theory of expected utility. Economics, in turn, is being increasingly influenced by a multiple-systems approach to decision-making, a perspective strongly rooted in psychology and neuroscience. The integration of these disparate theoretical approaches and methodologies offers exciting potential for the construction of more accurate models of decision-making.
Menychtas, Andreas; Tsanakas, Panayiotis
2016-01-01
The proper acquisition of biosignals data from various biosensor devices and their remote accessibility are still issues that prevent the wide adoption of point-of-care systems in the routine of monitoring chronic patients. This Letter presents an advanced framework for enabling patient monitoring that utilises a cloud computing infrastructure for data management and analysis. The framework introduces also a local mechanism for uniform biosignals collection from wearables and biosignal sensors, and decision support modules, in order to enable prompt and essential decisions. A prototype smartphone application and the related cloud modules have been implemented for demonstrating the value of the proposed framework. Initial results regarding the performance of the system and the effectiveness in data management and decision-making have been quite encouraging. PMID:27222731
Menychtas, Andreas; Tsanakas, Panayiotis; Maglogiannis, Ilias
2016-03-01
The proper acquisition of biosignals data from various biosensor devices and their remote accessibility are still issues that prevent the wide adoption of point-of-care systems in the routine of monitoring chronic patients. This Letter presents an advanced framework for enabling patient monitoring that utilises a cloud computing infrastructure for data management and analysis. The framework introduces also a local mechanism for uniform biosignals collection from wearables and biosignal sensors, and decision support modules, in order to enable prompt and essential decisions. A prototype smartphone application and the related cloud modules have been implemented for demonstrating the value of the proposed framework. Initial results regarding the performance of the system and the effectiveness in data management and decision-making have been quite encouraging.
Definition and Demonstration of a Methodology for Validating Aircraft Trajectory Predictors
NASA Technical Reports Server (NTRS)
Vivona, Robert A.; Paglione, Mike M.; Cate, Karen T.; Enea, Gabriele
2010-01-01
This paper presents a new methodology for validating an aircraft trajectory predictor, inspired by the lessons learned from a number of field trials, flight tests and simulation experiments for the development of trajectory-predictor-based automation. The methodology introduces new techniques and a new multi-staged approach to reduce the effort in identifying and resolving validation failures, avoiding the potentially large costs associated with failures during a single-stage, pass/fail approach. As a case study, the validation effort performed by the Federal Aviation Administration for its En Route Automation Modernization (ERAM) system is analyzed to illustrate the real-world applicability of this methodology. During this validation effort, ERAM initially failed to achieve six of its eight requirements associated with trajectory prediction and conflict probe. The ERAM validation issues have since been addressed, but to illustrate how the methodology could have benefited the FAA effort, additional techniques are presented that could have been used to resolve some of these issues. Using data from the ERAM validation effort, it is demonstrated that these new techniques could have identified trajectory prediction error sources that contributed to several of the unmet ERAM requirements.
ERIC Educational Resources Information Center
Pirnay-Dummer, Pablo; Ifenthaler, Dirk
2011-01-01
Our study integrates automated natural language-oriented assessment and analysis methodologies into feasible reading comprehension tasks. With the newly developed T-MITOCAR toolset, prose text can be automatically converted into an association net which has similarities to a concept map. The "text to graph" feature of the software is based on…
Methodology Investigation of AI(Artificial Intelligence) Test Officer Support Tool. Volume 1
1989-03-01
American Association for Artificial inteligence A! ............. Artificial inteliigence AMC ............ Unt:ed States Army Maeriel Comand ASL...block number) FIELD GROUP SUB-GROUP Artificial Intelligence, Expert Systems Automated Aids to Testing 9. ABSTRACT (Continue on reverse if necessary and...identify by block number) This report covers the application of Artificial Intelligence-Techniques to the problem of creating automated tools to
ERIC Educational Resources Information Center
Pelin, Nicolae; Mironov, Vladimir
2008-01-01
In this article the problems of functioning algorithms development for system of the automated analysis of educational process rhythm in a higher educational institution are considered. Using the device of experiment planning for conducting the scientific researches, adapted methodologies, received by authors in the dissertational works at the…
Automated Corrosion Detection Program
2001-10-01
More detailed explanations of the methodology development can be found in Hidden Corrosion Detection Technology Assessment, a paper presented at...Detection Program, a paper presented at the Fourth Joint DoD/FAA/NASA Conference on Aging Aircraft, 2000. AS&M PULSE. The PULSE system, developed...selection can be found in The Evaluation of Hidden Corrosion Detection Technologies on the Automated Corrosion Detection Program, a paper presented
Application of Human-Autonomy Teaming (HAT) Patterns to Reduce Crew Operations (RCO)
NASA Technical Reports Server (NTRS)
Shively, R. Jay; Brandt, Summer L.; Lachter, Joel; Matessa, Mike; Sadler, Garrett; Battiste, Henri
2011-01-01
Unmanned aerial systems, advanced cockpits, and air traffic management are all seeing dramatic increases in automation. However, while automation may take on some tasks previously performed by humans, humans will still be required to remain in the system for the foreseeable future. The collaboration between humans and these increasingly autonomous systems will begin to resemble cooperation between teammates, rather than simple task allocation. It is critical to understand this human-autonomy teaming (HAT) to optimize these systems in the future. One methodology to understand HAT is by identifying recurring patterns of HAT that have similar characteristics and solutions. This paper applies a methodology for identifying HAT patterns to an advanced cockpit project.
Automated software development workstation
NASA Technical Reports Server (NTRS)
1986-01-01
Engineering software development was automated using an expert system (rule-based) approach. The use of this technology offers benefits not available from current software development and maintenance methodologies. A workstation was built with a library or program data base with methods for browsing the designs stored; a system for graphical specification of designs including a capability for hierarchical refinement and definition in a graphical design system; and an automated code generation capability in FORTRAN. The workstation was then used in a demonstration with examples from an attitude control subsystem design for the space station. Documentation and recommendations are presented.
NASA Astrophysics Data System (ADS)
Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.
Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.
Contaminated site cleanups involving complex activities may benefit from a detailed environmental footprint analysis to inform decision-making about application of suitable best management practices for greener cleanups.
Tamaki, Edson Mamoru; Tanaka, Oswaldo Yoshimi; Felisberto, Eronildo; Alves, Cinthia Kalyne de Almeida; Drumond Junior, Marcos; Bezerra, Luciana Caroline de Albuquerque; Calvo, Maria Cristina Marino; Miranda, Alcides Silva de
2012-04-01
This study sought to develop methodology for the construction of a Panel for the Monitoring and Evaluation of Management of the Unified Health System (SUS). The participative process used in addition to the systematization conducted made it possible to identify an effective strategy for building management tools in partnership with researchers, academic institutions and managers of the SUS. The final systematization of the Panel selected indicators for the management of the SUS in terms of Demand, Inputs, Processes, Outputs and Outcomes in order to provide a simple, versatile and useful tool for evaluation at any level of management and more transparent and easier communication with all stakeholders in decision-making. Taking the management of the SUS as the scope of these processes and practices in all normative aspects enabled dialog between systemic theories and those which consider the centrality of the social actor in the decision-making process.
Inter-Philosophies Dialogue: Creating a Paradigm for Global Health Ethics.
Benatar, Solomon; Daibes, Ibrahim; Tomsons, Sandra
While debate remains about the definition and goals of work on global health, there is growing agreement that our moral starting point is the reality of unjust inequalities in the distribution of the conditions necessary for human health and well-being. With the growth of multi-jurisdictional and multicultural global health partnerships, the adequacy of the prevailing bioethical paradigm guiding the conduct of global health research and practice is being increasingly challenged. In response to ethical challenges and conflicts confronted by decision-makers in global health research and practice, we propose an innovative methodology that could be developed to bridge the gap between polarized systems of ideas and values (metaphysical, epistemological, moral, and political). Our inter-philosophies methodology provides the potential to construct a new, shared paradigm for global health ethics, thereby increasing the capacity for solidarity and shared decision-making in global health research and practice.
[Clinical practice guidelines in Peru: evaluation of its quality using the AGREE II instrument].
Canelo-Aybar, Carlos; Balbin, Graciela; Perez-Gomez, Ángela; Florez, Iván D
2016-01-01
To evaluate the methodological quality of clinical practice guidelines (CPGs) put into practice by the Peruvian Ministry of Health (MINSA), 17 CPGs from the ministry, published between 2009 and 2014, were independently evaluated by three methodologic experts using the AGREE II instrument. The score of AGREE II domains was low and very low in all CPGs: scope and purpose (medium, 44%), clarity of presentation (medium, 47%), participation of decision-makers (medium, 8%), methodological rigor (medium, 5%), applicability (medium, 5%), and editorial independence (medium, 8%). In conclusion, the methodological quality of CPGs implemented by the MINSA is low. Consequently, its use could not be recommended. The implementation of the methodology for the development of CPGs described in the recentlypublished CPG methodological preparation manual in Peru is a pressing need.
Architecture Views Illustrating the Service Automation Aspect of SOA
NASA Astrophysics Data System (ADS)
Gu, Qing; Cuadrado, Félix; Lago, Patricia; Duenãs, Juan C.
Earlier in this book, Chapter 8 provided a detailed analysis of service engineering, including a review of service engineering techniques and methodologies. This chapter is closely related to Chapter 8 as shows how such approaches can be used to develop a service, with particular emphasis on the identification of three views (the automation decision view, degree of service automation view and service automation related data view) that structure and ease elicitation and documentation of stakeholders' concerns. This is carried out through two large case studies to learn the industrial needs in illustrating services deployment and configuration automation. This set of views adds to the more traditional notations like UML, the visual power of attracting the attention of their users to the addressed concerns, and assist them in their work. This is especially crucial in service oriented architecting where service automation is highly demanded.
Evaluating Management Strategies for Automated Test Systems/Equipment (ATS/E): An F-15 Case Study
2005-03-01
ethnography , grounded theory , case study , phenomenological research , and narrative research (also known as bibliography from...Creswell, 2003:183). Example inquiry strategies identified by Creswell are: narrative , phenomenology , ethnography , case study , and grounded theory ...other managed systems. Methodology The researcher chose a qualitative research methodology and
Clos, Lawrence J; Jofre, M Fransisca; Ellinger, James J; Westler, William M; Markley, John L
2013-06-01
To facilitate the high-throughput acquisition of nuclear magnetic resonance (NMR) experimental data on large sets of samples, we have developed a simple and straightforward automated methodology that capitalizes on recent advances in Bruker BioSpin NMR spectrometer hardware and software. Given the daunting challenge for non-NMR experts to collect quality spectra, our goal was to increase user accessibility, provide customized functionality, and improve the consistency and reliability of resultant data. This methodology, NMRbot, is encoded in a set of scripts written in the Python programming language accessible within the Bruker BioSpin TopSpin ™ software. NMRbot improves automated data acquisition and offers novel tools for use in optimizing experimental parameters on the fly. This automated procedure has been successfully implemented for investigations in metabolomics, small-molecule library profiling, and protein-ligand titrations on four Bruker BioSpin NMR spectrometers at the National Magnetic Resonance Facility at Madison. The investigators reported benefits from ease of setup, improved spectral quality, convenient customizations, and overall time savings.
Monakhova, Yulia B; Randel, Gabriele; Diehl, Bernd W K
2016-09-01
Recent classification of Aloe vera whole-leaf extract by the International Agency for Research and Cancer as a possible carcinogen to humans as well as the continuous adulteration of A. vera's authentic material have generated renewed interest in controlling A. vera. The existing NMR spectroscopic method for the analysis of A. vera, which is based on a routine developed at Spectral Service, was extended. Apart from aloverose, glucose, malic acid, lactic acid, citric acid, whole-leaf material (WLM), acetic acid, fumaric acid, sodium benzoate, and potassium sorbate, the quantification of Mg(2+), Ca(2+), and fructose is possible with the addition of a Cs-EDTA solution to sample. The proposed methodology was automated, which includes phasing, baseline-correction, deconvolution (based on the Lorentzian function), integration, quantification, and reporting. The NMR method was applied to 41 A. vera preparations in the form of liquid A. vera juice and solid A. vera powder. The advantages of the new NMR methodology over the previous method were discussed. Correlation between the new and standard NMR methodologies was significant for aloverose, glucose, malic acid, lactic acid, citric acid, and WLM (P < 0.0001, R(2) = 0.99). NMR was found to be suitable for the automated simultaneous quantitative determination of 13 parameters in A. vera.
Self-Contained Automated Methodology for Optimal Flow Control
NASA Technical Reports Server (NTRS)
Joslin, Ronald D.; Gunzburger, Max D.; Nicolaides, Roy A.; Erlebacherl, Gordon; Hussaini, M. Yousuff
1997-01-01
This paper describes a self-contained, automated methodology for active flow control which couples the time-dependent Navier-Stokes system with an adjoint Navier-Stokes system and optimality conditions from which optimal states, i.e., unsteady flow fields and controls (e.g., actuators), may be determined. The problem of boundary layer instability suppression through wave cancellation is used as the initial validation case to test the methodology. Here, the objective of control is to match the stress vector along a portion of the boundary to a given vector; instability suppression is achieved by choosing the given vector to be that of a steady base flow. Control is effected through the injection or suction of fluid through a single orifice on the boundary. The results demonstrate that instability suppression can be achieved without any a priori knowledge of the disturbance, which is significant because other control techniques have required some knowledge of the flow unsteadiness such as frequencies, instability type, etc. The present methodology has been extended to three dimensions and may potentially be applied to separation control, re-laminarization, and turbulence control applications using one to many sensors and actuators.
Ladner, Yoann; Mas, Silvia; Coussot, Gaelle; Bartley, Killian; Montels, Jérôme; Morel, Jacques; Perrin, Catherine
2017-12-15
The main purpose of the present work is to provide a fully integrated miniaturized electrophoretic methodology in order to facilitate the quality control of monoclonal antibodies (mAbs). This methodology called D-PES, which stands for Diffusion-mediated Proteolysis combined with an Electrophoretic Separation, permits to perform subsequently mAb tryptic digestion and electrophoresis separation of proteolysis products in an automated manner. Tryptic digestion conditions were optimized regarding the influence of enzyme concentration and incubation time in order to achieve similar enzymatic digestion efficiency to that obtained with the classical methodology (off-line). Then, the optimization of electrophoretic separation conditions concerning the nature of background electrolyte (BGE), ionic strength and pH was realized. Successful and repeatable electrophoretic profiles of three mAbs digests (Trastuzumab, Infliximab and Tocilizumab), comparable to the off-line digestion profiles, were obtained demonstrating the feasibility and robustness of the proposed methodology. In summary, the use of the proposed and optimized in-line approach opens a new, fast and easy way for the quality control of mAbs. Copyright © 2017 Elsevier B.V. All rights reserved.
Dissociated neural processing for decisions in managers and non-managers.
Caspers, Svenja; Heim, Stefan; Lucas, Marc G; Stephan, Egon; Fischer, Lorenz; Amunts, Katrin; Zilles, Karl
2012-01-01
Functional neuroimaging studies of decision-making so far mainly focused on decisions under uncertainty or negotiation with other persons. Dual process theory assumes that, in such situations, decision making relies on either a rapid intuitive, automated or a slower rational processing system. However, it still remains elusive how personality factors or professional requirements might modulate the decision process and the underlying neural mechanisms. Since decision making is a key task of managers, we hypothesized that managers, facing higher pressure for frequent and rapid decisions than non-managers, prefer the heuristic, automated decision strategy in contrast to non-managers. Such different strategies may, in turn, rely on different neural systems. We tested managers and non-managers in a functional magnetic resonance imaging study using a forced-choice paradigm on word-pairs. Managers showed subcortical activation in the head of the caudate nucleus, and reduced hemodynamic response within the cortex. In contrast, non-managers revealed the opposite pattern. With the head of the caudate nucleus being an initiating component for process automation, these results supported the initial hypothesis, hinting at automation during decisions in managers. More generally, the findings reveal how different professional requirements might modulate cognitive decision processing.
Dissociated Neural Processing for Decisions in Managers and Non-Managers
Caspers, Svenja; Heim, Stefan; Lucas, Marc G.; Stephan, Egon; Fischer, Lorenz; Amunts, Katrin; Zilles, Karl
2012-01-01
Functional neuroimaging studies of decision-making so far mainly focused on decisions under uncertainty or negotiation with other persons. Dual process theory assumes that, in such situations, decision making relies on either a rapid intuitive, automated or a slower rational processing system. However, it still remains elusive how personality factors or professional requirements might modulate the decision process and the underlying neural mechanisms. Since decision making is a key task of managers, we hypothesized that managers, facing higher pressure for frequent and rapid decisions than non-managers, prefer the heuristic, automated decision strategy in contrast to non-managers. Such different strategies may, in turn, rely on different neural systems. We tested managers and non-managers in a functional magnetic resonance imaging study using a forced-choice paradigm on word-pairs. Managers showed subcortical activation in the head of the caudate nucleus, and reduced hemodynamic response within the cortex. In contrast, non-managers revealed the opposite pattern. With the head of the caudate nucleus being an initiating component for process automation, these results supported the initial hypothesis, hinting at automation during decisions in managers. More generally, the findings reveal how different professional requirements might modulate cognitive decision processing. PMID:22927984
Cai, Wenli; Lee, June-Goo; Fikry, Karim; Yoshida, Hiroyuki; Novelline, Robert; de Moya, Marc
2013-01-01
It is commonly believed that the size of a pneumothorax is an important determinant of treatment decision, in particular regarding whether chest tube drainage (CTD) is required. However, the volumetric quantification of pneumothoraces has not routinely been performed in clinics. In this paper, we introduced an automated computer-aided volumetry (CAV) scheme for quantification of volume of pneumothoraces in chest multi-detect CT (MDCT) images. Moreover, we investigated the impact of accurate volume of pneumothoraces in the improvement of the performance in decision-making regarding CTD in the management of traumatic pneumothoraces. For this purpose, an occurrence frequency map was calculated for quantitative analysis of the importance of each clinical parameter in the decision-making regarding CTD by a computer simulation of decision-making using a genetic algorithm (GA) and a support vector machine (SVM). A total of 14 clinical parameters, including volume of pneumothorax calculated by our CAV scheme, was collected as parameters available for decision-making. The results showed that volume was the dominant parameter in decision-making regarding CTD, with an occurrence frequency value of 1.00. The results also indicated that the inclusion of volume provided the best performance that was statistically significant compared to the other tests in which volume was excluded from the clinical parameters. This study provides the scientific evidence for the application of CAV scheme in MDCT volumetric quantification of pneumothoraces in the management of clinically stable chest trauma patients with traumatic pneumothorax. PMID:22560899
NASA Astrophysics Data System (ADS)
Hargrave, C.; Moores, M.; Deegan, T.; Gibbs, A.; Poulsen, M.; Harden, F.; Mengersen, K.
2014-03-01
A decision-making framework for image-guided radiotherapy (IGRT) is being developed using a Bayesian Network (BN) to graphically describe, and probabilistically quantify, the many interacting factors that are involved in this complex clinical process. Outputs of the BN will provide decision-support for radiation therapists to assist them to make correct inferences relating to the likelihood of treatment delivery accuracy for a given image-guided set-up correction. The framework is being developed as a dynamic object-oriented BN, allowing for complex modelling with specific subregions, as well as representation of the sequential decision-making and belief updating associated with IGRT. A prototype graphic structure for the BN was developed by analysing IGRT practices at a local radiotherapy department and incorporating results obtained from a literature review. Clinical stakeholders reviewed the BN to validate its structure. The BN consists of a sub-network for evaluating the accuracy of IGRT practices and technology. The directed acyclic graph (DAG) contains nodes and directional arcs representing the causal relationship between the many interacting factors such as tumour site and its associated critical organs, technology and technique, and inter-user variability. The BN was extended to support on-line and off-line decision-making with respect to treatment plan compliance. Following conceptualisation of the framework, the BN will be quantified. It is anticipated that the finalised decision-making framework will provide a foundation to develop better decision-support strategies and automated correction algorithms for IGRT.
[The ethical reflection approach in decision-making processes in health institutes].
Gruat, Renaud
2015-12-01
Except in the specific case of end-of-life care, the law says nothing about the way in which health professionals must carry out ethical reflection regarding the treatment of their patients. A problem-solving methodology called the "ethical reflection approach" performed over several stages can be used. The decision-making process involves the whole team and draws on the ability of each caregiver to put forward a reasoned argument, in the interest of the patient. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
Meyer, Travis S; Muething, Joseph Z; Lima, Gustavo Amoras Souza; Torres, Breno Raemy Rangel; del Rosario, Trystyn Keia; Gomes, José Orlando; Lambert, James H
2012-01-01
Radiological nuclear emergency responders must be able to coordinate evacuation and relief efforts following the release of radioactive material into populated areas. In order to respond quickly and effectively to a nuclear emergency, high-level coordination is needed between a number of large, independent organizations, including police, military, hazmat, and transportation authorities. Given the complexity, scale, time-pressure, and potential negative consequences inherent in radiological emergency responses, tracking and communicating information that will assist decision makers during a crisis is crucial. The emergency response team at the Angra dos Reis nuclear power facility, located outside of Rio de Janeiro, Brazil, presently conducts emergency response simulations once every two years to prepare organizational leaders for real-life emergency situations. However, current exercises are conducted without the aid of electronic or software tools, resulting in possible cognitive overload and delays in decision-making. This paper describes the development of a decision support system employing systems methodologies, including cognitive task analysis and human-machine interface design. The decision support system can aid the coordination team by automating cognitive functions and improving information sharing. A prototype of the design will be evaluated by plant officials in Brazil and incorporated to a future trial run of a response simulation.
Universal in vivo Textural Model for Human Skin based on Optical Coherence Tomograms.
Adabi, Saba; Hosseinzadeh, Matin; Noei, Shahryar; Conforto, Silvia; Daveluy, Steven; Clayton, Anne; Mehregan, Darius; Nasiriavanaki, Mohammadreza
2017-12-20
Currently, diagnosis of skin diseases is based primarily on the visual pattern recognition skills and expertise of the physician observing the lesion. Even though dermatologists are trained to recognize patterns of morphology, it is still a subjective visual assessment. Tools for automated pattern recognition can provide objective information to support clinical decision-making. Noninvasive skin imaging techniques provide complementary information to the clinician. In recent years, optical coherence tomography (OCT) has become a powerful skin imaging technique. According to specific functional needs, skin architecture varies across different parts of the body, as do the textural characteristics in OCT images. There is, therefore, a critical need to systematically analyze OCT images from different body sites, to identify their significant qualitative and quantitative differences. Sixty-three optical and textural features extracted from OCT images of healthy and diseased skin are analyzed and, in conjunction with decision-theoretic approaches, used to create computational models of the diseases. We demonstrate that these models provide objective information to the clinician to assist in the diagnosis of abnormalities of cutaneous microstructure, and hence, aid in the determination of treatment. Specifically, we demonstrate the performance of this methodology on differentiating basal cell carcinoma (BCC) and squamous cell carcinoma (SCC) from healthy tissue.
Plan Execution Interchange Language (PLEXIL)
NASA Technical Reports Server (NTRS)
Estlin, Tara; Jonsson, Ari; Pasareanu, Corina; Simmons, Reid; Tso, Kam; Verma, Vandi
2006-01-01
Plan execution is a cornerstone of spacecraft operations, irrespective of whether the plans to be executed are generated on board the spacecraft or on the ground. Plan execution frameworks vary greatly, due to both different capabilities of the execution systems, and relations to associated decision-making frameworks. The latter dependency has made the reuse of execution and planning frameworks more difficult, and has all but precluded information sharing between different execution and decision-making systems. As a step in the direction of addressing some of these issues, a general plan execution language, called the Plan Execution Interchange Language (PLEXIL), is being developed. PLEXIL is capable of expressing concepts used by many high-level automated planners and hence provides an interface to multiple planners. PLEXIL includes a domain description that specifies command types, expansions, constraints, etc., as well as feedback to the higher-level decision-making capabilities. This document describes the grammar and semantics of PLEXIL. It includes a graphical depiction of this grammar and illustrative rover scenarios. It also outlines ongoing work on implementing a universal execution system, based on PLEXIL, using state-of-the-art rover functional interfaces and planners as test cases.
Automated Planning and Scheduling for Space Mission Operations
NASA Technical Reports Server (NTRS)
Chien, Steve; Jonsson, Ari; Knight, Russell
2005-01-01
Research Trends: a) Finite-capacity scheduling under more complex constraints and increased problem dimensionality (subcontracting, overtime, lot splitting, inventory, etc.) b) Integrated planning and scheduling. c) Mixed-initiative frameworks. d) Management of uncertainty (proactive and reactive). e) Autonomous agent architectures and distributed production management. e) Integration of machine learning capabilities. f) Wider scope of applications: 1) analysis of supplier/buyer protocols & tradeoffs; 2) integration of strategic & tactical decision-making; and 3) enterprise integration.
1983-12-16
management system (DBMS) is to record and maintain information used by an organization in the organization’s decision-making process. Some advantages of a...independence. Database Management Systems are classified into three major models; relational, network, and hierarchical. Each model uses a software...feeling impedes the overall effectiveness of the 4-" Acquisition Management Information System (AMIS), which currently uses S2k. The size of the AMIS
An Artificial Neural Network-Based Decision-Support System for Integrated Network Security
2014-09-01
group that they need to know in order to make team-based decisions in real-time environments, (c) Employ secure cloud computing services to host mobile...THESIS Presented to the Faculty Department of Electrical and Computer Engineering Graduate School of Engineering and Management Air Force...out-of-the-loop syndrome and create complexity creep. As a result, full automation efforts can lead to inappropriate decision-making despite a
A unified approach to VLSI layout automation and algorithm mapping on processor arrays
NASA Technical Reports Server (NTRS)
Venkateswaran, N.; Pattabiraman, S.; Srinivasan, Vinoo N.
1993-01-01
Development of software tools for designing supercomputing systems is highly complex and cost ineffective. To tackle this a special purpose PAcube silicon compiler which integrates different design levels from cell to processor arrays has been proposed. As a part of this, we present in this paper a novel methodology which unifies the problems of Layout Automation and Algorithm Mapping.
NASA Technical Reports Server (NTRS)
1975-01-01
The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.
Artificial intelligence issues related to automated computing operations
NASA Technical Reports Server (NTRS)
Hornfeck, William A.
1989-01-01
Large data processing installations represent target systems for effective applications of artificial intelligence (AI) constructs. The system organization of a large data processing facility at the NASA Marshall Space Flight Center is presented. The methodology and the issues which are related to AI application to automated operations within a large-scale computing facility are described. Problems to be addressed and initial goals are outlined.
Decision-making in Swiss home-like childbirth: A grounded theory study.
Meyer, Yvonne; Frank, Franziska; Schläppy Muntwyler, Franziska; Fleming, Valerie; Pehlke-Milde, Jessica
2017-12-01
Decision-making in midwifery, including a claim for shared decision-making between midwives and women, is of major significance for the health of mother and child. Midwives have little information about how to share decision-making responsibilities with women, especially when complications arise during birth. To increase understanding of decision-making in complex home-like birth settings by exploring midwives' and women's perspectives and to develop a dynamic model integrating participatory processes for making shared decisions. The study, based on grounded theory methodology, analysed 20 interviews of midwives and 20 women who had experienced complications in home-like births. The central phenomenon that arose from the data was "defining/redefining decision as a joint commitment to healthy childbirth". The sub-indicators that make up this phenomenon were safety, responsibility, mutual and personal commitments. These sub-indicators were also identified to influence temporal conditions of decision-making and to apply different strategies for shared decision-making. Women adopted strategies such as delegating a decision, making the midwife's decision her own, challenging a decision or taking a decision driven by the dynamics of childbirth. Midwives employed strategies such as remaining indecisive, approving a woman's decision, making an informed decision or taking the necessary decision. To respond to recommendations for shared responsibility for care, midwives need to strengthen their shared decision-making skills. The visual model of decision-making in childbirth derived from the data provides a framework for transferring clinical reasoning into practice. Copyright © 2017 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.
Commercial Activities Baseline Study
1991-03-01
object oriented programming technology) that automated processing . This report documents that methodology, reviews the candidates and criteria for source data, and provides examples of the output reports.
Ensemble modelling and structured decision-making to support Emergency Disease Management.
Webb, Colleen T; Ferrari, Matthew; Lindström, Tom; Carpenter, Tim; Dürr, Salome; Garner, Graeme; Jewell, Chris; Stevenson, Mark; Ward, Michael P; Werkman, Marleen; Backer, Jantien; Tildesley, Michael
2017-03-01
Epidemiological models in animal health are commonly used as decision-support tools to understand the impact of various control actions on infection spread in susceptible populations. Different models contain different assumptions and parameterizations, and policy decisions might be improved by considering outputs from multiple models. However, a transparent decision-support framework to integrate outputs from multiple models is nascent in epidemiology. Ensemble modelling and structured decision-making integrate the outputs of multiple models, compare policy actions and support policy decision-making. We briefly review the epidemiological application of ensemble modelling and structured decision-making and illustrate the potential of these methods using foot and mouth disease (FMD) models. In case study one, we apply structured decision-making to compare five possible control actions across three FMD models and show which control actions and outbreak costs are robustly supported and which are impacted by model uncertainty. In case study two, we develop a methodology for weighting the outputs of different models and show how different weighting schemes may impact the choice of control action. Using these case studies, we broadly illustrate the potential of ensemble modelling and structured decision-making in epidemiology to provide better information for decision-making and outline necessary development of these methods for their further application. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.
No Evidence of Association between Toxoplasma gondii Infection and Financial Risk Taking in Females
Šebánková, Blanka; Flegr, Jaroslav; Nave, Gideon
2015-01-01
Background Past research linked Toxoplasma gondii (TG) infection in humans with neurological and mental disorders (e.g., schizophrenia, Alzheimer’s disease and attention disorders), irregularities of the dopaminergic and testosterone system, and increased likelihood of being involved in traffic accidents. Methodology/Principal Findings We test for an association between TG infection and financial decision-making (DM) using a case-control design in a sample of female Czech students (n = 79). We estimate each subject's risk attitude and loss aversion using an experimental economic task involving real monetary incentives. We find no significant evidence that either measure of decision-making is associated with TG infection. Conclusion We were unable to find evidence of an association between TG infection and financial decision-making in females. PMID:26401912
Thomson, Hilary
2013-08-01
Systematic reviews have the potential to promote knowledge exchange between researchers and decision-makers. Review planning requires engagement with evidence users to ensure preparation of relevant reviews, and well-conducted reviews should provide accessible and reliable synthesis to support decision-making. Yet, systematic reviews are not routinely referred to by decision-makers, and innovative approaches to improve the utility of reviews is needed. Evidence synthesis for healthy public policy is typically complex and methodologically challenging. Although not lessening the value of reviews, these challenges can be overwhelming and threaten their utility. Using the interrelated principles of relevance, rigor, and readability, and in light of available resources, this article considers how utility of evidence synthesis for healthy public policy might be improved.
2013-01-01
Systematic reviews have the potential to promote knowledge exchange between researchers and decision-makers. Review planning requires engagement with evidence users to ensure preparation of relevant reviews, and well-conducted reviews should provide accessible and reliable synthesis to support decision-making. Yet, systematic reviews are not routinely referred to by decision-makers, and innovative approaches to improve the utility of reviews is needed. Evidence synthesis for healthy public policy is typically complex and methodologically challenging. Although not lessening the value of reviews, these challenges can be overwhelming and threaten their utility. Using the interrelated principles of relevance, rigor, and readability, and in light of available resources, this article considers how utility of evidence synthesis for healthy public policy might be improved. PMID:23763400
Review of evaluation on ecological carrying capacity: The progress and trend of methodology
NASA Astrophysics Data System (ADS)
Wang, S. F.; Xu, Y.; Liu, T. J.; Ye, J. M.; Pan, B. L.; Chu, C.; Peng, Z. L.
2018-02-01
The ecological carrying capacity (ECC) has been regarded as an important reference to indicate the level of regional sustainable development since the very beginning of twenty-first century. By a brief review of the main progress in ECC evaluation methodologies in recent five years, this paper systematically discusses the features and differences of these methods and expounds the current states and future development trend of ECC methodology. The result shows that further exploration in terms of the dynamic, comprehensive and intelligent assessment technologies needs to be provided in order to form a unified and scientific ECC methodology system and to produce a reliable basis for environmental-economic decision-makings.
Cost-Utility Analysis: Current Methodological Issues and Future Perspectives
Nuijten, Mark J. C.; Dubois, Dominique J.
2011-01-01
The use of cost–effectiveness as final criterion in the reimbursement process for listing of new pharmaceuticals can be questioned from a scientific and policy point of view. There is a lack of consensus on main methodological issues and consequently we may question the appropriateness of the use of cost–effectiveness data in health care decision-making. Another concern is the appropriateness of the selection and use of an incremental cost–effectiveness threshold (Cost/QALY). In this review, we focus mainly on only some key methodological concerns relating to discounting, the utility concept, cost assessment, and modeling methodologies. Finally we will consider the relevance of some other important decision criteria, like social values and equity. PMID:21713127
Application of Human-Autonomy Teaming (HAT) Patterns to Reduce Crew Operations (RCO)
NASA Technical Reports Server (NTRS)
Shively, R. Jay; Brandt, Summer L.; Lachter, Joel; Matessa, Mike; Sadler, Garrett; Battiste, Henri
2016-01-01
Unmanned aerial systems, robotics, advanced cockpits, and air traffic management are all examples of domains that are seeing dramatic increases in automation. While automation may take on some tasks previously performed by humans, humans will still be required, for the foreseeable future, to remain in the system. The collaboration with humans and these increasingly autonomous systems will begin to resemble cooperation between teammates, rather than simple task allocation. It is critical to understand this human-autonomy teaming (HAT) to optimize these systems in the future. One methodology to understand HAT is by identifying recurring patterns of HAT that have similar characteristics and solutions. This paper applies a methodology for identifying HAT patterns to an advanced cockpit project.
Collaborative Visual Analytics: A Health Analytics Approach to Injury Prevention
Fisher, Brian; Smith, Jennifer; Pike, Ian
2017-01-01
Background: Accurate understanding of complex health data is critical in order to deal with wicked health problems and make timely decisions. Wicked problems refer to ill-structured and dynamic problems that combine multidimensional elements, which often preclude the conventional problem solving approach. This pilot study introduces visual analytics (VA) methods to multi-stakeholder decision-making sessions about child injury prevention; Methods: Inspired by the Delphi method, we introduced a novel methodology—group analytics (GA). GA was pilot-tested to evaluate the impact of collaborative visual analytics on facilitating problem solving and supporting decision-making. We conducted two GA sessions. Collected data included stakeholders’ observations, audio and video recordings, questionnaires, and follow up interviews. The GA sessions were analyzed using the Joint Activity Theory protocol analysis methods; Results: The GA methodology triggered the emergence of ‘common ground’ among stakeholders. This common ground evolved throughout the sessions to enhance stakeholders’ verbal and non-verbal communication, as well as coordination of joint activities and ultimately collaboration on problem solving and decision-making; Conclusions: Understanding complex health data is necessary for informed decisions. Equally important, in this case, is the use of the group analytics methodology to achieve ‘common ground’ among diverse stakeholders about health data and their implications. PMID:28895928
Reaching a Consensus: Terminology and Concepts Used in Coordination and Decision-Making Research.
Pyritz, Lennart W; King, Andrew J; Sueur, Cédric; Fichtel, Claudia
2011-12-01
Research on coordination and decision-making in humans and nonhuman primates has increased considerably throughout the last decade. However, terminology has been used inconsistently, hampering the broader integration of results from different studies. In this short article, we provide a glossary containing the central terms of coordination and decision-making research. The glossary is based on previous definitions that have been critically revised and annotated by the participants of the symposium "Where next? Coordination and decision-making in primate groups" at the XXIIIth Congress of the International Primatological Society (IPS) in Kyoto, Japan. We discuss a number of conceptual and methodological issues and highlight consequences for their implementation. In summary, we recommend that future studies on coordination and decision-making in animal groups do not use the terms "combined decision" and "democratic/despotic decision-making." This will avoid ambiguity as well as anthropocentric connotations. Further, we demonstrate the importance of 1) taxon-specific definitions of coordination parameters (initiation, leadership, followership, termination), 2) differentiation between coordination research on individual-level process and group-level outcome, 3) analyses of collective action processes including initiation and termination, and 4) operationalization of successful group movements in the field to collect meaningful and comparable data across different species.
Patel, Vaishali N; Riley, Anne W
2007-10-01
A multiple case study was conducted to examine how staff in child out-of-home care programs used data from an Outcomes Management System (OMS) and other sources to inform decision-making. Data collection consisted of thirty-seven semi-structured interviews with clinicians, managers, and directors from two treatment foster care programs and two residential treatment centers, and individuals involved with developing the OMS; and observations of clinical and quality management meetings. Case study and grounded theory methodology guided analyses. The application of qualitative data analysis software is described. Results show that although staff rarely used data from the OMS, they did rely on other sources of systematically collected information to inform clinical, quality management, and program decisions. Analyses of how staff used these data suggest that improving the utility of OMS will involve encouraging staff to participate in data-based decision-making, and designing and implementing OMS in a manner that reflects how decision-making processes operate.
Is the relationship between pattern recall and decision-making influenced by anticipatory recall?
Gorman, Adam D; Abernethy, Bruce; Farrow, Damian
2013-01-01
The present study compared traditional measures of pattern recall to measures of anticipatory recall and decision-making to examine the underlying mechanisms of expert pattern perception and to address methodological limitations in previous studies where anticipatory recall has generally been overlooked. Recall performance in expert and novice basketball players was measured by examining the spatial error in recalling player positions both for a target image (traditional recall) and at 40-ms increments following the target image (anticipatory recall). Decision-making performance was measured by comparing the participant's response to those identified by a panel of expert coaches. Anticipatory recall was observed in the recall task and was significantly more pronounced for the experts, suggesting that traditional methods of spatial recall analysis may not have provided a completely accurate determination of the full magnitude of the experts' superiority. Accounting for anticipatory recall also increased the relative contribution of recall skill to decision-making accuracy although the gains in explained variance were modest and of debatable functional significance.
How to combine probabilistic and fuzzy uncertainties in fuzzy control
NASA Technical Reports Server (NTRS)
Nguyen, Hung T.; Kreinovich, Vladik YA.; Lea, Robert
1991-01-01
Fuzzy control is a methodology that translates natural-language rules, formulated by expert controllers, into the actual control strategy that can be implemented in an automated controller. In many cases, in addition to the experts' rules, additional statistical information about the system is known. It is explained how to use this additional information in fuzzy control methodology.
Connors, Brenda L.; Rende, Richard; Colton, Timothy J.
2014-01-01
The unique yield of collecting observational data on human movement has received increasing attention in a number of domains, including the study of decision-making style. As such, interest has grown in the nuances of core methodological issues, including the best ways of assessing inter-rater reliability. In this paper we focus on one key topic – the distinction between establishing reliability for the patterning of behaviors as opposed to the computation of raw counts – and suggest that reliability for each be compared empirically rather than determined a priori. We illustrate by assessing inter-rater reliability for key outcome measures derived from movement pattern analysis (MPA), an observational methodology that records body movements as indicators of decision-making style with demonstrated predictive validity. While reliability ranged from moderate to good for raw counts of behaviors reflecting each of two Overall Factors generated within MPA (Assertion and Perspective), inter-rater reliability for patterning (proportional indicators of each factor) was significantly higher and excellent (ICC = 0.89). Furthermore, patterning, as compared to raw counts, provided better prediction of observable decision-making process assessed in the laboratory. These analyses support the utility of using an empirical approach to inform the consideration of measuring patterning versus discrete behavioral counts of behaviors when determining inter-rater reliability of observable behavior. They also speak to the substantial reliability that may be achieved via application of theoretically grounded observational systems such as MPA that reveal thinking and action motivations via visible movement patterns. PMID:24999336
NASA Astrophysics Data System (ADS)
Alfonso, Leonardo
2013-04-01
The role of decision-makers is to take the outputs from hydrological and hydraulic analyses and, in some extent, use them as inputs to make decisions that are related to planning, design and operation of water systems. However, the use of these technical analyses is frequently limited, since there are other non-hydrological issues that must be considered, that may end up in very different solutions than those envisaged by the purely technical ones. A possibility to account for the nature of the human decisions under uncertainty is by exploring the use of concepts from decision theory and behavioural economics, such as Value of Information and Prospect Theory and embed them into the methodologies we use in the hydrology practice. Three examples are presented to illustrate these multidisciplinary interactions. The first one, for monitoring network design, uses Value of Information within a methodology to locate water level stations in a complex canal of networks in the Netherlands. The second example, for operation, shows how the Value of Information concept can be used to formulate alternative methods to evaluate flood risk according to the set of options available for decision-making during a flood event. The third example, for planning, uses Prospect Theory concepts to understand how the "losses hurt more than gains feel good" effect can determine the final decision of urbanise or not a flood-prone area. It is demonstrated that decision theory and behavioural economic principles are promising to evaluate the complex decision-making process in water-related issues.
Connors, Brenda L; Rende, Richard; Colton, Timothy J
2014-01-01
The unique yield of collecting observational data on human movement has received increasing attention in a number of domains, including the study of decision-making style. As such, interest has grown in the nuances of core methodological issues, including the best ways of assessing inter-rater reliability. In this paper we focus on one key topic - the distinction between establishing reliability for the patterning of behaviors as opposed to the computation of raw counts - and suggest that reliability for each be compared empirically rather than determined a priori. We illustrate by assessing inter-rater reliability for key outcome measures derived from movement pattern analysis (MPA), an observational methodology that records body movements as indicators of decision-making style with demonstrated predictive validity. While reliability ranged from moderate to good for raw counts of behaviors reflecting each of two Overall Factors generated within MPA (Assertion and Perspective), inter-rater reliability for patterning (proportional indicators of each factor) was significantly higher and excellent (ICC = 0.89). Furthermore, patterning, as compared to raw counts, provided better prediction of observable decision-making process assessed in the laboratory. These analyses support the utility of using an empirical approach to inform the consideration of measuring patterning versus discrete behavioral counts of behaviors when determining inter-rater reliability of observable behavior. They also speak to the substantial reliability that may be achieved via application of theoretically grounded observational systems such as MPA that reveal thinking and action motivations via visible movement patterns.
DOT National Transportation Integrated Search
2009-10-28
Transportation agencies strive to maintain their systems in good condition and also to provide : acceptable levels of service to users. However, funding is often inadequate to meet the needs : of system preservation and expansion, and thus performanc...
Decision Making, Models of Mind, and the New Cognitive Science.
ERIC Educational Resources Information Center
Evers, Colin W.
1998-01-01
Explores implications for understanding educational decision making from a cognitive science perspective. Examines three models of mind providing the methodological framework for decision-making studies. The "absent mind" embodies the behaviorist research tradition. The "functionalist mind" underwrites traditional cognitivism…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feng, Yixiong; Hu, Bingtao; Hao, He
With the development of communication and control technology, intelligent transportation systems have received increasing attention from both industry and academia. Intelligent transportation systems are supported by the Internet of Things, Cyber-Physical System, Artificial Intelligence, Cloud Computing and many other technologies, which supply fundamental information for connected and automated vehicles. Although plenty of studies have provided different formulations for intelligent transportation systems, many of them depend on Master Control Center. However, a centralized control mode requires a huge amount of data transmission and high level of hardware configuration and may cause communication delay and privacy leak. Some distributed architectures have beenmore » proposed to overcome the above problems but systematized technologies to collect and exchange information, process large amounts of data, model the dynamics of vehicles, and safely control the connected and automated vehicles are not explored in detail. In this paper, we proposed a novel distributed cyber-physical system for connected and automated vehicles in which every vehicle is modeled as a double-integrator using edge computing to analyze information collected from its nearest neighbors. The vehicles are supposed to travel along a desired trajectory and to maintain a rigid formation geometry. Related methodologies for the proposed system are illustrated and experiments are conducted showing that the performance of the connected and automated vehicles matches very well with analytic predictions. Some design guidelines and open questions are provided for the future study.« less
Feng, Yixiong; Hu, Bingtao; Hao, He; ...
2018-02-14
With the development of communication and control technology, intelligent transportation systems have received increasing attention from both industry and academia. Intelligent transportation systems are supported by the Internet of Things, Cyber-Physical System, Artificial Intelligence, Cloud Computing and many other technologies, which supply fundamental information for connected and automated vehicles. Although plenty of studies have provided different formulations for intelligent transportation systems, many of them depend on Master Control Center. However, a centralized control mode requires a huge amount of data transmission and high level of hardware configuration and may cause communication delay and privacy leak. Some distributed architectures have beenmore » proposed to overcome the above problems but systematized technologies to collect and exchange information, process large amounts of data, model the dynamics of vehicles, and safely control the connected and automated vehicles are not explored in detail. In this paper, we proposed a novel distributed cyber-physical system for connected and automated vehicles in which every vehicle is modeled as a double-integrator using edge computing to analyze information collected from its nearest neighbors. The vehicles are supposed to travel along a desired trajectory and to maintain a rigid formation geometry. Related methodologies for the proposed system are illustrated and experiments are conducted showing that the performance of the connected and automated vehicles matches very well with analytic predictions. Some design guidelines and open questions are provided for the future study.« less
2003-10-01
Among the procedures developed to identify cognitive processes, there are the Cognitive Task Analysis (CTA) and the Cognitive Work Analysis (CWA...of Cognitive Task Design. [11] Potter, S.S., Roth, E.M., Woods, D.D., and Elm, W.C. (2000). Cognitive Task Analysis as Bootstrapping Multiple...Converging Techniques, In Schraagen, Chipman, and Shalin (Eds.). Cognitive Task Analysis . Mahwah, NJ: Lawrence Erlbaum Associates. [12] Roth, E.M
Looking ahead in systems engineering
NASA Technical Reports Server (NTRS)
Feigenbaum, Donald S.
1966-01-01
Five areas that are discussed in this paper are: (1) the technological characteristics of systems engineering; (2) the analytical techniques that are giving modern systems work its capability and power; (3) the management, economics, and effectiveness dimensions that now frame the modern systems field; (4) systems engineering's future impact upon automation, computerization and managerial decision-making in industry - and upon aerospace and weapons systems in government and the military; and (5) modern systems engineering's partnership with modern quality control and reliability.
Fluorescence Behavioral Imaging (FBI) Tracks Identity in Heterogeneous Groups of Drosophila
Ramdya, Pavan; Schaffter, Thomas; Floreano, Dario; Benton, Richard
2012-01-01
Distinguishing subpopulations in group behavioral experiments can reveal the impact of differences in genetic, pharmacological and life-histories on social interactions and decision-making. Here we describe Fluorescence Behavioral Imaging (FBI), a toolkit that uses transgenic fluorescence to discriminate subpopulations, imaging hardware that simultaneously records behavior and fluorescence expression, and open-source software for automated, high-accuracy determination of genetic identity. Using FBI, we measure courtship partner choice in genetically mixed groups of Drosophila. PMID:23144871
Fluorescence behavioral imaging (FBI) tracks identity in heterogeneous groups of Drosophila.
Ramdya, Pavan; Schaffter, Thomas; Floreano, Dario; Benton, Richard
2012-01-01
Distinguishing subpopulations in group behavioral experiments can reveal the impact of differences in genetic, pharmacological and life-histories on social interactions and decision-making. Here we describe Fluorescence Behavioral Imaging (FBI), a toolkit that uses transgenic fluorescence to discriminate subpopulations, imaging hardware that simultaneously records behavior and fluorescence expression, and open-source software for automated, high-accuracy determination of genetic identity. Using FBI, we measure courtship partner choice in genetically mixed groups of Drosophila.
NASA Astrophysics Data System (ADS)
Nakagawa, M.; Akano, K.; Kobayashi, T.; Sekiguchi, Y.
2017-09-01
Image-based virtual reality (VR) is a virtual space generated with panoramic images projected onto a primitive model. In imagebased VR, realistic VR scenes can be generated with lower rendering cost, and network data can be described as relationships among VR scenes. The camera network data are generated manually or by an automated procedure using camera position and rotation data. When panoramic images are acquired in indoor environments, network data should be generated without Global Navigation Satellite Systems (GNSS) positioning data. Thus, we focused on image-based VR generation using a panoramic camera in indoor environments. We propose a methodology to automate network data generation using panoramic images for an image-based VR space. We verified and evaluated our methodology through five experiments in indoor environments, including a corridor, elevator hall, room, and stairs. We confirmed that our methodology can automatically reconstruct network data using panoramic images for image-based VR in indoor environments without GNSS position data.
A self-contained, automated methodology for optimal flow control validated for transition delay
NASA Technical Reports Server (NTRS)
Joslin, Ronald D.; Gunzburger, Max D.; Nicolaides, R. A.; Erlebacher, Gordon; Hussaini, M. Yousuff
1995-01-01
This paper describes a self-contained, automated methodology for flow control along with a validation of the methodology for the problem of boundary layer instability suppression. The objective of control is to match the stress vector along a portion of the boundary to a given vector; instability suppression is achieved by choosing the given vector to be that of a steady base flow, e.g., Blasius boundary layer. Control is effected through the injection or suction of fluid through a single orifice on the boundary. The present approach couples the time-dependent Navier-Stokes system with an adjoint Navier-Stokes system and optimality conditions from which optimal states, i.e., unsteady flow fields, and control, e.g., actuators, may be determined. The results demonstrate that instability suppression can be achieved without any a priori knowledge of the disturbance, which is significant because other control techniques have required some knowledge of the flow unsteadiness such as frequencies, instability type, etc.
DOT National Transportation Integrated Search
2013-08-01
The Texas A&M Transportation Institutes (TTIs) often-cited Urban Mobility Report (UMR) provides : transportation decision-makers with urban-area congestion statistics and trends. Data and their availability have : continued to evolve rapidly ov...
Lithography-based automation in the design of program defect masks
NASA Astrophysics Data System (ADS)
Vakanas, George P.; Munir, Saghir; Tejnil, Edita; Bald, Daniel J.; Nagpal, Rajesh
2004-05-01
In this work, we are reporting on a lithography-based methodology and automation in the design of Program Defect masks (PDM"s). Leading edge technology masks have ever-shrinking primary features and more pronounced model-based secondary features such as optical proximity corrections (OPC), sub-resolution assist features (SRAF"s) and phase-shifted mask (PSM) structures. In order to define defect disposition specifications for critical layers of a technology node, experience alone in deciding worst-case scenarios for the placement of program defects is necessary but may not be sufficient. MEEF calculations initiated from layout pattern data and their integration in a PDM layout flow provide a natural approach for improvements, relevance and accuracy in the placement of programmed defects. This methodology provides closed-loop feedback between layout and hard defect disposition specifications, thereby minimizing engineering test restarts, improving quality and reducing cost of high-end masks. Apart from SEMI and industry standards, best-known methods (BKM"s) in integrated lithographically-based layout methodologies and automation specific to PDM"s are scarce. The contribution of this paper lies in the implementation of Design-For-Test (DFT) principles to a synergistic interaction of CAD Layout and Aerial Image Simulator to drive layout improvements, highlight layout-to-fracture interactions and output accurate program defect placement coordinates to be used by tools in the mask shop.
NASA Astrophysics Data System (ADS)
Gorlach, Igor; Wessel, Oliver
2008-09-01
In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.
Studies of planning behavior of aircraft pilots in normal, abnormal and emergency situations
NASA Technical Reports Server (NTRS)
Johannsen, G.; Rouse, W. B.; Hillmann, K.
1981-01-01
A methodology for the study of planning is presented and the results of applying the methodology within two experimental investigations of planning behavior of aircraft pilots in normal, abnormal, and emergency situations are discussed. Beyond showing that the methodology yields consistent results, these experiments also lead to concepts in terms of a dichotomy between event driven and time driven planning, subtle effects of automation on planning, and the relationship of planning to workload and flight performance.
1988-02-28
enormous investment in software. This is an area extremely important objective. We need additional where better methodologies , tools and theories...microscopy (SEM) and optical mi- [131 Hanson, A., et a. "A Methodology for the Develop- croscopy. Current activities include the study of SEM im- ment...through a phased knowledge engineering methodology Center (ARC) and NASA Johnson Space Center consisting of: prototype knowledge base develop- iJSC
Agility through Automated Negotiation for C2 Services
2014-06-01
using this e-contract negotiation methodology in a C2 context in Brazil. We have modeled the operations of the Rio de Janeiro Command Center that will be...methodology in a C2 context in Brazil. We have modeled the operations of the Rio de Janeiro Command Center that will be in place for the World Cup (2014...through e-contracts. The scenario chosen to demonstrate this methodology is a security incident in Rio de Janeiro , host city of the next World Cup (2014
The effect of JPEG compression on automated detection of microaneurysms in retinal images
NASA Astrophysics Data System (ADS)
Cree, M. J.; Jelinek, H. F.
2008-02-01
As JPEG compression at source is ubiquitous in retinal imaging, and the block artefacts introduced are known to be of similar size to microaneurysms (an important indicator of diabetic retinopathy) it is prudent to evaluate the effect of JPEG compression on automated detection of retinal pathology. Retinal images were acquired at high quality and then compressed to various lower qualities. An automated microaneurysm detector was run on the retinal images of various qualities of JPEG compression and the ability to predict the presence of diabetic retinopathy based on the detected presence of microaneurysms was evaluated with receiver operating characteristic (ROC) methodology. The negative effect of JPEG compression on automated detection was observed even at levels of compression sometimes used in retinal eye-screening programmes and these may have important clinical implications for deciding on acceptable levels of compression for a fully automated eye-screening programme.
Study of the impact of automation on productivity in bus-maintenance facilities. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sumanth, D.J.; Weiss, H.J.; Adya, B.
1988-12-01
Whether or not the various types of automation and new technologies introduced in a bus-transit system really have an impact on productivity is the question addressed in the study. The report describes a new procedure of productivity measurement and evaluation for a county-transit system and provides an objective perspective on the impact of automation on productivity in bus maintenance facilities. The research objectives were: to study the impact of automation on total productivity in transit maintenance facilities; to develop and apply a methodology for measuring the total productivity of a Floridian transit maintenance facility (Bradenton-Manatee County bus maintenance facility whichmore » has been introducing automation since 1983); and to develop a practical step-by-step implementation scheme for the total productivity-based productivity measurement system that any bus manager can use. All 3 objectives were successfully accomplished.« less
Kaplan, Johanna S; Erickson, Kristine; Luckenbaugh, David A; Weiland-Fiedler, Petra; Geraci, Marilla; Sahakian, Barbara J; Charney, Dennis; Drevets, Wayne C; Neumeister, Alexander
2006-10-01
Neuropsychological studies have provided evidence for deficits in psychiatric disorders, such as schizophrenia and mood disorders. However, neuropsychological function in Panic Disorder (PD) or PD with a comorbid diagnosis of Major Depressive Disorder (MDD) has not been comprehensively studied. The present study investigated neuropsychological functioning in patients with PD and PD + MDD by focusing on tasks that assess attention, psychomotor speed, executive function, decision-making, and affective processing. Twenty-two unmedicated patients with PD, eleven of whom had a secondary diagnosis of MDD, were compared to twenty-two healthy controls, matched for gender, age, and intelligence on tasks of attention, memory, psychomotor speed, executive function, decision-making, and affective processing from the Cambridge Neuropsychological Test Automated Battery (CANTAB), Cambridge Gamble Task, and Affective Go/No-go Task. Relative to matched healthy controls, patients with PD + MDD displayed an attentional bias toward negatively-valenced verbal stimuli (Affective Go/No-go Task) and longer decision-making latencies (Cambridge Gamble Task). Furthermore, the PD + MDD group committed more errors on a task of memory and visual discrimination compared to their controls. In contrast, no group differences were found for PD patients relative to matched control subjects. The sample size was limited, however, all patients were drug-free at the time of testing. The PD + MDD patients demonstrated deficits on a task involving visual discrimination and working memory, and an attentional bias towards negatively-valenced stimuli. In addition, patients with comorbid depression provided qualitatively different responses in the areas of affective and decision-making processes.
The problem of resonance in technology usage
NASA Technical Reports Server (NTRS)
Sayani, H. H.; Svoboda, C. P.
1981-01-01
Various information system tools and techniques are analyzed. A case study is presented which draws together the issues raised in three distinct cases. This case study shows a typical progression from the selection of an analysis methodology, to the adoption of an automated tool for specification and documentation, and the difficulty of fitting these into an existing life cycle development methodology.
Systems thinking, complexity and managerial decision-making: an analytical review.
Cramp, D G; Carson, E R
2009-05-01
One feature that characterizes the organization and delivery of health care is its inherent complexity. All too often, with so much information and so many activities involved, it is difficult for decision-makers to determine in an objective fashion an appropriate course of action. It would appear that a holistic rather than a reductionist approach would be advantageous. The aim of this paper is to review how formal systems thinking can aid decision-making in complex situations. Consideration is given as to how the use of a number of systems modelling methodologies can help in gaining an understanding of a complex decision situation. This in turn can enhance the possibility of a decision being made in a more rational, explicit and transparent fashion. The arguments and approaches are illustrated using examples taken from the public health arena.
Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles.
Eom, Hwisoo; Lee, Sang Hun
2015-06-12
A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model.
Human-Automation Interaction Design for Adaptive Cruise Control Systems of Ground Vehicles
Eom, Hwisoo; Lee, Sang Hun
2015-01-01
A majority of recently developed advanced vehicles have been equipped with various automated driver assistance systems, such as adaptive cruise control (ACC) and lane keeping assistance systems. ACC systems have several operational modes, and drivers can be unaware of the mode in which they are operating. Because mode confusion is a significant human error factor that contributes to traffic accidents, it is necessary to develop user interfaces for ACC systems that can reduce mode confusion. To meet this requirement, this paper presents a new human-automation interaction design methodology in which the compatibility of the machine and interface models is determined using the proposed criteria, and if the models are incompatible, one or both of the models is/are modified to make them compatible. To investigate the effectiveness of our methodology, we designed two new interfaces by separately modifying the machine model and the interface model and then performed driver-in-the-loop experiments. The results showed that modifying the machine model provides a more compact, acceptable, effective, and safe interface than modifying the interface model. PMID:26076406
Intelligent Case Based Decision Support System for Online Diagnosis of Automated Production System
NASA Astrophysics Data System (ADS)
Ben Rabah, N.; Saddem, R.; Ben Hmida, F.; Carre-Menetrier, V.; Tagina, M.
2017-01-01
Diagnosis of Automated Production System (APS) is a decision-making process designed to detect, locate and identify a particular failure caused by the control law. In the literature, there are three major types of reasoning for industrial diagnosis: the first is model-based, the second is rule-based and the third is case-based. The common and major limitation of the first and the second reasonings is that they do not have automated learning ability. This paper presents an interactive and effective Case Based Decision Support System for online Diagnosis (CB-DSSD) of an APS. It offers a synergy between the Case Based Reasoning (CBR) and the Decision Support System (DSS) in order to support and assist Human Operator of Supervision (HOS) in his/her decision process. Indeed, the experimental evaluation performed on an Interactive Training System for PLC (ITS PLC) that allows the control of a Programmable Logic Controller (PLC), simulating sensors or/and actuators failures and validating the control algorithm through a real time interactive experience, showed the efficiency of our approach.
2002-09-01
sub-goal can lead to achieving different goals (e.g., automation of on-line order processing may lead to both reducing the storage cost and reducing...equipment Introduce new technology Find cheaper supplier Sign a contract Introduce cheaper materials Set up and automate on-line order processing Integrate... order processing with inventory and shipping Set up company’s website Freight consolidation Just-in-time versus pre-planned balance
2018-01-01
collected data. These statistical techniques are under the area of descriptive statistics, which is a methodology to condense the data in quantitative ...ARL-TR-8270 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter...report when it is no longer needed. Do not return it to the originator. ARL-TR-8270 ● JAN 2017 US Army Research Laboratory An
Decision Making In A High-Tech World: Automation Bias and Countermeasures
NASA Technical Reports Server (NTRS)
Mosier, Kathleen L.; Skitka, Linda J.; Burdick, Mark R.; Heers, Susan T.; Rosekind, Mark R. (Technical Monitor)
1996-01-01
Automated decision aids and decision support systems have become essential tools in many high-tech environments. In aviation, for example, flight management systems computers not only fly the aircraft, but also calculate fuel efficient paths, detect and diagnose system malfunctions and abnormalities, and recommend or carry out decisions. Air Traffic Controllers will soon be utilizing decision support tools to help them predict and detect potential conflicts and to generate clearances. Other fields as disparate as nuclear power plants and medical diagnostics are similarly becoming more and more automated. Ideally, the combination of human decision maker and automated decision aid should result in a high-performing team, maximizing the advantages of additional cognitive and observational power in the decision-making process. In reality, however, the presence of these aids often short-circuits the way that even very experienced decision makers have traditionally handled tasks and made decisions, and introduces opportunities for new decision heuristics and biases. Results of recent research investigating the use of automated aids have indicated the presence of automation bias, that is, errors made when decision makers rely on automated cues as a heuristic replacement for vigilant information seeking and processing. Automation commission errors, i.e., errors made when decision makers inappropriately follow an automated directive, or automation omission errors, i.e., errors made when humans fail to take action or notice a problem because an automated aid fails to inform them, can result from this tendency. Evidence of the tendency to make automation-related omission and commission errors has been found in pilot self reports, in studies using pilots in flight simulations, and in non-flight decision making contexts with student samples. Considerable research has found that increasing social accountability can successfully ameliorate a broad array of cognitive biases and resultant errors. To what extent these effects generalize to performance situations is not yet empirically established. The two studies to be presented represent concurrent efforts, with student and professional pilot samples, to determine the effects of accountability pressures on automation bias and on the verification of the accurate functioning of automated aids. Students (Experiment 1) and commercial pilots (Experiment 2) performed simulated flight tasks using automated aids. In both studies, participants who perceived themselves as accountable for their strategies of interaction with the automation were significantly more likely to verify its correctness, and committed significantly fewer automation-related errors than those who did not report this perception.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
This technical note describes the current capabilities and availability of the Automated Dredging and Disposal Alternatives Management System (ADDAMS). The technical note replaces the earlier Technical Note EEDP-06-12, which should be discarded. Planning, design, and management of dredging and dredged material disposal projects often require complex or tedious calculations or involve complex decision-making criteria. In addition, the evaluations often must be done for several disposal alternatives or disposal sites. ADDAMS is a personal computer (PC)-based system developed to assist in making such evaluations in a timely manner. ADDAMS contains a collection of computer programs (applications) designed to assist in managingmore » dredging projects. This technical note describes the system, currently available applications, mechanisms for acquiring and running the system, and provisions for revision and expansion.« less
Brodney, Marian D; Brosius, Arthur D; Gregory, Tracy; Heck, Steven D; Klug-McLeod, Jacquelyn L; Poss, Christopher S
2009-12-01
Advances in the field of drug discovery have brought an explosion in the quantity of data available to medicinal chemists and other project team members. New strategies and systems are needed to help these scientists to efficiently gather, organize, analyze, annotate, and share data about potential new drug molecules of interest to their project teams. Herein we describe a suite of integrated services and end-user applications that facilitate these activities throughout the medicinal chemistry design cycle. The Automated Data Presentation (ADP) and Virtual Compound Profiler (VCP) processes automate the gathering, organization, and storage of real and virtual molecules, respectively, and associated data. The Project-Focused Activity and Knowledge Tracker (PFAKT) provides a unified data analysis and collaboration environment, enhancing decision-making, improving team communication, and increasing efficiency.
Two-Graph Building Interior Representation for Emergency Response Applications
NASA Astrophysics Data System (ADS)
Boguslawski, P.; Mahdjoubi, L.; Zverovich, V.; Fadli, F.
2016-06-01
Nowadays, in a rapidly developing urban environment with bigger and higher public buildings, disasters causing emergency situations and casualties are unavoidable. Preparedness and quick response are crucial issues saving human lives. Available information about an emergency scene, such as a building structure, helps for decision making and organizing rescue operations. Models supporting decision-making should be available in real, or near-real, time. Thus, good quality models that allow implementation of automated methods are highly desirable. This paper presents details of the recently developed method for automated generation of variable density navigable networks in a 3D indoor environment, including a full 3D topological model, which may be used not only for standard navigation but also for finding safe routes and simulating hazard and phenomena associated with disasters such as fire spread and heat transfer.
Schedule Risks Due to Delays in Advanced Technology Development
NASA Technical Reports Server (NTRS)
Reeves, John D. Jr.; Kayat, Kamal A.; Lim, Evan
2008-01-01
This paper discusses a methodology and modeling capability that probabilistically evaluates the likelihood and impacts of delays in advanced technology development prior to the start of design, development, test, and evaluation (DDT&E) of complex space systems. The challenges of understanding and modeling advanced technology development considerations are first outlined, followed by a discussion of the problem in the context of lunar surface architecture analysis. The current and planned methodologies to address the problem are then presented along with sample analyses and results. The methodology discussed herein provides decision-makers a thorough understanding of the schedule impacts resulting from the inclusion of various enabling advanced technology assumptions within system design.
DOT National Transportation Integrated Search
2012-05-01
An accurate measure of crash costs is required to support effective decision-making about transportation investments. In particular, underinvestment will occur if measurement fails to capture the full cost of crashes. Such mis-measurement and underin...
Creating Business Intelligence from Course Management Systems
ERIC Educational Resources Information Center
van Dyk, Liezl; Conradie, Pieter
2007-01-01
Purpose: This article seeks to address the interface between individual learning facilitators that use course management systems (CMS) data to support decision-making and course design and institutional infrastructure providers that are responsible for institutional business intelligence. Design/methodology/approach: The design of a data warehouse…
Liaw, Siaw-Teng; Deveny, Elizabeth; Morrison, Iain; Lewis, Bryn
2006-09-01
Using a factorial vignette survey and modeling methodology, we developed clinical and information models - incorporating evidence base, key concepts, relevant terms, decision-making and workflow needed to practice safely and effectively - to guide the development of an integrated rule-based knowledge module to support prescribing decisions in asthma. We identified workflows, decision-making factors, factor use, and clinician information requirements. The Unified Modeling Language (UML) and public domain software and knowledge engineering tools (e.g. Protégé) were used, with the Australian GP Data Model as the starting point for expressing information needs. A Web Services service-oriented architecture approach was adopted within which to express functional needs, and clinical processes and workflows were expressed in the Business Process Execution Language (BPEL). This formal analysis and modeling methodology to define and capture the process and logic of prescribing best practice in a reference implementation is fundamental to tackling deficiencies in prescribing decision support software.
A Psychobiographical Study of Intuition in a Writer's Life: Paulo Coelho Revisited
Mayer, Claude-Hélène; Maree, David
2017-01-01
Intuition is defined as a form of knowledge which materialises as awareness of thoughts, feelings and physical sensations. It is a key to a deeper understanding and meaningfulness. Intuition, used as a psychological function, supports the transmission and integration of perceptions from unconscious and conscious realms. This study uses a psychobiographical single case study approach to explore intuition across the life span of Paulo Coelho. Methodologically, the study is based on a single case study, using the methodological frame of Dilthey's modern hermeneutics. The author, Paulo Coelho, was chosen as a subject of research, based on the content analysis of first- and third-person perspective documents. Findings show that Paulo Coelho, as one of the most famous and most read contemporary authors in the world, uses his intuitions as a deeper guidance in life, for decision-making and self-development. Intuitive decision-making is described throughout his life and by referring to selected creative works. PMID:28904596
Advanced automation for in-space vehicle processing
NASA Technical Reports Server (NTRS)
Sklar, Michael; Wegerif, D.
1990-01-01
The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.
NASA Technical Reports Server (NTRS)
Tartt, David M.; Hewett, Marle D.; Duke, Eugene L.; Cooper, James A.; Brumbaugh, Randal W.
1989-01-01
The Automated Flight Test Management System (ATMS) is being developed as part of the NASA Aircraft Automation Program. This program focuses on the application of interdisciplinary state-of-the-art technology in artificial intelligence, control theory, and systems methodology to problems of operating and flight testing high-performance aircraft. The development of a Flight Test Engineer's Workstation (FTEWS) is presented, with a detailed description of the system, technical details, and future planned developments. The goal of the FTEWS is to provide flight test engineers and project officers with an automated computer environment for planning, scheduling, and performing flight test programs. The FTEWS system is an outgrowth of the development of ATMS and is an implementation of a component of ATMS on SUN workstations.
NASA Out-of-Autoclave Process Technology Development
NASA Technical Reports Server (NTRS)
Johnston, Norman, J.; Clinton, R. G., Jr.; McMahon, William M.
2000-01-01
Polymer matrix composites (PMCS) will play a significant role in the construction of large reusable launch vehicles (RLVs), mankind's future major access to low earth orbit and the international space station. PMCs are lightweight and offer attractive economies of scale and automated fabrication methodology. Fabrication of large RLV structures will require non-autoclave methods which have yet to be matured including (1) thermoplastic forming: heated head robotic tape placement, sheet extrusion, pultrusion, molding and forming; (2) electron beam curing: bulk and ply-by-ply automated placement; (3) RTM and VARTM. Research sponsored by NASA in industrial and NASA laboratories on automated placement techniques involving the first 2 categories will be presented.
Research Design for an Automated Behavioral Intelligence (ABI) Project
1980-05-14
resolution of immediate short range problems to attainment of ultimate millenial goals. In specifying 1-15 KAPPA @ SSTEMJI INC. CNJ (%.J U C Ul 41I -o0...to be pursued, and of what perceptions are held by the foreign decision-makers. It should be possible through retrospective analysis to build a rich ...retrospective analysis should be able to provide a relatively rich data base as to what kinds of change may occur as the result of given types of
A framework to support human factors of automation in railway intelligent infrastructure.
Dadashi, Nastaran; Wilson, John R; Golightly, David; Sharples, Sarah
2014-01-01
Technological and organisational advances have increased the potential for remote access and proactive monitoring of the infrastructure in various domains and sectors - water and sewage, oil and gas and transport. Intelligent Infrastructure (II) is an architecture that potentially enables the generation of timely and relevant information about the state of any type of infrastructure asset, providing a basis for reliable decision-making. This paper reports an exploratory study to understand the concepts and human factors associated with II in the railway, largely drawing from structured interviews with key industry decision-makers and attachment to pilot projects. Outputs from the study include a data-processing framework defining the key human factors at different levels of the data structure within a railway II system and a system-level representation. The framework and other study findings will form a basis for human factors contributions to systems design elements such as information interfaces and role specifications.
Effects of Gain/Loss Framing in Cyber Defense Decision-Making
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bos, Nathan; Paul, Celeste; Gersh, John
Cyber defense requires decision making under uncertainty. Yet this critical area has not been a strong focus of research in judgment and decision-making. Future defense systems, which will rely on software-defined networks and may employ ‘moving target’ defenses, will increasingly automate lower level detection and analysis, but will still require humans in the loop for higher level judgment. We studied the decision making process and outcomes of 17 experienced network defense professionals who worked through a set of realistic network defense scenarios. We manipulated gain versus loss framing in a cyber defense scenario, and found significant effects in one ofmore » two focal problems. Defenders that began with a network already in quarantine (gain framing) used a quarantine system more than those that did not (loss framing). We also found some difference in perceived workload and efficacy. Alternate explanations of these findings and implications for network defense are discussed.« less
Cunha, Edite; Pinto, Paula C A G; Saraiva, M Lúcia M F S
2015-08-15
An automated methodology is proposed for the evaluation of a set of ionic liquids (ILs) as alternative reaction media for aldolase based synthetic processes. For that, the effect of traditionally used organic solvents and ILs on the activity of aldolase was studied by means of a novel automated methodology. The implemented methodology is based on the concept of sequential injection analysis (SIA) and relies on the aldolase based cleavage of d-fructose-1,6 diphosphate (DFDP), to produce dihydroxyacetone phosphate (DHAP) and d-glyceraldehyde-3-phosphate (G3P). In the presence of FeCl3, 3-methyl-2-benzothiazoline hydrazine (MBTH) forms a blue cation that can be measured at 670nm, by combination with G3P. The influence of several parameters such as substrate and enzyme concentration, temperature, delay time and MBTH and FeCl3 concentration were studied and the optimum reaction conditions were subsequently selected. The developed methodology showed good precision and a relative standard deviation (rsd) that does not exceed 7% also leading to low reagents consumption as well as effluent production. Resorting to this strategy, the activity of the enzyme was studied in strictly aqueous media and in the presence of dimethylformamide, methanol, bmpyr [Cl], hmim [Cl], bmim [BF4], emim [BF4], emim [Ac], bmim [Cl], emim [TfMs], emim [Ms] and Chol [Ac] up to 50%. The results show that the utilization of ILs as reaction media for aldolase based organic synthesis might present potential advantages over the tested conventional organic solvents. The least toxic IL found in this study was cho [Ac] that causes a reduction of enzyme activity of only 2.7% when used in a concentration of 50%. Generally, it can be concluded that ILs based on choline or short alkyl imidazolium moieties associated with biocompatible anions are the most promising ILs regarding the future inclusion of these solvents in synthetic protocols catalyzed by aldolase. Copyright © 2015 Elsevier B.V. All rights reserved.
A methodology for the evaluation of the human-bioclimatic performance of open spaces
NASA Astrophysics Data System (ADS)
Charalampopoulos, Ioannis; Tsiros, Ioannis; Chronopoulou-Sereli, Aik.; Matzarakis, Andreas
2017-05-01
The purpose of this paper is to present a simple methodology to improve the evaluation of the human-biometeorological benefits of open spaces. It is based on two groups of new indices using as basis the well-known PET index. This simple methodology along with the accompanying indices allows a qualitative and quantitative evaluation of the climatic behavior of the selected sites. The proposed methodology was applied in a human-biometeorology research in the city of Athens, Greece. The results of this study are in line with the results of other related studies indicating the considerable influence of the sky view factor (SVF), the existence of the vegetation and the building material on human-biometeorological conditions. The proposed methodology may provide new insights in the decision-making process related to urban open spaces' best configuration.
Factors influencing cancer treatment decision-making by indigenous peoples: a systematic review.
Tranberg, Rona; Alexander, Susan; Hatcher, Deborah; Mackey, Sandra; Shahid, Shaouli; Holden, Lynda; Kwok, Cannas
2016-02-01
We aim to systematically review studies that identify factors influencing cancer treatment decision-making among indigenous peoples. Following the outline suggested by the Preferred Reporting Items for Systematic Review and Meta-analysis, a rigorous systematic review and meta-synthesis were conducted of factors that influence cancer treatment decision-making by indigenous peoples. A total of 733 articles were retrieved from eight databases and a manual search. After screening the titles and abstracts, the full text of 26 articles were critically appraised, resulting in five articles that met inclusion criteria for the review. Because the five articles to be reviewed were qualitative studies, the Critical Appraisal Skills Program toolkit was used to evaluate the methodological quality. A thematic synthesis was employed to identify common themes across the studies. Multiple socio-economic and cultural factors were identified that all had the potential to influence cancer treatment decision-making by indigenous people. These factors were distilled into four themes: spiritual beliefs, cultural influences, communication and existing healthcare systems and structures. Although existing research identified multiple factors influencing decision-making, this review identified that quality studies in this domain are scarce. There is scope for further investigation, both into decision-making factors and into the subsequent design of culturally appropriate programmes and services that meet the needs of indigenous peoples. Copyright © 2015 John Wiley & Sons, Ltd.
Affective processes in human-automation interactions.
Merritt, Stephanie M
2011-08-01
This study contributes to the literature on automation reliance by illuminating the influences of user moods and emotions on reliance on automated systems. Past work has focused predominantly on cognitive and attitudinal variables, such as perceived machine reliability and trust. However, recent work on human decision making suggests that affective variables (i.e., moods and emotions) are also important. Drawing from the affect infusion model, significant effects of affect are hypothesized. Furthermore, a new affectively laden attitude termed liking is introduced. Participants watched video clips selected to induce positive or negative moods, then interacted with a fictitious automated system on an X-ray screening task At five time points, important variables were assessed including trust, liking, perceived machine accuracy, user self-perceived accuracy, and reliance.These variables, along with propensity to trust machines and state affect, were integrated in a structural equation model. Happiness significantly increased trust and liking for the system throughout the task. Liking was the only variable that significantly predicted reliance early in the task. Trust predicted reliance later in the task, whereas perceived machine accuracy and user self-perceived accuracy had no significant direct effects on reliance at any time. Affective influences on automation reliance are demonstrated, suggesting that this decision-making process may be less rational and more emotional than previously acknowledged. Liking for a new system may be key to appropriate reliance, particularly early in the task. Positive affect can be easily induced and may be a lever for increasing liking.
Costing for the Future: Exploring Cost Estimation With Unmanned Autonomous Systems
2016-04-30
account for how cost estimating for autonomy is different than current methodologies and to suggest ways it can be addressed through the integration and...The Development stage involves refining the system requirements, creating a solution description , and building a system. 3. The Operational Test...parameter describes the extent to which efficient fabrication methodologies and processes are used, and the automation of labor-intensive operations
Shinde, V; Burke, K E; Chakravarty, A; Fleming, M; McDonald, A A; Berger, A; Ecsedy, J; Blakemore, S J; Tirrell, S M; Bowman, D
2014-01-01
Immunohistochemistry-based biomarkers are commonly used to understand target inhibition in key cancer pathways in preclinical models and clinical studies. Automated slide-scanning and advanced high-throughput image analysis software technologies have evolved into a routine methodology for quantitative analysis of immunohistochemistry-based biomarkers. Alongside the traditional pathology H-score based on physical slides, the pathology world is welcoming digital pathology and advanced quantitative image analysis, which have enabled tissue- and cellular-level analysis. An automated workflow was implemented that includes automated staining, slide-scanning, and image analysis methodologies to explore biomarkers involved in 2 cancer targets: Aurora A and NEDD8-activating enzyme (NAE). The 2 workflows highlight the evolution of our immunohistochemistry laboratory and the different needs and requirements of each biological assay. Skin biopsies obtained from MLN8237 (Aurora A inhibitor) phase 1 clinical trials were evaluated for mitotic and apoptotic index, while mitotic index and defects in chromosome alignment and spindles were assessed in tumor biopsies to demonstrate Aurora A inhibition. Additionally, in both preclinical xenograft models and an acute myeloid leukemia phase 1 trial of the NAE inhibitor MLN4924, development of a novel image algorithm enabled measurement of downstream pathway modulation upon NAE inhibition. In the highlighted studies, developing a biomarker strategy based on automated image analysis solutions enabled project teams to confirm target and pathway inhibition and understand downstream outcomes of target inhibition with increased throughput and quantitative accuracy. These case studies demonstrate a strategy that combines a pathologist's expertise with automated image analysis to support oncology drug discovery and development programs.
Risk and Strategic Decision-Making in Developing Evidence-Based Practice Guidelines
ERIC Educational Resources Information Center
Wilczynski, Susan M.
2012-01-01
Evidence-based practice (EBP) represents an important approach to educating and treating individuals diagnosed with disabilities or disorders. Understanding research findings is the cornerstone of EBP. The methodology of systematic reviews, which involves carefully analyzing research findings, can result a practice guideline that recommends…
Risk Assessment and Decision-Making at the Local Level:?Who does what when, how, and to what extent?
Problem: Exposure to multiple stressors increases the likelihood of an adverse response in human and ecosystem communities. Challenge: A rigorous and scientifically robust assessment methodology is needed to characterize stressors and receptors of interest, calculate risk estimat...
This article develops and explores a methodology for using qualitative influence diagrams in environmental policy and management to support decision-making efforts that minimize risk and increase resiliency. Influence diagrams are representations of the conditional aspects of a p...
Comments on Method in Comparative Higher Education.
ERIC Educational Resources Information Center
Wasser, Henry
The methodologies employed in comparative higher education and comparative education are briefly summarized and analyzed. Weaknesses of the following approaches used by scholars/researchers in the field are identified: (1) locating decision-making structures and relations in broadly differentiated aggregations of systems; (2) case study; (3)…
Automated sizing of large structures by mixed optimization methods
NASA Technical Reports Server (NTRS)
Sobieszczanski, J.; Loendorf, D.
1973-01-01
A procedure for automating the sizing of wing-fuselage airframes was developed and implemented in the form of an operational program. The program combines fully stressed design to determine an overall material distribution with mass-strength and mathematical programming methods to design structural details accounting for realistic design constraints. The practicality and efficiency of the procedure is demonstrated for transport aircraft configurations. The methodology is sufficiently general to be applicable to other large and complex structures.
1988-01-24
vanes.-The new facility is currently being called the Engine Blade/ Vape Facility (EB/VF). There are three primary goals in automating this proc..e...earlier, the search led primarily into the areas of CIM Justification, Automation Strategies , Performance Measurement, and Integration issues. Of...of living, has been steadily eroding. One dangerous trend that has developed in keenly competitive world markets , says Rohan [33], has been for U.S
Translational Cognition for Decision Support in Critical Care Environments: A Review
Patel, Vimla L.; Zhang, Jiajie; Yoskowitz, Nicole A.; Green, Robert; Sayan, Osman R.
2008-01-01
The dynamic and distributed work environment in critical care requires a high level of collaboration among clinical team members and a sophisticated task coordination system to deliver safe, timely and effective care. A complex cognitive system underlies the decision-making process in such cooperative workplaces. This methodological review paper addresses the issues of translating cognitive research to clinical practice with a specific focus on decision-making in critical care, and the role of information and communication technology to aid in such decisions. Examples are drawn from studies of critical care in our own research laboratories. Critical care, in this paper, includes both intensive (inpatient) and emergency (outpatient) care. We define translational cognition as the research on basic and applied cognitive issues that contribute to our understanding of how information is stored, retrieved and used for problem-solving and decision-making. The methods and findings are discussed in the context of constraints on decision-making in real world complex environments and implications for supporting the design and evaluation of decision support tools for critical care health providers. PMID:18343731
Comparing and using assessments of the value of information to clinical decision-making.
Urquhart, C J; Hepworth, J B
1996-01-01
This paper discusses the Value project, which assessed the value to clinical decision-making of information supplied by National Health Service (NHS) library and information services. The project not only showed how health libraries in the United Kingdom help clinicians in decision-making but also provided quality assurance guidelines for these libraries to help make their information services more effective. The paper reviews methods and results used in previous studies of the value of health libraries, noting that methodological differences appear to affect the results. The paper also discusses aspects of user involvement, categories of clinical decision-making, the value of information to present and future clinical decisions, and the combination of quantitative and qualitative assessments of value, as applied to the Value project and the studies reviewed. The Value project also demonstrated that the value placed on information depends in part on the career stage of the physician. The paper outlines the structure of the quality assurance tool kit, which is based on the findings and methods used in the Value project. PMID:8913550
Translational cognition for decision support in critical care environments: a review.
Patel, Vimla L; Zhang, Jiajie; Yoskowitz, Nicole A; Green, Robert; Sayan, Osman R
2008-06-01
The dynamic and distributed work environment in critical care requires a high level of collaboration among clinical team members and a sophisticated task coordination system to deliver safe, timely and effective care. A complex cognitive system underlies the decision-making process in such cooperative workplaces. This methodological review paper addresses the issues of translating cognitive research to clinical practice with a specific focus on decision-making in critical care, and the role of information and communication technology to aid in such decisions. Examples are drawn from studies of critical care in our own research laboratories. Critical care, in this paper, includes both intensive (inpatient) and emergency (outpatient) care. We define translational cognition as the research on basic and applied cognitive issues that contribute to our understanding of how information is stored, retrieved and used for problem-solving and decision-making. The methods and findings are discussed in the context of constraints on decision-making in real-world complex environments and implications for supporting the design and evaluation of decision support tools for critical care health providers.
19 CFR 111.23 - Retention of records.
Code of Federal Regulations, 2012 CFR
2012-04-01
... bonded warehouse, records relating to the withdrawal must be retained for 5 years from the date of... consolidated location, the methodology of record maintenance, a description of any automated data processing to...
19 CFR 111.23 - Retention of records.
Code of Federal Regulations, 2010 CFR
2010-04-01
... bonded warehouse, records relating to the withdrawal must be retained for 5 years from the date of... consolidated location, the methodology of record maintenance, a description of any automated data processing to...
19 CFR 111.23 - Retention of records.
Code of Federal Regulations, 2011 CFR
2011-04-01
... bonded warehouse, records relating to the withdrawal must be retained for 5 years from the date of... consolidated location, the methodology of record maintenance, a description of any automated data processing to...
Do neonatologists limit parental decision-making authority? A Canadian perspective.
Albersheim, Susan G; Lavoie, Pascal M; Keidar, Yaron D
2010-12-01
According to the principles of family-centered care, fully informed parents and health care professionals are partners in the care of sick neonates. The aim of this study was to assess the attitudes of Canadian neonatologists towards the authority of parents to make life-and-death decisions for their babies. We interviewed 121 (74%) of the 164 practicing neonatologists in Canada (June 2004-March 2005), using scripted open-ended questions and common clinical scenarios. Data analysis employed interpretive description methodology. The main outcome measure was the intention of neonatologists to limit parental life-and-death decision-making authority, when they disagree with parental decisions. Neonatologists' self-rated respect for parental decision-making authority was 8/10. Most neonatologists thought that parents should be either primary decision-makers or part of the decision-making team. Fifty-six percent of neonatologists would limit parental decision-making authority if the parents' decision is not in the baby's "best interest". In response to common neonatal severe illness scenarios, up to 18% of neonatologists said they would limit parental decision-making, even if the chance of intact survival is very poor. For clinical scenarios with equally poor long-term outcomes, neonatologists were more likely to comply with parental wishes early in the life of a baby, particularly with documented brain injury. Canadian neonatologists espouse high regard for parental decision-making authority, but are prepared to limit parental authority if the parents' decision is not thought to be in the baby's best interest. Although neonatologists advise parents that treatment can be started at birth, and stopped later, this was only for early severe brain injury. Copyright © 2010 Elsevier Ltd. All rights reserved.
Hagbaghery, Mohsen Adib; Salsali, Mahvash; Ahmadi, Fazlolah
2004-01-01
Background Nurses' practice takes place in a context of ongoing advances in research and technology. The dynamic and uncertain nature of health care environment requires nurses to be competent decision-makers in order to respond to clients' needs. Recently, the public and the government have criticized Iranian nurses because of poor quality of patient care. However nurses' views and experiences on factors that affect their clinical function and clinical decision-making have rarely been studied. Methods Grounded theory methodology was used to analyze the participants' lived experiences and their viewpoints regarding the factors affecting their clinical function and clinical decision-making. Semi-structured interviews and participant observation methods were used to gather the data. Thirty-eight participants were interviewed and twelve sessions of observation were carried out. Constant comparative analysis method was used to analyze the data. Results Five main themes emerged from the data. From the participants' points of view, "feeling competent", "being self-confident", "organizational structure", "nursing education", and "being supported" were considered as important factors in effective clinical decision-making. Conclusion As participants in this research implied, being competent and self-confident are the most important personal factors influencing nurses clinical decision-making. Also external factors such as organizational structure, access to supportive resources and nursing education have strengthening or inhibiting effects on the nurses' decisions. Individual nurses, professional associations, schools of nursing, nurse educators, organizations that employ nurses and government all have responsibility for developing and finding strategies that facilitate nurses' effective clinical decision-making. They are responsible for identifying barriers and enhancing factors within the organizational structure that facilitate nurses' clinical decision-making. PMID:15068484
Kwak, Seung-Jun; Yoo, Seung-Hoon; Shin, Chol-Oh
2002-02-01
Evaluating environmental impacts has become an increasingly vital part of environmental management. In the present study, a methodological procedure based on multiattribute utility theory (MAUT) has been applied to obtain a decision-maker's value index on assessment of the environmental impacts. The paper begins with an overview of MAUT. Next, we elicited strategic objectives and several important attributes, and then structured them into a hierarchy, with the aim of structuring and quantifying the basic values for the assessment. An environmental multiattribute index is constructed as a multiattribute utility function, based on value judgements provided by a decision-maker at the Korean Ministry of Environment (MOE). The implications of the results are useful for many aspects of MOE's environmental policies; identifying the strategic objectives and basic values; facilitating communication about the organization's priorities; and recognizing decision opportunities that face decision-makers of Korea.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heo, Yeonsook; Augenbroe, Godfried; Graziano, Diane
2015-05-01
The increasing interest in retrofitting of existing buildings is motivated by the need to make a major contribution to enhancing building energy efficiency and reducing energy consumption and CO2 emission by the built environment. This paper examines the relevance of calibration in model-based analysis to support decision-making for energy and carbon efficiency retrofits of individual buildings and portfolios of buildings. The authors formulate a set of real retrofit decision-making situations and evaluate the role of calibration by using a case study that compares predictions and decisions from an uncalibrated model with those of a calibrated model. The case study illustratesmore » both the mechanics and outcomes of a practical alternative to the expert- and time-intense application of dynamic energy simulation models for large-scale retrofit decision-making under uncertainty.« less
NASA Astrophysics Data System (ADS)
Miller, T. N.; Brumbaugh, E. J.; Barker, M.; Ly, V.; Schick, R.; Rogers, L.
2015-12-01
The NASA DEVELOP National Program conducts over eighty Earth science projects every year. Each project applies NASA Earth observations to impact decision-making related to a local or regional community concern. Small, interdisciplinary teams create a methodology to address the specific issue, and then pass on the results to partner organizations, as well as providing them with instruction to continue using remote sensing for future decisions. Many different methods are used by individual teams, and the program as a whole, to communicate results and research accomplishments to decision-makers, stakeholders, alumni, and the general public. These methods vary in scope from formal publications to more informal venues, such as social media. This presentation will highlight the communication techniques used by the DEVELOP program. Audiences, strategies, and outlets will be discussed, including a newsletter, microjournal, video contest, and several others.
Information security system quality assessment through the intelligent tools
NASA Astrophysics Data System (ADS)
Trapeznikov, E. V.
2018-04-01
The technology development has shown the automated system information security comprehensive analysis necessity. The subject area analysis indicates the study relevance. The research objective is to develop the information security system quality assessment methodology based on the intelligent tools. The basis of the methodology is the information security assessment model in the information system through the neural network. The paper presents the security assessment model, its algorithm. The methodology practical implementation results in the form of the software flow diagram are represented. The practical significance of the model being developed is noted in conclusions.
Public attitudes and values in priority setting.
Peacock, Stuart J
2015-01-01
There is growing recognition that critical decisions concerning investments in new health care technologies and services should incorporate society's values along with the scientific evidence. From a normative perspective, public engagement can help realize the democratic ideals of legitimacy, transparency, and accountability. On a more pragmatic level, public engagement can help stakeholders understand the degree of popular support for policy options, and may enhance public trust in decision-making processes. To better understand public attitudes and values relating to priority setting in health care, researchers and decision-makers will have to employ a range of quantitative and qualitative approaches, drawing on different disciplines and methodological traditions.
Cancer Detection Using Neural Computing Methodology
NASA Technical Reports Server (NTRS)
Toomarian, Nikzad; Kohen, Hamid S.; Bearman, Gregory H.; Seligson, David B.
2001-01-01
This paper describes a novel learning methodology used to analyze bio-materials. The premise of this research is to help pathologists quickly identify anomalous cells in a cost efficient method. Skilled pathologists must methodically, efficiently and carefully analyze manually histopathologic materials for the presence, amount and degree of malignancy and/or other disease states. The prolonged attention required to accomplish this task induces fatigue that may result in a higher rate of diagnostic errors. In addition, automated image analysis systems to date lack a sufficiently intelligent means of identifying even the most general regions of interest in tissue based studies and this shortfall greatly limits their utility. An intelligent data understanding system that could quickly and accurately identify diseased tissues and/or could choose regions of interest would be expected to increase the accuracy of diagnosis and usher in truly automated tissue based image analysis.
Application of Human-Autonomy Teaming (HAT) Patterns to Reduced Crew Operations (RCO)
NASA Technical Reports Server (NTRS)
Shively, R. Jay; Brandt, Summer L.; Lachter, Joel; Matessa, Mike; Sadler, Garrett; Battiste, Henri
2016-01-01
As part of the Air Force - NASA Bi-Annual Research Council Meeting, slides will be presented on recent Reduced Crew Operations (RCO) work. Unmanned aerial systems, robotics, advanced cockpits, and air traffic management are all examples of domains that are seeing dramatic increases in automation. While automation may take on some tasks previously performed by humans, humans will still be required, for the foreseeable future, to remain in the system. The collaboration with humans and these increasingly autonomous systems will begin to resemble cooperation between teammates, rather than simple task allocation. It is critical to understand this human-autonomy teaming (HAT) to optimize these systems in the future. One methodology to understand HAT is by identifying recurring patterns of HAT that have similar characteristics and solutions. A methodology for identifying HAT patterns to an advanced cockpit project is discussed.
Cultivating Peace through Design Thinking: Problem Solving with PAST Foundation
ERIC Educational Resources Information Center
Deaner, Kat; McCreery-Kellert, Heather
2018-01-01
Design thinking is a methodology that emphasizes reasoning and decision-making as part of the problem-solving process. It is a structured framework for identifying challenges, gathering information, generating potential solutions, refining ideas, and testing solutions. Design thinking offers valuable skills that will serve students well as they…
The role of reporting standards in producing robust literature reviews
NASA Astrophysics Data System (ADS)
Haddaway, Neal Robert; Macura, Biljana
2018-06-01
Literature reviews can help to inform decision-making, yet they may be subject to fatal bias if not conducted rigorously as `systematic reviews'. Reporting standards help authors to provide sufficient methodological detail to allow verification and replication, clarifying when key steps, such as critical appraisal, have been omitted.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-10
... final recommendation documents, and participating in workgroups on specific topics or methods. A... literature and in the methods of evidence review; 2. Understanding and experience in the application of... based on their expertise in methodological issues such as medical decisionmaking, clinical epidemiology...
A methodology for decisionmaking in project evaluation in land management planning
A. Weintraub
1978-01-01
In order to evaluate alternative plans, wildland management planners must consider many objectives, such as timber production, recreational use, and community stability. The method presented utilizes the type of qualitative and intuitive information widely available to wildland management planners, and structures this information into a format suitable for...
USDA-ARS?s Scientific Manuscript database
According to the U.S. National Environmental Policy Act of 1969 (NEPA), federal action to manipulate habitat for species conservation requires an environmental impact statement (EIS), which should integrate natural and social sciences in planning and decision-making. Nonetheless, most impact assessm...
Children's Spirit: Leadership Standards and Chief School Executives
ERIC Educational Resources Information Center
Boske, Christa
2009-01-01
Purpose: The purpose of this study is to increase awareness of the interactions among school leadership standards, cultural competence, and decision-making practices for chief school executives. Design/methodology/approach: To achieve this objective, 1,087 chief school executives, who were members of the American Association of School…
Saving the Lost Boys: Narratives of Discipline Disproportionality
ERIC Educational Resources Information Center
Gray, Mariama Smith
2016-01-01
In this article, I explore how discriminatory adult practices disproportionately involve Latino boys in the juvenile justice system. I use the critical methodologies of critical ethnography, critical discourse analysis and Critical Race Theory (CRT) to provide a race-centered analysis of decision-making in student discipline. My findings reveal…
Third Sector Involvement in Public Education: The Israeli Case
ERIC Educational Resources Information Center
Berkovich, Izhak; Foldes, Vincent Jonathan
2012-01-01
Purpose: The purpose of this article is to address the involvement of third sector organizations in state public education in Israel, with emphasis on the decision-making processes affecting the geographic distribution of service provision. Design/methodology/approach: A collective case study approach was used to investigate non-governmental…
How Do Raters Judge Spoken Vocabulary?
ERIC Educational Resources Information Center
Li, Hui
2016-01-01
The aim of the study was to investigate how raters come to their decisions when judging spoken vocabulary. Segmental rating was introduced to quantify raters' decision-making process. It is hoped that this simulated study brings fresh insight to future methodological considerations with spoken data. Twenty trainee raters assessed five Chinese…
Post-Secondary Enrolment Forecasting with Traditional and Cross Pressure-Impact Methodologies.
ERIC Educational Resources Information Center
Hoffman, Bernard B.
A model for forecasting postsecondary enrollment, the PDEM-1, is considered, which combines the traditional with a cross-pressure impact decision-making model. The model is considered in relation to its background, assumptions, survey instrument, model conception, applicability to educational environments, and implementation difficulties. The…
Multiple Criteria Decision-Making Techniques in Higher Education
ERIC Educational Resources Information Center
Ho, William; Dey, Prasanta K.; Higson, Helen E.
2006-01-01
Purpose: The purpose of this paper is to review the literature which focuses on four major higher education decision problems. These are: resource allocation; performance measurement; budgeting; and scheduling. Design/methodology/approach: Related articles appearing in the international journals from 1996 to 2005 are gathered and analyzed so that…
ERIC Educational Resources Information Center
DiMeo, Michelle A.; Moore, G. Kurt; Lichtenstein, Carolyn
2012-01-01
Evidence-based treatments (EBTs) are "interventions" that have been proven effective through rigorous research methodologies. Evidence-based practice (EBP), however, refers to a "decision-making process" that integrates the best available research, clinician expertise, and client characteristics. This study examined community mental health service…
Systematic, Cooperative Evaluation.
ERIC Educational Resources Information Center
Nassif, Paula M.
Evaluation procedures based on a systematic evaluation methodology, decision-maker validity, new measurement and design techniques, low cost, and a high level of cooperation on the part of the school staff were used in the assessment of a public school mathematics program for grades 3-8. The mathematics curriculum was organized into Spirals which…
Assessing Financial Education Methods: Principles vs. Rules-of-Thumb Approaches
ERIC Educational Resources Information Center
Skimmyhorn, William L.; Davies, Evan R.; Mun, David; Mitchell, Brian
2016-01-01
Despite thousands of programs and tremendous public and private interest in improving financial decision-making, little is known about how best to teach financial education. Using an experimental approach, the authors estimated the effects of two different education methodologies (principles-based and rules-of-thumb) on the knowledge,…
Reclaiming "Sense" from "Cents" in Accounting Education
ERIC Educational Resources Information Center
Dellaportas, Steven
2015-01-01
This essay adopts an interpretive methodology of relevant literature to explore the limitations of accounting education when it is taught purely as a technical practice. The essay proceeds from the assumption that conventional accounting education is captured by a positivistic neo-classical model of decision-making that draws on economic rationale…
Burden of Disease Study and Priority Setting in Korea: an Ethical Perspective
2016-01-01
When thinking about priority setting in access to healthcare resources, decision-making requires that cost-effectiveness is balanced against medical ethics. The burden of disease has emerged as an important approach to the assessment of health needs for political decision-making. However, the disability adjusted life years approach hides conceptual and methodological issues regarding the claims and value of disabled people. In this article, we discuss ethical issues that are raised as a consequence of the introduction of evidence-based health policy, such as economic evidence, in establishing resource allocation priorities. In terms of ethical values in health priority setting in Korea, there is no reliable rationale for the judgment used in decision-making as well as for setting separate and distinct priorities for different government bodies. An important question, therefore, is which ethical values guiding the practice of decision-making should be reconciled with the economic evidence found in Korean healthcare. The health technology assessment core model from the European network for Health Technology Assessment (EUnetHTA) project is a good example of incorporating ethical values into decision-making. We suggest that a fair distribution of scarce healthcare resources in South Korea can be achieved by considering the ethical aspects of healthcare. PMID:27775247
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haugen, G.R.; Bystroff, R.I.; Downey, R.M.
1975-09-01
In the area of automation and instrumentation, progress in the following studies is reported: computer automation of the Cary model 17I spectrophotometer; a new concept for monitoring the concentration of water in gases; on-line gas analysis for a gas circulation experiment; and count-rate-discriminator technique for measuring grain-boundary composition. In the area of analytical methodology and measurements, progress is reported in the following studies: separation of molecular species by radiation pressure; study of the vaporization of U(thd)$sub 4$, (thd = 2,2,6,6-tetramethylheptane-3,5-drone); study of the vaporization of U(C$sub 8$H$sub 8$)$sub 2$; determination of ethylenic unsaturation in polyimide resins; and, semimicrodetermination of hydroxylmore » and amino groups with pyromellitic dianhydride (PMDA). (JGB)« less
Analysis of technical university information system
NASA Astrophysics Data System (ADS)
Savelyev, N. A.; Boyarkin, M. A.
2018-05-01
The paper covers a set and interaction of the existing higher education institution automated control systems in φ state budgetary educational institution of higher professional education "Industrial University of Tyumen ". A structural interaction of the existing systems and their functions has been analyzed which has become a basis for identification of a number of system-related and local (related to separate modules) drawbacks of the university activities automation. The authors suggested a new structure of the automated control system, consisting of three major subsystems: management support; training and methodology support; distance and supplementary education support. Functionality for each subsystem has been defined in accordance with the educational institution automation requirements. The suggested structure of the ACS will solve the challenges facing the university during reorganization and optimization of the processes of management of the institution activities as a whole.
Aquino, Arturo; Gegundez-Arias, Manuel Emilio; Marin, Diego
2010-11-01
Optic disc (OD) detection is an important step in developing systems for automated diagnosis of various serious ophthalmic pathologies. This paper presents a new template-based methodology for segmenting the OD from digital retinal images. This methodology uses morphological and edge detection techniques followed by the Circular Hough Transform to obtain a circular OD boundary approximation. It requires a pixel located within the OD as initial information. For this purpose, a location methodology based on a voting-type algorithm is also proposed. The algorithms were evaluated on the 1200 images of the publicly available MESSIDOR database. The location procedure succeeded in 99% of cases, taking an average computational time of 1.67 s. with a standard deviation of 0.14 s. On the other hand, the segmentation algorithm rendered an average common area overlapping between automated segmentations and true OD regions of 86%. The average computational time was 5.69 s with a standard deviation of 0.54 s. Moreover, a discussion on advantages and disadvantages of the models more generally used for OD segmentation is also presented in this paper.
Decision making technical support study for the US Army's Chemical Stockpile Disposal Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feldman, D.L.; Dobson, J.E.
1990-08-01
This report examines the adequacy of current command and control systems designed to make timely decisions that would enable sufficient warning and protective response to an accident at the Edgewood area of Aberdeen Proving Ground (APG), Maryland, and at Pine Bluff Arsenal (PBA), Arkansas. Institutional procedures designed to facilitate rapid accident assessment, characterization, warning, notification, and response after the onset of an emergency and computer-assisted decision-making aids designed to provide salient information to on- and-off-post emergency responders are examined. The character of emergency decision making at APG and PBA, as well as potential needs for improvements to decision-making practices, procedures,more » and automated decision-support systems (ADSSs), are described and recommendations are offered to guide equipment acquisition and improve on- and off-post command and control relationships. We recommend that (1) a continued effort be made to integrate on- and off-post command control, and decision-making procedures to permit rapid decision making; (2) the pathways for alert and notification among on- and off-post officials be improved and that responsibilities and chain of command among off-post agencies be clarified; (3) greater attention be given to organizational and social context factors that affect the adequacy of response and the likelihood that decision-making systems will work as intended; and (4) faster improvements be made to on-post ADSSs being developed at APG and PBA, which hold considerable promise for depicting vast amounts of information. Phased development and procurement of computer-assisted decision-making tools should be undertaken to balance immediate needs against available resources and to ensure flexibility, equity among sites, and compatibility among on- and off-post systems. 112 refs., 6 tabs.« less
The Structural Consequences of Big Data-Driven Education.
Zeide, Elana
2017-06-01
Educators and commenters who evaluate big data-driven learning environments focus on specific questions: whether automated education platforms improve learning outcomes, invade student privacy, and promote equality. This article puts aside separate unresolved-and perhaps unresolvable-issues regarding the concrete effects of specific technologies. It instead examines how big data-driven tools alter the structure of schools' pedagogical decision-making, and, in doing so, change fundamental aspects of America's education enterprise. Technological mediation and data-driven decision-making have a particularly significant impact in learning environments because the education process primarily consists of dynamic information exchange. In this overview, I highlight three significant structural shifts that accompany school reliance on data-driven instructional platforms that perform core school functions: teaching, assessment, and credentialing. First, virtual learning environments create information technology infrastructures featuring constant data collection, continuous algorithmic assessment, and possibly infinite record retention. This undermines the traditional intellectual privacy and safety of classrooms. Second, these systems displace pedagogical decision-making from educators serving public interests to private, often for-profit, technology providers. They constrain teachers' academic autonomy, obscure student evaluation, and reduce parents' and students' ability to participate or challenge education decision-making. Third, big data-driven tools define what "counts" as education by mapping the concepts, creating the content, determining the metrics, and setting desired learning outcomes of instruction. These shifts cede important decision-making to private entities without public scrutiny or pedagogical examination. In contrast to the public and heated debates that accompany textbook choices, schools often adopt education technologies ad hoc. Given education's crucial impact on individual and collective success, educators and policymakers must consider the implications of data-driven education proactively and explicitly.
Automated measurement of office, home and ambulatory blood pressure in atrial fibrillation.
Kollias, Anastasios; Stergiou, George S
2014-01-01
1. Hypertension and atrial fibrillation (AF) often coexist and are strong risk factors for stroke. Current guidelines for blood pressure (BP) measurement in AF recommend repeated measurements using the auscultatory method, whereas the accuracy of the automated devices is regarded as questionable. This review presents the current evidence on the feasibility and accuracy of automated BP measurement in the presence of AF and the potential for automated detection of undiagnosed AF during such measurements. 2. Studies evaluating the use of automated BP monitors in AF are limited and have significant heterogeneity in methodology and protocols. Overall, the oscillometric method is feasible for static (office or home) and ambulatory use and appears to be more accurate for systolic than diastolic BP measurement. 3. Given that systolic hypertension is particularly common and important in the elderly, the automated BP measurement method may be acceptable for self-home and ambulatory monitoring, but not for professional office or clinic measurement. 4. An embedded algorithm for the detection of asymptomatic AF during routine automated BP measurement with high diagnostic accuracy has been developed and appears to be a useful screening tool for elderly hypertensives. © 2013 Wiley Publishing Asia Pty Ltd.
NASA Technical Reports Server (NTRS)
Clement, Warren F.; Gorder, Pater J.; Jewell, Wayne F.; Coppenbarger, Richard
1990-01-01
Developing a single-pilot all-weather NOE capability requires fully automatic NOE navigation and flight control. Innovative guidance and control concepts are being investigated to (1) organize the onboard computer-based storage and real-time updating of NOE terrain profiles and obstacles; (2) define a class of automatic anticipative pursuit guidance algorithms to follow the vertical, lateral, and longitudinal guidance commands; (3) automate a decision-making process for unexpected obstacle avoidance; and (4) provide several rapid response maneuvers. Acquired knowledge from the sensed environment is correlated with the recorded environment which is then used to determine an appropriate evasive maneuver if a nonconformity is observed. This research effort has been evaluated in both fixed-base and moving-base real-time piloted simulations thereby evaluating pilot acceptance of the automated concepts, supervisory override, manual operation, and reengagement of the automatic system.
Information in medical decision making: how consistent is our management?
Lorence, Daniel P; Spink, Amanda; Jameson, Robert
2002-01-01
The use of outcomes data in clinical environments requires a correspondingly greater variety of information used in decision making, the measurement of quality, and clinical performance. As information becomes integral in the decision-making process, trustworthy decision support data are required. Using data from a national census of certified health information managers, variation in automated data quality management practices was examined. Relatively low overall adoption of automated data management exists in health care organizations, with significant geographic and practice setting variation. Nonuniform regional adoption of computerized data management exists, despite national mandates that promote and in some cases require uniform adoption. Overall, a significant number of respondents (42.7%) indicated that they had not adopted policies and procedures to direct the timeliness of data capture, with 57.3% having adopted such practices. The inconsistency of patient data policy suggests that provider organizations do not use uniform information management methods, despite growing federal mandates to do so.
Automation in high-content flow cytometry screening.
Naumann, U; Wand, M P
2009-09-01
High-content flow cytometric screening (FC-HCS) is a 21st Century technology that combines robotic fluid handling, flow cytometric instrumentation, and bioinformatics software, so that relatively large numbers of flow cytometric samples can be processed and analysed in a short period of time. We revisit a recent application of FC-HCS to the problem of cellular signature definition for acute graft-versus-host-disease. Our focus is on automation of the data processing steps using recent advances in statistical methodology. We demonstrate that effective results, on par with those obtained via manual processing, can be achieved using our automatic techniques. Such automation of FC-HCS has the potential to drastically improve diagnosis and biomarker identification.
Day, Molly A; Anthony, Christopher A; Bedard, Nicholas A; Glass, Natalie A; Clark, Charles R; Callaghan, John J; Noiseux, Nicolas O
2018-01-01
Automated mobile phone messaging has not been reported in total joint arthroplasty (TJA). Our purpose was to compare Press Ganey (PG) and Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) scores between TJA patients who did and did not receive perioperative automated mobile phone messages. Patients were prospectively enrolled and received messages for 1 week prior until 2 weeks after TJA. Message content included reminders, activity, and pain control. Patients answered select PG/HCAHPS and questions regarding their experience with the automated communication platform. Average PG/HCAHPS scores were compared to historical TJA patients in the 3-year window prior (control group) with significance P < .05. Thirty-seven consecutive patients were approached and 92% (n = 34) were enrolled. The experimental group was 47% male, with 80% patients between 51 and 75 years. The experimental (n = 30) and control groups (n = 26) were similar. Patients receiving messages were more likely to have a good understanding of health responsibilities (P = .024) and feel that the care team demonstrated shared decision-making (P = .024). Of patients enrolled, 87% felt messages helped them be more prepared for surgery, 100% felt messages kept them better informed, and 97% would participate again. TJA patients who received perioperative communication via automated mobile phone messaging had improved patient satisfaction scores postoperatively. Patients perceived this form of communication was useful and kept them better informed. Automated mobile phone messaging can be an easily integrated, helpful adjunct to surgeons, healthcare systems, and case managers to more effectively communicate with patients undergoing TJA in this era of value-based care. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Ancel, Ersin; Shih, Ann T.
2014-01-01
This paper highlights the development of a model that is focused on the safety issue of increasing complexity and reliance on automation systems in transport category aircraft. Recent statistics show an increase in mishaps related to manual handling and automation errors due to pilot complacency and over-reliance on automation, loss of situational awareness, automation system failures and/or pilot deficiencies. Consequently, the aircraft can enter a state outside the flight envelope and/or air traffic safety margins which potentially can lead to loss-of-control (LOC), controlled-flight-into-terrain (CFIT), or runway excursion/confusion accidents, etc. The goal of this modeling effort is to provide NASA's Aviation Safety Program (AvSP) with a platform capable of assessing the impacts of AvSP technologies and products towards reducing the relative risk of automation related accidents and incidents. In order to do so, a generic framework, capable of mapping both latent and active causal factors leading to automation errors, is developed. Next, the framework is converted into a Bayesian Belief Network model and populated with data gathered from Subject Matter Experts (SMEs). With the insertion of technologies and products, the model provides individual and collective risk reduction acquired by technologies and methodologies developed within AvSP.
NASA Technical Reports Server (NTRS)
Koltai, Kolina; Ho, Nhut; Masequesmay, Gina; Niedober, David; Skoog, Mark; Cacanindin, Artemio; Johnson, Walter; Lyons, Joseph
2014-01-01
This paper discusses a case study that examined the influence of cultural, organizational and automation capability upon human trust in, and reliance on, automation. In particular, this paper focuses on the design and application of an extended case study methodology, and on the foundational lessons revealed by it. Experimental test pilots involved in the research and development of the US Air Force's newly developed Automatic Ground Collision Avoidance System served as the context for this examination. An eclectic, multi-pronged approach was designed to conduct this case study, and proved effective in addressing the challenges associated with the case's politically sensitive and military environment. Key results indicate that the system design was in alignment with pilot culture and organizational mission, indicating the potential for appropriate trust development in operational pilots. These include the low-vulnerability/ high risk nature of the pilot profession, automation transparency and suspicion, system reputation, and the setup of and communications among organizations involved in the system development.
Agile based "Semi-"Automated Data ingest process : ORNL DAAC example
NASA Astrophysics Data System (ADS)
Santhana Vannan, S. K.; Beaty, T.; Cook, R. B.; Devarakonda, R.; Hook, L.; Wei, Y.; Wright, D.
2015-12-01
The ORNL DAAC archives and publishes data and information relevant to biogeochemical, ecological, and environmental processes. The data archived at the ORNL DAAC must be well formatted, self-descriptive, and documented, as well as referenced in a peer-reviewed publication. The ORNL DAAC ingest team curates diverse data sets from multiple data providers simultaneously. To streamline the ingest process, the data set submission process at the ORNL DAAC has been recently updated to use an agile process and a semi-automated workflow system has been developed to provide a consistent data provider experience and to create a uniform data product. The goals of semi-automated agile ingest process are to: 1.Provide the ability to track a data set from acceptance to publication 2. Automate steps that can be automated to improve efficiencies and reduce redundancy 3.Update legacy ingest infrastructure 4.Provide a centralized system to manage the various aspects of ingest. This talk will cover the agile methodology, workflow, and tools developed through this system.
NASA Technical Reports Server (NTRS)
Koltai, Kolina Sun; Ho, Nhut; Masequesmay, Gina; Niedober, David; Skoog, Mark; Johnson, Walter; Cacanindin, Artemio
2014-01-01
This paper discusses a case study that examined the influence of cultural, organizational and automation capability upon human trust in, and reliance on, automation. In particular, this paper focuses on the design and application of an extended case study methodology, and on the foundational lessons revealed by it. Experimental test pilots involved in the research and development of the US Air Forces newly developed Automatic Ground Collision Avoidance System served as the context for this examination. An eclectic, multi-pronged approach was designed to conduct this case study, and proved effective in addressing the challenges associated with the cases politically sensitive and military environment. Key results indicate that the system design was in alignment with pilot culture and organizational mission, indicating the potential for appropriate trust development in operational pilots. These include the low-vulnerabilityhigh risk nature of the pilot profession, automation transparency and suspicion, system reputation, and the setup of and communications among organizations involved in the system development.
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, R. H.; Badger, W.; Beckman, C. S.; Beshers, G.; Hammerslag, D.; Kimball, J.; Kirslis, P. A.; Render, H.; Richards, P.; Terwilliger, R.
1984-01-01
The project to automate the management of software production systems is described. The SAGA system is a software environment that is designed to support most of the software development activities that occur in a software lifecycle. The system can be configured to support specific software development applications using given programming languages, tools, and methodologies. Meta-tools are provided to ease configuration. Several major components of the SAGA system are completed to prototype form. The construction methods are described.
Automated generation of image products for Mars Exploration Rover Mission tactical operations
NASA Technical Reports Server (NTRS)
Alexander, Doug; Zamani, Payam; Deen, Robert; Andres, Paul; Mortensen, Helen
2005-01-01
This paper will discuss, from design to implementation, the methodologies applied to MIPL's automated pipeline processing as a 'system of systems' integrated with the MER GDS. Overviews of the interconnected product generating systems will also be provided with emphasis on interdependencies, including those for a) geometric rectificationn of camera lens distortions, b) generation of stereo disparity, c) derivation of 3-dimensional coordinates in XYZ space, d) generation of unified terrain meshes, e) camera-to-target ranging (distance) and f) multi-image mosaicking.
NASA Astrophysics Data System (ADS)
Arakelyan, E. K.; Andryushin, A. V.; Mezin, S. V.; Kosoy, A. A.; Kalinina, Ya V.; Khokhlov, I. S.
2017-11-01
The principle of interaction of the specified systems of technological protections by the Automated process control system (APCS) and information safety in case of incorrect execution of the algorithm of technological protection is offered. - checking the correctness of the operation of technological protection in each specific situation using the functional relationship between the monitored parameters. The methodology for assessing the economic feasibility of developing and implementing an information security system.
Medicare payment system for hospital inpatients: diagnosis-related groups.
Baker, Judith J
2002-01-01
Diagnosis-Related Groups (DRGs) are categories of patient conditions that demonstrate similar levels of hospital resources required to treat the conditions. Each inpatient that is discharged from an acute care hospital can be classified into one of the 506 DRGs currently utilized by the Medicare program. The Medicare DRG prospective payment methodology has been in use for almost two decades and is used by hospital managers for planning and decisionmaking. The viability of DRGs for future prospective payment depends on the ability to keep up with the times through updates of the current methodology.
Turner, Simon; D'Lima, Danielle; Hudson, Emma; Morris, Stephen; Sheringham, Jessica; Swart, Nick; Fulop, Naomi J
2017-12-04
A range of evidence informs decision-making on innovation in health care, including formal research findings, local data and professional opinion. However, cultural and organisational factors often prevent the translation of evidence for innovations into practice. In addition to the characteristics of evidence, it is known that processes at the individual level influence its impact on decision-making. Less is known about the ways in which processes at the professional, organisational and local system level shape evidence use and its role in decisions to adopt innovations. A systematic scoping review was used to review the health literature on innovations within acute and primary care and map processes at the professional, organisational and local system levels which influence how evidence informs decision-making on innovation. Stakeholder feedback on the themes identified was collected via focus groups to test and develop the findings. Following database and manual searches, 31 studies reporting primary qualitative data met the inclusion criteria: 24 were of sufficient methodological quality to be included in the thematic analysis. Evidence use in decision-making on innovation is influenced by multi-level processes (professional, organisational, local system) and interactions across these levels. Preferences for evidence vary by professional group and health service setting. Organisations can shape professional behaviour by requiring particular forms of evidence to inform decision-making. Pan-regional organisations shape innovation decision-making at lower levels. Political processes at all levels shape the selection and use of evidence in decision-making. The synthesis of results from primary qualitative studies found that evidence use in decision-making on innovation is influenced by processes at multiple levels. Interactions between different levels shape evidence use in decision-making (e.g. professional groups and organisations can use local systems to validate evidence and legitimise innovations, while local systems can tailor or frame evidence to influence activity at lower levels). Organisational leaders need to consider whether the environment in which decisions are made values diverse evidence and stakeholder perspectives. Further qualitative research on decision-making practices that highlights how and why different types of evidence come to count during decisions, and tracks the political aspects of decisions about innovation, is needed.
Assessment of Automated Measurement and Verification (M&V) Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Granderson, Jessica; Touzani, Samir; Custodio, Claudine
This report documents the application of a general statistical methodology to assess the accuracy of baseline energy models, focusing on its application to Measurement and Verification (M&V) of whole-building energy savings.
Automated pavement analysis in Missouri using ground penetrating radar
DOT National Transportation Integrated Search
2003-02-01
Current geotechnical procedures for monitoring the condition of roadways are time consuming and can be disruptive to traffic, often requiring extensive invasive procedures (e.g., coring). Ground penetrating radar (GPR) technology offers a methodology...
Comparison of two drug safety signals in a pharmacovigilance data mining framework.
Tubert-Bitter, Pascale; Bégaud, Bernard; Ahmed, Ismaïl
2016-04-01
Since adverse drug reactions are a major public health concern, early detection of drug safety signals has become a top priority for regulatory agencies and the pharmaceutical industry. Quantitative methods for analyzing spontaneous reporting material recorded in pharmacovigilance databases through data mining have been proposed in the last decades and are increasingly used to flag potential safety problems. While automated data mining is motivated by the usually huge size of pharmacovigilance databases, it does not systematically produce relevant alerts. Moreover, each detected signal requires appropriate assessment that may involve investigation of the whole therapeutic class. The goal of this article is to provide a methodology for comparing two detected signals. It is nested within the automated surveillance framework as (1) no extra information is required and (2) no simple inference on the actual risks can be extrapolated from spontaneous reporting data. We designed our methodology on the basis of two classical methods used for automated signal detection: the Bayesian Gamma Poisson Shrinker and the frequentist Proportional Reporting Ratio. A simulation study was conducted to assess the performances of both proposed methods. The latter were used to compare cardiovascular signals for two HIV treatments from the French pharmacovigilance database. © The Author(s) 2012.
NASA Technical Reports Server (NTRS)
Smith, Jeffrey H.; Gyanfi, Max; Volkmer, Kent; Zimmerman, Wayne
1988-01-01
The efforts of a recent study aimed at identifying key issues and trade-offs associated with using a Flight Telerobotic Servicer (FTS) to aid in Space Station assembly-phase tasks is described. The use of automation and robotic (A and R) technologies for large space systems would involve a substitution of automation capabilities for human extravehicular or intravehicular activities (EVA, IVA). A methodology is presented that incorporates assessment of candidate assembly-phase tasks, telerobotic performance capabilities, development costs, and effect of operational constraints (space transportation system (STS), attached payload, and proximity operations). Changes in the region of cost-effectiveness are examined under a variety of systems design assumptions. A discussion of issues is presented with focus on three roles the FTS might serve: (1) as a research-oriented testbed to learn more about space usage of telerobotics; (2) as a research based testbed having an experimental demonstration orientation with limited assembly and servicing applications; or (3) as an operational system to augment EVA and to aid the construction of the Space Station and to reduce the programmatic (schedule) risk by increasing the flexibility of mission operations.
Evolution of Patient Decision-Making Regarding Medical Treatment of Rheumatoid Arthritis.
Mathews, Alexandra L; Coleska, Adriana; Burns, Patricia B; Chung, Kevin C
2016-03-01
The migration of health care toward a consumer-driven system favors increased patient participation during the treatment decision-making process. Patient involvement in treatment decision discussions has been linked to increased treatment adherence and patient satisfaction. Previous studies have quantified decision-making styles of patients with rheumatoid arthritis (RA); however, none of them have considered the evolution of patient involvement after living with RA for many years. We conducted a qualitative study to determine the decision-making model used by long-term RA patients, and to describe the changes in their involvement over time. Twenty participants were recruited from the ongoing Silicone Arthroplasty in Rheumatoid Arthritis study. Semistructured interviews were conducted and data were analyzed using grounded theory methodology. Nineteen out of 20 participants recalled using the paternalistic decision-making (PDM) model immediately following their diagnosis. Fourteen of the 19 participants who initially used PDM evolved to shared decision-making (SDM). Participants attributed the change in involvement to the development of a trusting relationship with their physician, as well as to becoming educated about the disease. When initially diagnosed with RA, patients may let their physician decide on the best treatment course. However, over time patients may evolve to exercise a more collaborative role. Physicians should understand that even within SDM, each patient can demonstrate a varied amount of autonomy. It is up to the physician to have a discussion with each patient to determine his or her desired level of involvement. © 2016, American College of Rheumatology.
The enactment stage of end-of-life decision-making for children.
Sullivan, Jane Elizabeth; Gillam, Lynn Heather; Monagle, Paul Terence
2018-01-11
Typically pediatric end-of-life decision-making studies have examined the decision-making process, factors, and doctors' and parents' roles. Less attention has focussed on what happens after an end-of-life decision is made; that is, decision enactment and its outcome. This study explored the views and experiences of bereaved parents in end-of-life decision-making for their child. Findings reported relate to parents' experiences of acting on their decision. It is argued that this is one significant stage of the decision-making process. A qualitative methodology was used. Semi-structured interviews were conducted with bereaved parents, who had discussed end-of-life decisions for their child who had a life-limiting condition and who had died. Data were thematically analysed. Twenty-five bereaved parents participated. Findings indicate that, despite differences in context, including the child's condition and age, end-of-life decision-making did not end when an end-of-life decision was made. Enacting the decision was the next stage in a process. Time intervals between stages and enactment pathways varied, but the enactment was always distinguishable as a separate stage. Decision enactment involved making further decisions - parents needed to discern the appropriate time to implement their decision to withdraw or withhold life-sustaining medical treatment. Unexpected events, including other people's actions, impacted on parents enacting their decision in the way they had planned. Several parents had to re-implement decisions when their child recovered from serious health issues without medical intervention. Significance of results A novel, critical finding was that parents experienced end-of-life decision-making as a sequence of interconnected stages, the final stage being enactment. The enactment stage involved further decision-making. End-of-life decision-making is better understood as a process rather than a discrete once-off event. The enactment stage has particular emotional and practical implications for parents. Greater understanding of this stage can improve clinician's support for parents as they care for their child.
Using microwave Doppler radar in automated manufacturing applications
NASA Astrophysics Data System (ADS)
Smith, Gregory C.
Since the beginning of the Industrial Revolution, manufacturers worldwide have used automation to improve productivity, gain market share, and meet growing or changing consumer demand for manufactured products. To stimulate further industrial productivity, manufacturers need more advanced automation technologies: "smart" part handling systems, automated assembly machines, CNC machine tools, and industrial robots that use new sensor technologies, advanced control systems, and intelligent decision-making algorithms to "see," "hear," "feel," and "think" at the levels needed to handle complex manufacturing tasks without human intervention. The investigator's dissertation offers three methods that could help make "smart" CNC machine tools and industrial robots possible: (1) A method for detecting acoustic emission using a microwave Doppler radar detector, (2) A method for detecting tool wear on a CNC lathe using a Doppler radar detector, and (3) An online non-contact method for detecting industrial robot position errors using a microwave Doppler radar motion detector. The dissertation studies indicate that microwave Doppler radar could be quite useful in automated manufacturing applications. In particular, the methods developed may help solve two difficult problems that hinder further progress in automating manufacturing processes: (1) Automating metal-cutting operations on CNC machine tools by providing a reliable non-contact method for detecting tool wear, and (2) Fully automating robotic manufacturing tasks by providing a reliable low-cost non-contact method for detecting on-line position errors. In addition, the studies offer a general non-contact method for detecting acoustic emission that may be useful in many other manufacturing and non-manufacturing areas, as well (e.g., monitoring and nondestructively testing structures, materials, manufacturing processes, and devices). By advancing the state of the art in manufacturing automation, the studies may help stimulate future growth in industrial productivity, which also promises to fuel economic growth and promote economic stability. The study also benefits the Department of Industrial Technology at Iowa State University and the field of Industrial Technology by contributing to the ongoing "smart" machine research program within the Department of Industrial Technology and by stimulating research into new sensor technologies within the University and within the field of Industrial Technology.
Fully automated analysis of multi-resolution four-channel micro-array genotyping data
NASA Astrophysics Data System (ADS)
Abbaspour, Mohsen; Abugharbieh, Rafeef; Podder, Mohua; Tebbutt, Scott J.
2006-03-01
We present a fully-automated and robust microarray image analysis system for handling multi-resolution images (down to 3-micron with sizes up to 80 MBs per channel). The system is developed to provide rapid and accurate data extraction for our recently developed microarray analysis and quality control tool (SNP Chart). Currently available commercial microarray image analysis applications are inefficient, due to the considerable user interaction typically required. Four-channel DNA microarray technology is a robust and accurate tool for determining genotypes of multiple genetic markers in individuals. It plays an important role in the state of the art trend where traditional medical treatments are to be replaced by personalized genetic medicine, i.e. individualized therapy based on the patient's genetic heritage. However, fast, robust, and precise image processing tools are required for the prospective practical use of microarray-based genetic testing for predicting disease susceptibilities and drug effects in clinical practice, which require a turn-around timeline compatible with clinical decision-making. In this paper we have developed a fully-automated image analysis platform for the rapid investigation of hundreds of genetic variations across multiple genes. Validation tests indicate very high accuracy levels for genotyping results. Our method achieves a significant reduction in analysis time, from several hours to just a few minutes, and is completely automated requiring no manual interaction or guidance.
Automated Wildfire Detection Through Artificial Neural Networks
NASA Technical Reports Server (NTRS)
Miller, Jerry; Borne, Kirk; Thomas, Brian; Huang, Zhenping; Chi, Yuechen
2005-01-01
Wildfires have a profound impact upon the biosphere and our society in general. They cause loss of life, destruction of personal property and natural resources and alter the chemistry of the atmosphere. In response to the concern over the consequences of wildland fire and to support the fire management community, the National Oceanic and Atmospheric Administration (NOAA), National Environmental Satellite, Data and Information Service (NESDIS) located in Camp Springs, Maryland gradually developed an operational system to routinely monitor wildland fire by satellite observations. The Hazard Mapping System, as it is known today, allows a team of trained fire analysts to examine and integrate, on a daily basis, remote sensing data from Geostationary Operational Environmental Satellite (GOES), Advanced Very High Resolution Radiometer (AVHRR) and Moderate Resolution Imaging Spectroradiometer (MODIS) satellite sensors and generate a 24 hour fire product for the conterminous United States. Although assisted by automated fire detection algorithms, N O M has not been able to eliminate the human element from their fire detection procedures. As a consequence, the manually intensive effort has prevented NOAA from transitioning to a global fire product as urged particularly by climate modelers. NASA at Goddard Space Flight Center in Greenbelt, Maryland is helping N O M more fully automate the Hazard Mapping System by training neural networks to mimic the decision-making process of the frre analyst team as well as the automated algorithms.
Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján
2016-02-04
Simplicity, effectiveness, swiftness, and environmental friendliness - these are the typical requirements for the state of the art development of green analytical techniques. Liquid phase microextraction (LPME) stands for a family of elegant sample pretreatment and analyte preconcentration techniques preserving these principles in numerous applications. By using only fractions of solvent and sample compared to classical liquid-liquid extraction, the extraction kinetics, the preconcentration factor, and the cost efficiency can be increased. Moreover, significant improvements can be made by automation, which is still a hot topic in analytical chemistry. This review surveys comprehensively and in two parts the developments of automation of non-dispersive LPME methodologies performed in static and dynamic modes. Their advantages and limitations and the reported analytical performances are discussed and put into perspective with the corresponding manual procedures. The automation strategies, techniques, and their operation advantages as well as their potentials are further described and discussed. In this first part, an introduction to LPME and their static and dynamic operation modes as well as their automation methodologies is given. The LPME techniques are classified according to the different approaches of protection of the extraction solvent using either a tip-like (needle/tube/rod) support (drop-based approaches), a wall support (film-based approaches), or microfluidic devices. In the second part, the LPME techniques based on porous supports for the extraction solvent such as membranes and porous media are overviewed. An outlook on future demands and perspectives in this promising area of analytical chemistry is finally given. Copyright © 2015 Elsevier B.V. All rights reserved.
Progress of artificial pancreas devices towards clinical use: the first outpatient studies.
Russell, Steven J
2015-04-01
This article describes recent progress in the automated control of glycemia in type 1 diabetes with artificial pancreas devices that combine continuous glucose monitoring with automated decision-making and insulin delivery. After a gestation period of closely supervised feasibility studies in research centers, the last 2 years have seen publication of studies testing these devices in outpatient environments, and many more such studies are ongoing. The most basic form of automation, suspension of insulin delivery for actual or predicted hypoglycemia, has been shown to be effective and well tolerated, and a first-generation device has actually reached the market. Artificial pancreas devices that actively dose insulin fall into two categories, those that dose insulin alone and those that also use glucagon to prevent and treat hypoglycemia (bihormonal artificial pancreas). Initial outpatient clinical trials have shown that both strategies can improve glycemic management in comparison with patient-controlled insulin pump therapy, but only the bihormonal strategy has been tested without restrictions on exercise. Artificial pancreas technology has the potential to reduce acute and chronic complications of diabetes and mitigate the burden of diabetes self-management. Successful outpatient studies bring these technologies one step closer to availability for patients.
Statistical modelling of networked human-automation performance using working memory capacity.
Ahmed, Nisar; de Visser, Ewart; Shaw, Tyler; Mohamed-Ameen, Amira; Campbell, Mark; Parasuraman, Raja
2014-01-01
This study examines the challenging problem of modelling the interaction between individual attentional limitations and decision-making performance in networked human-automation system tasks. Analysis of real experimental data from a task involving networked supervision of multiple unmanned aerial vehicles by human participants shows that both task load and network message quality affect performance, but that these effects are modulated by individual differences in working memory (WM) capacity. These insights were used to assess three statistical approaches for modelling and making predictions with real experimental networked supervisory performance data: classical linear regression, non-parametric Gaussian processes and probabilistic Bayesian networks. It is shown that each of these approaches can help designers of networked human-automated systems cope with various uncertainties in order to accommodate future users by linking expected operating conditions and performance from real experimental data to observable cognitive traits like WM capacity. Practitioner Summary: Working memory (WM) capacity helps account for inter-individual variability in operator performance in networked unmanned aerial vehicle supervisory tasks. This is useful for reliable performance prediction near experimental conditions via linear models; robust statistical prediction beyond experimental conditions via Gaussian process models and probabilistic inference about unknown task conditions/WM capacities via Bayesian network models.
Robotic and automatic welding development at the Marshall Space Flight Center
NASA Technical Reports Server (NTRS)
Jones, C. S.; Jackson, M. E.; Flanigan, L. A.
1988-01-01
Welding automation is the key to two major development programs to improve quality and reduce the cost of manufacturing space hardware currently undertaken by the Materials and Processes Laboratory of the NASA Marshall Space Flight Center. Variable polarity plasma arc welding has demonstrated its effectiveness on class 1 aluminum welding in external tank production. More than three miles of welds were completed without an internal defect. Much of this success can be credited to automation developments which stabilize the process. Robotic manipulation technology is under development for automation of welds on the Space Shuttle's main engines utilizing pathfinder systems in development of tooling and sensors for the production applications. The overall approach to welding automation development undertaken is outlined. Advanced sensors and control systems methodologies are described that combine to make aerospace quality welds with a minimum of dependence on operator skill.
Automated computation of autonomous spectral submanifolds for nonlinear modal analysis
NASA Astrophysics Data System (ADS)
Ponsioen, Sten; Pedergnana, Tiemo; Haller, George
2018-04-01
We discuss an automated computational methodology for computing two-dimensional spectral submanifolds (SSMs) in autonomous nonlinear mechanical systems of arbitrary degrees of freedom. In our algorithm, SSMs, the smoothest nonlinear continuations of modal subspaces of the linearized system, are constructed up to arbitrary orders of accuracy, using the parameterization method. An advantage of this approach is that the construction of the SSMs does not break down when the SSM folds over its underlying spectral subspace. A further advantage is an automated a posteriori error estimation feature that enables a systematic increase in the orders of the SSM computation until the required accuracy is reached. We find that the present algorithm provides a major speed-up, relative to numerical continuation methods, in the computation of backbone curves, especially in higher-dimensional problems. We illustrate the accuracy and speed of the automated SSM algorithm on lower- and higher-dimensional mechanical systems.
Improving management decision processes through centralized communication linkages
NASA Technical Reports Server (NTRS)
Simanton, D. F.; Garman, J. R.
1985-01-01
Information flow is a critical element to intelligent and timely decision-making. At NASA's Johnson Space Center the flow of information is being automated through the use of a centralized backbone network. The theoretical basis of this network, its implications to the horizontal and vertical flow of information, and the technical challenges involved in its implementation are the focus of this paper. The importance of the use of common tools among programs and some future concerns related to file transfer, graphics transfer, and merging of voice and data are also discussed.
From Collectives to Collective Decision-Making and Action: Farmer Field Schools in Vietnam
ERIC Educational Resources Information Center
van de Fliert, Elske; Dung, Ngo Tien; Henriksen, Ole; Dalsgaard, Jens Peter Tang
2007-01-01
In 1992, even before a formalized agricultural extension system existed, the Farmer Field School was introduced in Vietnam as a farmer education methodology aiming at enhancing farmers' agroecological knowledge, critical skills and collective action to support sustainable agricultural development. Over the years, the model saw a wide range of…
Implementing Competency-Based Education: Challenges, Strategies, and a Decision-Making Framework
ERIC Educational Resources Information Center
Dragoo, Amie; Barrows, Richard
2016-01-01
The number of competency-based education (CBE) degree programs has increased rapidly over the past five years, yet there is little research on CBE program development. This study utilized conceptual models of higher education change and a qualitative methodology to analyze the strategies and challenges in implementing CBE business degree programs…
Technology as an Instructional Tool: What We Are Learning. Research Bulletin #3.
ERIC Educational Resources Information Center
Minnesota Educational Computing Consortium, St. Paul.
The purpose of this research bulletin is to provide educational decision-makers with empirical data for making informed decisions relative to the integration of technology in schools. Ten expanded abstracts of research studies are included here, each with a background/problem statement, list of study goals, description of methodology, conclusion…
Measuring the Continuum of Literacy Skills among Adults: Educational Testing and the LAMP Experience
ERIC Educational Resources Information Center
Guadalupe, Cesar; Cardoso, Manuel
2011-01-01
The field of educational testing has become increasingly important for providing different stakeholders and decision-makers with information. This paper discusses basic standards for methodological approaches used in measuring literacy skills among adults. The authors address the increasing interest in skills measurement, the discourses on how…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-25
... these decisionmaking processes will be applied by FDA to help design effective communication strategies..., beliefs, and behaviors--and use risk communications; (2) more efficiently and effectively design messages... provided about the design and methodology of the pretests and the studies to effectively comment on the...
Getting State Education Data Right: What We Can Learn from Tennessee
ERIC Educational Resources Information Center
Jones, Joseph; Southern, Kyle
2011-01-01
Federal education policy in recent years has encouraged state and local education agencies to embrace data use and analysis in decision-making, ranging from policy development and implementation to performance evaluation. The capacity of these agencies to make effective and methodologically sound use of collected data for these purposes remains an…
ERIC Educational Resources Information Center
Ramaekers, Stephan; Kremer, Wim; Pilot, Albert; van Beukelen, Peter; van Keulen, Hanno
2010-01-01
Real-life, complex problems often require that decisions are made despite limited information or insufficient time to explore all relevant aspects. Incorporating authentic uncertainties into an assessment, however, poses problems in establishing results and analysing their methodological qualities. This study aims at developing a test on clinical…
The Times Higher Education Ranking Product: Visualising Excellence through Media
ERIC Educational Resources Information Center
Stack, Michelle L.
2013-01-01
This paper will examine the Times Higher Education's (THE) World University Rankings as a corporate media product. A number of empirical studies have critiqued the methodology of the THE, yet individuals, Higher Education Institutions (HEIs) and governments continue to use them for decision-making. This paper analyses the influence of…
ERIC Educational Resources Information Center
Maxwell, James R.; Gilberti, Anthony F.; Mupinga, Davison M.
2006-01-01
This paper will study some of the problems associated with case studies and make recommendations using standard and innovative methodologies effectively. Human resource management (HRM) and resource development cases provide context for analysis and decision-making designs in different industries. In most HRM development and training courses…
ERIC Educational Resources Information Center
Chen, Ching-Huei; Zimitat, Craig
2006-01-01
Purpose: The purpose of this paper is to investigate the motivators for Taiwanese students to study higher education in a western society. The behavioural motivations of Taiwanese students intending to undertake higher education in Australia and the USA were analysed using the theory of planned behaviour (TPB). Design/methodology/approach:…
THE EMERGING FOCUS ON LIFE-CYCLE ASSESSMENT IN THE U. S. ENVIRONMENTAL PROTECTION AGENCY
EPA has been actively engaged in LCA research since 1990 to help advance the methodology and application of life cycle thinking in decision-making. Across the Agency consideration of the life cycle concept is increasing in the development of policies and programs. A major force i...
Decision-Making Training for Occupational Choice and Early Turnover: A Field Experiment
ERIC Educational Resources Information Center
Pazy, Asya; Ganzach, Yoav; Davidov, Yariv
2006-01-01
Purpose: The study seeks to examine how a short intervention, aimed at enhancing occupational choice skills, influences turnover during the early stages of organizational membership. It seeks to explore two theoretical rationales for this effect: social exchange and self-determination. Design/methodology/approach: The study is a "constructive…
Observation and Description: An Alternative Methodology for the Investigation of Human Phenomena.
ERIC Educational Resources Information Center
Carini, Patricia F.
This monograph is one of a continuing series initiated to provide materials for teachers, parents, school administrators, and governmental decision-makers that might encourage reexamination of a range of evaluation issues and perspectives about schools and schooling. This monograph suggests a radical departure from the approaches to educational…
ERIC Educational Resources Information Center
Burton, Rob J. F.
2004-01-01
In rural studies the "behavioural approach", i.e. an actor-oriented, largely questionnaire-based methodology that focuses ''on the motives, values and attitudes that determine the decision-making processes of individual farmers'' (J. Rural Stud 11 (1995) 51, p. 55), has become increasingly important in the investigation of farmer response to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Webster, Mort D.
This report presents the final outcomes and products of the project as performed both at the Massachusetts Institute of Technology and subsequently at Pennsylvania State University. The research project can be divided into three main components: methodology development for decision-making under uncertainty, improving the resolution of the electricity sector to improve integrated assessment, and application of these methods to integrated assessment.
European Qualifications Framework: Weighing Some Pros and Cons out of a French Perspective
ERIC Educational Resources Information Center
Bouder, Annie
2008-01-01
Purpose: The purpose of this paper is to question the appropriateness of a proposal for a new European Qualifications Framework. The framework has three perspectives: historical; analytical; and national. Design/methodology/approach: The approaches are diverse since the first insists on the institutional and decision-making processes at European…
MVP-CA Methodology for the Expert System Advocate's Advisor (ESAA)
DOT National Transportation Integrated Search
1997-11-01
The Multi-Viewpoint Clustering Analysis (MVP-CA) tool is a semi-automated tool to provide a valuable aid for comprehension, verification, validation, maintenance, integration, and evolution of complex knowledge-based software systems. In this report,...
Balancing Technology with Established Methodology in the Accounting Classroom.
ERIC Educational Resources Information Center
Hoyt, William B.
1996-01-01
Discusses the role of technology in secondary accounting courses. Indicates that students must master the principles and concepts in accounting and must experience the manual preparation of documents before automated procedures are integrated. (Author/JOW)
Electronic automation of LRFD design programs.
DOT National Transportation Integrated Search
2010-03-01
The study provided electronic programs to WisDOT for designing pre-stressed girders and piers using the Load : Resistance Factor Design (LRFD) methodology. The software provided is intended to ease the transition to : LRFD for WisDOT design engineers...
Mañós, M; Giralt, J; Rueda, A; Cabrera, J; Martinez-Trufero, J; Marruecos, J; Lopez-Pousa, A; Rodrigo, J P; Castelo, B; Martínez-Galán, J; Arias, F; Chaves, M; Herranz, J J; Arrazubi, V; Baste, N; Castro, A; Mesía, R
2017-07-01
Head and neck cancer is one of the most frequent malignances worldwide. Despite the site-specific multimodality therapy, up to half of the patients will develop recurrence. Treatment selection based on a multidisciplinary tumor board represents the cornerstone of head and neck cancer, as it is essential for achieving the best results, not only in terms of outcome, but also in terms of organ-function preservation and quality of life. Evidence-based international and national clinical practice guidelines for head and neck cancer not always provide answers in terms of decision-making that specialists must deal with in their daily practice. This is the first Expert Consensus on the Multidisciplinary Approach for Head and Neck Squamous Cell Carcinoma (HNSCC) elaborated by the Spanish Society for Head and Neck Cancer and based on a Delphi methodology. It offers several specific recommendations based on the available evidence and the expertise of our specialists to facilitate decision-making of all health-care specialists involved. Copyright © 2017. Published by Elsevier Ltd.
Economic benefits of meteorological services
NASA Astrophysics Data System (ADS)
Freebairn, John W.; Zillman, John W.
2002-03-01
There is an increasing need for more rigorous and more broadly based determination of the economic value of meteorological services as an aid to decision-making on the appropriate level of funding to be committed to their provision at the national level. This paper develops an overall framework for assessment of the economic value of meteorological services based on the recognition that most national meteorological infrastructure and services possess the non rival properties of public goods. Given this overall framework for determination of both total and marginal benefits, four main methodologies appropriate for use in valuation studies - market prices, normative or prescriptive decision-making models, descriptive behavioural response studies and contingent valuation studies - are outlined and their strengths and limitations described. Notwithstanding the methodological limitations and the need for a much more comprehensive set of studies for the various application sectors, it is clear that the actual and potential benefits to individuals, firms, industry sectors and national economies from state-of-the-art meteorological and related services are substantial and that, at this stage, they are inadequately recognised and insufficiently exploited in many countries.
Uranga, Jon; Arrizabalaga, Haritz; Boyra, Guillermo; Hernandez, Maria Carmen; Goñi, Nicolas; Arregui, Igor; Fernandes, Jose A; Yurramendi, Yosu; Santiago, Josu
2017-01-01
This study presents a methodology for the automated analysis of commercial medium-range sonar signals for detecting presence/absence of bluefin tuna (Tunnus thynnus) in the Bay of Biscay. The approach uses image processing techniques to analyze sonar screenshots. For each sonar image we extracted measurable regions and analyzed their characteristics. Scientific data was used to classify each region into a class ("tuna" or "no-tuna") and build a dataset to train and evaluate classification models by using supervised learning. The methodology performed well when validated with commercial sonar screenshots, and has the potential to automatically analyze high volumes of data at a low cost. This represents a first milestone towards the development of acoustic, fishery-independent indices of abundance for bluefin tuna in the Bay of Biscay. Future research lines and additional alternatives to inform stock assessments are also discussed.
Uranga, Jon; Arrizabalaga, Haritz; Boyra, Guillermo; Hernandez, Maria Carmen; Goñi, Nicolas; Arregui, Igor; Fernandes, Jose A.; Yurramendi, Yosu; Santiago, Josu
2017-01-01
This study presents a methodology for the automated analysis of commercial medium-range sonar signals for detecting presence/absence of bluefin tuna (Tunnus thynnus) in the Bay of Biscay. The approach uses image processing techniques to analyze sonar screenshots. For each sonar image we extracted measurable regions and analyzed their characteristics. Scientific data was used to classify each region into a class (“tuna” or “no-tuna”) and build a dataset to train and evaluate classification models by using supervised learning. The methodology performed well when validated with commercial sonar screenshots, and has the potential to automatically analyze high volumes of data at a low cost. This represents a first milestone towards the development of acoustic, fishery-independent indices of abundance for bluefin tuna in the Bay of Biscay. Future research lines and additional alternatives to inform stock assessments are also discussed. PMID:28152032
Expected p-values in light of an ROC curve analysis applied to optimal multiple testing procedures.
Vexler, Albert; Yu, Jihnhee; Zhao, Yang; Hutson, Alan D; Gurevich, Gregory
2017-01-01
Many statistical studies report p-values for inferential purposes. In several scenarios, the stochastic aspect of p-values is neglected, which may contribute to drawing wrong conclusions in real data experiments. The stochastic nature of p-values makes their use to examine the performance of given testing procedures or associations between investigated factors to be difficult. We turn our focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we prove that the EPV can be considered in the context of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. The ROC-based framework provides a new and efficient methodology for investigating and constructing statistical decision-making procedures, including: (1) evaluation and visualization of properties of the testing mechanisms, considering, e.g. partial EPVs; (2) developing optimal tests via the minimization of EPVs; (3) creation of novel methods for optimally combining multiple test statistics. We demonstrate that the proposed EPV-based approach allows us to maximize the integrated power of testing algorithms with respect to various significance levels. In an application, we use the proposed method to construct the optimal test and analyze a myocardial infarction disease dataset. We outline the usefulness of the "EPV/ROC" technique for evaluating different decision-making procedures, their constructions and properties with an eye towards practical applications.
Moher, David; Clifford, Tammy J.
2016-01-01
Background Rapid reviews expedite the knowledge synthesis process with the goal of providing timely information to healthcare decision-makers who want to use evidence-informed policy and practice approaches. A range of opinions and viewpoints on rapid reviews is thought to exist; however, no research to date has formally captured these views. This paper aims to explore evidence producer and knowledge user attitudes and perceptions towards rapid reviews. Methods A Q methodology study was conducted to identify central viewpoints about rapid reviews based on a broad topic discourse. Participants rank-ordered 50 text statements and explained their Q-sort in free-text comments. Individual Q-sorts were analysed using Q-Assessor (statistical method: factor analysis with varimax rotation). Factors, or salient viewpoints on rapid reviews, were identified, interpreted and described. Results Analysis of the 11 individual Q sorts identified three prominent viewpoints: Factor A cautions against the use of study design labels to make judgements. Factor B maintains that rapid reviews should be the exception and not the rule. Factor C focuses on the practical needs of the end-user over the review process. Conclusion Results show that there are opposing viewpoints on rapid reviews, yet some unity exists. The three factors described offer insight into how and why various stakeholders act as they do and what issues may need to be resolved before increase uptake of the evidence from rapid reviews can be realized in healthcare decision-making environments. PMID:27761324
NASA Technical Reports Server (NTRS)
Broderick, Ron
1997-01-01
The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network development. The changes were to include evaluation tools that can be applied to neural networks at each phase of the software engineering life cycle. The result was a formal evaluation approach to increase the product quality of systems that use neural networks for their implementation.
Methodology of automated ionosphere front velocity estimation for ground-based augmentation of GNSS
NASA Astrophysics Data System (ADS)
Bang, Eugene; Lee, Jiyun
2013-11-01
ionospheric anomalies occurring during severe ionospheric storms can pose integrity threats to Global Navigation Satellite System (GNSS) Ground-Based Augmentation Systems (GBAS). Ionospheric anomaly threat models for each region of operation need to be developed to analyze the potential impact of these anomalies on GBAS users and develop mitigation strategies. Along with the magnitude of ionospheric gradients, the speed of the ionosphere "fronts" in which these gradients are embedded is an important parameter for simulation-based GBAS integrity analysis. This paper presents a methodology for automated ionosphere front velocity estimation which will be used to analyze a vast amount of ionospheric data, build ionospheric anomaly threat models for different regions, and monitor ionospheric anomalies continuously going forward. This procedure automatically selects stations that show a similar trend of ionospheric delays, computes the orientation of detected fronts using a three-station-based trigonometric method, and estimates speeds for the front using a two-station-based method. It also includes fine-tuning methods to improve the estimation to be robust against faulty measurements and modeling errors. It demonstrates the performance of the algorithm by comparing the results of automated speed estimation to those manually computed previously. All speed estimates from the automated algorithm fall within error bars of ± 30% of the manually computed speeds. In addition, this algorithm is used to populate the current threat space with newly generated threat points. A larger number of velocity estimates helps us to better understand the behavior of ionospheric gradients under geomagnetic storm conditions.
Object-oriented analysis and design: a methodology for modeling the computer-based patient record.
Egyhazy, C J; Eyestone, S M; Martino, J; Hodgson, C L
1998-08-01
The article highlights the importance of an object-oriented analysis and design (OOAD) methodology for the computer-based patient record (CPR) in the military environment. Many OOAD methodologies do not adequately scale up, allow for efficient reuse of their products, or accommodate legacy systems. A methodology that addresses these issues is formulated and used to demonstrate its applicability in a large-scale health care service system. During a period of 6 months, a team of object modelers and domain experts formulated an OOAD methodology tailored to the Department of Defense Military Health System and used it to produce components of an object model for simple order processing. This methodology and the lessons learned during its implementation are described. This approach is necessary to achieve broad interoperability among heterogeneous automated information systems.
A Cognitive Systems Engineering Approach to Developing HMI Requirements for New Technologies
NASA Technical Reports Server (NTRS)
Fern, Lisa Carolynn
2016-01-01
This document examines the challenges inherent in designing and regulating to support human-automation interaction for new technologies that will deployed into complex systems. A key question for new technologies, is how work will be accomplished by the human and machine agents. This question has traditionally been framed as how functions should be allocated between humans and machines. Such framing misses the coordination and synchronization that is needed for the different human and machine roles in the system to accomplish their goals. Coordination and synchronization demands are driven by the underlying human-automation architecture of the new technology, which are typically not specified explicitly by the designers. The human machine interface (HMI) which is intended to facilitate human-machine interaction and cooperation, however, typically is defined explicitly and therefore serves as a proxy for human-automation cooperation requirements with respect to technical standards for technologies. Unfortunately, mismatches between the HMI and the coordination and synchronization demands of the underlying human-automation architecture, can lead to system breakdowns. A methodology is needed that both designers and regulators can utilize to evaluate the expected performance of a new technology given potential human-automation architectures. Three experiments were conducted to inform the minimum HMI requirements a detect and avoid system for unmanned aircraft systems (UAS). The results of the experiments provided empirical input to specific minimum operational performance standards that UAS manufacturers will have to meet in order to operate UAS in the National Airspace System (NAS). These studies represent a success story for how to objectively and systematically evaluate prototype technologies as part of the process for developing regulatory requirements. They also provide an opportunity to reflect on the lessons learned from a recent research effort in order to improve the methodology for defining technology requirements for regulators in the future. The biggest shortcoming of the presented research program was the absence of the explicit definition, generation and analysis of potential human-automation architectures. Failure to execute this step in the research process resulted in less efficient evaluation of the candidate prototypes technologies in addition to the complete absence of different approaches to human-automation cooperation. For example, all of the prototype technologies that were evaluated in the research program assumed a human-automation architecture that relied on serial processing from the automation to the human. While this type of human-automation architecture is typical across many different technologies and in many different domains, it ignores different architectures where humans and automation work in parallel. Defining potential human-automation architectures a priori also allows regulators to develop scenarios that will stress the performance boundaries of the technology during the evaluation phase. The importance of adding this step of generating and evaluating candidate human-automation architectures prior to formal empirical evaluation is discussed.
Santos, Adriano A; Moura, J Antão B; de Araújo, Joseana Macêdo Fechine Régis
2015-01-01
Mitigating uncertainty and risks faced by specialist physicians in analysis of rare clinical cases is something desired by anyone who needs health services. The number of clinical cases never seen by these experts, with little documentation, may introduce errors in decision-making. Such errors negatively affect well-being of patients, increase procedure costs, rework, health insurance premiums, and impair the reputation of specialists and medical systems involved. In this context, IT and Clinical Decision Support Systems (CDSS) play a fundamental role, supporting decision-making process, making it more efficient and effective, reducing a number of avoidable medical errors and enhancing quality of treatment given to patients. An investigation has been initiated to look into characteristics and solution requirements of this problem, model it, propose a general solution in terms of a conceptual risk-based, automated framework to support rare-case medical diagnostics and validate it by means of case studies. A preliminary validation study of the proposed framework has been carried out by interviews conducted with experts who are practicing professionals, academics, and researchers in health care. This paper summarizes the investigation and its positive results. These results motivate continuation of research towards development of the conceptual framework and of a software tool that implements the proposed model.
Mind the gap: temporal discrimination and dystonia.
Sadnicka, A; Daum, C; Cordivari, C; Bhatia, K P; Rothwell, J C; Manohar, S; Edwards, M J
2017-06-01
One of the most widely studied perceptual measures of sensory dysfunction in dystonia is the temporal discrimination threshold (TDT) (the shortest interval at which subjects can perceive that there are two stimuli rather than one). However the elevated thresholds described may be due to a number of potential mechanisms as current paradigms test not only temporal discrimination but also extraneous sensory and decision-making parameters. In this study two paradigms designed to better quantify temporal processing are presented and a decision-making model is used to assess the influence of decision strategy. 22 patients with cervical dystonia and 22 age-matched controls completed two tasks (i) temporal resolution (a randomized, automated version of existing TDT paradigms) and (ii) interval discrimination (rating the length of two consecutive intervals). In the temporal resolution task patients had delayed (P = 0.021) and more variable (P = 0.013) response times but equivalent discrimination thresholds. Modelling these effects suggested this was due to an increased perceptual decision boundary in dystonia with patients requiring greater evidence before committing to decisions (P = 0.020). Patient performance on the interval discrimination task was normal. Our work suggests that previously observed abnormalities in TDT may not be due to a selective sensory deficit of temporal processing as decision-making itself is abnormal in cervical dystonia. © 2017 EAN.
Fuzzy approaches to supplier selection problem
NASA Astrophysics Data System (ADS)
Ozkok, Beyza Ahlatcioglu; Kocken, Hale Gonce
2013-09-01
Supplier selection problem is a multi-criteria decision making problem which includes both qualitative and quantitative factors. In the selection process many criteria may conflict with each other, therefore decision-making process becomes complicated. In this study, we handled the supplier selection problem under uncertainty. In this context; we used minimum criterion, arithmetic mean criterion, regret criterion, optimistic criterion, geometric mean and harmonic mean. The membership functions created with the help of the characteristics of used criteria, and we tried to provide consistent supplier selection decisions by using these memberships for evaluating alternative suppliers. During the analysis, no need to use expert opinion is a strong aspect of the methodology used in the decision-making.
Automated Information System (AIS) Alarm System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hunteman, W.
1997-05-01
The Automated Information Alarm System is a joint effort between Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratory to demonstrate and implement, on a small-to-medium sized local area network, an automated system that detects and automatically responds to attacks that use readily available tools and methodologies. The Alarm System will sense or detect, assess, and respond to suspicious activities that may be detrimental to information on the network or to continued operation of the network. The responses will allow stopping, isolating, or ejecting the suspicious activities. The number of sensors, the sensitivity of the sensors, themore » assessment criteria, and the desired responses may be set by the using organization to meet their local security policies.« less
Current Challenges in Health Economic Modeling of Cancer Therapies: A Research Inquiry
Miller, Jeffrey D.; Foley, Kathleen A.; Russell, Mason W.
2014-01-01
Background The demand for economic models that evaluate cancer treatments is increasing, as healthcare decision makers struggle for ways to manage their budgets while providing the best care possible to patients with cancer. Yet, after nearly 2 decades of cultivating and refining techniques for modeling the cost-effectiveness and budget impact of cancer therapies, serious methodologic and policy challenges have emerged that question the adequacy of economic modeling as a sound decision-making tool in oncology. Objectives We sought to explore some of the contentious issues associated with the development and use of oncology economic models as informative tools in current healthcare decision-making. Our objective was to draw attention to these complex pharmacoeconomic concerns and to promote discussion within the oncology and health economics research communities. Methods Using our combined expertise in health economics research and economic modeling, we structured our inquiry around the following 4 questions: (1) Are economic models adequately addressing questions relevant to oncology decision makers; (2) What are the methodologic limitations of oncology economic models; (3) What guidelines are followed for developing oncology economic models; and (4) Is the evolution of oncology economic modeling keeping pace with treatment innovation? Within the context of each of these questions, we discuss issues related to the technical limitations of oncology modeling, the availability of adequate data for developing models, and the problems with how modeling analyses and results are presented and interpreted. Discussion There is general acceptance that economic models are good, essential tools for decision-making, but the practice of oncology and its rapidly evolving technologies present unique challenges that make assessing and demonstrating value especially complex. There is wide latitude for improvement in oncology modeling methodologies and how model results are presented and interpreted. Conclusion Complex technical and data availability issues with oncology economic modeling pose serious concerns that need to be addressed. It is our hope that this article will provide a framework to guide future discourse on this important topic. PMID:24991399
Current challenges in health economic modeling of cancer therapies: a research inquiry.
Miller, Jeffrey D; Foley, Kathleen A; Russell, Mason W
2014-05-01
The demand for economic models that evaluate cancer treatments is increasing, as healthcare decision makers struggle for ways to manage their budgets while providing the best care possible to patients with cancer. Yet, after nearly 2 decades of cultivating and refining techniques for modeling the cost-effectiveness and budget impact of cancer therapies, serious methodologic and policy challenges have emerged that question the adequacy of economic modeling as a sound decision-making tool in oncology. We sought to explore some of the contentious issues associated with the development and use of oncology economic models as informative tools in current healthcare decision-making. Our objective was to draw attention to these complex pharmacoeconomic concerns and to promote discussion within the oncology and health economics research communities. Using our combined expertise in health economics research and economic modeling, we structured our inquiry around the following 4 questions: (1) Are economic models adequately addressing questions relevant to oncology decision makers; (2) What are the methodologic limitations of oncology economic models; (3) What guidelines are followed for developing oncology economic models; and (4) Is the evolution of oncology economic modeling keeping pace with treatment innovation? Within the context of each of these questions, we discuss issues related to the technical limitations of oncology modeling, the availability of adequate data for developing models, and the problems with how modeling analyses and results are presented and interpreted. There is general acceptance that economic models are good, essential tools for decision-making, but the practice of oncology and its rapidly evolving technologies present unique challenges that make assessing and demonstrating value especially complex. There is wide latitude for improvement in oncology modeling methodologies and how model results are presented and interpreted. Complex technical and data availability issues with oncology economic modeling pose serious concerns that need to be addressed. It is our hope that this article will provide a framework to guide future discourse on this important topic.
NASA Astrophysics Data System (ADS)
Fern, Lisa Carolynn
This dissertation examines the challenges inherent in designing and regulating to support human-automation interaction for new technologies that will be deployed into complex systems. A key question for new technologies with increasingly capable automation, is how work will be accomplished by human and machine agents. This question has traditionally been framed as how functions should be allocated between humans and machines. Such framing misses the coordination and synchronization that is needed for the different human and machine roles in the system to accomplish their goals. Coordination and synchronization demands are driven by the underlying human-automation architecture of the new technology, which are typically not specified explicitly by designers. The human machine interface (HMI), which is intended to facilitate human-machine interaction and cooperation, typically is defined explicitly and therefore serves as a proxy for human-automation cooperation requirements with respect to technical standards for technologies. Unfortunately, mismatches between the HMI and the coordination and synchronization demands of the underlying human-automation architecture can lead to system breakdowns. A methodology is needed that both designers and regulators can utilize to evaluate the predicted performance of a new technology given potential human-automation architectures. Three experiments were conducted to inform the minimum HMI requirements for a detect and avoid (DAA) system for unmanned aircraft systems (UAS). The results of the experiments provided empirical input to specific minimum operational performance standards that UAS manufacturers will have to meet in order to operate UAS in the National Airspace System (NAS). These studies represent a success story for how to objectively and systematically evaluate prototype technologies as part of the process for developing regulatory requirements. They also provide an opportunity to reflect on the lessons learned in order to improve the methodology for defining technology requirements for regulators in the future. The biggest shortcoming of the presented research program was the absence of the explicit definition, generation and analysis of potential human-automation architectures. Failure to execute this step in the research process resulted in less efficient evaluation of the candidate prototypes technologies in addition to a lack of exploration of different approaches to human-automation cooperation. Defining potential human-automation architectures a priori also allows regulators to develop scenarios that will stress the performance boundaries of the technology during the evaluation phase. The importance of adding this step of generating and evaluating candidate human-automation architectures prior to formal empirical evaluation is discussed. This document concludes with a look at both the importance of, and the challenges facing, the inclusion of examining human-automation coordination issues as part of the safety assurance activities of new technologies.
NASA Astrophysics Data System (ADS)
Krokhin, G.; Pestunov, A.
2017-11-01
Exploitation conditions of power stations in variable modes and related changes of their technical state actualized problems of creating models for decision-making and state recognition basing on diagnostics using the fuzzy logic for identification their state and managing recovering processes. There is no unified methodological approach for obtaining the relevant information is a case of fuzziness and inhomogeneity of the raw information about the equipment state. The existing methods for extracting knowledge are usually unable to provide the correspondence between of the aggregates model parameters and the actual object state. The switchover of the power engineering from the preventive repair to the one, which is implemented according to the actual technical state, increased the responsibility of those who estimate the volume and the duration of the work. It may lead to inadequacy of the diagnostics and the decision-making models if corresponding methodological preparations do not take fuzziness into account, because the nature of the state information is of this kind. In this paper, we introduce a new model which formalizes the equipment state using not only exact information, but fuzzy as well. This model is more adequate to the actual state, than traditional analogs, and may be used in order to increase the efficiency and the service period of the power installations.
Silva, Silvio Fernandes da; Souza, Nathan Mendes; Barreto, Jorge Otávio Maia
2014-11-01
The scope of this article was to identify the boundaries of the autonomy of local administration in the context of the federal pact in the Brazilian Unified Health System and the importance and potential for promoting innovation, creativity and evidence-based decision-making by local governments. The methodology used was to ask questions that favored dialogue with the specific literature to identify the influence of centrally-formulated policies in spaces of local autonomy and then to identify strategies to foster innovation, creativity and the systematic use of evidence-based research in health policy implementation. A gradual reduction in municipal decision-making autonomy was detected due to increased financial commitment of the municipalities resulting from responsibilities assumed, albeit with the possibility of reverting this trend in the more recent context. Some determinants and challenges for the dissemination of innovative practices were analyzed and some relevant national and international experiences in this respect were presented. The conclusion drawn is that it is possible to make local decision-making more effective provided that initiatives are consolidated to promote this culture and the formulation and implementation of evidence-based health policies.
Blair-West, Laura F; Hoy, Kate E; Hall, Phillip J; Fitzgerald, Paul B; Fitzgibbon, Bernadette M
2018-01-01
The right temporoparietal junction (rTPJ) is thought to play an important role in social cognition and pro-social decision-making. One way to explore this link is through the use of transcranial direct current stimulation (tDCS), a non-invasive brain stimulation method that is able to modulate cortical activity. The aim of this research was therefore to determine whether anodal tDCS to the rTPJ altered response to a social decision-making task. In this study, 34 healthy volunteers participated in a single-center, double-blinded, sham-controlled crossover design. Subjects received 20 min of active/sham anodal tDCS to the rTPJ before undertaking the Ultimatum Game (UG), a neuroeconomics paradigm in which participants are forced to choose between monetary reward and punishing an opponent's unfairness. Contrary to expectations, we found no significant difference between anodal and sham stimulation with regard to either the total number or reaction time of unfair offer rejections in the UG. This study draws attention to methodological issues in tDCS studies of the rTPJ, and highlights the complexity of social decision-making in the UG.
NASA Astrophysics Data System (ADS)
Wikman-Svahn, Per
2013-04-01
Hydrological sciences are increasingly utilized in decision-making contexts that need to manage deep uncertainty, changing conditions and very long-lead times and lifetimes. Traditional optimizing approaches become problematic in such situations. For example, optimizing approaches may underestimate the importance of low probability outcomes, or very uncertain outcomes. Alternative decision-making strategies are therefore increasingly used in hydrological applications, including "bottom-up/top-down", "context-first", "decision-scaling", "assess risk of policy", "robust", "resilient" or "flexible" approaches. These kinds of strategies are typically designed to handle very uncertain and diverse outcomes, and often start from the particular decision-making context, in contrast to more traditional "predict-then-act" or "science first" approaches. Contemporary research in philosophy of science stress the influence of value judgments and norms in scientific assessments. In particular, this literature points out that implicit anticipated applications often influence choices made in scientific assessments. Furthermore, this literature also emphasize that choices made at within scientific assessments have consequences for decision-making later on. One reason is that it is often difficult for decision-makers to see what choices are made and the implications of these choices. Another reason is that information that could be of use for decision-makers are lost at an early stage. For example, the choice to focus on central estimates and not providing assessments on more unlikely outcomes is a choice that has consequences for what outcomes are taken into account in the decision-making process. This paper develops this argument and then analyzes the implications of these new developments for hydrological science. One implication of the increasing use of the new breed of planning strategies is that a broader range of uncertainty in scientific assessments becomes desirable in order to fully benefit from the power of the new decision-making strategies. Another implication is that bayesian probability assessments become more important. Finally, advantages and risks involved in changing scientific assessments in order to anticipate the new decision-making strategies are discussed.
Currie, Danielle J; Smith, Carl; Jagals, Paul
2018-03-27
Policy and decision-making processes are routinely challenged by the complex and dynamic nature of environmental health problems. System dynamics modelling has demonstrated considerable value across a number of different fields to help decision-makers understand and predict the dynamic behaviour of complex systems in support the development of effective policy actions. In this scoping review we investigate if, and in what contexts, system dynamics modelling is being used to inform policy or decision-making processes related to environmental health. Four electronic databases and the grey literature were systematically searched to identify studies that intersect the areas environmental health, system dynamics modelling, and decision-making. Studies identified in the initial screening were further screened for their contextual, methodological and application-related relevancy. Studies deemed 'relevant' or 'highly relevant' according to all three criteria were included in this review. Key themes related to the rationale, impact and limitation of using system dynamics in the context of environmental health decision-making and policy were analysed. We identified a limited number of relevant studies (n = 15), two-thirds of which were conducted between 2011 and 2016. The majority of applications occurred in non-health related sectors (n = 9) including transportation, public utilities, water, housing, food, agriculture, and urban and regional planning. Applications were primarily targeted at micro-level (local, community or grassroots) decision-making processes (n = 9), with macro-level (national or international) decision-making to a lesser degree. There was significant heterogeneity in the stated rationales for using system dynamics and the intended impact of the system dynamics model on decision-making processes. A series of user-related, technical and application-related limitations and challenges were identified. None of the reported limitations or challenges appeared unique to the application of system dynamics within the context of environmental health problems, but rather to the use of system dynamics in general. This review reveals that while system dynamics modelling is increasingly being used to inform decision-making related to environmental health, applications are currently limited. Greater application of system dynamics within this context is needed before its benefits and limitations can be fully understood.
Wu, Xin Yin; Du, Xin Jian; Ho, Robin S T; Lee, Clarence C Y; Yip, Benjamin H K; Wong, Martin C S; Wong, Samuel Y S; Chung, Vincent C H
2017-02-01
Methodological quality of meta-analyses on hypertension treatments can affect treatment decision-making. The authors conducted a cross-sectional study to investigate the methodological quality of meta-analyses on hypertension treatments. One hundred and fifty-eight meta-analyses were identified. Overall, methodological quality was unsatisfactory in the following aspects: comprehensive reporting of financial support (1.9%), provision of included and excluded lists of studies (22.8%), inclusion of grey literature (27.2%), and inclusion of protocols (32.9%). The 126 non-Cochrane meta-analyses had poor performance on almost all the methodological items. Non-Cochrane meta-analyses focused on nonpharmacologic treatments were more likely to consider scientific quality of included studies when making conclusions. The 32 Cochrane meta-analyses generally had good methodological quality except for comprehensive reporting of the sources of support. These results highlight the need for cautious interpretation of these meta-analyses, especially among physicians and policy makers when guidelines are formulated. Future meta-analyses should pay attention to improving these methodological aspects. ©2016 Wiley Periodicals, Inc.
Adaptive Multi-scale PHM for Robotic Assembly Processes
Choo, Benjamin Y.; Beling, Peter A.; LaViers, Amy E.; Marvel, Jeremy A.; Weiss, Brian A.
2017-01-01
Adaptive multiscale prognostics and health management (AM-PHM) is a methodology designed to support PHM in smart manufacturing systems. As a rule, PHM information is not used in high-level decision-making in manufacturing systems. AM-PHM leverages and integrates component-level PHM information with hierarchical relationships across the component, machine, work cell, and production line levels in a manufacturing system. The AM-PHM methodology enables the creation of actionable prognostic and diagnostic intelligence up and down the manufacturing process hierarchy. Decisions are made with the knowledge of the current and projected health state of the system at decision points along the nodes of the hierarchical structure. A description of the AM-PHM methodology with a simulated canonical robotic assembly process is presented. PMID:28664161
Web-based automation of green building rating index and life cycle cost analysis
NASA Astrophysics Data System (ADS)
Shahzaib Khan, Jam; Zakaria, Rozana; Aminuddin, Eeydzah; IzieAdiana Abidin, Nur; Sahamir, Shaza Rina; Ahmad, Rosli; Nafis Abas, Darul
2018-04-01
Sudden decline in financial markets and economic meltdown has slow down adaptation and lowered interest of investors towards green certified buildings due to their higher initial costs. Similarly, it is essential to fetch investor’s attention towards more development of green buildings through automated tools for the construction projects. Though, historical dearth is found on the automation of green building rating tools that brings up an essential gap to develop an automated analog computerized programming tool. This paper present a proposed research aim to develop an integrated web-based automated analog computerized programming that applies green building rating assessment tool, green technology and life cycle cost analysis. It also emphasizes to identify variables of MyCrest and LCC to be integrated and developed in a framework then transformed into automated analog computerized programming. A mix methodology of qualitative and quantitative survey and its development portray the planned to carry MyCrest-LCC integration to an automated level. In this study, the preliminary literature review enriches better understanding of Green Building Rating Tools (GBRT) integration to LCC. The outcome of this research is a pave way for future researchers to integrate other efficient tool and parameters that contributes towards green buildings and future agendas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpson, Wayne; Borders, Tammie
INL successfully developed a proof of concept for "Software Defined Anything" by emulating the laboratory's business applications that run on Virtual Machines. The work INL conducted demonstrates to industry on how this methodology can be used to improve security, automate and repeat processes, and improve consistency.
Approximate Reasoning: Past, Present, Future
1990-06-27
This note presents a personal view of the state of the art in the representation and manipulation of imprecise and uncertain information by automated ... processing systems. To contrast their objectives and characteristics with the sound deductive procedures of classical logic, methodologies developed
Interfacing An Intelligent Decision-Maker To A Real-Time Control System
NASA Astrophysics Data System (ADS)
Evers, D. C.; Smith, D. M.; Staros, C. J.
1984-06-01
This paper discusses some of the practical aspects of implementing expert systems in a real-time environment. There is a conflict between the needs of a process control system and the computational load imposed by intelligent decision-making software. The computation required to manage a real-time control problem is primarily concerned with routine calculations which must be executed in real time. On most current hardware, non-trivial AI software should not be forced to operate under real-time constraints. In order for the system to work efficiently, the two processes must be separated by a well-defined interface. Although the precise nature of the task separation will vary with the application, the definition of the interface will need to follow certain fundamental principles in order to provide functional separation. This interface was successfully implemented in the expert scheduling software currently running the automated chemical processing facility at Lockheed-Georgia. Potential applications of this concept in the areas of airborne avionics and robotics will be discussed.
Pope, Catherine; Halford, Susan; Turnbull, Joanne; Prichard, Jane
2014-06-01
This article draws on data collected during a 2-year project examining the deployment of a computerised decision support system. This computerised decision support system was designed to be used by non-clinical staff for dealing with calls to emergency (999) and urgent care (out-of-hours) services. One of the promises of computerised decisions support technologies is that they can 'hold' vast amounts of sophisticated clinical knowledge and combine it with decision algorithms to enable standardised decision-making by non-clinical (clerical) staff. This article draws on our ethnographic study of this computerised decision support system in use, and we use our analysis to question the 'automated' vision of decision-making in healthcare call-handling. We show that embodied and experiential (human) expertise remains central and highly salient in this work, and we propose that the deployment of the computerised decision support system creates something new, that this conjunction of computer and human creates a cyborg practice.
Cognitive Performance in Operational Environments
NASA Technical Reports Server (NTRS)
Russo, Michael; McGhee, James; Friedler, Edna; Thomas, Maria
2005-01-01
Optimal cognition during complex and sustained operations is a critical component for success in current and future military operations. "Cognitive Performance, Judgment, and Decision-making" (CPJD) is a newly organized U.S. Army Medical Research and Materiel Command research program focused on sustaining operational effectiveness of Future Force Warriors by developing paradigms through which militarily-relevant, higher-order cognitive performance, judgment, and decision-making can be assessed and sustained in individuals, small teams, and leaders of network-centric fighting units. CPJD evaluates the impact of stressors intrinsic to military operational environments (e.g., sleep deprivation, workload, fatigue, temperature extremes, altitude, environmental/physiological disruption) on military performance, evaluates noninvasive automated methods for monitoring and predicting cognitive performance, and investigates pharmaceutical strategies (e.g., stimulant countermeasures, hypnotics) to mitigate performance decrements. This manuscript describes the CPJD program, discusses the metrics utilized to relate militarily applied research findings to academic research, and discusses how the simulated combat capabilities of a synthetic battle laboratory may facilitate future cognitive performance research.
Fast-mode duplex qPCR for BCR-ABL1 molecular monitoring: innovation, automation, and harmonization.
Gerrard, Gareth; Mudge, Katherine; Foskett, Pierre; Stevens, David; Alikian, Mary; White, Helen E; Cross, Nicholas C P; Apperley, Jane; Foroni, Letizia
2012-07-01
Reverse transcription quantitative polymerase chain reaction (RTqPCR)is currently the most sensitive tool available for the routine monitoring of disease level in patients undergoing treatment for BCRABL1 associated malignancies. Considerable effort has been invested at both the local and international levels to standardise the methodology and reporting criteria used to assess this critical metric. In an effort to accommodate the demands of increasing sample throughput and greater standardization, we adapted the current best-practice guidelines to encompass automation platforms and improved multiplex RT-qPCR technology.
Recycling isotachophoresis - A novel approach to preparative protein fractionation
NASA Technical Reports Server (NTRS)
Sloan, Jeffrey E.; Thormann, Wolfgang; Bier, Milan; Twitty, Garland E.; Mosher, Richard A.
1986-01-01
The concept of automated recycling isotachophoresis (RITP) as a purification methodology is discussed, in addition to a description of the apparatus. In the present automated RITP, the computer system follows the achievement of steady state using arrays of universal and specific sensors, monitors the position of the front edge of the zone structure, activates the counterflow if the leading boundary passes a specified position along the separation axis, or changes the applied current, accordingly. The system demonstrates high resolution, in addition to higher processing rates than are possible in zone electrophoresis or isoelectric focusing.
Automated Verification of Specifications with Typestates and Access Permissions
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu I.; Catano, Nestor
2011-01-01
We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).
Methodology for Designing Operational Banking Risks Monitoring System
NASA Astrophysics Data System (ADS)
Kostjunina, T. N.
2018-05-01
The research looks at principles of designing an information system for monitoring operational banking risks. A proposed design methodology enables one to automate processes of collecting data on information security incidents in the banking network, serving as the basis for an integrated approach to the creation of an operational risk management system. The system can operate remotely ensuring tracking and forecasting of various operational events in the bank network. A structure of a content management system is described.
NASA Technical Reports Server (NTRS)
Nauda, A.
1982-01-01
Performance and reliability models of alternate microcomputer architectures as a methodology for optimizing system design were examined. A methodology for selecting an optimum microcomputer architecture for autonomous operation of planetary spacecraft power systems was developed. Various microcomputer system architectures are analyzed to determine their application to spacecraft power systems. It is suggested that no standardization formula or common set of guidelines exists which provides an optimum configuration for a given set of specifications.
Automated flood extent identification using WorldView imagery for the insurance industry
NASA Astrophysics Data System (ADS)
Geller, Christina
2017-10-01
Flooding is the most common and costly natural disaster around the world, causing the loss of human life and billions in economic and insured losses each year. In 2016, pluvial and fluvial floods caused an estimated 5.69 billion USD in losses worldwide with the most severe events occurring in Germany, France, China, and the United States. While catastrophe modeling has begun to help bridge the knowledge gap about the risk of fluvial flooding, understanding the extent of a flood - pluvial and fluvial - in near real-time allows insurance companies around the world to quantify the loss of property that their clients face during a flooding event and proactively respond. To develop this real-time, global analysis of flooded areas and the associated losses, a new methodology utilizing optical multi-spectral imagery from DigitalGlobe (DGI) WorldView satellite suite is proposed for the extraction of pluvial and fluvial flood extents. This methodology involves identifying flooded areas visible to the sensor, filling in the gaps left by the built environment (i.e. buildings, trees) with a nearest neighbor calculation, and comparing the footprint against an Industry Exposure Database (IE) to calculate a loss estimate. Full-automation of the methodology allows production of flood extents and associated losses anywhere around the world as required. The methodology has been tested and proven effective for the 2016 flood in Louisiana, USA.
NASA Technical Reports Server (NTRS)
Oishi, Meeko; Tomlin, Claire; Degani, Asaf
2003-01-01
Human interaction with complex hybrid systems involves the user, the automation's discrete mode logic, and the underlying continuous dynamics of the physical system. Often the user-interface of such systems displays a reduced set of information about the entire system. In safety-critical systems, how can we identify user-interface designs which do not have adequate information, or which may confuse the user? Here we describe a methodology, based on hybrid system analysis, to verify that a user-interface contains information necessary to safely complete a desired procedure or task. Verification within a hybrid framework allows us to account for the continuous dynamics underlying the simple, discrete representations displayed to the user. We provide two examples: a car traveling through a yellow light at an intersection and an aircraft autopilot in a landing/go-around maneuver. The examples demonstrate the general nature of this methodology, which is applicable to hybrid systems (not fully automated) which have operational constraints we can pose in terms of safety. This methodology differs from existing work in hybrid system verification in that we directly account for the user's interactions with the system.
Automated Authorship Attribution Using Advanced Signal Classification Techniques
Ebrahimpour, Maryam; Putniņš, Tālis J.; Berryman, Matthew J.; Allison, Andrew; Ng, Brian W.-H.; Abbott, Derek
2013-01-01
In this paper, we develop two automated authorship attribution schemes, one based on Multiple Discriminant Analysis (MDA) and the other based on a Support Vector Machine (SVM). The classification features we exploit are based on word frequencies in the text. We adopt an approach of preprocessing each text by stripping it of all characters except a-z and space. This is in order to increase the portability of the software to different types of texts. We test the methodology on a corpus of undisputed English texts, and use leave-one-out cross validation to demonstrate classification accuracies in excess of 90%. We further test our methods on the Federalist Papers, which have a partly disputed authorship and a fair degree of scholarly consensus. And finally, we apply our methodology to the question of the authorship of the Letter to the Hebrews by comparing it against a number of original Greek texts of known authorship. These tests identify where some of the limitations lie, motivating a number of open questions for future work. An open source implementation of our methodology is freely available for use at https://github.com/matthewberryman/author-detection. PMID:23437047
Decision-making regarding organ donation in Korean adults: A grounded-theory study.
Yeun, Eun Ja; Kwon, Young Mi; Kim, Jung A
2015-06-01
The aim of this study was to identify the hidden patterns of behavior leading toward the decision to donate organs. Thirteen registrants at the Association for Organ Sharing in Korea were recruited. Data were collected using in-depth interview and the interview transcripts were analyzed using Glaserian grounded-theory methodology. The main problem of participants was "body attachment" and the core category (management process) was determined to be "pursuing life." The theme consisted of four phases, which were: "hesitating," "investigating," "releasing," and "re-discovering. " Therefore, to increase organ donations, it is important to find a strategy that will create positive attitudes about organ donation through education and public relations. These results explain and provide a deeper understanding of the main problem that Korean people have about organ donation and their management of decision-making processes. These findings can help care providers to facilitate the decision-making process and respond to public needs while taking into account the sociocultural context within which decisions are made. © 2014 Wiley Publishing Asia Pty Ltd.
Perspectives on benefit-risk decision-making in vaccinology: Conference report.
Greenberg, M; Simondon, F; Saadatian-Elahi, M
2016-01-01
Benefit/risk (B/R) assessment methods are increasingly being used by regulators and companies as an important decision-making tool and their outputs as the basis of communication. B/R appraisal of vaccines, as compared with drugs, is different due to their attributes and their use. For example, vaccines are typically given to healthy people, and, for some vaccines, benefits exist both at the population and individual level. For vaccines in particular, factors such as the benefit afforded through herd effects as a function of vaccine coverage and consequently impact the B/R ratio, should also be taken into consideration and parameterized in B/R assessment models. Currently, there is no single agreed methodology for vaccine B/R assessment that can fully capture all these aspects. The conference "Perspectives on Benefit-Risk Decision-making in Vaccinology," held in Annecy (France), addressed these issues and provided recommendations on how to advance the science and practice of B/R assessment of vaccines and vaccination programs.
Connors, Brenda L; Rende, Richard; Colton, Timothy J
2013-01-01
There has been a surge of interest in examining the utility of methods for capturing individual differences in decision-making style. We illustrate the potential offered by Movement Pattern Analysis (MPA), an observational methodology that has been used in business and by the US Department of Defense to record body movements that provide predictive insight into individual differences in decision-making motivations and actions. Twelve military officers participated in an intensive 2-h interview that permitted detailed and fine-grained observation and coding of signature movements by trained practitioners using MPA. Three months later, these subjects completed four hypothetical decision-making tasks in which the amount of information sought out before coming to a decision, as well as the time spent on the tasks, were under the partial control of the subject. A composite MPA indicator of how a person allocates decision-making actions and motivations to balance both Assertion (exertion of tangible movement effort on the environment to make something occur) and Perspective (through movements that support shaping in the body to perceive and create a suitable viewpoint for action) was highly correlated with the total number of information draws and total response time-individuals high on Assertion reached for less information and had faster response times than those high on Perspective. Discussion focuses on the utility of using movement-based observational measures to capture individual differences in decision-making style and the implications for application in applied settings geared toward investigations of experienced leaders and world statesmen where individuality rules the day.
Connors, Brenda L.; Rende, Richard; Colton, Timothy J.
2013-01-01
There has been a surge of interest in examining the utility of methods for capturing individual differences in decision-making style. We illustrate the potential offered by Movement Pattern Analysis (MPA), an observational methodology that has been used in business and by the US Department of Defense to record body movements that provide predictive insight into individual differences in decision-making motivations and actions. Twelve military officers participated in an intensive 2-h interview that permitted detailed and fine-grained observation and coding of signature movements by trained practitioners using MPA. Three months later, these subjects completed four hypothetical decision-making tasks in which the amount of information sought out before coming to a decision, as well as the time spent on the tasks, were under the partial control of the subject. A composite MPA indicator of how a person allocates decision-making actions and motivations to balance both Assertion (exertion of tangible movement effort on the environment to make something occur) and Perspective (through movements that support shaping in the body to perceive and create a suitable viewpoint for action) was highly correlated with the total number of information draws and total response time—individuals high on Assertion reached for less information and had faster response times than those high on Perspective. Discussion focuses on the utility of using movement-based observational measures to capture individual differences in decision-making style and the implications for application in applied settings geared toward investigations of experienced leaders and world statesmen where individuality rules the day. PMID:24069012
Lovell, Sarah; Walker, Robert J; Schollum, John B W; Marshall, Mark R; McNoe, Bronwen M; Derrett, Sarah
2017-01-01
Background Issues related to renal replacement therapy in elderly people with end stage kidney disease (ESKD) are complex. There is inadequate empirical data related to: decision-making by older populations, treatment experiences, implications of dialysis treatment and treatment modality on quality of life, and how these link to expectations of ageing. Study population Participants for this study were selected from a larger quantitative study of dialysis and predialysis patients aged 65 years or older recruited from three nephrology services across New Zealand. All participants had reached chronic kidney disease (CKD) stage 5 and had undergone dialysis education but had not started dialysis or recently started dialysis within the past 6 months. Methodology Serial qualitative interviews were undertaken to explore the decision-making processes and subsequent treatment experiences of patients with ESKD. Analytical approach: A framework method guided the iterative process of analysis. Decision-making codes were generated within NVivo software and then compared with the body of the interviews. Results Interviews were undertaken with 17 participants. We observed that decision-making was often a fluid process, rather than occurring at a single point in time, and was heavily influenced by perceptions of oneself as becoming old, social circumstances, life events and health status. Limitations This study focuses on participants' experiences of decision-making about treatment and does not include perspectives of their nephrologists or other members of the nephrology team. Conclusions Older patients often delay dialysis as an act of self-efficacy. They often do not commit to a dialysis decision following predialysis education. Delaying decision-making and initiating dialysis were common. This was not seen by participants as a final decision about therapy. Predialysis care and education should be different for older patients, who will delay decision-making until the time of facing obvious uraemic symptoms, threatening blood tests or paternalistic guidance from their nephrologist. Trial registration number Australasian Clinical Trials Registry ACTRN 12611000024943; results. PMID:28360253
The Cognitive Challenges of Flying a Remotely Piloted Aircraft
NASA Technical Reports Server (NTRS)
Hobbs, Alan; Cardoza, Colleen; Null, Cynthia
2016-01-01
A large variety of Remotely Piloted Aircraft (RPA) designs are currently in production or in development. These aircraft range from small electric quadcopters that are flown close to the ground within visual range of the operator, to larger systems capable of extended flight in airspace shared with conventional aircraft. Before RPA can operate routinely and safely in civilian airspace, we need to understand the unique human factors associated with these aircraft. The task of flying an RPA in civilian airspace involves challenges common to the operation of other highly-automated systems, but also introduces new considerations for pilot perception, decision-making, and action execution. RPA pilots participated in focus groups where they were asked to recall critical incidents that either presented a threat to safety, or highlighted a case where the pilot contributed to system resilience or mission success. Ninety incidents were gathered from focus-groups. Human factor issues included the impact of reduced sensory cues, traffic separation in the absence of an out-the-window view, control latencies, vigilance during monotonous and ultra-long endurance flights, control station design considerations, transfer of control between control stations, the management of lost link procedures, and decision-making during emergencies. Some of these concerns have received significant attention in the literature, or are analogous to human factors of manned aircraft. The presentation will focus on issues that are poorly understood, and have not yet been the subject of extensive human factors study. Although many of the reported incidents were related to pilot error, the participants also provided examples of the positive contribution that humans make to the operation of highly-automated systems.
Adversarial reasoning and resource allocation: the LG approach
NASA Astrophysics Data System (ADS)
Stilman, Boris; Yakhnis, Vladimir; Umanskiy, Oleg; Boyd, Ron
2005-05-01
Many existing automated tools purporting to model the intelligent enemy utilize a fixed battle plan for the enemy while using flexible decisions of human players for the friendly side. According to the Naval Studies Board, "It is an open secret and a point of distress ... that too much of the substantive content of such M&S has its origin in anecdote, ..., or a narrow construction tied to stereotypical current practices of 'doctrinally correct behavior.'" Clearly, such runs lack objectivity by being heavily skewed in favor of the friendly forces. Presently, the military branches employ a variety of game-based simulators and synthetic environments, with manual (i.e., user-based) decision-making, for training and other purposes. However, without an ability to automatically generate the best strategies, tactics, and COA, the games serve mostly to display the current situation rather than form a basis for automated decision-making and effective training. We solve the problem of adversarial reasoning as a gaming problem employing Linguistic Geometry (LG), a new type of game theory demonstrating significant increase in size in gaming problems solvable in real and near-real time. It appears to be a viable approach for solving such practical problems as mission planning and battle management. Essentially, LG may be structured into two layers: game construction and game solving. Game construction includes construction of a game called an LG hypergame based on a hierarchy of Abstract Board Games (ABG). Game solving includes resource allocation for constructing an advantageous initial game state and strategy generation to reach a desirable final game state in the course of the game.
ERIC Educational Resources Information Center
Wendt, Oliver; Miller, Bridget
2012-01-01
Critical appraisal of the research literature is an essential step in informing and implementing evidence-based practice. Quality appraisal tools that assess the methodological quality of experimental studies provide a means to identify the most rigorous research suitable for evidence-based decision-making. In single-subject experimental research,…
ERIC Educational Resources Information Center
Slaven, Lori A.
2014-01-01
Purpose: The purpose of this study was to determine the degree of importance of Harvey et al.'s (1997) 13 problem-solving strategies for making retrenchment decisions on school district budgets as perceived by California superintendents of medium-sized school districts. Methodology: The subjects in the present study were 86 superintendents of…
Organisational Issues for E-Learning: Critical Success Factors as Identified by HE Practitioners
ERIC Educational Resources Information Center
McPherson, Maggie; Nunes, Miguel Baptista
2006-01-01
Purpose: The purpose of this paper is to report on a research project that identified organisational critical success factors (CSFs) for e-learning implementation in higher education (HE). These CSFs can be used as a theoretical foundation upon which to base decision-making and strategic thinking about e-learning. Design/methodology/approach: The…
Swift and Smart Decision Making: Heuristics that Work
ERIC Educational Resources Information Center
Hoy, Wayne K.; Tarter, C. J.
2010-01-01
Purpose: The aim of this paper is to examine the research literature on decision making and identify and develop a set of heuristics that work for school decision makers. Design/methodology/approach: This analysis is a synthesis of the research on decision-making heuristics that work. Findings: A set of nine rules for swift and smart decision…
Graduate Career-Making and Business Start-Up: A Literature Review
ERIC Educational Resources Information Center
Nabi, Ghulam; Holden, Rick; Walmsley, Andreas
2006-01-01
Purpose: The purpose of this article is to provide a selective review of literature on the career-related decision-making processes in terms of the transition from student to business start-up, and the nature and influence of support and guidance. Design/methodology/approach: Primarily, a critical review of a range of recently published literature…
ERIC Educational Resources Information Center
Leshowitz, Barry; Okun, Morris
2011-01-01
Research in social cognition laboratories and in simulated legal settings demonstrates that people often do not understand the statistical properties of evidence and are unable to detect scientifically flawed studies. In a mock jury study, we examined the effects of an evidence-based transcript on scepticism towards evidence obtained in flawed…
Teaching an Application of Bayes' Rule for Legal Decision-Making: Measuring the Strength of Evidence
ERIC Educational Resources Information Center
Satake, Eiki; Murray, Amy Vashlishan
2014-01-01
Although Bayesian methodology has become a powerful approach for describing uncertainty, it has largely been avoided in undergraduate statistics education. Here we demonstrate that one can present Bayes' Rule in the classroom through a hypothetical, yet realistic, legal scenario designed to spur the interests of students in introductory- and…
Policy Implications Analysis: A Methodological Advancement for Policy Research and Evaluation.
ERIC Educational Resources Information Center
Madey, Doren L.; Stenner, A. Jackson
Policy Implications Analysis (PIA) is a tool designed to maximize the likelihood that an evaluation report will have an impact on decision-making. PIA was designed to help people planning and conducting evaluations tailor their information so that it has optimal potential for being used and acted upon. This paper describes the development and…
ERIC Educational Resources Information Center
Bruininks, Robert H.
The project sought to clarify the nature and structure of adaptive functioning and to address methodological issues in its assessment, in order to improve placement, evaluation, and instructional decision-making related to adaptive functioning. Project components included: (1) exploration of the structure of adaptive behavior; (2) comparison of…
Life Design Counseling Group Intervention with Portuguese Adolescents: A Process and Outcome Study
ERIC Educational Resources Information Center
Cardoso, Paulo; Janeiro, Isabel Nunes; Duarte, Maria Eduarda
2018-01-01
This article examines the process and outcome of a life design counseling group intervention with students in Grades 9 and 12. First, we applied a quasi-experimental methodology to analyze the intervention's effectiveness in promoting career certainty, career decision-making, self-efficacy, and career adaptability in a sample of 236 students.…
School Principals as Agents of Reform of the Russian Education System
ERIC Educational Resources Information Center
Kasprzhak, A. G.; Filinov, N. B.; Bayburin, R. F.; Isaeva, N. V.; Bysik, N. V.
2015-01-01
The paper is based on the results of a study of secondary school principal decision-making styles conducted in eight regions of the Russian Federation (one per federal district) in 2014 using the methodological approach developed by Alan J. Rowe. The study aimed to assess the reformist potential of Russian school principals. We believe that this…
Methods of the Development Strategy of Service Companies: Logistical Approach
ERIC Educational Resources Information Center
Toymentseva, Irina A.; Karpova, Natalya P.; Toymentseva, Angelina A.; Chichkina, Vera D.; Efanov, Andrey V.
2016-01-01
The urgency of the analyzed issue is due to lack of attention of heads of service companies to the theory and methodology of strategic management, methods and models of management decision-making in times of economic instability. The purpose of the article is to develop theoretical positions and methodical recommendations on the formation of the…
Key concepts and methods in social vulnerability and adaptive capacity
Daniel J. Murphy; Carina Wyborn; Laurie Yung; Daniel R. Williams
2015-01-01
National forests have been asked to assess how climate change will impact nearby human communities. To assist their thinking on this topic, we examine the concepts of social vulnerability and adaptive capacity with an emphasis on a range of theoretical and methodological approaches. This analysis is designed to help researchers and decision-makers select appropriate...
Cognitive Task Analysis for Instruction in Single-Injection Ultrasound Guided-Regional Anesthesia
ERIC Educational Resources Information Center
Gucev, Gligor V.
2012-01-01
Cognitive task analysis (CTA) is methodology for eliciting knowledge from subject matter experts. CTA has been used to capture the cognitive processes, decision-making, and judgments that underlie expert behaviors. A review of the literature revealed that CTA has not yet been used to capture the knowledge required to perform ultrasound guided…
Using basic, easily attainable GIS data, AGWA provides a simple, direct, and repeatable methodology for hydrologic model setup, execution, and visualization. AGWA experiences activity from over 170 countries. It l has been downloaded over 11,000 times.
Mallants, Dirk; Batelaan, Okke; Gedeon, Matej; Huysmans, Marijke; Dassargues, Alain
2017-01-01
Cone penetration testing (CPT) is one of the most efficient and versatile methods currently available for geotechnical, lithostratigraphic and hydrogeological site characterization. Currently available methods for soil behaviour type classification (SBT) of CPT data however have severe limitations, often restricting their application to a local scale. For parameterization of regional groundwater flow or geotechnical models, and delineation of regional hydro- or lithostratigraphy, regional SBT classification would be very useful. This paper investigates the use of model-based clustering for SBT classification, and the influence of different clustering approaches on the properties and spatial distribution of the obtained soil classes. We additionally propose a methodology for automated lithostratigraphic mapping of regionally occurring sedimentary units using SBT classification. The methodology is applied to a large CPT dataset, covering a groundwater basin of ~60 km2 with predominantly unconsolidated sandy sediments in northern Belgium. Results show that the model-based approach is superior in detecting the true lithological classes when compared to more frequently applied unsupervised classification approaches or literature classification diagrams. We demonstrate that automated mapping of lithostratigraphic units using advanced SBT classification techniques can provide a large gain in efficiency, compared to more time-consuming manual approaches and yields at least equally accurate results. PMID:28467468
Rogiers, Bart; Mallants, Dirk; Batelaan, Okke; Gedeon, Matej; Huysmans, Marijke; Dassargues, Alain
2017-01-01
Cone penetration testing (CPT) is one of the most efficient and versatile methods currently available for geotechnical, lithostratigraphic and hydrogeological site characterization. Currently available methods for soil behaviour type classification (SBT) of CPT data however have severe limitations, often restricting their application to a local scale. For parameterization of regional groundwater flow or geotechnical models, and delineation of regional hydro- or lithostratigraphy, regional SBT classification would be very useful. This paper investigates the use of model-based clustering for SBT classification, and the influence of different clustering approaches on the properties and spatial distribution of the obtained soil classes. We additionally propose a methodology for automated lithostratigraphic mapping of regionally occurring sedimentary units using SBT classification. The methodology is applied to a large CPT dataset, covering a groundwater basin of ~60 km2 with predominantly unconsolidated sandy sediments in northern Belgium. Results show that the model-based approach is superior in detecting the true lithological classes when compared to more frequently applied unsupervised classification approaches or literature classification diagrams. We demonstrate that automated mapping of lithostratigraphic units using advanced SBT classification techniques can provide a large gain in efficiency, compared to more time-consuming manual approaches and yields at least equally accurate results.
A Hybrid-Cloud Science Data System Enabling Advanced Rapid Imaging & Analysis for Monitoring Hazards
NASA Astrophysics Data System (ADS)
Hua, H.; Owen, S. E.; Yun, S.; Lundgren, P.; Moore, A. W.; Fielding, E. J.; Radulescu, C.; Sacco, G.; Stough, T. M.; Mattmann, C. A.; Cervelli, P. F.; Poland, M. P.; Cruz, J.
2012-12-01
Volcanic eruptions, landslides, and levee failures are some examples of hazards that can be more accurately forecasted with sufficient monitoring of precursory ground deformation, such as the high-resolution measurements from GPS and InSAR. In addition, coherence and reflectivity change maps can be used to detect surface change due to lava flows, mudslides, tornadoes, floods, and other natural and man-made disasters. However, it is difficult for many volcano observatories and other monitoring agencies to process GPS and InSAR products in an automated scenario needed for continual monitoring of events. Additionally, numerous interoperability barriers exist in multi-sensor observation data access, preparation, and fusion to create actionable products. Combining high spatial resolution InSAR products with high temporal resolution GPS products--and automating this data preparation & processing across global-scale areas of interests--present an untapped science and monitoring opportunity. The global coverage offered by satellite-based SAR observations, and the rapidly expanding GPS networks, can provide orders of magnitude more data on these hazardous events if we have a data system that can efficiently and effectively analyze the voluminous raw data, and provide users the tools to access data from their regions of interest. Currently, combined GPS & InSAR time series are primarily generated for specific research applications, and are not implemented to run on large-scale continuous data sets and delivered to decision-making communities. We are developing an advanced service-oriented architecture for hazard monitoring leveraging NASA-funded algorithms and data management to enable both science and decision-making communities to monitor areas of interests via seamless data preparation, processing, and distribution. Our objectives: * Enable high-volume and low-latency automatic generation of NASA Solid Earth science data products (InSAR and GPS) to support hazards monitoring. * Facilitate NASA-USGS collaborations to share NASA InSAR and GPS data products, which are difficult to process in high-volume and low-latency, for decision-support. * Enable interoperable discovery, access, and sharing of NASA observations and derived actionable products, and between the observation and decision-making communities. * Enable their improved understanding through visualization, mining, and cross-agency sharing. Existing InSAR & GPS processing packages and other software are integrated for generating geodetic decision support monitoring products. We employ semantic and cloud-based data management and processing techniques for handling large data volumes, reducing end product latency, codifying data system information with semantics, and deploying interoperable services for actionable products to decision-making communities.
Hubble, Lee J; Cooper, James S; Sosa-Pintos, Andrea; Kiiveri, Harri; Chow, Edith; Webster, Melissa S; Wieczorek, Lech; Raguse, Burkhard
2015-02-09
Chemiresistor sensor arrays are a promising technology to replace current laboratory-based analysis instrumentation, with the advantage of facile integration into portable, low-cost devices for in-field use. To increase the performance of chemiresistor sensor arrays a high-throughput fabrication and screening methodology was developed to assess different organothiol-functionalized gold nanoparticle chemiresistors. This high-throughput fabrication and testing methodology was implemented to screen a library consisting of 132 different organothiol compounds as capping agents for functionalized gold nanoparticle chemiresistor sensors. The methodology utilized an automated liquid handling workstation for the in situ functionalization of gold nanoparticle films and subsequent automated analyte testing of sensor arrays using a flow-injection analysis system. To test the methodology we focused on the discrimination and quantitation of benzene, toluene, ethylbenzene, p-xylene, and naphthalene (BTEXN) mixtures in water at low microgram per liter concentration levels. The high-throughput methodology identified a sensor array configuration consisting of a subset of organothiol-functionalized chemiresistors which in combination with random forests analysis was able to predict individual analyte concentrations with overall root-mean-square errors ranging between 8-17 μg/L for mixtures of BTEXN in water at the 100 μg/L concentration. The ability to use a simple sensor array system to quantitate BTEXN mixtures in water at the low μg/L concentration range has direct and significant implications to future environmental monitoring and reporting strategies. In addition, these results demonstrate the advantages of high-throughput screening to improve the performance of gold nanoparticle based chemiresistors for both new and existing applications.
Tsipouras, Markos G; Giannakeas, Nikolaos; Tzallas, Alexandros T; Tsianou, Zoe E; Manousou, Pinelopi; Hall, Andrew; Tsoulos, Ioannis; Tsianos, Epameinondas
2017-03-01
Collagen proportional area (CPA) extraction in liver biopsy images provides the degree of fibrosis expansion in liver tissue, which is the most characteristic histological alteration in hepatitis C virus (HCV). Assessment of the fibrotic tissue is currently based on semiquantitative staging scores such as Ishak and Metavir. Since its introduction as a fibrotic tissue assessment technique, CPA calculation based on image analysis techniques has proven to be more accurate than semiquantitative scores. However, CPA has yet to reach everyday clinical practice, since the lack of standardized and robust methods for computerized image analysis for CPA assessment have proven to be a major limitation. The current work introduces a three-stage fully automated methodology for CPA extraction based on machine learning techniques. Specifically, clustering algorithms have been employed for background-tissue separation, as well as for fibrosis detection in liver tissue regions, in the first and the third stage of the methodology, respectively. Due to the existence of several types of tissue regions in the image (such as blood clots, muscle tissue, structural collagen, etc.), classification algorithms have been employed to identify liver tissue regions and exclude all other non-liver tissue regions from CPA computation. For the evaluation of the methodology, 79 liver biopsy images have been employed, obtaining 1.31% mean absolute CPA error, with 0.923 concordance correlation coefficient. The proposed methodology is designed to (i) avoid manual threshold-based and region selection processes, widely used in similar approaches presented in the literature, and (ii) minimize CPA calculation time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Roehner, Nicholas; Myers, Chris J
2014-02-21
Recently, we have begun to witness the potential of synthetic biology, noted here in the form of bacteria and yeast that have been genetically engineered to produce biofuels, manufacture drug precursors, and even invade tumor cells. The success of these projects, however, has often failed in translation and application to new projects, a problem exacerbated by a lack of engineering standards that combine descriptions of the structure and function of DNA. To address this need, this paper describes a methodology to connect the systems biology markup language (SBML) to the synthetic biology open language (SBOL), existing standards that describe biochemical models and DNA components, respectively. Our methodology involves first annotating SBML model elements such as species and reactions with SBOL DNA components. A graph is then constructed from the model, with vertices corresponding to elements within the model and edges corresponding to the cause-and-effect relationships between these elements. Lastly, the graph is traversed to assemble the annotating DNA components into a composite DNA component, which is used to annotate the model itself and can be referenced by other composite models and DNA components. In this way, our methodology can be used to build up a hierarchical library of models annotated with DNA components. Such a library is a useful input to any future genetic technology mapping algorithm that would automate the process of composing DNA components to satisfy a behavioral specification. Our methodology for SBML-to-SBOL annotation is implemented in the latest version of our genetic design automation (GDA) software tool, iBioSim.
A Computational Framework for Automation of Point Defect Calculations
NASA Astrophysics Data System (ADS)
Goyal, Anuj; Gorai, Prashun; Peng, Haowei; Lany, Stephan; Stevanovic, Vladan; National Renewable Energy Laboratory, Golden, Colorado 80401 Collaboration
A complete and rigorously validated open-source Python framework to automate point defect calculations using density functional theory has been developed. The framework provides an effective and efficient method for defect structure generation, and creation of simple yet customizable workflows to analyze defect calculations. The package provides the capability to compute widely accepted correction schemes to overcome finite-size effects, including (1) potential alignment, (2) image-charge correction, and (3) band filling correction to shallow defects. Using Si, ZnO and In2O3as test examples, we demonstrate the package capabilities and validate the methodology. We believe that a robust automated tool like this will enable the materials by design community to assess the impact of point defects on materials performance. National Renewable Energy Laboratory, Golden, Colorado 80401.
Automated analysis of clonal cancer cells by intravital imaging
Coffey, Sarah Earley; Giedt, Randy J; Weissleder, Ralph
2013-01-01
Longitudinal analyses of single cell lineages over prolonged periods have been challenging particularly in processes characterized by high cell turn-over such as inflammation, proliferation, or cancer. RGB marking has emerged as an elegant approach for enabling such investigations. However, methods for automated image analysis continue to be lacking. Here, to address this, we created a number of different multicolored poly- and monoclonal cancer cell lines for in vitro and in vivo use. To classify these cells in large scale data sets, we subsequently developed and tested an automated algorithm based on hue selection. Our results showed that this method allows accurate analyses at a fraction of the computational time required by more complex color classification methods. Moreover, the methodology should be broadly applicable to both in vitro and in vivo analyses. PMID:24349895
Automating the Generation of the Cassini Tour Atlas Database
NASA Technical Reports Server (NTRS)
Grazier, Kevin R.; Roumeliotis, Chris; Lange, Robert D.
2010-01-01
The Tour Atlas is a large database of geometrical tables, plots, and graphics used by Cassini science planning engineers and scientists primarily for science observation planning. Over time, as the contents of the Tour Atlas grew, the amount of time it took to recreate the Tour Atlas similarly grew--to the point that it took one person a week of effort. When Cassini tour designers estimated that they were going to create approximately 30 candidate Extended Mission trajectories--which needed to be analyzed for science return in a short amount of time--it became a necessity to automate. We report on the automation methodology that reduced the amount of time it took one person to (re)generate a Tour Atlas from a week to, literally, one UNIX command.
Artificial intelligence for multi-mission planetary operations
NASA Technical Reports Server (NTRS)
Atkinson, David J.; Lawson, Denise L.; James, Mark L.
1990-01-01
A brief introduction is given to an automated system called the Spacecraft Health Automated Reasoning Prototype (SHARP). SHARP is designed to demonstrate automated health and status analysis for multi-mission spacecraft and ground data systems operations. The SHARP system combines conventional computer science methodologies with artificial intelligence techniques to produce an effective method for detecting and analyzing potential spacecraft and ground systems problems. The system performs real-time analysis of spacecraft and other related telemetry, and is also capable of examining data in historical context. Telecommunications link analysis of the Voyager II spacecraft is the initial focus for evaluation of the prototype in a real-time operations setting during the Voyager spacecraft encounter with Neptune in August, 1989. The preliminary results of the SHARP project and plans for future application of the technology are discussed.
Methodology of management of dredging operations II. Applications.
Junqua, G; Abriak, N E; Gregoire, P; Dubois, V; Mac Farlane, F; Damidot, D
2006-04-01
This paper presents the new methodology of management of dredging operations. Derived partly from existing methodologies (OECD, PNUE, AIPCN), it aims to be more comprehensive, mixing the qualities and the complementarities of previous methodologies. The application of the methodology has been carried out on the site of the Port of Dunkirk (FRANCE). Thus, a characterization of the sediments of this site has allowed a zoning of the Port to be established in to zones of probable homogeneity of sediments. Moreover, sources of pollution have been identified, with an aim of prevention. Ways of waste improvement have also been developed, to answer regional needs, from a point of view of competitive and territorial intelligence. Their development has required a mutualisation of resources between professionals, research centres and local communities, according to principles of industrial ecology. Lastly, a tool of MultiCriteria Decision-Making Aid (M.C.D.M.A.) has been used to determine the most relevant scenario (or alternative, or action) for a dredging operation intended by the Port of Dunkirk. These applications have confirmed the relevance of this methodology for the management of dredging operations.
Top Level Space Cost Methodology (TLSCM)
1997-12-02
Software 7 6. ACEIT . 7 C. Ground Rules and Assumptions 7 D. Typical Life Cycle Cost Distribution 7 E. Methodologies 7 1. Cost/budget Threshold 9 2. Analogy...which is based on real-time Air Force and space programs. Ref.(25:2- 8, 2-9) 6. ACEIT : Automated Cost Estimating Integrated Tools( ACEIT ), Tecolote...Research, Inc. There is a way to use the ACEIT cost program to get a print-out of an expanded WBS. Therefore, find someone that has ACEIT experience and