2017-06-01
AUTONOMOUS CONTROL AND COLLABORATION (UTACC) HUMAN-MACHINE INTEGRATION MEASURES OF PERFORMANCE AND MEASURES OF EFFECTIVENESS by Thomas A...TACTICAL AUTONOMOUS CONTROL AND COLLABORATION (UTACC) HUMAN-MACHINE INTEGRATION MEASURES OF PERFORMANCE AND MEASURES OF EFFECTIVENESS 5. FUNDING...Tactical Autonomous Control and Collaboration (UTACC) program seeks to integrate Marines and autonomous machines to address the challenges encountered in
Preliminary Framework for Human-Automation Collaboration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oxstrand, Johanna Helene; Le Blanc, Katya Lee; Spielman, Zachary Alexander
The Department of Energy’s Advanced Reactor Technologies Program sponsors research, development and deployment activities through its Next Generation Nuclear Plant, Advanced Reactor Concepts, and Advanced Small Modular Reactor (aSMR) Programs to promote safety, technical, economical, and environmental advancements of innovative Generation IV nuclear energy technologies. The Human Automation Collaboration (HAC) Research Project is located under the aSMR Program, which identifies developing advanced instrumentation and controls and human-machine interfaces as one of four key research areas. It is expected that the new nuclear power plant designs will employ technology significantly more advanced than the analog systems in the existing reactor fleetmore » as well as utilizing automation to a greater extent. Moving towards more advanced technology and more automation does not necessary imply more efficient and safer operation of the plant. Instead, a number of concerns about how these technologies will affect human performance and the overall safety of the plant need to be addressed. More specifically, it is important to investigate how the operator and the automation work as a team to ensure effective and safe plant operation, also known as the human-automation collaboration (HAC). The focus of the HAC research is to understand how various characteristics of automation (such as its reliability, processes, and modes) effect an operator’s use and awareness of plant conditions. In other words, the research team investigates how to best design the collaboration between the operators and the automated systems in a manner that has the greatest positive impact on overall plant performance and reliability. This report addresses the Department of Energy milestone M4AT-15IN2302054, Complete Preliminary Framework for Human-Automation Collaboration, by discussing the two phased development of a preliminary HAC framework. The framework developed in the first phase was used as the basis for selecting topics to be investigated in more detail. The results and insights gained from the in-depth studies conducted during the second phase were used to revise the framework. This report describes the basis for the framework developed in phase 1, the changes made to the framework in phase 2, and the basis for the changes. Additional research needs are identified and presented in the last section of the report.« less
CHISSL: A Human-Machine Collaboration Space for Unsupervised Learning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arendt, Dustin L.; Komurlu, Caner; Blaha, Leslie M.
We developed CHISSL, a human-machine interface that utilizes supervised machine learning in an unsupervised context to help the user group unlabeled instances by her own mental model. The user primarily interacts via correction (moving a misplaced instance into its correct group) or confirmation (accepting that an instance is placed in its correct group). Concurrent with the user's interactions, CHISSL trains a classification model guided by the user's grouping of the data. It then predicts the group of unlabeled instances and arranges some of these alongside the instances manually organized by the user. We hypothesize that this mode of human andmore » machine collaboration is more effective than Active Learning, wherein the machine decides for itself which instances should be labeled by the user. We found supporting evidence for this hypothesis in a pilot study where we applied CHISSL to organize a collection of handwritten digits.« less
NASA Technical Reports Server (NTRS)
Wellens, A. Rodney
1991-01-01
Both NASA and DoD have had a long standing interest in teamwork, distributed decision making, and automation. While research on these topics has been pursued independently, it is becoming increasingly clear that the integration of social, cognitive, and human factors engineering principles will be necessary to meet the challenges of highly sophisticated scientific and military programs of the future. Images of human/intelligent-machine electronic collaboration were drawn from NASA and Air Force reports as well as from other sources. Here, areas of common concern are highlighted. A description of the author's research program testing a 'psychological distancing' model of electronic media effects and human/expert system collaboration is given.
Nyholm, Sven
2017-07-18
Many ethicists writing about automated systems (e.g. self-driving cars and autonomous weapons systems) attribute agency to these systems. Not only that; they seemingly attribute an autonomous or independent form of agency to these machines. This leads some ethicists to worry about responsibility-gaps and retribution-gaps in cases where automated systems harm or kill human beings. In this paper, I consider what sorts of agency it makes sense to attribute to most current forms of automated systems, in particular automated cars and military robots. I argue that whereas it indeed makes sense to attribute different forms of fairly sophisticated agency to these machines, we ought not to regard them as acting on their own, independently of any human beings. Rather, the right way to understand the agency exercised by these machines is in terms of human-robot collaborations, where the humans involved initiate, supervise, and manage the agency of their robotic collaborators. This means, I argue, that there is much less room for justified worries about responsibility-gaps and retribution-gaps than many ethicists think.
Applications Using High Flux LCS gamma-ray Beams: Nuclear Security and Contributions to Fukushima
NASA Astrophysics Data System (ADS)
Fujiwara, Mamoru
2014-09-01
Nuclear nonproliferation and security are an important issue for the peaceful use of nuclear energy. Many countries now collaborate together for preventing serious accidents from nuclear terrorism. Detection of hidden long-lived radioisotopes and fissionable nuclides in a non-destructive manner is useful for nuclear safeguards and management of nuclear wastes as well as nuclear security. After introducing the present situation concerning the nuclear nonproliferation and security in Japan, we plan to show the present activities of JAEA to detect the hidden nuclear materials by means of the nuclear resonance fluorescence with energy-tunable, monochromatic gamma-rays generated by Laser Compton Scattering (LCS) with an electron beam. The energy recovery linac (ERL) machine is now under development with the KEK-JAEA collaboration for realizing the new generation of gamma-ray sources. The detection technologies of nuclear materials are currently developed using the existing electron beam facilities at Duke University and at NewSubaru. These developments in Japan will contribute to the nuclear security program in Japan and to the assay of melted nuclear fuels in the Fukushima Daiichi nuclear power plants.
Controlled English to facilitate human/machine analytical processing
NASA Astrophysics Data System (ADS)
Braines, Dave; Mott, David; Laws, Simon; de Mel, Geeth; Pham, Tien
2013-06-01
Controlled English is a human-readable information representation format that is implemented using a restricted subset of the English language, but which is unambiguous and directly accessible by simple machine processes. We have been researching the capabilities of CE in a number of contexts, and exploring the degree to which a flexible and more human-friendly information representation format could aid the intelligence analyst in a multi-agent collaborative operational environment; especially in cases where the agents are a mixture of other human users and machine processes aimed at assisting the human users. CE itself is built upon a formal logic basis, but allows users to easily specify models for a domain of interest in a human-friendly language. In our research we have been developing an experimental component known as the "CE Store" in which CE information can be quickly and flexibly processed and shared between human and machine agents. The CE Store environment contains a number of specialized machine agents for common processing tasks and also supports execution of logical inference rules that can be defined in the same CE language. This paper outlines the basic architecture of this approach, discusses some of the example machine agents that have been developed, and provides some typical examples of the CE language and the way in which it has been used to support complex analytical tasks on synthetic data sources. We highlight the fusion of human and machine processing supported through the use of the CE language and CE Store environment, and show this environment with examples of highly dynamic extensions to the model(s) and integration between different user-defined models in a collaborative setting.
Framework for Building Collaborative Research Environment
Devarakonda, Ranjeet; Palanisamy, Giriprakash; San Gil, Inigo
2014-10-25
Wide range of expertise and technologies are the key to solving some global problems. Semantic web technology can revolutionize the nature of how scientific knowledge is produced and shared. The semantic web is all about enabling machine-machine readability instead of a routine human-human interaction. Carefully structured data, as in machine readable data is the key to enabling these interactions. Drupal is an example of one such toolset that can render all the functionalities of Semantic Web technology right out of the box. Drupal’s content management system automatically stores the data in a structured format enabling it to be machine. Withinmore » this paper, we will discuss how Drupal promotes collaboration in a research setting such as Oak Ridge National Laboratory (ORNL) and Long Term Ecological Research Center (LTER) and how it is effectively using the Semantic Web in achieving this.« less
ERIC Educational Resources Information Center
Edwards, Autumn; Edwards, Chad
2017-01-01
Educational encounters of the future (and increasingly, of the present) will involve a complex collaboration of human and machine intelligences and agents, partnering to enhance learning and growth. Increasingly, "students and instructors are not only talking 'through' machines, but also [talking] 'to them', and 'within them'" (Edwards…
An Affordance-Based Framework for Human Computation and Human-Computer Collaboration.
Crouser, R J; Chang, R
2012-12-01
Visual Analytics is "the science of analytical reasoning facilitated by visual interactive interfaces". The goal of this field is to develop tools and methodologies for approaching problems whose size and complexity render them intractable without the close coupling of both human and machine analysis. Researchers have explored this coupling in many venues: VAST, Vis, InfoVis, CHI, KDD, IUI, and more. While there have been myriad promising examples of human-computer collaboration, there exists no common language for comparing systems or describing the benefits afforded by designing for such collaboration. We argue that this area would benefit significantly from consensus about the design attributes that define and distinguish existing techniques. In this work, we have reviewed 1,271 papers from many of the top-ranking conferences in visual analytics, human-computer interaction, and visualization. From these, we have identified 49 papers that are representative of the study of human-computer collaborative problem-solving, and provide a thorough overview of the current state-of-the-art. Our analysis has uncovered key patterns of design hinging on human and machine-intelligence affordances, and also indicates unexplored avenues in the study of this area. The results of this analysis provide a common framework for understanding these seemingly disparate branches of inquiry, which we hope will motivate future work in the field.
Man/Machine Interaction Dynamics And Performance (MMIDAP) capability
NASA Technical Reports Server (NTRS)
Frisch, Harold P.
1991-01-01
The creation of an ability to study interaction dynamics between a machine and its human operator can be approached from a myriad of directions. The Man/Machine Interaction Dynamics and Performance (MMIDAP) project seeks to create an ability to study the consequences of machine design alternatives relative to the performance of both machine and operator. The class of machines to which this study is directed includes those that require the intelligent physical exertions of a human operator. While Goddard's Flight Telerobotic's program was expected to be a major user, basic engineering design and biomedical applications reach far beyond telerobotics. Ongoing efforts are outlined of the GSFC and its University and small business collaborators to integrate both human performance and musculoskeletal data bases with analysis capabilities necessary to enable the study of dynamic actions, reactions, and performance of coupled machine/operator systems.
NASA Astrophysics Data System (ADS)
Hong, Haibo; Yin, Yuehong; Chen, Xing
2016-11-01
Despite the rapid development of computer science and information technology, an efficient human-machine integrated enterprise information system for designing complex mechatronic products is still not fully accomplished, partly because of the inharmonious communication among collaborators. Therefore, one challenge in human-machine integration is how to establish an appropriate knowledge management (KM) model to support integration and sharing of heterogeneous product knowledge. Aiming at the diversity of design knowledge, this article proposes an ontology-based model to reach an unambiguous and normative representation of knowledge. First, an ontology-based human-machine integrated design framework is described, then corresponding ontologies and sub-ontologies are established according to different purposes and scopes. Second, a similarity calculation-based ontology integration method composed of ontology mapping and ontology merging is introduced. The ontology searching-based knowledge sharing method is then developed. Finally, a case of human-machine integrated design of a large ultra-precision grinding machine is used to demonstrate the effectiveness of the method.
A voyage to Mars: A challenge to collaboration between man and machines
NASA Technical Reports Server (NTRS)
Statler, Irving C.
1991-01-01
A speech addressing the design of man machine systems for exploration of space beyond Earth orbit from the human factors perspective is presented. Concerns relative to the design of automated and intelligent systems for the NASA Space Exploration Initiative (SEI) missions are largely based on experiences with integrating humans and comparable systems in aviation. The history, present status, and future prospect, of human factors in machine design are discussed in relation to a manned voyage to Mars. Three different cases for design philosophy are presented. The use of simulation is discussed. Recommendations for required research are given.
Leveraging human oversight and intervention in large-scale parallel processing of open-source data
NASA Astrophysics Data System (ADS)
Casini, Enrico; Suri, Niranjan; Bradshaw, Jeffrey M.
2015-05-01
The popularity of cloud computing along with the increased availability of cheap storage have led to the necessity of elaboration and transformation of large volumes of open-source data, all in parallel. One way to handle such extensive volumes of information properly is to take advantage of distributed computing frameworks like Map-Reduce. Unfortunately, an entirely automated approach that excludes human intervention is often unpredictable and error prone. Highly accurate data processing and decision-making can be achieved by supporting an automatic process through human collaboration, in a variety of environments such as warfare, cyber security and threat monitoring. Although this mutual participation seems easily exploitable, human-machine collaboration in the field of data analysis presents several challenges. First, due to the asynchronous nature of human intervention, it is necessary to verify that once a correction is made, all the necessary reprocessing is done in chain. Second, it is often needed to minimize the amount of reprocessing in order to optimize the usage of resources due to limited availability. In order to improve on these strict requirements, this paper introduces improvements to an innovative approach for human-machine collaboration in the processing of large amounts of open-source data in parallel.
Toward Usable Interactive Analytics: Coupling Cognition and Computation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Endert, Alexander; North, Chris; Chang, Remco
Interactive analytics provide users a myriad of computational means to aid in extracting meaningful information from large and complex datasets. Much prior work focuses either on advancing the capabilities of machine-centric approaches by the data mining and machine learning communities, or human-driven methods by the visualization and CHI communities. However, these methods do not yet support a true human-machine symbiotic relationship where users and machines work together collaboratively and adapt to each other to advance an interactive analytic process. In this paper we discuss some of the inherent issues, outlining what we believe are the steps toward usable interactive analyticsmore » that will ultimately increase the effectiveness for both humans and computers to produce insights.« less
Collaborative human-machine analysis using a controlled natural language
NASA Astrophysics Data System (ADS)
Mott, David H.; Shemanski, Donald R.; Giammanco, Cheryl; Braines, Dave
2015-05-01
A key aspect of an analyst's task in providing relevant information from data is the reasoning about the implications of that data, in order to build a picture of the real world situation. This requires human cognition, based upon domain knowledge about individuals, events and environmental conditions. For a computer system to collaborate with an analyst, it must be capable of following a similar reasoning process to that of the analyst. We describe ITA Controlled English (CE), a subset of English to represent analyst's domain knowledge and reasoning, in a form that it is understandable by both analyst and machine. CE can be used to express domain rules, background data, assumptions and inferred conclusions, thus supporting human-machine interaction. A CE reasoning and modeling system can perform inferences from the data and provide the user with conclusions together with their rationale. We present a logical problem called the "Analysis Game", used for training analysts, which presents "analytic pitfalls" inherent in many problems. We explore an iterative approach to its representation in CE, where a person can develop an understanding of the problem solution by incremental construction of relevant concepts and rules. We discuss how such interactions might occur, and propose that such techniques could lead to better collaborative tools to assist the analyst and avoid the "pitfalls".
Neo-Symbiosis: The Next Stage in the Evolution of Human Information Interaction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griffith, Douglas; Greitzer, Frank L.
In his 1960 paper Man-Machine Symbiosis, Licklider predicted that human brains and computing machines will be coupled in a tight partnership that will think as no human brain has ever thought and process data in a way not approached by the information-handling machines we know today. Today we are on the threshold of resurrecting the vision of symbiosis. While Licklider’s original vision suggested a co-equal relationship, here we discuss an updated vision, neo-symbiosis, in which the human holds a superordinate position in an intelligent human-computer collaborative environment. This paper was originally published as a journal article and is being publishedmore » as a chapter in an upcoming book series, Advances in Novel Approaches in Cognitive Informatics and Natural Intelligence.« less
Human Factors Consideration for the Design of Collaborative Machine Assistants
NASA Astrophysics Data System (ADS)
Park, Sung; Fisk, Arthur D.; Rogers, Wendy A.
Recent improvements in technology have facilitated the use of robots and virtual humans not only in entertainment and engineering but also in the military (Hill et al., 2003), healthcare (Pollack et al., 2002), and education domains (Johnson, Rickel, & Lester, 2000). As active partners of humans, such machine assistants can take the form of a robot or a graphical representation and serve the role of a financial assistant, a health manager, or even a social partner. As a result, interactive technologies are becoming an integral component of people's everyday lives.
NASA Astrophysics Data System (ADS)
Lin, Y.; Zhang, W. J.
2005-02-01
This paper presents an approach to human-machine interface design for control room operators of nuclear power plants. The first step in designing an interface for a particular application is to determine information content that needs to be displayed. The design methodology for this step is called the interface design framework (called framework ). Several frameworks have been proposed for applications at varying levels, including process plants. However, none is based on the design and manufacture of a plant system for which the interface is designed. This paper presents an interface design framework which originates from design theory and methodology for general technical systems. Specifically, the framework is based on a set of core concepts of a function-behavior-state model originally proposed by the artificial intelligence research community and widely applied in the design research community. Benefits of this new framework include the provision of a model-based fault diagnosis facility, and the seamless integration of the design (manufacture, maintenance) of plants and the design of human-machine interfaces. The missing linkage between design and operation of a plant was one of the causes of the Three Mile Island nuclear reactor incident. A simulated plant system is presented to explain how to apply this framework in designing an interface. The resulting human-machine interface is discussed; specifically, several fault diagnosis examples are elaborated to demonstrate how this interface could support operators' fault diagnosis in an unanticipated situation.
Human-Machine Collaborative Optimization via Apprenticeship Scheduling
2016-09-09
prenticeship Scheduling (COVAS), which performs ma- chine learning using human expert demonstration, in conjunction with optimization, to automatically and ef...ficiently produce optimal solutions to challenging real- world scheduling problems. COVAS first learns a policy from human scheduling demonstration via...apprentice- ship learning , then uses this initial solution to provide a tight bound on the value of the optimal solution, thereby substantially
DOE Office of Scientific and Technical Information (OSTI.GOV)
Donald D Dudenhoeffer; Burce P Hallbert
Instrumentation, Controls, and Human-Machine Interface (ICHMI) technologies are essential to ensuring delivery and effective operation of optimized advanced Generation IV (Gen IV) nuclear energy systems. In 1996, the Watts Bar I nuclear power plant in Tennessee was the last U.S. nuclear power plant to go on line. It was, in fact, built based on pre-1990 technology. Since this last U.S. nuclear power plant was designed, there have been major advances in the field of ICHMI systems. Computer technology employed in other industries has advanced dramatically, and computing systems are now replaced every few years as they become functionally obsolete. Functionalmore » obsolescence occurs when newer, more functional technology replaces or supersedes an existing technology, even though an existing technology may well be in working order.Although ICHMI architectures are comprised of much of the same technology, they have not been updated nearly as often in the nuclear power industry. For example, some newer Personal Digital Assistants (PDAs) or handheld computers may, in fact, have more functionality than the 1996 computer control system at the Watts Bar I plant. This illustrates the need to transition and upgrade current nuclear power plant ICHMI technologies.« less
Efficient cost-sensitive human-machine collaboration for offline signature verification
NASA Astrophysics Data System (ADS)
Coetzer, Johannes; Swanepoel, Jacques; Sabourin, Robert
2012-01-01
We propose a novel strategy for the optimal combination of human and machine decisions in a cost-sensitive environment. The proposed algorithm should be especially beneficial to financial institutions where off-line signatures, each associated with a specific transaction value, require authentication. When presented with a collection of genuine and fraudulent training signatures, produced by so-called guinea pig writers, the proficiency of a workforce of human employees and a score-generating machine can be estimated and represented in receiver operating characteristic (ROC) space. Using a set of Boolean fusion functions, the majority vote decision of the human workforce is combined with each threshold-specific machine-generated decision. The performance of the candidate ensembles is estimated and represented in ROC space, after which only the optimal ensembles and associated decision trees are retained. When presented with a questioned signature linked to an arbitrary writer, the system first uses the ROC-based cost gradient associated with the transaction value to select the ensemble that minimises the expected cost, and then uses the corresponding decision tree to authenticate the signature in question. We show that, when utilising the entire human workforce, the incorporation of a machine streamlines the authentication process and decreases the expected cost for all operating conditions.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-03
...] International Business Machines Corporation, ITD Business Unit, Division 7, E-mail and Collaboration Group... Business Machines Corporation (IBM), ITD Business Unit, Division 7, E- mail and Collaboration Group... Business Unit, Division 7, E-mail and Collaboration [[Page 46854
Movement Characteristics Analysis and Dynamic Simulation of Collaborative Measuring Robot
NASA Astrophysics Data System (ADS)
guoqing, MA; li, LIU; zhenglin, YU; guohua, CAO; yanbin, ZHENG
2017-03-01
Human-machine collaboration is becoming increasingly more necessary, and so collaborative robot applications are also in high demand. We selected a UR10 robot as our research subject for this study. First, we applied D-H coordinate transformation of the robot to establish a link system, and we then used inverse transformation to solve the robot’s inverse kinematics and find all the joints. Use Lagrange method to analysis UR robot dynamics; use ADAMS multibody dynamics simulation software to dynamic simulation; verifying the correctness of the derived kinetic models.
A Concept for Optimizing Behavioural Effectiveness & Efficiency
NASA Astrophysics Data System (ADS)
Barca, Jan Carlo; Rumantir, Grace; Li, Raymond
Both humans and machines exhibit strengths and weaknesses that can be enhanced by merging the two entities. This research aims to provide a broader understanding of how closer interactions between these two entities can facilitate more optimal goal-directed performance through the use of artificial extensions of the human body. Such extensions may assist us in adapting to and manipulating our environments in a more effective way than any system known today. To demonstrate this concept, we have developed a simulation where a semi interactive virtual spider can be navigated through an environment consisting of several obstacles and a virtual predator capable of killing the spider. The virtual spider can be navigated through the use of three different control systems that can be used to assist in optimising overall goal directed performance. The first two control systems use, an onscreen button interface and a touch sensor, respectively to facilitate human navigation of the spider. The third control system is an autonomous navigation system through the use of machine intelligence embedded in the spider. This system enables the spider to navigate and react to changes in its local environment. The results of this study indicate that machines should be allowed to override human control in order to maximise the benefits of collaboration between man and machine. This research further indicates that the development of strong machine intelligence, sensor systems that engage all human senses, extra sensory input systems, physical remote manipulators, multiple intelligent extensions of the human body, as well as a tighter symbiosis between man and machine, can support an upgrade of the human form.
Collaborative mining and interpretation of large-scale data for biomedical research insights.
Tsiliki, Georgia; Karacapilidis, Nikos; Christodoulou, Spyros; Tzagarakis, Manolis
2014-01-01
Biomedical research becomes increasingly interdisciplinary and collaborative in nature. Researchers need to efficiently and effectively collaborate and make decisions by meaningfully assembling, mining and analyzing available large-scale volumes of complex multi-faceted data residing in different sources. In line with related research directives revealing that, in spite of the recent advances in data mining and computational analysis, humans can easily detect patterns which computer algorithms may have difficulty in finding, this paper reports on the practical use of an innovative web-based collaboration support platform in a biomedical research context. Arguing that dealing with data-intensive and cognitively complex settings is not a technical problem alone, the proposed platform adopts a hybrid approach that builds on the synergy between machine and human intelligence to facilitate the underlying sense-making and decision making processes. User experience shows that the platform enables more informed and quicker decisions, by displaying the aggregated information according to their needs, while also exploiting the associated human intelligence.
Collaborative Mining and Interpretation of Large-Scale Data for Biomedical Research Insights
Tsiliki, Georgia; Karacapilidis, Nikos; Christodoulou, Spyros; Tzagarakis, Manolis
2014-01-01
Biomedical research becomes increasingly interdisciplinary and collaborative in nature. Researchers need to efficiently and effectively collaborate and make decisions by meaningfully assembling, mining and analyzing available large-scale volumes of complex multi-faceted data residing in different sources. In line with related research directives revealing that, in spite of the recent advances in data mining and computational analysis, humans can easily detect patterns which computer algorithms may have difficulty in finding, this paper reports on the practical use of an innovative web-based collaboration support platform in a biomedical research context. Arguing that dealing with data-intensive and cognitively complex settings is not a technical problem alone, the proposed platform adopts a hybrid approach that builds on the synergy between machine and human intelligence to facilitate the underlying sense-making and decision making processes. User experience shows that the platform enables more informed and quicker decisions, by displaying the aggregated information according to their needs, while also exploiting the associated human intelligence. PMID:25268270
Social Intelligence in a Human-Machine Collaboration System
NASA Astrophysics Data System (ADS)
Nakajima, Hiroshi; Morishima, Yasunori; Yamada, Ryota; Brave, Scott; Maldonado, Heidy; Nass, Clifford; Kawaji, Shigeyasu
In this information society of today, it is often argued that it is necessary to create a new way of human-machine interaction. In this paper, an agent with social response capabilities has been developed to achieve this goal. There are two kinds of information that is exchanged by two entities: objective and functional information (e.g., facts, requests, states of matters, etc.) and subjective information (e.g., feelings, sense of relationship, etc.). Traditional interactive systems have been designed to handle the former kind of information. In contrast, in this study social agents handling the latter type of information are presented. The current study focuses on sociality of the agent from the view point of Media Equation theory. This article discusses the definition, importance, and benefits of social intelligence as agent technology and argues that social intelligence has a potential to enhance the user's perception of the system, which in turn can lead to improvements of the system's performance. In order to implement social intelligence in the agent, a mind model has been developed to render affective expressions and personality of the agent. The mind model has been implemented in a human-machine collaborative learning system. One differentiating feature of the collaborative learning system is that it has an agent that performs as a co-learner with which the user interacts during the learning session. The mind model controls the social behaviors of the agent, thus making it possible for the user to have more social interactions with the agent. The experiment with the system suggested that a greater degree of learning was achieved when the students worked with the co-learner agent and that the co-learner agent with the mind model that expressed emotions resulted in a more positive attitude toward the system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.; Riensche, Roderick M.; Haack, Jereme N.
“Gamification”, the application of gameplay to real-world problems, enables the development of human computation systems that support decision-making through the integration of social and machine intelligence. One of gamification’s major benefits includes the creation of a problem solving environment where the influence of cognitive and cultural biases on human judgment can be curtailed through collaborative and competitive reasoning. By reducing biases on human judgment, gamification allows human computation systems to exploit human creativity relatively unhindered by human error. Operationally, gamification uses simulation to harvest human behavioral data that provide valuable insights for the solution of real-world problems.
[The current state of the brain-computer interface problem].
Shurkhay, V A; Aleksandrova, E V; Potapov, A A; Goryainov, S A
2015-01-01
It was only 40 years ago that the first PC appeared. Over this period, rather short in historical terms, we have witnessed the revolutionary changes in lives of individuals and the entire society. Computer technologies are tightly connected with any field, either directly or indirectly. We can currently claim that computers are manifold superior to a human mind in terms of a number of parameters; however, machines lack the key feature: they are incapable of independent thinking (like a human). However, the key to successful development of humankind is collaboration between the brain and the computer rather than competition. Such collaboration when a computer broadens, supplements, or replaces some brain functions is known as the brain-computer interface. Our review focuses on real-life implementation of this collaboration.
Gray, John
2017-01-01
Machine-to-machine (M2M) communication is a key enabling technology for industrial internet of things (IIoT)-empowered industrial networks, where machines communicate with one another for collaborative automation and intelligent optimisation. This new industrial computing paradigm features high-quality connectivity, ubiquitous messaging, and interoperable interactions between machines. However, manufacturing IIoT applications have specificities that distinguish them from many other internet of things (IoT) scenarios in machine communications. By highlighting the key requirements and the major technical gaps of M2M in industrial applications, this article describes a collaboration-oriented M2M (CoM2M) messaging mechanism focusing on flexible connectivity and discovery, ubiquitous messaging, and semantic interoperability that are well suited for the production line-scale interoperability of manufacturing applications. The designs toward machine collaboration and data interoperability at both the communication and semantic level are presented. Then, the application scenarios of the presented methods are illustrated with a proof-of-concept implementation in the PicknPack food packaging line. Eventually, the advantages and some potential issues are discussed based on the PicknPack practice. PMID:29165347
Ferraldeschi, Michela; Salvetti, Marco; Zaccaria, Andrea; Crisanti, Andrea; Grassi, Francesca
2017-01-01
Background: Multiple sclerosis has an extremely variable natural course. In most patients, disease starts with a relapsing-remitting (RR) phase, which proceeds to a secondary progressive (SP) form. The duration of the RR phase is hard to predict, and to date predictions on the rate of disease progression remain suboptimal. This limits the opportunity to tailor therapy on an individual patient's prognosis, in spite of the choice of several therapeutic options. Approaches to improve clinical decisions, such as collective intelligence of human groups and machine learning algorithms are widely investigated. Methods: Medical students and a machine learning algorithm predicted the course of disease on the basis of randomly chosen clinical records of patients that attended at the Multiple Sclerosis service of Sant'Andrea hospital in Rome. Results: A significant improvement of predictive ability was obtained when predictions were combined with a weight that depends on the consistence of human (or algorithm) forecasts on a given clinical record. Conclusions: In this work we present proof-of-principle that human-machine hybrid predictions yield better prognoses than machine learning algorithms or groups of humans alone. To strengthen this preliminary result, we propose a crowdsourcing initiative to collect prognoses by physicians on an expanded set of patients. PMID:29904574
Tacchella, Andrea; Romano, Silvia; Ferraldeschi, Michela; Salvetti, Marco; Zaccaria, Andrea; Crisanti, Andrea; Grassi, Francesca
2017-01-01
Background: Multiple sclerosis has an extremely variable natural course. In most patients, disease starts with a relapsing-remitting (RR) phase, which proceeds to a secondary progressive (SP) form. The duration of the RR phase is hard to predict, and to date predictions on the rate of disease progression remain suboptimal. This limits the opportunity to tailor therapy on an individual patient's prognosis, in spite of the choice of several therapeutic options. Approaches to improve clinical decisions, such as collective intelligence of human groups and machine learning algorithms are widely investigated. Methods: Medical students and a machine learning algorithm predicted the course of disease on the basis of randomly chosen clinical records of patients that attended at the Multiple Sclerosis service of Sant'Andrea hospital in Rome. Results: A significant improvement of predictive ability was obtained when predictions were combined with a weight that depends on the consistence of human (or algorithm) forecasts on a given clinical record. Conclusions: In this work we present proof-of-principle that human-machine hybrid predictions yield better prognoses than machine learning algorithms or groups of humans alone. To strengthen this preliminary result, we propose a crowdsourcing initiative to collect prognoses by physicians on an expanded set of patients.
NASA Astrophysics Data System (ADS)
Mattsson, Thomas R.; Townsend, Joshua P.; Shulenburger, Luke; Seagle, Christopher T.; Furnish, Michael D.; Fei, Yingwei
2017-06-01
For the past seven years, the Z Fundamental Science program has fostered collaboration between scientists at the national laboratories and academic research groups to utilize the Z-machine to explore properties of matter in extreme conditions. A recent example of this involves a collaboration between the Carnegie institution of Washington and Sandia to determine the properties of warm dense MgSiO3 by performing shock experiments using the Z-machine. To reach the higher densities desired, bridgmanite samples are being fabricated at Carnegie using multi-anvil presses. We will describe the preparations under way for these experiments, including pre-shot ab-initio calculations of the Hugoniot and the deployment of dual-layer flyer plates that allow for the measurement of sound velocities along the Hugoniot. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Software Should be Written by Writers.
ERIC Educational Resources Information Center
Sheridan, James
1983-01-01
Considering the computer as a collaborator rather than a machine, it is encouraged that those in the humanities and the arts fields take advantage of the great potential that artificial intelligence can offer. Stresses that unless deliberately restricted, the computer is an inherently interdisciplinary medium, and capable of interacting with any…
Proceedings of the international meeting on thermal nuclear reactor safety. Vol. 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Separate abstracts are included for each of the papers presented concerning current issues in nuclear power plant safety; national programs in nuclear power plant safety; radiological source terms; probabilistic risk assessment methods and techniques; non LOCA and small-break-LOCA transients; safety goals; pressurized thermal shocks; applications of reliability and risk methods to probabilistic risk assessment; human factors and man-machine interface; and data bases and special applications.
NASA Technical Reports Server (NTRS)
Schutte, Paul C.; Goodrich, Kenneth H.; Cox, David E.; Jackson, Bruce; Palmer, Michael T.; Pope, Alan T.; Schlecht, Robin W.; Tedjojuwono, Ken K.; Trujillo, Anna C.; Williams, Ralph A.;
2007-01-01
This paper reviews current and emerging operational experiences, technologies, and human-machine interaction theories to develop an integrated flight system concept designed to increase the safety, reliability, and performance of single-pilot operations in an increasingly accommodating but stringent national airspace system. This concept, know as the Naturalistic Flight Deck (NFD), uses a form of human-centered automation known as complementary-automation (or complemation) to structure the relationship between the human operator and the aircraft as independent, collaborative agents having complimentary capabilities. The human provides commonsense knowledge, general intelligence, and creative thinking, while the machine contributes specialized intelligence and control, extreme vigilance, resistance to fatigue, and encyclopedic memory. To support the development of the NFD, an initial Concept of Operations has been created and selected normal and non-normal scenarios are presented in this document.
NASA Astrophysics Data System (ADS)
Preece, Alun; Webberley, Will; Braines, Dave
2015-05-01
Recent advances in natural language question-answering systems and context-aware mobile apps create opportunities for improved sensemaking in a tactical setting. Users equipped with mobile devices act as both sensors (able to acquire information) and effectors (able to act in situ), operating alone or in collectives. The currently- dominant technical approaches follow either a pull model (e.g. Apple's Siri or IBM's Watson which respond to users' natural language queries) or a push model (e.g. Google's Now which sends notifications to a user based on their context). There is growing recognition that users need more flexible styles of conversational interaction, where they are able to freely ask or tell, be asked or told, seek explanations and clarifications. Ideally such conversations should involve a mix of human and machine agents, able to collaborate in collective sensemaking activities with as few barriers as possible. Desirable capabilities include adding new knowledge, collaboratively building models, invoking specific services, and drawing inferences. As a step towards this goal, we collect evidence from a number of recent pilot studies including natural experiments (e.g. situation awareness in the context of organised protests) and synthetic experiments (e.g. human and machine agents collaborating in information seeking and spot reporting). We identify some principles and areas of future research for "conversational sensemaking".
Designing Computer Agents With Facial Personality To Improve Human-Machine Collaboration
2006-05-25
Francis Galton is credited with recognizing the fundamental lexical hypothesis which states that you can identify “the more conspicuous aspects of the...available to describe more important traits. Galton (1884) also surmised that although there are a thousand subtly unique words used to describe character...inconsistent. A hallmark of intelligence , what potentially separates human beings from earlier life forms, is the ability to think about future consequences
Tokamak foundation in USSR/Russia 1950-1990
NASA Astrophysics Data System (ADS)
Smirnov, V. P.
2010-01-01
In the USSR, nuclear fusion research began in 1950 with the work of I.E. Tamm, A.D. Sakharov and colleagues. They formulated the principles of magnetic confinement of high temperature plasmas, that would allow the development of a thermonuclear reactor. Following this, experimental research on plasma initiation and heating in toroidal systems began in 1951 at the Kurchatov Institute. From the very first devices with vessels made of glass, porcelain or metal with insulating inserts, work progressed to the operation of the first tokamak, T-1, in 1958. More machines followed and the first international collaboration in nuclear fusion, on the T-3 tokamak, established the tokamak as a promising option for magnetic confinement. Experiments continued and specialized machines were developed to test separately improvements to the tokamak concept needed for the production of energy. At the same time, research into plasma physics and tokamak theory was being undertaken which provides the basis for modern theoretical work. Since then, the tokamak concept has been refined by a world-wide effort and today we look forward to the successful operation of ITER.
NASA Astrophysics Data System (ADS)
Pekedis, Mahmut; Mascerañas, David; Turan, Gursoy; Ercan, Emre; Farrar, Charles R.; Yildiz, Hasan
2015-08-01
For the last two decades, developments in damage detection algorithms have greatly increased the potential for autonomous decisions about structural health. However, we are still struggling to build autonomous tools that can match the ability of a human to detect and localize the quantity of damage in structures. Therefore, there is a growing interest in merging the computational and cognitive concepts to improve the solution of structural health monitoring (SHM). The main object of this research is to apply the human-machine cooperative approach on a tower structure to detect damage. The cooperation approach includes haptic tools to create an appropriate collaboration between SHM sensor networks, statistical compression techniques and humans. Damage simulation in the structure is conducted by releasing some of the bolt loads. Accelerometers are bonded to various locations of the tower members to acquire the dynamic response of the structure. The obtained accelerometer results are encoded in three different ways to represent them as a haptic stimulus for the human subjects. Then, the participants are subjected to each of these stimuli to detect the bolt loosened damage in the tower. Results obtained from the human-machine cooperation demonstrate that the human subjects were able to recognize the damage with an accuracy of 88 ± 20.21% and response time of 5.87 ± 2.33 s. As a result, it is concluded that the currently developed human-machine cooperation SHM may provide a useful framework to interact with abstract entities such as data from a sensor network.
Collaborative Robots and Knowledge Management - A Short Review
NASA Astrophysics Data System (ADS)
Mușat, Flaviu-Constantin; Mihu, Florin-Constantin
2017-12-01
Because the requirements of the customers are more and more high related to quality, quantity, delivery times at lowest costs possible, the industry had to come with automated solutions to improve these requirements. Starting from the automated lines developed by Ford and Toyota, we have now developed automated and self-sustained working lines, which is possible nowadays-using collaborative robots. By using the knowledge management system we can improve the development of the future of this kind of area of research. This paper shows the benefits and the smartness use of the robots that are performing the manipulation activities that increases the work place ergonomically and improve the interaction between human - machine in order to assist in parallel tasks and lowering the physically human efforts.
Sahaï, Aïsha; Pacherie, Elisabeth; Grynszpan, Ouriel; Berberian, Bruno
2017-01-01
Nowadays, interactions with others do not only involve human peers but also automated systems. Many studies suggest that the motor predictive systems that are engaged during action execution are also involved during joint actions with peers and during other human generated action observation. Indeed, the comparator model hypothesis suggests that the comparison between a predicted state and an estimated real state enables motor control, and by a similar functioning, understanding and anticipating observed actions. Such a mechanism allows making predictions about an ongoing action, and is essential to action regulation, especially during joint actions with peers. Interestingly, the same comparison process has been shown to be involved in the construction of an individual's sense of agency, both for self-generated and observed other human generated actions. However, the implication of such predictive mechanisms during interactions with machines is not consensual, probably due to the high heterogeneousness of the automata used in the experimentations, from very simplistic devices to full humanoid robots. The discrepancies that are observed during human/machine interactions could arise from the absence of action/observation matching abilities when interacting with traditional low-level automata. Consistently, the difficulties to build a joint agency with this kind of machines could stem from the same problem. In this context, we aim to review the studies investigating predictive mechanisms during social interactions with humans and with automated artificial systems. We will start by presenting human data that show the involvement of predictions in action control and in the sense of agency during social interactions. Thereafter, we will confront this literature with data from the robotic field. Finally, we will address the upcoming issues in the field of robotics related to automated systems aimed at acting as collaborative agents. PMID:29081744
Mobile robotics application in the nuclear industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, S.L.; White, J.R.
1995-03-01
Mobile robots have been developed to perform hazardous operations in place of human workers. Applications include nuclear plant inspection/maintenance, decontamination and decommissioning police/military explosive ordinance disposal (EOD), hostage/terrorist negotiations and fire fighting. Nuclear facilities have proven that robotic applications can be cost-effective solutions to reducing personnel exposure and plant downtime. The first applications of mobile robots in the nuclear industry began in the early 1980`s, with the first vehicles being one of a kind machines or adaptations of commercial EOD robots. These activities included efforts by numerous commercial companies, the U.S. Nuclear Regulatory Commission, EPRI, and several national laboratories. Somemore » of these efforts were driven by the recovery and cleanup activities at TMI which demonstrated the potential and need for a remote means of performing surveillance and maintenance tasks in nuclear plants. The use of these machines is now becoming commonplace in nuclear facilities throughout the world. The hardware maturity and the confidence of the users has progressed to the point where the applications of mobile robots is not longer considered a novelty. These machines are being used in applications where the result is to help achieve more aggressive goals for personnel radiation exposure and plant availability, perform tasks more efficiently, and allow plant operators to retrieve information from areas previously considered inaccessible. Typical examples include surveillance in high radiation areas (during operation and outage activities), radiation surveys, waste handling, and decontamination evolutions. This paper will discuss this evolution including specific applications experiences, examples of currently available technology, and the benefits derived from the use of mobile robotic vehicles in commercial nuclear power facilities.« less
NASA Astrophysics Data System (ADS)
Davenport, Jack H.
2016-05-01
Intelligence analysts demand rapid information fusion capabilities to develop and maintain accurate situational awareness and understanding of dynamic enemy threats in asymmetric military operations. The ability to extract relationships between people, groups, and locations from a variety of text datasets is critical to proactive decision making. The derived network of entities must be automatically created and presented to analysts to assist in decision making. DECISIVE ANALYTICS Corporation (DAC) provides capabilities to automatically extract entities, relationships between entities, semantic concepts about entities, and network models of entities from text and multi-source datasets. DAC's Natural Language Processing (NLP) Entity Analytics model entities as complex systems of attributes and interrelationships which are extracted from unstructured text via NLP algorithms. The extracted entities are automatically disambiguated via machine learning algorithms, and resolution recommendations are presented to the analyst for validation; the analyst's expertise is leveraged in this hybrid human/computer collaborative model. Military capability is enhanced by these NLP Entity Analytics because analysts can now create/update an entity profile with intelligence automatically extracted from unstructured text, thereby fusing entity knowledge from structured and unstructured data sources. Operational and sustainment costs are reduced since analysts do not have to manually tag and resolve entities.
Machining Test Specimens from Harvested Zion RPV Segments for Through Wall Attenuation Studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosseel, Thomas M; Sokolov, Mikhail A; Nanstad, Randy K
2015-01-01
The decommissioning of the Zion Units 1 and 2 Nuclear Generating Station (NGS) in Zion, Illinois presents a special opportunity for developing a better understanding of materials degradation and other issues associated with extending the lifetime of existing Nuclear Power Plants (NPPs) beyond 60 years of service. In support of extended service and current operations of the US nuclear reactor fleet, the Oak Ridge National Laboratory (ORNL), through the Department of Energy (DOE), Light Water Reactor Sustainability (LWRS) Program, is coordinating and contracting with Zion Solutions, LLC, a subsidiary of Energy Solutions, the selective procurement of materials, structures, and componentsmore » from the decommissioned reactors. In this paper, we will discuss the acquisition of segments of the Zion Unit 2 Reactor Pressure Vessel (RPV), the cutting of these segments into sections and blocks from the beltline and upper vertical welds and plate material, the current status of machining those blocks into mechanical (Charpy, compact tension, and tensile) test specimens and coupons for chemical and microstructural (TEM, APT, SANS, and nano indention) characterization, as well as the current test plans and possible collaborative projects. Access to service-irradiated RPV welds and plate sections will allow through wall attenuation studies to be performed, which will be used to assess current radiation damage models (Rosseel et al. (2012) and Rosseel et al. (2015)).« less
Face recognition accuracy of forensic examiners, superrecognizers, and face recognition algorithms.
Phillips, P Jonathon; Yates, Amy N; Hu, Ying; Hahn, Carina A; Noyes, Eilidh; Jackson, Kelsey; Cavazos, Jacqueline G; Jeckeln, Géraldine; Ranjan, Rajeev; Sankaranarayanan, Swami; Chen, Jun-Cheng; Castillo, Carlos D; Chellappa, Rama; White, David; O'Toole, Alice J
2018-06-12
Achieving the upper limits of face identification accuracy in forensic applications can minimize errors that have profound social and personal consequences. Although forensic examiners identify faces in these applications, systematic tests of their accuracy are rare. How can we achieve the most accurate face identification: using people and/or machines working alone or in collaboration? In a comprehensive comparison of face identification by humans and computers, we found that forensic facial examiners, facial reviewers, and superrecognizers were more accurate than fingerprint examiners and students on a challenging face identification test. Individual performance on the test varied widely. On the same test, four deep convolutional neural networks (DCNNs), developed between 2015 and 2017, identified faces within the range of human accuracy. Accuracy of the algorithms increased steadily over time, with the most recent DCNN scoring above the median of the forensic facial examiners. Using crowd-sourcing methods, we fused the judgments of multiple forensic facial examiners by averaging their rating-based identity judgments. Accuracy was substantially better for fused judgments than for individuals working alone. Fusion also served to stabilize performance, boosting the scores of lower-performing individuals and decreasing variability. Single forensic facial examiners fused with the best algorithm were more accurate than the combination of two examiners. Therefore, collaboration among humans and between humans and machines offers tangible benefits to face identification accuracy in important applications. These results offer an evidence-based roadmap for achieving the most accurate face identification possible. Copyright © 2018 the Author(s). Published by PNAS.
Health Management Technology as a General Solution Framework
NASA Astrophysics Data System (ADS)
Nakajima, Hiroshi; Hasegawa, Yoshifumi; Tasaki, Hiroshi; Iwami, Taro; Tsuchiya, Naoki
Health maintenance and improvement of humans, artifacts, and nature are pressing requirements considering the problems human beings have faced. In this article, the health management technology is proposed by centering cause-effect structure. The important aspect of the technology is evolvement through human-machine collaboration in response to changes of target systems. One of the reasons why the cause-effect structure is centered in the technology is its feature of transparency to humans by instinct point of view. The notion has been spreaded over wide application areas such as quality control, energy management, and healthcare. Some experiments were conducted to prove effectiveness of the technology in the article.
Balmer, Andrew S; Bulpin, Kate J
2013-01-01
In this article, we evaluate a novel method for post-ELSI (ethical, legal and social implications) collaboration, drawing on ‘human practices' (HP) to develop a form of reflexive ethical equipment that we termed ‘sociotechnical circuits'. We draw on a case study of working collaboratively in the International Genetically Engineered Machine Competition (iGEM) and relate this to the parts-based agenda of synthetic biology. We use qualitative methods to explore the experience of undergraduate students in the Competition, focussing on the 2010 University of Sheffield team. We examine how teams work collaboratively across disciplines to produce novel microorganisms. The Competition involves a HP component and we examine the way in which this has been narrowly defined within the ELSI framework. We argue that this is a much impoverished style of HP when compared with its original articulation as the development of ‘ethical equipment'. Inspired by this more theoretically rich HP framework, we explore the relations established between team members and how these were shaped by the norms, materials and practices of the Competition. We highlight the importance of care in the context of post-ELSI collaborations and report on the implications of our case study for such efforts and for the relation of the social sciences to the life sciences more generally. PMID:24159360
NASA Astrophysics Data System (ADS)
Prudic, K.; Toshack, M.; Hickson, B.; Hutchinson, R.
2017-12-01
Far too many species do not have proven means of assessment or effective conservation across the globe. We must gain better insights into the biological, environmental, and behavioral influences on the health and wealth of biodiversity to make a difference for various species and habitats as the environment changes due to human activities. Pollinator biodiversity information necessary for conservation is difficult to collect at a local level, let alone across a continent. Particularly, what pollinators are doing in more remote locations across elevational clines and how is climate change affecting them? Here we showcase a citizen-science project which takes advantage of the human ability to catch and photograph butterfly and their nectar plants coupled with machine learning to identify species, phenology shifts and diversity hotspots. We use this combined approach of human-computer collaboration to represent patterns of pollinator and nectar plant occurrences and diversity across broad spatial and temporal scales. We also improve data quality by taking advantage of the synergies between human computation and mechanical computation. We call this a human-machine learning network, whose core is an active learning feedback loop between humans and computers. We explore how this approach can leverage the contributions of human observers and process their contributed data with artificial intelligence algorithms leading to a computational power that far exceeds the sum of the individual parts providing important data products and visualizations for pollinator conservation research across a continent.
Smart Homes for All: Collaborating Services in a for-All Architecture for Domotics
NASA Astrophysics Data System (ADS)
Catarci, Tiziana; Cincotti, Febo; de Leoni, Massimiliano; Mecella, Massimo; Santucci, Giuseppe
Nowadays, control equipments such as automobiles, home appliances, communication, control and office machines, offer their functionalities in the form of services. Such service pervasivity is particularly evident in immersive realities, i.e., scenarios in which invisible embedded systems need to continuously interact with human users, in order to provide continuous sensed information and to react to service requests from the users themselves. The sm4all project, which will be presented in this paper, is investigating an innovative middleware platform for collaborating smart embedded services in immersive and person-centric environments, through the use of composability and semantic techniques.
Liu, Yu-Ting; Pal, Nikhil R; Marathe, Amar R; Wang, Yu-Kai; Lin, Chin-Teng
2017-01-01
A brain-computer interface (BCI) creates a direct communication pathway between the human brain and an external device or system. In contrast to patient-oriented BCIs, which are intended to restore inoperative or malfunctioning aspects of the nervous system, a growing number of BCI studies focus on designing auxiliary systems that are intended for everyday use. The goal of building these BCIs is to provide capabilities that augment existing intact physical and mental capabilities. However, a key challenge to BCI research is human variability; factors such as fatigue, inattention, and stress vary both across different individuals and for the same individual over time. If these issues are addressed, autonomous systems may provide additional benefits that enhance system performance and prevent problems introduced by individual human variability. This study proposes a human-machine autonomous (HMA) system that simultaneously aggregates human and machine knowledge to recognize targets in a rapid serial visual presentation (RSVP) task. The HMA focuses on integrating an RSVP BCI with computer vision techniques in an image-labeling domain. A fuzzy decision-making fuser (FDMF) is then applied in the HMA system to provide a natural adaptive framework for evidence-based inference by incorporating an integrated summary of the available evidence (i.e., human and machine decisions) and associated uncertainty. Consequently, the HMA system dynamically aggregates decisions involving uncertainties from both human and autonomous agents. The collaborative decisions made by an HMA system can achieve and maintain superior performance more efficiently than either the human or autonomous agents can achieve independently. The experimental results shown in this study suggest that the proposed HMA system with the FDMF can effectively fuse decisions from human brain activities and the computer vision techniques to improve overall performance on the RSVP recognition task. This conclusion demonstrates the potential benefits of integrating autonomous systems with BCI systems.
Liu, Yu-Ting; Pal, Nikhil R.; Marathe, Amar R.; Wang, Yu-Kai; Lin, Chin-Teng
2017-01-01
A brain-computer interface (BCI) creates a direct communication pathway between the human brain and an external device or system. In contrast to patient-oriented BCIs, which are intended to restore inoperative or malfunctioning aspects of the nervous system, a growing number of BCI studies focus on designing auxiliary systems that are intended for everyday use. The goal of building these BCIs is to provide capabilities that augment existing intact physical and mental capabilities. However, a key challenge to BCI research is human variability; factors such as fatigue, inattention, and stress vary both across different individuals and for the same individual over time. If these issues are addressed, autonomous systems may provide additional benefits that enhance system performance and prevent problems introduced by individual human variability. This study proposes a human-machine autonomous (HMA) system that simultaneously aggregates human and machine knowledge to recognize targets in a rapid serial visual presentation (RSVP) task. The HMA focuses on integrating an RSVP BCI with computer vision techniques in an image-labeling domain. A fuzzy decision-making fuser (FDMF) is then applied in the HMA system to provide a natural adaptive framework for evidence-based inference by incorporating an integrated summary of the available evidence (i.e., human and machine decisions) and associated uncertainty. Consequently, the HMA system dynamically aggregates decisions involving uncertainties from both human and autonomous agents. The collaborative decisions made by an HMA system can achieve and maintain superior performance more efficiently than either the human or autonomous agents can achieve independently. The experimental results shown in this study suggest that the proposed HMA system with the FDMF can effectively fuse decisions from human brain activities and the computer vision techniques to improve overall performance on the RSVP recognition task. This conclusion demonstrates the potential benefits of integrating autonomous systems with BCI systems. PMID:28676734
Radiochemistry Research and Training, UC Davis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sutcliffe, Julie
2012-08-01
The report contains a summary of the accomplishments made during the R2@UCDavis proposal. In brief we proposed to develop new and highly innovative radiotracer methods and to enhance training opportunities to ensure the future availability of human resources for highly specialized fields of radiotracer development chemistry and clinical nuclear medicine research and allied disciplines. The overall scientific objectives of this proposal were to utilize “click” chemistry to facilitate fast and site-specific radiolabeling. Progress was made on all initial goals presented. This funding has to date resulted in publications in high impact journals such as Acta Biomaterialia, Molecular Imaging and Biology,more » Nuclear Medicine and Biology and most recently Environmental Science and technology, and it is anticipated that through the collaborations established during the time course of this funding that future research will be published in clinically relevant journals such as Science Translational Medicine and the Journal of Nuclear Medicine. Trainees involved in this proposal have gone on to further their careers in both academia, industry and the private sector. The collaborative forums established during the time course of this funding will ensure the future availability of human resources for highly specialized fields of radiotracer development chemistry and clinical nuclear medicine research and allied disciplines.« less
A Review of Extra-Terrestrial Mining Robot Concepts
NASA Technical Reports Server (NTRS)
Mueller, Robert P.; Van Susante, Paul J.
2011-01-01
Outer space contains a vast amount of resources that offer virtually unlimited wealth to the humans that can access and use them for commercial purposes. One of the key technologies for harvesting these resources is robotic mining of regolith, minerals, ices and metals. The harsh environment and vast distances create challenges that are handled best by robotic machines working in collaboration with human explorers. Humans will benefit from the resources that will be mined by robots. They will visit outposts and mining camps as required for exploration, commerce and scientific research, but a continuous presence is most likely to be provided by robotic mining machines that are remotely controlled by humans. There have been a variety of extra-terrestrial robotic mining concepts proposed over the last 100 years and this paper will attempt to summarize and review concepts in the public domain (government, industry and academia) to serve as an informational resource for future mining robot developers and operators. The challenges associated with these concepts will be discussed and feasibility will be assessed. Future needs associated with commercial efforts will also be investigated.
A Review of Extra-Terrestrial Mining Concepts
NASA Technical Reports Server (NTRS)
Mueller, R. P.; van Susante, P. J.
2012-01-01
Outer space contains a vast amount of resources that offer virtually unlimited wealth to the humans that can access and use them for commercial purposes. One of the key technologies for harvesting these resources is robotic mining of regolith, minerals, ices and metals. The harsh environment and vast distances create challenges that are handled best by robotic machines working in collaboration with human explorers. Humans will benefit from the resources that will be mined by robots. They will visit outposts and mining camps as required for exploration, commerce and scientific research, but a continuous presence is most likely to be provided by robotic mining machines that are remotely controlled by humans. There have been a variety of extra-terrestrial robotic mining concepts proposed over the last 40 years and this paper will attempt to summarize and review concepts in the public domain (government, industry and academia) to serve as an informational resource for future mining robot developers and operators. The challenges associated with these concepts will be discussed and feasibility will be assessed. Future needs associated with commercial efforts will also be investigated.
NASA Astrophysics Data System (ADS)
Kimura, Toshiaki; Kasai, Fumio; Kamio, Yoichi; Kanda, Yuichi
This research paper discusses a manufacturing support system which supports not only maintenance services but also consulting services for manufacturing systems consisting of multi-vendor machine tools. In order to do this system enables inter-enterprise collaboration between engineering companies and machine tool vendors. The system is called "After-Sales Support Inter-enterprise collaboration System using information Technologies" (ASSIST). This paper describes the concept behind the planned ASSIST, the development of a prototype of the system, and discusses test operation results of the system.
A collaborative framework for Distributed Privacy-Preserving Support Vector Machine learning.
Que, Jialan; Jiang, Xiaoqian; Ohno-Machado, Lucila
2012-01-01
A Support Vector Machine (SVM) is a popular tool for decision support. The traditional way to build an SVM model is to estimate parameters based on a centralized repository of data. However, in the field of biomedicine, patient data are sometimes stored in local repositories or institutions where they were collected, and may not be easily shared due to privacy concerns. This creates a substantial barrier for researchers to effectively learn from the distributed data using machine learning tools like SVMs. To overcome this difficulty and promote efficient information exchange without sharing sensitive raw data, we developed a Distributed Privacy Preserving Support Vector Machine (DPP-SVM). The DPP-SVM enables privacy-preserving collaborative learning, in which a trusted server integrates "privacy-insensitive" intermediary results. The globally learned model is guaranteed to be exactly the same as learned from combined data. We also provide a free web-service (http://privacy.ucsd.edu:8080/ppsvm/) for multiple participants to collaborate and complete the SVM-learning task in an efficient and privacy-preserving manner.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harold S. Blackman; Ronald Boring; Julie L. Marble
This panel will discuss what new directions are necessary to maximize the usefulness of HRA techniques across different areas of application. HRA has long been a part of Probabilistic Risk Assessment in the nuclear industry as it offers a superior standard for risk-based decision-making. These techniques are continuing to be adopted by other industries including oil & gas, cybersecurity, nuclear, and aviation. Each participant will present his or her ideas concerning industry needs followed by a discussion about what research is needed and the necessity to achieve cross industry collaboration.
2017-09-01
third-offset/. 33 Sydney J . Freedberg Jr., “Carter, Roper Unveil Army’s New Ship-Killer Missile: ATACMS Upgrade,” Breaking Defense, October 28, 2016...39 For example, human-machine collaboration to improve the quality and speed of decision making. Sydney J . Freedberg Jr., “Centaur...unpublished thesis), United States Marine Corps Command and Staff College, 2000. 47 Stephen Anno and William E. Einspahr, “The Grenada Invasion,” in
Finding Waldo: Learning about Users from their Interactions.
Brown, Eli T; Ottley, Alvitta; Zhao, Helen; Quan Lin; Souvenir, Richard; Endert, Alex; Chang, Remco
2014-12-01
Visual analytics is inherently a collaboration between human and computer. However, in current visual analytics systems, the computer has limited means of knowing about its users and their analysis processes. While existing research has shown that a user's interactions with a system reflect a large amount of the user's reasoning process, there has been limited advancement in developing automated, real-time techniques that mine interactions to learn about the user. In this paper, we demonstrate that we can accurately predict a user's task performance and infer some user personality traits by using machine learning techniques to analyze interaction data. Specifically, we conduct an experiment in which participants perform a visual search task, and apply well-known machine learning algorithms to three encodings of the users' interaction data. We achieve, depending on algorithm and encoding, between 62% and 83% accuracy at predicting whether each user will be fast or slow at completing the task. Beyond predicting performance, we demonstrate that using the same techniques, we can infer aspects of the user's personality factors, including locus of control, extraversion, and neuroticism. Further analyses show that strong results can be attained with limited observation time: in one case 95% of the final accuracy is gained after a quarter of the average task completion time. Overall, our findings show that interactions can provide information to the computer about its human collaborator, and establish a foundation for realizing mixed-initiative visual analytics systems.
Transfer of control system interface solutions from other domains to the thermal power industry.
Bligård, L-O; Andersson, J; Osvalder, A-L
2012-01-01
In a thermal power plant the operators' roles are to control and monitor the process to achieve efficient and safe production. To achieve this, the human-machine interfaces have a central part. The interfaces need to be updated and upgraded together with the technical functionality to maintain optimal operation. One way of achieving relevant updates is to study other domains and see how they have solved similar issues in their design solutions. The purpose of this paper is to present how interface design solution ideas can be transferred from domains with operator control to thermal power plants. In the study 15 domains were compared using a model for categorisation of human-machine systems. The result from the domain comparison showed that nuclear power, refinery and ship engine control were most similar to thermal power control. From the findings a basic interface structure and three specific display solutions were proposed for thermal power control: process parameter overview, plant overview, and feed water view. The systematic comparison of the properties of a human-machine system allowed interface designers to find suitable objects, structures and navigation logics in a range of domains that could be transferred to the thermal power domain.
Areas for US-India civilian nuclear cooperation to prevent/mitigate radiological events.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balachandran, Gopalan; Forden, Geoffrey Ethan
2013-01-01
Over the decades, India and the United States have had very little formal collaboration on nuclear issues. Partly this was because neither country needed collaboration to make progress in the nuclear field. But it was also due, in part, to the concerns both countries had about the others intentions. Now that the U.S.-India Deal on nuclear collaboration has been signed and the Hyde Act passed in the United States, it is possible to recognize that both countries can benefit from such nuclear collaboration, especially if it starts with issues important to both countries that do not touch on strategic systems.more » Fortunately, there are many noncontroversial areas for collaboration. This study, funded by the U.S. State Department, has identified a number of areas in the prevention of and response to radiological incidents where such collaboration could take place.« less
Department of Defense In-House RDT&E Activities. Management Analysis Report
1987-10-30
AIRCRAFT BY NAVY PERSONNEL; ESTABLISH HUMAN TOLERANCE LIMITS FOR THESE FORCES, DEVELOP PREVENTIVE AND THERAPEUTIC METHODS TO PROTECT PERSONNEL FROM...Engineering 436 Plant Protection and 830 Mechanical Engineering Quarantine 840 Nuclear Engineering 437 Horticulture S50 Electrical Engineering 440...Technician 648 Therapeutic Radiological 1311 Physical Science Technologist Technician 649 Medical Machine Technician 1316 Hydraulic Technician 650 Medical
Coleman, C Norman; Hrdina, Chad; Bader, Judith L; Norwood, Ann; Hayhurst, Robert; Forsha, Joseph; Yeskey, Kevin; Knebel, Ann
2009-02-01
The end of the Cold War led to a reduced concern for a major nuclear event. However, the current threats from terrorism make a radiologic (dispersal or use of radioactive material) or nuclear (improvised nuclear device) event a possibility. The specter and enormousness of the catastrophe resulting from a state-sponsored nuclear attack and a sense of nihilism about the effectiveness of a response were such that there had been limited civilian medical response planning. Although the consequences of a radiologic dispersal device are substantial, and the detonation of a modest-sized (10 kiloton) improvised nuclear device is catastrophic, it is both possible and imperative that a medical response be planned. To meet this need, the Office of the Assistant Secretary for Preparedness and Response in the Department of Health and Human Services, in collaboration within government and with nongovernment partners, has developed a scientifically based comprehensive planning framework and Web-based "just-in-time" medical response information called Radiation Event Medical Management (available at http://www.remm.nlm.gov). The response plan includes (1) underpinnings from basic radiation biology, (2) tailored medical responses, (3) delivery of medical countermeasures for postevent mitigation and treatment, (4) referral to expert centers for acute treatment, and (5) long-term follow-up. Although continuing to evolve and increase in scope and capacity, current response planning is sufficiently mature that planners and responders should be aware of the basic premises, tools, and resources available. An effective response will require coordination, communication, and cooperation at an unprecedented level. The logic behind and components of this response are presented to allow for active collaboration among emergency planners and responders and federal, state, local, and tribal governments.
A Collaborative 20 Questions Model for Target Search with Human-Machine Interaction
2013-05-01
optimal policies for entropy loss,” Journal of Applied Probability, vol. 49, pp. 114–136, 2012. [2] R. Castro and R. Nowak, “ Active learning and...vol. 10, pp. 223231, 1974. [8] R. Castro, Active Learning and Adaptive Sampling for Non- parametric Inference, Ph.D. thesis, Rice University, August...2007. [9] R. Castro and R. D. Nowak, “Upper and lower bounds for active learning ,” in 44th Annual Allerton Conference on Communica- tion, Control and Computing, 2006.
The QuEST for multi-sensor big data ISR situation understanding
NASA Astrophysics Data System (ADS)
Rogers, Steven; Culbertson, Jared; Oxley, Mark; Clouse, H. Scott; Abayowa, Bernard; Patrick, James; Blasch, Erik; Trumpfheller, John
2016-05-01
The challenges for providing war fighters with the best possible actionable information from diverse sensing modalities using advances in big-data and machine learning are addressed in this paper. We start by presenting intelligence, surveillance, and reconnaissance (ISR) related big-data challenges associated with the Third Offset Strategy. Current approaches to big-data are shown to be limited with respect to reasoning/understanding. We present a discussion of what meaning making and understanding require. We posit that for human-machine collaborative solutions to address the requirements for the strategy a new approach, Qualia Exploitation of Sensor Technology (QuEST), will be required. The requirements for developing a QuEST theory of knowledge are discussed and finally, an engineering approach for achieving situation understanding is presented.
Giving Back: Collaborations with Others in Ecological Studies on the Nevada National Security Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott A. Wade; Kathryn S. Knapp; Cathy A. Wills
2013-02-24
Formerly named the Nevada Test Site, the Nevada National Security Site (NNSS) was the historical site for nuclear weapons testing from the 1950s to the early 1990s. The site was renamed in 2010 to reflect the diversity of nuclear, energy, and homeland security activities now conducted at the site. Biological and ecological programs and research have been conducted on the site for decades to address the impacts of radiation and to take advantage of the relatively undisturbed and isolated lands for gathering basic information on the occurrence and distribution of native plants and animals. Currently, the Office of the Assistantmore » Manager for Environmental Management of the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Site Office (NNSA/NSO) oversees the radiological biota monitoring and ecological compliance programs on the NNSS. The top priority of these programs are compliance with federal and state regulations. They focus on performing radiological dose assessments for the public who reside near the NNSS and for populations of plants and animals on the NNSS and in protecting important species and habitat from direct impacts of mission activities. The NNSS serves as an invaluable outdoor laboratory. The geographic and ecological diversity of the site offers researchers many opportunities to study human influences on ecosystems. NNSA/NSO has pursued collaborations with outside agencies and organizations to be able to conduct programs and studies that enhance radiological biota monitoring and ecosystem preservation when budgets are restrictive, as well as to provide valuable scientific information to the human health and natural resource communities at large. NNSA/NSO is using one current collaborative study to better assess the potential dose to the off-site public from the ingestion of game animals, the most realistic pathway for off-site public exposure at this time from radionuclide contamination on the NNSS. A second collaborative study is furthering desert tortoise conservation measures onsite. It is the goal of NNSA/NSO to continue to develop such collaborations in the sharing of resources, such as personnel, equipment, expertise, and NNSS land access, with outside entities to meet mutually beneficial goals cost effectively.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wade, Scott A.; Knapp, Kathryn S.; Wills, Cathy A.
2013-07-01
Formerly named the Nevada Test Site, the Nevada National Security Site (NNSS) was the historical site for nuclear weapons testing from the 1950's to the early 1990's. The site was renamed in 2010 to reflect the diversity of nuclear, energy, and homeland security activities now conducted at the site. Biological and ecological programs and research have been conducted on the site for decades to address the impacts of radiation and to take advantage of the relatively undisturbed and isolated lands for gathering basic information on the occurrence and distribution of native plants and animals. Currently, the Office of the Assistantmore » Manager for Environmental Management of the U.S. Department of Energy (DOE), National Nuclear Security Administration Nevada Site Office (NNSA/NSO) oversees the radiological biota monitoring and ecological compliance programs on the NNSS. The top priority of these programs are compliance with federal and state regulations. They focus on performing radiological dose assessments for the public who reside near the NNSS and for populations of plants and animals on the NNSS and in protecting important species and habitat from direct impacts of mission activities. The NNSS serves as an invaluable outdoor laboratory. The geographic and ecological diversity of the site offers researchers many opportunities to study human influences on ecosystems. NNSA/NSO has pursued collaborations with outside agencies and organizations to be able to conduct programs and studies that enhance radiological biota monitoring and ecosystem preservation when budgets are restrictive, as well as to provide valuable scientific information to the human health and natural resource communities at large. NNSA/NSO is using one current collaborative study to better assess the potential dose to the off-site public from the ingestion of game animals, the most realistic pathway for off-site public exposure at this time from radionuclide contamination on the NNSS. A second collaborative study is furthering desert tortoise conservation measures onsite. It is the goal of NNSA/NSO to continue to develop such collaborations in the sharing of resources, such as personnel, equipment, expertise, and NNSS land access, with outside entities to meet mutually beneficial goals cost effectively. (authors)« less
A Collaborative Framework for Distributed Privacy-Preserving Support Vector Machine Learning
Que, Jialan; Jiang, Xiaoqian; Ohno-Machado, Lucila
2012-01-01
A Support Vector Machine (SVM) is a popular tool for decision support. The traditional way to build an SVM model is to estimate parameters based on a centralized repository of data. However, in the field of biomedicine, patient data are sometimes stored in local repositories or institutions where they were collected, and may not be easily shared due to privacy concerns. This creates a substantial barrier for researchers to effectively learn from the distributed data using machine learning tools like SVMs. To overcome this difficulty and promote efficient information exchange without sharing sensitive raw data, we developed a Distributed Privacy Preserving Support Vector Machine (DPP-SVM). The DPP-SVM enables privacy-preserving collaborative learning, in which a trusted server integrates “privacy-insensitive” intermediary results. The globally learned model is guaranteed to be exactly the same as learned from combined data. We also provide a free web-service (http://privacy.ucsd.edu:8080/ppsvm/) for multiple participants to collaborate and complete the SVM-learning task in an efficient and privacy-preserving manner. PMID:23304414
The secondary supernova machine: Gravitational compression, stored Coulomb energy, and SNII displays
NASA Astrophysics Data System (ADS)
Clayton, Donald D.; Meyer, Bradley S.
2016-04-01
Radioactive power for several delayed optical displays of core-collapse supernovae is commonly described as having been provided by decays of 56Ni nuclei. This review analyses the provenance of that energy more deeply: the form in which that energy is stored; what mechanical work causes its storage; what conservation laws demand that it be stored; and why its release is fortuitously delayed for about 106 s into a greatly expanded supernova envelope. We call the unifying picture of those energy transfers the secondary supernova machine owing to its machine-like properties; namely, mechanical work forces storage of large increases of nuclear Coulomb energy, a positive energy component within new nuclei synthesized by the secondary machine. That positive-energy increase occurs despite the fusion decreasing negative total energy within nuclei. The excess of the Coulomb energy can later be radiated, accounting for the intense radioactivity in supernovae. Detailed familiarity with this machine is the focus of this review. The stored positive-energy component created by the machine will not be reduced until roughly 106 s later by radioactive emissions (EC and β +) owing to the slowness of weak decays. The delayed energy provided by the secondary supernova machine is a few × 1049 erg, much smaller than the one percent of the 1053 erg collapse that causes the prompt ejection of matter; however, that relatively small stored energy is vital for activation of the late displays. The conceptual basis of the secondary supernova machine provides a new framework for understanding the energy source for late SNII displays. We demonstrate the nuclear dynamics with nuclear network abundance calculations, with a model of sudden compression and reexpansion of the nuclear gas, and with nuclear energy decompositions of a nuclear-mass law. These tools identify excess Coulomb energy, a positive-energy component of the total negative nuclear energy, as the late activation energy. If the value of fundamental charge e were smaller, SNII would not be so profoundly radioactive. Excess Coulomb energy has been carried within nuclei radially for roughly 109 km before being radiated into greatly expanded supernova remnants. The Coulomb force claims heretofore unacknowledged significance for supernova physics.
Computational approaches for predicting biomedical research collaborations.
Zhang, Qing; Yu, Hong
2014-01-01
Biomedical research is increasingly collaborative, and successful collaborations often produce high impact work. Computational approaches can be developed for automatically predicting biomedical research collaborations. Previous works of collaboration prediction mainly explored the topological structures of research collaboration networks, leaving out rich semantic information from the publications themselves. In this paper, we propose supervised machine learning approaches to predict research collaborations in the biomedical field. We explored both the semantic features extracted from author research interest profile and the author network topological features. We found that the most informative semantic features for author collaborations are related to research interest, including similarity of out-citing citations, similarity of abstracts. Of the four supervised machine learning models (naïve Bayes, naïve Bayes multinomial, SVMs, and logistic regression), the best performing model is logistic regression with an ROC ranging from 0.766 to 0.980 on different datasets. To our knowledge we are the first to study in depth how research interest and productivities can be used for collaboration prediction. Our approach is computationally efficient, scalable and yet simple to implement. The datasets of this study are available at https://github.com/qingzhanggithub/medline-collaboration-datasets.
Physics-based and human-derived information fusion for analysts
NASA Astrophysics Data System (ADS)
Blasch, Erik; Nagy, James; Scott, Steve; Okoth, Joshua; Hinman, Michael
2017-05-01
Recent trends in physics-based and human-derived information fusion (PHIF) have amplified the capabilities of analysts; however with the big data opportunities there is a need for open architecture designs, methods of distributed team collaboration, and visualizations. In this paper, we explore recent trends in the information fusion to support user interaction and machine analytics. Challenging scenarios requiring PHIF include combing physics-based video data with human-derived text data for enhanced simultaneous tracking and identification. A driving effort would be to provide analysts with applications, tools, and interfaces that afford effective and affordable solutions for timely decision making. Fusion at scale should be developed to allow analysts to access data, call analytics routines, enter solutions, update models, and store results for distributed decision making.
NASA Technical Reports Server (NTRS)
Ambur, Manjula; Schwartz, Katherine G.; Mavris, Dimitri N.
2016-01-01
The fields of machine learning and big data analytics have made significant advances in recent years, which has created an environment where cross-fertilization of methods and collaborations can achieve previously unattainable outcomes. The Comprehensive Digital Transformation (CDT) Machine Learning and Big Data Analytics team planned a workshop at NASA Langley in August 2016 to unite leading experts the field of machine learning and NASA scientists and engineers. The primary goal for this workshop was to assess the state-of-the-art in this field, introduce these leading experts to the aerospace and science subject matter experts, and develop opportunities for collaboration. The workshop was held over a three day-period with lectures from 15 leading experts followed by significant interactive discussions. This report provides an overview of the 15 invited lectures and a summary of the key discussion topics that arose during both formal and informal discussion sections. Four key workshop themes were identified after the closure of the workshop and are also highlighted in the report. Furthermore, several workshop attendees provided their feedback on how they are already utilizing machine learning algorithms to advance their research, new methods they learned about during the workshop, and collaboration opportunities they identified during the workshop.
Comprehensive proteomic analysis of the human spliceosome
NASA Astrophysics Data System (ADS)
Zhou, Zhaolan; Licklider, Lawrence J.; Gygi, Steven P.; Reed, Robin
2002-09-01
The precise excision of introns from pre-messenger RNA is performed by the spliceosome, a macromolecular machine containing five small nuclear RNAs and numerous proteins. Much has been learned about the protein components of the spliceosome from analysis of individual purified small nuclear ribonucleoproteins and salt-stable spliceosome `core' particles. However, the complete set of proteins that constitutes intact functional spliceosomes has yet to be identified. Here we use maltose-binding protein affinity chromatography to isolate spliceosomes in highly purified and functional form. Using nanoscale microcapillary liquid chromatography tandem mass spectrometry, we identify ~145 distinct spliceosomal proteins, making the spliceosome the most complex cellular machine so far characterized. Our spliceosomes comprise all previously known splicing factors and 58 newly identified components. The spliceosome contains at least 30 proteins with known or putative roles in gene expression steps other than splicing. This complexity may be required not only for splicing multi-intronic metazoan pre-messenger RNAs, but also for mediating the extensive coupling between splicing and other steps in gene expression.
NASA Technical Reports Server (NTRS)
Burdett, Gerald L. (Editor); Soffen, Gerald A. (Editor)
1987-01-01
Papers are presented on the Space Station, materials processing in space, the status of space remote sensing, the evolution of space infrastructure, and the NASA Teacher Program. Topics discussed include visionary technologies, the effect of intelligent machines on space operations, future information technology, and the role of nuclear power in future space missions. Consideration is given to the role of humans in space exploration; medical problems associated with long-duration space flights; lunar and Martian settlements, and Biosphere II (the closed ecology project).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roger Lew; Ronald L. Boring; Thomas A. Ulrich
Operators of critical processes, such as nuclear power production, must contend with highly complex systems, procedures, and regulations. Developing human-machine interfaces (HMIs) that better support operators is a high priority for ensuring the safe and reliable operation of critical processes. Human factors engineering (HFE) provides a rich and mature set of tools for evaluating the performance of HMIs, but the set of tools for developing and designing HMIs is still in its infancy. Here we propose that Microsoft Windows Presentation Foundation (WPF) is well suited for many roles in the research and development of HMIs for process control.
CESAR robotics and intelligent systems research for nuclear environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mann, R.C.
1992-07-01
The Center for Engineering Systems Advanced Research (CESAR) at the Oak Ridge National Laboratory (ORNL) encompasses expertise and facilities to perform basic and applied research in robotics and intelligent systems in order to address a broad spectrum of problems related to nuclear and other environments. For nuclear environments, research focus is derived from applications in advanced nuclear power stations, and in environmental restoration and waste management. Several programs at CESAR emphasize the cross-cutting technology issues, and are executed in appropriate cooperation with projects that address specific problem areas. Although the main thrust of the CESAR long-term research is on developingmore » highly automated systems that can cooperate and function reliably in complex environments, the development of advanced human-machine interfaces represents a significant part of our research. 11 refs.« less
CESAR robotics and intelligent systems research for nuclear environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mann, R.C.
1992-01-01
The Center for Engineering Systems Advanced Research (CESAR) at the Oak Ridge National Laboratory (ORNL) encompasses expertise and facilities to perform basic and applied research in robotics and intelligent systems in order to address a broad spectrum of problems related to nuclear and other environments. For nuclear environments, research focus is derived from applications in advanced nuclear power stations, and in environmental restoration and waste management. Several programs at CESAR emphasize the cross-cutting technology issues, and are executed in appropriate cooperation with projects that address specific problem areas. Although the main thrust of the CESAR long-term research is on developingmore » highly automated systems that can cooperate and function reliably in complex environments, the development of advanced human-machine interfaces represents a significant part of our research. 11 refs.« less
Study About Ceiling Design for Main Control Room of NPP with HFE
NASA Astrophysics Data System (ADS)
Gu, Pengfei; Ni, Ying; Chen, Weihua; Chen, Bo; Zhang, Jianbo; Liang, Huihui
Recently since human factor engineering (HFE) has been used in control room design of nuclear power plant (NPP), the human-machine interface (HMI) has been gradual to develop harmoniously, especially the use of the digital technology. Comparing with the analog technology which was used to human-machine interface in the past, human-machine interaction has been more enhanced. HFE and the main control room (MCR) design engineering of NPP is a combination of multidisciplinary cross, mainly related to electrical and instrument control, reactor, machinery, systems engineering and management disciplines. However, MCR is not only equipped with HMI provided by the equipments, but also more important for the operator to provide a work environment, such as the main control room ceiling. The ceiling design of main control room related to HFE which influences the performance of staff should also be considered in the design of the environment and aesthetic factors, especially the introduction of professional design experience and evaluation method. Based on Ling Ao phase II and Hong Yanhe project implementation experience, the study analyzes lighting effect, space partition, vision load about the ceiling of main control room of NPP. Combining with the requirements of standards, the advantages and disadvantages of the main control room ceiling design has been discussed, and considering the requirements of lightweight, noise reduction, fire prevention, moisture protection, the ceiling design solution of the main control room also has been discussed.
Finding Waldo: Learning about Users from their Interactions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Eli T.; Ottley, Alvitta; Zhao, Helen
Visual analytics is inherently a collaboration between human and computer. However, in current visual analytics systems, the computer has limited means of knowing about its users and their analysis processes. While existing research has shown that a user’s interactions with a system reflect a large amount of the user’s reasoning process, there has been limited advancement in developing automated, real-time techniques that mine interactions to learn about the user. In this paper, we demonstrate that we can accurately predict a user’s task performance and infer some user personality traits by using machine learning techniques to analyze interaction data. Specifically, wemore » conduct an experiment in which participants perform a visual search task and we apply well-known machine learning algorithms to three encodings of the users interaction data. We achieve, depending on algorithm and encoding, between 62% and 96% accuracy at predicting whether each user will be fast or slow at completing the task. Beyond predicting performance, we demonstrate that using the same techniques, we can infer aspects of the user’s personality factors, including locus of control, extraversion, and neuroticism. Further analyses show that strong results can be attained with limited observation time, in some cases, 82% of the final accuracy is gained after a quarter of the average task completion time. Overall, our findings show that interactions can provide information to the computer about its human collaborator, and establish a foundation for realizing mixed- initiative visual analytics systems.« less
Accelerating Cancer Systems Biology Research through Semantic Web Technology
Wang, Zhihui; Sagotsky, Jonathan; Taylor, Thomas; Shironoshita, Patrick; Deisboeck, Thomas S.
2012-01-01
Cancer systems biology is an interdisciplinary, rapidly expanding research field in which collaborations are a critical means to advance the field. Yet the prevalent database technologies often isolate data rather than making it easily accessible. The Semantic Web has the potential to help facilitate web-based collaborative cancer research by presenting data in a manner that is self-descriptive, human and machine readable, and easily sharable. We have created a semantically linked online Digital Model Repository (DMR) for storing, managing, executing, annotating, and sharing computational cancer models. Within the DMR, distributed, multidisciplinary, and inter-organizational teams can collaborate on projects, without forfeiting intellectual property. This is achieved by the introduction of a new stakeholder to the collaboration workflow, the institutional licensing officer, part of the Technology Transfer Office. Furthermore, the DMR has achieved silver level compatibility with the National Cancer Institute’s caBIG®, so users can not only interact with the DMR through a web browser but also through a semantically annotated and secure web service. We also discuss the technology behind the DMR leveraging the Semantic Web, ontologies, and grid computing to provide secure inter-institutional collaboration on cancer modeling projects, online grid-based execution of shared models, and the collaboration workflow protecting researchers’ intellectual property. PMID:23188758
Accelerating cancer systems biology research through Semantic Web technology.
Wang, Zhihui; Sagotsky, Jonathan; Taylor, Thomas; Shironoshita, Patrick; Deisboeck, Thomas S
2013-01-01
Cancer systems biology is an interdisciplinary, rapidly expanding research field in which collaborations are a critical means to advance the field. Yet the prevalent database technologies often isolate data rather than making it easily accessible. The Semantic Web has the potential to help facilitate web-based collaborative cancer research by presenting data in a manner that is self-descriptive, human and machine readable, and easily sharable. We have created a semantically linked online Digital Model Repository (DMR) for storing, managing, executing, annotating, and sharing computational cancer models. Within the DMR, distributed, multidisciplinary, and inter-organizational teams can collaborate on projects, without forfeiting intellectual property. This is achieved by the introduction of a new stakeholder to the collaboration workflow, the institutional licensing officer, part of the Technology Transfer Office. Furthermore, the DMR has achieved silver level compatibility with the National Cancer Institute's caBIG, so users can interact with the DMR not only through a web browser but also through a semantically annotated and secure web service. We also discuss the technology behind the DMR leveraging the Semantic Web, ontologies, and grid computing to provide secure inter-institutional collaboration on cancer modeling projects, online grid-based execution of shared models, and the collaboration workflow protecting researchers' intellectual property. Copyright © 2012 Wiley Periodicals, Inc.
Interaction with Machine Improvisation
NASA Astrophysics Data System (ADS)
Assayag, Gerard; Bloch, George; Cont, Arshia; Dubnov, Shlomo
We describe two multi-agent architectures for an improvisation oriented musician-machine interaction systems that learn in real time from human performers. The improvisation kernel is based on sequence modeling and statistical learning. We present two frameworks of interaction with this kernel. In the first, the stylistic interaction is guided by a human operator in front of an interactive computer environment. In the second framework, the stylistic interaction is delegated to machine intelligence and therefore, knowledge propagation and decision are taken care of by the computer alone. The first framework involves a hybrid architecture using two popular composition/performance environments, Max and OpenMusic, that are put to work and communicate together, each one handling the process at a different time/memory scale. The second framework shares the same representational schemes with the first but uses an Active Learning architecture based on collaborative, competitive and memory-based learning to handle stylistic interactions. Both systems are capable of processing real-time audio/video as well as MIDI. After discussing the general cognitive background of improvisation practices, the statistical modelling tools and the concurrent agent architecture are presented. Then, an Active Learning scheme is described and considered in terms of using different improvisation regimes for improvisation planning. Finally, we provide more details about the different system implementations and describe several performances with the system.
WTEC panel report on European nuclear instrumentation and controls
NASA Technical Reports Server (NTRS)
White, James D.; Lanning, David D.; Beltracchi, Leo; Best, Fred R.; Easter, James R.; Oakes, Lester C.; Sudduth, A. L.
1991-01-01
Control and instrumentation systems might be called the 'brain' and 'senses' of a nuclear power plant. As such they become the key elements in the integrated operation of these plants. Recent developments in digital equipment have allowed a dramatic change in the design of these instrument and control (I&C) systems. New designs are evolving with cathode ray tube (CRT)-based control rooms, more automation, and better logical information for the human operators. As these new advanced systems are developed, various decisions must be made about the degree of automation and the human-to-machine interface. Different stages of the development of control automation and of advanced digital systems can be found in various countries. The purpose of this technology assessment is to make a comparative evaluation of the control and instrumentation systems that are being used for commercial nuclear power plants in Europe and the United States. This study is limited to pressurized water reactors (PWR's). Part of the evaluation includes comparisons with a previous similar study assessing Japanese technology.
NASA Astrophysics Data System (ADS)
Meyer, B. S.
2018-04-01
The author and collaborators are developing nucleusHUB.org, built with HUBzero technology, to facilitate interaction among astronomers, nuclear astrophysicists, and planetary scientists. The site allows users to collaborate and publish online tools.
A machine learning approach for viral genome classification.
Remita, Mohamed Amine; Halioui, Ahmed; Malick Diouara, Abou Abdallah; Daigle, Bruno; Kiani, Golrokh; Diallo, Abdoulaye Baniré
2017-04-11
Advances in cloning and sequencing technology are yielding a massive number of viral genomes. The classification and annotation of these genomes constitute important assets in the discovery of genomic variability, taxonomic characteristics and disease mechanisms. Existing classification methods are often designed for specific well-studied family of viruses. Thus, the viral comparative genomic studies could benefit from more generic, fast and accurate tools for classifying and typing newly sequenced strains of diverse virus families. Here, we introduce a virus classification platform, CASTOR, based on machine learning methods. CASTOR is inspired by a well-known technique in molecular biology: restriction fragment length polymorphism (RFLP). It simulates, in silico, the restriction digestion of genomic material by different enzymes into fragments. It uses two metrics to construct feature vectors for machine learning algorithms in the classification step. We benchmark CASTOR for the classification of distinct datasets of human papillomaviruses (HPV), hepatitis B viruses (HBV) and human immunodeficiency viruses type 1 (HIV-1). Results reveal true positive rates of 99%, 99% and 98% for HPV Alpha species, HBV genotyping and HIV-1 M subtyping, respectively. Furthermore, CASTOR shows a competitive performance compared to well-known HIV-1 specific classifiers (REGA and COMET) on whole genomes and pol fragments. The performance of CASTOR, its genericity and robustness could permit to perform novel and accurate large scale virus studies. The CASTOR web platform provides an open access, collaborative and reproducible machine learning classifiers. CASTOR can be accessed at http://castor.bioinfo.uqam.ca .
Ching, Joan M; Williams, Barbara L; Idemoto, Lori M; Blackmore, C Craig
2014-08-01
Virginia Mason Medical Center (Seattle) employed the Lean concept of Jidoka (automation with a human touch) to plan for and deploy bar code medication administration (BCMA) to hospitalized patients. Integrating BCMA technology into the nursing work flow with minimal disruption was accomplished using three steps ofJidoka: (1) assigning work to humans and machines on the basis of their differing abilities, (2) adapting machines to the human work flow, and (3) monitoring the human-machine interaction. Effectiveness of BCMA to both reinforce safe administration practices and reduce medication errors was measured using the Collaborative Alliance for Nursing Outcomes (CALNOC) Medication Administration Accuracy Quality Study methodology. Trained nurses observed a total of 16,149 medication doses for 3,617 patients in a three-year period. Following BCMA implementation, the number of safe practice violations decreased from 54.8 violations/100 doses (January 2010-September 2011) to 29.0 violations/100 doses (October 2011-December 2012), resulting in an absolute risk reduction of 25.8 violations/100 doses (95% confidence interval [CI]: 23.7, 27.9, p < .001). The number of medication errors decreased from 5.9 errors/100 doses at baseline to 3.0 errors/100 doses after BCMA implementation (absolute risk reduction: 2.9 errors/100 doses [95% CI: 2.2, 3.6,p < .001]). The number of unsafe administration practices (estimate, -5.481; standard error 1.133; p < .001; 95% CI: -7.702, -3.260) also decreased. As more hospitals respond to health information technology meaningful use incentives, thoughtful, methodical, and well-managed approaches to technology deployment are crucial. This work illustrates how Jidoka offers opportunities for a smooth transition to new technology.
Day, Eric Anthony; Boatman, Paul R; Kowollik, Vanessa; Espejo, Jazmine; McEntire, Lauren E; Sherwin, Rachel E
2007-12-01
This study examined the effectiveness of collaborative training for individuals with low pretraining self-efficacy versus individuals with high pretraining self-efficacy regarding the acquisition of a complex skill that involved strong cognitive and psychomotor demands. Despite support for collaborative learning from the educational literature and the similarities between collaborative learning and interventions designed to remediate low self-efficacy, no research has addressed how self-efficacy and collaborative learning interact in contexts concerning complex skills and human-machine interactions. One hundred fifty-five young male adults trained either individually or collaboratively with a more experienced partner on a complex computer task that simulated the demands of a dynamic aviation environment. Participants also completed a task-specific measure of self-efficacy before, during, and after training. Collaborative training enhanced skill acquisition significantly more for individuals with low pretraining self-efficacy than for individuals with high pretraining self-efficacy. However, collaborative training did not bring the skill acquisition levels of those persons with low pretraining self-efficacy to the levels found for persons with high pretraining self-efficacy. Moreover, tests of mediation suggested that collaborative training may have enhanced appropriate skill development strategies without actually raising self-efficacy. Although collaborative training can facilitate the skill acquisition process for trainees with low self-efficacy, future research is needed that examines how the negative effects of low pretraining self-efficacy on complex skill acquisition can be more fully remediated. The differential effects of collaborative training as a function of self-efficacy highlight the importance of person analysis and tailoring training to meet differing trainee needs.
NASA Astrophysics Data System (ADS)
Preece, Alun; Gwilliams, Chris; Parizas, Christos; Pizzocaro, Diego; Bakdash, Jonathan Z.; Braines, Dave
2014-05-01
Recent developments in sensing technologies, mobile devices and context-aware user interfaces have made it pos- sible to represent information fusion and situational awareness for Intelligence, Surveillance and Reconnaissance (ISR) activities as a conversational process among actors at or near the tactical edges of a network. Motivated by use cases in the domain of Company Intelligence Support Team (CoIST) tasks, this paper presents an approach to information collection, fusion and sense-making based on the use of natural language (NL) and controlled nat- ural language (CNL) to support richer forms of human-machine interaction. The approach uses a conversational protocol to facilitate a ow of collaborative messages from NL to CNL and back again in support of interactions such as: turning eyewitness reports from human observers into actionable information (from both soldier and civilian sources); fusing information from humans and physical sensors (with associated quality metadata); and assisting human analysts to make the best use of available sensing assets in an area of interest (governed by man- agement and security policies). CNL is used as a common formal knowledge representation for both machine and human agents to support reasoning, semantic information fusion and generation of rationale for inferences, in ways that remain transparent to human users. Examples are provided of various alternative styles for user feedback, including NL, CNL and graphical feedback. A pilot experiment with human subjects shows that a prototype conversational agent is able to gather usable CNL information from untrained human subjects.
AfterMath: the work of proof in the age of human-machine collaboration.
Dick, Stephanie
2011-09-01
During the 1970s and 1980s, a team of Automated Theorem Proving researchers at the Argonne National Laboratory near Chicago developed the Automated Reasoning Assistant, or AURA, to assist human users in the search for mathematical proofs. The resulting hybrid humans+AURA system developed the capacity to make novel contributions to pure mathematics by very untraditional means. This essay traces how these unconventional contributions were made and made possible through negotiations between the humans and the AURA at Argonne and the transformation in mathematical intuition they produced. At play in these negotiations were experimental practices, nonhumans, and nonmathematical modes of knowing. This story invites an earnest engagement between historians of mathematics and scholars in the history of science and science studies interested in experimental practice, material culture, and the roles of nonhumans in knowledge making.
ERIC Educational Resources Information Center
Dillenbourg, Pierre, Ed.
Intended to illustrate the benefits of collaboration between scientists from psychology and computer science, namely machine learning, this book contains the following chapters, most of which are co-authored by scholars from both sides: (1) "Introduction: What Do You Mean by 'Collaborative Learning'?" (Pierre Dillenbourg); (2)…
Nipah virus matrix protein: expert hacker of cellular machines.
Watkinson, Ruth E; Lee, Benhur
2016-08-01
Nipah virus (NiV, Henipavirus) is a highly lethal emergent zoonotic paramyxovirus responsible for repeated human outbreaks of encephalitis in South East Asia. There are no approved vaccines or treatments, thus improved understanding of NiV biology is imperative. NiV matrix protein recruits a plethora of cellular machinery to scaffold and coordinate virion budding. Intriguingly, matrix also hijacks cellular trafficking and ubiquitination pathways to facilitate transient nuclear localization. While the biological significance of matrix nuclear localization for an otherwise cytoplasmic virus remains enigmatic, the molecular details have begun to be characterized, and are conserved among matrix proteins from divergent paramyxoviruses. Matrix protein appropriation of cellular machinery will be discussed in terms of its early nuclear targeting and later role in virion assembly. © 2016 Federation of European Biochemical Societies.
NASA Astrophysics Data System (ADS)
Fei, Cheng-Wei; Bai, Guang-Chen
2014-12-01
To improve the computational precision and efficiency of probabilistic design for mechanical dynamic assembly like the blade-tip radial running clearance (BTRRC) of gas turbine, a distribution collaborative probabilistic design method-based support vector machine of regression (SR)(called as DCSRM) is proposed by integrating distribution collaborative response surface method and support vector machine regression model. The mathematical model of DCSRM is established and the probabilistic design idea of DCSRM is introduced. The dynamic assembly probabilistic design of aeroengine high-pressure turbine (HPT) BTRRC is accomplished to verify the proposed DCSRM. The analysis results reveal that the optimal static blade-tip clearance of HPT is gained for designing BTRRC, and improving the performance and reliability of aeroengine. The comparison of methods shows that the DCSRM has high computational accuracy and high computational efficiency in BTRRC probabilistic analysis. The present research offers an effective way for the reliability design of mechanical dynamic assembly and enriches mechanical reliability theory and method.
Proceedings of the NASA Conference on Space Telerobotics, volume 1
NASA Technical Reports Server (NTRS)
Rodriguez, Guillermo (Editor); Seraji, Homayoun (Editor)
1989-01-01
The theme of the Conference was man-machine collaboration in space. Topics addressed include: redundant manipulators; man-machine systems; telerobot architecture; remote sensing and planning; navigation; neural networks; fundamental AI research; and reasoning under uncertainty.
Agile Machining and Inspection Non-Nuclear Report (NNR) Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lazarus, Lloyd
This report is a high level summary of the eight major projects funded by the Agile Machining and Inspection Non-Nuclear Readiness (NNR) project (FY06.0422.3.04.R1). The largest project of the group is the Rapid Response project in which the six major sub categories are summarized. This project focused on the operations of the machining departments that will comprise Special Applications Machining (SAM) in the Kansas City Responsive Infrastructure Manufacturing & Sourcing (KCRIMS) project. This project was aimed at upgrading older machine tools, developing new inspection tools, eliminating Classified Removable Electronic Media (CREM) in the handling of classified Numerical Control (NC) programsmore » by installing the CRONOS network, and developing methods to automatically load Coordinated-Measuring Machine (CMM) inspection data into bomb books and product score cards. Finally, the project personnel leaned perations of some of the machine tool cells, and now have the model to continue this activity.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hrubiak, Rostislav; Sinogeikin, Stanislav; Rod, Eric
We have designed and constructed a new system for micro-machining parts and sample assemblies used for diamond anvil cells and general user operations at the High Pressure Collaborative Access Team, sector 16 of the Advanced Photon Source. The new micro-machining system uses a pulsed laser of 400 ps pulse duration, ablating various materials without thermal melting, thus leaving a clean edge. With optics designed for a tight focus, the system can machine holes any size larger than 3 μm in diameter. Unlike a standard electrical discharge machining drill, the new laser system allows micro-machining of non-conductive materials such as: amorphousmore » boron and silicon carbide gaskets, diamond, oxides, and other materials including organic materials such as polyimide films (i.e., Kapton). An important feature of the new system is the use of gas-tight or gas-flow environmental chambers which allow the laser micro-machining to be done in a controlled (e.g., inert gas) atmosphere to prevent oxidation and other chemical reactions in air sensitive materials. The gas-tight workpiece enclosure is also useful for machining materials with known health risks (e.g., beryllium). Specialized control software with a graphical interface enables micro-machining of custom 2D and 3D shapes. The laser-machining system was designed in a Class 1 laser enclosure, i.e., it includes laser safety interlocks and computer controls and allows for routine operation. Though initially designed mainly for machining of the diamond anvil cell gaskets, the laser-machining system has since found many other micro-machining applications, several of which are presented here.« less
Designing Flight-Deck Procedures
NASA Technical Reports Server (NTRS)
Degani, Asaf; Wiener, L.; Shafto, Mike (Technical Monitor)
1995-01-01
A complex human-machine system consists of more than merely one or more human operators and a collection of hardware components. In order to operate a complex system successfully, the human-machine system must be supported by an organizational infrastructure of operating concepts, rules, guidelines, and documents. The coherency of such operating concepts, in terms of consistency and logic, is vitally important for the efficiency and safety of any complex system. In high-risk endeavors such as aircraft operations, space flight, nuclear power production, manufacturing process control, and military operations, it is essential that such support be flawless, as the price of operational error can be high. When operating rules are not adhered to, or the rules are inadequate for the task at hand, not only will the system's goals be thwarted, but there may also be tragic human and material consequences. To ensure safe and predictable operations, support to the operators, in this case flight crews, often comes in the form of standard operating procedures. These provide the crew with step-by-step guidance for carrying out their operations. Standard procedures do indeed promote uniformity, but they do so at the risk of reducing the role of human operators to a lower level. Management, however, must recognize the danger of over-procedurization, which fails to exploit one of the most valuable assets in the system, the intelligent operator who is "on the scene." The alert system designer and operations manager recognize that there cannot be a procedure for everything, and the time will come in which the operators of a complex system will face a situation for which there is no written procedure. Procedures, whether executed by humans or machines, have their place, but so does human cognition.
Collective Machine Learning: Team Learning and Classification in Multi-Agent Systems
ERIC Educational Resources Information Center
Gifford, Christopher M.
2009-01-01
This dissertation focuses on the collaboration of multiple heterogeneous, intelligent agents (hardware or software) which collaborate to learn a task and are capable of sharing knowledge. The concept of collaborative learning in multi-agent and multi-robot systems is largely under studied, and represents an area where further research is needed to…
Induced Pluripotent Stem Cell Technology in Regenerative Medicine and Biology
NASA Astrophysics Data System (ADS)
Pei, Duanqing; Xu, Jianyong; Zhuang, Qiang; Tse, Hung-Fat; Esteban, Miguel A.
The potential of human embryonic stem cells (ESCs) for regenerative medicine is unquestionable, but practical and ethical considerations have hampered clinical application and research. In an attempt to overcome these issues, the conversion of somatic cells into pluripotent stem cells similar to ESCs, commonly termed nuclear reprogramming, has been a top objective of contemporary biology. More than 40 years ago, King, Briggs, and Gurdon pioneered somatic cell nuclear reprogramming in frogs, and in 1981 Evans successfully isolated mouse ESCs. In 1997 Wilmut and collaborators produced the first cloned mammal using nuclear transfer, and then Thomson obtained human ESCs from in vitro fertilized blastocysts in 1998. Over the last 2 decades we have also seen remarkable findings regarding how ESC behavior is controlled, the importance of which should not be underestimated. This knowledge allowed the laboratory of Shinya Yamanaka to overcome brilliantly conceptual and technical barriers in 2006 and generate induced pluripotent stem cells (iPSCs) from mouse fibroblasts by overexpressing defined combinations of ESC-enriched transcription factors. Here, we discuss some important implications of human iPSCs for biology and medicine and also point to possible future directions.
Molecular-Sized DNA or RNA Sequencing Machine | NCI Technology Transfer Center | TTC
The National Cancer Institute's Gene Regulation and Chromosome Biology Laboratory is seeking statements of capability or interest from parties interested in collaborative research to co-develop a molecular-sized DNA or RNA sequencing machine.
NASA Astrophysics Data System (ADS)
Taranenko, L.; Janouch, F.; Owsiacki, L.
2001-06-01
This paper presents Science and Technology Center in Ukraine (STCU) activities devoted to furthering nuclear and radiation safety, which is a prioritized STCU area. The STCU, an intergovernmental organization with the principle objective of non-proliferation, administers financial support from the USA, Canada, and the EU to Ukrainian projects in various scientific and technological areas; coordinates projects; and promotes the integration of Ukrainian scientists into the international scientific community, including involving western collaborators. The paper focuses on STCU's largest project to date "Program Supporting Y2K Readiness at Ukrainian NPPs" initiated in April 1999 and designed to address possible Y2K readiness problems at 14 Ukrainian nuclear reactors. Other presented projects demonstrate a wide diversity of supported directions in the fields of nuclear and radiation safety, including reactor material improvement ("Improved Zirconium-Based Elements for Nuclear Reactors"), information technologies for nuclear industries ("Ukrainian Nuclear Data Bank in Slavutich"), and radiation health science ("Diagnostics and Treatment of Radiation-Induced Injuries of Human Biopolymers").
Lin, Jhih-Rong; Liu, Zhonghao; Hu, Jianjun
2014-10-01
The binding affinity between a nuclear localization signal (NLS) and its import receptor is closely related to corresponding nuclear import activity. PTM-based modulation of the NLS binding affinity to the import receptor is one of the most understood mechanisms to regulate nuclear import of proteins. However, identification of such regulation mechanisms is challenging due to the difficulty of assessing the impact of PTM on corresponding nuclear import activities. In this study we proposed NIpredict, an effective algorithm to predict nuclear import activity given its NLS, in which molecular interaction energy components (MIECs) were used to characterize the NLS-import receptor interaction, and the support vector regression machine (SVR) was used to learn the relationship between the characterized NLS-import receptor interaction and the corresponding nuclear import activity. Our experiments showed that nuclear import activity change due to NLS change could be accurately predicted by the NIpredict algorithm. Based on NIpredict, we developed a systematic framework to identify potential PTM-based nuclear import regulations for human and yeast nuclear proteins. Application of this approach has identified the potential nuclear import regulation mechanisms by phosphorylation of two nuclear proteins including SF1 and ORC6. © 2014 Wiley Periodicals, Inc.
Wargaming and interactive color graphics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bly, S.; Buzzell, C.; Smith, G.
1980-08-04
JANUS is a two-sided interactive color graphic simulation in which human commanders can direct their forces, each trying to accomplish their mission. This competitive synthetic battlefield is used to explore the range of human ingenuity under conditions of incomplete information about enemy strength and deployment. Each player can react to new situations by planning new unit movements, using conventional and nuclear weapons, or modifying unit objectives. Conventional direct fire among tanks, infantry fighting vehicles, helicopters, and other units is automated subject to constraints of target acquisition, reload rate, range, suppression, etc. Artillery and missile indirect fire systems deliver conventional munitions,more » smoke, and nuclear weapons. Players use reconnaissance units, helicopters, or fixed wing aircraft to search for enemy unit locations. Counter-battery radars acquire enemy artillery. The JANUS simulation at LLL has demonstrated the value of the computer as a sophisticated blackboard. A small dedicated minicomputer is adequate for detailed calculations, and may be preferable to sharing a more powerful machine. Real-time color interactive graphics are essential to allow realistic command decision inputs. Competitive human-versus-human synthetic experiences are intense and well-remembered. 2 figures.« less
Basic and Applied Materials Science Research Efforts at MSFC Germane to NASA Goals
NASA Technical Reports Server (NTRS)
2003-01-01
Presently, a number of investigations are ongoing that blend basic research with engineering applications in support of NASA goals. These include (1) "Pore Formation and Mobility (PFMI) " An ISS Glovebox Investigation" NASA Selected Project - 400-34-3D; (2) "Interactions Between Rotating Bodies" Center Director's Discretionary Fund (CDDF) Project - 279-62-00-16; (3) "Molybdenum - Rhenium (Mo-Re) Alloys for Nuclear Fuel Containment" TD Collaboration - 800-11-02; (4) "Fabrication of Alumina - Metal Composites for Propulsion Components" ED Collaboration - 090-50-10; (5) "Radiation Shielding for Deep-Space Missions" SD Effort; (6) "Other Research". In brief, "Pore Formation and Mobility" is an experiment to be conducted in the ISS Microgravity Science Glovebox that will systematically investigate the development, movement, and interactions of bubbles (porosity) during the controlled directional solidification of a transparent material. In addition to promoting our general knowledge of porosity physics, this work will serve as a guide to future ISS experiments utilizing metal alloys. "Interactions Between Rotating Bodies" is a CDDF sponsored project that is critically examining, through theory and experiment, claims of "new" physics relating to gravity modification and electric field effects. "Molybdenum - Rhenium Alloys for Nuclear Fuel Containment" is a TD collaboration in support of nuclear propulsion. Mo-Re alloys are being evaluated and developed for nuclear fuel containment. "Fabrication of Alumina - Metal Composites for Propulsion Components" is an ED collaboration with the intent of increasing strength and decreasing weight of metal engine components through the incorporation of nanometer-sized alumina fibers. "Radiation Shielding for Deep-Space Missions" is an SD effort aimed at minimizing the health risk from radiation to human space voyagers; work to date has been primarily programmatic but experiments to develop hydrogen-rich materials for shielding are planned. "Other Research" includes: BUNDLE (Bridgman Unidirectional Dendrite in a Liquid Experiment) activities (primarily crucible development), vibrational float-zone processing (with Vanderbilt University), use of ultrasonics in materials processing (with UAH), rotational effects on microstructural development, and application of magnetic fields for mixing.
Celestial data routing network
NASA Astrophysics Data System (ADS)
Bordetsky, Alex
2000-11-01
Imagine that information processing human-machine network is threatened in a particular part of the world. Suppose that an anticipated threat of physical attacks could lead to disruption of telecommunications network management infrastructure and access capabilities for small geographically distributed groups engaged in collaborative operations. Suppose that small group of astronauts are exploring the solar planet and need to quickly configure orbital information network to support their collaborative work and local communications. The critical need in both scenarios would be a set of low-cost means of small team celestial networking. To the geographically distributed mobile collaborating groups such means would allow to maintain collaborative multipoint work, set up orbital local area network, and provide orbital intranet communications. This would be accomplished by dynamically assembling the network enabling infrastructure of the small satellite based router, satellite based Codec, and set of satellite based intelligent management agents. Cooperating single function pico satellites, acting as agents and personal switching devices together would represent self-organizing intelligent orbital network of cooperating mobile management nodes. Cooperative behavior of the pico satellite based agents would be achieved by comprising a small orbital artificial neural network capable of learning and restructing the networking resources in response to the anticipated threat.
Open access for ALICE analysis based on virtualization technology
NASA Astrophysics Data System (ADS)
Buncic, P.; Gheata, M.; Schutz, Y.
2015-12-01
Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modification factor in Pb-Pb collisions. The interface can be easily extended to include an arbitrary number of additional analysis modules. We present the current status of the tools used by ALICE through the CERN open access portal, and the plans for future extensions of this system.
Neuroscience-Inspired Artificial Intelligence.
Hassabis, Demis; Kumaran, Dharshan; Summerfield, Christopher; Botvinick, Matthew
2017-07-19
The fields of neuroscience and artificial intelligence (AI) have a long and intertwined history. In more recent times, however, communication and collaboration between the two fields has become less commonplace. In this article, we argue that better understanding biological brains could play a vital role in building intelligent machines. We survey historical interactions between the AI and neuroscience fields and emphasize current advances in AI that have been inspired by the study of neural computation in humans and other animals. We conclude by highlighting shared themes that may be key for advancing future research in both fields. Copyright © 2017. Published by Elsevier Inc.
2001-01-01
This editorial provides a model of how quality initiatives concerned with health information on the World Wide Web may in the future interact with each other. This vision fits into the evolving "Semantic Web" architecture - ie, the prospective that the World Wide Web may evolve from a mess of unstructured, human-readable information sources into a global knowledge base with an additional layer providing richer and more meaningful relationships between resources. One first prerequisite for forming such a "Semantic Web" or "web of trust" among the players active in quality management of health information is that these initiatives make statements about themselves and about each other in a machine-processable language. I present a concrete model on how this collaboration could look, and provide some recommendations on what the role of the World Health Organization (WHO) and other policy makers in this framework could be. PMID:11772549
Eysenbach, G
2001-01-01
This editorial provides a model of how quality initiatives concerned with health information on the World Wide Web may in the future interact with each other. This vision fits into the evolving "Semantic Web" architecture - ie, the prospective that the World Wide Web may evolve from a mess of unstructured, human-readable information sources into a global knowledge base with an additional layer providing richer and more meaningful relationships between resources. One first prerequisite for forming such a "Semantic Web" or "web of trust" among the players active in quality management of health information is that these initiatives make statements about themselves and about each other in a machine-processable language. I present a concrete model on how this collaboration could look, and provide some recommendations on what the role of the World Health Organization (WHO) and other policy makers in this framework could be.
Advanced human-machine interface for collaborative building control
Zheng, Xianjun S.; Song, Zhen; Chen, Yanzi; Zhang, Shaopeng; Lu, Yan
2015-08-11
A system for collaborative energy management and control in a building, including an energy management controller, one or more occupant HMIs that supports two-way communication between building occupants and a facility manager, and between building occupants and the energy management controller, and a facility manager HMI that supports two-way communication between the facility manager and the building occupants, and between the facility manager and the energy management controller, in which the occupant HMI allows building occupants to provide temperature preferences to the facility manager and the energy management controller, and the facility manager HMI allows the facility manager to configure an energy policy for the building as a set of rules and to view occupants' aggregated temperature preferences, and the energy management controller determines an optimum temperature range that resolves conflicting occupant temperature preferences and occupant temperature preferences that conflict with the facility manager's energy policy for the building.
First-of-A-Kind Control Room Modernization Project Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, Kenneth David
This project plan describes a comprehensive approach to the design of an end-state concept for a modernized control room for Palo Verde. It describes the collaboration arrangement between the DOE LWRS Program Control Room Modernization Project and the APS Palo Verde Nuclear Generating Station. It further describes the role of other collaborators, including the Institute for Energy Technology (IFE) and the Electric Power Research Institute (EPRI). It combines advanced tools, methodologies, and facilities to enable a science-based approach to the validation of applicable engineering and human factors principles for nuclear plant control rooms. It addresses the required project results andmore » documentation to demonstrate compliance with regulatory requirements. It describes the project tasks that will be conducted in the project, and the deliverable reports that will be developed through these tasks. This project plan will be updated as new tasks are added and as project milestones are completed. It will serve as an ongoing description on the project both for project participants and for industry stakeholders.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwantes, J. M.; Marsden, O.; Reilly, D.
Abstract The Nuclear Forensics International Technical Working Group is a community of nuclear forensic practitioners who respond to incidents involving nuclear and other radioactive material out of regulatory control. The Group is dedicated to advancing nuclear forensic science in part through periodic participation in materials exercises. The Group completed its fourth Collaborative Materials Exercise in 2015 in which laboratories from 15 countries and one multinational organization analyzed three samples of special nuclear material in support of a mock nuclear forensic investigation. This special section of the Journal for Radioanalytical and Nuclear Chemistry is devoted to summarizing highlights from this exercise.
Analytical design of intelligent machines
NASA Technical Reports Server (NTRS)
Saridis, George N.; Valavanis, Kimon P.
1987-01-01
The problem of designing 'intelligent machines' to operate in uncertain environments with minimum supervision or interaction with a human operator is examined. The structure of an 'intelligent machine' is defined to be the structure of a Hierarchically Intelligent Control System, composed of three levels hierarchically ordered according to the principle of 'increasing precision with decreasing intelligence', namely: the organizational level, performing general information processing tasks in association with a long-term memory; the coordination level, dealing with specific information processing tasks with a short-term memory; and the control level, which performs the execution of various tasks through hardware using feedback control methods. The behavior of such a machine may be managed by controls with special considerations and its 'intelligence' is directly related to the derivation of a compatible measure that associates the intelligence of the higher levels with the concept of entropy, which is a sufficient analytic measure that unifies the treatment of all the levels of an 'intelligent machine' as the mathematical problem of finding the right sequence of internal decisions and controls for a system structured in the order of intelligence and inverse order of precision such that it minimizes its total entropy. A case study on the automatic maintenance of a nuclear plant illustrates the proposed approach.
Minimizing soil impacts from forest operations
Emily A. Carter
2011-01-01
Several studies were conducted by Forest Service researchers and University and Industrial collaborators that investigated the potential for lessening soil surface disturbances and compaction in forest operations through modifications of machine components or harvest systems. Specific machine modifications included change in tire size, use of dual tire systems,...
76 FR 50268 - Amended Certification Regarding Eligibility To Apply for Worker Adjustment Assistance
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-12
... Certification Regarding Eligibility To Apply for Worker Adjustment Assistance TA-W-73,218 International Business... International Business Machines Corporation (IBM), ITD Business Unit, Division 7, Email and Collaboration Group..., the Department is amending this certification to include workers of International Business Machines...
Supplying the nuclear arsenal: Production reactor technology, management, and policy, 1942--1992
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carlisle, R.P.; Zenzen, J.M.
1994-01-01
This book focuses on the lineage of America`s production reactors, those three at Hanford and their descendants, the reactors behind America`s nuclear weapons. The work will take only occasional sideways glances at the collateral lines of descent, the reactor cousins designed for experimental purposes, ship propulsion, and electric power generation. Over the decades from 1942 through 1992, fourteen American production reactors made enough plutonium to fuel a formidable arsenal of more than twenty thousand weapons. In the last years of that period, planners, nuclear engineers, and managers struggled over designs for the next generation of production reactors. The story ofmore » fourteen individual machines and of the planning effort to replace them might appear relatively narrow. Yet these machines lay at the heart of the nation`s nuclear weapons complex. The story of these machines is the story of arming the winning weapon, supplying the nuclear arms race. This book is intended to capture the history of the first fourteen production reactors, and associated design work, in the face of the end of the Cold War.« less
PET - radiopharmaceutical facilities at Washington University Medical School - an overview
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dence, C.S.; Welch, M.J.
1994-12-31
The PET program at Washington University has evolved over more than three decades of research and development in the use of positron-emitting isotopes in medicine and biology. In 1962 the installation of the first hospital cyclotron in the USA was accomplished. This first machine was an Allis Chalmers (AC) cyclotron and it was operated until July, 1990. Simultaneously with this cyclotron the authors also ran a Cyclotron Corporation (TCC) CS-15 cyclotron that was purchased in 1977. Both of these cyclotrons were maintained in-house and operated with a relatively small downtime (approximately 3.5%). After the dismantling of the AC machine inmore » 1990, a Japanese Steel Works 16/8 (JSW-16/8) cyclotron was installed in the vault. Whereas the AC cyclotron could only accelerate deuterons (6.2 MeV), the JSW - 16/8 machine can accelerate both protons and deuterons, so all of the radiopharmaceuticals can be produced on either of the two presently owned accelerators. At the end of May 1993, the medical school installed the first clinical Tandem Cascade Accelerator (TCA) a collaboration with Science Research Laboratories (SRL) of Somerville, MA. Preliminary target testing, design and development are presently under way. In 1973, the University installed the first operational PETT device in the country, and at present there is a large basic science and clinical research program involving more than a hundred staff in nuclear medicine, radiation sciences, neurology, neurosurgery, psychiatry, cardiology, pulmonary medicine, oncology, and surgery.« less
Analysis in Motion Initiative – Human Machine Intelligence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blaha, Leslie
As computers and machines become more pervasive in our everyday lives, we are looking for ways for humans and machines to work more intelligently together. How can we help machines understand their users so the team can do smarter things together? The Analysis in Motion Initiative is advancing the science of human machine intelligence — creating human-machine teams that work better together to make correct, useful, and timely interpretations of data.
ERIC Educational Resources Information Center
Huang, Yifen
2010-01-01
Mixed-initiative clustering is a task where a user and a machine work collaboratively to analyze a large set of documents. We hypothesize that a user and a machine can both learn better clustering models through enriched communication and interactive learning from each other. The first contribution or this thesis is providing a framework of…
Learning Simple Machines through Cross-Age Collaborations
ERIC Educational Resources Information Center
Lancor, Rachael; Schiebel, Amy
2008-01-01
In this project, introductory college physics students (noneducation majors) were asked to teach simple machines to a class of second graders. This nontraditional activity proved to be a successful way to encourage college students to think critically about physics and how it applied to their everyday lives. The noneducation majors benefited by…
Determination of Machining Parameters of Corn Byproduct Filled Plastics
USDA-ARS?s Scientific Manuscript database
In a collaborative project between the USDA and Northern Illinois University, the use of ethanol corn processing by-products as bio-filler materials in the compression molding of phenolic plastics has been studied. This paper reports on the results of a machinability study in the milling of various ...
Determining Machining Parameters of Corn Byproduct Filled Plastics
USDA-ARS?s Scientific Manuscript database
In a collaborative project between the USDA and Northern Illinois University, the use of corn ethanol processing byproducts (i.e., DDGS) as bio-filler materials in the compression molding of phenolic plastics has been studied. This paper reports on the results of a machinability study in the milling...
NASA Technical Reports Server (NTRS)
Warren, W. H., Jr.
1984-01-01
A detailed description of the machine-readable astronomical catalog as it is currently being distributed from the Astronomical Data Center is given. Stellar motions and positions are listed herein in tabular form.
NASA Technical Reports Server (NTRS)
Corker, Kevin M.; Pisanich, Gregory M.; Lebacqz, Victor (Technical Monitor)
1996-01-01
The Man-Machine Interaction Design and Analysis System (MIDAS) has been under development for the past ten years through a joint US Army and NASA cooperative agreement. MIDAS represents multiple human operators and selected perceptual, cognitive, and physical functions of those operators as they interact with simulated systems. MIDAS has been used as an integrated predictive framework for the investigation of human/machine systems, particularly in situations with high demands on the operators. Specific examples include: nuclear power plant crew simulation, military helicopter flight crew response, and police force emergency dispatch. In recent applications to airborne systems development, MIDAS has demonstrated an ability to predict flight crew decision-making and procedural behavior when interacting with automated flight management systems and Air Traffic Control. In this paper we describe two enhancements to MIDAS. The first involves the addition of working memory in the form of an articulatory buffer for verbal communication protocols and a visuo-spatial buffer for communications via digital datalink. The second enhancement is a representation of multiple operators working as a team. This enhanced model was used to predict the performance of human flight crews and their level of compliance with commercial aviation communication procedures. We show how the data produced by MIDAS compares with flight crew performance data from full mission simulations. Finally, we discuss the use of these features to study communications issues connected with aircraft-based separation assurance.
2010-11-01
metal. Recovery extraction centrifugal contactors A process that uses solvent to extract uranium for purposes of purification. Agile machining A...extraction centrifugal contactors 5 6 Yes 6 No Agile machining 5 5 No 6 No Chip management 5 6 Yes 6 No Special casting 3 6 Yes 6 No Source: GAO
Protection against UV and X-ray cataracts using dynamic light scattering
NASA Technical Reports Server (NTRS)
Giblin, Frank J.
2005-01-01
Static and dynamic light scattering (SLS and DLS) analysis was used to investigate the aggregation of lens proteins in a hyperbaric oxygen (HBO)/guinea pig in vivo model for nuclear cataract. Nuclear cataract, an opacity which occurs in the center of the lens, is a major type of human maturity-onset cataract for which the cause is not well-understood. HBO is commonly used in major hospitals for treating complications such as poor wound healing due to impaired blood circulation. It is known that treatment of human patients with HBO for extended periods of time can produce nuclear cataract. Guinea pigs, initially 18 months old, were treated with HBO (2.5 atm of 100% O2 for 2.5 hr) 3x per week for 7 months to increase tie level of lens nuclear light scattering. Age-matched animals were used for controls. The eyes of the animals were analyzed in vivo using an integrated static and DLS fiber optic probe in collaboration with the NASA group. DLS in vivo was used to measure the size of lens proteins at 50 different locations across the optical axis of the guinea pig lens.
Comparison of Human and Machine Scoring of Essays: Differences by Gender, Ethnicity, and Country
ERIC Educational Resources Information Center
Bridgeman, Brent; Trapani, Catherine; Attali, Yigal
2012-01-01
Essay scores generated by machine and by human raters are generally comparable; that is, they can produce scores with similar means and standard deviations, and machine scores generally correlate as highly with human scores as scores from one human correlate with scores from another human. Although human and machine essay scores are highly related…
A Framework to Guide the Assessment of Human-Machine Systems.
Stowers, Kimberly; Oglesby, James; Sonesh, Shirley; Leyva, Kevin; Iwig, Chelsea; Salas, Eduardo
2017-03-01
We have developed a framework for guiding measurement in human-machine systems. The assessment of safety and performance in human-machine systems often relies on direct measurement, such as tracking reaction time and accidents. However, safety and performance emerge from the combination of several variables. The assessment of precursors to safety and performance are thus an important part of predicting and improving outcomes in human-machine systems. As part of an in-depth literature analysis involving peer-reviewed, empirical articles, we located and classified variables important to human-machine systems, giving a snapshot of the state of science on human-machine system safety and performance. Using this information, we created a framework of safety and performance in human-machine systems. This framework details several inputs and processes that collectively influence safety and performance. Inputs are divided according to human, machine, and environmental inputs. Processes are divided into attitudes, behaviors, and cognitive variables. Each class of inputs influences the processes and, subsequently, outcomes that emerge in human-machine systems. This framework offers a useful starting point for understanding the current state of the science and measuring many of the complex variables relating to safety and performance in human-machine systems. This framework can be applied to the design, development, and implementation of automated machines in spaceflight, military, and health care settings. We present a hypothetical example in our write-up of how it can be used to aid in project success.
Birnbaum, Michael L; Ernala, Sindhu Kiranmai; Rizvi, Asra F; De Choudhury, Munmun; Kane, John M
2017-08-14
Linguistic analysis of publicly available Twitter feeds have achieved success in differentiating individuals who self-disclose online as having schizophrenia from healthy controls. To date, limited efforts have included expert input to evaluate the authenticity of diagnostic self-disclosures. This study aims to move from noisy self-reports of schizophrenia on social media to more accurate identification of diagnoses by exploring a human-machine partnered approach, wherein computational linguistic analysis of shared content is combined with clinical appraisals. Twitter timeline data, extracted from 671 users with self-disclosed diagnoses of schizophrenia, was appraised for authenticity by expert clinicians. Data from disclosures deemed true were used to build a classifier aiming to distinguish users with schizophrenia from healthy controls. Results from the classifier were compared to expert appraisals on new, unseen Twitter users. Significant linguistic differences were identified in the schizophrenia group including greater use of interpersonal pronouns (P<.001), decreased emphasis on friendship (P<.001), and greater emphasis on biological processes (P<.001). The resulting classifier distinguished users with disclosures of schizophrenia deemed genuine from control users with a mean accuracy of 88% using linguistic data alone. Compared to clinicians on new, unseen users, the classifier's precision, recall, and accuracy measures were 0.27, 0.77, and 0.59, respectively. These data reinforce the need for ongoing collaborations integrating expertise from multiple fields to strengthen our ability to accurately identify and effectively engage individuals with mental illness online. These collaborations are crucial to overcome some of mental illnesses' biggest challenges by using digital technology. ©Michael L Birnbaum, Sindhu Kiranmai Ernala, Asra F Rizvi, Munmun De Choudhury, John M Kane. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 14.08.2017.
Christakis, Panos G; Braga-Mele, Rosa M
2012-02-01
To compare the intraoperative performance and postoperative outcomes of 3 phacoemulsification machines that use different modes. Kensington Eye Institute, Toronto, Ontario, Canada. Comparative case series. This chart and video review comprised consecutive eligible patients who had phacoemulsification by the same surgeon using a Whitestar Signature Ellips-FX (transversal), Infiniti-Ozil-IP (torsional), or Stellaris (longitudinal) machine. The review included 98 patients. Baseline characteristics in the groups were similar; the mean nuclear sclerosis grade was 2.0 ± 0.8. There were no significant intraoperative complications. The torsional machine averaged less phacoemulsification needle time (83 ± 33 seconds) than the transversal (99 ± 40 seconds; P=.21) or longitudinal (110 ± 45 seconds; P=.02) machines; the difference was accentuated in cases with high-grade nuclear sclerosis. The torsional machine had less chatter and better followability than the transversal or longitudinal machines (P<.001). The torsional and longitudinal machines had better anterior chamber stability than the transversal machine (P<.001). Postoperatively, the torsional machine yielded less central corneal edema than the transversal (P<.001) and longitudinal (P=.04) machines, corresponding to a smaller increase in mean corneal thickness (torsional 5%, transversal 10%, longitudinal 12%; P=.04). Also, the torsional machine had better 1-day postoperative visual acuities (P<.001). All 3 phacoemulsification machines were effective with no significant intraoperative complications. The torsional machine outperformed the transversal and longitudinal machines, with a lower mean needle time, less chatter, and improved followability. This corresponded to less corneal edema 1 day postoperatively and better visual acuity. Copyright © 2011 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Ambur, Manjula Y.; Yagle, Jeremy J.; Reith, William; McLarney, Edward
2016-01-01
In 2014, a team of researchers, engineers and information technology specialists at NASA Langley Research Center developed a Big Data Analytics and Machine Intelligence Strategy and Roadmap as part of Langley's Comprehensive Digital Transformation Initiative, with the goal of identifying the goals, objectives, initiatives, and recommendations need to develop near-, mid- and long-term capabilities for data analytics and machine intelligence in aerospace domains. Since that time, significant progress has been made in developing pilots and projects in several research, engineering, and scientific domains by following the original strategy of collaboration between mission support organizations, mission organizations, and external partners from universities and industry. This report summarizes the work to date in Data Intensive Scientific Discovery, Deep Content Analytics, and Deep Q&A projects, as well as the progress made in collaboration, outreach, and education. Recommendations for continuing this success into future phases of the initiative are also made.
1990-02-01
human-to- human communication patterns during situation assessment and cooperative problem solving tasks. The research proposed for the second URRP year...Hardware development. In order to create an environment within which to study multi-channeled human-to- human communication , a multi-media observation...that machine-to- human communication can be used to increase cohesion between humans and intelligent machines and to promote human-machine team
How DARHT Works - the World's Most Powerful X-ray Machine
None
2018-06-01
The Dual Axis Radiographic Hydrodynamic Test (DARHT) facility at Los Alamos National Laboratory is an essential scientific tool that supports Stockpile Stewardship at the Laboratory. The World's most powerful x-ray machine, it's used to take high-speed images of mock nuclear devices - data that is used to confirm and modify advanced computer codes in assuring the safety, security, and effectiveness of the U.S. nuclear deterrent.
ARMD Strategic Thrust 6: Assured Autonomy for Aviation Transformation
NASA Technical Reports Server (NTRS)
Ballin, Mark; Holbrook, Jon; Sharma, Shivanjli
2016-01-01
In collaboration with the external community and other government agencies, NASA will develop enabling technologies, standards, and design guidelines to support cost-effective applications of automation and limited autonomy for individual components of aviation systems. NASA will also provide foundational knowledge and methods to support the next epoch. Research will address issues of verification and validation, operational evaluation, national policy, and societal cost-benefit. Two research and development approaches to aviation autonomy will advance in parallel. The Increasing Autonomy (IA) approach will seek to advance knowledge and technology through incremental increases in machine-based support of existing human-centered tasks, leading to long-term reallocation of functions between humans and machines. The Autonomy as a New Technology (ANT) approach seeks advances by developing technology to achieve goals that are not currently possible using human-centered concepts of operation. IA applications are mission-enhancing, and their selection will be based on benefits achievable relative to existing operations. ANT applications are mission-enabling, and their value will be assessed based on societal benefit resulting from a new capability. The expected demand for small autonomous unmanned aircraft systems (UAS) provides an opportunity for development of ANT applications. Supervisory autonomy may be implemented as an expansion of the number of functions or systems that may be controlled by an individual human operator. Convergent technology approaches, such as the use of electronic flight bags and existing network servers, will be leveraged to the maximum extent possible.
Le, Laetitia Minh Maï; Kégl, Balázs; Gramfort, Alexandre; Marini, Camille; Nguyen, David; Cherti, Mehdi; Tfaili, Sana; Tfayli, Ali; Baillet-Guffroy, Arlette; Prognon, Patrice; Chaminade, Pierre; Caudron, Eric
2018-07-01
The use of monoclonal antibodies (mAbs) constitutes one of the most important strategies to treat patients suffering from cancers such as hematological malignancies and solid tumors. These antibodies are prescribed by the physician and prepared by hospital pharmacists. An analytical control enables the quality of the preparations to be ensured. The aim of this study was to explore the development of a rapid analytical method for quality control. The method used four mAbs (Infliximab, Bevacizumab, Rituximab and Ramucirumab) at various concentrations and was based on recording Raman data and coupling them to a traditional chemometric and machine learning approach for data analysis. Compared to conventional linear approach, prediction errors are reduced with a data-driven approach using statistical machine learning methods. In the latter, preprocessing and predictive models are jointly optimized. An additional original aspect of the work involved on submitting the problem to a collaborative data challenge platform called Rapid Analytics and Model Prototyping (RAMP). This allowed using solutions from about 300 data scientists in collaborative work. Using machine learning, the prediction of the four mAbs samples was considerably improved. The best predictive model showed a combined error of 2.4% versus 14.6% using linear approach. The concentration and classification errors were 5.8% and 0.7%, only three spectra were misclassified over the 429 spectra of the test set. This large improvement obtained with machine learning techniques was uniform for all molecules but maximal for Bevacizumab with an 88.3% reduction on combined errors (2.1% versus 17.9%). Copyright © 2018 Elsevier B.V. All rights reserved.
32 CFR 701.53 - FOIA fee schedule.
Code of Federal Regulations, 2014 CFR
2014-07-01
... human time) and machine time. (1) Human time. Human time is all the time spent by humans performing the...) Machine time. Machine time involves only direct costs of the central processing unit (CPU), input/output... exist to calculate CPU time, no machine costs can be passed on to the requester. When CPU calculations...
32 CFR 701.53 - FOIA fee schedule.
Code of Federal Regulations, 2012 CFR
2012-07-01
... human time) and machine time. (1) Human time. Human time is all the time spent by humans performing the...) Machine time. Machine time involves only direct costs of the central processing unit (CPU), input/output... exist to calculate CPU time, no machine costs can be passed on to the requester. When CPU calculations...
32 CFR 518.20 - Collection of fees and fee rates.
Code of Federal Regulations, 2014 CFR
2014-07-01
...; individual time (hereafter referred to as human time), and machine time. (i) Human time. Human time is all the time spent by humans performing the necessary tasks to prepare the job for a machine to execute..., programmer, database administrator, or action officer). (ii) Machine time. Machine time involves only direct...
32 CFR 518.20 - Collection of fees and fee rates.
Code of Federal Regulations, 2012 CFR
2012-07-01
...; individual time (hereafter referred to as human time), and machine time. (i) Human time. Human time is all the time spent by humans performing the necessary tasks to prepare the job for a machine to execute..., programmer, database administrator, or action officer). (ii) Machine time. Machine time involves only direct...
32 CFR 518.20 - Collection of fees and fee rates.
Code of Federal Regulations, 2013 CFR
2013-07-01
...; individual time (hereafter referred to as human time), and machine time. (i) Human time. Human time is all the time spent by humans performing the necessary tasks to prepare the job for a machine to execute..., programmer, database administrator, or action officer). (ii) Machine time. Machine time involves only direct...
32 CFR 701.53 - FOIA fee schedule.
Code of Federal Regulations, 2013 CFR
2013-07-01
... human time) and machine time. (1) Human time. Human time is all the time spent by humans performing the...) Machine time. Machine time involves only direct costs of the central processing unit (CPU), input/output... exist to calculate CPU time, no machine costs can be passed on to the requester. When CPU calculations...
Evaluation of Additive Manufacturing for Stainless Steel Components
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peter, William H.; Lou, Xiaoyuan; List, III, Frederick Alyious
This collaboration between Oak Ridge National Laboratory and General Electric Company aimed to evaluate the mechanical properties, microstructure, and porosity of the additively manufactured 316L stainless steel by ORNL’s Renishaw AM250 machine for nuclear application. The program also evaluated the stress corrosion cracking and corrosion fatigue crack growth rate of the same material in high temperature water environments. Results show the properties of this material to be similar to the properties of 316L stainless steel fabricated additively with equipment from other manufacturers with slightly higher porosity. The stress corrosion crack growth rate is similar to that for wrought 316L stainlessmore » steel for an oxygenated high temperature water environment and slightly higher for a hydrogenated high temperature water environment. Optimized heat treatment of this material is expected to improve performance in high temperature water environments.« less
Gunatilake, Mangala
2018-06-01
Similar to human beings, pain is an unpleasant sensation experienced by animals as well. There is no exception when the animals are subjected to experimental procedures. Our duty as researchers/scientists is to prevent or minimize the pain in animals so as to lessen their suffering and distress during experimental procedures. The basics of the physiology of pain and pain perception, analgesia, anesthesia, and euthanasia of laboratory animals were included to complete the program, before the practical part was attempted and before advanced topics, such as comparison of anesthetic combinations, were discussed. Therefore, this course was organized in Sri Lanka for the first time in collaboration with the Comparative Biology Centre of Newcastle University, UK. During this course, we were able to demonstrate how an anesthesia machine could be used in laboratory animal anesthesia for the first time in the country. None of the animal houses in the country were equipped with an anesthesia machine at the time of conducting the course.
Robot Guidance Using Machine Vision Techniques in Industrial Environments: A Comparative Review.
Pérez, Luis; Rodríguez, Íñigo; Rodríguez, Nuria; Usamentiaga, Rubén; García, Daniel F
2016-03-05
In the factory of the future, most of the operations will be done by autonomous robots that need visual feedback to move around the working space avoiding obstacles, to work collaboratively with humans, to identify and locate the working parts, to complete the information provided by other sensors to improve their positioning accuracy, etc. Different vision techniques, such as photogrammetry, stereo vision, structured light, time of flight and laser triangulation, among others, are widely used for inspection and quality control processes in the industry and now for robot guidance. Choosing which type of vision system to use is highly dependent on the parts that need to be located or measured. Thus, in this paper a comparative review of different machine vision techniques for robot guidance is presented. This work analyzes accuracy, range and weight of the sensors, safety, processing time and environmental influences. Researchers and developers can take it as a background information for their future works.
Robot Guidance Using Machine Vision Techniques in Industrial Environments: A Comparative Review
Pérez, Luis; Rodríguez, Íñigo; Rodríguez, Nuria; Usamentiaga, Rubén; García, Daniel F.
2016-01-01
In the factory of the future, most of the operations will be done by autonomous robots that need visual feedback to move around the working space avoiding obstacles, to work collaboratively with humans, to identify and locate the working parts, to complete the information provided by other sensors to improve their positioning accuracy, etc. Different vision techniques, such as photogrammetry, stereo vision, structured light, time of flight and laser triangulation, among others, are widely used for inspection and quality control processes in the industry and now for robot guidance. Choosing which type of vision system to use is highly dependent on the parts that need to be located or measured. Thus, in this paper a comparative review of different machine vision techniques for robot guidance is presented. This work analyzes accuracy, range and weight of the sensors, safety, processing time and environmental influences. Researchers and developers can take it as a background information for their future works. PMID:26959030
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Yonggang
In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integratedmore » analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.« less
Propagation of nuclear data uncertainties for fusion power measurements
NASA Astrophysics Data System (ADS)
Sjöstrand, Henrik; Conroy, Sean; Helgesson, Petter; Hernandez, Solis Augusto; Koning, Arjan; Pomp, Stephan; Rochman, Dimitri
2017-09-01
Neutron measurements using neutron activation systems are an essential part of the diagnostic system at large fusion machines such as JET and ITER. Nuclear data is used to infer the neutron yield. Consequently, high-quality nuclear data is essential for the proper determination of the neutron yield and fusion power. However, uncertainties due to nuclear data are not fully taken into account in uncertainty analysis for neutron yield calibrations using activation foils. This paper investigates the neutron yield uncertainty due to nuclear data using the so-called Total Monte Carlo Method. The work is performed using a detailed MCNP model of the JET fusion machine; the uncertainties due to the cross-sections and angular distributions in JET structural materials, as well as the activation cross-sections in the activation foils, are analysed. It is found that a significant contribution to the neutron yield uncertainty can come from uncertainties in the nuclear data.
First Megascience Experiment at Fermilab: Through Hardship to Protons
NASA Astrophysics Data System (ADS)
Pronskikh, Vitaly; Higgins, Valerie
The E-36 experiment on the small angle proton-proton scattering that officially started in 1970, making use of the Main Ring beams and giving rise to a chain of similar experiments that continued after 1972, was the first experiment at the newly built NAL. It was also the first US/USSR collaboration in particle physics as well as the first experiment that can be confidently characterized as megascience. The experimental data were interpreted as an indication of the pomeron, a quasiparticle that had been named after the Soviet theorist I. Pomeranchuk. The idea of the experiment can be traced back to the Rochester conference held in 1970 in Kiev where two American and Soviet physicists met to develop it and later acquainted NAL director Robert Wilson with it. Wilson enthusiastically set the stage for the experiment at NAL. Involving a gas-jet target built at the Dubna machine shop of Joint Institute for Nuclear Research and brought to Batavia, Illinois, the experiment established cooperation between the US and the Soviets in the spirit of their contemporary Apollo-Soyuz space program, thus breaking the ice of the Cold War from within high-energy physics. In this talk based on the Fermilab Archives and interviews, we discuss the financial and administrative obstacles raised by Soviet officials that the Russian collaborators had to overcome, interinstitutional tensions among the Soviets that accompanied the collaboration, NAL culture as well as the roles of scientists in megascience as ambassadors of peace.
Mechanisms for training security inspectors to enhance human performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burkhalter, H.E.; Sessions, J.C.
The Department of Energy (DOE) has established qualification standards for protective force personnel employed at nuclear facilities (10 CFR Part 1046 (Federal Register)). Training mechanisms used at Los Alamos to enhance human performance in meeting DOE standards include, but are not limited to, the following: for cardio-respiratory training, they utilize distance running, interval training, sprint training, pacing, indoor aerobics and circuit training; for muscular strength, free weights, weight machines, light hand weights, grip strength conditioners, and calistenics are employed; for muscular endurance, participants do high repetitions (15 - 40) using dumbbells, flex weights, resistive rubber bands, benches, and calisthenics; formore » flexibility, each training session devotes specific times to stretch the muscles involved for a particular activity. These training mechanisms with specific protocols can enhance human performance.« less
Proceedings of the NASA Conference on Space Telerobotics, volume 3
NASA Technical Reports Server (NTRS)
Rodriguez, Guillermo (Editor); Seraji, Homayoun (Editor)
1989-01-01
The theme of the Conference was man-machine collaboration in space. The Conference provided a forum for researchers and engineers to exchange ideas on the research and development required for application of telerobotics technology to the space systems planned for the 1990s and beyond. The Conference: (1) provided a view of current NASA telerobotic research and development; (2) stimulated technical exchange on man-machine systems, manipulator control, machine sensing, machine intelligence, concurrent computation, and system architectures; and (3) identified important unsolved problems of current interest which can be dealt with by future research.
Control rod system useable for fuel handling in a gas-cooled nuclear reactor
Spurrier, Francis R.
1976-11-30
A control rod and its associated drive are used to elevate a complete stack of fuel blocks to a position above the core of a gas-cooled nuclear reactor. A fuel-handling machine grasps the control rod and the drive is unlatched from the rod. The stack and rod are transferred out of the reactor, or to a new location in the reactor, by the fuel-handling machine.
Design of virtual SCADA simulation system for pressurized water reactor
NASA Astrophysics Data System (ADS)
Wijaksono, Umar; Abdullah, Ade Gafar; Hakim, Dadang Lukman
2016-02-01
The Virtual SCADA system is a software-based Human-Machine Interface that can visualize the process of a plant. This paper described the results of the virtual SCADA system design that aims to recognize the principle of the Nuclear Power Plant type Pressurized Water Reactor. This simulation uses technical data of the Nuclear Power Plant Unit Olkiluoto 3 in Finland. This device was developed using Wonderware Intouch, which is equipped with manual books for each component, animation links, alarm systems, real time and historical trending, and security system. The results showed that in general this device can demonstrate clearly the principles of energy flow and energy conversion processes in Pressurized Water Reactors. This virtual SCADA simulation system can be used as instructional media to recognize the principle of Pressurized Water Reactor.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Otuka, N., E-mail: n.otsuka@iaea.org; Dupont, E.; Semkova, V.
The International Network of Nuclear Reaction Data Centres (NRDC) coordinated by the IAEA Nuclear Data Section (NDS) successfully collaborates in the maintenance and development of the EXFOR library. As the scope of published data expands (e.g. to higher energy, to heavier projectile) to meet the needs of research and applications, it has become a challenging task to maintain both the completeness and accuracy of the EXFOR library. Evolution of the library highlighting recent developments is described.
Human-like machines: Transparency and comprehensibility.
Patrzyk, Piotr M; Link, Daniela; Marewski, Julian N
2017-01-01
Artificial intelligence algorithms seek inspiration from human cognitive systems in areas where humans outperform machines. But on what level should algorithms try to approximate human cognition? We argue that human-like machines should be designed to make decisions in transparent and comprehensible ways, which can be achieved by accurately mirroring human cognitive processes.
ERIC Educational Resources Information Center
Bergner, Yoav; Droschler, Stefan; Kortemeyer, Gerd; Rayyan, Saif; Seaton, Daniel; Pritchard, David E.
2012-01-01
We apply collaborative filtering (CF) to dichotomously scored student response data (right, wrong, or no interaction), finding optimal parameters for each student and item based on cross-validated prediction accuracy. The approach is naturally suited to comparing different models, both unidimensional and multidimensional in ability, including a…
IEEE 1982. Proceedings of the international conference on cybernetics and society
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1982-01-01
The following topics were dealt with: knowledge-based systems; risk analysis; man-machine interactions; human information processing; metaphor, analogy and problem-solving; manual control modelling; transportation systems; simulation; adaptive and learning systems; biocybernetics; cybernetics; mathematical programming; robotics; decision support systems; analysis, design and validation of models; computer vision; systems science; energy systems; environmental modelling and policy; pattern recognition; nuclear warfare; technological forecasting; artificial intelligence; the Turin shroud; optimisation; workloads. Abstracts of individual papers can be found under the relevant classification codes in this or future issues.
State of the art in nuclear telerobotics: focus on the man/machine connection
NASA Astrophysics Data System (ADS)
Greaves, Amna E.
1995-12-01
The interface between the human controller and remotely operated device is a crux of telerobotic investigation today. This human-to-machine connection is the means by which we communicate our commands to the device, as well as the medium for decision-critical feedback to the operator. The amount of information transferred through the user interface is growing. This can be seen as a direct result of our need to support added complexities, as well as a rapidly expanding domain of applications. A user interface, or UI, is therefore subject to increasing demands to present information in a meaningful manner to the user. Virtual reality, and multi degree-of-freedom input devices lend us the ability to augment the man/machine interface, and handle burgeoning amounts of data in a more intuitive and anthropomorphically correct manner. Along with the aid of 3-D input and output devices, there are several visual tools that can be employed as part of a graphical UI that enhance and accelerate our comprehension of the data being presented. Thus an advanced UI that features these improvements would reduce the amount of fatigue on the teleoperator, increase his level of safety, facilitate learning, augment his control, and potentially reduce task time. This paper investigates the cutting edge concepts and enhancements that lead to the next generation of telerobotic interface systems.
Big Data: An Opportunity for Collaboration with Computer Scientists on Data-Driven Science
NASA Astrophysics Data System (ADS)
Baru, C.
2014-12-01
Big data technologies are evolving rapidly, driven by the need to manage ever increasing amounts of historical data; process relentless streams of human and machine-generated data; and integrate data of heterogeneous structure from extremely heterogeneous sources of information. Big data is inherently an application-driven problem. Developing the right technologies requires an understanding of the applications domain. Though, an intriguing aspect of this phenomenon is that the availability of the data itself enables new applications not previously conceived of! In this talk, we will discuss how the big data phenomenon creates an imperative for collaboration among domain scientists (in this case, geoscientists) and computer scientists. Domain scientists provide the application requirements as well as insights about the data involved, while computer scientists help assess whether problems can be solved with currently available technologies or require adaptaion of existing technologies and/or development of new technologies. The synergy can create vibrant collaborations potentially leading to new science insights as well as development of new data technologies and systems. The area of interface between geosciences and computer science, also referred to as geoinformatics is, we believe, a fertile area for interdisciplinary research.
Distributed and collaborative synthetic environments
NASA Technical Reports Server (NTRS)
Bajaj, Chandrajit L.; Bernardini, Fausto
1995-01-01
Fast graphics workstations and increased computing power, together with improved interface technologies, have created new and diverse possibilities for developing and interacting with synthetic environments. A synthetic environment system is generally characterized by input/output devices that constitute the interface between the human senses and the synthetic environment generated by the computer; and a computation system running a real-time simulation of the environment. A basic need of a synthetic environment system is that of giving the user a plausible reproduction of the visual aspect of the objects with which he is interacting. The goal of our Shastra research project is to provide a substrate of geometric data structures and algorithms which allow the distributed construction and modification of the environment, efficient querying of objects attributes, collaborative interaction with the environment, fast computation of collision detection and visibility information for efficient dynamic simulation and real-time scene display. In particular, we address the following issues: (1) A geometric framework for modeling and visualizing synthetic environments and interacting with them. We highlight the functions required for the geometric engine of a synthetic environment system. (2) A distribution and collaboration substrate that supports construction, modification, and interaction with synthetic environments on networked desktop machines.
Media-Augmented Exercise Machines
NASA Astrophysics Data System (ADS)
Krueger, T.
2002-01-01
Cardio-vascular exercise has been used to mitigate the muscle and cardiac atrophy associated with adaptation to micro-gravity environments. Several hours per day may be required. In confined spaces and long duration missions this kind of exercise is inevitably repetitive and rapidly becomes uninteresting. At the same time, there are pressures to accomplish as much as possible given the cost- per-hour for humans occupying orbiting or interplanetary. Media augmentation provides a the means to overlap activities in time by supplementing the exercise with social, recreational, training or collaborative activities and thereby reducing time pressures. In addition, the machine functions as an interface to a wide range of digital environments allowing for spatial variety in an otherwise confined environment. We hypothesize that the adoption of media augmented exercise machines will have a positive effect on psycho-social well-being on long duration missions. By organizing and supplementing exercise machines, data acquisition hardware, computers and displays into an interacting system this proposal increases functionality with limited additional mass. This paper reviews preliminary work on a project to augment exercise equipment in a manner that addresses these issues and at the same time opens possibilities for additional benefits. A testbed augmented exercise machine uses a specialty built cycle trainer as both input to a virtual environment and as an output device from it using spatialized sound, and visual displays, vibration transducers and variable resistance. The resulting interactivity increases a sense of engagement in the exercise, provides a rich experience of the digital environments. Activities in the virtual environment and accompanying physiological and psychological indicators may be correlated to track and evaluate the health of the crew.
NASA Technical Reports Server (NTRS)
Prater, T.; Tilson, W.; Jones, Z.
2015-01-01
The absence of an economy of scale in spaceflight hardware makes additive manufacturing an immensely attractive option for propulsion components. As additive manufacturing techniques are increasingly adopted by government and industry to produce propulsion hardware in human-rated systems, significant development efforts are needed to establish these methods as reliable alternatives to conventional subtractive manufacturing. One of the critical challenges facing powder bed fusion techniques in this application is variability between machines used to perform builds. Even with implementation of robust process controls, it is possible for two machines operating at identical parameters with equivalent base materials to produce specimens with slightly different material properties. The machine variability study presented here evaluates 60 specimens of identical geometry built using the same parameters. 30 samples were produced on machine 1 (M1) and the other 30 samples were built on machine 2 (M2). Each of the 30-sample sets were further subdivided into three subsets (with 10 specimens in each subset) to assess the effect of progressive heat treatment on machine variability. The three categories for post-processing were: stress relief, stress relief followed by hot isostatic press (HIP), and stress relief followed by HIP followed by heat treatment per AMS 5664. Each specimen (a round, smooth tensile) was mechanically tested per ASTM E8. Two formal statistical techniques, hypothesis testing for equivalency of means and one-way analysis of variance (ANOVA), were applied to characterize the impact of machine variability and heat treatment on six material properties: tensile stress, yield stress, modulus of elasticity, fracture elongation, and reduction of area. This work represents the type of development effort that is critical as NASA, academia, and the industrial base work collaboratively to establish a path to certification for additively manufactured parts. For future flight programs, NASA and its commercial partners will procure parts from vendors who will use a diverse range of machines to produce parts and, as such, it is essential that the AM community develop a sound understanding of the degree to which machine variability impacts material properties.
Collaborative Supervised Learning for Sensor Networks
NASA Technical Reports Server (NTRS)
Wagstaff, Kiri L.; Rebbapragada, Umaa; Lane, Terran
2011-01-01
Collaboration methods for distributed machine-learning algorithms involve the specification of communication protocols for the learners, which can query other learners and/or broadcast their findings preemptively. Each learner incorporates information from its neighbors into its own training set, and they are thereby able to bootstrap each other to higher performance. Each learner resides at a different node in the sensor network and makes observations (collects data) independently of the other learners. After being seeded with an initial labeled training set, each learner proceeds to learn in an iterative fashion. New data is collected and classified. The learner can then either broadcast its most confident classifications for use by other learners, or can query neighbors for their classifications of its least confident items. As such, collaborative learning combines elements of both passive (broadcast) and active (query) learning. It also uses ideas from ensemble learning to combine the multiple responses to a given query into a single useful label. This approach has been evaluated against current non-collaborative alternatives, including training a single classifier and deploying it at all nodes with no further learning possible, and permitting learners to learn from their own most confident judgments, absent interaction with their neighbors. On several data sets, it has been consistently found that active collaboration is the best strategy for a distributed learner network. The main advantages include the ability for learning to take place autonomously by collaboration rather than by requiring intervention from an oracle (usually human), and also the ability to learn in a distributed environment, permitting decisions to be made in situ and to yield faster response time.
Wu, Dongrui; Lance, Brent J; Parsons, Thomas D
2013-01-01
Brain-computer interaction (BCI) and physiological computing are terms that refer to using processed neural or physiological signals to influence human interaction with computers, environment, and each other. A major challenge in developing these systems arises from the large individual differences typically seen in the neural/physiological responses. As a result, many researchers use individually-trained recognition algorithms to process this data. In order to minimize time, cost, and barriers to use, there is a need to minimize the amount of individual training data required, or equivalently, to increase the recognition accuracy without increasing the number of user-specific training samples. One promising method for achieving this is collaborative filtering, which combines training data from the individual subject with additional training data from other, similar subjects. This paper describes a successful application of a collaborative filtering approach intended for a BCI system. This approach is based on transfer learning (TL), active class selection (ACS), and a mean squared difference user-similarity heuristic. The resulting BCI system uses neural and physiological signals for automatic task difficulty recognition. TL improves the learning performance by combining a small number of user-specific training samples with a large number of auxiliary training samples from other similar subjects. ACS optimally selects the classes to generate user-specific training samples. Experimental results on 18 subjects, using both k nearest neighbors and support vector machine classifiers, demonstrate that the proposed approach can significantly reduce the number of user-specific training data samples. This collaborative filtering approach will also be generalizable to handling individual differences in many other applications that involve human neural or physiological data, such as affective computing.
Wu, Dongrui; Lance, Brent J.; Parsons, Thomas D.
2013-01-01
Brain-computer interaction (BCI) and physiological computing are terms that refer to using processed neural or physiological signals to influence human interaction with computers, environment, and each other. A major challenge in developing these systems arises from the large individual differences typically seen in the neural/physiological responses. As a result, many researchers use individually-trained recognition algorithms to process this data. In order to minimize time, cost, and barriers to use, there is a need to minimize the amount of individual training data required, or equivalently, to increase the recognition accuracy without increasing the number of user-specific training samples. One promising method for achieving this is collaborative filtering, which combines training data from the individual subject with additional training data from other, similar subjects. This paper describes a successful application of a collaborative filtering approach intended for a BCI system. This approach is based on transfer learning (TL), active class selection (ACS), and a mean squared difference user-similarity heuristic. The resulting BCI system uses neural and physiological signals for automatic task difficulty recognition. TL improves the learning performance by combining a small number of user-specific training samples with a large number of auxiliary training samples from other similar subjects. ACS optimally selects the classes to generate user-specific training samples. Experimental results on 18 subjects, using both nearest neighbors and support vector machine classifiers, demonstrate that the proposed approach can significantly reduce the number of user-specific training data samples. This collaborative filtering approach will also be generalizable to handling individual differences in many other applications that involve human neural or physiological data, such as affective computing. PMID:23437188
Can Machine Scoring Deal with Broad and Open Writing Tests as Well as Human Readers?
ERIC Educational Resources Information Center
McCurry, Doug
2010-01-01
This article considers the claim that machine scoring of writing test responses agrees with human readers as much as humans agree with other humans. These claims about the reliability of machine scoring of writing are usually based on specific and constrained writing tasks, and there is reason for asking whether machine scoring of writing requires…
Integrative Curriculum Development in Nuclear Education and Research Vertical Enhancement Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Egarievwe, Stephen U.; Jow, Julius O.; Edwards, Matthew E.
Using a vertical education enhancement model, a Nuclear Education and Research Vertical Enhancement (NERVE) program was developed. The NERVE program is aimed at developing nuclear engineering education and research to 1) enhance skilled workforce development in disciplines relevant to nuclear power, national security and medical physics, and 2) increase the number of students and faculty from underrepresented groups (women and minorities) in fields related to the nuclear industry. The program uses multi-track training activities that vertically cut across the several education domains: undergraduate degree programs, graduate schools, and post-doctoral training. In this paper, we present the results of an integrativemore » curriculum development in the NERVE program. The curriculum development began with nuclear content infusion into existing science, engineering and technology courses. The second step involved the development of nuclear engineering courses: 1) Introduction to Nuclear Engineering, 2) Nuclear Engineering I, and 2) Nuclear Engineering II. The third step is the establishment of nuclear engineering concentrations in two engineering degree programs: 1) electrical engineering, and 2) mechanical engineering. A major outcome of the NERVE program is a collaborative infrastructure that uses laboratory work, internships at nuclear facilities, on-campus research, and mentoring in collaboration with industry and government partners to provide hands-on training for students. The major activities of the research and education collaborations include: - One-week spring training workshop at Brookhaven National Laboratory: The one-week training and workshop is used to enhance research collaborations and train faculty and students on user facilities/equipment at Brookhaven National Laboratory, and for summer research internships. Participants included students, faculty members at Alabama A and M University and research collaborators at BNL. The activities include 1) tour and introduction to user facilities/equipment at BNL that are used for research in room-temperature semiconductor nuclear detectors, 2) presentations on advances on this project and on wide band-gap semiconductor nuclear detectors in general, and 3) graduate students' research presentations. - Invited speakers and lectures: This brings collaborating research scientist from BNL to give talks and lectures on topics directly related to the project. Attendance includes faculty members, researchers and students throughout the university. - Faculty-students team summer research at BNL: This DOE and National Science Foundation (NSF) program help train students and faculty members in research. Faculty members go on to establish research collaborations with scientists at BNL, develop and submit research proposals to funding agencies, transform research experience at BNL to establish and enhance reach capabilities at home institution, and integrate their research into teaching through class projects and hands-on training for students. The students go on to participate in research work at BNL and at home institution, co-author research papers for conferences and technical journals, and transform their experiences into developing senior and capstone projects. - Grant proposal development: Faculty members in the NERVE program collaborate with BNL scientists to develop proposals, which often help to get external funding needed to expand and sustain the continuity of research activities and supports for student's wages and scholarships (stipends, tuition and fees). - Faculty development and mentoring: The above collaboration activities help faculty professional development. The experiences, grants, joint publications in technical journals, and supervision of student's research, including thesis and dissertation research projects, contribute greatly to faculty development. Senior scientists at BNL and senior faculty members on campus jointly mentor junior faculty members to enhance their professional growth. - Graduate thesis and dissertation research: Brookhaven National Laboratory provides unique opportunities and outstanding research resources for the NERVE program graduate research. Scientists from BNL serve in master's degree thesis and PhD dissertation committees, where they play active roles in the supervision of the research. (authors)« less
Evaluation of an Integrated Multi-Task Machine Learning System with Humans in the Loop
2007-01-01
machine learning components natural language processing, and optimization...was examined with a test explicitly developed to measure the impact of integrated machine learning when used by a human user in a real world setting...study revealed that integrated machine learning does produce a positive impact on overall performance. This paper also discusses how specific machine learning components contributed to human-system
2017-02-01
DARPA ROBOTICS CHALLENGE (DRC) USING HUMAN-MACHINE TEAMWORK TO PERFORM DISASTER RESPONSE WITH A HUMANOID ROBOT FLORIDA INSTITUTE FOR HUMAN AND...AND SUBTITLE DARPA ROBOTICS CHALLENGE (DRC) USING HUMAN-MACHINE TEAMWORK TO PERFORM DISASTER RESPONSE WITH A HUMANOID ROBOT 5a. CONTRACT NUMBER...Human and Machine Cognition (IHMC) from 2012-2016 through three phases of the Defense Advanced Research Projects Agency (DARPA) Robotics Challenge
Loss of the integral nuclear envelope protein SUN1 induces alteration of nucleoli
Matsumoto, Ayaka; Sakamoto, Chiyomi; Matsumori, Haruka; Katahira, Jun; Yasuda, Yoko; Yoshidome, Katsuhide; Tsujimoto, Masahiko; Goldberg, Ilya G; Matsuura, Nariaki; Nakao, Mitsuyoshi; Saitoh, Noriko; Hieda, Miki
2016-01-01
ABSTRACT A supervised machine learning algorithm, which is qualified for image classification and analyzing similarities, is based on multiple discriminative morphological features that are automatically assembled during the learning processes. The algorithm is suitable for population-based analysis of images of biological materials that are generally complex and heterogeneous. Here we used the algorithm wndchrm to quantify the effects on nucleolar morphology of the loss of the components of nuclear envelope in a human mammary epithelial cell line. The linker of nucleoskeleton and cytoskeleton (LINC) complex, an assembly of nuclear envelope proteins comprising mainly members of the SUN and nesprin families, connects the nuclear lamina and cytoskeletal filaments. The components of the LINC complex are markedly deficient in breast cancer tissues. We found that a reduction in the levels of SUN1, SUN2, and lamin A/C led to significant changes in morphologies that were computationally classified using wndchrm with approximately 100% accuracy. In particular, depletion of SUN1 caused nucleolar hypertrophy and reduced rRNA synthesis. Further, wndchrm revealed a consistent negative correlation between SUN1 expression and the size of nucleoli in human breast cancer tissues. Our unbiased morphological quantitation strategies using wndchrm revealed an unexpected link between the components of the LINC complex and the morphologies of nucleoli that serves as an indicator of the malignant phenotype of breast cancer cells. PMID:26962703
Loss of the integral nuclear envelope protein SUN1 induces alteration of nucleoli.
Matsumoto, Ayaka; Sakamoto, Chiyomi; Matsumori, Haruka; Katahira, Jun; Yasuda, Yoko; Yoshidome, Katsuhide; Tsujimoto, Masahiko; Goldberg, Ilya G; Matsuura, Nariaki; Nakao, Mitsuyoshi; Saitoh, Noriko; Hieda, Miki
2016-01-01
A supervised machine learning algorithm, which is qualified for image classification and analyzing similarities, is based on multiple discriminative morphological features that are automatically assembled during the learning processes. The algorithm is suitable for population-based analysis of images of biological materials that are generally complex and heterogeneous. Here we used the algorithm wndchrm to quantify the effects on nucleolar morphology of the loss of the components of nuclear envelope in a human mammary epithelial cell line. The linker of nucleoskeleton and cytoskeleton (LINC) complex, an assembly of nuclear envelope proteins comprising mainly members of the SUN and nesprin families, connects the nuclear lamina and cytoskeletal filaments. The components of the LINC complex are markedly deficient in breast cancer tissues. We found that a reduction in the levels of SUN1, SUN2, and lamin A/C led to significant changes in morphologies that were computationally classified using wndchrm with approximately 100% accuracy. In particular, depletion of SUN1 caused nucleolar hypertrophy and reduced rRNA synthesis. Further, wndchrm revealed a consistent negative correlation between SUN1 expression and the size of nucleoli in human breast cancer tissues. Our unbiased morphological quantitation strategies using wndchrm revealed an unexpected link between the components of the LINC complex and the morphologies of nucleoli that serves as an indicator of the malignant phenotype of breast cancer cells.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saleh, Z; Tang, X; Song, Y
Purpose: To investigate the long term stability and viability of using EPID-based daily output QA via in-house and vendor driven protocol, to replace conventional QA tools and improve QA efficiency. Methods: Two Varian TrueBeam machines (TB1&TB2) equipped with electronic portal imaging devices (EPID) were employed in this study. Both machines were calibrated per TG-51 and used clinically since Oct 2014. Daily output measurement for 6/15 MV beams were obtained using SunNuclear DailyQA3 device as part of morning QA. In addition, in-house protocol was implemented for EPID output measurement (10×10 cm fields, 100 MU, 100cm SID, output defined over an ROImore » of 2×2 cm around central axis). Moreover, the Varian Machine Performance Check (MPC) was used on both machines to measure machine output. The EPID and DailyQA3 based measurements of the relative machine output were compared and cross-correlated with monthly machine output as measured by an A12 Exradin 0.65cc Ion Chamber (IC) serving as ground truth. The results were correlated using Pearson test. Results: The correlations among DailyQA3, in-house EPID and Varian MPC output measurements, with the IC for 6/15 MV were similar for TB1 (0.83–0.95) and TB2 (0.55–0.67). The machine output for the 6/15MV beams on both machines showed a similar trend, namely an increase over time as indicated by all measurements, requiring a machine recalibration after 6 months. This drift is due to a known issue with pressurized monitor chamber which tends to leak over time. MPC failed occasionally but passed when repeated. Conclusion: The results indicate that the use of EPID for daily output measurements has the potential to become a viable and efficient tool for daily routine LINAC QA, thus eliminating weather (T,P) and human setup variability and increasing efficiency of the QA process.« less
Crowd Sourcing to Improve Urban Stormwater Management
NASA Astrophysics Data System (ADS)
Minsker, B. S.; Band, L. E.; Heidari Haratmeh, B.; Law, N. L.; Leonard, L. N.; Rai, A.
2017-12-01
Over half of the world's population currently lives in urban areas, a number predicted to grow to 60 percent by 2030. Urban areas face unprecedented and growing challenges that threaten society's long-term wellbeing, including poverty; chronic health problems; widespread pollution and resource degradation; and increased natural disasters. These are "wicked" problems involving "systems of systems" that require unprecedented information sharing and collaboration across disciplines and organizational boundaries. Cities are recognizing that the increasing stream of data and information ("Big Data"), informatics, and modeling can support rapid advances on these challenges. Nonetheless, information technology solutions can only be effective in addressing these challenges through deeply human and systems perspectives. A stakeholder-driven approach ("crowd sourcing") is needed to develop urban systems that address multiple needs, such as parks that capture and treat stormwater while improving human and ecosystem health and wellbeing. We have developed informatics- and Cloud-based collaborative methods that enable crowd sourcing of green stormwater infrastructure (GSI: rain gardens, bioswales, trees, etc.) design and management. The methods use machine learning, social media data, and interactive design tools (called IDEAS-GI) to identify locations and features of GSI that perform best on a suite of objectives, including life cycle cost, stormwater volume reduction, and air pollution reduction. Insights will be presented on GI features that best meet stakeholder needs and are therefore most likely to improve human wellbeing and be well maintained.
New insights into the biogenesis of nuclear RNA polymerases?
Cloutier, Philippe; Coulombe, Benoit
2010-04-01
More than 30 years of research on nuclear RNA polymerases (RNAP I, II, and III) has uncovered numerous factors that regulate the activity of these enzymes during the transcription reaction. However, very little is known about the machinery that regulates the fate of RNAPs before or after transcription. In particular, the mechanisms of biogenesis of the 3 nuclear RNAPs, which comprise both common and specific subunits, remains mostly uncharacterized and the proteins involved are yet to be discovered. Using protein affinity purification coupled to mass spectrometry (AP-MS), we recently unraveled a high-density interaction network formed by nuclear RNAP subunits from the soluble fraction of human cell extracts. Validation of the dataset using a machine learning approach trained to minimize the rate of false positives and false negatives yielded a high-confidence dataset and uncovered novel interactors that regulate the RNAP II transcription machinery, including a set of proteins we named the RNAP II-associated proteins (RPAPs). One of the RPAPs, RPAP3, is part of an 11-subunit complex we termed the RPAP3/R2TP/prefoldin-like complex. Here, we review the literature on the subunits of this complex, which points to a role in nuclear RNAP biogenesis.
New insights into the biogenesis of nuclear RNA polymerases?1
Cloutier, Philippe; Coulombe, Benoit
2015-01-01
More than 30 years of research on nuclear RNA polymerases (RNAP I, II, and III) has uncovered numerous factors that regulate the activity of these enzymes during the transcription reaction. However, very little is known about the machinery that regulates the fate of RNAPs before or after transcription. In particular, the mechanisms of biogenesis of the 3 nuclear RNAPs, which comprise both common and specific subunits, remains mostly uncharacterized and the proteins involved are yet to be discovered. Using protein affinity purification coupled to mass spectrometry (AP–MS), we recently unraveled a high-density interaction network formed by nuclear RNAP subunits from the soluble fraction of human cell extracts. Validation of the dataset using a machine learning approach trained to minimize the rate of false positives and false negatives yielded a high-confidence dataset and uncovered novel interactors that regulate the RNAP II transcription machinery, including a set of proteins we named the RNAP II-associated proteins (RPAPs). One of the RPAPs, RPAP3, is part of an 11-subunit complex we termed the RPAP3/R2TP/prefoldin-like complex. Here, we review the literature on the subunits of this complex, which points to a role in nuclear RNAP biogenesis. PMID:20453924
The Nuclear Disarmament Movement: Politics, Potential, and Strategy
ERIC Educational Resources Information Center
Nebel, Jacob
2012-01-01
Nuclear disarmament is a global ambition and requires collaboration, but who is collaborating, and what are their roles? This paper discusses the role of the American people in the path towards zero. Scholars have discussed at length the historical lessons of the global disarmament movement, and activists have worked to rekindle the movement after…
Cutting the Cord: Discrimination and Command Responsibility in Autonomous Lethal Weapons
2014-02-13
machine responses to identical stimuli, and it was the job of a third party human “witness” to determine which participant was man and which was...machines may be error free, but there are potential benefits to be gained through autonomy if machines can meet or exceed human performance in...lieu of human operators and reap the benefits that autonomy provides. Human and Machine Error It would be foolish to assert that either humans
Using machine learning to emulate human hearing for predictive maintenance of equipment
NASA Astrophysics Data System (ADS)
Verma, Dinesh; Bent, Graham
2017-05-01
At the current time, interfaces between humans and machines use only a limited subset of senses that humans are capable of. The interaction among humans and computers can become much more intuitive and effective if we are able to use more senses, and create other modes of communicating between them. New machine learning technologies can make this type of interaction become a reality. In this paper, we present a framework for a holistic communication between humans and machines that uses all of the senses, and discuss how a subset of this capability can allow machines to talk to humans to indicate their health for various tasks such as predictive maintenance.
Structuring Cooperative Nuclear RIsk Reduction Initiatives with China.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brandt, Larry; Reinhardt, Jason Christian; Hecker, Siegfried
The Stanford Center for International Security and Cooperation engaged several Chinese nuclear organizations in cooperative research that focused on responses to radiological and nuclear terrorism. The objective was to identify joint research initiatives to reduce the global dangers of such threats and to pursue initial technical collaborations in several high priority areas. Initiatives were identified in three primary research areas: 1) detection and interdiction of smuggled nuclear materials; 2) nuclear forensics; and 3) radiological (“dirty bomb”) threats and countermeasures. Initial work emphasized the application of systems and risk analysis tools, which proved effective in structuring the collaborations. The extensive engagementsmore » between national security nuclear experts in China and the U.S. during the research strengthened professional relationships between these important communities.« less
Co-Located Collaborative Learning Video Game with Single Display Groupware
ERIC Educational Resources Information Center
Infante, Cristian; Weitz, Juan; Reyes, Tomas; Nussbaum, Miguel; Gomez, Florencia; Radovic, Darinka
2010-01-01
Role Game is a co-located CSCL video game played by three students sitting at one machine sharing a single screen, each with their own input device. Inspired by video console games, Role Game enables students to learn by doing, acquiring social abilities and mastering subject matter in a context of co-located collaboration. After describing the…
Fan, Jing; Kuai, Bin; Wu, Guifen; Wu, Xudong; Chi, Binkai; Wang, Lantian; Wang, Ke; Shi, Zhubing; Zhang, Heng; Chen, She; He, Zhisong; Wang, Siyuan; Zhou, Zhaocai; Li, Guohui; Cheng, Hong
2017-10-02
The exosome is a key RNA machine that functions in the degradation of unwanted RNAs. Here, we found that significant fractions of precursors and mature forms of mRNAs and long noncoding RNAs are degraded by the nuclear exosome in normal human cells. Exosome-mediated degradation of these RNAs requires its cofactor hMTR4. Significantly, hMTR4 plays a key role in specifically recruiting the exosome to its targets. Furthermore, we provide several lines of evidence indicating that hMTR4 executes this role by directly competing with the mRNA export adaptor ALYREF for associating with ARS2, a component of the cap-binding complex (CBC), and this competition is critical for determining whether an RNA is degraded or exported to the cytoplasm. Together, our results indicate that the competition between hMTR4 and ALYREF determines exosome recruitment and functions in creating balanced nuclear RNA pools for degradation and export. © 2017 The Authors.
Taniguchi, Hidetaka; Sato, Hiroshi; Shirakawa, Tomohiro
2018-05-09
Human learners can generalize a new concept from a small number of samples. In contrast, conventional machine learning methods require large amounts of data to address the same types of problems. Humans have cognitive biases that promote fast learning. Here, we developed a method to reduce the gap between human beings and machines in this type of inference by utilizing cognitive biases. We implemented a human cognitive model into machine learning algorithms and compared their performance with the currently most popular methods, naïve Bayes, support vector machine, neural networks, logistic regression and random forests. We focused on the task of spam classification, which has been studied for a long time in the field of machine learning and often requires a large amount of data to obtain high accuracy. Our models achieved superior performance with small and biased samples in comparison with other representative machine learning methods.
Code of Federal Regulations, 2012 CFR
2012-01-01
... uranium or enriching uranium in the isotope 235, zirconium tubes, heavy water or deuterium, nuclear-grade..., irradiated fuel element chopping machines, and hot cells. Nuclear fuel cycle-related research and development...
Patzel-Mattern, Katja
2005-01-01
The 20th Century is the century of of technical artefacts. With their existance and use they create an artificial reality, within which humans have to position themselves. Psychotechnik is an attempt to enable humans for this positioning. It gained importance in Germany after World War I and had its heyday between 1919 and 1926. On the basis of the activity of the engineer and supporter of Psychotechnik Georg Schlesinger, whose particular interest were disabled soldiers, the essay on hand will investigate the understanding of the body and the human being of Psychotechnik as an applied science. It turned out, that the biggest achievement of Psychotechnik was to establish a new view of the relation between human being and machine. Thus it helped to show that the human-machine-interface is a shapable unit. Psychotechnik sees the human body and its physique as the last instance for the design of machines. Its main concern is to optimize the relation between human being and machine rather than to standardize human beings according to the construction of machines. After her splendid rise during the Weimar Republic and her rapid decline since the late 1920s Psychotechnik nowadays gains scientifical attention as a historical phenomenon. The main attention in the current discourse lies on the aspects conserning philosophy of science: the unity of body and soul, the understanding of the human-machine-interface as a shapable unit and the human being as a last instance of this unit.
Scientific bases of human-machine communication by voice.
Schafer, R W
1995-01-01
The scientific bases for human-machine communication by voice are in the fields of psychology, linguistics, acoustics, signal processing, computer science, and integrated circuit technology. The purpose of this paper is to highlight the basic scientific and technological issues in human-machine communication by voice and to point out areas of future research opportunity. The discussion is organized around the following major issues in implementing human-machine voice communication systems: (i) hardware/software implementation of the system, (ii) speech synthesis for voice output, (iii) speech recognition and understanding for voice input, and (iv) usability factors related to how humans interact with machines. PMID:7479802
Forsythe, J Chris [Sandia Park, NM; Xavier, Patrick G [Albuquerque, NM; Abbott, Robert G [Albuquerque, NM; Brannon, Nathan G [Albuquerque, NM; Bernard, Michael L [Tijeras, NM; Speed, Ann E [Albuquerque, NM
2009-04-28
Digital technology utilizing a cognitive model based on human naturalistic decision-making processes, including pattern recognition and episodic memory, can reduce the dependency of human-machine interactions on the abilities of a human user and can enable a machine to more closely emulate human-like responses. Such a cognitive model can enable digital technology to use cognitive capacities fundamental to human-like communication and cooperation to interact with humans.
Implementation of a Web-Based Collaborative Process Planning System
NASA Astrophysics Data System (ADS)
Wang, Huifen; Liu, Tingting; Qiao, Li; Huang, Shuangxi
Under the networked manufacturing environment, all phases of product manufacturing involving design, process planning, machining and assembling may be accomplished collaboratively by different enterprises, even different manufacturing stages of the same part may be finished collaboratively by different enterprises. Based on the self-developed networked manufacturing platform eCWS(e-Cooperative Work System), a multi-agent-based system framework for collaborative process planning is proposed. In accordance with requirements of collaborative process planning, share resources provided by cooperative enterprises in the course of collaboration are classified into seven classes. Then a reconfigurable and extendable resource object model is built. Decision-making strategy is also studied in this paper. Finally a collaborative process planning system e-CAPP is developed and applied. It provides strong support for distributed designers to collaboratively plan and optimize product process though network.
Gloved Human-Machine Interface
NASA Technical Reports Server (NTRS)
Adams, Richard (Inventor); Hannaford, Blake (Inventor); Olowin, Aaron (Inventor)
2015-01-01
Certain exemplary embodiments can provide a system, machine, device, manufacture, circuit, composition of matter, and/or user interface adapted for and/or resulting from, and/or a method and/or machine-readable medium comprising machine-implementable instructions for, activities that can comprise and/or relate to: tracking movement of a gloved hand of a human; interpreting a gloved finger movement of the human; and/or in response to interpreting the gloved finger movement, providing feedback to the human.
Knowledge-based load leveling and task allocation in human-machine systems
NASA Technical Reports Server (NTRS)
Chignell, M. H.; Hancock, P. A.
1986-01-01
Conventional human-machine systems use task allocation policies which are based on the premise of a flexible human operator. This individual is most often required to compensate for and augment the capabilities of the machine. The development of artificial intelligence and improved technologies have allowed for a wider range of task allocation strategies. In response to these issues a Knowledge Based Adaptive Mechanism (KBAM) is proposed for assigning tasks to human and machine in real time, using a load leveling policy. This mechanism employs an online workload assessment and compensation system which is responsive to variations in load through an intelligent interface. This interface consists of a loading strategy reasoner which has access to information about the current status of the human-machine system as well as a database of admissible human/machine loading strategies. Difficulties standing in the way of successful implementation of the load leveling strategy are examined.
The US DOE-EM International Program - 13004
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elmetti, Rosa R.; Han, Ana M.; Williams, Alice C.
2013-07-01
The U.S. Department of Energy (DOE) Office of Environmental Management (EM) conducts international collaboration activities in support of U.S. policies and objectives regarding the accelerated risk reduction and remediation of environmental legacy of the nations' nuclear weapons program and government sponsored nuclear energy research. The EM International Program supported out of the EM Office of the Associate Principal Deputy Assistant Secretary pursues collaborations with foreign government organizations, educational institutions and private industry to assist in identifying technologies and promote international collaborations that leverage resources and link international experience and expertise. In fiscal year (FY) 2012, the International Program awarded eightmore » international collaborative projects for work scope spanning waste processing, groundwater and soil remediation, deactivation and decommissioning (D and D) and nuclear materials disposition initiatives to seven foreign organizations. Additionally, the International Program's scope and collaboration opportunities were expanded to include technical as well as non-technical areas. This paper will present an overview of the on-going tasks awarded in FY 2012 and an update of upcoming international activities and opportunities for expansion into FY 2013 and beyond. (authors)« less
The US DOE EM international program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elmetti, Rosa R.; Han, Ana M.; Roach, Jay A.
2013-07-01
The U.S. Department of Energy (DOE) Office of Environmental Management (EM) conducts international collaboration activities in support of U.S. policies and objectives regarding the accelerated risk reduction and remediation of environmental legacy of the nations' nuclear weapons program and government sponsored nuclear energy research. The EM International Program supported out of the EM Office of the Associate Principal Deputy Assistant Secretary pursues collaborations with foreign government organizations, educational institutions and private industry to assist in identifying technologies and promote international collaborations that leverage resources and link international experience and expertise. In fiscal year (FY) 2012, the International Program awarded eightmore » international collaborative projects for work scope spanning waste processing, groundwater and soil remediation, deactivation and decommissioning (D and D) and nuclear materials disposition initiatives to seven foreign organizations. Additionally, the International Program's scope and collaboration opportunities were expanded to include technical as well as non-technical areas. This paper will present an overview of the on-going tasks awarded in FY 2012 and an update of upcoming international activities and opportunities for expansion into the remainder of FY 2013 and beyond. (authors)« less
Design of virtual SCADA simulation system for pressurized water reactor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wijaksono, Umar, E-mail: umar.wijaksono@student.upi.edu; Abdullah, Ade Gafar; Hakim, Dadang Lukman
The Virtual SCADA system is a software-based Human-Machine Interface that can visualize the process of a plant. This paper described the results of the virtual SCADA system design that aims to recognize the principle of the Nuclear Power Plant type Pressurized Water Reactor. This simulation uses technical data of the Nuclear Power Plant Unit Olkiluoto 3 in Finland. This device was developed using Wonderware Intouch, which is equipped with manual books for each component, animation links, alarm systems, real time and historical trending, and security system. The results showed that in general this device can demonstrate clearly the principles ofmore » energy flow and energy conversion processes in Pressurized Water Reactors. This virtual SCADA simulation system can be used as instructional media to recognize the principle of Pressurized Water Reactor.« less
Developing disaster management modules: a collaborative approach.
Douglas, Valerie
Disasters, whether natural or human induced, can strike when least expected. The events of 9/11 in the US, the 7/7 bombings in the UK, and the anthrax incident in the US on 10th October 2001 indicate that there is a need to have a nursing workforce who is able to respond effectively to mass casualty events and incidents involving chemical, biological, radiological and nuclear substances. Multi-agency collaboration is one of the fundamental principles of disaster preparedness and response. It was therefore necessary to take a similar multi-agency collaborative approach to develop modules on the management of mass casualty events and incidents involving hazardous substances. The modules are offered to registered nurses and registered paramedics. They can be taken independently or as part of a BSc in nursing or health pathway, on a part-time basis. Since the commencement of the modules in September 2004, registered paramedics and registered nurses who work in a wide range of specialties have accessed them.
The High-Strain Rate Loading of Structural Biological Materials
NASA Astrophysics Data System (ADS)
Proud, W. G.; Nguyen, T.-T. N.; Bo, C.; Butler, B. J.; Boddy, R. L.; Williams, A.; Masouros, S.; Brown, K. A.
2015-10-01
The human body can be subjected to violent acceleration as a result of explosion caused by military ordinance or accident. Blast waves cause injury and blunt trauma can be produced by violent impact of objects against the human body. The long-term clinical manifestations of blast injury can be significantly different in nature and extent to those suffering less aggressive insult. Similarly, the damage seen in lower limbs from those injured in explosion incidents is in general more severe than those falling from height. These phenomena increase the need for knowledge of the short- and long-term effect of transient mechanical loading to the biological structures of the human body. This paper gives an overview of some of the results of collaborative investigation into blast injury. The requirement for time-resolved data, appropriate mechanical modeling, materials characterization and biological effects is presented. The use of a range of loading platforms, universal testing machines, drop weights, Hopkinson bars, and bespoke traumatic injury simulators are given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
St. Germain, Shawn W.; Farris, Ronald K.; Whaley, April M.
This research effort is a part of the Light-Water Reactor Sustainability (LWRS) Program, which is a research and development (R&D) program sponsored by Department of Energy (DOE) and performed in close collaboration with industry R&D programs that provide the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants. The LWRS program serves to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. The purpose of this research is to improve management of nuclear powermore » plant (NPP) outages through the development of an advanced outage control center (AOCC) that is specifically designed to maximize the usefulness of communication and collaboration technologies for outage coordination and problem resolution activities. This technical report for industry implementation outlines methods and considerations for the establishment of an AOCC. This report provides a process for implementation of a change management plan, evaluation of current outage processes, the selection of technology, and guidance for the implementation of the selected technology. Methods are presented for both adoption of technologies within an existing OCC and for a complete OCC replacement, including human factors considerations for OCC design and setup.« less
ERIC Educational Resources Information Center
Kirrane, Diane E.
1990-01-01
As scientists seek to develop machines that can "learn," that is, solve problems by imitating the human brain, a gold mine of information on the processes of human learning is being discovered, expert systems are being improved, and human-machine interactions are being enhanced. (SK)
Deng, Li; Wang, Guohua; Yu, Suihuai
2016-01-01
In order to consider the psychological cognitive characteristics affecting operating comfort and realize the automatic layout design, cognitive ergonomics and GA-ACA (genetic algorithm and ant colony algorithm) were introduced into the layout design of human-machine interaction interface. First, from the perspective of cognitive psychology, according to the information processing process, the cognitive model of human-machine interaction interface was established. Then, the human cognitive characteristics were analyzed, and the layout principles of human-machine interaction interface were summarized as the constraints in layout design. Again, the expression form of fitness function, pheromone, and heuristic information for the layout optimization of cabin was studied. The layout design model of human-machine interaction interface was established based on GA-ACA. At last, a layout design system was developed based on this model. For validation, the human-machine interaction interface layout design of drilling rig control room was taken as an example, and the optimization result showed the feasibility and effectiveness of the proposed method.
Deng, Li; Wang, Guohua; Yu, Suihuai
2016-01-01
In order to consider the psychological cognitive characteristics affecting operating comfort and realize the automatic layout design, cognitive ergonomics and GA-ACA (genetic algorithm and ant colony algorithm) were introduced into the layout design of human-machine interaction interface. First, from the perspective of cognitive psychology, according to the information processing process, the cognitive model of human-machine interaction interface was established. Then, the human cognitive characteristics were analyzed, and the layout principles of human-machine interaction interface were summarized as the constraints in layout design. Again, the expression form of fitness function, pheromone, and heuristic information for the layout optimization of cabin was studied. The layout design model of human-machine interaction interface was established based on GA-ACA. At last, a layout design system was developed based on this model. For validation, the human-machine interaction interface layout design of drilling rig control room was taken as an example, and the optimization result showed the feasibility and effectiveness of the proposed method. PMID:26884745
Future of Mechatronics and Human
NASA Astrophysics Data System (ADS)
Harashima, Fumio; Suzuki, Satoshi
This paper mentions circumstance of mechatronics that sustain our human society, and introduces HAM(Human Adaptive Mechatronics)-project as one of research projects to create new human-machine system. The key point of HAM is skill, and analysis of skill and establishment of assist method to enhance total performance of human-machine system are main research concerns. As study of skill is an elucidation of human itself, analyses of human higher function are significant. In this paper, after surveying researches of human brain functions, an experimental analysis of human characteristic in machine operation is shown as one example of our research activities. We used hovercraft simulator as verification system including observation, voluntary motion control and machine operation that are needed to general machine operation. Process and factors to become skilled were investigated by identification of human control characteristics with measurement of the operator's line-of sight. It was confirmed that early switching of sub-controllers / reference signals in human and enhancement of space perception are significant.
Upgrading the fuel-handling machine of the Novovoronezh nuclear power plant unit no. 5
NASA Astrophysics Data System (ADS)
Terekhov, D. V.; Dunaev, V. I.
2014-02-01
The calculation of safety parameters was carried out in the process of upgrading the fuel-handling machine (FHM) of the Novovoronezh nuclear power plant (NPP) unit no. 5 based on the results of quantitative safety analysis of nuclear fuel transfer operations using a dynamic logical-and-probabilistic model of the processing procedure. Specific engineering and design concepts that made it possible to reduce the probability of damaging the fuel assemblies (FAs) when performing various technological operations by an order of magnitude and introduce more flexible algorithms into the modernized FHM control system were developed. The results of pilot operation during two refueling campaigns prove that the total reactor shutdown time is lowered.
Cleaning of uranium vs machine coolant formulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cristy, S.S.; Byrd, V.R.; Simandl, R.F.
1984-10-01
This study compares methods for cleaning uranium chips and the residues left on chips from alternate machine coolants based on propylene glycol-water mixtures with either borax, ammonium tetraborate, or triethanolamine tetraborate added as a nuclear poison. Residues left on uranium surfaces machined with perchloroethylene-mineral oil coolant and on surfaces machined with the borax-containing alternate coolant were also compared. In comparing machined surfaces, greater chlorine contamination was found on the surface of the perchloroethylene-mineral oil machined surfaces, but slightly greater oxidation was found on the surfaces machined with the alternate borax-containing coolant. Overall, the differences were small and a change tomore » the alternate coolant does not appear to constitute a significant threat to the integrity of machined uranium parts.« less
Big Data, Global Development, and Complex Social Systems
NASA Astrophysics Data System (ADS)
Eagle, Nathan
2010-03-01
Petabytes of data about human movements, transactions, and communication patterns are continuously being generated by everyday technologies such as mobile phones and credit cards. This unprecedented volume of information facilitates a novel set of research questions applicable to a wide range of development issues. In collaboration with the mobile phone, internet, and credit card industries, my colleagues and I are aggregating and analyzing behavioral data from over 250 million people from North and South America, Europe, Asia and Africa. I will discuss a selection of projects arising from these collaborations that involve inferring behavioral dynamics on a broad spectrum of scales; from risky behavior in a group of MIT freshman to population-level behavioral signatures, including cholera outbreaks in Rwanda and wealth in the UK. Access to the movement patterns of the majority of mobile phones in East Africa also facilitates realistic models of disease transmission as well as slum formations. This vast volume of data requires new analytical tools - we are developing a range of large-scale network analysis and machine learning algorithms that we hope will provide deeper insight into human behavior. However, ultimately our goal is to determine how we can use these insights to actively improve the lives of the billions of people who generate this data and the societies in which they live.
The politics of psycholinguistics.
Cohen-Cole, Jamie
2015-01-01
This article narrates the history of the interdisciplinary field of psycholinguistics from its modern organization in the 1950s to its application and influence in the field of reading instruction. Beginning as a combination of structural linguistics, behaviorist psychology, and information theory, the field was revolutionized by the collaboration of the psychologist George Miller and the linguist Noam Chomsky. This transformation was, at root, the adoption of the view that humans should be best understood as creative users of language and the rejection of behaviorist or machine models. Under their influence the field came to treat humans as creative, nonmechanical learners and users of language who, like scientists, hypothesize in order to understand and even perceive the world. This vision of language as a nondeterministic process shaped the field of reading instruction by providing the central model to advocates of the whole-language pedagogical method. © 2014 Wiley Periodicals, Inc.
Tambone, V; Pennacchini, M
2010-01-01
The term technoscience (T) indicates the complex interactions between contemporary science and technology, that have become practically inseparable. From an epistemological point of view, T only considers the quantitative knowledge in a reductionist way. Nature has been reduced to a machine that works according to laws learnt through the experimental science. At present, technical efficiency represents an operational dominion on Nature; it gives the power to those who possess it. Scientists, considered as visionaries, have the assignment to lead society. They create new cosmos-visions that are technocentric, thus Ts use the human being as subject of experimentation and they transform some essential dimensions of the human being. All this suggests the necessity of an ethical evaluation of the action integration of different subjects in what we call integrated action. This configuration involves ethical obligations for the agent: he/she has to act preserving and allowing the collaboration, and respecting the professional's individual responsibility.
Dippel, Anne
2017-12-01
This article looks at how games and play contribute to the big data-driven production of knowledge in High-Energy Physics, with a particular focus on the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN), where the author has been conducting anthropological fieldwork since 2014. The ludic (playful) aspect of knowledge production is analyzed here in three different dimensions: the Symbolic, the Ontological, and the Epistemic. The first one points towards CERN as place where a cosmological game of probability is played with the help of Monte-Carlo simulations. The second one can be seen in the agonistic infrastructures of competing experimental collaborations. The third dimension unfolds in ludic platforms, such as online Challenges and citizen science games, which contribute to the development of machine learning algorithms, whose function is necessary in order to process the huge amount of data gathered from experimental events. Following Clifford Geertz, CERN itself is characterized as a site of deep play, a concept that contributes to understanding wider social and cultural orders through the analysis of ludic collective phenomena. The article also engages with Peter Galison's idea of the trading zone, proposing to comprehend it in the age of big data as a Playground. Thus the author hopes to contribute to a wider discussion in the historiographical and social study of science and technology, as well as in cultural anthropology, by recognizing the ludic in science as a central element of understanding collaborative knowledge production.
Human Machine Learning Symbiosis
ERIC Educational Resources Information Center
Walsh, Kenneth R.; Hoque, Md Tamjidul; Williams, Kim H.
2017-01-01
Human Machine Learning Symbiosis is a cooperative system where both the human learner and the machine learner learn from each other to create an effective and efficient learning environment adapted to the needs of the human learner. Such a system can be used in online learning modules so that the modules adapt to each learner's learning state both…
Proceedings of the NASA Conference on Space Telerobotics, volume 2
NASA Technical Reports Server (NTRS)
Rodriguez, Guillermo (Editor); Seraji, Homayoun (Editor)
1989-01-01
These proceedings contain papers presented at the NASA Conference on Space Telerobotics held in Pasadena, January 31 to February 2, 1989. The theme of the Conference was man-machine collaboration in space. The Conference provided a forum for researchers and engineers to exchange ideas on the research and development required for application of telerobotics technology to the space systems planned for the 1990s and beyond. The Conference: (1) provided a view of current NASA telerobotic research and development; (2) stimulated technical exchange on man-machine systems, manipulator control, machine sensing, machine intelligence, concurrent computation, and system architectures; and (3) identified important unsolved problems of current interest which can be dealt with by future research.
Intervention strategies for the management of human error
NASA Technical Reports Server (NTRS)
Wiener, Earl L.
1993-01-01
This report examines the management of human error in the cockpit. The principles probably apply as well to other applications in the aviation realm (e.g. air traffic control, dispatch, weather, etc.) as well as other high-risk systems outside of aviation (e.g. shipping, high-technology medical procedures, military operations, nuclear power production). Management of human error is distinguished from error prevention. It is a more encompassing term, which includes not only the prevention of error, but also a means of disallowing an error, once made, from adversely affecting system output. Such techniques include: traditional human factors engineering, improvement of feedback and feedforward of information from system to crew, 'error-evident' displays which make erroneous input more obvious to the crew, trapping of errors within a system, goal-sharing between humans and machines (also called 'intent-driven' systems), paperwork management, and behaviorally based approaches, including procedures, standardization, checklist design, training, cockpit resource management, etc. Fifteen guidelines for the design and implementation of intervention strategies are included.
Machines and Human Beings in the Movies
ERIC Educational Resources Information Center
van der Laan, J. M.
2006-01-01
Over the years, many movies have presented on-screen a struggle between machines and human beings. Typically, the machines have come to rule and threaten the existence of humanity. They must be conquered to ensure the survival of and to secure the freedom of the human race. Although these movies appear to expose the dangers of an autonomous and…
A new machine classification method applied to human peripheral blood leukocytes
NASA Technical Reports Server (NTRS)
Rorvig, Mark E.; Fitzpatrick, Steven J.; Vitthal, Sanjay; Ladoulis, Charles T.
1994-01-01
Human beings judge images by complex mental processes, whereas computing machines extract features. By reducing scaled human judgments and machine extracted features to a common metric space and fitting them by regression, the judgments of human experts rendered on a sample of images may be imposed on an image population to provide automatic classification.
Support vector machines for nuclear reactor state estimation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zavaljevski, N.; Gross, K. C.
2000-02-14
Validation of nuclear power reactor signals is often performed by comparing signal prototypes with the actual reactor signals. The signal prototypes are often computed based on empirical data. The implementation of an estimation algorithm which can make predictions on limited data is an important issue. A new machine learning algorithm called support vector machines (SVMS) recently developed by Vladimir Vapnik and his coworkers enables a high level of generalization with finite high-dimensional data. The improved generalization in comparison with standard methods like neural networks is due mainly to the following characteristics of the method. The input data space is transformedmore » into a high-dimensional feature space using a kernel function, and the learning problem is formulated as a convex quadratic programming problem with a unique solution. In this paper the authors have applied the SVM method for data-based state estimation in nuclear power reactors. In particular, they implemented and tested kernels developed at Argonne National Laboratory for the Multivariate State Estimation Technique (MSET), a nonlinear, nonparametric estimation technique with a wide range of applications in nuclear reactors. The methodology has been applied to three data sets from experimental and commercial nuclear power reactor applications. The results are promising. The combination of MSET kernels with the SVM method has better noise reduction and generalization properties than the standard MSET algorithm.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
HOPKINS, A.M.
The new approach to negotiations was termed collaborative (win-win) rather than positional (win-lose). Collaborative negotiations were conducted to establish milestones for the decommissioning of the Plutonium Finishing Plant, PFP.
Reference Architecture for MNE 5 Technical System
2007-05-30
of being available in most experiments. Core Services A core set of applications whi directories, web portal and collaboration applications etc. A...classifications Messages (xml, JMS, content level…) Meta data filtering, who can initiate services Web browsing Collaboration & messaging Border...Exchange Ref Architecture for MNE5 Tech System.doc 9 of 21 audit logging Person and machine Data lev objects, web services, messages rification el
Using human brain activity to guide machine learning.
Fong, Ruth C; Scheirer, Walter J; Cox, David D
2018-03-29
Machine learning is a field of computer science that builds algorithms that learn. In many cases, machine learning algorithms are used to recreate a human ability like adding a caption to a photo, driving a car, or playing a game. While the human brain has long served as a source of inspiration for machine learning, little effort has been made to directly use data collected from working brains as a guide for machine learning algorithms. Here we demonstrate a new paradigm of "neurally-weighted" machine learning, which takes fMRI measurements of human brain activity from subjects viewing images, and infuses these data into the training process of an object recognition learning algorithm to make it more consistent with the human brain. After training, these neurally-weighted classifiers are able to classify images without requiring any additional neural data. We show that our neural-weighting approach can lead to large performance gains when used with traditional machine vision features, as well as to significant improvements with already high-performing convolutional neural network features. The effectiveness of this approach points to a path forward for a new class of hybrid machine learning algorithms which take both inspiration and direct constraints from neuronal data.
Seamless Digital Environment – Data Analytics Use Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oxstrand, Johanna
Multiple research efforts in the U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program studies the need and design of an underlying architecture to support the increased amount and use of data in the nuclear power plant. More specifically the three LWRS research efforts; Digital Architecture for an Automated Plant, Automated Work Packages, Computer-Based Procedures for Field Workers, and the Online Monitoring efforts all have identified the need for a digital architecture and more importantly the need for a Seamless Digital Environment (SDE). A SDE provides a mean to access multiple applications, gather the data points needed, conduct themore » analysis requested, and present the result to the user with minimal or no effort by the user. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting the nuclear utilities identified the need for research focused on data analytics. The effort was to develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics. The goal of the study is to research potential approaches to building an analytics solution for equipment reliability, on a small scale, focusing on either a single piece of equipment or a single system. The analytics solution will likely consist of a data integration layer, predictive and machine learning layer and the user interface layer that will display the output of the analysis in a straight forward, easy to consume manner. This report describes the use case study initiated by NITSL and conducted in a collaboration between Idaho National Laboratory, Arizona Public Service – Palo Verde Nuclear Generating Station, and NextAxiom Inc.« less
Cognitive simulation as a tool for cognitive task analysis.
Roth, E M; Woods, D D; Pople, H E
1992-10-01
Cognitive simulations are runnable computer programs that represent models of human cognitive activities. We show how one cognitive simulation built as a model of some of the cognitive processes involved in dynamic fault management can be used in conjunction with small-scale empirical data on human performance to uncover the cognitive demands of a task, to identify where intention errors are likely to occur, and to point to improvements in the person-machine system. The simulation, called Cognitive Environment Simulation or CES, has been exercised on several nuclear power plant accident scenarios. Here we report one case to illustrate how a cognitive simulation tool such as CES can be used to clarify the cognitive demands of a problem-solving situation as part of a cognitive task analysis.
NASA Astrophysics Data System (ADS)
Kergosien, Yannick L.; Racoceanu, Daniel
2017-11-01
This article presents our vision about the next generation of challenges in computational/digital pathology. The key role of the domain ontology, developed in a sustainable manner (i.e. using reference checklists and protocols, as the living semantic repositories), opens the way to effective/sustainable traceability and relevance feedback concerning the use of existing machine learning algorithms, proven to be very performant in the latest digital pathology challenges (i.e. convolutional neural networks). Being able to work in an accessible web-service environment, with strictly controlled issues regarding intellectual property (image and data processing/analysis algorithms) and medical data/image confidentiality is essential for the future. Among the web-services involved in the proposed approach, the living yellow pages in the area of computational pathology seems to be very important in order to reach an operational awareness, validation, and feasibility. This represents a very promising way to go to the next generation of tools, able to bring more guidance to the computer scientists and confidence to the pathologists, towards an effective/efficient daily use. Besides, a consistent feedback and insights will be more likely to emerge in the near future - from these sophisticated machine learning tools - back to the pathologists-, strengthening, therefore, the interaction between the different actors of a sustainable biomedical ecosystem (patients, clinicians, biologists, engineers, scientists etc.). Beside going digital/computational - with virtual slide technology demanding new workflows-, Pathology must prepare for another coming revolution: semantic web technologies now enable the knowledge of experts to be stored in databases, shared through the Internet, and accessible by machines. Traceability, disambiguation of reports, quality monitoring, interoperability between health centers are some of the associated benefits that pathologists were seeking. However, major changes are also to be expected for the relation of human diagnosis to machine based procedures. Improving on a former imaging platform which used a local knowledge base and a reasoning engine to combine image processing modules into higher level tasks, we propose a framework where different actors of the histopathology imaging world can cooperate using web services - exchanging knowledge as well as imaging services - and where the results of such collaborations on diagnostic related tasks can be evaluated in international challenges such as those recently organized for mitosis detection, nuclear atypia, or tissue architecture in the context of cancer grading. This framework is likely to offer an effective context-guidance and traceability to Deep Learning approaches, with an interesting promising perspective given by the multi-task learning (MTL) paradigm, distinguished by its applicability to several different learning algorithms, its non- reliance on specialized architectures and the promising results demonstrated, in particular towards the problem of weak supervision-, an issue found when direct links from pathology terms in reports to corresponding regions within images are missing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matthews, W.
2000-02-22
Modern High Energy Nuclear and Particle Physics (HENP) experiments at Laboratories around the world present a significant challenge to wide area networks. Petabytes (1015) or exabytes (1018) of data will be generated during the lifetime of the experiment. Much of this data will be distributed via the Internet to the experiment's collaborators at Universities and Institutes throughout the world for analysis. In order to assess the feasibility of the computing goals of these and future experiments, the HENP networking community is actively monitoring performance across a large part of the Internet used by its collaborators. Since 1995, the pingER projectmore » has been collecting data on ping packet loss and round trip times. In January 2000, there are 28 monitoring sites in 15 countries gathering data on over 2,000 end-to-end pairs. HENP labs such as SLAC, Fermi Lab and CERN are using Advanced Network's Surveyor project and monitoring performance from one-way delay of UDP packets. More recently several HENP sites have become involved with NLANR's active measurement program (AMP). In addition SLAC and CERN are part of the RIPE test-traffic project and SLAC is home for a NIMI machine. The large End-to-end performance monitoring infrastructure allows the HENP networking community to chart long term trends and closely examine short term glitches across a wide range of networks and connections. The different methodologies provide opportunities to compare results based on different protocols and statistical samples. Understanding agreement and discrepancies between results provides particular insight into the nature of the network. This paper will highlight the practical side of monitoring by reviewing the special needs of High Energy Nuclear and Particle Physics experiments and provide an overview of the experience of measuring performance across a large number of interconnected networks throughout the world with various methodologies. In particular, results from each project will be compared and disagreement will be analyzed. The goal is to address issues for improving understanding for gathering and analysis of accurate monitoring data, but the outlook for the computing goals of HENP will also be examined.« less
Revisit of Machine Learning Supported Biological and Biomedical Studies.
Yu, Xiang-Tian; Wang, Lu; Zeng, Tao
2018-01-01
Generally, machine learning includes many in silico methods to transform the principles underlying natural phenomenon to human understanding information, which aim to save human labor, to assist human judge, and to create human knowledge. It should have wide application potential in biological and biomedical studies, especially in the era of big biological data. To look through the application of machine learning along with biological development, this review provides wide cases to introduce the selection of machine learning methods in different practice scenarios involved in the whole biological and biomedical study cycle and further discusses the machine learning strategies for analyzing omics data in some cutting-edge biological studies. Finally, the notes on new challenges for machine learning due to small-sample high-dimension are summarized from the key points of sample unbalance, white box, and causality.
Optimal design method to minimize users' thinking mapping load in human-machine interactions.
Huang, Yanqun; Li, Xu; Zhang, Jie
2015-01-01
The discrepancy between human cognition and machine requirements/behaviors usually results in serious mental thinking mapping loads or even disasters in product operating. It is important to help people avoid human-machine interaction confusions and difficulties in today's mental work mastered society. Improving the usability of a product and minimizing user's thinking mapping and interpreting load in human-machine interactions. An optimal human-machine interface design method is introduced, which is based on the purpose of minimizing the mental load in thinking mapping process between users' intentions and affordance of product interface states. By analyzing the users' thinking mapping problem, an operating action model is constructed. According to human natural instincts and acquired knowledge, an expected ideal design with minimized thinking loads is uniquely determined at first. Then, creative alternatives, in terms of the way human obtains operational information, are provided as digital interface states datasets. In the last, using the cluster analysis method, an optimum solution is picked out from alternatives, by calculating the distances between two datasets. Considering multiple factors to minimize users' thinking mapping loads, a solution nearest to the ideal value is found in the human-car interaction design case. The clustering results show its effectiveness in finding an optimum solution to the mental load minimizing problems in human-machine interaction design.
ABA-Cloud: support for collaborative breath research
Elsayed, Ibrahim; Ludescher, Thomas; King, Julian; Ager, Clemens; Trosin, Michael; Senocak, Uygar; Brezany, Peter; Feilhauer, Thomas; Amann, Anton
2016-01-01
This paper introduces the advanced breath analysis (ABA) platform, an innovative scientific research platform for the entire breath research domain. Within the ABA project, we are investigating novel data management concepts and semantic web technologies to document breath analysis studies for the long run as well as to enable their full automatic reproducibility. We propose several concept taxonomies (a hierarchical order of terms from a glossary of terms), which can be seen as a first step toward the definition of conceptualized terms commonly used by the international community of breath researchers. They build the basis for the development of an ontology (a concept from computer science used for communication between machines and/or humans and representation and reuse of knowledge) dedicated to breath research. PMID:23619467
ABA-Cloud: support for collaborative breath research.
Elsayed, Ibrahim; Ludescher, Thomas; King, Julian; Ager, Clemens; Trosin, Michael; Senocak, Uygar; Brezany, Peter; Feilhauer, Thomas; Amann, Anton
2013-06-01
This paper introduces the advanced breath analysis (ABA) platform, an innovative scientific research platform for the entire breath research domain. Within the ABA project, we are investigating novel data management concepts and semantic web technologies to document breath analysis studies for the long run as well as to enable their full automatic reproducibility. We propose several concept taxonomies (a hierarchical order of terms from a glossary of terms), which can be seen as a first step toward the definition of conceptualized terms commonly used by the international community of breath researchers. They build the basis for the development of an ontology (a concept from computer science used for communication between machines and/or humans and representation and reuse of knowledge) dedicated to breath research.
Collaborative Planning of Robotic Exploration
NASA Technical Reports Server (NTRS)
Norris, Jeffrey; Backes, Paul; Powell, Mark; Vona, Marsette; Steinke, Robert
2004-01-01
The Science Activity Planner (SAP) software system includes an uplink-planning component, which enables collaborative planning of activities to be undertaken by an exploratory robot on a remote planet or on Earth. Included in the uplink-planning component is the SAP-Uplink Browser, which enables users to load multiple spacecraft activity plans into a single window, compare them, and merge them. The uplink-planning component includes a subcomponent that implements the Rover Markup Language Activity Planning format (RML-AP), based on the Extensible Markup Language (XML) format that enables the representation, within a single document, of planned spacecraft and robotic activities together with the scientific reasons for the activities. Each such document is highly parseable and can be validated easily. Another subcomponent of the uplink-planning component is the Activity Dictionary Markup Language (ADML), which eliminates the need for two mission activity dictionaries - one in a human-readable format and one in a machine-readable format. Style sheets that have been developed along with the ADML format enable users to edit one dictionary in a user-friendly environment without compromising
ERIC Educational Resources Information Center
Chou, Chih-Yueh; Huang, Bau-Hung; Lin, Chi-Jen
2011-01-01
This study proposes a virtual teaching assistant (VTA) to share teacher tutoring tasks in helping students practice program tracing and proposes two mechanisms of complementing machine intelligence and human intelligence to develop the VTA. The first mechanism applies machine intelligence to extend human intelligence (teacher answers) to evaluate…
The Trust Project - Symbiotic Human Machine Teams: Social Cueing for Trust and Reliance
2016-06-30
AFRL-RH-WP-TR-2016-0096 THE TRUST PROJECT - SYMBIOTIC HUMAN-MACHINE TEAMS: SOCIAL CUEING FOR TRUST & RELIANCE Susan Rivers, Monika Lohani, Marissa...30 JUN 2012 – 30 JUN 2016 4. TITLE AND SUBTITLE THE TRUST PROJECT - SYMBIOTIC HUMAN-MACHINE TEAMS: SOCIAL CUEING FOR TRUST & RELIANCE 5a. CONTRACT
Improving Emergency Response and Human-Robotic Performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
David I. Gertman; David J. Bruemmer; R. Scott Hartley
2007-08-01
Preparedness for chemical, biological, and radiological/nuclear incidents at nuclear power plants (NPPs) includes the deployment of well trained emergency response teams. While teams are expected to do well, data from other domains suggests that the timeliness and accuracy associated with incident response can be improved through collaborative human-robotic interaction. Many incident response scenarios call for multiple, complex procedure-based activities performed by personnel wearing cumbersome personal protective equipment (PPE) and operating under high levels of stress and workload. While robotic assistance is postulated to reduce workload and exposure, limitations associated with communications and the robot’s ability to act independently have servedmore » to limit reliability and reduce our potential to exploit human –robotic interaction and efficacy of response. Recent work at the Idaho National Laboratory (INL) on expanding robot capability has the potential to improve human-system response during disaster management and recovery. Specifically, increasing the range of higher level robot behaviors such as autonomous navigation and mapping, evolving new abstractions for sensor and control data, and developing metaphors for operator control have the potential to improve state-of-the-art in incident response. This paper discusses these issues and reports on experiments underway intelligence residing on the robot to enhance emergency response.« less
The Majorana Demonstrator: A search for neutrinoless double-beta decay of germanium-76
NASA Astrophysics Data System (ADS)
Elliott, S. R.; Abgrall, N.; Aguayo, E.; Avignone, F. T., III; Barabash, A. S.; Bertrand, F. E.; Boswell, M.; Brudanin, V.; Busch, M.; Caldwell, A. S.; Chan, Y.-D.; Christofferson, C. D.; Combs, D. C.; Detwiler, J. A.; Doe, P. J.; Efremenko, Yu.; Egorov, V.; Ejiri, H.; Esterline, J.; Fast, J. E.; Finnerty, P.; Fraenkle, F. M.; Galindo-Uribarri, A.; Giovanetti, G. K.; Goett, J.; Green, M. P.; Gruszko, J.; Guiseppe, V. E.; Gusev, K.; Hallin, A. L.; Hazama, R.; Hegai, A.; Henning, R.; Hoppe, E. W.; Howard, S.; Howe, M. A.; Keeter, K. J.; Kidd, M. F.; Kochetov, O.; Konovalov, S. I.; Kouzes, R. T.; LaFerriere, B. D.; Leon, J.; Leviner, L. E.; Loach, J. C.; MacMullin, S.; Martin, R. D.; Mertens, S.; Mizouni, L.; Nomachi, M.; Orrell, J. L.; O'Shaughnessy, C.; Overman, N. R.; Phillips, D. G., II; Poon, A. W. P.; Pushkin, K.; Radford, D. C.; Rielage, K.; Robertson, R. G. H.; Ronquest, M. C.; Schubert, A. G.; Shanks, B.; Shima, T.; Shirchenko, M.; Snavely, K. J.; Snyder, N.; Soin, A.; Strain, J.; Suriano, A. M.; Timkin, V.; Tornow, W.; Varner, R. L.; Vasilyev, S.; Vetter, K.; Vorren, K.; White, B. R.; Wilkerson, J. F.; Xu, W.; Yakushev, E.; Young, A. R.; Yu, C.-H.; Yumatov, V.
2013-12-01
The Majorana collaboration is searching for neutrinoless double beta decay using 76Ge, which has been shown to have a number of advantages in terms of sensitivities and backgrounds. The observation of neutrinoless double-beta decay would show that lepton number is violated and that neutrinos are Majorana particles and would simultaneously provide information on neutrino mass. Attaining sensitivities for neutrino masses in the inverted hierarchy region, 15 - 50 meV, will require large, tonne-scale detectors with extremely low backgrounds, at the level of ˜1 count/t-y or lower in the region of the signal. The Majorana collaboration, with funding support from DOE Office of Nuclear Physics and NSF Particle Astrophysics, is constructing the Demonstrator, an array consisting of 40 kg of p-type point-contact high-purity germanium (HPGe) detectors, of which ˜30 kg will be enriched to 87% in 76Ge. The Demonstrator is being constructed in a clean room laboratory facility at the 4850' level (4300 m.w.e.) of the Sanford Underground Research Facility (SURF) in Lead, SD. It utilizes a compact graded shield approach with the inner portion consisting of ultra-clean Cu that is being electroformed and machined underground. The primary aim of the Demonstrator is to show the feasibility of a future tonne-scale measurement in terms of backgrounds and scalability.
NASA Astrophysics Data System (ADS)
Stewart, Sarah
2017-06-01
Shock-induced vaporization was a common process during the end stages of terrestrial planet formation and transient features in extra-solar systems are attributed to recent giant impacts. At the Sandia Z Machine, my collaborators and I are conducting experiments to study the shock Hugoniot and release to the liquid-vapor phase boundary of major minerals in rocky planets. Current work on forsterite, enstatite and bronzite and previous results on silica, iron and periclase demonstrate that shock-induced vaporization played a larger role during planet formation than previously thought. I will provide an overview of the experimental results and describe how the data have changed our views of planetary impact events in our solar system and beyond. Sandia National Laboratories is a multiprogram laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under Contract No. DE-AC04-94AL85000. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. This work is supported by the Z Fundamental Science Program at Sandia National Laboratories, DOE-NNSA Grant DE- NA0002937, NASA Grant # NNX15AH54G, and UC Multicampus-National Lab Collaborative Research and Training Grant #LFR-17-449059.
Short Distance of Nuclei - Mining the Wealth of Existing Jefferson Lab Data - Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weinstein, Lawrence; Kuhn, Sebastian
Over the last fifteen years of operation, the Jefferson Lab CLAS Collaboration has performed many experiments using nuclear targets. Because the CLAS detector has a very large acceptance and because it used a very open (i.e., nonspecific) trigger, there is a vast amount of data on many different reaction channels yet to be analyzed. The goal of the Jefferson Lab Nuclear Data Mining grant was to (1) collect the data from nuclear target experiments using the CLAS detector, (2) collect the associated cuts and corrections used to analyze that data, (3) provide non-expert users with a software environment for easymore » analysis of the data, and (4) to search for interesting reaction signatures in the data. We formed the Jefferson Lab Nuclear Data Mining collaboration under the auspices of this grant. The collaboration successfully carried out all of our goals. Dr. Gavalian, the data mining scientist, created a remarkably user-friendly web-based interface to enable easy analysis of the nuclear-target data by non-experts. Data from many of the CLAS nuclear target experiments has been made available on servers at Old Dominion University. Many of the associated cuts and corrections have been incorporated into the data mining software. The data mining collaboration was extraordinarily successful in finding interesting reaction signatures in the data. Our paper Momentum sharing in imbalanced Fermi systems was published in Science. Several analyses of CLAS data are continuing and will result in papers after the end of the grant period. We have held several analysis workshops and have given many invited talks at international conferences and workshops related to the data mining initiative. Our initiative to maximize the impact of data collected with CLAS in the 6-GeV era was very successful. During the hiatus between the end of 6-GeV experiments and the beginning of 12-GeV experiments, our collaboration and the physics community at large benefited tremendously from the Jefferson Lab Nuclear Data Mining effort.« less
The human role in space: Technology, economics and optimization
NASA Technical Reports Server (NTRS)
Hall, S. B. (Editor)
1985-01-01
Man-machine interactions in space are explored in detail. The role and the degree of direct involvement of humans that will be required in future space missions are investigated. An attempt is made to establish valid criteria for allocating functional activities between humans and machines and to provide insight into the technological requirements, economics, and benefits of the human presence in space. Six basic categories of man-machine interactions are considered: manual, supported, augmented, teleoperated, supervised, and independent. Appendices are included which provide human capability data, project analyses, activity timeline profiles and data sheets for 37 generic activities, support equipment and human capabilities required in these activities, and cumulative costs as a function of activity for seven man-machine modes.
Human Factors Directions for Civil Aviation
NASA Technical Reports Server (NTRS)
Hart, Sandra G.
2002-01-01
Despite considerable progress in understanding human capabilities and limitations, incorporating human factors into aircraft design, operation, and certification, and the emergence of new technologies designed to reduce workload and enhance human performance in the system, most aviation accidents still involve human errors. Such errors occur as a direct or indirect result of untimely, inappropriate, or erroneous actions (or inactions) by apparently well-trained and experienced pilots, controllers, and maintainers. The field of human factors has solved many of the more tractable problems related to simple ergonomics, cockpit layout, symbology, and so on. We have learned much about the relationships between people and machines, but know less about how to form successful partnerships between humans and the information technologies that are beginning to play a central role in aviation. Significant changes envisioned in the structure of the airspace, pilots and controllers' roles and responsibilities, and air/ground technologies will require a similarly significant investment in human factors during the next few decades to ensure the effective integration of pilots, controllers, dispatchers, and maintainers into the new system. Many of the topics that will be addressed are not new because progress in crucial areas, such as eliminating human error, has been slow. A multidisciplinary approach that capitalizes upon human studies and new classes of information, computational models, intelligent analytical tools, and close collaborations with organizations that build, operate, and regulate aviation technology will ensure that the field of human factors meets the challenge.
Structure design of lower limb exoskeletons for gait training
NASA Astrophysics Data System (ADS)
Li, Jianfeng; Zhang, Ziqiang; Tao, Chunjing; Ji, Run
2015-09-01
Due to the close physical interaction between human and machine in process of gait training, lower limb exoskeletons should be safe, comfortable and able to smoothly transfer desired driving force/moments to the patients. Correlatively, in kinematics the exoskeletons are required to be compatible with human lower limbs and thereby to avoid the uncontrollable interactional loads at the human-machine interfaces. Such requirement makes the structure design of exoskeletons very difficult because the human-machine closed chains are complicated. In addition, both the axis misalignments and the kinematic character difference between the exoskeleton and human joints should be taken into account. By analyzing the DOF(degree of freedom) of the whole human-machine closed chain, the human-machine kinematic incompatibility of lower limb exoskeletons is studied. An effective method for the structure design of lower limb exoskeletons, which are kinematically compatible with human lower limb, is proposed. Applying this method, the structure synthesis of the lower limb exoskeletons containing only one-DOF revolute and prismatic joints is investigated; the feasible basic structures of exoskeletons are developed and classified into three different categories. With the consideration of quasi-anthropopathic feature, structural simplicity and wearable comfort of lower limb exoskeletons, a joint replacement and structure comparison based approach to select the ideal structures of lower limb exoskeletons is proposed, by which three optimal exoskeleton structures are obtained. This paper indicates that the human-machine closed chain formed by the exoskeleton and human lower limb should be an even-constrained kinematic system in order to avoid the uncontrollable human-machine interactional loads. The presented method for the structure design of lower limb exoskeletons is universal and simple, and hence can be applied to other kinds of wearable exoskeletons.
Using Machine Learning to Predict MCNP Bias
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grechanuk, Pavel Aleksandrovi
For many real-world applications in radiation transport where simulations are compared to experimental measurements, like in nuclear criticality safety, the bias (simulated - experimental k eff) in the calculation is an extremely important quantity used for code validation. The objective of this project is to accurately predict the bias of MCNP6 [1] criticality calculations using machine learning (ML) algorithms, with the intention of creating a tool that can complement the current nuclear criticality safety methods. In the latest release of MCNP6, the Whisper tool is available for criticality safety analysts and includes a large catalogue of experimental benchmarks, sensitivity profiles,more » and nuclear data covariance matrices. This data, coming from 1100+ benchmark cases, is used in this study of ML algorithms for criticality safety bias predictions.« less
ERIC Educational Resources Information Center
Anderson, James D.; Perez-Carballo, Jose
2001-01-01
Discussion of human intellectual indexing versus automatic indexing focuses on automatic indexing. Topics include keyword indexing; negative vocabulary control; counting words; comparative counting and weighting; stemming; words versus phrases; clustering; latent semantic indexing; citation indexes; bibliographic coupling; co-citation; relevance…
NASA Astrophysics Data System (ADS)
Ardi, S.; Ardyansyah, D.
2018-02-01
In the Manufacturing of automotive spare parts, increased sales of vehicles is resulted in increased demand for production of engine valve of the customer. To meet customer demand, we carry out improvement and overhaul of the NTVS-2894 seat grinder machine on a machining line. NTVS-2894 seat grinder machine has been decreased machine productivity, the amount of trouble, and the amount of downtime. To overcome these problems on overhaul the NTVS-2984 seat grinder machine include mechanical and programs, is to do the design and manufacture of HMI (Human Machine Interface) GP-4501T program. Because of the time prior to the overhaul, NTVS-2894 seat grinder machine does not have a backup HMI (Human Machine Interface) program. The goal of the design and manufacture in this program is to improve the achievement of production, and allows an operator to operate beside it easier to troubleshoot the NTVS-2894 seat grinder machine thereby reducing downtime on the NTVS-2894 seat grinder machine. The results after the design are HMI program successfully made it back, machine productivity increased by 34.8%, the amount of trouble, and downtime decreased 40% decrease from 3,160 minutes to 1,700 minutes. The implication of our design, it could facilitate the operator in operating machine and the technician easer to maintain and do the troubleshooting the machine problems.
Liu, Xunying; Zhang, Chao; Woodland, Phil; Fonteneau, Elisabeth
2017-01-01
There is widespread interest in the relationship between the neurobiological systems supporting human cognition and emerging computational systems capable of emulating these capacities. Human speech comprehension, poorly understood as a neurobiological process, is an important case in point. Automatic Speech Recognition (ASR) systems with near-human levels of performance are now available, which provide a computationally explicit solution for the recognition of words in continuous speech. This research aims to bridge the gap between speech recognition processes in humans and machines, using novel multivariate techniques to compare incremental ‘machine states’, generated as the ASR analysis progresses over time, to the incremental ‘brain states’, measured using combined electro- and magneto-encephalography (EMEG), generated as the same inputs are heard by human listeners. This direct comparison of dynamic human and machine internal states, as they respond to the same incrementally delivered sensory input, revealed a significant correspondence between neural response patterns in human superior temporal cortex and the structural properties of ASR-derived phonetic models. Spatially coherent patches in human temporal cortex responded selectively to individual phonetic features defined on the basis of machine-extracted regularities in the speech to lexicon mapping process. These results demonstrate the feasibility of relating human and ASR solutions to the problem of speech recognition, and suggest the potential for further studies relating complex neural computations in human speech comprehension to the rapidly evolving ASR systems that address the same problem domain. PMID:28945744
Adamson, Matthew
2016-03-01
This study explores the origins and consequences of a unique, secret, French-American collaboration to prospect for uranium in 1950s Morocco. This collaboration permitted mediation between the United States and France. The appearance of France in an American-supported project for raw nuclear materials signalled American willingness to accept a new nuclear global order in which the French assumed a new, higher position as regional nuclear ally as opposed to suspicious rival. This collaboration also permitted France and the United States to agree tacitly to the same geopolitical status for the French Moroccan Protectorate, a status under dispute both in Morocco and outside it. The secret scientific effort reassured the French that, whatever the Americans might say publicly, they stood behind the maintenance of French hegemony in the centuries-old kingdom. But Moroccan independence proved impossible to deny. With its foreseeable arrival, the collaboration went from seductive to dangerous, and the priority of American and French geologists shifted from finding a major uranium lode to making sure that nothing was readily available to whatever post-independence interests might prove most powerful. Ultimately, the Kingdom of Morocco took a page out of the French book, using uranium exploration to assert sovereignty over a different disputed territory, its de facto colony of the Western Sahara.
Hao, Bibo; Sun, Wen; Yu, Yiqin; Li, Jing; Hu, Gang; Xie, Guotong
2016-01-01
Recent advances in cloud computing and machine learning made it more convenient for researchers to gain insights from massive healthcare data, while performing analyses on healthcare data in current practice still lacks efficiency for researchers. What's more, collaborating among different researchers and sharing analysis results are challenging issues. In this paper, we developed a practice to make analytics process collaborative and analysis results reproducible by exploiting and extending Jupyter Notebook. After applying this practice in our use cases, we can perform analyses and deliver results with less efforts in shorter time comparing to our previous practice.
Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Mount, Frances; Carreon, Patricia; Torney, Susan E.
2001-01-01
The Engineering and Mission Operations Directorates at NASA Johnson Space Center are combining laboratories and expertise to establish the Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations. This is a testbed for human centered design, development and evaluation of intelligent autonomous and assistant systems that will be needed for human exploration and development of space. This project will improve human-centered analysis, design and evaluation methods for developing intelligent software. This software will support human-machine cognitive and collaborative activities in future interplanetary work environments where distributed computer and human agents cooperate. We are developing and evaluating prototype intelligent systems for distributed multi-agent mixed-initiative operations. The primary target domain is control of life support systems in a planetary base. Technical approaches will be evaluated for use during extended manned tests in the target domain, the Bioregenerative Advanced Life Support Systems Test Complex (BIO-Plex). A spinoff target domain is the International Space Station (ISS) Mission Control Center (MCC). Prodl}cts of this project include human-centered intelligent software technology, innovative human interface designs, and human-centered software development processes, methods and products. The testbed uses adjustable autonomy software and life support systems simulation models from the Adjustable Autonomy Testbed, to represent operations on the remote planet. Ground operations prototypes and concepts will be evaluated in the Exploration Planning and Operations Center (ExPOC) and Jupiter Facility.
Man-systems integration and the man-machine interface
NASA Technical Reports Server (NTRS)
Hale, Joseph P.
1990-01-01
Viewgraphs on man-systems integration and the man-machine interface are presented. Man-systems integration applies the systems' approach to the integration of the user and the machine to form an effective, symbiotic Man-Machine System (MMS). A MMS is a combination of one or more human beings and one or more physical components that are integrated through the common purpose of achieving some objective. The human operator interacts with the system through the Man-Machine Interface (MMI).
Should You Trust Your Money to a Robot?
Dhar, Vasant
2015-06-01
Financial markets emanate massive amounts of data from which machines can, in principle, learn to invest with minimal initial guidance from humans. I contrast human and machine strengths and weaknesses in making investment decisions. The analysis reveals areas in the investment landscape where machines are already very active and those where machines are likely to make significant inroads in the next few years.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, Kenneth; Oxstrand, Johanna
The Digital Architecture effort is a part of the Department of Energy (DOE) sponsored Light-Water Reactor Sustainability (LWRS) Program conducted at Idaho National Laboratory (INL). The LWRS program is performed in close collaboration with industry research and development (R&D) programs that provides the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants (NPPs). One of the primary missions of the LWRS program is to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. Therefore,more » a major objective of the LWRS program is the development of a seamless digital environment for plant operations and support by integrating information from plant systems with plant processes for nuclear workers through an array of interconnected technologies. In order to get the most benefits of the advanced technology suggested by the different research activities in the LWRS program, the nuclear utilities need a digital architecture in place to support the technology. A digital architecture can be defined as a collection of information technology (IT) capabilities needed to support and integrate a wide-spectrum of real-time digital capabilities for nuclear power plant performance improvements. It is not hard to imagine that many processes within the plant can be largely improved from both a system and human performance perspective by utilizing a plant wide (or near plant wide) wireless network. For example, a plant wide wireless network allows for real time plant status information to easily be accessed in the control room, field workers’ computer-based procedures can be updated based on the real time plant status, and status on ongoing procedures can be incorporated into smart schedules in the outage command center to allow for more accurate planning of critical tasks. The goal of the digital architecture project is to provide a long-term strategy to integrate plant systems, plant processes, and plant workers. This include technologies to improve nuclear worker efficiency and human performance; to offset a range of plant surveillance and testing activities with new on-line monitoring technologies; improve command, control, and collaboration in settings such as outage control centers and work execution centers; and finally to improve operator performance with new operator aid technologies for the control room. The requirements identified through the activities in the Digital Architecture project will be used to estimate the amount of traffic on the network and hence estimating the minimal bandwidth needed.« less
Space exploration and colonization - Towards a space faring society
NASA Technical Reports Server (NTRS)
Hammond, Walter E.
1990-01-01
Development trends of space exploration and colonization since 1957 are reviewed, and a five-phase evolutionary program planned for the long-term future is described. The International Geosphere-Biosphere program which is intended to provide the database on enviromental changes of the earth as a global system is considered. Evolution encompasses the anticipated advantages of such NASA observation projects as the Hubble Space Telescope, the Gamma Ray Observatory, the Advanced X-Ray Astrophysics Facility, and the Cosmic Background Explorer. Attention is given to requirements for space colonization, including development of artificial gravity and countermeasures to mitigate zero gravity problems; robotics and systems aimed to minimize human exposure to the space environment; the use of nuclear propulsion; and international collaboration on lunar-Mars projects. It is recommended that nuclear energy sources be developed for both propulsion and as extraterrestrial power plants.
Organic Creativity for Well-Being in the Post-Information Society.
Corazza, Giovanni Emanuele
2017-11-01
The editorial dwells upon the technology-driven evolution from the Industrial to the Post-Information Society, indicating that this transition will bring about drastic transformations in our way of living, starting from the job market and then pervading all aspects at both individual and social levels. Great opportunities will come together with unprecedented challenges to living as we have always known it. In this innovation-filled scenario, it is argued that human creativity becomes the distinctive ability to provide dignity at first and survival in the long term. The term organic creativity is introduced to indicate those conditions, attitudes, and actions that bear the potential to be at the same time productive in socio-economic terms and conducive to human well-being. As a consequence, the role of psychologists in an open cooperation with sociologists, economists, computer scientists, engineers and others, will be as central as ever in establishing healthy collaboration modes between humans and machines, and large investments in related multidisciplinary scientific research are advocated to establish organic creativity as a discipline that should permeate every educational level, as well as our professional and everyday lives.
Organic Creativity for Well-Being in the Post-Information Society
Corazza, Giovanni Emanuele
2017-01-01
The editorial dwells upon the technology-driven evolution from the Industrial to the Post-Information Society, indicating that this transition will bring about drastic transformations in our way of living, starting from the job market and then pervading all aspects at both individual and social levels. Great opportunities will come together with unprecedented challenges to living as we have always known it. In this innovation-filled scenario, it is argued that human creativity becomes the distinctive ability to provide dignity at first and survival in the long term. The term organic creativity is introduced to indicate those conditions, attitudes, and actions that bear the potential to be at the same time productive in socio-economic terms and conducive to human well-being. As a consequence, the role of psychologists in an open cooperation with sociologists, economists, computer scientists, engineers and others, will be as central as ever in establishing healthy collaboration modes between humans and machines, and large investments in related multidisciplinary scientific research are advocated to establish organic creativity as a discipline that should permeate every educational level, as well as our professional and everyday lives. PMID:29358976
DOE Office of Scientific and Technical Information (OSTI.GOV)
Windsor, Lindsay K.; Kessler, Carol E.
An exceptional number of Middle Eastern and North African nations have recently expressed interest in developing nuclear energy for peaceful purposes. Many of these countries have explored nuclear research in limited ways in the past, but the current focused interest and application of resources towards developing nuclear-generated electricity and nuclear-powered desalination plants is unprecedented. Consequently, questions arise in response to this emerging trend: What instigated this interest? To what end(s) will a nuclear program be applied? Does the country have adequate technical, political, legislative, nonproliferation, and safety infrastructure required for the capability desired? If so, what are the next stepsmore » for a country in preparation for a future nuclear program? And if not, what collaboration efforts are possible with the United States or others? This report provides information on the capabilities and interests of 13 countries in the region in nuclear energy programs in light of safety, nonproliferation and security concerns. It also provides information useful for determining potential for offering technical collaboration, financial aid, and/or political support.« less
NASA Technical Reports Server (NTRS)
Kazerooni, H.
1991-01-01
A human's ability to perform physical tasks is limited, not only by his intelligence, but by his physical strength. If, in an appropriate environment, a machine's mechanical power is closely integrated with a human arm's mechanical power under the control of the human intellect, the resulting system will be superior to a loosely integrated combination of a human and a fully automated robot. Therefore, we must develop a fundamental solution to the problem of 'extending' human mechanical power. The work presented here defines 'extenders' as a class of robot manipulators worn by humans to increase human mechanical strength, while the wearer's intellect remains the central control system for manipulating the extender. The human, in physical contact with the extender, exchanges power and information signals with the extender. The aim is to determine the fundamental building blocks of an intelligent controller, a controller which allows interaction between humans and a broad class of computer-controlled machines via simultaneous exchange of both power and information signals. The prevalent trend in automation has been to physically separate the human from the machine so the human must always send information signals via an intermediary device (e.g., joystick, pushbutton, light switch). Extenders, however are perfect examples of self-powered machines that are built and controlled for the optimal exchange of power and information signals with humans. The human wearing the extender is in physical contact with the machine, so power transfer is unavoidable and information signals from the human help to control the machine. Commands are transferred to the extender via the contact forces and the EMG signals between the wearer and the extender. The extender augments human motor ability without accepting any explicit commands: it accepts the EMG signals and the contact force between the person's arm and the extender, and the extender 'translates' them into a desired position. In this unique configuration, mechanical power transfer between the human and the extender occurs because the human is pushing against the extender. The extender transfers to the human's hand, in feedback fashion, a scaled-down version of the actual external load which the extender is manipulating. This natural feedback force on the human's hand allows him to 'feel' a modified version of the external forces on the extender. The information signals from the human (e.g., EMG signals) to the computer reflect human cognitive ability, and the power transfer between the human and the machine (e.g., physical interaction) reflects human physical ability. Thus the information transfer to the machine augments cognitive ability, and the power transfer augments motor ability. These two actions are coupled through the human cognitive/motor dynamic behavior. The goal is to derive the control rules for a class of computer-controlled machines that augment human physical and cognitive abilities in certain manipulative tasks.
Electric field prediction for a human body-electric machine system.
Ioannides, Maria G; Papadopoulos, Peter J; Dimitropoulou, Eugenia
2004-01-01
A system consisting of an electric machine and a human body is studied and the resulting electric field is predicted. A 3-phase induction machine operating at full load is modeled considering its geometry, windings, and materials. A human model is also constructed approximating its geometry and the electric properties of tissues. Using the finite element technique the electric field distribution in the human body is determined for a distance of 1 and 5 m from the machine and its effects are studied. Particularly, electric field potential variations are determined at specific points inside the human body and for these points the electric field intensity is computed and compared to the limit values for exposure according to international standards.
ERIC Educational Resources Information Center
Lancaster, F. W.
1989-01-01
Describes various stages involved in the applications of electronic media to the publishing industry. Highlights include computer typesetting, or photocomposition; machine-readable databases; the distribution of publications in electronic form; computer conferencing and electronic mail; collaborative authorship; hypertext; hypermedia publications;…
VIEW OF MICROMACHINING, HIGH PRECISION EQUIPMENT USED TO CUSTOM MAKE ...
VIEW OF MICRO-MACHINING, HIGH PRECISION EQUIPMENT USED TO CUSTOM MAKE SMALL PARTS. LUMPS OF CLAY; SHOWN IN THE PHOTOGRAPH, WERE USED TO STABILIZE PARTS BEING MACHINED. (11/1/87) - Rocky Flats Plant, Stainless Steel & Non-Nuclear Components Manufacturing, Southeast corner of intersection of Cottonwood & Third Avenues, Golden, Jefferson County, CO
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rashdan, Ahmad Al; Oxstrand, Johanna; Agarwal, Vivek
As part of the ongoing efforts at the U.S. Department of Energy’s Light Water Reactor Sustainability Program, Idaho National Laboratory is conducting several pilot projects in collaboration with the nuclear industry to improve the reliability, safety, and economics of the nuclear power industry, especially as the nuclear power plants extend their operating licenses to 80 years. One of these pilot projects is the automated work package (AWP) pilot project. An AWP is an electronic intelligent and interactive work package. It uses plant condition, resources status, and user progress to adaptively drive the work process in a manner that increases efficiencymore » while reducing human error. To achieve this mission, the AWP acquires information from various systems of a nuclear power plant’s and incorporates several advanced instrumentation and control technologies along with modern human factors techniques. With the current rapid technological advancement, it is possible to envision several available or soon-to-be-available capabilities that can play a significant role in improving the work package process. As a pilot project, the AWP project develops a prototype of an expanding set of capabilities and evaluates them in an industrial environment. While some of the proposed capabilities are based on using technological advances in other applications, others are conceptual; thus, require significant research and development to be applicable in an AWP. The scope of this paper is to introduce a set of envisioned capabilities, their need for the industry, and the industry difficulties they resolve.« less
Collaborative Research: Neutrinos and Nucleosynthesis in Hot and Dense Matter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alford, Mark
2015-05-31
The Topical Collaboration funded one of Prof. Alford's graduate students, Jun (Sophia) Han, by providing 75% of her support. The work reported here was wholly or partly supported by the Topical Collaboration. Additional support, e.g. for postdoc Kai Schwenzer, came from Nuclear Theory grant #DE-FG02-05ER41375.
Future Cyborgs: Human-Machine Interface for Virtual Reality Applications
2007-04-01
FUTURE CYBORGS : HUMAN-MACHINE INTERFACE FOR VIRTUAL REALITY APPLICATIONS Robert R. Powell, Major, USAF April 2007 Blue Horizons...SUBTITLE Future Cyborgs : Human-Machine Interface for Virtual Reality Applications 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...Nicholas Negroponte, Being Digital (New York: Alfred A Knopf, Inc, 1995), 123. 23 Ibid. 24 Andy Clark, Natural-Born Cyborgs (New York: Oxford
A novel stiffness control method for series elastic actuator
NASA Astrophysics Data System (ADS)
Lin, Guangmo; Zhao, Xingang; Han, Jianda
2017-01-01
Compliance plays an important role in human-robot cooperation. However, fixed compliance, or fixed stiffness, is difficult to meet the growing needs of human machine collaboration. As a result, the robot actuator is demanded to be able to adjust its stiffness. This paper presents a stiffness control scheme for a single DOF series elastic actuator (SEA) with a linear spring mounted in series in the mechanism. In this proposed method, the output angle of the spring is measured and used to calculate the input angle of the spring, thus the equivalent stiffness of the robot actuator revealed to the human operator can be rendered in accordance to the desired stiffness. Since the techniques used in this method only involve the position information of the system, there is no need to install an expensive force/torque sensor on the actuator. Further, the force/torque produced by the actuator can be estimated by simply multiplying the deformation angle of the spring and its constant stiffness coefficient. The analysis of the stiffness controller is provided. Then a simulation that emulates a human operates the SEA while the stiffness controller is running is carried out and the results also validate the proposed method.
Adjusting Beliefs via Transformed Fuzzy Priors
NASA Astrophysics Data System (ADS)
Rattanadamrongaksorn, T.; Sirikanchanarak, D.; Sirisrisakulchai, J.; Sriboonchitta, S.
2018-02-01
Instead of leaving a decision to a pure data-driven system, intervention and collaboration by human would be preferred to fill the gap that machine cannot perform well. In financial applications, for instance, the inference and prediction during structural changes by critical factors; such as market conditions, administrative styles, political policies, etc.; have significant influences to investment strategies. With the conditions differing from the past, we believe that the decision should not be made by only the historical data but also with human estimation. In this study, the updating process by data fusion between expert opinions and statistical observations is thus proposed. The expert’s linguistic terms can be translated into mathematical expressions by the predefined fuzzy numbers and utilized as the initial knowledge for Bayesian statistical framework via the possibility-to-probability transformation. The artificial samples on five scenarios were tested in the univariate problem to demonstrate the methodology. The results showed the shifts and variations appeared on the parameters of the distributions and, as a consequence, adjust the degrees of belief accordingly.
Fahey, Frederic H; Bom, Henry Hee-Seong; Chiti, Arturo; Choi, Yun Young; Huang, Gang; Lassmann, Michael; Laurin, Norman; Mut, Fernando; Nuñez-Miller, Rodolfo; O'Keeffe, Darin; Pradhan, Prasanta; Scott, Andrew M; Song, Shaoli; Soni, Nischal; Uchiyama, Mayuki; Vargas, Luis
2015-04-01
The Nuclear Medicine Global Initiative (NMGI) was formed in 2012 and consists of 13 international organizations with direct involvement in nuclear medicine. The underlying objectives of the NMGI were to promote human health by advancing the field of nuclear medicine and molecular imaging, encourage global collaboration in education, and harmonize procedure guidelines and other policies that ultimately lead to improvements in quality and safety in the field throughout the world. For its first project, the NMGI decided to consider the issues involved in the standardization of administered activities in pediatric nuclear medicine. This article presents part 1 of the final report of this initial project of the NMGI. It provides a review of the value of pediatric nuclear medicine, the current understanding of the carcinogenic risk of radiation as it pertains to the administration of radiopharmaceuticals in children, and the application of dosimetric models in children. A listing of pertinent educational and reference resources available in print and online is also provided. The forthcoming part 2 report will discuss current standards for administered activities in children and adolescents that have been developed by various organizations and an evaluation of the current practice of pediatric nuclear medicine specifically with regard to administered activities as determined by an international survey of nuclear medicine clinics and centers. Lastly, the part 2 report will recommend a path forward toward global standardization of the administration of radiopharmaceuticals in children. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.
Developing a Nuclear Global Health Workforce Amid the Increasing Threat of a Nuclear Crisis.
Burkle, Frederick M; Dallas, Cham E
2016-02-01
This study argues that any nuclear weapon exchange or major nuclear plant meltdown, in the categories of human systems failure and conflict-based crises, will immediately provoke an unprecedented public health emergency of international concern. Notwithstanding nuclear triage and management plans and technical monitoring standards within the International Atomic Energy Agency and the World Health Organization (WHO), the capacity to rapidly deploy a robust professional workforce with the internal coordination and collaboration capabilities required for large-scale nuclear crises is profoundly lacking. A similar dilemma, evident in the early stages of the Ebola epidemic, was eventually managed by using worldwide infectious disease experts from the Global Outbreak Alert and Response Network and multiple multidisciplinary WHO-supported foreign medical teams. This success has led the WHO to propose the development of a Global Health Workforce. A strategic format is proposed for nuclear preparedness and response that builds and expands on the current model for infectious disease outbreak currently under consideration. This study proposes the inclusion of a nuclear global health workforce under the technical expertise of the International Atomic Energy Agency and WHO's Radiation Emergency Medical Preparedness and Assistance Network leadership and supported by the International Health Regulations Treaty. Rationales are set forth for the development, structure, and function of a nuclear workforce based on health outcomes research that define the unique health, health systems, and public health challenges of a nuclear crisis. Recent research supports that life-saving opportunities are possible, but only if a rapidly deployed and robust multidisciplinary response component exists.
Nuclear powerplants for mobile applications.
NASA Technical Reports Server (NTRS)
Anderson, J. L.
1972-01-01
Mobile nuclear powerplants for applications other than large ships and submarines will require compact, lightweight reactors with especially stringent impact-safety design. This paper examines the technical and economic feasibility that the broadening role of civilian nuclear power, in general, (land-based nuclear electric generating plants and nuclear ships) can extend to lightweight, safe mobile nuclear powerplants. The paper discusses technical experience, identifies potential sources of technology for advanced concepts, cites the results of economic studies of mobile nuclear powerplants, and surveys future technical capabilities needed by examining the current use and projected needs for vehicles, machines, and habitats that could effectively use mobile nuclear reactor powerplants.
Nuclear power plants for mobile applications
NASA Technical Reports Server (NTRS)
Anderson, J. L.
1972-01-01
Mobile nuclear powerplants for applications other than large ships and submarines will require compact, lightweight reactors with especially stringent impact-safety design. The technical and economic feasibility that the broadening role of civilian nuclear power, in general, (land-based nuclear electric generating plants and nuclear ships) can extend to lightweight, safe mobile nuclear powerplants are examined. The paper discusses technical experience, identifies potential sources of technology for advanced concepts, cites the results of economic studies of mobile nuclear powerplants, and surveys future technical capabilities needed by examining the current use and projected needs for vehicles, machines, and habitats that could effectively use mobile nuclear reactor powerplants.
32 CFR 286.29 - Collection of fees and fee rates.
Code of Federal Regulations, 2013 CFR
2013-07-01
... consists of two parts; individual time (hereafter referred to as human time), and machine time. (i) Human... support, operator, programmer, database administrator, or action officer). (ii) Machine time. Machine time... the time of providing the documents to the requester or recipient when the requester specifically...
32 CFR 286.29 - Collection of fees and fee rates.
Code of Federal Regulations, 2012 CFR
2012-07-01
... consists of two parts; individual time (hereafter referred to as human time), and machine time. (i) Human... support, operator, programmer, database administrator, or action officer). (ii) Machine time. Machine time... the time of providing the documents to the requester or recipient when the requester specifically...
32 CFR 286.29 - Collection of fees and fee rates.
Code of Federal Regulations, 2014 CFR
2014-07-01
... consists of two parts; individual time (hereafter referred to as human time), and machine time. (i) Human... support, operator, programmer, database administrator, or action officer). (ii) Machine time. Machine time... the time of providing the documents to the requester or recipient when the requester specifically...
ERIC Educational Resources Information Center
Ch'ien, Evelyn
2011-01-01
This paper describes how a linguistic form, rap, can evolve in tandem with technological advances and manifest human-machine creativity. Rather than assuming that the interplay between machines and technology makes humans robotic or machine-like, the paper explores how the pressure of executing artistic visions using technology can drive…
Light at Night Markup Language (LANML): XML Technology for Light at Night Monitoring Data
NASA Astrophysics Data System (ADS)
Craine, B. L.; Craine, E. R.; Craine, E. M.; Crawford, D. L.
2013-05-01
Light at Night Markup Language (LANML) is a standard, based upon XML, useful in acquiring, validating, transporting, archiving and analyzing multi-dimensional light at night (LAN) datasets of any size. The LANML standard can accommodate a variety of measurement scenarios including single spot measures, static time-series, web based monitoring networks, mobile measurements, and airborne measurements. LANML is human-readable, machine-readable, and does not require a dedicated parser. In addition LANML is flexible; ensuring future extensions of the format will remain backward compatible with analysis software. The XML technology is at the heart of communicating over the internet and can be equally useful at the desktop level, making this standard particularly attractive for web based applications, educational outreach and efficient collaboration between research groups.
Chain Experiment competition inspires learning of physics
NASA Astrophysics Data System (ADS)
Dziob, Daniel; Górska, Urszula; Kołodziej, Tomasz
2017-05-01
The Chain Experiment is an annual competition which originated in Slovenia in 2005 and later expanded to Poland in 2013. For the purpose of the event, each participating team designs and builds a contraption that transports a small steel ball from one end to the other. At the same time the constructed machine needs to use a number of interesting phenomena and physics laws. In the competition’s finale, all contraptions are connected to each other to form a long chain transporting steel balls. In brief, they are all evaluated for qualities such as: creativity and advance in theoretical background, as well as the reliability of the constructed machine to work without human help. In this article, we present the contraptions developed by students taking part in the competition in order to demonstrate the advance in theoretical basis together with creativity in design and outstanding engineering skills of its participants. Furthermore, we situate the Chain Experiment in the context of other group competitions, at the same time demonstrating that—besides activating numerous group work skills—it also improves the ability to think critically and present one’s knowledge to a broader audience. We discussed it in the context of problem based learning, gamification and collaborative testing.
Collaborative autonomous sensing with Bayesians in the loop
NASA Astrophysics Data System (ADS)
Ahmed, Nisar
2016-10-01
There is a strong push to develop intelligent unmanned autonomy that complements human reasoning for applications as diverse as wilderness search and rescue, military surveillance, and robotic space exploration. More than just replacing humans for `dull, dirty and dangerous' work, autonomous agents are expected to cope with a whole host of uncertainties while working closely together with humans in new situations. The robotics revolution firmly established the primacy of Bayesian algorithms for tackling challenging perception, learning and decision-making problems. Since the next frontier of autonomy demands the ability to gather information across stretches of time and space that are beyond the reach of a single autonomous agent, the next generation of Bayesian algorithms must capitalize on opportunities to draw upon the sensing and perception abilities of humans-in/on-the-loop. This work summarizes our recent research toward harnessing `human sensors' for information gathering tasks. The basic idea behind is to allow human end users (i.e. non-experts in robotics, statistics, machine learning, etc.) to directly `talk to' the information fusion engine and perceptual processes aboard any autonomous agent. Our approach is grounded in rigorous Bayesian modeling and fusion of flexible semantic information derived from user-friendly interfaces, such as natural language chat and locative hand-drawn sketches. This naturally enables `plug and play' human sensing with existing probabilistic algorithms for planning and perception, and has been successfully demonstrated with human-robot teams in target localization applications.
NASA Astrophysics Data System (ADS)
Wardzinska, Aleksandra; Petit, Stephan; Bray, Rachel; Delamare, Christophe; Garcia Arza, Griselda; Krastev, Tsvetelin; Pater, Krzysztof; Suwalska, Anna; Widegren, David
2015-12-01
Large-scale long-term projects such as the LHC require the ability to store, manage, organize and distribute large amounts of engineering information, covering a wide spectrum of fields. This information is a living material, evolving in time, following specific lifecycles. It has to reach the next generations of engineers so they understand how their predecessors designed, crafted, operated and maintained the most complex machines ever built. This is the role of CERN EDMS. The Engineering and Equipment Data Management Service has served the High Energy Physics Community for over 15 years. It is CERN's official PLM (Product Lifecycle Management), supporting engineering communities in their collaborations inside and outside the laboratory. EDMS is integrated with the CAD (Computer-aided Design) and CMMS (Computerized Maintenance Management) systems used at CERN providing tools for engineers who work in different domains and who are not PLM specialists. Over the years, human collaborations and machines grew in size and complexity. So did EDMS: it is currently home to more than 2 million files and documents, and has over 6 thousand active users. In April 2014 we released a new major version of EDMS, featuring a complete makeover of the web interface, improved responsiveness and enhanced functionality. Following the results of user surveys and building upon feedback received from key users group, we brought what we think is a system that is more attractive and makes it easy to perform complex tasks. In this paper we will describe the main functions and the architecture of EDMS. We will discuss the available integration options, which enable further evolution and automation of engineering data management. We will also present our plans for the future development of EDMS.
Multiple man-machine interfaces
NASA Technical Reports Server (NTRS)
Stanton, L.; Cook, C. W.
1981-01-01
The multiple man machine interfaces inherent in military pilot training, their social implications, and the issue of possible negative feedback were explored. Modern technology has produced machines which can see, hear, and touch with greater accuracy and precision than human beings. Consequently, the military pilot is more a systems manager, often doing battle against a target he never sees. It is concluded that unquantifiable human activity requires motivation that is not intrinsic in a machine.
Surface Inspection Machine Infrared (SIMIR). Final CRADA report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Powell, G.L.; Neu, J.T.; Beecroft, M.
This Cooperative Research and Development Agreement was a one year effort to make the surface inspection machine based on diffuse reflectance infrared spectroscopy (Surface Inspection Machine-Infrared, SIMIR), being developed by Surface Optics Corporation, perform to its highest potential as a practical, portable surface inspection machine. The design function of the SIMIR is to inspect metal surfaces for cleanliness (stains). The system is also capable of evaluating graphite-resin systems for cure and heat damage, and for measuring the effects of moisture exposure on lithium hydride, corrosion on uranium metal, and the constituents of and contamination on wood, paper, and fabrics. Overmore » the period of the CRADA, extensive experience with the use of the SIMIR for surface cleanliness measurements have been achieved through collaborations with NASA and the Army. The SIMIR was made available to the AMTEX CRADA for Finish on Yarn where it made a very significant contribution. The SIMIR was the foundation of a Forest Products CRADA that was developed over the time interval of this CRADA. Surface Optics Corporation and the SIMIR have been introduced to the chemical spectroscopy on-line analysis market and have made staffing additions and arrangements for international marketing of the SIMIR as an on-line surface inspection device. LMES has been introduced to a wide range of aerospace applications, the research and fabrication skills of Surface Optics Corporation, has gained extensive experience in the areas of surface cleanliness from collaborations with NASA and the Army, and an extensive introduction to the textile and forest products industries. The SIMIR, marketed as the SOC-400, has filled an important new technology need in the DOE-DP Enhanced Surveillance Program with instruments delivered to or on order by LMES, LANL, LLNL, and Pantex, where extensive collaborations are underway to implement and improve this technology.« less
NASA Astrophysics Data System (ADS)
Kang, Soon Ju; Moon, Jae Chul; Choi, Doo-Hyun; Choi, Sung Su; Woo, Hee Gon
1998-06-01
The inspection of steam-generator (SG) tubes in a nuclear power plant (NPP) is a time-consuming, laborious, and hazardous task because of several hard constraints such as a highly radiated working environment, a tight task schedule, and the need for many experienced human inspectors. This paper presents a new distributed intelligent system architecture for automating traditional inspection methods. The proposed architecture adopts three basic technical strategies in order to reduce the complexity of system implementation. The first is the distributed task allocation into four stages: inspection planning (IF), signal acquisition (SA), signal evaluation (SE), and inspection data management (IDM). Consequently, dedicated subsystems for automation of each stage can be designed and implemented separately. The second strategy is the inclusion of several useful artificial intelligence techniques for implementing the subsystems of each stage, such as an expert system for IP and SE and machine vision and remote robot control techniques for SA. The third strategy is the integration of the subsystems using client/server-based distributed computing architecture and a centralized database management concept. Through the use of the proposed architecture, human errors, which can occur during inspection, can be minimized because the element of human intervention has been almost eliminated; however, the productivity of the human inspector can be increased equally. A prototype of the proposed system has been developed and successfully tested over the last six years in domestic NPP's.
ERIC Educational Resources Information Center
Inglis, David Rittenhouse
1975-01-01
The government promotes and heavily subsidizes research in nuclear power plants. Federal development of wind power is slow in comparison even though much research with large wind-electric machines has already been conducted. Unless wind power programs are accelerated it will not become a major energy alternative to nuclear power. (MR)
Knowledge Acquisition, Knowledge Programming, and Knowledge Refinement.
ERIC Educational Resources Information Center
Hayes-Roth, Frederick; And Others
This report describes the principal findings and recommendations of a 2-year Rand research project on machine-aided knowledge acquisition and discusses the transfer of expertise from humans to machines, as well as the functions of planning, debugging, knowledge refinement, and autonomous machine learning. The relative advantages of humans and…
Learning Machine, Vietnamese Based Human-Computer Interface.
ERIC Educational Resources Information Center
Northwest Regional Educational Lab., Portland, OR.
The sixth session of IT@EDU98 consisted of seven papers on the topic of the learning machine--Vietnamese based human-computer interface, and was chaired by Phan Viet Hoang (Informatics College, Singapore). "Knowledge Based Approach for English Vietnamese Machine Translation" (Hoang Kiem, Dinh Dien) presents the knowledge base approach,…
1998-04-01
information representation and processing technology, although faster than the wheels and gears of the Charles Babbage computation machine, is still in...the same computational complexity class as the Babbage machine, with bits of information represented by entities which obey classical (non-quantum...nuclear double resonances Charles M Bowden and Jonathan P. Dowling Weapons Sciences Directorate, AMSMI-RD-WS-ST Missile Research, Development, and
Human-Robot Interaction: Status and Challenges.
Sheridan, Thomas B
2016-06-01
The current status of human-robot interaction (HRI) is reviewed, and key current research challenges for the human factors community are described. Robots have evolved from continuous human-controlled master-slave servomechanisms for handling nuclear waste to a broad range of robots incorporating artificial intelligence for many applications and under human supervisory control. This mini-review describes HRI developments in four application areas and what are the challenges for human factors research. In addition to a plethora of research papers, evidence of success is manifest in live demonstrations of robot capability under various forms of human control. HRI is a rapidly evolving field. Specialized robots under human teleoperation have proven successful in hazardous environments and medical application, as have specialized telerobots under human supervisory control for space and repetitive industrial tasks. Research in areas of self-driving cars, intimate collaboration with humans in manipulation tasks, human control of humanoid robots for hazardous environments, and social interaction with robots is at initial stages. The efficacy of humanoid general-purpose robots has yet to be proven. HRI is now applied in almost all robot tasks, including manufacturing, space, aviation, undersea, surgery, rehabilitation, agriculture, education, package fetch and delivery, policing, and military operations. © 2016, Human Factors and Ergonomics Society.
Marucci-Wellman, Helen R; Corns, Helen L; Lehto, Mark R
2017-01-01
Injury narratives are now available real time and include useful information for injury surveillance and prevention. However, manual classification of the cause or events leading to injury found in large batches of narratives, such as workers compensation claims databases, can be prohibitive. In this study we compare the utility of four machine learning algorithms (Naïve Bayes, Single word and Bi-gram models, Support Vector Machine and Logistic Regression) for classifying narratives into Bureau of Labor Statistics Occupational Injury and Illness event leading to injury classifications for a large workers compensation database. These algorithms are known to do well classifying narrative text and are fairly easy to implement with off-the-shelf software packages such as Python. We propose human-machine learning ensemble approaches which maximize the power and accuracy of the algorithms for machine-assigned codes and allow for strategic filtering of rare, emerging or ambiguous narratives for manual review. We compare human-machine approaches based on filtering on the prediction strength of the classifier vs. agreement between algorithms. Regularized Logistic Regression (LR) was the best performing algorithm alone. Using this algorithm and filtering out the bottom 30% of predictions for manual review resulted in high accuracy (overall sensitivity/positive predictive value of 0.89) of the final machine-human coded dataset. The best pairings of algorithms included Naïve Bayes with Support Vector Machine whereby the triple ensemble NB SW =NB BI-GRAM =SVM had very high performance (0.93 overall sensitivity/positive predictive value and high accuracy (i.e. high sensitivity and positive predictive values)) across both large and small categories leaving 41% of the narratives for manual review. Integrating LR into this ensemble mix improved performance only slightly. For large administrative datasets we propose incorporation of methods based on human-machine pairings such as we have done here, utilizing readily-available off-the-shelf machine learning techniques and resulting in only a fraction of narratives that require manual review. Human-machine ensemble methods are likely to improve performance over total manual coding. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Tasking and sharing sensing assets using controlled natural language
NASA Astrophysics Data System (ADS)
Preece, Alun; Pizzocaro, Diego; Braines, David; Mott, David
2012-06-01
We introduce an approach to representing intelligence, surveillance, and reconnaissance (ISR) tasks at a relatively high level in controlled natural language. We demonstrate that this facilitates both human interpretation and machine processing of tasks. More specically, it allows the automatic assignment of sensing assets to tasks, and the informed sharing of tasks between collaborating users in a coalition environment. To enable automatic matching of sensor types to tasks, we created a machine-processable knowledge representation based on the Military Missions and Means Framework (MMF), and implemented a semantic reasoner to match task types to sensor types. We combined this mechanism with a sensor-task assignment procedure based on a well-known distributed protocol for resource allocation. In this paper, we re-formulate the MMF ontology in Controlled English (CE), a type of controlled natural language designed to be readable by a native English speaker whilst representing information in a structured, unambiguous form to facilitate machine processing. We show how CE can be used to describe both ISR tasks (for example, detection, localization, or identication of particular kinds of object) and sensing assets (for example, acoustic, visual, or seismic sensors, mounted on motes or unmanned vehicles). We show how these representations enable an automatic sensor-task assignment process. Where a group of users are cooperating in a coalition, we show how CE task summaries give users in the eld a high-level picture of ISR coverage of an area of interest. This allows them to make ecient use of sensing resources by sharing tasks.
Virtual network computing: cross-platform remote display and collaboration software.
Konerding, D E
1999-04-01
VNC (Virtual Network Computing) is a computer program written to address the problem of cross-platform remote desktop/application display. VNC uses a client/server model in which an image of the desktop of the server is transmitted to the client and displayed. The client collects mouse and keyboard input from the user and transmits them back to the server. The VNC client and server can run on Windows 95/98/NT, MacOS, and Unix (including Linux) operating systems. VNC is multi-user on Unix machines (any number of servers can be run are unrelated to the primary display of the computer), while it is effectively single-user on Macintosh and Windows machines (only one server can be run, displaying the contents of the primary display of the server). The VNC servers can be configured to allow more than one client to connect at one time, effectively allowing collaboration through the shared desktop. I describe the function of VNC, provide details of installation, describe how it achieves its goal, and evaluate the use of VNC for molecular modelling. VNC is an extremely useful tool for collaboration, instruction, software development, and debugging of graphical programs with remote users.
Are human beings humean robots?
NASA Astrophysics Data System (ADS)
Génova, Gonzalo; Quintanilla Navarro, Ignacio
2018-01-01
David Hume, the Scottish philosopher, conceives reason as the slave of the passions, which implies that human reason has predetermined objectives it cannot question. An essential element of an algorithm running on a computational machine (or Logical Computing Machine, as Alan Turing calls it) is its having a predetermined purpose: an algorithm cannot question its purpose, because it would cease to be an algorithm. Therefore, if self-determination is essential to human intelligence, then human beings are neither Humean beings, nor computational machines. We examine also some objections to the Turing Test as a model to understand human intelligence.
NASA Astrophysics Data System (ADS)
Johnson, Bradley; May, Gayle L.; Korn, Paula
The present conference discusses the currently envisioned goals of human-machine systems in spacecraft environments, prospects for human exploration of the solar system, and plausible methods for meeting human needs in space. Also discussed are the problems of human-machine interaction in long-duration space flights, remote medical systems for space exploration, the use of virtual reality for planetary exploration, the alliance between U.S. Antarctic and space programs, and the economic and educational impacts of the U.S. space program.
NASA Technical Reports Server (NTRS)
Schwarzenberg, M.; Pippia, P.; Meloni, M. A.; Cossu, G.; Cogoli-Greuter, M.; Cogoli, A.
1998-01-01
The purpose of this paper is to present the results obtained in our laboratory with both instruments, the FFM [free fall machine] and the RPM [random positioning machine], to compare them with the data from earlier experiments with human lymphocytes conducted in the FRC [fast rotating clinostat] and in space. Furthermore, the suitability of the FFM and RPM for research in gravitational cell biology is discussed.
Amplifying human ability through autonomics and machine learning in IMPACT
NASA Astrophysics Data System (ADS)
Dzieciuch, Iryna; Reeder, John; Gutzwiller, Robert; Gustafson, Eric; Coronado, Braulio; Martinez, Luis; Croft, Bryan; Lange, Douglas S.
2017-05-01
Amplifying human ability for controlling complex environments featuring autonomous units can be aided by learned models of human and system performance. In developing a command and control system that allows a small number of people to control a large number of autonomous teams, we employ an autonomics framework to manage the networks that represent mission plans and the networks that are composed of human controllers and their autonomous assistants. Machine learning allows us to build models of human and system performance useful for monitoring plans and managing human attention and task loads. Machine learning also aids in the development of tactics that human supervisors can successfully monitor through the command and control system.
Experimental Studies of Instability Development in Magnetically Driven Systems
Awe, Thomas James
2015-03-01
The author highlights results from a variety of experiments on the Z Machine, for which he served as the lead experimentalist. All experiments on Z take dedicated effort from a large collaboration of scientists, engineers, and technicians.
Learning Activity Package, Physical Science. LAP Numbers 8, 9, 10, and 11.
ERIC Educational Resources Information Center
Williams, G. J.
These four units of the Learning Activity Packages (LAPs) for individualized instruction in physical science cover nuclear reactions, alpha and beta particles, atomic radiation, medical use of nuclear energy, fission, fusion, simple machines, Newton's laws of motion, electricity, currents, electromagnetism, Oersted's experiment, sound, light,…
Verification and Validation of Digitally Upgraded Control Rooms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boring, Ronald; Lau, Nathan
2015-09-01
As nuclear power plants undertake main control room modernization, a challenge is the lack of a clearly defined human factors process to follow. Verification and validation (V&V) as applied in the nuclear power community has tended to involve efforts such as integrated system validation, which comes at the tail end of the design stage. To fill in guidance gaps and create a step-by-step process for control room modernization, we have developed the Guideline for Operational Nuclear Usability and Knowledge Elicitation (GONUKE). This approach builds on best practices in the software industry, which prescribe an iterative user-centered approach featuring multiple cyclesmore » of design and evaluation. Nuclear regulatory guidance for control room design emphasizes summative evaluation—which occurs after the design is complete. In the GONUKE approach, evaluation is also performed at the formative stage of design—early in the design cycle using mockups and prototypes for evaluation. The evaluation may involve expert review (e.g., software heuristic evaluation at the formative stage and design verification against human factors standards like NUREG-0700 at the summative stage). The evaluation may also involve user testing (e.g., usability testing at the formative stage and integrated system validation at the summative stage). An additional, often overlooked component of evaluation is knowledge elicitation, which captures operator insights into the system. In this report we outline these evaluation types across design phases that support the overall modernization process. The objective is to provide industry-suitable guidance for steps to be taken in support of the design and evaluation of a new human-machine interface (HMI) in the control room. We suggest the value of early-stage V&V and highlight how this early-stage V&V can help improve the design process for control room modernization. We argue that there is a need to overcome two shortcomings of V&V in current practice—the propensity for late-stage V&V and the use of increasingly complex psychological assessment measures for V&V.« less
Human factors - Man-machine symbiosis in space
NASA Technical Reports Server (NTRS)
Brown, Jeri W.
1987-01-01
The relation between man and machine in space is studied. Early spaceflight and the goal of establishing a permanent space presence are described. The need to consider the physiological, psychological, and social integration of humans for each space mission is examined. Human factors must also be considered in the design of spacecraft. The effective utilization of man and machine capabilities, and research in anthropometry and biomechanics aimed at determining the limitations of spacecrews are discussed.
Emerging needs for mobile nuclear powerplants
NASA Technical Reports Server (NTRS)
Anderson, J. L.
1972-01-01
Incentives for broadening the present role of civilian nuclear power to include mobile nuclear power plants that are compact, lightweight, and safe are examined. Specifically discussed is the growing importance of: (1) a new international cargo transportation capability, and (2) the capability for development of resources in previously remote regions of the earth including the oceans and the Arctic. This report surveys present and potential systems (vehicles, remote stations, and machines) that would both provide these capabilities and require enough power to justify using mobile nuclear reactor power plants.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tu, Zhude
The goal of this grant was to provide critical interdisciplinary research training for the next generation of radiochemists and nuclear medicine physicians through a collaboration between basic science and clinical faculty who are actively involved in the development, application, and translation of radiopharmaceuticals. Following the four year funding support period, the 10 postdocs, graduate students, as well as clinical physicians who received training have become faculty members, or senior radiochemists at different academic institutes or industry. With respect to scientific accomplishments, 26 peer-reviewed articles have been published to date as well as numerous poster and oral presentations. The goals ofmore » all four scientific projects were completed and several promising radiotracers identified for transfer into clinical investigation for human use. Some preliminary data generated from this training grant led several successful NIH grant proposals for the principal investigators.« less
Shutdown Dose Rate Analysis for the long-pulse D-D Operation Phase in KSTAR
NASA Astrophysics Data System (ADS)
Park, Jin Hun; Han, Jung-Hoon; Kim, D. H.; Joo, K. S.; Hwang, Y. S.
2017-09-01
KSTAR is a medium size fully superconducting tokamak. The deuterium-deuterium (D-D) reaction in the KSTAR tokamak generates neutrons with a peak yield of 3.5x1016 per second through a pulse operation of 100 seconds. The effect of neutron generation from full D-D high power KSTAR operation mode to the machine, such as activation, shutdown dose rate, and nuclear heating, are estimated for an assurance of safety during operation, maintenance, and machine upgrade. The nuclear heating of the in-vessel components, and neutron activation of the surrounding materials have been investigated. The dose rates during operation and after shutdown of KSTAR have been calculated by a 3D CAD model of KSTAR with the Monte Carlo code MCNP5 (neutron flux and decay photon), the inventory code FISPACT (activation and decay photon) and the FENDL 2.1 nuclear data library.
Teaching And Training Tools For The Undergraduate: Experience With A Rebuilt AN-400 Accelerator
NASA Astrophysics Data System (ADS)
Roberts, Andrew D.
2011-06-01
There is an increasingly recognized need for people trained in a broad range of applied nuclear science techniques, indicated by reports from the American Physical Society and elsewhere. Anecdotal evidence suggests that opportunities for hands-on training with small particle accelerators have diminished in the US, as development programs established in the 1960's and 1970's have been decommissioned over recent decades. Despite the reduced interest in the use of low energy accelerators in fundamental research, these machines can offer a powerful platform for bringing unique training opportunities to the undergraduate curriculum in nuclear physics, engineering and technology. We report here on the new MSU Applied Nuclear Science Lab, centered around the rebuild of an AN400 electrostatic accelerator. This machine is run entirely by undergraduate students under faculty supervision, allowing a great deal of freedom in its use without restrictions from graduate or external project demands.
Teaching And Training Tools For The Undergraduate: Experience With A Rebuilt AN-400 Accelerator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roberts, Andrew D.
2011-06-01
There is an increasingly recognized need for people trained in a broad range of applied nuclear science techniques, indicated by reports from the American Physical Society and elsewhere. Anecdotal evidence suggests that opportunities for hands-on training with small particle accelerators have diminished in the US, as development programs established in the 1960's and 1970's have been decommissioned over recent decades. Despite the reduced interest in the use of low energy accelerators in fundamental research, these machines can offer a powerful platform for bringing unique training opportunities to the undergraduate curriculum in nuclear physics, engineering and technology. We report here onmore » the new MSU Applied Nuclear Science Lab, centered around the rebuild of an AN400 electrostatic accelerator. This machine is run entirely by undergraduate students under faculty supervision, allowing a great deal of freedom in its use without restrictions from graduate or external project demands.« less
Towards a framework of human factors certification of complex human-machine systems
NASA Technical Reports Server (NTRS)
Bukasa, Birgit
1994-01-01
As far as total automation is not realized, the combination of technical and social components in man-machine systems demands not only contributions from engineers but at least to an equal extent from behavioral scientists. This has been neglected far too long. The psychological, social and cultural aspects of technological innovations were almost totally overlooked. Yet, along with expected safety improvements the institutionalization of human factors is on the way. The introduction of human factors certification of complex man-machine systems will be a milestone in this process.
Five Papers on Human-Machine Interaction.
ERIC Educational Resources Information Center
Norman, Donald A.
Different aspects of human-machine interaction are discussed in the five brief papers that comprise this report. The first paper, "Some Observations on Mental Models," discusses the role of a person's mental model in the interaction with systems. The second paper, "A Psychologist Views Human Processing: Human Errors and Other…
NASA Astrophysics Data System (ADS)
Noor, Ahmed K.
2013-12-01
Some of the recent attempts for improving and transforming engineering education are reviewed. The attempts aim at providing the entry level engineers with the skills needed to address the challenges of future large-scale complex systems and projects. Some of the frontier sectors and future challenges for engineers are outlined. The major characteristics of the coming intelligence convergence era (the post-information age) are identified. These include the prevalence of smart devices and environments, the widespread applications of anticipatory computing and predictive / prescriptive analytics, as well as a symbiotic relationship between humans and machines. Devices and machines will be able to learn from, and with, humans in a natural collaborative way. The recent game changers in learnscapes (learning paradigms, technologies, platforms, spaces, and environments) that can significantly impact engineering education in the coming era are identified. Among these are open educational resources, knowledge-rich classrooms, immersive interactive 3D learning, augmented reality, reverse instruction / flipped classroom, gamification, robots in the classroom, and adaptive personalized learning. Significant transformative changes in, and mass customization of, learning are envisioned to emerge from the synergistic combination of the game changers and other technologies. The realization of the aforementioned vision requires the development of a new multidisciplinary framework of emergent engineering for relating innovation, complexity and cybernetics, within the future learning environments. The framework can be used to treat engineering education as a complex adaptive system, with dynamically interacting and communicating components (instructors, individual, small, and large groups of learners). The emergent behavior resulting from the interactions can produce progressively better, and continuously improving, learning environment. As a first step towards the realization of the vision, intelligent adaptive cyber-physical ecosystems need to be developed to facilitate collaboration between the various stakeholders of engineering education, and to accelerate the development of a skilled engineering workforce. The major components of the ecosystems include integrated knowledge discovery and exploitation facilities, blended learning and research spaces, novel ultra-intelligent software agents, multimodal and autonomous interfaces, and networked cognitive and tele-presence robots.
NASA Astrophysics Data System (ADS)
Borne, K. D.; Fortson, L.; Gay, P.; Lintott, C.; Raddick, M. J.; Wallin, J.
2009-12-01
The remarkable success of Galaxy Zoo as a citizen science project for galaxy classification within a terascale astronomy data collection has led to the development of a broader collaboration, known as the Zooniverse. Activities will include astronomy, lunar science, solar science, and digital humanities. Some features of our program include development of a unified framework for citizen science projects, development of a common set of user-based research tools, engagement of the machine learning community to apply machine learning algorithms on the rich training data provided by citizen scientists, and extension across multiple research disciplines. The Zooniverse collaboration is just getting started, but already we are implementing a scientifically deep follow-on to Galaxy Zoo. This project, tentatively named Galaxy Merger Zoo, will engage users in running numerical simulations, whose input parameter space is voluminous and therefore demands a clever solution, such as allowing the citizen scientists to select their own sets of parameters, which then trigger new simulations of colliding galaxies. The user interface design has many of the engaging features that retain users, including rapid feedback, visually appealing graphics, and the sense of playing a competitive game for the benefit of science. We will discuss these topics. In addition, we will also describe applications of Citizen Science that are being considered for the petascale science project LSST (Large Synoptic Survey Telescope). LSST will produce a scientific data system that consists of a massive image archive (nearly 100 petabytes) and a similarly massive scientific parameter database (20-40 petabytes). Applications of Citizen Science for such an enormous data collection will enable greater scientific return in at least two ways. First, citizen scientists work with real data and perform authentic research tasks of value to the advancement of the science, providing "human computation" capabilities and resources to review, annotate, and explore aspects of the data that are too overwhelming for the science team. Second, citizen scientists' inputs (in the form of rich training data and class labels) can be used to improve the classifiers that the project team uses to classify and prioritize new events detected in the petascale data stream. This talk will review these topics and provide an update on the Zooniverse project.
A collaborative interaction and visualization multi-modal environment for surgical planning.
Foo, Jung Leng; Martinez-Escobar, Marisol; Peloquin, Catherine; Lobe, Thom; Winer, Eliot
2009-01-01
The proliferation of virtual reality visualization and interaction technologies has changed the way medical image data is analyzed and processed. This paper presents a multi-modal environment that combines a virtual reality application with a desktop application for collaborative surgical planning. Both visualization applications can function independently but can also be synced over a network connection for collaborative work. Any changes to either application is immediately synced and updated to the other. This is an efficient collaboration tool that allows multiple teams of doctors with only an internet connection to visualize and interact with the same patient data simultaneously. With this multi-modal environment framework, one team working in the VR environment and another team from a remote location working on a desktop machine can both collaborate in the examination and discussion for procedures such as diagnosis, surgical planning, teaching and tele-mentoring.
CmapTools: A Software Environment for Knowledge Modeling and Sharing
NASA Technical Reports Server (NTRS)
Canas, Alberto J.
2004-01-01
In an ongoing collaborative effort between a group of NASA Ames scientists and researchers at the Institute for Human and Machine Cognition (IHMC) of the University of West Florida, a new version of CmapTools has been developed that enable scientists to construct knowledge models of their domain of expertise, share them with other scientists, make them available to anybody on the Internet with access to a Web browser, and peer-review other scientists models. These software tools have been successfully used at NASA to build a large-scale multimedia on Mars and in knowledge model on Habitability Assessment. The new version of the software places emphasis on greater usability for experts constructing their own knowledge models, and support for the creation of large knowledge models with large number of supporting resources in the forms of images, videos, web pages, and other media. Additionally, the software currently allows scientists to cooperate with each other in the construction, sharing and criticizing of knowledge models. Scientists collaborating from remote distances, for example researchers at the Astrobiology Institute, can concurrently manipulate the knowledge models they are viewing without having to do this at a special videoconferencing facility.
Supporting Regularized Logistic Regression Privately and Efficiently.
Li, Wenfa; Liu, Hongzhe; Yang, Peng; Xie, Wei
2016-01-01
As one of the most popular statistical and machine learning models, logistic regression with regularization has found wide adoption in biomedicine, social sciences, information technology, and so on. These domains often involve data of human subjects that are contingent upon strict privacy regulations. Concerns over data privacy make it increasingly difficult to coordinate and conduct large-scale collaborative studies, which typically rely on cross-institution data sharing and joint analysis. Our work here focuses on safeguarding regularized logistic regression, a widely-used statistical model while at the same time has not been investigated from a data security and privacy perspective. We consider a common use scenario of multi-institution collaborative studies, such as in the form of research consortia or networks as widely seen in genetics, epidemiology, social sciences, etc. To make our privacy-enhancing solution practical, we demonstrate a non-conventional and computationally efficient method leveraging distributing computing and strong cryptography to provide comprehensive protection over individual-level and summary data. Extensive empirical evaluations on several studies validate the privacy guarantee, efficiency and scalability of our proposal. We also discuss the practical implications of our solution for large-scale studies and applications from various disciplines, including genetic and biomedical studies, smart grid, network analysis, etc.
Application of free energy minimization to the design of adaptive multi-agent teams
NASA Astrophysics Data System (ADS)
Levchuk, Georgiy; Pattipati, Krishna; Fouse, Adam; Serfaty, Daniel
2017-05-01
Many novel DoD missions, from disaster relief to cyber reconnaissance, require teams of humans and machines with diverse capabilities. Current solutions do not account for heterogeneity of agent capabilities, uncertainty of team knowledge, and dynamics of and dependencies between tasks and agent roles, resulting in brittle teams. Most importantly, the state-of-the-art team design solutions are either centralized, imposing role and relation assignment onto agents, or completely distributed, suitable for only homogeneous organizations such as swarms. Centralized design models can't provide insights for team's self-organization, i.e. adapting team structure over time in distributed collaborative manner by team members with diverse expertise and responsibilities. In this paper we present an information-theoretic formalization of team composition and structure adaptation using a minimization of variational free energy. The structure adaptation is obtained in an iterative distributed and collaborative manner without the need for centralized control. We show that our model is lightweight, predictive, and produces team structures that theoretically approximate an optimal policy for team adaptation. Our model also provides a unique coupling between the structure and action policy, and captures three essential processes of learning, perception, and control.
Supporting Regularized Logistic Regression Privately and Efficiently
Li, Wenfa; Liu, Hongzhe; Yang, Peng; Xie, Wei
2016-01-01
As one of the most popular statistical and machine learning models, logistic regression with regularization has found wide adoption in biomedicine, social sciences, information technology, and so on. These domains often involve data of human subjects that are contingent upon strict privacy regulations. Concerns over data privacy make it increasingly difficult to coordinate and conduct large-scale collaborative studies, which typically rely on cross-institution data sharing and joint analysis. Our work here focuses on safeguarding regularized logistic regression, a widely-used statistical model while at the same time has not been investigated from a data security and privacy perspective. We consider a common use scenario of multi-institution collaborative studies, such as in the form of research consortia or networks as widely seen in genetics, epidemiology, social sciences, etc. To make our privacy-enhancing solution practical, we demonstrate a non-conventional and computationally efficient method leveraging distributing computing and strong cryptography to provide comprehensive protection over individual-level and summary data. Extensive empirical evaluations on several studies validate the privacy guarantee, efficiency and scalability of our proposal. We also discuss the practical implications of our solution for large-scale studies and applications from various disciplines, including genetic and biomedical studies, smart grid, network analysis, etc. PMID:27271738
Work Term Assignment Spring 2017
NASA Technical Reports Server (NTRS)
Sico, Mallory
2017-01-01
My tour in the Engineering Robotics directorate exceeded my expectations. I learned lessons about Creo, manufacturing and assembly, collaboration, and troubleshooting. During my first tour, last spring, I used Creo on a smaller project, but had limited experience with it before starting in the Dynamic Systems Test branch this spring. I gained valuable experience learning assembly design, sheet metal design and designing with intent for manufacturing and assembly. These skills came from working both on the hatch and the floor. I also learned to understand the intent of other designers on models I worked with. While redesigning the floor, I was modifying an existing part and worked to understand what the previous designer had done to make it fit with the new model. Through working with the machine shop and in the mock-up, I learned much more about manufacturing and assembly. I used a Dremel, rivet gun, belt sander, and countersink for the first time. Through taking multiple safety training for different machine shops, I learned new machine shop safety skills specific to each one. This semester also gave me new collaborative opportunities. I collaborated with engineers within my branch as well as with Human Factors and the building 10 machine shop. This experience helped me learn how to design for functionality and assembly, not only for what would be easiest in my designs. In addition to these experiences, I learned many lessons in troubleshooting. I was the first person in my office to use a Windows 10 computer. This caused unexpected issues with NASA services and programs, such as the Digital Data Management Server (DDMS). Because of this, I gained experience finding solutions to lockout and freeze issues as well as Creo specific settings. These will be useful skills to have in the future and will be implemented in future rotations. This co-op tour has motivated me more to finish my degree and pursue my academic goals. I intend to take a machining Career Gateway Elective in the Fall to improve my skills in building as well as designing with manufacturing intent. I am also inspired to take more mechatronics CGE courses before I graduate to learn more about the crossover between mechanical and electrical engineering. This semester, I worked on multiple projects and had the opportunity to learn from engineers of different disciplines. I became proficient in Creo 2.0, a program I had not used significantly before. I finished modifying the hatch for 3D print and made sizeable modifications to the nose floor support design. I also gained hands-on experience that will be useful in my engineering career in the future. I would consider all of these major achievements from this spring semester. Lastly, I learned to ask more questions and to search for the right people to find answers which I know will be a valuable skill in the future. This summer, I will be completing my last rotation in the Flight Operations Directorate at Ellington Air Field. After this last tour, I will be returning to school for the Fall, Spring and Summer. I will graduate in August of 2018. I am looking forward to learning more about the different jobs available to engineers. This division works directly with many different types of aircrafts and I am excited to learn more about this focus of engineering. The Aircraft Operations Division values team work greatly and I intend to improve my interpersonal skills by working with them this summer.
Mistaking minds and machines: How speech affects dehumanization and anthropomorphism.
Schroeder, Juliana; Epley, Nicholas
2016-11-01
Treating a human mind like a machine is an essential component of dehumanization, whereas attributing a humanlike mind to a machine is an essential component of anthropomorphism. Here we tested how a cue closely connected to a person's actual mental experience-a humanlike voice-affects the likelihood of mistaking a person for a machine, or a machine for a person. We predicted that paralinguistic cues in speech are particularly likely to convey the presence of a humanlike mind, such that removing voice from communication (leaving only text) would increase the likelihood of mistaking the text's creator for a machine. Conversely, adding voice to a computer-generated script (resulting in speech) would increase the likelihood of mistaking the text's creator for a human. Four experiments confirmed these hypotheses, demonstrating that people are more likely to infer a human (vs. computer) creator when they hear a voice expressing thoughts than when they read the same thoughts in text. Adding human visual cues to text (i.e., seeing a person perform a script in a subtitled video clip), did not increase the likelihood of inferring a human creator compared with only reading text, suggesting that defining features of personhood may be conveyed more clearly in speech (Experiments 1 and 2). Removing the naturalistic paralinguistic cues that convey humanlike capacity for thinking and feeling, such as varied pace and intonation, eliminates the humanizing effect of speech (Experiment 4). We discuss implications for dehumanizing others through text-based media, and for anthropomorphizing machines through speech-based media. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Vargas, Mara Ambrosina de O; Meyer, Dagmar Estermann
2005-06-01
This study discusses the human being-machine relationship in the process called "cyborgzation" of the nurse who works in intensive care, based on post-structuralist Cultural Studies and highlighting Haraway's concept of cyborg. In it, manuals used by nurses in Intensive Care Units have been examined as cultural texts. This cultural analysis tries to decode the various senses of "human" and "machine", with the aim of recognizing processes that turn nurses into cyborgs. The argument is that intensive care nurses fall into a process of "technology embodiment" that turns the body-professional into a hybrid that makes possible to disqualify, at the same time, notions such as machine and body "proper", since it is the hybridization between one and the other that counts there. Like cyborgs, intensive care nurses learn to "be with" the machine, and this connection limits the specificity of their actions. It is suggested that processes of "cyborgzation" such as this are useful for questioning - and to deal with in different ways - the senses of "human" and "humanity" that support a major part of knowledge/action in health.
Contrasting State-of-the-Art in the Machine Scoring of Short-Form Constructed Responses
ERIC Educational Resources Information Center
Shermis, Mark D.
2015-01-01
This study compared short-form constructed responses evaluated by both human raters and machine scoring algorithms. The context was a public competition on which both public competitors and commercial vendors vied to develop machine scoring algorithms that would match or exceed the performance of operational human raters in a summative high-stakes…
Schwantes, Jon M.; Marsden, Oliva; Pellegrini, Kristi L.
2016-09-16
The Nuclear Forensics International Technical Working Group (ITWG) recently completed its fourth Collaborative Materials Exercise (CMX-4) in the 21 year history of the Group. This was also the largest materials exercise to date, with participating laboratories from 16 countries or international organizations. Moreover, exercise samples (including three separate samples of low enriched uranium oxide) were shipped as part of an illicit trafficking scenario, for which each laboratory was asked to conduct nuclear forensic analyses in support of a fictitious criminal investigation. In all, over 30 analytical techniques were applied to characterize exercise materials, for which ten of those techniques weremore » applied to ITWG exercises for the first time. We performed an objective review of the state of practice and emerging application of analytical techniques of nuclear forensic analysis based upon the outcome of this most recent exercise is provided.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwantes, Jon M.; Marsden, Oliva; Pellegrini, Kristi L.
The Nuclear Forensics International Technical Working Group (ITWG) recently completed its fourth Collaborative Materials Exercise (CMX-4) in the 21 year history of the Group. This was also the largest materials exercise to date, with participating laboratories from 16 countries or international organizations. Moreover, exercise samples (including three separate samples of low enriched uranium oxide) were shipped as part of an illicit trafficking scenario, for which each laboratory was asked to conduct nuclear forensic analyses in support of a fictitious criminal investigation. In all, over 30 analytical techniques were applied to characterize exercise materials, for which ten of those techniques weremore » applied to ITWG exercises for the first time. We performed an objective review of the state of practice and emerging application of analytical techniques of nuclear forensic analysis based upon the outcome of this most recent exercise is provided.« less
Crandall, Jacob W; Oudah, Mayada; Tennom; Ishowo-Oloko, Fatimah; Abdallah, Sherief; Bonnefon, Jean-François; Cebrian, Manuel; Shariff, Azim; Goodrich, Michael A; Rahwan, Iyad
2018-01-16
Since Alan Turing envisioned artificial intelligence, technical progress has often been measured by the ability to defeat humans in zero-sum encounters (e.g., Chess, Poker, or Go). Less attention has been given to scenarios in which human-machine cooperation is beneficial but non-trivial, such as scenarios in which human and machine preferences are neither fully aligned nor fully in conflict. Cooperation does not require sheer computational power, but instead is facilitated by intuition, cultural norms, emotions, signals, and pre-evolved dispositions. Here, we develop an algorithm that combines a state-of-the-art reinforcement-learning algorithm with mechanisms for signaling. We show that this algorithm can cooperate with people and other algorithms at levels that rival human cooperation in a variety of two-player repeated stochastic games. These results indicate that general human-machine cooperation is achievable using a non-trivial, but ultimately simple, set of algorithmic mechanisms.
NASA Astrophysics Data System (ADS)
Stone, N.; Lafuente, B.; Bristow, T.; Keller, R.; Downs, R. T.; Blake, D. F.; Fonda, M.; Pires, A.
2016-12-01
Working primarily with astrobiology researchers at NASA Ames, the Open Data Repository (ODR) has been conducting a software pilot to meet the varying needs of this multidisciplinary community. Astrobiology researchers often have small communities or operate individually with unique data sets that don't easily fit into existing database structures. The ODR constructed its Data Publisher software to allow researchers to create databases with common metadata structures and subsequently extend them to meet their individual needs and data requirements. The software accomplishes these tasks through a web-based interface that allows collaborative creation and revision of common metadata templates and individual extensions to these templates for custom data sets. This allows researchers to search disparate datasets based on common metadata established through the metadata tools, but still facilitates distinct analyses and data that may be stored alongside the required common metadata. The software produces web pages that can be made publicly available at the researcher's discretion so that users may search and browse the data in an effort to make interoperability and data discovery a human-friendly task while also providing semantic data for machine-based discovery. Once relevant data has been identified, researchers can utilize the built-in application programming interface (API) that exposes the data for machine-based consumption and integration with existing data analysis tools (e.g. R, MATLAB, Project Jupyter - http://jupyter.org). The current evolution of the project has created the Astrobiology Habitable Environments Database (AHED)[1] which provides an interface to databases connected through a common metadata core. In the next project phase, the goal is for small research teams and groups to be self-sufficient in publishing their research data to meet funding mandates and academic requirements as well as fostering increased data discovery and interoperability through human-readable and machine-readable interfaces. This project is supported by the Science-Enabling Research Activity (SERA) and NASA NNX11AP82A, MSL. [1] B. Lafuente et al. (2016) AGU, submitted.
Information, knowledge and the future of machines.
MacFarlane, Alistair G J
2003-08-15
This wide-ranging survey considers the future of machines in terms of information, complexity and the growth of knowledge shared amongst agents. Mechanical and human agents are compared and contrasted, and it is argued that, for the foreseeable future, their roles will be complementary. The future development of machines is examined in terms of unions of human and machine agency evolving as part of economic activity. Limits to, and threats posed by, the continuing evolution of such a society of agency are considered.
Modelling of human-machine interaction in equipment design of manufacturing cells
NASA Astrophysics Data System (ADS)
Cochran, David S.; Arinez, Jorge F.; Collins, Micah T.; Bi, Zhuming
2017-08-01
This paper proposes a systematic approach to model human-machine interactions (HMIs) in supervisory control of machining operations; it characterises the coexistence of machines and humans for an enterprise to balance the goals of automation/productivity and flexibility/agility. In the proposed HMI model, an operator is associated with a set of behavioural roles as a supervisor for multiple, semi-automated manufacturing processes. The model is innovative in the sense that (1) it represents an HMI based on its functions for process control but provides the flexibility for ongoing improvements in the execution of manufacturing processes; (2) it provides a computational tool to define functional requirements for an operator in HMIs. The proposed model can be used to design production systems at different levels of an enterprise architecture, particularly at the machine level in a production system where operators interact with semi-automation to accomplish the goal of 'autonomation' - automation that augments the capabilities of human beings.
A journey from nuclear criticality methods to high energy density radflow experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Urbatsch, Todd James
Los Alamos National Laboratory is a nuclear weapons laboratory supporting our nation's defense. In support of this mission is a high energy-density physics program in which we design and execute experiments to study radiationhydrodynamics phenomena and improve the predictive capability of our largescale multi-physics software codes on our big-iron computers. The Radflow project’s main experimental effort now is to understand why we haven't been able to predict opacities on Sandia National Laboratory's Z-machine. We are modeling an increasing fraction of the Z-machine's dynamic hohlraum to find multi-physics explanations for the experimental results. Further, we are building an entirely different opacitymore » platform on Lawrence Livermore National Laboratory's National Ignition Facility (NIF), which is set to get results early 2017. Will the results match our predictions, match the Z-machine, or give us something entirely different? The new platform brings new challenges such as designing hohlraums and spectrometers. The speaker will recount his history, starting with one-dimensional Monte Carlo nuclear criticality methods in graduate school, radiative transfer methods research and software development for his first 16 years at LANL, and, now, radflow technology and experiments. Who knew that the real world was more than just radiation transport? Experiments aren't easy, but they sure are fun.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griffiths, Grant; Keegan, E.; Young, E.
Physical characterization is one of the most broad and important categories of techniques to apply in a nuclear forensic examination. Physical characterization techniques vary from simple weighing and dimensional measurements to complex sample preparation and scanning electron microscopy-electron backscatter diffraction analysis. This paper reports on the physical characterization conducted by several international laboratories participating in the fourth Collaborative Materials Exercise, organized by the Nuclear Forensics International Technical Working Group. Methods include a range of physical measurements, microscopy-based observations, and profilometry. In conclusion, the value of these results for addressing key investigative questions concerning two uranium dioxide pellets and a uraniummore » dioxide powder is discussed.« less
Griffiths, Grant; Keegan, E.; Young, E.; ...
2018-01-06
Physical characterization is one of the most broad and important categories of techniques to apply in a nuclear forensic examination. Physical characterization techniques vary from simple weighing and dimensional measurements to complex sample preparation and scanning electron microscopy-electron backscatter diffraction analysis. This paper reports on the physical characterization conducted by several international laboratories participating in the fourth Collaborative Materials Exercise, organized by the Nuclear Forensics International Technical Working Group. Methods include a range of physical measurements, microscopy-based observations, and profilometry. In conclusion, the value of these results for addressing key investigative questions concerning two uranium dioxide pellets and a uraniummore » dioxide powder is discussed.« less
Dual linear structured support vector machine tracking method via scale correlation filter
NASA Astrophysics Data System (ADS)
Li, Weisheng; Chen, Yanquan; Xiao, Bin; Feng, Chen
2018-01-01
Adaptive tracking-by-detection methods based on structured support vector machine (SVM) performed well on recent visual tracking benchmarks. However, these methods did not adopt an effective strategy of object scale estimation, which limits the overall tracking performance. We present a tracking method based on a dual linear structured support vector machine (DLSSVM) with a discriminative scale correlation filter. The collaborative tracker comprised of a DLSSVM model and a scale correlation filter obtains good results in tracking target position and scale estimation. The fast Fourier transform is applied for detection. Extensive experiments show that our tracking approach outperforms many popular top-ranking trackers. On a benchmark including 100 challenging video sequences, the average precision of the proposed method is 82.8%.
Designing Contestability: Interaction Design, Machine Learning, and Mental Health
Hirsch, Tad; Merced, Kritzia; Narayanan, Shrikanth; Imel, Zac E.; Atkins, David C.
2017-01-01
We describe the design of an automated assessment and training tool for psychotherapists to illustrate challenges with creating interactive machine learning (ML) systems, particularly in contexts where human life, livelihood, and wellbeing are at stake. We explore how existing theories of interaction design and machine learning apply to the psychotherapy context, and identify “contestability” as a new principle for designing systems that evaluate human behavior. Finally, we offer several strategies for making ML systems more accountable to human actors. PMID:28890949
NASA Technical Reports Server (NTRS)
Warren, Wayne H., Jr.; Ochsenbein, Francois; Rappaport, Barry N.
1990-01-01
The entire series of Durchmusterung (DM) catalogs (Bonner, Southern, Cordoba, Cape Photographic) has been computerized through a collaborative effort among institutions and individuals in France and the United States of America. Complete verification of the data, both manually and by computer, the inclusion of all supplemental stars (represented by lower case letters), complete representation of all numerical data, and a consistent format for all catalogs, should make this collection of machine-readable data a valuable addition to digitized astronomical archives.
Reconstructing the Antikythera Mechanism
NASA Astrophysics Data System (ADS)
Freeth, Tony
The Antikythera Mechanism is a geared astronomical calculating machine from ancient Greece. The extraordinary nature of this device has become even more apparent in recent years as a result of research under the aegis of the Antikythera Mechanism Research Project (AMRP) - an international collaboration of scientists, historians, museum staff, engineers, and imaging specialists. Though many questions still remain, we may now be close to reconstructing the complete machine. As a technological artifact, it is unique in the ancient world. Its brilliant design conception means that it is a landmark in the history of science and technology.
NASA Astrophysics Data System (ADS)
Civitarese, O.; Suhonen, J.; Zuber, K.
2015-09-01
The extension of the Standard Model of electroweak interactions, to accommodate massive neutrinos and/or right-handed currents, is one of the fundamental questions to answer in the cross-field of particle and nuclear physics. The consequences of such extensions would reflect upon nuclear decays, like the very exotic nuclear double-beta-decay, as well as upon high-energy proton-proton reactions of the type performed at the LHC accelerator. In this talk we shall address this question by looking at the results reported by the ATLAS and CMS collaborations, where the excitation and decay of a heavy-mass boson may be mediated by a heavy-mass neutrino in proton-proton reactions leading to two jets and two leptons, and by extracting limits on the left-right mixing, from the latest measurements of nuclear-double-beta decays reported by the GERDA and EXO collaborations.
Sward, Katherine A; Newth, Christopher JL; Khemani, Robinder G; Cryer, Martin E; Thelen, Julie L; Enriquez, Rene; Shaoyu, Su; Pollack, Murray M; Harrison, Rick E; Meert, Kathleen L; Berg, Robert A; Wessel, David L; Shanley, Thomas P; Dalton, Heidi; Carcillo, Joseph; Jenkins, Tammara L; Dean, J Michael
2015-01-01
Objectives To examine the feasibility of deploying a virtual web service for sharing data within a research network, and to evaluate the impact on data consistency and quality. Material and Methods Virtual machines (VMs) encapsulated an open-source, semantically and syntactically interoperable secure web service infrastructure along with a shadow database. The VMs were deployed to 8 Collaborative Pediatric Critical Care Research Network Clinical Centers. Results Virtual web services could be deployed in hours. The interoperability of the web services reduced format misalignment from 56% to 1% and demonstrated that 99% of the data consistently transferred using the data dictionary and 1% needed human curation. Conclusions Use of virtualized open-source secure web service technology could enable direct electronic abstraction of data from hospital databases for research purposes. PMID:25796596
NASA Astrophysics Data System (ADS)
Wyborn, L. A.
2007-12-01
The Information Age in Science is being driven partly by the data deluge as exponentially growing volumes of data are being generated by research. Such large volumes of data cannot be effectively processed by humans and efficient and timely processing by computers requires development of specific machine readable formats. Further, as key challenges in earth and space sciences, such as climate change, hazard prediction and sustainable development resources require a cross disciplinary approach, data from various domains will need to be integrated from globally distributed sources also via machine to machine formats. However, it is becoming increasingly apparent that the existing standards can be very domain specific and most existing data transfer formats require human intervention. Where groups from different communities do try combine data across the domain/discipline boundaries much time is spent reformatting and reorganizing the data and it is conservatively estimated that this can take 80% of a project's time and resources. Four different types of standards are required for machine to machine interaction: systems, syntactic, schematic and semantic. Standards at the systems (WMS, WFS, etc) and at the syntactic level (GML, Observation and Measurement, SensorML) are being developed through international standards bodies such as ISO, OGC, W3C, IEEE etc. In contrast standards at the schematic level (e.g., GeoSciML, LandslidesML, WaterML, QuakeML) and at the semantic level (ie ontologies and vocabularies) are currently developing rapidly, in a very uncoordinated way and with little governance. As the size of the community that can machine read each others data depends on the size of the community that has developed the schematic or semantic standards, it is essential that to achieve global integration of earth and space science data, the required standards need to be developed through international collaboration using accepted standard proceedures. Once developed the standards also require some form of governance to maintain and then extend the standard as the science evolves to meet new challenges. A standard that does have some governance is GeoSciML, a data transfer standard for geoscience map data. GeoSciML is currently being developed by a consortium of 7 countries under the auspices of the Commission for the Management of and Application of Geoscience Information (CGI), a commission of the International Union of Geological Sciences. Perhaps other `ML' or ontology and vocabulary development `teams' need to look to their international domain specific specialty societies for endorsement and governance. But the issue goes beyond Earth and Space Sciences, as increasingly cross and intra disciplinary science requires machine to machine interaction with other science disciplines such as physics, chemistry and astronomy. For example, for geochemistry do we develop GeochemistryML or do we extend the existing Chemical Markup Language? Again, the question is who will provide the coordination of the development of the required schematic and semantic standards that underpin machine to machine global integration of science data. Is this a role for ICSU or CODATA or who? In order to address this issue, Geoscience Australia and CSIRO established the Solid Earth and Environmental Grid Community website to enable communities to `advertise' standards development and to provide a community TWIKI where standards can be developed in a globally `open' environment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cristy, S.S.; Bennett, R.K. Jr.; Dillon, J.J.
1986-12-31
The use of perchloroethylene (perc) as an ingredient in coolants for machining enriched uranium at the Oak Ridge Y-12 Plant has been discontinued because of environmental concerns. A new coolant was substituted in December 1985, which consists of an aqueous solution of propylene glycol with borax (sodium tetraborate) added as a nuclear poison and with a nitrite added as a corrosion inhibitor. Uranium surfaces machined using the two coolants were compared with respects to residual contamination, corrosion or corrosion potential, and with the aqueous propylene glycol-borax coolant was found to be better than that of enriched uranium machined with themore » perc-mineral oil coolant. The boron residues on the final-finished parts machined with the borax-containing coolant were not sufficient to cause problems in further processing. All evidence indicated that the enriched uranium surfaces machined with the borax-containing coolant will be as satisfactory as those machined with the perc coolant.« less
NASA Astrophysics Data System (ADS)
Lingadurai, K.; Nagasivamuni, B.; Muthu Kamatchi, M.; Palavesam, J.
2012-06-01
Wire electrical discharge machining (WEDM) is a specialized thermal machining process capable of accurately machining parts of hard materials with complex shapes. Parts having sharp edges that pose difficulties to be machined by the main stream machining processes can be easily machined by WEDM process. Design of Experiments approach (DOE) has been reported in this work for stainless steel AISI grade-304 which is used in cryogenic vessels, evaporators, hospital surgical equipment, marine equipment, fasteners, nuclear vessels, feed water tubing, valves, refrigeration equipment, etc., is machined by WEDM with brass wire electrode. The DOE method is used to formulate the experimental layout, to analyze the effect of each parameter on the machining characteristics, and to predict the optimal choice for each WEDM parameter such as voltage, pulse ON, pulse OFF and wire feed. It is found that these parameters have a significant influence on machining characteristic such as metal removal rate (MRR), kerf width and surface roughness (SR). The analysis of the DOE reveals that, in general the pulse ON time significantly affects the kerf width and the wire feed rate affects SR, while, the input voltage mainly affects the MRR.
The challenges of leadership in the modern world: introduction to the special issue.
Bennis, Warren
2007-01-01
This article surveys contemporary trends in leadership theory as well as its current status and the social context that has shaped the contours of leadership studies. Emphasis is placed on the urgent need for collaboration among social-neuro-cognitive scientists in order to achieve an integrated theory, and the author points to promising leads for accomplishing this. He also asserts that the 4 major threats to world stability are a nuclear/biological catastrophe, a world-wide pandemic, tribalism, and the leadership of human institutions. Without exemplary leadership, solving the problems stemming from the first 3 threats will be impossible. ((c) 2007 APA, all rights reserved)
Concept and design philosophy of a person-accompanying robot
NASA Astrophysics Data System (ADS)
Mizoguchi, Hiroshi; Shigehara, Takaomi; Goto, Yoshiyasu; Hidai, Ken-ichi; Mishima, Taketoshi
1999-01-01
This paper proposes a person accompanying robot as a novel human collaborative robot. The person accompanying robot is such legged mobile robot that is possible to follow the person utilizing its vision. towards future aging society, human collaboration and human support are required as novel applications of robots. Such human collaborative robots share the same space with humans. But conventional robots are isolated from humans and lack the capability to observe humans. Study on human observing function of robot is crucial to realize novel robot such as service and pet robot. To collaborate and support humans properly human collaborative robot must have capability to observe and recognize humans. Study on human observing function of robot is crucial to realize novel robot such as service and pet robot. The authors are currently implementing a prototype of the proposed accompanying robot.As a base for the human observing function of the prototype robot, we have realized face tracking utilizing skin color extraction and correlation based tracking. We also develop a method for the robot to pick up human voice clearly and remotely by utilizing microphone arrays. Results of these preliminary study suggest feasibility of the proposed robot.
Can machines think? A report on Turing test experiments at the Royal Society
NASA Astrophysics Data System (ADS)
Warwick, Kevin; Shah, Huma
2016-11-01
In this article we consider transcripts that originated from a practical series of Turing's Imitation Game that was held on 6 and 7 June 2014 at the Royal Society London. In all cases the tests involved a three-participant simultaneous comparison by an interrogator of two hidden entities, one being a human and the other a machine. Each of the transcripts considered here resulted in a human interrogator being fooled such that they could not make the 'right identification', that is, they could not say for certain which was the machine and which was the human. The transcripts presented all involve one machine only, namely 'Eugene Goostman', the result being that the machine became the first to pass the Turing test, as set out by Alan Turing, on unrestricted conversation. This is the first time that results from the Royal Society tests have been disclosed and discussed in a paper.
Application of Robotics in Decommissioning and Decontamination - 12536
DOE Office of Scientific and Technical Information (OSTI.GOV)
Banford, Anthony; Kuo, Jeffrey A.; Bowen, R.A.
Decommissioning and dismantling of nuclear facilities is a significant challenge worldwide and one which is growing in size as more plants reach the end of their operational lives. The strategy chosen for individual projects varies from the hands-on approach with significant manual intervention using traditional demolition equipment at one extreme to bespoke highly engineered robotic solutions at the other. The degree of manual intervention is limited by the hazards and risks involved, and in some plants are unacceptable. Robotic remote engineering is often viewed as more expensive and less reliable than manual approaches, with significant lead times and capital expenditure.more » However, advances in robotics and automation in other industries offer potential benefits for future decommissioning activities, with the high probability of reducing worker exposure and other safety risks as well as reducing the schedule and costs required to complete these activities. Some nuclear decommissioning tasks and facility environments are so hazardous that they can only be accomplished by exclusive use of robotic and remote intervention. Less hazardous tasks can be accomplished by manual intervention and the use of PPE. However, PPE greatly decreases worker productivity and still exposes the worker to both risk and dose making remote operation preferable to achieve ALARP. Before remote operations can be widely accepted and deployed, there are some economic and technological challenges that must be addressed. These challenges will require long term investment commitments in order for technology to be: - Specifically developed for nuclear applications; - At a sufficient TRL for practical deployment; - Readily available as a COTS. Tremendous opportunities exist to reduce cost and schedule and improve safety in D and D activities through the use of robotic and/or tele-operated systems. - Increasing the level of remote intervention reduces the risk and dose to an operator. Better environmental information identifies hazards, which can be assessed, managed and mitigated. - Tele-autonomous control in a congested unstructured environment is more reliable compared to a human operator. Advances in Human Machine Interfaces contribute to reliability and task optimization. Use of standardized dexterous manipulators and COTS, including standardized communication protocols reduces project time scales. - The technologies identified, if developed to a sufficient TRL would all contribute to cost reductions. Additionally, optimizing a project's position on a Remote Intervention Scale, a Bespoke Equipment Scale and a Tele-autonomy Scale would provide cost reductions from the start of a project. Of the technologies identified, tele-autonomy is arguably the most significant, because this would provide a fundamental positive change for robotic control in the nuclear industry. The challenge for technology developers is to develop versatile robotic technology that can be economically deployed to a wide range of future D and D projects and industrial sectors. The challenge for facility owners and project managers is to partner with the developers to provide accurate systems requirements and an open and receptive environment for testing and deployment. To facilitate this development and deployment effort, the NNL and DOE have initiated discussions to explore a collaborative R and D program that would accelerate development and support the optimum utilization of resources. (authors)« less
Absorption of language concepts in the machine mind
NASA Astrophysics Data System (ADS)
Kollár, Ján
2016-06-01
In our approach, the machine mind is the applicative dynamic system represented by its algorithmically evolvable internal language. By other words, the mind and the language of mind are synonyms. Coming out from Shaumyan's semiotic theory of languages, we present the representation of language concepts in the machine mind as a result of our experiment, to show non-redundancy of the language of mind. To provide useful restriction for further research, we also introduce the hypothesis of semantic saturation in Computer-Computer communication, which indicates that a set of machines is not self-evolvable. The goal of our research is to increase the abstraction of Human-Computer and Computer-Computer communication. If we want humans and machines comunicate as a parent with the child, using different symbols and media, we must find the language of mind commonly usable by both machines and humans. In our opinion, there exist a kind of calm language of thinking, which we try to propose for machines in this paper. We separate the layers of a machine mind, we present the structure of the evolved mind and we discuss the selected properties. We are concentrating on the representation of symbolized concepts in the mind, that are languages, not just grammars, since they have meaning.
A human-machine cooperation route planning method based on improved A* algorithm
NASA Astrophysics Data System (ADS)
Zhang, Zhengsheng; Cai, Chao
2011-12-01
To avoid the limitation of common route planning method to blindly pursue higher Machine Intelligence and autoimmunization, this paper presents a human-machine cooperation route planning method. The proposed method includes a new A* path searing strategy based on dynamic heuristic searching and a human cooperated decision strategy to prune searching area. It can overcome the shortage of A* algorithm to fall into a local long term searching. Experiments showed that this method can quickly plan a feasible route to meet the macro-policy thinking.
Collaboration between Supported Employment and Human Resource Services: Strategies for Success
ERIC Educational Resources Information Center
Post, Michal; Campbell, Camille; Heinz, Tom; Kotsonas, Lori; Montgomery, Joyce; Storey, Keith
2010-01-01
The article presents the benefits of successful collaboration between supported employment agencies and human resource managers when working together to secure employment for individuals with disabilities. Two case studies are presented: one involving a successful collaboration with county human resource managers in negotiating a change in the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hopper, Calvin Mitchell
In May 1973 the University of New Mexico conducted the first nationwide criticality safety training and education week-long short course for nuclear criticality safety engineers. Subsequent to that course, the Los Alamos Critical Experiments Facility (LACEF) developed very successful 'hands-on' subcritical and critical training programs for operators, supervisors, and engineering staff. Since the inception of the US Department of Energy (DOE) Nuclear Criticality Technology and Safety Project (NCT&SP) in 1983, the DOE has stimulated contractor facilities and laboratories to collaborate in the furthering of nuclear criticality as a discipline. That effort included the education and training of nuclear criticality safetymore » engineers (NCSEs). In 1985 a textbook was written that established a path toward formalizing education and training for NCSEs. Though the NCT&SP went through a brief hiatus from 1990 to 1992, other DOE-supported programs were evolving to the benefit of NCSE training and education. In 1993 the DOE established a Nuclear Criticality Safety Program (NCSP) and undertook a comprehensive development effort to expand the extant LACEF 'hands-on' course specifically for the education and training of NCSEs. That successful education and training was interrupted in 2006 for the closing of the LACEF and the accompanying movement of materials and critical experiment machines to the Nevada Test Site. Prior to that closing, the Lawrence Livermore National Laboratory (LLNL) was commissioned by the US DOE NCSP to establish an independent hands-on NCSE subcritical education and training course. The course provided an interim transition for the establishment of a reinvigorated and expanded two-week NCSE education and training program in 2011. The 2011 piloted two-week course was coordinated by the Oak Ridge National Laboratory (ORNL) and jointly conducted by the Los Alamos National Laboratory (LANL) classroom education and facility training, the Sandia National Laboratory (SNL) hands-on criticality experiments training, and the US DOE National Criticality Experiment Research Center (NCERC) hands-on criticality experiments training that is jointly supported by LLNL and LANL and located at the Nevada National Security Site (NNSS) This paper provides the description of the bases, content, and conduct of the piloted, and future US DOE NCSP Criticality Safety Engineer Training and Education Project.« less
NASA Technical Reports Server (NTRS)
Miller, R. H.; Minsky, M. L.; Smith, D. B. S.
1982-01-01
Applications of automation, robotics, and machine intelligence systems (ARAMIS) to space activities and their related ground support functions are studied, so that informed decisions can be made on which aspects of ARAMIS to develop. The specific tasks which will be required by future space project tasks are identified and the relative merits of these options are evaluated. The ARAMIS options defined and researched span the range from fully human to fully machine, including a number of intermediate options (e.g., humans assisted by computers, and various levels of teleoperation). By including this spectrum, the study searches for the optimum mix of humans and machines for space project tasks.
Advances in Machine Learning and Data Mining for Astronomy
NASA Astrophysics Data System (ADS)
Way, Michael J.; Scargle, Jeffrey D.; Ali, Kamal M.; Srivastava, Ashok N.
2012-03-01
Advances in Machine Learning and Data Mining for Astronomy documents numerous successful collaborations among computer scientists, statisticians, and astronomers who illustrate the application of state-of-the-art machine learning and data mining techniques in astronomy. Due to the massive amount and complexity of data in most scientific disciplines, the material discussed in this text transcends traditional boundaries between various areas in the sciences and computer science. The book's introductory part provides context to issues in the astronomical sciences that are also important to health, social, and physical sciences, particularly probabilistic and statistical aspects of classification and cluster analysis. The next part describes a number of astrophysics case studies that leverage a range of machine learning and data mining technologies. In the last part, developers of algorithms and practitioners of machine learning and data mining show how these tools and techniques are used in astronomical applications. With contributions from leading astronomers and computer scientists, this book is a practical guide to many of the most important developments in machine learning, data mining, and statistics. It explores how these advances can solve current and future problems in astronomy and looks at how they could lead to the creation of entirely new algorithms within the data mining community.
EMG and EPP-integrated human-machine interface between the paralyzed and rehabilitation exoskeleton.
Yin, Yue H; Fan, Yuan J; Xu, Li D
2012-07-01
Although a lower extremity exoskeleton shows great prospect in the rehabilitation of the lower limb, it has not yet been widely applied to the clinical rehabilitation of the paralyzed. This is partly caused by insufficient information interactions between the paralyzed and existing exoskeleton that cannot meet the requirements of harmonious control. In this research, a bidirectional human-machine interface including a neurofuzzy controller and an extended physiological proprioception (EPP) feedback system is developed by imitating the biological closed-loop control system of human body. The neurofuzzy controller is built to decode human motion in advance by the fusion of the fuzzy electromyographic signals reflecting human motion intention and the precise proprioception providing joint angular feedback information. It transmits control information from human to exoskeleton, while the EPP feedback system based on haptic stimuli transmits motion information of the exoskeleton back to the human. Joint angle and torque information are transmitted in the form of air pressure to the human body. The real-time bidirectional human-machine interface can help a patient with lower limb paralysis to control the exoskeleton with his/her healthy side and simultaneously perceive motion on the paralyzed side by EPP. The interface rebuilds a closed-loop motion control system for paralyzed patients and realizes harmonious control of the human-machine system.
PhenoLines: Phenotype Comparison Visualizations for Disease Subtyping via Topic Models.
Glueck, Michael; Naeini, Mahdi Pakdaman; Doshi-Velez, Finale; Chevalier, Fanny; Khan, Azam; Wigdor, Daniel; Brudno, Michael
2018-01-01
PhenoLines is a visual analysis tool for the interpretation of disease subtypes, derived from the application of topic models to clinical data. Topic models enable one to mine cross-sectional patient comorbidity data (e.g., electronic health records) and construct disease subtypes-each with its own temporally evolving prevalence and co-occurrence of phenotypes-without requiring aligned longitudinal phenotype data for all patients. However, the dimensionality of topic models makes interpretation challenging, and de facto analyses provide little intuition regarding phenotype relevance or phenotype interrelationships. PhenoLines enables one to compare phenotype prevalence within and across disease subtype topics, thus supporting subtype characterization, a task that involves identifying a proposed subtype's dominant phenotypes, ages of effect, and clinical validity. We contribute a data transformation workflow that employs the Human Phenotype Ontology to hierarchically organize phenotypes and aggregate the evolving probabilities produced by topic models. We introduce a novel measure of phenotype relevance that can be used to simplify the resulting topology. The design of PhenoLines was motivated by formative interviews with machine learning and clinical experts. We describe the collaborative design process, distill high-level tasks, and report on initial evaluations with machine learning experts and a medical domain expert. These results suggest that PhenoLines demonstrates promising approaches to support the characterization and optimization of topic models.
Yoda, Kazushige; Umeda, Tokuo; Hasegawa, Tomoyuki
2003-11-01
Organ movements that occur naturally as a result of vital functions such as respiration and heartbeat cause deterioration of image quality in nuclear medicine imaging. Among these movements, respiration has a large effect, but there has been no practical method of correcting for this. In the present study, we examined a method of correction that uses ultrasound images to correct baseline shifts caused by respiration in cardiac nuclear medicine examinations. To evaluate the validity of this method, simulation studies were conducted with an X-ray TV machine instead of a nuclear medicine scanner. The X-ray TV images and ultrasound images were recorded as digital movies and processed with public domain software (Scion Image). Organ movements were detected in the ultrasound images of the subcostal four-chamber view mode using slit regions of interest and were measured on a two-dimensional image coordinate. Then translational shifts were applied to the X-ray TV images to correct these movements by using macro-functions of the software. As a result, respiratory movements of about 20.1 mm were successfully reduced to less than 2.6 mm. We conclude that this correction technique is potentially useful in nuclear medicine cardiology.
Fission in R-processes Elements (FIRE) - Annual Report: Fiscal Year 2017
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schunck, Nicolas
The goal of the FIRE topical collaboration in nuclear theory is to determine the astrophysical conditions of the rapid neutron capture process (r-process), which is responsible for the formation of heavy elements. This will be achieved by including in r-process simulations the most advanced models of fission (spontaneous, neutron-induced, beta-delayed) that have been developed at LLNL and LANL. The collaboration is composed of LLNL (lead) and LANL for work on nuclear data (ground-state properties, fission, beta-decay), BNL for nuclear data management, and the university of Notre Dame and North Carolina State University for r-process simulations. Under DOE/NNSA agreement, both universitiesmore » receive funds from the DOE Office of Science, while national laboratories receive funds directly from NA221.« less
A Dataset and a Technique for Generalized Nuclear Segmentation for Computational Pathology.
Kumar, Neeraj; Verma, Ruchika; Sharma, Sanuj; Bhargava, Surabhi; Vahadane, Abhishek; Sethi, Amit
2017-07-01
Nuclear segmentation in digital microscopic tissue images can enable extraction of high-quality features for nuclear morphometrics and other analysis in computational pathology. Conventional image processing techniques, such as Otsu thresholding and watershed segmentation, do not work effectively on challenging cases, such as chromatin-sparse and crowded nuclei. In contrast, machine learning-based segmentation can generalize across various nuclear appearances. However, training machine learning algorithms requires data sets of images, in which a vast number of nuclei have been annotated. Publicly accessible and annotated data sets, along with widely agreed upon metrics to compare techniques, have catalyzed tremendous innovation and progress on other image classification problems, particularly in object recognition. Inspired by their success, we introduce a large publicly accessible data set of hematoxylin and eosin (H&E)-stained tissue images with more than 21000 painstakingly annotated nuclear boundaries, whose quality was validated by a medical doctor. Because our data set is taken from multiple hospitals and includes a diversity of nuclear appearances from several patients, disease states, and organs, techniques trained on it are likely to generalize well and work right out-of-the-box on other H&E-stained images. We also propose a new metric to evaluate nuclear segmentation results that penalizes object- and pixel-level errors in a unified manner, unlike previous metrics that penalize only one type of error. We also propose a segmentation technique based on deep learning that lays a special emphasis on identifying the nuclear boundaries, including those between the touching or overlapping nuclei, and works well on a diverse set of test images.
Zooniverse: Combining Human and Machine Classifiers for the Big Survey Era
NASA Astrophysics Data System (ADS)
Fortson, Lucy; Wright, Darryl; Beck, Melanie; Lintott, Chris; Scarlata, Claudia; Dickinson, Hugh; Trouille, Laura; Willi, Marco; Laraia, Michael; Boyer, Amy; Veldhuis, Marten; Zooniverse
2018-01-01
Many analyses of astronomical data sets, ranging from morphological classification of galaxies to identification of supernova candidates, have relied on humans to classify data into distinct categories. Crowdsourced galaxy classifications via the Galaxy Zoo project provided a solution that scaled visual classification for extant surveys by harnessing the combined power of thousands of volunteers. However, the much larger data sets anticipated from upcoming surveys will require a different approach. Automated classifiers using supervised machine learning have improved considerably over the past decade but their increasing sophistication comes at the expense of needing ever more training data. Crowdsourced classification by human volunteers is a critical technique for obtaining these training data. But several improvements can be made on this zeroth order solution. Efficiency gains can be achieved by implementing a “cascade filtering” approach whereby the task structure is reduced to a set of binary questions that are more suited to simpler machines while demanding lower cognitive loads for humans.Intelligent subject retirement based on quantitative metrics of volunteer skill and subject label reliability also leads to dramatic improvements in efficiency. We note that human and machine classifiers may retire subjects differently leading to trade-offs in performance space. Drawing on work with several Zooniverse projects including Galaxy Zoo and Supernova Hunter, we will present recent findings from experiments that combine cohorts of human and machine classifiers. We show that the most efficient system results when appropriate subsets of the data are intelligently assigned to each group according to their particular capabilities.With sufficient online training, simple machines can quickly classify “easy” subjects, leaving more difficult (and discovery-oriented) tasks for volunteers. We also find humans achieve higher classification purity while samples produced by machines are typically more complete. These findings set the stage for further investigations, with the ultimate goal of efficiently and accurately labeling the wide range of data classes that will arise from the planned large astronomical surveys.
The Importance of HRA in Human Space Flight: Understanding the Risks
NASA Technical Reports Server (NTRS)
Hamlin, Teri
2010-01-01
Human performance is critical to crew safety during space missions. Humans interact with hardware and software during ground processing, normal flight, and in response to events. Human interactions with hardware and software can cause Loss of Crew and/or Vehicle (LOCV) through improper actions, or may prevent LOCV through recovery and control actions. Humans have the ability to deal with complex situations and system interactions beyond the capability of machines. Human Reliability Analysis (HRA) is a method used to qualitatively and quantitatively assess the occurrence of human failures that affect availability and reliability of complex systems. Modeling human actions with their corresponding failure probabilities in a Probabilistic Risk Assessment (PRA) provides a more complete picture of system risks and risk contributions. A high-quality HRA can provide valuable information on potential areas for improvement, including training, procedures, human interfaces design, and the need for automation. Modeling human error has always been a challenge in part because performance data is not always readily available. For spaceflight, the challenge is amplified not only because of the small number of participants and limited amount of performance data available, but also due to the lack of definition of the unique factors influencing human performance in space. These factors, called performance shaping factors in HRA terminology, are used in HRA techniques to modify basic human error probabilities in order to capture the context of an analyzed task. Many of the human error modeling techniques were developed within the context of nuclear power plants and therefore the methodologies do not address spaceflight factors such as the effects of microgravity and longer duration missions. This presentation will describe the types of human error risks which have shown up as risk drivers in the Shuttle PRA which may be applicable to commercial space flight. As with other large PRAs of complex machines, human error in the Shuttle PRA proved to be an important contributor (12 percent) to LOCV. An existing HRA technique was adapted for use in the Shuttle PRA, but additional guidance and improvements are needed to make the HRA task in space-related PRAs easier and more accurate. Therefore, this presentation will also outline plans for expanding current HRA methodology to more explicitly cover spaceflight performance shaping factors.
NASA Astrophysics Data System (ADS)
Coyne, Kevin Anthony
The safe operation of complex systems such as nuclear power plants requires close coordination between the human operators and plant systems. In order to maintain an adequate level of safety following an accident or other off-normal event, the operators often are called upon to perform complex tasks during dynamic situations with incomplete information. The safety of such complex systems can be greatly improved if the conditions that could lead operators to make poor decisions and commit erroneous actions during these situations can be predicted and mitigated. The primary goal of this research project was the development and validation of a cognitive model capable of simulating nuclear plant operator decision-making during accident conditions. Dynamic probabilistic risk assessment methods can improve the prediction of human error events by providing rich contextual information and an explicit consideration of feedback arising from man-machine interactions. The Accident Dynamics Simulator paired with the Information, Decision, and Action in a Crew context cognitive model (ADS-IDAC) shows promise for predicting situational contexts that might lead to human error events, particularly knowledge driven errors of commission. ADS-IDAC generates a discrete dynamic event tree (DDET) by applying simple branching rules that reflect variations in crew responses to plant events and system status changes. Branches can be generated to simulate slow or fast procedure execution speed, skipping of procedure steps, reliance on memorized information, activation of mental beliefs, variations in control inputs, and equipment failures. Complex operator mental models of plant behavior that guide crew actions can be represented within the ADS-IDAC mental belief framework and used to identify situational contexts that may lead to human error events. This research increased the capabilities of ADS-IDAC in several key areas. The ADS-IDAC computer code was improved to support additional branching events and provide a better representation of the IDAC cognitive model. An operator decision-making engine capable of responding to dynamic changes in situational context was implemented. The IDAC human performance model was fully integrated with a detailed nuclear plant model in order to realistically simulate plant accident scenarios. Finally, the improved ADS-IDAC model was calibrated, validated, and updated using actual nuclear plant crew performance data. This research led to the following general conclusions: (1) A relatively small number of branching rules are capable of efficiently capturing a wide spectrum of crew-to-crew variabilities. (2) Compared to traditional static risk assessment methods, ADS-IDAC can provide a more realistic and integrated assessment of human error events by directly determining the effect of operator behaviors on plant thermal hydraulic parameters. (3) The ADS-IDAC approach provides an efficient framework for capturing actual operator performance data such as timing of operator actions, mental models, and decision-making activities.
Hands-free human-machine interaction with voice
NASA Astrophysics Data System (ADS)
Juang, B. H.
2004-05-01
Voice is natural communication interface between a human and a machine. The machine, when placed in today's communication networks, may be configured to provide automation to save substantial operating cost, as demonstrated in AT&T's VRCP (Voice Recognition Call Processing), or to facilitate intelligent services, such as virtual personal assistants, to enhance individual productivity. These intelligent services often need to be accessible anytime, anywhere (e.g., in cars when the user is in a hands-busy-eyes-busy situation or during meetings where constantly talking to a microphone is either undersirable or impossible), and thus call for advanced signal processing and automatic speech recognition techniques which support what we call ``hands-free'' human-machine communication. These techniques entail a broad spectrum of technical ideas, ranging from use of directional microphones and acoustic echo cancellatiion to robust speech recognition. In this talk, we highlight a number of key techniques that were developed for hands-free human-machine communication in the mid-1990s after Bell Labs became a unit of Lucent Technologies. A video clip will be played to demonstrate the accomplishement.
NASA Technical Reports Server (NTRS)
Garin, John; Matteo, Joseph; Jennings, Von Ayre
1988-01-01
The capability for a single operator to simultaneously control complex remote multi degree of freedom robotic arms and associated dextrous end effectors is being developed. An optimal solution within the realm of current technology, can be achieved by recognizing that: (1) machines/computer systems are more effective than humans when the task is routine and specified, and (2) humans process complex data sets and deal with the unpredictable better than machines. These observations lead naturally to a philosophy in which the human's role becomes a higher level function associated with planning, teaching, initiating, monitoring, and intervening when the machine gets into trouble, while the machine performs the codifiable tasks with deliberate efficiency. This concept forms the basis for the integration of man and telerobotics, i.e., robotics with the operator in the control loop. The concept of integration of the human in the loop and maximizing the feed-forward and feed-back data flow is referred to as telepresence.
Human Centered Hardware Modeling and Collaboration
NASA Technical Reports Server (NTRS)
Stambolian Damon; Lawrence, Brad; Stelges, Katrine; Henderson, Gena
2013-01-01
In order to collaborate engineering designs among NASA Centers and customers, to in clude hardware and human activities from multiple remote locations, live human-centered modeling and collaboration across several sites has been successfully facilitated by Kennedy Space Center. The focus of this paper includes innovative a pproaches to engineering design analyses and training, along with research being conducted to apply new technologies for tracking, immersing, and evaluating humans as well as rocket, vehic le, component, or faci lity hardware utilizing high resolution cameras, motion tracking, ergonomic analysis, biomedical monitoring, wor k instruction integration, head-mounted displays, and other innovative human-system integration modeling, simulation, and collaboration applications.
Big Data: Next-Generation Machines for Big Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hack, James J.; Papka, Michael E.
Addressing the scientific grand challenges identified by the US Department of Energy’s (DOE’s) Office of Science’s programs alone demands a total leadership-class computing capability of 150 to 400 Pflops by the end of this decade. The successors to three of the DOE’s most powerful leadership-class machines are set to arrive in 2017 and 2018—the products of the Collaboration Oak Ridge Argonne Livermore (CORAL) initiative, a national laboratory–industry design/build approach to engineering nextgeneration petascale computers for grand challenge science. These mission-critical machines will enable discoveries in key scientific fields such as energy, biotechnology, nanotechnology, materials science, and high-performance computing, and servemore » as a milestone on the path to deploying exascale computing capabilities.« less
Bonner Durchmusterung (Argelander 1859-1862): Documentation for the machine-readable version
NASA Technical Reports Server (NTRS)
Warren, Wayne H., Jr.; Ochsenbein, Francois
1989-01-01
The machine-readable version of the catalog, as it is currently being distributed from the Astronomical Data Center, is described. The entire Bonner Durchmusterung (BD) was computerized through the collaborative efforts of the Centre de Donnees Astronomiques de Strasbourg, l'Observatoire de Nice, and the Astronomical Data Center at the NASA/Goddard Space Flight Center. All corrigenda published in the original BD volumes were incorporated into the machine file, along with changes published following the 1903 edition. In addition, stars indicated to be missing in published lists and verified by various techniques are flagged so that they can be omitted from computer plotted charts if desired. Stars deleted in the various errata lists were similarly flagged, while those with revised data are flagged and listed in a separate table.
Health Informatics via Machine Learning for the Clinical Management of Patients.
Clifton, D A; Niehaus, K E; Charlton, P; Colopy, G W
2015-08-13
To review how health informatics systems based on machine learning methods have impacted the clinical management of patients, by affecting clinical practice. We reviewed literature from 2010-2015 from databases such as Pubmed, IEEE xplore, and INSPEC, in which methods based on machine learning are likely to be reported. We bring together a broad body of literature, aiming to identify those leading examples of health informatics that have advanced the methodology of machine learning. While individual methods may have further examples that might be added, we have chosen some of the most representative, informative exemplars in each case. Our survey highlights that, while much research is taking place in this high-profile field, examples of those that affect the clinical management of patients are seldom found. We show that substantial progress is being made in terms of methodology, often by data scientists working in close collaboration with clinical groups. Health informatics systems based on machine learning are in their infancy and the translation of such systems into clinical management has yet to be performed at scale.
Uhlig, Johannes; Uhlig, Annemarie; Kunze, Meike; Beissbarth, Tim; Fischer, Uwe; Lotz, Joachim; Wienbeck, Susanne
2018-05-24
The purpose of this study is to evaluate the diagnostic performance of machine learning techniques for malignancy prediction at breast cone-beam CT (CBCT) and to compare them to human readers. Five machine learning techniques, including random forests, back propagation neural networks (BPN), extreme learning machines, support vector machines, and K-nearest neighbors, were used to train diagnostic models on a clinical breast CBCT dataset with internal validation by repeated 10-fold cross-validation. Two independent blinded human readers with profound experience in breast imaging and breast CBCT analyzed the same CBCT dataset. Diagnostic performance was compared using AUC, sensitivity, and specificity. The clinical dataset comprised 35 patients (American College of Radiology density type C and D breasts) with 81 suspicious breast lesions examined with contrast-enhanced breast CBCT. Forty-five lesions were histopathologically proven to be malignant. Among the machine learning techniques, BPNs provided the best diagnostic performance, with AUC of 0.91, sensitivity of 0.85, and specificity of 0.82. The diagnostic performance of the human readers was AUC of 0.84, sensitivity of 0.89, and specificity of 0.72 for reader 1 and AUC of 0.72, sensitivity of 0.71, and specificity of 0.67 for reader 2. AUC was significantly higher for BPN when compared with both reader 1 (p = 0.01) and reader 2 (p < 0.001). Machine learning techniques provide a high and robust diagnostic performance in the prediction of malignancy in breast lesions identified at CBCT. BPNs showed the best diagnostic performance, surpassing human readers in terms of AUC and specificity.
Behavior Analysis and the Quest for Machine Intelligence.
ERIC Educational Resources Information Center
Stephens, Kenneth R.; Hutchison, William R.
1993-01-01
Discusses three approaches to building intelligent systems: artificial intelligence, neural networks, and behavior analysis. BANKET, an object-oriented software system, is explained; a commercial application of BANKET is described; and a collaborative effort between the academic and business communities for the use of BANKET is discussed.…
Spitzer observatory operations: increasing efficiency in mission operations
NASA Astrophysics Data System (ADS)
Scott, Charles P.; Kahr, Bolinda E.; Sarrel, Marc A.
2006-06-01
This paper explores the how's and why's of the Spitzer Mission Operations System's (MOS) success, efficiency, and affordability in comparison to other observatory-class missions. MOS exploits today's flight, ground, and operations capabilities, embraces automation, and balances both risk and cost. With operational efficiency as the primary goal, MOS maintains a strong control process by translating lessons learned into efficiency improvements, thereby enabling the MOS processes, teams, and procedures to rapidly evolve from concept (through thorough validation) into in-flight implementation. Operational teaming, planning, and execution are designed to enable re-use. Mission changes, unforeseen events, and continuous improvement have often times forced us to learn to fly anew. Collaborative spacecraft operations and remote science and instrument teams have become well integrated, and worked together to improve and optimize each human, machine, and software-system element. Adaptation to tighter spacecraft margins has facilitated continuous operational improvements via automated and autonomous software coupled with improved human analysis. Based upon what we now know and what we need to improve, adapt, or fix, the projected mission lifetime continues to grow - as does the opportunity for numerous scientific discoveries.
Saito, S; Piccoli, B; Smith, M J; Sotoyama, M; Sweitzer, G; Villanueva, M B; Yoshitake, R
2000-10-01
In the 1980's, the visual display terminal (VDT) was introduced in workplaces of many countries. Soon thereafter, an upsurge in reported cases of related health problems, such as musculoskeletal disorders and eyestrain, was seen. Recently, the flat panel display or notebook personal computer (PC) became the most remarkable feature in modern workplaces with VDTs and even in homes. A proactive approach must be taken to avert foreseeable ergonomic and occupational health problems from the use of this new technology. Because of its distinct physical and optical characteristics, the ergonomic requirements for notebook PCs in terms of machine layout, workstation design, lighting conditions, among others, should be different from the CRT-based computers. The Japan Ergonomics Society (JES) technical committee came up with a set of guidelines for notebook PC use following exploratory discussions that dwelt on its ergonomic aspects. To keep in stride with this development, the Technical Committee on Human-Computer Interaction under the auspices of the International Ergonomics Association worked towards the international issuance of the guidelines. This paper unveils the result of this collaborative effort.
2012-03-05
DISTRIBUTION A: Approved for public release; distribution is unlimited. Program Trends •Trust in Autonomous Systems • Cross - cultural Trust...Trust & trustworthiness are independent (Mayer et al, 1995) •Trust is relational •Humans in cross - cultural interactions •Complex human-machine...Interpersonal Trustworthiness •Ability •Benevolence •Integrity Trust Metrics Cross - Cultural Trust Issues Human-Machine Interactions Autonomous
Biosleeve Human-Machine Interface
NASA Technical Reports Server (NTRS)
Assad, Christopher (Inventor)
2016-01-01
Systems and methods for sensing human muscle action and gestures in order to control machines or robotic devices are disclosed. One exemplary system employs a tight fitting sleeve worn on a user arm and including a plurality of electromyography (EMG) sensors and at least one inertial measurement unit (IMU). Power, signal processing, and communications electronics may be built into the sleeve and control data may be transmitted wirelessly to the controlled machine or robotic device.
Ataş, Mustafa; Demircan, Süleyman; Karatepe Haşhaş, Arzu Seyhan; Gülhan, Ahmet; Zararsız, Gökmen
2014-01-01
AIM To compare and evaluate the phacoemulsification parameters and postoperative endothelial cell changes of two different phacoemulsification machines, each with different modes, but also to assess the relationship between postoperative endothelial cell loss and the phacoemulsification parameters, as well as the other factors in both groups. METHODS This prospective observational study was comprised of consecutive eligible cataract patients operated with phacoemulsification technique performed by the same surgeon using either a WHITESTAR Signature Ellips FX (transversal, group 1) or Infiniti OZil IP (torsional, group 2) machine. RESULTS The study included 86 patients. Baseline characteristics in the groups were similar. The median nuclear sclerosis grade was 3 (2-4) in the first group and 2 (2-4) in the second group (P=0.265). Both groups had similar phacoemulsification needle times (group 1: 60.63±36 s; group 2: 55.98±30 s; P=0.789). The percentage of endothelial cell loss 30d after surgery ranged from 3% to 15% with a median of 7% in group 1, and from 2% to 13% with a median of 6% in group 2; however, there was no statistically significant difference between the groups (P=0.407). Hexagonality (P=0.794) and the coefficient of variation (CV; P=0.142) did not differ significantly between the groups before and 30d after surgery. A significant positive correlation was found between the endothelial cell loss and nuclear sclerosis grade (group 1: P<0.001; group 2: P<0.001) and between the endothelial cell loss and average phacoemulsification power (group 1: P=0.007; group 2: P=0.008). CONCLUSION Both of these machines were efficient, with similar endothelial cell loss. This endothelial cell loss was related to the increased nuclear sclerosis grade and increased phacoemulsification power. PMID:25349800
Ataş, Mustafa; Demircan, Süleyman; Karatepe Haşhaş, Arzu Seyhan; Gülhan, Ahmet; Zararsız, Gökmen
2014-01-01
To compare and evaluate the phacoemulsification parameters and postoperative endothelial cell changes of two different phacoemulsification machines, each with different modes, but also to assess the relationship between postoperative endothelial cell loss and the phacoemulsification parameters, as well as the other factors in both groups. This prospective observational study was comprised of consecutive eligible cataract patients operated with phacoemulsification technique performed by the same surgeon using either a WHITESTAR Signature Ellips FX (transversal, group 1) or Infiniti OZil IP (torsional, group 2) machine. The study included 86 patients. Baseline characteristics in the groups were similar. The median nuclear sclerosis grade was 3 (2-4) in the first group and 2 (2-4) in the second group (P=0.265). Both groups had similar phacoemulsification needle times (group 1: 60.63±36 s; group 2: 55.98±30 s; P=0.789). The percentage of endothelial cell loss 30d after surgery ranged from 3% to 15% with a median of 7% in group 1, and from 2% to 13% with a median of 6% in group 2; however, there was no statistically significant difference between the groups (P=0.407). Hexagonality (P=0.794) and the coefficient of variation (CV; P=0.142) did not differ significantly between the groups before and 30d after surgery. A significant positive correlation was found between the endothelial cell loss and nuclear sclerosis grade (group 1: P<0.001; group 2: P<0.001) and between the endothelial cell loss and average phacoemulsification power (group 1: P=0.007; group 2: P=0.008). Both of these machines were efficient, with similar endothelial cell loss. This endothelial cell loss was related to the increased nuclear sclerosis grade and increased phacoemulsification power.
Comulang: towards a collaborative e-learning system that supports student group modeling.
Troussas, Christos; Virvou, Maria; Alepis, Efthimios
2013-01-01
This paper describes an e-learning system that is expected to further enhance the educational process in computer-based tutoring systems by incorporating collaboration between students and work in groups. The resulting system is called "Comulang" while as a test bed for its effectiveness a multiple language learning system is used. Collaboration is supported by a user modeling module that is responsible for the initial creation of student clusters, where, as a next step, working groups of students are created. A machine learning clustering algorithm works towards group formatting, so that co-operations between students from different clusters are attained. One of the resulting system's basic aims is to provide efficient student groups whose limitations and capabilities are well balanced.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwantes, Jon M.
Founded in 1996 upon the initiative of the “Group of 8” governments (G8), the Nuclear Forensics International Technical Working Group (ITWG) is an ad hoc organization of official Nuclear Forensics practitioners (scientists, law enforcement, and regulators) that can be called upon to provide technical assistance to the global community in the event of a seizure of nuclear or radiological materials. The ITWG is supported by and is affiliated with nearly 40 countries and international partner organizations including the International Atomic Energy Agency (IAEA), EURATOM, INTERPOL, EUROPOL, and the United Nations Interregional Crime and Justice Research Institute (UNICRI) (Figure 1). Besidesmore » providing a network of nuclear forensics laboratories that are able to assist the global community during a nuclear smuggling event, the ITWG is also committed to the advancement of the science of nuclear forensic analysis, largely through participation in periodic table top and Collaborative Materials Exercises (CMXs). Exercise scenarios use “real world” samples with realistic forensics investigation time constraints and reporting requirements. These exercises are designed to promote best practices in the field and test, evaluate, and improve new technical capabilities, methods and techniques in order to advance the science of nuclear forensics. Past efforts to advance nuclear forensic science have also included scenarios that asked laboratories to adapt conventional forensics methods (e.g. DNA, fingerprints, tool marks, and document comparisons) for collecting and preserving evidence comingled with radioactive materials.« less
Human image tracking technique applied to remote collaborative environments
NASA Astrophysics Data System (ADS)
Nagashima, Yoshio; Suzuki, Gen
1993-10-01
To support various kinds of collaborations over long distances by using visual telecommunication, it is necessary to transmit visual information related to the participants and topical materials. When people collaborate in the same workspace, they use visual cues such as facial expressions and eye movement. The realization of coexistence in a collaborative workspace requires the support of these visual cues. Therefore, it is important that the facial images be large enough to be useful. During collaborations, especially dynamic collaborative activities such as equipment operation or lectures, the participants often move within the workspace. When the people move frequently or over a wide area, the necessity for automatic human tracking increases. Using the movement area of the human being or the resolution of the extracted area, we have developed a memory tracking method and a camera tracking method for automatic human tracking. Experimental results using a real-time tracking system show that the extracted area fairly moves according to the movement of the human head.
A journey from nuclear criticality methods to high energy density radflow experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Urbatsch, Todd James
Los Alamos National Laboratory is a nuclear weapons laboratory supporting our nation's defense. In support of this mission is a high energy-density physics program in which we design and execute experiments to study radiationhydrodynamics phenomena and improve the predictive capability of our largescale multi-physics software codes on our big-iron computers. The Radflow project’s main experimental effort now is to understand why we haven't been able to predict opacities on Sandia National Laboratory's Z-machine. We are modeling an increasing fraction of the Z-machine's dynamic hohlraum to find multi-physics explanations for the experimental results. Further, we are building an entirely different opacitymore » platform on Lawrence Livermore National Laboratory's National Ignition Facility (NIF), which is set to get results early 2017. Will the results match our predictions, match the Z-machine, or give us something entirely different? The new platform brings new challenges such as designing hohlraums and spectrometers. The speaker will recount his history, starting with one-dimensional Monte Carlo nuclear criticality methods in graduate school, radiative transfer methods research and software development for his first 16 years at LANL, and, now, radflow technology and experiments. Who knew that the real world was more than just radiation transport? Experiments aren't easy and they are as saturated with politics as a presidential election, but they sure are fun.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuzin, V.V.; Pshakin, G.M.; Belov, A.P.
1996-12-31
During 1995, collaborative Russian-US nuclear material protection, control, and accounting (MPC and A) tasks at the Institute of Physics and Power Engineering (IPPE) in Obninsk, Russia focused on improving the protection of nuclear materials at the BFS Fast Critical Facility. BFS has tens of thousands of fuel disks containing highly enriched uranium and weapons-grade plutonium that are used to simulate the core configurations of experimental reactors in two critical assemblies. Completed tasks culminated in demonstrations of newly implemented equipment (Russian and US) and methods that enhanced the MPC and A at BFS through computerized accounting, nondestructive inventory verification measurements, personnelmore » identification and access control, physical inventory taking, physical protection, and video surveillance. The collaborative work with US Department of Energy national laboratories is now being extended. In 1996 additional tasks to improve MPC and A have been implemented at BFS, the Technological Laboratory for Fuel Fabrication (TLFF) the Central Storage Facility (CSF), and for the entire site. The TLFF reclads BFS uranium metal fuel disks (process operations and transfers of fissile material). The CSF contains many different types of nuclear material. MPC and A at these additional facilities will be integrated with that at BFS as a prototype site-wide approach. Additional site-wide tasks encompass communications and tamper-indicating devices. Finally, new storage alternatives are being implemented that will consolidate the more attractive nuclear materials in a better-protected nuclear island. The work this year represents not just the addition of new facilities and the site-wide approach, but the systematization of the MPC and A elements that are being implemented as a first step and the more comprehensive ones planned.« less
Privacy-preserving restricted boltzmann machine.
Li, Yu; Zhang, Yuan; Ji, Yue
2014-01-01
With the arrival of the big data era, it is predicted that distributed data mining will lead to an information technology revolution. To motivate different institutes to collaborate with each other, the crucial issue is to eliminate their concerns regarding data privacy. In this paper, we propose a privacy-preserving method for training a restricted boltzmann machine (RBM). The RBM can be got without revealing their private data to each other when using our privacy-preserving method. We provide a correctness and efficiency analysis of our algorithms. The comparative experiment shows that the accuracy is very close to the original RBM model.
Privacy-Preserving Restricted Boltzmann Machine
Li, Yu
2014-01-01
With the arrival of the big data era, it is predicted that distributed data mining will lead to an information technology revolution. To motivate different institutes to collaborate with each other, the crucial issue is to eliminate their concerns regarding data privacy. In this paper, we propose a privacy-preserving method for training a restricted boltzmann machine (RBM). The RBM can be got without revealing their private data to each other when using our privacy-preserving method. We provide a correctness and efficiency analysis of our algorithms. The comparative experiment shows that the accuracy is very close to the original RBM model. PMID:25101139
NASA Astrophysics Data System (ADS)
Schmitt, Kara Anne
This research aims to prove that strict adherence to procedures and rigid compliance to process in the US Nuclear Industry may not prevent incidents or increase safety. According to the Institute of Nuclear Power Operations, the nuclear power industry has seen a recent rise in events, and this research claims that a contributing factor to this rise is organizational, cultural, and based on peoples overreliance on procedures and policy. Understanding the proper balance of function allocation, automation and human decision-making is imperative to creating a nuclear power plant that is safe, efficient, and reliable. This research claims that new generations of operators are less engaged and thinking because they have been instructed to follow procedures to a fault. According to operators, they were once to know the plant and its interrelations, but organizationally more importance is now put on following procedure and policy. Literature reviews were performed, experts were questioned, and a model for context analysis was developed. The Context Analysis Method for Identifying Design Solutions (CAMIDS) Model was created, verified and validated through both peer review and application in real world scenarios in active nuclear power plant simulators. These experiments supported the claim that strict adherence and rigid compliance to procedures may not increase safety by studying the industry's propensity for following incorrect procedures, and when it directly affects the outcome of safety or security of the plant. The findings of this research indicate that the younger generations of operators rely highly on procedures, and the organizational pressures of required compliance to procedures may lead to incidents within the plant because operators feel pressured into following the rules and policy above performing the correct actions in a timely manner. The findings support computer based procedures, efficient alarm systems, and skill of the craft matrices. The solution to the problems facing the industry include in-depth, multiple fault failure training which tests the operator's knowledge of the situation. This builds operator collaboration, competence and confidence to know what to do, and when to do it in response to an emergency situation. Strict adherence to procedures and rigid compliance to process may not prevent incidents or increase safety; building operators' fundamental skills of collaboration, competence and confidence will.
None
2018-05-01
A new Idaho National Laboratory supercomputer is helping scientists create more realistic simulations of nuclear fuel. Dubbed "Ice Storm" this 2048-processor machine allows researchers to model and predict the complex physics behind nuclear reactor behavior. And with a new visualization lab, the team can see the results of its simulations on the big screen. For more information about INL research, visit http://www.facebook.com/idahonationallaboratory.
MACHINING TEST SPECIMENS FROM HARVESTED ZION RPV SEGMENTS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nanstad, Randy K; Rosseel, Thomas M; Sokolov, Mikhail A
2015-01-01
The decommissioning of the Zion Nuclear Generating Station (NGS) in Zion, Illinois, presents a special and timely opportunity for developing a better understanding of materials degradation and other issues associated with extending the lifetime of existing nuclear power plants (NPPs) beyond 60 years of service. In support of extended service and current operations of the US nuclear reactor fleet, the Oak Ridge National Laboratory (ORNL), through the Department of Energy (DOE), Light Water Reactor Sustainability (LWRS) Program, is coordinating and contracting with Zion Solutions, LLC, a subsidiary of Energy Solutions, an international nuclear services company, the selective procurement of materials,more » structures, components, and other items of interest from the decommissioned reactors. In this paper, we will discuss the acquisition of segments of the Zion Unit 2 Reactor Pressure Vessel (RPV), cutting these segments into blocks from the beltline and upper vertical welds and plate material and machining those blocks into mechanical (Charpy, compact tension, and tensile) test specimens and coupons for microstructural (TEM, SEM, APT, SANS and nano indention) characterization. Access to service-irradiated RPV welds and plate sections will allow through wall attenuation studies to be performed, which will be used to assess current radiation damage models [1].« less
Improvement of human operator vibroprotection system in the utility machine
NASA Astrophysics Data System (ADS)
Korchagin, P. A.; Teterina, I. A.; Rahuba, L. F.
2018-01-01
The article is devoted to an urgent problem of improving efficiency of road-building utility machines in terms of improving human operator vibroprotection system by determining acceptable values of the rigidity coefficients and resistance coefficients of operator’s cab suspension system elements and those of operator’s seat. Negative effects of vibration result in labour productivity decrease and occupational diseases. Besides, structure vibrations have a damaging impact on the machine units and mechanisms, which leads to reducing an overall service life of the machine. Results of experimental and theoretical research of operator vibroprotection system in the road-building utility machine are presented. An algorithm for the program to calculate dynamic impacts on the operator in terms of different structural and performance parameters of the machine and considering combination of external pertrubation influences was proposed.
THE NEXT GENERATION SAFEGUARDS PROFESSIONAL NETWORK: PROGRESS AND NEXT STEPS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhernosek, Alena V; Lynch, Patrick D; Scholz, Melissa A
2011-01-01
President Obama has repeatedly stated that the United States must ensure that the international safeguards regime, as embodied by the International Atomic Energy Agency (IAEA), has 'the authority, information, people, and technology it needs to do its job.' The U.S. Department of Energy (DOE) National Nuclear Security Administration (NNSA) works to implement the President's vision through the Next Generation Safeguards Initiative (NGSI), a program to revitalize the U.S. DOE national laboratories safeguards technology and human capital base so that the United States can more effectively support the IAEA and ensure that it meets current and emerging challenges to the internationalmore » safeguards system. In 2009, in response to the human capital development goals of NGSI, young safeguards professionals within the Global Nuclear Security Technology Division at Oak Ridge National Laboratory launched the Next Generation Safeguards Professional Network (NGSPN). The purpose of this initiative is to establish working relationships and to foster collaboration and communication among the next generation of safeguards leaders. The NGSPN is an organization for, and of, young professionals pursuing careers in nuclear safeguards and nonproliferation - as well as mid-career professionals new to the field - whether working within the U.S. DOE national laboratory complex, U.S. government agencies, academia, or industry or at the IAEA. The NGSPN is actively supported by the NNSA, boasts more than 70 members, maintains a website and newsletter, and has held two national meetings as well as an NGSPN session and panel at the July 2010 Institute of Nuclear Material Management Annual Meeting. This paper discusses the network; its significance, goals and objectives; developments and progress to date; and future plans.« less
2018-01-05
research team recorded fMRI or event-related potentials while subjects were playing two cognitive games . At the first experiment, human subjects played a...theory-of-mind bilateral game with two types of computerized agents: with or without humanlike cues. At the second experiment, human subjects played...a unilateral game in which the human subjects played the role of the Coach (or supervisor) while a computer agent played as the Player
NASA Technical Reports Server (NTRS)
Miller, R. H.; Minsky, M. L.; Smith, D. B. S.
1982-01-01
Potential applications of automation, robotics, and machine intelligence systems (ARAMIS) to space activities, and to their related ground support functions are explored. The specific tasks which will be required by future space projects are identified. ARAMIS options which are candidates for those space project tasks and the relative merits of these options are defined and evaluated. Promising applications of ARAMIS and specific areas for further research are identified. The ARAMIS options defined and researched by the study group span the range from fully human to fully machine, including a number of intermediate options (e.g., humans assisted by computers, and various levels of teleoperation). By including this spectrum, the study searches for the optimum mix of humans and machines for space project tasks.
Meyer, Travis S; Muething, Joseph Z; Lima, Gustavo Amoras Souza; Torres, Breno Raemy Rangel; del Rosario, Trystyn Keia; Gomes, José Orlando; Lambert, James H
2012-01-01
Radiological nuclear emergency responders must be able to coordinate evacuation and relief efforts following the release of radioactive material into populated areas. In order to respond quickly and effectively to a nuclear emergency, high-level coordination is needed between a number of large, independent organizations, including police, military, hazmat, and transportation authorities. Given the complexity, scale, time-pressure, and potential negative consequences inherent in radiological emergency responses, tracking and communicating information that will assist decision makers during a crisis is crucial. The emergency response team at the Angra dos Reis nuclear power facility, located outside of Rio de Janeiro, Brazil, presently conducts emergency response simulations once every two years to prepare organizational leaders for real-life emergency situations. However, current exercises are conducted without the aid of electronic or software tools, resulting in possible cognitive overload and delays in decision-making. This paper describes the development of a decision support system employing systems methodologies, including cognitive task analysis and human-machine interface design. The decision support system can aid the coordination team by automating cognitive functions and improving information sharing. A prototype of the design will be evaluated by plant officials in Brazil and incorporated to a future trial run of a response simulation.
The Ecological Stewardship Institute at Northern Kentucky University and the U.S. Environmental Protection Agency are collaborating to optimize a harmful algal bloom detection algorithm that estimates the presence and count of cyanobacteria in freshwater systems by image analysis...
ERIC Educational Resources Information Center
McLaughlin, Cheryl A.; McLaughlin, Felecia C.; Pringle, Rose M.
2013-01-01
This article presents the experiences of Miss Felecia McLaughlin, a fourth-grade teacher from the island of Jamaica who used the model proposed by Bass et al. (2009) to assess conceptual understanding of four of the six types of simple machines while encouraging collaboration through the creation of learning teams. Students had an opportunity to…
Using a Group Decision Support System for Creativity.
ERIC Educational Resources Information Center
Aiken, Milam; Riggs, Mary
1993-01-01
A computer-based group decision support system (GDSS) to increase collaborative group productivity and creativity is explained. Various roles for the computer are identified, and implementation of GDSS systems at the University of Mississippi and International Business Machines are described. The GDSS is seen as fostering productivity through…
Topic Models for Link Prediction in Document Networks
ERIC Educational Resources Information Center
Kataria, Saurabh
2012-01-01
Recent explosive growth of interconnected document collections such as citation networks, network of web pages, content generated by crowd-sourcing in collaborative environments, etc., has posed several challenging problems for data mining and machine learning community. One central problem in the domain of document networks is that of "link…
Pendergrass, William; Zitnik, Galynn; Urfer, Silvan R.
2011-01-01
Purpose To determine the differences between species in the retention of lens fiber cell nuclei and nuclear fragments in the aging lens cortex and the relationship of nuclear retention to lens opacity. For this purpose old human, monkey, dog, and rat lenses were compared to those of three strains of mouse. We also investigated possible mechanisms leading to nuclear retention. Methods Fixed specimens of the species referred to above were obtained from immediate on site sacrifice of mice and rats, or from recently fixed lenses of other species, dogs, monkeys, and humans, obtained from collaborators. The retention of undegraded nuclei and nuclear fragments was graded 1–4 from histologic observation. All species lenses were examined microscopically in fixed sections stained with hematoxylin and eosin (H&E) or 4',6-diamidino-2-phenylindole (DAPI). Slit lamp observations were made only on the mice and rats before sacrifice and lens fixation. Values of 0 to 4 (clear lens to cataract) were given to degree of opacity. MRNA content in young versus old C57BL/6 mouse lenses was determined by quantitative PCR (qPCR) for DNase II-like acid DNase (DLAD) and other proteins. DLAD protein was determined by immunofluorescence of fixed eye sections. Results In old C57BL/6 and DBA mice and, to a lesser degree, in old CBA mice and old Brown Norway (BN) rats lenses were seen to contain a greatly expanded pool of unresolved whole nuclei or fragments of nuclei in differentiating lens fiber cells. This generally correlated with increased slit lamp opacities in these mice. Most old dog lenses also had an increase in retained cortical nuclei, as did a few old humans. However, a second rat strain, BNF1, in which opacity was quite high had no increase in retained nuclei with age nor did any of the old monkeys, indicating that retained nuclei could not be a cause of opacity in these animals. The nuclei and nuclear fragments were located at all levels in the outer cortex extending inward from the lens equator and were observable by the DAPI. These nuclei and nuclear fragments were seen from 12 months onward in all C57BL/6 and DBA/2 mice and to a lesser degree in the CBA, increasing in number and in space occupancy with increasing age. Preliminary results suggest that retention of nuclei in the C57BL/6 mouse is correlated with an age-related loss of DLAD from old lenses. Conclusions A very marked apparently light refractive condition caused by retained cortical nuclei and nuclear fragments is present in the lens cortices, increasing with age in the three strains of mice examined and in one of two strains of rats (BN). This condition was also seen in some old dogs and a few old humans. It may be caused by an age-related loss of DLAD, which is essential for nuclear DNA degradation in the lens. However, this condition does not develop in old BNF1 rats, or old monkeys and is only seen sporadically in humans. Thus, it can not be a universal cause for age related lens opacity or cataract presence, although it develops concurrently with opacity in mice. This phenomenon should be considered when using the old mouse as a model for human age-related cataract. PMID:22065920
New Paradigms for Human-Robotic Collaboration During Human Planetary Exploration
NASA Astrophysics Data System (ADS)
Parrish, J. C.; Beaty, D. W.; Bleacher, J. E.
2017-02-01
Human exploration missions to other planetary bodies offer new paradigms for collaboration (control, interaction) between humans and robots beyond the methods currently used to control robots from Earth and robots in Earth orbit.
Conformal Predictions in Multimedia Pattern Recognition
ERIC Educational Resources Information Center
Nallure Balasubramanian, Vineeth
2010-01-01
The fields of pattern recognition and machine learning are on a fundamental quest to design systems that can learn the way humans do. One important aspect of human intelligence that has so far not been given sufficient attention is the capability of humans to express when they are certain about a decision, or when they are not. Machine learning…
Refueling machine with relative positioning capability
Challberg, R.C.; Jones, C.R.
1998-12-15
A refueling machine is disclosed having relative positioning capability for refueling a nuclear reactor. The refueling machine includes a pair of articulated arms mounted on a refueling bridge. Each arm supports a respective telescoping mast. Each telescoping mast is designed to flex laterally in response to application of a lateral thrust on the end of the mast. A pendant mounted on the end of the mast carries an air-actuated grapple, television cameras, ultrasonic transducers and waterjet thrusters. The ultrasonic transducers are used to detect the gross position of the grapple relative to the bail of a nuclear fuel assembly in the fuel core. The television cameras acquire an image of the bail which is compared to a pre-stored image in computer memory. The pendant can be rotated until the television image and the pre-stored image match within a predetermined tolerance. Similarly, the waterjet thrusters can be used to apply lateral thrust to the end of the flexible mast to place the grapple in a fine position relative to the bail as a function of the discrepancy between the television and pre-stored images. 11 figs.
Refueling machine with relative positioning capability
Challberg, Roy Clifford; Jones, Cecil Roy
1998-01-01
A refueling machine having relative positioning capability for refueling a nuclear reactor. The refueling machine includes a pair of articulated arms mounted on a refueling bridge. Each arm supports a respective telescoping mast. Each telescoping mast is designed to flex laterally in response to application of a lateral thrust on the end of the mast. A pendant mounted on the end of the mast carries an air-actuated grapple, television cameras, ultrasonic transducers and waterjet thrusters. The ultrasonic transducers are used to detect the gross position of the grapple relative to the bail of a nuclear fuel assembly in the fuel core. The television cameras acquire an image of the bail which is compared to a pre-stored image in computer memory. The pendant can be rotated until the television image and the pre-stored image match within a predetermined tolerance. Similarly, the waterjet thrusters can be used to apply lateral thrust to the end of the flexible mast to place the grapple in a fine position relative to the bail as a function of the discrepancy between the television and pre-stored images.
Bringing UAVs to the fight: recent army autonomy research and a vision for the future
NASA Astrophysics Data System (ADS)
Moorthy, Jay; Higgins, Raymond; Arthur, Keith
2008-04-01
The Unmanned Autonomous Collaborative Operations (UACO) program was initiated in recognition of the high operational burden associated with utilizing unmanned systems by both mounted and dismounted, ground and airborne warfighters. The program was previously introduced at the 62nd Annual Forum of the American Helicopter Society in May of 20061. This paper presents the three technical approaches taken and results obtained in UACO. All three approaches were validated extensively in contractor simulations, two were validated in government simulation, one was flight tested outside the UACO program, and one was flight tested in Part 2 of UACO. Results and recommendations are discussed regarding diverse areas such as user training and human-machine interface, workload distribution, UAV flight safety, data link bandwidth, user interface constructs, adaptive algorithms, air vehicle system integration, and target recognition. Finally, a vision for UAV As A Wingman is presented.
Intelligent platforms for disease assessment: novel approaches in functional echocardiography.
Sengupta, Partho P
2013-11-01
Accelerating trends in the dynamic digital era (from 2004 onward) has resulted in the emergence of novel parametric imaging tools that allow easy and accurate extraction of quantitative information from cardiac images. This review principally attempts to heighten the awareness of newer emerging paradigms that may advance acquisition, visualization and interpretation of the large functional data sets obtained during cardiac ultrasound imaging. Incorporation of innovative cognitive software that allow advanced pattern recognition and disease forecasting will likely transform the human-machine interface and interpretation process to achieve a more efficient and effective work environment. Novel technologies for automation and big data analytics that are already active in other fields need to be rapidly adapted to the health care environment with new academic-industry collaborations to enrich and accelerate the delivery of newer decision making tools for enhancing patient care. Copyright © 2013. Published by Elsevier Inc.
Code of Federal Regulations, 2013 CFR
2013-01-01
...); (3) A fuel fabrication plant; (4) An enrichment plant or isotope separation plant for the separation..., irradiated fuel element chopping machines, and hot cells. Nuclear fuel cycle-related research and development...
Code of Federal Regulations, 2014 CFR
2014-01-01
...); (3) A fuel fabrication plant; (4) An enrichment plant or isotope separation plant for the separation..., irradiated fuel element chopping machines, and hot cells. Nuclear fuel cycle-related research and development...
Dogac, Asuman; Kabak, Yildiray; Namli, Tuncay; Okcan, Alper
2008-11-01
Integrating healthcare enterprise (IHE) specifies integration profiles describing selected real world use cases to facilitate the interoperability of healthcare information resources. While realizing a complex real-world scenario, IHE profiles are combined by grouping the related IHE actors. Grouping IHE actors implies that the associated business processes (IHE profiles) that the actors are involved must be combined, that is, the choreography of the resulting collaborative business process must be determined by deciding on the execution sequence of transactions coming from different profiles. There are many IHE profiles and each user or vendor may support a different set of IHE profiles that fits to its business need. However, determining the precedence of all the involved transactions manually for each possible combination of the profiles is a very tedious task. In this paper, we describe how to obtain the overall business process automatically when IHE actors are grouped. For this purpose, we represent the IHE profiles through a standard, machine-processable language, namely, Organization for the Advancement of Structured Information Standards (OASIS) ebusiness eXtensible Markup Language (ebXML) Business Process Specification (ebBP) Language. We define the precedence rules among the transactions of the IHE profiles, again, in a machine-processable way. Then, through a graphical tool, we allow users to select the actors to be grouped and automatically produce the overall business process in a machine-processable format.
Dictionary of Basic Military Terms
1965-04-01
having nuclear charges. 101 ATOMNAYA SILOVAYA (ENERGEHCHESKAYA) KORA- BEL’NAYA (SUDOVAYA) USTANOVKA (atomic power plant for ship propulsion )- A special...atomic power plant for ship propulsion consists of an atomic "boiler," or reactor, a turbine (steam or gas), and electro- mechanical machinery. The...type, is mounted on a heay artillery tractor chassis. A high - speed trench-digging machine can dig trenches to a depth of 1.5 meters. The machine’s
Man-equivalent telepresence through four fingered human-like hand system
NASA Technical Reports Server (NTRS)
Jau, Bruno M.
1992-01-01
The author describes a newly developed mechanical hand system. The robot hand is in human-like configuration with a thumb and three fingers, a palm, a wrist, and the forearm in which the hand and wrist actuators are located. Each finger and the wrist has its own active electromechanical compliance system, allowing the joint drive trains to be stiffened or loosened. This mechanism imitates the human muscle dual function of positioner and stiffness controller. This is essential for soft grappling operations. The hand-wrist assembly has 16 finger joints, three wrist joints, and five compliance mechanisms for a total of 24 degrees of freedom. The strength of the hand is roughly half that of the human hand and its size is comparable to a male hand. The hand is controlled through an exoskeleton glove controller that the operator wears. The glove provides the man-machine interface in telemanipulation control mode: it senses the operator's inputs to guide the mechanical hand in hybrid position and force control. The hand system is intended for dexterous manipulations in structured environments. Typical applications will include work in hostile environment such as space operations and nuclear power plants.
Spiers, Adam J; Liarokapis, Minas V; Calli, Berk; Dollar, Aaron M
2016-01-01
Classical robotic approaches to tactile object identification often involve rigid mechanical grippers, dense sensor arrays, and exploratory procedures (EPs). Though EPs are a natural method for humans to acquire object information, evidence also exists for meaningful tactile property inference from brief, non-exploratory motions (a 'haptic glance'). In this work, we implement tactile object identification and feature extraction techniques on data acquired during a single, unplanned grasp with a simple, underactuated robot hand equipped with inexpensive barometric pressure sensors. Our methodology utilizes two cooperating schemes based on an advanced machine learning technique (random forests) and parametric methods that estimate object properties. The available data is limited to actuator positions (one per two link finger) and force sensors values (eight per finger). The schemes are able to work both independently and collaboratively, depending on the task scenario. When collaborating, the results of each method contribute to the other, improving the overall result in a synergistic fashion. Unlike prior work, the proposed approach does not require object exploration, re-grasping, grasp-release, or force modulation and works for arbitrary object start positions and orientations. Due to these factors, the technique may be integrated into practical robotic grasping scenarios without adding time or manipulation overheads.
Liu, Bing-Chun; Binaykia, Arihant; Chang, Pei-Chann; Tiwari, Manoj Kumar; Tsao, Cheng-Chin
2017-01-01
Today, China is facing a very serious issue of Air Pollution due to its dreadful impact on the human health as well as the environment. The urban cities in China are the most affected due to their rapid industrial and economic growth. Therefore, it is of extreme importance to come up with new, better and more reliable forecasting models to accurately predict the air quality. This paper selected Beijing, Tianjin and Shijiazhuang as three cities from the Jingjinji Region for the study to come up with a new model of collaborative forecasting using Support Vector Regression (SVR) for Urban Air Quality Index (AQI) prediction in China. The present study is aimed to improve the forecasting results by minimizing the prediction error of present machine learning algorithms by taking into account multiple city multi-dimensional air quality information and weather conditions as input. The results show that there is a decrease in MAPE in case of multiple city multi-dimensional regression when there is a strong interaction and correlation of the air quality characteristic attributes with AQI. Also, the geographical location is found to play a significant role in Beijing, Tianjin and Shijiazhuang AQI prediction. PMID:28708836
Developing and Validating Practical Eye Metrics for the Sense-Assess-Augment Framework
2015-09-29
Sense-Assess-Augment ( SAA ) Framework. To better close the loop between the human and machine teammates AFRL’s Human Performance Wing and Human...Sense-Assess-Augment ( SAA ) framework, which is designed to sense a suite of physiological signals from the operator, use these signals to assess the...to use psychophysiological measures to improve human-machine teamwork (such as Biocybernetics or Augmented Cognition) the AFRL- SAA research program
Introducing Nuclear Data Evaluations of Prompt Fission Neutron Spectra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neudecker, Denise
2015-06-17
Nuclear data evaluations provide recommended data sets for nuclear data applications such as reactor physics, stockpile stewardship or nuclear medicine. The evaluated data are often based on information from multiple experimental data sets and nuclear theory using statistical methods. Therefore, they are collaborative efforts of evaluators, theoreticians, experimentalists, benchmark experts, statisticians and application area scientists. In this talk, an introductions is given to the field of nuclear data evaluation at the specific example of a recent evaluation of the outgoing neutron energy spectrum emitted promptly after fission from 239Pu and induced by neutrons from thermal to 30 MeV.
ERIC Educational Resources Information Center
Mahoney, Kristin; Brown, Rich
2013-01-01
We use an experimental course collaboration that occurred in the winter of 2012 as a case study for an approach to interdisciplinary collaboration between Theatre and Humanities courses, and we argue that the theatre methodology of "devising" can serve as a particularly rich locus for collaboration between Theatre students and other…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barty, C J
A renaissance in nuclear physics is occurring around the world because of a new kind of incredibly bright, gamma-ray light source that can be created with short pulse lasers and energetic electron beams. These highly Mono-Energetic Gamma-ray (MEGa-ray) sources produce narrow, laser-like beams of incoherent, tunable gamma-rays and are enabling access and manipulation of the nucleus of the atom with photons or so called 'Nuclear Photonics'. Just as in the early days of the laser when photon manipulation of the valence electron structure of the atom became possible and enabling to new applications and science, nuclear photonics with laser-based gamma-raymore » sources promises both to open up wide areas of practical isotope-related, materials applications and to enable new discovery-class nuclear science. In the United States, the development of high brightness and high flux MEGa-ray sources is being actively pursued at the Lawrence Livermore National Laboratory in Livermore (LLNL), California near San Francisco. The LLNL work aims to create by 2013 a machine that will advance the state of the art with respect to source the peak brightness by 6 orders of magnitude. This machine will create beams of 1 to 2.3 MeV photons with color purity matching that of common lasers. In Europe a similar but higher photon energy gamma source has been included as part of the core capability that will be established at the Extreme Light Infrastructure Nuclear Physics (ELI-NP) facility in Magurele, Romania outside of Bucharest. This machine is expected to have an end point gamma energy in the range of 13 MeV. The machine will be co-located with two world-class, 10 Petawatt laser systems thus allowing combined intense-laser and gamma-ray interaction experiments. Such capability will be unique in the world. In this talk, Dr. Chris Barty from LLNL will review the state of the art with respect to MEGa-ray source design, construction and experiments and will describe both the ongoing projects around the world as well some of the exciting applications that these machines will enable. The optimized interaction of short-duration, pulsed lasers with relativistic electron beams (inverse laser-Compton scattering) is the key to unrivaled MeV-scale photon source monochromaticity, pulse brightness and flux. In the MeV spectral range, such Mono-Energetic Gamma-ray (MEGa-ray) sources can have many orders of magnitude higher peak brilliance than even the world's largest synchrotrons. They can efficiently perturb and excite the isotope-specific resonant structure of the nucleus in a manner similar to resonant laser excitation of the valence electron structure of the atom.« less
Paques, Joseph-Jean; Gauthier, François; Perez, Alejandro
2007-01-01
To assess and plan future risk-analysis research projects, 275 documents describing methods and tools for assessing the risks associated with industrial machines or with other sectors such as the military, and the nuclear and aeronautics industries, etc., were collected. These documents were in the format of published books or papers, standards, technical guides and company procedures collected throughout industry. From the collected documents, 112 documents were selected for analysis; 108 methods applied or potentially applicable for assessing the risks associated with industrial machines were analyzed and classified. This paper presents the main quantitative results of the analysis of the methods and tools.
Improving air traffic control: Proving new tools or approving the joint human-machine system?
NASA Technical Reports Server (NTRS)
Gaillard, Irene; Leroux, Marcel
1994-01-01
From the description of a field problem (i.e., designing decision aids for air traffic controllers), this paper points out how a cognitive engineering approach provides the milestones for the evaluation of future joint human-machine systems.
Particle Accelerator Focus Automation
NASA Astrophysics Data System (ADS)
Lopes, José; Rocha, Jorge; Redondo, Luís; Cruz, João
2017-08-01
The Laboratório de Aceleradores e Tecnologias de Radiação (LATR) at the Campus Tecnológico e Nuclear, of Instituto Superior Técnico (IST) has a horizontal electrostatic particle accelerator based on the Van de Graaff machine which is used for research in the area of material characterization. This machine produces alfa (He+) and proton (H+) beams of some μA currents up to 2 MeV/q energies. Beam focusing is obtained using a cylindrical lens of the Einzel type, assembled near the high voltage terminal. This paper describes the developed system that automatically focuses the ion beam, using a personal computer running the LabVIEW software, a multifunction input/output board and signal conditioning circuits. The focusing procedure consists of a scanning method to find the lens bias voltage which maximizes the beam current measured on a beam stopper target, which is used as feedback for the scanning cycle. This system, as part of a wider start up and shut down automation system built for this particle accelerator, brings great advantages to the operation of the accelerator by turning it faster and easier to operate, requiring less human presence, and adding the possibility of total remote control in safe conditions.
Practice of One Health approaches: bridges and barriers in Tanzania.
Kayunze, Kim A; Kiwara, Angwara; Lyamuya, Eligius; Kambarage, Dominic M; Rushton, Jonathan; Coker, Richard; Kock, Richard
2014-04-23
The practice of one health approaches in human and animal health programmes is influenced by type and scope of bridges and barriers for partnerships. It was thus essential to evaluate the nature and scope of collaborative arrangements among human, animal, and wildlife health experts in dealing with health challenges which demand inter-sectoral partnership. The nature of collaborative arrangement was assessed, and the respective bridges and barriers over a period of 12 months (July 20011 to June 2012) were identified. The specific objectives were to: (1) determine the proportion of health experts who had collaborated with other experts of disciplines different from theirs, (2) rank the general bridges for and barriers against collaboration according to the views of the health experts, and (3) find the actual bridges for and barriers against collaboration among the health experts interviewed. It was found that 27.0% of animal health officers interviewed had collaborated with medical officers while 12.4% of the medical officers interviewed had collaborated with animal health experts. Only 6.7% of the wildlife officers had collaborated with animal health experts. The main bridges for collaboration were instruction by upper level leaders, zoonotic diseases of serious impacts, and availability of funding. The main barriers for collaboration were lack of knowledge about animal/human health issues, lack of networks for collaboration, and lack of plans to collaborate. This thus calls for the need to curb barriers in order to enhance inter-sectoral collaboration for more effective management of risks attributable to infectious diseases of humans and animals.
Collaborative Lab Reports with Google Docs
ERIC Educational Resources Information Center
Wood, Michael
2011-01-01
Science is a collaborative endeavor. The solitary genius working on the next great scientific breakthrough is a myth not seen much today. Instead, most physicists have worked in a group at one point in their careers, whether as a graduate student, faculty member, staff scientist, or industrial researcher. As an experimental nuclear physicist with…
Advancing the Theory of Nuclear Reactions with Rare Isotopes. From the Laboratory to the Cosmos
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nunes, Filomena
2015-06-01
The mission of the Topical Collaboration on the Theory of Reactions for Unstable iSotopes (TORUS) was to develop new methods to advance nuclear reaction theory for unstable isotopes—particularly the (d,p) reaction in which a deuteron, composed of a proton and a neutron, transfers its neutron to an unstable nucleus. After benchmarking the state-of-the-art theories, the TORUS collaboration found that there were no exact methods to study (d,p) reactions involving heavy targets; the difficulty arising from the long-range nature of the well known, yet subtle, Coulomb force. To overcome this challenge, the TORUS collaboration developed a new theory where the complexitymore » of treating the long-range Coulomb interaction is shifted to the calculation of so-called form-factors. An efficient implementation for the computation of these form factors was a major achievement of the TORUS collaboration. All the new machinery developed are essential ingredients to analyse (d,p) reactions involving heavy nuclei relevant for astrophysics, energy production, and stockpile stewardship.« less
Integration Telegram Bot on E-Complaint Applications in College
NASA Astrophysics Data System (ADS)
Rosid, M. A.; Rachmadany, A.; Multazam, M. T.; Nandiyanto, A. B. D.; Abdullah, A. G.; Widiaty, I.
2018-01-01
Internet of Things (IoT) has influenced human life where IoT internet connectivity extending from human-to-humans to human-to-machine or machine-to-machine. With this research field, it will be created a technology and concepts that allow humans to communicate with machines for a specific purpose. This research aimed to integrate between application service of the telegram sender with application of e-complaint at a college. With this application, users do not need to visit the Url of the E-compliant application; but, they can be accessed simply by submitting a complaint via Telegram, and then the complaint will be forwarded to the E-complaint Application. From the test results, e-complaint integration with Telegram Bot has been run in accordance with the design. Telegram Bot is made able to provide convenience to the user in this academician to submit a complaint, besides the telegram bot provides the user interaction with the usual interface used by people everyday on their smartphones. Thus, with this system, the complained work unit can immediately make improvements since all the complaints process can be delivered rapidly.
ERIC Educational Resources Information Center
Dereli, Esra; Aypay, Ayse
2012-01-01
The purpose of this study was to investigate the relationships among the empathic tendency, collaboration character trait, human values of student high school and whether high school students' empathic tendency, character trait of collaboration, human values differ based on qualifications of personnel ( gender, class levels, mother and father…
Trials for the cosmological 7Li problem with 7Be beams at CRIB and collaborating studies
NASA Astrophysics Data System (ADS)
Hayakawa, S.
2017-09-01
For many years, the cosmological ^7 Li problem has been tackled from various aspects. The nuclear reaction data have also been improved, but still there remains some ambiguities. We review our experimental plans to measure the cross sections of three key reactions which act to destroy ^7 Be during the Big-Bang Nucleosynthesis (BBN). These experiments are all based on ^7 Be beams produced at Center-for-Nuclear-Study Radioactive Ion Beam separator (CRIB) in collaborations mainly with research groups from INFN-LNS and RCNP. The preliminary result of the previous experiment and the future plan are discussed.
Pullara, Filippo; Guerrero-Santoro, Jennifer; Calero, Monica; Zhang, Qiangmin; Peng, Ye; Spåhr, Henrik; Kornberg, Guy L.; Cusimano, Antonella; Stevenson, Hilary P.; Santamaria-Suarez, Hugo; Reynolds, Shelley L.; Brown, Ian S.; Monga, Satdarshan P.S.; Van Houten, Bennett; Rapić-Otrin, Vesna; Calero, Guillermo; Levine, Arthur S.
2014-01-01
Expression of recombinant proteins in bacterial or eukaryotic systems often results in aggregation rendering them unavailable for biochemical or structural studies. Protein aggregation is a costly problem for biomedical research. It forces research laboratories and the biomedical industry to search for alternative, more soluble, non-human proteins and limits the number of potential “druggable” targets. In this study we present a highly reproducible protocol that introduces the systematic use of an extensive number of detergents to solubilize aggregated proteins expressed in bacterial and eukaryotic systems. We validate the usefulness of this protocol by solubilizing traditionally difficult human protein targets to milligram quantities and confirm their biological activity. We use this method to solubilize monomeric or multimeric components of multi-protein complexes and demonstrate its efficacy to reconstitute large cellular machines. This protocol works equally well on cytosolic, nuclear and membrane proteins and can be easily adapted to a high throughput format. PMID:23137940
Human Cognitive Enhancement Ethical Implications for Airman-Machine Teaming
2017-04-06
34 Psychological Constructs versus Neural Mechanisms: Different Perspectives for Advanced Research of Cognitive Processes and Development of Neuroadaptive...AIR WAR COLLEGE AIR UNIVERSITY HUMAN COGNITIVE ENHANCEMENT ETHICAL IMPLICATIONS FOR AIRMAN-MACHINE TEAMING by William M. Curlin...increasingly challenging adversarial threats. It is hypothesized that by the year 2030, human system operators will be “ cognitively challenged” to keep pace
Machine learning for micro-tomography
NASA Astrophysics Data System (ADS)
Parkinson, Dilworth Y.; Pelt, Daniël. M.; Perciano, Talita; Ushizima, Daniela; Krishnan, Harinarayan; Barnard, Harold S.; MacDowell, Alastair A.; Sethian, James
2017-09-01
Machine learning has revolutionized a number of fields, but many micro-tomography users have never used it for their work. The micro-tomography beamline at the Advanced Light Source (ALS), in collaboration with the Center for Applied Mathematics for Energy Research Applications (CAMERA) at Lawrence Berkeley National Laboratory, has now deployed a series of tools to automate data processing for ALS users using machine learning. This includes new reconstruction algorithms, feature extraction tools, and image classification and recommen- dation systems for scientific image. Some of these tools are either in automated pipelines that operate on data as it is collected or as stand-alone software. Others are deployed on computing resources at Berkeley Lab-from workstations to supercomputers-and made accessible to users through either scripting or easy-to-use graphical interfaces. This paper presents a progress report on this work.
Visualization tool for human-machine interface designers
NASA Astrophysics Data System (ADS)
Prevost, Michael P.; Banda, Carolyn P.
1991-06-01
As modern human-machine systems continue to grow in capabilities and complexity, system operators are faced with integrating and managing increased quantities of information. Since many information components are highly related to each other, optimizing the spatial and temporal aspects of presenting information to the operator has become a formidable task for the human-machine interface (HMI) designer. The authors describe a tool in an early stage of development, the Information Source Layout Editor (ISLE). This tool is to be used for information presentation design and analysis; it uses human factors guidelines to assist the HMI designer in the spatial layout of the information required by machine operators to perform their tasks effectively. These human factors guidelines address such areas as the functional and physical relatedness of information sources. By representing these relationships with metaphors such as spring tension, attractors, and repellers, the tool can help designers visualize the complex constraint space and interacting effects of moving displays to various alternate locations. The tool contains techniques for visualizing the relative 'goodness' of a configuration, as well as mechanisms such as optimization vectors to provide guidance toward a more optimal design. Also available is a rule-based design checker to determine compliance with selected human factors guidelines.
Organizational Culture for Safety, Security, and Safeguards in New Nuclear Power Countries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kovacic, Donald N
2015-01-01
This chapter will contain the following sections: Existing international norms and standards for developing the infrastructure to support new nuclear power programs The role of organizational culture and how it supports the safe, secure, and peaceful application of nuclear power Identifying effective and efficient strategies for implementing safety, security and safeguards in nuclear operations Challenges identified in the implementation of safety, security and safeguards Potential areas for future collaboration between countries in order to support nonproliferation culture
Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling
NASA Astrophysics Data System (ADS)
Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.
2017-12-01
Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.
Traceability of On-Machine Tool Measurement: A Review.
Mutilba, Unai; Gomez-Acedo, Eneko; Kortaberria, Gorka; Olarra, Aitor; Yagüe-Fabra, Jose A
2017-07-11
Nowadays, errors during the manufacturing process of high value components are not acceptable in driving industries such as energy and transportation. Sectors such as aerospace, automotive, shipbuilding, nuclear power, large science facilities or wind power need complex and accurate components that demand close measurements and fast feedback into their manufacturing processes. New measuring technologies are already available in machine tools, including integrated touch probes and fast interface capabilities. They provide the possibility to measure the workpiece in-machine during or after its manufacture, maintaining the original setup of the workpiece and avoiding the manufacturing process from being interrupted to transport the workpiece to a measuring position. However, the traceability of the measurement process on a machine tool is not ensured yet and measurement data is still not fully reliable enough for process control or product validation. The scientific objective is to determine the uncertainty on a machine tool measurement and, therefore, convert it into a machine integrated traceable measuring process. For that purpose, an error budget should consider error sources such as the machine tools, components under measurement and the interactions between both of them. This paper reviews all those uncertainty sources, being mainly focused on those related to the machine tool, either on the process of geometric error assessment of the machine or on the technology employed to probe the measurand.
Man-Machine Communication Research.
1977-02-01
communication difficulty for the computer-naive; discovery of major communication structures in human communication that have been left out of man-machine...processes; creation of a new overview of how human communication functions in cooperative task-oriented activity; and assistance in ARPA policy formation on CAI equipment development.
ERIC Educational Resources Information Center
Sagan, Carl
1975-01-01
The author of this article believes that human survival depends upon the ability to develop and work with machines of high artificial intelligence. He lists uses of such machines, including terrestrial mining, outer space exploration, and other tasks too dangerous, too expensive, or too boring for human beings. (MA)
Carnahan, Brian; Meyer, Gérard; Kuntz, Lois-Ann
2003-01-01
Multivariate classification models play an increasingly important role in human factors research. In the past, these models have been based primarily on discriminant analysis and logistic regression. Models developed from machine learning research offer the human factors professional a viable alternative to these traditional statistical classification methods. To illustrate this point, two machine learning approaches--genetic programming and decision tree induction--were used to construct classification models designed to predict whether or not a student truck driver would pass his or her commercial driver license (CDL) examination. The models were developed and validated using the curriculum scores and CDL exam performances of 37 student truck drivers who had completed a 320-hr driver training course. Results indicated that the machine learning classification models were superior to discriminant analysis and logistic regression in terms of predictive accuracy. Actual or potential applications of this research include the creation of models that more accurately predict human performance outcomes.
de Visser, Ewart J.; Monfort, Samuel S.; Goodyear, Kimberly; Lu, Li; O’Hara, Martin; Lee, Mary R.; Parasuraman, Raja; Krueger, Frank
2017-01-01
Objective We investigated the effects of exogenous oxytocin on trust, compliance, and team decision making with agents varying in anthropomorphism (computer, avatar, human) and reliability (100%, 50%). Background Recent work has explored psychological similarities in how we trust human-like automation compared to how we trust other humans. Exogenous administration of oxytocin, a neuropeptide associated with trust among humans, offers a unique opportunity to probe the anthropomorphism continuum of automation to infer when agents are trusted like another human or merely a machine. Method Eighty-four healthy male participants collaborated with automated agents varying in anthropomorphism that provided recommendations in a pattern recognition task. Results Under placebo, participants exhibited less trust and compliance with automated aids as the anthropomorphism of those aids increased. Under oxytocin, participants interacted with aids on the extremes of the anthropomorphism continuum similarly to placebos, but increased their trust, compliance, and performance with the avatar, an agent on the midpoint of the anthropomorphism continuum. Conclusion This study provided the first evidence that administration of exogenous oxytocin affected trust, compliance, and team decision making with automated agents. These effects provide support for the premise that oxytocin increases affinity for social stimuli in automated aids. Application Designing automation to mimic basic human characteristics is sufficient to elicit behavioral trust outcomes that are driven by neurological processes typically observed in human-human interactions. Designers of automated systems should consider the task, the individual, and the level of anthropomorphism to achieve the desired outcome. PMID:28146673
Integrated human-machine intelligence in space systems
NASA Technical Reports Server (NTRS)
Boy, Guy A.
1992-01-01
The integration of human and machine intelligence in space systems is outlined with respect to the contributions of artificial intelligence. The current state-of-the-art in intelligent assistant systems (IASs) is reviewed, and the requirements of some real-world applications of the technologies are discussed. A concept of integrated human-machine intelligence is examined in the contexts of: (1) interactive systems that tolerate human errors; (2) systems for the relief of workloads; and (3) interactive systems for solving problems in abnormal situations. Key issues in the development of IASs include the compatibility of the systems with astronauts in terms of inputs/outputs, processing, real-time AI, and knowledge-based system validation. Real-world applications are suggested such as the diagnosis, planning, and control of enginnered systems.
Th and U fuel photofission study by NTD for AD-MSR subcritical assembly
NASA Astrophysics Data System (ADS)
Sajo-Bohus, Laszlo; Greaves, Eduardo D.; Davila, Jesus; Barros, Haydn; Pino, Felix; Barrera, Maria T.; Farina, Fulvio
2015-07-01
During the last decade a considerable effort has been devoted for developing energy generating systems based on advanced nuclear technology within the design concepts of GEN-IV. Thorium base fuel systems such as accelerator driven nuclear reactors are one of the often mentioned attractive and affordable options. Several radiotherapy linear accelerators are on the market and due to their reliability, they could be employed as drivers for subcritical liquid fuel assemblies. Bremsstrahlung photons with energies above 5.5MeV, induce (γ,n) and (e,e'n) reactions in the W-target. Resulting gamma radiation and photo or fission neutrons may be absorbed in target materials such as thorium and uranium isotopes to induce sustained fission or nuclear transmutation in waste radioactive materials. Relevant photo driven and photo-fission reaction cross sections are important for actinides 232Th, 238U and 237Np in the radiotherapy machines energy range of 10-20 MV. In this study we employ passive nuclear track detectors (NTD) to determine fission rates and neutron production rates with the aim to establish the feasibility for gamma and photo-neutron driven subcritical assemblies. To cope with these objectives a 20 MV radiotherapy machine has been employed with a mixed fuel target. Results will support further development for a subcritical assembly employing a thorium containing liquid fuel. It is expected that acquired technological knowledge will contribute to the Venezuelan nuclear energy program.
Biomimetics in Intelligent Sensor and Actuator Automation Systems
NASA Astrophysics Data System (ADS)
Bruckner, Dietmar; Dietrich, Dietmar; Zucker, Gerhard; Müller, Brit
Intelligent machines are really an old mankind's dream. With increasing technological development, the requirements for intelligent devices also increased. However, up to know, artificial intelligence (AI) lacks solutions to the demands of truly intelligent machines that have no problems to integrate themselves into daily human environments. Current hardware with a processing power of billions of operations per second (but without any model of human-like intelligence) could not substantially contribute to the intelligence of machines when compared with that of the early AI times. There are great results, of course. Machines are able to find the shortest path between far apart cities on the map; algorithms let you find information described only by few key words. But no machine is able to get us a cup of coffee from the kitchen yet.
Combining human and machine processes (CHAMP)
NASA Astrophysics Data System (ADS)
Sudit, Moises; Sudit, David; Hirsch, Michael
2015-05-01
Machine Reasoning and Intelligence is usually done in a vacuum, without consultation of the ultimate decision-maker. The late consideration of the human cognitive process causes some major problems in the use of automated systems to provide reliable and actionable information that users can trust and depend to make the best Course-of-Action (COA). On the other hand, if automated systems are created exclusively based on human cognition, then there is a danger of developing systems that don't push the barrier of technology and are mainly done for the comfort level of selected subject matter experts (SMEs). Our approach to combining human and machine processes (CHAMP) is based on the notion of developing optimal strategies for where, when, how, and which human intelligence should be injected within a machine reasoning and intelligence process. This combination is based on the criteria of improving the quality of the output of the automated process while maintaining the required computational efficiency for a COA to be actuated in timely fashion. This research addresses the following problem areas: • Providing consistency within a mission: Injection of human reasoning and intelligence within the reliability and temporal needs of a mission to attain situational awareness, impact assessment, and COA development. • Supporting the incorporation of data that is uncertain, incomplete, imprecise and contradictory (UIIC): Development of mathematical models to suggest the insertion of a cognitive process within a machine reasoning and intelligent system so as to minimize UIIC concerns. • Developing systems that include humans in the loop whose performance can be analyzed and understood to provide feedback to the sensors.
Cognitive consequences of clumsy automation on high workload, high consequence human performance
NASA Technical Reports Server (NTRS)
Cook, Richard I.; Woods, David D.; Mccolligan, Elizabeth; Howie, Michael B.
1991-01-01
The growth of computational power has fueled attempts to automate more of the human role in complex problem solving domains, especially those where system faults have high consequences and where periods of high workload may saturate the performance capacity of human operators. Examples of these domains include flightdecks, space stations, air traffic control, nuclear power operation, ground satellite control rooms, and surgical operating rooms. Automation efforts may have unanticipated effects on human performance, particularly if they increase the workload at peak workload times or change the practitioners' strategies for coping with workload. Smooth and effective changes in automation requires detailed understanding of the congnitive tasks confronting the user: it has been called user centered automation. The introduction of a new computerized technology in a group of hospital operating rooms used for heart surgery was observed. The study revealed how automation, especially 'clumsy automation', effects practitioner work patterns and suggest that clumsy automation constrains users in specific and significant ways. Users tailor both the new system and their tasks in order to accommodate the needs of process and production. The study of this tailoring may prove a powerful tool for exposing previously hidden patterns of user data processing, integration, and decision making which may, in turn, be useful in the design of more effective human-machine systems.
ERIC Educational Resources Information Center
Rosen, Yigal
2014-01-01
Students' performance in human-to-human and human-to-agent collaborative problem solving assessment task is investigated in this paper. A secondary data analysis of the research reported by Rosen and Tager (2013) was conducted in order to investigate the comparability of the opportunities for conflict situations in human-to-human and…
Human machine interface display design document.
DOT National Transportation Integrated Search
2008-01-01
The purpose of this document is to describe the design for the human machine interface : (HMI) display for the Next Generation 9-1-1 (NG9-1-1) System (or system of systems) : based on the initial Tier 1 requirements identified for the NG9-1-1 S...
Resource Letter AFHEP-1: Accelerators for the Future of High-Energy Physics
NASA Astrophysics Data System (ADS)
Barletta, William A.
2012-02-01
This Resource Letter provides a guide to literature concerning the development of accelerators for the future of high-energy physics. Research articles, books, and Internet resources are cited for the following topics: motivation for future accelerators, present accelerators for high-energy physics, possible future machine, and laboratory and collaboration websites.
Northern Kentucky University and the U.S. EPA Office of Research Development in Cincinnati Agency are collaborating to develop a harmful algal bloom detection algorithm that estimates the presence of cyanobacteria in freshwater systems by image analysis. Green and blue-green alg...
Design of Scalable and Effective Earth Science Collaboration Tool
NASA Astrophysics Data System (ADS)
Maskey, M.; Ramachandran, R.; Kuo, K. S.; Lynnes, C.; Niamsuwan, N.; Chidambaram, C.
2014-12-01
Collaborative research is growing rapidly. Many tools including IDEs are now beginning to incorporate new collaborative features. Software engineering research has shown the effectiveness of collaborative programming and analysis. In particular, drastic reduction in software development time resulting in reduced cost has been highlighted. Recently, we have witnessed the rise of applications that allow users to share their content. Most of these applications scale such collaboration using cloud technologies. Earth science research needs to adopt collaboration technologies to reduce redundancy, cut cost, expand knowledgebase, and scale research experiments. To address these needs, we developed the Earth science collaboration workbench (CWB). CWB provides researchers with various collaboration features by augmenting their existing analysis tools to minimize learning curve. During the development of the CWB, we understood that Earth science collaboration tasks are varied and we concluded that it is not possible to design a tool that serves all collaboration purposes. We adopted a mix of synchronous and asynchronous sharing methods that can be used to perform collaboration across time and location dimensions. We have used cloud technology for scaling the collaboration. Cloud has been highly utilized and valuable tool for Earth science researchers. Among other usages, cloud is used for sharing research results, Earth science data, and virtual machine images; allowing CWB to create and maintain research environments and networks to enhance collaboration between researchers. Furthermore, collaborative versioning tool, Git, is integrated into CWB for versioning of science artifacts. In this paper, we present our experience in designing and implementing the CWB. We will also discuss the integration of collaborative code development use cases for data search and discovery using NASA DAAC and simulation of satellite observations using NASA Earth Observing System Simulation Suite (NEOS3).
Adapting human-machine interfaces to user performance.
Danziger, Zachary; Fishbach, Alon; Mussa-Ivaldi, Ferdinando A
2008-01-01
The goal of this study was to create and examine machine learning algorithms that adapt in a controlled and cadenced way to foster a harmonious learning environment between the user of a human-machine interface and the controlled device. In this experiment, subjects' high-dimensional finger motions remotely controlled the joint angles of a simulated planar 2-link arm, which was used to hit targets on a computer screen. Subjects were required to move the cursor at the endpoint of the simulated arm.
1951-03-14
human "We have been very much occupied In perfect. engineering to the improvement of the air-navigation ing the machines and the tools which the...a man-machine system which will ever, if he were only considered as an instrument, yield optimal results in the way of efficiency and a tool , a motor...operation of machines and equipment and system development, which will permit tools , the emphasis has been upon the adjustment of an orderly and
Delivering key signals to the machine: seeking the electric signal that muscles emanate
NASA Astrophysics Data System (ADS)
Bani Hashim, A. Y.; Maslan, M. N.; Izamshah, R.; Mohamad, I. S.
2014-11-01
Due to the limitation of electric power generation in the human body, present human-machine interfaces have not been successful because of the nature of standard electronics circuit designs, which do not consider the specifications of signals that resulted from the skin. In general, the outcomes and applications of human-machine interfaces are limited to custom-designed subsystems, such as neuroprosthesis. We seek to model the bio dynamical of sub skin into equivalent mathematical definitions, descriptions, and theorems. Within the human skin, there are networks of nerves that permit the skin to function as a multi dimension transducer. We investigate the nature of structural skin. Apart from multiple networks of nerves, there are other segments within the skin such as minute muscles. We identify the segments that are active when there is an electromyography activity. When the nervous system is firing signals, the muscle is being stimulated. We evaluate the phenomena of biodynamic of the muscles that is concerned with the electromyography activity of the nervous system. In effect, we design a relationship between the human somatosensory and synthetic systems sensory as the union of a complete set of the new domain of the functional system. This classifies electromyogram waveforms linked to intent thought of an operator. The system will become the basis for delivering key signals to machine such that the machine is under operator's intent, hence slavery.
NASA Technical Reports Server (NTRS)
Shafto, Michael G.; Remington, Roger W.; Trimble, Jay W.
1994-01-01
A case study is presented to illustrate some of the problems of applying cognitive science to complex human-machine systems. Disregard for facts about human cognition often undermines the safety, reliability, and cost-effectiveness of complex systems. Yet single-point methods (for example, better user-interface design), whether rooted in computer science or in experimental psychology, fall far short of addressing systems-level problems in a timely way using realistic resources. A model-based methodology is proposed for organizing and prioritizing the cognitive engineering effort, focusing appropriate expertise on major problems first, then moving to more sophisticated refinements if time and resources permit. This case study is based on a collaborative effort between the Human Factors Division at NASA-Ames and the Spaceborne Imaging Radar SIR-C/X-Band Synthetic Aperture Radar (SIR-C/X-SAR) Project at the Jet Propulsion Laboratory (JPL), California institute of Technology. The first SIR-C/X-SAR Shuttle mission flew successfully in April, 1994. A series of such missions is planned to provide radar data to study Earth's ecosystems, climatic and geological processes, hydrologic cycle, and ocean circulation. In addition to JPL and NASA personnel, the SIR-C/X-SAR operations team included Scientists and engineers from the German and Italian space agencies.
The 7SK snRNP associates with the little elongation complex to promote snRNA gene expression.
Egloff, Sylvain; Vitali, Patrice; Tellier, Michael; Raffel, Raoul; Murphy, Shona; Kiss, Tamás
2017-04-03
The 7SK small nuclear RNP (snRNP), composed of the 7SK small nuclear RNA (snRNA), MePCE, and Larp7, regulates the mRNA elongation capacity of RNA polymerase II (RNAPII) through controlling the nuclear activity of positive transcription elongation factor b (P-TEFb). Here, we demonstrate that the human 7SK snRNP also functions as a canonical transcription factor that, in collaboration with the little elongation complex (LEC) comprising ELL, Ice1, Ice2, and ZC3H8, promotes transcription of RNAPII-specific spliceosomal snRNA and small nucleolar RNA (snoRNA) genes. The 7SK snRNA specifically associates with a fraction of RNAPII hyperphosphorylated at Ser5 and Ser7, which is a hallmark of RNAPII engaged in snRNA synthesis. Chromatin immunoprecipitation (ChIP) and chromatin isolation by RNA purification (ChIRP) experiments revealed enrichments for all components of the 7SK snRNP on RNAPII-specific sn/snoRNA genes. Depletion of 7SK snRNA or Larp7 disrupts LEC integrity, inhibits RNAPII recruitment to RNAPII-specific sn/snoRNA genes, and reduces nascent snRNA and snoRNA synthesis. Thus, through controlling both mRNA elongation and sn/snoRNA synthesis, the 7SK snRNP is a key regulator of nuclear RNA production by RNAPII. © 2017 The Authors.
Supervised machine learning and active learning in classification of radiology reports.
Nguyen, Dung H M; Patrick, Jon D
2014-01-01
This paper presents an automated system for classifying the results of imaging examinations (CT, MRI, positron emission tomography) into reportable and non-reportable cancer cases. This system is part of an industrial-strength processing pipeline built to extract content from radiology reports for use in the Victorian Cancer Registry. In addition to traditional supervised learning methods such as conditional random fields and support vector machines, active learning (AL) approaches were investigated to optimize training production and further improve classification performance. The project involved two pilot sites in Victoria, Australia (Lake Imaging (Ballarat) and Peter MacCallum Cancer Centre (Melbourne)) and, in collaboration with the NSW Central Registry, one pilot site at Westmead Hospital (Sydney). The reportability classifier performance achieved 98.25% sensitivity and 96.14% specificity on the cancer registry's held-out test set. Up to 92% of training data needed for supervised machine learning can be saved by AL. AL is a promising method for optimizing the supervised training production used in classification of radiology reports. When an AL strategy is applied during the data selection process, the cost of manual classification can be reduced significantly. The most important practical application of the reportability classifier is that it can dramatically reduce human effort in identifying relevant reports from the large imaging pool for further investigation of cancer. The classifier is built on a large real-world dataset and can achieve high performance in filtering relevant reports to support cancer registries. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Bio-Inspired Human-Level Machine Learning
2015-10-25
extensions to high-level cognitive functions such as anagram solving problem. 15. SUBJECT TERMS 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF...extensions to high-level cognitive functions such as anagram solving problem. We expect that the bio-inspired human-level machine learning combined with...numbers of 1011 neurons and 1014 synaptic connections in the human brain. In previous work, we experimentally demonstrated the feasibility of cognitive
Martínez-Córcoles, Mario; Schöbel, Markus; Gracia, Francisco J; Tomás, Inés; Peiró, José M
2012-07-01
Safety participation is of paramount importance in guaranteeing the safe running of nuclear power plants. The present study examined the effects of empowering leadership on safety participation. Based on a sample of 495 employees from two Spanish nuclear power plants, structural equation modeling showed that empowering leadership has a significant relationship with safety participation, which is mediated by collaborative team learning. In addition, the results revealed that the relationship between empowering leadership and collaborative learning is partially mediated by the promotion of dialogue and open communication. The implications of these findings for safety research and their practical applications are outlined. An empowering leadership style enhances workers' safety performance, particularly safety participation behaviors. Safety participation is recommended to detect possible rule inconsistencies or misunderstood procedures and make workers aware of critical safety information and issues. Crown Copyright © 2012. Published by Elsevier Ltd. All rights reserved.
A Distributed Control System Prototyping Environment to Support Control Room Modernization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lew, Roger Thomas; Boring, Ronald Laurids; Ulrich, Thomas Anthony
Operators of critical processes, such as nuclear power production, must contend with highly complex systems, procedures, and regulations. Developing human-machine interfaces (HMIs) that better support operators is a high priority for ensuring the safe and reliable operation of critical processes. Human factors engineering (HFE) provides a rich and mature set of tools for evaluating the performance of HMIs, however the set of tools for developing and designing HMIs is still in its infancy. Here we propose a rapid prototyping approach for integrating proposed HMIs into their native environments before a design is finalized. This approach allows researchers and developers tomore » test design ideas and eliminate design flaws prior to fully developing the new system. We illustrate this approach with four prototype designs developed using Microsoft’s Windows Presentation Foundation (WPF). One example is integrated into a microworld environment to test the functionality of the design and identify the optimal level of automation for a new system in a nuclear power plant. The other three examples are integrated into a full-scale, glasstop digital simulator of a nuclear power plant. One example demonstrates the capabilities of next generation control concepts; another aims to expand the current state of the art; lastly, an HMI prototype was developed as a test platform for a new control system currently in development at U.S. nuclear power plants. WPF possesses several characteristics that make it well suited to HMI design. It provides a tremendous amount of flexibility, agility, robustness, and extensibility. Distributed control system (DCS) specific environments tend to focus on the safety and reliability requirements for real-world interfaces and consequently have less emphasis on providing functionality to support novel interaction paradigms. Because of WPF’s large user-base, Microsoft can provide an extremely mature tool. Within process control applications,WPF is platform independent and can communicate with popular full-scope process control simulator vendor plant models and DCS platforms.« less
Space Weather in the Machine Learning Era: A Multidisciplinary Approach
NASA Astrophysics Data System (ADS)
Camporeale, E.; Wing, S.; Johnson, J.; Jackman, C. M.; McGranaghan, R.
2018-01-01
The workshop entitled Space Weather: A Multidisciplinary Approach took place at the Lorentz Center, University of Leiden, Netherlands, on 25-29 September 2017. The aim of this workshop was to bring together members of the Space Weather, Mathematics, Statistics, and Computer Science communities to address the use of advanced techniques such as Machine Learning, Information Theory, and Deep Learning, to better understand the Sun-Earth system and to improve space weather forecasting. Although individual efforts have been made toward this goal, the community consensus is that establishing interdisciplinary collaborations is the most promising strategy for fully utilizing the potential of these advanced techniques in solving Space Weather-related problems.
Feasibility Study of Jupiter Icy Moons Orbiter Permanent Magnet Alternator Start Sequence
NASA Technical Reports Server (NTRS)
Kenny, Barbara H.; Tokars, Roger P.
2006-01-01
The Jupiter Icy Moons Orbiter (JIMO) mission was a proposed, (recently cancelled) long duration science mission to study three moons of Jupiter: Callisto, Ganymede, and Europa. One design of the JIMO spacecraft used a nuclear heat source in conjunction with a Brayton rotating machine to generate electrical power for the electric thrusters and the spacecraft bus. The basic operation of the closed cycle Brayton system was as follows. The working fluid, a heliumxenon gas mixture, first entered a compressor, then went through a recuperator and hot-side heat exchanger, then expanded across a turbine that drove an alternator, then entered the cold-side of the recuperator and heat exchanger and finally returned to the compressor. The spacecraft was to be launched with the Brayton system off-line and the nuclear reactor shut down. Once the system was started, the helium-xenon gas would be circulated into the heat exchangers as the nuclear reactors were activated. Initially, the alternator unit would operate as a motor so as to drive the turbine and compressor to get the cycle started. This report investigated the feasibility of the start up sequence of a permanent magnet (PM) machine, similar in operation to the alternator unit, without any position or speed feedback sensors ("sensorless") and with a variable load torque. It is found that the permanent magnet machine can start with sensorless control and a load torque of up to 30 percent of the rated value.
NASA Human Health and Performance Center: Open Innovation Successes and Collaborative Projects
NASA Technical Reports Server (NTRS)
Davis, Jeffrey R.; Richard, Elizabeth E.
2014-01-01
In May 2007, what was then the Space Life Sciences Directorate published the 2007 Space Life Sciences Strategy for Human Space Exploration, which resulted in the development and implementation of new business models and significant advances in external collaboration over the next five years. The strategy was updated on the basis of these accomplishments and reissued as the NASA Human Health and Performance Strategy in 2012, and continues to drive new approaches to innovation for the directorate. This short paper describes the open innovation successes and collaborative projects developed over this timeframe, including the efforts of the NASA Human Health and Performance Center (NHHPC), which was established to advance human health and performance innovations for spaceflight and societal benefit via collaboration in new markets.
Human semi-supervised learning.
Gibson, Bryan R; Rogers, Timothy T; Zhu, Xiaojin
2013-01-01
Most empirical work in human categorization has studied learning in either fully supervised or fully unsupervised scenarios. Most real-world learning scenarios, however, are semi-supervised: Learners receive a great deal of unlabeled information from the world, coupled with occasional experiences in which items are directly labeled by a knowledgeable source. A large body of work in machine learning has investigated how learning can exploit both labeled and unlabeled data provided to a learner. Using equivalences between models found in human categorization and machine learning research, we explain how these semi-supervised techniques can be applied to human learning. A series of experiments are described which show that semi-supervised learning models prove useful for explaining human behavior when exposed to both labeled and unlabeled data. We then discuss some machine learning models that do not have familiar human categorization counterparts. Finally, we discuss some challenges yet to be addressed in the use of semi-supervised models for modeling human categorization. Copyright © 2013 Cognitive Science Society, Inc.
Synchronization, TIGoRS, and Information Flow in Complex Systems: Dispositional Cellular Automata.
Sulis, William H
2016-04-01
Synchronization has a long history in physics where it refers to the phase matching of two identical oscillators. This notion has been extensively studied in physics as well as in biology, where it has been applied to such widely varying phenomena as the flashing of fireflies and firing of neurons in the brain. Human behavior, however, may be recurrent but it is not oscillatory even though many physiological systems do exhibit oscillatory tendencies. Moreover, much of human behaviour is collaborative and cooperative, where the individual behaviours may be distinct yet contemporaneous (if not simultaneous) and taken collectively express some functionality. In the context of behaviour, the important aspect is the repeated co-occurrence in time of behaviours that facilitate the propagation of information or of functionality, regardless of whether or not these behaviours are similar or identical. An example of this weaker notion of synchronization is transient induced global response synchronization (TIGoRS). Previous work has shown that TIGoRS is a ubiquitous phenomenon among complex systems, enabling them to stably parse environmental transients into salient units to which they stably respond. This leads to the notion of Sulis machines, which emergently generate a primitive linguistic structure through their dynamics. This article reviews the notion of TIGoRS and its expression in several complex systems models including tempered neural networks, driven cellular automata and cocktail party automata. The emergent linguistics of Sulis machines are discussed. A new class of complex systems model, the dispositional cellular automaton is introduced. A new metric for TIGoRS, the excess synchronization, is introduced and applied to the study of TIGoRS in dispositional cellular automata. It is shown that these automata exhibit a nonlinear synchronization response to certain perturbing transients.
Mickelson, Grace; Suter, Esther; Deutschlander, Siegrid; Bainbridge, Lesley; Harrison, Liz; Grymonpre, Ruby; Hepp, Shelanne
2012-01-01
The current gap in research on inter-professional collaboration and health human resources outcomes is explored by the Western Canadian Interprofessional Health Collaborative (WCIHC). In a recent research planning workshop with the four western provinces, 82 stakeholders from various sectors including health, provincial governments, research and education engaged with WCIHC to consider aligning their respective research agendas relevant to inter-professional collaboration and health human resources. Key research recommendations from a recent knowledge synthesis on inter-professional collaboration and health human resources as well as current provincial health priorities framed the discussions at the workshop. This knowledge exchange has helped to consolidate a shared current understanding of inter-professional education and practice and health workforce planning and management among the participating stakeholders. Ultimately, through a focused research program, a well-aligned approach between sectors to finding health human resources solutions will result in sustainable health systems reform. Copyright © 2013 Longwoods Publishing.
NASA Technical Reports Server (NTRS)
Miller, R. H.; Minsky, M. L.; Smith, D. B. S.
1982-01-01
Potential applications of automation, robotics, and machine intelligence systems (ARAMIS) to space activities, and to their related ground support functions, in the years 1985-2000, so that NASA may make informed decisions on which aspects of ARAMIS to develop. The study first identifies the specific tasks which will be required by future space projects. It then defines ARAMIS options which are candidates for those space project tasks, and evaluates the relative merits of these options. Finally, the study identifies promising applications of ARAMIS, and recommends specific areas for further research. The ARAMIS options defined and researched by the study group span the range from fully human to fully machine, including a number of intermediate options (e.g., humans assisted by computers, and various levels of teleoperation). By including this spectrum, the study searches for the optimum mix of humans and machines for space project tasks.
Formal verification of human-automation interaction
NASA Technical Reports Server (NTRS)
Degani, Asaf; Heymann, Michael
2002-01-01
This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training material (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces.
Computational nuclear quantum many-body problem: The UNEDF project
NASA Astrophysics Data System (ADS)
Bogner, S.; Bulgac, A.; Carlson, J.; Engel, J.; Fann, G.; Furnstahl, R. J.; Gandolfi, S.; Hagen, G.; Horoi, M.; Johnson, C.; Kortelainen, M.; Lusk, E.; Maris, P.; Nam, H.; Navratil, P.; Nazarewicz, W.; Ng, E.; Nobre, G. P. A.; Ormand, E.; Papenbrock, T.; Pei, J.; Pieper, S. C.; Quaglioni, S.; Roche, K. J.; Sarich, J.; Schunck, N.; Sosonkina, M.; Terasaki, J.; Thompson, I.; Vary, J. P.; Wild, S. M.
2013-10-01
The UNEDF project was a large-scale collaborative effort that applied high-performance computing to the nuclear quantum many-body problem. The primary focus of the project was on constructing, validating, and applying an optimized nuclear energy density functional, which entailed a wide range of pioneering developments in microscopic nuclear structure and reactions, algorithms, high-performance computing, and uncertainty quantification. UNEDF demonstrated that close associations among nuclear physicists, mathematicians, and computer scientists can lead to novel physics outcomes built on algorithmic innovations and computational developments. This review showcases a wide range of UNEDF science results to illustrate this interplay.
NASA Astrophysics Data System (ADS)
Moyle, Steve
Collaborative Data Mining is a setting where the Data Mining effort is distributed to multiple collaborating agents - human or software. The objective of the collaborative Data Mining effort is to produce solutions to the tackled Data Mining problem which are considered better by some metric, with respect to those solutions that would have been achieved by individual, non-collaborating agents. The solutions require evaluation, comparison, and approaches for combination. Collaboration requires communication, and implies some form of community. The human form of collaboration is a social task. Organizing communities in an effective manner is non-trivial and often requires well defined roles and processes. Data Mining, too, benefits from a standard process. This chapter explores the standard Data Mining process CRISP-DM utilized in a collaborative setting.
NASA Astrophysics Data System (ADS)
Prasanna, J.; Rajamanickam, S.; Amith Kumar, O.; Karthick Raj, G.; Sathya Narayanan, P. V. V.
2017-05-01
In this paper Ti-6Al-4V used as workpiece material and it is keenly seen in variety of field including medical, chemical, marine, automotive, aerospace, aviation, electronic industries, nuclear reactor, consumer products etc., The conventional machining of Ti-6Al-4V is very difficult due to its distinctive properties. The Electrical Discharge Machining (EDM) is right choice of machining this material. The tungsten copper composite material is employed as tool material. The gap voltage, peak current, pulse on time and duty factor is considered as the machining parameter to analyze the machining characteristics Material Removal Rate (MRR) and Tool Wear Rate (TWR). The Taguchi method is provided to work for finding the significant parameter of EDM. It is found that for MRR significant parameters rated in the following order Gap Voltage, Pulse On-Time, Peak Current and Duty Factor. On the other hand for TWR significant parameters are listed in line of Gap Voltage, Duty Factor, Peak Current and Pulse On-Time.
Proceedings for the Advance Planning Briefing for Industry
1990-01-24
Liaison Office TOD - Technical Objective Documents TSR - Tactical Source Region UAV - Unmanned Aerial Vehicle UGT - UnderGround nuclear Test A G.EN D635I...tests in AURORA and underground nuclear tests ( UGT ) and will help develop tactical source region hardening requirements and lead to approaches for TSR...X-Ray theory , lasers, electronic controllers, computers, robotics, etc. Contracting for scientific studies and one-of-a-kind machines will emphasize
Sochat, Vanessa V
2015-01-01
Targeted collaboration is becoming more challenging with the ever-increasing number of publications, conferences, and academic responsibilities that the modern-day researcher must synthesize. Specifically, the field of neuroimaging had roughly 10,000 new papers in PubMed for the year 2013, presenting tens of thousands of international authors, each a potential collaborator working on some sub-domain in the field. To remove the burden of synthesizing an entire corpus of publications, talks, and conference interactions to find and assess collaborations, we combine meta-analytical neuroimaging informatics methods with machine learning and network analysis toward this goal. We present "AuthorSynth," a novel application prototype that includes (1) a collaboration network to identify researchers with similar results reported in the literature; and (2) a 2D plot-"brain lattice"-to visually summarize a single author's contribution to the field, and allow for searching of authors based on behavioral terms. This method capitalizes on intelligent synthesis of the neuroimaging literature, and demonstrates that data-driven approaches can be used to confirm existing collaborations, reveal potential ones, and identify gaps in published knowledge. We believe this tool exemplifies how methods from neuroimaging informatics can better inform researchers about progress and knowledge in the field, and enhance the modern workflow of finding collaborations.
Progress report on nuclear spectroscopic studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bingham, C.R.; Guidry, M.W.; Riedinger, L.L.
1994-02-18
The Nuclear Physics group at the University of Tennessee, Knoxville (UTK) is involved in several aspects of heavy-ion physics including both nuclear structure and reaction mechanisms. While the main emphasis is on experimental problems, the authors have maintained a strong collaboration with several theorists in order to best pursue the physics of their measurements. During the last year they have had several experiments at the ATLAS at Argonne National Laboratory, the GAMMASPHERE at the LBL 88 Cyclotron, and with the NORDBALL at the Niels Bohr Institute Tandem. Also, they continue to be very active in the WA93/98 collaboration studying ultra-relativisticmore » heavy ion physics utilizing the SPS accelerator at CERN in Geneva, Switzerland and in the PHENIX Collaboration at the RHIC accelerator under construction at Brookhaven National Laboratory. During the last year their experimental work has been in three broad areas: (1) the structure of nuclei at high angular momentum, (2) the structure of nuclei far from stability, and (3) ultra-relativistic heavy-ion physics. The results of studies in these particular areas are described in this document. These studies concentrate on the structure of nuclear matter in extreme conditions of rotational motion, imbalance of neutrons and protons, or very high temperature and density. Another area of research is heavy-ion-induced transfer reactions, which utilize the transfer of nucleons to states with high angular momentum to learn about their structure and to understand the transfer of particles, energy, and angular momentum in collisions between heavy ions.« less
Medlin, John B.
1976-05-25
A charging machine for loading fuel slugs into the process tubes of a nuclear reactor includes a tubular housing connected to the process tube, a charging trough connected to the other end of the tubular housing, a device for loading the charging trough with a group of fuel slugs, means for equalizing the coolant pressure in the charging trough with the pressure in the process tubes, means for pushing the group of fuel slugs into the process tube and a latch and a seal engaging the last object in the group of fuel slugs to prevent the fuel slugs from being ejected from the process tube when the pusher is removed and to prevent pressure liquid from entering the charging machine.
National Synchrotron Light Source annual report 1991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hulbert, S.L.; Lazarz, N.M.
1992-04-01
This report discusses the following research conducted at NSLS: atomic and molecular science; energy dispersive diffraction; lithography, microscopy and tomography; nuclear physics; UV photoemission and surface science; x-ray absorption spectroscopy; x-ray scattering and crystallography; x-ray topography; workshop on surface structure; workshop on electronic and chemical phenomena at surfaces; workshop on imaging; UV FEL machine reviews; VUV machine operations; VUV beamline operations; VUV storage ring parameters; x-ray machine operations; x-ray beamline operations; x-ray storage ring parameters; superconducting x-ray lithography source; SXLS storage ring parameters; the accelerator test facility; proposed UV-FEL user facility at the NSLS; global orbit feedback systems; and NSLSmore » computer system.« less
Chi, Michelene T H; Roy, Marguerite; Hausmann, Robert G M
2008-03-01
The goals of this study are to evaluate a relatively novel learning environment, as well as to seek greater understanding of why human tutoring is so effective. This alternative learning environment consists of pairs of students collaboratively observing a videotape of another student being tutored. Comparing this collaboratively observing environment to four other instructional methods-one-on-one human tutoring, observing tutoring individually, collaborating without observing, and studying alone-the results showed that students learned to solve physics problems just as effectively from observing tutoring collaboratively as the tutees who were being tutored individually. We explain the effectiveness of this learning environment by postulating that such a situation encourages learners to become active and constructive observers through interactions with a peer. In essence, collaboratively observing combines the benefit of tutoring with the benefit of collaborating. The learning outcomes of the tutees and the collaborative observers, along with the tutoring dialogues, were used to further evaluate three hypotheses explaining why human tutoring is an effective learning method. Detailed analyses of the protocols at several grain sizes suggest that tutoring is effective when tutees are independently or jointly constructing knowledge: with the tutor, but not when the tutor independently conveys knowledge. 2008 Cognitive Science Society, Inc.
Goodyear, Kimberly; Parasuraman, Raja; Chernyak, Sergey; de Visser, Ewart; Madhavan, Poornima; Deshpande, Gopikrishna; Krueger, Frank
2017-10-01
As society becomes more reliant on machines and automation, understanding how people utilize advice is a necessary endeavor. Our objective was to reveal the underlying neural associations during advice utilization from expert human and machine agents with fMRI and multivariate Granger causality analysis. During an X-ray luggage-screening task, participants accepted or rejected good or bad advice from either the human or machine agent framed as experts with manipulated reliability (high miss rate). We showed that the machine-agent group decreased their advice utilization compared to the human-agent group and these differences in behaviors during advice utilization could be accounted for by high expectations of reliable advice and changes in attention allocation due to miss errors. Brain areas involved with the salience and mentalizing networks, as well as sensory processing involved with attention, were recruited during the task and the advice utilization network consisted of attentional modulation of sensory information with the lingual gyrus as the driver during the decision phase and the fusiform gyrus as the driver during the feedback phase. Our findings expand on the existing literature by showing that misses degrade advice utilization, which is represented in a neural network involving salience detection and self-processing with perceptual integration.
Zhang, Jianhua; Yin, Zhong; Wang, Rubin
2017-01-01
This paper developed a cognitive task-load (CTL) classification algorithm and allocation strategy to sustain the optimal operator CTL levels over time in safety-critical human-machine integrated systems. An adaptive human-machine system is designed based on a non-linear dynamic CTL classifier, which maps a set of electroencephalogram (EEG) and electrocardiogram (ECG) related features to a few CTL classes. The least-squares support vector machine (LSSVM) is used as dynamic pattern classifier. A series of electrophysiological and performance data acquisition experiments were performed on seven volunteer participants under a simulated process control task environment. The participant-specific dynamic LSSVM model is constructed to classify the instantaneous CTL into five classes at each time instant. The initial feature set, comprising 56 EEG and ECG related features, is reduced to a set of 12 salient features (including 11 EEG-related features) by using the locality preserving projection (LPP) technique. An overall correct classification rate of about 80% is achieved for the 5-class CTL classification problem. Then the predicted CTL is used to adaptively allocate the number of process control tasks between operator and computer-based controller. Simulation results showed that the overall performance of the human-machine system can be improved by using the adaptive automation strategy proposed.
Collaborative Thinking: The Challenge of the Modern University
ERIC Educational Resources Information Center
Corrigan, Kevin
2012-01-01
More collaborative work in the humanities could be instrumental in helping to break down the traditional rigid boundaries between academic divisions and disciplines in modern universities. The value of the traditional model of the solitary humanities scholar or the collaborative science paradigm should not be discounted. However, increasing the…
Artificial Intelligence/Robotics Applications to Navy Aircraft Maintenance.
1984-06-01
other automatic machinery such as presses, molding machines , and numerically-controlled machine tools, just as people do. A-36...Robotics Technologies 3 B. Relevant AI Technologies 4 1. Expert Systems 4 2. Automatic Planning 4 3. Natural Language 5 4. Machine Vision...building machines that imitate human behavior. Artificial intelligence is concerned with the functions of the brain, whereas robotics include, in
Monitoring osseointegration and developing intelligent systems (Conference Presentation)
NASA Astrophysics Data System (ADS)
Salvino, Liming W.
2017-05-01
Effective monitoring of structural and biological systems is an extremely important research area that enables technology development for future intelligent devices, platforms, and systems. This presentation provides an overview of research efforts funded by the Office of Naval Research (ONR) to establish structural health monitoring (SHM) methodologies in the human domain. Basic science efforts are needed to utilize SHM sensing, data analysis, modeling, and algorithms to obtain the relevant physiological and biological information for human-specific health and performance conditions. This overview of current research efforts is based on the Monitoring Osseointegrated Prosthesis (MOIP) program. MOIP develops implantable and intelligent prosthetics that are directly anchored to the bone of residual limbs. Through real-time monitoring, sensing, and responding to osseointegration of bones and implants as well as interface conditions and environment, our research program aims to obtain individualized actionable information for implant failure identification, load estimation, infection mitigation and treatment, as well as healing assessment. Looking ahead to achieve ultimate goals of SHM, we seek to expand our research areas to cover monitoring human, biological and engineered systems, as well as human-machine interfaces. Examples of such include 1) brainwave monitoring and neurological control, 2) detecting and evaluating brain injuries, 3) monitoring and maximizing human-technological object teaming, and 4) closed-loop setups in which actions can be triggered automatically based on sensors, actuators, and data signatures. Finally, some ongoing and future collaborations across different disciplines for the development of knowledge automation and intelligent systems will be discussed.
NASA Astrophysics Data System (ADS)
Kalayeh, Mahdi M.; Marin, Thibault; Pretorius, P. Hendrik; Wernick, Miles N.; Yang, Yongyi; Brankov, Jovan G.
2011-03-01
In this paper, we present a numerical observer for image quality assessment, aiming to predict human observer accuracy in a cardiac perfusion defect detection task for single-photon emission computed tomography (SPECT). In medical imaging, image quality should be assessed by evaluating the human observer accuracy for a specific diagnostic task. This approach is known as task-based assessment. Such evaluations are important for optimizing and testing imaging devices and algorithms. Unfortunately, human observer studies with expert readers are costly and time-demanding. To address this problem, numerical observers have been developed as a surrogate for human readers to predict human diagnostic performance. The channelized Hotelling observer (CHO) with internal noise model has been found to predict human performance well in some situations, but does not always generalize well to unseen data. We have argued in the past that finding a model to predict human observers could be viewed as a machine learning problem. Following this approach, in this paper we propose a channelized relevance vector machine (CRVM) to predict human diagnostic scores in a detection task. We have previously used channelized support vector machines (CSVM) to predict human scores and have shown that this approach offers better and more robust predictions than the classical CHO method. The comparison of the proposed CRVM with our previously introduced CSVM method suggests that CRVM can achieve similar generalization accuracy, while dramatically reducing model complexity and computation time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Satogata, Todd
2013-04-22
The integrated control system (ICS) is responsible for the whole ESS machine and facility: accelerator, target, neutron scattering instruments and conventional facilities. This unified approach keeps the costs of development, maintenance and support relatively low. ESS has selected a standardised, field-proven controls framework, the Experimental Physics and Industrial Control System (EPICS), which was originally developed jointly by Argonne and Los Alamos National Laboratories. Complementing this selection are best practices and experience from similar facilities regarding platform standardisation, control system development and device integration and commissioning. The components of ICS include the control system core, the control boxes, the BLED databasemore » management system, and the human machine interface. The control system core is a set of systems and tools that make it possible for the control system to provide required data, information and services to engineers, operators, physicists and the facility itself. The core components are the timing system that makes possible clock synchronisation across the facility, the machine protection system (MPS) and the personnel protection system (PPS) that prevent damage to the machine and personnel, and a set of control system services. Control boxes are servers that control a collection of equipment (for example a radio frequency cavity). The integrated control system will include many control boxes that can be assigned to one supplier, such as an internal team, a collaborating institute or a commercial vendor. This approach facilitates a clear division of responsibilities and makes integration much easier. A control box is composed of a standardised hardware platform, components, development tools and services. On the top level, it interfaces with the core control system components (timing, MPS, PPS) and with the human-machine interface. At the bottom, it interfaces with the equipment and parts of the facility through a set of analog and digital signals, real-time control loops and other communication buses. The ICS central data management system is named BLED (beam line element databases). BLED is a set of databases, tools and services that is used to store, manage and access data. It holds vital control system configuration and physics-related (lattice) information about the accelerator, target and instruments. It facilitates control system configuration by bringing together direct input-output controller (IOC) con guration and real-time data from proton and neutron beam line models. BLED also simplifies development and speeds up the code-test-debug cycle. The set of tools that access BLED will be tailored to the needs of different categories of users, such as ESS staff physicists, engineers, and operators; external partner laboratories; and visiting experimental instrument users. The human-machine interface is vital to providing a high-quality experience to ICS users. It encompasses a wide array of devices and software tools, from control room screens to engineer terminal windows; from beam physics data tools to post-mortem data analysis tools. It serves users with a wide range of skills from widely varied backgrounds. The Controls Group is developing a set of user profiles to accommodate this diverse range of use-cases and users.« less
[A new machinability test machine and the machinability of composite resins for core built-up].
Iwasaki, N
2001-06-01
A new machinability test machine especially for dental materials was contrived. The purpose of this study was to evaluate the effects of grinding conditions on machinability of core built-up resins using this machine, and to confirm the relationship between machinability and other properties of composite resins. The experimental machinability test machine consisted of a dental air-turbine handpiece, a control weight unit, a driving unit of the stage fixing the test specimen, and so on. The machinability was evaluated as the change in volume after grinding using a diamond point. Five kinds of core built-up resins and human teeth were used in this study. The machinabilities of these composite resins increased with an increasing load during grinding, and decreased with repeated grinding. There was no obvious correlation between the machinability and Vickers' hardness; however, a negative correlation was observed between machinability and scratch width.
Kant, Vivek
2017-03-01
Jens Rasmussen's contribution to the field of human factors and ergonomics has had a lasting impact. Six prominent interrelated themes can be extracted from his research between 1961 and 1986. These themes form the basis of an engineering epistemology which is best manifested by his abstraction hierarchy. Further, Rasmussen reformulated technical reliability using systems language to enable a proper human-machine fit. To understand the concept of human-machine fit, he included the operator as a central component in the system to enhance system safety. This change resulted in the application of a qualitative and categorical approach for human-machine interaction design. Finally, Rasmussen's insistence on a working philosophy of systems design as being a joint responsibility of operators and designers provided the basis for averting errors and ensuring safe and correct system functioning. Copyright © 2016 Elsevier Ltd. All rights reserved.
Human factors model concerning the man-machine interface of mining crewstations
NASA Technical Reports Server (NTRS)
Rider, James P.; Unger, Richard L.
1989-01-01
The U.S. Bureau of Mines is developing a computer model to analyze the human factors aspect of mining machine operator compartments. The model will be used as a research tool and as a design aid. It will have the capability to perform the following: simulated anthropometric or reach assessment, visibility analysis, illumination analysis, structural analysis of the protective canopy, operator fatigue analysis, and computation of an ingress-egress rating. The model will make extensive use of graphics to simplify data input and output. Two dimensional orthographic projections of the machine and its operator compartment are digitized and the data rebuilt into a three dimensional representation of the mining machine. Anthropometric data from either an individual or any size population may be used. The model is intended for use by equipment manufacturers and mining companies during initial design work on new machines. In addition to its use in machine design, the model should prove helpful as an accident investigation tool and for determining the effects of machine modifications made in the field on the critical areas of visibility and control reach ability.
Living systematic reviews: 2. Combining human and machine effort.
Thomas, James; Noel-Storr, Anna; Marshall, Iain; Wallace, Byron; McDonald, Steven; Mavergames, Chris; Glasziou, Paul; Shemilt, Ian; Synnot, Anneliese; Turner, Tari; Elliott, Julian
2017-11-01
New approaches to evidence synthesis, which use human effort and machine automation in mutually reinforcing ways, can enhance the feasibility and sustainability of living systematic reviews. Human effort is a scarce and valuable resource, required when automation is impossible or undesirable, and includes contributions from online communities ("crowds") as well as more conventional contributions from review authors and information specialists. Automation can assist with some systematic review tasks, including searching, eligibility assessment, identification and retrieval of full-text reports, extraction of data, and risk of bias assessment. Workflows can be developed in which human effort and machine automation can each enable the other to operate in more effective and efficient ways, offering substantial enhancement to the productivity of systematic reviews. This paper describes and discusses the potential-and limitations-of new ways of undertaking specific tasks in living systematic reviews, identifying areas where these human/machine "technologies" are already in use, and where further research and development is needed. While the context is living systematic reviews, many of these enabling technologies apply equally to standard approaches to systematic reviewing. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Human-machine interface for a VR-based medical imaging environment
NASA Astrophysics Data System (ADS)
Krapichler, Christian; Haubner, Michael; Loesch, Andreas; Lang, Manfred K.; Englmeier, Karl-Hans
1997-05-01
Modern 3D scanning techniques like magnetic resonance imaging (MRI) or computed tomography (CT) produce high- quality images of the human anatomy. Virtual environments open new ways to display and to analyze those tomograms. Compared with today's inspection of 2D image sequences, physicians are empowered to recognize spatial coherencies and examine pathological regions more facile, diagnosis and therapy planning can be accelerated. For that purpose a powerful human-machine interface is required, which offers a variety of tools and features to enable both exploration and manipulation of the 3D data. Man-machine communication has to be intuitive and efficacious to avoid long accustoming times and to enhance familiarity with and acceptance of the interface. Hence, interaction capabilities in virtual worlds should be comparable to those in the real work to allow utilization of our natural experiences. In this paper the integration of hand gestures and visual focus, two important aspects in modern human-computer interaction, into a medical imaging environment is shown. With the presented human- machine interface, including virtual reality displaying and interaction techniques, radiologists can be supported in their work. Further, virtual environments can even alleviate communication between specialists from different fields or in educational and training applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Germain, Shawn St.; Farris, Ronald
2014-09-01
Advanced Outage Control Center (AOCC), is a multi-year pilot project targeted at Nuclear Power Plant (NPP) outage improvement. The purpose of this pilot project is to improve management of NPP outages through the development of an AOCC that is specifically designed to maximize the usefulness of communication and collaboration technologies for outage coordination and problem resolution activities. This report documents the results of a benchmarking effort to evaluate the transferability of technologies demonstrated at Idaho National Laboratory and the primary pilot project partner, Palo Verde Nuclear Generating Station. The initial assumption for this pilot project was that NPPs generally domore » not take advantage of advanced technology to support outage management activities. Several researchers involved in this pilot project have commercial NPP experience and believed that very little technology has been applied towards outage communication and collaboration. To verify that the technology options researched and demonstrated through this pilot project would in fact have broad application for the US commercial nuclear fleet, and to look for additional outage management best practices, LWRS program researchers visited several additional nuclear facilities.« less
Raising consciousness about the nuclear threat through music
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ungerleider, J.H.
1987-01-01
This dissertation examines the use of music, in particular topical collaborative group song writing, as a tool for raising consciousness about the threat of nuclear war. Consciousness raising is one way to overcome the phenomenon of denial and to increase discussion and social action in response to the nuclear threat. This dissertation measures the impact of a group song writing workshop on developing critical problem-solving in adult groups; it reviews how music is applied in psychological research and clinical work, has been used historically as a tool in social-change movements in America, and is used in the contemporary field ofmore » peace education. The perspectives of several theorists who discuss the potential of music to contribute to social change are presented. It is concluded that consciousness about the nuclear threat - in terms of naming and analyzing - can be raised by working with music's potential for developing affective, expressive, and collaborative capabilities in individuals and groups. Potential applications of the group song writing workshop are in schools, with peace organizations, music groups, and in relation to other social issues.« less
The development of internationally managed information systems and their prospects.
East, H
1978-12-01
This paper reviews a selection of international collaborative efforts in the production of information services and attempts to characterize modes of cooperation. Information systems specifically discussed include: international nuclear information system (INIS); Nuclear Science Abstract (NSA); EURATOM; AGRIS; AGRINDEX; Information Retrieval Limited (IRL); IFIS (International Food Information Service); Chemical Abstracts Service (CAS); MEDLARS; and TITUS. 3 methods of international information transfer are discussed: commercial transactions; negotiated (bilateral) barter arrangements; and contribution to internationally managed systems. Technical, economic, and professional objectives support the rationale for international cooperation. It is argued that economic and political considerations, as much as improved technology or information transfer, will determine the nature of collaboration in the future.
The EXPERT project: part of the Super-FRS Experiment Collaboration
NASA Astrophysics Data System (ADS)
Chudoba, V.; "EXPERT project,
Fried, Jacquelyn L
2014-06-01
A collaborative practice model related to Human Papilloma Virus (HPV) associated oropharyngeal cancer highlights the role of the dental hygienist in addressing this condition. The incidence of HPV associated head and neck cancer is rising. Multiple professionals including the dental hygienist can work collaboratively to confront this growing public health concern. A critical review applies the growth and utilization of interprofessional education (IPE) and interprofessional collaboration (IPC) to multi-disciplinary models addressing the human papilloma virus and oropharyngeal cancers. A model related to HPV associated oropharyngeal cancer addresses an oral systemic condition that supports the inclusion of a dental hygienist on collaborative teams addressing prevention, detection, treatment and cure of OPC. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Mann, R. W.
1974-01-01
Design and development of a prosthetic device fitted to an above elbow amputee is reported that derives control information from the human to modulate power to an actuator to drive the substitute limb. In turn, the artificial limb generates sensory information feedback to the human nervous system and brain. This synergetic unity feeds efferent or motor control information from the human to the machine, and the machine responds, delivering afferent or sensory information back to the man.
Techniques and applications for binaural sound manipulation in human-machine interfaces
NASA Technical Reports Server (NTRS)
Begault, Durand R.; Wenzel, Elizabeth M.
1990-01-01
The implementation of binaural sound to speech and auditory sound cues (auditory icons) is addressed from both an applications and technical standpoint. Techniques overviewed include processing by means of filtering with head-related transfer functions. Application to advanced cockpit human interface systems is discussed, although the techniques are extendable to any human-machine interface. Research issues pertaining to three-dimensional sound displays under investigation at the Aerospace Human Factors Division at NASA Ames Research Center are described.
Techniques and applications for binaural sound manipulation in human-machine interfaces
NASA Technical Reports Server (NTRS)
Begault, Durand R.; Wenzel, Elizabeth M.
1992-01-01
The implementation of binaural sound to speech and auditory sound cues (auditory icons) is addressed from both an applications and technical standpoint. Techniques overviewed include processing by means of filtering with head-related transfer functions. Application to advanced cockpit human interface systems is discussed, although the techniques are extendable to any human-machine interface. Research issues pertaining to three-dimensional sound displays under investigation at the Aerospace Human Factors Division at NASA Ames Research Center are described.
NASA Astrophysics Data System (ADS)
Altıparmak, Hamit; Al Shahadat, Mohamad; Kiani, Ehsan; Dimililer, Kamil
2018-04-01
Robotic agriculture requires smart and doable techniques to substitute the human intelligence with machine intelligence. Strawberry is one of the important Mediterranean product and its productivity enhancement requires modern and machine-based methods. Whereas a human identifies the disease infected leaves by his eye, the machine should also be capable of vision-based disease identification. The objective of this paper is to practically verify the applicability of a new computer-vision method for discrimination between the healthy and disease infected strawberry leaves which does not require neural network or time consuming trainings. The proposed method was tested under outdoor lighting condition using a regular DLSR camera without any particular lens. Since the type and infection degree of disease is approximated a human brain a fuzzy decision maker classifies the leaves over the images captured on-site having the same properties of human vision. Optimizing the fuzzy parameters for a typical strawberry production area at a summer mid-day in Cyprus produced 96% accuracy for segmented iron deficiency and 93% accuracy for segmented using a typical human instant classification approximation as the benchmark holding higher accuracy than a human eye identifier. The fuzzy-base classifier provides approximate result for decision making on the leaf status as if it is healthy or not.
CFD Aided Design and Production of Hydraulic Turbines
NASA Astrophysics Data System (ADS)
Kaplan, Alper; Cetinturk, Huseyin; Demirel, Gizem; Ayli, Ece; Celebioglu, Kutay; Aradag, Selin; ETU Hydro Research Center Team
2014-11-01
Hydraulic turbines are turbo machines which produce electricity from hydraulic energy. Francis type turbines are the most common one in use today. The design of these turbines requires high engineering effort since each turbine is tailor made due to different head and discharge. Therefore each component of the turbine is designed specifically. During the last decades, Computational Fluid Dynamics (CFD) has become very useful tool to predict hydraulic machinery performance and save time and money for designers. This paper describes a design methodology to optimize a Francis turbine by integrating theoretical and experimental fundamentals of hydraulic machines and commercial CFD codes. Specific turbines are designed and manufactured with the help of a collaborative CFD/CAD/CAM methodology based on computational fluid dynamics and five-axis machining for hydraulic electric power plants. The details are presented in this study. This study is financially supported by Turkish Ministry of Development.
Nutrition environment measures survey-vending: development, dissemination, and reliability.
Voss, Carol; Klein, Susan; Glanz, Karen; Clawson, Margaret
2012-07-01
Researchers determined a need to develop an instrument to assess the vending machine environment that was comparably reliable and valid to other Nutrition Environment Measures Survey tools and that would provide consistent and comparable data for businesses, schools, and communities. Tool development, reliability testing, and dissemination of the Nutrition Environment Measures Survey-Vending (NEMS-V) involved a collaboration of students, professionals, and community leaders. Interrater reliability testing showed high levels of agreement among trained raters on the products and evaluations of products. NEMS-V can benefit public health partners implementing policy and environmental change initiatives as a part of their community wellness activities. The vending machine project will support a policy calling for state facilities to provide a minimum of 30% of foods and beverages in vending machines as healthy options, based on NEMS-V criteria, which will be used as a model for other businesses.
Hierarchical analytical and simulation modelling of human-machine systems with interference
NASA Astrophysics Data System (ADS)
Braginsky, M. Ya; Tarakanov, D. V.; Tsapko, S. G.; Tsapko, I. V.; Baglaeva, E. A.
2017-01-01
The article considers the principles of building the analytical and simulation model of the human operator and the industrial control system hardware and software. E-networks as the extension of Petri nets are used as the mathematical apparatus. This approach allows simulating complex parallel distributed processes in human-machine systems. The structural and hierarchical approach is used as the building method for the mathematical model of the human operator. The upper level of the human operator is represented by the logical dynamic model of decision making based on E-networks. The lower level reflects psychophysiological characteristics of the human-operator.
All printed touchless human-machine interface based on only five functional materials
NASA Astrophysics Data System (ADS)
Scheipl, G.; Zirkl, M.; Sawatdee, A.; Helbig, U.; Krause, M.; Kraker, E.; Andersson Ersman, P.; Nilsson, D.; Platt, D.; Bodö, P.; Bauer, S.; Domann, G.; Mogessie, A.; Hartmann, Paul; Stadlober, B.
2012-02-01
We demonstrate the printing of a complex smart integrated system using only five functional inks: the fluoropolymer P(VDF:TrFE) (Poly(vinylidene fluoride trifluoroethylene) sensor ink, the conductive polymer PEDOT:PSS (poly(3,4 ethylenedioxythiophene):poly(styrene sulfonic acid) ink, a conductive carbon paste, a polymeric electrolyte and SU8 for separation. The result is a touchless human-machine interface, including piezo- and pyroelectric sensor pixels (sensitive to pressure changes and impinging infrared light), transistors for impedance matching and signal conditioning, and an electrochromic display. Applications may not only emerge in human-machine interfaces, but also in transient temperature or pressure sensing used in safety technology, in artificial skins and in disposable sensor labels.
Collaborative Lab Reports with Google Docs
NASA Astrophysics Data System (ADS)
Wood, Michael
2011-03-01
Science is a collaborative endeavor. The solitary genius working on the next great scientific breakthrough is a myth not seen much today. Instead, most physicists have worked in a group at one point in their careers, whether as a graduate student, faculty member, staff scientist, or industrial researcher. As an experimental nuclear physicist with research at the Thomas Jefferson National Accelerator Facility, my collaboration consists of over 200 scientists, both national and international. A typical experiment will have a dozen or so principal investigators. Add in the hundreds of staff scientists, engineers, and technicians, and it is clear that science is truly a collaborative effort. This paper will describe the use of Google Docs for collaborative reports for an introductory physics laboratory.
Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun
2017-02-06
In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human-machine-environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines.
Collaborative filtering on a family of biological targets.
Erhan, Dumitru; L'heureux, Pierre-Jean; Yue, Shi Yi; Bengio, Yoshua
2006-01-01
Building a QSAR model of a new biological target for which few screening data are available is a statistical challenge. However, the new target may be part of a bigger family, for which we have more screening data. Collaborative filtering or, more generally, multi-task learning, is a machine learning approach that improves the generalization performance of an algorithm by using information from related tasks as an inductive bias. We use collaborative filtering techniques for building predictive models that link multiple targets to multiple examples. The more commonalities between the targets, the better the multi-target model that can be built. We show an example of a multi-target neural network that can use family information to produce a predictive model of an undersampled target. We evaluate JRank, a kernel-based method designed for collaborative filtering. We show their performance on compound prioritization for an HTS campaign and the underlying shared representation between targets. JRank outperformed the neural network both in the single- and multi-target models.
PROVIDING PLANT DATA ANALYTICS THROUGH A SEAMLESS DIGITAL ENVIRONMENT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bly, Aaron; Oxstrand, Johanna
As technology continues to evolve and become more integrated into a worker’s daily routine in the Nuclear Power industry the need for easy access to data becomes a priority. Not only does the need for data increase but the amount of data collected increases. In most cases the data is collected and stored in various software applications, many of which are legacy systems, which do not offer any other option to access the data except through the application’s user interface. Furthermore the data gets grouped in “silos” according to work function and not necessarily by subject. Hence, in order tomore » access all the information needed for a particular task or analysis one may have to access multiple applications to gather all the data needed. The industry and the research community have identified the need for a digital architecture and more importantly the need for a Seamless Digital Environment. An SDE provides a means to access multiple applications, gather the data points needed, conduct the analysis requested, and present the result to the user with minimal or no effort by the user. In addition, the nuclear utilities have identified the need for research focused on data analytics. The effort should develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics. Idaho National Laboratory is leading such effort, which is conducted in close collaboration with vendors, nuclear utilities, Institute of Nuclear Power Operations, and Electric Power Research Institute. The goal of the study is to research potential approaches to building an analytics solution for equipment reliability, on a small scale, focusing on either a single piece of equipment or a single system. The analytics solution will likely consist of a data integration layer, predictive and machine learning layer and the user interface layer that will display the output of the analysis in a straight forward, easy to consume manner. This paper will describe the study and the initial results.« less
Traceability of On-Machine Tool Measurement: A Review
Gomez-Acedo, Eneko; Kortaberria, Gorka; Olarra, Aitor
2017-01-01
Nowadays, errors during the manufacturing process of high value components are not acceptable in driving industries such as energy and transportation. Sectors such as aerospace, automotive, shipbuilding, nuclear power, large science facilities or wind power need complex and accurate components that demand close measurements and fast feedback into their manufacturing processes. New measuring technologies are already available in machine tools, including integrated touch probes and fast interface capabilities. They provide the possibility to measure the workpiece in-machine during or after its manufacture, maintaining the original setup of the workpiece and avoiding the manufacturing process from being interrupted to transport the workpiece to a measuring position. However, the traceability of the measurement process on a machine tool is not ensured yet and measurement data is still not fully reliable enough for process control or product validation. The scientific objective is to determine the uncertainty on a machine tool measurement and, therefore, convert it into a machine integrated traceable measuring process. For that purpose, an error budget should consider error sources such as the machine tools, components under measurement and the interactions between both of them. This paper reviews all those uncertainty sources, being mainly focused on those related to the machine tool, either on the process of geometric error assessment of the machine or on the technology employed to probe the measurand. PMID:28696358
NASA Astrophysics Data System (ADS)
Ikeda, Fujio; Toyama, Shigehiro; Ishiduki, Souta; Seta, Hiroaki
2016-09-01
Maritime accidents of small ships continue to increase in number. One of the major factors is poor manoeuvrability of the Manual Hydraulic Steering Mechanism (MHSM) in common use. The manoeuvrability can be improved by using the Electronic Control Steering Mechanism (ECSM). This paper conducts stability analyses of a pleasure boat controlled by human models in view of path following on a target course, in order to establish design guidelines for the ECSM. First, to analyse the stability region, the research derives the linear approximated model in a planar global coordinate system. Then, several human models are assumed to develop closed-loop human-machine controlled systems. These human models include basic proportional, derivative, integral and time-delay actions. The stability analysis simulations for those human-machine systems are carried out. The results show that the stability region tends to spread as a ship's velocity increases in the case of the basic proportional human model. The derivative action and time-delay action of human models are effective in spreading the stability region in their respective ranges of frontal gazing points.
Redesigning the Human-Machine Interface for Computer-Mediated Visual Technologies.
ERIC Educational Resources Information Center
Acker, Stephen R.
1986-01-01
This study examined an application of a human machine interface which relies on the use of optical bar codes incorporated in a computer-based module to teach radio production. The sequencing procedure used establishes the user rather than the computer as the locus of control for the mediated instruction. (Author/MBR)
Teaching Machines to Think Fuzzy
ERIC Educational Resources Information Center
Technology Teacher, 2004
2004-01-01
Fuzzy logic programs for computers make them more human. Computers can then think through messy situations and make smart decisions. It makes computers able to control things the way people do. Fuzzy logic has been used to control subway trains, elevators, washing machines, microwave ovens, and cars. Pretty much all the human has to do is push one…
Energy-Efficient Hosting Rich Content from Mobile Platforms with Relative Proximity Sensing.
Park, Ki-Woong; Lee, Younho; Baek, Sung Hoon
2017-08-08
In this paper, we present a tiny networked mobile platform, termed Tiny-Web-Thing ( T-Wing ), which allows the sharing of data-intensive content among objects in cyber physical systems. The object includes mobile platforms like a smartphone, and Internet of Things (IoT) platforms for Human-to-Human (H2H), Human-to-Machine (H2M), Machine-to-Human (M2H), and Machine-to-Machine (M2M) communications. T-Wing makes it possible to host rich web content directly on their objects, which nearby objects can access instantaneously. Using a new mechanism that allows the Wi-Fi interface of the object to be turned on purely on-demand, T-Wing achieves very high energy efficiency. We have implemented T-Wing on an embedded board, and present evaluation results from our testbed. From the evaluation result of T-Wing , we compare our system against alternative approaches to implement this functionality using only the cellular or Wi-Fi (but not both), and show that in typical usage, T-Wing consumes less than 15× the energy and is faster by an order of magnitude.
Application of machine learning methods in bioinformatics
NASA Astrophysics Data System (ADS)
Yang, Haoyu; An, Zheng; Zhou, Haotian; Hou, Yawen
2018-05-01
Faced with the development of bioinformatics, high-throughput genomic technology have enabled biology to enter the era of big data. [1] Bioinformatics is an interdisciplinary, including the acquisition, management, analysis, interpretation and application of biological information, etc. It derives from the Human Genome Project. The field of machine learning, which aims to develop computer algorithms that improve with experience, holds promise to enable computers to assist humans in the analysis of large, complex data sets.[2]. This paper analyzes and compares various algorithms of machine learning and their applications in bioinformatics.
A Framework for Modeling Human-Machine Interactions
NASA Technical Reports Server (NTRS)
Shafto, Michael G.; Rosekind, Mark R. (Technical Monitor)
1996-01-01
Modern automated flight-control systems employ a variety of different behaviors, or modes, for managing the flight. While developments in cockpit automation have resulted in workload reduction and economical advantages, they have also given rise to an ill-defined class of human-machine problems, sometimes referred to as 'automation surprises'. Our interest in applying formal methods for describing human-computer interaction stems from our ongoing research on cockpit automation. In this area of aeronautical human factors, there is much concern about how flight crews interact with automated flight-control systems, so that the likelihood of making errors, in particular mode-errors, is minimized and the consequences of such errors are contained. The goal of the ongoing research on formal methods in this context is: (1) to develop a framework for describing human interaction with control systems; (2) to formally categorize such automation surprises; and (3) to develop tests for identification of these categories early in the specification phase of a new human-machine system.
Stimulating Students' Use of External Representations for a Distance Education Time Machine Design
ERIC Educational Resources Information Center
Baaki, John; Luo, Tian
2017-01-01
As faculty members in an instructional design and technology (IDT) program, we wanted to help our graduate students better understand and experience how designers design in the real world. We aimed to design a reflective and collaborative learning environment where we sparked students to engage in reflection, ideation, and the iterative process of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haxton, Wick
2012-03-07
This project was focused on simulations of core-collapse supernovae on parallel platforms. The intent was to address a number of linked issues: the treatment of hydrodynamics and neutrino diffusion in two and three dimensions; the treatment of the underlying nuclear microphysics that governs neutrino transport and neutrino energy deposition; the understanding of the associated nucleosynthesis, including the r-process and neutrino process; the investigation of the consequences of new neutrino phenomena, such as oscillations; and the characterization of the neutrino signal that might be recorded in terrestrial detectors. This was a collaborative effort with Oak Ridge National Laboratory, State University ofmore » New York at Stony Brook, University of Illinois at Urbana-Champaign, University of California at San Diego, University of Tennessee at Knoxville, Florida Atlantic University, North Carolina State University, and Clemson. The collaborations tie together experts in hydrodynamics, nuclear physics, computer science, and neutrino physics. The University of Washington contributions to this effort include the further development of techniques to solve the Bloch-Horowitz equation for effective interactions and operators; collaborative efforts on developing a parallel Lanczos code; investigating the nuclear and neutrino physics governing the r-process and neutrino physics; and exploring the effects of new neutrino physics on the explosion mechanism, nucleosynthesis, and terrestrial supernova neutrino detection.« less
NASA Astrophysics Data System (ADS)
Schillaci, F.; Pommarel, L.; Romano, F.; Cuttone, G.; Costa, M.; Giove, D.; Maggiore, M.; Russo, A. D.; Scuderi, V.; Malka, V.; Vauzour, B.; Flacco, A.; Cirrone, G. A. P.
2016-07-01
Laser-based accelerators are gaining interest in recent years as an alternative to conventional machines [1]. In the actual ion acceleration scheme, energy and angular spread of the laser-driven beams are the main limiting factors for beam applications and different solutions for dedicated beam-transport lines have been proposed [2,3]. In this context a system of Permanent Magnet Quadrupoles (PMQs) has been realized [2] by INFN-LNS (Laboratori Nazionali del Sud of the Instituto Nazionale di Fisica Nucleare) researchers, in collaboration with SIGMAPHI company in France, to be used as a collection and pre-selection system for laser driven proton beams. This system is meant to be a prototype to a more performing one [3] to be installed at ELI-Beamlines for the collection of ions. The final system is designed for protons and carbons up to 60 MeV/u. In order to validate the design and the performances of this large bore, compact, high gradient magnetic system prototype an experimental campaign have been carried out, in collaboration with the group of the SAPHIR experimental facility at LOA (Laboratoire d'Optique Appliquée) in Paris using a 200 TW Ti:Sapphire laser system. During this campaign a deep study of the quadrupole system optics has been performed, comparing the results with the simulation codes used to determine the setup of the PMQ system and to track protons with realistic TNSA-like divergence and spectrum. Experimental and simulation results are good agreement, demonstrating the possibility to have a good control on the magnet optics. The procedure used during the experimental campaign and the most relevant results are reported here.
ERIC Educational Resources Information Center
Miles, Melissa; Rainbird, Sarah
2015-01-01
This article responds to the rising emphasis placed on interdisciplinary collaborative learning and its implications for assessment in higher education. It presents findings from a research project that examined the effectiveness of an interdisciplinary collaborative student symposium as an assessment task in an art school/humanities environment.…
ERIC Educational Resources Information Center
Jäppinen, Aini-Kristiina
2014-01-01
The article aims at explicating the emergence of human interactional sense-making process within educational leadership as a complex system. The kind of leadership is understood as a holistic entity called collaborative leadership. There, sense-making emerges across interdependent domains, called attributes of collaborative leadership. The…
Human Systems Engineering: A Leadership Model for Collaboration and Change.
ERIC Educational Resources Information Center
Clark, Karen L.
Human systems engineering (HSE) was created to introduce a new way of viewing collaboration. HSE emphasizes the role of leaders who welcome risk, commit to achieving positive change, and help others achieve change. The principles of HSE and its successful application to the collaborative process were illustrated through a case study representing a…
DOT National Transportation Integrated Search
1974-08-01
Volume 3 describes the methodology for man-machine task allocation. It contains a description of man and machine performance capabilities and an explanation of the methodology employed to allocate tasks to human or automated resources. It also presen...
Simple Machines. Physical Science in Action[TM]. Schlessinger Science Library. [Videotape].
ERIC Educational Resources Information Center
2000
In today's world, kids are aware that there are machines all around them. What they may not realize is that the function of all machines is to make work easier in some way. Simple Machines uses engaging visuals and colorful graphics to explain the concept of work and how humans use certain basic tools to help get work done. Students will learn…
Liu, Tongzhu; Shen, Aizong; Hu, Xiaojian; Tong, Guixian; Gu, Wei
2017-06-01
We aimed to apply collaborative business intelligence (BI) system to hospital supply, processing and distribution (SPD) logistics management model. We searched Engineering Village database, China National Knowledge Infrastructure (CNKI) and Google for articles (Published from 2011 to 2016), books, Web pages, etc., to understand SPD and BI related theories and recent research status. For the application of collaborative BI technology in the hospital SPD logistics management model, we realized this by leveraging data mining techniques to discover knowledge from complex data and collaborative techniques to improve the theories of business process. For the application of BI system, we: (i) proposed a layered structure of collaborative BI system for intelligent management in hospital logistics; (ii) built data warehouse for the collaborative BI system; (iii) improved data mining techniques such as supporting vector machines (SVM) and swarm intelligence firefly algorithm to solve key problems in hospital logistics collaborative BI system; (iv) researched the collaborative techniques oriented to data and business process optimization to improve the business processes of hospital logistics management. Proper combination of SPD model and BI system will improve the management of logistics in the hospitals. The successful implementation of the study requires: (i) to innovate and improve the traditional SPD model and make appropriate implement plans and schedules for the application of BI system according to the actual situations of hospitals; (ii) the collaborative participation of internal departments in hospital including the department of information, logistics, nursing, medical and financial; (iii) timely response of external suppliers.
Machinability of some dentin simulating materials.
Möllersten, L
1985-01-01
Machinability in low speed drilling was investigated for pure aluminium, Frasaco teeth, ivory, plexiglass and human dentin. The investigation was performed in order to find a suitable test material for drilling experiments using paralleling instruments. A material simulating human dentin in terms of cuttability at low drilling speeds was sought. Tests were performed using a specially designed apparatus. Holes to a depth of 2 mm were drilled with a twist drill using a constant feeding force. The time required was registered. The machinability of the materials tested was determined by direct comparison of the drilling times. As regards cuttability, first aluminium and then ivory were found to resemble human dentin most closely. By comparing drilling time variances the homogeneity of the materials tested was estimated. Aluminium, Frasaco teeth and plexiglass demonstrated better homogeneity than ivory and human dentin.
NASA Astrophysics Data System (ADS)
Fern, Lisa Carolynn
This dissertation examines the challenges inherent in designing and regulating to support human-automation interaction for new technologies that will be deployed into complex systems. A key question for new technologies with increasingly capable automation, is how work will be accomplished by human and machine agents. This question has traditionally been framed as how functions should be allocated between humans and machines. Such framing misses the coordination and synchronization that is needed for the different human and machine roles in the system to accomplish their goals. Coordination and synchronization demands are driven by the underlying human-automation architecture of the new technology, which are typically not specified explicitly by designers. The human machine interface (HMI), which is intended to facilitate human-machine interaction and cooperation, typically is defined explicitly and therefore serves as a proxy for human-automation cooperation requirements with respect to technical standards for technologies. Unfortunately, mismatches between the HMI and the coordination and synchronization demands of the underlying human-automation architecture can lead to system breakdowns. A methodology is needed that both designers and regulators can utilize to evaluate the predicted performance of a new technology given potential human-automation architectures. Three experiments were conducted to inform the minimum HMI requirements for a detect and avoid (DAA) system for unmanned aircraft systems (UAS). The results of the experiments provided empirical input to specific minimum operational performance standards that UAS manufacturers will have to meet in order to operate UAS in the National Airspace System (NAS). These studies represent a success story for how to objectively and systematically evaluate prototype technologies as part of the process for developing regulatory requirements. They also provide an opportunity to reflect on the lessons learned in order to improve the methodology for defining technology requirements for regulators in the future. The biggest shortcoming of the presented research program was the absence of the explicit definition, generation and analysis of potential human-automation architectures. Failure to execute this step in the research process resulted in less efficient evaluation of the candidate prototypes technologies in addition to a lack of exploration of different approaches to human-automation cooperation. Defining potential human-automation architectures a priori also allows regulators to develop scenarios that will stress the performance boundaries of the technology during the evaluation phase. The importance of adding this step of generating and evaluating candidate human-automation architectures prior to formal empirical evaluation is discussed. This document concludes with a look at both the importance of, and the challenges facing, the inclusion of examining human-automation coordination issues as part of the safety assurance activities of new technologies.
General method of pattern classification using the two-domain theory
NASA Technical Reports Server (NTRS)
Rorvig, Mark E. (Inventor)
1993-01-01
Human beings judge patterns (such as images) by complex mental processes, some of which may not be known, while computing machines extract features. By representing the human judgements with simple measurements and reducing them and the machine extracted features to a common metric space and fitting them by regression, the judgements of human experts rendered on a sample of patterns may be imposed on a pattern population to provide automatic classification.
General method of pattern classification using the two-domain theory
NASA Technical Reports Server (NTRS)
Rorvig, Mark E. (Inventor)
1990-01-01
Human beings judge patterns (such as images) by complex mental processes, some of which may not be known, while computing machines extract features. By representing the human judgements with simple measurements and reducing them and the machine extracted features to a common metric space and fitting them by regression, the judgements of human experts rendered on a sample of patterns may be imposed on a pattern population to provide automatic classification.
Han-Markey, T L; Wang, L; Schlotterbeck, S; Jackson, E A; Gurm, R; Leidal, A; Eagle, K
2012-04-01
The school environment has been the focus of many health initiatives over the years as a means to address the childhood obesity crisis. The availability of low-nutrient, high-calorie foods and beverages to students via vending machines further exacerbates the issue of childhood obesity. However, a healthy overhaul of vending machines may also affect revenue on which schools have come to depend. This article describes the experience of one school district in changing the school environment, and the resulting impact on food and beverage vending machines. Observational study in Ann Arbor public schools. The contents and locations of vending machines were identified in 2003 and surveyed repeatedly in 2007. Overall revenues were also documented during this time period. Changes were observed in the contents of both food and beverage vending machines. Revenue in the form of commissions to the contracted companies and the school district decreased. Local and national wellness policy changes may have financial ramifications for school districts. In order to facilitate and sustain school environment change, all stakeholders, including teachers, administrators, students and healthcare providers, should collaborate and communicate on policy implementation, recognizing that change can have negative financial consequences as well as positive, healthier outcomes. Copyright © 2012 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Trust metrics in information fusion
NASA Astrophysics Data System (ADS)
Blasch, Erik
2014-05-01
Trust is an important concept for machine intelligence and is not consistent across many applications. In this paper, we seek to understand trust from a variety of factors: humans, sensors, communications, intelligence processing algorithms and human-machine displays of information. In modeling the various aspects of trust, we provide an example from machine intelligence that supports the various attributes of measuring trust such as sensor accuracy, communication timeliness, machine processing confidence, and display throughput to convey the various attributes that support user acceptance of machine intelligence results. The example used is fusing video and text whereby an analyst needs trust information in the identified imagery track. We use the proportional conflict redistribution rule as an information fusion technique that handles conflicting data from trusted and mistrusted sources. The discussion of the many forms of trust explored in the paper seeks to provide a systems-level design perspective for information fusion trust quantification.
Reasons for 2011 Release of the Evaluated Nuclear Data Library (ENDL2011.0)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, D.; Escher, J.; Hoffman, R.
LLNL's Computational Nuclear Physics Group and Nuclear Theory and Modeling Group have collaborated to create the 2011 release of the Evaluated Nuclear Data Library (ENDL2011). ENDL2011 is designed to sup- port LLNL's current and future nuclear data needs. This database is currently the most complete nuclear database for Monte Carlo and deterministic transport of neutrons and charged particles, surpassing ENDL2009.0 [1]. The ENDL2011 release [2] contains 918 transport-ready eval- uations in the neutron sub-library alone. ENDL2011 was assembled with strong support from the ASC program, leveraged with support from NNSA science campaigns and the DOE/Offce of Science US Nuclear Datamore » Pro- gram.« less
Nuclear valve manufacturer selects stainless forgings
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1976-02-01
Forged type 316 stainless steel components for nuclear valves are described. Automatic plasma arc welding with powder filler alloys is employed for hardfacing. Seat ring forgings are surfaced four-at-a-time with Stellite No. 156 in a sequential manner to minimize heat input to the individual components. After cladding and machining, seat rings are welded into the valve body using a semiautomatic, hot-wire gas tungsten-arc process. Disc faces and guide slots are surfaced with Stellite No. 6. The valve stem is machined from 17-4PH forged bar stock in the H-1100 condition. The heat treatment is specified to minimize pitting under prolonged exposuremore » to wet packing. A 12 rms (0.3 $mu$m) surface finish minimizes tearing of the packing and subsequent leakage. The link and stem pin are SA 564 Grade 660 (in the H-1100 condition) and ASTM A637 Grade 718 respectively. (JRD)« less