Performance Support Tools: Delivering Value when and where It Is Needed
ERIC Educational Resources Information Center
McManus, Paul; Rossett, Allison
2006-01-01
Some call them Electronic Performance Support Systems (EPSSs). Others prefer Performance Support Tools (PSTs) or decision support tools. One might call EPSSs or PSTs job aids on steroids, technological tools that provide critical information or advice needed to move forward at a particular moment in time. Characteristic advantages of an EPSS or a…
(abstract) Generic Modeling of a Life Support System for Process Technology Comparisons
NASA Technical Reports Server (NTRS)
Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.
1993-01-01
This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support systems and process technology options for a Lunar Base and a Mars Exploration Mission.
Generic Modeling of a Life Support System for Process Technology Comparison
NASA Technical Reports Server (NTRS)
Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.
1993-01-01
This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support system and process technology options for a Lunar Base with a crew size of 4 and mission lengths of 90 and 600 days. System configurations to minimize the life support system weight and power are explored.
Enhancement of the EPA Stormwater BMP Decision-Support Tool (SUSTAIN)
U.S. Environmental Protection Agency (EPA) has been developing and improving a decision-support tool for placement of stormwater best management practices (BMPs) at strategic locations in urban watersheds. The tool is called the System for Urban Stormwater Treatment and Analysis...
Enhancement of the EPA Stormwater BMP Decision-Support Tool (SUSTAIN) - slides
U.S. Environmental Protection Agency (EPA) has been developing and improving a decision-support tool for placement of stormwater best management practices (BMPs) at strategic locations in urban watersheds. The tool is called the System for Urban Stormwater Treatment and Analysis...
Coordinating complex decision support activities across distributed applications
NASA Technical Reports Server (NTRS)
Adler, Richard M.
1994-01-01
Knowledge-based technologies have been applied successfully to automate planning and scheduling in many problem domains. Automation of decision support can be increased further by integrating task-specific applications with supporting database systems, and by coordinating interactions between such tools to facilitate collaborative activities. Unfortunately, the technical obstacles that must be overcome to achieve this vision of transparent, cooperative problem-solving are daunting. Intelligent decision support tools are typically developed for standalone use, rely on incompatible, task-specific representational models and application programming interfaces (API's), and run on heterogeneous computing platforms. Getting such applications to interact freely calls for platform independent capabilities for distributed communication, as well as tools for mapping information across disparate representations. Symbiotics is developing a layered set of software tools (called NetWorks! for integrating and coordinating heterogeneous distributed applications. he top layer of tools consists of an extensible set of generic, programmable coordination services. Developers access these services via high-level API's to implement the desired interactions between distributed applications.
Trinkaus, Hans L; Gaisser, Andrea E
2010-09-01
Nearly 30,000 individual inquiries are answered annually by the telephone cancer information service (CIS, KID) of the German Cancer Research Center (DKFZ). The aim was to develop a tool for evaluating these calls, and to support the complete counseling process interactively. A novel software tool is introduced, based on a structure similar to a music score. Treating the interaction as a "duet", guided by the CIS counselor, the essential contents of the dialogue are extracted automatically. For this, "trained speech recognition" is applied to the (known) counselor's part, and "keyword spotting" is used on the (unknown) client's part to pick out specific items from the "word streams". The outcomes fill an abstract score representing the dialogue. Pilot tests performed on a prototype of SACA (Software Assisted Call Analysis) resulted in a basic proof of concept: Demographic data as well as information regarding the situation of the caller could be identified. The study encourages following up on the vision of an integrated SACA tool for supporting calls online and performing statistics on its knowledge database offline. Further research perspectives are to check SACA's potential in comparison with established interaction analysis systems like RIAS. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.
2007-11-01
Engineer- ing Research Laboratory is currently developing a set of facility ‘architec- tural’ programming tools , called Facility ComposerTM (FC). FC...requirements in the early phases of project development. As the facility program, crite- ria, and requirements are chosen, these tools populate the IFC...developing a set of facility “ar- chitectural” programming tools , called Facility Composer (FC), to support the capture and tracking of facility criteria
SMARTE: SUSTAINABLE MANAGEMENT APPROACHES AND REVITALIZATION TOOLS-ELECTRONIC (BELFAST, IRELAND)
The U.S.-German Bilateral Working Group is developing Site-specific Management Approaches and Redevelopment Tools (SMART). In the U.S., the SMART compilation is housed in a web-based, decision support tool called SMARTe. All tools within SMARTe that are developed specifically for...
Maintenance and operations decision support tool : Clarus regional demonstrations.
DOT National Transportation Integrated Search
2011-01-01
Weather affects almost all maintenance activity decisions. The Federal Highway Administration (FHWA) tested a new decision support system for maintenance in Iowa, Indiana, and Illinois called the Maintenance and Operations Decision Support System (MO...
SMARTE: IMPROVING REVITALIZATION DECISIONS (BERLIN, GERMANY)
The U.S.-German Bilateral Working Group is developing Site-specific Management Approaches and Redevelopment Tools (SMART). In the U.S., the SMART compilation is housed in a web-based, decision support tool called SMARTe. All tools within SMARTe that are developed specifically for...
The Three C's for Urban Science Education
ERIC Educational Resources Information Center
Emdin, Chris
2008-01-01
In this article, the author outlines briefly what he calls the three C's--a set of tools that can be used to improve urban science education. The author then describes ways that these tools can support students who have traditionally been marginalized. These three aligned and closely connected tools provide practical ways to engage students in…
ERIC Educational Resources Information Center
Upitis, Rena; Brook, Julia
2017-01-01
Even though there are demonstrated benefits of using online tools to support student musicians, there is a persistent challenge of providing sufficient and effective professional development for independent music teachers to use such tools successfully. This paper describes several methods for helping teachers use an online tool called iSCORE,…
Tools Automate Spacecraft Testing, Operation
NASA Technical Reports Server (NTRS)
2010-01-01
"NASA began the Small Explorer (SMEX) program to develop spacecraft to advance astrophysics and space physics. As one of the entities supporting software development at Goddard Space Flight Center, the Hammers Company Inc. (tHC Inc.), of Greenbelt, Maryland, developed the Integrated Test and Operations System to support SMEX. Later, the company received additional Small Business Innovation Research (SBIR) funding from Goddard for a tool to facilitate the development of flight software called VirtualSat. NASA uses the tools to support 15 satellites, and the aerospace industry is using them to develop science instruments, spacecraft computer systems, and navigation and control software."
North, Frederick; Varkey, Prathiba; Caraballo, Pedro; Vsetecka, Darlene; Bartel, Greg
2007-10-11
Complex decision support software can require significant effort in maintenance and enhancement. A quality improvement tool, the prioritization matrix, was successfully used to guide software enhancement of algorithms in a symptom assessment call center.
ART-Ada design project, phase 2
NASA Technical Reports Server (NTRS)
Lee, S. Daniel; Allen, Bradley P.
1990-01-01
Interest in deploying expert systems in Ada has increased. An Ada based expert system tool is described called ART-Ada, which was built to support research into the language and methodological issues of expert systems in Ada. ART-Ada allows applications of an existing expert system tool called ART-IM (Automated Reasoning Tool for Information Management) to be deployed in various Ada environments. ART-IM, a C-based expert system tool, is used to generate Ada source code which is compiled and linked with an Ada based inference engine to produce an Ada executable image. ART-Ada is being used to implement several expert systems for NASA's Space Station Freedom Program and the U.S. Air Force.
NASA Astrophysics Data System (ADS)
Kimura, Toshiaki; Kasai, Fumio; Kamio, Yoichi; Kanda, Yuichi
This research paper discusses a manufacturing support system which supports not only maintenance services but also consulting services for manufacturing systems consisting of multi-vendor machine tools. In order to do this system enables inter-enterprise collaboration between engineering companies and machine tool vendors. The system is called "After-Sales Support Inter-enterprise collaboration System using information Technologies" (ASSIST). This paper describes the concept behind the planned ASSIST, the development of a prototype of the system, and discusses test operation results of the system.
Active Reading Documents (ARDs): A Tool to Facilitate Meaningful Learning through Reading
ERIC Educational Resources Information Center
Dubas, Justin M.; Toledo, Santiago A.
2015-01-01
Presented here is a practical tool called the Active Reading Document (ARD) that can give students the necessary incentive to engage with the text/readings. By designing the tool to incrementally develop student understanding of the material through reading using Marzano's Taxonomy as a framework, the ARD offers support through scaffolding as they…
Current Capabilities and Planned Enhancements of SUSTAIN - Paper
Efforts have been under way by the U.S. Environmental Protection Agency (EPA) since 2003 to develop a decision-support tool for placement of best management practices (BMPs) at strategic locations in urban watersheds. The tool is called the System for Urban Stormwater Treatment ...
STS-57 Pilot Duffy uses TDS soldering tool in SPACEHAB-01 aboard OV-105
1993-07-01
STS057-30-021 (21 June-1 July 1993) --- Astronaut Brian Duffy, pilot, handles a soldering tool onboard the Earth-orbiting Space Shuttle Endeavour. The Soldering Experiment (SE) called for a crew member to solder on a printed circuit board containing 45 connection points, then de-solder 35 points on a similar board. The SE was part of a larger project called the Tools and Diagnostic Systems (TDS), sponsored by the Space and Life Sciences Directorate at Johnson Space Center (JSC). TDS represents a group of equipment selected from the tools and diagnostic hardware to be supported by the International Space Station program. TDS was designed to demonstrate the maintenance of experiment hardware on-orbit and to evaluate the adequacy of its design and the crew interface. Duffy and five other NASA astronauts spent almost ten days aboard the Space Shuttle Endeavour in Earth-orbit supporting the SpaceHab mission, retrieving the European Retrievable Carrier (EURECA) and conducting various experiments.
Pathway Tools version 13.0: integrated software for pathway/genome informatics and systems biology
Paley, Suzanne M.; Krummenacker, Markus; Latendresse, Mario; Dale, Joseph M.; Lee, Thomas J.; Kaipa, Pallavi; Gilham, Fred; Spaulding, Aaron; Popescu, Liviu; Altman, Tomer; Paulsen, Ian; Keseler, Ingrid M.; Caspi, Ron
2010-01-01
Pathway Tools is a production-quality software environment for creating a type of model-organism database called a Pathway/Genome Database (PGDB). A PGDB such as EcoCyc integrates the evolving understanding of the genes, proteins, metabolic network and regulatory network of an organism. This article provides an overview of Pathway Tools capabilities. The software performs multiple computational inferences including prediction of metabolic pathways, prediction of metabolic pathway hole fillers and prediction of operons. It enables interactive editing of PGDBs by DB curators. It supports web publishing of PGDBs, and provides a large number of query and visualization tools. The software also supports comparative analyses of PGDBs, and provides several systems biology analyses of PGDBs including reachability analysis of metabolic networks, and interactive tracing of metabolites through a metabolic network. More than 800 PGDBs have been created using Pathway Tools by scientists around the world, many of which are curated DBs for important model organisms. Those PGDBs can be exchanged using a peer-to-peer DB sharing system called the PGDB Registry. PMID:19955237
OCSEGen: Open Components and Systems Environment Generator
NASA Technical Reports Server (NTRS)
Tkachuk, Oksana
2014-01-01
To analyze a large system, one often needs to break it into smaller components.To analyze a component or unit under analysis, one needs to model its context of execution, called environment, which represents the components with which the unit interacts. Environment generation is a challenging problem, because the environment needs to be general enough to uncover unit errors, yet precise enough to make the analysis tractable. In this paper, we present a tool for automated environment generation for open components and systems. The tool, called OCSEGen, is implemented on top of the Soot framework. We present the tool's current support and discuss its possible future extensions.
RAFCON: A Graphical Tool for Engineering Complex, Robotic Tasks
2016-10-09
Robotic tasks are becoming increasingly complex, and with this also the robotic systems. This requires new tools to manage this complexity and to...execution of robotic tasks, called RAFCON. These tasks are described in hierarchical state machines supporting concurrency. A formal notation of this concept
Developing an Indigenous Proficiency Scale
ERIC Educational Resources Information Center
Kahakalau, Ku
2017-01-01
With an increased interest in the revitalization of Indigenous languages and cultural practices worldwide, there is also an increased need to develop tools to support Indigenous language learners and instructors. The purpose of this article is to presents such a tool called ANA 'OLELO, designed specifically to assess Hawaiian language proficiency.…
Assessing the Use of Instant Messaging in Online Learning Environments
ERIC Educational Resources Information Center
Contreras-Castillo, Juan; Perez-Fragoso, Carmen; Favela, Jesus
2006-01-01
There is a body of evidence supporting the claim that informal interactions in educational environments have positive effects on learning. In order to increase the opportunities of informal interaction in online courses, an instant messaging tool, called CENTERS, was developed and integrated into online learning environments. This tool provides…
ERIC Educational Resources Information Center
Sebastian, James; Allensworth, Elaine; Stevens, David
2014-01-01
Background: In this paper we call for studying school leadership and its relationship to instruction and learning through approaches that highlight the role of configurations of multiple organizational supports. A configuration-focused approach to studying leadership and other essential supports provides a valuable addition to existing tools in…
Structural Embeddings: Mechanization with Method
NASA Technical Reports Server (NTRS)
Munoz, Cesar; Rushby, John
1999-01-01
The most powerful tools for analysis of formal specifications are general-purpose theorem provers and model checkers, but these tools provide scant methodological support. Conversely, those approaches that do provide a well-developed method generally have less powerful automation. It is natural, therefore, to try to combine the better-developed methods with the more powerful general-purpose tools. An obstacle is that the methods and the tools often employ very different logics. We argue that methods are separable from their logics and are largely concerned with the structure and organization of specifications. We, propose a technique called structural embedding that allows the structural elements of a method to be supported by a general-purpose tool, while substituting the logic of the tool for that of the method. We have found this technique quite effective and we provide some examples of its application. We also suggest how general-purpose systems could be restructured to support this activity better.
Flight Awareness Collaboration Tool Development
NASA Technical Reports Server (NTRS)
Mogford, Richard
2016-01-01
This is a PowerPoint presentation covering airline operations center (AOC) research. It reviews a dispatcher decision support tool called the Flight Awareness Collaboration Tool (FACT). FACT gathers information about winter weather onto one screen and includes predictive abilities. FACT should prove to be useful for airline dispatchers and airport personnel when they manage winter storms and their effect on air traffic. This material is very similar to other previously approved presentations.
NASA Technical Reports Server (NTRS)
Thomas, Stan J.
1993-01-01
KATE (Knowledge-based Autonomous Test Engineer) is a model-based software system developed in the Artificial Intelligence Laboratory at the Kennedy Space Center for monitoring, fault detection, and control of launch vehicles and ground support systems. In order to bring KATE to the level of performance, functionality, and integratability needed for firing room applications, efforts are underway to implement KATE in the C++ programming language using an X-windows interface. Two programs which were designed and added to the collection of tools which comprise the KATE toolbox are described. The first tool, called the schematic viewer, gives the KATE user the capability to view digitized schematic drawings in the KATE environment. The second tool, called the model editor, gives the KATE model builder a tool for creating and editing knowledge base files. Design and implementation issues having to do with these two tools are discussed. It will be useful to anyone maintaining or extending either the schematic viewer or the model editor.
3D Designing for Mathematical Learning
ERIC Educational Resources Information Center
Greenstein, Steven; Leszczynski, Eliza; Fernández, Eileen
2017-01-01
Inspired by the promise of new 3D technologies and the proposition that new tools make innovation possible, this article provides a case study of how a tool called Thirty6 was designed and used in classrooms by mathematics teachers in their own varied and invented ways. Unlike established manipulatives that are designed to support the learning of…
A front-end automation tool supporting design, verification and reuse of SOC.
Yan, Xiao-lang; Yu, Long-li; Wang, Jie-bing
2004-09-01
This paper describes an in-house developed language tool called VPerl used in developing a 250 MHz 32-bit high-performance low power embedded CPU core. The authors showed that use of this tool can compress the Verilog code by more than a factor of 5, increase the efficiency of the front-end design, reduce the bug rate significantly. This tool can be used to enhance the reusability of an intellectual property model, and facilitate porting design for different platforms.
Karp, Peter D; Paley, Suzanne; Romero, Pedro
2002-01-01
Bioinformatics requires reusable software tools for creating model-organism databases (MODs). The Pathway Tools is a reusable, production-quality software environment for creating a type of MOD called a Pathway/Genome Database (PGDB). A PGDB such as EcoCyc (see http://ecocyc.org) integrates our evolving understanding of the genes, proteins, metabolic network, and genetic network of an organism. This paper provides an overview of the four main components of the Pathway Tools: The PathoLogic component supports creation of new PGDBs from the annotated genome of an organism. The Pathway/Genome Navigator provides query, visualization, and Web-publishing services for PGDBs. The Pathway/Genome Editors support interactive updating of PGDBs. The Pathway Tools ontology defines the schema of PGDBs. The Pathway Tools makes use of the Ocelot object database system for data management services for PGDBs. The Pathway Tools has been used to build PGDBs for 13 organisms within SRI and by external users.
Mullinix, C.; Hearn, P.; Zhang, H.; Aguinaldo, J.
2009-01-01
Federal, State, and local water quality managers charged with restoring the Chesapeake Bay ecosystem require tools to maximize the impact of their limited resources. To address this need, the U.S. Geological Survey (USGS) and the Environmental Protection Agency's Chesapeake Bay Program (CBP) are developing a suite of Web-based tools called the Chesapeake Online Assessment Support Toolkit (COAST). The goal of COAST is to help CBP partners identify geographic areas where restoration activities would have the greatest effect, select the appropriate management strategies, and improve coordination and prioritization among partners. As part of the COAST suite of tools focused on environmental restoration, a water quality management visualization component called the Nutrient Yields Mapper (NYM) tool is being developed by USGS. The NYM tool is a web application that uses watershed yield estimates from USGS SPAtially Referenced Regressions On Watershed (SPARROW) attributes model (Schwarz et al., 2006) [6] to allow water quality managers to identify important sources of nitrogen and phosphorous within the Chesapeake Bay watershed. The NYM tool utilizes new open source technologies that have become popular in geospatial web development, including components such as OpenLayers and GeoServer. This paper presents examples of water quality data analysis based on nutrient type, source, yield, and area of interest using the NYM tool for the Chesapeake Bay watershed. In addition, we describe examples of map-based techniques for identifying high and low nutrient yield areas; web map engines; and data visualization and data management techniques.
ERIC Educational Resources Information Center
Rose, Carolyn; Wang, Yi-Chia; Cui, Yue; Arguello, Jaime; Stegmann, Karsten; Weinberger, Armin; Fischer, Frank
2008-01-01
In this article we describe the emerging area of text classification research focused on the problem of collaborative learning process analysis both from a broad perspective and more specifically in terms of a publicly available tool set called TagHelper tools. Analyzing the variety of pedagogically valuable facets of learners' interactions is a…
Visual management support system
Lee Anderson; Jerry Mosier; Geoffrey Chandler
1979-01-01
The Visual Management Support System (VMSS) is an extension of an existing computer program called VIEWIT, which has been extensively used by the U. S. Forest Service. The capabilities of this program lie in the rapid manipulation of large amounts of data, specifically opera-ting as a tool to overlay or merge one set of data with another. VMSS was conceived to...
ERIC Educational Resources Information Center
Bidarra, José; Rusman, Ellen
2017-01-01
This paper proposes a design framework to support science education through blended learning, based on a participatory and interactive approach supported by ICT-based tools, called "Science Learning Activities Model" (SLAM). The development of this design framework started as a response to complex changes in society and education (e.g.…
Decision support frameworks and tools for conservation
Schwartz, Mark W.; Cook, Carly N.; Pressey, Robert L.; Pullin, Andrew S.; Runge, Michael C.; Salafsky, Nick; Sutherland, William J.; Williamson, Matthew A.
2018-01-01
The practice of conservation occurs within complex socioecological systems fraught with challenges that require transparent, defensible, and often socially engaged project planning and management. Planning and decision support frameworks are designed to help conservation practitioners increase planning rigor, project accountability, stakeholder participation, transparency in decisions, and learning. We describe and contrast five common frameworks within the context of six fundamental questions (why, who, what, where, when, how) at each of three planning stages of adaptive management (project scoping, operational planning, learning). We demonstrate that decision support frameworks provide varied and extensive tools for conservation planning and management. However, using any framework in isolation risks diminishing potential benefits since no one framework covers the full spectrum of potential conservation planning and decision challenges. We describe two case studies that have effectively deployed tools from across conservation frameworks to improve conservation actions and outcomes. Attention to the critical questions for conservation project planning should allow practitioners to operate within any framework and adapt tools to suit their specific management context. We call on conservation researchers and practitioners to regularly use decision support tools as standard practice for framing both practice and research.
Evaluating the Utility of Web-Based Consumer Support Tools Using Rough Sets
NASA Astrophysics Data System (ADS)
Maciag, Timothy; Hepting, Daryl H.; Slezak, Dominik; Hilderman, Robert J.
On the Web, many popular e-commerce sites provide consumers with decision support tools to assist them in their commerce-related decision-making. Many consumers will rank the utility of these tools quite highly. Data obtained from web usage mining analyses, which may provide knowledge about a user's online experiences, could help indicate the utility of these tools. This type of analysis could provide insight into whether provided tools are adequately assisting consumers in conducting their online shopping activities or if new or additional enhancements need consideration. Although some research in this regard has been described in previous literature, there is still much that can be done. The authors of this paper hypothesize that a measurement of consumer decision accuracy, i.e. a measurement preferences, could help indicate the utility of these tools. This paper describes a procedure developed towards this goal using elements of rough set theory. The authors evaluated the procedure using two support tools, one based on a tool developed by the US-EPA and the other developed by one of the authors called cogito. Results from the evaluation did provide interesting insights on the utility of both support tools. Although it was shown that the cogito tool obtained slightly higher decision accuracy, both tools could be improved from additional enhancements. Details of the procedure developed and results obtained from the evaluation will be provided. Opportunities for future work are also discussed.
Makkar, Steve R.; Haynes, Abby; Williamson, Anna; Redman, Sally
2018-01-01
There are calls for policymakers to make greater use of research when formulating policies. Therefore, it is important that policy organisations have a range of tools and systems to support their staff in using research in their work. The aim of the present study was to measure the extent to which a range of tools and systems to support research use were available within six Australian agencies with a role in health policy, and examine whether this was related to the extent of engagement with, and use of research in policymaking by their staff. The presence of relevant systems and tools was assessed via a structured interview called ORACLe which is conducted with a senior executive from the agency. To measure research use, four policymakers from each agency undertook a structured interview called SAGE, which assesses and scores the extent to which policymakers engaged with (i.e., searched for, appraised, and generated) research, and used research in the development of a specific policy document. The results showed that all agencies had at least a moderate range of tools and systems in place, in particular policy development processes; resources to access and use research (such as journals, databases, libraries, and access to research experts); processes to generate new research; and mechanisms to establish relationships with researchers. Agencies were less likely, however, to provide research training for staff and leaders, or to have evidence-based processes for evaluating existing policies. For the majority of agencies, the availability of tools and systems was related to the extent to which policymakers engaged with, and used research when developing policy documents. However, some agencies did not display this relationship, suggesting that other factors, namely the organisation’s culture towards research use, must also be considered. PMID:29513669
Makkar, Steve R; Haynes, Abby; Williamson, Anna; Redman, Sally
2018-01-01
There are calls for policymakers to make greater use of research when formulating policies. Therefore, it is important that policy organisations have a range of tools and systems to support their staff in using research in their work. The aim of the present study was to measure the extent to which a range of tools and systems to support research use were available within six Australian agencies with a role in health policy, and examine whether this was related to the extent of engagement with, and use of research in policymaking by their staff. The presence of relevant systems and tools was assessed via a structured interview called ORACLe which is conducted with a senior executive from the agency. To measure research use, four policymakers from each agency undertook a structured interview called SAGE, which assesses and scores the extent to which policymakers engaged with (i.e., searched for, appraised, and generated) research, and used research in the development of a specific policy document. The results showed that all agencies had at least a moderate range of tools and systems in place, in particular policy development processes; resources to access and use research (such as journals, databases, libraries, and access to research experts); processes to generate new research; and mechanisms to establish relationships with researchers. Agencies were less likely, however, to provide research training for staff and leaders, or to have evidence-based processes for evaluating existing policies. For the majority of agencies, the availability of tools and systems was related to the extent to which policymakers engaged with, and used research when developing policy documents. However, some agencies did not display this relationship, suggesting that other factors, namely the organisation's culture towards research use, must also be considered.
SUSTAIN - A BMP Process and Placement Tool for Urban Watersheds (Poster)
To assist stormwater management professionals in planning for best management practices (BMPs) and low-impact developments (LIDs) implementation, USEPA is developing a decision support system, called the System for Urban Stormwater Treatment and Analysis INtegration (SUSTAIN). ...
Compositional Verification with Abstraction, Learning, and SAT Solving
2015-05-01
arithmetic, and bit-vectors (currently, via bit-blasting). The front-end is based on an existing tool called UFO [8] which converts C programs to the Horn...supports propositional logic, linear arithmetic, and bit-vectors (via bit-blasting). The front-end is based on the tool UFO [8]. It encodes safety of...tool UFO [8]. The encoding in Horn-SMT only uses the theory of Linear Rational Arithmetic. All experiments were carried out on an Intel R© CoreTM2 Quad
Information visualisation based on graph models
NASA Astrophysics Data System (ADS)
Kasyanov, V. N.; Kasyanova, E. V.
2013-05-01
Information visualisation is a key component of support tools for many applications in science and engineering. A graph is an abstract structure that is widely used to model information for its visualisation. In this paper, we consider practical and general graph formalism called hierarchical graphs and present the Higres and Visual Graph systems aimed at supporting information visualisation on the base of hierarchical graph models.
Probability and Statistics in Sensor Performance Modeling
2010-12-01
language software program is called Environmental Awareness for Sensor and Emitter Employment. Some important numerical issues in the implementation...3 Statistical analysis for measuring sensor performance...complementary cumulative distribution function cdf cumulative distribution function DST decision-support tool EASEE Environmental Awareness of
Jayapandian, Catherine P; Chen, Chien-Hung; Bozorgi, Alireza; Lhatoo, Samden D; Zhang, Guo-Qiang; Sahoo, Satya S
2013-01-01
Epilepsy is the most common serious neurological disorder affecting 50-60 million persons worldwide. Electrophysiological data recordings, such as electroencephalogram (EEG), are the gold standard for diagnosis and pre-surgical evaluation in epilepsy patients. The increasing trend towards multi-center clinical studies require signal visualization and analysis tools to support real time interaction with signal data in a collaborative environment, which cannot be supported by traditional desktop-based standalone applications. As part of the Prevention and Risk Identification of SUDEP Mortality (PRISM) project, we have developed a Web-based electrophysiology data visualization and analysis platform called Cloudwave using highly scalable open source cloud computing infrastructure. Cloudwave is integrated with the PRISM patient cohort identification tool called MEDCIS (Multi-modality Epilepsy Data Capture and Integration System). The Epilepsy and Seizure Ontology (EpSO) underpins both Cloudwave and MEDCIS to support query composition and result retrieval. Cloudwave is being used by clinicians and research staff at the University Hospital - Case Medical Center (UH-CMC) Epilepsy Monitoring Unit (EMU) and will be progressively deployed at four EMUs in the United States and the United Kingdomas part of the PRISM project.
On the design of computer-based models for integrated environmental science.
McIntosh, Brian S; Jeffrey, Paul; Lemon, Mark; Winder, Nick
2005-06-01
The current research agenda in environmental science is dominated by calls to integrate science and policy to better understand and manage links between social (human) and natural (nonhuman) processes. Freshwater resource management is one area where such calls can be heard. Designing computer-based models for integrated environmental science poses special challenges to the research community. At present it is not clear whether such tools, or their outputs, receive much practical policy or planning application. It is argued that this is a result of (1) a lack of appreciation within the research modeling community of the characteristics of different decision-making processes including policy, planning, and (2) participation, (3) a lack of appreciation of the characteristics of different decision-making contexts, (4) the technical difficulties in implementing the necessary support tool functionality, and (5) the socio-technical demands of designing tools to be of practical use. This article presents a critical synthesis of ideas from each of these areas and interprets them in terms of design requirements for computer-based models being developed to provide scientific information support for policy and planning. Illustrative examples are given from the field of freshwater resources management. Although computer-based diagramming and modeling tools can facilitate processes of dialogue, they lack adequate simulation capabilities. Component-based models and modeling frameworks provide such functionality and may be suited to supporting problematic or messy decision contexts. However, significant technical (implementation) and socio-technical (use) challenges need to be addressed before such ambition can be realized.
Using EPA Decision Support Tools in Your Community
ORD and R9 collaborators will be presenting at the “This Way to Sustainability Conference XII” hosted by California State University-Chico, which formed an EPIC called the “Initiative for Resilient Cities” stemming directly from EPA’s efforts. This p...
Using telephony data to facilitate discovery of clinical workflows.
Rucker, Donald W
2017-04-19
Discovery of clinical workflows to target for redesign using methods such as Lean and Six Sigma is difficult. VoIP telephone call pattern analysis may complement direct observation and EMR-based tools in understanding clinical workflows at the enterprise level by allowing visualization of institutional telecommunications activity. To build an analytic framework mapping repetitive and high-volume telephone call patterns in a large medical center to their associated clinical units using an enterprise unified communications server log file and to support visualization of specific call patterns using graphical networks. Consecutive call detail records from the medical center's unified communications server were parsed to cross-correlate telephone call patterns and map associated phone numbers to a cost center dictionary. Hashed data structures were built to allow construction of edge and node files representing high volume call patterns for display with an open source graph network tool. Summary statistics for an analysis of exactly one week's call detail records at a large academic medical center showed that 912,386 calls were placed with a total duration of 23,186 hours. Approximately half of all calling called number pairs had an average call duration under 60 seconds and of these the average call duration was 27 seconds. Cross-correlation of phone calls identified by clinical cost center can be used to generate graphical displays of clinical enterprise communications. Many calls are short. The compact data transfers within short calls may serve as automation or re-design targets. The large absolute amount of time medical center employees were engaged in VoIP telecommunications suggests that analysis of telephone call patterns may offer additional insights into core clinical workflows.
DataViewer3D: An Open-Source, Cross-Platform Multi-Modal Neuroimaging Data Visualization Tool
Gouws, André; Woods, Will; Millman, Rebecca; Morland, Antony; Green, Gary
2008-01-01
Integration and display of results from multiple neuroimaging modalities [e.g. magnetic resonance imaging (MRI), magnetoencephalography, EEG] relies on display of a diverse range of data within a common, defined coordinate frame. DataViewer3D (DV3D) is a multi-modal imaging data visualization tool offering a cross-platform, open-source solution to simultaneous data overlay visualization requirements of imaging studies. While DV3D is primarily a visualization tool, the package allows an analysis approach where results from one imaging modality can guide comparative analysis of another modality in a single coordinate space. DV3D is built on Python, a dynamic object-oriented programming language with support for integration of modular toolkits, and development of cross-platform software for neuroimaging. DV3D harnesses the power of the Visualization Toolkit (VTK) for two-dimensional (2D) and 3D rendering, calling VTK's low level C++ functions from Python. Users interact with data via an intuitive interface that uses Python to bind wxWidgets, which in turn calls the user's operating system dialogs and graphical user interface tools. DV3D currently supports NIfTI-1, ANALYZE™ and DICOM formats for MRI data display (including statistical data overlay). Formats for other data types are supported. The modularity of DV3D and ease of use of Python allows rapid integration of additional format support and user development. DV3D has been tested on Mac OSX, RedHat Linux and Microsoft Windows XP. DV3D is offered for free download with an extensive set of tutorial resources and example data. PMID:19352444
Science 2.0: Communicating Science Creatively
ERIC Educational Resources Information Center
Smith, Ben; Mader, Jared
2017-01-01
This column shares web tools that support learning. The authors have been covering the International Society for Technology in Education (ISTE) standards in every issue since September 2016. This article examines the final standard, called Creative Communicator, which requires students to communicate effectively and creatively express themselves…
2014-10-01
designed an Internet-based and mobile application (software) to assist with the following domains pertinent to diabetes self-management: 1...management that provides education, reminders, and support. The new tool is an internet-based and mobile application (software), now called Tracking...is mobile , provides decision support with actionable options, and is based on user input, will enhance diabetes self-care, improve glycemic control
Model-Driven Useware Engineering
NASA Astrophysics Data System (ADS)
Meixner, Gerrit; Seissler, Marc; Breiner, Kai
User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.
Computer Aided Method for System Safety and Reliability Assessments
2008-09-01
program between 1998 and 2003. This tool was not marketed in the public domain after the CRV program ended. The other tool is called eXpress, and it...support Government reviewed and approved analyses methodologies which can 5 then be shared with other government agencies and industry partners...Documented for B&R, UP&L, EPRI 30 DEC 80 GO IBM Version Enhanced at UCC , Dallas, Descriptors, Facility to Alter Array Sizes, Explanation of Use 1 SEP 82
Formal verification and testing: An integrated approach to validating Ada programs
NASA Technical Reports Server (NTRS)
Cohen, Norman H.
1986-01-01
An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.
Knowledge and information generated using new tools/methods collectively called "Omics" technologies could have a profound effect on qualitative and quantitative characterizations of human health risk assessments.
The suffix "Omics" is a descriptor used for a series of e...
A survey on annotation tools for the biomedical literature.
Neves, Mariana; Leser, Ulf
2014-03-01
New approaches to biomedical text mining crucially depend on the existence of comprehensive annotated corpora. Such corpora, commonly called gold standards, are important for learning patterns or models during the training phase, for evaluating and comparing the performance of algorithms and also for better understanding the information sought for by means of examples. Gold standards depend on human understanding and manual annotation of natural language text. This process is very time-consuming and expensive because it requires high intellectual effort from domain experts. Accordingly, the lack of gold standards is considered as one of the main bottlenecks for developing novel text mining methods. This situation led the development of tools that support humans in annotating texts. Such tools should be intuitive to use, should support a range of different input formats, should include visualization of annotated texts and should generate an easy-to-parse output format. Today, a range of tools which implement some of these functionalities are available. In this survey, we present a comprehensive survey of tools for supporting annotation of biomedical texts. Altogether, we considered almost 30 tools, 13 of which were selected for an in-depth comparison. The comparison was performed using predefined criteria and was accompanied by hands-on experiences whenever possible. Our survey shows that current tools can support many of the tasks in biomedical text annotation in a satisfying manner, but also that no tool can be considered as a true comprehensive solution.
SMART: A Propositional Logic-Based Trade Analysis and Risk Assessment Tool for a Complex Mission
NASA Technical Reports Server (NTRS)
Ono, Masahiro; Nicholas, Austin; Alibay, Farah; Parrish, Joseph
2015-01-01
This paper introduces a new trade analysis software called the Space Mission Architecture and Risk Analysis Tool (SMART). This tool supports a high-level system trade study on a complex mission, such as a potential Mars Sample Return (MSR) mission, in an intuitive and quantitative manner. In a complex mission, a common approach to increase the probability of success is to have redundancy and prepare backups. Quantitatively evaluating the utility of adding redundancy to a system is important but not straightforward, particularly when the failure of parallel subsystems are correlated.
Near-Infrared Neuroimaging with NinPy
Strangman, Gary E.; Zhang, Quan; Zeffiro, Thomas
2009-01-01
There has been substantial recent growth in the use of non-invasive optical brain imaging in studies of human brain function in health and disease. Near-infrared neuroimaging (NIN) is one of the most promising of these techniques and, although NIN hardware continues to evolve at a rapid pace, software tools supporting optical data acquisition, image processing, statistical modeling, and visualization remain less refined. Python, a modular and computationally efficient development language, can support functional neuroimaging studies of diverse design and implementation. In particular, Python's easily readable syntax and modular architecture allow swift prototyping followed by efficient transition to stable production systems. As an introduction to our ongoing efforts to develop Python software tools for structural and functional neuroimaging, we discuss: (i) the role of non-invasive diffuse optical imaging in measuring brain function, (ii) the key computational requirements to support NIN experiments, (iii) our collection of software tools to support NIN, called NinPy, and (iv) future extensions of these tools that will allow integration of optical with other structural and functional neuroimaging data sources. Source code for the software discussed here will be made available at www.nmr.mgh.harvard.edu/Neural_SystemsGroup/software.html. PMID:19543449
NASA Airline Operations Research Center
NASA Technical Reports Server (NTRS)
Mogford, Richard H.
2016-01-01
This is a PowerPoint presentation NASA airline operations center (AOC) research. It includes information on using IBM Watson in the AOC. It also reviews a dispatcher decision support tool call the Flight Awareness Collaboration Tool (FACT). FACT gathers information about winter weather onto one screen and includes predictive abilities. It should prove to be useful for airline dispatchers and airport personnel when they manage winter storms and their effect on air traffic. This material is very similar to other previously approved presentations with the same title.
Automated simulation as part of a design workstation
NASA Technical Reports Server (NTRS)
Cantwell, Elizabeth; Shenk, T.; Robinson, P.; Upadhye, R.
1990-01-01
A development project for a design workstation for advanced life-support systems (called the DAWN Project, for Design Assistant Workstation), incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulation such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components are discussed.
CWA 15793 2011 Planning and Implementation Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gross, Alan; Nail, George
This software, built on an open source platform called Electron (runs on Chromium and Node.js), is designed to assist organizations in the implementation of a biorisk management system consistent with the requirements of the international, publicly available guidance document CEN Workshop Agreement 15793:2011 (CWA 15793). The software includes tools for conducting organizational gap analysis against CWA 15793 requirements, planning tools to support the implementation of CWA 15793 requirements, and performance monitoring support. The gap analysis questions are based on the text of CWA 15793, and its associated guidance document, CEN Workshop Agreement 16393:2012. The authors have secured permission from themore » publisher of CWA 15793, the European Committee for Standardization (CEN), to use language from the document in the software, with the understanding that the software will be made available freely, without charge.« less
Medication Assisted Treatment for the 21st Century: Community Education Kit.
ERIC Educational Resources Information Center
Substance Abuse and Mental Health Services Administration (DHHS/PHS), Rockville, MD. Center for Substance Abuse Treatment.
The need to support the success of individuals in methadone-assisted recovery, and the recent availability of new pharmacologic treatment options for opioid dependence, calls for an information tool that underscores the evidence-based benefits of medication assisted treatment for opioid dependence. The U.S. Department of Health and Human Services'…
ERIC Educational Resources Information Center
Wilkerson-Jerde, Michelle Hoda
2014-01-01
There are increasing calls to prepare K-12 students to use computational tools and principles when exploring scientific or mathematical phenomena. The purpose of this paper is to explore whether and how constructionist computer-supported collaborative environments can explicitly engage students in this practice. The Categorizer is a…
A New Paradigm for Intelligent Tutoring Systems: Example-Tracing Tutors
ERIC Educational Resources Information Center
Aleven, Vincent; McLaren, Bruce M.; Sewall, Jonathan; Koedinger, Kenneth R.
2009-01-01
The Cognitive Tutor Authoring Tools (CTAT) support creation of a novel type of tutors called example-tracing tutors. Unlike other types of ITSs (e.g., model-tracing tutors, constraint-based tutors), example-tracing tutors evaluate student behavior by flexibly comparing it against generalized examples of problem-solving behavior. Example-tracing…
The US EPA is developing an open and publically available software program called the Human Exposure Model (HEM) to provide near-field exposure information for Life Cycle Impact Assessments (LCIAs). Historically, LCIAs have often omitted impacts from near-field sources of exposur...
The Portable Usability Testing Lab: A Flexible Research Tool.
ERIC Educational Resources Information Center
Hale, Michael E.; And Others
A group of faculty at the University of Georgia obtained funding for a research and development facility called the Learning and Performance Support Laboratory (LPSL). One of the LPSL's primary needs was obtaining a portable usability lab for software testing, so the facility obtained the "Luggage Lab 2000." The lab is transportable to…
Chemical annotation of small and peptide-like molecules at the Protein Data Bank
Young, Jasmine Y.; Feng, Zukang; Dimitropoulos, Dimitris; Sala, Raul; Westbrook, John; Zhuravleva, Marina; Shao, Chenghua; Quesada, Martha; Peisach, Ezra; Berman, Helen M.
2013-01-01
Over the past decade, the number of polymers and their complexes with small molecules in the Protein Data Bank archive (PDB) has continued to increase significantly. To support scientific advancements and ensure the best quality and completeness of the data files over the next 10 years and beyond, the Worldwide PDB partnership that manages the PDB archive is developing a new deposition and annotation system. This system focuses on efficient data capture across all supported experimental methods. The new deposition and annotation system is composed of four major modules that together support all of the processing requirements for a PDB entry. In this article, we describe one such module called the Chemical Component Annotation Tool. This tool uses information from both the Chemical Component Dictionary and Biologically Interesting molecule Reference Dictionary to aid in annotation. Benchmark studies have shown that the Chemical Component Annotation Tool provides significant improvements in processing efficiency and data quality. Database URL: http://wwpdb.org PMID:24291661
Chemical annotation of small and peptide-like molecules at the Protein Data Bank.
Young, Jasmine Y; Feng, Zukang; Dimitropoulos, Dimitris; Sala, Raul; Westbrook, John; Zhuravleva, Marina; Shao, Chenghua; Quesada, Martha; Peisach, Ezra; Berman, Helen M
2013-01-01
Over the past decade, the number of polymers and their complexes with small molecules in the Protein Data Bank archive (PDB) has continued to increase significantly. To support scientific advancements and ensure the best quality and completeness of the data files over the next 10 years and beyond, the Worldwide PDB partnership that manages the PDB archive is developing a new deposition and annotation system. This system focuses on efficient data capture across all supported experimental methods. The new deposition and annotation system is composed of four major modules that together support all of the processing requirements for a PDB entry. In this article, we describe one such module called the Chemical Component Annotation Tool. This tool uses information from both the Chemical Component Dictionary and Biologically Interesting molecule Reference Dictionary to aid in annotation. Benchmark studies have shown that the Chemical Component Annotation Tool provides significant improvements in processing efficiency and data quality. Database URL: http://wwpdb.org.
Jeagle: a JAVA Runtime Verification Tool
NASA Technical Reports Server (NTRS)
DAmorim, Marcelo; Havelund, Klaus
2005-01-01
We introduce the temporal logic Jeagle and its supporting tool for runtime verification of Java programs. A monitor for an Jeagle formula checks if a finite trace of program events satisfies the formula. Jeagle is a programming oriented extension of the rule-based powerful Eagle logic that has been shown to be capable of defining and implementing a range of finite trace monitoring logics, including future and past time temporal logic, real-time and metric temporal logics, interval logics, forms of quantified temporal logics, and so on. Monitoring is achieved on a state-by-state basis avoiding any need to store the input trace. Jeagle extends Eagle with constructs for capturing parameterized program events such as method calls and method returns. Parameters can be the objects that methods are called upon, arguments to methods, and return values. Jeagle allows one to refer to these in formulas. The tool performs automated program instrumentation using AspectJ. We show the transformational semantics of Jeagle.
compomics-utilities: an open-source Java library for computational proteomics.
Barsnes, Harald; Vaudel, Marc; Colaert, Niklaas; Helsens, Kenny; Sickmann, Albert; Berven, Frode S; Martens, Lennart
2011-03-08
The growing interest in the field of proteomics has increased the demand for software tools and applications that process and analyze the resulting data. And even though the purpose of these tools can vary significantly, they usually share a basic set of features, including the handling of protein and peptide sequences, the visualization of (and interaction with) spectra and chromatograms, and the parsing of results from various proteomics search engines. Developers typically spend considerable time and effort implementing these support structures, which detracts from working on the novel aspects of their tool. In order to simplify the development of proteomics tools, we have implemented an open-source support library for computational proteomics, called compomics-utilities. The library contains a broad set of features required for reading, parsing, and analyzing proteomics data. compomics-utilities is already used by a long list of existing software, ensuring library stability and continued support and development. As a user-friendly, well-documented and open-source library, compomics-utilities greatly simplifies the implementation of the basic features needed in most proteomics tools. Implemented in 100% Java, compomics-utilities is fully portable across platforms and architectures. Our library thus allows the developers to focus on the novel aspects of their tools, rather than on the basic functions, which can contribute substantially to faster development, and better tools for proteomics.
The cancer experience map: an approach to including the patient voice in supportive care solutions.
Hall, Leslie Kelly; Kunz, Breanne F; Davis, Elizabeth V; Dawson, Rose I; Powers, Ryan S
2015-05-28
The perspective of the patient, also called the "patient voice", is an essential element in materials created for cancer supportive care. Identifying that voice, however, can be a challenge for researchers and developers. A multidisciplinary team at a health information company tasked with addressing this issue created a representational model they call the "cancer experience map". This map, designed as a tool for content developers, offers a window into the complex perspectives inside the cancer experience. Informed by actual patient quotes, the map shows common overall themes for cancer patients, concerns at key treatment points, strategies for patient engagement, and targeted behavioral goals. In this article, the team members share the process by which they created the map as well as its first use as a resource for cancer support videos. The article also addresses the broader policy implications of including the patient voice in supportive cancer content, particularly with regard to mHealth apps.
SBML-PET: a Systems Biology Markup Language-based parameter estimation tool.
Zi, Zhike; Klipp, Edda
2006-11-01
The estimation of model parameters from experimental data remains a bottleneck for a major breakthrough in systems biology. We present a Systems Biology Markup Language (SBML) based Parameter Estimation Tool (SBML-PET). The tool is designed to enable parameter estimation for biological models including signaling pathways, gene regulation networks and metabolic pathways. SBML-PET supports import and export of the models in the SBML format. It can estimate the parameters by fitting a variety of experimental data from different experimental conditions. SBML-PET has a unique feature of supporting event definition in the SMBL model. SBML models can also be simulated in SBML-PET. Stochastic Ranking Evolution Strategy (SRES) is incorporated in SBML-PET for parameter estimation jobs. A classic ODE Solver called ODEPACK is used to solve the Ordinary Differential Equation (ODE) system. http://sysbio.molgen.mpg.de/SBML-PET/. The website also contains detailed documentation for SBML-PET.
The Design of Modular Web-Based Collaboration
NASA Astrophysics Data System (ADS)
Intapong, Ploypailin; Settapat, Sittapong; Kaewkamnerdpong, Boonserm; Achalakul, Tiranee
Online collaborative systems are popular communication channels as the systems allow people from various disciplines to interact and collaborate with ease. The systems provide communication tools and services that can be integrated on the web; consequently, the systems are more convenient to use and easier to install. Nevertheless, most of the currently available systems are designed according to some specific requirements and cannot be straightforwardly integrated into various applications. This paper provides the design of a new collaborative platform, which is component-based and re-configurable. The platform is called the Modular Web-based Collaboration (MWC). MWC shares the same concept as computer supported collaborative work (CSCW) and computer-supported collaborative learning (CSCL), but it provides configurable tools for online collaboration. Each tool module can be integrated into users' web applications freely and easily. This makes collaborative system flexible, adaptable and suitable for online collaboration.
ERIC Educational Resources Information Center
Roy, Debopriyo
2014-01-01
Besides focusing on grammar, writing skills, and web-based language learning, researchers in "CALL" and second language acquisition have also argued for the importance of promoting higher-order thinking skills in ESL (English as Second Language) and EFL (English as Foreign Language) classrooms. There is solid evidence supporting the…
Tool Mediation in Focus on Form Activities: Case Studies in a Grammar-Exploring Environment
ERIC Educational Resources Information Center
Karlstrom, Petter; Cerratto-Pargman, Teresa; Lindstrom, Henrik; Knutsson, Ola
2007-01-01
We present two case studies of two different pedagogical tasks in a Computer Assisted Language Learning environment called Grim. The main design principle in Grim is to support "Focus on Form" in second language pedagogy. Grim contains several language technology-based features for exploring linguistic forms (static, rule-based and statistical),…
NASA Technical Reports Server (NTRS)
Karandikar, Harsh M.
1997-01-01
An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.
ERIC Educational Resources Information Center
Kennedy, Michael J.; Thomas, Cathy Newman; Meyer, J. Patrick; Alves, Kat D.; Lloyd, John Wills
2014-01-01
Universal Design for Learning (UDL) is a framework that is commonly used for guiding the construction and delivery of instruction intended to support all students. In this study, we used a related model to guide creation of a multimedia-based instructional tool called content acquisition podcasts (CAPs). CAPs delivered vocabulary instruction…
A Call for Qualitative Methods in Action: Enlisting Positionality as an Equity Tool
ERIC Educational Resources Information Center
Relles, Stefani R.
2016-01-01
This article describes how the qualitative research tradition known as "positionality" can be used as a method to support classroom equity. The text describes three ways teachers can use a spoken approach to positionality in their day-to-day practice. Classroom vignettes illuminate how these spoken methods of positionality can address…
An Unexpected Ally: Using Microsoft's SharePoint to Create a Departmental Intranet
ERIC Educational Resources Information Center
Dahl, David
2010-01-01
In September 2008, the Albert S. Cook Library at Towson University implemented an intranet to support the various functions of the library's Reference Department. This intranet is called the RefPortal. After exploring open source options and other Web 2.0 tools, the department (under the guidance of the library technology coordinator) chose…
Nassi-Schneiderman Diagram in HTML Based on AML
ERIC Educational Resources Information Center
Menyhárt, László
2013-01-01
In an earlier work I defined an extension of XML called Algorithm Markup Language (AML) for easy and understandable coding in an IDE which supports XML editing (e.g. NetBeans). The AML extension contains annotations and native language (English or Hungarian) tag names used when coding our algorithm. This paper presents a drawing tool with which…
Utility in a Fallible Tool: A Multi-Site Case Study of Automated Writing Evaluation
ERIC Educational Resources Information Center
Grimes, Douglas; Warschauer, Mark
2010-01-01
Automated writing evaluation (AWE) software uses artificial intelligence (AI) to score student essays and support revision. We studied how an AWE program called MY Access![R] was used in eight middle schools in Southern California over a three-year period. Although many teachers and students considered automated scoring unreliable, and teachers'…
Digital teaching tools and global learning communities.
Williams, Mary; Lockhart, Patti; Martin, Cathie
2015-01-01
In 2009, we started a project to support the teaching and learning of university-level plant sciences, called Teaching Tools in Plant Biology. Articles in this series are published by the plant science journal, The Plant Cell (published by the American Society of Plant Biologists). Five years on, we investigated how the published materials are being used through an analysis of the Google Analytics pageviews distribution and through a user survey. Our results suggest that this project has had a broad, global impact in supporting higher education, and also that the materials are used differently by individuals in terms of their role (instructor, independent learner, student) and geographical location. We also report on our ongoing efforts to develop a global learning community that encourages discussion and resource sharing.
The Future of the Internet in Science
NASA Technical Reports Server (NTRS)
Guice, Jon; Duffy, Robert
2000-01-01
How are scientists going to make use of the Internet several years from now? This is a case study of a leading-edge experiment in building a 'virtual institute'-- using electronic communication tools to foster collaboration among geographically dispersed scientists. Our experience suggests: Scientists will want to use web-based document management systems. There will be a demand for Internet-enabled meeting support tools. While internet videoconferencing will have limited value for scientists, webcams will be in great demand as a tool for transmitting pictures of objects and settings, rather than "talking heads." and a significant share of scientists who do fieldwork will embrace mobile voice, data and video communication tools. The setting for these findings is a research consortium called the NASA Astrobiology Institute.
MEA-Tools: an open source toolbox for the analysis of multi-electrode data with MATLAB.
Egert, U; Knott, Th; Schwarz, C; Nawrot, M; Brandt, A; Rotter, S; Diesmann, M
2002-05-30
Recent advances in electrophysiological techniques have created new tools for the acquisition and storage of neuronal activity recorded simultaneously with numerous electrodes. These techniques support the analysis of the function as well as the structure of individual electrogenic cells in the context of surrounding neuronal or cardiac network. Commercially available tools for the analysis of such data, however, cannot be easily adapted to newly emerging requirements for data analysis and visualization, and cross compatibility between them is limited. In this report we introduce a free open source toolbox called microelectrode array tools (MEA-Tools) for the analysis of multi-electrode data based on the common data analysis environment MATLAB (version 5.3-6.1, The Mathworks, Natick, MA). The toolbox itself is platform independent. The file interface currently supports files recorded with MCRack (Multi Channel Systems, Reutlingen, Germany) under Microsoft Windows 95, 98, NT, and 2000, but can be adapted to other data acquisition systems. Functions are controlled via command line input and graphical user interfaces, and support common requirements for the analysis of local field potentials, extracellular spike activity, and continuous recordings, in addition to supplementary data acquired by additional instruments, e.g. intracellular amplifiers. Data may be processed as continuous recordings or time windows triggered to some event.
A Two-Layer Least Squares Support Vector Machine Approach to Credit Risk Assessment
NASA Astrophysics Data System (ADS)
Liu, Jingli; Li, Jianping; Xu, Weixuan; Shi, Yong
Least squares support vector machine (LS-SVM) is a revised version of support vector machine (SVM) and has been proved to be a useful tool for pattern recognition. LS-SVM had excellent generalization performance and low computational cost. In this paper, we propose a new method called two-layer least squares support vector machine which combines kernel principle component analysis (KPCA) and linear programming form of least square support vector machine. With this method sparseness and robustness is obtained while solving large dimensional and large scale database. A U.S. commercial credit card database is used to test the efficiency of our method and the result proved to be a satisfactory one.
Applying AI tools to operational space environmental analysis
NASA Technical Reports Server (NTRS)
Krajnak, Mike; Jesse, Lisa; Mucks, John
1995-01-01
The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines events covering reports of natural phenomena such as solar flares, bursts, geomagnetic storms, and five others pertinent to space environmental analysis. With our preliminary event definitions we experimented with TAS's support for temporal pattern analysis using X-ray flare and geomagnetic storm forecasts as case studies. We are currently working on a framework for integrating advanced graphics and space environmental models into this analytical environment.
Sidney, Kristi; Antony, Jimmy; Rodrigues, Rashmi; Arumugam, Karthika; Krishnamurthy, Shubha; D'souza, George; De Costa, Ayesha; Shet, Anita
2012-01-01
There has been exponential growth in the use of mobile phones in India over the last few years, and their potential benefits as a healthcare tool has raised tremendous interest. We used mobile phone reminders to help support adherence to antiretroviral therapy (ART) among HIV patients at an infectious disease clinic in a tertiary hospital in Bangalore. Between March and June 2010, 139 adult HIV patients taking regular ART for at least a month received weekly reminders to support adherence. These reminders consisted of a weekly interactive call and a non-interactive neutral pictorial short message service (SMS). After four weeks of the intervention, participants were interviewed to study perceptions on preference, usefulness, potential stigma and privacy concerns associated with this intervention. Majority of the participants were urban (89%), and had at least a secondary education (85%). A total of 744 calls were made, 545 (76%) of which were received by the participants. In addition, all participants received the weekly pictorial SMS reminder. A month later, 90% of participants reported the intervention as being helpful as medication reminders, and did not feel their privacy was intruded. Participants (87%) reported that they preferred the call as reminders, just 11% favoured SMS reminders alone. Only 59% of participants viewed all the SMSs that were delivered, while 15% never viewed any at all. Participants also denied any discomfort or stigma despite 20% and 13%, respectively, reporting that another person had inadvertently received their reminder call or SMS. Mobile phone interventions are an acceptable way of supporting adherence in this setting. Voice calls rather than SMSs alone seem to be preferred as reminders. Further research to study the influence of this intervention on adherence and health maintenance is warranted.
A decision support tool for synchronizing technology advances with strategic mission objectives
NASA Technical Reports Server (NTRS)
Hornstein, Rhoda S.; Willoughby, John K.
1992-01-01
Successful accomplishment of the objectives of many long-range future missions in areas such as space systems, land-use planning, and natural resource management requires significant technology developments. This paper describes the development of a decision-support data-derived tool called MisTec for helping strategic planners to determine technology development alternatives and to synchronize the technology development schedules with the performance schedules of future long-term missions. Special attention is given to the operations, concept, design, and functional capabilities of the MisTec. The MisTec was initially designed for manned Mars mission, but can be adapted to support other high-technology long-range strategic planning situations, making it possible for a mission analyst, planner, or manager to describe a mission scenario, determine the technology alternatives for making the mission achievable, and to plan the R&D activity necessary to achieve the required technology advances.
Counter Action Procedure Generation in an Emergency Situation of Nuclear Power Plants
NASA Astrophysics Data System (ADS)
Gofuku, A.
2018-02-01
Lessons learned from the Fukushima Daiichi accident revealed various weak points in the design and operation of nuclear power plants at the time although there were many resilient activities made by the plant staff under difficult work environment. In order to reinforce the measures to make nuclear power plants more resilient, improvement of hardware and improvement of education and training of nuclear personnel are considered. In addition, considering the advancement of computer technology and artificial intelligence, it is a promising way to develop software tools to support the activities of plant staff.This paper focuses on the software tools to support the operations by human operators and introduces a concept of an intelligent operator support system that is called as co-operator. This paper also describes a counter operation generation technique the authors are studying as a core component of the co-operator.
Student Evaluation of CALL Tools during the Design Process
ERIC Educational Resources Information Center
Nesbitt, Dallas
2013-01-01
This article discusses the comparative effectiveness of student input at different times during the design of CALL tools for learning kanji, the Japanese characters of Chinese origin. The CALL software "package" consisted of tools to facilitate the writing, reading and practising of kanji characters in context. A pre-design questionnaire…
van Oosterom, L; Montgomery, J C; Jeffs, A G; Radford, C A
2016-01-11
Soundscapes provide a new tool for the study of fish communities. Bigeyes (Pempheris adspersa) are nocturnal planktivorous reef fish, feed in loose shoals and are soniferous. These vocalisations have been suggested to be contact calls to maintain group cohesion, however direct evidence for this is absent, despite the fact that contact calls are well documented for many other vertebrates, including marine mammals. For fish, direct evidence for group cohesion signals is restricted to the use of visual and hydrodynamic cues. In support of adding vocalisation as a contributing cue, our laboratory experiments show that bigeyes significantly increased group cohesion when exposed to recordings of ambient reef sound at higher sound levels while also decreasing vocalisations. These patterns of behaviour are consistent with acoustic masking. When exposed to playback of conspecific vocalisations, the group cohesion and vocalisation rates of bigeyes both significantly increased. These results provide the first direct experimental support for the hypotheses that vocalisations are used as contact calls to maintain group cohesion in fishes, making fish the evolutionarily oldest vertebrate group in which this phenomenon has been observed, and adding a new dimension to the interpretation of nocturnal reef soundscapes.
NASA Astrophysics Data System (ADS)
van Oosterom, L.; Montgomery, J. C.; Jeffs, A. G.; Radford, C. A.
2016-01-01
Soundscapes provide a new tool for the study of fish communities. Bigeyes (Pempheris adspersa) are nocturnal planktivorous reef fish, feed in loose shoals and are soniferous. These vocalisations have been suggested to be contact calls to maintain group cohesion, however direct evidence for this is absent, despite the fact that contact calls are well documented for many other vertebrates, including marine mammals. For fish, direct evidence for group cohesion signals is restricted to the use of visual and hydrodynamic cues. In support of adding vocalisation as a contributing cue, our laboratory experiments show that bigeyes significantly increased group cohesion when exposed to recordings of ambient reef sound at higher sound levels while also decreasing vocalisations. These patterns of behaviour are consistent with acoustic masking. When exposed to playback of conspecific vocalisations, the group cohesion and vocalisation rates of bigeyes both significantly increased. These results provide the first direct experimental support for the hypotheses that vocalisations are used as contact calls to maintain group cohesion in fishes, making fish the evolutionarily oldest vertebrate group in which this phenomenon has been observed, and adding a new dimension to the interpretation of nocturnal reef soundscapes.
van Oosterom, L.; Montgomery, J. C.; Jeffs, A. G.; Radford, C. A.
2016-01-01
Soundscapes provide a new tool for the study of fish communities. Bigeyes (Pempheris adspersa) are nocturnal planktivorous reef fish, feed in loose shoals and are soniferous. These vocalisations have been suggested to be contact calls to maintain group cohesion, however direct evidence for this is absent, despite the fact that contact calls are well documented for many other vertebrates, including marine mammals. For fish, direct evidence for group cohesion signals is restricted to the use of visual and hydrodynamic cues. In support of adding vocalisation as a contributing cue, our laboratory experiments show that bigeyes significantly increased group cohesion when exposed to recordings of ambient reef sound at higher sound levels while also decreasing vocalisations. These patterns of behaviour are consistent with acoustic masking. When exposed to playback of conspecific vocalisations, the group cohesion and vocalisation rates of bigeyes both significantly increased. These results provide the first direct experimental support for the hypotheses that vocalisations are used as contact calls to maintain group cohesion in fishes, making fish the evolutionarily oldest vertebrate group in which this phenomenon has been observed, and adding a new dimension to the interpretation of nocturnal reef soundscapes. PMID:26750559
Sinclair, Shane; Hagen, Neil A; Chambers, Carole; Manns, Braden; Simon, Anita; Browman, George P
2008-05-01
Drug decision-makers are involved in developing and implementing policy, procedure and processes to support health resource allocation regarding drug treatment formularies. A variety of approaches to decision-making, including formal decision-making frameworks, have been developed to support transparent and fair priority setting. Recently, a decision tool, 'The 6-STEPPPs Tool', was developed to assist in making decisions about new cancer drugs within the public health care system. We conducted a qualitative study, utilizing focus groups and participant observation, in order to investigate the internal frameworks that supported and challenged individual participants as they applied this decision tool within a multi-stakeholder decision process. We discovered that health care resource allocation engaged not only the minds of decision-makers but profoundly called on the often conflicting values of the heart. Objective decision-making frameworks for new drug therapies need to consider the subjective internal frameworks of decision-makers that affect decisions. Understanding the very human, internal turmoil experienced by individuals involved in health care resource allocation, sheds additional insight into how to account for reasonableness and how to better support difficult decisions through transparent, values-based resource allocation policy, procedures and processes.
ERIC Educational Resources Information Center
Moreillon, Judi
2015-01-01
As more and more library and LIS education migrates to the online environment, college and university faculty are called upon to expand their pedagogy. Interactivity has been cited as one factor in attracting and retaining students in online courses and programs. LIS educators can reach outside the online learning management system (LMS) to…
Kevin Hyde; Matthew B. Dickinson; Gil Bohrer; David Calkin; Louisa Evers; Julie Gilbertson-Day; Tessa Nicolet; Kevin Ryan; Christina Tague
2013-01-01
Wildland fire management has moved beyond a singular focus on suppression, calling for wildfire management for ecological benefit where no critical human assets are at risk. Processes causing direct effects and indirect, long-term ecosystem changes are complex and multidimensional. Robust risk-assessment tools are required that account for highly variable effects on...
ERIC Educational Resources Information Center
Abrami, Philip C.; Venkatesh, Vivek; Meyer, Elizabeth J.; Wade, C. Anne
2013-01-01
The research presented here is a continuation of a line of inquiry that explores the impacts of an electronic portfolio software called ePEARL, which is a knowledge tool designed to support the key phases of self-regulated learning (SRL)--forethought, performance, and self-reflection--and promote student learning. Participants in this study were…
ERIC Educational Resources Information Center
Swensen, Kaja Vembe; Silseth, Kenneth; Krange, Ingeborg
2014-01-01
In this paper, we will present and discuss data from a research project called MIRACLE, in which high school students learned about energy and energy transformation in a technology-rich learning environment. This learning environment spanned across a classroom, a science center, and an online platform specially designed to support coherence across…
ERIC Educational Resources Information Center
Sager, Morten; Eriksson, Lena
2015-01-01
In this article we describe and diagnose ailments suffered by the so-called "medical insurance decision-making support tool" that was published in 2007 as part of a major reform of the Swedish social insurance. Through document studies and interviews the guideline is analysed and compared with a reference case, a guideline within…
Logistics Process Analysis ToolProcess Analysis Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
2008-03-31
LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fortmore » Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less
A software communication tool for the tele-ICU.
Pimintel, Denise M; Wei, Shang Heng; Odor, Alberto
2013-01-01
The Tele Intensive Care Unit (tele-ICU) supports a high volume, high acuity population of patients. There is a high-volume of incoming and outgoing calls, especially during the evening and night hours, through the tele-ICU hubs. The tele-ICU clinicians must be able to communicate effectively to team members in order to support the care of complex and critically ill patients while supporting and maintaining a standard to improve time to intervention. This study describes a software communication tool that will improve the time to intervention, over the paper-driven communication format presently used in the tele-ICU. The software provides a multi-relational database of message instances to mine information for evaluation and quality improvement for all entities that touch the tele-ICU. The software design incorporates years of critical care and software design experience combined with new skills acquired in an applied Health Informatics program. This software tool will function in the tele-ICU environment and perform as a front-end application that gathers, routes, and displays internal communication messages for intervention by priority and provider.
Tools, Services & Support of NASA Salinity Mission Data Archival Distribution through PO.DAAC
NASA Astrophysics Data System (ADS)
Tsontos, V. M.; Vazquez, J.
2017-12-01
The Physical Oceanography Distributed Active Center (PO.DAAC) serves as the designated NASA repository and distribution node for all Aquarius/SAC-D and SMAP sea surface salinity (SSS) mission data products in close collaboration with the projects. In addition to these official mission products, that by December 2017 will include the Aquarius V5.0 end-of-mission data, PO.DAAC archives and distributes high-value, principal investigator led satellite SSS products, and also datasets from NASA's "Salinity Processes in the Upper Ocean Regional Study" (SPURS 1 & 2) field campaigns in the N. Atlantic salinity maximum and high rainfall E. Tropical Pacific regions. Here we report on the status of these data holdings at PO.DAAC, and the range of data services and access tools that are provided in support of NASA salinity. These include user support and data discovery services, OPeNDAP and THREDDS web services for subsetting/extraction, and visualization via LAS and SOTO. Emphasis is placed on newer capabilities, including PODAAC's consolidated web services (CWS) and advanced L2 subsetting tool called HiTIDE.
The Development of an eHealth Tool Suite for Prostate Cancer Patients and Their Partners
Van Bogaert, Donna; Hawkins, Robert; Pingree, Suzanne; Jarrard, David
2013-01-01
Background eHealth resources for people facing health crises must balance the expert knowledge and perspective of developers and clinicians against the very different needs and perspectives of prospective users. This formative study explores the information and support needs of posttreatment prostate cancer patients and their partners as a way to improve an existing eHealth information and support system called CHESS (Comprehensive Health Enhancement Support System). Methods Focus groups with patient survivors and their partners were used to identify information gaps and information-seeking milestones. Results Both patients and partners expressed a need for assistance in decision making, connecting with experienced patients, and making sexual adjustments. Female partners of patients are more active in searching for cancer information. All partners have information and support needs distinct from those of the patient. Conclusions Findings were used to develop a series of interactive tools and navigational features for the CHESS prostate cancer computer-mediated system. PMID:22591675
Development of a knowledge management system for complex domains.
Perott, André; Schader, Nils; Bruder, Ralph; Leonhardt, Jörg
2012-01-01
Deutsche Flugsicherung GmbH, the German Air Navigation Service Provider, follows a systematic approach, called HERA, for investigating incidents. The HERA analysis shows a distinctive occurrence of incidents in German air traffic control in which the visual perception of information plays a key role. The reasons can be partially traced back to workstation design, where basic ergonomic rules and principles are not sufficiently followed by the designers in some cases. In cooperation with the Institute of Ergonomics in Darmstadt the DFS investigated possible approaches that may support designers to implement ergonomic systems. None of the currently available tools were found to be able to meet the identified user requirements holistically. Therefore it was suggested to develop an enhanced software tool called Design Process Guide. The name Design Process Guide indicates that this tool exceeds the classic functions of currently available Knowledge Management Systems. It offers "design element" based access, shows processual and content related topics, and shows the implications of certain design decisions. Furthermore, it serves as documentation, detailing why a designer made to a decision under a particular set of conditions.
Chronic condition self-management support for Aboriginal people: Adapting tools and training.
Battersby, Malcolm; Lawn, Sharon; Kowanko, Inge; Bertossa, Sue; Trowbridge, Coral; Liddicoat, Raylene
2018-04-22
Chronic conditions are major health problems for Australian Aboriginal people. Self-management programs can improve health outcomes. However, few health workers are skilled in self-management support and existing programs are not always appropriate in Australian Aboriginal contexts. The goal was to increase the capacity of the Australian health workforce to support Australian Aboriginal people to self-manage their chronic conditions by adapting the Flinders Program of chronic condition self-management support for Australian Aboriginal clients and develop and deliver training for health professionals to implement the program. Feedback from health professionals highlighted that the Flinders Program assessment and care planning tools needed to be adapted to suit Australian Aboriginal contexts. Through consultation with Australian Aboriginal Elders and other experts, the tools were condensed into an illustrated booklet called 'My Health Story'. Associated training courses and resources focusing on cultural safety and effective engagement were developed. A total of 825 health professionals across Australia was trained and 61 people qualified as accredited trainers in the program, ensuring sustainability. The capacity and skills of the Australian health workforce to engage with and support Australian Aboriginal people to self-manage their chronic health problems significantly increased as a result of this project. The adapted tools and training were popular and appreciated by the health care organisations, health professionals and clients involved. The adapted tools have widespread appeal for cultures that do not have Western models of health care and where there are health literacy challenges. My Health Story has already been used internationally. © 2018 National Rural Health Alliance Ltd.
Pandey, Ram Vinay; Pabinger, Stephan; Kriegner, Albert; Weinhäusel, Andreas
2016-01-01
Traditional Sanger sequencing as well as Next-Generation Sequencing have been used for the identification of disease causing mutations in human molecular research. The majority of currently available tools are developed for research and explorative purposes and often do not provide a complete, efficient, one-stop solution. As the focus of currently developed tools is mainly on NGS data analysis, no integrative solution for the analysis of Sanger data is provided and consequently a one-stop solution to analyze reads from both sequencing platforms is not available. We have therefore developed a new pipeline called MutAid to analyze and interpret raw sequencing data produced by Sanger or several NGS sequencing platforms. It performs format conversion, base calling, quality trimming, filtering, read mapping, variant calling, variant annotation and analysis of Sanger and NGS data under a single platform. It is capable of analyzing reads from multiple patients in a single run to create a list of potential disease causing base substitutions as well as insertions and deletions. MutAid has been developed for expert and non-expert users and supports four sequencing platforms including Sanger, Illumina, 454 and Ion Torrent. Furthermore, for NGS data analysis, five read mappers including BWA, TMAP, Bowtie, Bowtie2 and GSNAP and four variant callers including GATK-HaplotypeCaller, SAMTOOLS, Freebayes and VarScan2 pipelines are supported. MutAid is freely available at https://sourceforge.net/projects/mutaid.
Pandey, Ram Vinay; Pabinger, Stephan; Kriegner, Albert; Weinhäusel, Andreas
2016-01-01
Traditional Sanger sequencing as well as Next-Generation Sequencing have been used for the identification of disease causing mutations in human molecular research. The majority of currently available tools are developed for research and explorative purposes and often do not provide a complete, efficient, one-stop solution. As the focus of currently developed tools is mainly on NGS data analysis, no integrative solution for the analysis of Sanger data is provided and consequently a one-stop solution to analyze reads from both sequencing platforms is not available. We have therefore developed a new pipeline called MutAid to analyze and interpret raw sequencing data produced by Sanger or several NGS sequencing platforms. It performs format conversion, base calling, quality trimming, filtering, read mapping, variant calling, variant annotation and analysis of Sanger and NGS data under a single platform. It is capable of analyzing reads from multiple patients in a single run to create a list of potential disease causing base substitutions as well as insertions and deletions. MutAid has been developed for expert and non-expert users and supports four sequencing platforms including Sanger, Illumina, 454 and Ion Torrent. Furthermore, for NGS data analysis, five read mappers including BWA, TMAP, Bowtie, Bowtie2 and GSNAP and four variant callers including GATK-HaplotypeCaller, SAMTOOLS, Freebayes and VarScan2 pipelines are supported. MutAid is freely available at https://sourceforge.net/projects/mutaid. PMID:26840129
Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool
NASA Technical Reports Server (NTRS)
Maul, William A.; Fulton, Christopher E.
2011-01-01
This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual
NASA Astrophysics Data System (ADS)
Groves, D.
2014-12-01
After the devastating 2005 hurricane season, Louisiana embarked on an ambitious and daunting effort to develop and implement a comprehensive Coastal Master Plan. The Master Plan sought to achieve two key goals simultaneously: reduce hurricane flood risk and halt the net conversion of its coastal landscape to open ocean. Numerous prior efforts to achieve these goals had been tried without significant success. In 2012, however, the Louisiana Coastal Protection and Restoration Authority (CPRA) produced a 50-year, $50 billion Master Plan. It had broad support from a diverse and often adversarial set of stakeholders, and it was unanimously passed by the Louisiana legislature. In contrast to other efforts, CPRA took an approach to planning called by the U.S. National Research Council as "deliberation with analysis". Specifically, CPRA used data, models, and decision support tools not to define an optimal or best strategy, but instead to support stakeholder dialogue and deliberations over alterative coastal management strategies. RAND researchers, with the support of CPRA and other collaborators, developed the planning tool at the center of this process. The CPRA planning tool synthesized large amounts of information about how the coast might evolve over time with and without different combinations of hundreds of different projects and programs. The tool helped CPRA propose alternative strategies that could achieve the State's goals while also highlighting to stakeholders the key tradeoffs among them. Importantly, this process helped bring diverse communities together to support a single vision and specific set of projects and programs to meet many of Louisiana's coastal water resources challenges. This presentation will describe the planning approach and decision support tools developed to support the Master Plan's participatory stakeholder process. The presentation will also highlight several important key takeaway messages that have broad applicability to other water resources planning efforts. Lastly, it will describe several on-going efforts in other parts of the U.S. that are employing this same approach.
Radiocarbon dating of twentieth century works of art
NASA Astrophysics Data System (ADS)
Petrucci, F.; Caforio, L.; Fedi, M.; Mandò, P. A.; Peccenini, E.; Pellicori, V.; Rylands, P.; Schwartzbaum, P.; Taccetti, F.
2016-11-01
The atmospheric tests of nuclear weapons caused a sudden increase in the radiocarbon concentration in the atmosphere from 1955, reaching its maximum value in 1963-1965. Once the nuclear tests in the atmosphere were halted, the 14C concentration started to decrease. This behavior of the radiocarbon concentration is called the "Bomb Peak", and it has successfully been used as a tool for high-precision radiocarbon measurements, in forensic sciences and biology. In the art field, the possibility of dating canvas, wood and paper, widely used as supports for paintings, may be an invaluable tool in modern art studies.
Coordinating complex problem-solving among distributed intelligent agents
NASA Technical Reports Server (NTRS)
Adler, Richard M.
1992-01-01
A process-oriented control model is described for distributed problem solving. The model coordinates the transfer and manipulation of information across independent networked applications, both intelligent and conventional. The model was implemented using SOCIAL, a set of object-oriented tools for distributing computing. Complex sequences of distributed tasks are specified in terms of high level scripts. Scripts are executed by SOCIAL objects called Manager Agents, which realize an intelligent coordination model that routes individual tasks to suitable server applications across the network. These tools are illustrated in a prototype distributed system for decision support of ground operations for NASA's Space Shuttle fleet.
Numerical Propulsion System Simulation: A Common Tool for Aerospace Propulsion Being Developed
NASA Technical Reports Server (NTRS)
Follen, Gregory J.; Naiman, Cynthia G.
2001-01-01
The NASA Glenn Research Center is developing an advanced multidisciplinary analysis environment for aerospace propulsion systems called the Numerical Propulsion System Simulation (NPSS). This simulation is initially being used to support aeropropulsion in the analysis and design of aircraft engines. NPSS provides increased flexibility for the user, which reduces the total development time and cost. It is currently being extended to support the Aviation Safety Program and Advanced Space Transportation. NPSS focuses on the integration of multiple disciplines such as aerodynamics, structure, and heat transfer with numerical zooming on component codes. Zooming is the coupling of analyses at various levels of detail. NPSS development includes using the Common Object Request Broker Architecture (CORBA) in the NPSS Developer's Kit to facilitate collaborative engineering. The NPSS Developer's Kit will provide the tools to develop custom components and to use the CORBA capability for zooming to higher fidelity codes, coupling to multidiscipline codes, transmitting secure data, and distributing simulations across different platforms. These powerful capabilities will extend NPSS from a zero-dimensional simulation tool to a multifidelity, multidiscipline system-level simulation tool for the full life cycle of an engine.
NASA Astrophysics Data System (ADS)
Bell, C.; Li, Y.; Lopez, E.; Hogue, T. S.
2017-12-01
Decision support tools that quantitatively estimate the cost and performance of infrastructure alternatives are valuable for urban planners. Such a tool is needed to aid in planning stormwater projects to meet diverse goals such as the regulation of stormwater runoff and its pollutants, minimization of economic costs, and maximization of environmental and social benefits in the communities served by the infrastructure. This work gives a brief overview of an integrated decision support tool, called i-DST, that is currently being developed to serve this need. This presentation focuses on the development of a default database for the i-DST that parameterizes water quality treatment efficiency of stormwater best management practices (BMPs) by region. Parameterizing the i-DST by region will allow the tool to perform accurate simulations in all parts of the United States. A national dataset of BMP performance is analyzed to determine which of a series of candidate regionalizations explains the most variance in the national dataset. The data used in the regionalization analysis comes from the International Stormwater BMP Database and data gleaned from an ongoing systematic review of peer-reviewed and gray literature. In addition to identifying a regionalization scheme for water quality performance parameters in the i-DST, our review process will also provide example methods and protocols for systematic reviews in the field of Earth Science.
ERIC Educational Resources Information Center
Asfeldt, Morten; Purc-Stephenson, Rebecca; Hvenegaard, Glen
2017-01-01
Journal writing is a common practice in outdoor education (OE) and there is a long-standing claim that OE programs enhance sense of community (SOC). However, there remains a call for additional evidence to support the relationship between participation in outdoor programs and SOC. This study examines students' perceptions of the role of a group…
DOE Office of Scientific and Technical Information (OSTI.GOV)
North, Michael J.
Schema-on-read is an agile approach to data storage and retrieval that defers investments in data organization until production queries need to be run by working with data directly in native form. Schema-on-read functions have been implemented in a wide range of analytical systems, most notably Hadoop. SchemaOnRead is a CRAN package that uses R’s flexible data representations to provide transparent and convenient support for the schema-on-read paradigm in R. The schema-on- read tools within the package include a single function call that recursively reads folders with text, comma separated value, raster image, R data, HDF5, NetCDF, spreadsheet, Weka, Epi Info,more » Pajek network, R network, HTML, SPSS, Systat, and Stata files. The provided tools can be used as-is or easily adapted to implement customized schema-on-read tool chains in R. This paper’s contribution is that it introduces and describes SchemaOnRead, the first R package specifically focused on providing explicit schema-on-read support in R.« less
Virtual Power Electronics: Novel Software Tools for Design, Modeling and Education
NASA Astrophysics Data System (ADS)
Hamar, Janos; Nagy, István; Funato, Hirohito; Ogasawara, Satoshi; Dranga, Octavian; Nishida, Yasuyuki
The current paper is dedicated to present browser-based multimedia-rich software tools and e-learning curriculum to support the design and modeling process of power electronics circuits and to explain sometimes rather sophisticated phenomena. Two projects will be discussed. The so-called Inetele project is financed by the Leonardo da Vinci program of the European Union (EU). It is a collaborative project between numerous EU universities and institutes to develop state-of-the art curriculum in Electrical Engineering. Another cooperative project with participation of Japanese, European and Australian institutes focuses especially on developing e-learning curriculum, interactive design and modeling tools, furthermore on development of a virtual laboratory. Snapshots from these two projects will be presented.
Cloud-Based Tools to Support High-Resolution Modeling (Invited)
NASA Astrophysics Data System (ADS)
Jones, N.; Nelson, J.; Swain, N.; Christensen, S.
2013-12-01
The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482
An Investigation of Software Scaffolds Supporting Modeling Practices
NASA Astrophysics Data System (ADS)
Fretz, Eric B.; Wu, Hsin-Kai; Zhang, Baohui; Davis, Elizabeth A.; Krajcik, Joseph S.; Soloway, Elliot
2002-08-01
Modeling of complex systems and phenomena is of value in science learning and is increasingly emphasised as an important component of science teaching and learning. Modeling engages learners in desired pedagogical activities. These activities include practices such as planning, building, testing, analysing, and critiquing. Designing realistic models is a difficult task. Computer environments allow the creation of dynamic and even more complex models. One way of bringing the design of models within reach is through the use of scaffolds. Scaffolds are intentional assistance provided to learners from a variety of sources, allowing them to complete tasks that would otherwise be out of reach. Currently, our understanding of how scaffolds in software tools assist learners is incomplete. In this paper the scaffolds designed into a dynamic modeling software tool called Model-It are assessed in terms of their ability to support learners' use of modeling practices. Four pairs of middle school students were video-taped as they used the modeling software for three hours, spread over a two week time frame. Detailed analysis of coded videotape transcripts provided evidence of the importance of scaffolds in supporting the use of modeling practices. Learners used a variety of modeling practices, the majority of which occurred in conjunction with scaffolds. The use of three tool scaffolds was assessed as directly as possible, and these scaffolds were seen to support a variety of modeling practices. An argument is made for the continued empirical validation of types and instances of tool scaffolds, and further investigation of the important role of teacher and peer scaffolding in the use of scaffolded tools.
ERIC Educational Resources Information Center
DeVane, Benjamin
2017-01-01
In this review article, I argue that games are complementary, not self-supporting, learning tools for democratic education because they can: (a) offer "simplified, but often not simple, outlines" (later called "models") of complex social systems that generate further inquiry; (b) provide "practice spaces" for…
ERIC Educational Resources Information Center
Giles, David; Bills, Andrew
2017-01-01
This case study research found that the relational leadership and organisational culture at a public primary school situated in a high poverty location in South Australia was built upon the strength of the inter-relationships between the teachers, teachers and leadership, and between teachers and students. Supported by what we called "dynamic…
Stress management standards: a warning indicator for employee health.
Kazi, A; Haslam, C O
2013-07-01
Psychological stress is a major cause of lost working days in the UK. The Health & Safety Executive (HSE) has developed management standards (MS) to help organizations to assess work-related stress. To investigate the relationships between the MS indicator tool and employee health, job attitudes, work performance and environmental outcomes. The first phase involved a survey employing the MS indicator tool, General Health Questionnaire-12 (GHQ-12), job attitudes, work performance and environmental measures in a call centre from a large utility company. The second phase comprised six focus groups to investigate what employees believed contributed to their perceived stress. Three hundred and four call centre employees responded with a response rate of 85%. Significant negative correlations were found between GHQ-12 and two MS dimensions; demands (Rho = -0.211, P < 0.001) and relationships (Rho= -0.134, P < 0.05). Other dimensions showed no significant relationship with GHQ-12. Higher levels of stress were associated with reduced job performance, job motivation and increased intention to quit but low stress levels were associated with reduced job satisfaction. Lack of management support, recognition and development opportunities were identified as sources of stress. The findings support the utility of the MS as a measure of employee attitudes and performance.
NASA Astrophysics Data System (ADS)
Mendiola, M. A.; Aguado, P. L.; Espejo, R.
2012-04-01
The main objective of the CyberAula 2.0 project is to support, record and validate videoconferencing and lecture recording services by means the integration of the Polytechnic University of Madrid (UPM) Moodle Platform with both the Global Plaza platform and Isabel videoconference tool. Each class session is broadcast on the Internet and then recorded, enabling geographically distant students to participate in on-campus classes. All the software used in the project is open source. The videoconferencing tool that we have used is called Isabel and the web platform to schedule, perform, stream, record and publish the videoconferences automatically is called GlobalPlaza. Both of them have been developed at UPM (Universidad Politécnica de Madrid) and are specifically designed for educational purposes. Each class session is broadcasted on the Internet and recorded, enabling geographically distant students to participate live in on-campus classes with questions through a chat tool or through the same videoconference. In order to provide educational support to GlobalPlaza, the CyberAula 2.0 project has been proposed. GlobalPlaza (Barra et al., 2011) is the web platform to schedule, perform, stream, record and publish videoconferences automatically. It is integrated with the videoconferencing tool called Isabel (Quemada et al, 2005), which is a real-time collaboration tool for the Internet, which supports advanced collaborative web/videoconferencing with application sharing and TV like media integration. Both of them are open source solutions which have been developed at our university. GlobalPlaza is a web application developed in the context of the GLOBAL project, a research project supported by the European Commission's seventh framework program. Students can review the recording lectures when needed through Moodle. In this paper we present the project developed at the Escuela Técnica Superior de Ingenieros Agrónomos (ETSIA), Secondary Cycle free-elective-subject, which is currently in process of expiration with the introduction of new curricula within the framework of the European Higher Education Space. Students participate in this subject with outstanding interest, thus achieving transversal competences as they must prepare and present a report in the last week of the Semester. The Project development background was the inclusion of the subject Plants of agro-alimentary interest. It has a quite remarkable, attractive practical deal within the Subjects Offer by this Center and it is one of the most demanded subjects of the free elective ones by students, whose active participation can be highlighted, either in practical workshops or in their individual presentation of reports. In the workshops they must identify, describe, classify and even taste several species of agro-alimentary interest (fruits of tempered or tropical regions, aromatic plants and spices, edible mushrooms and cereals and pseudocereals), many of them formerly unknown for the majority. They are asked to fill some questionnaires in order to consolidate concepts and to evaluate their personal participation in the subject development.
77 FR 72830 - Request for Comments on Request for Continued Examination (RCE) Practice
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-06
... the submission of written comments using a Web-based collaboration tool called IdeaScale[supreg]; and... collaboration tool called IdeaScale[supreg]. The tool allows users to post comments on a topic, and view and...
The ALMA OT in early science: supporting multiple customers
NASA Astrophysics Data System (ADS)
Bridger, Alan; Williams, Stewart; McLay, Stewart; Yatagai, Hiroshi; Schilling, Marcus; Biggs, Andrew; Tobar, Rodrigo; Warmels, Rein H.
2012-09-01
The ALMA Observatory is currently operating 'Early Science' observing. The Cycle0 and Cycle1 Calls for Proposals are part of this Early Science, and in both the ALMA Observing Tool plays a crucial role. This paper describes how the ALMA OT tackles the problem of making millimeter/sub-millimeter interferometry accessible to the wider community, while allowing "experts" the power and flexibility they need. We will also describe our approach to the challenges of supporting multiple customers, and explore the lessons learnt from the Early Science experiences. Finally we look ahead to the challenges presented by future observing cycles.
Applications of Support Vector Machine (SVM) Learning in Cancer Genomics
HUANG, SHUJUN; CAI, NIANGUANG; PACHECO, PEDRO PENZUTI; NARANDES, SHAVIRA; WANG, YANG; XU, WAYNE
2017-01-01
Machine learning with maximization (support) of separating margin (vector), called support vector machine (SVM) learning, is a powerful classification tool that has been used for cancer genomic classification or subtyping. Today, as advancements in high-throughput technologies lead to production of large amounts of genomic and epigenomic data, the classification feature of SVMs is expanding its use in cancer genomics, leading to the discovery of new biomarkers, new drug targets, and a better understanding of cancer driver genes. Herein we reviewed the recent progress of SVMs in cancer genomic studies. We intend to comprehend the strength of the SVM learning and its future perspective in cancer genomic applications. PMID:29275361
42: An Open-Source Simulation Tool for Study and Design of Spacecraft Attitude Control Systems
NASA Technical Reports Server (NTRS)
Stoneking, Eric
2018-01-01
Simulation is an important tool in the analysis and design of spacecraft attitude control systems. The speaker will discuss the simulation tool, called simply 42, that he has developed over the years to support his own work as an engineer in the Attitude Control Systems Engineering Branch at NASA Goddard Space Flight Center. 42 was intended from the outset to be high-fidelity and powerful, but also fast and easy to use. 42 is publicly available as open source since 2014. The speaker will describe some of 42's models and features, and discuss its applicability to studies ranging from early concept studies through the design cycle, integration, and operations. He will outline 42's architecture and share some thoughts on simulation development as a long-term project.
Prioritization of malaria endemic zones using self-organizing maps in the Manipur state of India.
Murty, Upadhyayula Suryanarayana; Srinivasa Rao, Mutheneni; Misra, Sunil
2008-09-01
Due to the availability of a huge amount of epidemiological and public health data that require analysis and interpretation by using appropriate mathematical tools to support the existing method to control the mosquito and mosquito-borne diseases in a more effective way, data-mining tools are used to make sense from the chaos. Using data-mining tools, one can develop predictive models, patterns, association rules, and clusters of diseases, which can help the decision-makers in controlling the diseases. This paper mainly focuses on the applications of data-mining tools that have been used for the first time to prioritize the malaria endemic regions in Manipur state by using Self Organizing Maps (SOM). The SOM results (in two-dimensional images called Kohonen maps) clearly show the visual classification of malaria endemic zones into high, medium and low in the different districts of Manipur, and will be discussed in the paper.
Charmaz, Kathy
2015-12-01
This article addresses criticisms of qualitative research for spawning studies that lack analytic development and theoretical import. It focuses on teaching initial grounded theory tools while interviewing, coding, and writing memos for the purpose of scaling up the analytic level of students' research and advancing theory construction. Adopting these tools can improve teaching qualitative methods at all levels although doctoral education is emphasized here. What teachers cover in qualitative methods courses matters. The pedagogy presented here requires a supportive environment and relies on demonstration, collective participation, measured tasks, progressive analytic complexity, and accountability. Lessons learned from using initial grounded theory tools are exemplified in a doctoral student's coding and memo-writing excerpts that demonstrate progressive analytic development. The conclusion calls for increasing the number and depth of qualitative methods courses and for creating a cadre of expert qualitative methodologists. © The Author(s) 2015.
Miller, Brian W.; Morisette, Jeffrey T.
2014-01-01
Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.
DOE Office of Scientific and Technical Information (OSTI.GOV)
The Profile Interface Generator (PIG) is a tool for loosely coupling applications and performance tools. It enables applications to write code that looks like standard C and Fortran functions calls, without requiring that applications link to specific implementations of those function calls. Performance tools can register with PIG in order to listen to only the calls that give information they care about. This interface reduces the build and configuration burden on application developers and allows semantic instrumentation to live in production codes without interfering with production runs.
Long Range Plan for Embedded Computer Systems Support. Volume II
1981-10-01
interface (pilot displays and controls plus visual system), and data collection (CMAC data, bus data and simulation data). Non-real time functions include...unless adequate upfront planning is implemented, the command will be controlled by the dynamics rather than controll - ing them. The upfront planning should...or should they be called manually? What amount and type of data should the various tools pass between each other? Under what conditions and controls
Lathe Attachment Finishes Inner Surface of Tubes
NASA Technical Reports Server (NTRS)
Lancki, A. J.
1982-01-01
Extremely smooth finishes are machined on inside surfaces of tubes by new attachment for a lathe. The relatively inexpensive accessory, called a "microhone," holds a honing stone against workpiece by rigid tangs instead of springs as in conventional honing tools. Inner rod permits adjustment of microhoning stone, while outer tube supports assembly. Outer tube is held between split blocks on lathe toolpost. Microhoning can be done with either microhone or workpiece moving and other member stationary.
NASA Technical Reports Server (NTRS)
Robers, James L.; Sobieszczanski-Sobieski, Jaroslaw
1989-01-01
Only recently have engineers begun making use of Artificial Intelligence (AI) tools in the area of conceptual design. To continue filling this void in the design process, a prototype knowledge-based system, called STRUTEX has been developed to initially configure a structure to support point loads in two dimensions. This prototype was developed for testing the application of AI tools to conceptual design as opposed to being a testbed for new methods for improving structural analysis and optimization. This system combines numerical and symbolic processing by the computer with interactive problem solving aided by the vision of the user. How the system is constructed to interact with the user is described. Of special interest is the information flow between the knowledge base and the data base under control of the algorithmic main program. Examples of computed and refined structures are presented during the explanation of the system.
NASA Technical Reports Server (NTRS)
Williams, Jacob; Stewart, Shaun M.; Lee, David E.; Davis, Elizabeth C.; Condon, Gerald L.; Senent, Juan
2010-01-01
The National Aeronautics and Space Administration s (NASA) Constellation Program paves the way for a series of lunar missions leading to a sustained human presence on the Moon. The proposed mission design includes an Earth Departure Stage (EDS), a Crew Exploration Vehicle (Orion) and a lunar lander (Altair) which support the transfer to and from the lunar surface. This report addresses the design, development and implementation of a new mission scan tool called the Mission Assessment Post Processor (MAPP) and its use to provide insight into the integrated (i.e., EDS, Orion, and Altair based) mission cost as a function of various mission parameters and constraints. The Constellation architecture calls for semiannual launches to the Moon and will support a number of missions, beginning with 7-day sortie missions, culminating in a lunar outpost at a specified location. The operational lifetime of the Constellation Program can cover a period of decades over which the Earth-Moon geometry (particularly, the lunar inclination) will go through a complete cycle (i.e., the lunar nodal cycle lasting 18.6 years). This geometry variation, along with other parameters such as flight time, landing site location, and mission related constraints, affect the outbound (Earth to Moon) and inbound (Moon to Earth) translational performance cost. The mission designer must determine the ability of the vehicles to perform lunar missions as a function of this complex set of interdependent parameters. Trade-offs among these parameters provide essential insights for properly assessing the ability of a mission architecture to meet desired goals and objectives. These trades also aid in determining the overall usable propellant required for supporting nominal and off-nominal missions over the entire operational lifetime of the program, thus they support vehicle sizing.
Actualities and Development of Heavy-Duty CNC Machine Tool Thermal Error Monitoring Technology
NASA Astrophysics Data System (ADS)
Zhou, Zu-De; Gui, Lin; Tan, Yue-Gang; Liu, Ming-Yao; Liu, Yi; Li, Rui-Ya
2017-09-01
Thermal error monitoring technology is the key technological support to solve the thermal error problem of heavy-duty CNC (computer numerical control) machine tools. Currently, there are many review literatures introducing the thermal error research of CNC machine tools, but those mainly focus on the thermal issues in small and medium-sized CNC machine tools and seldom introduce thermal error monitoring technologies. This paper gives an overview of the research on the thermal error of CNC machine tools and emphasizes the study of thermal error of the heavy-duty CNC machine tool in three areas. These areas are the causes of thermal error of heavy-duty CNC machine tool and the issues with the temperature monitoring technology and thermal deformation monitoring technology. A new optical measurement technology called the "fiber Bragg grating (FBG) distributed sensing technology" for heavy-duty CNC machine tools is introduced in detail. This technology forms an intelligent sensing and monitoring system for heavy-duty CNC machine tools. This paper fills in the blank of this kind of review articles to guide the development of this industry field and opens up new areas of research on the heavy-duty CNC machine tool thermal error.
SchemaOnRead: A Package for Schema-on-Read in R
DOE Office of Scientific and Technical Information (OSTI.GOV)
North, Michael J.
Schema-on-read is an agile approach to data storage and retrieval that defers investments in data organization until production queries need to be run by working with data directly in native form. Schema-on-read functions have been implemented in a wide range of analytical systems, most notably Hadoop. SchemaOnRead is a CRAN package that uses R’s flexible data representations to provide transparent and convenient support for the schema-on-read paradigm in R. The schema-on- read tools within the package include a single function call that recursively reads folders with text, comma separated value, raster image, R data, HDF5, NetCDF, spreadsheet, Weka, Epi Info,more » Pajek network, R network, HTML, SPSS, Systat, and Stata files. The provided tools can be used as-is or easily adapted to implement customized schema-on-read tool chains in R. This paper’s contribution is that it introduces and describes SchemaOnRead, the first R package specifically focused on providing explicit schema-on-read support in R.« less
The Virtual Mission Operations Center
NASA Technical Reports Server (NTRS)
Moore, Mike; Fox, Jeffrey
1994-01-01
Spacecraft management is becoming more human intensive as spacecraft become more complex and as operations costs are growing accordingly. Several automation approaches have been proposed to lower these costs. However, most of these approaches are not flexible enough in the operations processes and levels of automation that they support. This paper presents a concept called the Virtual Mission Operations Center (VMOC) that provides highly flexible support for dynamic spacecraft management processes and automation. In a VMOC, operations personnel can be shared among missions, the operations team can change personnel and their locations, and automation can be added and removed as appropriate. The VMOC employs a form of on-demand supervisory control called management by exception to free operators from having to actively monitor their system. The VMOC extends management by exception, however, so that distributed, dynamic teams can work together. The VMOC uses work-group computing concepts and groupware tools to provide a team infrastructure, and it employs user agents to allow operators to define and control system automation.
Large Terrain Continuous Level of Detail 3D Visualization Tool
NASA Technical Reports Server (NTRS)
Myint, Steven; Jain, Abhinandan
2012-01-01
This software solved the problem of displaying terrains that are usually too large to be displayed on standard workstations in real time. The software can visualize terrain data sets composed of billions of vertices, and can display these data sets at greater than 30 frames per second. The Large Terrain Continuous Level of Detail 3D Visualization Tool allows large terrains, which can be composed of billions of vertices, to be visualized in real time. It utilizes a continuous level of detail technique called clipmapping to support this. It offloads much of the work involved in breaking up the terrain into levels of details onto the GPU (graphics processing unit) for faster processing.
Brown, Ameldia R; Coppola, Patricia; Giacona, Marian; Petriches, Anne; Stockwell, Mary Ann
2009-01-01
Health systems seeking responsible stewardship of community benefit dollars supporting Faith Community Nursing Networks require demonstration of positive measurable health outcomes. Faith Community Nurses (FCNs) answer the call for measurable outcomes by documenting cost savings and cost avoidances to families, communities, and health systems associated with their interventions. Using a spreadsheet tool based on Medicare reimbursements and diagnostic-related groupings, 3 networks of FCNs have together shown more than 600 000 (for calendar year 2008) healthcare dollars saved by avoidance of unnecessary acute care visits and extended care placements. The cost-benefit ratio of support dollars to cost savings and cost avoidance demonstrates that support of FCNs is good stewardship of community benefit dollars.
Applications of Support Vector Machine (SVM) Learning in Cancer Genomics.
Huang, Shujun; Cai, Nianguang; Pacheco, Pedro Penzuti; Narrandes, Shavira; Wang, Yang; Xu, Wayne
2018-01-01
Machine learning with maximization (support) of separating margin (vector), called support vector machine (SVM) learning, is a powerful classification tool that has been used for cancer genomic classification or subtyping. Today, as advancements in high-throughput technologies lead to production of large amounts of genomic and epigenomic data, the classification feature of SVMs is expanding its use in cancer genomics, leading to the discovery of new biomarkers, new drug targets, and a better understanding of cancer driver genes. Herein we reviewed the recent progress of SVMs in cancer genomic studies. We intend to comprehend the strength of the SVM learning and its future perspective in cancer genomic applications. Copyright© 2018, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.
Analysis of the Requirements Generation Process for the Logistics Analysis and Wargame Support Tool
2017-06-01
For instance, the requirements for a pen seem straight forward; however, they may vary depending on the context in which the pen will be used...the interactions between the operational elements, specify which tasks are dependent on others and the order of executing task, and estimate how...configuration file to call that spreadsheet. This requirement can be met depending on the situation. If the nodes and arcs are pre-defined and readily
Examination of Various Methods Used in Support of Concurrent Engineering
1990-03-01
1989. F.Y.I. Drawing a2Ther Productivity. Industrial Engineering 21: 80. Ishi82 Ishikawa , Kaoru . 1982. Guide to Quality Control. White Plains, NY: Kraus...observe it in practice have an easier time identifying the different methods or tech- niques (such as the Ishikawa tools) used than understanding the...simple histogram to show what prob- lems should be attacked first. Cause and Effect Diagrams Sometimes called the fishbone or Ishikawa diagrams-a kind
Piette, John D; Mendoza-Avelares, Milton O; Ganser, Martha; Mohamed, Muhima; Marinec, Nicolle; Krishnan, Sheila
2011-06-01
Although interactive voice response (IVR) calls can be an effective tool for chronic disease management, many regions of the world lack the infrastructure to provide these services. This study evaluated the feasibility and potential impact of an IVR program using a cloud-computing model to improve diabetes management in Honduras. A single-group, pre-post study was conducted between June and August 2010. The telecommunications infrastructure was maintained on a U.S. server, and calls were directed to patients' cell phones using VoIP. Eighty-five diabetes patients in Honduras received weekly IVR disease management calls for 6 weeks, with automated follow-up e-mails to clinicians, and voicemail reports to family caregivers. Patients completed interviews at enrollment and a 6-week follow-up. Other measures included patients' glycemic control (HbA1c) and data from the IVR calling system. A total of 53% of participants completed at least half of their IVR calls and 23% of participants completed 80% or more. Higher baseline blood pressures, greater diabetes burden, greater distance from the clinic, and better medication adherence were related to higher call completion rates. Nearly all participants (98%) reported that because of the program, they improved in aspects of diabetes management such as glycemic control (56%) or foot care (89%). Mean HbA1c's decreased from 10.0% at baseline to 8.9% at follow-up (p<0.01). Most participants (92%) said that if the service were available in their clinic they would use it again. Cloud computing is a feasible strategy for providing IVR services globally. IVR self-care support may improve self-care and glycemic control for patients in underdeveloped countries. Published by Elsevier Inc.
Piette, John D.; Mendoza-Avelares, Milton O.; Ganser, Martha; Mohamed, Muhima; Marinec, Nicolle; Krishnan, Sheila
2013-01-01
Background Although interactive voice response (IVR) calls can be an effective tool for chronic disease management, many regions of the world lack the infrastructure to provide these services. Objective This study evaluated the feasibility and potential impact of an IVR program using a cloud-computing model to improve diabetes management in Honduras. Methods A single group, pre-post study was conducted between June and August 2010. The telecommunications infrastructure was maintained on a U.S. server, and calls were directed to patients’ cell phones using VoIP. Eighty-five diabetes patients in Honduras received weekly IVR disease management calls for six weeks, with automated follow-up emails to clinicians, and voicemail reports to family caregivers. Patients completed interviews at enrollment and a six week follow-up. Other measures included patients’ glycemic control (A1c) and data from the IVR calling system. Results 55% of participants completed the majority of their IVR calls and 33% completed 80% or more. Higher baseline blood pressures, greater diabetes burden, greater distance from the clinic, and better adherence were related to higher call completion rates. Nearly all participants (98%) reported that because of the program, they improved in aspects of diabetes management such as glycemic control (56%) or foot care (89%). Mean A1c’s decreased from 10.0% at baseline to 8.9% at follow-up (p<.01). Most participants (92%) said that if the service were available in their clinic they would use it again. Conclusions Cloud computing is a feasible strategy for providing IVR services globally. IVR self-care support may improve self-care and glycemic control for patients in under-developed countries. PMID:21565655
NASA Technical Reports Server (NTRS)
Darroy, Jean Michel
1993-01-01
Current trends in the spacecraft mission operations area (spacecraft & mission complexity, project duration, required flexibility are requiring a breakthrough for what concerns philosophy, organization, and support tools. A major evolution is related to space operations 'informationalization', i.e adding to existing operations support & data processing systems a new generation of tools based on advanced information technologies (object-oriented programming, artificial intelligence, data bases, hypertext) that automate, at least partially, operations tasks that used be performed manually (mission & project planning/scheduling, operations procedures elaboration & execution, data analysis & failure diagnosis). All the major facets of this 'informationalization' are addressed at MATRA MARCONI SPACE, operational applications were fielded and generic products are becoming available. These various applications have generated a significant feedback from the users (at ESA, CNES, ARIANESPACE, MATRA MARCONI SPACE), which is now allowing us to precisely measure how the deployment of this new generation of tools, that we called OPSWARE, can 'reengineer' current spacecraft mission operations philosophy, how it can make space operations faster, better, and cheaper. This paper can be considered as an update of the keynote address 'Knowledge-Based Systems for Spacecraft Control' presented during the first 'Ground Data Systems for Spacecraft Control' conference in Darmstadt, June 1990, with a special emphasis on these last two years users feedback.
Tu, S W; Eriksson, H; Gennari, J H; Shahar, Y; Musen, M A
1995-06-01
PROTEGE-II is a suite of tools and a methodology for building knowledge-based systems and domain-specific knowledge-acquisition tools. In this paper, we show how PROTEGE-II can be applied to the task of providing protocol-based decision support in the domain of treating HIV-infected patients. To apply PROTEGE-II, (1) we construct a decomposable problem-solving method called episodic skeletal-plan refinement, (2) we build an application ontology that consists of the terms and relations in the domain, and of method-specific distinctions not already captured in the domain terms, and (3) we specify mapping relations that link terms from the application ontology to the domain-independent terms used in the problem-solving method. From the application ontology, we automatically generate a domain-specific knowledge-acquisition tool that is custom-tailored for the application. The knowledge-acquisition tool is used for the creation and maintenance of domain knowledge used by the problem-solving method. The general goal of the PROTEGE-II approach is to produce systems and components that are reusable and easily maintained. This is the rationale for constructing ontologies and problem-solving methods that can be composed from a set of smaller-grained methods and mechanisms. This is also why we tightly couple the knowledge-acquisition tools to the application ontology that specifies the domain terms used in the problem-solving systems. Although our evaluation is still preliminary, for the application task of providing protocol-based decision support, we show that these goals of reusability and easy maintenance can be achieved. We discuss design decisions and the tradeoffs that have to be made in the development of the system.
Perl Tools for Automating Satellite Ground Systems
NASA Technical Reports Server (NTRS)
McLean, David; Haar, Therese; McDonald, James
2000-01-01
The freeware scripting language Pert offers many opportunities for automating satellite ground systems for new satellites as well as older, in situ systems. This paper describes a toolkit that has evolved from of the experiences gained by using Pert to automate the ground system for the Compton Gamma Ray Observatory (CGRO) and for automating some of the elements in the Earth Observing System Data and Operations System (EDOS) ground system at Goddard Space Flight Center (GSFC). CGRO is an older ground system that was forced to automate because of fund cuts. Three 8 hour shifts were cut back to one 8 hour shift, 7 days per week. EDOS supports a new mission called Terra, launched December 1999 that requires distribution and tracking of mission-critical reports throughout the world. Both of these ground systems use Pert scripts to process data and display it on the Internet as well as scripts to coordinate many of the other systems that make these ground systems work as a coherent whole. Another task called Automated Multimodal Trend Analysis System (AMTAS) is looking at technology for isolation and recovery of spacecraft problems. This effort has led to prototypes that seek to evaluate various tools and technology that meet at least some of the AMTAS goals. The tools, experiences, and lessons learned by implementing these systems are described here.
A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maile, Tobias; Bazjanac, Vladimir; O'Donnell, James
2011-11-01
Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots andmore » data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.« less
SynopSIS: integrating physician sign-out with the electronic medical record.
Sarkar, Urmimala; Carter, Jonathan T; Omachi, Theodore A; Vidyarthi, Arpana R; Cucina, Russell; Bokser, Seth; van Eaton, Erik; Blum, Michael
2007-09-01
Safe delivery of care depends on effective communication among all health care providers, especially during transfers of care. The traditional medical chart does not adequately support such communication. We designed a patient-tracking tool that enhances provider communication and supports clinical decision making. To develop a problem-based patient-tracking tool, called Sign-out, Information Retrieval, and Summary (SynopSIS), in order to support patient tracking, transfers of care (ie, sign-outs), and daily rounds. Tertiary-care, university-based teaching hospital. SynopSIS compiles and organizes information from the electronic medical record to support hospital discharge and disposition decisions, daily provider decisions, and overnight or cross-coverage decisions. It reflects the provider's patient-care and daily work-flow needs. We plan to use Web-based surveys, audits of daily use, and interdisciplinary focus groups to evaluate SynopSIS's impact on communication between providers, quality of sign-out, patient continuity of care, and rounding efficiency. We expect SynopSIS to improve care by facilitating communication between care teams, standardizing sign-out, and automating daily review of clinical and laboratory trends. SynopSIS redesigns the clinical chart to better serve provider and patient needs. (c) 2007 Society of Hospital Medicine.
NASA Technical Reports Server (NTRS)
Schwartz, Susan K.
1992-01-01
The Solid Modeling Aerospace Research Tool (SMART) is a computer aided design tool used in aerospace vehicle design. Modeling of structural components using SMART includes the representation of the transverse or cross-wise elements of a vehicle's fuselage, ringframes, and bulkheads. Ringframes are placed along a vehicle's fuselage to provide structural support and maintain the shape of the fuselage. Bulkheads are also used to maintain shape, but are placed at locations where substantial structural support is required. Given a Bezier curve representation of a cross sectional cut through a vehicle's fuselage and/or an internal tank, this project produces a first-guess Bezier patch representation of a ringframe or bulkhead at the cross-sectional position. The grid produced is later used in the structural analysis of the vehicle. The graphical display of the generated patches allows the user to edit patch control points in real time. Constraints considered in the patch generation include maintaining 'square-like' patches and placement of longitudinal, or lengthwise along the fuselage, structural elements called longerons.
Full Life Cycle of Data Analysis with Climate Model Diagnostic Analyzer (CMDA)
NASA Astrophysics Data System (ADS)
Lee, S.; Zhai, C.; Pan, L.; Tang, B.; Zhang, J.; Bao, Q.; Malarout, N.
2017-12-01
We have developed a system that supports the full life cycle of a data analysis process, from data discovery, to data customization, to analysis, to reanalysis, to publication, and to reproduction. The system called Climate Model Diagnostic Analyzer (CMDA) is designed to demonstrate that the full life cycle of data analysis can be supported within one integrated system for climate model diagnostic evaluation with global observational and reanalysis datasets. CMDA has four subsystems that are highly integrated to support the analysis life cycle. Data System manages datasets used by CMDA analysis tools, Analysis System manages CMDA analysis tools which are all web services, Provenance System manages the meta data of CMDA datasets and the provenance of CMDA analysis history, and Recommendation System extracts knowledge from CMDA usage history and recommends datasets/analysis tools to users. These four subsystems are not only highly integrated but also easily expandable. New datasets can be easily added to Data System and scanned to be visible to the other subsystems. New analysis tools can be easily registered to be available in the Analysis System and Provenance System. With CMDA, a user can start a data analysis process by discovering datasets of relevance to their research topic using the Recommendation System. Next, the user can customize the discovered datasets for their scientific use (e.g. anomaly calculation, regridding, etc) with tools in the Analysis System. Next, the user can do their analysis with the tools (e.g. conditional sampling, time averaging, spatial averaging) in the Analysis System. Next, the user can reanalyze the datasets based on the previously stored analysis provenance in the Provenance System. Further, they can publish their analysis process and result to the Provenance System to share with other users. Finally, any user can reproduce the published analysis process and results. By supporting the full life cycle of climate data analysis, CMDA improves the research productivity and collaboration level of its user.
Real-time access of large volume imagery through low-bandwidth links
NASA Astrophysics Data System (ADS)
Phillips, James; Grohs, Karl; Brower, Bernard; Kelly, Lawrence; Carlisle, Lewis; Pellechia, Matthew
2010-04-01
Providing current, time-sensitive imagery and geospatial information to deployed tactical military forces or first responders continues to be a challenge. This challenge is compounded through rapid increases in sensor collection volumes, both with larger arrays and higher temporal capture rates. Focusing on the needs of these military forces and first responders, ITT developed a system called AGILE (Advanced Geospatial Imagery Library Enterprise) Access as an innovative approach based on standard off-the-shelf techniques to solving this problem. The AGILE Access system is based on commercial software called Image Access Solutions (IAS) and incorporates standard JPEG 2000 processing. Our solution system is implemented in an accredited, deployable form, incorporating a suite of components, including an image database, a web-based search and discovery tool, and several software tools that act in concert to process, store, and disseminate imagery from airborne systems and commercial satellites. Currently, this solution is operational within the U.S. Government tactical infrastructure and supports disadvantaged imagery users in the field. This paper presents the features and benefits of this system to disadvantaged users as demonstrated in real-world operational environments.
NASA Technical Reports Server (NTRS)
Leveson, Nancy G.; Heimdahl, Mats P. E.; Reese, Jon Damon
1999-01-01
Previously, we defined a blackbox formal system modeling language called RSML (Requirements State Machine Language). The language was developed over several years while specifying the system requirements for a collision avoidance system for commercial passenger aircraft. During the language development, we received continual feedback and evaluation by FAA employees and industry representatives, which helped us to produce a specification language that is easily learned and used by application experts. Since the completion of the PSML project, we have continued our research on specification languages. This research is part of a larger effort to investigate the more general problem of providing tools to assist in developing embedded systems. Our latest experimental toolset is called SpecTRM (Specification Tools and Requirements Methodology), and the formal specification language is SpecTRM-RL (SpecTRM Requirements Language). This paper describes what we have learned from our use of RSML and how those lessons were applied to the design of SpecTRM-RL. We discuss our goals for SpecTRM-RL and the design features that support each of these goals.
González Carrascosa, R; Bayo Montó, J L; Meneu Barreira, T; García Segovia, P; Martínez-Monzó, J
2011-01-01
To introduce and describe a new tool called UPV-FFQ to evaluate dietary intake of the university population. The new tool consists principally in a self-administered online food frequency questionnaire (FFQ). The tool UPV-FFQ has been developed by means of web pages applying the technology ASP.NET 2.0 and using the database SQL Server 2005 as support. To develop the FFQ has been used as model the paper and pencil FFQ called "Dieta, salud y antropometría en la población universitaria". The tool has three parts: (1) a homepage, (2) a general questionnaire and (3) a FFQ. The FFQ has a closed list of 84 food items commonly consumed in Valencia region. The respondents has to indicate the food items that they consume (2 possible options), the frequency of consumption (9 response options) and the quantity consumed (7 response options). The UPV-FFQ has approximately 250 color photographs that represents to three portion sizes. The photographs are useful to help the respondents to choose the portion sizes that more adjusts to their habitual portions. The new tool provides quantitative information of the habitual intake of 31 nutritional parameters and provides qualitative information of the general questionnaire. A pilot study was done for a total of 57 respondents. The media time spend to fill in was 15 minutes. The pilot study concluded that the questionnaire was ease-of-use, low cost and time-effectiveness questionnaire. The format and the sequence of the questions were easily understood.
Mirel, Barbara; Görg, Carsten
2014-04-26
A common class of biomedical analysis is to explore expression data from high throughput experiments for the purpose of uncovering functional relationships that can lead to a hypothesis about mechanisms of a disease. We call this analysis expression driven, -omics hypothesizing. In it, scientists use interactive data visualizations and read deeply in the research literature. Little is known, however, about the actual flow of reasoning and behaviors (sense making) that scientists enact in this analysis, end-to-end. Understanding this flow is important because if bioinformatics tools are to be truly useful they must support it. Sense making models of visual analytics in other domains have been developed and used to inform the design of useful and usable tools. We believe they would be helpful in bioinformatics. To characterize the sense making involved in expression-driven, -omics hypothesizing, we conducted an in-depth observational study of one scientist as she engaged in this analysis over six months. From findings, we abstracted a preliminary sense making model. Here we describe its stages and suggest guidelines for developing visualization tools that we derived from this case. A single case cannot be generalized. But we offer our findings, sense making model and case-based tool guidelines as a first step toward increasing interest and further research in the bioinformatics field on scientists' analytical workflows and their implications for tool design.
2014-01-01
A common class of biomedical analysis is to explore expression data from high throughput experiments for the purpose of uncovering functional relationships that can lead to a hypothesis about mechanisms of a disease. We call this analysis expression driven, -omics hypothesizing. In it, scientists use interactive data visualizations and read deeply in the research literature. Little is known, however, about the actual flow of reasoning and behaviors (sense making) that scientists enact in this analysis, end-to-end. Understanding this flow is important because if bioinformatics tools are to be truly useful they must support it. Sense making models of visual analytics in other domains have been developed and used to inform the design of useful and usable tools. We believe they would be helpful in bioinformatics. To characterize the sense making involved in expression-driven, -omics hypothesizing, we conducted an in-depth observational study of one scientist as she engaged in this analysis over six months. From findings, we abstracted a preliminary sense making model. Here we describe its stages and suggest guidelines for developing visualization tools that we derived from this case. A single case cannot be generalized. But we offer our findings, sense making model and case-based tool guidelines as a first step toward increasing interest and further research in the bioinformatics field on scientists’ analytical workflows and their implications for tool design. PMID:24766796
Challenges in Achieving Trajectory-Based Operations
NASA Technical Reports Server (NTRS)
Cate, Karen Tung
2012-01-01
In the past few years much of the global ATM research community has proposed advanced systems based on Trajectory-Based Operations (TBO). The concept of TBO uses four-dimensional aircraft trajectories as the base information for managing safety and capacity. Both the US and European advanced ATM programs call for the sharing of trajectory data across different decision support tools for successful operations. However, the actual integration of TBO systems presents many challenges. Trajectory predictors are built to meet the specific needs of a particular system and are not always compatible with others. Two case studies are presented which examine the challenges of introducing a new concept into two legacy systems in regards to their trajectory prediction software. The first case describes the issues with integrating a new decision support tool with a legacy operational system which overlap in domain space. These tools perform similar functions but are driven by different requirements. The difference in the resulting trajectories can lead to conflicting advisories. The second case looks at integrating this same new tool with a legacy system originally developed as an integrated system, but diverged many years ago. Both cases illustrate how the lack of common architecture concepts for the trajectory predictors added cost and complexity to the integration efforts.
New Human-Computer Interface Concepts for Mission Operations
NASA Technical Reports Server (NTRS)
Fox, Jeffrey A.; Hoxie, Mary Sue; Gillen, Dave; Parkinson, Christopher; Breed, Julie; Nickens, Stephanie; Baitinger, Mick
2000-01-01
The current climate of budget cuts has forced the space mission operations community to reconsider how it does business. Gone are the days of building one-of-kind control centers with teams of controllers working in shifts 24 hours per day, 7 days per week. Increasingly, automation is used to significantly reduce staffing needs. In some cases, missions are moving towards lights-out operations where the ground system is run semi-autonomously. On-call operators are brought in only to resolve anomalies. Some operations concepts also call for smaller operations teams to manage an entire family of spacecraft. In the not too distant future, a skeleton crew of full-time general knowledge operators will oversee the operations of large constellations of small spacecraft, while geographically distributed specialists will be assigned to emergency response teams based on their expertise. As the operations paradigms change, so too must the tools to support the mission operations team's tasks. Tools need to be built not only to automate routine tasks, but also to communicate varying types of information to the part-time, generalist, or on-call operators and specialists more effectively. Thus, the proper design of a system's user-system interface (USI) becomes even more importance than before. Also, because the users will be accessing these systems from various locations (e.g., control center, home, on the road) via different devices with varying display capabilities (e.g., workstations, home PCs, PDAS, pagers) over connections with various bandwidths (e.g., dial-up 56k, wireless 9.6k), the same software must have different USIs to support the different types of users, their equipment, and their environments. In other words, the software must now adapt to the needs of the users! This paper will focus on the needs and the challenges of designing USIs for mission operations. After providing a general discussion of these challenges, the paper will focus on the current efforts of creatin(y an effective USI for one specific suite of tools, SERS (The Spacecraft Emergency Response System), which has been built to enable lights-out operations. SERS is a Web-based collaborative environment that enables secure distributed fault management.
DOE Office of Scientific and Technical Information (OSTI.GOV)
The open source Project Haystack initiative defines meta data and communication standards related to data from buildings and intelligent devices. The Project Haystack REST API defines standard formats and operations for exchanging Haystack tagged data over HTTP. The HaystackRuby gem wraps calls to this REST API to enable Ruby application to easily integrate data hosted on a Project Haystack compliant server. The HaystackRuby gem was developed at the National Renewable Energy Lab to support applications related to campus energy. We hope that this tool may be useful to others.
2014-09-18
Converter AES Advance Encryption Standard ANN Artificial Neural Network APS Application Support AUC Area Under the Curve CPA Correlation Power Analysis ...Importance WGN White Gaussian Noise WPAN Wireless Personal Area Networks XEnv Cross-Environment XRx Cross-Receiver xxi ADVANCES IN SCA AND RF-DNA...based tool called KillerBee was released in 2009 that increases the exposure of ZigBee and other IEEE 802.15.4-based Wireless Personal Area Networks
DRIFTER Web App Development Support
NASA Technical Reports Server (NTRS)
Davis, Derrick D.; Armstrong, Curtis D.
2015-01-01
During my 2015 internship at Stennis Space Center (SSC) I supported the development of a web based tool to enable user interaction with a low-cost environmental monitoring buoy called the DRIFTER. DRIFTERs are designed by SSC's Applied Science and Technology Projects branch and are used to measure parameters such as water temperature and salinity. Data collected by the buoys help verify measurements by NASA satellites, which contributes to NASA's mission to advance understanding of the Earth by developing technologies to improve the quality of life on or home planet. My main objective during this internship was to support the development of the DRIFTER by writing web-based software that allows the public to view and access data collected by the buoys. In addition, this software would enable DRIFTER owners to configure and control the devices.
Progress on automated data analysis algorithms for ultrasonic inspection of composites
NASA Astrophysics Data System (ADS)
Aldrin, John C.; Forsyth, David S.; Welter, John T.
2015-03-01
Progress is presented on the development and demonstration of automated data analysis (ADA) software to address the burden in interpreting ultrasonic inspection data for large composite structures. The automated data analysis algorithm is presented in detail, which follows standard procedures for analyzing signals for time-of-flight indications and backwall amplitude dropout. New algorithms have been implemented to reliably identify indications in time-of-flight images near the front and back walls of composite panels. Adaptive call criteria have also been applied to address sensitivity to variation in backwall signal level, panel thickness variation, and internal signal noise. ADA processing results are presented for a variety of test specimens that include inserted materials and discontinuities produced under poor manufacturing conditions. Software tools have been developed to support both ADA algorithm design and certification, producing a statistical evaluation of indication results and false calls using a matching process with predefined truth tables. Parametric studies were performed to evaluate detection and false call results with respect to varying algorithm settings.
Automatic welding detection by an intelligent tool pipe inspection
NASA Astrophysics Data System (ADS)
Arizmendi, C. J.; Garcia, W. L.; Quintero, M. A.
2015-07-01
This work provide a model based on machine learning techniques in welds recognition, based on signals obtained through in-line inspection tool called “smart pig” in Oil and Gas pipelines. The model uses a signal noise reduction phase by means of pre-processing algorithms and attribute-selection techniques. The noise reduction techniques were selected after a literature review and testing with survey data. Subsequently, the model was trained using recognition and classification algorithms, specifically artificial neural networks and support vector machines. Finally, the trained model was validated with different data sets and the performance was measured with cross validation and ROC analysis. The results show that is possible to identify welding automatically with an efficiency between 90 and 98 percent.
The PDA as a reference tool: libraries' role in enhancing nursing education.
Scollin, Patrick; Callahan, John; Mehta, Apurva; Garcia, Elizabeth
2006-01-01
"The PDA as a Reference Tool: The Libraries' Role in Enhancing Nursing Education" is a pilot project funded by the University of Massachusetts President's Office Information Technology Council through their Professional Development Grant program in 2004. The project's goal is to offer faculty and students in nursing programs at two University of Massachusetts campuses access to an array of medical reference information, such as handbooks, dictionaries, calculators, and diagnostic tools, on small handheld computers called personal digital assistants. Through exposure to the variety of information resources in this digital format, participants can discover and explore these resources at no personal financial cost. Participants borrow handhelds from the University Library's circulation desks. The libraries provide support in routine resynchronizing of handhelds to update information. This report will discuss how the projects were administered, what we learned about what did and did not work, the problems and solutions, and where we hope to go from here.
Status of the Combustion Devices Injector Technology Program at the NASA MSFC
NASA Technical Reports Server (NTRS)
Jones, Gregg; Protz, Christopher; Trinh, Huu; Tucker, Kevin; Nesman, Tomas; Hulka, James
2005-01-01
To support the NASA Space Exploration Mission, an in-house program called Combustion Devices Injector Technology (CDIT) is being conducted at the NASA Marshall Space Flight Center (MSFC) for the fiscal year 2005. CDIT is focused on developing combustor technology and analysis tools to improve reliability and durability of upper-stage and in-space liquid propellant rocket engines. The three areas of focus include injector/chamber thermal compatibility, ignition, and combustion stability. In the compatibility and ignition areas, small-scale single- and multi-element hardware experiments will be conducted to demonstrate advanced technological concepts as well as to provide experimental data for validation of computational analysis tools. In addition, advanced analysis tools will be developed to eventually include 3-dimensional and multi- element effects and improve capability and validity to analyze heat transfer and ignition in large, multi-element injectors.
The X-windows interactive navigation data editor
NASA Technical Reports Server (NTRS)
Rinker, G. C.
1992-01-01
A new computer program called the X-Windows Interactive Data Editor (XIDE) was developed and demonstrated as a prototype application for editing radio metric data in the orbit-determination process. The program runs on a variety of workstations and employs pull-down menus and graphical displays, which allow users to easily inspect and edit radio metric data in the orbit data files received from the Deep Space Network (DSN). The XIDE program is based on the Open Software Foundation OSF/Motif Graphical User Interface (GUI) and has proven to be an efficient tool for editing radio metric data in the navigation operations environment. It was adopted by the Magellan Navigation Team as their primary data-editing tool. Because the software was designed from the beginning to be portable, the prototype was successfully moved to new workstation environments. It was also itegrated into the design of the next-generation software tool for DSN multimission navigation interactive launch support.
Evaluating Variant Calling Tools for Non-Matched Next-Generation Sequencing Data
NASA Astrophysics Data System (ADS)
Sandmann, Sarah; de Graaf, Aniek O.; Karimi, Mohsen; van der Reijden, Bert A.; Hellström-Lindberg, Eva; Jansen, Joop H.; Dugas, Martin
2017-02-01
Valid variant calling results are crucial for the use of next-generation sequencing in clinical routine. However, there are numerous variant calling tools that usually differ in algorithms, filtering strategies, recommendations and thus, also in the output. We evaluated eight open-source tools regarding their ability to call single nucleotide variants and short indels with allelic frequencies as low as 1% in non-matched next-generation sequencing data: GATK HaplotypeCaller, Platypus, VarScan, LoFreq, FreeBayes, SNVer, SAMtools and VarDict. We analysed two real datasets from patients with myelodysplastic syndrome, covering 54 Illumina HiSeq samples and 111 Illumina NextSeq samples. Mutations were validated by re-sequencing on the same platform, on a different platform and expert based review. In addition we considered two simulated datasets with varying coverage and error profiles, covering 50 samples each. In all cases an identical target region consisting of 19 genes (42,322 bp) was analysed. Altogether, no tool succeeded in calling all mutations. High sensitivity was always accompanied by low precision. Influence of varying coverages- and background noise on variant calling was generally low. Taking everything into account, VarDict performed best. However, our results indicate that there is a need to improve reproducibility of the results in the context of multithreading.
A group communication approach for mobile computing mobile channel: An ISIS tool for mobile services
NASA Astrophysics Data System (ADS)
Cho, Kenjiro; Birman, Kenneth P.
1994-05-01
This paper examines group communication as an infrastructure to support mobility of users, and presents a simple scheme to support user mobility by means of switching a control point between replicated servers. We describe the design and implementation of a set of tools, called Mobile Channel, for use with the ISIS system. Mobile Channel is based on a combination of the two replication schemes: the primary-backup approach and the state machine approach. Mobile Channel implements a reliable one-to-many FIFO channel, in which a mobile client sees a single reliable server; servers, acting as a state machine, see multicast messages from clients. Migrations of mobile clients are handled as an intentional primary switch, and hand-offs or server failures are completely masked to mobile clients. To achieve high performance, servers are replicated at a sliding-window level. Our scheme provides a simple abstraction of migration, eliminates complicated hand-off protocols, provides fault-tolerance and is implemented within the existing group communication mechanism.
DARPA DTN Phase 3 Core Engineering Support
NASA Technical Reports Server (NTRS)
Torgerson, J. Leigh; Richard Borgen, Richard; McKelvey, James; Segui, John; Tsao, Phil
2010-01-01
This report covers the initial DARPA DTN Phase 3 activities as JPL provided Core Engineering Support to the DARPA DTN Program, and then further details the culmination of the Phase 3 Program with a systematic development, integration and test of a disruption-tolerant C2 Situation Awareness (SA) system that may be transitioned to the USMC and deployed in the near future. The system developed and tested was a SPAWAR/JPL-Developed Common Operating Picture Fusion Tool called the Software Interoperability Environment (SIE), running over Disruption Tolerant Networking (DTN) protocols provided by BBN and MITRE, which effectively extends the operational range of SIE from normal fully-connected internet environments to the mobile tactical edges of the battlefield network.
Automated simulation as part of a design workstation
NASA Technical Reports Server (NTRS)
Cantwell, E.; Shenk, T.; Robinson, P.; Upadhye, R.
1990-01-01
A development project for a design workstation for advanced life-support systems incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulations, such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The paper reports on the Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components.
Embedded sensor systems for health - providing the tools in future healthcare.
Lindén, Maria; Björkman, Mats
2014-01-01
Wearable, embedded sensor systems for health applications are foreseen to be enablers in the future healthcare. They will provide ubiquitous monitoring of multiple parameters without restricting the person to stay at home or in the hospital. By following trend changes in the health status, early deteriorations will be detected and treatment can start earlier. Also health prevention will be supported. Such future healthcare requires technology development, including miniaturized sensors, smart textiles and wireless communication. The tremendous amount of data generated by these systems calls for both signal processing and decision support to guarantee the quality of data and avoid overflow of information. Safe and secure communications have to protect the integrity of the persons monitored.
Mandzuka, Mensur; Begic, Edin; Boskovic, Dusanka; Begic, Zijo; Masic, Izet
2017-06-01
This paper presents mobile application implementing a decision support system for acid-base disorder diagnosis and treatment recommendation. The application was developed using the official integrated development environment for the Android platform (to maximize availability and minimize investments in specialized hardware) called Android Studio. The application identifies disorder, based on the blood gas analysis, evaluates whether the disorder has been compensated, and based on additional input related to electrolyte imbalance, provides recommendations for treatment. The application is a tool in the hands of the user, which provides assistance during acid-base disorders treatment. The application will assist the physician in clinical practice and is focused on the treatment in intensive care.
The Neighborhood Auditing Tool: a hybrid interface for auditing the UMLS.
Morrey, C Paul; Geller, James; Halper, Michael; Perl, Yehoshua
2009-06-01
The UMLS's integration of more than 100 source vocabularies, not necessarily consistent with one another, causes some inconsistencies. The purpose of auditing the UMLS is to detect such inconsistencies and to suggest how to resolve them while observing the requirement of fully representing the content of each source in the UMLS. A software tool, called the Neighborhood Auditing Tool (NAT), that facilitates UMLS auditing is presented. The NAT supports "neighborhood-based" auditing, where, at any given time, an auditor concentrates on a single-focus concept and one of a variety of neighborhoods of its closely related concepts. Typical diagrammatic displays of concept networks have a number of shortcomings, so the NAT utilizes a hybrid diagram/text interface that features stylized neighborhood views which retain some of the best features of both the diagrammatic layouts and text windows while avoiding the shortcomings. The NAT allows an auditor to display knowledge from both the Metathesaurus (concept) level and the Semantic Network (semantic type) level. Various additional features of the NAT that support the auditing process are described. The usefulness of the NAT is demonstrated through a group of case studies. Its impact is tested with a study involving a select group of auditors.
Community Coordinated Modeling Center Support of Science Needs for Integrated Data Environment
NASA Technical Reports Server (NTRS)
Kuznetsova, M. M.; Hesse, M.; Rastatter, L.; Maddox, M.
2007-01-01
Space science models are essential component of integrated data environment. Space science models are indispensable tools to facilitate effective use of wide variety of distributed scientific sources and to place multi-point local measurements into global context. The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the- art space science models ranging from the solar atmosphere to the Earth's upper atmosphere. The majority of models residing at CCMC are comprehensive computationally intensive physics-based models. To allow the models to be driven by data relevant to particular events, the CCMC developed an online data file generation tool that automatically downloads data from data providers and transforms them to required format. CCMC provides a tailored web-based visualization interface for the model output, as well as the capability to download simulations output in portable standard format with comprehensive metadata and user-friendly model output analysis library of routines that can be called from any C supporting language. CCMC is developing data interpolation tools that enable to present model output in the same format as observations. CCMC invite community comments and suggestions to better address science needs for the integrated data environment.
Automated support for experience-based software management
NASA Technical Reports Server (NTRS)
Valett, Jon D.
1992-01-01
To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.
Using Google Earth for Submarine Operations at Pavilion Lake
NASA Astrophysics Data System (ADS)
Deans, M. C.; Lees, D. S.; Fong, T.; Lim, D. S.
2009-12-01
During the July 2009 Pavilion Lake field test, we supported submarine "flight" operations using Google Earth. The Intelligent Robotics Group at NASA Ames has experience with ground data systems for NASA missions, earth analog field tests, disaster response, and the Gigapan camera system. Leveraging this expertise and existing software, we put together a set of tools to support sub tracking and mapping, called the "Surface Data System." This system supports flight planning, real time flight operations, and post-flight analysis. For planning, we make overlays of the regional bedrock geology, sonar bathymetry, and sonar backscatter maps that show geology, depth, and structure of the bottom. Placemarks show the mooring locations for start and end points. Flight plans are shown as polylines with icons for waypoints. Flight tracks and imagery from previous field seasons are embedded in the map for planning follow-on activities. These data provide context for flight planning. During flights, sub position is updated every 5 seconds from the nav computer on the chase boat. We periodically update tracking KML files and refresh them with network links. A sub icon shows current location of the sub. A compass rose shows bearings to indicate heading to the next waypoint. A "Science Stenographer" listens on the voice loop and transcribes significant observations in real time. Observations called up to the surface immediately appear on the map as icons with date, time, position, and what was said. After each flight, the science back room immediately has the flight track and georeferenced notes from the pilots. We add additional information in post-processing. The submarines record video continuously, with "event" timestamps marked by the pilot. We cross-correlate the event timestamps with position logs to geolocate events and put a preview image and compressed video clip into the map. Animated flight tracks are also generated, showing timestamped position and providing timelapse playback of the flight. Neogeography tools are increasing in popularity and offer an excellent platform for geoinformatics. The scientists on the team are already familiar with Google Earth, eliminating up-front training on new tools. The flight maps and archived data are available immediately and in a usable format. Google Earth provides lots of measurement tools, annotation tools, and other built-in functions that we can use to create and analyze the map. All of this information is saved to a shared filesystem so that everyone on the team has access to all of the same map data. After the field season, the map data will be used by the team to analyse and correlate information from across the lake and across different flights to support their research, and to plan next year's activities.
Politi, Mary C; Barker, Abigail R; Kaphingst, Kimberly A; McBride, Timothy; Shacham, Enbal; Kebodeaux, Carey S
2016-02-16
The implementation of the ACA has improved access to quality health insurance, a necessary first step to improving health outcomes. However, access must be supplemented by education to help individuals make informed choices for plans that meet their individual financial and health needs. Drawing on a model of information processing and on prior research, we developed a health insurance decision support tool called Show Me My Health Plans. Developed with extensive stakeholder input, the current tool (1) simplifies information through plain language and graphics in an educational component; (2) assesses and reviews knowledge interactively to ensure comprehension of key material; (3) incorporates individual and/or family health status to personalize out-of-pocket cost estimates; (4) assesses preferences for plan features; and (5) helps individuals weigh information appropriate to their interests and needs through a summary page with "good fit" plans generated from a tailored algorithm. The current study will evaluate whether the online decision support tool improves health insurance decisions compared to a usual care condition (the healthcare.gov marketplace website). The trial will include 362 individuals (181 in each group) from rural, suburban, and urban settings within a 90 mile radius around St. Louis. Eligibility criteria includes English-speaking individuals 18-64 years old who are eligible for the ACA marketplace plans. They will be computer randomized to view the intervention or usual care condition. Presenting individuals with options that they can understand tailored to their needs and preferences could help improve decision quality. By helping individuals narrow down the complexity of health insurance plan options, decision support tools such as this one could prepare individuals to better navigate enrollment in a plan that meets their individual needs. The randomized trial was registered in clinicaltrials.gov (NCT02522624) on August 6, 2015.
SeqMule: automated pipeline for analysis of human exome/genome sequencing data.
Guo, Yunfei; Ding, Xiaolei; Shen, Yufeng; Lyon, Gholson J; Wang, Kai
2015-09-18
Next-generation sequencing (NGS) technology has greatly helped us identify disease-contributory variants for Mendelian diseases. However, users are often faced with issues such as software compatibility, complicated configuration, and no access to high-performance computing facility. Discrepancies exist among aligners and variant callers. We developed a computational pipeline, SeqMule, to perform automated variant calling from NGS data on human genomes and exomes. SeqMule integrates computational-cluster-free parallelization capability built on top of the variant callers, and facilitates normalization/intersection of variant calls to generate consensus set with high confidence. SeqMule integrates 5 alignment tools, 5 variant calling algorithms and accepts various combinations all by one-line command, therefore allowing highly flexible yet fully automated variant calling. In a modern machine (2 Intel Xeon X5650 CPUs, 48 GB memory), when fast turn-around is needed, SeqMule generates annotated VCF files in a day from a 30X whole-genome sequencing data set; when more accurate calling is needed, SeqMule generates consensus call set that improves over single callers, as measured by both Mendelian error rate and consistency. SeqMule supports Sun Grid Engine for parallel processing, offers turn-key solution for deployment on Amazon Web Services, allows quality check, Mendelian error check, consistency evaluation, HTML-based reports. SeqMule is available at http://seqmule.openbioinformatics.org.
The SCEC Community Modeling Environment(SCEC/CME): A Collaboratory for Seismic Hazard Analysis
NASA Astrophysics Data System (ADS)
Maechling, P. J.; Jordan, T. H.; Minster, J. B.; Moore, R.; Kesselman, C.
2005-12-01
The SCEC Community Modeling Environment (SCEC/CME) Project is an NSF-supported Geosciences/IT partnership that is actively developing an advanced information infrastructure for system-level earthquake science in Southern California. This partnership includes SCEC, USC's Information Sciences Institute (ISI), the San Diego Supercomputer Center (SDSC), the Incorporated Institutions for Research in Seismology (IRIS), and the U.S. Geological Survey. The goal of the SCEC/CME is to develop seismological applications and information technology (IT) infrastructure to support the development of Seismic Hazard Analysis (SHA) programs and other geophysical simulations. The SHA application programs developed on the Project include a Probabilistic Seismic Hazard Analysis system called OpenSHA. OpenSHA computational elements that are currently available include a collection of attenuation relationships, and several Earthquake Rupture Forecasts (ERFs). Geophysicists in the collaboration have also developed Anelastic Wave Models (AWMs) using both finite-difference and finite-element approaches. Earthquake simulations using these codes have been run for a variety of earthquake sources. Rupture Dynamic Model (RDM) codes have also been developed that simulate friction-based fault slip. The SCEC/CME collaboration has also developed IT software and hardware infrastructure to support the development, execution, and analysis of these SHA programs. To support computationally expensive simulations, we have constructed a grid-based scientific workflow system. Using the SCEC grid, project collaborators can submit computations from the SCEC/CME servers to High Performance Computers at USC and TeraGrid High Performance Computing Centers. Data generated and archived by the SCEC/CME is stored in a digital library system, the Storage Resource Broker (SRB). This system provides a robust and secure system for maintaining the association between the data seta and their metadata. To provide an easy-to-use system for constructing SHA computations, a browser-based workflow assembly web portal has been developed. Users can compose complex SHA calculations, specifying SCEC/CME data sets as inputs to calculations, and calling SCEC/CME computational programs to process the data and the output. Knowledge-based software tools have been implemented that utilize ontological descriptions of SHA software and data can validate workflows created with this pathway assembly tool. Data visualization software developed by the collaboration supports analysis and validation of data sets. Several programs have been developed to visualize SCEC/CME data including GMT-based map making software for PSHA codes, 4D wavefield propagation visualization software based on OpenGL, and 3D Geowall-based visualization of earthquakes, faults, and seismic wave propagation. The SCEC/CME Project also helps to sponsor the SCEC UseIT Intern program. The UseIT Intern Program provides research opportunities in both Geosciences and Information Technology to undergraduate students in a variety of fields. The UseIT group has developed a 3D data visualization tool, called SCEC-VDO, as a part of this undergraduate research program.
NASA Astrophysics Data System (ADS)
Hart, D. M.; Merchant, B. J.; Abbott, R. E.
2012-12-01
The Component Evaluation project at Sandia National Laboratories supports the Ground-based Nuclear Explosion Monitoring program by performing testing and evaluation of the components that are used in seismic and infrasound monitoring systems. In order to perform this work, Component Evaluation maintains a testing facility called the FACT (Facility for Acceptance, Calibration, and Testing) site, a variety of test bed equipment, and a suite of software tools for analyzing test data. Recently, Component Evaluation has successfully integrated several improvements to its software analysis tools and test bed equipment that have substantially improved our ability to test and evaluate components. The software tool that is used to analyze test data is called TALENT: Test and AnaLysis EvaluatioN Tool. TALENT is designed to be a single, standard interface to all test configuration, metadata, parameters, waveforms, and results that are generated in the course of testing monitoring systems. It provides traceability by capturing everything about a test in a relational database that is required to reproduce the results of that test. TALENT provides a simple, yet powerful, user interface to quickly acquire, process, and analyze waveform test data. The software tool has also been expanded recently to handle sensors whose output is proportional to rotation angle, or rotation rate. As an example of this new processing capability, we show results from testing the new ATA ARS-16 rotational seismometer. The test data was collected at the USGS ASL. Four datasets were processed: 1) 1 Hz with increasing amplitude, 2) 4 Hz with increasing amplitude, 3) 16 Hz with increasing amplitude and 4) twenty-six discrete frequencies between 0.353 Hz to 64 Hz. The results are compared to manufacture-supplied data sheets.
Sullivan-Bolyai, Susan; Bova, Carol; Leung, Katherine; Trudeau, Allison; Lee, Mary; Gruppuso, Philip
2010-01-01
The purpose of this study was to test the efficacy of a social support intervention with parents of children <13 years old newly diagnosed with type 1 diabetes mellitus (T1DM). For this randomized, controlled clinical trial, 10 parent mentors of children diagnosed with T1DM >or=1 year and 60 parent participants were recruited from 2 pediatric diabetes centers. Mentors were trained to provide social support (home visits and phone calls) for 12 months to families in the experimental arm (32 mothers). Control group parents (28 mothers) received the phone number of an experienced parent (not trained to give social support) to call as needed. Findings Mothers in the experimental and control arms differed at baseline only in birth order of the child with T1DM. The 2 groups did not differ significantly at 3, 6, or 12 months in parent concern, confidence, worry, impact on the family, or perceived social support. Mothers in the experimental arm identified the parent mentor as someone they would seek for advice and issues regarding growth and development, sleep, eating habits, and identification of community agencies. Parent mentors consistently referred mothers to health care providers for advice on medications and treatments but helped them incorporate this advice into day-to-day management. Mothers in the experimental arm valued the mentors' help in adjusting to the diagnosis, but this value was not measured by the study instruments. Focus group research is under way to clarify the concept of parent mentor social support and to develop a social support measurement tool.
NASA Technical Reports Server (NTRS)
Mainger, Steve
2004-01-01
As NASA speculates on and explores the future of aviation, the technological and physical aspects of our environment increasing become hurdles that must be overcome for success. Research into methods for overcoming some of these selected hurdles have been purposed by several NASA research partners as concepts. The task of establishing a common evaluation environment was placed on NASA's Virtual Airspace Simulation Technologies (VAST) project (sub-project of VAMS), and they responded with the development of the Airspace Concept Evaluation System (ACES). As one examines the ACES environment from a communication, navigation or surveillance (CNS) perspective, the simulation parameters are built with assumed perfection in the transactions associated with CNS. To truly evaluate these concepts in a realistic sense, the contributions/effects of CNS must be part of the ACES. NASA Glenn Research Center (GRC) has supported the Virtual Airspace Modeling and Simulation (VAMS) project through the continued development of CNS models and analysis capabilities which supports the ACES environment. NASA GRC initiated the development a communications traffic loading analysis tool, called the Future Aeronautical Sub-network Traffic Emulator for Communications, Navigation and Surveillance (FASTE-CNS), as part of this support. This tool allows for forecasting of communications load with the understanding that, there is no single, common source for loading models used to evaluate the existing and planned communications channels; and that, consensus and accuracy in the traffic load models is a very important input to the decisions being made on the acceptability of communication techniques used to fulfill the aeronautical requirements. Leveraging off the existing capabilities of the FASTE-CNS tool, GRC has called for FASTE-CNS to have the functionality to pre- and post-process the simulation runs of ACES to report on instances when traffic density, frequency congestion or aircraft spacing/distance violations have occurred. The integration of these functions require that the CNS models used to characterize these avionic system be of higher fidelity and better consistency then is present in FASTE-CNS system. This presentation will explore the capabilities of FASTE-CNS with renewed emphasis on the enhancements being added to perform these processing functions; the fidelity and reliability of CNS models necessary to make the enhancements work; and the benchmarking of FASTE-CNS results to improve confidence for the results of the new processing capabilities.
NASA Astrophysics Data System (ADS)
Basista, A.
2013-12-01
There are many tools to manage spatial data. They called Geographic Information System (GIS), which apart from data visualization in space, let users make various spatial analysis. Thanks to them, it is possible to obtain more, essential information for real estate market analysis. Many scientific research present GIS exploitation to future mass valuation, because it is necessary to use advanced tools to manage such a huge real estates' data sets gathered for mass valuation needs. In practice, appraisers use rarely these tools for single valuation, because there are not many available GIS tools to support real estate valuation. The paper presents the functionality of geoinformatic subsystem, that is used to support real estate market analysis and real estate valuation. There are showed a detailed description of the process relied to attributes' inputting into the database and the attributes' values calculation based on the proposed definition of attributes' scales. This work presents also the algorithm of similar properties selection that was implemented within the described subsystem. The main stage of this algorithm is the calculation of the price creative indicator for each real estate, using their attributes' values. The set of properties, chosen in this way, are visualized on the map. The geoinformatic subsystem is used for the un-built real estates and living premises. Geographic Information System software was used to worked out this project. The basic functionality of gvSIG software (open source software) was extended and some extra functions were added to support real estate market analysis.
Dy, Christine J.
2017-01-01
Abstract Body weight–supported treadmill training (BWSTT) developed from animal studies of spinal cord injury (SCI). Evidence that spinal cats (i.e., cats that have a complete surgical transection of the cord) could regain the ability to step on a moving treadmill indicated a vast potential for spinal circuits to generate walking without the brain. BWSTT represented a means to unlock that potential. As the technique was adapted as a rehabilitation intervention for humans with SCI, shortcomings in the translation to walking in the real world were exposed. Evidence that BWSTT has not been as successful for humans with SCI leads us to revisit key animal studies. In this short review, we describe the task-specific nature of BWSTT and discuss how this specificity may pose limits on the recovery of overground walking. Also discussed are more recent studies that have introduced new strategies and tools that adapt BWSTT ideas to more functionally-relevant tasks. We introduce a new device for weight-supported overground walking in rats called Circular BART (Body weight supported Ambulatory Rat Trainer) and demonstrate that it is relatively easy and inexpensive to produce. Future animal studies will benefit from the development of simple tools that facilitate training and testing of overground walking. PMID:27863455
de Leon, Ray D; Dy, Christine J
2017-05-01
Body weight-supported treadmill training (BWSTT) developed from animal studies of spinal cord injury (SCI). Evidence that spinal cats (i.e., cats that have a complete surgical transection of the cord) could regain the ability to step on a moving treadmill indicated a vast potential for spinal circuits to generate walking without the brain. BWSTT represented a means to unlock that potential. As the technique was adapted as a rehabilitation intervention for humans with SCI, shortcomings in the translation to walking in the real world were exposed. Evidence that BWSTT has not been as successful for humans with SCI leads us to revisit key animal studies. In this short review, we describe the task-specific nature of BWSTT and discuss how this specificity may pose limits on the recovery of overground walking. Also discussed are more recent studies that have introduced new strategies and tools that adapt BWSTT ideas to more functionally-relevant tasks. We introduce a new device for weight-supported overground walking in rats called Circular BART (Body weight supported Ambulatory Rat Trainer) and demonstrate that it is relatively easy and inexpensive to produce. Future animal studies will benefit from the development of simple tools that facilitate training and testing of overground walking.
ERIC Educational Resources Information Center
Heys, Chris
2008-01-01
Excel, Microsoft's spreadsheet program, offers several tools which have proven useful in solving some optimization problems that arise in operations research. We will look at two such tools, the Excel modules called Solver and Goal Seek--this after deriving an equation, called the "cash accumulation equation", to be used in conjunction with them.
Carney, Timothy Jay; Morgan, Geoffrey P.; Jones, Josette; McDaniel, Anna M.; Weaver, Michael; Weiner, Bryan; Haggstrom, David A.
2014-01-01
Our conceptual model demonstrates our goal to investigate the impact of clinical decision support (CDS) utilization on cancer screening improvement strategies in the community health care (CHC) setting. We employed a dual modeling technique using both statistical and computational modeling to evaluate impact. Our statistical model used the Spearman’s Rho test to evaluate the strength of relationship between our proximal outcome measures (CDS utilization) against our distal outcome measure (provider self-reported cancer screening improvement). Our computational model relied on network evolution theory and made use of a tool called Construct-TM to model the use of CDS measured by the rate of organizational learning. We employed the use of previously collected survey data from community health centers Cancer Health Disparities Collaborative (HDCC). Our intent is to demonstrate the added valued gained by using a computational modeling tool in conjunction with a statistical analysis when evaluating the impact a health information technology, in the form of CDS, on health care quality process outcomes such as facility-level screening improvement. Significant simulated disparities in organizational learning over time were observed between community health centers beginning the simulation with high and low clinical decision support capability. PMID:24953241
Radiological Operations Support Specialist (ROSS) Pilot Course Summary and Recommendations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alai, M.; Askin, A.; Buddemeier, B.
In support of the Department of Homeland Security / Science and Technology Directorate’s (DHS/S&T) creation of a new position called the Radiological Operations Support Specialist (ROSS), Lawrence Livermore National Laboratory (LLNL) in Sub-task 1.1 and 1.2 has assisted in the development of the ROSS skills, knowledge, and abilities (SKAs); identified potentially relevant training; cross-mapped the training to the SKAs; and identified gaps in the training related to the SKAs, as well as their respective level of training knowledge - current versus desired. In the follow on task, Sub-task 1.3, a 5 day ROSS Pilot Training course was developed to fillmore » the priority gaps identified in Sub-Task 1.2. Additionally, in Sub-Task 1.5, LLNL has performed a gap analysis of electronic tools, handbooks, and job-aides currently available to the ROSS and developed recommendations for additional and next generation tools to ensure the operational effectiveness of the ROSS position. This document summarizes the feedback received from the instructors and pilot course observers on what worked in the course and what could be improved as well as an assessment of the Pre- and Post-Test administered to the students.« less
RCQ-GA: RDF Chain Query Optimization Using Genetic Algorithms
NASA Astrophysics Data System (ADS)
Hogenboom, Alexander; Milea, Viorel; Frasincar, Flavius; Kaymak, Uzay
The application of Semantic Web technologies in an Electronic Commerce environment implies a need for good support tools. Fast query engines are needed for efficient querying of large amounts of data, usually represented using RDF. We focus on optimizing a special class of SPARQL queries, the so-called RDF chain queries. For this purpose, we devise a genetic algorithm called RCQ-GA that determines the order in which joins need to be performed for an efficient evaluation of RDF chain queries. The approach is benchmarked against a two-phase optimization algorithm, previously proposed in literature. The more complex a query is, the more RCQ-GA outperforms the benchmark in solution quality, execution time needed, and consistency of solution quality. When the algorithms are constrained by a time limit, the overall performance of RCQ-GA compared to the benchmark further improves.
Towards a flexible middleware for context-aware pervasive and wearable systems.
Muro, Marco; Amoretti, Michele; Zanichelli, Francesco; Conte, Gianni
2012-11-01
Ambient intelligence and wearable computing call for innovative hardware and software technologies, including a highly capable, flexible and efficient middleware, allowing for the reuse of existing pervasive applications when developing new ones. In the considered application domain, middleware should also support self-management, interoperability among different platforms, efficient communications, and context awareness. In the on-going "everything is networked" scenario scalability appears as a very important issue, for which the peer-to-peer (P2P) paradigm emerges as an appealing solution for connecting software components in an overlay network, allowing for efficient and balanced data distribution mechanisms. In this paper, we illustrate how all these concepts can be placed into a theoretical tool, called networked autonomic machine (NAM), implemented into a NAM-based middleware, and evaluated against practical problems of pervasive computing.
NASA Astrophysics Data System (ADS)
Faqih, A.
2017-03-01
Providing information regarding future climate scenarios is very important in climate change study. The climate scenario can be used as basic information to support adaptation and mitigation studies. In order to deliver future climate scenarios over specific region, baseline and projection data from the outputs of global climate models (GCM) is needed. However, due to its coarse resolution, the data have to be downscaled and bias corrected in order to get scenario data with better spatial resolution that match the characteristics of the observed data. Generating this downscaled data is mostly difficult for scientist who do not have specific background, experience and skill in dealing with the complex data from the GCM outputs. In this regards, it is necessary to develop a tool that can be used to simplify the downscaling processes in order to help scientist, especially in Indonesia, for generating future climate scenario data that can be used for their climate change-related studies. In this paper, we introduce a tool called as “Statistical Bias Correction for Climate Scenarios (SiBiaS)”. The tool is specially designed to facilitate the use of CMIP5 GCM data outputs and process their statistical bias corrections relative to the reference data from observations. It is prepared for supporting capacity building in climate modeling in Indonesia as part of the Indonesia 3rd National Communication (TNC) project activities.
Greased Lightning (GL-10) Performance Flight Research: Flight Data Report
NASA Technical Reports Server (NTRS)
McSwain, Robert G.; Glaab, Louis J.; Theodore, Colin R.; Rhew, Ray D. (Editor); North, David D. (Editor)
2017-01-01
Modern aircraft design methods have produced acceptable designs for large conventional aircraft performance. With revolutionary electronic propulsion technologies fueled by the growth in the small UAS (Unmanned Aerial Systems) industry, these same prediction models are being applied to new smaller, and experimental design concepts requiring a VTOL (Vertical Take Off and Landing) capability for ODM (On Demand Mobility). A 50% sub-scale GL-10 flight model was built and tested to demonstrate the transition from hover to forward flight utilizing DEP (Distributed Electric Propulsion)[1][2]. In 2016 plans were put in place to conduct performance flight testing on the 50% sub-scale GL-10 flight model to support a NASA project called DELIVER (Design Environment for Novel Vertical Lift Vehicles). DELIVER was investigating the feasibility of including smaller and more experimental aircraft configurations into a NASA design tool called NDARC (NASA Design and Analysis of Rotorcraft)[3]. This report covers the performance flight data collected during flight testing of the GL-10 50% sub-scale flight model conducted at Beaver Dam Airpark, VA. Overall the flight test data provides great insight into how well our existing conceptual design tools predict the performance of small scale experimental DEP concepts. Low fidelity conceptual design tools estimated the (L/D)( sub max)of the GL-10 50% sub-scale flight model to be 16. Experimentally measured (L/D)( sub max) for the GL-10 50% scale flight model was 7.2. The aerodynamic performance predicted versus measured highlights the complexity of wing and nacelle interactions which is not currently accounted for in existing low fidelity tools.
From Systematic Review to Call for Action.
Sawin, Erika Metzler; Sobel, Linda L; Annan, Sandra L; Schminkey, Donna L
2017-06-01
Intimate partner violence (IPV) is a global public health and criminal justice concern with significant impacts; especially high rates are seen among rural Hispanic American (HA) communities, the fastest growing population in the United States. They experience additional barriers to care including extreme poverty, lesser education, gender norms, and language and immigration issues. A systematic literature review was conducted using Cooper's framework to identify evidence supporting associations between interventions and prevention, reduction, and elimination of IPV among rural HA women. Searches conducted on databases including CINAHL, PubMed, Medline, Women's Studies International, MedicLatina, and JSTOR used the MeSH terms Hispanic Americans (Latino/a and Hispanic), domestic violence, and intimate partner violence. Selected studies were published between January 1, 2000, and January 1, 2014. Of the 617 yielded articles, only 6 met the inclusion criteria. Of these, none closely examined rurality or provided valid and reliable measures of outcomes, instead reporting program descriptions and suggested interventions. We identify key findings to guide program, screening, and tool development. Our study identifies a gap in knowledge, research, and effective practices and issues a call for action to create evidence-based tools to prevent, reduce, and eliminate IPV in these underserved populations.
Peer-supported review of teaching: an evaluation.
Thampy, Harish; Bourke, Michael; Naran, Prasheena
2015-09-01
Peer-supported review (also called peer observation) of teaching is a commonly implemented method of ascertaining teaching quality that supplements student feedback. A large variety of scheme formats with rather differing purposes are described in the literature. They range from purely formative, developmental formats that facilitate a tutor's reflection of their own teaching to reaffirm strengths and identify potential areas for development through to faculty- or institution-driven summative quality assurance-based schemes. Much of the current literature in this field focuses within general higher education and on the development of rating scales, checklists or observation tools to help guide the process. This study reports findings from a qualitative evaluation of a purely formative peer-supported review of teaching scheme that was implemented for general practice clinical tutors at our medical school and describes tutors' attitudes and perceived benefits and challenges when undergoing observation.
CSDC: a nationwide screening platform for stroke control and prevention in China.
Jinghui Yu; Huajian Mao; Mei Li; Dan Ye; Dongsheng Zhao
2016-08-01
As a leading cause of severe disability and death, stroke places an enormous burden on Chinese society. A nationwide stroke screening platform called CSDC (China Stoke Data Center) has been built to support the national stroke prevention program and stroke clinical research since 2011. This platform is composed of a data integration system and a big data analysis system. The data integration system is used to collect information on risk factors, diagnosis history, treatment, and sociodemographic characteristics and stroke patients' EMR. The big data analysis system support decision making of stroke control and prevention, clinical evaluation and research. In this paper, the design and implementation of CSDC are illustrated, and some application results are presented. This platform is expected to provide rich data and powerful tool support for stroke control and prevention in China.
The mobile phone as a tool in improving cancer care in Nigeria.
Odigie, V I; Yusufu, L M D; Dawotola, D A; Ejagwulu, F; Abur, P; Mai, A; Ukwenya, Y; Garba, E S; Rotibi, B B; Odigie, E C
2012-03-01
The use of mobile phone as a tool for improving cancer care in a low resource setting. A total of 1176 oncology patients participated in the study. Majority had breast cancer. 58.4% of the patients had no formal education; 10.7 and 9.5% of patients had college or graduate education respectively. Two out of every three patients lived greater than 200 km from hospital or clinic. One half of patients rented a phone to call. At 24 months, 97.6% (1132 patients) had sustained their follow-up appointments as against 19.2% (42 patients) who did not receive the phone intervention. 72.8% (14 102 calls) were to discuss illness/treatment. 14% of the calls were rated as emergency by the oncologist. 86.2% of patients found the use of mobile phone convenient/excellent/cheap. 97.6% found the use of the phone worthwhile and preferred the phone to traveling long distance to hospital/clinic. Also the patients felt that they had not been forgotten by their doctors and were been taken care of outside the hospital/clinic. Low resource countries faced with the burden of cancer care, poor patient follow-up and poor psychosocial support can cash in on this to overcome the persistent problem of poor communication in their healthcare delivery. The potential is enormous to enhance the use of mobile phones in novel ways: developing helpline numbers that can be called for cancer information from prevention to treatment to palliative care. The ability to reach out by mobile phone to a reliable source for medical information about cancer is something that the international community, having experience with helplines, should undertake with colleagues in Africa, who are experimenting with the mobile phone potential. Copyright © 2011 John Wiley & Sons, Ltd.
Decision Analysis Tools for Volcano Observatories
NASA Astrophysics Data System (ADS)
Hincks, T. H.; Aspinall, W.; Woo, G.
2005-12-01
Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.
On Constructing, Grouping and Using Topical Ontology for Semantic Matching
NASA Astrophysics Data System (ADS)
Tang, Yan; de Baer, Peter; Zhao, Gang; Meersman, Robert
An ontology topic is used to group concepts from different contexts (or even from different domain ontologies). This paper presents a pattern-driven modeling methodology for constructing and grouping topics in an ontology (PAD-ON methodology), which is used for matching similarities between competences in the human resource management (HRM) domain. The methodology is supported by a tool called PAD-ON. This paper demonstrates our recent achievement in the work from the EC Prolix project. The paper approach is applied to the training processes at British Telecom as the test bed.
Heisler, Michele; Mase, Rebecca; Brown, Brianne; Wilson, Shayla; Reeves, Pamela J.
2017-01-01
Background Racial and ethnic minority adults with diabetes living in under-resourced communities face multiple barriers to sustaining self-management behaviors necessary to improve diabetes outcomes. Peer support and decision support tools each have been associated with improved diabetes outcomes. Methods 289 primarily African American adults with poor glycemic control will be recruited from the Detroit Veteran’s Administration Hospital and randomized to Technology-Enhanced Coaching (TEC) or Peer Coaching alone. Participants in both arms will be assigned a peer coach trained in autonomy-supportive approaches. Coaches are diabetes patients with prior poor glycemic control who now have good control. All participants meet face-to-face initially with their coach to review diabetes education materials and develop an action plan. Educational materials in the TEC arm are delivered via a web-based, educational tool tailored with each participant’s personalized health data (iDecide). Over the next six months, Coaches call their assigned participants once a week to provide support for weekly action steps. Data are also collected on an Observational Control group with no contact with study staff. Changes in A1c, blood pressure, other patient-centered outcomes and mediators and moderators of intervention effects will be assessed. Discussion Tailored e-Health tools with educational content may enhance the effectiveness of peer coaching programs to better prepare patients to set self-management goals, identify action plans, and discuss treatment options with their health care providers. The study will provide insights for scalable self-management support programs for diabetes and chronic illnesses that require high levels of sustained patient self-management. PMID:28132876
Team table: a framework and tool for continuous factory planning
NASA Astrophysics Data System (ADS)
Sihn, Wilfried; Bischoff, Juergen; von Briel, Ralf; Josten, Marcus
2000-10-01
Growing market turbulences and shorter product life cycles require a continuous adaptation of factory structures resulting in a continuous factory planning process. Therefore a new framework is developed which focuses on configuration and data management process integration. This enable an online system performance evaluation based on continuous availability of current data. The use of this framework is especially helpful and will guarantee high cost and time savings, when used in the early stages of the planning, called the concept or rough planning phase. The new framework is supported by a planning round table as a tool for team-based configuration processes integrating the knowledge of all persons involved in planning processes. A case study conducted at a German company shows the advantages which can be achieved by implementing the new framework and methods.
Health Information Technology as a Universal Donor to Bioethics Education.
Goodman, Kenneth W
2017-04-01
Health information technology, sometimes called biomedical informatics, is the use of computers and networks in the health professions. This technology has become widespread, from electronic health records to decision support tools to patient access through personal health records. These computational and information-based tools have engendered their own ethics literature and now present an opportunity to shape the standard medical and nursing ethics curricula. It is suggested that each of four core components in the professional education of clinicians-privacy, end-of-life care, access to healthcare and valid consent, and clinician-patient communication-offers an opportunity to leverage health information technology for curricular improvement. Using informatics in ethics education freshens ethics pedagogy and increases its utility, and does so without additional demands on overburdened curricula.
Modeling crime events by d-separation method
NASA Astrophysics Data System (ADS)
Aarthee, R.; Ezhilmaran, D.
2017-11-01
Problematic legal cases have recently called for a scientifically founded method of dealing with the qualitative and quantitative roles of evidence in a case [1].To deal with quantitative, we proposed a d-separation method for modeling the crime events. A d-separation is a graphical criterion for identifying independence in a directed acyclic graph. By developing a d-separation method, we aim to lay the foundations for the development of a software support tool that can deal with the evidential reasoning in legal cases. Such a tool is meant to be used by a judge or juror, in alliance with various experts who can provide information about the details. This will hopefully improve the communication between judges or jurors and experts. The proposed method used to uncover more valid independencies than any other graphical criterion.
The JPSS Ground Project Algorithm Verification, Test and Evaluation System
NASA Astrophysics Data System (ADS)
Vicente, G. A.; Jain, P.; Chander, G.; Nguyen, V. T.; Dixon, V.
2016-12-01
The Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) is an operational system that provides services to the Suomi National Polar-orbiting Partnership (S-NPP) Mission. It is also a unique environment for Calibration/Validation (Cal/Val) and Data Quality Assessment (DQA) of the Join Polar Satellite System (JPSS) mission data products. GRAVITE provides a fast and direct access to the data and products created by the Interface Data Processing Segment (IDPS), the NASA/NOAA operational system that converts Raw Data Records (RDR's) generated by sensors on the S-NPP into calibrated geo-located Sensor Data Records (SDR's) and generates Mission Unique Products (MUPS). It also facilitates algorithm investigation, integration, checkouts and tuning, instrument and product calibration and data quality support, monitoring and data/products distribution. GRAVITE is the portal for the latest S-NPP and JPSS baselined Processing Coefficient Tables (PCT's) and Look-Up-Tables (LUT's) and hosts a number DQA offline tools that takes advantage of the proximity to the near-real time data flows. It also contains a set of automated and ad-hoc Cal/Val tools used for algorithm analysis and updates, including an instance of the IDPS called GRAVITE Algorithm Development Area (G-ADA), that has the latest installation of the IDPS algorithms running in an identical software and hardware platforms. Two other important GRAVITE component are the Investigator-led Processing System (IPS) and the Investigator Computing Facility (ICF). The IPS is a dedicated environment where authorized users run automated scripts called Product Generation Executables (PGE's) to support Cal/Val and data quality assurance offline. This data-rich and data-driven service holds its own distribution system and allows operators to retrieve science data products. The ICF is a workspace where users can share computing applications and resources and have full access to libraries and science and sensor quality analysis tools. In this presentation we will describe the GRAVITE systems and subsystems, architecture, technical specifications, capabilities and resources, distributed data and products and the latest advances to support the JPSS science algorithm implementation, validation and testing.
Piette, John D; Lun, K C; Moura, Lincoln A; Fraser, Hamish S F; Mechael, Patricia N; Powell, John; Khoja, Shariq R
2012-05-01
E-health encompasses a diverse set of informatics tools that have been designed to improve public health and health care. Little information is available on the impacts of e-health programmes, particularly in low- and middle-income countries. We therefore conducted a scoping review of the published and non-published literature to identify data on the effects of e-health on health outcomes and costs. The emphasis was on the identification of unanswered questions for future research, particularly on topics relevant to low- and middle-income countries. Although e-health tools supporting clinical practice have growing penetration globally, there is more evidence of benefits for tools that support clinical decisions and laboratory information systems than for those that support picture archiving and communication systems. Community information systems for disease surveillance have been implemented successfully in several low- and middle-income countries. Although information on outcomes is generally lacking, a large project in Brazil has documented notable impacts on health-system efficiency. Meta-analyses and rigorous trials have documented the benefits of text messaging for improving outcomes such as patients' self-care. Automated telephone monitoring and self-care support calls have been shown to improve some outcomes of chronic disease management, such as glycaemia and blood pressure control, in low- and middle-income countries. Although large programmes for e-health implementation and research are being conducted in many low- and middle-income countries, more information on the impacts of e-health on outcomes and costs in these settings is still needed.
Lun, KC; Moura, Lincoln A; Fraser, Hamish SF; Mechael, Patricia N; Powell, John; Khoja, Shariq R
2012-01-01
Abstract E-health encompasses a diverse set of informatics tools that have been designed to improve public health and health care. Little information is available on the impacts of e-health programmes, particularly in low- and middle-income countries. We therefore conducted a scoping review of the published and non-published literature to identify data on the effects of e-health on health outcomes and costs. The emphasis was on the identification of unanswered questions for future research, particularly on topics relevant to low- and middle-income countries. Although e-health tools supporting clinical practice have growing penetration globally, there is more evidence of benefits for tools that support clinical decisions and laboratory information systems than for those that support picture archiving and communication systems. Community information systems for disease surveillance have been implemented successfully in several low- and middle-income countries. Although information on outcomes is generally lacking, a large project in Brazil has documented notable impacts on health-system efficiency. Meta-analyses and rigorous trials have documented the benefits of text messaging for improving outcomes such as patients’ self-care. Automated telephone monitoring and self-care support calls have been shown to improve some outcomes of chronic disease management, such as glycaemia and blood pressure control, in low- and middle-income countries. Although large programmes for e-health implementation and research are being conducted in many low- and middle-income countries, more information on the impacts of e-health on outcomes and costs in these settings is still needed. PMID:22589570
Mobile Clinical Decision Support System for Acid-base Balance Diagnosis and Treatment Recommendation
Mandzuka, Mensur; Begic, Edin; Boskovic, Dusanka; Begic, Zijo; Masic, Izet
2017-01-01
Introduction: This paper presents mobile application implementing a decision support system for acid-base disorder diagnosis and treatment recommendation. Material and methods: The application was developed using the official integrated development environment for the Android platform (to maximize availability and minimize investments in specialized hardware) called Android Studio. Results: The application identifies disorder, based on the blood gas analysis, evaluates whether the disorder has been compensated, and based on additional input related to electrolyte imbalance, provides recommendations for treatment. Conclusion: The application is a tool in the hands of the user, which provides assistance during acid-base disorders treatment. The application will assist the physician in clinical practice and is focused on the treatment in intensive care. PMID:28883678
Graphical Technique to Support the Teaching/Learning Process of Software Process Reference Models
NASA Astrophysics Data System (ADS)
Espinosa-Curiel, Ismael Edrein; Rodríguez-Jacobo, Josefina; Fernández-Zepeda, José Alberto
In this paper, we propose a set of diagrams to visualize software process reference models (PRM). The diagrams, called dimods, are the combination of some visual and process modeling techniques such as rich pictures, mind maps, IDEF and RAD diagrams. We show the use of this technique by designing a set of dimods for the Mexican Software Industry Process Model (MoProSoft). Additionally, we perform an evaluation of the usefulness of dimods. The result of the evaluation shows that dimods may be a support tool that facilitates the understanding, memorization, and learning of software PRMs in both, software development organizations and universities. The results also show that dimods may have advantages over the traditional description methods for these types of models.
Mena, Luis J.; Orozco, Eber E.; Felix, Vanessa G.; Ostos, Rodolfo; Melgarejo, Jesus; Maestre, Gladys E.
2012-01-01
Machine learning has become a powerful tool for analysing medical domains, assessing the importance of clinical parameters, and extracting medical knowledge for outcomes research. In this paper, we present a machine learning method for extracting diagnostic and prognostic thresholds, based on a symbolic classification algorithm called REMED. We evaluated the performance of our method by determining new prognostic thresholds for well-known and potential cardiovascular risk factors that are used to support medical decisions in the prognosis of fatal cardiovascular diseases. Our approach predicted 36% of cardiovascular deaths with 80% specificity and 75% general accuracy. The new method provides an innovative approach that might be useful to support decisions about medical diagnoses and prognoses. PMID:22924062
Patton, Evan W.; Seyed, Patrice; Wang, Ping; Fu, Linyun; Dein, F. Joshua; Bristol, R. Sky; McGuinness, Deborah L.
2014-01-01
We aim to inform the development of decision support tools for resource managers who need to examine large complex ecosystems and make recommendations in the face of many tradeoffs and conflicting drivers. We take a semantic technology approach, leveraging background ontologies and the growing body of linked open data. In previous work, we designed and implemented a semantically enabled environmental monitoring framework called SemantEco and used it to build a water quality portal named SemantAqua. Our previous system included foundational ontologies to support environmental regulation violations and relevant human health effects. In this work, we discuss SemantEco’s new architecture that supports modular extensions and makes it easier to support additional domains. Our enhanced framework includes foundational ontologies to support modeling of wildlife observation and wildlife health impacts, thereby enabling deeper and broader support for more holistically examining the effects of environmental pollution on ecosystems. We conclude with a discussion of how, through the application of semantic technologies, modular designs will make it easier for resource managers to bring in new sources of data to support more complex use cases.
Bohler, Anwesha; Eijssen, Lars M T; van Iersel, Martijn P; Leemans, Christ; Willighagen, Egon L; Kutmon, Martina; Jaillard, Magali; Evelo, Chris T
2015-08-23
Biological pathways are descriptive diagrams of biological processes widely used for functional analysis of differentially expressed genes or proteins. Primary data analysis, such as quality control, normalisation, and statistical analysis, is often performed in scripting languages like R, Perl, and Python. Subsequent pathway analysis is usually performed using dedicated external applications. Workflows involving manual use of multiple environments are time consuming and error prone. Therefore, tools are needed that enable pathway analysis directly within the same scripting languages used for primary data analyses. Existing tools have limited capability in terms of available pathway content, pathway editing and visualisation options, and export file formats. Consequently, making the full-fledged pathway analysis tool PathVisio available from various scripting languages will benefit researchers. We developed PathVisioRPC, an XMLRPC interface for the pathway analysis software PathVisio. PathVisioRPC enables creating and editing biological pathways, visualising data on pathways, performing pathway statistics, and exporting results in several image formats in multiple programming environments. We demonstrate PathVisioRPC functionalities using examples in Python. Subsequently, we analyse a publicly available NCBI GEO gene expression dataset studying tumour bearing mice treated with cyclophosphamide in R. The R scripts demonstrate how calls to existing R packages for data processing and calls to PathVisioRPC can directly work together. To further support R users, we have created RPathVisio simplifying the use of PathVisioRPC in this environment. We have also created a pathway module for the microarray data analysis portal ArrayAnalysis.org that calls the PathVisioRPC interface to perform pathway analysis. This module allows users to use PathVisio functionality online without having to download and install the software and exemplifies how the PathVisioRPC interface can be used by data analysis pipelines for functional analysis of processed genomics data. PathVisioRPC enables data visualisation and pathway analysis directly from within various analytical environments used for preliminary analyses. It supports the use of existing pathways from WikiPathways or pathways created using the RPC itself. It also enables automation of tasks performed using PathVisio, making it useful to PathVisio users performing repeated visualisation and analysis tasks. PathVisioRPC is freely available for academic and commercial use at http://projects.bigcat.unimaas.nl/pathvisiorpc.
The Neighborhood Auditing Tool: A Hybrid Interface for Auditing the UMLS
Morrey, C. Paul; Geller, James; Halper, Michael; Perl, Yehoshua
2009-01-01
The UMLS’s integration of more than 100 source vocabularies, not necessarily consistent with one another, causes some inconsistencies. The purpose of auditing the UMLS is to detect such inconsistencies and to suggest how to resolve them while observing the requirement of fully representing the content of each source in the UMLS. A software tool, called the Neighborhood Auditing Tool (NAT), that facilitates UMLS auditing is presented. The NAT supports “neighborhood-based” auditing, where, at any given time, an auditor concentrates on a single focus concept and one of a variety of neighborhoods of its closely related concepts. Typical diagrammatic displays of concept networks have a number of shortcomings, so the NAT utilizes a hybrid diagram/text interface that features stylized neighborhood views which retain some of the best features of both the diagrammatic layouts and text windows while avoiding the shortcomings. The NAT allows an auditor to display knowledge from both the Metathesaurus (concept) level and the Semantic Network (semantic type) level. Various additional features of the NAT that support the auditing process are described. The usefulness of the NAT is demonstrated through a group of case studies. Its impact is tested with a study involving a select group of auditors. PMID:19475725
A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Omitaomu, Olufemi A; Parish, Esther S; Nugent, Philip J
Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). Formore » all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.« less
[Facebook in oncology. Review of the literature].
Veneroni, Laura; Ferrari, Andrea; Massimino, Maura; Clerici, Carlo Alfredo
2015-01-01
Internet and particularly the so-called Web 2.0 are powerful tools of communication characterized by high user participation in the creation of content through various sites, such as those of social networking, where Facebook is the best known and most widely used. The aim of the present paper is to review the literature on the use of Facebook in health care. The international scientific literature of the past 10 years has been collected by major databases online. From the research were identified 262 articles of which 57 are considered relevant. The articles are schematically divided into three categories according to the topic: use of Facebook for psychosocial support for communication, for doctor-patient relationship, for institutional communication. The authors have identified the critical aspects and the possibility of using this tool in the communication and relationship between patients and health professionals. Despite the presence of critical issues, the use of social media is to be considered with interest and is worthy of study and research in the clinical setting. It should at the same time that health professionals are aware of the risks associated with the use of social networking but also trained in the use of the potential of these virtual tools that cannot replace real interactions but can support them.
Microcredit -- an emerging tool for fighting poverty.
1997-01-01
A summit focusing on microcredit (small business, microenterprise, loans) as a means of fighting poverty was held February 3-4 in Washington; it was co-chaired by First Lady Hillary Rodham Clinton and by Queen Sofia of Spain. The United States Agency for International Development (USAID) has long supported microenterprise and microfinance. The summit set a goal of reaching 100 million poor families over the next nine years. USAID Administrator Brian Atwood spoke concerning the need to involve the private sector in microfinance; previously loans had been financed outside of the mainstream financial system via nongovernmental organizations and credit unions funded mainly by governments and donors. USAID launched a Microenterprise Initiative in 1994 that has supported 150 programs in 45 countries, and that is expected to reach approximately 4 million families. Atwood said the microenterprise strategies were currently in use in nearly every country USAID supports in Latin America and Asia, and most countries in Africa; future efforts would concentrate on countries in Africa, in eastern Europe and in central Asia. Mrs. Clinton called microenterprise "an invaluable tool in alleviating poverty, promoting self-sufficiency, and stimulating the economy." Treasury Secretary Robert Rubin stated that the policy helped people help themselves by giving them the tools they needed to join the economic mainstream. Microcredit focuses on businesses with five or fewer workers; loans range from less than $100 to $10,000. More than half of the businesses are owned and operated by women.
Revel8or: Model Driven Capacity Planning Tool Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Liming; Liu, Yan; Bui, Ngoc B.
2007-05-31
Designing complex multi-tier applications that must meet strict performance requirements is a challenging software engineering problem. Ideally, the application architect could derive accurate performance predictions early in the project life-cycle, leveraging initial application design-level models and a description of the target software and hardware platforms. To this end, we have developed a capacity planning tool suite for component-based applications, called Revel8tor. The tool adheres to the model driven development paradigm and supports benchmarking and performance prediction for J2EE, .Net and Web services platforms. The suite is composed of three different tools: MDAPerf, MDABench and DSLBench. MDAPerf allows annotation of designmore » diagrams and derives performance analysis models. MDABench allows a customized benchmark application to be modeled in the UML 2.0 Testing Profile and automatically generates a deployable application, with measurement automatically conducted. DSLBench allows the same benchmark modeling and generation to be conducted using a simple performance engineering Domain Specific Language (DSL) in Microsoft Visual Studio. DSLBench integrates with Visual Studio and reuses its load testing infrastructure. Together, the tool suite can assist capacity planning across platforms in an automated fashion.« less
A Practical Guide to the Technology and Adoption of Software Process Automation
1994-03-01
IDE’s integration of Software through Pictures, CodeCenter, and FrameMaker ). However, successful use of in- tegrated tools, as reflected in actual...tool for a specific platform. Thus, when a Work Context calls for a word processor, the weaver.tis file can be set up to call FrameMaker for the Sun4
NASA Astrophysics Data System (ADS)
Kademian, Sylvie M.
Current reform efforts prioritize science instruction that provides opportunities for students to engage in productive talk about scientific phenomena. Given the challenges teachers face enacting instruction that integrates science practices and science content, beginning teachers need support to develop the knowledge and teaching practices required to teach reform-oriented science lessons. Practice-based teacher education shows potential for supporting beginning teachers while they are learning to teach in this way. However, little is known about how beginning elementary teachers draw upon the types of support and tools associated with practice-based teacher education to learn to successfully enact this type of instruction. This dissertation addresses this gap by investigating how a practice-based science methods course using a suite of teacher educator-provided tools can support beginning teachers' planning and enactment of investigation-based science lessons. Using qualitative case study methodologies, this study drew on video-records, lesson plans, class assignments, and surveys from one cohort of 22 pre-service teachers (called interns in this study) enrolled in a year-long elementary education master of the arts and teaching certification program. Six focal interns were also interviewed at multiple time-points during the methods course. Similarities existed across the types of tools and teaching practices interns used most frequently to plan and enact investigation-based discussions. For the focal interns, use of four synergistic teaching practices throughout the lesson enactments (including consideration of students' initial ideas; use of open-ended questions to elicit, extend, and challenge ideas; connecting across students' ideas and the disciplinary core ideas; and use of a representation to organize and highlight students' ideas) appeared to lead to increased opportunities for students to share their ideas and engage in data analysis, argumentation and explanation construction. Student opportunities to engage in practices that prioritize scientific discourse also occurred when interns were using dialogic voice and the tools designed to foster development of teacher knowledge for facilitating investigation-based science discussions. However, several intern characteristics likely moderated or mediated intern use of tools, dialogic voice, and productive teaching practices to capitalize on student contributions. These characteristics included intern knowledge of the science content and practices and initial beliefs about science teaching. Missed opportunities to use a combination of several teaching practices and tools designed to foster the development of knowledge for science teaching resulted in fewer opportunities for students to engage in data analysis, argumentation based on evidence, and construction of scientific explanations. These findings highlight the potential of teacher-educator provided tools for supporting beginning teachers in learning to facilitate investigation-based discussions that capitalize on student contributions. These findings also help the field conceptualize how beginning teachers use tools and teaching practices to plan and enact investigation-based science lessons, and how intern characteristics relate to tool use and planned and enacted lessons. By analyzing the investigation-based science lessons holistically, this study begins to unpack the complexities of facilitating investigation-based discussions including the interplay between intern characteristics and tool use, and the ways intern engagement in synergistic teaching practices provide opportunities for students to engage in data analysis, explanation construction, and argumentation. This study also describes methodological implications for this type of whole-lesson analysis and comments on the need for further research investigating beginning teachers' use of tools over time. Finally, I propose the need for iterative design of scaffolds to further support beginning teacher facilitation of investigation-based science lessons.
Timbre Brownfield Prioritization Tool to support effective brownfield regeneration.
Pizzol, Lisa; Zabeo, Alex; Klusáček, Petr; Giubilato, Elisa; Critto, Andrea; Frantál, Bohumil; Martinát, Standa; Kunc, Josef; Osman, Robert; Bartke, Stephan
2016-01-15
In the last decade, the regeneration of derelict or underused sites, fully or partly located in urban areas (or so called "brownfields"), has become more common, since free developable land (or so called "greenfields") has more and more become a scare and, hence, more expensive resource, especially in densely populated areas. Although the regeneration of brownfield sites can offer development potentials, the complexity of these sites requires considerable efforts to successfully complete their revitalization projects and the proper selection of promising sites is a pre-requisite to efficiently allocate the limited financial resources. The identification and analysis of success factors for brownfield sites regeneration can support investors and decision makers in selecting those sites which are the most advantageous for successful regeneration. The objective of this paper is to present the Timbre Brownfield Prioritization Tool (TBPT), developed as a web-based solution to assist stakeholders responsible for wider territories or clusters of brownfield sites (portfolios) to identify which brownfield sites should be preferably considered for redevelopment or further investigation. The prioritization approach is based on a set of success factors properly identified through a systematic stakeholder engagement procedure. Within the TBPT these success factors are integrated by means of a Multi Criteria Decision Analysis (MCDA) methodology, which includes stakeholders' requalification objectives and perspectives related to the brownfield regeneration process and takes into account the three pillars of sustainability (economic, social and environmental dimensions). The tool has been applied to the South Moravia case study (Czech Republic), considering two different requalification objectives identified by local stakeholders, namely the selection of suitable locations for the development of a shopping centre and a solar power plant, respectively. The application of the TBPT to the case study showed that it is flexible and easy to adapt to different local contexts, allowing the assessors to introduce locally relevant parameters identified according to their expertise and considering the availability of local data. Copyright © 2015 Elsevier Ltd. All rights reserved.
Real-Time Visualization of Network Behaviors for Situational Awareness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Best, Daniel M.; Bohn, Shawn J.; Love, Douglas V.
Plentiful, complex, and dynamic data make understanding the state of an enterprise network difficult. Although visualization can help analysts understand baseline behaviors in network traffic and identify off-normal events, visual analysis systems often do not scale well to operational data volumes (in the hundreds of millions to billions of transactions per day) nor to analysis of emergent trends in real-time data. We present a system that combines multiple, complementary visualization techniques coupled with in-stream analytics, behavioral modeling of network actors, and a high-throughput processing platform called MeDICi. This system provides situational understanding of real-time network activity to help analysts takemore » proactive response steps. We have developed these techniques using requirements gathered from the government users for which the tools are being developed. By linking multiple visualization tools to a streaming analytic pipeline, and designing each tool to support a particular kind of analysis (from high-level awareness to detailed investigation), analysts can understand the behavior of a network across multiple levels of abstraction.« less
THE HUMAN BEHAVIOR RATING SCALE-BRIEF: A TOOL TO MEASURE 21ST CENTURY SKILLS OF K-12 LEARNERS.
Woods-Groves, Suzanne
2015-06-01
Currently there is a call for brief concise measurements to appraise relevant 21st century college readiness skills in K-12 learners. This study employed K-12 teachers' ratings for over 3,000 students for an existing 91-item rating scale, the Human Behavior Rating Scale, that measured the 21st century skills of persistence, curiosity, externalizing affect, internalizing affect, and cognition. Teachers' ratings for K-12 learners were used to develop a brief, concise, and manageable 30-item tool, the Human Behavior Rating Scale-Brief. Results yielded high internal consistency coefficients and inter-item correlations. The items were not biased with regard to student sex or race, and were supported through confirmatory factor analyses. In addition, when teachers' ratings were compared with students' academic and behavioral performance data, moderate to strong relationships were revealed. This study provided an essential first step in the development of a psychometrically sound, manageable, and brief tool to appraise 21st century skills in K-12 learners.
Combining Domain-driven Design and Mashups for Service Development
NASA Astrophysics Data System (ADS)
Iglesias, Carlos A.; Fernández-Villamor, José Ignacio; Del Pozo, David; Garulli, Luca; García, Boni
This chapter presents the Romulus project approach to Service Development using Java-based web technologies. Romulus aims at improving productivity of service development by providing a tool-supported model to conceive Java-based web applications. This model follows a Domain Driven Design approach, which states that the primary focus of software projects should be the core domain and domain logic. Romulus proposes a tool-supported model, Roma Metaframework, that provides an abstraction layer on top of existing web frameworks and automates the application generation from the domain model. This metaframework follows an object centric approach, and complements Domain Driven Design by identifying the most common cross-cutting concerns (security, service, view, ...) of web applications. The metaframework uses annotations for enriching the domain model with these cross-cutting concerns, so-called aspects. In addition, the chapter presents the usage of mashup technology in the metaframework for service composition, using the web mashup editor MyCocktail. This approach is applied to a scenario of the Mobile Phone Service Portability case study for the development of a new service.
Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong
2015-01-01
Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.
Kruser, Jacqueline M; Nabozny, Michael J; Steffens, Nicole M; Brasel, Karen J; Campbell, Toby C; Gaines, Martha E; Schwarze, Margaret L
2015-09-01
To evaluate a communication tool called "Best Case/Worst Case" (BC/WC) based on an established conceptual model of shared decision-making. Focus group study. Older adults (four focus groups) and surgeons (two focus groups) using modified questions from the Decision Aid Acceptability Scale and the Decisional Conflict Scale to evaluate and revise the communication tool. Individuals aged 60 and older recruited from senior centers (n = 37) and surgeons from academic and private practices in Wisconsin (n = 17). Qualitative content analysis was used to explore themes and concepts that focus group respondents identified. Seniors and surgeons praised the tool for the unambiguous illustration of multiple treatment options and the clarity gained from presentation of an array of treatment outcomes. Participants noted that the tool provides an opportunity for in-the-moment, preference-based deliberation about options and a platform for further discussion with other clinicians and loved ones. Older adults worried that the format of the tool was not universally accessible for people with different educational backgrounds, and surgeons had concerns that the tool was vulnerable to physicians' subjective biases. The BC/WC tool is a novel decision support intervention that may help facilitate difficult decision-making for older adults and their physicians when considering invasive, acute medical treatments such as surgery. © 2015, Copyright the Authors Journal compilation © 2015, The American Geriatrics Society.
Rodrigues, Ramon Gouveia; das Dores, Rafael Marques; Camilo-Junior, Celso G; Rosa, Thierson Couto
2016-01-01
Cancer is a critical disease that affects millions of people and families around the world. In 2012 about 14.1 million new cases of cancer occurred globally. Because of many reasons like the severity of some cases, the side effects of some treatments and death of other patients, cancer patients tend to be affected by serious emotional disorders, like depression, for instance. Thus, monitoring the mood of the patients is an important part of their treatment. Many cancer patients are users of online social networks and many of them take part in cancer virtual communities where they exchange messages commenting about their treatment or giving support to other patients in the community. Most of these communities are of public access and thus are useful sources of information about the mood of patients. Based on that, Sentiment Analysis methods can be useful to automatically detect positive or negative mood of cancer patients by analyzing their messages in these online communities. The objective of this work is to present a Sentiment Analysis tool, named SentiHealth-Cancer (SHC-pt), that improves the detection of emotional state of patients in Brazilian online cancer communities, by inspecting their posts written in Portuguese language. The SHC-pt is a sentiment analysis tool which is tailored specifically to detect positive, negative or neutral messages of patients in online communities of cancer patients. We conducted a comparative study of the proposed method with a set of general-purpose sentiment analysis tools adapted to this context. Different collections of posts were obtained from two cancer communities in Facebook. Additionally, the posts were analyzed by sentiment analysis tools that support the Portuguese language (Semantria and SentiStrength) and by the tool SHC-pt, developed based on the method proposed in this paper called SentiHealth. Moreover, as a second alternative to analyze the texts in Portuguese, the collected texts were automatically translated into English, and submitted to sentiment analysis tools that do not support the Portuguese language (AlchemyAPI and Textalytics) and also to Semantria and SentiStrength, using the English option of these tools. Six experiments were conducted with some variations and different origins of the collected posts. The results were measured using the following metrics: precision, recall, F1-measure and accuracy The proposed tool SHC-pt reached the best averages for accuracy and F1-measure (harmonic mean between recall and precision) in the three sentiment classes addressed (positive, negative and neutral) in all experimental settings. Moreover, the worst accuracy value (58%) achieved by SHC-pt in any experiment is 11.53% better than the greatest accuracy (52%) presented by other addressed tools. Finally, the worst average F1 (48.46%) reached by SHC-pt in any experiment is 4.14% better than the greatest average F1 (46.53%) achieved by other addressed tools. Thus, even when we compare the SHC-pt results in complex scenario versus others in easier scenario the SHC-pt is better. This paper presents two contributions. First, it proposes the method SentiHealth to detect the mood of cancer patients that are also users of communities of patients in online social networks. Second, it presents an instantiated tool from the method, called SentiHealth-Cancer (SHC-pt), dedicated to automatically analyze posts in communities of cancer patients, based on SentiHealth. This context-tailored tool outperformed other general-purpose sentiment analysis tools at least in the cancer context. This suggests that the SentiHealth method could be instantiated as other disease-based tools during future works, for instance SentiHealth-HIV, SentiHealth-Stroke and SentiHealth-Sclerosis. Copyright © 2015. Published by Elsevier Ireland Ltd.
The impact of CmapTools utilization towards students' conceptual change on optics topic
NASA Astrophysics Data System (ADS)
Rofiuddin, Muhammad Rifqi; Feranie, Selly
2017-05-01
Science teachers need to help students identify their prior ideas and modify them based on scientific knowledge. This process is called as conceptual change. One of essential tools to analyze students' conceptual change is by using concept map. Concept Maps are graphical representations of knowledge that are comprised of concepts and the relationships between them. Constructing concept map is implemented by adapting the role of technology to support learning process, as it is suitable with Educational Ministry Regulation No.68 year 2013. Institute for Human and Machine Cognition (IHMC) has developed CmapTools, a client-server software for easily construct and visualize concept maps. This research aims to investigate secondary students' conceptual change after experiencing five-stage conceptual teaching model by utilizing CmapTools in learning Optics. Weak experimental method through one group pretest-posttest design is implemented in this study to collect preliminary and post concept map as qualitative data. Sample was taken purposively of 8th grade students (n= 22) at one of private schools Bandung, West Java. Conceptual change based on comparison of preliminary and post concept map construction is assessed based on rubric of concept map scoring and structure. Results shows significance conceptual change differences at 50.92 % that is elaborated into concept map element such as prepositions and hierarchical level in high category, cross links in medium category and specific examples in low category. All of the results are supported with the students' positive response towards CmapTools utilization that indicates improvement of motivation, interest, and behavior aspect towards Physics lesson.
TERMTrial--terminology-based documentation systems for cooperative clinical trials.
Merzweiler, A; Weber, R; Garde, S; Haux, R; Knaup-Gregori, P
2005-04-01
Within cooperative groups of multi-center clinical trials a standardized documentation is a prerequisite for communication and sharing of data. Standardizing documentation systems means standardizing the underlying terminology. The management and consistent application of terminology systems is a difficult and fault-prone task, which should be supported by appropriate software tools. Today, documentation systems for clinical trials are often implemented as so-called Remote-Data-Entry-Systems (RDE-systems). Although there are many commercial systems, which support the development of RDE-systems there is none offering a comprehensive terminological support. Therefore, we developed the software system TERMTrial which consists of a component for the definition and management of terminology systems for cooperative groups of clinical trials and two components for the terminology-based automatic generation of trial databases and terminology-based interactive design of electronic case report forms (eCRFs). TERMTrial combines the advantages of remote data entry with a comprehensive terminological control.
Technique and cue selection for graphical presentation of generic hyperdimensional data
NASA Astrophysics Data System (ADS)
Howard, Lee M.; Burton, Robert P.
2013-12-01
Several presentation techniques have been created for visualization of data with more than three variables. Packages have been written, each of which implements a subset of these techniques. However, these packages generally fail to provide all the features needed by the user during the visualization process. Further, packages generally limit support for presentation techniques to a few techniques. A new package called Petrichor accommodates all necessary and useful features together in one system. Any presentation technique may be added easily through an extensible plugin system. Features are supported by a user interface that allows easy interaction with data. Annotations allow users to mark up visualizations and share information with others. By providing a hyperdimensional graphics package that easily accommodates presentation techniques and includes a complete set of features, including those that are rarely or never supported elsewhere, the user is provided with a tool that facilitates improved interaction with multivariate data to extract and disseminate information.
Prins, Pjotr; Goto, Naohisa; Yates, Andrew; Gautier, Laurent; Willis, Scooter; Fields, Christopher; Katayama, Toshiaki
2012-01-01
Open-source software (OSS) encourages computer programmers to reuse software components written by others. In evolutionary bioinformatics, OSS comes in a broad range of programming languages, including C/C++, Perl, Python, Ruby, Java, and R. To avoid writing the same functionality multiple times for different languages, it is possible to share components by bridging computer languages and Bio* projects, such as BioPerl, Biopython, BioRuby, BioJava, and R/Bioconductor. In this chapter, we compare the two principal approaches for sharing software between different programming languages: either by remote procedure call (RPC) or by sharing a local call stack. RPC provides a language-independent protocol over a network interface; examples are RSOAP and Rserve. The local call stack provides a between-language mapping not over the network interface, but directly in computer memory; examples are R bindings, RPy, and languages sharing the Java Virtual Machine stack. This functionality provides strategies for sharing of software between Bio* projects, which can be exploited more often. Here, we present cross-language examples for sequence translation, and measure throughput of the different options. We compare calling into R through native R, RSOAP, Rserve, and RPy interfaces, with the performance of native BioPerl, Biopython, BioJava, and BioRuby implementations, and with call stack bindings to BioJava and the European Molecular Biology Open Software Suite. In general, call stack approaches outperform native Bio* implementations and these, in turn, outperform RPC-based approaches. To test and compare strategies, we provide a downloadable BioNode image with all examples, tools, and libraries included. The BioNode image can be run on VirtualBox-supported operating systems, including Windows, OSX, and Linux.
Ong, Stephanie W; Jassal, Sarbjit V; Porter, Eveline; Logan, Alexander G; Miller, Judith A
2013-01-01
New healthcare delivery models are needed to enhance the patient experience and improve quality of care for individuals with chronic conditions such as kidney disease. One potential avenue is to implement self-management strategies. There is growing evidence that self-management interventions help optimize various aspects of chronic disease management. With the increasing use of information technology (IT) in health care, chronic disease management programs are incorporating IT solutions to support patient self-management practices. IT solutions have the ability to promote key principles of self-management, namely education, empowerment, and collaboration. Positive clinical outcomes have been demonstrated for a number of chronic conditions when IT solutions were incorporated into self-management programs. There is a paucity of evidence for self-management in chronic kidney disease (CKD) patients. Furthermore, IT strategies have not been tested in this patient population to the same extent as other chronic conditions (e.g., diabetes, hypertension). Therefore, it is currently unknown if IT strategies will promote self-management behaviors and lead to improvements in overall patient care. We designed and developed an IT solution called My KidneyCare Centre to support self-management strategies for patients with CKD. In this review, we discuss the rationale and vision of incorporating an electronic self-management tool to support the care of patients with CKD. © 2013 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Robertson, Glen A.
2013-01-01
NASA currently has a program called the Space Synthetic Biology Project. Synthetic Biology or SynBio is the design and construction of new biological functions and systems not found in nature. Four NASA field centers, along with experts from industry and academia, have been partnering on the Space Synthetic Biology Project and are working on new breakthroughs in this increasingly useful pursuit, which is part a science discipline and part engineering. Led by researchers at NASA s Ames Research Center, the team is studying how this powerful new tool can help NASA now and in the future. The project was created to harness biology in reliable, robust, engineered systems to support the agency s exploration and science missions, to improve life on Earth and to help shape NASA's future. The program also is intended to contribute foundational tools to the synthetic biology research community.
Southern African Office of Astronomy for Development: A New Hub for Astronomy for Development
NASA Astrophysics Data System (ADS)
Mutondo, Moola S.; Simpemba, Prospery
2016-10-01
A new Astronomy for Development hub needs innovative tools and programs. SAROAD is developing exciting tools integrating Raspberry Pi technology to bring cost-effective astronomy content to learning centres. SAROAD would also like to report achievements in realizing the IAU's strategic plan. In order to manage, evaluate and coordinate regional IAU (International Astronomical Union) capacity building programmes, including the recruitment and mobilization of volunteers, SAROAD has built an intranet that is accessible to regional members upon request. Using this resource, regional members can see and participate in regional activities. SAROAD has commenced with projects in the three Task Force areas of Universities and Research, Children and Schools and Public Outreach. Under the three Task Force areas, a total of seven projects have commenced in Zambia (some supported by funds from IAU Annual Call for proposals).
Trade-Off Analysis between Concerns Based on Aspect-Oriented Requirements Engineering
NASA Astrophysics Data System (ADS)
Laurito, Abelyn Methanie R.; Takada, Shingo
The identification of functional and non-functional concerns is an important activity during requirements analysis. However, there may be conflicts between the identified concerns, and they must be discovered and resolved through trade-off analysis. Aspect-Oriented Requirements Engineering (AORE) has trade-off analysis as one of its goals, but most AORE approaches do not actually offer support for trade-off analysis; they focus on describing concerns and generating their composition. This paper proposes an approach for trade-off analysis based on AORE using use cases and the Requirements Conflict Matrix (RCM) to represent compositions. RCM shows the positive or negative effect of non-functional concerns over use cases and other non-functional concerns. Our approach is implemented within a tool called E-UCEd (Extended Use Case Editor). We also show the results of evaluating our tool.
Developing tools for digital radar image data evaluation
NASA Technical Reports Server (NTRS)
Domik, G.; Leberl, F.; Raggam, J.
1986-01-01
The refinement of radar image analysis methods has led to a need for a systems approach to radar image processing software. Developments stimulated through satellite radar are combined with standard image processing techniques to create a user environment to manipulate and analyze airborne and satellite radar images. One aim is to create radar products for the user from the original data to enhance the ease of understanding the contents. The results are called secondary image products and derive from the original digital images. Another aim is to support interactive SAR image analysis. Software methods permit use of a digital height model to create ortho images, synthetic images, stereo-ortho images, radar maps or color combinations of different component products. Efforts are ongoing to integrate individual tools into a combined hardware/software environment for interactive radar image analysis.
Grants4Targets - an innovative approach to translate ideas from basic research into novel drugs.
Lessl, Monika; Schoepe, Stefanie; Sommer, Anette; Schneider, Martin; Asadullah, Khusru
2011-04-01
Collaborations between industry and academia are steadily gaining importance. To combine expertises Bayer Healthcare has set up a novel open innovation approach called Grants4Targets. Ideas on novel drug targets can easily be submitted to http://www.grants4targets.com. After a review process, grants are provided to perform focused experiments to further validate the proposed targets. In addition to financial support specific know-how on target validation and drug discovery is provided. Experienced scientists are nominated as project partners and, depending on the project, tools or specific models are provided. Around 280 applications have been received and 41 projects granted. According to our experience, this type of bridging fund combined with joint efforts provides a valuable tool to foster drug discovery collaborations. Copyright © 2010 Elsevier Ltd. All rights reserved.
Hiller, Thomas Stephan; Freytag, Antje; Breitbart, Jörg; Teismann, Tobias; Schöne, Elisabeth; Blank, Wolfgang; Schelle, Mercedes; Vollmar, Horst Christian; Margraf, Jürgen; Gensichen, Jochen
2018-04-01
Behavior therapy-oriented methods are recommended for treating anxiety disorders in primary care. The treatment of patients with long-term conditions can be improved by case management and structured clinical monitoring. The present paper describes the rationale, design and application of the 'Jena Anxiety Monitoring List' (JAMoL), a monitoring tool for the treatment of patients with panic disorder, with or without agoraphobia, in primary care. JAMoL's design was based on established clinical measures, the rationale of exposure-based anxiety treatment, and research on family practice-based case management. After piloting, the JAMoL was used in the clinical study 'Jena-PARADISE' (ISRCTN64669297), where non-physician practice staff monitored patients with panic disorder by telephone. Using semi-structured interviews in concomitant studies, study participants were asked about the instrument's functionality. The JAMoL assesses the severity of anxiety symptoms (6 items) as well as the patient's adherence to therapy (4 items) and fosters the case management-related information exchange (3 items). An integrated traffic light scheme facilitates the evaluation of monitoring results. Within the clinical study, non-physician practice staff carried out a total of 1,525 JAMoL-supported monitoring calls on 177 patients from 30 primary care practices (median calls per patient: 10 [interquartile range, 9-10]). Qualitative analyses revealed that most practice teams and patients rated the JAMoL as a practicable and treatment-relevant tool. The JAMoL enables primary care practice teams to continuously monitor anxiety symptoms and treatment adherence in patients with panic disorder with or without agoraphobia. Within the behavior therapy-oriented treatment program 'Jena-PARADISE', the JAMoL constitutes an important case management tool. Copyright © 2018. Published by Elsevier GmbH.
CALL in the Zone of Proximal Development: Novelty Effects and Teacher Guidance
ERIC Educational Resources Information Center
Karlström, Petter; Lundin, Eva
2013-01-01
Digital tools are not always used in the manner their designers had in mind. Therefore, it is not enough to assume that learning through CALL tools occurs in intended ways, if at all. We have studied the use of an enhanced word processor for writing essays in Swedish as a second language. The word processor contained natural language processing…
Software engineering methodologies and tools
NASA Technical Reports Server (NTRS)
Wilcox, Lawrence M.
1993-01-01
Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.
VIPER: a web application for rapid expert review of variant calls.
Wöste, Marius; Dugas, Martin
2018-06-01
With the rapid development in next-generation sequencing, cost and time requirements for genomic sequencing are decreasing, enabling applications in many areas such as cancer research. Many tools have been developed to analyze genomic variation ranging from single nucleotide variants to whole chromosomal aberrations. As sequencing throughput increases, the number of variants called by such tools also grows. Often employed manual inspection of such calls is thus becoming a time-consuming procedure. We developed the Variant InsPector and Expert Rating tool (VIPER) to speed up this process by integrating the Integrative Genomics Viewer into a web application. Analysts can then quickly iterate through variants, apply filters and make decisions based on the generated images and variant metadata. VIPER was successfully employed in analyses with manual inspection of more than 10 000 calls. VIPER is implemented in Java and Javascript and is freely available at https://github.com/MarWoes/viper. marius.woeste@uni-muenster.de. Supplementary data are available at Bioinformatics online.
Recent Evidence for Emerging Digital Technologies to Support Global HIV Engagement in Care
Jongbloed, Kate; Parmar, Sunjit; van der Kop, Mia; Spittal, Patricia M.; Lester, Richard T.
2017-01-01
Antiretroviral therapy is a powerful tool to reduce morbidity and mortality for the 35 million people living with HIV globally. However, availability of treatment alone is insufficient to meet new UNAIDS 90-90-90 targets calling for rapid scale-up of engagement in HIV care to end the epidemic in 2030. Digital technology interventions (mHealth, eHealth, and telehealth) are emerging as one approach to support lifelong engagement in HIV care. This review synthesizes recent reviews and primary studies published since January 2014 on digital technology interventions for engagement in HIV care after diagnosis. Technologies for health provide emerging and proven solutions to support achievement of the United Nations targets for the generalized HIV-affected population. Much of the existing evidence addresses antiretroviral therapy (ART) adherence; however, studies have begun to investigate programs to support linkage and retention in care as well as interventions to engage key populations facing extensive barriers to care. PMID:26454756
Promoting Reflective Physics Teaching Through the Use of Collaborative Learning Annotation System
NASA Astrophysics Data System (ADS)
Milner-Bolotin, Marina
2018-05-01
Effective physics teaching requires extensive knowledge of physics, relevant pedagogies, and modern educational technologies that can support student learning. Acquiring this knowledge is a challenging task, considering how fast modern technologies and expectations of student learning outcomes and of teaching practices are changing Therefore 21st-century physics teachers should be supported in developing a different way of thinking about technology-enhanced physics teaching and learning. We call it Deliberate Pedagogical Thinking with Technology, and base it on the original Pedagogical Content Knowledge and Technological Pedagogical Content Knowledge frameworks. However, unlike the two aforementioned frameworks, the Deliberate Pedagogical Thinking with Technology emphasizes not only teachers' knowledge, but also their attitudes and dispositions about using digital tools in order to support student learning. This paper examines how an online system that allows an ongoing discussion of videos uploaded on it by the students can support reflection in physics teacher education. Examples of using such a system in physics teacher education and teacher-candidates' feedback on their experiences with it are also discussed.
PRO-Elicere: A Study for Create a New Process of Dependability Analysis of Space Computer Systems
NASA Astrophysics Data System (ADS)
da Silva, Glauco; Netto Lahoz, Carlos Henrique
2013-09-01
This paper presents the new approach to the computer system dependability analysis, called PRO-ELICERE, which introduces data mining concepts and intelligent mechanisms to decision support to analyze the potential hazards and failures of a critical computer system. Also, are presented some techniques and tools that support the traditional dependability analysis and briefly discusses the concept of knowledge discovery and intelligent databases for critical computer systems. After that, introduces the PRO-ELICERE process, an intelligent approach to automate the ELICERE, a process created to extract non-functional requirements for critical computer systems. The PRO-ELICERE can be used in the V&V activities in the projects of Institute of Aeronautics and Space, such as the Brazilian Satellite Launcher (VLS-1).
Environmental Resilience: Exploring Scientific Concepts for ...
Report This report summarizes two Community Environmental Resilience Index workshops held at EPA in May and July of 2014. The workshops explored scientific concepts for building an index of indicators of community environmental resilience to natural or human-caused disasters. The index could be used to support disaster decision-making. Key workshop outcomes include: a working definition of environmental resilience and insight into how it relates to EPA's mission and Strategic Goals, a call for an inventory of EPA resiliency tools, a preliminary list of indicators and CERI structure, identification of next steps for index development, and emergence of a network of collaborators. The report can be used to support EPA's work in resilience under PPD-8, PPD-21, and the national response and disaster recovery frameworks. It can feed into interagency efforts on building community resilience.
The Next Generation of the Montage Image Mopsaic Engine
NASA Astrophysics Data System (ADS)
Berriman, G. Bruce; Good, John; Rusholme, Ben; Robitaille, Thomas
2016-01-01
We have released a major upgrade of the Montage image mosaic engine (http://montage.ipac.caltech.edu) , as part of a program to develop the next generation of the engine in response to the rapid changes in the data processing landscape in Astronomy, which is generating ever larger data sets in ever more complex formats . The new release (version 4) contains modules dedicated to creating and managing mosaics of data stored as multi-dimensional arrays ("data cubes"). The new release inherits the architectural benefits of portability and scalability of the original design. The code is publicly available on Git Hub and the Montage web page. The release includes a command line tool that supports visualization of large images, and the beta-release of a Python interface to the visualization tool. We will provide examples on how to use these these features. We are generating a mosaic of the Galactic Arecibo L-band Feed Array HI (GALFA-HI) Survey maps of neutral hydrogen in and around our Milky Way Galaxy, to assess the performance at scale and to develop tools and methodologies that will enable scientists inexpert in cloud processing to exploit could platforms for data processing and product generation at scale. Future releases include support for an R-tree based mechanism for fast discovery of and access to large data sets and on-demand access to calibrated SDSS DR9 data that exploits it; support for the Hierarchical Equal Area isoLatitude Pixelization (HEALPix) scheme, now standard for projects investigating cosmic background radiation (Gorski et al 2005); support fort the Tessellated Octahedral Adaptive Subdivision Transform (TOAST), the sky partitioning sky used by the WorldWide Telescope (WWT); and a public applications programming interface (API) in C that can be called from other languages, especially Python.
Towards computer-assisted surgery in shoulder joint replacement
NASA Astrophysics Data System (ADS)
Valstar, Edward R.; Botha, Charl P.; van der Glas, Marjolein; Rozing, Piet M.; van der Helm, Frans C. T.; Post, Frits H.; Vossepoel, Albert M.
A research programme that aims to improve the state of the art in shoulder joint replacement surgery has been initiated at the Delft University of Technology. Development of improved endoprostheses for the upper extremities (DIPEX), as this effort is called, is a clinically driven multidisciplinary programme consisting of many contributory aspects. A part of this research programme focuses on the pre-operative planning and per-operative guidance issues. The ultimate goal of this part of the DIPEX project is to create a surgical support infrastructure that can be used to predict the optimal surgical protocol and can assist with the selection of the most suitable endoprosthesis for a particular patient. In the pre-operative planning phase, advanced biomechanical models of the endoprosthesis fixation and the musculo-skeletal system of the shoulder will be incorporated, which are adjusted to the individual's morphology. Subsequently, the support infrastructure must assist the surgeon during the operation in executing his surgical plan. In the per-operative phase, the chosen optimal position of the endoprosthesis can be realised using camera-assisted tools or mechanical guidance tools. In this article, the pathway towards the desired surgical support infrastructure is described. Furthermore, we discuss the pre-operative planning phase and the per-operative guidance phase, the initial work performed, and finally, possible approaches for improving prosthesis placement.
Sloane, Elliot B; Rosow, Eric; Adam, Joe; Shine, Dave
2006-01-01
Each individual U.S. Air Force, Army, and Navy Surgeon General has integrated oversight of global medical supplies and resources using the Joint Medical Asset Repository (JMAR). A Business Intelligence system called the JMAR Executive Dashboard Initiative (JEDI) was developed over a three-year period to add real-time interactive data-mining tools and executive dashboards. Medical resources can now be efficiently reallocated to military, veteran, family, or civilian purposes and inventories can be maintained at lean levels with peaks managed by interactive dashboards that reduce workload and errors.
Portable parallel portfolio optimization in the Aurora Financial Management System
NASA Astrophysics Data System (ADS)
Laure, Erwin; Moritsch, Hans
2001-07-01
Financial planning problems are formulated as large scale, stochastic, multiperiod, tree structured optimization problems. An efficient technique for solving this kind of problems is the nested Benders decomposition method. In this paper we present a parallel, portable, asynchronous implementation of this technique. To achieve our portability goals we elected the programming language Java for our implementation and used a high level Java based framework, called OpusJava, for expressing the parallelism potential as well as synchronization constraints. Our implementation is embedded within a modular decision support tool for portfolio and asset liability management, the Aurora Financial Management System.
[The grounded theory as a methodological alternative for nursing research].
dos Santos, Sérgio Ribeiro; da Nóbrega, Maria Miriam
2002-01-01
This study presents a method of interpretative and systematic research with appliance to the development of studies in nursing called "the grounded theory", whose theoretical support is the symbolic interactionism. The purpose of the paper is to describe the grounded theory as an alternative methodology for the construction of knowledge in nursing. The study highlights four topics: the basic principle, the basic concepts, the trajectory of the method and the process of analysis of the data. We conclude that the systematization of data and its interpretation, based on social actors' experience, constitute strong subsidies to generate theories through this research tool.
Automatic Implementation of Ttethernet-Based Time-Triggered Avionics Applications
NASA Astrophysics Data System (ADS)
Gorcitz, Raul Adrian; Carle, Thomas; Lesens, David; Monchaux, David; Potop-Butucaruy, Dumitru; Sorel, Yves
2015-09-01
The design of safety-critical embedded systems such as those used in avionics still involves largely manual phases. But in avionics the definition of standard interfaces embodied in standards such as ARINC 653 or TTEthernet should allow the definition of fully automatic code generation flows that reduce the costs while improving the quality of the generated code, much like compilers have done when replacing manual assembly coding. In this paper, we briefly present such a fully automatic implementation tool, called Lopht, for ARINC653-based time-triggered systems, and then explain how it is currently extended to include support for TTEthernet networks.
Guardia, Gabriela D A; Ferreira Pires, Luís; da Silva, Eduardo G; de Farias, Cléver R G
2017-02-01
Gene expression studies often require the combined use of a number of analysis tools. However, manual integration of analysis tools can be cumbersome and error prone. To support a higher level of automation in the integration process, efforts have been made in the biomedical domain towards the development of semantic web services and supporting composition environments. Yet, most environments consider only the execution of simple service behaviours and requires users to focus on technical details of the composition process. We propose a novel approach to the semantic composition of gene expression analysis services that addresses the shortcomings of the existing solutions. Our approach includes an architecture designed to support the service composition process for gene expression analysis, and a flexible strategy for the (semi) automatic composition of semantic web services. Finally, we implement a supporting platform called SemanticSCo to realize the proposed composition approach and demonstrate its functionality by successfully reproducing a microarray study documented in the literature. The SemanticSCo platform provides support for the composition of RESTful web services semantically annotated using SAWSDL. Our platform also supports the definition of constraints/conditions regarding the order in which service operations should be invoked, thus enabling the definition of complex service behaviours. Our proposed solution for semantic web service composition takes into account the requirements of different stakeholders and addresses all phases of the service composition process. It also provides support for the definition of analysis workflows at a high-level of abstraction, thus enabling users to focus on biological research issues rather than on the technical details of the composition process. The SemanticSCo source code is available at https://github.com/usplssb/SemanticSCo. Copyright © 2017 Elsevier Inc. All rights reserved.
A Simulation on Organizational Communication Patterns During a Terrorist Attack
2008-06-01
and the Air Support Headquarters. The call is created at the time of attack, and it automatically includes a request for help. Reliability of...communication conditions. 2. Air Support call : This call is produced for just the Headquarters of Air Component, only in case of armed attacks. The request can...estimated speed of armored vehicles in combat areas (West-Point Organization, 2002). When a call for air support is received, an information
ON DEVELOPING TOOLS AND METHODS FOR ENVIRONMENTALLY BENIGN PROCESSES
Two types of tools are generally needed for designing processes and products that are cleaner from environmental impact perspectives. The first kind is called process tools. Process tools are based on information obtained from experimental investigations in chemistry., material s...
Spitzer Space Telescope proposal process
NASA Astrophysics Data System (ADS)
Laine, S.; Silbermann, N. A.; Rebull, L. M.; Storrie-Lombardi, L. J.
2006-06-01
This paper discusses the Spitzer Space Telescope General Observer proposal process. Proposals, consisting of the scientific justification, basic contact information for the observer, and observation requests, are submitted electronically using a client-server Java package called Spot. The Spitzer Science Center (SSC) uses a one-phase proposal submission process, meaning that fully-planned observations are submitted for most proposals at the time of submission, not months after acceptance. Ample documentation and tools are available to the observers on SSC web pages to support the preparation of proposals, including an email-based Helpdesk. Upon submission proposals are immediately ingested into a database which can be queried at the SSC for program information, statistics, etc. at any time. Large proposals are checked for technical feasibility and all proposals are checked against duplicates of already approved observations. Output from these tasks is made available to the Time Allocation Committee (TAC) members. At the review meeting, web-based software is used to record reviewer comments and keep track of the voted scores. After the meeting, another Java-based web tool, Griffin, is used to track the approved programs as they go through technical reviews, duplication checks and minor modifications before the observations are released for scheduling. In addition to detailing the proposal process, lessons learned from the first two General Observer proposal calls are discussed.
Murphy, Jill; Hatfield, Jennifer; Afsana, Kaosar; Neufeld, Vic
2015-03-01
Global health research partnerships have many benefits, including the development of research capacity and improving the production and use of evidence to improve global health equity. These partnerships also include many challenges, with power and resource differences often leading to inequitable and unethical partnership dynamics. Responding to these challenges and to important gaps in partnership scholarship, the Canadian Coalition for Global Health Research (CCGHR) conducted a three-year, multi-regional consultation to capture the research partnership experiences of stakeholders in South Asia, Latin America, and sub-Saharan Africa. The consultation participants described persistent inequities in the conduct of global health research partnerships and called for a mechanism through which to improve accountability for ethical conduct within partnerships. They also called for a commitment by the global health research community to research partnership ethics. The Partnership Assessment Toolkit (PAT) is a practical tool that enables partners to openly discuss the ethics of their partnership and to put in place structures that create ethical accountability. Clear mechanisms such as the PAT are essential to guide ethical conduct to ensure that global health research partnerships are beneficial to all collaborators, that they reflect the values of the global health endeavor more broadly, and that they ultimately lead to improvements in health outcomes and health equity.
Rezapour, Tara; Hatami, Javad; Farhoudian, Ali; Sofuoglu, Mehmet; Noroozi, Alireza; Daneshmand, Reza; Samiei, Ahmadreza; Ekhtiari, Hamed
2015-01-01
Despite extensive evidence for cognitive deficits associated with drug use and multiple publications supporting the efficacy of cognitive rehabilitation treatment (CRT) services for drug addictions, there are a few well-structured tools and organized programs to improve cognitive abilities in substance users. Most published studies on cognitive rehabilitation for drug dependent patients used rehabilitation tools, which have been previously designed for other types of brain injuries such as schizophrenia or traumatic brain injuries and not specifically designed for drug dependent patients. These studies also suffer from small sample size, lack of follow-up period assessments and or comprehensive treatment outcome measures. To address these limitations, we decided to develop and investigate the efficacy of a paper and pencil cognitive rehabilitation package called NECOREDA (Neurocognitive Rehabilitation for Disease of Addiction) to improve neurocognitive deficits associated with drug dependence particularly caused by stimulants (e.g. amphetamine type stimulants and cocaine) and opiates. To evaluate the feasibility of NECOREDA program, we conducted a pilot study with 10 opiate and methamphetamine dependent patients for 3 months in outpatient setting. NECOREDA was revised based on qualitative comments received from clients and treatment providers. Final version of NECOREDA is composed of brain training exercises called “Brain Gym” and psychoeducational modules called “Brain Treasures” which is implemented in 16 training sessions interleaved with 16 review and practice sessions. NECOREDA will be evaluated as an add-on intervention to methadone maintenance treatment in a randomized clinical trial among opiate dependent patients starting from August 2015. We discuss methodological features of NECOREDA development and evaluation in this article. PMID:26649167
DOT National Transportation Integrated Search
2003-04-01
Introduction Motorist aid call boxes are used to provide motorist assistance, improve safety, and can serve as an incident detection tool. More recently, Intelligent Transportation Systems (ITS) applications have been added to call box systems to enh...
NASA Astrophysics Data System (ADS)
Smith, B.
2015-12-01
In 2014, eight Department of Energy (DOE) national laboratories, four academic institutions, one company, and the National Centre for Atmospheric Research combined forces in a project called Accelerated Climate Modeling for Energy (ACME) with the goal to speed Earth system model development for climate and energy. Over the planned 10-year span, the project will conduct simulations and modeling on DOE's most powerful high-performance computing systems at Oak Ridge, Argonne, and Lawrence Berkeley Leadership Compute Facilities. A key component of the ACME project is the development of an interactive test bed for the advanced Earth system model. Its execution infrastructure will accelerate model development and testing cycles. The ACME Workflow Group is leading the efforts to automate labor-intensive tasks, provide intelligent support for complex tasks and reduce duplication of effort through collaboration support. As part of this new workflow environment, we have created a diagnostic, metric, and intercomparison Python framework, called UVCMetrics, to aid in the testing-to-production execution of the ACME model. The framework exploits similarities among different diagnostics to compactly support diagnosis of new models. It presently focuses on atmosphere and land but is designed to support ocean and sea ice model components as well. This framework is built on top of the existing open-source software framework known as the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT). Because of its flexible framework design, scientists and modelers now can generate thousands of possible diagnostic outputs. These diagnostics can compare model runs, compare model vs. observation, or simply verify a model is physically realistic. Additional diagnostics are easily integrated into the framework, and our users have already added several. Diagnostics can be generated, viewed, and manipulated from the UV-CDAT graphical user interface, Python command line scripts and programs, and web browsers. The framework is designed to be scalable to large datasets, yet easy to use and familiar to scientists using previous tools. Integration in the ACME overall user interface facilitates data publication, further analysis, and quick feedback to model developers and scientists making component or coupled model runs.
Designing software for operational decision support through coloured Petri nets
NASA Astrophysics Data System (ADS)
Maggi, F. M.; Westergaard, M.
2017-05-01
Operational support provides, during the execution of a business process, replies to questions such as 'how do I end the execution of the process in the cheapest way?' and 'is my execution compliant with some expected behaviour?' These questions may be asked several times during a single execution and, to answer them, dedicated software components (the so-called operational support providers) need to be invoked. Therefore, an infrastructure is needed to handle multiple providers, maintain data between queries about the same execution and discard information when it is no longer needed. In this paper, we use coloured Petri nets (CPNs) to model and analyse software implementing such an infrastructure. This analysis is needed to clarify the requirements before implementation and to guarantee that the resulting software is correct. To this aim, we present techniques to represent and analyse state spaces with 250 million states on a normal PC. We show how the specified requirements have been implemented as a plug-in of the process mining tool ProM and how the operational support in ProM can be used in combination with an existing operational support provider.
Terminal Area Conflict Detection and Resolution Tool
NASA Technical Reports Server (NTRS)
Verma, Savita Arora
2011-01-01
This poster will describe analysis of a conflict detection and resolution tool for the terminal area called T-TSAFE. With altitude clearance information, the tool can reduce false alerts to as low as 2 per hour.
Stacey, Dawn; Chambers, Suzanne K; Jacobsen, Mary Jane; Dunn, Jeff
2008-11-01
To evaluate the effect of an intervention on healthcare professionals' perceptions of barriers influencing their provision of decision support for callers facing cancer-related decisions. A pre- and post-test study guided by the Ottawa Model of Research Use. Australian statewide cancer call center that provides public access to information and supportive cancer services. 34 nurses, psychologists, and other allied healthcare professionals at the cancer call center. Participants completed baseline measures and, subsequently, were exposed to an intervention that included a decision support tutorial, coaching protocol, and skill-building workshop. Strategies were implemented to address organizational barriers. Perceived barriers and facilitators influencing provision of decision support, decision support knowledge, quality of decision support provided to standardized callers, and call length. Postintervention participants felt more prepared, confident in providing decision support, and aware of decision support resources. They had a stronger belief that providing decision support was within their role. Participants significantly improved their knowledge and provided higher-quality decision support to standardized callers without changing call length. The implementation intervention overcame several identified barriers that influenced call center professionals when providing decision support. Nurses and other helpline professionals have the potential to provide decision support designed to help callers understand cancer information, clarify their values associated with their options, and reduce decisional conflict. However, they require targeted education and organizational interventions to reduce their perceived barriers to providing decision support.
NASA Astrophysics Data System (ADS)
Duffy, P. B.; Colohan, P.; Driggers, R.; Herring, D.; Laurier, F.; Petes, L.; Ruffo, S.; Tilmes, C.; Venkataraman, B.; Weaver, C. P.
2014-12-01
Effective adaptation to impacts of climate change requires best-available information. To be most useful, this information should be easily found, well-documented, and translated into tools that decision-makers use and trust. To meet these needs, the President's Climate Action Plan includes efforts to develop "actionable climate science". The Climate Data Initiative (CDI) leverages the Federal Government's extensive, open data resources to stimulate innovation and private-sector entrepreneurship in support of actions to prepare for climate change. The Initiative forges commitments and partnerships from the private, NGO, academic, and public sectors to create data-driven tools. Open data from Federal agencies to support this innovation is available on Climate.Data.gov, initially focusing on coastal flooding but soon to expand to topics including food, energy, water, energy, transportation, and health. The Climate Resilience Toolkit (CRT) will facilitate access to data-driven resilience tools, services, and best practices, including those accessible through the CDI. The CRT will also include access to training and tutorials, case studies, engagement forums, and other information sources. The Climate Action Plan also calls for a public-private partnership on extreme weather risk, with the goal of generating improved assessments of risk from different types of extreme weather events, using methods and data that are transparent and accessible. Finally, the U.S. Global Change Research Program and associated agencies work to advance the science necessary to inform decisions and sustain assessments. Collectively, these efforts represent increased emphasis across the Federal Government on the importance of information to support climate resilience.
... is usually done using a tool called a stethoscope. Health care providers routinely listen to a person's ... unborn infants. This can be done with a stethoscope or with sound waves (called Doppler ultrasound). Auscultation ...
ERIC Educational Resources Information Center
Parmaxi, Antigoni; Zaphiris, Panayiotis
2017-01-01
This study explores the research development pertaining to the use of Web 2.0 technologies in the field of Computer-Assisted Language Learning (CALL). Published research manuscripts related to the use of Web 2.0 tools in CALL have been explored, and the following research foci have been determined: (1) Web 2.0 tools that dominate second/foreign…
Mobile phones: the next step towards healthcare delivery in rural India?
DeSouza, Sherwin I; Rashmi, M R; Vasanthi, Agalya P; Joseph, Suchitha Maria; Rodrigues, Rashmi
2014-01-01
Given the ubiquity of mobile phones, their use to support healthcare in the Indian context is inevitable. It is however necessary to assess end-user perceptions regarding mobile health interventions especially in the rural Indian context prior to its use in healthcare. This would contextualize the use of mobile phone communication for health to 70% of the country's population that resides in rural India. To explore the acceptability of delivering healthcare interventions through mobile phones among users in a village in rural Bangalore. This was an exploratory study of 488 mobile phone users, residing in a village, near Bangalore city, Karnataka, South India. A pretested, translated, interviewer-administered questionnaire was used to obtain data on mobile phone usage patterns and acceptability of the mobile phone, as a tool for health-related communication. The data is described using basic statistical measures. The primary use of mobile phones was to make or receive phone calls (100%). Text messaging (SMS) was used by only 70 (14%) of the respondents. Most of the respondents, 484 (99%), were willing to receive health-related information on their mobile phones and did not consider receiving such information, an intrusion into their personal life. While receiving reminders for drug adherence was acceptable to most 479 (98%) of our respondents, 424 (89%) preferred voice calls alone to other forms of communication. Nearly all were willing to use their mobile phones to communicate with health personnel in emergencies and 367 (75%) were willing to consult a doctor via the phone in an acute illness. Factors such as sex, English literacy, employment status, and presence of chronic disease affected preferences regarding mode and content of communication. The mobile phone, as a tool for receiving health information and supporting healthcare through mHealth interventions was acceptable in the rural Indian context.
ToTem: a tool for variant calling pipeline optimization.
Tom, Nikola; Tom, Ondrej; Malcikova, Jitka; Pavlova, Sarka; Kubesova, Blanka; Rausch, Tobias; Kolarik, Miroslav; Benes, Vladimir; Bystry, Vojtech; Pospisilova, Sarka
2018-06-26
High-throughput bioinformatics analyses of next generation sequencing (NGS) data often require challenging pipeline optimization. The key problem is choosing appropriate tools and selecting the best parameters for optimal precision and recall. Here we introduce ToTem, a tool for automated pipeline optimization. ToTem is a stand-alone web application with a comprehensive graphical user interface (GUI). ToTem is written in Java and PHP with an underlying connection to a MySQL database. Its primary role is to automatically generate, execute and benchmark different variant calling pipeline settings. Our tool allows an analysis to be started from any level of the process and with the possibility of plugging almost any tool or code. To prevent an over-fitting of pipeline parameters, ToTem ensures the reproducibility of these by using cross validation techniques that penalize the final precision, recall and F-measure. The results are interpreted as interactive graphs and tables allowing an optimal pipeline to be selected, based on the user's priorities. Using ToTem, we were able to optimize somatic variant calling from ultra-deep targeted gene sequencing (TGS) data and germline variant detection in whole genome sequencing (WGS) data. ToTem is a tool for automated pipeline optimization which is freely available as a web application at https://totem.software .
Halvade-RNA: Parallel variant calling from transcriptomic data using MapReduce.
Decap, Dries; Reumers, Joke; Herzeel, Charlotte; Costanza, Pascal; Fostier, Jan
2017-01-01
Given the current cost-effectiveness of next-generation sequencing, the amount of DNA-seq and RNA-seq data generated is ever increasing. One of the primary objectives of NGS experiments is calling genetic variants. While highly accurate, most variant calling pipelines are not optimized to run efficiently on large data sets. However, as variant calling in genomic data has become common practice, several methods have been proposed to reduce runtime for DNA-seq analysis through the use of parallel computing. Determining the effectively expressed variants from transcriptomics (RNA-seq) data has only recently become possible, and as such does not yet benefit from efficiently parallelized workflows. We introduce Halvade-RNA, a parallel, multi-node RNA-seq variant calling pipeline based on the GATK Best Practices recommendations. Halvade-RNA makes use of the MapReduce programming model to create and manage parallel data streams on which multiple instances of existing tools such as STAR and GATK operate concurrently. Whereas the single-threaded processing of a typical RNA-seq sample requires ∼28h, Halvade-RNA reduces this runtime to ∼2h using a small cluster with two 20-core machines. Even on a single, multi-core workstation, Halvade-RNA can significantly reduce runtime compared to using multi-threading, thus providing for a more cost-effective processing of RNA-seq data. Halvade-RNA is written in Java and uses the Hadoop MapReduce 2.0 API. It supports a wide range of distributions of Hadoop, including Cloudera and Amazon EMR.
Points of attention in designing tools for regional brownfield prioritization.
Limasset, Elsa; Pizzol, Lisa; Merly, Corinne; Gatchett, Annette M; Le Guern, Cécile; Martinát, Stanislav; Klusáček, Petr; Bartke, Stephan
2018-05-01
The regeneration of brownfields has been increasingly recognized as a key instrument in sustainable land management, since free developable land (or so called "greenfields") has become a scare and more expensive resource, especially in densely populated areas. However, the complexity of these sites requires considerable efforts to successfully complete their revitalization projects, thus requiring the development and application of appropriate tools to support decision makers in the selection of promising sites where efficiently allocate the limited financial resources. The design of effective prioritization tools is a complex process, which requires the analysis and consideration of critical points of attention (PoAs) which has been identified considering the state of the art in literature, and lessons learned from previous developments of regional brownfield (BF) prioritization processes, frameworks and tools. Accordingly, we identified 5 PoAs, namely 1) Assessing end user needs and orientation discussions, 2) Availability and quality of the data needed for the BF prioritization tool, 3) Communication and stakeholder engagement 4) Drivers of regeneration success, and 5) Financing and application costs. To deepen and collate the most recent knowledge on the topics from scientists and practitioners, we organized a focus group discussion within a special session at the AquaConSoil (ACS) conference 2017, where participants were asked to add their experience and thoughts to the discussion in order to identify the most significant and urgent points of attention in BF prioritization tool design. The result of this assessment is a comprehensive table (Table 2), which can support problem owners, investors, service providers, regulators, public and private land managers, decision makers etc. in the identification of the main aspects (sub-topics) to be considered and their relative influences and in the comprehension of the general patterns and challenges to be faced when dealing with the development of BF prioritization tools. Copyright © 2017 Elsevier B.V. All rights reserved.
A GIS-based tool for an integrated assessment of spatial planning trade-offs with aquaculture.
Gimpel, Antje; Stelzenmüller, Vanessa; Töpsch, Sandra; Galparsoro, Ibon; Gubbins, Matthew; Miller, David; Murillas, Arantza; Murray, Alexander G; Pınarbaşı, Kemal; Roca, Guillem; Watret, Robert
2018-06-15
The increasing demand for protein from aquaculture will trigger a global expansion of the sector in coastal and offshore waters. While contributing to food security, potential conflicts with other traditional activities such as fisheries or tourism are inevitable, thus calling for decision-support tools to assess aquaculture planning scenarios in a multi-use context. Here we introduce the AquaSpace tool, one of the first Geographic Information System (GIS)-based planning tools empowering an integrated assessment and mapping of 30 indicators reflecting economic, environmental, inter-sectorial and socio-cultural risks and opportunities for proposed aquaculture systems in a marine environment. A bottom-up process consulting more than 350 stakeholders from 10 countries across southern and northern Europe enabled the direct consideration of stakeholder needs when developing the GIS AddIn. The AquaSpace tool is an open source product and builds in the prospective use of open source datasets at a European scale, hence aiming to improve reproducibility and collaboration in aquaculture science and research. Tool outputs comprise detailed reports and graphics allowing key stakeholders such as planners or licensing authorities to evaluate and communicate alternative planning scenarios and to take more informed decisions. With the help of the German North Sea case study we demonstrate here the tool application at multiple spatial scales with different aquaculture systems and under a range of space-related development constraints. The computation of these aquaculture planning scenarios and the assessment of their trade-offs showed that it is entirely possible to identify aquaculture sites, that correspondent to multifarious potential challenges, for instance by a low conflict potential, a low risk of disease spread, a comparable high economic profit and a low impact on touristic attractions. We believe that a transparent visualisation of risks and opportunities of aquaculture planning scenarios helps an effective Marine Spatial Planning (MSP) process, supports the licensing process and simplifies investments. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth
2005-03-15
The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scalemore » long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK projects have made use of this infrastructure to build performance measurement and analysis tools that scale to long-running programs on large parallel and distributed systems and that automate much of the search for performance bottlenecks.« less
Command Center Training Tool (C2T2)
NASA Technical Reports Server (NTRS)
Jones, Phillip; Drucker, Nich; Mathews, Reejo; Stanton, Laura; Merkle, Ed
2012-01-01
This abstract presents the training approach taken to create a management-centered, experiential learning solution for the Virginia Port Authority's Port Command Center. The resultant tool, called the Command Center Training Tool (C2T2), follows a holistic approach integrated across the training management cycle and within a single environment. The approach allows a single training manager to progress from training design through execution and AAR. The approach starts with modeling the training organization, identifying the organizational elements and their individual and collective performance requirements, including organizational-specific performance scoring ontologies. Next, the developer specifies conditions, the problems, and constructs that compose exercises and drive experiential learning. These conditions are defined by incidents, which denote a single, multi-media datum, and scenarios, which are stories told by incidents. To these layered, modular components, previously developed meta-data is attached, including associated performance requirements. The components are then stored in a searchable library An event developer can create a training event by searching the library based on metadata and then selecting and loading the resultant modular pieces. This loading process brings into the training event all the previously associated task and teamwork material as well as AAR preparation materials. The approach includes tools within an integrated management environment that places these materials at the fingertips of the event facilitator such that, in real time, the facilitator can track training audience performance and resultantly modify the training event. The approach also supports the concentrated knowledge management requirements for rapid preparation of an extensive AAR. This approach supports the integrated training cycle and allows a management-based perspective and advanced tools, through which a complex, thorough training event can be developed.
Elwyn, Glyn; Burstin, Helen; Barry, Michael J; Corry, Maureen P; Durand, Marie Anne; Lessler, Daniel; Saigal, Christopher
2018-04-27
Efforts to implement the use of patient decision aids to stimulate shared decision making are gaining prominence. Patient decision aids have been designed to help patients participate in making specific choices among health care options. Because these tools clearly influence decisions, poor quality, inaccurate or unbalanced presentations or misleading tools are a risk to patients. As payer interest in these tools increases, so does the risk that patients are harmed by the use of tools that are described as patient decision aids yet fail to meet established standards. To address this problem, the National Quality Forum (NQF) in the USA convened a multi-stakeholder expert panel in 2016 to propose national standards for a patient decision aid certification process. In 2017, NQF established an Action Team to foster shared decision making, and to call for a national certification process as one recommendation among others to stimulate improvement. A persistent barrier to the setup of a national patient decision aids certification process is the lack of a sustainable financial model to support the work. Copyright © 2018 The Author(s). Published by Elsevier B.V. All rights reserved.
A new approach to road accident rescue.
Morales, Alejandro; González-Aguilera, Diego; López, Alfonso I; Gutiérrez, Miguel A
2016-01-01
This article develops and validates a new methodology and tool for rescue assistance in traffic accidents, with the aim of improving its efficiency and safety in the evacuation of people, reducing the number of victims in road accidents. Different tests supported by professionals and experts have been designed under different circumstances and with different categories of damaged vehicles coming from real accidents and simulated trapped victims in order to calibrate and refine the proposed methodology and tool. To validate this new approach, a tool called App_Rescue has been developed. This tool is based on the use of a computer system that allows an efficient access to the technical information of the vehicle and sanitary information of the common passengers. The time spent during rescue using the standard protocol and the proposed method was compared. This rescue assistance system allows us to make vital information accessible in posttrauma care services, improving the effectiveness of interventions by the emergency services, reducing the rescue time and therefore minimizing the consequences involved and the number of victims. This could often mean saving lives. In the different simulated rescue operations, the rescue time has been reduced an average of 14%.
Architectural evaluation of dynamic and partial reconfigurable systems designed with DREAMS tool
NASA Astrophysics Data System (ADS)
Otero, Andrés.; Gallego, Ángel; de la Torre, Eduardo; Riesgo, Teresa
2013-05-01
Benefits of dynamic and partial reconfigurable systems are increasingly being more accepted by the industry. For this reason, SRAM-based FPGA manufacturers have improved, or even included for the first time, the support they offer for the design of this kind of systems. However, commercial tools still offer a poor flexibility, which leads to a limited efficiency. This is witnessed by the overhead introduced by the communication primitives, as well as by the inability to relocate reconfigurable modules, among others. For this reason, authors have proposed an academic design tool called DREAMS, which targets the design of dynamically reconfigurable systems. In this paper, main features offered by DREAMS are described, comparing them with existing commercial and academic tools. Moreover, a graphic user interface (GUI) is originally described in this work, with the aim of simplifying the design process, as well as to hide the low level device dependent details to the system designer. The overall goal is to increase the designer productivity. Using the graphic interface, different reconfigurable architectures are provided as design examples. Among them, both conventional slot-based architectures and mesh type designs have been included.
Distributed data mining on grids: services, tools, and applications.
Cannataro, Mario; Congiusta, Antonio; Pugliese, Andrea; Talia, Domenico; Trunfio, Paolo
2004-12-01
Data mining algorithms are widely used today for the analysis of large corporate and scientific datasets stored in databases and data archives. Industry, science, and commerce fields often need to analyze very large datasets maintained over geographically distributed sites by using the computational power of distributed and parallel systems. The grid can play a significant role in providing an effective computational support for distributed knowledge discovery applications. For the development of data mining applications on grids we designed a system called Knowledge Grid. This paper describes the Knowledge Grid framework and presents the toolset provided by the Knowledge Grid for implementing distributed knowledge discovery. The paper discusses how to design and implement data mining applications by using the Knowledge Grid tools starting from searching grid resources, composing software and data components, and executing the resulting data mining process on a grid. Some performance results are also discussed.
How Formal Dynamic Verification Tools Facilitate Novel Concurrency Visualizations
NASA Astrophysics Data System (ADS)
Aananthakrishnan, Sriram; Delisi, Michael; Vakkalanka, Sarvani; Vo, Anh; Gopalakrishnan, Ganesh; Kirby, Robert M.; Thakur, Rajeev
With the exploding scale of concurrency, presenting valuable pieces of information collected by formal verification tools intuitively and graphically can greatly enhance concurrent system debugging. Traditional MPI program debuggers present trace views of MPI program executions. Such views are redundant, often containing equivalent traces that permute independent MPI calls. In our ISP formal dynamic verifier for MPI programs, we present a collection of alternate views made possible by the use of formal dynamic verification. Some of ISP’s views help pinpoint errors, some facilitate discerning errors by eliminating redundancy, while others help understand the program better by displaying concurrent even orderings that must be respected by all MPI implementations, in the form of completes-before graphs. In this paper, we describe ISP’s graphical user interface (GUI) capabilities in all these areas which are currently supported by a portable Java based GUI, a Microsoft Visual Studio GUI, and an Eclipse based GUI whose development is in progress.
Bioinformatics and molecular modeling in glycobiology
Schloissnig, Siegfried
2010-01-01
The field of glycobiology is concerned with the study of the structure, properties, and biological functions of the family of biomolecules called carbohydrates. Bioinformatics for glycobiology is a particularly challenging field, because carbohydrates exhibit a high structural diversity and their chains are often branched. Significant improvements in experimental analytical methods over recent years have led to a tremendous increase in the amount of carbohydrate structure data generated. Consequently, the availability of databases and tools to store, retrieve and analyze these data in an efficient way is of fundamental importance to progress in glycobiology. In this review, the various graphical representations and sequence formats of carbohydrates are introduced, and an overview of newly developed databases, the latest developments in sequence alignment and data mining, and tools to support experimental glycan analysis are presented. Finally, the field of structural glycoinformatics and molecular modeling of carbohydrates, glycoproteins, and protein–carbohydrate interaction are reviewed. PMID:20364395
NASA Technical Reports Server (NTRS)
Chatterjee, Sharmista
1993-01-01
Our first goal in this project was to perform a systems analysis of a closed loop Environmental Control Life Support System (ECLSS). This pertains to the development of a model of an existing real system from which to assess the state or performance of the existing system. Systems analysis is applied to conceptual models obtained from a system design effort. For our modelling purposes we used a simulator tool called ASPEN (Advanced System for Process Engineering). Our second goal was to evaluate the thermodynamic efficiency of the different components comprising an ECLSS. Use is made of the second law of thermodynamics to determine the amount of irreversibility of energy loss of each component. This will aid design scientists in selecting the components generating the least entropy, as our penultimate goal is to keep the entropy generation of the whole system at a minimum.
Kinjo, Akira R.; Bekker, Gert-Jan; Suzuki, Hirofumi; Tsuchiya, Yuko; Kawabata, Takeshi; Ikegawa, Yasuyo; Nakamura, Haruki
2017-01-01
The Protein Data Bank Japan (PDBj, http://pdbj.org), a member of the worldwide Protein Data Bank (wwPDB), accepts and processes the deposited data of experimentally determined macromolecular structures. While maintaining the archive in collaboration with other wwPDB partners, PDBj also provides a wide range of services and tools for analyzing structures and functions of proteins. We herein outline the updated web user interfaces together with RESTful web services and the backend relational database that support the former. To enhance the interoperability of the PDB data, we have previously developed PDB/RDF, PDB data in the Resource Description Framework (RDF) format, which is now a wwPDB standard called wwPDB/RDF. We have enhanced the connectivity of the wwPDB/RDF data by incorporating various external data resources. Services for searching, comparing and analyzing the ever-increasing large structures determined by hybrid methods are also described. PMID:27789697
National Cycle Program (NCP) Common Analysis Tool for Aeropropulsion
NASA Technical Reports Server (NTRS)
Follen, G.; Naiman, C.; Evans, A.
1999-01-01
Through the NASA/Industry Cooperative Effort (NICE) agreement, NASA Lewis and industry partners are developing a new engine simulation, called the National Cycle Program (NCP), which is the initial framework of NPSS. NCP is the first phase toward achieving the goal of NPSS. This new software supports the aerothermodynamic system simulation process for the full life cycle of an engine. The National Cycle Program (NCP) was written following the Object Oriented Paradigm (C++, CORBA). The software development process used was also based on the Object Oriented paradigm. Software reviews, configuration management, test plans, requirements, design were all apart of the process used in developing NCP. Due to the many contributors to NCP, the stated software process was mandatory for building a common tool intended for use by so many organizations. The U.S. aircraft and airframe companies recognize NCP as the future industry standard for propulsion system modeling.
Nutrition environment measures survey-vending: development, dissemination, and reliability.
Voss, Carol; Klein, Susan; Glanz, Karen; Clawson, Margaret
2012-07-01
Researchers determined a need to develop an instrument to assess the vending machine environment that was comparably reliable and valid to other Nutrition Environment Measures Survey tools and that would provide consistent and comparable data for businesses, schools, and communities. Tool development, reliability testing, and dissemination of the Nutrition Environment Measures Survey-Vending (NEMS-V) involved a collaboration of students, professionals, and community leaders. Interrater reliability testing showed high levels of agreement among trained raters on the products and evaluations of products. NEMS-V can benefit public health partners implementing policy and environmental change initiatives as a part of their community wellness activities. The vending machine project will support a policy calling for state facilities to provide a minimum of 30% of foods and beverages in vending machines as healthy options, based on NEMS-V criteria, which will be used as a model for other businesses.
Tool use and affordance: Manipulation-based versus reasoning-based approaches.
Osiurak, François; Badets, Arnaud
2016-10-01
Tool use is a defining feature of human species. Therefore, a fundamental issue is to understand the cognitive bases of human tool use. Given that people cannot use tools without manipulating them, proponents of the manipulation-based approach have argued that tool use might be supported by the simulation of past sensorimotor experiences, also sometimes called affordances. However, in the meanwhile, evidence has been accumulated demonstrating the critical role of mechanical knowledge in tool use (i.e., the reasoning-based approach). The major goal of the present article is to examine the validity of the assumptions derived from the manipulation-based versus the reasoning-based approach. To do so, we identified 3 key issues on which the 2 approaches differ, namely, (a) the reference frame issue, (b) the intention issue, and (c) the action domain issue. These different issues will be addressed in light of studies in experimental psychology and neuropsychology that have provided valuable contributions to the topic (i.e., tool-use interaction, orientation effect, object-size effect, utilization behavior and anarchic hand, tool use and perception, apraxia of tool use, transport vs. use actions). To anticipate our conclusions, the reasoning-based approach seems to be promising for understanding the current literature, even if it is not fully satisfactory because of a certain number of findings easier to interpret with regard to the manipulation-based approach. A new avenue for future research might be to develop a framework accommodating both approaches, thereby shedding a new light on the cognitive bases of human tool use and affordances. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Microbe-ID: an open source toolbox for microbial genotyping and species identification.
Tabima, Javier F; Everhart, Sydney E; Larsen, Meredith M; Weisberg, Alexandra J; Kamvar, Zhian N; Tancos, Matthew A; Smart, Christine D; Chang, Jeff H; Grünwald, Niklaus J
2016-01-01
Development of tools to identify species, genotypes, or novel strains of invasive organisms is critical for monitoring emergence and implementing rapid response measures. Molecular markers, although critical to identifying species or genotypes, require bioinformatic tools for analysis. However, user-friendly analytical tools for fast identification are not readily available. To address this need, we created a web-based set of applications called Microbe-ID that allow for customizing a toolbox for rapid species identification and strain genotyping using any genetic markers of choice. Two components of Microbe-ID, named Sequence-ID and Genotype-ID, implement species and genotype identification, respectively. Sequence-ID allows identification of species by using BLAST to query sequences for any locus of interest against a custom reference sequence database. Genotype-ID allows placement of an unknown multilocus marker in either a minimum spanning network or dendrogram with bootstrap support from a user-created reference database. Microbe-ID can be used for identification of any organism based on nucleotide sequences or any molecular marker type and several examples are provided. We created a public website for demonstration purposes called Microbe-ID (microbe-id.org) and provided a working implementation for the genus Phytophthora (phytophthora-id.org). In Phytophthora-ID, the Sequence-ID application allows identification based on ITS or cox spacer sequences. Genotype-ID groups individuals into clonal lineages based on simple sequence repeat (SSR) markers for the two invasive plant pathogen species P. infestans and P. ramorum. All code is open source and available on github and CRAN. Instructions for installation and use are provided at https://github.com/grunwaldlab/Microbe-ID.
Examining A Health Care Price Transparency Tool: Who Uses It, And How They Shop For Care.
Sinaiko, Anna D; Rosenthal, Meredith B
2016-04-01
Calls for transparency in health care prices are increasing, in an effort to encourage and enable patients to make value-based decisions. Yet there is very little evidence of whether and how patients use health care price transparency tools. We evaluated the experiences, in the period 2011-12, of an insured population of nonelderly adults with Aetna's Member Payment Estimator, a web-based tool that provides real-time, personalized, episode-level price estimates. Overall, use of the tool increased during the study period but remained low. Nonetheless, for some procedures the number of people searching for prices of services (called searchers) was high relative to the number of people who received the service (called patients). Among Aetna patients who had an imaging service, childbirth, or one of several outpatient procedures, searchers for price information were significantly more likely to be younger and healthier and to have incurred higher annual deductible spending than patients who did not search for price information. A campaign to deliver price information to consumers may be important to increase patients' engagement with price transparency tools. Project HOPE—The People-to-People Health Foundation, Inc.
Purawat, Shweta; Cowart, Charles; Amaro, Rommie E; Altintas, Ilkay
2016-06-01
The BBDTC (https://biobigdata.ucsd.edu) is a community-oriented platform to encourage high-quality knowledge dissemination with the aim of growing a well-informed biomedical big data community through collaborative efforts on training and education. The BBDTC collaborative is an e-learning platform that supports the biomedical community to access, develop and deploy open training materials. The BBDTC supports Big Data skill training for biomedical scientists at all levels, and from varied backgrounds. The natural hierarchy of courses allows them to be broken into and handled as modules . Modules can be reused in the context of multiple courses and reshuffled, producing a new and different, dynamic course called a playlist . Users may create playlists to suit their learning requirements and share it with individual users or the wider public. BBDTC leverages the maturity and design of the HUBzero content-management platform for delivering educational content. To facilitate the migration of existing content, the BBDTC supports importing and exporting course material from the edX platform. Migration tools will be extended in the future to support other platforms. Hands-on training software packages, i.e., toolboxes , are supported through Amazon EC2 and Virtualbox virtualization technologies, and they are available as: ( i ) downloadable lightweight Virtualbox Images providing a standardized software tool environment with software packages and test data on their personal machines, and ( ii ) remotely accessible Amazon EC2 Virtual Machines for accessing biomedical big data tools and scalable big data experiments. At the moment, the BBDTC site contains three open Biomedical big data training courses with lecture contents, videos and hands-on training utilizing VM toolboxes, covering diverse topics. The courses have enhanced the hands-on learning environment by providing structured content that users can use at their own pace. A four course biomedical big data series is planned for development in 2016.
NASA Astrophysics Data System (ADS)
Roesch-McNally, G.; Prendeville, H. R.
2017-12-01
A lack of coproduction, the joint production of new technologies or knowledge among technical experts and other groups, is arguably one of the reasons why much scientific information and resulting decision support systems are not very usable. Increasingly, public agencies and academic institutions are emphasizing the importance of coproduction of scientific knowledge and decision support systems in order to facilitate greater engagement between the scientific community and key stakeholder groups. Coproduction has been embraced as a way for the scientific community to develop actionable scientific information that will assist end users in solving real-world problems. Increasing the level of engagement and stakeholder buy-in to the scientific process is increasingly necessary, particularly in the context of growing politicization of science and the scientific process. Coproduction can be an effective way to build trust and can build-on and integrate local and traditional knowledge. Employing coproduction strategies may enable the development of more relevant and useful information and decision support tools that address stakeholder challenges at relevant scales. The USDA Northwest Climate Hub has increasingly sought ways to integrate coproduction in the development of both applied research projects and the development of decision support systems. Integrating coproduction, however, within existing institutions is not always simple, given that coproduction is often more focused on process than products and products are, for better or worse, often the primary focus of applied research and tool development projects. The USDA Northwest Climate Hub sought to integrate coproduction into our FY2017 call for proposal process. As a result we have a set of proposals and fledgling projects that fall along the engagement continuum (see Figure 1- attached). We will share the challenges and opportunities that emerged from this purposeful integration of coproduction into the work that we prioritized for funding. This effort highlights strategies for how federal agencies might consider how and whether to codify coproduction tenets into their collaborations and agenda setting.
Hello, Who is Calling?: Can Words Reveal the Social Nature of Conversations?
Stark, Anthony; Shafran, Izhak; Kaye, Jeffrey
2012-01-01
This study aims to infer the social nature of conversations from their content automatically. To place this work in context, our motivation stems from the need to understand how social disengagement affects cognitive decline or depression among older adults. For this purpose, we collected a comprehensive and naturalistic corpus comprising of all the incoming and outgoing telephone calls from 10 subjects over the duration of a year. As a first step, we learned a binary classifier to filter out business related conversation, achieving an accuracy of about 85%. This classification task provides a convenient tool to probe the nature of telephone conversations. We evaluated the utility of openings and closing in differentiating personal calls, and find that empirical results on a large corpus do not support the hypotheses by Schegloff and Sacks that personal conversations are marked by unique closing structures. For classifying different types of social relationships such as family vs other, we investigated features related to language use (entropy), hand-crafted dictionary (LIWC) and topics learned using unsupervised latent Dirichlet models (LDA). Our results show that the posteriors over topics from LDA provide consistently higher accuracy (60-81%) compared to LIWC or language use features in distinguishing different types of conversations.
NASA Technical Reports Server (NTRS)
Rogers, James L.; Feyock, Stefan; Sobieszczanski-Sobieski, Jaroslaw
1988-01-01
The purpose of this research effort is to investigate the benefits that might be derived from applying artificial intelligence tools in the area of conceptual design. Therefore, the emphasis is on the artificial intelligence aspects of conceptual design rather than structural and optimization aspects. A prototype knowledge-based system, called STRUTEX, was developed to initially configure a structure to support point loads in two dimensions. This system combines numerical and symbolic processing by the computer with interactive problem solving aided by the vision of the user by integrating a knowledge base interface and inference engine, a data base interface, and graphics while keeping the knowledge base and data base files separate. The system writes a file which can be input into a structural synthesis system, which combines structural analysis and optimization.
Jones, Emma L
2011-01-01
This article examines letters sent by members of the general public to the Abortion Law Reform Association (ALRA) in the decade immediately before the 1967 Abortion Act. It shows how a voluntary organisation, in their aim of supporting a specific cause of unclear legality, called forth correspondence from those in need. In detailing the personal predicaments of those facing an unwanted pregnancy, this body of correspondence was readily deployed by ALRA in their efforts to mobilise support for abortion law reform, thus exercising a political function. A close examination of the content of the letters and the epistolary strategies adopted by their writers reveals that as much as they were a lobbying tool for changes in abortion law, these letters were discursively shaped by debates surrounding that very reform.
NanoStringNormCNV: pre-processing of NanoString CNV data.
Sendorek, Dorota H; Lalonde, Emilie; Yao, Cindy Q; Sabelnykova, Veronica Y; Bristow, Robert G; Boutros, Paul C
2018-03-15
The NanoString System is a well-established technology for measuring RNA and DNA abundance. Although it can estimate copy number variation, relatively few tools support analysis of these data. To address this gap, we created NanoStringNormCNV, an R package for pre-processing and copy number variant calling from NanoString data. This package implements algorithms for pre-processing, quality-control, normalization and copy number variation detection. A series of reporting and data visualization methods support exploratory analyses. To demonstrate its utility, we apply it to a new dataset of 96 genes profiled on 41 prostate tumour and 24 matched normal samples. NanoStringNormCNV is implemented in R and is freely available at http://labs.oicr.on.ca/boutros-lab/software/nanostringnormcnv. paul.boutros@oicr.on.ca. Supplementary data are available at Bioinformatics online.
An integrated Diet Monitoring Solution for nutrigenomic research.
Conti, Costanza; Rossi, Elena; Marceglia, Sara; Tauro, Vittorio; Rizzi, Federica; Lazzaroni, Monica; Barlassina, Cristina; Soldati, Laura; Cusi, Daniele
2015-01-01
The emergence of evidence pointing at diet as key risk factor for chronic diseases and at gene-diet interactions as key elements in the interplay between an individual genetic background and his/her lifestyle, pave the way for studies in nutrigenomics. Such studies need an integrated solution to collect, monitor and analyse a large set of data. In the frame of ATHENA, a European Commission FP7 project, we developed an integrated platform, called Dietary Monitoring Solution enabling the collection of phenotypic, genetic and lifestyle information, linked to a mHealth application tool. The data collection solution allows maintaining anonymized information and supports a number of features making it particularly suited for multicentre studies. The mHealth application was designed to translate the knowledge generated from research into a personalised prevention programme and to support the patient adherence to the programme.
Ketso: A New Tool for Extension Professionals
ERIC Educational Resources Information Center
Bates, James S.
2016-01-01
Extension professionals employ many techniques and tools to obtain feedback, input, information, and data from stakeholders, research participants, and program learners. An information-gathering tool called Ketso is described in this article. This tool and its associated techniques can be used in all phases of program development, implementation,…
A call for benchmarking transposable element annotation methods.
Hoen, Douglas R; Hickey, Glenn; Bourque, Guillaume; Casacuberta, Josep; Cordaux, Richard; Feschotte, Cédric; Fiston-Lavier, Anna-Sophie; Hua-Van, Aurélie; Hubley, Robert; Kapusta, Aurélie; Lerat, Emmanuelle; Maumus, Florian; Pollock, David D; Quesneville, Hadi; Smit, Arian; Wheeler, Travis J; Bureau, Thomas E; Blanchette, Mathieu
2015-01-01
DNA derived from transposable elements (TEs) constitutes large parts of the genomes of complex eukaryotes, with major impacts not only on genomic research but also on how organisms evolve and function. Although a variety of methods and tools have been developed to detect and annotate TEs, there are as yet no standard benchmarks-that is, no standard way to measure or compare their accuracy. This lack of accuracy assessment calls into question conclusions from a wide range of research that depends explicitly or implicitly on TE annotation. In the absence of standard benchmarks, toolmakers are impeded in improving their tools, annotators cannot properly assess which tools might best suit their needs, and downstream researchers cannot judge how accuracy limitations might impact their studies. We therefore propose that the TE research community create and adopt standard TE annotation benchmarks, and we call for other researchers to join the authors in making this long-overdue effort a success.
Rurkhamet, Busagarin; Nanthavanij, Suebsak
2004-12-01
One important factor that leads to the development of musculoskeletal disorders (MSD) and cumulative trauma disorders (CTD) among visual display terminal (VDT) users is their work posture. While operating a VDT, a user's body posture is strongly influenced by the task, VDT workstation settings, and layout of computer accessories. This paper presents an analytic and rule-based decision support tool called EQ-DeX (an ergonomics and quantitative design expert system) that is developed to provide valid and practical recommendations regarding the adjustment of a VDT workstation and the arrangement of computer accessories. The paper explains the structure and components of EQ-DeX, input data, rules, and adjustment and arrangement algorithms. From input information such as gender, age, body height, task, etc., EQ-DeX uses analytic and rule-based algorithms to estimate quantitative settings of a computer table and a chair, as well as locations of computer accessories such as monitor, document holder, keyboard, and mouse. With the input and output screens that are designed using the concept of usability, the interactions between the user and EQ-DeX are convenient. Examples are also presented to demonstrate the recommendations generated by EQ-DeX.
Palomar, Esther; Chen, Xiaohong; Liu, Zhiming; Maharjan, Sabita; Bowen, Jonathan
2016-10-28
Smart city systems embrace major challenges associated with climate change, energy efficiency, mobility and future services by embedding the virtual space into a complex cyber-physical system. Those systems are constantly evolving and scaling up, involving a wide range of integration among users, devices, utilities, public services and also policies. Modelling such complex dynamic systems' architectures has always been essential for the development and application of techniques/tools to support design and deployment of integration of new components, as well as for the analysis, verification, simulation and testing to ensure trustworthiness. This article reports on the definition and implementation of a scalable component-based architecture that supports a cooperative energy demand response (DR) system coordinating energy usage between neighbouring households. The proposed architecture, called refinement of Cyber-Physical Component Systems (rCPCS), which extends the refinement calculus for component and object system (rCOS) modelling method, is implemented using Eclipse Extensible Coordination Tools (ECT), i.e., Reo coordination language. With rCPCS implementation in Reo, we specify the communication, synchronisation and co-operation amongst the heterogeneous components of the system assuring, by design scalability and the interoperability, correctness of component cooperation.
Using "get with the guidelines" to improve cardiovascular secondary prevention.
LaBresh, Kenneth A; Gliklich, Richard; Liljestrand, James; Peto, Randolph; Ellrodt, A Gray
2003-10-01
"Get With The Guidelines (GWTG)" was developed and piloted by the American Heart Association (AHA), New England Affiliate; MassPRO, Inc.; and other organizations to reduce the gap in the application of secondary prevention guidelines in hospitalized cardiovascular disease patients. Collaborative learning programs and technology solutions were created for the project. The interactive Web-based patient management tool (PMT) was developed using quality measures derived from the AHA/American College of Cardiology secondary prevention guidelines. It provided data entry, embedded reminders and guideline summaries, and online reports of quality measure performance, including comparisons with the aggregate performance of all hospitals. Multidisciplinary teams from 24 hospitals participated in the 2000-2001 pilot. Four collaborative learning sessions and monthly conference calls supported team interaction. Best-practices sharing and the use of an Internet tool enabled hospitals to change systems and collect data on 1,738 patients. The GWTG program, a template of learning sessions with didactic presentations, best-practices sharing, and collaborative multidisciplinary team meetings supported by the Internet-based data collection and reporting system, can be extended to multiple regions without requiring additional development. Following the completion of the pilot, the AHA adopted GWTG as a national program.
Palomar, Esther; Chen, Xiaohong; Liu, Zhiming; Maharjan, Sabita; Bowen, Jonathan
2016-01-01
Smart city systems embrace major challenges associated with climate change, energy efficiency, mobility and future services by embedding the virtual space into a complex cyber-physical system. Those systems are constantly evolving and scaling up, involving a wide range of integration among users, devices, utilities, public services and also policies. Modelling such complex dynamic systems’ architectures has always been essential for the development and application of techniques/tools to support design and deployment of integration of new components, as well as for the analysis, verification, simulation and testing to ensure trustworthiness. This article reports on the definition and implementation of a scalable component-based architecture that supports a cooperative energy demand response (DR) system coordinating energy usage between neighbouring households. The proposed architecture, called refinement of Cyber-Physical Component Systems (rCPCS), which extends the refinement calculus for component and object system (rCOS) modelling method, is implemented using Eclipse Extensible Coordination Tools (ECT), i.e., Reo coordination language. With rCPCS implementation in Reo, we specify the communication, synchronisation and co-operation amongst the heterogeneous components of the system assuring, by design scalability and the interoperability, correctness of component cooperation. PMID:27801829
Eradication of Yaws: Historical Efforts and Achieving WHO's 2020 Target
Asiedu, Kingsley; Fitzpatrick, Christopher; Jannin, Jean
2014-01-01
Background Yaws, one of the 17 neglected tropical diseases (NTDs), is targeted for eradication by 2020 in resolution WHA66.12 of the World Health Assembly (2013) and the WHO roadmap on NTDs (2012). The disease frequently affects children who live in poor socioeconomic conditions. Between 1952 and 1964, WHO and the United Nations Children's Fund (UNICEF) led a global eradication campaign using injectable benzathine penicillin. Recent developments using a single dose of oral azithromycin have renewed optimism that eradication can be achieved through a comprehensive large-scale treatment strategy. We review historical efforts to eradicate yaws and argue that this goal is now technically feasible using new tools and with the favorable environment for control of NTDs. We also summarize the work of WHO's Department of Control of Neglected Tropical Diseases in leading the renewed eradication initiative and call on the international community to support efforts to achieve the 2020 eradication goal. The critical factor remains access to azithromycin. Excluding medicines, the financial cost of yaws eradication could be as little as US$ 100 million. Conclusions The development of new tools has renewed interest in eradication of yaws; with modest support, the WHO eradication target of 2020 can be achieved. PMID:25254372
NASA Astrophysics Data System (ADS)
Frezzo, Dennis C.; Behrens, John T.; Mislevy, Robert J.
2010-04-01
Simulation environments make it possible for science and engineering students to learn to interact with complex systems. Putting these capabilities to effective use for learning, and assessing learning, requires more than a simulation environment alone. It requires a conceptual framework for the knowledge, skills, and ways of thinking that are meant to be developed, in order to design activities that target these capabilities. The challenges of using simulation environments effectively are especially daunting in dispersed social systems. This article describes how these challenges were addressed in the context of the Cisco Networking Academies with a simulation tool for computer networks called Packet Tracer. The focus is on a conceptual support framework for instructors in over 9,000 institutions around the world for using Packet Tracer in instruction and assessment, by learning to create problem-solving scenarios that are at once tuned to the local needs of their students and consistent with the epistemic frame of "thinking like a network engineer." We describe a layered framework of tools and interfaces above the network simulator that supports the use of Packet Tracer in the distributed community of instructors and students.
Developing mechanisms for estimating carbon footprint in farming systems
NASA Astrophysics Data System (ADS)
Anaya-Romero, María; Fernández Luque, José Enrique; Rodríguez Merino, Alejandro; José Moreno Delgado, Juan; Rodado, Concepción Mira; Romero Vicente, Rafael; Perez-Martin, Alfonso; Muñoz-Rojas, Miriam
2015-04-01
Sustainable land management is critical to avoid land degradation and to reclaim degraded land for its productive use and for reaping the benefits of crucial ecosystem services and protecting biodiversity. It also helps in mitigating and adapting to climate change. Land and its various uses are affected severely by climate change too (flooding, droughts, etc.). Existing tools and technologies for efficient land management need to be adapted and their application expanded. A large number of human livelihoods and ecosystems can benefit from these tools and techniques since these yield multiple benefits. Disseminating and scaling up the implementation of sustainable land management approaches will, however, need to be backed up by mobilizing strong political will and financial resources. The challenge is to provide an integral decision support tool that can establish relationships between soil carbon content, climate change and land use and management aspects that allow stakeholders to detect, cope with and intervene into land system change in a sustainable way. In order to achieve this goal an agro-ecological meta-model called CarboLAND will be calibrated in several plots located in Andalusia region, Southern Spain, under different scenarios of climate and agricultural use and management. The output will be the CLIMALAND e-platform, which will also include protocols in order to support stakeholders for an integrated ecosystem approach, taking into account biodiversity, hydrological and soil capability, socio-economic aspects, and regional and environmental policies. This tool will be made available at the European context for a regional level, providing user-friendly interfaces and a scientifically-technical platform for the assessment of sustainable land use and management.
Gurinović, Mirjana; Milešević, Jelena; Kadvan, Agnes; Nikolić, Marina; Zeković, Milica; Djekić-Ivanković, Marija; Dupouy, Eleonora; Finglas, Paul; Glibetić, Maria
2018-01-01
In order to meet growing public health nutrition challenges in Central Eastern European Countries (CEEC) and Balkan countries, development of a Research Infrastructure (RI) and availability of an effective nutrition surveillance system are a prerequisite. The building block of this RI is an innovative tool called DIET ASSESS & PLAN (DAP), which is a platform for standardized and harmonized food consumption collection, comprehensive dietary intake assessment and nutrition planning. Its unique structure enables application of national food composition databases (FCDBs) from the European food composition exchange platform (28 national FCDBs) developed by EuroFIR (http://www.eurofir.org/) and in addition allows communication with other tools. DAP is used for daily menu and/or long-term diet planning in diverse public sector settings, foods design/reformulation, food labelling, nutrient intake assessment and calculation of the dietary diversity indicator, Minimum Dietary Diversity-Women (MDD-W). As a validated tool in different national and international projects, DAP represents an important RI in public health nutrition epidemiology in the CEEC region. Copyright © 2016 Elsevier Ltd. All rights reserved.
Combining conceptual graphs and argumentation for aiding in the teleexpertise.
Doumbouya, Mamadou Bilo; Kamsu-Foguem, Bernard; Kenfack, Hugues; Foguem, Clovis
2015-08-01
Current medical information systems are too complex to be meaningfully exploited. Hence there is a need to develop new strategies for maximising the exploitation of medical data to the benefit of medical professionals. It is against this backdrop that we want to propose a tangible contribution by providing a tool which combines conceptual graphs and Dung׳s argumentation system in order to assist medical professionals in their decision making process. The proposed tool allows medical professionals to easily manipulate and visualise queries and answers for making decisions during the practice of teleexpertise. The knowledge modelling is made using an open application programming interface (API) called CoGui, which offers the means for building structured knowledge bases with the dedicated functionalities of graph-based reasoning via retrieved data from different institutions (hospitals, national security centre, and nursing homes). The tool that we have described in this study supports a formal traceable structure of the reasoning with acceptable arguments to elucidate some ethical problems that occur very often in the telemedicine domain. Copyright © 2015 Elsevier Ltd. All rights reserved.
Precision Departure Release Capability (PDRC) Final Report
NASA Technical Reports Server (NTRS)
Engelland, Shawn A.; Capps, Richard; Day, Kevin Brian; Kistler, Matthew Stephen; Gaither, Frank; Juro, Greg
2013-01-01
After takeoff, aircraft must merge into en route (Center) airspace traffic flows that may be subject to constraints that create localized demand/capacity imbalances. When demand exceeds capacity, Traffic Management Coordinators (TMCs) and Frontline Managers (FLMs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves a Call for Release (CFR) procedure wherein the Tower must call the Center to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool, based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release time is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that improves tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions and departure runway assignments to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept reduces uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs and FLMs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station in Dallas/Fort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations. This paper presents research results from the PDRC research activity. Companion papers present the Concept of Operations and a Technology Description.
Precision Departure Release Capability (PDRC) Technology Description
NASA Technical Reports Server (NTRS)
Engelland, Shawn A.; Capps, Richard; Day, Kevin; Robinson, Corissia; Null, Jody R.
2013-01-01
After takeoff, aircraft must merge into en route (Center) airspace traffic flows which may be subject to constraints that create localized demand-capacity imbalances. When demand exceeds capacity, Traffic Management Coordinators (TMCs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves use of a Call for Release (CFR) procedure wherein the Tower must call the Center TMC to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System (NextGen) plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that uses this technology to improve tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept helps reduce uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station (NTX) in Dallas-Fort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations. This paper presents the Technology Description. Companion papers include the Final Report and a Concept of Operations.
Precision Departure Release Capability (PDRC): NASA to FAA Research Transition
NASA Technical Reports Server (NTRS)
Engelland, Shawn; Davis, Thomas J.
2013-01-01
After takeoff, aircraft must merge into en route (Center) airspace traffic flows which may be subject to constraints that create localized demand-capacity imbalances. When demand exceeds capacity, Traffic Management Coordinators (TMCs) and Frontline Managers (FLMs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves use of a Call for Release (CFR) procedure wherein the Tower must call the Center to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release time is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that improves tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions and departure runway assignments to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept reduces uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs and FLMs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station in Dallas-Fort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations.
Precision Departure Release Capability (PDRC) Concept of Operations
NASA Technical Reports Server (NTRS)
Engelland, Shawn; Capps, Richard A.; Day, Kevin Brian
2013-01-01
After takeoff, aircraft must merge into en route (Center) airspace traffic flows which may be subject to constraints that create localized demandcapacity imbalances. When demand exceeds capacity Traffic Management Coordinators (TMCs) often use tactical departure scheduling to manage the flow of departures into the constrained Center traffic flow. Tactical departure scheduling usually involves use of a Call for Release (CFR) procedure wherein the Tower must call the Center TMC to coordinate a release time prior to allowing the flight to depart. In present-day operations release times are computed by the Center Traffic Management Advisor (TMA) decision support tool based upon manual estimates of aircraft ready time verbally communicated from the Tower to the Center. The TMA-computed release is verbally communicated from the Center back to the Tower where it is relayed to the Local controller as a release window that is typically three minutes wide. The Local controller will manage the departure to meet the coordinated release time window. Manual ready time prediction and verbal release time coordination are labor intensive and prone to inaccuracy. Also, use of release time windows adds uncertainty to the tactical departure process. Analysis of more than one million flights from January 2011 indicates that a significant number of tactically scheduled aircraft missed their en route slot due to ready time prediction uncertainty. Uncertainty in ready time estimates may result in missed opportunities to merge into constrained en route flows and lead to lost throughput. Next Generation Air Transportation System (NextGen) plans call for development of Tower automation systems capable of computing surface trajectory-based ready time estimates. NASA has developed the Precision Departure Release Capability (PDRC) concept that uses this technology to improve tactical departure scheduling by automatically communicating surface trajectory-based ready time predictions to the Center scheduling tool. The PDRC concept also incorporates earlier NASA and FAA research into automation-assisted CFR coordination. The PDRC concept helps reduce uncertainty by automatically communicating coordinated release times with seconds-level precision enabling TMCs to work with target times rather than windows. NASA has developed a PDRC prototype system that integrates the Center's TMA system with a research prototype Tower decision support tool. A two-phase field evaluation was conducted at NASA's North Texas Research Station (NTX) in DallasFort Worth. The field evaluation validated the PDRC concept and demonstrated reduced release time uncertainty while being used for tactical departure scheduling of more than 230 operational flights over 29 weeks of operations. This paper presents the Concept of Operations. Companion papers include the Final Report and a Technology Description. ? SUBJECT:
Soldering Tool for Integrated Circuits
NASA Technical Reports Server (NTRS)
Takahashi, Ted H.
1987-01-01
Many connections soldered simultaneously in confined spaces. Improved soldering tool bonds integrated circuits onto printed-circuit boards. Intended especially for use with so-called "leadless-carrier" integrated circuits.
Evidence Does Not Support Clinical Screening of Literacy
Wolf, Michael S.
2007-01-01
Limited health literacy is a significant risk factor for adverse health outcomes. Despite controversy, many health care professionals have called for routine clinical screening of patients’ literacy skills. Whereas brief literacy screening tools exist that with further evaluation could potentially be used to detect limited literacy in clinical settings, no screening program for limited literacy has been shown to be effective. Yet there is a noted potential for harm, in the form of shame and alienation, which might be induced through clinical screening. There is fair evidence to suggest that possible harm outweighs any current benefits; therefore, clinical screening for literacy should not be recommended at this time. PMID:17992564
Climate Model Diagnostic Analyzer Web Service System
NASA Astrophysics Data System (ADS)
Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.
2013-12-01
The latest Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with newly available global observations. The traditional approach to climate model evaluation, which compares a single parameter at a time, identifies symptomatic model biases and errors but fails to diagnose the model problems. The model diagnosis process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. To address these challenges, we are developing a parallel, distributed web-service system that enables the physics-based multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation and (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, and (4) the calculation of difference between two variables. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA is planned to be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. The requirements of the educational tool are defined with the interaction with the school organizers, and CMDA is customized to meet the requirements accordingly. The tool needs to be production quality for 30+ simultaneous users. The summer school will thus serve as a valuable testbed for the tool development, preparing CMDA to serve the Earth-science modeling and model-analysis community at the end of the project. This work was funded by the NASA Earth Science Program called Computational Modeling Algorithms and Cyberinfrastructure (CMAC).
Durr, W
1998-01-01
Call centers are strategically and tactically important to many industries, including the healthcare industry. Call centers play a key role in acquiring and retaining customers. The ability to deliver high-quality and timely customer service without much expense is the basis for the proliferation and expansion of call centers. Call centers are unique blends of people and technology, where performance indicates combining appropriate technology tools with sound management practices built on key operational data. While the technology is fascinating, the people working in call centers and the skill of the management team ultimately make a difference to their companies.
Ocean Drilling Program: TAMRF Administrative Services: Meeting, Travel, and
Port-Call Information ODP/TAMU Science Operator Home Mirror sites ODP/TAMU staff Cruise information Science and curation services Publication services and products Drilling services and tools Online ODP Meeting, Travel, and Port-Call Information All ODP meeting and port-call activities are complete
Marketing, Management and Performance: Multilingualism as Commodity in a Tourism Call Centre
ERIC Educational Resources Information Center
Duchene, Alexandre
2009-01-01
This paper focuses on the ways an institution of the new economy--a tourism call centre in Switzerland--markets, manages and performs multilingual services. In particular, it explores the ways multilingualism operates as a strategic and managerial tool within tourism call centres and how the institutional regulation of language practices…
Thinking about Pregnancy After Premature Birth
... Moms Need Blog News & Media News Videos Mission stories Ambassadors Spotlights Tools & Resources Frequently asked media questions ... a kind of fertility treatment called assisted reproductive technology (also called ART). Fertility treatment is medical treatment ...
Technology- and Phone-Based Weight Loss Intervention
Hartman, Sheri J.; Nelson, Sandahl H.; Cadmus-Bertram, Lisa A.; Patterson, Ruth E.; Parker, Barbara A.; Pierce, John P.
2017-01-01
Introduction For women with an increased breast cancer risk, reducing excess weight and increasing physical activity are believed to be important approaches for reducing their risk. This study tested a weight loss intervention that combined commercially available technology-based self-monitoring tools with individualized phone calls. Design Women were randomized to a weight loss intervention arm (n=36) or a usual care arm (n=18). Setting/Participants Participants were women with a BMI ≥ 27.5 kg/m2 and elevated breast cancer risk recruited from the mammography clinic at the Moores Cancer Center at the University of California San Diego. Intervention Intervention participants used the MyFitnessPal website and phone app to monitor diet and a Fitbit to monitor physical activity. Participants received 12 standardized coaching calls with trained counselors over 6 months. Usual care participants received the U.S. Dietary Guidelines for Americans at baseline and two brief calls over the 6 months. Main outcome measures Weight and accelerometer-measured physical activity were assessed at baseline and 6 months. Data were collected in San Diego, CA, from 2012 to 2014 and analyzed in 2015. Results Participants (n=54) had a mean age of 59.5 (SD=5.6) years, BMI of 31.9 (SD=3.5), and a mean Gail Model score of 2.5 (SD=1.4). At 6 months, intervention participants had lost significantly more weight (4.4 kg vs 0.8 kg, p=0.004) and a greater percentage of starting weight (5.3% vs 1.0%, p=0.005) than usual care participants. Across arms, greater increases in moderate-to-vigorous physical activity resulted in greater weight loss (p=0.01). Conclusions Combining technology-based self-monitoring tools with phone counseling supported weight loss over 6 months in women at increased risk for breast cancer. PMID:27593420
Weller, J M; Torrie, J; Boyd, M; Frengley, R; Garden, A; Ng, W L; Frampton, C
2014-06-01
Sharing information with the team is critical in developing a shared mental model in an emergency, and fundamental to effective teamwork. We developed a structured call-out tool, encapsulated in the acronym 'SNAPPI': Stop; Notify; Assessment; Plan; Priorities; Invite ideas. We explored whether a video-based intervention could improve structured call-outs during simulated crises and if this would improve information sharing and medical management. In a simulation-based randomized, blinded study, we evaluated the effect of the video-intervention teaching SNAPPI on scores for SNAPPI, information sharing, and medical management using baseline and follow-up crisis simulations. We assessed information sharing using a probe technique where nurses and technicians received unique, clinically relevant information probes before the simulation. Shared knowledge of probes was measured in a written, post-simulation test. We also scored sharing of diagnostic options with the team and medical management. Anaesthetists' scores for SNAPPI were significantly improved, as was the number of diagnostic options they shared. We found a non-significant trend to improve information-probe sharing and medical management in the intervention group, and across all simulations, a significant correlation between SNAPPI and information-probe sharing. Of note, only 27% of the clinically relevant information about the patient provided to the nurse and technician in the pre-simulation information probes was subsequently learnt by the anaesthetist. We developed a structured communication tool, SNAPPI, to improve information sharing between anaesthetists and their team, taught it using a video-based intervention, and provide initial evidence to support its value for improving communication in a crisis. © The Author [2014]. Published by Oxford University Press on behalf of the British Journal of Anaesthesia. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Co-creation of a digital tool for the empowerment of parents of children with physical disabilities.
Alsem, M W; van Meeteren, K M; Verhoef, M; Schmitz, M J W M; Jongmans, M J; Meily-Visser, J M A; Ketelaar, M
2017-01-01
Parents of children with physical disabilities do a lot to support their child in daily life. In doing this they are faced with many challenges. These parents have a wide range of unmet needs, especially for information, on different topics. It is sometimes hard for them to get the right information at the right moment, and to ask the right questions to physicians and other healthcare professionals. In order to develop a digital tool to help parents formulate questions and find information, we thought it would be crucial to work together in a process of co-creation with parents, researchers, IT-specialists and healthcare professionals. In close collaboration with them we developed a tool that aims to help parents ask questions, find information and take a more leading role in consultations with healthcare professionals, called the WWW-roadmap (WWW-wijzer in Dutch).In two groups of parents (one group with and one group without experience of using the tool), we will study the effects of using this tool, on consultations with physicians. We expect that using the tool will result in better empowerment, satisfaction and family-centred care. Parents of children with physical disabilities do much to support their child in daily life. In doing so, they are faced with many challenges. These parents have a wide range of unmet needs, especially for information, on various topics. Getting timely and reliable information is very difficult for parents, whereas being informed is a major requirement for the process of empowerment and shared decision-making. This paper describes the development of a digital tool to support parents in this process. During its development, working together with parents was crucial to address relevant topics and design a user-centred intervention. In co-creation with parents, healthcare professionals, IT-professionals and researchers, a digital tool was developed, the 'WWW-roadmap' ['WWW-wijzer' in Dutch]. This digital tool aims to enable parents to explore their questions (What do I want to know?), help in their search for information (Where can I find the information I need), and refer to appropriate professionals (Who can assist me further?).During the process, we got extensive feedback from a parent panel consisting of parents of children with physical disabilities, enabling us to create the tool 'with' rather than 'for' them. This led to a user-friendly and problem-driven tool. The WWW-roadmap can function as a tool to help parents formulate their questions, search for information and thus prepare for consultations with healthcare professionals, and to facilitate parental empowerment and shared-decision making by parent and professional. Effects of using the WWW-roadmap on consultations with professionals will be studied in the future.
Pisu, Maria; Meneses, Karen; Azuero, Andres; Benz, Rachel; Su, Xiaogang; McNees, Patrick
2016-04-01
Understanding how resources are used provides guidance to disseminating effective interventions. Here, we report data on implementation resources needed for the Rural Breast Cancer Survivors (RBCS) study that tested a telephone-delivered psychoeducational education and support intervention to survivors in rural Florida. Intervention resources included interventionists' time on one intake assessment (IA) call, three education calls (ED), one follow-up education call (FUE), six support (SUP) calls, and documentation time per survivor. Interventionists logged start and end times of each type of call. Average interventionist time in minutes was calculated by call type. Associations between interventionists' time and participants' characteristics including age, race/ethnicity, time since treatment, cancer treatment, depressive symptoms, education, income, employment, and support, was assessed using linear mixed models with repeated measures. Among 328 survivors, IA calls lasted 66.9 min (SD 21.7); ED lasted 50.6 (SD 16.7), 48.1 (SD 15.9), and 39.6 (SD 14.8); FUE lasted 24.7 (SD 14.8); and SUP 42.8 (SD 29.6) min. Documentation time was 18.4 min for IA, 23-27 for ED, 12.3 for FUE, and 23.0 for SUP. Interventionists spent significantly more time with participants with depressive symptoms, who already used other support, and who received SUP calls before the ED vs. after. There were no significant differences by time since or type of cancer treatment, or other personal characteristics. Resources vary by survivor characteristics. Careful consideration of mental health status or support available is warranted for planning implementation and dissemination of effective survivorship interventions on a broad scale.
Decision blocks: A tool for automating decision making in CLIPS
NASA Technical Reports Server (NTRS)
Eick, Christoph F.; Mehta, Nikhil N.
1991-01-01
The human capability of making complex decision is one of the most fascinating facets of human intelligence, especially if vague, judgemental, default or uncertain knowledge is involved. Unfortunately, most existing rule based forward chaining languages are not very suitable to simulate this aspect of human intelligence, because of their lack of support for approximate reasoning techniques needed for this task, and due to the lack of specific constructs to facilitate the coding of frequently reoccurring decision block to provide better support for the design and implementation of rule based decision support systems. A language called BIRBAL, which is defined on the top of CLIPS, for the specification of decision blocks, is introduced. Empirical experiments involving the comparison of the length of CLIPS program with the corresponding BIRBAL program for three different applications are surveyed. The results of these experiments suggest that for decision making intensive applications, a CLIPS program tends to be about three times longer than the corresponding BIRBAL program.
Relative evolutionary rate inference in HyPhy with LEISR.
Spielman, Stephanie J; Kosakovsky Pond, Sergei L
2018-01-01
We introduce LEISR (Likehood Estimation of Individual Site Rates, pronounced "laser"), a tool to infer relative evolutionary rates from protein and nucleotide data, implemented in HyPhy. LEISR is based on the popular Rate4Site (Pupko et al., 2002) approach for inferring relative site-wise evolutionary rates, primarily from protein data. We extend the original method for more general use in several key ways: (i) we increase the support for nucleotide data with additional models, (ii) we allow for datasets of arbitrary size, (iii) we support analysis of site-partitioned datasets to correct for the presence of recombination breakpoints, (iv) we produce rate estimates at all sites rather than at just a subset of sites, and (v) we implemented LEISR as MPI-enabled to support rapid, high-throughput analysis. LEISR is available in HyPhy starting with version 2.3.8, and it is accessible as an option in the HyPhy analysis menu ("Relative evolutionary rate inference"), which calls the HyPhy batchfile LEISR.bf.
Efficient Delivery and Visualization of Long Time-Series Datasets Using Das2 Tools
NASA Astrophysics Data System (ADS)
Piker, C.; Granroth, L.; Faden, J.; Kurth, W. S.
2017-12-01
For over 14 years the University of Iowa Radio and Plasma Wave Group has utilized a network transparent data streaming and visualization system for most daily data review and collaboration activities. This system, called Das2, was originally designed in support of the Cassini Radio and Plasma Wave Science (RPWS) investigation, but is now relied on for daily review and analysis of Voyager, Polar, Cluster, Mars Express, Juno and other mission results. In light of current efforts to promote automatic data distribution in space physics it seems prudent to provide an overview of our open source Das2 programs and interface definitions to the wider community and to recount lessons learned. This submission will provide an overview of interfaces that define the system, describe the relationship between the Das2 effort and Autoplot and will examine handling Cassini RPWS Wideband waveforms and dynamic spectra as examples of dealing with long time-series data sets. In addition, the advantages and limitations of the current Das2 tool set will be discussed, as well as lessons learned that are applicable to other data sharing initiatives. Finally, plans for future developments including improved catalogs to support 'no-software' data sources and redundant multi-server fail over, as well as new adapters for CSV (Comma Separated Values) and JSON (Javascript Object Notation) output to support Cassini closeout and the HAPI (Heliophysics Application Programming Interface) initiative are outlined.
Staccini, Pascal; Dufour, Jean-Charles; Raps, Hervé; Fieschi, Marius
2005-01-01
Making educational material be available on a network cannot be reduced to merely implementing hypermedia and interactive resources on a server. A pedagogical schema has to be defined to guide students for learning and to provide teachers with guidelines to prepare valuable and upgradeable resources. Components of a learning environment, as well as interactions between students and other roles such as author, tutor and manager, can be deduced from cognitive foundations of learning, such as the constructivist approach. Scripting the way a student will to navigate among information nodes and interact with tools to build his/her own knowledge can be a good way of deducing the features of the graphic interface related to the management of the objects. We defined a typology of pedagogical resources, their data model and their logic of use. We implemented a generic and web-based authoring and publishing platform (called J@LON for Join And Learn On the Net) within an object-oriented and open-source programming environment (called Zope) embedding a content management system (called Plone). Workflow features have been used to mark the progress of students and to trace the life cycle of resources shared by the teaching staff. The platform integrated advanced on line authoring features to create interactive exercises and support live courses diffusion. The platform engine has been generalized to the whole curriculum of medical studies in our faculty; it also supports an international master of risk management in health care and will be extent to all other continuous training diploma.
SCI peer health coach influence on self-management with peers: a qualitative analysis.
Skeels, S E; Pernigotti, D; Houlihan, B V; Belliveau, T; Brody, M; Zazula, J; Hasiotis, S; Seetharama, S; Rosenblum, D; Jette, A
2017-11-01
A process evaluation of a clinical trial. To describe the roles fulfilled by peer health coaches (PHCs) with spinal cord injury (SCI) during a randomized controlled trial research study called 'My Care My Call', a novel telephone-based, peer-led self-management intervention for adults with chronic SCI 1+ years after injury. Connecticut and Greater Boston Area, MA, USA. Directed content analysis was used to qualitatively examine information from 504 tele-coaching calls, conducted with 42 participants with SCI, by two trained SCI PHCs. Self-management was the focus of each 6-month PHC-peer relationship. PHCs documented how and when they used the communication tools (CTs) and information delivery strategies (IDSs) they developed for the intervention. Interaction data were coded and analyzed to determine PHC roles in relation to CT and IDS utilization and application. PHCs performed three principal roles: Role Model, Supporter, and Advisor. Role Model interactions included CTs and IDSs that allowed PHCs to share personal experiences of managing and living with an SCI, including sharing their opinions and advice when appropriate. As Supporters, PHCs used CTs and IDSs to build credible relationships based on dependability and reassuring encouragement. PHCs fulfilled the unique role of Advisor using CTs and IDSs to teach and strategize with peers about SCI self-management. The SCI PHC performs a powerful, flexible role in promoting SCI self-management among peers. Analysis of PHC roles can inform the design of peer-led interventions and highlights the importance for the provision of peer mentor training.
Exploring TechQuests Through Open Source and Tools That Inspire Digital Natives
NASA Astrophysics Data System (ADS)
Hayden, K.; Ouyang, Y.; Kilb, D.; Taylor, N.; Krey, B.
2008-12-01
"There is little doubt that K-12 students need to understand and appreciate the Earth on which they live. They can achieve this understanding only if their teachers are well prepared". Dan Barstow, Director of Center for Earth and Space Science Education at TERC. The approach of San Diego County's Cyberinfrastructure Training, Education, Advancement, and Mentoring (SD Cyber-TEAM) project is to build understandings of Earth systems for middle school teachers and students through a collaborative that has engaged the scientific community in the use of cyber-based tools and environments for learning. The SD Cyber-TEAM has used Moodle, an open source management system with social networking tools, that engage digital native students and their teachers in collaboration and sharing of ideas and research related to Earth science. Teachers participate in on-line professional dialog through chat, wikis, blogs, forums, journals and other tools and choose the tools that will best fit their classroom. The use of Moodle during the Summer Cyber Academy developed a cyber-collaboratory environment where teaching strategies were discussed, supported and actualized by participants. These experiences supported digital immigrants (teachers) in adapting teaching strategies using technologies that are most attractive and familiar to students (digital natives). A new study by the National School Boards Association and Grunwald Associates LLC indicated that "the online behaviors of U.S. teens and 'tweens shows that 96 percent of students with online access use social networking technologies, such as chatting, text messaging, blogging, and visiting online communities such as Facebook, MySpace, and Webkinz". While SD Cyber-TEAM teachers are implementing TechQuests in classrooms they use these social networking elements to capture student interest and address the needs of digital natives. Through the Moodle environment, teachers have explored a variety of learning objects called TechQuests, to support classroom instruction previously outlined through a textbook. Project classrooms have participated in videoconferences over high-speed networks and through satellite connections with experts in the field investigating scientific data found in the CA State Park of Anza Borrego. Other engaging tools include: An Interactive Epicenter Locator Tool developed through the project in collaboration with the Scripps Institution of Oceanography to engage students in the use of data to determine earthquake epicenters during hands on investigations, and a TechQuest activity where GoogleEarth allows students to explore geographic locations and scientific data.
Garner, Bryan R; Godley, Mark D; Passetti, Lora L; Funk, Rodney R; White, William L
2014-01-01
The present quasi-experiment examined the direct and indirect effects of recovery support telephone calls following adolescent substance use disorder treatment. Six-month outcome data from 202 adolescents who had received recovery support calls from primarily pre-professional (i.e., college-level social service students) volunteers was compared to 6-month outcome data from a matched comparison sample of adolescents (n = 404). Results suggested adolescents in the recovery support sample had significantly greater reductions in their recovery environment risk relative to the comparison sample (β = -.17). Path analysis also suggested that the reduction in recovery environment risk produced by recovery support calls had indirect impacts (via recovery environment risk) on reductions in social risk (β = .22), substance use (β = .23), and substance-related problems (β = .16). Finally, moderation analyses suggested the effects of recovery support calls did not differ by gender, but were significantly greater for adolescents with lower levels of treatment readiness. In addition to providing rare empirical support for the effectiveness of recovery support services, an important contribution of this study is that it provides evidence that recovery support services do not necessarily have to be “peer-based,” at least in terms of the recovery support service provider having the experiential credentials of being “in recovery.” If replicated, this latter finding may have particularly important implications for helping increase the recovery support workforce. PMID:25574502
Semantic Importance Sampling for Statistical Model Checking
2015-01-16
SMT calls while maintaining correctness. Finally, we implement SIS in a tool called osmosis and use it to verify a number of stochastic systems with...2 surveys related work. Section 3 presents background definitions and concepts. Section 4 presents SIS, and Section 5 presents our tool osmosis . In...which I∗M|=Φ(x) = 1. We do this by first randomly selecting a cube c from C∗ with uniform probability since each cube has equal probability 9 5. OSMOSIS
Xu, Lingyang; Hou, Yali; Bickhart, Derek M; Song, Jiuzhou; Liu, George E
2013-06-25
Copy number variations (CNVs) are gains and losses of genomic sequence between two individuals of a species when compared to a reference genome. The data from single nucleotide polymorphism (SNP) microarrays are now routinely used for genotyping, but they also can be utilized for copy number detection. Substantial progress has been made in array design and CNV calling algorithms and at least 10 comparison studies in humans have been published to assess them. In this review, we first survey the literature on existing microarray platforms and CNV calling algorithms. We then examine a number of CNV calling tools to evaluate their impacts using bovine high-density SNP data. Large incongruities in the results from different CNV calling tools highlight the need for standardizing array data collection, quality assessment and experimental validation. Only after careful experimental design and rigorous data filtering can the impacts of CNVs on both normal phenotypic variability and disease susceptibility be fully revealed.
FFI: A software tool for ecological monitoring
Duncan C. Lutes; Nathan C. Benson; MaryBeth Keifer; John F. Caratti; S. Austin Streetman
2009-01-01
A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool. FFI provides software...
Performance Analysis of and Tool Support for Transactional Memory on BG/Q
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schindewolf, M
2011-12-08
Martin Schindewolf worked during his internship at the Lawrence Livermore National Laboratory (LLNL) under the guidance of Martin Schulz at the Computer Science Group of the Center for Applied Scientific Computing. We studied the performance of the TM subsystem of BG/Q as well as researched the possibilities for tool support for TM. To study the performance, we run CLOMP-TM. CLOMP-TM is a benchmark designed for the purpose to quantify the overhead of OpenMP and compare different synchronization primitives. To advance CLOMP-TM, we added Message Passing Interface (MPI) routines for a hybrid parallelization. This enables to run multiple MPI tasks, eachmore » running OpenMP, on one node. With these enhancements, a beneficial MPI task to OpenMP thread ratio is determined. Further, the synchronization primitives are ranked as a function of the application characteristics. To demonstrate the usefulness of these results, we investigate a real Monte Carlo simulation called Monte Carlo Benchmark (MCB). Applying the lessons learned yields the best task to thread ratio. Further, we were able to tune the synchronization by transactifying the MCB. Further, we develop tools that capture the performance of the TM run time system and present it to the application's developer. The performance of the TM run time system relies on the built-in statistics. These tools use the Blue Gene Performance Monitoring (BGPM) interface to correlate the statistics from the TM run time system with performance counter values. This combination provides detailed insights in the run time behavior of the application and enables to track down the cause of degraded performance. Further, one tool has been implemented that separates the performance counters in three categories: Successful Speculation, Unsuccessful Speculation and No Speculation. All of the tools are crafted around IBM's xlc compiler for C and C++ and have been run and tested on a Q32 early access system.« less
VCFR: A package to manipulate and visualize variant call format data in R
USDA-ARS?s Scientific Manuscript database
Software to call single nucleotide polymorphisms or related genetic variants has converged on the variant call format (vcf) as their output format of choice. This has created a need for tools to work with vcf files. While an increasing number of software exists to read vcf data, many of them only ex...
ERIC Educational Resources Information Center
Hamel, Marie-Josee; Caws, Catherine
2010-01-01
This article discusses CALL development from both educational and ergonomic perspectives. It focuses on the learner-task-tool interaction, in particular on the aspects contributing to its overall quality, herein called "usability." Two pilot studies are described that were carried out with intermediate to advanced learners of French in two…
GeneSCF: a real-time based functional enrichment tool with support for multiple organisms.
Subhash, Santhilal; Kanduri, Chandrasekhar
2016-09-13
High-throughput technologies such as ChIP-sequencing, RNA-sequencing, DNA sequencing and quantitative metabolomics generate a huge volume of data. Researchers often rely on functional enrichment tools to interpret the biological significance of the affected genes from these high-throughput studies. However, currently available functional enrichment tools need to be updated frequently to adapt to new entries from the functional database repositories. Hence there is a need for a simplified tool that can perform functional enrichment analysis by using updated information directly from the source databases such as KEGG, Reactome or Gene Ontology etc. In this study, we focused on designing a command-line tool called GeneSCF (Gene Set Clustering based on Functional annotations), that can predict the functionally relevant biological information for a set of genes in a real-time updated manner. It is designed to handle information from more than 4000 organisms from freely available prominent functional databases like KEGG, Reactome and Gene Ontology. We successfully employed our tool on two of published datasets to predict the biologically relevant functional information. The core features of this tool were tested on Linux machines without the need for installation of more dependencies. GeneSCF is more reliable compared to other enrichment tools because of its ability to use reference functional databases in real-time to perform enrichment analysis. It is an easy-to-integrate tool with other pipelines available for downstream analysis of high-throughput data. More importantly, GeneSCF can run multiple gene lists simultaneously on different organisms thereby saving time for the users. Since the tool is designed to be ready-to-use, there is no need for any complex compilation and installation procedures.
Khan, Nazib Uz Zaman; Rasheed, Sabrina; Sharmin, Tamanna; Ahmed, Tanvir; Mahmood, Shehrin Shaila; Khatun, Fatema; Hanifi, Sma; Hoque, Shahidul; Iqbal, Mohammad; Bhuiya, Abbas
2015-08-05
Bangladesh is facing serious shortage of trained health professionals. In the pluralistic healthcare system of Bangladesh, formal health care providers constitute only 5 % of the total workforce; the rest are informal health care providers. Information Communication Technologies (ICTs) are increasingly seen as a powerful tool for linking the community with formal healthcare providers. Our study assesses an intervention that linked village doctors (a cadre of informal health care providers practising modern medicine) to formal doctors through call centres from the perspective of the village doctors who participated in the intervention. The study was conducted in Chakaria, a remote rural area in south-eastern Bangladesh during April-May 2013. Twelve village doctors were selected purposively from a pool of 55 village doctors who participated in the mobile health (mHealth) intervention. In depth interviews were conducted to collect data. The data were manually analysed using themes that emerged. The village doctors talked about both business benefits (access to formal doctors, getting support for decision making, and being entitled to call trained doctors) and personal benefits (both financial and non-financial). Some of the major barriers mentioned were technical problems related to accessing the call centre, charging consultation fees, and unfamiliarity with the call centre physicians. Village doctors saw many benefits to having a business relationship with the trained doctors that the mHealth intervention provided. mHealth through call centres has the potential to ensure consultation services to populations through existing informal healthcare providers in settings with a shortage of qualified healthcare providers.
Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique.
Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Shaw, Philip J; Ukosakit, Kittipat; Tragoonrung, Somvong; Tongsima, Sissades
2015-01-01
DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. This work presents an automated genotyping tool from DNA gel electrophoresis images, called GELect, which was written in Java and made available through the imageJ framework. With a novel automated image processing workflow, the tool can accurately segment lanes from a gel matrix, intelligently extract distorted and even doublet bands that are difficult to identify by existing image processing tools. Consequently, genotyping from DNA gel electrophoresis can be performed automatically allowing users to efficiently conduct large scale DNA fingerprinting via DNA gel electrophoresis. The software is freely available from http://www.biotec.or.th/gi/tools/gelect.
Rheumatology telephone advice line - experience of a Portuguese department.
Ferreira, R; Marques, A; Mendes, A; da Silva, J A
2015-01-01
Telephone helplines for patients are tool for information and advice. They can contribute to patient's satisfaction with care and to the effectiveness and safety of treatments. In order to achieve this, they need to be adequately adapted to the target populations, as to incorporate their abilities and expectations. a) Evaluate the adherence of patients to a telephone helpline managed by nurses in a Portuguese Rheumatology Department, b) Analyse the profile of users and their major needs, c) Analyse the management of calls by the nurses. The target population of this phone service are the patients treated at Day Care Hospital and Early Arthritis Clinic of our department. Nurses answered phone calls immediately between 8am and 4pm of working days. In the remaining hours messages were recorded on voice mail and answered as soon as possible. Details of the calls were registered in a dedicated sheet and patients were requested permission to use data to improve the service, with respect for their rights of confidentiality, anonymity and freedom of decision. In 18 months 173 calls were made by 79 patients, with a mean age of 47.9 years (sd=9.13). Considering the proportions of men and women in the target population, it was found that men called more frequently (M= 32.7% vs F= 20.4%, p=.016). The reasons for these calls can be divided into three categories: instrumental help, such as the request for results of complementary tests or rescheduling appointments (43.9% of calls); counselling on side effects or worsening of the disease/pain (31.2 %); counselling on therapy management (24.9%). Neither sex nor patient age were significantly related to these reasons for calling. Nurses resolved autonomously half (50.3%) of the calls and in 79.8% of the cases there was no need for patient referral to other health services. About a quarter of patients adhered to the telephone helpline.. Patients called to obtain support in the management of disease and therapy or to report side effects and/or symptom aggravation in addition to reasonable instrumental reasons. This suggests that this service may provide important health gains, in addition to comfort for the patient.
NASA Astrophysics Data System (ADS)
Abdel-Aal, H. A.; Mansori, M. El
2012-12-01
Cutting tools are subject to extreme thermal and mechanical loads during operation. The state of loading is intensified in dry cutting environment especially when cutting the so called hard-to-cut-materials. Although, the effect of mechanical loads on tool failure have been extensively studied, detailed studies on the effect of thermal dissipation on the deterioration of the cutting tool are rather scarce. In this paper we study failure of coated carbide tools due to thermal loading. The study emphasizes the role assumed by the thermo-physical properties of the tool material in enhancing or preventing mass attrition of the cutting elements within the tool. It is shown that within a comprehensive view of the nature of conduction in the tool zone, thermal conduction is not solely affected by temperature. Rather it is a function of the so called thermodynamic forces. These are the stress, the strain, strain rate, rate of temperature rise, and the temperature gradient. Although that within such consideration description of thermal conduction is non-linear, it is beneficial to employ such a form because it facilitates a full mechanistic understanding of thermal activation of tool wear.
Distributed and parallel Ada and the Ada 9X recommendations
NASA Technical Reports Server (NTRS)
Volz, Richard A.; Goldsack, Stephen J.; Theriault, R.; Waldrop, Raymond S.; Holzbacher-Valero, A. A.
1992-01-01
Recently, the DoD has sponsored work towards a new version of Ada, intended to support the construction of distributed systems. The revised version, often called Ada 9X, will become the new standard sometimes in the 1990s. It is intended that Ada 9X should provide language features giving limited support for distributed system construction. The requirements for such features are given. Many of the most advanced computer applications involve embedded systems that are comprised of parallel processors or networks of distributed computers. If Ada is to become the widely adopted language envisioned by many, it is essential that suitable compilers and tools be available to facilitate the creation of distributed and parallel Ada programs for these applications. The major languages issues impacting distributed and parallel programming are reviewed, and some principles upon which distributed/parallel language systems should be built are suggested. Based upon these, alternative language concepts for distributed/parallel programming are analyzed.
A New Virtual and Remote Experimental Environment for Teaching and Learning Science
NASA Astrophysics Data System (ADS)
Lustigova, Zdena; Lustig, Frantisek
This paper describes how a scientifically exact and problem-solving-oriented remote and virtual science experimental environment might help to build a new strategy for science education. The main features are: the remote observations and control of real world phenomena, their processing and evaluation, verification of hypotheses combined with the development of critical thinking, supported by sophisticated relevant information search, classification and storing tools and collaborative environment, supporting argumentative writing and teamwork, public presentations and defense of achieved results, all either in real presence, in telepresence or in combination of both. Only then real understanding of generalized science laws and their consequences can be developed. This science learning and teaching environment (called ROL - Remote and Open Laboratory), has been developed and used by Charles University in Prague since 1996, offered to science students in both formal and informal learning, and also to science teachers within their professional development studies, since 2003.
Cori, Liliana; Carducci, Annalaura; Donzelli, Gabriele; La Rocca, Cinzia; Bianchi, Fabrizio
2018-01-01
Eleven projects within the LIFE programme (through which the Directorate-General for Environment of the European Commission provides funding for projects aim at protecting environment and nature) addressing environmental-health-related issues have been involved in a collaborative network called KTE LIFE EnvHealth Network. The shared issues tackled by that projects are knowledge transfer and exchange (KTE). The objective of the LIFE programme is to support the implementation of the environmental legislation in the European Union, to provide new tools and knowledge that will help to better protect both the territory and the communities. Transferring knowledge to decision makers, at the appropriate and effective level, is therefore a central function of the projects. The Network promotes national and international networking, which intends to involve other projects, to provide methodological support, to make information and successful practices circulate, with the aim of multiplying the energies of each project involved.
Microbe-ID: an open source toolbox for microbial genotyping and species identification
Tabima, Javier F.; Everhart, Sydney E.; Larsen, Meredith M.; Weisberg, Alexandra J.; Kamvar, Zhian N.; Tancos, Matthew A.; Smart, Christine D.; Chang, Jeff H.
2016-01-01
Development of tools to identify species, genotypes, or novel strains of invasive organisms is critical for monitoring emergence and implementing rapid response measures. Molecular markers, although critical to identifying species or genotypes, require bioinformatic tools for analysis. However, user-friendly analytical tools for fast identification are not readily available. To address this need, we created a web-based set of applications called Microbe-ID that allow for customizing a toolbox for rapid species identification and strain genotyping using any genetic markers of choice. Two components of Microbe-ID, named Sequence-ID and Genotype-ID, implement species and genotype identification, respectively. Sequence-ID allows identification of species by using BLAST to query sequences for any locus of interest against a custom reference sequence database. Genotype-ID allows placement of an unknown multilocus marker in either a minimum spanning network or dendrogram with bootstrap support from a user-created reference database. Microbe-ID can be used for identification of any organism based on nucleotide sequences or any molecular marker type and several examples are provided. We created a public website for demonstration purposes called Microbe-ID (microbe-id.org) and provided a working implementation for the genus Phytophthora (phytophthora-id.org). In Phytophthora-ID, the Sequence-ID application allows identification based on ITS or cox spacer sequences. Genotype-ID groups individuals into clonal lineages based on simple sequence repeat (SSR) markers for the two invasive plant pathogen species P. infestans and P. ramorum. All code is open source and available on github and CRAN. Instructions for installation and use are provided at https://github.com/grunwaldlab/Microbe-ID. PMID:27602267
Cho, Jin-Young; Lee, Hyoung-Joo; Jeong, Seul-Ki; Paik, Young-Ki
2017-12-01
Mass spectrometry (MS) is a widely used proteome analysis tool for biomedical science. In an MS-based bottom-up proteomic approach to protein identification, sequence database (DB) searching has been routinely used because of its simplicity and convenience. However, searching a sequence DB with multiple variable modification options can increase processing time, false-positive errors in large and complicated MS data sets. Spectral library searching is an alternative solution, avoiding the limitations of sequence DB searching and allowing the detection of more peptides with high sensitivity. Unfortunately, this technique has less proteome coverage, resulting in limitations in the detection of novel and whole peptide sequences in biological samples. To solve these problems, we previously developed the "Combo-Spec Search" method, which uses manually multiple references and simulated spectral library searching to analyze whole proteomes in a biological sample. In this study, we have developed a new analytical interface tool called "Epsilon-Q" to enhance the functions of both the Combo-Spec Search method and label-free protein quantification. Epsilon-Q performs automatically multiple spectral library searching, class-specific false-discovery rate control, and result integration. It has a user-friendly graphical interface and demonstrates good performance in identifying and quantifying proteins by supporting standard MS data formats and spectrum-to-spectrum matching powered by SpectraST. Furthermore, when the Epsilon-Q interface is combined with the Combo-Spec search method, called the Epsilon-Q system, it shows a synergistic function by outperforming other sequence DB search engines for identifying and quantifying low-abundance proteins in biological samples. The Epsilon-Q system can be a versatile tool for comparative proteome analysis based on multiple spectral libraries and label-free quantification.
Dynamic Visualization of Co-expression in Systems Genetics Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
New, Joshua Ryan; Huang, Jian; Chesler, Elissa J
2008-01-01
Biologists hope to address grand scientific challenges by exploring the abundance of data made available through modern microarray technology and other high-throughput techniques. The impact of this data, however, is limited unless researchers can effectively assimilate such complex information and integrate it into their daily research; interactive visualization tools are called for to support the effort. Specifically, typical studies of gene co-expression require novel visualization tools that enable the dynamic formulation and fine-tuning of hypotheses to aid the process of evaluating sensitivity of key parameters. These tools should allow biologists to develop an intuitive understanding of the structure of biologicalmore » networks and discover genes which reside in critical positions in networks and pathways. By using a graph as a universal data representation of correlation in gene expression data, our novel visualization tool employs several techniques that when used in an integrated manner provide innovative analytical capabilities. Our tool for interacting with gene co-expression data integrates techniques such as: graph layout, qualitative subgraph extraction through a novel 2D user interface, quantitative subgraph extraction using graph-theoretic algorithms or by querying an optimized b-tree, dynamic level-of-detail graph abstraction, and template-based fuzzy classification using neural networks. We demonstrate our system using a real-world workflow from a large-scale, systems genetics study of mammalian gene co-expression.« less
Balatsoukas, Panos; Williams, Richard; Davies, Colin; Ainsworth, John; Buchan, Iain
2015-11-01
Integrated care pathways (ICPs) define a chronological sequence of steps, most commonly diagnostic or treatment, to be followed in providing care for patients. Care pathways help to ensure quality standards are met and to reduce variation in practice. Although research on the computerisation of ICP progresses, there is still little knowledge on what are the requirements for designing user-friendly and usable electronic care pathways, or how users (normally health care professionals) interact with interfaces that support design, analysis and visualisation of ICPs. The purpose of the study reported in this paper was to address this gap by evaluating the usability of a novel web-based tool called COCPIT (Collaborative Online Care Pathway Investigation Tool). COCPIT supports the design, analysis and visualisation of ICPs at the population level. In order to address the aim of this study, an evaluation methodology was designed based on heuristic evaluations and a mixed method usability test. The results showed that modular visualisation and direct manipulation of information related to the design and analysis of ICPs is useful for engaging and stimulating users. However, designers should pay attention to issues related to the visibility of the system status and the match between the system and the real world, especially in relation to the display of statistical information about care pathways and the editing of clinical information within a care pathway. The paper concludes with recommendations for interface design.
Teaching Web Security Using Portable Virtual Labs
ERIC Educational Resources Information Center
Chen, Li-Chiou; Tao, Lixin
2012-01-01
We have developed a tool called Secure WEb dEvelopment Teaching (SWEET) to introduce security concepts and practices for web application development. This tool provides introductory tutorials, teaching modules utilizing virtualized hands-on exercises, and project ideas in web application security. In addition, the tool provides pre-configured…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Femec, D.A.
This report describes two code-generating tools used to speed design and implementation of relational databases and user interfaces: CREATE-SCHEMA and BUILD-SCREEN. CREATE-SCHEMA produces the SQL commands that actually create and define the database. BUILD-SCREEN takes templates for data entry screens and generates the screen management system routine calls to display the desired screen. Both tools also generate the related FORTRAN declaration statements and precompiled SQL calls. Included with this report is the source code for a number of FORTRAN routines and functions used by the user interface. This code is broadly applicable to a number of different databases.
Simulink/PARS Integration Support
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vacaliuc, B.; Nakhaee, N.
2013-12-18
The state of the art for signal processor hardware has far out-paced the development tools for placing applications on that hardware. In addition, signal processors are available in a variety of architectures, each uniquely capable of handling specific types of signal processing efficiently. With these processors becoming smaller and demanding less power, it has become possible to group multiple processors, a heterogeneous set of processors, into single systems. Different portions of the desired problem set can be assigned to different processor types as appropriate. As software development tools do not keep pace with these processors, especially when multiple processors ofmore » different types are used, a method is needed to enable software code portability among multiple processors and multiple types of processors along with their respective software environments. Sundance DSP, Inc. has developed a software toolkit called “PARS”, whose objective is to provide a framework that uses suites of tools provided by different vendors, along with modeling tools and a real time operating system, to build an application that spans different processor types. The software language used to express the behavior of the system is a very high level modeling language, “Simulink”, a MathWorks product. ORNL has used this toolkit to effectively implement several deliverables. This CRADA describes this collaboration between ORNL and Sundance DSP, Inc.« less
Biological data integration: wrapping data and tools.
Lacroix, Zoé
2002-06-01
Nowadays scientific data is inevitably digital and stored in a wide variety of formats in heterogeneous systems. Scientists need to access an integrated view of remote or local heterogeneous data sources with advanced data accessing, analyzing, and visualization tools. Building a digital library for scientific data requires accessing and manipulating data extracted from flat files or databases, documents retrieved from the Web as well as data generated by software. We present an approach to wrapping web data sources, databases, flat files, or data generated by tools through a database view mechanism. Generally, a wrapper has two tasks: it first sends a query to the source to retrieve data and, second builds the expected output with respect to the virtual structure. Our wrappers are composed of a retrieval component based on an intermediate object view mechanism called search views mapping the source capabilities to attributes, and an eXtensible Markup Language (XML) engine, respectively, to perform these two tasks. The originality of the approach consists of: 1) a generic view mechanism to access seamlessly data sources with limited capabilities and 2) the ability to wrap data sources as well as the useful specific tools they may provide. Our approach has been developed and demonstrated as part of the multidatabase system supporting queries via uniform object protocol model (OPM) interfaces.
The Five S’s: A Communication Tool for Child Psychiatric Access Projects
Harrison, Joyce; Wasserman, Kate; Steinberg, Janna; Platt, Rheanna; Coble, Kelly; Bower, Kelly
2017-01-01
Given the gap in child psychiatric services available to meet existing pediatric behavioral health needs, children and families are increasingly seeking behavioral health services from their primary care clinicians (PCCs). However, many pediatricians report not feeling adequately trained to meet these needs. As a result, child psychiatric access projects (CPAPs) are being developed around the country to support the integration of care for children. Despite the promise and success of these programs, there are barriers, including the challenge of effective communication between PCCs and child psychiatrists. Consultants from the Maryland CPAP, the Behavioral Health Integration in Pediatric Primary Care (BHIPP) project, have developed a framework called the Five S’s. The Five S’s are Safety, Specific Behaviors, Setting, Scary Things, and Screening/Services. It is a tool that can be used to help PCCs and child psychiatrists communicate and collaborate to formulate pediatric behavioral health cases for consultation or referral requests. Each of these components and its importance to the case consultation are described. Two case studies are presented that illustrate how the Five S’s tool can be used in clinical consultation between PCC and child psychiatrist. We also describe the utility of the tool beyond its use in behavioral health consultation. PMID:27919566
Leavesley, G.H.; Markstrom, S.L.; Viger, R.J.
2004-01-01
The interdisciplinary nature and increasing complexity of water- and environmental-resource problems require the use of modeling approaches that can incorporate knowledge from a broad range of scientific disciplines. The large number of distributed hydrological and ecosystem models currently available are composed of a variety of different conceptualizations of the associated processes they simulate. Assessment of the capabilities of these distributed models requires evaluation of the conceptualizations of the individual processes, and the identification of which conceptualizations are most appropriate for various combinations of criteria, such as problem objectives, data constraints, and spatial and temporal scales of application. With this knowledge, "optimal" models for specific sets of criteria can be created and applied. The U.S. Geological Survey (USGS) Modular Modeling System (MMS) is an integrated system of computer software that has been developed to provide these model development and application capabilities. MMS supports the integration of models and tools at a variety of levels of modular design. These include individual process models, tightly coupled models, loosely coupled models, and fully-integrated decision support systems. A variety of visualization and statistical tools are also provided. MMS has been coupled with the Bureau of Reclamation (BOR) object-oriented reservoir and river-system modeling framework, RiverWare, under a joint USGS-BOR program called the Watershed and River System Management Program. MMS and RiverWare are linked using a shared relational database. The resulting database-centered decision support system provides tools for evaluating and applying optimal resource-allocation and management strategies to complex, operational decisions on multipurpose reservoir systems and watersheds. Management issues being addressed include efficiency of water-resources management, environmental concerns such as meeting flow needs for endangered species, and optimizing operations within the constraints of multiple objectives such as power generation, irrigation, and water conservation. This decision support system approach is being developed, tested, and implemented in the Gunni-son, Yakima, San Juan, Rio Grande, and Truckee River basins of the western United States. Copyright ASCE 2004.
A decision support for an integrated multi-scale analysis of irrigation: DSIRR.
Bazzani, Guido M
2005-12-01
The paper presents a decision support designed to conduct an economic-environmental assessment of the agricultural activity focusing on irrigation called 'Decision Support for IRRigated Agriculture' (DSIRR). The program describes the effect at catchment scale of choices taken at micro scale by independent actors, the farmers, by simulating their decision process. The decision support (DS) has been thought of as a support tool for participatory water policies as requested by the Water Framework Directive and it aims at analyzing alternatives in production and technology, according to different market, policy and climate conditions. The tool uses data and models, provides a graphical user interface and can incorporate the decision makers' own insights. Heterogeneity in preferences is admitted since it is assumed that irrigators try to optimize personal multi-attribute utility functions, subject to a set of constraints. Consideration of agronomic and engineering aspects allows an accurate description of irrigation. Mathematical programming techniques are applied to find solutions. The program has been applied in the river Po basin (northern Italy) to analyze the impact of a pricing policy in a context of irrigation technology innovation. Water demand functions and elasticity to water price have been estimated. Results demonstrate how different areas and systems react to the same policy in quite a different way. While in the annual cropping system pricing seems effective to save the resource at the cost of impeding Water Agencies cost recovery, the same policy has an opposite effect in the perennial fruit system which shows an inelastic response to water price. The multidimensional assessment conducted clarified the trades-off among conflicting economic-social-environmental objectives, thus generating valuable information to design a more tailored mix of measures.
Advancing Collaboration through Hydrologic Data and Model Sharing
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Castronova, A. M.; Miles, B.; Li, Z.; Morsy, M. M.
2015-12-01
HydroShare is an online, collaborative system for open sharing of hydrologic data, analytical tools, and models. It supports the sharing of and collaboration around "resources" which are defined primarily by standardized metadata, content data models for each resource type, and an overarching resource data model based on the Open Archives Initiative's Object Reuse and Exchange (OAI-ORE) standard and a hierarchical file packaging system called "BagIt". HydroShare expands the data sharing capability of the CUAHSI Hydrologic Information System by broadening the classes of data accommodated to include geospatial and multidimensional space-time datasets commonly used in hydrology. HydroShare also includes new capability for sharing models, model components, and analytical tools and will take advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. It also supports web services and server/cloud based computation operating on resources for the execution of hydrologic models and analysis and visualization of hydrologic data. HydroShare uses iRODS as a network file system for underlying storage of datasets and models. Collaboration is enabled by casting datasets and models as "social objects". Social functions include both private and public sharing, formation of collaborative groups of users, and value-added annotation of shared datasets and models. The HydroShare web interface and social media functions were developed using the Django web application framework coupled to iRODS. Data visualization and analysis is supported through the Tethys Platform web GIS software stack. Links to external systems are supported by RESTful web service interfaces to HydroShare's content. This presentation will introduce the HydroShare functionality developed to date and describe ongoing development of functionality to support collaboration and integration of data and models.
Computer-Aided Air-Traffic Control In The Terminal Area
NASA Technical Reports Server (NTRS)
Erzberger, Heinz
1995-01-01
Developmental computer-aided system for automated management and control of arrival traffic at large airport includes three integrated subsystems. One subsystem, called Traffic Management Advisor, another subsystem, called Descent Advisor, and third subsystem, called Final Approach Spacing Tool. Data base that includes current wind measurements and mathematical models of performances of types of aircraft contributes to effective operation of system.
ESIF Call for High-Impact Integrated Projects | Energy Systems Integration
Integrated Projects As a U.S. Department of Energy user facility, the Energy Systems Integration Facility concepts, tools, and technologies needed to measure, analyze, predict, protect, and control the grid of the Facility | NREL ESIF Call for High-Impact Integrated Projects ESIF Call for High-Impact
Using WebQuests as Idea Banks for Fostering Autonomy in Online Language Courses
ERIC Educational Resources Information Center
Sadaghian, Shirin; Marandi, S. Susan
2016-01-01
The concept of language learner autonomy has influenced Computer-Assisted Language Learning (CALL) to the extent that Schwienhorst (2012) informs us of a paradigm change in CALL design in the light of learner autonomy. CALL is not considered a tool anymore, but a learner environment available to language learners anywhere in the world. Based on a…
Brodaty, Henry; Low, Lee-Fay; Liu, Zhixin; Fletcher, Jennifer; Roast, Joel; Goodenough, Belinda; Chenoweth, Lynn
2014-12-01
To test the hypothesis that individual and institutional-level factors influence the effects of a humor therapy intervention on aged care residents. Data were from the humor therapy group of the Sydney Multisite Intervention of LaughterBosses and ElderClowns, or SMILE, study, a single-blind cluster randomized controlled trial of humor therapy conducted over 12 weeks; assessments were performed at baseline, week 13, and week 26. One hundred eighty-nine individuals from 17 Sydney residential aged care facilities were randomly allocated to the humor therapy intervention. Professional performers called "ElderClowns" provided 9-12 weekly humor therapy 2-hour sessions, augmented by trained staff, called "LaughterBosses." Outcome measures were as follows: Cornell Scale for Depression in Dementia, Cohen-Mansfield Agitation Inventory, Neuropsychiatric Inventory, the withdrawal subscale of Multidimensional Observation Scale for Elderly Subjects, and proxy-rated quality of life in dementia population scale. Facility-level measures were as follows: support of the management for the intervention, commitment levels of LaughterBosses, Environmental Audit Tool scores, and facility level of care provided (high/low). Resident-level measures were engagement, functional ability, disease severity, and time-in-care. Multilevel path analyses simultaneously modeled resident engagement at the individual level (repeated measures) and the effects of management support and staff commitment to humor therapy at the cluster level. Models indicated flow-on effects, whereby management support had positive effects on LaughterBoss commitment, and LaughterBoss commitment increased resident engagement. Higher resident engagement was associated with reduced depression, agitation, and neuropsychiatric scores. Effectiveness of psychosocial programs in residential aged care can be enhanced by management support, staff commitment, and active resident engagement. Copyright © 2014 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.
Distributed data analysis in ATLAS
NASA Astrophysics Data System (ADS)
Nilsson, Paul; Atlas Collaboration
2012-12-01
Data analysis using grid resources is one of the fundamental challenges to be addressed before the start of LHC data taking. The ATLAS detector will produce petabytes of data per year, and roughly one thousand users will need to run physics analyses on this data. Appropriate user interfaces and helper applications have been made available to ensure that the grid resources can be used without requiring expertise in grid technology. These tools enlarge the number of grid users from a few production administrators to potentially all participating physicists. ATLAS makes use of three grid infrastructures for the distributed analysis: the EGEE sites, the Open Science Grid, and Nordu Grid. These grids are managed by the gLite workload management system, the PanDA workload management system, and ARC middleware; many sites can be accessed via both the gLite WMS and PanDA. Users can choose between two front-end tools to access the distributed resources. Ganga is a tool co-developed with LHCb to provide a common interface to the multitude of execution backends (local, batch, and grid). The PanDA workload management system provides a set of utilities called PanDA Client; with these tools users can easily submit Athena analysis jobs to the PanDA-managed resources. Distributed data is managed by Don Quixote 2, a system developed by ATLAS; DQ2 is used to replicate datasets according to the data distribution policies and maintains a central catalog of file locations. The operation of the grid resources is continually monitored by the Ganga Robot functional testing system, and infrequent site stress tests are performed using the Hammer Cloud system. In addition, the DAST shift team is a group of power users who take shifts to provide distributed analysis user support; this team has effectively relieved the burden of support from the developers.
NASA Astrophysics Data System (ADS)
Ines, A. V. M.; Han, E.; Baethgen, W.
2017-12-01
Advances in seasonal climate forecasts (SCFs) during the past decades have brought great potential to improve agricultural climate risk managements associated with inter-annual climate variability. In spite of popular uses of crop simulation models in addressing climate risk problems, the models cannot readily take seasonal climate predictions issued in the format of tercile probabilities of most likely rainfall categories (i.e, below-, near- and above-normal). When a skillful SCF is linked with the crop simulation models, the informative climate information can be further translated into actionable agronomic terms and thus better support strategic and tactical decisions. In other words, crop modeling connected with a given SCF allows to simulate "what-if" scenarios with different crop choices or management practices and better inform the decision makers. In this paper, we present a decision support tool, called CAMDT (Climate Agriculture Modeling and Decision Tool), which seamlessly integrates probabilistic SCFs to DSSAT-CSM-Rice model to guide decision-makers in adopting appropriate crop and agricultural water management practices for given climatic conditions. The CAMDT has a functionality to disaggregate a probabilistic SCF into daily weather realizations (either a parametric or non-parametric disaggregation method) and to run DSSAT-CSM-Rice with the disaggregated weather realizations. The convenient graphical user-interface allows easy implementation of several "what-if" scenarios for non-technical users and visualize the results of the scenario runs. In addition, the CAMDT also translates crop model outputs to economic terms once the user provides expected crop price and cost. The CAMDT is a practical tool for real-world applications, specifically for agricultural climate risk management in the Bicol region, Philippines, having a great flexibility for being adapted to other crops or regions in the world. CAMDT GitHub: https://github.com/Agro-Climate/CAMDT
Basic to Advanced InSAR Processing: GMTSAR
NASA Astrophysics Data System (ADS)
Sandwell, D. T.; Xu, X.; Baker, S.; Hogrelius, A.; Mellors, R. J.; Tong, X.; Wei, M.; Wessel, P.
2017-12-01
Monitoring crustal deformation using InSAR is becoming a standard technique for the science and application communities. Optimal use of the new data streams from Sentinel-1 and NISAR will require open software tools as well as education on the strengths and limitations of the InSAR methods. Over the past decade we have developed freely available, open-source software for processing InSAR data. The software relies on the Generic Mapping Tools (GMT) for the back-end data analysis and display and is thus called GMTSAR. With startup funding from NSF, we accelerated the development of GMTSAR to include more satellite data sources and provide better integration and distribution with GMT. In addition, with support from UNAVCO we have offered 6 GMTSAR short courses to educate mostly novice InSAR users. Currently, the software is used by hundreds of scientists and engineers around the world to study deformation at more than 4300 different sites. The most challenging aspect of the recent software development was the transition from image alignment using the cross-correlation method to a completely new alignment algorithm that uses only the precise orbital information to geometrically align images to an accuracy of better than 7 cm. This development was needed to process a new data type that is being acquired by the Sentinel-1A/B satellites. This combination of software and open data is transforming radar interferometry from a research tool into a fully operational time series analysis tool. Over the next 5 years we are planning to continue to broaden the user base through: improved software delivery methods; code hardening; better integration with data archives; support for high level products being developed for NISAR; and continued education and outreach.
BS-virus-finder: virus integration calling using bisulfite sequencing data.
Gao, Shengjie; Hu, Xuesong; Xu, Fengping; Gao, Changduo; Xiong, Kai; Zhao, Xiao; Chen, Haixiao; Zhao, Shancen; Wang, Mengyao; Fu, Dongke; Zhao, Xiaohui; Bai, Jie; Mao, Likai; Li, Bo; Wu, Song; Wang, Jian; Li, Shengbin; Yang, Huangming; Bolund, Lars; Pedersen, Christian N S
2018-01-01
DNA methylation plays a key role in the regulation of gene expression and carcinogenesis. Bisulfite sequencing studies mainly focus on calling single nucleotide polymorphism, different methylation region, and find allele-specific DNA methylation. Until now, only a few software tools have focused on virus integration using bisulfite sequencing data. We have developed a new and easy-to-use software tool, named BS-virus-finder (BSVF, RRID:SCR_015727), to detect viral integration breakpoints in whole human genomes. The tool is hosted at https://github.com/BGI-SZ/BSVF. BS-virus-finder demonstrates high sensitivity and specificity. It is useful in epigenetic studies and to reveal the relationship between viral integration and DNA methylation. BS-virus-finder is the first software tool to detect virus integration loci by using bisulfite sequencing data. © The Authors 2017. Published by Oxford University Press.
VoiceThread: A Useful Program Evaluation Tool
ERIC Educational Resources Information Center
Mott, Rebecca
2018-01-01
With today's technology, Extension professionals have a variety of tools available for program evaluation. This article describes an innovative platform called VoiceThread that has been used in many classrooms but also is useful for conducting virtual focus group research. I explain how this tool can be used to collect qualitative participant…
Rapid Response to Decision Making for Complex Issues - How Technologies of Cooperation Can Help
2005-11-01
creating bottom–up taxonomies—called folksonomies —using metadata tools like del.icio.us (in which users create their own tags for bookmarking Web...tools such as RSS, tagging (and the consequent development of folksonomies ), wikis, and group visualization tools all help multiply the individual
Wong, Frances Kam Yuet; So, Ching; Chau, June; Law, Antony Kwan Pui; Tam, Stanley Ku Fu; McGhee, Sarah
2015-01-01
home visits and telephone calls are two often used approaches in transitional care, but their differential economic effects are unknown. to examine the differential economic benefits of home visits with telephone calls and telephone calls only in transitional discharge support. cost-effectiveness analysis conducted alongside a randomised controlled trial (RCT). patients discharged from medical units randomly assigned to control (control, N = 210), home visits with calls (home, N = 196) and calls only (call, N = 204). cost-effectiveness analyses were conducted from the societal perspective comparing monetary benefits and quality-adjusted life years (QALYs) gained. the home arm was less costly but less effective at 28 days and was dominating (less costly and more effective) at 84 days. The call arm was dominating at both 28 and 84 days. The incremental QALY for the home arm was -0.0002/0.0008 (28/84 days), and the call arm was 0.0022/0.0104 (28/84 days). When the three groups were compared, the call arm had a higher probability being cost-effective at 84 days but not at 28 days (home: 53%, call: 35% (28 days) versus home: 22%, call: 73% (84 days)) measuring against the NICE threshold of £20,000. the original RCT showed that the bundled intervention involving home visits and calls was more effective than calls only in the reduction of hospital readmissions. This study adds a cost perspective to inform policymakers that both home visits and calls only are cost-effective for transitional care support, but calls only have a higher chance of being cost-effective for a sustained period after intervention. © The Author 2014. Published by Oxford University Press on behalf of the British Geriatrics Society.
Akuna: An Open Source User Environment for Managing Subsurface Simulation Workflows
NASA Astrophysics Data System (ADS)
Freedman, V. L.; Agarwal, D.; Bensema, K.; Finsterle, S.; Gable, C. W.; Keating, E. H.; Krishnan, H.; Lansing, C.; Moeglein, W.; Pau, G. S. H.; Porter, E.; Scheibe, T. D.
2014-12-01
The U.S. Department of Energy (DOE) is investing in development of a numerical modeling toolset called ASCEM (Advanced Simulation Capability for Environmental Management) to support modeling analyses at legacy waste sites. ASCEM is an open source and modular computing framework that incorporates new advances and tools for predicting contaminant fate and transport in natural and engineered systems. The ASCEM toolset includes both a Platform with Integrated Toolsets (called Akuna) and a High-Performance Computing multi-process simulator (called Amanzi). The focus of this presentation is on Akuna, an open-source user environment that manages subsurface simulation workflows and associated data and metadata. In this presentation, key elements of Akuna are demonstrated, which includes toolsets for model setup, database management, sensitivity analysis, parameter estimation, uncertainty quantification, and visualization of both model setup and simulation results. A key component of the workflow is in the automated job launching and monitoring capabilities, which allow a user to submit and monitor simulation runs on high-performance, parallel computers. Visualization of large outputs can also be performed without moving data back to local resources. These capabilities make high-performance computing accessible to the users who might not be familiar with batch queue systems and usage protocols on different supercomputers and clusters.
ParseCNV integrative copy number variation association software with quality tracking
Glessner, Joseph T.; Li, Jin; Hakonarson, Hakon
2013-01-01
A number of copy number variation (CNV) calling algorithms exist; however, comprehensive software tools for CNV association studies are lacking. We describe ParseCNV, unique software that takes CNV calls and creates probe-based statistics for CNV occurrence in both case–control design and in family based studies addressing both de novo and inheritance events, which are then summarized based on CNV regions (CNVRs). CNVRs are defined in a dynamic manner to allow for a complex CNV overlap while maintaining precise association region. Using this approach, we avoid failure to converge and non-monotonic curve fitting weaknesses of programs, such as CNVtools and CNVassoc, and although Plink is easy to use, it only provides combined CNV state probe-based statistics, not state-specific CNVRs. Existing CNV association methods do not provide any quality tracking information to filter confident associations, a key issue which is fully addressed by ParseCNV. In addition, uncertainty in CNV calls underlying CNV associations is evaluated to verify significant results, including CNV overlap profiles, genomic context, number of probes supporting the CNV and single-probe intensities. When optimal quality control parameters are followed using ParseCNV, 90% of CNVs validate by polymerase chain reaction, an often problematic stage because of inadequate significant association review. ParseCNV is freely available at http://parsecnv.sourceforge.net. PMID:23293001
ParseCNV integrative copy number variation association software with quality tracking.
Glessner, Joseph T; Li, Jin; Hakonarson, Hakon
2013-03-01
A number of copy number variation (CNV) calling algorithms exist; however, comprehensive software tools for CNV association studies are lacking. We describe ParseCNV, unique software that takes CNV calls and creates probe-based statistics for CNV occurrence in both case-control design and in family based studies addressing both de novo and inheritance events, which are then summarized based on CNV regions (CNVRs). CNVRs are defined in a dynamic manner to allow for a complex CNV overlap while maintaining precise association region. Using this approach, we avoid failure to converge and non-monotonic curve fitting weaknesses of programs, such as CNVtools and CNVassoc, and although Plink is easy to use, it only provides combined CNV state probe-based statistics, not state-specific CNVRs. Existing CNV association methods do not provide any quality tracking information to filter confident associations, a key issue which is fully addressed by ParseCNV. In addition, uncertainty in CNV calls underlying CNV associations is evaluated to verify significant results, including CNV overlap profiles, genomic context, number of probes supporting the CNV and single-probe intensities. When optimal quality control parameters are followed using ParseCNV, 90% of CNVs validate by polymerase chain reaction, an often problematic stage because of inadequate significant association review. ParseCNV is freely available at http://parsecnv.sourceforge.net.
SMARTScience Tools: Interacting With Blazar Data In The Web Browser
NASA Astrophysics Data System (ADS)
Hasan, Imran; Isler, Jedidah; Urry, C. Megan; MacPherson, Emily; Buxton, Michelle; Bailyn, Charles D.; Coppi, Paolo S.
2014-08-01
The Yale-SMARTS blazar group has accumulated 6 years of optical-IR photometry of more than 70 blazars, mostly bright enough in gamma-rays to be detected with Fermi. Observations were done with the ANDICAM instrument on the SMARTS 1.3 m telescope at the Cerro Tololo Inter-American Observatory. As a result of this long-term, multiwavelength monitoring, we have produced a calibrated, publicly available data set (see www.astro.yale.edu/smarts/glast/home.php), which we have used to find that (i) optical-IR and gamma-ray light curves are well correlated, supporting inverse-Compton models for gamma-ray production (Bonning et al. 2009, 2012), (ii) at their brightest, blazar jets can contribute significantly to the photoionization of the broad-emission-line region, indicating that gamma-rays are produced within 0.1 pc of the black hole in at least some cases (Isler et al. 2014), and (iii) optical-IR and gamma-ray flares are symmetric, implying the time scales are dominated by light-travel-time effects rather than acceleration or cooling (Chatterjee et al. 2012). The volume of data and diversity of projects for which it is used calls out for an efficient means of visualization. To this end, we have developed a suite of visualization tools called SMARTScience Tools, which allow users to interact dynamically with our dataset. The SMARTScience Tools is publicly available via our webpage and can be used to customize multiwavelength light curves and color magnitude diagrams quickly and intuitively. Users can choose specific bands to construct plots, and the plots include features such as band-by-band panning, dynamic zooming, and direct mouse interaction with individual data points. Human and machine readable tables of the plotted data can be directly printed for the user's convenience and for further independent study. The SMARTScience Tools significantly improves the public’s ability to interact with the Yale-SMARTS 6-year data base of blazar photometry, and should make multiwavelength studies of blazars even more accessible, efficient, and community driven.
29 CFR 825.126 - Leave because of a qualifying exigency.
Code of Federal Regulations, 2011 CFR
2011-07-01
... spouse, son, daughter, or parent (the “covered military member”) is on active duty or call to active duty... notified of an impending call or order to active duty in support of a contingency operation seven or less... call or order to active duty in support of a contingency operation; (2) Military events and related...
Kinyua, Florence; Kiptoo, Michael; Kikuvi, Gideon; Mutai, Joseph; Meyers, Adrienne F A; Muiruri, Peter; Songok, Elijah
2013-10-21
Clinical trials were conducted to assess the feasibility of using a cell phone text messaging-based system to follow up Human Immunodeficiency Virus (HIV) infected patients on antiretroviral (ARTs) and assess for improved adherence to their medication. However there is need to evaluate the perceptions of the HIV infected patients towards the use of these cell phones in an effort to better aid in the clinical management of their HIV infection. The objective of this study was therefore to determine the perceptions of HIV infected patients on the use of cell phone text messaging as a tool to support adherence to their ART medication. A cross sectional survey was conducted among patients receiving Highly Active Anti-Retroviral Therapy (HAART) at the Kenyatta National Hospital Comprehensive Care Clinic in Nairobi between May and July, 2011. Pre-tested questionnaires were used to collect the socio-demographic and perceptions data. The recruitment of the participants was done using the random probability sampling method and statistical analysis of data performed using Statistical Package for Social Sciences (SPSS) version 16.0. A total of 500 HIV infected patients (Male-107, Female-307) aged 19-72 years were interviewed. The majority of individuals (99%) had access to cell phones and 99% of the HIV infected patients interviewed supported the idea of cell phone use in management of their HIV infection. A large proportion (46%) claimed that they needed cell phone access for medical advice and guidance on factors that hinder their adherence to medication and only 3% of them needed it as a reminder to take their drugs. The majority (72%) preferred calling the healthcare provider with their own phones for convenience and confidential purposes with only 0.4% preferring to be called or texted by the health care provider. Most (94%), especially the older patients, had no problem with their confidentiality being infringed in the process of the conversation as per the bivariate analysis results. Cell phone communications are acceptable and in fact preferable over cell phone reminders.
2013-01-01
Background Clinical trials were conducted to assess the feasibility of using a cell phone text messaging-based system to follow up Human Immunodeficiency Virus (HIV) infected patients on antiretroviral (ARTs) and assess for improved adherence to their medication. However there is need to evaluate the perceptions of the HIV infected patients towards the use of these cell phones in an effort to better aid in the clinical management of their HIV infection. The objective of this study was therefore to determine the perceptions of HIV infected patients on the use of cell phone text messaging as a tool to support adherence to their ART medication. Methods A cross sectional survey was conducted among patients receiving Highly Active Anti-Retroviral Therapy (HAART) at the Kenyatta National Hospital Comprehensive Care Clinic in Nairobi between May and July, 2011. Pre-tested questionnaires were used to collect the socio-demographic and perceptions data. The recruitment of the participants was done using the random probability sampling method and statistical analysis of data performed using Statistical Package for Social Sciences (SPSS) version 16.0. Results A total of 500 HIV infected patients (Male-107, Female-307) aged 19-72 years were interviewed. The majority of individuals (99%) had access to cell phones and 99% of the HIV infected patients interviewed supported the idea of cell phone use in management of their HIV infection. A large proportion (46%) claimed that they needed cell phone access for medical advice and guidance on factors that hinder their adherence to medication and only 3% of them needed it as a reminder to take their drugs. The majority (72%) preferred calling the healthcare provider with their own phones for convenience and confidential purposes with only 0.4% preferring to be called or texted by the health care provider. Most (94%), especially the older patients, had no problem with their confidentiality being infringed in the process of the conversation as per the bivariate analysis results. Conclusion Cell phone communications are acceptable and in fact preferable over cell phone reminders. PMID:24143931
Comprehensive Environmental Assessment: A Meta-Assessment Approach
2012-01-01
With growing calls for changes in the field of risk assessment, improved systematic approaches for addressing environmental issues with greater transparency and stakeholder engagement are needed to ensure sustainable trade-offs. Here we describe the comprehensive environmental assessment (CEA) approach as a holistic way to manage complex information and to structure input from diverse stakeholder perspectives to support environmental decision-making for the near- and long-term. We further note how CEA builds upon and incorporates other available tools and approaches, describe its current application at the U.S. Environmental Protection Agency, and point out how it could be extended in evaluating a major issue such as the sustainability of biofuels. PMID:22889372
Characteristics and contents of dreams.
Schredl, Michael
2010-01-01
Dreams have been studied from different perspectives: psychoanalysis, academic psychology, and neurosciences. After presenting the definition of dreaming and the methodological tools of dream research, the major findings regarding the phenomenology of dreaming and the factors influencing dream content are briefly reviewed. The so-called continuity hypothesis stating that dreams reflect waking-life experiences is supported by studies investigating the dreams of psychiatric patients and patients with sleep disorders, i.e., their daytime symptoms and problems are reflected in their dreams. Dreams also have an effect on subsequent waking life, e.g., on daytime mood and creativity. The question about the functions of dreaming is still unanswered and open to future research. Copyright © 2010 Elsevier Inc. All rights reserved.
Kroshus, Emily; Parsons, John; Hainline, Brian
2017-11-08
Sports officials can play an important role in concussion safety by calling injury timeouts so that athletic trainers can evaluate athletes with possible concussions. Understanding the determinants of whether officials call an injury timeout when they suspect a concussion has important implications for the design of interventions that better support officials in this role. To assess the knowledge of US collegiate football officials about concussion symptoms and to determine the associations between knowledge, perceived injunctive norms, and self-efficacy and calling injury timeouts for suspected concussions in athletes. Cross-sectional study. Electronic survey. Of the 3074 US collegiate football officials contacted, 1324 (43% response rate) participated. Concussion knowledge, injunctive norms (belief about what others would want them to do), and behavioral self-efficacy (confidence in their ability to call injury timeouts for suspected concussions in athletes during challenging game-day conditions). Officials reported calling approximately 1 injury timeout for a suspected concussion every 4 games during the 2015 season. Structural equation modeling indicated that officials with more concussion-symptom knowledge had greater behavioral self-efficacy. Independent of an official's symptom knowledge, injunctive norms that were more supportive of calling an injury timeout were associated with greater self-efficacy. Concussion education for officials is important because when officials are aware of concussion symptoms, they are more confident in calling injury timeouts. Beyond increasing symptom knowledge, fostering sports environments that encourage concussion safety in all stakeholder groups can support officials in calling injury timeouts. Athletic trainers can help create sports environments that support proactive concussion identification by educating stakeholders, including officials, about the importance of concussion safety. When officials believe that other stakeholders support concussion safety, they are more likely to call injury timeouts if they suspect a concussion has occurred.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-25
... UNITED STATES INSTITUTE OF PEACE Call for Proposals for a Micro Support Program on International Conflict Resolution and Peacebuilding For Immediate Release AGENCY: United States Institute of Peace. ACTION: Notice. SUMMARY: Micro Support Program on International Conflict Resolution and Peacebuilding...
New Tool for Benefit-Cost Analysis in Evaluating Transportation Alternatives
DOT National Transportation Integrated Search
1997-01-01
The Intermodal Surface Transportation Efficiency Act (ISTEA) emphasizes assessment of multi-modal alternatives and demand management strategies. In 1995, the Federal Highway Administration (FHWA) developed a corridor sketch planning tool called the S...
Raskob, Wolfgang; Schneider, Thierry; Gering, Florian; Charron, Sylvie; Zhelezniak, Mark; Andronopoulos, Spyros; Heriard-Dubreuil, Gilles; Camps, Johan
2015-04-01
The PREPARE project that started in February 2013 and will end at the beginning of 2016 aims to close gaps that have been identified in nuclear and radiological preparedness in Europe following the first evaluation of the Fukushima disaster. Among others, the project will address the review of existing operational procedures for dealing with long-lasting releases and cross-border problems in radiation monitoring and food safety and further develop missing functionalities in decision support systems (DSS) ranging from improved source-term estimation and dispersion modelling to the inclusion of hydrological pathways for European water bodies. In addition, a so-called Analytical Platform will be developed exploring the scientific and operational means to improve information collection, information exchange and the evaluation of such types of disasters. The tools developed within the project will be partly integrated into the two DSS ARGOS and RODOS. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
The BOEING 777 - concurrent engineering and digital pre-assembly
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abarbanel, B.
The processes created on the 777 for checking designs were called {open_quotes}digital pre-assembly{close_quotes}. Using FlyThru(tm), a spin-off of a Boeing advanced computing research project, engineers were able to view up to 1500 models (15000 solids) in 3d traversing that data at high speed. FlyThru(tm) was rapidly deployed in 1991 to meet the needs of the 777 for large scale product visualization and verification. The digital pre-assembly process has bad fantastic results. The 777 has had far fewer assembly and systems problems compared to previous airplane programs. Today, FlyThru(tm) is installed on hundreds of workstations on almost every airplane program, andmore » is being used on Space Station, F22, AWACS, and other defense projects. It`s applications have gone far beyond just design review. In many ways, FlyThru is a Data Warehouse supported by advanced tools for analysis. It is today being integrated with Knowledge Based Engineering geometry generation tools.« less
Teamwork tools and activities within the hazard component of the Global Earthquake Model
NASA Astrophysics Data System (ADS)
Pagani, M.; Weatherill, G.; Monelli, D.; Danciu, L.
2013-05-01
The Global Earthquake Model (GEM) is a public-private partnership aimed at supporting and fostering a global community of scientists and engineers working in the fields of seismic hazard and risk assessment. In the hazard sector, in particular, GEM recognizes the importance of local ownership and leadership in the creation of seismic hazard models. For this reason, over the last few years, GEM has been promoting different activities in the context of seismic hazard analysis ranging, for example, from regional projects targeted at the creation of updated seismic hazard studies to the development of a new open-source seismic hazard and risk calculation software called OpenQuake-engine (http://globalquakemodel.org). In this communication we'll provide a tour of the various activities completed, such as the new ISC-GEM Global Instrumental Catalogue, and of currently on-going initiatives like the creation of a suite of tools for the creation of PSHA input models. Discussion, comments and criticism by the colleagues in the audience will be highly appreciated.
BEASTling: A software tool for linguistic phylogenetics using BEAST 2
Forkel, Robert; Kaiping, Gereon A.; Atkinson, Quentin D.
2017-01-01
We present a new open source software tool called BEASTling, designed to simplify the preparation of Bayesian phylogenetic analyses of linguistic data using the BEAST 2 platform. BEASTling transforms comparatively short and human-readable configuration files into the XML files used by BEAST to specify analyses. By taking advantage of Creative Commons-licensed data from the Glottolog language catalog, BEASTling allows the user to conveniently filter datasets using names for recognised language families, to impose monophyly constraints so that inferred language trees are backward compatible with Glottolog classifications, or to assign geographic location data to languages for phylogeographic analyses. Support for the emerging cross-linguistic linked data format (CLDF) permits easy incorporation of data published in cross-linguistic linked databases into analyses. BEASTling is intended to make the power of Bayesian analysis more accessible to historical linguists without strong programming backgrounds, in the hopes of encouraging communication and collaboration between those developing computational models of language evolution (who are typically not linguists) and relevant domain experts. PMID:28796784
BEASTling: A software tool for linguistic phylogenetics using BEAST 2.
Maurits, Luke; Forkel, Robert; Kaiping, Gereon A; Atkinson, Quentin D
2017-01-01
We present a new open source software tool called BEASTling, designed to simplify the preparation of Bayesian phylogenetic analyses of linguistic data using the BEAST 2 platform. BEASTling transforms comparatively short and human-readable configuration files into the XML files used by BEAST to specify analyses. By taking advantage of Creative Commons-licensed data from the Glottolog language catalog, BEASTling allows the user to conveniently filter datasets using names for recognised language families, to impose monophyly constraints so that inferred language trees are backward compatible with Glottolog classifications, or to assign geographic location data to languages for phylogeographic analyses. Support for the emerging cross-linguistic linked data format (CLDF) permits easy incorporation of data published in cross-linguistic linked databases into analyses. BEASTling is intended to make the power of Bayesian analysis more accessible to historical linguists without strong programming backgrounds, in the hopes of encouraging communication and collaboration between those developing computational models of language evolution (who are typically not linguists) and relevant domain experts.
NASA Technical Reports Server (NTRS)
Dubos, Gregory F.; Cornford, Steven
2012-01-01
While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".
Generic Airspace Concepts and Research
NASA Technical Reports Server (NTRS)
Mogford, Richard H.
2010-01-01
The purpose of this study was to evaluate methods for reducing the training and memorization required to manage air traffic in mid-term, Next Generation Air Transportation System (NextGen) airspace. We contrasted the performance of controllers using a sector information display and NextGen automation tools while working with familiar and unfamiliar sectors. The airspace included five sectors from Oakland and Salt Lake City Centers configured as a "generic center" called "West High Center." The Controller Information Tool was used to present essential information for managing these sectors. The Multi Aircraft Control System air traffic control simulator provided data link and conflict detection and resolution. There were five experienced air traffic controller participants. Each was familiar with one or two of the five sectors, but not the others. The participants rotated through all five sectors during the ten data collection runs. The results addressing workload, traffic management, and safety, as well as controller and observer comments, supported the generic sector concept. The unfamiliar sectors were comparable to the familiar sectors on all relevant measures.
Page, Roderic D M
2011-05-23
The Biodiversity Heritage Library (BHL) is a large digital archive of legacy biological literature, comprising over 31 million pages scanned from books, monographs, and journals. During the digitisation process basic metadata about the scanned items is recorded, but not article-level metadata. Given that the article is the standard unit of citation, this makes it difficult to locate cited literature in BHL. Adding the ability to easily find articles in BHL would greatly enhance the value of the archive. A service was developed to locate articles in BHL based on matching article metadata to BHL metadata using approximate string matching, regular expressions, and string alignment. This article locating service is exposed as a standard OpenURL resolver on the BioStor web site http://biostor.org/openurl/. This resolver can be used on the web, or called by bibliographic tools that support OpenURL. BioStor provides tools for extracting, annotating, and visualising articles from the Biodiversity Heritage Library. BioStor is available from http://biostor.org/.
NASA Astrophysics Data System (ADS)
Javahery, Homa; Deichman, Alexander; Seffah, Ahmed; Taleb, Mohamed
Patterns are a design tool to capture best practices, tackling problems that occur in different contexts. A user interface (UI) design pattern spans several levels of design abstraction ranging from high-level navigation to low-level idioms detailing a screen layout. One challenge is to combine a set of patterns to create a conceptual design that reflects user experiences. In this chapter, we detail a user-centered design (UCD) framework that exploits the novel idea of using personas and patterns together. Personas are used initially to collect and model user experiences. UI patterns are selected based on personas pecifications; these patterns are then used as building blocks for constructing conceptual designs. Through the use of a case study, we illustrate how personas and patterns can act as complementary techniques in narrowing the gap between two major steps in UCD: capturing users and their experiences, and building an early design based on that information. As a result of lessons learned from the study and by refining our framework, we define a more systematic process called UX-P (User Experiences to Pattern), with a supporting tool. The process introduces intermediate analytical steps and supports designers in creating usable designs.
Yu, Yao; Hu, Hao; Bohlender, Ryan J; Hu, Fulan; Chen, Jiun-Sheng; Holt, Carson; Fowler, Jerry; Guthery, Stephen L; Scheet, Paul; Hildebrandt, Michelle A T; Yandell, Mark; Huff, Chad D
2018-04-06
High-throughput sequencing data are increasingly being made available to the research community for secondary analyses, providing new opportunities for large-scale association studies. However, heterogeneity in target capture and sequencing technologies often introduce strong technological stratification biases that overwhelm subtle signals of association in studies of complex traits. Here, we introduce the Cross-Platform Association Toolkit, XPAT, which provides a suite of tools designed to support and conduct large-scale association studies with heterogeneous sequencing datasets. XPAT includes tools to support cross-platform aware variant calling, quality control filtering, gene-based association testing and rare variant effect size estimation. To evaluate the performance of XPAT, we conducted case-control association studies for three diseases, including 783 breast cancer cases, 272 ovarian cancer cases, 205 Crohn disease cases and 3507 shared controls (including 1722 females) using sequencing data from multiple sources. XPAT greatly reduced Type I error inflation in the case-control analyses, while replicating many previously identified disease-gene associations. We also show that association tests conducted with XPAT using cross-platform data have comparable performance to tests using matched platform data. XPAT enables new association studies that combine existing sequencing datasets to identify genetic loci associated with common diseases and other complex traits.
Decades of CALL Development: A Retrospective of the Work of James Pusack and Sue Otto
ERIC Educational Resources Information Center
Otto, Sue E. K.
2010-01-01
In this article, the author describes a series of projects that James Pusack and the author engaged in together, a number of them to develop CALL authoring tools. With their shared love of technology and dedication to language teaching and learning, they embarked on a long and immensely enjoyable career in CALL during which each project evolved…
Using Firefly Tools to Enhance Archive Web Pages
NASA Astrophysics Data System (ADS)
Roby, W.; Wu, X.; Ly, L.; Goldina, T.
2013-10-01
Astronomy web developers are looking for fast and powerful HTML 5/AJAX tools to enhance their web archives. We are exploring ways to make this easier for the developer. How could you have a full FITS visualizer or a Web 2.0 table that supports paging, sorting, and filtering in your web page in 10 minutes? Can it be done without even installing any software or maintaining a server? Firefly is a powerful, configurable system for building web-based user interfaces to access astronomy science archives. It has been in production for the past three years. Recently, we have made some of the advanced components available through very simple JavaScript calls. This allows a web developer, without any significant knowledge of Firefly, to have FITS visualizers, advanced table display, and spectrum plots on their web pages with minimal learning curve. Because we use cross-site JSONP, installing a server is not necessary. Web sites that use these tools can be created in minutes. Firefly was created in IRSA, the NASA/IPAC Infrared Science Archive (http://irsa.ipac.caltech.edu). We are using Firefly to serve many projects including Spitzer, Planck, WISE, PTF, LSST and others.
Situated Agents and Humans in Social Interaction for Elderly Healthcare: From Coaalas to AVICENA.
Gómez-Sebastià, Ignasi; Moreno, Jonathan; Álvarez-Napagao, Sergio; Garcia-Gasulla, Dario; Barrué, Cristian; Cortés, Ulises
2016-02-01
Assistive Technologies (AT) are an application area where several Artificial Intelligence techniques and tools have been successfully applied to support elderly or impeded people on their daily activities. However, approaches to AT tend to center in the user-tool interaction, neglecting the user's connection with its social environment (such as caretakers, relatives and health professionals) and the possibility to monitor undesired behaviour providing both adaptation to a dynamic environment and early response to potentially dangerous situations. In previous work we have presented COAALAS, an intelligent social and norm-aware device for elderly people that is able to autonomously organize, reorganize and interact with the different actors involved in elderly-care, either human actors or other devices. In this paper we put our work into context, by first examining what are the desirable properties of such a system, analysing the state-of-the-art on the relevant topics, and verifying the validity of our proposal in a larger context that we call AVICENA. AVICENA's aim is develop a semi-autonomous (collaborative) tool to promote monitored, intensive, extended and personalized therapeutic regime adherence at home based on adaptation techniques.
NASA Technical Reports Server (NTRS)
Bailin, Sydney; Paterra, Frank; Henderson, Scott; Truszkowski, Walt
1993-01-01
This paper presents a discussion of current work in the area of graphical modeling and model-based reasoning being undertaken by the Automation Technology Section, Code 522.3, at Goddard. The work was initially motivated by the growing realization that the knowledge acquisition process was a major bottleneck in the generation of fault detection, isolation, and repair (FDIR) systems for application in automated Mission Operations. As with most research activities this work started out with a simple objective: to develop a proof-of-concept system demonstrating that a draft rule-base for a FDIR system could be automatically realized by reasoning from a graphical representation of the system to be monitored. This work was called Knowledge From Pictures (KFP) (Truszkowski et. al. 1992). As the work has successfully progressed the KFP tool has become an environment populated by a set of tools that support a more comprehensive approach to model-based reasoning. This paper continues by giving an overview of the graphical modeling objectives of the work, describing the three tools that now populate the KFP environment, briefly presenting a discussion of related work in the field, and by indicating future directions for the KFP environment.
Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools
NASA Astrophysics Data System (ADS)
Boe, Bryce A.
There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.
Accelerating Harmonization in Digital Health.
Moore, Carolyn; Werner, Laurie; BenDor, Amanda Puckett; Bailey, Mike; Khan, Nighat
2017-01-01
Digital tools play an important role in supporting front-line health workers who deliver primary care. This paper explores the current state of efforts undertaken to move away from single-purpose applications of digital health towards integrated systems and solutions that align with national strategies. Through examples from health information systems, data and health worker training, this paper demonstrates how governments and stakeholders are working to integrate digital health services. We emphasize three factors as crucial for this integration: development and implementation of national digital health strategies; technical interoperability and collaborative approaches to ensure that digital health has an impact on the primary care level. Consolidation of technologies will enable an integrated, scaleable approach to the use of digital health to support health workers. As this edition explores a paradigm shift towards harmonization in primary healthcare systems, this paper explores complementary efforts undertaken to move away from single-purpose applications of digital health towards integrated systems and solutions that align with national strategies. It describes a paradigm shift towards integrated and interoperable systems that respond to health workers' needs in training, data and health information; and calls for the consolidation and integration of digital health tools and approaches across health areas, functions and levels of the health system. It then considers the critical factors that must be in place to support this paradigm shift. This paper aims not only to describe steps taken to move from fractured pilots to effective systems, but to propose a new perspective focused on consolidation and collaboration guided by national digital health strategies.
Reed, Richard L; Battersby, Malcolm; Osborne, Richard H; Bond, Malcolm J; Howard, Sara L; Roeger, Leigh
2011-11-01
The prevalence of older Australians with multiple chronic diseases is increasing and now accounts for a large proportion of total health care utilisation. Chronic disease self-management support (CDSMS) has become a core service component of many community based health programs because it is considered a useful tool in improving population health outcomes and reducing the financial burden of chronic disease care. However, the evidence base to justify these support programs is limited, particularly for older people with multiple chronic diseases. We describe an ongoing trial examining the effectiveness of a particular CDSMS approach called the Flinders Program. The Flinders Program is a clinician-led generic self-management intervention that provides a set of tools and a structured process that enables health workers and patients to collaboratively assess self-management behaviours, identify problems, set goals, and develop individual care plans covering key self-care, medical, psychosocial and carer issues. A sample of 252 older Australians that have two or more chronic conditions will be randomly assigned to receive either CDSMS or an attention control intervention (health information only) for 6 months. Outcomes will be assessed using self-reported health measures taken at baseline and post-intervention. This project will be the first comprehensive evaluation of CDSMS in this population. Findings are expected to guide consumers, clinicians and policymakers in the use of CDSMS, as well as facilitate prioritisation of public monies towards evidence-based services. Copyright © 2011 Elsevier Inc. All rights reserved.
Challenging Google, Microsoft Unveils a Search Tool for Scholarly Articles
ERIC Educational Resources Information Center
Carlson, Scott
2006-01-01
Microsoft has introduced a new search tool to help people find scholarly articles online. The service, which includes journal articles from prominent academic societies and publishers, puts Microsoft in direct competition with Google Scholar. The new free search tool, which should work on most Web browsers, is called Windows Live Academic Search…
Evaluation of Knowla: An Online Assessment and Learning Tool
ERIC Educational Resources Information Center
Thompson, Meredith Myra; Braude, Eric John
2016-01-01
The assessment of learning in large online courses requires tools that are valid, reliable, easy to administer, and can be automatically scored. We have evaluated an online assessment and learning tool called Knowledge Assembly, or Knowla. Knowla measures a student's knowledge in a particular subject by having the student assemble a set of…
ERIC Educational Resources Information Center
Weldeana, Hailu Nigus; Sbhatu, Desta Berhe
2017-01-01
Background: This article reports contributions of an assessment tool called Portfolio of Evidence (PE) in learning college geometry. Material and methods: Two classes of second-year students from one Ethiopian teacher education college, assigned into Treatment and Comparison classes, were participated. The assessment tools used in the Treatment…
Kroshus, Emily; Parsons, John; Hainline, Brian
2017-11-01
Sports officials can play an important role in concussion safety by calling injury timeouts so that athletic trainers can evaluate athletes with possible concussions. Understanding the determinants of whether officials call an injury timeout when they suspect a concussion has important implications for the design of interventions to better support officials in this role. To assess the knowledge of US collegiate football officials about concussion symptoms and to determine the associations between knowledge, perceived injunctive norms, and self-efficacy in calling injury timeouts for suspected concussions. Cross-sectional study. Electronic survey. Of the 3074 US collegiate football officials contacted, 1324 (43% response rate) participated. Concussion knowledge, injunctive norms (belief about what others would want them to do), and behavioral self-efficacy (confidence in their ability to call injury timeouts for suspected concussions in athletes during challenging game-day conditions). Officials reported calling approximately 1 injury timeout for a suspected concussion every 4 games during the 2015 season. Structural equation modeling indicated that officials with more concussion-symptom knowledge had greater self-efficacy. Independent of an official's symptom knowledge, injunctive norms that were more supportive of calling an injury timeout were associated with greater self-efficacy. Concussion education for officials is important because when officials are aware of concussion symptoms, they are more confident in calling injury timeouts. Beyond increasing symptom knowledge, fostering sports environments that encourage concussion safety can support officials in calling injury timeouts. Athletic trainers can help by educating stakeholders, including officials, about the importance of concussion safety. When officials believe that other stakeholders support concussion safety, they are more likely to call injury timeouts if they suspect a concussion has occurred.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, G.
1984-09-01
Two classifications of fishing jobs are discussed: open hole and cased hole. When there is no casing in the area of the fish, it is called open hole fishing. When the fish is inside the casing, it is called cased hole fishing. The article lists various things that can become a fish-stuck drill pipe, including: broken drill pipe, drill collars, bit, bit cones, hand tools dropped in the well, sanded up or mud stuck tubing, packers become stuck, and much more. It is suggested that on a fishing job, all parties involved should cooperate with each other, and that fishingmore » tool people obtain all the information concerning the well. That way they can select the right tools and methods to clean out the well as quickly as possible.« less
Blickem, Christian; Kennedy, Anne; Jariwala, Praksha; Morris, Rebecca; Bowen, Robert; Vassilev, Ivaylo; Brooks, Helen; Blakeman, Tom; Rogers, Anne
2014-06-17
Recent initiatives to target the personal, social and clinical needs of people with long-term health conditions have had limited impact within primary care. Evidence of the importance of social networks to support people with long-term conditions points to the need for self-management approaches which align personal circumstances with valued activities. The Patient-Led Assessment for Network Support (PLANS) intervention is a needs-led assessment for patients to prioritise their health and social needs and provide access to local community services and activities. Exploring the work and practices of patients and telephone workers are important for understanding and evaluating the workability and implementation of new interventions. Qualitative methods (interviews, focus group, observations) were used to explore the experience of PLANS from the perspectives of participants and the telephone support workers who delivered it (as part of an RCT) and the reasons why the intervention worked or not. Normalisation Process Theory (NPT) was used as a sensitising tool to evaluate: the relevance of PLANS to patients (coherence); the processes of engagement (cognitive participation); the work done for PLANS to happen (collective action); the perceived benefits and costs of PLANS (reflexive monitoring). 20 patients in the intervention arm of a clinical trial were interviewed and their telephone support calls were recorded and a focus group with 3 telephone support workers was conducted. Analysis of the interviews, support calls and focus group identified three themes in relation to the delivery and experience of PLANS. These are: formulation of 'health' in the context of everyday life; trajectories and tipping points: disrupting everyday routines; precarious trust in networks. The relevance of these themes are considered using NPT constructs in terms of the work that is entailed in engaging with PLANS, taking action, and who is implicated this process. PLANS gives scope to align long-term condition management to everyday life priorities and valued aspects of life. This approach can improve engagement with health-relevant practices by situating them within everyday contexts. This has potential to increase utilisation of local resources with potential cost-saving benefits for the NHS. ISRCTN45433299.
Myria: Scalable Analytics as a Service
NASA Astrophysics Data System (ADS)
Howe, B.; Halperin, D.; Whitaker, A.
2014-12-01
At the UW eScience Institute, we're working to empower non-experts, especially in the sciences, to write and use data-parallel algorithms. To this end, we are building Myria, a web-based platform for scalable analytics and data-parallel programming. Myria's internal model of computation is the relational algebra extended with iteration, such that every program is inherently data-parallel, just as every query in a database is inherently data-parallel. But unlike databases, iteration is a first class concept, allowing us to express machine learning tasks, graph traversal tasks, and more. Programs can be expressed in a number of languages and can be executed on a number of execution environments, but we emphasize a particular language called MyriaL that supports both imperative and declarative styles and a particular execution engine called MyriaX that uses an in-memory column-oriented representation and asynchronous iteration. We deliver Myria over the web as a service, providing an editor, performance analysis tools, and catalog browsing features in a single environment. We find that this web-based "delivery vector" is critical in reaching non-experts: they are insulated from irrelevant effort technical work associated with installation, configuration, and resource management. The MyriaX backend, one of several execution runtimes we support, is a main-memory, column-oriented, RDBMS-on-the-worker system that supports cyclic data flows as a first-class citizen and has been shown to outperform competitive systems on 100-machine cluster sizes. I will describe the Myria system, give a demo, and present some new results in large-scale oceanographic microbiology.
Verbist, Bie M P; Thys, Kim; Reumers, Joke; Wetzels, Yves; Van der Borght, Koen; Talloen, Willem; Aerssens, Jeroen; Clement, Lieven; Thas, Olivier
2015-01-01
In virology, massively parallel sequencing (MPS) opens many opportunities for studying viral quasi-species, e.g. in HIV-1- and HCV-infected patients. This is essential for understanding pathways to resistance, which can substantially improve treatment. Although MPS platforms allow in-depth characterization of sequence variation, their measurements still involve substantial technical noise. For Illumina sequencing, single base substitutions are the main error source and impede powerful assessment of low-frequency mutations. Fortunately, base calls are complemented with quality scores (Qs) that are useful for differentiating errors from the real low-frequency mutations. A variant calling tool, Q-cpileup, is proposed, which exploits the Qs of nucleotides in a filtering strategy to increase specificity. The tool is imbedded in an open-source pipeline, VirVarSeq, which allows variant calling starting from fastq files. Using both plasmid mixtures and clinical samples, we show that Q-cpileup is able to reduce the number of false-positive findings. The filtering strategy is adaptive and provides an optimized threshold for individual samples in each sequencing run. Additionally, linkage information is kept between single-nucleotide polymorphisms as variants are called at the codon level. This enables virologists to have an immediate biological interpretation of the reported variants with respect to their antiviral drug responses. A comparison with existing SNP caller tools reveals that calling variants at the codon level with Q-cpileup results in an outstanding sensitivity while maintaining a good specificity for variants with frequencies down to 0.5%. The VirVarSeq is available, together with a user's guide and test data, at sourceforge: http://sourceforge.net/projects/virtools/?source=directory. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
A Modeling Tool for Household Biogas Burner Flame Port Design
NASA Astrophysics Data System (ADS)
Decker, Thomas J.
Anaerobic digestion is a well-known and potentially beneficial process for rural communities in emerging markets, providing the opportunity to generate usable gaseous fuel from agricultural waste. With recent developments in low-cost digestion technology, communities across the world are gaining affordable access to the benefits of anaerobic digestion derived biogas. For example, biogas can displace conventional cooking fuels such as biomass (wood, charcoal, dung) and Liquefied Petroleum Gas (LPG), effectively reducing harmful emissions and fuel cost respectively. To support the ongoing scaling effort of biogas in rural communities, this study has developed and tested a design tool aimed at optimizing flame port geometry for household biogas-fired burners. The tool consists of a multi-component simulation that incorporates three-dimensional CAD designs with simulated chemical kinetics and computational fluid dynamics. An array of circular and rectangular port designs was developed for a widely available biogas stove (called the Lotus) as part of this study. These port designs were created through guidance from previous studies found in the literature. The three highest performing designs identified by the tool were manufactured and tested experimentally to validate tool output and to compare against the original port geometry. The experimental results aligned with the tool's prediction for the three chosen designs. Each design demonstrated improved thermal efficiency relative to the original, with one configuration of circular ports exhibiting superior performance. The results of the study indicated that designing for a targeted range of port hydraulic diameter, velocity and mixture density in the tool is a relevant way to improve the thermal efficiency of a biogas burner. Conversely, the emissions predictions made by the tool were found to be unreliable and incongruent with laboratory experiments.
NASA Astrophysics Data System (ADS)
Gallo, E. M.; Hogue, T. S.; Bell, C. D.; Spahr, K.; McCray, J. E.
2017-12-01
The water quality of receiving streams and waterbodies in urban watersheds are increasingly polluted from stormwater runoff. The implementation of Green Infrastructure (GI), which includes Low Impact Developments (LIDs) and Best Management Practices (BMPs), within a watershed aim to mitigate the effects of urbanization by reducing pollutant loads, runoff volume, and storm peak flow. Stormwater modeling is generally used to assess the impact of GIs implemented within a watershed. These modeling tools are useful for determining the optimal suite of GIs to maximize pollutant load reduction and minimize cost. However, stormwater management for most resource managers and communities also includes the implementation of grey and hybrid stormwater infrastructure. An integrated decision support tool, called i-DST, that allows for the optimization and comprehensive life-cycle cost assessment of grey, green, and hybrid stormwater infrastructure, is currently being developed. The i-DST tool will evaluate optimal stormwater runoff management by taking into account the diverse economic, environmental, and societal needs associated with watersheds across the United States. Three watersheds from southern California will act as a test site and assist in the development and initial application of the i-DST tool. The Ballona Creek, Dominguez Channel, and Los Angeles River Watersheds are located in highly urbanized Los Angeles County. The water quality of the river channels flowing through each are impaired by heavy metals, including copper, lead, and zinc. However, despite being adjacent to one another within the same county, modeling results, using EPA System for Urban Stormwater Treatment and Analysis INtegration (SUSTAIN), found that the optimal path to compliance in each watershed differs significantly. The differences include varied costs, suites of BMPs, and ancillary benefits. This research analyzes how the economic, physical, and hydrological differences between the three watersheds shape the optimal plan for stormwater management.
Seeking high reliability in primary care: Leadership, tools, and organization.
Weaver, Robert R
2015-01-01
Leaders in health care increasingly recognize that improving health care quality and safety requires developing an organizational culture that fosters high reliability and continuous process improvement. For various reasons, a reliability-seeking culture is lacking in most health care settings. Developing a reliability-seeking culture requires leaders' sustained commitment to reliability principles using key mechanisms to embed those principles widely in the organization. The aim of this study was to examine how key mechanisms used by a primary care practice (PCP) might foster a reliability-seeking, system-oriented organizational culture. A case study approach was used to investigate the PCP's reliability culture. The study examined four cultural artifacts used to embed reliability-seeking principles across the organization: leadership statements, decision support tools, and two organizational processes. To decipher their effects on reliability, the study relied on observations of work patterns and the tools' use, interactions during morning huddles and process improvement meetings, interviews with clinical and office staff, and a "collective mindfulness" questionnaire. The five reliability principles framed the data analysis. Leadership statements articulated principles that oriented the PCP toward a reliability-seeking culture of care. Reliability principles became embedded in the everyday discourse and actions through the use of "problem knowledge coupler" decision support tools and daily "huddles." Practitioners and staff were encouraged to report unexpected events or close calls that arose and which often initiated a formal "process change" used to adjust routines and prevent adverse events from recurring. Activities that foster reliable patient care became part of the taken-for-granted routine at the PCP. The analysis illustrates the role leadership, tools, and organizational processes play in developing and embedding a reliable-seeking culture across an organization. Progress toward a reliability-seeking, system-oriented approach to care remains ongoing, and movement in that direction requires deliberate and sustained effort by committed leaders in health care.
... pelvic exam, or special tests. Treatments include special pelvic muscle exercises called Kegel exercises. A mechanical support device called a pessary helps some women. Surgery and medicines are other treatments. NIH: National Institute of Child Health and Human Development
Liu, Bin; Wang, Shanyi; Dong, Qiwen; Li, Shumin; Liu, Xuan
2016-04-20
DNA-binding proteins play a pivotal role in various intra- and extra-cellular activities ranging from DNA replication to gene expression control. With the rapid development of next generation of sequencing technique, the number of protein sequences is unprecedentedly increasing. Thus it is necessary to develop computational methods to identify the DNA-binding proteins only based on the protein sequence information. In this study, a novel method called iDNA-KACC is presented, which combines the Support Vector Machine (SVM) and the auto-cross covariance transformation. The protein sequences are first converted into profile-based protein representation, and then converted into a series of fixed-length vectors by the auto-cross covariance transformation with Kmer composition. The sequence order effect can be effectively captured by this scheme. These vectors are then fed into Support Vector Machine (SVM) to discriminate the DNA-binding proteins from the non DNA-binding ones. iDNA-KACC achieves an overall accuracy of 75.16% and Matthew correlation coefficient of 0.5 by a rigorous jackknife test. Its performance is further improved by employing an ensemble learning approach, and the improved predictor is called iDNA-KACC-EL. Experimental results on an independent dataset shows that iDNA-KACC-EL outperforms all the other state-of-the-art predictors, indicating that it would be a useful computational tool for DNA binding protein identification. .
Surface infrastructure functions, requirements and subsystems for a manned Mars mission
NASA Technical Reports Server (NTRS)
Fairchild, Kyle
1986-01-01
Planning and development for a permanently manned scientific outpost on Mars requires an in-depth understanding and analysis of the functions the outpost is expected to perform. The optimum configuration that accomplishes these functions then arises during the trade studies process. In a project this complex, it becomes necessary to use a formal methodology to document the design and planning process. The method chosen for this study is called top-down functional decomposition. This method is used to determine the functions that are needed to accomplish the overall mission, then determine what requirements and systems are needed to do each of the functions. This method facilitates automation of the trades and options process. In the example, this was done with an off-the shelf software package called TK! olver. The basic functions that a permanently manned outpost on Mars must accomplish are: (1) Establish the Life Critical Systems; (2) Support Planetary Sciences and Exploration; and (3) Develop and Maintain Long-term Support Functions, including those systems needed towards self-sufficiency. The top-down functional decomposition methology, combined with standard spread sheet software, offers a powerful tool to quickly assess various design trades and analyze options. As the specific subsystems, and the relational rule algorithms are further refined, it will be possible to very accurately determine the implications of continually evolving mission requirements.
Field Ground Truthing Data Collector - a Mobile Toolkit for Image Analysis and Processing
NASA Astrophysics Data System (ADS)
Meng, X.
2012-07-01
Field Ground Truthing Data Collector is one of the four key components of the NASA funded ICCaRS project, being developed in Southeast Michigan. The ICCaRS ground truthing toolkit entertains comprehensive functions: 1) Field functions, including determining locations through GPS, gathering and geo-referencing visual data, laying out ground control points for AEROKAT flights, measuring the flight distance and height, and entering observations of land cover (and use) and health conditions of ecosystems and environments in the vicinity of the flight field; 2) Server synchronization functions, such as, downloading study-area maps, aerial photos and satellite images, uploading and synchronizing field-collected data with the distributed databases, calling the geospatial web services on the server side to conduct spatial querying, image analysis and processing, and receiving the processed results in field for near-real-time validation; and 3) Social network communication functions for direct technical assistance and pedagogical support, e.g., having video-conference calls in field with the supporting educators, scientists, and technologists, participating in Webinars, or engaging discussions with other-learning portals. This customized software package is being built on Apple iPhone/iPad and Google Maps/Earth. The technical infrastructures, data models, coupling methods between distributed geospatial data processing and field data collector tools, remote communication interfaces, coding schema, and functional flow charts will be illustrated and explained at the presentation. A pilot case study will be also demonstrated.
NASA Astrophysics Data System (ADS)
Ames, D. P.; Kadlec, J.; Cao, Y.; Grover, D.; Horsburgh, J. S.; Whiteaker, T.; Goodall, J. L.; Valentine, D. W.
2010-12-01
A growing number of hydrologic information servers are being deployed by government agencies, university networks, and individual researchers using the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS). The CUAHSI HIS Project has developed a standard software stack, called HydroServer, for publishing hydrologic observations data. It includes the Observations Data Model (ODM) database and Water Data Service web services, which together enable publication of data on the Internet in a standard format called Water Markup Language (WaterML). Metadata describing available datasets hosted on these servers is compiled within a central metadata catalog called HIS Central at the San Diego Supercomputer Center and is searchable through a set of predefined web services based queries. Together, these servers and central catalog service comprise a federated HIS of a scale and comprehensiveness never previously available. This presentation will briefly review/introduce the CUAHSI HIS system with special focus on a new HIS software tool called "HydroDesktop" and the open source software development web portal, www.HydroDesktop.org, which supports community development and maintenance of the software. HydroDesktop is a client-side, desktop software application that acts as a search and discovery tool for exploring the distributed network of HydroServers, downloading specific data series, visualizing and summarizing data series and exporting these to formats needed for analysis by external software. HydroDesktop is based on the open source DotSpatial GIS developer toolkit which provides it with map-based data interaction and visualization, and a plug-in interface that can be used by third party developers and researchers to easily extend the software using Microsoft .NET programming languages. HydroDesktop plug-ins that are presently available or currently under development within the project and by third party collaborators include functions for data search and discovery, extensive graphing, data editing and export, HydroServer exploration, integration with the OpenMI workflow and modeling system, and an interface for data analysis through the R statistical package.
Piotrowicz, Ryszard; Grabowski, Marcin; Balsam, Paweł; Kołtowski, Łukasz; Kozierkiewicz, Adam; Zajdel, Justyna; Piotrowicz, Ewa; Kowalski, Oskar; Mitkowski, Przemysław; Kaźmierczak, Jarosław; Kalarus, Zbigniew; Opolski, Grzegorz
2015-01-01
For several decades we have observed the development of data transmission technology on an unprecedented scale. With the development of such technology there has also appeared concepts on the use of these solutions in health care systems. Over the last decade telemedicine has been joined by the concept of mHealth, which is based on mobile devices mainly to monitor selected biomedical parameters. On 10 October 2014, during the conference Baltic Electrocardiology Autumn - Telemedicine and Arrhythmia (BEATA), a debate was held with the participation of physicians, politicians, businessmen, and representatives of the Government (Ministry of Health, National Health Fund, Social Insurance Institution) concerning the use of telecardiology services in daily practice. During the meeting issues were discussed such as: telemedicine solutions available throughout the world, analysis of their effectiveness based on clinical trials, funding opportunities, their legal status, and the development perspectives of telecardiology in Poland. The result of the meeting was a document called the "Baltic Declaration". The declaration is a call for proven and profitable technologies to be introduced into clinical practice. The declaration also indicates that the variety of available technological solutions are merely tools, and the utility of such tools stems not only from their modernity, but also primarily from matching their functionality to the features of the health interventions that are to be improved.
Mobile Phones: The Next Step towards Healthcare Delivery in Rural India?
DeSouza, Sherwin I.; Rashmi, M. R.; Vasanthi, Agalya P.; Joseph, Suchitha Maria; Rodrigues, Rashmi
2014-01-01
Background Given the ubiquity of mobile phones, their use to support healthcare in the Indian context is inevitable. It is however necessary to assess end-user perceptions regarding mobile health interventions especially in the rural Indian context prior to its use in healthcare. This would contextualize the use of mobile phone communication for health to 70% of the country's population that resides in rural India. Objectives To explore the acceptability of delivering healthcare interventions through mobile phones among users in a village in rural Bangalore. Methods This was an exploratory study of 488 mobile phone users, residing in a village, near Bangalore city, Karnataka, South India. A pretested, translated, interviewer-administered questionnaire was used to obtain data on mobile phone usage patterns and acceptability of the mobile phone, as a tool for health-related communication. The data is described using basic statistical measures. Results The primary use of mobile phones was to make or receive phone calls (100%). Text messaging (SMS) was used by only 70 (14%) of the respondents. Most of the respondents, 484 (99%), were willing to receive health-related information on their mobile phones and did not consider receiving such information, an intrusion into their personal life. While receiving reminders for drug adherence was acceptable to most 479 (98%) of our respondents, 424 (89%) preferred voice calls alone to other forms of communication. Nearly all were willing to use their mobile phones to communicate with health personnel in emergencies and 367 (75%) were willing to consult a doctor via the phone in an acute illness. Factors such as sex, English literacy, employment status, and presence of chronic disease affected preferences regarding mode and content of communication. Conclusion The mobile phone, as a tool for receiving health information and supporting healthcare through mHealth interventions was acceptable in the rural Indian context. PMID:25133610
NASA Astrophysics Data System (ADS)
Wright, D. J.; O'Dea, E.; Cushing, J. B.; Cuny, J. E.; Toomey, D. R.; Hackett, K.; Tikekar, R.
2001-12-01
The East Pacific Rise (EPR) from 9-10deg. N is currently our best-studied section of fast-spreading mid-ocean ridge. During several decades of investigation it has been explored by the full spectrum of ridge investigators, including chemists, biologists, geologists and geophysicists. These studies, and those that are ongoing, provide a wealth of observational data, results and data-driven theoretical (often numerical) studies that have not yet been fully utilized either by research scientists or by professional educators. While the situation is improving, a large amount of data, results, and related theoretical models still exist either in an inert, non-interactive form (e.g., journal publications) or as unlinked and currently incompatible computer data or algorithms. Infrastructure is needed not just for ready access to data, but linkage of disparate data sets (data to data) as well as data to models in order quantitatively evaluate hypotheses, refine numerical simulations, and explore new relations between observables. The prototype of a computational environment and toolset, called the Virtual Research Vessel (VRV), is being developed to provide scientists and educators with ready access to data, results and numerical models. While this effort is focused on the EPR 9N region, the resulting software tools and infrastructure should be helpful in establishing similar systems for other sections of the global mid-ocean ridge. Work in progress includes efforts to develop: (1) virtual database to incorporate diverse data types with domain-specific metadata into a global schema that allows web-query across different marine geology data sets, and an analogous declarative (database available) description of tools and models; (2) the ability to move data between GIS and the above DBMS, and tools to encourage data submission to archivesl (3) tools for finding and viewing archives, and translating between formats; (4) support for "computational steering" (tool composition) and model coupling (e.g., ability to run tool composition locally but access input data from the web, APIs to support coupling such as invoking programs that are running remotely, and help in writing data wrappers to publish programs); (5) support of migration paths for prototyped model coupling; and (6) export of marine geological data and data analysis to the undergraduate classroom (VRV-ET, "Educational Tool"). See the main VRV web site at http://oregonstate.edu/dept/vrv and the VRV-ET web site at: http://www.cs.uoregon.edu/research/vrv-et.
Herrmann, Florian E M; Lenski, Markus; Steffen, Julius; Kailuweit, Magdalena; Nikolaus, Marc; Koteeswaran, Rajasekaran; Sailer, Andreas; Hanszke, Anna; Wintergerst, Maximilian; Dittmer, Sissi; Mayr, Doris; Genzel-Boroviczény, Orsolya; Eley, Diann S; Fischer, Martin R
2015-06-02
Pathology is a discipline that provides the basis of the understanding of disease in medicine. The past decades have seen a decline in the emphasis laid on pathology teaching in medical schools and outdated pathology curricula have worsened the situation. Student opinions and thoughts are central to the questions of whether and how such curricula should be modernized. A survey was conducted among 1018 German medical students regarding their preferences in pathology teaching modalities and their satisfaction with lecture-based courses. A qualitative analysis was performed comparing a recently modernized pathology curriculum with a traditional lecture-based curriculum. The differences in modalities of teaching used were investigated. Student satisfaction with the lecture-based curriculum positively correlated with student grades (spearman's correlation coefficient 0.24). Additionally, students with lower grades supported changing the curriculum (spearman's correlation coefficient 0.47). The majority supported virtual microscopy, autopsies, seminars and podcasts as preferred didactic methods. The data supports the implementation of a pathology curriculum where tutorials, autopsies and supplementary computer-based learning tools play important roles.
ERIC Educational Resources Information Center
Ambrose, Regina Maria; Palpanathan, Shanthini
2017-01-01
Computer-assisted language learning (CALL) has evolved through various stages in both technology as well as the pedagogical use of technology (Warschauer & Healey, 1998). Studies show that the CALL trend has facilitated students in their English language writing with useful tools such as computer based activities and word processing. Students…
ERIC Educational Resources Information Center
Sheehan, Mark D.; Thorpe, Todd; Dunn, Robert
2015-01-01
Much has been gained over the years in various educational fields that have taken advantage of CALL. In many cases, CALL has facilitated learning and provided teachers and students access to materials and tools that would have remained out of reach were it not for technology. Nonetheless, there are still cases where a lack of funding or access to…
Precisely Tracking Childhood Death.
Farag, Tamer H; Koplan, Jeffrey P; Breiman, Robert F; Madhi, Shabir A; Heaton, Penny M; Mundel, Trevor; Ordi, Jaume; Bassat, Quique; Menendez, Clara; Dowell, Scott F
2017-07-01
Little is known about the specific causes of neonatal and under-five childhood death in high-mortality geographic regions due to a lack of primary data and dependence on inaccurate tools, such as verbal autopsy. To meet the ambitious new Sustainable Development Goal 3.2 to eliminate preventable child mortality in every country, better approaches are needed to precisely determine specific causes of death so that prevention and treatment interventions can be strengthened and focused. Minimally invasive tissue sampling (MITS) is a technique that uses needle-based postmortem sampling, followed by advanced histopathology and microbiology to definitely determine cause of death. The Bill & Melinda Gates Foundation is supporting a new surveillance system called the Child Health and Mortality Prevention Surveillance network, which will determine cause of death using MITS in combination with other information, and yield cause-specific population-based mortality rates, eventually in up to 12-15 sites in sub-Saharan Africa and south Asia. However, the Gates Foundation funding alone is not enough. We call on governments, other funders, and international stakeholders to expand the use of pathology-based cause of death determination to provide the information needed to end preventable childhood mortality.
Sequence Diversity Diagram for comparative analysis of multiple sequence alignments.
Sakai, Ryo; Aerts, Jan
2014-01-01
The sequence logo is a graphical representation of a set of aligned sequences, commonly used to depict conservation of amino acid or nucleotide sequences. Although it effectively communicates the amount of information present at every position, this visual representation falls short when the domain task is to compare between two or more sets of aligned sequences. We present a new visual presentation called a Sequence Diversity Diagram and validate our design choices with a case study. Our software was developed using the open-source program called Processing. It loads multiple sequence alignment FASTA files and a configuration file, which can be modified as needed to change the visualization. The redesigned figure improves on the visual comparison of two or more sets, and it additionally encodes information on sequential position conservation. In our case study of the adenylate kinase lid domain, the Sequence Diversity Diagram reveals unexpected patterns and new insights, for example the identification of subgroups within the protein subfamily. Our future work will integrate this visual encoding into interactive visualization tools to support higher level data exploration tasks.
EPA Registers Innovative Tool to Control Corn Rootworm
Ribonucleic acid interference (RNAi) based Plant Incorporated Protectant (PIP) technology is a new and innovative scientific tool utilized by U.S. growers. Learn more about RNAi technology and the 4 new products containing the RNAi based PIP called SMARTST
Concepts, tools, and strategies for effluent testing: An international survey
Whole effluent testing (also called Direct Toxicity Assessment) remains a critical long-term assessment tool for aquatic environmental protection. Use of animal alternative approaches for wastewater testing is expected to increase as more regulatory authorities routinely require ...
How Can I Deal with My Asthma?
... had trouble with it and why. Use asthma management tools. Even if you're feeling absolutely fine, don't abandon tools like daily long-term control medicines (also called "controller" or "maintenance" medicines) if they're a part of your ...
ERIC Educational Resources Information Center
Bustamante, Rebecca M.
2006-01-01
This module is designed to introduce educational leaders to an organizational assessment tool called a "culture audit." Literature on organizational cultural competence suggests that culture audits are a valuable tool for determining how well school policies, programs, and practices respond to the needs of diverse groups and prepare…
FFI: What it is and what it can do for you
Duncan C. Lutes; MaryBeth Keifer; Nathan C. Benson; John F. Caratti
2009-01-01
A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool (FEAT). FFI provides...
HYPATIA--An Online Tool for ATLAS Event Visualization
ERIC Educational Resources Information Center
Kourkoumelis, C.; Vourakis, S.
2014-01-01
This paper describes an interactive tool for analysis of data from the ATLAS experiment taking place at the world's highest energy particle collider at CERN. The tool, called HYPATIA/applet, enables students of various levels to become acquainted with particle physics and look for discoveries in a similar way to that of real research.
Developing Multimedia Courseware for the Internet's Java versus Shockwave.
ERIC Educational Resources Information Center
Majchrzak, Tina L.
1996-01-01
Describes and compares two methods for developing multimedia courseware for use on the Internet: an authoring tool called Shockwave, and an object-oriented language called Java. Topics include vector graphics, browsers, interaction with network protocols, data security, multithreading, and computer languages versus development environments. (LRW)
Computer aided manufacturing for complex freeform optics
NASA Astrophysics Data System (ADS)
Wolfs, Franciscus; Fess, Ed; Johns, Dustin; LePage, Gabriel; Matthews, Greg
2017-10-01
Recently, the desire to use freeform optics has been increasing. Freeform optics can be used to expand the capabilities of optical systems and reduce the number of optics needed in an assembly. The traits that increase optical performance also present challenges in manufacturing. As tolerances on freeform optics become more stringent, it is necessary to continue to improve methods for how the grinding and polishing processes interact with metrology. To create these complex shapes, OptiPro has developed a computer aided manufacturing package called PROSurf. PROSurf generates tool paths required for grinding and polishing freeform optics with multiple axes of motion. It also uses metrology feedback for deterministic corrections. ProSurf handles 2 key aspects of the manufacturing process that most other CAM systems struggle with. The first is having the ability to support several input types (equations, CAD models, point clouds) and still be able to create a uniform high-density surface map useable for generating a smooth tool path. The second is to improve the accuracy of mapping a metrology file to the part surface. To perform this OptiPro is using 3D error maps instead of traditional 2D maps. The metrology error map drives the tool path adjustment applied during processing. For grinding, the error map adjusts the tool position to compensate for repeatable system error. For polishing, the error map drives the relative dwell times of the tool across the part surface. This paper will present the challenges associated with these issues and solutions that we have created.
Bartke, Stephan; Schwarze, Reimund
2015-04-15
The EU Soil Thematic Strategy calls for the application of sustainability concepts and methods as part of an integrated policy to prevent soil degradation and to increase the re-use of brownfields. Although certain general principles have been proposed for the evaluation of sustainable development, the practical application of sustainability assessment tools (SATs) is contingent on the actual requirements of tool users, e.g. planners or investors, to pick up such instruments in actual decision making. We examine the normative sustainability principles that need to be taken into account in order to make sound land-use decisions between new development on greenfield sites and the regeneration of brownfields - and relate these principles to empirically observed user requirements and the properties of available SATs. In this way we provide an overview of approaches to sustainability assessment. Three stylized approaches, represented in each case by a typical tool selected from the literature, are presented and contrasted with (1) the norm-oriented Bellagio sustainability principles and (2) the requirements of three different stakeholder groups: decision makers, scientists/experts and representatives of the general public. The paper disentangles some of the inevitable trade-offs involved in seeking to implement sustainable land-use planning, i.e. between norm orientation and holism, broad participation and effective communication. It concludes with the controversial assessment that there are no perfect tools and that to be meaningful the user requirements of decision makers must take precedence over those of other interest groups in the design of SATs. Copyright © 2015 Elsevier Ltd. All rights reserved.
NATIONAL URBAN DATABASE AND ACCESS PROTAL TOOL
Current mesoscale weather prediction and microscale dispersion models are limited in their ability to perform accurate assessments in urban areas. A project called the National Urban Database with Access Portal Tool (NUDAPT) is beginning to provide urban data and improve the para...
Variant Review with the Integrative Genomics Viewer.
Robinson, James T; Thorvaldsdóttir, Helga; Wenger, Aaron M; Zehir, Ahmet; Mesirov, Jill P
2017-11-01
Manual review of aligned reads for confirmation and interpretation of variant calls is an important step in many variant calling pipelines for next-generation sequencing (NGS) data. Visual inspection can greatly increase the confidence in calls, reduce the risk of false positives, and help characterize complex events. The Integrative Genomics Viewer (IGV) was one of the first tools to provide NGS data visualization, and it currently provides a rich set of tools for inspection, validation, and interpretation of NGS datasets, as well as other types of genomic data. Here, we present a short overview of IGV's variant review features for both single-nucleotide variants and structural variants, with examples from both cancer and germline datasets. IGV is freely available at https://www.igv.org Cancer Res; 77(21); e31-34. ©2017 AACR . ©2017 American Association for Cancer Research.
Enhancement Approachof Object Constraint Language Generation
NASA Astrophysics Data System (ADS)
Salemi, Samin; Selamat, Ali
2018-01-01
OCL is the most prevalent language to document system constraints that are annotated in UML. Writing OCL specifications is not an easy task due to the complexity of the OCL syntax. Therefore, an approach to help and assist developers to write OCL specifications is needed. There are two approaches to do so: First, creating an OCL specifications by a tool called COPACABANA. Second, an MDA-based approach to help developers in writing OCL specification by another tool called NL2OCLviaSBVR that generates OCL specification automatically. This study presents another MDA-based approach called En2OCL, and its objective is twofold. 1- to improve the precison of the existing works. 2- to present a benchmark of these approaches. The benchmark shows that the accuracy of COPACABANA, NL2OCLviaSBVR, and En2OCL are 69.23, 84.64, and 88.40 respectively.
Dfam: a database of repetitive DNA based on profile hidden Markov models.
Wheeler, Travis J; Clements, Jody; Eddy, Sean R; Hubley, Robert; Jones, Thomas A; Jurka, Jerzy; Smit, Arian F A; Finn, Robert D
2013-01-01
We present a database of repetitive DNA elements, called Dfam (http://dfam.janelia.org). Many genomes contain a large fraction of repetitive DNA, much of which is made up of remnants of transposable elements (TEs). Accurate annotation of TEs enables research into their biology and can shed light on the evolutionary processes that shape genomes. Identification and masking of TEs can also greatly simplify many downstream genome annotation and sequence analysis tasks. The commonly used TE annotation tools RepeatMasker and Censor depend on sequence homology search tools such as cross_match and BLAST variants, as well as Repbase, a collection of known TE families each represented by a single consensus sequence. Dfam contains entries corresponding to all Repbase TE entries for which instances have been found in the human genome. Each Dfam entry is represented by a profile hidden Markov model, built from alignments generated using RepeatMasker and Repbase. When used in conjunction with the hidden Markov model search tool nhmmer, Dfam produces a 2.9% increase in coverage over consensus sequence search methods on a large human benchmark, while maintaining low false discovery rates, and coverage of the full human genome is 54.5%. The website provides a collection of tools and data views to support improved TE curation and annotation efforts. Dfam is also available for download in flat file format or in the form of MySQL table dumps.
Eyetracking Methodology in SCMC: A Tool for Empowering Learning and Teaching
ERIC Educational Resources Information Center
Stickler, Ursula; Shi, Lijing
2017-01-01
Computer-assisted language learning, or CALL, is an interdisciplinary area of research, positioned between science and social science, computing and education, linguistics and applied linguistics. This paper argues that by appropriating methods originating in some areas of CALL-related research, for example human-computer interaction (HCI) or…
On the Edge: Intelligent CALL in the 1990s.
ERIC Educational Resources Information Center
Underwood, John
1989-01-01
Examines the possibilities of developing computer-assisted language learning (CALL) based on the best of modern technology, arguing that artificial intelligence (AI) strategies will radically improve the kinds of exercises that can be performed. Recommends combining AI technology with other tools for delivering instruction, such as simulation and…
Rethinking Transfer: Learning from CALL Teacher Education as Consequential Transition
ERIC Educational Resources Information Center
Chao, Chin-chi
2015-01-01
Behind CALL teacher education (CTE) there is an unproblematized consensus of transfer, which suggests a positivist and tool-centered view of learning gains that differs from the sociocultural focus of recent teacher education research. Drawing on Beach's (2003) conceptualization of transfer as "consequential transition," this qualitative…
Developing a Web-Based Ppgis, as AN Environmental Reporting Service
NASA Astrophysics Data System (ADS)
Ranjbar Nooshery, N.; Taleai, M.; Kazemi, R.; Ebadi, K.
2017-09-01
Today municipalities are searching for new tools to empower locals for changing the future of their own areas by increasing their participation in different levels of urban planning. These tools should involve the community in planning process using participatory approaches instead of long traditional top-down planning models and help municipalities to obtain proper insight about major problems of urban neighborhoods from the residents' point of view. In this matter, public participation GIS (PPGIS) which enables citizens to record and following up their feeling and spatial knowledge regarding problems of the city in the form of maps have been introduced. In this research, a tool entitled CAER (Collecting & Analyzing of Environmental Reports) is developed. In the first step, a software framework based on Web-GIS tool, called EPGIS (Environmental Participatory GIS) has been designed to support public participation in reporting urban environmental problems and to facilitate data flow between citizens and municipality. A web-based cartography tool was employed for geo-visualization and dissemination of map-based reports. In the second step of CAER, a subsystem is developed based on SOLAP (Spatial On-Line Analytical Processing), as a data mining tools to elicit the local knowledge facilitating bottom-up urban planning practices and to help urban managers to find hidden relations among the recorded reports. This system is implemented in a case study area in Boston, Massachusetts and its usability was evaluated. The CAER should be considered as bottom-up planning tools to collect people's problems and views about their neighborhood and transmits them to the city officials. It also helps urban planners to find solutions for better management from citizen's viewpoint and gives them this chance to develop good plans to the neighborhoods that should be satisfied the citizens.
Aerospace Power Systems Design and Analysis (APSDA) Tool
NASA Technical Reports Server (NTRS)
Truong, Long V.
1998-01-01
The conceptual design of space and/or planetary electrical power systems has required considerable effort. Traditionally, in the early stages of the design cycle (conceptual design), the researchers have had to thoroughly study and analyze tradeoffs between system components, hardware architectures, and operating parameters (such as frequencies) to optimize system mass, efficiency, reliability, and cost. This process could take anywhere from several months to several years (as for the former Space Station Freedom), depending on the scale of the system. Although there are many sophisticated commercial software design tools for personal computers (PC's), none of them can support or provide total system design. To meet this need, researchers at the NASA Lewis Research Center cooperated with Professor George Kusic from the University of Pittsburgh to develop a new tool to help project managers and design engineers choose the best system parameters as quickly as possible in the early design stages (in days instead of months). It is called the Aerospace Power Systems Design and Analysis (APSDA) Tool. By using this tool, users can obtain desirable system design and operating parameters such as system weight, electrical distribution efficiency, bus power, and electrical load schedule. With APSDA, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. user interface. It operates on any PC running the MS-DOS (Microsoft Corp.) operating system, version 5.0 or later. A color monitor (EGA or VGA) and two-button mouse are required. The APSDA tool was presented at the 30th Intersociety Energy Conversion Engineering Conference (IECEC) and is being beta tested at several NASA centers. Beta test packages are available for evaluation by contacting the author.
Structure and software tools of AIDA.
Duisterhout, J S; Franken, B; Witte, F
1987-01-01
AIDA consists of a set of software tools to allow for fast development and easy-to-maintain Medical Information Systems. AIDA supports all aspects of such a system both during development and operation. It contains tools to build and maintain forms for interactive data entry and on-line input validation, a database management system including a data dictionary and a set of run-time routines for database access, and routines for querying the database and output formatting. Unlike an application generator, the user of AIDA may select parts of the tools to fulfill his needs and program other subsystems not developed with AIDA. The AIDA software uses as host language the ANSI-standard programming language MUMPS, an interpreted language embedded in an integrated database and programming environment. This greatly facilitates the portability of AIDA applications. The database facilities supported by AIDA are based on a relational data model. This data model is built on top of the MUMPS database, the so-called global structure. This relational model overcomes the restrictions of the global structure regarding string length. The global structure is especially powerful for sorting purposes. Using MUMPS as a host language allows the user an easy interface between user-defined data validation checks or other user-defined code and the AIDA tools. AIDA has been designed primarily for prototyping and for the construction of Medical Information Systems in a research environment which requires a flexible approach. The prototyping facility of AIDA operates terminal independent and is even to a great extent multi-lingual. Most of these features are table-driven; this allows on-line changes in the use of terminal type and language, but also causes overhead. AIDA has a set of optimizing tools by which it is possible to build a faster, but (of course) less flexible code from these table definitions. By separating the AIDA software in a source and a run-time version, one is able to write implementation-specific code which can be selected and loaded by a special source loader, being part of the AIDA software. This feature is also accessible for maintaining software on different sites and on different installations.
Large-scale mHealth professional support for health workers in rural Maharashtra, India.
Hegde, Shailendra Kumar B; Saride, Sriranga Prasad; Kuruganty, Sudha; Banker, Niraja; Patil, Chetan; Phanse, Vishal
2018-04-01
Expanding mobile telephony in India has prompted interest in the potential of mobile-telephone health (mHealth) in linking health workers in rural areas with specialist medical advice and other professional services. In 2012, a toll-free helpline offering specialist medical advice to community-based health workers throughout Maharashtra was launched. Calls are handled via a 24 h centre in Pune, staffed by health advisory officers and medical specialists. Health advisory officers handle general queries, which include medical advice via validated algorithms; blood on-call services; grievance issues; and mental health support - the latter calls are transferred to a qualified counsellor. Calls requiring more specialist advice are transferred to the appropriate medical specialist. This paper describes the experience of the first 4 years of this helpline, in terms of the services used, callers, nature of calls, types of queries serviced and lessons learnt. In the first 4 years of the helpline, 669 265 calls were serviced. Of these calls, 453 373 (67.74%) needed medical advice and were handled by health advisory officers. Specialist services were required to address 199 226 (29.77%) calls. Blood-bank-related services accounted for 7919 (1.18%) calls, while 2462 (0.37%) were grievance calls. Counselling for mental health issues accounted for 6285 (0.94%) calls. The large-scale mHealth professional support provided by this helpline in Maharashtra has reached many health workers serving rural communities. Future work is required to explore ways to expand the reach of the helpline further and to measure its effectiveness in improving health outcomes.
PERTS: A Prototyping Environment for Real-Time Systems
NASA Technical Reports Server (NTRS)
Liu, Jane W. S.; Lin, Kwei-Jay; Liu, C. L.
1991-01-01
We discuss an ongoing project to build a Prototyping Environment for Real-Time Systems, called PERTS. PERTS is a unique prototyping environment in that it has (1) tools and performance models for the analysis and evaluation of real-time prototype systems, (2) building blocks for flexible real-time programs and the support system software, (3) basic building blocks of distributed and intelligent real time applications, and (4) an execution environment. PERTS will make the recent and future theoretical advances in real-time system design and engineering readily usable to practitioners. In particular, it will provide an environment for the use and evaluation of new design approaches, for experimentation with alternative system building blocks and for the analysis and performance profiling of prototype real-time systems.
Privacy Awareness: A Means to Solve the Privacy Paradox?
NASA Astrophysics Data System (ADS)
Pötzsch, Stefanie
People are limited in their resources, i.e. they have limited memory capabilities, cannot pay attention to too many things at the same time, and forget much information after a while; computers do not suffer from these limitations. Thus, revealing personal data in electronic communication environments and being completely unaware of the impact of privacy might cause a lot of privacy issues later. Even if people are privacy aware in general, the so-called privacy paradox shows that they do not behave according to their stated attitudes. This paper discusses explanations for the existing dichotomy between the intentions of people towards disclosure of personal data and their behaviour. We present requirements on tools for privacy-awareness support in order to counteract the privacy paradox.
Auction-based bandwidth allocation in the Internet
NASA Astrophysics Data System (ADS)
Wei, Jiaolong; Zhang, Chi
2002-07-01
It has been widely accepted that auctioning which is the pricing approach with minimal information requirement is a proper tool to manage scare network resources. Previous works focus on Vickrey auction which is incentive compatible in classic auction theory. In the beginning of this paper, the faults of the most representative auction-based mechanisms are discussed. And then a new method called uniform-price auction (UPA), which has the simplest auction rule is proposed and it's incentive compatibility in the network environment is also proved. Finally, the basic mode is extended to support applications which require minimum bandwidth guarantees for a given time period by introducing derivative market, and a market mechanism for network resource allocation which is predictable, riskless, and simple for end-users is completed.
Vandenhove, Hildegarde; Turcanu, Catrinel
2016-10-01
The options adopted for recovery of agricultural land after the Chernobyl and Fukushima accidents are compared by examining their technical and socio-economic aspects. The analysis highlights commonalities such as the implementation of tillage and other types of countermeasures and differences in approach, such as preferences for topsoil removal in Fukushima and the application of K fertilizers in Chernobyl. This analysis shows that the recovery approach needs to be context-specific to best suit the physical, social, and political environment. The complex nature of the decision problem calls for a formal process for engaging stakeholders and the development of adequate decision support tools. Integr Environ Assess Manag 2016;12:662-666. © 2016 SETAC. © 2016 SETAC.
Physics-based and human-derived information fusion for analysts
NASA Astrophysics Data System (ADS)
Blasch, Erik; Nagy, James; Scott, Steve; Okoth, Joshua; Hinman, Michael
2017-05-01
Recent trends in physics-based and human-derived information fusion (PHIF) have amplified the capabilities of analysts; however with the big data opportunities there is a need for open architecture designs, methods of distributed team collaboration, and visualizations. In this paper, we explore recent trends in the information fusion to support user interaction and machine analytics. Challenging scenarios requiring PHIF include combing physics-based video data with human-derived text data for enhanced simultaneous tracking and identification. A driving effort would be to provide analysts with applications, tools, and interfaces that afford effective and affordable solutions for timely decision making. Fusion at scale should be developed to allow analysts to access data, call analytics routines, enter solutions, update models, and store results for distributed decision making.
NASA Astrophysics Data System (ADS)
Harrison, Robert; Vera, Daniel; Ahmad, Bilal
2016-10-01
The fourth industrial revolution promises to create what has been called the smart factory. The vision is that within such modular structured smart factories, cyber-physical systems monitor physical processes, create a virtual copy of the physical world and make decentralised decisions. This paper provides a view of this initiative from an automation systems perspective. In this context it considers how future automation systems might be effectively configured and supported through their lifecycles and how integration, application modelling, visualisation and reuse of such systems might be best achieved. The paper briefly describes limitations in current engineering methods, and new emerging approaches including the cyber physical systems (CPS) engineering tools being developed by the automation systems group (ASG) at Warwick Manufacturing Group, University of Warwick, UK.
Reshaping of large aeronautical structural parts: A simplified simulation approach
NASA Astrophysics Data System (ADS)
Mena, Ramiro; Aguado, José V.; Guinard, Stéphane; Huerta, Antonio
2018-05-01
Large aeronautical structural parts present important distortions after machining. This problem is caused by the presence of residual stresses, which are developed during previous manufacturing steps (quenching). Before being put into service, the nominal geometry is restored by means of mechanical methods. This operation is called reshaping and exclusively depends on the skills of a well-trained and experienced operator. Moreover, this procedure is time consuming and nowadays, it is only based on a trial and error approach. Therefore, there is a need at industrial level to solve this problem with the support of numerical simulation tools. By using a simplification hypothesis, it was found that the springback phenomenon behaves linearly and it allows developing a strategy to implement reshaping at an industrial level.
Visualising nursing data using correspondence analysis.
Kokol, Peter; Blažun Vošner, Helena; Železnik, Danica
2016-09-01
Digitally stored, large healthcare datasets enable nurses to use 'big data' techniques and tools in nursing research. Big data is complex and multi-dimensional, so visualisation may be a preferable approach to analyse and understand it. To demonstrate the use of visualisation of big data in a technique called correspondence analysis. In the authors' study, relations among data in a nursing dataset were shown visually in graphs using correspondence analysis. The case presented demonstrates that correspondence analysis is easy to use, shows relations between data visually in a form that is simple to interpret, and can reveal hidden associations between data. Correspondence analysis supports the discovery of new knowledge. Implications for practice Knowledge obtained using correspondence analysis can be transferred immediately into practice or used to foster further research.
Gage, Barbara; Stineman, Margaret; Deutsch, Anne; Mallinson, Trudy; Heinemann, Allen; Bernard, Shulamit; Constantine, Roberta
2007-12-01
Better measurement of the case-mix complexity of patients receiving rehabilitation services is critical to understanding variations in the outcomes achieved by patients treated in different postacute care (PAC) settings. The Medicare program recognized this issue and is undertaking a major initiative to develop a new patient-assessment instrument that would standardize case-mix measurement in inpatient rehabilitation facilities, long-term care hospitals, skilled nursing facilities, and home health agencies. The new instrument, called the Continuity Assessment Record and Evaluation Tool, builds on the scientific advances in measurement to develop standard measures of medical acuity, functional status, cognitive impairment, and social support related to resource need, outcomes, and continuity of care for use in all PAC settings.
An application of computer aided requirements analysis to a real time deep space system
NASA Technical Reports Server (NTRS)
Farny, A. M.; Morris, R. V.; Hartsough, C.; Callender, E. D.; Teichroew, D.; Chikofsky, E.
1981-01-01
The entire procedure of incorporating the requirements and goals of a space flight project into integrated, time ordered sequences of spacecraft commands, is called the uplink process. The Uplink Process Control Task (UPCT) was created to examine the uplink process and determine ways to improve it. The Problem Statement Language/Problem Statement Analyzer (PSL/PSA) designed to assist the designer/analyst/engineer in the preparation of specifications of an information system is used as a supporting tool to aid in the analysis. Attention is given to a definition of the uplink process, the definition of PSL/PSA, the construction of a PSA database, the value of analysis to the study of the uplink process, and the PSL/PSA lessons learned.
NASA's Astrophysics Data Archives
NASA Astrophysics Data System (ADS)
Hasan, H.; Hanisch, R.; Bredekamp, J.
2000-09-01
The NASA Office of Space Science has established a series of archival centers where science data acquired through its space science missions is deposited. The availability of high quality data to the general public through these open archives enables the maximization of science return of the flight missions. The Astrophysics Data Centers Coordinating Council, an informal collaboration of archival centers, coordinates data from five archival centers distiguished primarily by the wavelength range of the data deposited there. Data are available in FITS format. An overview of NASA's data centers and services is presented in this paper. A standard front-end modifyer called `Astrowbrowse' is described. Other catalog browsers and tools include WISARD and AMASE supported by the National Space Scince Data Center, as well as ISAIA, a follow on to Astrobrowse.
The semantics of Chemical Markup Language (CML) for computational chemistry : CompChem.
Phadungsukanan, Weerapong; Kraft, Markus; Townsend, Joe A; Murray-Rust, Peter
2012-08-07
: This paper introduces a subdomain chemistry format for storing computational chemistry data called CompChem. It has been developed based on the design, concepts and methodologies of Chemical Markup Language (CML) by adding computational chemistry semantics on top of the CML Schema. The format allows a wide range of ab initio quantum chemistry calculations of individual molecules to be stored. These calculations include, for example, single point energy calculation, molecular geometry optimization, and vibrational frequency analysis. The paper also describes the supporting infrastructure, such as processing software, dictionaries, validation tools and database repositories. In addition, some of the challenges and difficulties in developing common computational chemistry dictionaries are discussed. The uses of CompChem are illustrated by two practical applications.
The semantics of Chemical Markup Language (CML) for computational chemistry : CompChem
2012-01-01
This paper introduces a subdomain chemistry format for storing computational chemistry data called CompChem. It has been developed based on the design, concepts and methodologies of Chemical Markup Language (CML) by adding computational chemistry semantics on top of the CML Schema. The format allows a wide range of ab initio quantum chemistry calculations of individual molecules to be stored. These calculations include, for example, single point energy calculation, molecular geometry optimization, and vibrational frequency analysis. The paper also describes the supporting infrastructure, such as processing software, dictionaries, validation tools and database repositories. In addition, some of the challenges and difficulties in developing common computational chemistry dictionaries are discussed. The uses of CompChem are illustrated by two practical applications. PMID:22870956
Crew resource management in the ICU: the need for culture change.
Haerkens, Marck Htm; Jenkins, Donald H; van der Hoeven, Johannes G
2012-08-22
Intensive care frequently results in unintentional harm to patients and statistics don't seem to improve. The ICU environment is especially unforgiving for mistakes due to the multidisciplinary, time-critical nature of care and vulnerability of the patients. Human factors account for the majority of adverse events and a sound safety climate is therefore essential. This article reviews the existing literature on aviation-derived training called Crew Resource Management (CRM) and discusses its application in critical care medicine. CRM focuses on teamwork, threat and error management and blame free discussion of human mistakes. Though evidence is still scarce, the authors consider CRM to be a promising tool for culture change in the ICU setting, if supported by leadership and well-designed follow-up.
Mayfield, Teresa J; Olimpo, Jeffrey T; Floyd, Kevin W; Greenbaum, Eli
2018-01-01
Scientists are increasingly called upon to communicate with the public, yet most never receive formal training in this area. Public understanding is particularly critical to maintaining support for undervalued resources such as biological collections, research data repositories, and expensive equipment. We describe activities carried out in an inquiry-driven organismal biology laboratory course designed to engage a diverse student body using biological collections. The goals of this cooperative learning experience were to increase students' ability to locate and comprehend primary research articles, and to communicate the importance of an undervalued scientific resource to nonscientists. Our results indicate that collaboratively created, research-focused informational posters are an effective tool for achieving these goals and may be applied in other disciplines or classroom settings.
A hierarchical distributed control model for coordinating intelligent systems
NASA Technical Reports Server (NTRS)
Adler, Richard M.
1991-01-01
A hierarchical distributed control (HDC) model for coordinating cooperative problem-solving among intelligent systems is described. The model was implemented using SOCIAL, an innovative object-oriented tool for integrating heterogeneous, distributed software systems. SOCIAL embeds applications in 'wrapper' objects called Agents, which supply predefined capabilities for distributed communication, control, data specification, and translation. The HDC model is realized in SOCIAL as a 'Manager'Agent that coordinates interactions among application Agents. The HDC Manager: indexes the capabilities of application Agents; routes request messages to suitable server Agents; and stores results in a commonly accessible 'Bulletin-Board'. This centralized control model is illustrated in a fault diagnosis application for launch operations support of the Space Shuttle fleet at NASA, Kennedy Space Center.
Recent Advancements towards Full-System Microfluidics
Miled, Amine
2017-01-01
Microfluidics is quickly becoming a key technology in an expanding range of fields, such as medical sciences, biosensing, bioactuation, chemical synthesis, and more. This is helping its transformation from a promising R&D tool to commercially viable technology. Fuelling this expansion is the intensified focus on automation and enhanced functionality through integration of complex electrical control, mechanical properties, in situ sensing and flow control. Here we highlight recent contributions to the Sensors Special Issue series called “Microfluidics-Based Microsystem Integration Research” under the following categories: (i) Device fabrication to support complex functionality; (ii) New methods for flow control and mixing; (iii) Towards routine analysis and point of care applications; (iv) In situ characterization; and (v) Plug and play microfluidics. PMID:28757587
Abidi, Samina; Vallis, Michael; Piccinini-Vallis, Helena; Imran, Syed Ali; Abidi, Syed Sibte Raza
2018-04-18
Behavioral science is now being integrated into diabetes self-management interventions. However, the challenge that presents itself is how to translate these knowledge resources during care so that primary care practitioners can use them to offer evidence-informed behavior change support and diabetes management recommendations to patients with diabetes. The aim of this study was to develop and evaluate a computerized decision support platform called "Diabetes Web-Centric Information and Support Environment" (DWISE) that assists primary care practitioners in applying standardized behavior change strategies and clinical practice guidelines-based recommendations to an individual patient and empower the patient with the skills and knowledge required to self-manage their diabetes through planned, personalized, and pervasive behavior change strategies. A health care knowledge management approach is used to implement DWISE so that it features the following functionalities: (1) assessment of primary care practitioners' readiness to administer validated behavior change interventions to patients with diabetes; (2) educational support for primary care practitioners to help them offer behavior change interventions to patients; (3) access to evidence-based material, such as the Canadian Diabetes Association's (CDA) clinical practice guidelines, to primary care practitioners; (4) development of personalized patient self-management programs to help patients with diabetes achieve healthy behaviors to meet CDA targets for managing type 2 diabetes; (5) educational support for patients to help them achieve behavior change; and (6) monitoring of the patients' progress to assess their adherence to the behavior change program and motivating them to ensure compliance with their program. DWISE offers these functionalities through an interactive Web-based interface to primary care practitioners, whereas the patient's self-management program and associated behavior interventions are delivered through a mobile patient diary via mobile phones and tablets. DWISE has been tested for its usability, functionality, usefulness, and acceptance through a series of qualitative studies. For the primary care practitioner tool, most usability problems were associated with the navigation of the tool and the presentation, formatting, understandability, and suitability of the content. For the patient tool, most issues were related to the tool's screen layout, design features, understandability of the content, clarity of the labels used, and navigation across the tool. Facilitators and barriers to DWISE use in a shared decision-making environment have also been identified. This work has provided a unique electronic health solution to translate complex health care knowledge in terms of easy-to-use, evidence-informed, point-of-care decision aids for primary care practitioners. Patients' feedback is now being used to make necessary modification to DWISE. ©Samina Abidi, Michael Vallis, Helena Piccinini-Vallis, Syed Ali Imran, Syed Sibte Raza Abidi. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 18.04.2018.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poliakoff, David; Legendre, Matt
2017-03-29
GOTCHA is a runtime API intercepting function calls between shared libraries. It is intended to be used by HPC Tools (i.e., performance analysis tools like Open/SpeedShop, HPCToolkit, TAU, etc.). 2:18 PMThese other tools can use Gotch to intercept interesting functions, such as MPI functions, and collect performance metrics about those functions. We intend for this to be open-source software that gets adopted by other open-s0urse tools that are used at LLNL.
Morel, Sophia; Portolese, Olivia; Chertouk, Yasmine; Leahy, Jade; Bertout, Laurence; Laverdière, Caroline; Krajinovic, Maja; Sinnett, Daniel; Levy, Emile; Marcil, Valérie
2018-04-21
Survivors of childhood acute lymphoblastic leukemia (cALL) experience cardiometabolic and bone complications after treatments. This study aimed at developing and validating an interview-administrated food frequency questionnaire (FFQ) that will serve to estimate the impact of nutrition in the development of long-term sequalea of French-Canadian cALL survivors. The FFQ was developed to assess habitual diet, Mediterranean diet score, nutrients promoting bone health and antioxidants. It was validated using a 3-day food record (3-DFR) in 80 cALL survivors (50% male) aged between 11.4 and 40.1 years (median of 18.0 years). Reproducibility was evaluated by comparing FFQs from visit 1 and 2 in 29 cALL survivors. When compared to 3-DFR, the mean values for macro- and micronutrient intake were overestimated by our FFQ with the exception of lipid-related nutrients. Correlations between nutrient intakes derived from the FFQs and the 3-DFRs showed moderate to very good correlations (0.46-0.74). Intraclass correlation coefficients assessing FFQ reproducibility ranged from 0.62 to 0.92, indicating moderate to good reliability. Furthermore, classification into quartiles showed more than 75% of macro- and micronutrients derived from FFQs 1 and 2 classified into the same or adjacent quartile. Overall, our results support the reproducibility and accuracy of the developed FFQ to appropriately classify individuals according to their dietary intake. This validated tool will be valuable for future studies analyzing the impact of nutrition on cardiometabolic and bone complications in French-speaking populations.
Burnside, Elizabeth S.; Lee, Sandra J.; Bennette, Carrie; Near, Aimee M.; Alagoz, Oguzhan; Huang, Hui; van den Broek, Jeroen J.; Kim, Joo Yeon; Ergun, Mehmet A.; van Ravesteyn, Nicolien T.; Stout, Natasha K.; de Koning, Harry J.; Mandelblatt, Jeanne S.
2017-01-01
Background There are no publicly available tools designed specifically to assist policy makers to make informed decisions about the optimal ages of breast cancer screening initiation for different populations of US women. Objective To use three established simulation models to develop a web-based tool called Mammo OUTPuT. Methods The simulation models use the 1970 US birth cohort and common parameters for incidence, digital screening performance, and treatment effects. Outcomes include breast cancers diagnosed, breast cancer deaths averted, breast cancer mortality reduction, false-positive mammograms, benign biopsies, and overdiagnosis. The Mammo OUTPuT tool displays these outcomes for combinations of age at screening initiation (every year from 40 to 49), annual versus biennial interval, lifetime versus 10-year horizon, and breast density, compared to waiting to start biennial screening at age 50 and continuing to 74. The tool was piloted by decision makers (n = 16) who completed surveys. Results The tool demonstrates that benefits in the 40s increase linearly with earlier initiation age, without a specific threshold age. Likewise, the harms of screening increase monotonically with earlier ages of initiation in the 40s. The tool also shows users how the balance of benefits and harms varies with breast density. Surveys revealed that 100% of users (16/16) liked the appearance of the site; 94% (15/16) found the tool helpful; and 94% (15/16) would recommend the tool to a colleague. Conclusions This tool synthesizes a representative subset of the most current CISNET (Cancer Intervention and Surveillance Modeling Network) simulation model outcomes to provide policy makers with quantitative data on the benefits and harms of screening women in the 40s. Ultimate decisions will depend on program goals, the population served, and informed judgments about the weight of benefits and harms. PMID:29376135
Demonstration of C-Tools for Community Exposure Assessment
The presentation describes a new community-scale tool called C-PORT to model emissions related to all port-related activities – including, but not limited to ships, trucks, cranes, etc. – and predict concentrations at fine spatial scales in the near-source environment...
How freight moves : estimating milage and routes using an innovative GIS tool
DOT National Transportation Integrated Search
2007-06-01
The Bureau of Transportation Statistics (BTS) has developed an innovative software tool, called GeoMiler, that is helping researchers better estimate freight travel. GeoMiler is being used to compute mileages along likely routes for the nearly 6 mill...
Feasibility of lane closures using probe data : technical brief.
DOT National Transportation Integrated Search
2017-04-01
This study developed an on-line system analysis tool called the Work Zone Interactive : Management Application - Planning (WIMAP-P), an easy-to-use and easy-to-learn tool for : predicting the traffic impact caused by work zone lane closures on freewa...
ERIC Educational Resources Information Center
Wilson, Courtney R.; Trautmann, Nancy M.; MaKinster, James G.; Barker, Barbara J.
2010-01-01
A new online tool called "Science Pipes" allows students to conduct biodiversity investigations. With this free tool, students create and run analyses that would otherwise require access to unwieldy data sets and the ability to write computer code. Using these data, students can conduct guided inquiries or hypothesis-driven research to…
Electronic Assessment and Feedback Tool in Supervision of Nursing Students during Clinical Training
ERIC Educational Resources Information Center
Mettiäinen, Sari
2015-01-01
The aim of this study was to determine nursing teachers' and students' attitudes to and experiences of using an electronic assessment and feedback tool in supervision of clinical training. The tool was called eTaitava, and it was developed in Finland. During the pilot project, the software was used by 12 nursing teachers and 430 nursing students.…
ERIC Educational Resources Information Center
European Training Foundation, Turin (Italy).
This document presents a management tool kit on training needs assessment and program design for countries in transition to a market economy. Chapter 1 describes the tool's development within the framework of the project called Strengthening of Partnership between Management Training Institutions and Companies, Ukraine-Kazakhstan-Kyrgyzstan.…
Advances in Grid Computing for the FabrIc for Frontier Experiments Project at Fermialb
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herner, K.; Alba Hernandex, A. F.; Bhat, S.
The FabrIc for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientic Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of diering size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certicate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have signicantly matured, and present an increasinglymore » complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the eorts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production work ows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular work ows, and support troubleshooting and triage in case of problems. Recently a new certicate management infrastructure called Distributed Computing Access with Federated Identities (DCAFI) has been put in place that has eliminated our dependence on a Fermilab-specic third-party Certicate Authority service and better accommodates FIFE collaborators without a Fermilab Kerberos account. DCAFI integrates the existing InCommon federated identity infrastructure, CILogon Basic CA, and a MyProxy service using a new general purpose open source tool. We will discuss the general FIFE onboarding strategy, progress in expanding FIFE experiments presence on the Open Science Grid, new tools for job monitoring, the POMS service, and the DCAFI project.« less
Advances in Grid Computing for the Fabric for Frontier Experiments Project at Fermilab
NASA Astrophysics Data System (ADS)
Herner, K.; Alba Hernandez, A. F.; Bhat, S.; Box, D.; Boyd, J.; Di Benedetto, V.; Ding, P.; Dykstra, D.; Fattoruso, M.; Garzoglio, G.; Kirby, M.; Kreymer, A.; Levshina, T.; Mazzacane, A.; Mengel, M.; Mhashilkar, P.; Podstavkov, V.; Retzke, K.; Sharma, N.; Teheran, J.
2017-10-01
The Fabric for Frontier Experiments (FIFE) project is a major initiative within the Fermilab Scientific Computing Division charged with leading the computing model for Fermilab experiments. Work within the FIFE project creates close collaboration between experimenters and computing professionals to serve high-energy physics experiments of differing size, scope, and physics area. The FIFE project has worked to develop common tools for job submission, certificate management, software and reference data distribution through CVMFS repositories, robust data transfer, job monitoring, and databases for project tracking. Since the projects inception the experiments under the FIFE umbrella have significantly matured, and present an increasingly complex list of requirements to service providers. To meet these requirements, the FIFE project has been involved in transitioning the Fermilab General Purpose Grid cluster to support a partitionable slot model, expanding the resources available to experiments via the Open Science Grid, assisting with commissioning dedicated high-throughput computing resources for individual experiments, supporting the efforts of the HEP Cloud projects to provision a variety of back end resources, including public clouds and high performance computers, and developing rapid onboarding procedures for new experiments and collaborations. The larger demands also require enhanced job monitoring tools, which the project has developed using such tools as ElasticSearch and Grafana. in helping experiments manage their large-scale production workflows. This group in turn requires a structured service to facilitate smooth management of experiment requests, which FIFE provides in the form of the Production Operations Management Service (POMS). POMS is designed to track and manage requests from the FIFE experiments to run particular workflows, and support troubleshooting and triage in case of problems. Recently a new certificate management infrastructure called Distributed Computing Access with Federated Identities (DCAFI) has been put in place that has eliminated our dependence on a Fermilab-specific third-party Certificate Authority service and better accommodates FIFE collaborators without a Fermilab Kerberos account. DCAFI integrates the existing InCommon federated identity infrastructure, CILogon Basic CA, and a MyProxy service using a new general purpose open source tool. We will discuss the general FIFE onboarding strategy, progress in expanding FIFE experiments presence on the Open Science Grid, new tools for job monitoring, the POMS service, and the DCAFI project.
Bat detective-Deep learning tools for bat acoustic signal detection.
Mac Aodha, Oisin; Gibb, Rory; Barlow, Kate E; Browning, Ella; Firman, Michael; Freeman, Robin; Harder, Briana; Kinsey, Libby; Mead, Gary R; Newson, Stuart E; Pandourski, Ivan; Parsons, Stuart; Russ, Jon; Szodoray-Paradi, Abigel; Szodoray-Paradi, Farkas; Tilova, Elena; Girolami, Mark; Brostow, Gabriel; Jones, Kate E
2018-03-01
Passive acoustic sensing has emerged as a powerful tool for quantifying anthropogenic impacts on biodiversity, especially for echolocating bat species. To better assess bat population trends there is a critical need for accurate, reliable, and open source tools that allow the detection and classification of bat calls in large collections of audio recordings. The majority of existing tools are commercial or have focused on the species classification task, neglecting the important problem of first localizing echolocation calls in audio which is particularly problematic in noisy recordings. We developed a convolutional neural network based open-source pipeline for detecting ultrasonic, full-spectrum, search-phase calls produced by echolocating bats. Our deep learning algorithms were trained on full-spectrum ultrasonic audio collected along road-transects across Europe and labelled by citizen scientists from www.batdetective.org. When compared to other existing algorithms and commercial systems, we show significantly higher detection performance of search-phase echolocation calls with our test sets. As an example application, we ran our detection pipeline on bat monitoring data collected over five years from Jersey (UK), and compared results to a widely-used commercial system. Our detection pipeline can be used for the automatic detection and monitoring of bat populations, and further facilitates their use as indicator species on a large scale. Our proposed pipeline makes only a small number of bat specific design decisions, and with appropriate training data it could be applied to detecting other species in audio. A crucial novelty of our work is showing that with careful, non-trivial, design and implementation considerations, state-of-the-art deep learning methods can be used for accurate and efficient monitoring in audio.
Bat detective—Deep learning tools for bat acoustic signal detection
Barlow, Kate E.; Firman, Michael; Freeman, Robin; Harder, Briana; Kinsey, Libby; Mead, Gary R.; Newson, Stuart E.; Pandourski, Ivan; Russ, Jon; Szodoray-Paradi, Abigel; Tilova, Elena; Girolami, Mark; Jones, Kate E.
2018-01-01
Passive acoustic sensing has emerged as a powerful tool for quantifying anthropogenic impacts on biodiversity, especially for echolocating bat species. To better assess bat population trends there is a critical need for accurate, reliable, and open source tools that allow the detection and classification of bat calls in large collections of audio recordings. The majority of existing tools are commercial or have focused on the species classification task, neglecting the important problem of first localizing echolocation calls in audio which is particularly problematic in noisy recordings. We developed a convolutional neural network based open-source pipeline for detecting ultrasonic, full-spectrum, search-phase calls produced by echolocating bats. Our deep learning algorithms were trained on full-spectrum ultrasonic audio collected along road-transects across Europe and labelled by citizen scientists from www.batdetective.org. When compared to other existing algorithms and commercial systems, we show significantly higher detection performance of search-phase echolocation calls with our test sets. As an example application, we ran our detection pipeline on bat monitoring data collected over five years from Jersey (UK), and compared results to a widely-used commercial system. Our detection pipeline can be used for the automatic detection and monitoring of bat populations, and further facilitates their use as indicator species on a large scale. Our proposed pipeline makes only a small number of bat specific design decisions, and with appropriate training data it could be applied to detecting other species in audio. A crucial novelty of our work is showing that with careful, non-trivial, design and implementation considerations, state-of-the-art deep learning methods can be used for accurate and efficient monitoring in audio. PMID:29518076
A workshop will be conducted to demonstrate and focus on two decision support tools developed at EPA/ORD: 1. Community-scale MARKAL model: an energy-water technology evaluation tool and 2. Municipal Solid Waste Decision Support Tool (MSW DST). The Workshop will be part of Southea...
Kerr, Cicely; Murray, Elizabeth; Burns, Jo; Turner, Indra; Westwood, Mark A; Macadam, Catherine; Nazareth, Irwin; Patterson, David
2008-01-01
Internet interventions can help people to self-manage chronic disease. However, they are only likely to be used if they meet patients' perceived needs. We have developed an Internet intervention in two stages to meet the needs of patients with coronary heart disease (CHD). First, user-generated criteria were applied to an existing US-based intervention called 'CHESS Living with Heart Disease' which provides information, emotional and social support, self-assessment and monitoring tools, and behavioural change support. This identified the development work required. Then we conducted a user evaluation with a panel of five patients with CHD. Overall, users generally made positive comments about the information content. However they were critical of presentation, ease of navigation through the content, understanding what was offered in the different services and finding the information they were after. Applying user-generated quality criteria proved useful in developing an intervention to meet the needs of UK patients with CHD.
To welcome or affirm: Black clergy views about homosexuality, inclusivity, and church leadership.
Barnes, Sandra L
2013-01-01
When the subject of the Black Church and homosexuality is broached, research often focuses on homophobia and correlates with HIV/AIDS. Fewer studies examine other problematic issues germane to gay and lesbian involvement in Black congregations. In this analysis, Black clergy dialogue during focus groups about inclusivity and church leadership by gays and lesbians. Informed by Cultural Theory, of equal interest is whether discourses are influenced by Black Church cultural tools, as well as cultural dynamics, from the broader Black community. As anticipated, findings suggest the tendency for clergy to promote welcoming church spaces, but to be reticent about affirming homosexuality as an acceptable lifestyle. Furthermore, although clergy are generally supportive of involvement by closeted gays and lesbians as lay leaders, most do not support their involvement in the clergy, particularly as pastors. However, views vary based on denomination and gender, and are informed by Black Church cultural components such as scripture and the call-and-response tradition.
Development and function of human innate immune cells in a humanized mouse model.
Rongvaux, Anthony; Willinger, Tim; Martinek, Jan; Strowig, Till; Gearty, Sofia V; Teichmann, Lino L; Saito, Yasuyuki; Marches, Florentina; Halene, Stephanie; Palucka, A Karolina; Manz, Markus G; Flavell, Richard A
2014-04-01
Mice repopulated with human hematopoietic cells are a powerful tool for the study of human hematopoiesis and immune function in vivo. However, existing humanized mouse models cannot support development of human innate immune cells, including myeloid cells and natural killer (NK) cells. Here we describe two mouse strains called MITRG and MISTRG, in which human versions of four genes encoding cytokines important for innate immune cell development are knocked into their respective mouse loci. The human cytokines support the development and function of monocytes, macrophages and NK cells derived from human fetal liver or adult CD34(+) progenitor cells injected into the mice. Human macrophages infiltrated a human tumor xenograft in MITRG and MISTRG mice in a manner resembling that observed in tumors obtained from human patients. This humanized mouse model may be used to model the human immune system in scenarios of health and pathology, and may enable evaluation of therapeutic candidates in an in vivo setting relevant to human physiology.
Development and function of human innate immune cells in a humanized mouse model
Rongvaux, Anthony; Willinger, Tim; Martinek, Jan; Strowig, Till; Gearty, Sofia V.; Teichmann, Lino L.; Saito, Yasuyuki; Marches, Florentina; Halene, Stephanie; Palucka, A. Karolina; Manz, Markus G.; Flavell, Richard A.
2014-01-01
Mice repopulated with human hematopoietic cells are a powerful tool for the study of human hematopoiesis and immune function in vivo. However, existing humanized mouse models are unable to support development of human innate immune cells, including myeloid cells and NK cells. Here we describe a mouse strain, called MI(S)TRG, in which human versions of four genes encoding cytokines important for innate immune cell development are knocked in to their respective mouse loci. The human cytokines support the development and function of monocytes/macrophages and natural killer cells derived from human fetal liver or adult CD34+ progenitor cells injected into the mice. Human macrophages infiltrated a human tumor xenograft in MI(S)TRG mice in a manner resembling that observed in tumors obtained from human patients. This humanized mouse model may be used to model the human immune system in scenarios of health and pathology, and may enable evaluation of therapeutic candidates in an in vivo setting relevant to human physiology. PMID:24633240
Advancing satellite operations with intelligent graphical monitoring systems
NASA Technical Reports Server (NTRS)
Hughes, Peter M.; Shirah, Gregory W.; Luczak, Edward C.
1993-01-01
For nearly twenty-five years, spacecraft missions have been operated in essentially the same manner: human operators monitor displays filled with alphanumeric text watching for limit violations or other indicators that signal a problem. The task is performed predominately by humans. Only in recent years have graphical user interfaces and expert systems been accepted within the control center environment to help reduce operator workloads. Unfortunately, the development of these systems is often time consuming and costly. At the NASA Goddard Space Flight Center (GSFC), a new domain specific expert system development tool called the Generic Spacecraft Analyst Assistant (GenSAA) has been developed. Through the use of a highly graphical user interface and point-and-click operation, GenSAA facilitates the rapid, 'programming-free' construction of intelligent graphical monitoring systems to serve as real-time, fault-isolation assistants for spacecraft analysts. Although specifically developed to support real-time satellite monitoring, GenSAA can support the development of intelligent graphical monitoring systems in a variety of space and commercial applications.
Numerical Propulsion System Simulation Architecture
NASA Technical Reports Server (NTRS)
Naiman, Cynthia G.
2004-01-01
The Numerical Propulsion System Simulation (NPSS) is a framework for performing analysis of complex systems. Because the NPSS was developed using the object-oriented paradigm, the resulting architecture is an extensible and flexible framework that is currently being used by a diverse set of participants in government, academia, and the aerospace industry. NPSS is being used by over 15 different institutions to support rockets, hypersonics, power and propulsion, fuel cells, ground based power, and aerospace. Full system-level simulations as well as subsystems may be modeled using NPSS. The NPSS architecture enables the coupling of analyses at various levels of detail, which is called numerical zooming. The middleware used to enable zooming and distributed simulations is the Common Object Request Broker Architecture (CORBA). The NPSS Developer's Kit offers tools for the developer to generate CORBA-based components and wrap codes. The Developer's Kit enables distributed multi-fidelity and multi-discipline simulations, preserves proprietary and legacy codes, and facilitates addition of customized codes. The platforms supported are PC, Linux, HP, Sun, and SGI.
[Microfabricated X-ray Optics Technology Development for the Constellation X-Mission
NASA Technical Reports Server (NTRS)
Schattenburg, Mark L.
2005-01-01
MIT has previously developed advanced methods for the application of silicon microstructures (so-called microcombs) in the precision assembly of foil x-ray optics in support of the Constellation-X Spectroscopy X-ray Telescope (SXT) technology development at the NASA Goddard Space Flight Center (GSFC). During the first year of the above Cooperative Agreement, MIT has developed a new, mature, potentially high- yield process for the manufacturing of microcombs that can be applied to a range of substrates independent of thickness. MIT also developed techniques to extract microcomb accuracy from an assembly truss metrology test stand and to extend the dynamic range of its Shack-Hartmann foil metrology tool. The placement repeatability of foil optics with microcombs in the assembly truss has been improved by a factor of two to approximately 0.15 micron. This was achieved by electric contact determination in favor of determining contact through force measurements. Development work on a stress-free thin foil holder was also supported by this agreement and successfully continued under a different grant.
Watershed Central: A New Gateway to Watershed Information
Many communities across the country struggle to find the right approaches, tools and data to in their watershed plans. EPA recently posted a new Web site called "Watershed Central, a “onestop" tool, to help watershed organizations and others find key resources to protect their ...
A GIS-BASED MODAL MODEL OF AUTOMOBILE EXHAUST EMISSIONS
The report presents progress toward the development of a computer tool called MEASURE, the Mobile Emission Assessment System for Urban and Regional Evaluation. The tool works toward a goal of providing researchers and planners with a way to assess new mobile emission mitigation s...
Visualizing Qualitative Information
ERIC Educational Resources Information Center
Slone, Debra J.
2009-01-01
The abundance of qualitative data in today's society and the need to easily scrutinize, digest, and share this information calls for effective visualization and analysis tools. Yet, no existing qualitative tools have the analytic power, visual effectiveness, and universality of familiar quantitative instruments like bar charts, scatter-plots, and…
Chen, Connie; Haddad, David; Selsky, Joshua; Hoffman, Julia E; Kravitz, Richard L; Estrin, Deborah E; Sim, Ida
2012-08-09
Mobile phones and devices, with their constant presence, data connectivity, and multiple intrinsic sensors, can support around-the-clock chronic disease prevention and management that is integrated with daily life. These mobile health (mHealth) devices can produce tremendous amounts of location-rich, real-time, high-frequency data. Unfortunately, these data are often full of bias, noise, variability, and gaps. Robust tools and techniques have not yet been developed to make mHealth data more meaningful to patients and clinicians. To be most useful, health data should be sharable across multiple mHealth applications and connected to electronic health records. The lack of data sharing and dearth of tools and techniques for making sense of health data are critical bottlenecks limiting the impact of mHealth to improve health outcomes. We describe Open mHealth, a nonprofit organization that is building an open software architecture to address these data sharing and "sense-making" bottlenecks. Our architecture consists of open source software modules with well-defined interfaces using a minimal set of common metadata. An initial set of modules, called InfoVis, has been developed for data analysis and visualization. A second set of modules, our Personal Evidence Architecture, will support scientific inferences from mHealth data. These Personal Evidence Architecture modules will include standardized, validated clinical measures to support novel evaluation methods, such as n-of-1 studies. All of Open mHealth's modules are designed to be reusable across multiple applications, disease conditions, and user populations to maximize impact and flexibility. We are also building an open community of developers and health innovators, modeled after the open approach taken in the initial growth of the Internet, to foster meaningful cross-disciplinary collaboration around new tools and techniques. An open mHealth community and architecture will catalyze increased mHealth efficiency, effectiveness, and innovation.
Rafferty, Anne Marie; Philippou, Julia; Fitzpatrick, Joanne M; Pike, Geoff; Ball, Jane
2017-01-01
Objective Concerns about care quality have prompted calls to create workplace cultures conducive to high-quality, safe and compassionate care and to provide a supportive environment in which staff can operate effectively. How healthcare organisations assess their culture of care is an important first step in creating such cultures. This article reports on the development and validation of a tool, the Culture of Care Barometer, designed to assess perceptions of a caring culture among healthcare workers preliminary to culture change. Design/setting/participants An exploratory mixed methods study designed to develop and test the validity of a tool to measure ‘culture of care’ through focus groups and questionnaires. Questionnaire development was facilitated through: a literature review, experts generating items of interest and focus group discussions with healthcare staff across specialities, roles and seniority within three types of public healthcare organisations in the UK. The tool was designed to be multiprofessional and pilot tested with a sample of 467 nurses and healthcare support workers in acute care and then validated with a sample of 1698 staff working across acute, mental health and community services in England. Exploratory factor analysis was used to identify dimensions underlying the Barometer. Results Psychometric testing resulted in the development of a 30-item questionnaire linked to four domains with retained items loading to four factors: organisational values (α=0.93, valid n=1568, M=3.7), team support (α=0.93, valid n=1557, M=3.2), relationships with colleagues (α=0.84, valid n=1617, M=4.0) and job constraints (α=0.70, valid n=1616, M=3.3). Conclusions The study developed a valid and reliable instrument with which to gauge the different attributes of care culture perceived by healthcare staff with potential for organisational benchmarking. PMID:28821526
Rafferty, Anne Marie; Philippou, Julia; Fitzpatrick, Joanne M; Pike, Geoff; Ball, Jane
2017-08-18
Concerns about care quality have prompted calls to create workplace cultures conducive to high-quality, safe and compassionate care and to provide a supportive environment in which staff can operate effectively. How healthcare organisations assess their culture of care is an important first step in creating such cultures. This article reports on the development and validation of a tool, the Culture of Care Barometer, designed to assess perceptions of a caring culture among healthcare workers preliminary to culture change. An exploratory mixed methods study designed to develop and test the validity of a tool to measure 'culture of care' through focus groups and questionnaires. Questionnaire development was facilitated through: a literature review, experts generating items of interest and focus group discussions with healthcare staff across specialities, roles and seniority within three types of public healthcare organisations in the UK. The tool was designed to be multiprofessional and pilot tested with a sample of 467 nurses and healthcare support workers in acute care and then validated with a sample of 1698 staff working across acute, mental health and community services in England. Exploratory factor analysis was used to identify dimensions underlying the Barometer. Psychometric testing resulted in the development of a 30-item questionnaire linked to four domains with retained items loading to four factors: organisational values (α=0.93, valid n=1568, M=3.7), team support (α=0.93, valid n=1557, M=3.2), relationships with colleagues (α=0.84, valid n=1617, M=4.0) and job constraints (α=0.70, valid n=1616, M=3.3). The study developed a valid and reliable instrument with which to gauge the different attributes of care culture perceived by healthcare staff with potential for organisational benchmarking. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Chen, Connie; Haddad, David; Selsky, Joshua; Hoffman, Julia E; Kravitz, Richard L; Estrin, Deborah E
2012-01-01
Mobile phones and devices, with their constant presence, data connectivity, and multiple intrinsic sensors, can support around-the-clock chronic disease prevention and management that is integrated with daily life. These mobile health (mHealth) devices can produce tremendous amounts of location-rich, real-time, high-frequency data. Unfortunately, these data are often full of bias, noise, variability, and gaps. Robust tools and techniques have not yet been developed to make mHealth data more meaningful to patients and clinicians. To be most useful, health data should be sharable across multiple mHealth applications and connected to electronic health records. The lack of data sharing and dearth of tools and techniques for making sense of health data are critical bottlenecks limiting the impact of mHealth to improve health outcomes. We describe Open mHealth, a nonprofit organization that is building an open software architecture to address these data sharing and “sense-making” bottlenecks. Our architecture consists of open source software modules with well-defined interfaces using a minimal set of common metadata. An initial set of modules, called InfoVis, has been developed for data analysis and visualization. A second set of modules, our Personal Evidence Architecture, will support scientific inferences from mHealth data. These Personal Evidence Architecture modules will include standardized, validated clinical measures to support novel evaluation methods, such as n-of-1 studies. All of Open mHealth’s modules are designed to be reusable across multiple applications, disease conditions, and user populations to maximize impact and flexibility. We are also building an open community of developers and health innovators, modeled after the open approach taken in the initial growth of the Internet, to foster meaningful cross-disciplinary collaboration around new tools and techniques. An open mHealth community and architecture will catalyze increased mHealth efficiency, effectiveness, and innovation. PMID:22875563
Detection of Cutting Tool Wear using Statistical Analysis and Regression Model
NASA Astrophysics Data System (ADS)
Ghani, Jaharah A.; Rizal, Muhammad; Nuawi, Mohd Zaki; Haron, Che Hassan Che; Ramli, Rizauddin
2010-10-01
This study presents a new method for detecting the cutting tool wear based on the measured cutting force signals. A statistical-based method called Integrated Kurtosis-based Algorithm for Z-Filter technique, called I-kaz was used for developing a regression model and 3D graphic presentation of I-kaz 3D coefficient during machining process. The machining tests were carried out using a CNC turning machine Colchester Master Tornado T4 in dry cutting condition. A Kistler 9255B dynamometer was used to measure the cutting force signals, which were transmitted, analyzed, and displayed in the DasyLab software. Various force signals from machining operation were analyzed, and each has its own I-kaz 3D coefficient. This coefficient was examined and its relationship with flank wear lands (VB) was determined. A regression model was developed due to this relationship, and results of the regression model shows that the I-kaz 3D coefficient value decreases as tool wear increases. The result then is used for real time tool wear monitoring.
Young, Anna M.; Cordier, Breanne; Mundry, Roger; Wright, Timothy F.
2014-01-01
In many social species group, members share acoustically similar calls. Functional hypotheses have been proposed for call sharing, but previous studies have been limited by an inability to distinguish among these hypotheses. We examined the function of vocal sharing in female budgerigars with a two-part experimental design that allowed us to distinguish between two functional hypotheses. The social association hypothesis proposes that shared calls help animals mediate affiliative and aggressive interactions, while the password hypothesis proposes that shared calls allow animals to distinguish group identity and exclude nonmembers. We also tested the labeling hypothesis, a mechanistic explanation which proposes that shared calls are used to address specific individuals within the sender–receiver relationship. We tested the social association hypothesis by creating four–member flocks of unfamiliar female budgerigars (Melopsittacus undulatus) and then monitoring the birds’ calls, social behaviors, and stress levels via fecal glucocorticoid metabolites. We tested the password hypothesis by moving immigrants into established social groups. To test the labeling hypothesis, we conducted additional recording sessions in which individuals were paired with different group members. The social association hypothesis was supported by the development of multiple shared call types in each cage and a correlation between the number of shared call types and the number of aggressive interactions between pairs of birds. We also found support for calls serving as a labeling mechanism using discriminant function analysis with a permutation procedure. Our results did not support the password hypothesis, as there was no difference in stress or directed behaviors between immigrant and control birds. PMID:24860236
Mixed Initiative Visual Analytics Using Task-Driven Recommendations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, Kristin A.; Cramer, Nicholas O.; Israel, David
2015-12-07
Visual data analysis is composed of a collection of cognitive actions and tasks to decompose, internalize, and recombine data to produce knowledge and insight. Visual analytic tools provide interactive visual interfaces to data to support tasks involved in discovery and sensemaking, including forming hypotheses, asking questions, and evaluating and organizing evidence. Myriad analytic models can be incorporated into visual analytic systems, at the cost of increasing complexity in the analytic discourse between user and system. Techniques exist to increase the usability of interacting with such analytic models, such as inferring data models from user interactions to steer the underlying modelsmore » of the system via semantic interaction, shielding users from having to do so explicitly. Such approaches are often also referred to as mixed-initiative systems. Researchers studying the sensemaking process have called for development of tools that facilitate analytic sensemaking through a combination of human and automated activities. However, design guidelines do not exist for mixed-initiative visual analytic systems to support iterative sensemaking. In this paper, we present a candidate set of design guidelines and introduce the Active Data Environment (ADE) prototype, a spatial workspace supporting the analytic process via task recommendations invoked by inferences on user interactions within the workspace. ADE recommends data and relationships based on a task model, enabling users to co-reason with the system about their data in a single, spatial workspace. This paper provides an illustrative use case, a technical description of ADE, and a discussion of the strengths and limitations of the approach.« less
Y0: An innovative tool for spatial data analysis
NASA Astrophysics Data System (ADS)
Wilson, Jeremy C.
1993-08-01
This paper describes an advanced analysis and visualization tool, called Y0 (pronounced ``Why not?!''), that has been developed to directly support the scientific process for earth and space science research. Y0 aids the scientific research process by enabling the user to formulate algorithms and models within an integrated environment, and then interactively explore the solution space with the aid of appropriate visualizations. Y0 has been designed to provide strong support for both quantitative analysis and rich visualization. The user's algorithm or model is defined in terms of algebraic formulas in cells on worksheets, in a similar fashion to spreadsheet programs. Y0 is specifically designed to provide the data types and rich function set necessary for effective analysis and manipulation of remote sensing data. This includes various types of arrays, geometric objects, and objects for representing geographic coordinate system mappings. Visualization of results is tailored to the needs of remote sensing, with straightforward methods of composing, comparing, and animating imagery and graphical information, with reference to geographical coordinate systems. Y0 is based on advanced object-oriented technology. It is implemented in C++ for use in Unix environments, with a user interface based on the X window system. Y0 has been delivered under contract to Unidata, a group which provides data and software support to atmospheric researches in universities affiliated with UCAR. This paper will explore the key concepts in Y0, describe its utility for remote sensing analysis and visualization, and will give a specific example of its application to the problem of measuring glacier flow rates from Landsat imagery.
Weinlich, Michael; Kurz, Peter; Blau, Melissa B; Walcher, Felix; Piatek, Stefan
2018-01-01
When patients are disorientated or experience language barriers, it is impossible to activate the emergency response system. In these cases, the delay for receiving appropriate help can extend to several hours. A worldwide emergency call support system (ECSS), including geolocation of modern smartphones (GPS, WLAN and LBS), was established referring to E911 and eCall systems. The system was tested for relevance in quickly forwarding abroad emergency calls to emergency medical services (EMS). To verify that geolocation data from smartphones are exact enough to be used for emergency cases, the accuracy of GPS (global positioning system), Wi-Fi (wireless LAN network) and LBS (location based system) was tested in eleven different countries and compared to actual location. The main objective was analyzed by simulation of emergencies in different countries. The time delay in receiving help in unsuccessful emergency call cases by using the worldwide emergency call support system (ECSS) was measured. GPS is the gold standard to locate patients with an average accuracy of 2.0 ± 3.3 m. Wi-Fi can be used within buildings with an accuracy of 7.0 ± 24.1 m. Using ECSS, the emergency call leads to a successful activation of EMS in 22.8 ± 10.8 min (Median 21 min). The use of a simple app with one button to touch did never cause any delay. The worldwide emergency call support system (ECSS) significantly improves the emergency response in cases of disorientated patients or language barriers. Under circumstances without ECSS, help can be delayed by 2 or more hours and might have relevant lifesaving effects. This is the first time that Wi-Fi geolocation could prove to be a useful improvement in emergencies to enhance GPS, especially within or close to buildings.
Ewing, Gail; Austin, Lynn; Grande, Gunn
2016-04-01
The importance of supporting family carers is well recognised in healthcare policy. The Carer Support Needs Assessment Tool is an evidence-based, comprehensive measure of carer support needs to facilitate carer support in palliative home care. To examine practitioner perspectives of the role of the Carer Support Needs Assessment Tool intervention in palliative home care to identify its impact and mechanisms of action. Qualitative - practitioner accounts of implementation (interviews, focus groups, reflective audio diaries) plus researcher field notes. A total of 29 staff members from two hospice home-care services - contrasting geographical locations, different service sizes and staff composition. A thematic analysis was conducted. Existing approaches to identification of carer needs were informal and unstructured. Practitioners expressed some concerns, pre-implementation, about negative impacts of the Carer Support Needs Assessment Tool on carers and expectations raised about support available. In contrast, post-implementation, the Carer Support Needs Assessment Tool provided positive impacts when used as part of a carer-led assessment and support process: it made support needs visible, legitimised support for carers and opened up different conversations with carers. The mechanisms of action that enabled the Carer Support Needs Assessment Tool to make a difference were creating space for the separate needs of carers, providing an opportunity for carers to express support needs and responding to carers' self-defined priorities. The Carer Support Needs Assessment Tool delivered benefits through a change in practice to an identifiable, separate assessment process for carers, facilitated by practitioners but carer-led. Used routinely with all carers, the Carer Support Needs Assessment Tool has the potential to normalise carer assessment and support, facilitate delivery of carer-identified support and enable effective targeting of resources. © The Author(s) 2015.
ERIC Educational Resources Information Center
Wood, Marianne
2007-01-01
This article presents a lesson called Memory Palaces. A memory palace is a memory tool used to remember information, usually as visual images, in a sequence that is logical to the person remembering it. In his book, "In the Palaces of Memory", George Johnson calls them "...structure(s) for arranging knowledge. Lots of connections to language arts,…
ERIC Educational Resources Information Center
Marshall, Harriet
2011-01-01
This paper exposes the tensions between different agendas and calls for what is loosely called "global citizenship education" by developing a set of sociological conceptual tools useful for engaging with associated educational forms and ideals. It presents the instrumentalist and normative agendas at play within global citizenship education…
Gibbens, J C; Frost, A J; Houston, C W; Lester, H; Gauntlett, F A
2016-11-26
An evidence-based decision support tool, 'D2R2', has been developed by Defra. It contains a wide range of standardised information about exotic and endemic diseases held in 'disease profiles'. Each profile includes 40 criteria used for scoring, enabling D2R2 to provide relative priority rankings for every disease profiled. D2R2 also provides a range of reports for each disease and the functionality to explore the impact of changes in any criterion or weighting on a disease's ranking. These outputs aid the prioritisation and management of animal diseases by government. D2R2 was developed with wide stakeholder engagement and its design was guided by clear specifications. It uses the weighted scores of a limited number of criteria to generate impact and risk scores for each disease, and relies on evidence drawn from published material wherever possible and maintained up to date. It allows efficient use of expertise, as maintained disease profiles reduce the need for on call, reactive, expert input for policy development and enables rapid simultaneous access to the same information by multiple parties, for example during exotic disease outbreaks. The experience in developing D2R2 has been shared internationally to assist others with their development of disease prioritisation and categorisation systems. British Veterinary Association.
NASA Astrophysics Data System (ADS)
Ravikumar, Ashwin; Larjavaara, Markku; Larson, Anne; Kanninen, Markku
2017-01-01
Revenues derived from carbon have been seen as an important tool for supporting forest conservation over the past decade. At the same time, there is high uncertainty about how much revenue can reasonably be expected from land use emissions reductions initiatives. Despite this uncertainty, REDD+ projects and conservation initiatives that aim to take advantage of available or, more commonly, future funding from carbon markets have proliferated. This study used participatory multi-stakeholder workshops to develop divergent future scenarios of land use in eight landscapes in four countries around the world: Peru, Indonesia, Tanzania, and Mexico. The results of these future scenario building exercises were analyzed using a new tool, CarboScen, for calculating the landscape carbon storage implications of different future land use scenarios. The findings suggest that potential revenues from carbon storage or emissions reductions are significant in some landscapes (most notably the peat forests of Indonesia), and much less significant in others (such as the low-carbon forests of Zanzibar and the interior of Tanzania). The findings call into question the practicality of many conservation programs that hinge on expectations of future revenue from carbon finance. The future scenarios-based approach is useful to policy-makers and conservation program developers in distinguishing between landscapes where carbon finance can substantially support conservation, and landscapes where other strategies for conservation and land use should be prioritized.
Srinivasan, Sudha M.; Bhat, Anjana N.
2013-01-01
The rising incidence of Autism Spectrum Disorders (ASDs) has led to a surge in the number of children needing autism interventions. This paper is a call to clinicians to diversify autism interventions and to promote the use of embodied music-based approaches to facilitate multisystem development. Approximately 12% of all autism interventions and 45% of all alternative treatment strategies in schools involve music-based activities. Musical training impacts various forms of development including communication, social-emotional, and motor development in children with ASDs and other developmental disorders as well as typically developing children. In this review, we will highlight the multisystem impairments of ASDs, explain why music and movement therapies are a powerful clinical tool, as well as describe mechanisms and offer evidence in support of music therapies for children with ASDs. We will support our claims by reviewing results from brain imaging studies reporting on music therapy effects in children with autism. We will also discuss the critical elements and the different types of music therapy approaches commonly used in pediatric neurological populations including autism. We provide strong arguments for the use of music and movement interventions as a multisystem treatment tool for children with ASDs. Finally, we also make recommendations for assessment and treatment of children with ASDs, and provide directions for future research. PMID:23576962
X-ray techniques for innovation in industry
Lawniczak-Jablonska, Krystyna; Cutler, Jeffrey
2014-01-01
The smart specialization declared in the European program Horizon 2020, and the increasing cooperation between research and development found in companies and researchers at universities and research institutions have created a new paradigm where many calls for proposals require participation and funding from public and private entities. This has created a unique opportunity for large-scale facilities, such as synchrotron research laboratories, to participate in and support applied research programs. Scientific staff at synchrotron facilities have developed many advanced tools that make optimal use of the characteristics of the light generated by the storage ring. These tools have been exceptionally valuable for materials characterization including X-ray absorption spectroscopy, diffraction, tomography and scattering, and have been key in solving many research and development issues. Progress in optics and detectors, as well as a large effort put into the improvement of data analysis codes, have resulted in the development of reliable and reproducible procedures for materials characterization. Research with photons has contributed to the development of a wide variety of products such as plastics, cosmetics, chemicals, building materials, packaging materials and pharma. In this review, a few examples are highlighted of successful cooperation leading to solutions of a variety of industrial technological problems which have been exploited by industry including lessons learned from the Science Link project, supported by the European Commission, as a new approach to increase the number of commercial users at large-scale research infrastructures. PMID:25485139
Toward a framework for computer-mediated collaborative design in medical informatics.
Patel, V L; Kaufman, D R; Allen, V G; Shortliffe, E H; Cimino, J J; Greenes, R A
1999-09-01
The development and implementation of enabling tools and methods that provide ready access to knowledge and information are among the central goals of medical informatics. The need for multi-institutional collaboration in the development of such tools and methods is increasingly being recognized. Collaboration involves communication, which typically involves individuals who work together at the same location. With the evolution of electronic modalities for communication, we seek to understand the role that such technologies can play in supporting collaboration, especially when the participants are geographically separated. Using the InterMed Collaboratory as a subject of study, we have analyzed their activities as an exercise in computer- and network-mediated collaborative design. We report on the cognitive, sociocultural, and logistical issues encountered when scientists from diverse organizations and backgrounds use communications technologies while designing and implementing shared products. Results demonstrate that it is important to match carefully the content with the mode of communication, identifying, for example, suitable uses of E-mail, conference calls, and face-to-face meetings. The special role of leaders in guiding and facilitating the group activities can also be seen, regardless of the communication setting in which the interactions occur. Most important is the proper use of technology to support the evolution of a shared vision of group goals and methods, an element that is clearly necessary before successful collaborative designs can proceed.
Vargas-Salinas, Fernando; Amézquita, Adolfo
2013-01-01
According to the acoustic adaptation hypothesis, communication signals are evolutionary shaped in a way that minimizes its degradation and maximizes its contrast against the background noise. To compare the importance for call divergence of acoustic adaptation and hybridization, an evolutionary force allegedly promoting phenotypic variation, we compared the mate recognition signal of two species of poison frogs (Oophaga histrionica and O. lehmanni) at five localities: two (one per species) alongside noisy streams, two away from streams, and one interspecific hybrid. We recorded the calls of 47 males and characterized the microgeographic variation in their spectral and temporal features, measuring ambient noise level, body size, and body temperature as covariates. As predicted, frogs living in noisy habitats uttered high frequency calls and, in one species, were much smaller in size. These results support a previously unconsidered role of noise on streams as a selective force promoting an increase in call frequency and pleiotropic effects in body size. Regarding hybrid frogs, their calls overlapped in the signal space with the calls of one of the parental lineages. Our data support acoustic adaptation following two evolutionary routes but do not support the presumed role of hybridization in promoting phenotypic diversity.
"How can I help?" Nurse call openings on a cancer helpline and implications for call progressivity.
Leydon, Geraldine Marie; Ekberg, Katie; Drew, Paul
2013-07-01
Helplines are a key service used for information and support by people affected by cancer. Little is known about the process of delivering and seeking cancer related telephone help. Using conversation analysis 52 calls between callers and specialist nurses on a major UK cancer helpline are analysed; focusing on the openings of helpline calls by specialist nurses. The helpline involves a triage system from a frontline call-taker to a specialist nurse. The triage system introduces challenges to the interactions for nurses and callers. This paper demonstrates how calls commence, and outlines implications for how they progress. Four key elements to the nurse's initial opening of the call were identified, which together contribute to managing an effective transition from the frontline call-taker to the current call with the specialist cancer nurse. The smooth exchange of information and provision of support in a trusted call environment is a critical goal of the cancer helpline; an effective call opening in a triage environment may significantly optimise the possibility of this goal being realised. A simple strategy is recommended to avoid the difficulties identified, a script for how the triaged call openings may be optimally formulated. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
The experiences of frequent users of crisis helplines: A qualitative interview study.
Middleton, Aves; Gunn, Jane; Bassilios, Bridget; Pirkis, Jane
2016-11-01
To understand why some users call crisis helplines frequently. Nineteen semi-structured telephone interviews were conducted with callers to Lifeline Australia who reported calling 20 times or more in the past month and provided informed consent. Interviews were audio-recorded and transcribed verbatim. Inductive thematic analysis was used to generate common themes. Approval was granted by The University of Melbourne Human Research Ethics Committee. Three overarching themes emerged from the data and included reasons for calling, service response and calling behaviours. Respondents called seeking someone to talk to, help with their mental health issues and assistance with negative life events. When they called, they found short-term benefits in the unrestricted support offered by the helpline. Over time they called about similar issues and described reactive, support-seeking and dependent calling behaviours. Frequent users of crisis helplines call about ongoing issues. They have developed distinctive calling behaviours which appear to occur through an interaction between their reasons for calling and the response they receive from the helpline. The ongoing nature of the issues prompting frequent users to call suggests that a service model that includes a continuity of care component may be more efficient in meeting their needs. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Mankowska, Anna
2016-01-01
Little, if any, examination of using play-based tools to examine children's opinions in research exists in the current literature. Therefore, this paper is meant to address that gap within the literature and showcase the study about the use of a specific play-based methodological tool in qualitative research. This specific tool called social board…
S.T.A. Pickett; M.L. Cadenasso; J.M. Grove
2004-01-01
Urban designers, ecologists, and social scientists have called for closer links among their disciplines. We examine a promising new tool for promoting this linkageâthe metaphor of "cities of resilience." To put this tool to best use, we indicate how metaphor fits with other conceptual tools in science. We then present the two opposing definitions of...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Figen, J.
1981-09-10
Ratfor (RATional FORtran) is a dialect of Fortran that allows structured programming and the use of simple macros. It is the language of the Software Tools package, and is documented in the book Software Tools. It has proved significantly easier than Fortran to read, write, and understand. Although debugging is slightly harder in Ratfor than in Fortran, there is usually less of it to do, since Ratfor lends itself to better program design. Ratfor operates as a preprocessor to any existing Fortran system. It is relatively easy, using Ratfor, to write programs that are portable with little or no changemore » to any environment that supports standard Fortran. REP (Ratfor Extended and Portable) is an extended version of Ratfor. It is fully upward compatible with the Addison-Wesley translator, though there are a few divergences from certain Unix and Software Tools User Group dialects. The principal feature of REP is its fully developed macro facility, a language within a language, capable of doing such things as creating new data types, data structuring, recursive procedures, and much more, portably, and in the spirit of Ratfor, but there are many other lesser though nevertheless important extensions.« less
AquaCrop-OS: A tool for resilient management of land and water resources in agriculture
NASA Astrophysics Data System (ADS)
Foster, Timothy; Brozovic, Nicholas; Butler, Adrian P.; Neale, Christopher M. U.; Raes, Dirk; Steduto, Pasquale; Fereres, Elias; Hsiao, Theodore C.
2017-04-01
Water managers, researchers, and other decision makers worldwide are faced with the challenge of increasing food production under population growth, drought, and rising water scarcity. Crop simulation models are valuable tools in this effort, and, importantly, provide a means of quantifying rapidly crop yield response to water, climate, and field management practices. Here, we introduce a new open-source crop modelling tool called AquaCrop-OS (Foster et al., 2017), which extends the functionality of the globally used FAO AquaCrop model. Through case studies focused on groundwater-fed irrigation in the High Plains and Central Valley of California in the United States, we demonstrate how AquaCrop-OS can be used to understand the local biophysical, behavioural, and institutional drivers of water risks in agricultural production. Furthermore, we also illustrate how AquaCrop-OS can be combined effectively with hydrologic and economic models to support drought risk mitigation and decision-making around water resource management at a range of spatial and temporal scales, and highlight future plans for model development and training. T. Foster, et al. (2017) AquaCrop-OS: An open source version of FAO's crop water productivity model. Agricultural Water Management. 181: 18-22. http://dx.doi.org/10.1016/j.agwat.2016.11.015.
Effect-directed analysis supporting monitoring of aquatic ...
Aquatic environments are often contaminated with complex mixtures of chemicals that may pose a risk to ecosystems and human health. This contamination cannot be addressed with target analysis alone but tools are required to reduce this complexity and identify those chemicals that might cause adverse effects. Effect-directed analysis (EDA) is designed to meet this challenge and faces increasing interest in water and sediment quality monitoring. Thus, the present paper summarizes current experience with the EDA approach and the tools required,and provides practical advice on their application. The paper highlights the need for proper problem formulation and gives general advice for study design. As the EDA approach is directed by toxicity, basic principles for the selection of bioassays are given as well as a comprehensive compilation of appropriate assays, includingtheir strengths andweaknesses. A specific focus is given to strategies for sampling, extraction and bioassay dosing since they strongly impact prioritization of toxicants in EDA. Reduction of sample complexity mainly relies onfractionation procedures, which are discussed in this paper, including quality assurance and quality control. Automated combinations of fractionation, biotesting and chemical analysis using so-called hyphenated tools can enhance the throughput and might reduce the risk of artifacts in laboratory work. The key to determiningthe chemical structures causing effects is analytical toxi
Tool Support for Software Lookup Table Optimization
Wilcox, Chris; Strout, Michelle Mills; Bieman, James M.
2011-01-01
A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT) optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology andmore » tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0× and 6.9× for two molecular biology algorithms, 1.4× for a molecular dynamics program, 2.1× to 2.8× for a neural network application, and 4.6× for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches.« less