(abstract) Generic Modeling of a Life Support System for Process Technology Comparisons
NASA Technical Reports Server (NTRS)
Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.
1993-01-01
This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support systems and process technology options for a Lunar Base and a Mars Exploration Mission.
NASA Technical Reports Server (NTRS)
Thomas, Stan J.
1993-01-01
KATE (Knowledge-based Autonomous Test Engineer) is a model-based software system developed in the Artificial Intelligence Laboratory at the Kennedy Space Center for monitoring, fault detection, and control of launch vehicles and ground support systems. In order to bring KATE to the level of performance, functionality, and integratability needed for firing room applications, efforts are underway to implement KATE in the C++ programming language using an X-windows interface. Two programs which were designed and added to the collection of tools which comprise the KATE toolbox are described. The first tool, called the schematic viewer, gives the KATE user the capability to view digitized schematic drawings in the KATE environment. The second tool, called the model editor, gives the KATE model builder a tool for creating and editing knowledge base files. Design and implementation issues having to do with these two tools are discussed. It will be useful to anyone maintaining or extending either the schematic viewer or the model editor.
Software Engineering Tools for Scientific Models
NASA Technical Reports Server (NTRS)
Abrams, Marc; Saboo, Pallabi; Sonsini, Mike
2013-01-01
Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.
Generic Modeling of a Life Support System for Process Technology Comparison
NASA Technical Reports Server (NTRS)
Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.
1993-01-01
This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support system and process technology options for a Lunar Base with a crew size of 4 and mission lengths of 90 and 600 days. System configurations to minimize the life support system weight and power are explored.
Detection of Cutting Tool Wear using Statistical Analysis and Regression Model
NASA Astrophysics Data System (ADS)
Ghani, Jaharah A.; Rizal, Muhammad; Nuawi, Mohd Zaki; Haron, Che Hassan Che; Ramli, Rizauddin
2010-10-01
This study presents a new method for detecting the cutting tool wear based on the measured cutting force signals. A statistical-based method called Integrated Kurtosis-based Algorithm for Z-Filter technique, called I-kaz was used for developing a regression model and 3D graphic presentation of I-kaz 3D coefficient during machining process. The machining tests were carried out using a CNC turning machine Colchester Master Tornado T4 in dry cutting condition. A Kistler 9255B dynamometer was used to measure the cutting force signals, which were transmitted, analyzed, and displayed in the DasyLab software. Various force signals from machining operation were analyzed, and each has its own I-kaz 3D coefficient. This coefficient was examined and its relationship with flank wear lands (VB) was determined. A regression model was developed due to this relationship, and results of the regression model shows that the I-kaz 3D coefficient value decreases as tool wear increases. The result then is used for real time tool wear monitoring.
2016-04-01
IND Response Decision-Making: Models for Government–Industry Collaboration for the Development of Game -Based Training Tools R.M. Seater C.E. Rose...Models for Government–Industry Collaboration for the Development of Game -Based Training Tools C.E. Rose A.S. Norige Group 44 R.M. Seater K.C...Report 1208 Lexington Massachusetts This page intentionally left blank. iii EXECUTIVE SUMMARY Game -based training tools, sometimes called “serious
2016-05-05
Training for IND Response Decision-Making: Models for Government–Industry Collaboration for the Development of Game -Based Training Tools R.M. Seater...Skill Transfer and Virtual Training for IND Response Decision-Making: Models for Government–Industry Collaboration for the Development of Game -Based...unlimited. This page intentionally left blank. iii EXECUTIVE SUMMARY Game -based training tools, sometimes called “serious games ,” are becoming
SCOUT: A Fast Monte-Carlo Modeling Tool of Scintillation Camera Output
Hunter, William C. J.; Barrett, Harrison H.; Lewellen, Thomas K.; Miyaoka, Robert S.; Muzi, John P.; Li, Xiaoli; McDougald, Wendy; MacDonald, Lawrence R.
2011-01-01
We have developed a Monte-Carlo photon-tracking and readout simulator called SCOUT to study the stochastic behavior of signals output from a simplified rectangular scintillation-camera design. SCOUT models the salient processes affecting signal generation, transport, and readout. Presently, we compare output signal statistics from SCOUT to experimental results for both a discrete and a monolithic camera. We also benchmark the speed of this simulation tool and compare it to existing simulation tools. We find this modeling tool to be relatively fast and predictive of experimental results. Depending on the modeled camera geometry, we found SCOUT to be 4 to 140 times faster than other modeling tools. PMID:22072297
SCOUT: a fast Monte-Carlo modeling tool of scintillation camera output†
Hunter, William C J; Barrett, Harrison H.; Muzi, John P.; McDougald, Wendy; MacDonald, Lawrence R.; Miyaoka, Robert S.; Lewellen, Thomas K.
2013-01-01
We have developed a Monte-Carlo photon-tracking and readout simulator called SCOUT to study the stochastic behavior of signals output from a simplified rectangular scintillation-camera design. SCOUT models the salient processes affecting signal generation, transport, and readout of a scintillation camera. Presently, we compare output signal statistics from SCOUT to experimental results for both a discrete and a monolithic camera. We also benchmark the speed of this simulation tool and compare it to existing simulation tools. We find this modeling tool to be relatively fast and predictive of experimental results. Depending on the modeled camera geometry, we found SCOUT to be 4 to 140 times faster than other modeling tools. PMID:23640136
Information of urban morphological features at high resolution is needed to properly model and characterize the meteorological and air quality fields in urban areas. We describe a new project called National Urban Database with Access Portal Tool, (NUDAPT) that addresses this nee...
ERIC Educational Resources Information Center
Clossey, Laurene; Mehnert, Kevin; Silva, Sara
2011-01-01
This article describes an organizational development tool called appreciative inquiry (AI) and its use in mental health to aid agencies implementing recovery model services. AI is a discursive tool with the power to shift dominant organizational cultures. Its philosophical underpinnings emphasize values consistent with recovery: community,…
Semantic Importance Sampling for Statistical Model Checking
2015-01-16
SMT calls while maintaining correctness. Finally, we implement SIS in a tool called osmosis and use it to verify a number of stochastic systems with...2 surveys related work. Section 3 presents background definitions and concepts. Section 4 presents SIS, and Section 5 presents our tool osmosis . In...which I∗M|=Φ(x) = 1. We do this by first randomly selecting a cube c from C∗ with uniform probability since each cube has equal probability 9 5. OSMOSIS
Computer-Aided Air-Traffic Control In The Terminal Area
NASA Technical Reports Server (NTRS)
Erzberger, Heinz
1995-01-01
Developmental computer-aided system for automated management and control of arrival traffic at large airport includes three integrated subsystems. One subsystem, called Traffic Management Advisor, another subsystem, called Descent Advisor, and third subsystem, called Final Approach Spacing Tool. Data base that includes current wind measurements and mathematical models of performances of types of aircraft contributes to effective operation of system.
The 3 "C" Design Model for Networked Collaborative E-Learning: A Tool for Novice Designers
ERIC Educational Resources Information Center
Bird, Len
2007-01-01
This paper outlines a model for online course design aimed at the mainstream majority of university academics rather than at the early adopters of technology. It has been developed from work at Coventry Business School where tutors have been called upon to design online modules for the first time. Like many good tools, the model's key strength is…
NATIONAL URBAN DATABASE AND ACCESS PROTAL TOOL
Current mesoscale weather prediction and microscale dispersion models are limited in their ability to perform accurate assessments in urban areas. A project called the National Urban Database with Access Portal Tool (NUDAPT) is beginning to provide urban data and improve the para...
OCSEGen: Open Components and Systems Environment Generator
NASA Technical Reports Server (NTRS)
Tkachuk, Oksana
2014-01-01
To analyze a large system, one often needs to break it into smaller components.To analyze a component or unit under analysis, one needs to model its context of execution, called environment, which represents the components with which the unit interacts. Environment generation is a challenging problem, because the environment needs to be general enough to uncover unit errors, yet precise enough to make the analysis tractable. In this paper, we present a tool for automated environment generation for open components and systems. The tool, called OCSEGen, is implemented on top of the Soot framework. We present the tool's current support and discuss its possible future extensions.
Demonstration of C-Tools for Community Exposure Assessment
The presentation describes a new community-scale tool called C-PORT to model emissions related to all port-related activities – including, but not limited to ships, trucks, cranes, etc. – and predict concentrations at fine spatial scales in the near-source environment...
Imamizu, Hiroshi; Kuroda, Tomoe; Yoshioka, Toshinori; Kawato, Mitsuo
2004-02-04
An internal model is a neural mechanism that can mimic the input-output properties of a controlled object such as a tool. Recent research interests have moved on to how multiple internal models are learned and switched under a given context of behavior. Two representative computational models for task switching propose distinct neural mechanisms, thus predicting different brain activity patterns in the switching of internal models. In one model, called the mixture-of-experts architecture, switching is commanded by a single executive called a "gating network," which is different from the internal models. In the other model, called the MOSAIC (MOdular Selection And Identification for Control), the internal models themselves play crucial roles in switching. Consequently, the mixture-of-experts model predicts that neural activities related to switching and internal models can be temporally and spatially segregated, whereas the MOSAIC model predicts that they are closely intermingled. Here, we directly examined the two predictions by analyzing functional magnetic resonance imaging activities during the switching of one common tool (an ordinary computer mouse) and two novel tools: a rotated mouse, the cursor of which appears in a rotated position, and a velocity mouse, the cursor velocity of which is proportional to the mouse position. The switching and internal model activities temporally and spatially overlapped each other in the cerebellum and in the parietal cortex, whereas the overlap was very small in the frontal cortex. These results suggest that switching mechanisms in the frontal cortex can be explained by the mixture-of-experts architecture, whereas those in the cerebellum and the parietal cortex are explained by the MOSAIC model.
Integration of Air Quality & Exposure Models for Health Studies
The presentation describes a new community-scale tool called exposure model for individuals (EMI), which predicts five tiers of individual-level exposure metrics for ambient PM using outdoor concentrations, questionnaires, weather, and time-location information. In this modeling ...
Comparison of BrainTool to other UML modeling and model transformation tools
NASA Astrophysics Data System (ADS)
Nikiforova, Oksana; Gusarovs, Konstantins
2017-07-01
In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.
A GIS-BASED MODAL MODEL OF AUTOMOBILE EXHAUST EMISSIONS
The report presents progress toward the development of a computer tool called MEASURE, the Mobile Emission Assessment System for Urban and Regional Evaluation. The tool works toward a goal of providing researchers and planners with a way to assess new mobile emission mitigation s...
ERIC Educational Resources Information Center
Sheehan, Mark D.; Thorpe, Todd; Dunn, Robert
2015-01-01
Much has been gained over the years in various educational fields that have taken advantage of CALL. In many cases, CALL has facilitated learning and provided teachers and students access to materials and tools that would have remained out of reach were it not for technology. Nonetheless, there are still cases where a lack of funding or access to…
Pathway Tools version 13.0: integrated software for pathway/genome informatics and systems biology
Paley, Suzanne M.; Krummenacker, Markus; Latendresse, Mario; Dale, Joseph M.; Lee, Thomas J.; Kaipa, Pallavi; Gilham, Fred; Spaulding, Aaron; Popescu, Liviu; Altman, Tomer; Paulsen, Ian; Keseler, Ingrid M.; Caspi, Ron
2010-01-01
Pathway Tools is a production-quality software environment for creating a type of model-organism database called a Pathway/Genome Database (PGDB). A PGDB such as EcoCyc integrates the evolving understanding of the genes, proteins, metabolic network and regulatory network of an organism. This article provides an overview of Pathway Tools capabilities. The software performs multiple computational inferences including prediction of metabolic pathways, prediction of metabolic pathway hole fillers and prediction of operons. It enables interactive editing of PGDBs by DB curators. It supports web publishing of PGDBs, and provides a large number of query and visualization tools. The software also supports comparative analyses of PGDBs, and provides several systems biology analyses of PGDBs including reachability analysis of metabolic networks, and interactive tracing of metabolites through a metabolic network. More than 800 PGDBs have been created using Pathway Tools by scientists around the world, many of which are curated DBs for important model organisms. Those PGDBs can be exchanged using a peer-to-peer DB sharing system called the PGDB Registry. PMID:19955237
Software Certification for Temporal Properties With Affordable Tool Qualification
NASA Technical Reports Server (NTRS)
Xia, Songtao; DiVito, Benedetto L.
2005-01-01
It has been recognized that a framework based on proof-carrying code (also called semantic-based software certification in its community) could be used as a candidate software certification process for the avionics industry. To meet this goal, tools in the "trust base" of a proof-carrying code system must be qualified by regulatory authorities. A family of semantic-based software certification approaches is described, each different in expressive power, level of automation and trust base. Of particular interest is the so-called abstraction-carrying code, which can certify temporal properties. When a pure abstraction-carrying code method is used in the context of industrial software certification, the fact that the trust base includes a model checker would incur a high qualification cost. This position paper proposes a hybrid of abstraction-based and proof-based certification methods so that the model checker used by a client can be significantly simplified, thereby leading to lower cost in tool qualification.
software development of next-generation whole-building energy modeling, analysis, and simulation tools technical positions in networking protocol specifications, call control software, and requirements
An MIP model to schedule the call center workforce and organize the breaks
NASA Astrophysics Data System (ADS)
Türker, Turgay; Demiriz, Ayhan
2016-06-01
In modern economies, companies place a premium on managing their workforce efficiently especially in labor intensive service sector, since the services have become the significant portion of the economies. Tour scheduling is an important tool to minimize the overall workforce costs while satisfying the minimum service level constraints. In this study, we consider the workforce management problem of an inbound call-center while satisfying the call demand within the short time periods with the minimum cost. We propose a mixed-integer programming model to assign workers to the daily shifts, to determine the weekly off-days, and to determine the timings of lunch and other daily breaks for each worker. The proposed model has been verified on the weekly demand data observed at a specific call center location of a satellite TV operator. The model was run on both 15 and 10 minutes demand estimation periods (planning time intervals).
VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox
NASA Astrophysics Data System (ADS)
Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.
2016-12-01
VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.
Model-Driven Useware Engineering
NASA Astrophysics Data System (ADS)
Meixner, Gerrit; Seissler, Marc; Breiner, Kai
User-oriented hardware and software development relies on a systematic development process based on a comprehensive analysis focusing on the users' requirements and preferences. Such a development process calls for the integration of numerous disciplines, from psychology and ergonomics to computer sciences and mechanical engineering. Hence, a correspondingly interdisciplinary team must be equipped with suitable software tools to allow it to handle the complexity of a multimodal and multi-device user interface development approach. An abstract, model-based development approach seems to be adequate for handling this complexity. This approach comprises different levels of abstraction requiring adequate tool support. Thus, in this chapter, we present the current state of our model-based software tool chain. We introduce the use model as the core model of our model-based process, transformation processes, and a model-based architecture, and we present different software tools that provide support for creating and maintaining the models or performing the necessary model transformations.
The Mission Planning Lab: A Visualization and Analysis Tool
NASA Technical Reports Server (NTRS)
Daugherty, Sarah C.; Cervantes, Benjamin W.
2009-01-01
Simulation and visualization are powerful decision making tools that are time-saving and cost-effective. Space missions pose testing and e valuation challenges that can be overcome through modeling, simulatio n, and visualization of mission parameters. The National Aeronautics and Space Administration?s (NASA) Wallops Flight Facility (WFF) capi talizes on the benefits of modeling, simulation, and visualization to ols through a project initiative called The Mission Planning Lab (MPL ).
S.T.A. Pickett; M.L. Cadenasso; J.M. Grove
2004-01-01
Urban designers, ecologists, and social scientists have called for closer links among their disciplines. We examine a promising new tool for promoting this linkageâthe metaphor of "cities of resilience." To put this tool to best use, we indicate how metaphor fits with other conceptual tools in science. We then present the two opposing definitions of...
Seismic hazard assessment of Oregon highway truck routes.
DOT National Transportation Integrated Search
2012-06-01
This research project developed a seismic risk assessment model along the major truck routes in Oregon. The study had adopted federally : developed software tools called Risk for Earthquake Damage to Roadway Systems (REDARS2) and HAZUS-MH. The model ...
A new approach for developing adjoint models
NASA Astrophysics Data System (ADS)
Farrell, P. E.; Funke, S. W.
2011-12-01
Many data assimilation algorithms rely on the availability of gradients of misfit functionals, which can be efficiently computed with adjoint models. However, the development of an adjoint model for a complex geophysical code is generally very difficult. Algorithmic differentiation (AD, also called automatic differentiation) offers one strategy for simplifying this task: it takes the abstraction that a model is a sequence of primitive instructions, each of which may be differentiated in turn. While extremely successful, this low-level abstraction runs into time-consuming difficulties when applied to the whole codebase of a model, such as differentiating through linear solves, model I/O, calls to external libraries, language features that are unsupported by the AD tool, and the use of multiple programming languages. While these difficulties can be overcome, it requires a large amount of technical expertise and an intimate familiarity with both the AD tool and the model. An alternative to applying the AD tool to the whole codebase is to assemble the discrete adjoint equations and use these to compute the necessary gradients. With this approach, the AD tool must be applied to the nonlinear assembly operators, which are typically small, self-contained units of the codebase. The disadvantage of this approach is that the assembly of the discrete adjoint equations is still very difficult to perform correctly, especially for complex multiphysics models that perform temporal integration; as it stands, this approach is as difficult and time-consuming as applying AD to the whole model. In this work, we have developed a library which greatly simplifies and automates the alternate approach of assembling the discrete adjoint equations. We propose a complementary, higher-level abstraction to that of AD: that a model is a sequence of linear solves. The developer annotates model source code with library calls that build a 'tape' of the operators involved and their dependencies, and supplies callbacks to compute the action of these operators. The library, called libadjoint, is then capable of symbolically manipulating the forward annotation to automatically assemble the adjoint equations. Libadjoint is open source, and is explicitly designed to be bolted-on to an existing discrete model. It can be applied to any discretisation, steady or time-dependent problems, and both linear and nonlinear systems. Using libadjoint has several advantages. It requires the application of an AD tool only to small pieces of code, making the use of AD far more tractable. As libadjoint derives the adjoint equations, the expertise required to develop an adjoint model is greatly diminished. One major advantage of this approach is that the model developer is freed from implementing complex checkpointing strategies for the adjoint model: libadjoint has sufficient information about the forward model to re-play the entire forward solve when necessary, and thus the checkpointing algorithm can be implemented entirely within the library itself. Examples are shown using the Fluidity/ICOM framework, a complex ocean model under development at Imperial College London.
On the design of computer-based models for integrated environmental science.
McIntosh, Brian S; Jeffrey, Paul; Lemon, Mark; Winder, Nick
2005-06-01
The current research agenda in environmental science is dominated by calls to integrate science and policy to better understand and manage links between social (human) and natural (nonhuman) processes. Freshwater resource management is one area where such calls can be heard. Designing computer-based models for integrated environmental science poses special challenges to the research community. At present it is not clear whether such tools, or their outputs, receive much practical policy or planning application. It is argued that this is a result of (1) a lack of appreciation within the research modeling community of the characteristics of different decision-making processes including policy, planning, and (2) participation, (3) a lack of appreciation of the characteristics of different decision-making contexts, (4) the technical difficulties in implementing the necessary support tool functionality, and (5) the socio-technical demands of designing tools to be of practical use. This article presents a critical synthesis of ideas from each of these areas and interprets them in terms of design requirements for computer-based models being developed to provide scientific information support for policy and planning. Illustrative examples are given from the field of freshwater resources management. Although computer-based diagramming and modeling tools can facilitate processes of dialogue, they lack adequate simulation capabilities. Component-based models and modeling frameworks provide such functionality and may be suited to supporting problematic or messy decision contexts. However, significant technical (implementation) and socio-technical (use) challenges need to be addressed before such ambition can be realized.
Operator function modeling: An approach to cognitive task analysis in supervisory control systems
NASA Technical Reports Server (NTRS)
Mitchell, Christine M.
1987-01-01
In a study of models of operators in complex, automated space systems, an operator function model (OFM) methodology was extended to represent cognitive as well as manual operator activities. Development continued on a software tool called OFMdraw, which facilitates construction of an OFM by permitting construction of a heterarchic network of nodes and arcs. Emphasis was placed on development of OFMspert, an expert system designed both to model human operation and to assist real human operators. The system uses a blackboard method of problem solving to make an on-line representation of operator intentions, called ACTIN (actions interpreter).
Modeling the Pulse Signal by Wave-Shape Function and Analyzing by Synchrosqueezing Transform
Wang, Chun-Li; Yang, Yueh-Lung; Wu, Wen-Hsiang; Tsai, Tung-Hu; Chang, Hen-Hong
2016-01-01
We apply the recently developed adaptive non-harmonic model based on the wave-shape function, as well as the time-frequency analysis tool called synchrosqueezing transform (SST) to model and analyze oscillatory physiological signals. To demonstrate how the model and algorithm work, we apply them to study the pulse wave signal. By extracting features called the spectral pulse signature, and based on functional regression, we characterize the hemodynamics from the radial pulse wave signals recorded by the sphygmomanometer. Analysis results suggest the potential of the proposed signal processing approach to extract health-related hemodynamics features. PMID:27304979
Modeling the Pulse Signal by Wave-Shape Function and Analyzing by Synchrosqueezing Transform.
Wu, Hau-Tieng; Wu, Han-Kuei; Wang, Chun-Li; Yang, Yueh-Lung; Wu, Wen-Hsiang; Tsai, Tung-Hu; Chang, Hen-Hong
2016-01-01
We apply the recently developed adaptive non-harmonic model based on the wave-shape function, as well as the time-frequency analysis tool called synchrosqueezing transform (SST) to model and analyze oscillatory physiological signals. To demonstrate how the model and algorithm work, we apply them to study the pulse wave signal. By extracting features called the spectral pulse signature, and based on functional regression, we characterize the hemodynamics from the radial pulse wave signals recorded by the sphygmomanometer. Analysis results suggest the potential of the proposed signal processing approach to extract health-related hemodynamics features.
The comparative risk assessment framework and tools (CRAFT)
Southern Research Station USDA Forest Service
2010-01-01
To help address these challenges, the USDA Forest Serviceâs Eastern Forest Environmental Threat Assessment Center (EFETAC) and the University of North Carolina Ashevilleâs National Environmental Modeling and Analysis Center (NEMAC) designed a planning framework, called the Comparative Risk Assessment Framework and Tools (CRAFT). CRAFT is...
A front-end automation tool supporting design, verification and reuse of SOC.
Yan, Xiao-lang; Yu, Long-li; Wang, Jie-bing
2004-09-01
This paper describes an in-house developed language tool called VPerl used in developing a 250 MHz 32-bit high-performance low power embedded CPU core. The authors showed that use of this tool can compress the Verilog code by more than a factor of 5, increase the efficiency of the front-end design, reduce the bug rate significantly. This tool can be used to enhance the reusability of an intellectual property model, and facilitate porting design for different platforms.
ERIC Educational Resources Information Center
Stratford, Steven J.; Krajeik, Joseph; Soloway, Elliot
This paper presents the results of a study of the cognitive strategies in which ninth-grade science students engaged as they used a learner-centered dynamic modeling tool (called Model-It) to make original models based upon stream ecosystem scenarios. The research questions were: (1) In what Cognitive Strategies for Modeling (analyzing, reasoning,…
USDA-ARS?s Scientific Manuscript database
Greenhouse gas (GHG) emissions and their potential impact on the environment has become an important national and international concern. Animal agriculture is a recognized source of GHG emissions, but good information does not exist on the net emissions from our farms. A software tool called the Dai...
ERIC Educational Resources Information Center
Smith, Philip A.; Webb, Geoffrey I.
2000-01-01
Describes "Glass-box Interpreter" a low-level program visualization tool called Bradman designed to provide a conceptual model of C program execution for novice programmers and makes visible aspects of the programming process normally hidden from the user. Presents an experiment that tests the efficacy of Bradman, and provides…
Coordinating complex decision support activities across distributed applications
NASA Technical Reports Server (NTRS)
Adler, Richard M.
1994-01-01
Knowledge-based technologies have been applied successfully to automate planning and scheduling in many problem domains. Automation of decision support can be increased further by integrating task-specific applications with supporting database systems, and by coordinating interactions between such tools to facilitate collaborative activities. Unfortunately, the technical obstacles that must be overcome to achieve this vision of transparent, cooperative problem-solving are daunting. Intelligent decision support tools are typically developed for standalone use, rely on incompatible, task-specific representational models and application programming interfaces (API's), and run on heterogeneous computing platforms. Getting such applications to interact freely calls for platform independent capabilities for distributed communication, as well as tools for mapping information across disparate representations. Symbiotics is developing a layered set of software tools (called NetWorks! for integrating and coordinating heterogeneous distributed applications. he top layer of tools consists of an extensible set of generic, programmable coordination services. Developers access these services via high-level API's to implement the desired interactions between distributed applications.
Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives
NASA Technical Reports Server (NTRS)
Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.
2016-01-01
A new engine cycle analysis tool, called Pycycle, was recently built using the OpenMDAO framework. This tool uses equilibrium chemistry based thermodynamics, and provides analytic derivatives. This allows for stable and efficient use of gradient-based optimization and sensitivity analysis methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a multi-point turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.
Using TI-Nspire in a Modelling Teacher's Training Course
ERIC Educational Resources Information Center
Flores, Ángel Homero; Gómez, Adriana; Chávez, Xochitl
2015-01-01
Using Mathematical Modelling has become a useful tool in teaching-learning mathematics at all levels. This is so because mathematical objects are seen from their very applications, giving them meaning from the beginning. In this paper we present some details on the development of a teacher's training course called Modelling in the Teaching of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahmadi, Rouhollah, E-mail: rouhollahahmadi@yahoo.com; Khamehchi, Ehsan
Conditioning stochastic simulations are very important in many geostatistical applications that call for the introduction of nonlinear and multiple-point data in reservoir modeling. Here, a new methodology is proposed for the incorporation of different data types into multiple-point statistics (MPS) simulation frameworks. Unlike the previous techniques that call for an approximate forward model (filter) for integration of secondary data into geologically constructed models, the proposed approach develops an intermediate space where all the primary and secondary data are easily mapped onto. Definition of the intermediate space, as may be achieved via application of artificial intelligence tools like neural networks andmore » fuzzy inference systems, eliminates the need for using filters as in previous techniques. The applicability of the proposed approach in conditioning MPS simulations to static and geologic data is verified by modeling a real example of discrete fracture networks using conventional well-log data. The training patterns are well reproduced in the realizations, while the model is also consistent with the map of secondary data.« less
Coordinating complex problem-solving among distributed intelligent agents
NASA Technical Reports Server (NTRS)
Adler, Richard M.
1992-01-01
A process-oriented control model is described for distributed problem solving. The model coordinates the transfer and manipulation of information across independent networked applications, both intelligent and conventional. The model was implemented using SOCIAL, a set of object-oriented tools for distributing computing. Complex sequences of distributed tasks are specified in terms of high level scripts. Scripts are executed by SOCIAL objects called Manager Agents, which realize an intelligent coordination model that routes individual tasks to suitable server applications across the network. These tools are illustrated in a prototype distributed system for decision support of ground operations for NASA's Space Shuttle fleet.
NASA Technical Reports Server (NTRS)
Keller, Richard M.
1991-01-01
The construction of scientific software models is an integral part of doing science, both within NASA and within the scientific community at large. Typically, model-building is a time-intensive and painstaking process, involving the design of very large, complex computer programs. Despite the considerable expenditure of resources involved, completed scientific models cannot easily be distributed and shared with the larger scientific community due to the low-level, idiosyncratic nature of the implemented code. To address this problem, we have initiated a research project aimed at constructing a software tool called the Scientific Modeling Assistant. This tool provides automated assistance to the scientist in developing, using, and sharing software models. We describe the Scientific Modeling Assistant, and also touch on some human-machine interaction issues relevant to building a successful tool of this type.
HexSim version 2.0 is soon to be released by EPA's Western Ecology Division (WED). More than three years of work have gone into the development of this tool, which grew out of an EPA model called PATCH. HexSim makes it possible for non-programmers to develop sophisticated simula...
Metamodeling Techniques to Aid in the Aggregation Process of Large Hierarchical Simulation Models
2008-08-01
Level Outputs Campaign Level Model Campaign Level Outputs Aggregation Metamodeling Complexity (Spatial, Temporal, etc.) Others? Apply VRT (type......reduction, are called variance reduction techniques ( VRT ) [Law, 2006]. The implementation of some type of VRT can prove to be a very valuable tool
Iterating between Tools to Create and Edit Visualizations.
Bigelow, Alex; Drucker, Steven; Fisher, Danyel; Meyer, Miriah
2017-01-01
A common workflow for visualization designers begins with a generative tool, like D3 or Processing, to create the initial visualization; and proceeds to a drawing tool, like Adobe Illustrator or Inkscape, for editing and cleaning. Unfortunately, this is typically a one-way process: once a visualization is exported from the generative tool into a drawing tool, it is difficult to make further, data-driven changes. In this paper, we propose a bridge model to allow designers to bring their work back from the drawing tool to re-edit in the generative tool. Our key insight is to recast this iteration challenge as a merge problem - similar to when two people are editing a document and changes between them need to reconciled. We also present a specific instantiation of this model, a tool called Hanpuku, which bridges between D3 scripts and Illustrator. We show several examples of visualizations that are iteratively created using Hanpuku in order to illustrate the flexibility of the approach. We further describe several hypothetical tools that bridge between other visualization tools to emphasize the generality of the model.
Verus: A Tool for Quantitative Analysis of Finite-State Real-Time Systems.
1996-08-12
Symbolic model checking is a technique for verifying finite-state concurrent systems that has been extended to handle real - time systems . Models with...up to 10(exp 30) states can often be verified in minutes. In this paper, we present a new tool to analyze real - time systems , based on this technique...We have designed a language, called Verus, for the description of real - time systems . Such a description is compiled into a state-transition graph and
Teaching the Teacher: Tutoring SimStudent Leads to More Effective Cognitive Tutor Authoring
ERIC Educational Resources Information Center
Matsuda, Noboru; Cohen, William W.; Koedinger, Kenneth R.
2015-01-01
SimStudent is a machine-learning agent initially developed to help novice authors to create cognitive tutors without heavy programming. Integrated into an existing suite of software tools called Cognitive Tutor Authoring Tools (CTAT), SimStudent helps authors to create an expert model for a cognitive tutor by tutoring SimStudent on how to solve…
Vision Algorithms Catch Defects in Screen Displays
NASA Technical Reports Server (NTRS)
2014-01-01
Andrew Watson, a senior scientist at Ames Research Center, developed a tool called the Spatial Standard Observer (SSO), which models human vision for use in robotic applications. Redmond, Washington-based Radiant Zemax LLC licensed the technology from NASA and combined it with its imaging colorimeter system, creating a powerful tool that high-volume manufacturers of flat-panel displays use to catch defects in screens.
Agent-based modeling as a tool for program design and evaluation.
Lawlor, Jennifer A; McGirr, Sara
2017-12-01
Recently, systems thinking and systems science approaches have gained popularity in the field of evaluation; however, there has been relatively little exploration of how evaluators could use quantitative tools to assist in the implementation of systems approaches therein. The purpose of this paper is to explore potential uses of one such quantitative tool, agent-based modeling, in evaluation practice. To this end, we define agent-based modeling and offer potential uses for it in typical evaluation activities, including: engaging stakeholders, selecting an intervention, modeling program theory, setting performance targets, and interpreting evaluation results. We provide demonstrative examples from published agent-based modeling efforts both inside and outside the field of evaluation for each of the evaluative activities discussed. We further describe potential pitfalls of this tool and offer cautions for evaluators who may chose to implement it in their practice. Finally, the article concludes with a discussion of the future of agent-based modeling in evaluation practice and a call for more formal exploration of this tool as well as other approaches to simulation modeling in the field. Copyright © 2017 Elsevier Ltd. All rights reserved.
Greased Lightning (GL-10) Performance Flight Research: Flight Data Report
NASA Technical Reports Server (NTRS)
McSwain, Robert G.; Glaab, Louis J.; Theodore, Colin R.; Rhew, Ray D. (Editor); North, David D. (Editor)
2017-01-01
Modern aircraft design methods have produced acceptable designs for large conventional aircraft performance. With revolutionary electronic propulsion technologies fueled by the growth in the small UAS (Unmanned Aerial Systems) industry, these same prediction models are being applied to new smaller, and experimental design concepts requiring a VTOL (Vertical Take Off and Landing) capability for ODM (On Demand Mobility). A 50% sub-scale GL-10 flight model was built and tested to demonstrate the transition from hover to forward flight utilizing DEP (Distributed Electric Propulsion)[1][2]. In 2016 plans were put in place to conduct performance flight testing on the 50% sub-scale GL-10 flight model to support a NASA project called DELIVER (Design Environment for Novel Vertical Lift Vehicles). DELIVER was investigating the feasibility of including smaller and more experimental aircraft configurations into a NASA design tool called NDARC (NASA Design and Analysis of Rotorcraft)[3]. This report covers the performance flight data collected during flight testing of the GL-10 50% sub-scale flight model conducted at Beaver Dam Airpark, VA. Overall the flight test data provides great insight into how well our existing conceptual design tools predict the performance of small scale experimental DEP concepts. Low fidelity conceptual design tools estimated the (L/D)( sub max)of the GL-10 50% sub-scale flight model to be 16. Experimentally measured (L/D)( sub max) for the GL-10 50% scale flight model was 7.2. The aerodynamic performance predicted versus measured highlights the complexity of wing and nacelle interactions which is not currently accounted for in existing low fidelity tools.
NASA Astrophysics Data System (ADS)
Farroni, Flavio; Lamberti, Raffaele; Mancinelli, Nicolò; Timpone, Francesco
2018-03-01
Tyres play a key role in ground vehicles' dynamics because they are responsible for traction, braking and cornering. A proper tyre-road interaction model is essential for a useful and reliable vehicle dynamics model. In the last two decades Pacejka's Magic Formula (MF) has become a standard in simulation field. This paper presents a Tool, called TRIP-ID (Tyre Road Interaction Parameters IDentification), developed to characterize and to identify with a high grade of accuracy and reliability MF micro-parameters from experimental data deriving from telemetry or from test rig. The tool guides interactively the user through the identification process on the basis of strong diagnostic considerations about the experimental data made evident by the tool itself. A motorsport application of the tool is shown as a case study.
Student Evaluation of CALL Tools during the Design Process
ERIC Educational Resources Information Center
Nesbitt, Dallas
2013-01-01
This article discusses the comparative effectiveness of student input at different times during the design of CALL tools for learning kanji, the Japanese characters of Chinese origin. The CALL software "package" consisted of tools to facilitate the writing, reading and practising of kanji characters in context. A pre-design questionnaire…
OTLA: A New Model for Online Teaching, Learning and Assessment in Higher Education
ERIC Educational Resources Information Center
Ghilay, Yaron; Ghilay, Ruth
2013-01-01
The study examined a new asynchronous model for online teaching, learning and assessment, called OTLA. It is designed for higher-education institutions and is based on LMS (Learning Management System) as well as other relevant IT tools. The new model includes six digital basic components: text, hypertext, text reading, lectures (voice/video),…
T. W. Appelboom; G. M. Chescheir; R. W. Skaggs; J. W. Gilliam; Devendra M. Amatya
2006-01-01
Watershed modeling has become an important tool for researchers with the high costs of water quality monitoring. When modeling nitrate transport within drainage networks, denitrification within the sediments needs to be accounted for. Birgand et. al. developed an equation using a term called a mass transfer coefficient to mathematically describe sediment...
T.W. Appelboom; G.M. Chescheir; F. Birgand; R.W. Skaggs; J.W. Gilliam; D. Amatya
2010-01-01
Watershed modeling has become an important tool for researchers. Modeling nitrate transport within drainage networks requires quantifying the denitrification within the sediments in canals and streams. In a previous study, several of the authors developed an equation using a term called a mass transfer coefficient to mathematically describe sediment denitrification....
T.W. Appelboom; G.M. Chescheir; F. Birgand; R.W. Skaggs; J.W. Gilliam; D. Amatya
2010-01-01
Watershed modeling has become an important tool for researchers. Modeling nitrate transport within drainage networks requires quantifying the denitrification within the sediments in canals and streams. In a previous study, several of the authors developed an equation using a term called a mass transfer coefficient to mathematically describe sediment denitrification....
Acoustic/seismic signal propagation and sensor performance modeling
NASA Astrophysics Data System (ADS)
Wilson, D. Keith; Marlin, David H.; Mackay, Sean
2007-04-01
Performance, optimal employment, and interpretation of data from acoustic and seismic sensors depend strongly and in complex ways on the environment in which they operate. Software tools for guiding non-expert users of acoustic and seismic sensors are therefore much needed. However, such tools require that many individual components be constructed and correctly connected together. These components include the source signature and directionality, representation of the atmospheric and terrain environment, calculation of the signal propagation, characterization of the sensor response, and mimicking of the data processing at the sensor. Selection of an appropriate signal propagation model is particularly important, as there are significant trade-offs between output fidelity and computation speed. Attenuation of signal energy, random fading, and (for array systems) variations in wavefront angle-of-arrival should all be considered. Characterization of the complex operational environment is often the weak link in sensor modeling: important issues for acoustic and seismic modeling activities include the temporal/spatial resolution of the atmospheric data, knowledge of the surface and subsurface terrain properties, and representation of ambient background noise and vibrations. Design of software tools that address these challenges is illustrated with two examples: a detailed target-to-sensor calculation application called the Sensor Performance Evaluator for Battlefield Environments (SPEBE) and a GIS-embedded approach called Battlefield Terrain Reasoning and Awareness (BTRA).
The National energy modeling system
NASA Astrophysics Data System (ADS)
The DOE uses a variety of energy and economic models to forecast energy supply and demand. It also uses a variety of more narrowly focussed analytical tools to examine energy policy options. For the purpose of the scope of this work, this set of models and analytical tools is called the National Energy Modeling System (NEMS). The NEMS is the result of many years of development of energy modeling and analysis tools, many of which were developed for different applications and under different assumptions. As such, NEMS is believed to be less than satisfactory in certain areas. For example, NEMS is difficult to keep updated and expensive to use. Various outputs are often difficult to reconcile. Products were not required to interface, but were designed to stand alone. Because different developers were involved, the inner workings of the NEMS are often not easily or fully understood. Even with these difficulties, however, NEMS comprises the best tools currently identified to deal with our global, national and regional energy modeling, and energy analysis needs.
Virtual Power Electronics: Novel Software Tools for Design, Modeling and Education
NASA Astrophysics Data System (ADS)
Hamar, Janos; Nagy, István; Funato, Hirohito; Ogasawara, Satoshi; Dranga, Octavian; Nishida, Yasuyuki
The current paper is dedicated to present browser-based multimedia-rich software tools and e-learning curriculum to support the design and modeling process of power electronics circuits and to explain sometimes rather sophisticated phenomena. Two projects will be discussed. The so-called Inetele project is financed by the Leonardo da Vinci program of the European Union (EU). It is a collaborative project between numerous EU universities and institutes to develop state-of-the art curriculum in Electrical Engineering. Another cooperative project with participation of Japanese, European and Australian institutes focuses especially on developing e-learning curriculum, interactive design and modeling tools, furthermore on development of a virtual laboratory. Snapshots from these two projects will be presented.
Semantic Importance Sampling for Statistical Model Checking
2014-10-18
we implement SIS in a tool called osmosis and use it to verify a number of stochastic systems with rare events. Our results indicate that SIS reduces...background definitions and concepts. Section 4 presents SIS, and Section 5 presents our tool osmosis . In Section 6, we present our experiments and results...Syntactic Extraction ∗( ) dReal + Refinement ∗ |∗| , Monte-Carlo , Fig. 5. Architecture of osmosis
THE AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT TOOL
A toolkit for distributed hydrologic modeling at multiple scales using a geographic information system is presented. This open-source, freely available software was developed through a collaborative endeavor involving two Universities and two government agencies. Called the Auto...
Collaborative Workspaces within Distributed Virtual Environments.
1996-12-01
such as a text document, a 3D model, or a captured image using a collaborative workspace called the InPerson Whiteboard . The Whiteboard contains a...commands for editing objects drawn on the screen. Finally, when the call is completed, the Whiteboard can be saved to a file for future use . IRIS Annotator... use , and a shared whiteboard that includes a number of multimedia annotation tools. Both systems are also mindful of bandwidth limitations and can
ISW-galaxy cross-correlation in K-mouflage
NASA Astrophysics Data System (ADS)
Benevento, G.; Bartolo, N.; Liguori, M.
2018-01-01
Cross-correlations between the cosmic microwave background and the galaxy distribution can probe the linear growth rate of cosmic structures, thus providing a powerful tool to investigate different Dark Energy and Modified Gravity models. We explore the possibility of using this observable to probe a particular class of Modified Gravity models, called K-mouflage.
The US EPA is developing an open and publically available software program called the Human Exposure Model (HEM) to provide near-field exposure information for Life Cycle Impact Assessments (LCIAs). Historically, LCIAs have often omitted impacts from near-field sources of exposur...
Representing climate, disturbance, and vegetation interactions in landscape models
Robert E. Keane; Donald McKenzie; Donald A. Falk; Erica A.H. Smithwick; Carol Miller; Lara-Karena B. Kellogg
2015-01-01
The prospect of rapidly changing climates over the next century calls for methods to predict their effects on myriad, interactive ecosystem processes. Spatially explicit models that simulate ecosystem dynamics at fine (plant, stand) to coarse (regional, global) scales are indispensable tools for meeting this challenge under a variety of possible futures. A special...
2015-06-01
and tools, called model-integrated computing ( MIC ) [3] relies on the use of domain-specific modeling languages for creating models of the system to be...hence giving reflective capabilities to it. We have followed the MIC method here: we designed a domain- specific modeling language for modeling...are produced one-off and not for the mass market , the scope for price reduction based on the market demands is non-existent. Processes to create
Information visualisation based on graph models
NASA Astrophysics Data System (ADS)
Kasyanov, V. N.; Kasyanova, E. V.
2013-05-01
Information visualisation is a key component of support tools for many applications in science and engineering. A graph is an abstract structure that is widely used to model information for its visualisation. In this paper, we consider practical and general graph formalism called hierarchical graphs and present the Higres and Visual Graph systems aimed at supporting information visualisation on the base of hierarchical graph models.
van den Akker, Jeroen; Mishne, Gilad; Zimmer, Anjali D; Zhou, Alicia Y
2018-04-17
Next generation sequencing (NGS) has become a common technology for clinical genetic tests. The quality of NGS calls varies widely and is influenced by features like reference sequence characteristics, read depth, and mapping accuracy. With recent advances in NGS technology and software tools, the majority of variants called using NGS alone are in fact accurate and reliable. However, a small subset of difficult-to-call variants that still do require orthogonal confirmation exist. For this reason, many clinical laboratories confirm NGS results using orthogonal technologies such as Sanger sequencing. Here, we report the development of a deterministic machine-learning-based model to differentiate between these two types of variant calls: those that do not require confirmation using an orthogonal technology (high confidence), and those that require additional quality testing (low confidence). This approach allows reliable NGS-based calling in a clinical setting by identifying the few important variant calls that require orthogonal confirmation. We developed and tested the model using a set of 7179 variants identified by a targeted NGS panel and re-tested by Sanger sequencing. The model incorporated several signals of sequence characteristics and call quality to determine if a variant was identified at high or low confidence. The model was tuned to eliminate false positives, defined as variants that were called by NGS but not confirmed by Sanger sequencing. The model achieved very high accuracy: 99.4% (95% confidence interval: +/- 0.03%). It categorized 92.2% (6622/7179) of the variants as high confidence, and 100% of these were confirmed to be present by Sanger sequencing. Among the variants that were categorized as low confidence, defined as NGS calls of low quality that are likely to be artifacts, 92.1% (513/557) were found to be not present by Sanger sequencing. This work shows that NGS data contains sufficient characteristics for a machine-learning-based model to differentiate low from high confidence variants. Additionally, it reveals the importance of incorporating site-specific features as well as variant call features in such a model.
Determining the spatial altitude of the hydraulic fractures.
NASA Astrophysics Data System (ADS)
Khamiev, Marsel; Kosarev, Victor; Goncharova, Galina
2016-04-01
Mathematical modeling and numerical simulation are the most widely used approaches for the solving geological problems. They imply software tools which are based on Monte Carlo method. The results of this project presents shows the possibility of using PNL tool to determine fracturing location. The modeled media is a homogeneous rock (limestone) cut by a vertical borehole (d=216 mm) with metal casing 9 mm thick. The cement sheath is 35 mm thick. The borehole is filled with fresh water. The rock mass is cut by crack, filled with a mixture of doped (gadolinium oxide Gd2O3) proppant (75%) and water (25%). A pulse neutron logging (PNL) tool is used for quality control in hydraulic fracturing operations. It includes a fast neutron source (so-called "neutron generator") and a set of thermal (or epithermal) neutron-sensing devices, forming the so-called near (ND) and far (FD) detectors. To evaluate neutron properties various segments (sectors) of the rock mass, the detector must register only neutrons that come from this very formation. It's possible if detecting block includes some (6 for example) thermal neutron detectors arranged circumferentially inside the tool. As a result we get few independent well logs, each accords with define rock sector. Afterwards synthetic logs processing we can determine spatial position of the hydraulic fracture.
SBML-PET: a Systems Biology Markup Language-based parameter estimation tool.
Zi, Zhike; Klipp, Edda
2006-11-01
The estimation of model parameters from experimental data remains a bottleneck for a major breakthrough in systems biology. We present a Systems Biology Markup Language (SBML) based Parameter Estimation Tool (SBML-PET). The tool is designed to enable parameter estimation for biological models including signaling pathways, gene regulation networks and metabolic pathways. SBML-PET supports import and export of the models in the SBML format. It can estimate the parameters by fitting a variety of experimental data from different experimental conditions. SBML-PET has a unique feature of supporting event definition in the SMBL model. SBML models can also be simulated in SBML-PET. Stochastic Ranking Evolution Strategy (SRES) is incorporated in SBML-PET for parameter estimation jobs. A classic ODE Solver called ODEPACK is used to solve the Ordinary Differential Equation (ODE) system. http://sysbio.molgen.mpg.de/SBML-PET/. The website also contains detailed documentation for SBML-PET.
National Urban Database and Access Portal Tool
Based on the need for advanced treatments of high resolution urban morphological features (e.g., buildings, trees) in meteorological, dispersion, air quality and human exposure modeling systems for future urban applications, a new project was launched called the National Urban Da...
Probability and Statistics in Sensor Performance Modeling
2010-12-01
language software program is called Environmental Awareness for Sensor and Emitter Employment. Some important numerical issues in the implementation...3 Statistical analysis for measuring sensor performance...complementary cumulative distribution function cdf cumulative distribution function DST decision-support tool EASEE Environmental Awareness of
ACTG: novel peptide mapping onto gene models.
Choi, Seunghyuk; Kim, Hyunwoo; Paek, Eunok
2017-04-15
In many proteogenomic applications, mapping peptide sequences onto genome sequences can be very useful, because it allows us to understand origins of the gene products. Existing software tools either take the genomic position of a peptide start site as an input or assume that the peptide sequence exactly matches the coding sequence of a given gene model. In case of novel peptides resulting from genomic variations, especially structural variations such as alternative splicing, these existing tools cannot be directly applied unless users supply information about the variant, either its genomic position or its transcription model. Mapping potentially novel peptides to genome sequences, while allowing certain genomic variations, requires introducing novel gene models when aligning peptide sequences to gene structures. We have developed a new tool called ACTG (Amino aCids To Genome), which maps peptides to genome, assuming all possible single exon skipping, junction variation allowing three edit distances from the original splice sites, exon extension and frame shift. In addition, it can also consider SNVs (single nucleotide variations) during mapping phase if a user provides the VCF (variant call format) file as an input. Available at http://prix.hanyang.ac.kr/ACTG/search.jsp . eunokpaek@hanyang.ac.kr. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
Computerized Biomechanical Man-Model
1976-07-01
Force Systems Command Wright-Patterson AFB, Ohio ABSTRACT The COMputerized BIomechanical MAN-Model (called COMBIMAN) is a computer interactive graphics...concept was to build a mock- The use of mock-ups for biomechanical evalua- up which permitted the designer to visualize the tion has long been a tool...of the can become an obstacle to design change. Aerospace Medical Research Laboratory, we are developing a computerized biomechanical man-model
iTree-Hydro: Snow hydrology update for the urban forest hydrology model
Yang Yang; Theodore A. Endreny; David J. Nowak
2011-01-01
This article presents snow hydrology updates made to iTree-Hydro, previously called the Urban Forest EffectsâHydrology model. iTree-Hydro Version 1 was a warm climate model developed by the USDA Forest Service to provide a process-based planning tool with robust water quantity and quality predictions given data limitations common to most urban areas. Cold climate...
ERIC Educational Resources Information Center
Bidarra, José; Rusman, Ellen
2017-01-01
This paper proposes a design framework to support science education through blended learning, based on a participatory and interactive approach supported by ICT-based tools, called "Science Learning Activities Model" (SLAM). The development of this design framework started as a response to complex changes in society and education (e.g.…
Revel8or: Model Driven Capacity Planning Tool Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Liming; Liu, Yan; Bui, Ngoc B.
2007-05-31
Designing complex multi-tier applications that must meet strict performance requirements is a challenging software engineering problem. Ideally, the application architect could derive accurate performance predictions early in the project life-cycle, leveraging initial application design-level models and a description of the target software and hardware platforms. To this end, we have developed a capacity planning tool suite for component-based applications, called Revel8tor. The tool adheres to the model driven development paradigm and supports benchmarking and performance prediction for J2EE, .Net and Web services platforms. The suite is composed of three different tools: MDAPerf, MDABench and DSLBench. MDAPerf allows annotation of designmore » diagrams and derives performance analysis models. MDABench allows a customized benchmark application to be modeled in the UML 2.0 Testing Profile and automatically generates a deployable application, with measurement automatically conducted. DSLBench allows the same benchmark modeling and generation to be conducted using a simple performance engineering Domain Specific Language (DSL) in Microsoft Visual Studio. DSLBench integrates with Visual Studio and reuses its load testing infrastructure. Together, the tool suite can assist capacity planning across platforms in an automated fashion.« less
Occupational voice demands and their impact on the call-centre industry.
Hazlett, D E; Duffy, O M; Moorhead, S A
2009-04-20
Within the last decade there has been a growth in the call-centre industry in the UK, with a growing awareness of the voice as an important tool for successful communication. Occupational voice problems such as occupational dysphonia, in a business which relies on healthy, effective voice as the primary professional communication tool, may threaten working ability and occupational health and safety of workers. While previous studies of telephone call-agents have reported a range of voice symptoms and functional vocal health problems, there have been no studies investigating the use and impact of vocal performance in the communication industry within the UK. This study aims to address a significant gap in the evidence-base of occupational health and safety research. The objectives of the study are: 1. to investigate the work context and vocal communication demands for call-agents; 2. to evaluate call-agents' vocal health, awareness and performance; and 3. to identify key risks and training needs for employees and employers within call-centres. This is an occupational epidemiological study, which plans to recruit call-centres throughout the UK and Ireland. Data collection will consist of three components: 1. interviews with managers from each participating call-centre to assess their communication and training needs; 2. an online biopsychosocial questionnaire will be administered to investigate the work environment and vocal demands of call-agents; and 3. voice acoustic measurements of a random sample of participants using the Multi-dimensional Voice Program (MDVP). Qualitative content analysis from the interviews will identify underlying themes and issues. A multivariate analysis approach will be adopted using Structural Equation Modelling (SEM), to develop voice measurement models in determining the construct validity of potential factors contributing to occupational dysphonia. Quantitative data will be analysed using SPSS version 15. Ethical approval is granted for this study from the School of Communication, University of Ulster. The results from this study will provide the missing element of voice-based evidence, by appraising the interactional dimensions of vocal health and communicative performance. This information will be used to inform training for call-agents and to contribute to health policies within the workplace, in order to enhance vocal health.
NASA Astrophysics Data System (ADS)
Prettyman, T. H.; Gardner, R. P.; Verghese, K.
1993-08-01
A new specific purpose Monte Carlo code called McENL for modeling the time response of epithermal neutron lifetime tools is described. The weight windows technique, employing splitting and Russian roulette, is used with an automated importance function based on the solution of an adjoint diffusion model to improve the code efficiency. Complete composition and density correlated sampling is also included in the code, and can be used to study the effect on tool response of small variations in the formation, borehole, or logging tool composition and density. An illustration of the latter application is given for the density of a thermal neutron filter. McENL was benchmarked against test-pit data for the Mobil pulsed neutron porosity tool and was found to be very accurate. Results of the experimental validation and details of code performance are presented.
77 FR 72830 - Request for Comments on Request for Continued Examination (RCE) Practice
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-06
... the submission of written comments using a Web-based collaboration tool called IdeaScale[supreg]; and... collaboration tool called IdeaScale[supreg]. The tool allows users to post comments on a topic, and view and...
Stereo imaging with spaceborne radars
NASA Technical Reports Server (NTRS)
Leberl, F.; Kobrick, M.
1983-01-01
Stereo viewing is a valuable tool in photointerpretation and is used for the quantitative reconstruction of the three dimensional shape of a topographical surface. Stereo viewing refers to a visual perception of space by presenting an overlapping image pair to an observer so that a three dimensional model is formed in the brain. Some of the observer's function is performed by machine correlation of the overlapping images - so called automated stereo correlation. The direct perception of space with two eyes is often called natural binocular vision; techniques of generating three dimensional models of the surface from two sets of monocular image measurements is the topic of stereology.
Documenting AUTOGEN and APGEN Model Files
NASA Technical Reports Server (NTRS)
Gladden, Roy E.; Khanampompan, Teerapat; Fisher, Forest W.; DelGuericio, Chris c.
2008-01-01
A computer program called "autogen hypertext map generator" satisfies a need for documenting and assisting in visualization of, and navigation through, model files used in the AUTOGEN and APGEN software mentioned in the two immediately preceding articles. This program parses autogen script files, autogen model files, PERL scripts, and apgen activity-definition files and produces a hypertext map of the files to aid in the navigation of the model. This program also provides a facility for adding notes and descriptions, beyond what is in the source model represented by the hypertext map. Further, this program provides access to a summary of the model through variable, function, sub routine, activity and resource declarations as well as providing full access to the source model and source code. The use of the tool enables easy access to the declarations and the ability to traverse routines and calls while analyzing the model.
ERIC Educational Resources Information Center
Luan, Jing; Willett, Terrence
This paper discusses data mining--an end-to-end (ETE) data analysis tool that is used by researchers in higher education. It also relates data mining and other software programs to a brand new concept called "Knowledge Management." The paper culminates in the Tier Knowledge Management Model (TKMM), which seeks to provide a stable…
Miller, Brian W.; Morisette, Jeffrey T.
2014-01-01
Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.
Clossey, Laurene; Mehnert, Kevin; Silva, Sara
2011-11-01
This article describes an organizational development tool called appreciative inquiry (AI) and its use in mental health to aid agencies implementing recovery model services. AI is a discursive tool with the power to shift dominant organizational cultures. Its philosophical underpinnings emphasize values consistent with recovery: community, empowerment, and positive focus. Recent research in the field of mental health demonstrates the salience of organizational cultural context in affecting new service adoption. This article explores how AI could be helpful in shifting an organization's culture to render it compatible with recovery through descriptions of two mental health centers' use of the tool. The experiences described indicate that AI, if used consistently, empowers staff. The article concludes with consideration of the implications of this empowerment for recovery model implementation and directions for future research.
NASA Astrophysics Data System (ADS)
El Naqa, I.; Suneja, G.; Lindsay, P. E.; Hope, A. J.; Alaly, J. R.; Vicic, M.; Bradley, J. D.; Apte, A.; Deasy, J. O.
2006-11-01
Radiotherapy treatment outcome models are a complicated function of treatment, clinical and biological factors. Our objective is to provide clinicians and scientists with an accurate, flexible and user-friendly software tool to explore radiotherapy outcomes data and build statistical tumour control or normal tissue complications models. The software tool, called the dose response explorer system (DREES), is based on Matlab, and uses a named-field structure array data type. DREES/Matlab in combination with another open-source tool (CERR) provides an environment for analysing treatment outcomes. DREES provides many radiotherapy outcome modelling features, including (1) fitting of analytical normal tissue complication probability (NTCP) and tumour control probability (TCP) models, (2) combined modelling of multiple dose-volume variables (e.g., mean dose, max dose, etc) and clinical factors (age, gender, stage, etc) using multi-term regression modelling, (3) manual or automated selection of logistic or actuarial model variables using bootstrap statistical resampling, (4) estimation of uncertainty in model parameters, (5) performance assessment of univariate and multivariate analyses using Spearman's rank correlation and chi-square statistics, boxplots, nomograms, Kaplan-Meier survival plots, and receiver operating characteristics curves, and (6) graphical capabilities to visualize NTCP or TCP prediction versus selected variable models using various plots. DREES provides clinical researchers with a tool customized for radiotherapy outcome modelling. DREES is freely distributed. We expect to continue developing DREES based on user feedback.
Performance Support Tools: Delivering Value when and where It Is Needed
ERIC Educational Resources Information Center
McManus, Paul; Rossett, Allison
2006-01-01
Some call them Electronic Performance Support Systems (EPSSs). Others prefer Performance Support Tools (PSTs) or decision support tools. One might call EPSSs or PSTs job aids on steroids, technological tools that provide critical information or advice needed to move forward at a particular moment in time. Characteristic advantages of an EPSS or a…
NASA Astrophysics Data System (ADS)
Adesta, Erry Yulian T.; Riza, Muhammad; Avicena
2018-03-01
Tool wear prediction plays a significant role in machining industry for proper planning and control machining parameters and optimization of cutting conditions. This paper aims to investigate the effect of tool path strategies that are contour-in and zigzag tool path strategies applied on tool wear during pocket milling process. The experiments were carried out on CNC vertical machining centre by involving PVD coated carbide inserts. Cutting speed, feed rate and depth of cut were set to vary. In an experiment with three factors at three levels, Response Surface Method (RSM) design of experiment with a standard called Central Composite Design (CCD) was employed. Results obtained indicate that tool wear increases significantly at higher range of feed per tooth compared to cutting speed and depth of cut. This result of this experimental work is then proven statistically by developing empirical model. The prediction model for the response variable of tool wear for contour-in strategy developed in this research shows a good agreement with experimental work.
Automatic welding detection by an intelligent tool pipe inspection
NASA Astrophysics Data System (ADS)
Arizmendi, C. J.; Garcia, W. L.; Quintero, M. A.
2015-07-01
This work provide a model based on machine learning techniques in welds recognition, based on signals obtained through in-line inspection tool called “smart pig” in Oil and Gas pipelines. The model uses a signal noise reduction phase by means of pre-processing algorithms and attribute-selection techniques. The noise reduction techniques were selected after a literature review and testing with survey data. Subsequently, the model was trained using recognition and classification algorithms, specifically artificial neural networks and support vector machines. Finally, the trained model was validated with different data sets and the performance was measured with cross validation and ROC analysis. The results show that is possible to identify welding automatically with an efficiency between 90 and 98 percent.
NASA Astrophysics Data System (ADS)
Albano, Raffaele; Manfreda, Salvatore; Celano, Giuseppe
The paper introduces a minimalist water-driven crop model for sustainable irrigation management using an eco-hydrological approach. Such model, called MY SIRR, uses a relatively small number of parameters and attempts to balance simplicity, accuracy, and robustness. MY SIRR is a quantitative tool to assess water requirements and agricultural production across different climates, soil types, crops, and irrigation strategies. The MY SIRR source code is published under copyleft license. The FOSS approach could lower the financial barriers of smallholders, especially in developing countries, in the utilization of tools for better decision-making on the strategies for short- and long-term water resource management.
DOE Office of Scientific and Technical Information (OSTI.GOV)
The Profile Interface Generator (PIG) is a tool for loosely coupling applications and performance tools. It enables applications to write code that looks like standard C and Fortran functions calls, without requiring that applications link to specific implementations of those function calls. Performance tools can register with PIG in order to listen to only the calls that give information they care about. This interface reduces the build and configuration burden on application developers and allows semantic instrumentation to live in production codes without interfering with production runs.
Integrated performance and reliability specification for digital avionics systems
NASA Technical Reports Server (NTRS)
Brehm, Eric W.; Goettge, Robert T.
1995-01-01
This paper describes an automated tool for performance and reliability assessment of digital avionics systems, called the Automated Design Tool Set (ADTS). ADTS is based on an integrated approach to design assessment that unifies traditional performance and reliability views of system designs, and that addresses interdependencies between performance and reliability behavior via exchange of parameters and result between mathematical models of each type. A multi-layer tool set architecture has been developed for ADTS that separates the concerns of system specification, model generation, and model solution. Performance and reliability models are generated automatically as a function of candidate system designs, and model results are expressed within the system specification. The layered approach helps deal with the inherent complexity of the design assessment process, and preserves long-term flexibility to accommodate a wide range of models and solution techniques within the tool set structure. ADTS research and development to date has focused on development of a language for specification of system designs as a basis for performance and reliability evaluation. A model generation and solution framework has also been developed for ADTS, that will ultimately encompass an integrated set of analytic and simulated based techniques for performance, reliability, and combined design assessment.
"Extreme Programming" in a Bioinformatics Class
ERIC Educational Resources Information Center
Kelley, Scott; Alger, Christianna; Deutschman, Douglas
2009-01-01
The importance of Bioinformatics tools and methodology in modern biological research underscores the need for robust and effective courses at the college level. This paper describes such a course designed on the principles of cooperative learning based on a computer software industry production model called "Extreme Programming" (EP).…
simBio: a Java package for the development of detailed cell models.
Sarai, Nobuaki; Matsuoka, Satoshi; Noma, Akinori
2006-01-01
Quantitative dynamic computer models, which integrate a variety of molecular functions into a cell model, provide a powerful tool to create and test working hypotheses. We have developed a new modeling tool, the simBio package (freely available from ), which can be used for constructing cell models, such as cardiac cells (the Kyoto model from Matsuoka et al., 2003, 2004 a, b, the LRd model from Faber and Rudy, 2000, and the Noble 98 model from Noble et al., 1998), epithelial cells (Strieter et al., 1990) and pancreatic beta cells (Magnus and Keizer, 1998). The simBio package is written in Java, uses XML and can solve ordinary differential equations. In an attempt to mimic biological functional structures, a cell model is, in simBio, composed of independent functional modules called Reactors, such as ion channels and the sarcoplasmic reticulum, and dynamic variables called Nodes, such as ion concentrations. The interactions between Reactors and Nodes are described by the graph theory and the resulting graph represents a blueprint of an intricate cellular system. Reactors are prepared in a hierarchical order, in analogy to the biological classification. Each Reactor can be composed or improved independently, and can easily be reused for different models. This way of building models, through the combination of various modules, is enabled through the use of object-oriented programming concepts. Thus, simBio is a straightforward system for the creation of a variety of cell models on a common database of functional modules.
Automated support for experience-based software management
NASA Technical Reports Server (NTRS)
Valett, Jon D.
1992-01-01
To effectively manage a software development project, the software manager must have access to key information concerning a project's status. This information includes not only data relating to the project of interest, but also, the experience of past development efforts within the environment. This paper describes the concepts and functionality of a software management tool designed to provide this information. This tool, called the Software Management Environment (SME), enables the software manager to compare an ongoing development effort with previous efforts and with models of the 'typical' project within the environment, to predict future project status, to analyze a project's strengths and weaknesses, and to assess the project's quality. In order to provide these functions the tool utilizes a vast corporate memory that includes a data base of software metrics, a set of models and relationships that describe the software development environment, and a set of rules that capture other knowledge and experience of software managers within the environment. Integrating these major concepts into one software management tool, the SME is a model of the type of management tool needed for all software development organizations.
2014-04-25
EA’s Java application programming interface (API), the team built a tool called OWL2EA that can ingest an OWL file and generate the corresponding UML...ObjectItemStructure specification shown in Figure 10. Running this script in the relational database server MySQL creates the physical schema that
Awareness effects of a youth suicide prevention media campaign in Louisiana.
Jenner, Eric; Jenner, Lynne Woodward; Matthews-Sterling, Maya; Butts, Jessica K; Williams, Trina Evans
2010-08-01
Research on the efficacy of mediated suicide awareness campaigns is limited. The impacts of a state-wide media campaign on call volumes to a national hotline were analyzed to determine if the advertisements have raised awareness of the hotline. We use a quasi-experimental design to compare call volumes from ZIP codes where and when the campaign is active with those where and when the campaign is not active. Multilevel model estimates suggest that the campaign appears to have significantly and substantially increased calls to the hotline. Results from this study add evidence to the growing public health literature that suggests that mediated campaigns can be an effective tool for raising audience awareness.
NASA Astrophysics Data System (ADS)
Nikolić, Dalibor; Milošević, Žarko; Saveljić, Igor; Filipović, Nenad
2015-12-01
Vibration of the skull causes a hearing sensation. We call it Bone Conduction (BC) sound. There are several investigations about transmission properties of bone conducted sound. The aim of this study was to develop a software tool for easy generation of the finite element (FE) model of the human head with different materials based on human head anatomy and to calculate sound conduction through the head. Developed software tool generates a model in a few steps. The first step is to do segmentation of CT medical images (DICOM) and to generate a surface mesh files (STL). Each STL file presents a different layer of human head with different material properties (brain, CSF, different layers of the skull bone, skin, etc.). The next steps are to make tetrahedral mesh from obtained STL files, to define FE model boundary conditions and to solve FE equations. This tool uses PAK solver, which is the open source software implemented in SIFEM FP7 project, for calculations of the head vibration. Purpose of this tool is to show impact of the bone conduction sound of the head on the hearing system and to estimate matching of obtained results with experimental measurements.
Monte Carlo-based searching as a tool to study carbohydrate structure
USDA-ARS?s Scientific Manuscript database
A torsion angle-based Monte-Carlo searching routine was developed and applied to several carbohydrate modeling problems. The routine was developed as a Unix shell script that calls several programs, which allows it to be interfaced with multiple potential functions and various functions for evaluat...
Hypnosis in the Treatment of Alcoholism: A Theoretical Perspective.
ERIC Educational Resources Information Center
Steffenhagen, R. A.
1983-01-01
Reviews the history and theory of alcoholism and hypnosis and proposes a theoretical model of alcholism based on self-esteem. Suggets that hypnosis may be an effective tool in the treatment of alcoholism with cure as the goal, and calls for more consistency in theory and practice. (JAC)
Mullinix, C.; Hearn, P.; Zhang, H.; Aguinaldo, J.
2009-01-01
Federal, State, and local water quality managers charged with restoring the Chesapeake Bay ecosystem require tools to maximize the impact of their limited resources. To address this need, the U.S. Geological Survey (USGS) and the Environmental Protection Agency's Chesapeake Bay Program (CBP) are developing a suite of Web-based tools called the Chesapeake Online Assessment Support Toolkit (COAST). The goal of COAST is to help CBP partners identify geographic areas where restoration activities would have the greatest effect, select the appropriate management strategies, and improve coordination and prioritization among partners. As part of the COAST suite of tools focused on environmental restoration, a water quality management visualization component called the Nutrient Yields Mapper (NYM) tool is being developed by USGS. The NYM tool is a web application that uses watershed yield estimates from USGS SPAtially Referenced Regressions On Watershed (SPARROW) attributes model (Schwarz et al., 2006) [6] to allow water quality managers to identify important sources of nitrogen and phosphorous within the Chesapeake Bay watershed. The NYM tool utilizes new open source technologies that have become popular in geospatial web development, including components such as OpenLayers and GeoServer. This paper presents examples of water quality data analysis based on nutrient type, source, yield, and area of interest using the NYM tool for the Chesapeake Bay watershed. In addition, we describe examples of map-based techniques for identifying high and low nutrient yield areas; web map engines; and data visualization and data management techniques.
2015-09-01
shown have units of pF/m. This is the capacitance matrix for the 115-kV 3-phase circuit seen in Fig. 24.....................................24 Fig. 29...The window that appears when one clicks “Calculate Lambdas ”. These are the linear charge densities for the 115-kV 3-phase circuit seen in Fig. 24...calculate the capacitance matrix (Fig. 28). The diagonal entries are called the coefficients of capacitance, and the non-diagonal entries are called
HRLSim: a high performance spiking neural network simulator for GPGPU clusters.
Minkovich, Kirill; Thibeault, Corey M; O'Brien, Michael John; Nogin, Aleksey; Cho, Youngkwan; Srinivasa, Narayan
2014-02-01
Modeling of large-scale spiking neural models is an important tool in the quest to understand brain function and subsequently create real-world applications. This paper describes a spiking neural network simulator environment called HRL Spiking Simulator (HRLSim). This simulator is suitable for implementation on a cluster of general purpose graphical processing units (GPGPUs). Novel aspects of HRLSim are described and an analysis of its performance is provided for various configurations of the cluster. With the advent of inexpensive GPGPU cards and compute power, HRLSim offers an affordable and scalable tool for design, real-time simulation, and analysis of large-scale spiking neural networks.
Deformable complex network for refining low-resolution X-ray structures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Chong; Wang, Qinghua; Ma, Jianpeng, E-mail: jpma@bcm.edu
2015-10-27
A new refinement algorithm called the deformable complex network that combines a novel angular network-based restraint with a deformable elastic network model in the target function has been developed to aid in structural refinement in macromolecular X-ray crystallography. In macromolecular X-ray crystallography, building more accurate atomic models based on lower resolution experimental diffraction data remains a great challenge. Previous studies have used a deformable elastic network (DEN) model to aid in low-resolution structural refinement. In this study, the development of a new refinement algorithm called the deformable complex network (DCN) is reported that combines a novel angular network-based restraint withmore » the DEN model in the target function. Testing of DCN on a wide range of low-resolution structures demonstrated that it constantly leads to significantly improved structural models as judged by multiple refinement criteria, thus representing a new effective refinement tool for low-resolution structural determination.« less
Optimization of Turbine Engine Cycle Analysis with Analytic Derivatives
NASA Technical Reports Server (NTRS)
Hearn, Tristan; Hendricks, Eric; Chin, Jeffrey; Gray, Justin; Moore, Kenneth T.
2016-01-01
A new engine cycle analysis tool, called Pycycle, was built using the OpenMDAO framework. Pycycle provides analytic derivatives allowing for an efficient use of gradient-based optimization methods on engine cycle models, without requiring the use of finite difference derivative approximation methods. To demonstrate this, a gradient-based design optimization was performed on a turbofan engine model. Results demonstrate very favorable performance compared to an optimization of an identical model using finite-difference approximated derivatives.
MTK: An AI tool for model-based reasoning
NASA Technical Reports Server (NTRS)
Erickson, William K.; Rudokas, Mary R.
1988-01-01
A 1988 goal for the Systems Autonomy Demonstration Project Office of the NASA Ames Research Office is to apply model-based representation and reasoning techniques in a knowledge-based system that will provide monitoring, fault diagnosis, control, and trend analysis of the Space Station Thermal Control System (TCS). A number of issues raised during the development of the first prototype system inspired the design and construction of a model-based reasoning tool called MTK, which was used in the building of the second prototype. These issues are outlined here with examples from the thermal system to highlight the motivating factors behind them, followed by an overview of the capabilities of MTK, which was developed to address these issues in a generic fashion.
Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool
NASA Technical Reports Server (NTRS)
Maul, William A.; Fulton, Christopher E.
2011-01-01
This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual
2014-06-01
from the ODM standard. Leveraging SPARX EA’s Java application programming interface (API), the team built a tool called OWL2EA that can ingest an OWL...server MySQL creates the physical schema that enables a user to store and retrieve data conforming to the vocabulary of the JC3IEDM. 6. GENERATING AN
Community Coordinated Modeling Center Support of Science Needs for Integrated Data Environment
NASA Technical Reports Server (NTRS)
Kuznetsova, M. M.; Hesse, M.; Rastatter, L.; Maddox, M.
2007-01-01
Space science models are essential component of integrated data environment. Space science models are indispensable tools to facilitate effective use of wide variety of distributed scientific sources and to place multi-point local measurements into global context. The Community Coordinated Modeling Center (CCMC) hosts a set of state-of-the- art space science models ranging from the solar atmosphere to the Earth's upper atmosphere. The majority of models residing at CCMC are comprehensive computationally intensive physics-based models. To allow the models to be driven by data relevant to particular events, the CCMC developed an online data file generation tool that automatically downloads data from data providers and transforms them to required format. CCMC provides a tailored web-based visualization interface for the model output, as well as the capability to download simulations output in portable standard format with comprehensive metadata and user-friendly model output analysis library of routines that can be called from any C supporting language. CCMC is developing data interpolation tools that enable to present model output in the same format as observations. CCMC invite community comments and suggestions to better address science needs for the integrated data environment.
Drezen, Erwan; Guyet, Thomas; Happe, André
2018-02-01
Medico-administrative data like SNDS (Système National de Données de Santé) are not collected initially for epidemiological purposes. Moreover, the data model and the tools proposed to SNDS users make their in-depth exploitation difficult. We propose a data model, called the ePEPS model, based on healthcare trajectories to provide a medical view of raw data. A data abstraction process enables the clinician to have an intuitive medical view of raw data and to design a study-specific view. This view is based on a generic model of care trajectory, that is a sequence of time stamped medical events for a given patient. This model is combined with tools to manipulate care trajectories efficiently. © 2017 Société Française de Pharmacologie et de Thérapeutique.
Climate Model Diagnostic Analyzer Web Service System
NASA Astrophysics Data System (ADS)
Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.
2013-12-01
The latest Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with newly available global observations. The traditional approach to climate model evaluation, which compares a single parameter at a time, identifies symptomatic model biases and errors but fails to diagnose the model problems. The model diagnosis process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. To address these challenges, we are developing a parallel, distributed web-service system that enables the physics-based multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation and (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, and (4) the calculation of difference between two variables. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA is planned to be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. The requirements of the educational tool are defined with the interaction with the school organizers, and CMDA is customized to meet the requirements accordingly. The tool needs to be production quality for 30+ simultaneous users. The summer school will thus serve as a valuable testbed for the tool development, preparing CMDA to serve the Earth-science modeling and model-analysis community at the end of the project. This work was funded by the NASA Earth Science Program called Computational Modeling Algorithms and Cyberinfrastructure (CMAC).
Karp, Peter D; Paley, Suzanne; Romero, Pedro
2002-01-01
Bioinformatics requires reusable software tools for creating model-organism databases (MODs). The Pathway Tools is a reusable, production-quality software environment for creating a type of MOD called a Pathway/Genome Database (PGDB). A PGDB such as EcoCyc (see http://ecocyc.org) integrates our evolving understanding of the genes, proteins, metabolic network, and genetic network of an organism. This paper provides an overview of the four main components of the Pathway Tools: The PathoLogic component supports creation of new PGDBs from the annotated genome of an organism. The Pathway/Genome Navigator provides query, visualization, and Web-publishing services for PGDBs. The Pathway/Genome Editors support interactive updating of PGDBs. The Pathway Tools ontology defines the schema of PGDBs. The Pathway Tools makes use of the Ocelot object database system for data management services for PGDBs. The Pathway Tools has been used to build PGDBs for 13 organisms within SRI and by external users.
Learning, remembering, and predicting how to use tools: Distributed neurocognitive mechanisms
Buxbaum, Laurel J.
2016-01-01
The reasoning-based approach championed by Francois Osiurak and Arnaud Badets (Osiurak & Badets, 2016) denies the existence of sensory-motor memories of tool use except in limited circumstances, and suggests instead that most tool use is subserved solely by online technical reasoning about tool properties. In this commentary, I highlight the strengths and limitations of the reasoning-based approach and review a number of lines of evidence that manipulation knowledge is in fact used in tool action tasks. In addition, I present a “two route” neurocognitive model of tool use called the “Two Action Systems Plus (2AS+)” framework that posits a complementary role for online and stored information and specifies the neurocognitive substrates of task-relevant action selection. This framework, unlike the reasoning based approach, has the potential to integrate the existing psychological and functional neuroanatomic data in the tool use domain. PMID:28358565
ERIC Educational Resources Information Center
Marshall, Neil; Buteau, Chantal
2014-01-01
As part of their undergraduate mathematics curriculum, students at Brock University learn to create and use computer-based tools with dynamic, visual interfaces, called Exploratory Objects, developed for the purpose of conducting pure or applied mathematical investigations. A student's Development Process Model of creating and using an Exploratory…
A New Paradigm for Intelligent Tutoring Systems: Example-Tracing Tutors
ERIC Educational Resources Information Center
Aleven, Vincent; McLaren, Bruce M.; Sewall, Jonathan; Koedinger, Kenneth R.
2009-01-01
The Cognitive Tutor Authoring Tools (CTAT) support creation of a novel type of tutors called example-tracing tutors. Unlike other types of ITSs (e.g., model-tracing tutors, constraint-based tutors), example-tracing tutors evaluate student behavior by flexibly comparing it against generalized examples of problem-solving behavior. Example-tracing…
ERIC Educational Resources Information Center
Phung, Dan; Valetto, Giuseppe; Kaiser, Gail E.; Liu, Tiecheng; Kender, John R.
2007-01-01
The increasing popularity of online courses has highlighted the need for collaborative learning tools for student groups. In this article, we present an e-Learning architecture and adaptation model called AI2TV (Adaptive Interactive Internet Team Video), which allows groups of students to collaboratively view instructional videos in synchrony.…
If You Build It, They Will Scan: Oxford University's Exploration of Community Collections
ERIC Educational Resources Information Center
Lee, Stuart D.; Lindsay, Kate
2009-01-01
Traditional large digitization projects demand massive resources from the central unit (library, museum, or university) that has acquired funding for them. Another model, enabled by easy access to cameras, scanners, and web tools, calls for public contributions to community collections of artifacts. In 2009, the University of Oxford ran a…
Facilitating Attuned Interactions: Using the FAN Approach to Family Engagement
ERIC Educational Resources Information Center
Gilkerson, Linda
2015-01-01
Erikson Institute's Fussy Baby Network® (FBN) is a national model prevention program known for its approach to family engagement called the FAN (Gilkerson & Gray, 2014; Gilkerson et al., 2012). The FAN is both a conceptual framework and a practical tool to facilitate attunement in helping relationships and promote reflective practice. This…
Strengthening the case for saproxylic arthropod conservation: a call for ecosystem services research
Michael Ulyshen
2013-01-01
While research on the ecosystem services provided by biodiversity is becoming widely embraced as an important tool in conservation, the services provided by saproxylic arthropods - an especially diverse and threatened assemblage dependent on dead or dying wood - remain unmeasured. A conceptual model depicting the reciprocal relationships between dead wood and...
Managing Change to a Quality Philosophy: A Partnership Perspective.
ERIC Educational Resources Information Center
Snyder, Karolyn J.; Acker-Hocevar, Michele
Within the past 5 years there has been an international movement to adapt the principles and practices of Total Quality Management work environments to school-restructuring agendas. This paper reports on the development of a model called the Educational Quality System, a benchmark assessment tool for identifying the essential elements of quality…
Answering the Call for Accountability: An Activity and Cost Analysis Case Study
ERIC Educational Resources Information Center
Carducci, Rozana; Kisker, Carrie B.; Chang, June; Schirmer, James
2007-01-01
This article summarizes the findings of a case study on the creation and application of an activity-based cost accounting model that links community college salary expenditures to mission-critical practices within academic divisions of a southern California community college. Although initially applied as a financial management tool in private…
An Investigation of Software Scaffolds Supporting Modeling Practices
NASA Astrophysics Data System (ADS)
Fretz, Eric B.; Wu, Hsin-Kai; Zhang, Baohui; Davis, Elizabeth A.; Krajcik, Joseph S.; Soloway, Elliot
2002-08-01
Modeling of complex systems and phenomena is of value in science learning and is increasingly emphasised as an important component of science teaching and learning. Modeling engages learners in desired pedagogical activities. These activities include practices such as planning, building, testing, analysing, and critiquing. Designing realistic models is a difficult task. Computer environments allow the creation of dynamic and even more complex models. One way of bringing the design of models within reach is through the use of scaffolds. Scaffolds are intentional assistance provided to learners from a variety of sources, allowing them to complete tasks that would otherwise be out of reach. Currently, our understanding of how scaffolds in software tools assist learners is incomplete. In this paper the scaffolds designed into a dynamic modeling software tool called Model-It are assessed in terms of their ability to support learners' use of modeling practices. Four pairs of middle school students were video-taped as they used the modeling software for three hours, spread over a two week time frame. Detailed analysis of coded videotape transcripts provided evidence of the importance of scaffolds in supporting the use of modeling practices. Learners used a variety of modeling practices, the majority of which occurred in conjunction with scaffolds. The use of three tool scaffolds was assessed as directly as possible, and these scaffolds were seen to support a variety of modeling practices. An argument is made for the continued empirical validation of types and instances of tool scaffolds, and further investigation of the important role of teacher and peer scaffolding in the use of scaffolded tools.
An effective automatic procedure for testing parameter identifiability of HIV/AIDS models.
Saccomani, Maria Pia
2011-08-01
Realistic HIV models tend to be rather complex and many recent models proposed in the literature could not yet be analyzed by traditional identifiability testing techniques. In this paper, we check a priori global identifiability of some of these nonlinear HIV models taken from the recent literature, by using a differential algebra algorithm based on previous work of the author. The algorithm is implemented in a software tool, called DAISY (Differential Algebra for Identifiability of SYstems), which has been recently released (DAISY is freely available on the web site http://www.dei.unipd.it/~pia/ ). The software can be used to automatically check global identifiability of (linear and) nonlinear models described by polynomial or rational differential equations, thus providing a general and reliable tool to test global identifiability of several HIV models proposed in the literature. It can be used by researchers with a minimum of mathematical background.
NASA Astrophysics Data System (ADS)
O'Neill, B. C.; Kauffman, B.; Lawrence, P.
2016-12-01
Integrated analysis of questions regarding land, water, and energy resources often requires integration of models of different types. One type of integration is between human and earth system models, since both societal and physical processes influence these resources. For example, human processes such as changes in population, economic conditions, and policies govern the demand for land, water and energy, while the interactions of these resources with physical systems determine their availability and environmental consequences. We have begun to develop and use a toolkit for linking human and earth system models called the Toolbox for Human-Earth System Integration and Scaling (THESIS). THESIS consists of models and software tools to translate, scale, and synthesize information from and between human system models and earth system models (ESMs), with initial application to linking the NCAR integrated assessment model, iPETS, with the NCAR earth system model, CESM. Initial development is focused on urban areas and agriculture, sectors that are both explicitly represented in both CESM and iPETS. Tools are being made available to the community as they are completed (see https://www2.cgd.ucar.edu/sections/tss/iam/THESIS_tools). We discuss four general types of functions that THESIS tools serve (Spatial Distribution, Spatial Properties, Consistency, and Outcome Evaluation). Tools are designed to be modular and can be combined in order to carry out more complex analyses. We illustrate their application to both the exposure of population to climate extremes and to the evaluation of climate impacts on the agriculture sector. For example, projecting exposure to climate extremes involves use of THESIS tools for spatial population, spatial urban land cover, the characteristics of both, and a tool to bring urban climate information together with spatial population information. Development of THESIS tools is continuing and open to the research community.
Optimization of Multi-Fidelity Computer Experiments via the EQIE Criterion
DOE Office of Scientific and Technical Information (OSTI.GOV)
He, Xu; Tuo, Rui; Jeff Wu, C. F.
Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi- delity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. Here, from simulation results and a real example using finite element analysis,more » our method out-performs the expected improvement (EI) criterion which works for single-accuracy experiments.« less
Optimization of Multi-Fidelity Computer Experiments via the EQIE Criterion
He, Xu; Tuo, Rui; Jeff Wu, C. F.
2017-01-31
Computer experiments based on mathematical models are powerful tools for understanding physical processes. This article addresses the problem of kriging-based optimization for deterministic computer experiments with tunable accuracy. Our approach is to use multi- delity computer experiments with increasing accuracy levels and a nonstationary Gaussian process model. We propose an optimization scheme that sequentially adds new computer runs by following two criteria. The first criterion, called EQI, scores candidate inputs with given level of accuracy, and the second criterion, called EQIE, scores candidate combinations of inputs and accuracy. Here, from simulation results and a real example using finite element analysis,more » our method out-performs the expected improvement (EI) criterion which works for single-accuracy experiments.« less
Aeroservoelastic Modeling of Body Freedom Flutter for Control System Design
NASA Technical Reports Server (NTRS)
Ouellette, Jeffrey
2017-01-01
One of the most severe forms of coupling between aeroelasticity and flight dynamics is an instability called freedom flutter. The existing tools often assume relatively weak coupling, and are therefore unable to accurately model body freedom flutter. Because the existing tools were developed from traditional flutter analysis models, inconsistencies in the final models are not compatible with control system design tools. To resolve these issues, a number of small, but significant changes have been made to the existing approaches. A frequency domain transformation is used with the unsteady aerodynamics to ensure a more physically consistent stability axis rational function approximation of the unsteady aerodynamic model. The aerodynamic model is augmented with additional terms to account for limitations of the baseline unsteady aerodynamic model and to account for the gravity forces. An assumed modes method is used for the structural model to ensure a consistent definition of the aircraft states across the flight envelope. The X-56A stiff wing flight-test data were used to validate the current modeling approach. The flight-test data does not show body-freedom flutter, but does show coupling between the flight dynamics and the aeroelastic dynamics and the effects of the fuel weight.
2014-06-19
urgent and compelling. Recent efforts in this area automate program analysis techniques using model checking and symbolic execution [2, 5–7]. These...bounded model checking tool for x86 binary programs developed at the Air Force Institute of Technology (AFIT). Jiseki creates a bit-vector logic model based...assume there are n different paths through the function foo . The program could potentially call the function foo a bound number of times, resulting in n
Standardized data sharing in a paediatric oncology research network--a proof-of-concept study.
Hochedlinger, Nina; Nitzlnader, Michael; Falgenhauer, Markus; Welte, Stefan; Hayn, Dieter; Koumakis, Lefteris; Potamias, George; Tsiknakis, Manolis; Saraceno, Davide; Rinaldi, Eugenia; Ladenstein, Ruth; Schreier, Günter
2015-01-01
Data that has been collected in the course of clinical trials are potentially valuable for additional scientific research questions in so called secondary use scenarios. This is of particular importance in rare disease areas like paediatric oncology. If data from several research projects need to be connected, so called Core Datasets can be used to define which information needs to be extracted from every involved source system. In this work, the utility of the Clinical Data Interchange Standards Consortium (CDISC) Operational Data Model (ODM) as a format for Core Datasets was evaluated and a web tool was developed which received Source ODM XML files and--via Extensible Stylesheet Language Transformation (XSLT)--generated standardized Core Dataset ODM XML files. Using this tool, data from different source systems were extracted and pooled for joined analysis in a proof-of-concept study, facilitating both, basic syntactic and semantic interoperability.
Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing
NASA Technical Reports Server (NTRS)
Ordaz, Irian
2011-01-01
Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.
Logistics Process Analysis ToolProcess Analysis Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
2008-03-31
LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fortmore » Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less
Staccini, Pascal M.; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius
2002-01-01
Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation. PMID:12463921
Staccini, Pascal M; Joubert, Michel; Quaranta, Jean-Francois; Fieschi, Marius
2002-01-01
Growing attention is being given to the use of process modeling methodology for user requirements elicitation. In the analysis phase of hospital information systems, the usefulness of care-process models has been investigated to evaluate the conceptual applicability and practical understandability by clinical staff and members of users teams. Nevertheless, there still remains a gap between users and analysts in their mutual ability to share conceptual views and vocabulary, keeping the meaning of clinical context while providing elements for analysis. One of the solutions for filling this gap is to consider the process model itself in the role of a hub as a centralized means of facilitating communication between team members. Starting with a robust and descriptive technique for process modeling called IDEF0/SADT, we refined the basic data model by extracting concepts from ISO 9000 process analysis and from enterprise ontology. We defined a web-based architecture to serve as a collaborative tool and implemented it using an object-oriented database. The prospects of such a tool are discussed notably regarding to its ability to generate data dictionaries and to be used as a navigation tool through the medium of hospital-wide documentation.
Cloud-Based Tools to Support High-Resolution Modeling (Invited)
NASA Astrophysics Data System (ADS)
Jones, N.; Nelson, J.; Swain, N.; Christensen, S.
2013-12-01
The majority of watershed models developed to support decision-making by water management agencies are simple, lumped-parameter models. Maturity in research codes and advances in the computational power from multi-core processors on desktop machines, commercial cloud-computing resources, and supercomputers with thousands of cores have created new opportunities for employing more accurate, high-resolution distributed models for routine use in decision support. The barriers for using such models on a more routine basis include massive amounts of spatial data that must be processed for each new scenario and lack of efficient visualization tools. In this presentation we will review a current NSF-funded project called CI-WATER that is intended to overcome many of these roadblocks associated with high-resolution modeling. We are developing a suite of tools that will make it possible to deploy customized web-based apps for running custom scenarios for high-resolution models with minimal effort. These tools are based on a software stack that includes 52 North, MapServer, PostGIS, HT Condor, CKAN, and Python. This open source stack provides a simple scripting environment for quickly configuring new custom applications for running high-resolution models as geoprocessing workflows. The HT Condor component facilitates simple access to local distributed computers or commercial cloud resources when necessary for stochastic simulations. The CKAN framework provides a powerful suite of tools for hosting such workflows in a web-based environment that includes visualization tools and storage of model simulations in a database to archival, querying, and sharing of model results. Prototype applications including land use change, snow melt, and burned area analysis will be presented. This material is based upon work supported by the National Science Foundation under Grant No. 1135482
Automatic building information model query generation
Jiang, Yufei; Yu, Nan; Ming, Jiang; ...
2015-12-01
Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approachmore » to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. In conclusion, by demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.« less
Automatic building information model query generation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, Yufei; Yu, Nan; Ming, Jiang
Energy efficient building design and construction calls for extensive collaboration between different subfields of the Architecture, Engineering and Construction (AEC) community. Performing building design and construction engineering raises challenges on data integration and software interoperability. Using Building Information Modeling (BIM) data hub to host and integrate building models is a promising solution to address those challenges, which can ease building design information management. However, the partial model query mechanism of current BIM data hub collaboration model has several limitations, which prevents designers and engineers to take advantage of BIM. To address this problem, we propose a general and effective approachmore » to generate query code based on a Model View Definition (MVD). This approach is demonstrated through a software prototype called QueryGenerator. In conclusion, by demonstrating a case study using multi-zone air flow analysis, we show how our approach and tool can help domain experts to use BIM to drive building design with less labour and lower overhead cost.« less
Crisis and emergency risk communication as an integrative model.
Reynolds, Barbara; W Seeger, Matthew
2005-01-01
This article describes a model of communication known as crisis and emergency risk communication (CERC). The model is outlined as a merger of many traditional notions of health and risk communication with work in crisis and disaster communication. The specific kinds of communication activities that should be called for at various stages of disaster or crisis development are outlined. Although crises are by definition uncertain, equivocal, and often chaotic situations, the CERC model is presented as a tool health communicators can use to help manage these complex events.
An advanced environment for hybrid modeling of biological systems based on modelica.
Pross, Sabrina; Bachmann, Bernhard
2011-01-20
Biological systems are often very complex so that an appropriate formalism is needed for modeling their behavior. Hybrid Petri Nets, consisting of time-discrete Petri Net elements as well as continuous ones, have proven to be ideal for this task. Therefore, a new Petri Net library was implemented based on the object-oriented modeling language Modelica which allows the modeling of discrete, stochastic and continuous Petri Net elements by differential, algebraic and discrete equations. An appropriate Modelica-tool performs the hybrid simulation with discrete events and the solution of continuous differential equations. A special sub-library contains so-called wrappers for specific reactions to simplify the modeling process. The Modelica-models can be connected to Simulink-models for parameter optimization, sensitivity analysis and stochastic simulation in Matlab. The present paper illustrates the implementation of the Petri Net component models, their usage within the modeling process and the coupling between the Modelica-tool Dymola and Matlab/Simulink. The application is demonstrated by modeling the metabolism of Chinese Hamster Ovary Cells.
Evaluating Variant Calling Tools for Non-Matched Next-Generation Sequencing Data
NASA Astrophysics Data System (ADS)
Sandmann, Sarah; de Graaf, Aniek O.; Karimi, Mohsen; van der Reijden, Bert A.; Hellström-Lindberg, Eva; Jansen, Joop H.; Dugas, Martin
2017-02-01
Valid variant calling results are crucial for the use of next-generation sequencing in clinical routine. However, there are numerous variant calling tools that usually differ in algorithms, filtering strategies, recommendations and thus, also in the output. We evaluated eight open-source tools regarding their ability to call single nucleotide variants and short indels with allelic frequencies as low as 1% in non-matched next-generation sequencing data: GATK HaplotypeCaller, Platypus, VarScan, LoFreq, FreeBayes, SNVer, SAMtools and VarDict. We analysed two real datasets from patients with myelodysplastic syndrome, covering 54 Illumina HiSeq samples and 111 Illumina NextSeq samples. Mutations were validated by re-sequencing on the same platform, on a different platform and expert based review. In addition we considered two simulated datasets with varying coverage and error profiles, covering 50 samples each. In all cases an identical target region consisting of 19 genes (42,322 bp) was analysed. Altogether, no tool succeeded in calling all mutations. High sensitivity was always accompanied by low precision. Influence of varying coverages- and background noise on variant calling was generally low. Taking everything into account, VarDict performed best. However, our results indicate that there is a need to improve reproducibility of the results in the context of multithreading.
Structural Embeddings: Mechanization with Method
NASA Technical Reports Server (NTRS)
Munoz, Cesar; Rushby, John
1999-01-01
The most powerful tools for analysis of formal specifications are general-purpose theorem provers and model checkers, but these tools provide scant methodological support. Conversely, those approaches that do provide a well-developed method generally have less powerful automation. It is natural, therefore, to try to combine the better-developed methods with the more powerful general-purpose tools. An obstacle is that the methods and the tools often employ very different logics. We argue that methods are separable from their logics and are largely concerned with the structure and organization of specifications. We, propose a technique called structural embedding that allows the structural elements of a method to be supported by a general-purpose tool, while substituting the logic of the tool for that of the method. We have found this technique quite effective and we provide some examples of its application. We also suggest how general-purpose systems could be restructured to support this activity better.
Radom, Marcin; Rybarczyk, Agnieszka; Szawulak, Bartlomiej; Andrzejewski, Hubert; Chabelski, Piotr; Kozak, Adam; Formanowicz, Piotr
2017-12-01
Model development and its analysis is a fundamental step in systems biology. The theory of Petri nets offers a tool for such a task. Since the rapid development of computer science, a variety of tools for Petri nets emerged, offering various analytical algorithms. From this follows a problem of using different programs to analyse a single model. Many file formats and different representations of results make the analysis much harder. Especially for larger nets the ability to visualize the results in a proper form provides a huge help in the understanding of their significance. We present a new tool for Petri nets development and analysis called Holmes. Our program contains algorithms for model analysis based on different types of Petri nets, e.g. invariant generator, Maximum Common Transitions (MCT) sets and cluster modules, simulation algorithms or knockout analysis tools. A very important feature is the ability to visualize the results of almost all analytical modules. The integration of such modules into one graphical environment allows a researcher to fully devote his or her time to the model building and analysis. Available at http://www.cs.put.poznan.pl/mradom/Holmes/holmes.html. piotr@cs.put.poznan.pl. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Evaluating the ecological benefits of wildfire by integrating fire and ecosystem simulation models
Robert E. Keane; Eva Karau
2010-01-01
Fire managers are now realizing that wildfires can be beneficial because they can reduce hazardous fuels and restore fire-dominated ecosystems. A software tool that assesses potential beneficial and detrimental ecological effects from wildfire would be helpful to fire management. This paper presents a simulation platform called FLEAT (Fire and Landscape Ecology...
Forage resource evaluation system for habitat—deer: an interactive deer habitat model
Thomas A. Hanley; Donald E. Spalinger; Kenrick J. Mock; Oran L. Weaver; Grant M. Harris
2012-01-01
We describe a food-based system for quantitatively evaluating habitat quality for deer called the Forage Resource Evaluation System for Habitat and provide its rationale and suggestions for use. The system was developed as a tool for wildlife biologists and other natural resource managers and planners interested in evaluating habitat quality and, especially, comparing...
ERIC Educational Resources Information Center
Kennedy, Michael J.; Thomas, Cathy Newman; Meyer, J. Patrick; Alves, Kat D.; Lloyd, John Wills
2014-01-01
Universal Design for Learning (UDL) is a framework that is commonly used for guiding the construction and delivery of instruction intended to support all students. In this study, we used a related model to guide creation of a multimedia-based instructional tool called content acquisition podcasts (CAPs). CAPs delivered vocabulary instruction…
ERIC Educational Resources Information Center
1996
This paper discusses a model of integrated instruction and assessment called SMART (Special Multimedia Arenas for Refining Thinking). SMART involves interactive use of the Internet and multimedia software. The Internet serves three important functions: it acts as a formative assessment tool by providing individualized feedback to students, creates…
The MeqTrees software system and its use for third-generation calibration of radio interferometers
NASA Astrophysics Data System (ADS)
Noordam, J. E.; Smirnov, O. M.
2010-12-01
Context. The formulation of the radio interferometer measurement equation (RIME) for a generic radio telescope by Hamaker et al. has provided us with an elegant mathematical apparatus for better understanding, simulation and calibration of existing and future instruments. The calibration of the new radio telescopes (LOFAR, SKA) would be unthinkable without the RIME formalism, and new software to exploit it. Aims: The MeqTrees software system is designed to implement numerical models, and to solve for arbitrary subsets of their parameters. It may be applied to many problems, but was originally geared towards implementing Measurement Equations in radio astronomy for the purposes of simulation and calibration. The technical goal of MeqTrees is to provide a tool for rapid implementation of such models, while offering performance comparable to hand-written code. We are also pursuing the wider goal of increasing the rate of evolution of radio astronomical software, by offering a tool that facilitates rapid experimentation, and exchange of ideas (and scripts). Methods: MeqTrees is implemented as a Python-based front-end called the meqbrowser, and an efficient (C++-based) computational back-end called the meqserver. Numerical models are defined on the front-end via a Python-based Tree Definition Language (TDL), then rapidly executed on the back-end. The use of TDL facilitates an extremely short turn-around time (hours rather than weeks or months) for experimentation with new ideas. This is also helped by unprecedented visualization capabilities for all final and intermediate results. A flexible data model and a number of important optimizations in the back-end ensures that the numerical performance is comparable to that of hand-written code. Results: MeqTrees is already widely used as the simulation tool for new instruments (LOFAR, SKA) and technologies (focal plane arrays). It has demonstrated that it can achieve a noise-limited dynamic range in excess of a million, on WSRT data. It is the only package that is specifically designed to handle what we propose to call third-generation calibration (3GC), which is needed for the new generation of giant radio telescopes, but can also improve the calibration of existing instruments.
Jo, Youngji; Labrique, Alain B.; Lefevre, Amnesty E.; Mehl, Garrett; Pfaff, Teresa; Walker, Neff; Friberg, Ingrid K.
2014-01-01
While the importance of mHealth scale-up has been broadly emphasized in the mHealth community, it is necessary to guide scale up efforts and investment in ways to help achieve the mortality reduction targets set by global calls to action such as the Millennium Development Goals, not merely to expand programs. We used the Lives Saved Tool (LiST)–an evidence-based modeling software–to identify priority areas for maternal and neonatal health services, by formulating six individual and combined interventions scenarios for two countries, Bangladesh and Uganda. Our findings show that skilled birth attendance and increased facility delivery as targets for mHealth strategies are likely to provide the biggest mortality impact relative to other intervention scenarios. Although further validation of this model is desirable, tools such as LiST can help us leverage the benefit of mHealth by articulating the most appropriate delivery points in the continuum of care to save lives. PMID:25014008
A flexible tool for diagnosing water, energy, and entropy budgets in climate models
NASA Astrophysics Data System (ADS)
Lembo, Valerio; Lucarini, Valerio
2017-04-01
We have developed a new flexible software for studying the global energy budget, the hydrological cycle, and the material entropy production of global climate models. The program receives as input radiative, latent and sensible energy fluxes, with the requirement that the variable names are in agreement with the Climate and Forecast (CF) conventions for the production of NetCDF datasets. Annual mean maps, meridional sections and time series are computed by means of Climate Data Operators (CDO) collection of command line operators developed at Max-Planck Institute for Meteorology (MPI-M). If a land-sea mask is provided, the program also computes the required quantities separately on the continents and oceans. Depending on the user's choice, the program also calls the MATLAB software to compute meridional heat transports and location and intensities of the peaks in the two hemispheres. We are currently planning to adapt the program in order to be included in the Earth System Model eValuation Tool (ESMValTool) community diagnostics.
Expert models and modeling processes associated with a computer-modeling tool
NASA Astrophysics Data System (ADS)
Zhang, Baohui; Liu, Xiufeng; Krajcik, Joseph S.
2006-07-01
Holding the premise that the development of expertise is a continuous process, this study concerns expert models and modeling processes associated with a modeling tool called Model-It. Five advanced Ph.D. students in environmental engineering and public health used Model-It to create and test models of water quality. Using think aloud technique and video recording, we captured their computer screen modeling activities and thinking processes. We also interviewed them the day following their modeling sessions to further probe the rationale of their modeling practices. We analyzed both the audio-video transcripts and the experts' models. We found the experts' modeling processes followed the linear sequence built in the modeling program with few instances of moving back and forth. They specified their goals up front and spent a long time thinking through an entire model before acting. They specified relationships with accurate and convincing evidence. Factors (i.e., variables) in expert models were clustered, and represented by specialized technical terms. Based on the above findings, we made suggestions for improving model-based science teaching and learning using Model-It.
Patient-specific finite element modeling of bones.
Poelert, Sander; Valstar, Edward; Weinans, Harrie; Zadpoor, Amir A
2013-04-01
Finite element modeling is an engineering tool for structural analysis that has been used for many years to assess the relationship between load transfer and bone morphology and to optimize the design and fixation of orthopedic implants. Due to recent developments in finite element model generation, for example, improved computed tomography imaging quality, improved segmentation algorithms, and faster computers, the accuracy of finite element modeling has increased vastly and finite element models simulating the anatomy and properties of an individual patient can be constructed. Such so-called patient-specific finite element models are potentially valuable tools for orthopedic surgeons in fracture risk assessment or pre- and intraoperative planning of implant placement. The aim of this article is to provide a critical overview of current themes in patient-specific finite element modeling of bones. In addition, the state-of-the-art in patient-specific modeling of bones is compared with the requirements for a clinically applicable patient-specific finite element method, and judgment is passed on the feasibility of application of patient-specific finite element modeling as a part of clinical orthopedic routine. It is concluded that further development in certain aspects of patient-specific finite element modeling are needed before finite element modeling can be used as a routine clinical tool.
Data Visualization Saliency Model: A Tool for Evaluating Abstract Data Visualizations
Matzen, Laura E.; Haass, Michael J.; Divis, Kristin M.; ...
2017-08-29
Evaluating the effectiveness of data visualizations is a challenging undertaking and often relies on one-off studies that test a visualization in the context of one specific task. Researchers across the fields of data science, visualization, and human-computer interaction are calling for foundational tools and principles that could be applied to assessing the effectiveness of data visualizations in a more rapid and generalizable manner. One possibility for such a tool is a model of visual saliency for data visualizations. Visual saliency models are typically based on the properties of the human visual cortex and predict which areas of a scene havemore » visual features (e.g. color, luminance, edges) that are likely to draw a viewer's attention. While these models can accurately predict where viewers will look in a natural scene, they typically do not perform well for abstract data visualizations. In this paper, we discuss the reasons for the poor performance of existing saliency models when applied to data visualizations. We introduce the Data Visualization Saliency (DVS) model, a saliency model tailored to address some of these weaknesses, and we test the performance of the DVS model and existing saliency models by comparing the saliency maps produced by the models to eye tracking data obtained from human viewers. In conclusion, we describe how modified saliency models could be used as general tools for assessing the effectiveness of visualizations, including the strengths and weaknesses of this approach.« less
Data Visualization Saliency Model: A Tool for Evaluating Abstract Data Visualizations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matzen, Laura E.; Haass, Michael J.; Divis, Kristin M.
Evaluating the effectiveness of data visualizations is a challenging undertaking and often relies on one-off studies that test a visualization in the context of one specific task. Researchers across the fields of data science, visualization, and human-computer interaction are calling for foundational tools and principles that could be applied to assessing the effectiveness of data visualizations in a more rapid and generalizable manner. One possibility for such a tool is a model of visual saliency for data visualizations. Visual saliency models are typically based on the properties of the human visual cortex and predict which areas of a scene havemore » visual features (e.g. color, luminance, edges) that are likely to draw a viewer's attention. While these models can accurately predict where viewers will look in a natural scene, they typically do not perform well for abstract data visualizations. In this paper, we discuss the reasons for the poor performance of existing saliency models when applied to data visualizations. We introduce the Data Visualization Saliency (DVS) model, a saliency model tailored to address some of these weaknesses, and we test the performance of the DVS model and existing saliency models by comparing the saliency maps produced by the models to eye tracking data obtained from human viewers. In conclusion, we describe how modified saliency models could be used as general tools for assessing the effectiveness of visualizations, including the strengths and weaknesses of this approach.« less
NASA Astrophysics Data System (ADS)
Block, J.; Crawl, D.; Artes, T.; Cowart, C.; de Callafon, R.; DeFanti, T.; Graham, J.; Smarr, L.; Srivas, T.; Altintas, I.
2016-12-01
The NSF-funded WIFIRE project has designed a web-based wildfire modeling simulation and visualization tool called FireMap. The tool executes FARSITE to model fire propagation using dynamic weather and fire data, configuration settings provided by the user, and static topography and fuel datasets already built-in. Using GIS capabilities combined with scalable big data integration and processing, FireMap enables simple execution of the model with options for running ensembles by taking the information uncertainty into account. The results are easily viewable, sharable, repeatable, and can be animated as a time series. From these capabilities, users can model real-time fire behavior, analyze what-if scenarios, and keep a history of model runs over time for sharing with collaborators. Firemap runs FARSITE with national and local sensor networks for real-time weather data ingestion and High-Resolution Rapid Refresh (HRRR) weather for forecasted weather. The HRRR is a NOAA/NCEP operational weather prediction system comprised of a numerical forecast model and an analysis/assimilation system to initialize the model. It is run with a horizontal resolution of 3 km, has 50 vertical levels, and has a temporal resolution of 15 minutes. The HRRR requires an Environmental Data Exchange (EDEX) server to receive the feed and generate secondary products out of it for the modeling. UCSD's EDEX server, funded by NSF, makes high-resolution weather data available to researchers worldwide and enables visualization of weather systems and weather events lasting months or even years. The high-speed server aggregates weather data from the University Consortium for Atmospheric Research by way of a subscription service from the Consortium called the Internet Data Distribution system. These features are part of WIFIRE's long term goals to build an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. Although Firemap is a research product of WIFIRE, developed in collaboration with a number of fire departments, the tool is operational in pilot form for providing big data-driven predictive fire spread modeling. Most recently, FireMap was used for situational awareness in the July 2016 Sand Fire by LA City and LA County Fire Departments.
PSPVDC: An Adaptation of the PSP that Incorporates Verified Design by Contract
2013-05-01
characteristics mentioned above, including the following: • Java Modeling Language (JML) implements DbC in Java . VDbC can then be carried out using tools like...Extended Static Checking (ESC/ Java ) [Cok 2005] or TACO [Galeotti 2010]. • Perfect Developer [Crocker 2003] is a specification and modeling language...These are written in the language employed in the environment (e.g., as Java Boolean expressions, if JML is used) which we call the carrier lan
Applying AI tools to operational space environmental analysis
NASA Technical Reports Server (NTRS)
Krajnak, Mike; Jesse, Lisa; Mucks, John
1995-01-01
The U.S. Air Force and National Oceanic Atmospheric Agency (NOAA) space environmental operations centers are facing increasingly complex challenges meeting the needs of their growing user community. These centers provide current space environmental information and short term forecasts of geomagnetic activity. Recent advances in modeling and data access have provided sophisticated tools for making accurate and timely forecasts, but have introduced new problems associated with handling and analyzing large quantities of complex data. AI (Artificial Intelligence) techniques have been considered as potential solutions to some of these problems. Fielding AI systems has proven more difficult than expected, in part because of operational constraints. Using systems which have been demonstrated successfully in the operational environment will provide a basis for a useful data fusion and analysis capability. Our approach uses a general purpose AI system already in operational use within the military intelligence community, called the Temporal Analysis System (TAS). TAS is an operational suite of tools supporting data processing, data visualization, historical analysis, situation assessment and predictive analysis. TAS includes expert system tools to analyze incoming events for indications of particular situations and predicts future activity. The expert system operates on a knowledge base of temporal patterns encoded using a knowledge representation called Temporal Transition Models (TTM's) and an event database maintained by the other TAS tools. The system also includes a robust knowledge acquisition and maintenance tool for creating TTM's using a graphical specification language. The ability to manipulate TTM's in a graphical format gives non-computer specialists an intuitive way of accessing and editing the knowledge base. To support space environmental analyses, we used TAS's ability to define domain specific event analysis abstractions. The prototype system defines events covering reports of natural phenomena such as solar flares, bursts, geomagnetic storms, and five others pertinent to space environmental analysis. With our preliminary event definitions we experimented with TAS's support for temporal pattern analysis using X-ray flare and geomagnetic storm forecasts as case studies. We are currently working on a framework for integrating advanced graphics and space environmental models into this analytical environment.
Muver, a computational framework for accurately calling accumulated mutations.
Burkholder, Adam B; Lujan, Scott A; Lavender, Christopher A; Grimm, Sara A; Kunkel, Thomas A; Fargo, David C
2018-05-09
Identification of mutations from next-generation sequencing data typically requires a balance between sensitivity and accuracy. This is particularly true of DNA insertions and deletions (indels), that can impart significant phenotypic consequences on cells but are harder to call than substitution mutations from whole genome mutation accumulation experiments. To overcome these difficulties, we present muver, a computational framework that integrates established bioinformatics tools with novel analytical methods to generate mutation calls with the extremely low false positive rates and high sensitivity required for accurate mutation rate determination and comparison. Muver uses statistical comparison of ancestral and descendant allelic frequencies to identify variant loci and assigns genotypes with models that include per-sample assessments of sequencing errors by mutation type and repeat context. Muver identifies maximally parsimonious mutation pathways that connect these genotypes, differentiating potential allelic conversion events and delineating ambiguities in mutation location, type, and size. Benchmarking with a human gold standard father-son pair demonstrates muver's sensitivity and low false positive rates. In DNA mismatch repair (MMR) deficient Saccharomyces cerevisiae, muver detects multi-base deletions in homopolymers longer than the replicative polymerase footprint at rates greater than predicted for sequential single-base deletions, implying a novel multi-repeat-unit slippage mechanism. Benchmarking results demonstrate the high accuracy and sensitivity achieved with muver, particularly for indels, relative to available tools. Applied to an MMR-deficient Saccharomyces cerevisiae system, muver mutation calls facilitate mechanistic insights into DNA replication fidelity.
ERIC Educational Resources Information Center
Heys, Chris
2008-01-01
Excel, Microsoft's spreadsheet program, offers several tools which have proven useful in solving some optimization problems that arise in operations research. We will look at two such tools, the Excel modules called Solver and Goal Seek--this after deriving an equation, called the "cash accumulation equation", to be used in conjunction with them.
Application of structured analysis to a telerobotic system
NASA Technical Reports Server (NTRS)
Dashman, Eric; Mclin, David; Harrison, F. W.; Soloway, Donald; Young, Steven
1990-01-01
The analysis and evaluation of a multiple arm telerobotic research and demonstration system developed by the NASA Intelligent Systems Research Laboratory (ISRL) is described. Structured analysis techniques were used to develop a detailed requirements model of an existing telerobotic testbed. Performance models generated during this process were used to further evaluate the total system. A commercial CASE tool called Teamwork was used to carry out the structured analysis and development of the functional requirements model. A structured analysis and design process using the ISRL telerobotic system as a model is described. Evaluation of this system focused on the identification of bottlenecks in this implementation. The results demonstrate that the use of structured methods and analysis tools can give useful performance information early in a design cycle. This information can be used to ensure that the proposed system meets its design requirements before it is built.
Jayashree, B; Jagadeesh, V T; Hoisington, D
2008-05-01
The availability of complete, annotated genomic sequence information in model organisms is a rich resource that can be extended to understudied orphan crops through comparative genomic approaches. We report here a software tool (cisprimertool) for the identification of conserved intron scanning regions using expressed sequence tag alignments to a completely sequenced model crop genome. The method used is based on earlier studies reporting the assessment of conserved intron scanning primers (called CISP) within relatively conserved exons located near exon-intron boundaries from onion, banana, sorghum and pearl millet alignments with rice. The tool is freely available to academic users at http://www.icrisat.org/gt-bt/CISPTool.htm. © 2007 ICRISAT.
NASA Technical Reports Server (NTRS)
Windley, P. J.
1991-01-01
In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.
Narratives of Inquiry Learning in Middle-School Geographic Inquiry Class
ERIC Educational Resources Information Center
Kuisma, Merja
2018-01-01
This study aimed at modifying a teaching and learning model for a geographic inquiry to enhance both the subject-related skills of geography and so-called twenty-first century skills in middle-school students (14-15 years old). The purpose of this research is to extend our understanding of the user experiences concerning certain tools for learning…
Model-based quality assessment and base-calling for second-generation sequencing data.
Bravo, Héctor Corrada; Irizarry, Rafael A
2010-09-01
Second-generation sequencing (sec-gen) technology can sequence millions of short fragments of DNA in parallel, making it capable of assembling complex genomes for a small fraction of the price and time of previous technologies. In fact, a recently formed international consortium, the 1000 Genomes Project, plans to fully sequence the genomes of approximately 1200 people. The prospect of comparative analysis at the sequence level of a large number of samples across multiple populations may be achieved within the next five years. These data present unprecedented challenges in statistical analysis. For instance, analysis operates on millions of short nucleotide sequences, or reads-strings of A,C,G, or T's, between 30 and 100 characters long-which are the result of complex processing of noisy continuous fluorescence intensity measurements known as base-calling. The complexity of the base-calling discretization process results in reads of widely varying quality within and across sequence samples. This variation in processing quality results in infrequent but systematic errors that we have found to mislead downstream analysis of the discretized sequence read data. For instance, a central goal of the 1000 Genomes Project is to quantify across-sample variation at the single nucleotide level. At this resolution, small error rates in sequencing prove significant, especially for rare variants. Sec-gen sequencing is a relatively new technology for which potential biases and sources of obscuring variation are not yet fully understood. Therefore, modeling and quantifying the uncertainty inherent in the generation of sequence reads is of utmost importance. In this article, we present a simple model to capture uncertainty arising in the base-calling procedure of the Illumina/Solexa GA platform. Model parameters have a straightforward interpretation in terms of the chemistry of base-calling allowing for informative and easily interpretable metrics that capture the variability in sequencing quality. Our model provides these informative estimates readily usable in quality assessment tools while significantly improving base-calling performance. © 2009, The International Biometric Society.
SedInConnect: a stand-alone, free and open source tool for the assessment of sediment connectivity
NASA Astrophysics Data System (ADS)
Crema, Stefano; Cavalli, Marco
2018-02-01
There is a growing call, within the scientific community, for solid theoretic frameworks and usable indices/models to assess sediment connectivity. Connectivity plays a significant role in characterizing structural properties of the landscape and, when considered in combination with forcing processes (e.g., rainfall-runoff modelling), can represent a valuable analysis for an improved landscape management. In this work, the authors present the development and application of SedInConnect: a free, open source and stand-alone application for the computation of the Index of Connectivity (IC), as expressed in Cavalli et al. (2013) with the addition of specific innovative features. The tool is intended to have a wide variety of users, both from the scientific community and from the authorities involved in the environmental planning. Thanks to its open source nature, the tool can be adapted and/or integrated according to the users' requirements. Furthermore, presenting an easy-to-use interface and being a stand-alone application, the tool can help management experts in the quantitative assessment of sediment connectivity in the context of hazard and risk assessment. An application to a sample dataset and an overview on up-to-date applications of the approach and of the tool shows the development potential of such analyses. The modelled connectivity, in fact, appears suitable not only to characterize sediment dynamics at the catchment scale but also to integrate prediction models and as a tool for helping geomorphological interpretation.
A virtual therapeutic environment with user projective agents.
Ookita, S Y; Tokuda, H
2001-02-01
Today, we see the Internet as more than just an information infrastructure, but a socializing place and a safe outlet of inner feelings. Many personalities develop aside from real world life due to its anonymous environment. Virtual world interactions are bringing about new psychological illnesses ranging from netaddiction to technostress, as well as online personality disorders and conflicts in multiple identities that exist in the virtual world. Presently, there are no standard therapy models for the virtual environment. There are very few therapeutic environments, or tools especially made for virtual therapeutic environments. The goal of our research is to provide the therapy model and middleware tools for psychologists to use in virtual therapeutic environments. We propose the Cyber Therapy Model, and Projective Agents, a tool used in the therapeutic environment. To evaluate the effectiveness of the tool, we created a prototype system, called the Virtual Group Counseling System, which is a therapeutic environment that allows the user to participate in group counseling through the eyes of their Projective Agent. Projective Agents inherit the user's personality traits. During the virtual group counseling, the user's Projective Agent interacts and collaborates to recover and increase their psychological growth. The prototype system provides a simulation environment where psychologists can adjust the parameters and customize their own simulation environment. The model and tool is a first attempt toward simulating online personalities that may exist only online, and provide data for observation.
NASA Astrophysics Data System (ADS)
Chakraborty, Abhishek
Detection of particulate matter thinly dispersed in a fluid medium with the aid of the difference in electrical conductivity between the pure fluid and the particles has been practiced at least since the last 50 to 60 years. The first such instruments were employed to measure cell counts in samples of biological fluid. Following a detailed study of the physics and principles operating within the device, called the Electric Sensing Zone (ESZ) principle, a new device called the Liquid Metal Cleanliness Analyzer (LiMCA) was invented which could measure and count particles of inclusions in molten metal. It provided a fast and fairly accurate tool to make online measurement of the quality of steel during refining and casting operations. On similar lines of development as the LiMCA, a water analogue of the device called, the Aqueous Particle Sensor (APS) was developed for physical modeling experiments of metal refining operations involving water models. The APS can detect and measure simulated particles of inclusions added to the working fluid (water). The present study involves the designing, building and final application of a new and improved APS in water modeling experiments to study inclusion behavior in a tundish operation. The custom built instrument shows superior performance and applicability in experiments involving physical modeling of metal refining operations, compared to its commercial counterparts. In addition to higher accuracy and range of operating parameters, its capability to take real-time experimental data for extended periods of time helps to reduce the total number of experiments required to reach a result, and makes it suitable for analyzing temporal changes occurring in unsteady systems. With the modern impetus on the quality of the final product of metallurgical operations, the new APS can prove to be an indispensable research tool to study and put forward innovative design and parametric changes in industrially practised metallurgical operations.
Burnside, Elizabeth S.; Lee, Sandra J.; Bennette, Carrie; Near, Aimee M.; Alagoz, Oguzhan; Huang, Hui; van den Broek, Jeroen J.; Kim, Joo Yeon; Ergun, Mehmet A.; van Ravesteyn, Nicolien T.; Stout, Natasha K.; de Koning, Harry J.; Mandelblatt, Jeanne S.
2017-01-01
Background There are no publicly available tools designed specifically to assist policy makers to make informed decisions about the optimal ages of breast cancer screening initiation for different populations of US women. Objective To use three established simulation models to develop a web-based tool called Mammo OUTPuT. Methods The simulation models use the 1970 US birth cohort and common parameters for incidence, digital screening performance, and treatment effects. Outcomes include breast cancers diagnosed, breast cancer deaths averted, breast cancer mortality reduction, false-positive mammograms, benign biopsies, and overdiagnosis. The Mammo OUTPuT tool displays these outcomes for combinations of age at screening initiation (every year from 40 to 49), annual versus biennial interval, lifetime versus 10-year horizon, and breast density, compared to waiting to start biennial screening at age 50 and continuing to 74. The tool was piloted by decision makers (n = 16) who completed surveys. Results The tool demonstrates that benefits in the 40s increase linearly with earlier initiation age, without a specific threshold age. Likewise, the harms of screening increase monotonically with earlier ages of initiation in the 40s. The tool also shows users how the balance of benefits and harms varies with breast density. Surveys revealed that 100% of users (16/16) liked the appearance of the site; 94% (15/16) found the tool helpful; and 94% (15/16) would recommend the tool to a colleague. Conclusions This tool synthesizes a representative subset of the most current CISNET (Cancer Intervention and Surveillance Modeling Network) simulation model outcomes to provide policy makers with quantitative data on the benefits and harms of screening women in the 40s. Ultimate decisions will depend on program goals, the population served, and informed judgments about the weight of benefits and harms. PMID:29376135
Probabilistic Priority Message Checking Modeling Based on Controller Area Networks
NASA Astrophysics Data System (ADS)
Lin, Cheng-Min
Although the probabilistic model checking tool called PRISM has been applied in many communication systems, such as wireless local area network, Bluetooth, and ZigBee, the technique is not used in a controller area network (CAN). In this paper, we use PRISM to model the mechanism of priority messages for CAN because the mechanism has allowed CAN to become the leader in serial communication for automobile and industry control. Through modeling CAN, it is easy to analyze the characteristic of CAN for further improving the security and efficiency of automobiles. The Markov chain model helps us to model the behaviour of priority messages.
Computational Modeling in Structural Materials Processing
NASA Technical Reports Server (NTRS)
Meyyappan, Meyya; Arnold, James O. (Technical Monitor)
1997-01-01
High temperature materials such as silicon carbide, a variety of nitrides, and ceramic matrix composites find use in aerospace, automotive, machine tool industries and in high speed civil transport applications. Chemical vapor deposition (CVD) is widely used in processing such structural materials. Variations of CVD include deposition on substrates, coating of fibers, inside cavities and on complex objects, and infiltration within preforms called chemical vapor infiltration (CVI). Our current knowledge of the process mechanisms, ability to optimize processes, and scale-up for large scale manufacturing is limited. In this regard, computational modeling of the processes is valuable since a validated model can be used as a design tool. The effort is similar to traditional chemically reacting flow modeling with emphasis on multicomponent diffusion, thermal diffusion, large sets of homogeneous reactions, and surface chemistry. In the case of CVI, models for pore infiltration are needed. In the present talk, examples of SiC nitride, and Boron deposition from the author's past work will be used to illustrate the utility of computational process modeling.
NASA Technical Reports Server (NTRS)
James, John T.; Lam, Chiu-wing; Scully, Robert R.
2013-01-01
Brief exposures of Apollo Astronauts to lunar dust occasionally elicited upper respiratory irritation; however, no limits were ever set for prolonged exposure ot lunar dust. Habitats for exploration, whether mobile of fixed must be designed to limit human exposure to lunar dust to safe levels. We have used a new technique we call Comparative Benchmark Dose Modeling to estimate safe exposure limits for lunar dust collected during the Apollo 14 mission.
1987-06-01
to a field of research called Computer-Aided Instruction (CAI). CAI is a powerful methodology for enhancing the overall quaiity and effectiveness of...provides a very powerful tool for statistical inference, especially when pooling informations from different source is appropriate. Thus. prior...04 , 2 ’ .. ."k, + ++ ,,;-+-,..,,..v ->’,0,,.’ I The power of the model lies in its ability to adapt a diagnostic session to the level of knowledge
Characterization of Cloud Water-Content Distribution
NASA Technical Reports Server (NTRS)
Lee, Seungwon
2010-01-01
The development of realistic cloud parameterizations for climate models requires accurate characterizations of subgrid distributions of thermodynamic variables. To this end, a software tool was developed to characterize cloud water-content distributions in climate-model sub-grid scales. This software characterizes distributions of cloud water content with respect to cloud phase, cloud type, precipitation occurrence, and geo-location using CloudSat radar measurements. It uses a statistical method called maximum likelihood estimation to estimate the probability density function of the cloud water content.
Much ado about mice: Standard-setting in model organism research.
Hardesty, Rebecca A
2018-04-11
Recently there has been a practice turn in the philosophy of science that has called for analyses to be grounded in the actual doings of everyday science. This paper is in furtherance of this call and it does so by employing participant-observation ethnographic methods as a tool for discovering epistemological features of scientific practice in a neuroscience lab. The case I present focuses on a group of neurobiologists researching the genetic underpinnings of cognition in Down syndrome (DS) and how they have developed a new mouse model which they argue should be regarded as the "gold standard" for all DS mouse research. Through use of ethnographic methods, interviews, and analyses of publications, I uncover how the lab constructed their new mouse model. Additionally, I describe how model organisms can serve as abstract standards for scientific work that impact the epistemic value of scientific claims, regulate practice, and constrain future work. Copyright © 2018 Elsevier Ltd. All rights reserved.
Knowledge-acquisition tools for medical knowledge-based systems.
Lanzola, G; Quaglini, S; Stefanelli, M
1995-03-01
Knowledge-based systems (KBS) have been proposed to solve a large variety of medical problems. A strategic issue for KBS development and maintenance are the efforts required for both knowledge engineers and domain experts. The proposed solution is building efficient knowledge acquisition (KA) tools. This paper presents a set of KA tools we are developing within a European Project called GAMES II. They have been designed after the formulation of an epistemological model of medical reasoning. The main goal is that of developing a computational framework which allows knowledge engineers and domain experts to interact cooperatively in developing a medical KBS. To this aim, a set of reusable software components is highly recommended. Their design was facilitated by the development of a methodology for KBS construction. It views this process as comprising two activities: the tailoring of the epistemological model to the specific medical task to be executed and the subsequent translation of this model into a computational architecture so that the connections between computational structures and their knowledge level counterparts are maintained. The KA tools we developed are illustrated taking examples from the behavior of a KBS we are building for the management of children with acute myeloid leukemia.
2007-11-01
Engineer- ing Research Laboratory is currently developing a set of facility ‘architec- tural’ programming tools , called Facility ComposerTM (FC). FC...requirements in the early phases of project development. As the facility program, crite- ria, and requirements are chosen, these tools populate the IFC...developing a set of facility “ar- chitectural” programming tools , called Facility Composer (FC), to support the capture and tracking of facility criteria
NASA Technical Reports Server (NTRS)
Bailin, Sydney; Paterra, Frank; Henderson, Scott; Truszkowski, Walt
1993-01-01
This paper presents a discussion of current work in the area of graphical modeling and model-based reasoning being undertaken by the Automation Technology Section, Code 522.3, at Goddard. The work was initially motivated by the growing realization that the knowledge acquisition process was a major bottleneck in the generation of fault detection, isolation, and repair (FDIR) systems for application in automated Mission Operations. As with most research activities this work started out with a simple objective: to develop a proof-of-concept system demonstrating that a draft rule-base for a FDIR system could be automatically realized by reasoning from a graphical representation of the system to be monitored. This work was called Knowledge From Pictures (KFP) (Truszkowski et. al. 1992). As the work has successfully progressed the KFP tool has become an environment populated by a set of tools that support a more comprehensive approach to model-based reasoning. This paper continues by giving an overview of the graphical modeling objectives of the work, describing the three tools that now populate the KFP environment, briefly presenting a discussion of related work in the field, and by indicating future directions for the KFP environment.
A numerical tool for reproducing driver behaviour: experiments and predictive simulations.
Casucci, M; Marchitto, M; Cacciabue, P C
2010-03-01
This paper presents the simulation tool called SDDRIVE (Simple Simulation of Driver performance), which is the numerical computerised implementation of the theoretical architecture describing Driver-Vehicle-Environment (DVE) interactions, contained in Cacciabue and Carsten [Cacciabue, P.C., Carsten, O. A simple model of driver behaviour to sustain design and safety assessment of automated systems in automotive environments, 2010]. Following a brief description of the basic algorithms that simulate the performance of drivers, the paper presents and discusses a set of experiments carried out in a Virtual Reality full scale simulator for validating the simulation. Then the predictive potentiality of the tool is shown by discussing two case studies of DVE interactions, performed in the presence of different driver attitudes in similar traffic conditions.
Sethi, Suresh; Linden, Daniel; Wenburg, John; Lewis, Cara; Lemons, Patrick R.; Fuller, Angela K.; Hare, Matthew P.
2016-01-01
Error-tolerant likelihood-based match calling presents a promising technique to accurately identify recapture events in genetic mark–recapture studies by combining probabilities of latent genotypes and probabilities of observed genotypes, which may contain genotyping errors. Combined with clustering algorithms to group samples into sets of recaptures based upon pairwise match calls, these tools can be used to reconstruct accurate capture histories for mark–recapture modelling. Here, we assess the performance of a recently introduced error-tolerant likelihood-based match-calling model and sample clustering algorithm for genetic mark–recapture studies. We assessed both biallelic (i.e. single nucleotide polymorphisms; SNP) and multiallelic (i.e. microsatellite; MSAT) markers using a combination of simulation analyses and case study data on Pacific walrus (Odobenus rosmarus divergens) and fishers (Pekania pennanti). A novel two-stage clustering approach is demonstrated for genetic mark–recapture applications. First, repeat captures within a sampling occasion are identified. Subsequently, recaptures across sampling occasions are identified. The likelihood-based matching protocol performed well in simulation trials, demonstrating utility for use in a wide range of genetic mark–recapture studies. Moderately sized SNP (64+) and MSAT (10–15) panels produced accurate match calls for recaptures and accurate non-match calls for samples from closely related individuals in the face of low to moderate genotyping error. Furthermore, matching performance remained stable or increased as the number of genetic markers increased, genotyping error notwithstanding.
Li, John; Maclehose, Rich; Smith, Kirk; Kaehler, Dawn; Hedberg, Craig
2011-01-01
Foodborne illness surveillance based on consumer complaints detects outbreaks by finding common exposures among callers, but this process is often difficult. Laboratory testing of ill callers could also help identify potential outbreaks. However, collection of stool samples from all callers is not feasible. Methods to help screen calls for etiology are needed to increase the efficiency of complaint surveillance systems and increase the likelihood of detecting foodborne outbreaks caused by Salmonella. Data from the Minnesota Department of Health foodborne illness surveillance database (2000 to 2008) were analyzed. Complaints with identified etiologies were examined to create a predictive model for Salmonella. Bootstrap methods were used to internally validate the model. Seventy-one percent of complaints in the foodborne illness database with known etiologies were due to norovirus. The predictive model had a good discriminatory ability to identify Salmonella calls. Three cutoffs for the predictive model were tested: one that maximized sensitivity, one that maximized specificity, and one that maximized predictive ability, providing sensitivities and specificities of 32 and 96%, 100 and 54%, and 89 and 72%, respectively. Development of a predictive model for Salmonella could help screen calls for etiology. The cutoff that provided the best predictive ability for Salmonella corresponded to a caller reporting diarrhea and fever with no vomiting, and five or fewer people ill. Screening calls for etiology would help identify complaints for further follow-up and result in identifying Salmonella cases that would otherwise go unconfirmed; in turn, this could lead to the identification of more outbreaks.
Akuna: An Open Source User Environment for Managing Subsurface Simulation Workflows
NASA Astrophysics Data System (ADS)
Freedman, V. L.; Agarwal, D.; Bensema, K.; Finsterle, S.; Gable, C. W.; Keating, E. H.; Krishnan, H.; Lansing, C.; Moeglein, W.; Pau, G. S. H.; Porter, E.; Scheibe, T. D.
2014-12-01
The U.S. Department of Energy (DOE) is investing in development of a numerical modeling toolset called ASCEM (Advanced Simulation Capability for Environmental Management) to support modeling analyses at legacy waste sites. ASCEM is an open source and modular computing framework that incorporates new advances and tools for predicting contaminant fate and transport in natural and engineered systems. The ASCEM toolset includes both a Platform with Integrated Toolsets (called Akuna) and a High-Performance Computing multi-process simulator (called Amanzi). The focus of this presentation is on Akuna, an open-source user environment that manages subsurface simulation workflows and associated data and metadata. In this presentation, key elements of Akuna are demonstrated, which includes toolsets for model setup, database management, sensitivity analysis, parameter estimation, uncertainty quantification, and visualization of both model setup and simulation results. A key component of the workflow is in the automated job launching and monitoring capabilities, which allow a user to submit and monitor simulation runs on high-performance, parallel computers. Visualization of large outputs can also be performed without moving data back to local resources. These capabilities make high-performance computing accessible to the users who might not be familiar with batch queue systems and usage protocols on different supercomputers and clusters.
Design Requirements for Communication-Intensive Interactive Applications
NASA Astrophysics Data System (ADS)
Bolchini, Davide; Garzotto, Franca; Paolini, Paolo
Online interactive applications call for new requirements paradigms to capture the growing complexity of computer-mediated communication. Crafting successful interactive applications (such as websites and multimedia) involves modeling the requirements for the user experience, including those leading to content design, usable information architecture and interaction, in profound coordination with the communication goals of all stakeholders involved, ranging from persuasion to social engagement, to call for action. To face this grand challenge, we propose a methodology for modeling communication requirements and provide a set of operational conceptual tools to be used in complex projects with multiple stakeholders. Through examples from real-life projects and lessons-learned from direct experience, we draw on the concepts of brand, value, communication goals, information and persuasion requirements to systematically guide analysts to master the multifaceted connections of these elements as drivers to inform successful communication designs.
Bangbang Zhang; Gary Feng; Lajpat R. Ahuja; Xiangbin Kong; Ying Ouyang; Ardeshir Adeli; Johnie N. Jenkins
2018-01-01
Crop production as a function of water use or water applied, called the crop water production function (CWPF), is a useful tool for irrigation planning, design and management. However, these functions are not only crop and variety specific they also vary with soil types and climatic conditions (locations). Derivation of multi-year average CWPFs through field...
A Practical Guide to the Technology and Adoption of Software Process Automation
1994-03-01
IDE’s integration of Software through Pictures, CodeCenter, and FrameMaker ). However, successful use of in- tegrated tools, as reflected in actual...tool for a specific platform. Thus, when a Work Context calls for a word processor, the weaver.tis file can be set up to call FrameMaker for the Sun4
NASA Technical Reports Server (NTRS)
Chaparro, Daniel; Fujiwara, Gustavo E. C.; Ting, Eric; Nguyen, Nhan
2016-01-01
The need to rapidly scan large design spaces during conceptual design calls for computationally inexpensive tools such as the vortex lattice method (VLM). Although some VLM tools, such as Vorview have been extended to model fully-supersonic flow, VLM solutions are typically limited to inviscid, subcritical flow regimes. Many transport aircraft operate at transonic speeds, which limits the applicability of VLM for such applications. This paper presents a novel approach to correct three-dimensional VLM through coupling of two-dimensional transonic small disturbance (TSD) solutions along the span of an aircraft wing in order to accurately predict transonic aerodynamic loading and wave drag for transport aircraft. The approach is extended to predict flow separation and capture the attenuation of aerodynamic forces due to boundary layer viscosity by coupling the TSD solver with an integral boundary layer (IBL) model. The modeling framework is applied to the NASA General Transport Model (GTM) integrated with a novel control surface known as the Variable Camber Continuous Trailing Edge Flap (VCCTEF).
vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments
2010-01-01
Background The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Results Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Conclusions Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/. PMID:20482791
vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments.
Ma, Jingming; Dykes, Carrie; Wu, Tao; Huang, Yangxin; Demeter, Lisa; Wu, Hulin
2010-05-18
The replication rate (or fitness) between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV). HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Based on a mathematical model and several statistical methods (least-squares approach and measurement error models), a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1). Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/.
NASA Astrophysics Data System (ADS)
Peckham, S. D.; Kelbert, A.; Rudan, S.; Stoica, M.
2016-12-01
Standardized metadata for models is the key to reliable and greatly simplified coupling in model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System). This model metadata also helps model users to understand the important details that underpin computational models and to compare the capabilities of different models. These details include simplifying assumptions on the physics, governing equations and the numerical methods used to solve them, discretization of space (the grid) and time (the time-stepping scheme), state variables (input or output), model configuration parameters. This kind of metadata provides a "deep description" of a computational model that goes well beyond other types of metadata (e.g. author, purpose, scientific domain, programming language, digital rights, provenance, execution) and captures the science that underpins a model. While having this kind of standardized metadata for each model in a repository opens up a wide range of exciting possibilities, it is difficult to collect this information and a carefully conceived "data model" or schema is needed to store it. Automated harvesting and scraping methods can provide some useful information, but they often result in metadata that is inaccurate or incomplete, and this is not sufficient to enable the desired capabilities. In order to address this problem, we have developed a browser-based tool called the MCM Tool (Model Component Metadata) which runs on notebooks, tablets and smart phones. This tool was partially inspired by the TurboTax software, which greatly simplifies the necessary task of preparing tax documents. It allows a model developer or advanced user to provide a standardized, deep description of a computational geoscience model, including hydrologic models. Under the hood, the tool uses a new ontology for models built on the CSDMS Standard Names, expressed as a collection of RDF files (Resource Description Framework). This ontology is based on core concepts such as variables, objects, quantities, operations, processes and assumptions. The purpose of this talk is to present details of the new ontology and to then demonstrate the MCM Tool for several hydrologic models.
Cognitive simulation as a tool for cognitive task analysis.
Roth, E M; Woods, D D; Pople, H E
1992-10-01
Cognitive simulations are runnable computer programs that represent models of human cognitive activities. We show how one cognitive simulation built as a model of some of the cognitive processes involved in dynamic fault management can be used in conjunction with small-scale empirical data on human performance to uncover the cognitive demands of a task, to identify where intention errors are likely to occur, and to point to improvements in the person-machine system. The simulation, called Cognitive Environment Simulation or CES, has been exercised on several nuclear power plant accident scenarios. Here we report one case to illustrate how a cognitive simulation tool such as CES can be used to clarify the cognitive demands of a problem-solving situation as part of a cognitive task analysis.
Heuristic Diagrams as a Tool to Teach History of Science
NASA Astrophysics Data System (ADS)
Chamizo, José A.
2012-05-01
The graphic organizer called here heuristic diagram as an improvement of Gowin's Vee heuristic is proposed as a tool to teach history of science. Heuristic diagrams have the purpose of helping students (or teachers, or researchers) to understand their own research considering that asks and problem-solving are central to scientific activity. The left side originally related in Gowin's Vee with philosophies, theories, models, laws or regularities now agrees with Toulmin's concepts (language, models as representation techniques and application procedures). Mexican science teachers without experience in science education research used the heuristic diagram to learn about the history of chemistry considering also in the left side two different historical times: past and present. Through a semantic differential scale teachers' attitude to the heuristic diagram was evaluated and its usefulness was demonstrated.
Modeling crime events by d-separation method
NASA Astrophysics Data System (ADS)
Aarthee, R.; Ezhilmaran, D.
2017-11-01
Problematic legal cases have recently called for a scientifically founded method of dealing with the qualitative and quantitative roles of evidence in a case [1].To deal with quantitative, we proposed a d-separation method for modeling the crime events. A d-separation is a graphical criterion for identifying independence in a directed acyclic graph. By developing a d-separation method, we aim to lay the foundations for the development of a software support tool that can deal with the evidential reasoning in legal cases. Such a tool is meant to be used by a judge or juror, in alliance with various experts who can provide information about the details. This will hopefully improve the communication between judges or jurors and experts. The proposed method used to uncover more valid independencies than any other graphical criterion.
NASA Technical Reports Server (NTRS)
Chatterjee, Sharmista
1993-01-01
Our first goal in this project was to perform a systems analysis of a closed loop Environmental Control Life Support System (ECLSS). This pertains to the development of a model of an existing real system from which to assess the state or performance of the existing system. Systems analysis is applied to conceptual models obtained from a system design effort. For our modelling purposes we used a simulator tool called ASPEN (Advanced System for Process Engineering). Our second goal was to evaluate the thermodynamic efficiency of the different components comprising an ECLSS. Use is made of the second law of thermodynamics to determine the amount of irreversibility of energy loss of each component. This will aid design scientists in selecting the components generating the least entropy, as our penultimate goal is to keep the entropy generation of the whole system at a minimum.
From direct-space discrepancy functions to crystallographic least squares.
Giacovazzo, Carmelo
2015-01-01
Crystallographic least squares are a fundamental tool for crystal structure analysis. In this paper their properties are derived from functions estimating the degree of similarity between two electron-density maps. The new approach leads also to modifications of the standard least-squares procedures, potentially able to improve their efficiency. The role of the scaling factor between observed and model amplitudes is analysed: the concept of unlocated model is discussed and its scattering contribution is combined with that arising from the located model. Also, the possible use of an ancillary parameter, to be associated with the classical weight related to the variance of the observed amplitudes, is studied. The crystallographic discrepancy factors, basic tools often combined with least-squares procedures in phasing approaches, are analysed. The mathematical approach here described includes, as a special case, the so-called vector refinement, used when accurate estimates of the target phases are available.
Modeling protein structure at near atomic resolutions with Gorgon.
Baker, Matthew L; Abeysinghe, Sasakthi S; Schuh, Stephen; Coleman, Ross A; Abrams, Austin; Marsh, Michael P; Hryc, Corey F; Ruths, Troy; Chiu, Wah; Ju, Tao
2011-05-01
Electron cryo-microscopy (cryo-EM) has played an increasingly important role in elucidating the structure and function of macromolecular assemblies in near native solution conditions. Typically, however, only non-atomic resolution reconstructions have been obtained for these large complexes, necessitating computational tools for integrating and extracting structural details. With recent advances in cryo-EM, maps at near-atomic resolutions have been achieved for several macromolecular assemblies from which models have been manually constructed. In this work, we describe a new interactive modeling toolkit called Gorgon targeted at intermediate to near-atomic resolution density maps (10-3.5 Å), particularly from cryo-EM. Gorgon's de novo modeling procedure couples sequence-based secondary structure prediction with feature detection and geometric modeling techniques to generate initial protein backbone models. Beyond model building, Gorgon is an extensible interactive visualization platform with a variety of computational tools for annotating a wide variety of 3D volumes. Examples from cryo-EM maps of Rotavirus and Rice Dwarf Virus are used to demonstrate its applicability to modeling protein structure. Copyright © 2011 Elsevier Inc. All rights reserved.
AquaCrop-OS: A tool for resilient management of land and water resources in agriculture
NASA Astrophysics Data System (ADS)
Foster, Timothy; Brozovic, Nicholas; Butler, Adrian P.; Neale, Christopher M. U.; Raes, Dirk; Steduto, Pasquale; Fereres, Elias; Hsiao, Theodore C.
2017-04-01
Water managers, researchers, and other decision makers worldwide are faced with the challenge of increasing food production under population growth, drought, and rising water scarcity. Crop simulation models are valuable tools in this effort, and, importantly, provide a means of quantifying rapidly crop yield response to water, climate, and field management practices. Here, we introduce a new open-source crop modelling tool called AquaCrop-OS (Foster et al., 2017), which extends the functionality of the globally used FAO AquaCrop model. Through case studies focused on groundwater-fed irrigation in the High Plains and Central Valley of California in the United States, we demonstrate how AquaCrop-OS can be used to understand the local biophysical, behavioural, and institutional drivers of water risks in agricultural production. Furthermore, we also illustrate how AquaCrop-OS can be combined effectively with hydrologic and economic models to support drought risk mitigation and decision-making around water resource management at a range of spatial and temporal scales, and highlight future plans for model development and training. T. Foster, et al. (2017) AquaCrop-OS: An open source version of FAO's crop water productivity model. Agricultural Water Management. 181: 18-22. http://dx.doi.org/10.1016/j.agwat.2016.11.015.
Schwartz, Carolyn E; Rapkin, Bruce D
2004-01-01
The increasing evidence for response shift phenomena in quality of life (QOL) assessment points to the necessity to reconsider both the measurement model and the application of psychometric analyses. The proposed psychometric model posits that the QOL true score is always contingent upon parameters of the appraisal process. This new model calls into question existing methods for establishing the reliability and validity of QOL assessment tools and suggests several new approaches for describing the psychometric properties of these scales. Recommendations for integrating the assessment of appraisal into QOL research and clinical practice are discussed. PMID:15038830
The Earth Gravitational Model 1996: The NCCS: Resource for Development, Resource for the Future
NASA Technical Reports Server (NTRS)
2002-01-01
For centuries, men have attempted to understand the climate system through observations obtained from Earth's surface. These observations yielded preliminary understanding of the ocean currents, tides, and prevailing winds using visual observation and simple mechanical tools as their instruments. Today's sensitive, downward-looking radar systems, called altimeters, onboard satellites can measure globally the precise height of the ocean surface. This surface is largely that of the equipotential gravity surface, called the geoid - the level surface to which the oceans would conform if there were no forces acting on them apart from gravity, as well as having a significant 1-2- meter-level signal arising from the motion of the ocean's currents.
Survival Data and Regression Models
NASA Astrophysics Data System (ADS)
Grégoire, G.
2014-12-01
We start this chapter by introducing some basic elements for the analysis of censored survival data. Then we focus on right censored data and develop two types of regression models. The first one concerns the so-called accelerated failure time models (AFT), which are parametric models where a function of a parameter depends linearly on the covariables. The second one is a semiparametric model, where the covariables enter in a multiplicative form in the expression of the hazard rate function. The main statistical tool for analysing these regression models is the maximum likelihood methodology and, in spite we recall some essential results about the ML theory, we refer to the chapter "Logistic Regression" for a more detailed presentation.
CALL in the Zone of Proximal Development: Novelty Effects and Teacher Guidance
ERIC Educational Resources Information Center
Karlström, Petter; Lundin, Eva
2013-01-01
Digital tools are not always used in the manner their designers had in mind. Therefore, it is not enough to assume that learning through CALL tools occurs in intended ways, if at all. We have studied the use of an enhanced word processor for writing essays in Swedish as a second language. The word processor contained natural language processing…
Connecting Requirements to Architecture and Analysis via Model-Based Systems Engineering
NASA Technical Reports Server (NTRS)
Cole, Bjorn F.; Jenkins, J. Steven
2015-01-01
In traditional systems engineering practice, architecture, concept development, and requirements development are related but still separate activities. Concepts for operation, key technical approaches, and related proofs of concept are developed. These inform the formulation of an architecture at multiple levels, starting with the overall system composition and functionality and progressing into more detail. As this formulation is done, a parallel activity develops a set of English statements that constrain solutions. These requirements are often called "shall statements" since they are formulated to use "shall." The separation of requirements from design is exacerbated by well-meaning tools like the Dynamic Object-Oriented Requirements System (DOORS) that remained separated from engineering design tools. With the Europa Clipper project, efforts are being taken to change the requirements development approach from a separate activity to one intimately embedded in formulation effort. This paper presents a modeling approach and related tooling to generate English requirement statements from constraints embedded in architecture definition.
NASA Astrophysics Data System (ADS)
Farmer, J. Doyne; Gallegati, M.; Hommes, C.; Kirman, A.; Ormerod, P.; Cincotti, S.; Sanchez, A.; Helbing, D.
2012-11-01
We outline a vision for an ambitious program to understand the economy and financial markets as a complex evolving system of coupled networks of interacting agents. This is a completely different vision from that currently used in most economic models. This view implies new challenges and opportunities for policy and managing economic crises. The dynamics of such models inherently involve sudden and sometimes dramatic changes of state. Further, the tools and approaches we use emphasize the analysis of crises rather than of calm periods. In this they respond directly to the calls of Governors Bernanke and Trichet for new approaches to macroeconomic modelling.
The Singularity Mystery Associated with a Radially Continuous Maxwell Viscoelastic Structure
NASA Technical Reports Server (NTRS)
Fang, Ming; Hager, Bradford H.
1995-01-01
The singularity problem associated with a radially continuous Maxwell viscoclastic structure is investigated. A special tool called the isolation function is developed. Results calculated using the isolation function show that the discrete model assumption is no longer valid when the viscoelastic parameter becomes a continuous function of radius. Continuous variations in the upper mantle viscoelastic parameter are especially powerful in destroying the mode-like structures. The contribution to the load Love numbers of the singularities is sensitive to the convexity of the viscoelastic parameter models. The difference between the vertical response and the horizontal response found in layered viscoelastic parameter models remains with continuous models.
Multimodal visualization interface for data management, self-learning and data presentation.
Van Sint Jan, S; Demondion, X; Clapworthy, G; Louryan, S; Rooze, M; Cotten, A; Viceconti, M
2006-10-01
A multimodal visualization software, called the Data Manager (DM), has been developed to increase interdisciplinary communication around the topic of visualization and modeling of various aspects of the human anatomy. Numerous tools used in Radiology are integrated in the interface that runs on standard personal computers. The available tools, combined to hierarchical data management and custom layouts, allow analyzing of medical imaging data using advanced features outside radiological premises (for example, for patient review, conference presentation or tutorial preparation). The system is free, and based on an open-source software development architecture, and therefore updates of the system for custom applications are possible.
Prioritization of malaria endemic zones using self-organizing maps in the Manipur state of India.
Murty, Upadhyayula Suryanarayana; Srinivasa Rao, Mutheneni; Misra, Sunil
2008-09-01
Due to the availability of a huge amount of epidemiological and public health data that require analysis and interpretation by using appropriate mathematical tools to support the existing method to control the mosquito and mosquito-borne diseases in a more effective way, data-mining tools are used to make sense from the chaos. Using data-mining tools, one can develop predictive models, patterns, association rules, and clusters of diseases, which can help the decision-makers in controlling the diseases. This paper mainly focuses on the applications of data-mining tools that have been used for the first time to prioritize the malaria endemic regions in Manipur state by using Self Organizing Maps (SOM). The SOM results (in two-dimensional images called Kohonen maps) clearly show the visual classification of malaria endemic zones into high, medium and low in the different districts of Manipur, and will be discussed in the paper.
Climate Model Diagnostic Analyzer
NASA Technical Reports Server (NTRS)
Lee, Seungwon; Pan, Lei; Zhai, Chengxing; Tang, Benyang; Kubar, Terry; Zhang, Zia; Wang, Wei
2015-01-01
The comprehensive and innovative evaluation of climate models with newly available global observations is critically needed for the improvement of climate model current-state representation and future-state predictability. A climate model diagnostic evaluation process requires physics-based multi-variable analyses that typically involve large-volume and heterogeneous datasets, making them both computation- and data-intensive. With an exploratory nature of climate data analyses and an explosive growth of datasets and service tools, scientists are struggling to keep track of their datasets, tools, and execution/study history, let alone sharing them with others. In response, we have developed a cloud-enabled, provenance-supported, web-service system called Climate Model Diagnostic Analyzer (CMDA). CMDA enables the physics-based, multivariable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. At the same time, CMDA provides a crowd-sourcing space where scientists can organize their work efficiently and share their work with others. CMDA is empowered by many current state-of-the-art software packages in web service, provenance, and semantic search.
VIPER: a web application for rapid expert review of variant calls.
Wöste, Marius; Dugas, Martin
2018-06-01
With the rapid development in next-generation sequencing, cost and time requirements for genomic sequencing are decreasing, enabling applications in many areas such as cancer research. Many tools have been developed to analyze genomic variation ranging from single nucleotide variants to whole chromosomal aberrations. As sequencing throughput increases, the number of variants called by such tools also grows. Often employed manual inspection of such calls is thus becoming a time-consuming procedure. We developed the Variant InsPector and Expert Rating tool (VIPER) to speed up this process by integrating the Integrative Genomics Viewer into a web application. Analysts can then quickly iterate through variants, apply filters and make decisions based on the generated images and variant metadata. VIPER was successfully employed in analyses with manual inspection of more than 10 000 calls. VIPER is implemented in Java and Javascript and is freely available at https://github.com/MarWoes/viper. marius.woeste@uni-muenster.de. Supplementary data are available at Bioinformatics online.
Mirel, Barbara; Görg, Carsten
2014-04-26
A common class of biomedical analysis is to explore expression data from high throughput experiments for the purpose of uncovering functional relationships that can lead to a hypothesis about mechanisms of a disease. We call this analysis expression driven, -omics hypothesizing. In it, scientists use interactive data visualizations and read deeply in the research literature. Little is known, however, about the actual flow of reasoning and behaviors (sense making) that scientists enact in this analysis, end-to-end. Understanding this flow is important because if bioinformatics tools are to be truly useful they must support it. Sense making models of visual analytics in other domains have been developed and used to inform the design of useful and usable tools. We believe they would be helpful in bioinformatics. To characterize the sense making involved in expression-driven, -omics hypothesizing, we conducted an in-depth observational study of one scientist as she engaged in this analysis over six months. From findings, we abstracted a preliminary sense making model. Here we describe its stages and suggest guidelines for developing visualization tools that we derived from this case. A single case cannot be generalized. But we offer our findings, sense making model and case-based tool guidelines as a first step toward increasing interest and further research in the bioinformatics field on scientists' analytical workflows and their implications for tool design.
2014-01-01
A common class of biomedical analysis is to explore expression data from high throughput experiments for the purpose of uncovering functional relationships that can lead to a hypothesis about mechanisms of a disease. We call this analysis expression driven, -omics hypothesizing. In it, scientists use interactive data visualizations and read deeply in the research literature. Little is known, however, about the actual flow of reasoning and behaviors (sense making) that scientists enact in this analysis, end-to-end. Understanding this flow is important because if bioinformatics tools are to be truly useful they must support it. Sense making models of visual analytics in other domains have been developed and used to inform the design of useful and usable tools. We believe they would be helpful in bioinformatics. To characterize the sense making involved in expression-driven, -omics hypothesizing, we conducted an in-depth observational study of one scientist as she engaged in this analysis over six months. From findings, we abstracted a preliminary sense making model. Here we describe its stages and suggest guidelines for developing visualization tools that we derived from this case. A single case cannot be generalized. But we offer our findings, sense making model and case-based tool guidelines as a first step toward increasing interest and further research in the bioinformatics field on scientists’ analytical workflows and their implications for tool design. PMID:24766796
42: An Open-Source Simulation Tool for Study and Design of Spacecraft Attitude Control Systems
NASA Technical Reports Server (NTRS)
Stoneking, Eric
2018-01-01
Simulation is an important tool in the analysis and design of spacecraft attitude control systems. The speaker will discuss the simulation tool, called simply 42, that he has developed over the years to support his own work as an engineer in the Attitude Control Systems Engineering Branch at NASA Goddard Space Flight Center. 42 was intended from the outset to be high-fidelity and powerful, but also fast and easy to use. 42 is publicly available as open source since 2014. The speaker will describe some of 42's models and features, and discuss its applicability to studies ranging from early concept studies through the design cycle, integration, and operations. He will outline 42's architecture and share some thoughts on simulation development as a long-term project.
NASA Technical Reports Server (NTRS)
Mainger, Steve
2004-01-01
As NASA speculates on and explores the future of aviation, the technological and physical aspects of our environment increasing become hurdles that must be overcome for success. Research into methods for overcoming some of these selected hurdles have been purposed by several NASA research partners as concepts. The task of establishing a common evaluation environment was placed on NASA's Virtual Airspace Simulation Technologies (VAST) project (sub-project of VAMS), and they responded with the development of the Airspace Concept Evaluation System (ACES). As one examines the ACES environment from a communication, navigation or surveillance (CNS) perspective, the simulation parameters are built with assumed perfection in the transactions associated with CNS. To truly evaluate these concepts in a realistic sense, the contributions/effects of CNS must be part of the ACES. NASA Glenn Research Center (GRC) has supported the Virtual Airspace Modeling and Simulation (VAMS) project through the continued development of CNS models and analysis capabilities which supports the ACES environment. NASA GRC initiated the development a communications traffic loading analysis tool, called the Future Aeronautical Sub-network Traffic Emulator for Communications, Navigation and Surveillance (FASTE-CNS), as part of this support. This tool allows for forecasting of communications load with the understanding that, there is no single, common source for loading models used to evaluate the existing and planned communications channels; and that, consensus and accuracy in the traffic load models is a very important input to the decisions being made on the acceptability of communication techniques used to fulfill the aeronautical requirements. Leveraging off the existing capabilities of the FASTE-CNS tool, GRC has called for FASTE-CNS to have the functionality to pre- and post-process the simulation runs of ACES to report on instances when traffic density, frequency congestion or aircraft spacing/distance violations have occurred. The integration of these functions require that the CNS models used to characterize these avionic system be of higher fidelity and better consistency then is present in FASTE-CNS system. This presentation will explore the capabilities of FASTE-CNS with renewed emphasis on the enhancements being added to perform these processing functions; the fidelity and reliability of CNS models necessary to make the enhancements work; and the benchmarking of FASTE-CNS results to improve confidence for the results of the new processing capabilities.
Propel: Tools and Methods for Practical Source Code Model Checking
NASA Technical Reports Server (NTRS)
Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem
2003-01-01
The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.
Digitizing for Computer-Aided Finite Element Model Generation.
1979-10-10
this approach is a collection of programs developed over the last eight years at the University of Arizona, and called the GIFTS system. This paper...briefly describes the latest version of the system, GIFTS -5, and demonstrates its suitability in a design environment by simple examples. The programs...constituting the GIFTS system were used as a tool for research in many areas, including mesh generation, finite element data base design, interactive
ERIC Educational Resources Information Center
DeVane, Benjamin
2017-01-01
In this review article, I argue that games are complementary, not self-supporting, learning tools for democratic education because they can: (a) offer "simplified, but often not simple, outlines" (later called "models") of complex social systems that generate further inquiry; (b) provide "practice spaces" for…
A simple method to predict regional fish abundance: an example in the McKenzie River Basin, Oregon
D.J. McGarvey; J.M. Johnston
2011-01-01
Regional assessments of fisheries resources are increasingly called for, but tools with which to perform them are limited. We present a simple method that can be used to estimate regional carrying capacity and apply it to the McKenzie River Basin, Oregon. First, we use a macroecological model to predict trout densities within small, medium, and large streams in the...
2015-03-26
albeit powerful , method available for exploring CAS. As discussed above, there are many useful mathematical tools appropriate for CAS modeling. Agent-based...cells, tele- phone calls, and sexual contacts approach power -law distributions. [48] Networks in general are robust against random failures, but...targeted failures can have powerful effects – provided the targeter has a good understanding of the network structure. Some argue (convincingly) that all
A random walk model to evaluate autism
NASA Astrophysics Data System (ADS)
Moura, T. R. S.; Fulco, U. L.; Albuquerque, E. L.
2018-02-01
A common test administered during neurological examination in children is the analysis of their social communication and interaction across multiple contexts, including repetitive patterns of behavior. Poor performance may be associated with neurological conditions characterized by impairments in executive function, such as the so-called pervasive developmental disorders (PDDs), a particular condition of the autism spectrum disorders (ASDs). Inspired in these diagnosis tools, mainly those related to repetitive movements and behaviors, we studied here how the diffusion regimes of two discrete-time random walkers, mimicking the lack of social interaction and restricted interests developed for children with PDDs, are affected. Our model, which is based on the so-called elephant random walk (ERW) approach, consider that one of the random walker can learn and imitate the microscopic behavior of the other with probability f (1 - f otherwise). The diffusion regimes, measured by the Hurst exponent (H), is then obtained, whose changes may indicate a different degree of autism.
Teamwork tools and activities within the hazard component of the Global Earthquake Model
NASA Astrophysics Data System (ADS)
Pagani, M.; Weatherill, G.; Monelli, D.; Danciu, L.
2013-05-01
The Global Earthquake Model (GEM) is a public-private partnership aimed at supporting and fostering a global community of scientists and engineers working in the fields of seismic hazard and risk assessment. In the hazard sector, in particular, GEM recognizes the importance of local ownership and leadership in the creation of seismic hazard models. For this reason, over the last few years, GEM has been promoting different activities in the context of seismic hazard analysis ranging, for example, from regional projects targeted at the creation of updated seismic hazard studies to the development of a new open-source seismic hazard and risk calculation software called OpenQuake-engine (http://globalquakemodel.org). In this communication we'll provide a tour of the various activities completed, such as the new ISC-GEM Global Instrumental Catalogue, and of currently on-going initiatives like the creation of a suite of tools for the creation of PSHA input models. Discussion, comments and criticism by the colleagues in the audience will be highly appreciated.
NASA Technical Reports Server (NTRS)
Dubos, Gregory F.; Cornford, Steven
2012-01-01
While the ability to model the state of a space system over time is essential during spacecraft operations, the use of time-based simulations remains rare in preliminary design. The absence of the time dimension in most traditional early design tools can however become a hurdle when designing complex systems whose development and operations can be disrupted by various events, such as delays or failures. As the value delivered by a space system is highly affected by such events, exploring the trade space for designs that yield the maximum value calls for the explicit modeling of time.This paper discusses the use of discrete-event models to simulate spacecraft development schedule as well as operational scenarios and on-orbit resources in the presence of uncertainty. It illustrates how such simulations can be utilized to support trade studies, through the example of a tool developed for DARPA's F6 program to assist the design of "fractionated spacecraft".
sbml-diff: A Tool for Visually Comparing SBML Models in Synthetic Biology.
Scott-Brown, James; Papachristodoulou, Antonis
2017-07-21
We present sbml-diff, a tool that is able to read a model of a biochemical reaction network in SBML format and produce a range of diagrams showing different levels of detail. Each diagram type can be used to visualize a single model or to visually compare two or more models. The default view depicts species as ellipses, reactions as rectangles, rules as parallelograms, and events as diamonds. A cartoon view replaces the symbols used for reactions on the basis of the associated Systems Biology Ontology terms. An abstract view represents species as ellipses and draws edges between them to indicate whether a species increases or decreases the production or degradation of another species. sbml-diff is freely licensed under the three-clause BSD license and can be downloaded from https://github.com/jamesscottbrown/sbml-diff and used as a python package called from other software, as a free-standing command-line application, or online using the form at http://sysos.eng.ox.ac.uk/tebio/upload.
ON DEVELOPING TOOLS AND METHODS FOR ENVIRONMENTALLY BENIGN PROCESSES
Two types of tools are generally needed for designing processes and products that are cleaner from environmental impact perspectives. The first kind is called process tools. Process tools are based on information obtained from experimental investigations in chemistry., material s...
DOT National Transportation Integrated Search
2003-04-01
Introduction Motorist aid call boxes are used to provide motorist assistance, improve safety, and can serve as an incident detection tool. More recently, Intelligent Transportation Systems (ITS) applications have been added to call box systems to enh...
González Carrascosa, R; Bayo Montó, J L; Meneu Barreira, T; García Segovia, P; Martínez-Monzó, J
2011-01-01
To introduce and describe a new tool called UPV-FFQ to evaluate dietary intake of the university population. The new tool consists principally in a self-administered online food frequency questionnaire (FFQ). The tool UPV-FFQ has been developed by means of web pages applying the technology ASP.NET 2.0 and using the database SQL Server 2005 as support. To develop the FFQ has been used as model the paper and pencil FFQ called "Dieta, salud y antropometría en la población universitaria". The tool has three parts: (1) a homepage, (2) a general questionnaire and (3) a FFQ. The FFQ has a closed list of 84 food items commonly consumed in Valencia region. The respondents has to indicate the food items that they consume (2 possible options), the frequency of consumption (9 response options) and the quantity consumed (7 response options). The UPV-FFQ has approximately 250 color photographs that represents to three portion sizes. The photographs are useful to help the respondents to choose the portion sizes that more adjusts to their habitual portions. The new tool provides quantitative information of the habitual intake of 31 nutritional parameters and provides qualitative information of the general questionnaire. A pilot study was done for a total of 57 respondents. The media time spend to fill in was 15 minutes. The pilot study concluded that the questionnaire was ease-of-use, low cost and time-effectiveness questionnaire. The format and the sequence of the questions were easily understood.
Integrating interactive computational modeling in biology curricula.
Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A
2015-03-01
While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.
NASA Technical Reports Server (NTRS)
Zheng, Yihua; Kuznetsova, Maria M.; Pulkkinen, Antti A.; Maddox, Marlo M.; Mays, Mona Leila
2015-01-01
The Space Weather Research Center (http://swrc. gsfc.nasa.gov) at NASA Goddard, part of the Community Coordinated Modeling Center (http://ccmc.gsfc.nasa.gov), is committed to providing research-based forecasts and notifications to address NASA's space weather needs, in addition to its critical role in space weather education. It provides a host of services including spacecraft anomaly resolution, historical impact analysis, real-time monitoring and forecasting, tailored space weather alerts and products, and weekly summaries and reports. In this paper, we focus on how (near) real-time data (both in space and on ground), in combination with modeling capabilities and an innovative dissemination system called the integrated Space Weather Analysis system (http://iswa.gsfc.nasa.gov), enable monitoring, analyzing, and predicting the spacecraft charging environment for spacecraft users. Relevant tools and resources are discussed.
Modeling the Multi-Body System Dynamics of a Flexible Solar Sail Spacecraft
NASA Technical Reports Server (NTRS)
Kim, Young; Stough, Robert; Whorton, Mark
2005-01-01
Solar sail propulsion systems enable a wide range of space missions that are not feasible with current propulsion technology. Hardware concepts and analytical methods have matured through ground development to the point that a flight validation mission is now realizable. Much attention has been given to modeling the structural dynamics of the constituent elements, but to date an integrated system level dynamics analysis has been lacking. Using a multi-body dynamics and control analysis tool called TREETOPS, the coupled dynamics of the sailcraft bus, sail membranes, flexible booms, and control system sensors and actuators of a representative solar sail spacecraft are investigated to assess system level dynamics and control issues. With this tool, scaling issues and parametric trade studies can be performed to study achievable performance, control authority requirements, and control/structure interaction assessments.
Anvil Forecast Tool in the Advanced Weather Interactive Processing System (AWIPS)
NASA Technical Reports Server (NTRS)
Barrett, Joe H., III; Hood, Doris
2009-01-01
Launch Weather Officers (LWOs) from the 45th Weather Squadron (45 WS) and forecasters from the National Weather Service (NWS) Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violating the Lightning Launch Commit Criteria (LLCC) (Krider et al. 2006; Space Shuttle Flight Rules (FR), NASA/JSC 2004)). As a result, the Applied Meteorology Unit (AMU) developed a tool that creates an anvil threat corridor graphic that can be overlaid on satellite imagery using the Meteorological Interactive Data Display System (MIDDS, Short and Wheeler, 2002). The tool helps forecasters estimate the locations of thunderstorm anvils at one, two, and three hours into the future. It has been used extensively in launch and landing operations by both the 45 WS and SMG. The Advanced Weather Interactive Processing System (AWIPS) is now used along with MIDDS for weather analysis and display at SMG. In Phase I of this task, SMG tasked the AMU to transition the tool from MIDDS to AWIPS (Barrett et aI., 2007). For Phase II, SMG requested the AMU make the Anvil Forecast Tool in AWIPS more configurable by creating the capability to read model gridded data from user-defined model files instead of hard-coded files. An NWS local AWIPS application called AGRID was used to accomplish this. In addition, SMG needed to be able to define the pressure levels for the model data, instead of hard-coding the bottom level as 300 mb and the top level as 150 mb. This paper describes the initial development of the Anvil Forecast Tool for MIDDS, followed by the migration of the tool to AWIPS in Phase I. It then gives a detailed presentation of the Phase II improvements to the AWIPS tool.
Mobile phone call data as a regional socio-economic proxy indicator.
Šćepanović, Sanja; Mishkovski, Igor; Hui, Pan; Nurminen, Jukka K; Ylä-Jääski, Antti
2015-01-01
The advent of publishing anonymized call detail records opens the door for temporal and spatial human dynamics studies. Such studies, besides being useful for creating universal models for mobility patterns, could be also used for creating new socio-economic proxy indicators that will not rely only on the local or state institutions. In this paper, from the frequency of calls at different times of the day, in different small regional units (sub-prefectures) in Côte d'Ivoire, we infer users' home and work sub-prefectures. This division of users enables us to analyze different mobility and calling patterns for the different regions. We then compare how those patterns correlate to the data from other sources, such as: news for particular events in the given period, census data, economic activity, poverty index, power plants and energy grid data. Our results show high correlation in many of the cases revealing the diversity of socio-economic insights that can be inferred using only mobile phone call data. The methods and the results may be particularly relevant to policy-makers engaged in poverty reduction initiatives as they can provide an affordable tool in the context of resource-constrained developing economies, such as Côte d'Ivoire's.
ART-Ada design project, phase 2
NASA Technical Reports Server (NTRS)
Lee, S. Daniel; Allen, Bradley P.
1990-01-01
Interest in deploying expert systems in Ada has increased. An Ada based expert system tool is described called ART-Ada, which was built to support research into the language and methodological issues of expert systems in Ada. ART-Ada allows applications of an existing expert system tool called ART-IM (Automated Reasoning Tool for Information Management) to be deployed in various Ada environments. ART-IM, a C-based expert system tool, is used to generate Ada source code which is compiled and linked with an Ada based inference engine to produce an Ada executable image. ART-Ada is being used to implement several expert systems for NASA's Space Station Freedom Program and the U.S. Air Force.
Terminal Area Conflict Detection and Resolution Tool
NASA Technical Reports Server (NTRS)
Verma, Savita Arora
2011-01-01
This poster will describe analysis of a conflict detection and resolution tool for the terminal area called T-TSAFE. With altitude clearance information, the tool can reduce false alerts to as low as 2 per hour.
... is usually done using a tool called a stethoscope. Health care providers routinely listen to a person's ... unborn infants. This can be done with a stethoscope or with sound waves (called Doppler ultrasound). Auscultation ...
ERIC Educational Resources Information Center
Parmaxi, Antigoni; Zaphiris, Panayiotis
2017-01-01
This study explores the research development pertaining to the use of Web 2.0 technologies in the field of Computer-Assisted Language Learning (CALL). Published research manuscripts related to the use of Web 2.0 tools in CALL have been explored, and the following research foci have been determined: (1) Web 2.0 tools that dominate second/foreign…
Trinkaus, Hans L; Gaisser, Andrea E
2010-09-01
Nearly 30,000 individual inquiries are answered annually by the telephone cancer information service (CIS, KID) of the German Cancer Research Center (DKFZ). The aim was to develop a tool for evaluating these calls, and to support the complete counseling process interactively. A novel software tool is introduced, based on a structure similar to a music score. Treating the interaction as a "duet", guided by the CIS counselor, the essential contents of the dialogue are extracted automatically. For this, "trained speech recognition" is applied to the (known) counselor's part, and "keyword spotting" is used on the (unknown) client's part to pick out specific items from the "word streams". The outcomes fill an abstract score representing the dialogue. Pilot tests performed on a prototype of SACA (Software Assisted Call Analysis) resulted in a basic proof of concept: Demographic data as well as information regarding the situation of the caller could be identified. The study encourages following up on the vision of an integrated SACA tool for supporting calls online and performing statistics on its knowledge database offline. Further research perspectives are to check SACA's potential in comparison with established interaction analysis systems like RIAS. Copyright (c) 2010 Elsevier Ireland Ltd. All rights reserved.
ToTem: a tool for variant calling pipeline optimization.
Tom, Nikola; Tom, Ondrej; Malcikova, Jitka; Pavlova, Sarka; Kubesova, Blanka; Rausch, Tobias; Kolarik, Miroslav; Benes, Vladimir; Bystry, Vojtech; Pospisilova, Sarka
2018-06-26
High-throughput bioinformatics analyses of next generation sequencing (NGS) data often require challenging pipeline optimization. The key problem is choosing appropriate tools and selecting the best parameters for optimal precision and recall. Here we introduce ToTem, a tool for automated pipeline optimization. ToTem is a stand-alone web application with a comprehensive graphical user interface (GUI). ToTem is written in Java and PHP with an underlying connection to a MySQL database. Its primary role is to automatically generate, execute and benchmark different variant calling pipeline settings. Our tool allows an analysis to be started from any level of the process and with the possibility of plugging almost any tool or code. To prevent an over-fitting of pipeline parameters, ToTem ensures the reproducibility of these by using cross validation techniques that penalize the final precision, recall and F-measure. The results are interpreted as interactive graphs and tables allowing an optimal pipeline to be selected, based on the user's priorities. Using ToTem, we were able to optimize somatic variant calling from ultra-deep targeted gene sequencing (TGS) data and germline variant detection in whole genome sequencing (WGS) data. ToTem is a tool for automated pipeline optimization which is freely available as a web application at https://totem.software .
NASA Technical Reports Server (NTRS)
Leveson, Nancy G.; Heimdahl, Mats P. E.; Reese, Jon Damon
1999-01-01
Previously, we defined a blackbox formal system modeling language called RSML (Requirements State Machine Language). The language was developed over several years while specifying the system requirements for a collision avoidance system for commercial passenger aircraft. During the language development, we received continual feedback and evaluation by FAA employees and industry representatives, which helped us to produce a specification language that is easily learned and used by application experts. Since the completion of the PSML project, we have continued our research on specification languages. This research is part of a larger effort to investigate the more general problem of providing tools to assist in developing embedded systems. Our latest experimental toolset is called SpecTRM (Specification Tools and Requirements Methodology), and the formal specification language is SpecTRM-RL (SpecTRM Requirements Language). This paper describes what we have learned from our use of RSML and how those lessons were applied to the design of SpecTRM-RL. We discuss our goals for SpecTRM-RL and the design features that support each of these goals.
Pohjola, Mikko V; Pohjola, Pasi; Tainio, Marko; Tuomisto, Jouni T
2013-06-26
The calls for knowledge-based policy and policy-relevant research invoke a need to evaluate and manage environment and health assessments and models according to their societal outcomes. This review explores how well the existing approaches to assessment and model performance serve this need. The perspectives to assessment and model performance in the scientific literature can be called: (1) quality assurance/control, (2) uncertainty analysis, (3) technical assessment of models, (4) effectiveness and (5) other perspectives, according to what is primarily seen to constitute the goodness of assessments and models. The categorization is not strict and methods, tools and frameworks in different perspectives may overlap. However, altogether it seems that most approaches to assessment and model performance are relatively narrow in their scope. The focus in most approaches is on the outputs and making of assessments and models. Practical application of the outputs and the consequential outcomes are often left unaddressed. It appears that more comprehensive approaches that combine the essential characteristics of different perspectives are needed. This necessitates a better account of the mechanisms of collective knowledge creation and the relations between knowledge and practical action. Some new approaches to assessment, modeling and their evaluation and management span the chain from knowledge creation to societal outcomes, but the complexity of evaluating societal outcomes remains a challenge.
NASA Technical Reports Server (NTRS)
1984-01-01
Key tool of Redken Laboratories new line of hair styling appliances is an instrument called a thermograph, a heat sensing device originally developed by Hughes Aircraft Co. under U.S. Army and NASA funding. Redken Laboratories bought one of the early models of the Hughes Probeye Thermal Video System or TVS which detects the various degrees of heat emitted by an object and displays the results in color on a TV monitor with colors representing different temperatures detected.
2013-11-01
by existing cyber-attack detection tools far exceeds the analysts’ cognitive capabilities. Grounded in perceptual and cognitive theory , many visual...Processes Inspired by the sense-making theory discussed earlier, we model the analytical reasoning process of cyber analysts using three key...analyst are called “working hypotheses”); each hypothesis could trigger further actions to confirm or disconfirm it. New actions will lead to new
Automated simulation as part of a design workstation
NASA Technical Reports Server (NTRS)
Cantwell, Elizabeth; Shenk, T.; Robinson, P.; Upadhye, R.
1990-01-01
A development project for a design workstation for advanced life-support systems (called the DAWN Project, for Design Assistant Workstation), incorporating qualitative simulation, required the implementation of a useful qualitative simulation capability and the integration of qualitative and quantitative simulation such that simulation capabilities are maximized without duplication. The reason is that to produce design solutions to a system goal, the behavior of the system in both a steady and perturbed state must be represented. The Qualitative Simulation Tool (QST), on an expert-system-like model building and simulation interface toll called ScratchPad (SP), and on the integration of QST and SP with more conventional, commercially available simulation packages now being applied in the evaluation of life-support system processes and components are discussed.
Optimal-adaptive filters for modelling spectral shape, site amplification, and source scaling
Safak, Erdal
1989-01-01
This paper introduces some applications of optimal filtering techniques to earthquake engineering by using the so-called ARMAX models. Three applications are presented: (a) spectral modelling of ground accelerations, (b) site amplification (i.e., the relationship between two records obtained at different sites during an earthquake), and (c) source scaling (i.e., the relationship between two records obtained at a site during two different earthquakes). A numerical example for each application is presented by using recorded ground motions. The results show that the optimal filtering techniques provide elegant solutions to above problems, and can be a useful tool in earthquake engineering.
The PyRosetta Toolkit: a graphical user interface for the Rosetta software suite.
Adolf-Bryfogle, Jared; Dunbrack, Roland L
2013-01-01
The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI) for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.
Augmenting Conceptual Design Trajectory Tradespace Exploration with Graph Theory
NASA Technical Reports Server (NTRS)
Dees, Patrick D.; Zwack, Mathew R.; Steffens, Michael; Edwards, Stephen
2016-01-01
Within conceptual design changes occur rapidly due to a combination of uncertainty and shifting requirements. To stay relevant in this fluid time, trade studies must also be performed rapidly. In order to drive down analysis time while improving the information gained by these studies, surrogate models can be created to represent the complex output of a tool or tools within a specified tradespace. In order to create this model however, a large amount of data must be collected in a short amount of time. By this method, the historical approach of relying on subject matter experts to generate the data required is schedule infeasible. However, by implementing automation and distributed analysis the required data can be generated in a fraction of the time. Previous work focused on setting up a tool called multiPOST capable of orchestrating many simultaneous runs of an analysis tool assessing these automated analyses utilizing heuristics gleaned from the best practices of current subject matter experts. In this update to the previous work, elements of graph theory are included to further drive down analysis time by leveraging data previously gathered. It is shown to outperform the previous method in both time required, and the quantity and quality of data produced.
Coupling of snow and permafrost processes using the Basic Modeling Interface (BMI)
NASA Astrophysics Data System (ADS)
Wang, K.; Overeem, I.; Jafarov, E. E.; Piper, M.; Stewart, S.; Clow, G. D.; Schaefer, K. M.
2017-12-01
We developed a permafrost modeling tool based by implementing the Kudryavtsev empirical permafrost active layer depth model (the so-called "Ku" component). The model is specifically set up to have a basic model interface (BMI), which enhances the potential coupling to other earth surface processes model components. This model is accessible through the Web Modeling Tool in Community Surface Dynamics Modeling System (CSDMS). The Kudryavtsev model has been applied for entire Alaska to model permafrost distribution at high spatial resolution and model predictions have been verified by Circumpolar Active Layer Monitoring (CALM) in-situ observations. The Ku component uses monthly meteorological forcing, including air temperature, snow depth, and snow density, and predicts active layer thickness (ALT) and temperature on the top of permafrost (TTOP), which are important factors in snow-hydrological processes. BMI provides an easy approach to couple the models with each other. Here, we provide a case of coupling the Ku component to snow process components, including the Snow-Degree-Day (SDD) method and Snow-Energy-Balance (SEB) method, which are existing components in the hydrological model TOPOFLOW. The work flow is (1) get variables from meteorology component, set the values to snow process component, and advance the snow process component, (2) get variables from meteorology and snow component, provide these to the Ku component and advance, (3) get variables from snow process component, set the values to meteorology component, and advance the meteorology component. The next phase is to couple the permafrost component with fully BMI-compliant TOPOFLOW hydrological model, which could provide a useful tool to investigate the permafrost hydrological effect.
SMARTE: SUSTAINABLE MANAGEMENT APPROACHES AND REVITALIZATION TOOLS-ELECTRONIC (BELFAST, IRELAND)
The U.S.-German Bilateral Working Group is developing Site-specific Management Approaches and Redevelopment Tools (SMART). In the U.S., the SMART compilation is housed in a web-based, decision support tool called SMARTe. All tools within SMARTe that are developed specifically for...
PredicT-ML: a tool for automating machine learning model building with big clinical data.
Luo, Gang
2016-01-01
Predictive modeling is fundamental to transforming large clinical data sets, or "big clinical data," into actionable knowledge for various healthcare applications. Machine learning is a major predictive modeling approach, but two barriers make its use in healthcare challenging. First, a machine learning tool user must choose an algorithm and assign one or more model parameters called hyper-parameters before model training. The algorithm and hyper-parameter values used typically impact model accuracy by over 40 %, but their selection requires many labor-intensive manual iterations that can be difficult even for computer scientists. Second, many clinical attributes are repeatedly recorded over time, requiring temporal aggregation before predictive modeling can be performed. Many labor-intensive manual iterations are required to identify a good pair of aggregation period and operator for each clinical attribute. Both barriers result in time and human resource bottlenecks, and preclude healthcare administrators and researchers from asking a series of what-if questions when probing opportunities to use predictive models to improve outcomes and reduce costs. This paper describes our design of and vision for PredicT-ML (prediction tool using machine learning), a software system that aims to overcome these barriers and automate machine learning model building with big clinical data. The paper presents the detailed design of PredicT-ML. PredicT-ML will open the use of big clinical data to thousands of healthcare administrators and researchers and increase the ability to advance clinical research and improve healthcare.
Examining A Health Care Price Transparency Tool: Who Uses It, And How They Shop For Care.
Sinaiko, Anna D; Rosenthal, Meredith B
2016-04-01
Calls for transparency in health care prices are increasing, in an effort to encourage and enable patients to make value-based decisions. Yet there is very little evidence of whether and how patients use health care price transparency tools. We evaluated the experiences, in the period 2011-12, of an insured population of nonelderly adults with Aetna's Member Payment Estimator, a web-based tool that provides real-time, personalized, episode-level price estimates. Overall, use of the tool increased during the study period but remained low. Nonetheless, for some procedures the number of people searching for prices of services (called searchers) was high relative to the number of people who received the service (called patients). Among Aetna patients who had an imaging service, childbirth, or one of several outpatient procedures, searchers for price information were significantly more likely to be younger and healthier and to have incurred higher annual deductible spending than patients who did not search for price information. A campaign to deliver price information to consumers may be important to increase patients' engagement with price transparency tools. Project HOPE—The People-to-People Health Foundation, Inc.
Gopalaswamy, Arjun M.; Royle, J. Andrew; Hines, James E.; Singh, Pallavi; Jathanna, Devcharan; Kumar, N. Samba; Karanth, K. Ullas
2012-01-01
1. The advent of spatially explicit capture-recapture models is changing the way ecologists analyse capture-recapture data. However, the advantages offered by these new models are not fully exploited because they can be difficult to implement. 2. To address this need, we developed a user-friendly software package, created within the R programming environment, called SPACECAP. This package implements Bayesian spatially explicit hierarchical models to analyse spatial capture-recapture data. 3. Given that a large number of field biologists prefer software with graphical user interfaces for analysing their data, SPACECAP is particularly useful as a tool to increase the adoption of Bayesian spatially explicit capture-recapture methods in practice.
NASA Astrophysics Data System (ADS)
Ivannikova, E.; Kruglyakov, M.; Kuvshinov, A. V.; Rastaetter, L.; Pulkkinen, A. A.; Ngwira, C. M.
2017-12-01
During extreme space weather events electric currents in the Earth's magnetosphere and ionosphere experience large variations, which leads to dramatic intensification of the fluctuating magnetic field at the surface of the Earth. According to Faraday's law of induction, the fluctuating geomagnetic field in turn induces electric field that generates harmful currents (so-called "geomagnetically induced currents"; GICs) in grounded technological systems. Understanding (via modeling) of the spatio-temporal evolution of the geoelectric field during enhanced geomagnetic activity is a key consideration in estimating the hazard to technological systems from space weather. We present the results of ground geoelectric field modeling for the Northeast United States, which is performed with the use of our novel numerical tool based on integral equation approach. The tool exploits realistic regional three-dimensional (3-D) models of the Earth's electrical conductivity and realistic global models of the spatio-temporal evolution of the magnetospheric and ionospheric current systems responsible for geomagnetic disturbances. We also explore in detail the manifestation of the coastal effect (anomalous intensification of the geoelectric field near the coasts) in this region.
Model-based diagnostics for Space Station Freedom
NASA Technical Reports Server (NTRS)
Fesq, Lorraine M.; Stephan, Amy; Martin, Eric R.; Lerutte, Marcel G.
1991-01-01
An innovative approach to fault management was recently demonstrated for the NASA LeRC Space Station Freedom (SSF) power system testbed. This project capitalized on research in model-based reasoning, which uses knowledge of a system's behavior to monitor its health. The fault management system (FMS) can isolate failures online, or in a post analysis mode, and requires no knowledge of failure symptoms to perform its diagnostics. An in-house tool called MARPLE was used to develop and run the FMS. MARPLE's capabilities are similar to those available from commercial expert system shells, although MARPLE is designed to build model-based as opposed to rule-based systems. These capabilities include functions for capturing behavioral knowledge, a reasoning engine that implements a model-based technique known as constraint suspension, and a tool for quickly generating new user interfaces. The prototype produced by applying MARPLE to SSF not only demonstrated that model-based reasoning is a valuable diagnostic approach, but it also suggested several new applications of MARPLE, including an integration and testing aid, and a complement to state estimation.
Combining Domain-driven Design and Mashups for Service Development
NASA Astrophysics Data System (ADS)
Iglesias, Carlos A.; Fernández-Villamor, José Ignacio; Del Pozo, David; Garulli, Luca; García, Boni
This chapter presents the Romulus project approach to Service Development using Java-based web technologies. Romulus aims at improving productivity of service development by providing a tool-supported model to conceive Java-based web applications. This model follows a Domain Driven Design approach, which states that the primary focus of software projects should be the core domain and domain logic. Romulus proposes a tool-supported model, Roma Metaframework, that provides an abstraction layer on top of existing web frameworks and automates the application generation from the domain model. This metaframework follows an object centric approach, and complements Domain Driven Design by identifying the most common cross-cutting concerns (security, service, view, ...) of web applications. The metaframework uses annotations for enriching the domain model with these cross-cutting concerns, so-called aspects. In addition, the chapter presents the usage of mashup technology in the metaframework for service composition, using the web mashup editor MyCocktail. This approach is applied to a scenario of the Mobile Phone Service Portability case study for the development of a new service.
Electrical conductivity of metal powders under pressure
NASA Astrophysics Data System (ADS)
Montes, J. M.; Cuevas, F. G.; Cintas, J.; Urban, P.
2011-12-01
A model for calculating the electrical conductivity of a compressed powder mass consisting of oxide-coated metal particles has been derived. A theoretical tool previously developed by the authors, the so-called `equivalent simple cubic system', was used in the model deduction. This tool is based on relating the actual powder system to an equivalent one consisting of deforming spheres packed in a simple cubic lattice, which is much easier to examine. The proposed model relates the effective electrical conductivity of the powder mass under compression to its level of porosity. Other physically measurable parameters in the model are the conductivities of the metal and oxide constituting the powder particles, their radii, the mean thickness of the oxide layer and the tap porosity of the powder. Two additional parameters controlling the effect of the descaling of the particle oxide layer were empirically introduced. The proposed model was experimentally verified by measurements of the electrical conductivity of aluminium, bronze, iron, nickel and titanium powders under pressure. The consistency between theoretical predictions and experimental results was reasonably good in all cases.
Application of uncertainty and sensitivity analysis to the air quality SHERPA modelling tool
NASA Astrophysics Data System (ADS)
Pisoni, E.; Albrecht, D.; Mara, T. A.; Rosati, R.; Tarantola, S.; Thunis, P.
2018-06-01
Air quality has significantly improved in Europe over the past few decades. Nonetheless we still find high concentrations in measurements mainly in specific regions or cities. This dimensional shift, from EU-wide to hot-spot exceedances, calls for a novel approach to regional air quality management (to complement EU-wide existing policies). The SHERPA (Screening for High Emission Reduction Potentials on Air quality) modelling tool was developed in this context. It provides an additional tool to be used in support to regional/local decision makers responsible for the design of air quality plans. It is therefore important to evaluate the quality of the SHERPA model, and its behavior in the face of various kinds of uncertainty. Uncertainty and sensitivity analysis techniques can be used for this purpose. They both reveal the links between assumptions and forecasts, help in-model simplification and may highlight unexpected relationships between inputs and outputs. Thus, a policy steered SHERPA module - predicting air quality improvement linked to emission reduction scenarios - was evaluated by means of (1) uncertainty analysis (UA) to quantify uncertainty in the model output, and (2) by sensitivity analysis (SA) to identify the most influential input sources of this uncertainty. The results of this study provide relevant information about the key variables driving the SHERPA output uncertainty, and advise policy-makers and modellers where to place their efforts for an improved decision-making process.
GAPIT: genome association and prediction integrated tool.
Lipka, Alexander E; Tian, Feng; Wang, Qishan; Peiffer, Jason; Li, Meng; Bradbury, Peter J; Gore, Michael A; Buckler, Edward S; Zhang, Zhiwu
2012-09-15
Software programs that conduct genome-wide association studies and genomic prediction and selection need to use methodologies that maximize statistical power, provide high prediction accuracy and run in a computationally efficient manner. We developed an R package called Genome Association and Prediction Integrated Tool (GAPIT) that implements advanced statistical methods including the compressed mixed linear model (CMLM) and CMLM-based genomic prediction and selection. The GAPIT package can handle large datasets in excess of 10 000 individuals and 1 million single-nucleotide polymorphisms with minimal computational time, while providing user-friendly access and concise tables and graphs to interpret results. http://www.maizegenetics.net/GAPIT. zhiwu.zhang@cornell.edu Supplementary data are available at Bioinformatics online.
Carney, Timothy Jay; Morgan, Geoffrey P.; Jones, Josette; McDaniel, Anna M.; Weaver, Michael; Weiner, Bryan; Haggstrom, David A.
2014-01-01
Our conceptual model demonstrates our goal to investigate the impact of clinical decision support (CDS) utilization on cancer screening improvement strategies in the community health care (CHC) setting. We employed a dual modeling technique using both statistical and computational modeling to evaluate impact. Our statistical model used the Spearman’s Rho test to evaluate the strength of relationship between our proximal outcome measures (CDS utilization) against our distal outcome measure (provider self-reported cancer screening improvement). Our computational model relied on network evolution theory and made use of a tool called Construct-TM to model the use of CDS measured by the rate of organizational learning. We employed the use of previously collected survey data from community health centers Cancer Health Disparities Collaborative (HDCC). Our intent is to demonstrate the added valued gained by using a computational modeling tool in conjunction with a statistical analysis when evaluating the impact a health information technology, in the form of CDS, on health care quality process outcomes such as facility-level screening improvement. Significant simulated disparities in organizational learning over time were observed between community health centers beginning the simulation with high and low clinical decision support capability. PMID:24953241
Kawano, Shin; Watanabe, Tsutomu; Mizuguchi, Sohei; Araki, Norie; Katayama, Toshiaki; Yamaguchi, Atsuko
2014-07-01
TogoTable (http://togotable.dbcls.jp/) is a web tool that adds user-specified annotations to a table that a user uploads. Annotations are drawn from several biological databases that use the Resource Description Framework (RDF) data model. TogoTable uses database identifiers (IDs) in the table as a query key for searching. RDF data, which form a network called Linked Open Data (LOD), can be searched from SPARQL endpoints using a SPARQL query language. Because TogoTable uses RDF, it can integrate annotations from not only the reference database to which the IDs originally belong, but also externally linked databases via the LOD network. For example, annotations in the Protein Data Bank can be retrieved using GeneID through links provided by the UniProt RDF. Because RDF has been standardized by the World Wide Web Consortium, any database with annotations based on the RDF data model can be easily incorporated into this tool. We believe that TogoTable is a valuable Web tool, particularly for experimental biologists who need to process huge amounts of data such as high-throughput experimental output. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
NASA Astrophysics Data System (ADS)
Ines, A. V. M.; Han, E.; Baethgen, W.
2017-12-01
Advances in seasonal climate forecasts (SCFs) during the past decades have brought great potential to improve agricultural climate risk managements associated with inter-annual climate variability. In spite of popular uses of crop simulation models in addressing climate risk problems, the models cannot readily take seasonal climate predictions issued in the format of tercile probabilities of most likely rainfall categories (i.e, below-, near- and above-normal). When a skillful SCF is linked with the crop simulation models, the informative climate information can be further translated into actionable agronomic terms and thus better support strategic and tactical decisions. In other words, crop modeling connected with a given SCF allows to simulate "what-if" scenarios with different crop choices or management practices and better inform the decision makers. In this paper, we present a decision support tool, called CAMDT (Climate Agriculture Modeling and Decision Tool), which seamlessly integrates probabilistic SCFs to DSSAT-CSM-Rice model to guide decision-makers in adopting appropriate crop and agricultural water management practices for given climatic conditions. The CAMDT has a functionality to disaggregate a probabilistic SCF into daily weather realizations (either a parametric or non-parametric disaggregation method) and to run DSSAT-CSM-Rice with the disaggregated weather realizations. The convenient graphical user-interface allows easy implementation of several "what-if" scenarios for non-technical users and visualize the results of the scenario runs. In addition, the CAMDT also translates crop model outputs to economic terms once the user provides expected crop price and cost. The CAMDT is a practical tool for real-world applications, specifically for agricultural climate risk management in the Bicol region, Philippines, having a great flexibility for being adapted to other crops or regions in the world. CAMDT GitHub: https://github.com/Agro-Climate/CAMDT
NASA Astrophysics Data System (ADS)
Illing, Sebastian; Schuster, Mareike; Kadow, Christopher; Kröner, Igor; Richling, Andy; Grieger, Jens; Kruschke, Tim; Lang, Benjamin; Redl, Robert; Schartner, Thomas; Cubasch, Ulrich
2016-04-01
MiKlip is project for medium-term climate prediction funded by the Federal Ministry of Education and Research in Germany (BMBF) and aims to create a model system that is able provide reliable decadal climate forecasts. During the first project phase of MiKlip the sub-project INTEGRATION located at Freie Universität Berlin developed a framework for scientific infrastructures (FREVA). More information about FREVA can be found in EGU2016-13060. An instance of this framework is used as Central Evaluation System (CES) during the MiKlip project. Throughout the first project phase various sub-projects developed over 25 analysis tools - so called plugins - for the CES. The main focus of these plugins is on the evaluation and verification of decadal climate prediction data, but most plugins are not limited to this scope. They target a wide range of scientific questions. Starting from preprocessing tools like the "LeadtimeSelector", which creates lead-time dependent time-series from decadal hindcast sets, over tracking tools like the "Zykpak" plugin, which can objectively locate and track mid-latitude cyclones, to plugins like "MurCSS" or "SPECS", which calculate deterministic and probabilistic skill metrics. We also integrated some analyses from Model Evaluation Tools (MET), which was developed at NCAR. We will show the theoretical background, technical implementation strategies, and some interesting results of the evaluation of the MiKlip Prototype decadal prediction system for a selected set of these tools.
Ketso: A New Tool for Extension Professionals
ERIC Educational Resources Information Center
Bates, James S.
2016-01-01
Extension professionals employ many techniques and tools to obtain feedback, input, information, and data from stakeholders, research participants, and program learners. An information-gathering tool called Ketso is described in this article. This tool and its associated techniques can be used in all phases of program development, implementation,…
Publishing and sharing of hydrologic models through WaterHUB
NASA Astrophysics Data System (ADS)
Merwade, V.; Ruddell, B. L.; Song, C.; Zhao, L.; Kim, J.; Assi, A.
2011-12-01
Most hydrologists use hydrologic models to simulate the hydrologic processes to understand hydrologic pathways and fluxes for research, decision making and engineering design. Once these tasks are complete including publication of results, the models generally are not published or made available to the public for further use and improvement. Although publication or sharing of models is not required for journal publications, sharing of models may open doors for new collaborations, and avoids duplication of efforts if other researchers are interested in simulating a particular watershed for which a model already exists. For researchers, who are interested in sharing models, there are limited avenues to publishing their models to the wider community. Towards filling this gap, a prototype cyberinfrastructure (CI), called WaterHUB, is developed for sharing hydrologic data and modeling tools in an interactive environment. To test the utility of WaterHUB for sharing hydrologic models, a system to publish and share SWAT (Soil Water Assessment Tool) is developed. Users can utilize WaterHUB to search and download existing SWAT models, and also upload new SWAT models. Metadata such as the name of the watershed, name of the person or agency who developed the model, simulation period, time step, and list of calibrated parameters also published with individual model.
A call for benchmarking transposable element annotation methods.
Hoen, Douglas R; Hickey, Glenn; Bourque, Guillaume; Casacuberta, Josep; Cordaux, Richard; Feschotte, Cédric; Fiston-Lavier, Anna-Sophie; Hua-Van, Aurélie; Hubley, Robert; Kapusta, Aurélie; Lerat, Emmanuelle; Maumus, Florian; Pollock, David D; Quesneville, Hadi; Smit, Arian; Wheeler, Travis J; Bureau, Thomas E; Blanchette, Mathieu
2015-01-01
DNA derived from transposable elements (TEs) constitutes large parts of the genomes of complex eukaryotes, with major impacts not only on genomic research but also on how organisms evolve and function. Although a variety of methods and tools have been developed to detect and annotate TEs, there are as yet no standard benchmarks-that is, no standard way to measure or compare their accuracy. This lack of accuracy assessment calls into question conclusions from a wide range of research that depends explicitly or implicitly on TE annotation. In the absence of standard benchmarks, toolmakers are impeded in improving their tools, annotators cannot properly assess which tools might best suit their needs, and downstream researchers cannot judge how accuracy limitations might impact their studies. We therefore propose that the TE research community create and adopt standard TE annotation benchmarks, and we call for other researchers to join the authors in making this long-overdue effort a success.
Piette, John D; Mendoza-Avelares, Milton O; Ganser, Martha; Mohamed, Muhima; Marinec, Nicolle; Krishnan, Sheila
2011-06-01
Although interactive voice response (IVR) calls can be an effective tool for chronic disease management, many regions of the world lack the infrastructure to provide these services. This study evaluated the feasibility and potential impact of an IVR program using a cloud-computing model to improve diabetes management in Honduras. A single-group, pre-post study was conducted between June and August 2010. The telecommunications infrastructure was maintained on a U.S. server, and calls were directed to patients' cell phones using VoIP. Eighty-five diabetes patients in Honduras received weekly IVR disease management calls for 6 weeks, with automated follow-up e-mails to clinicians, and voicemail reports to family caregivers. Patients completed interviews at enrollment and a 6-week follow-up. Other measures included patients' glycemic control (HbA1c) and data from the IVR calling system. A total of 53% of participants completed at least half of their IVR calls and 23% of participants completed 80% or more. Higher baseline blood pressures, greater diabetes burden, greater distance from the clinic, and better medication adherence were related to higher call completion rates. Nearly all participants (98%) reported that because of the program, they improved in aspects of diabetes management such as glycemic control (56%) or foot care (89%). Mean HbA1c's decreased from 10.0% at baseline to 8.9% at follow-up (p<0.01). Most participants (92%) said that if the service were available in their clinic they would use it again. Cloud computing is a feasible strategy for providing IVR services globally. IVR self-care support may improve self-care and glycemic control for patients in underdeveloped countries. Published by Elsevier Inc.
Piette, John D.; Mendoza-Avelares, Milton O.; Ganser, Martha; Mohamed, Muhima; Marinec, Nicolle; Krishnan, Sheila
2013-01-01
Background Although interactive voice response (IVR) calls can be an effective tool for chronic disease management, many regions of the world lack the infrastructure to provide these services. Objective This study evaluated the feasibility and potential impact of an IVR program using a cloud-computing model to improve diabetes management in Honduras. Methods A single group, pre-post study was conducted between June and August 2010. The telecommunications infrastructure was maintained on a U.S. server, and calls were directed to patients’ cell phones using VoIP. Eighty-five diabetes patients in Honduras received weekly IVR disease management calls for six weeks, with automated follow-up emails to clinicians, and voicemail reports to family caregivers. Patients completed interviews at enrollment and a six week follow-up. Other measures included patients’ glycemic control (A1c) and data from the IVR calling system. Results 55% of participants completed the majority of their IVR calls and 33% completed 80% or more. Higher baseline blood pressures, greater diabetes burden, greater distance from the clinic, and better adherence were related to higher call completion rates. Nearly all participants (98%) reported that because of the program, they improved in aspects of diabetes management such as glycemic control (56%) or foot care (89%). Mean A1c’s decreased from 10.0% at baseline to 8.9% at follow-up (p<.01). Most participants (92%) said that if the service were available in their clinic they would use it again. Conclusions Cloud computing is a feasible strategy for providing IVR services globally. IVR self-care support may improve self-care and glycemic control for patients in under-developed countries. PMID:21565655
DengueTools: innovative tools and strategies for the surveillance and control of dengue.
Wilder-Smith, Annelies; Renhorn, Karl-Erik; Tissera, Hasitha; Abu Bakar, Sazaly; Alphey, Luke; Kittayapong, Pattamaporn; Lindsay, Steve; Logan, James; Hatz, Christoph; Reiter, Paul; Rocklöv, Joacim; Byass, Peter; Louis, Valérie R; Tozan, Yesim; Massad, Eduardo; Tenorio, Antonio; Lagneau, Christophe; L'Ambert, Grégory; Brooks, David; Wegerdt, Johannah; Gubler, Duane
2012-01-01
Dengue fever is a mosquito-borne viral disease estimated to cause about 230 million infections worldwide every year, of which 25,000 are fatal. Global incidence has risen rapidly in recent decades with some 3.6 billion people, over half of the world's population, now at risk, mainly in urban centres of the tropics and subtropics. Demographic and societal changes, in particular urbanization, globalization, and increased international travel, are major contributors to the rise in incidence and geographic expansion of dengue infections. Major research gaps continue to hamper the control of dengue. The European Commission launched a call under the 7th Framework Programme with the title of 'Comprehensive control of Dengue fever under changing climatic conditions'. Fourteen partners from several countries in Europe, Asia, and South America formed a consortium named 'DengueTools' to respond to the call to achieve better diagnosis, surveillance, prevention, and predictive models and improve our understanding of the spread of dengue to previously uninfected regions (including Europe) in the context of globalization and climate change.The consortium comprises 12 work packages to address a set of research questions in three areas:Research area 1: Develop a comprehensive early warning and surveillance system that has predictive capability for epidemic dengue and benefits from novel tools for laboratory diagnosis and vector monitoring.Research area 2: Develop novel strategies to prevent dengue in children.Research area 3: Understand and predict the risk of global spread of dengue, in particular the risk of introduction and establishment in Europe, within the context of parameters of vectorial capacity, global mobility, and climate change.In this paper, we report on the rationale and specific study objectives of 'DengueTools'. DengueTools is funded under the Health theme of the Seventh Framework Programme of the European Community, Grant Agreement Number: 282589 Dengue Tools.
Using telephony data to facilitate discovery of clinical workflows.
Rucker, Donald W
2017-04-19
Discovery of clinical workflows to target for redesign using methods such as Lean and Six Sigma is difficult. VoIP telephone call pattern analysis may complement direct observation and EMR-based tools in understanding clinical workflows at the enterprise level by allowing visualization of institutional telecommunications activity. To build an analytic framework mapping repetitive and high-volume telephone call patterns in a large medical center to their associated clinical units using an enterprise unified communications server log file and to support visualization of specific call patterns using graphical networks. Consecutive call detail records from the medical center's unified communications server were parsed to cross-correlate telephone call patterns and map associated phone numbers to a cost center dictionary. Hashed data structures were built to allow construction of edge and node files representing high volume call patterns for display with an open source graph network tool. Summary statistics for an analysis of exactly one week's call detail records at a large academic medical center showed that 912,386 calls were placed with a total duration of 23,186 hours. Approximately half of all calling called number pairs had an average call duration under 60 seconds and of these the average call duration was 27 seconds. Cross-correlation of phone calls identified by clinical cost center can be used to generate graphical displays of clinical enterprise communications. Many calls are short. The compact data transfers within short calls may serve as automation or re-design targets. The large absolute amount of time medical center employees were engaged in VoIP telecommunications suggests that analysis of telephone call patterns may offer additional insights into core clinical workflows.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawson, M.; Yu, Y. H.; Nelessen, A.
2014-05-01
Wave energy converters (WECs) are commonly designed and analyzed using numerical models that combine multi-body dynamics with hydrodynamic models based on the Cummins Equation and linearized hydrodynamic coefficients. These modeling methods are attractive design tools because they are computationally inexpensive and do not require the use of high performance computing resources necessitated by high-fidelity methods, such as Navier Stokes computational fluid dynamics. Modeling hydrodynamics using linear coefficients assumes that the device undergoes small motions and that the wetted surface area of the devices is approximately constant. WEC devices, however, are typically designed to undergo large motions in order to maximizemore » power extraction, calling into question the validity of assuming that linear hydrodynamic models accurately capture the relevant fluid-structure interactions. In this paper, we study how calculating buoyancy and Froude-Krylov forces from the instantaneous position of a WEC device (referred to as instantaneous buoyancy and Froude-Krylov forces from herein) changes WEC simulation results compared to simulations that use linear hydrodynamic coefficients. First, we describe the WEC-Sim tool used to perform simulations and how the ability to model instantaneous forces was incorporated into WEC-Sim. We then use a simplified one-body WEC device to validate the model and to demonstrate how accounting for these instantaneously calculated forces affects the accuracy of simulation results, such as device motions, hydrodynamic forces, and power generation.« less
NASA Astrophysics Data System (ADS)
Courageot, Estelle; Sayah, Rima; Huet, Christelle
2010-05-01
Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.
Courageot, Estelle; Sayah, Rima; Huet, Christelle
2010-05-07
Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.
Graphical Technique to Support the Teaching/Learning Process of Software Process Reference Models
NASA Astrophysics Data System (ADS)
Espinosa-Curiel, Ismael Edrein; Rodríguez-Jacobo, Josefina; Fernández-Zepeda, José Alberto
In this paper, we propose a set of diagrams to visualize software process reference models (PRM). The diagrams, called dimods, are the combination of some visual and process modeling techniques such as rich pictures, mind maps, IDEF and RAD diagrams. We show the use of this technique by designing a set of dimods for the Mexican Software Industry Process Model (MoProSoft). Additionally, we perform an evaluation of the usefulness of dimods. The result of the evaluation shows that dimods may be a support tool that facilitates the understanding, memorization, and learning of software PRMs in both, software development organizations and universities. The results also show that dimods may have advantages over the traditional description methods for these types of models.
A hierarchical distributed control model for coordinating intelligent systems
NASA Technical Reports Server (NTRS)
Adler, Richard M.
1991-01-01
A hierarchical distributed control (HDC) model for coordinating cooperative problem-solving among intelligent systems is described. The model was implemented using SOCIAL, an innovative object-oriented tool for integrating heterogeneous, distributed software systems. SOCIAL embeds applications in 'wrapper' objects called Agents, which supply predefined capabilities for distributed communication, control, data specification, and translation. The HDC model is realized in SOCIAL as a 'Manager'Agent that coordinates interactions among application Agents. The HDC Manager: indexes the capabilities of application Agents; routes request messages to suitable server Agents; and stores results in a commonly accessible 'Bulletin-Board'. This centralized control model is illustrated in a fault diagnosis application for launch operations support of the Space Shuttle fleet at NASA, Kennedy Space Center.
Construction and completion of flux balance models from pathway databases.
Latendresse, Mario; Krummenacker, Markus; Trupp, Miles; Karp, Peter D
2012-02-01
Flux balance analysis (FBA) is a well-known technique for genome-scale modeling of metabolic flux. Typically, an FBA formulation requires the accurate specification of four sets: biochemical reactions, biomass metabolites, nutrients and secreted metabolites. The development of FBA models can be time consuming and tedious because of the difficulty in assembling completely accurate descriptions of these sets, and in identifying errors in the composition of these sets. For example, the presence of a single non-producible metabolite in the biomass will make the entire model infeasible. Other difficulties in FBA modeling are that model distributions, and predicted fluxes, can be cryptic and difficult to understand. We present a multiple gap-filling method to accelerate the development of FBA models using a new tool, called MetaFlux, based on mixed integer linear programming (MILP). The method suggests corrections to the sets of reactions, biomass metabolites, nutrients and secretions. The method generates FBA models directly from Pathway/Genome Databases. Thus, FBA models developed in this framework are easily queried and visualized using the Pathway Tools software. Predicted fluxes are more easily comprehended by visualizing them on diagrams of individual metabolic pathways or of metabolic maps. MetaFlux can also remove redundant high-flux loops, solve FBA models once they are generated and model the effects of gene knockouts. MetaFlux has been validated through construction of FBA models for Escherichia coli and Homo sapiens. Pathway Tools with MetaFlux is freely available to academic users, and for a fee to commercial users. Download from: biocyc.org/download.shtml. mario.latendresse@sri.com Supplementary data are available at Bioinformatics online.
FLUSH: A tool for the design of slush hydrogen flow systems
NASA Technical Reports Server (NTRS)
Hardy, Terry L.
1990-01-01
As part of the National Aerospace Plane Project an analytical model was developed to perform calculations for in-line transfer of solid-liquid mixtures of hydrogen. This code, called FLUSH, calculates pressure drop and solid fraction loss for the flow of slush hydrogen through pipe systems. The model solves the steady-state, one-dimensional equation of energy to obtain slush loss estimates. A description of the code is provided as well as a guide for users of the program. Preliminary results are also presented showing the anticipated degradation of slush hydrogen solid content for various piping systems.
A Cellular Automata Model of Infection Control on Medical Implants
Prieto-Langarica, Alicia; Kojouharov, Hristo; Chen-Charpentier, Benito; Tang, Liping
2011-01-01
S. epidermidis infections on medically implanted devices are a common problem in modern medicine due to the abundance of the bacteria. Once inside the body, S. epidermidis gather in communities called biofilms and can become extremely hard to eradicate, causing the patient serious complications. We simulate the complex S. epidermidis-Neutrophils interactions in order to determine the optimum conditions for the immune system to be able to contain the infection and avoid implant rejection. Our cellular automata model can also be used as a tool for determining the optimal amount of antibiotics for combating biofilm formation on medical implants. PMID:23543851
Towards a Pattern-Driven Topical Ontology Modeling Methodology in Elderly Care Homes
NASA Astrophysics Data System (ADS)
Tang, Yan; de Baer, Peter; Zhao, Gang; Meersman, Robert; Pudkey, Kevin
This paper presents a pattern-driven ontology modeling methodology, which is used to create topical ontologies in the human resource management (HRM) domain. An ontology topic is used to group concepts from different contexts (or even from different domain ontologies). We use the Organization for Economic Co-operation and Development (OECD) and the National Vocational Qualification (NVQ) as the resource to create the topical ontologies in this paper. The methodology is implemented in a tool called PAD-ON suit. The paper approach is illustrated with a use case from elderly care homes in UK.
Soldering Tool for Integrated Circuits
NASA Technical Reports Server (NTRS)
Takahashi, Ted H.
1987-01-01
Many connections soldered simultaneously in confined spaces. Improved soldering tool bonds integrated circuits onto printed-circuit boards. Intended especially for use with so-called "leadless-carrier" integrated circuits.
GrowYourIC: A Step Toward a Coherent Model of the Earth's Inner Core Seismic Structure
NASA Astrophysics Data System (ADS)
Lasbleis, Marine; Waszek, Lauren; Day, Elizabeth A.
2017-11-01
A complex inner core structure has been well established from seismic studies, showing radial and lateral heterogeneities at various length scales. Yet no geodynamic model is able to explain all the features observed. One of the main limits for this is the lack of tools to compare seismic observations and numerical models successfully. We use here a new Python tool called GrowYourIC to compare models of inner core structure. We calculate properties of geodynamic models of the inner core along seismic raypaths, for random or user-specified data sets. We test kinematic models which simulate fast lateral translation, superrotation, and differential growth. We explore first the influence on a real inner core data set, which has a sparse coverage of the inner core boundary. Such a data set is however able to successfully constrain the hemispherical boundaries due to a good sampling of latitudes. Combining translation and rotation could explain some of the features of the boundaries separating the inner core hemispheres. The depth shift of the boundaries, observed by some authors, seems unlikely to be modeled by a fast translation but could be produced by slow translation associated with superrotation.
NASA Astrophysics Data System (ADS)
Ouyang, Qin; Chen, Quansheng; Zhao, Jiewen
2016-02-01
The approach presented herein reports the application of near infrared (NIR) spectroscopy, in contrast with human sensory panel, as a tool for estimating Chinese rice wine quality; concretely, to achieve the prediction of the overall sensory scores assigned by the trained sensory panel. Back propagation artificial neural network (BPANN) combined with adaptive boosting (AdaBoost) algorithm, namely BP-AdaBoost, as a novel nonlinear algorithm, was proposed in modeling. First, the optimal spectra intervals were selected by synergy interval partial least square (Si-PLS). Then, BP-AdaBoost model based on the optimal spectra intervals was established, called Si-BP-AdaBoost model. These models were optimized by cross validation, and the performance of each final model was evaluated according to correlation coefficient (Rp) and root mean square error of prediction (RMSEP) in prediction set. Si-BP-AdaBoost showed excellent performance in comparison with other models. The best Si-BP-AdaBoost model was achieved with Rp = 0.9180 and RMSEP = 2.23 in the prediction set. It was concluded that NIR spectroscopy combined with Si-BP-AdaBoost was an appropriate method for the prediction of the sensory quality in Chinese rice wine.
Bioinformatics and molecular modeling in glycobiology
Schloissnig, Siegfried
2010-01-01
The field of glycobiology is concerned with the study of the structure, properties, and biological functions of the family of biomolecules called carbohydrates. Bioinformatics for glycobiology is a particularly challenging field, because carbohydrates exhibit a high structural diversity and their chains are often branched. Significant improvements in experimental analytical methods over recent years have led to a tremendous increase in the amount of carbohydrate structure data generated. Consequently, the availability of databases and tools to store, retrieve and analyze these data in an efficient way is of fundamental importance to progress in glycobiology. In this review, the various graphical representations and sequence formats of carbohydrates are introduced, and an overview of newly developed databases, the latest developments in sequence alignment and data mining, and tools to support experimental glycan analysis are presented. Finally, the field of structural glycoinformatics and molecular modeling of carbohydrates, glycoproteins, and protein–carbohydrate interaction are reviewed. PMID:20364395
On the several molecules and nanostructures of water.
Whitney, Cynthia Kolb
2012-01-01
This paper investigates the water molecule from a variety of viewpoints. Water can involve different isotopes of Hydrogen and Oxygen, it can form differently shaped isomer molecules, and, when frozen, it occupies space differently than most other substances do. The tool for conducting the investigation of all this is called 'Algebraic Chemistry'. This tool is a quantitative model for predicting the energy budget for all sorts of changes between different ionization states of atoms that are involved in chemical reactions and in changes of physical state. The model is based on consistent patterns seen in empirical data about ionization potentials, together with rational scaling laws that can interpolate and extrapolate for situations where no data are available. The results of the investigation of the water molecule include comments, both positive and negative, about technologies involving heavy water, poly water, Brown's gas, and cold fusion.
Variable context Markov chains for HIV protease cleavage site prediction.
Oğul, Hasan
2009-06-01
Deciphering the knowledge of HIV protease specificity and developing computational tools for detecting its cleavage sites in protein polypeptide chain are very desirable for designing efficient and specific chemical inhibitors to prevent acquired immunodeficiency syndrome. In this study, we developed a generative model based on a generalization of variable order Markov chains (VOMC) for peptide sequences and adapted the model for prediction of their cleavability by certain proteases. The new method, called variable context Markov chains (VCMC), attempts to identify the context equivalence based on the evolutionary similarities between individual amino acids. It was applied for HIV-1 protease cleavage site prediction problem and shown to outperform existing methods in terms of prediction accuracy on a common dataset. In general, the method is a promising tool for prediction of cleavage sites of all proteases and encouraged to be used for any kind of peptide classification problem as well.
On the Several Molecules and Nanostructures of Water
Whitney, Cynthia Kolb
2012-01-01
This paper investigates the water molecule from a variety of viewpoints. Water can involve different isotopes of Hydrogen and Oxygen, it can form differently shaped isomer molecules, and, when frozen, it occupies space differently than most other substances do. The tool for conducting the investigation of all this is called ‘Algebraic Chemistry’. This tool is a quantitative model for predicting the energy budget for all sorts of changes between different ionization states of atoms that are involved in chemical reactions and in changes of physical state. The model is based on consistent patterns seen in empirical data about ionization potentials, together with rational scaling laws that can interpolate and extrapolate for situations where no data are available. The results of the investigation of the water molecule include comments, both positive and negative, about technologies involving heavy water, poly water, Brown’s gas, and cold fusion. PMID:22312305
Cullis, B R; Smith, A B; Beeck, C P; Cowling, W A
2010-11-01
Exploring and exploiting variety by environment (V × E) interaction is one of the major challenges facing plant breeders. In paper I of this series, we presented an approach to modelling V × E interaction in the analysis of complex multi-environment trials using factor analytic models. In this paper, we develop a range of statistical tools which explore V × E interaction in this context. These tools include graphical displays such as heat-maps of genetic correlation matrices as well as so-called E-scaled uniplots that are a more informative alternative to the classical biplot for large plant breeding multi-environment trials. We also present a new approach to prediction for multi-environment trials that include pedigree information. This approach allows meaningful selection indices to be formed either for potential new varieties or potential parents.
NASA Astrophysics Data System (ADS)
Hoebelheinrich, N. J.; Lynnes, C.; West, P.; Ferritto, M.
2014-12-01
Two problems common to many geoscience domains are the difficulties in finding tools to work with a given dataset collection, and conversely, the difficulties in finding data for a known tool. A collaborative team from the Earth Science Information Partnership (ESIP) has gotten together to design and create a web service, called ToolMatch, to address these problems. The team began their efforts by defining an initial, relatively simple conceptual model that addressed the two uses cases briefly described above. The conceptual model is expressed as an ontology using OWL (Web Ontology Language) and DCterms (Dublin Core Terms), and utilizing standard ontologies such as DOAP (Description of a Project), FOAF (Friend of a Friend), SKOS (Simple Knowledge Organization System) and DCAT (Data Catalog Vocabulary). The ToolMatch service will be taking advantage of various Semantic Web and Web standards, such as OpenSearch, RESTful web services, SWRL (Semantic Web Rule Language) and SPARQL (Simple Protocol and RDF Query Language). The first version of the ToolMatch service was deployed in early fall 2014. While more complete testing is required, a number of communities besides ESIP member organizations have expressed interest in collaborating to create, test and use the service and incorporate it into their own web pages, tools and / or services including the USGS Data Catalog service, DataONE, the Deep Carbon Observatory, Virtual Solar Terrestrial Observatory (VSTO), and the U.S. Global Change Research Program. In this session, presenters will discuss the inception and development of the ToolMatch service, the collaborative process used to design, refine, and test the service, and future plans for the service.
Durr, W
1998-01-01
Call centers are strategically and tactically important to many industries, including the healthcare industry. Call centers play a key role in acquiring and retaining customers. The ability to deliver high-quality and timely customer service without much expense is the basis for the proliferation and expansion of call centers. Call centers are unique blends of people and technology, where performance indicates combining appropriate technology tools with sound management practices built on key operational data. While the technology is fascinating, the people working in call centers and the skill of the management team ultimately make a difference to their companies.
Ocean Drilling Program: TAMRF Administrative Services: Meeting, Travel, and
Port-Call Information ODP/TAMU Science Operator Home Mirror sites ODP/TAMU staff Cruise information Science and curation services Publication services and products Drilling services and tools Online ODP Meeting, Travel, and Port-Call Information All ODP meeting and port-call activities are complete
Marketing, Management and Performance: Multilingualism as Commodity in a Tourism Call Centre
ERIC Educational Resources Information Center
Duchene, Alexandre
2009-01-01
This paper focuses on the ways an institution of the new economy--a tourism call centre in Switzerland--markets, manages and performs multilingual services. In particular, it explores the ways multilingualism operates as a strategic and managerial tool within tourism call centres and how the institutional regulation of language practices…
Enabling eHealth as a Pathway for Patient Engagement: a Toolkit for Medical Practice.
Graffigna, Guendalina; Barello, Serena; Triberti, Stefano; Wiederhold, Brenda K; Bosio, A Claudio; Riva, Giuseppe
2014-01-01
Academic and managerial interest in patient engagement is rapidly earning attention and becoming a necessary tool for researchers, clinicians and policymakers worldwide to manage the increasing burden of chronic conditions. The concept of patient engagement calls for a reframe of healthcare organizations' models and approaches to care. This also requires innovations in the direction of facilitating the exchanges between the patients and the healthcare. eHealth, namely the use of new communication technologies to provide healthcare, is proved to be proposable to innovate healthcare organizations and to improve exchanges between patients and health providers. However, little attention has been still devoted to how to best design eHealth tools in order to engage patients in their care. eHealth tools have to be appropriately designed according to the specific patients' unmet needs and priorities featuring the different phases of the engagement process. Basing on the Patient Engagement model and on the Positive Technology paradigm, we suggest a toolkit of phase-specific technological resources, highlighting their specific potentialities in fostering the patient engagement process.
Thinking about Pregnancy After Premature Birth
... Moms Need Blog News & Media News Videos Mission stories Ambassadors Spotlights Tools & Resources Frequently asked media questions ... a kind of fertility treatment called assisted reproductive technology (also called ART). Fertility treatment is medical treatment ...
Amanzi: An Open-Source Multi-process Simulator for Environmental Applications
NASA Astrophysics Data System (ADS)
Moulton, J. D.; Molins, S.; Johnson, J. N.; Coon, E.; Lipnikov, K.; Day, M.; Barker, E.
2014-12-01
The Advanced Simulation Capabililty for Environmental Management (ASCEM) program is developing an approach and open-source tool suite for standardized risk and performance assessments at legacy nuclear waste sites. These assessments begin with simplified models, and add geometric and geologic complexity as understanding is gained. The Platform toolsets (Akuna) generates these conceptual models and Amanzi provides the computational engine to perform the simulations, returning the results for analysis and visualization. In this presentation we highlight key elements of the design, algorithms and implementations used in Amanzi. In particular, the hierarchical and modular design is aligned with the coupled processes being sumulated, and naturally supports a wide range of model complexity. This design leverages a dynamic data manager and the synergy of two graphs (one from the high-level perspective of the models the other from the dependencies of the variables in the model) to enable this flexible model configuration at run time. Moreover, to model sites with complex hydrostratigraphy, as well as engineered systems, we are developing a dual unstructured/structured capability. Recently, these capabilities have been collected in a framework named Arcos, and efforts have begun to improve interoperability between the unstructured and structured AMR approaches in Amanzi. To leverage a range of biogeochemistry capability from the community (e.g., CrunchFlow, PFLOTRAN, etc.), a biogeochemistry interface library was developed called Alquimia. To ensure that Amanzi is truly an open-source community code we require a completely open-source tool chain for our development. We will comment on elements of this tool chain, including the testing and documentation development tools such as docutils, and Sphinx. Finally, we will show simulation results from our phased demonstrations, including the geochemically complex Savannah River F-Area seepage basins.
Dfam: a database of repetitive DNA based on profile hidden Markov models.
Wheeler, Travis J; Clements, Jody; Eddy, Sean R; Hubley, Robert; Jones, Thomas A; Jurka, Jerzy; Smit, Arian F A; Finn, Robert D
2013-01-01
We present a database of repetitive DNA elements, called Dfam (http://dfam.janelia.org). Many genomes contain a large fraction of repetitive DNA, much of which is made up of remnants of transposable elements (TEs). Accurate annotation of TEs enables research into their biology and can shed light on the evolutionary processes that shape genomes. Identification and masking of TEs can also greatly simplify many downstream genome annotation and sequence analysis tasks. The commonly used TE annotation tools RepeatMasker and Censor depend on sequence homology search tools such as cross_match and BLAST variants, as well as Repbase, a collection of known TE families each represented by a single consensus sequence. Dfam contains entries corresponding to all Repbase TE entries for which instances have been found in the human genome. Each Dfam entry is represented by a profile hidden Markov model, built from alignments generated using RepeatMasker and Repbase. When used in conjunction with the hidden Markov model search tool nhmmer, Dfam produces a 2.9% increase in coverage over consensus sequence search methods on a large human benchmark, while maintaining low false discovery rates, and coverage of the full human genome is 54.5%. The website provides a collection of tools and data views to support improved TE curation and annotation efforts. Dfam is also available for download in flat file format or in the form of MySQL table dumps.
Design of a Pictorial Program Reference Language.
1984-08-01
be fixed during the typing process. If the manuscript is given to an English teacher moonlighting as a typist, the IThe effort and expense involved in...objects with stereotyped pur- these programming cliches automatically. In the poses. These are called typical programming pat- second method, the user uses...fixed. If you gave it to alm options found in many tools today. Modeling large 0 K.glmi m teacher moonlighting as a typiA, you bodies of facts and
Tools and Approaches for the Construction of Knowledge Models from the Neuroscientific Literature
Burns, Gully A. P. C.; Khan, Arshad M.; Ghandeharizadeh, Shahram; O’Neill, Mark A.; Chen, Yi-Shin
2015-01-01
Within this paper, we describe a neuroinformatics project (called “NeuroScholar,” http://www.neuroscholar.org/) that enables researchers to examine, manage, manipulate, and use the information contained within the published neuroscientific literature. The project is built within a multi-level, multi-component framework constructed with the use of software engineering methods that themselves provide code-building functionality for neuroinformaticians. We describe the different software layers of the system. First, we present a hypothetical usage scenario illustrating how NeuroScholar permits users to address large-scale questions in a way that would otherwise be impossible. We do this by applying NeuroScholar to a “real-world” neuroscience question: How is stress-related information processed in the brain? We then explain how the overall design of NeuroScholar enables the system to work and illustrate different components of the user interface. We then describe the knowledge management strategy we use to store interpretations. Finally, we describe the software engineering framework we have devised (called the “View-Primitive-Data Model framework,” [VPDMf]) to provide an open-source, accelerated software development environment for the project. We believe that NeuroScholar will be useful to experimental neuroscientists by helping them interact with the primary neuroscientific literature in a meaningful way, and to neuroinformaticians by providing them with useful, affordable software engineering tools. PMID:15055395
Meyer, Denny; Abbott, Jo-Anne; Rehm, Imogen; Bhar, Sunil; Barak, Azy; Deng, Gary; Wallace, Klaire; Ogden, Edward; Klein, Britt
2017-04-01
Suicidal patients often visit healthcare professionals in their last month before suicide, but medical practitioners are unlikely to raise the issue of suicide with patients because of time constraints and uncertainty regarding an appropriate approach. A brief tool called the e-PASS Suicidal Ideation Detector (eSID) was developed for medical practitioners to help detect the presence of suicidal ideation (SI) in their clients. If SI is detected, the system alerts medical practitioners to address this issue with a client. The eSID tool was developed due to the absence of an easy-to-use, evidence-based SI detection tool for general practice. The tool was developed using binary logistic regression analyses of data provided by clients accessing an online psychological assessment function. Ten primary healthcare professionals provided advice regarding the use of the tool. The analysis identified eleven factors in addition to the Kessler-6 for inclusion in the model used to predict the probability of recent SI. The model performed well across gender and age groups 18-64 (AUR 0.834, 95% CI 0.828-0.841, N = 16,703). Healthcare professionals were interviewed; they recommended that the tool be incorporated into existing medical software systems and that additional resources be supplied, tailored to the level of risk identified. The eSID is expected to trigger risk assessments by healthcare professionals when this is necessary. Initial reactions of healthcare professionals to the tool were favorable, but further testing and in situ development are required.
Grid Stiffened Structure Analysis Tool
NASA Technical Reports Server (NTRS)
1999-01-01
The Grid Stiffened Analysis Tool contract is contract performed by Boeing under NASA purchase order H30249D. The contract calls for a "best effort" study comprised of two tasks: (1) Create documentation for a composite grid-stiffened structure analysis tool, in the form of a Microsoft EXCEL spread sheet, that was developed by originally at Stanford University and later further developed by the Air Force, and (2) Write a program that functions as a NASTRAN pre-processor to generate an FEM code for grid-stiffened structure. In performing this contract, Task 1 was given higher priority because it enables NASA to make efficient use of a unique tool they already have; Task 2 was proposed by Boeing because it also would be beneficial to the analysis of composite grid-stiffened structures, specifically in generating models for preliminary design studies. The contract is now complete, this package includes copies of the user's documentation for Task 1 and a CD ROM & diskette with an electronic copy of the user's documentation and an updated version of the "GRID 99" spreadsheet.
NASA Astrophysics Data System (ADS)
Smith, B.
2015-12-01
In 2014, eight Department of Energy (DOE) national laboratories, four academic institutions, one company, and the National Centre for Atmospheric Research combined forces in a project called Accelerated Climate Modeling for Energy (ACME) with the goal to speed Earth system model development for climate and energy. Over the planned 10-year span, the project will conduct simulations and modeling on DOE's most powerful high-performance computing systems at Oak Ridge, Argonne, and Lawrence Berkeley Leadership Compute Facilities. A key component of the ACME project is the development of an interactive test bed for the advanced Earth system model. Its execution infrastructure will accelerate model development and testing cycles. The ACME Workflow Group is leading the efforts to automate labor-intensive tasks, provide intelligent support for complex tasks and reduce duplication of effort through collaboration support. As part of this new workflow environment, we have created a diagnostic, metric, and intercomparison Python framework, called UVCMetrics, to aid in the testing-to-production execution of the ACME model. The framework exploits similarities among different diagnostics to compactly support diagnosis of new models. It presently focuses on atmosphere and land but is designed to support ocean and sea ice model components as well. This framework is built on top of the existing open-source software framework known as the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT). Because of its flexible framework design, scientists and modelers now can generate thousands of possible diagnostic outputs. These diagnostics can compare model runs, compare model vs. observation, or simply verify a model is physically realistic. Additional diagnostics are easily integrated into the framework, and our users have already added several. Diagnostics can be generated, viewed, and manipulated from the UV-CDAT graphical user interface, Python command line scripts and programs, and web browsers. The framework is designed to be scalable to large datasets, yet easy to use and familiar to scientists using previous tools. Integration in the ACME overall user interface facilitates data publication, further analysis, and quick feedback to model developers and scientists making component or coupled model runs.
NASA Technical Reports Server (NTRS)
Unal, Resit
1999-01-01
Multdisciplinary design optimization (MDO) is an important step in the design and evaluation of launch vehicles, since it has a significant impact on performance and lifecycle cost. The objective in MDO is to search the design space to determine the values of design parameters that optimize the performance characteristics subject to system constraints. Vehicle Analysis Branch (VAB) at NASA Langley Research Center has computerized analysis tools in many of the disciplines required for the design and analysis of launch vehicles. Vehicle performance characteristics can be determined by the use of these computerized analysis tools. The next step is to optimize the system performance characteristics subject to multidisciplinary constraints. However, most of the complex sizing and performance evaluation codes used for launch vehicle design are stand-alone tools, operated by disciplinary experts. They are, in general, difficult to integrate and use directly for MDO. An alternative has been to utilize response surface methodology (RSM) to obtain polynomial models that approximate the functional relationships between performance characteristics and design variables. These approximation models, called response surface models, are then used to integrate the disciplines using mathematical programming methods for efficient system level design analysis, MDO and fast sensitivity simulations. A second-order response surface model of the form given has been commonly used in RSM since in many cases it can provide an adequate approximation especially if the region of interest is sufficiently limited.
OpenMP parallelization of a gridded SWAT (SWATG)
NASA Astrophysics Data System (ADS)
Zhang, Ying; Hou, Jinliang; Cao, Yongpan; Gu, Juan; Huang, Chunlin
2017-12-01
Large-scale, long-term and high spatial resolution simulation is a common issue in environmental modeling. A Gridded Hydrologic Response Unit (HRU)-based Soil and Water Assessment Tool (SWATG) that integrates grid modeling scheme with different spatial representations also presents such problems. The time-consuming problem affects applications of very high resolution large-scale watershed modeling. The OpenMP (Open Multi-Processing) parallel application interface is integrated with SWATG (called SWATGP) to accelerate grid modeling based on the HRU level. Such parallel implementation takes better advantage of the computational power of a shared memory computer system. We conducted two experiments at multiple temporal and spatial scales of hydrological modeling using SWATG and SWATGP on a high-end server. At 500-m resolution, SWATGP was found to be up to nine times faster than SWATG in modeling over a roughly 2000 km2 watershed with 1 CPU and a 15 thread configuration. The study results demonstrate that parallel models save considerable time relative to traditional sequential simulation runs. Parallel computations of environmental models are beneficial for model applications, especially at large spatial and temporal scales and at high resolutions. The proposed SWATGP model is thus a promising tool for large-scale and high-resolution water resources research and management in addition to offering data fusion and model coupling ability.
Xu, Lingyang; Hou, Yali; Bickhart, Derek M; Song, Jiuzhou; Liu, George E
2013-06-25
Copy number variations (CNVs) are gains and losses of genomic sequence between two individuals of a species when compared to a reference genome. The data from single nucleotide polymorphism (SNP) microarrays are now routinely used for genotyping, but they also can be utilized for copy number detection. Substantial progress has been made in array design and CNV calling algorithms and at least 10 comparison studies in humans have been published to assess them. In this review, we first survey the literature on existing microarray platforms and CNV calling algorithms. We then examine a number of CNV calling tools to evaluate their impacts using bovine high-density SNP data. Large incongruities in the results from different CNV calling tools highlight the need for standardizing array data collection, quality assessment and experimental validation. Only after careful experimental design and rigorous data filtering can the impacts of CNVs on both normal phenotypic variability and disease susceptibility be fully revealed.
a Web-Based Interactive Tool for Multi-Resolution 3d Models of a Maya Archaeological Site
NASA Astrophysics Data System (ADS)
Agugiaro, G.; Remondino, F.; Girardi, G.; von Schwerin, J.; Richards-Rissetto, H.; De Amicis, R.
2011-09-01
Continuous technological advances in surveying, computing and digital-content delivery are strongly contributing to a change in the way Cultural Heritage is "perceived": new tools and methodologies for documentation, reconstruction and research are being created to assist not only scholars, but also to reach more potential users (e.g. students and tourists) willing to access more detailed information about art history and archaeology. 3D computer-simulated models, sometimes set in virtual landscapes, offer for example the chance to explore possible hypothetical reconstructions, while on-line GIS resources can help interactive analyses of relationships and change over space and time. While for some research purposes a traditional 2D approach may suffice, this is not the case for more complex analyses concerning spatial and temporal features of architecture, like for example the relationship of architecture and landscape, visibility studies etc. The project aims therefore at creating a tool, called "QueryArch3D" tool, which enables the web-based visualisation and queries of an interactive, multi-resolution 3D model in the framework of Cultural Heritage. More specifically, a complete Maya archaeological site, located in Copan (Honduras), has been chosen as case study to test and demonstrate the platform's capabilities. Much of the site has been surveyed and modelled at different levels of detail (LoD) and the geometric model has been semantically segmented and integrated with attribute data gathered from several external data sources. The paper describes the characteristics of the research work, along with its implementation issues and the initial results of the developed prototype.
FFI: A software tool for ecological monitoring
Duncan C. Lutes; Nathan C. Benson; MaryBeth Keifer; John F. Caratti; S. Austin Streetman
2009-01-01
A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool. FFI provides software...
SMARTE: IMPROVING REVITALIZATION DECISIONS (BERLIN, GERMANY)
The U.S.-German Bilateral Working Group is developing Site-specific Management Approaches and Redevelopment Tools (SMART). In the U.S., the SMART compilation is housed in a web-based, decision support tool called SMARTe. All tools within SMARTe that are developed specifically for...
Enhancement of the EPA Stormwater BMP Decision-Support Tool (SUSTAIN)
U.S. Environmental Protection Agency (EPA) has been developing and improving a decision-support tool for placement of stormwater best management practices (BMPs) at strategic locations in urban watersheds. The tool is called the System for Urban Stormwater Treatment and Analysis...
2D Sub-Pixel Disparity Measurement Using QPEC / Medicis
NASA Astrophysics Data System (ADS)
Cournet, M.; Giros, A.; Dumas, L.; Delvit, J. M.; Greslou, D.; Languille, F.; Blanchet, G.; May, S.; Michel, J.
2016-06-01
In the frame of its earth observation missions, CNES created a library called QPEC, and one of its launcher called Medicis. QPEC / Medicis is a sub-pixel two-dimensional stereo matching algorithm that works on an image pair. This tool is a block matching algorithm, which means that it is based on a local method. Moreover it does not regularize the results found. It proposes several matching costs, such as the Zero mean Normalised Cross-Correlation or statistical measures (the Mutual Information being one of them), and different match validation flags. QPEC / Medicis is able to compute a two-dimensional dense disparity map with a subpixel precision. Hence, it is more versatile than disparity estimation methods found in computer vision literature, which often assume an epipolar geometry. CNES uses Medicis, among other applications, during the in-orbit image quality commissioning of earth observation satellites. For instance the Pléiades-HR 1A & 1B and the Sentinel-2 geometric calibrations are based on this block matching algorithm. Over the years, it has become a common tool in ground segments for in-flight monitoring purposes. For these two kinds of applications, the two-dimensional search and the local sub-pixel measure without regularization can be essential. This tool is also used to generate automatic digital elevation models, for which it was not initially dedicated. This paper deals with the QPEC / Medicis algorithm. It also presents some of its CNES applications (in-orbit commissioning, in flight monitoring or digital elevation model generation). Medicis software is distributed outside the CNES as well. This paper finally describes some of these external applications using Medicis, such as ground displacement measurement, or intra-oral scanner in the dental domain.
Estimating Software-Development Costs With Greater Accuracy
NASA Technical Reports Server (NTRS)
Baker, Dan; Hihn, Jairus; Lum, Karen
2008-01-01
COCOMOST is a computer program for use in estimating software development costs. The goal in the development of COCOMOST was to increase estimation accuracy in three ways: (1) develop a set of sensitivity software tools that return not only estimates of costs but also the estimation error; (2) using the sensitivity software tools, precisely define the quantities of data needed to adequately tune cost estimation models; and (3) build a repository of software-cost-estimation information that NASA managers can retrieve to improve the estimates of costs of developing software for their project. COCOMOST implements a methodology, called '2cee', in which a unique combination of well-known pre-existing data-mining and software-development- effort-estimation techniques are used to increase the accuracy of estimates. COCOMOST utilizes multiple models to analyze historical data pertaining to software-development projects and performs an exhaustive data-mining search over the space of model parameters to improve the performances of effort-estimation models. Thus, it is possible to both calibrate and generate estimates at the same time. COCOMOST is written in the C language for execution in the UNIX operating system.
Solar Field Optical Characterization at Stillwater Geothermal/Solar Hybrid Plant
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Guangdong; Turchi, Craig
Concentrating solar power (CSP) can provide additional thermal energy to boost geothermal plant power generation. For a newly constructed solar field at a geothermal power plant site, it is critical to properly characterize its performance so that the prediction of thermal power generation can be derived to develop an optimum operating strategy for a hybrid system. In the past, laboratory characterization of a solar collector has often extended into the solar field performance model and has been used to predict the actual solar field performance, disregarding realistic impacting factors. In this work, an extensive measurement on mirror slope error andmore » receiver position error has been performed in the field by using the optical characterization tool called Distant Observer (DO). Combining a solar reflectance sampling procedure, a newly developed solar characterization program called FirstOPTIC and public software for annual performance modeling called System Advisor Model (SAM), a comprehensive solar field optical characterization has been conducted, thus allowing for an informed prediction of solar field annual performance. The paper illustrates this detailed solar field optical characterization procedure and demonstrates how the results help to quantify an appropriate tracking-correction strategy to improve solar field performance. In particular, it is found that an appropriate tracking-offset algorithm can improve the solar field performance by about 15%. The work here provides a valuable reference for the growing CSP industry.« less
Solar Field Optical Characterization at Stillwater Geothermal/Solar Hybrid Plant
Zhu, Guangdong; Turchi, Craig
2017-01-27
Concentrating solar power (CSP) can provide additional thermal energy to boost geothermal plant power generation. For a newly constructed solar field at a geothermal power plant site, it is critical to properly characterize its performance so that the prediction of thermal power generation can be derived to develop an optimum operating strategy for a hybrid system. In the past, laboratory characterization of a solar collector has often extended into the solar field performance model and has been used to predict the actual solar field performance, disregarding realistic impacting factors. In this work, an extensive measurement on mirror slope error andmore » receiver position error has been performed in the field by using the optical characterization tool called Distant Observer (DO). Combining a solar reflectance sampling procedure, a newly developed solar characterization program called FirstOPTIC and public software for annual performance modeling called System Advisor Model (SAM), a comprehensive solar field optical characterization has been conducted, thus allowing for an informed prediction of solar field annual performance. The paper illustrates this detailed solar field optical characterization procedure and demonstrates how the results help to quantify an appropriate tracking-correction strategy to improve solar field performance. In particular, it is found that an appropriate tracking-offset algorithm can improve the solar field performance by about 15%. The work here provides a valuable reference for the growing CSP industry.« less
Mobile Phone Call Data as a Regional Socio-Economic Proxy Indicator
Šćepanović, Sanja; Mishkovski, Igor; Hui, Pan; Nurminen, Jukka K.; Ylä-Jääski, Antti
2015-01-01
The advent of publishing anonymized call detail records opens the door for temporal and spatial human dynamics studies. Such studies, besides being useful for creating universal models for mobility patterns, could be also used for creating new socio-economic proxy indicators that will not rely only on the local or state institutions. In this paper, from the frequency of calls at different times of the day, in different small regional units (sub-prefectures) in Côte d'Ivoire, we infer users' home and work sub-prefectures. This division of users enables us to analyze different mobility and calling patterns for the different regions. We then compare how those patterns correlate to the data from other sources, such as: news for particular events in the given period, census data, economic activity, poverty index, power plants and energy grid data. Our results show high correlation in many of the cases revealing the diversity of socio-economic insights that can be inferred using only mobile phone call data. The methods and the results may be particularly relevant to policy-makers engaged in poverty reduction initiatives as they can provide an affordable tool in the context of resource-constrained developing economies, such as Côte d'Ivoire's. PMID:25897957
Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun; Wang, Gi-Nam
2016-01-01
Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively.
Ghosh, Arup; Qin, Shiming; Lee, Jooyeoun
2016-01-01
Operational faults and behavioural anomalies associated with PLC control processes take place often in a manufacturing system. Real time identification of these operational faults and behavioural anomalies is necessary in the manufacturing industry. In this paper, we present an automated tool, called PLC Log-Data Analysis Tool (PLAT) that can detect them by using log-data records of the PLC signals. PLAT automatically creates a nominal model of the PLC control process and employs a novel hash table based indexing and searching scheme to satisfy those purposes. Our experiments show that PLAT is significantly fast, provides real time identification of operational faults and behavioural anomalies, and can execute within a small memory footprint. In addition, PLAT can easily handle a large manufacturing system with a reasonable computing configuration and can be installed in parallel to the data logging system to identify operational faults and behavioural anomalies effectively. PMID:27974882
CRIE: An automated analyzer for Chinese texts.
Sung, Yao-Ting; Chang, Tao-Hsing; Lin, Wei-Chun; Hsieh, Kuan-Sheng; Chang, Kuo-En
2016-12-01
Textual analysis has been applied to various fields, such as discourse analysis, corpus studies, text leveling, and automated essay evaluation. Several tools have been developed for analyzing texts written in alphabetic languages such as English and Spanish. However, currently there is no tool available for analyzing Chinese-language texts. This article introduces a tool for the automated analysis of simplified and traditional Chinese texts, called the Chinese Readability Index Explorer (CRIE). Composed of four subsystems and incorporating 82 multilevel linguistic features, CRIE is able to conduct the major tasks of segmentation, syntactic parsing, and feature extraction. Furthermore, the integration of linguistic features with machine learning models enables CRIE to provide leveling and diagnostic information for texts in language arts, texts for learning Chinese as a foreign language, and texts with domain knowledge. The usage and validation of the functions provided by CRIE are also introduced.
Information Technology Research Services: Powerful Tools to Keep Up with a Rapidly Moving Field
NASA Technical Reports Server (NTRS)
Hunter, Paul
2010-01-01
Marty firms offer Information Technology Research reports, analyst calls, conferences, seminars, tools, leadership development, etc. These entities include Gartner, Forrester Research, IDC, The Burton Group, Society for Information Management, 1nfoTech Research, The Corporate Executive Board, and so on. This talk will cover how a number of such services are being used at the Goddard Space Flight Center to improve our IT management practices, workforce skills, approach to innovation, and service delivery. These tools and services are used across the workforce, from the executive leadership to the IT worker. The presentation will cover the types of services each vendor provides and their primary engagement model. The use of these services at other NASA Centers and Headquarters will be included. In addition, I will explain how two of these services are available now to the entire NASA IT workforce through enterprise-wide subscriptions.
Reliability Analysis for AFTI-F16 SRFCS Using ASSIST and SURE
NASA Technical Reports Server (NTRS)
Wu, N. Eva
2001-01-01
This paper reports the results of a study on reliability analysis of an AFTI-16 Self-Repairing Flight Control System (SRFCS) using software tools SURE (Semi-Markov Unreliability Range Evaluator and ASSIST (Abstract Semi-Markov Specification Interface to the SURE Tool). The purpose of the study is to investigate the potential utility of the software tools in the ongoing effort of the NASA Aviation Safety Program, where the class of systems must be extended beyond the originally intended serving class of electronic digital processors. The study concludes that SURE and ASSIST are applicable to reliability, analysis of flight control systems. They are especially efficient for sensitivity analysis that quantifies the dependence of system reliability on model parameters. The study also confirms an earlier finding on the dominant role of a parameter called a failure coverage. The paper will remark on issues related to the improvement of coverage and the optimization of redundancy level.
NASA Technical Reports Server (NTRS)
Thakur, Siddarth; Wright, Jeffrey
2006-01-01
The traditional design and analysis practice for advanced propulsion systems, particularly chemical rocket engines, relies heavily on expensive full-scale prototype development and testing. Over the past decade, use of high-fidelity analysis and design tools such as CFD early in the product development cycle has been identified as one way to alleviate testing costs and to develop these devices better, faster and cheaper. Increased emphasis is being placed on developing and applying CFD models to simulate the flow field environments and performance of advanced propulsion systems. This necessitates the development of next generation computational tools which can be used effectively and reliably in a design environment by non-CFD specialists. A computational tool, called Loci-STREAM is being developed for this purpose. It is a pressure-based, Reynolds-averaged Navier-Stokes (RANS) solver for generalized unstructured grids, which is designed to handle all-speed flows (incompressible to hypersonic) and is particularly suitable for solving multi-species flow in fixed-frame combustion devices. Loci-STREAM integrates proven numerical methods for generalized grids and state-of-the-art physical models in a novel rule-based programming framework called Loci which allows: (a) seamless integration of multidisciplinary physics in a unified manner, and (b) automatic handling of massively parallel computing. The objective of the ongoing work is to develop a robust simulation capability for combustion problems in rocket engines. As an initial step towards validating this capability, a model problem is investigated in the present study which involves a gaseous oxygen/gaseous hydrogen (GO2/GH2) shear coaxial single element injector, for which experimental data are available. The sensitivity of the computed solutions to grid density, grid distribution, different turbulence models, and different near-wall treatments is investigated. A refined grid, which is clustered in the vicinity of the solid walls as well as the flame, is used to obtain a steady state solution which may be considered as the best solution attainable with the steady-state RANS methodology. From a design point of view, quick turnaround times are desirable; with this in mind, coarser grids are also employed and the resulting solutions are evaluated with respect to the fine grid solution.
HELIOGate, a Portal for the Heliophysics Community
NASA Astrophysics Data System (ADS)
Pierantoni; Gabriele; Carley, Eoin
2014-10-01
Heliophysics is the branch of physics that investigates the interactions between the Sun and the other bodies of the solar system. Heliophysicists rely on data collected from numerous sources scattered across the Solar System. The data collected from these sources is processed to extract metadata and the metadata extracted in this fashion is then used to build indexes of features and events called catalogues. Heliophysicists also develop conceptual and mathematical models of the phenomena and the environment of the Solar System. More specifically, they investigate the physical characteristics of the phenomena and they simulate how they propagate throughout the Solar System with mathematical and physical abstractions called propagation models. HELIOGate aims at addressing the need to combine and orchestrate existing web services in a flexible and easily configurable fashion to tackle different scientific questions. HELIOGate also offers a tool capable of connecting to size! able computation and storage infrastructures to execute data processing codes that are needed to calibrate raw data and to extract metadata.
Gelkopf, Marc; Haimov, Sigal; Lapid, Liron
2015-02-01
Long-term tele-counseling can potentially be a potent intervention mode in war- and terror-related community crisis situations. We aimed to examine a unique long-term telephone-administered intervention, targeting community trauma-related crisis situations by use of various techniques and approaches. 142 participants were evaluated using a non-intrusive by-proxy methodology appraising counselors' standard verbatim reports. Various background measures and elements in the intervention were quantitatively assessed, along with symptomatology and functioning at the onset and end of intervention. About 1/4 of the wide variety of clients called for someone else in addition to themselves, and most called due to a past event rather than a present crisis situation. The intervention successfully reduced posttraumatic stress symptoms and improved functioning. Most interventions included psychosocial education with additional elements, e.g., self-help tools, and almost 60% included also in-depth processes. In sum, tele-counseling might be a viable and effective intervention model for community-related traumatic stress.
Solar Wind Acceleration: Modeling Effects of Turbulent Heating in Open Flux Tubes
NASA Astrophysics Data System (ADS)
Woolsey, Lauren N.; Cranmer, Steven R.
2014-06-01
We present two self-consistent coronal heating models that determine the properties of the solar wind generated and accelerated in magnetic field geometries that are open to the heliosphere. These models require only the radial magnetic field profile as input. The first code, ZEPHYR (Cranmer et al. 2007) is a 1D MHD code that includes the effects of turbulent heating created by counter-propagating Alfven waves rather than relying on empirical heating functions. We present the analysis of a large grid of modeled flux tubes (> 400) and the resulting solar wind properties. From the models and results, we recreate the observed anti-correlation between wind speed at 1 AU and the so-called expansion factor, a parameterization of the magnetic field profile. We also find that our models follow the same observationally-derived relation between temperature at 1 AU and wind speed at 1 AU. We continue our analysis with a newly-developed code written in Python called TEMPEST (The Efficient Modified-Parker-Equation-Solving Tool) that runs an order of magnitude faster than ZEPHYR due to a set of simplifying relations between the input magnetic field profile and the temperature and wave reflection coefficient profiles. We present these simplifying relations as a useful result in themselves as well as the anti-correlation between wind speed and expansion factor also found with TEMPEST. Due to the nature of the algorithm TEMPEST utilizes to find solar wind solutions, we can effectively separate the two primary ways in which Alfven waves contribute to solar wind acceleration: 1) heating the surrounding gas through a turbulent cascade and 2) providing a separate source of wave pressure. We intend to make TEMPEST easily available to the public and suggest that TEMPEST can be used as a valuable tool in the forecasting of space weather, either as a stand-alone code or within an existing modeling framework.
Predicting silicon pore optics
NASA Astrophysics Data System (ADS)
Vacanti, Giuseppe; Barriére, Nicolas; Bavdaz, Marcos; Chatbi, Abdelhakim; Collon, Maximilien; Dekker, Danielle; Girou, David; Günther, Ramses; van der Hoeven, Roy; Landgraf, Boris; Sforzini, Jessica; Vervest, Mark; Wille, Eric
2017-09-01
Continuing improvement of Silicon Pore Optics (SPO) calls for regular extension and validation of the tools used to model and predict their X-ray performance. In this paper we present an updated geometrical model for the SPO optics and describe how we make use of the surface metrology collected during each of the SPO manufacturing runs. The new geometrical model affords the user a finer degree of control on the mechanical details of the SPO stacks, while a standard interface has been developed to make use of any type of metrology that can return changes in the local surface normal of the reflecting surfaces. Comparisons between the predicted and actual performance of samples optics will be shown and discussed.
VCFR: A package to manipulate and visualize variant call format data in R
USDA-ARS?s Scientific Manuscript database
Software to call single nucleotide polymorphisms or related genetic variants has converged on the variant call format (vcf) as their output format of choice. This has created a need for tools to work with vcf files. While an increasing number of software exists to read vcf data, many of them only ex...
ERIC Educational Resources Information Center
Hamel, Marie-Josee; Caws, Catherine
2010-01-01
This article discusses CALL development from both educational and ergonomic perspectives. It focuses on the learner-task-tool interaction, in particular on the aspects contributing to its overall quality, herein called "usability." Two pilot studies are described that were carried out with intermediate to advanced learners of French in two…
Flexible Environmental Modeling with Python and Open - GIS
NASA Astrophysics Data System (ADS)
Pryet, Alexandre; Atteia, Olivier; Delottier, Hugo; Cousquer, Yohann
2015-04-01
Numerical modeling now represents a prominent task of environmental studies. During the last decades, numerous commercial programs have been made available to environmental modelers. These software applications offer user-friendly graphical user interfaces that allow an efficient management of many case studies. However, they suffer from a lack of flexibility and closed-source policies impede source code reviewing and enhancement for original studies. Advanced modeling studies require flexible tools capable of managing thousands of model runs for parameter optimization, uncertainty and sensitivity analysis. In addition, there is a growing need for the coupling of various numerical models associating, for instance, groundwater flow modeling to multi-species geochemical reactions. Researchers have produced hundreds of open-source powerful command line programs. However, there is a need for a flexible graphical user interface allowing an efficient processing of geospatial data that comes along any environmental study. Here, we present the advantages of using the free and open-source Qgis platform and the Python scripting language for conducting environmental modeling studies. The interactive graphical user interface is first used for the visualization and pre-processing of input geospatial datasets. Python scripting language is then employed for further input data processing, call to one or several models, and post-processing of model outputs. Model results are eventually sent back to the GIS program, processed and visualized. This approach combines the advantages of interactive graphical interfaces and the flexibility of Python scripting language for data processing and model calls. The numerous python modules available facilitate geospatial data processing and numerical analysis of model outputs. Once input data has been prepared with the graphical user interface, models may be run thousands of times from the command line with sequential or parallel calls. We illustrate this approach with several case studies in groundwater hydrology and geochemistry and provide links to several python libraries that facilitate pre- and post-processing operations.
Enhancement of the EPA Stormwater BMP Decision-Support Tool (SUSTAIN) - slides
U.S. Environmental Protection Agency (EPA) has been developing and improving a decision-support tool for placement of stormwater best management practices (BMPs) at strategic locations in urban watersheds. The tool is called the System for Urban Stormwater Treatment and Analysis...
3D Visualization for Phoenix Mars Lander Science Operations
NASA Technical Reports Server (NTRS)
Edwards, Laurence; Keely, Leslie; Lees, David; Stoker, Carol
2012-01-01
Planetary surface exploration missions present considerable operational challenges in the form of substantial communication delays, limited communication windows, and limited communication bandwidth. A 3D visualization software was developed and delivered to the 2008 Phoenix Mars Lander (PML) mission. The components of the system include an interactive 3D visualization environment called Mercator, terrain reconstruction software called the Ames Stereo Pipeline, and a server providing distributed access to terrain models. The software was successfully utilized during the mission for science analysis, site understanding, and science operations activity planning. A terrain server was implemented that provided distribution of terrain models from a central repository to clients running the Mercator software. The Ames Stereo Pipeline generates accurate, high-resolution, texture-mapped, 3D terrain models from stereo image pairs. These terrain models can then be visualized within the Mercator environment. The central cross-cutting goal for these tools is to provide an easy-to-use, high-quality, full-featured visualization environment that enhances the mission science team s ability to develop low-risk productive science activity plans. In addition, for the Mercator and Viz visualization environments, extensibility and adaptability to different missions and application areas are key design goals.
Unifying Model-Based and Reactive Programming within a Model-Based Executive
NASA Technical Reports Server (NTRS)
Williams, Brian C.; Gupta, Vineet; Norvig, Peter (Technical Monitor)
1999-01-01
Real-time, model-based, deduction has recently emerged as a vital component in AI's tool box for developing highly autonomous reactive systems. Yet one of the current hurdles towards developing model-based reactive systems is the number of methods simultaneously employed, and their corresponding melange of programming and modeling languages. This paper offers an important step towards unification. We introduce RMPL, a rich modeling language that combines probabilistic, constraint-based modeling with reactive programming constructs, while offering a simple semantics in terms of hidden state Markov processes. We introduce probabilistic, hierarchical constraint automata (PHCA), which allow Markov processes to be expressed in a compact representation that preserves the modularity of RMPL programs. Finally, a model-based executive, called Reactive Burton is described that exploits this compact encoding to perform efficIent simulation, belief state update and control sequence generation.
Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique.
Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Shaw, Philip J; Ukosakit, Kittipat; Tragoonrung, Somvong; Tongsima, Sissades
2015-01-01
DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. This work presents an automated genotyping tool from DNA gel electrophoresis images, called GELect, which was written in Java and made available through the imageJ framework. With a novel automated image processing workflow, the tool can accurately segment lanes from a gel matrix, intelligently extract distorted and even doublet bands that are difficult to identify by existing image processing tools. Consequently, genotyping from DNA gel electrophoresis can be performed automatically allowing users to efficiently conduct large scale DNA fingerprinting via DNA gel electrophoresis. The software is freely available from http://www.biotec.or.th/gi/tools/gelect.
2013-01-01
Background The production of multiple transcript isoforms from one gene is a major source of transcriptome complexity. RNA-Seq experiments, in which transcripts are converted to cDNA and sequenced, allow the resolution and quantification of alternative transcript isoforms. However, methods to analyze splicing are underdeveloped and errors resulting in incorrect splicing calls occur in every experiment. Results We used RNA-Seq data to develop sequencing and aligner error models. By applying these error models to known input from simulations, we found that errors result from false alignment to minor splice motifs and antisense stands, shifted junction positions, paralog joining, and repeat induced gaps. By using a series of quantitative and qualitative filters, we eliminated diagnosed errors in the simulation, and applied this to RNA-Seq data from Drosophila melanogaster heads. We used high-confidence junction detections to specifically interrogate local splicing differences between transcripts. This method out-performed commonly used RNA-seq methods to identify known alternative splicing events in the Drosophila sex determination pathway. We describe a flexible software package to perform these tasks called Splicing Analysis Kit (Spanki), available at http://www.cbcb.umd.edu/software/spanki. Conclusions Splice-junction centric analysis of RNA-Seq data provides advantages in specificity for detection of alternative splicing. Our software provides tools to better understand error profiles in RNA-Seq data and improve inference from this new technology. The splice-junction centric approach that this software enables will provide more accurate estimates of differentially regulated splicing than current tools. PMID:24209455
Sturgill, David; Malone, John H; Sun, Xia; Smith, Harold E; Rabinow, Leonard; Samson, Marie-Laure; Oliver, Brian
2013-11-09
The production of multiple transcript isoforms from one gene is a major source of transcriptome complexity. RNA-Seq experiments, in which transcripts are converted to cDNA and sequenced, allow the resolution and quantification of alternative transcript isoforms. However, methods to analyze splicing are underdeveloped and errors resulting in incorrect splicing calls occur in every experiment. We used RNA-Seq data to develop sequencing and aligner error models. By applying these error models to known input from simulations, we found that errors result from false alignment to minor splice motifs and antisense stands, shifted junction positions, paralog joining, and repeat induced gaps. By using a series of quantitative and qualitative filters, we eliminated diagnosed errors in the simulation, and applied this to RNA-Seq data from Drosophila melanogaster heads. We used high-confidence junction detections to specifically interrogate local splicing differences between transcripts. This method out-performed commonly used RNA-seq methods to identify known alternative splicing events in the Drosophila sex determination pathway. We describe a flexible software package to perform these tasks called Splicing Analysis Kit (Spanki), available at http://www.cbcb.umd.edu/software/spanki. Splice-junction centric analysis of RNA-Seq data provides advantages in specificity for detection of alternative splicing. Our software provides tools to better understand error profiles in RNA-Seq data and improve inference from this new technology. The splice-junction centric approach that this software enables will provide more accurate estimates of differentially regulated splicing than current tools.
NASA Astrophysics Data System (ADS)
Abdel-Aal, H. A.; Mansori, M. El
2012-12-01
Cutting tools are subject to extreme thermal and mechanical loads during operation. The state of loading is intensified in dry cutting environment especially when cutting the so called hard-to-cut-materials. Although, the effect of mechanical loads on tool failure have been extensively studied, detailed studies on the effect of thermal dissipation on the deterioration of the cutting tool are rather scarce. In this paper we study failure of coated carbide tools due to thermal loading. The study emphasizes the role assumed by the thermo-physical properties of the tool material in enhancing or preventing mass attrition of the cutting elements within the tool. It is shown that within a comprehensive view of the nature of conduction in the tool zone, thermal conduction is not solely affected by temperature. Rather it is a function of the so called thermodynamic forces. These are the stress, the strain, strain rate, rate of temperature rise, and the temperature gradient. Although that within such consideration description of thermal conduction is non-linear, it is beneficial to employ such a form because it facilitates a full mechanistic understanding of thermal activation of tool wear.
Numerical modeling of axi-symmetrical cold forging process by ``Pseudo Inverse Approach''
NASA Astrophysics Data System (ADS)
Halouani, A.; Li, Y. M.; Abbes, B.; Guo, Y. Q.
2011-05-01
The incremental approach is widely used for the forging process modeling, it gives good strain and stress estimation, but it is time consuming. A fast Inverse Approach (IA) has been developed for the axi-symmetric cold forging modeling [1-2]. This approach exploits maximum the knowledge of the final part's shape and the assumptions of proportional loading and simplified tool actions make the IA simulation very fast. The IA is proved very useful for the tool design and optimization because of its rapidity and good strain estimation. However, the assumptions mentioned above cannot provide good stress estimation because of neglecting the loading history. A new approach called "Pseudo Inverse Approach" (PIA) was proposed by Batoz, Guo et al.. [3] for the sheet forming modeling, which keeps the IA's advantages but gives good stress estimation by taking into consideration the loading history. Our aim is to adapt the PIA for the cold forging modeling in this paper. The main developments in PIA are resumed as follows: A few intermediate configurations are generated for the given tools' positions to consider the deformation history; the strain increment is calculated by the inverse method between the previous and actual configurations. An incremental algorithm of the plastic integration is used in PIA instead of the total constitutive law used in the IA. An example is used to show the effectiveness and limitations of the PIA for the cold forging process modeling.
Mission Assignment Model and Simulation Tool for Different Types of Unmanned Aerial Vehicles
2008-09-01
TABLE OF ABBREVIATIONS AND ACRONYMS AAA Anti Aircraft Artillery ATO Air Tasking Order BDA Battle Damage Assessment DES Discrete Event Simulation...clock is advanced in small, fixed time steps. Since the value of simulated time is important in DES , an internal variable, called as simulation clock...VEHICLES Yücel Alver Captain, Turkish Air Force B.S., Turkish Air Force Academy, 2000 Murat Özdoğan 1st Lieutenant, Turkish Air Force B.S., Turkish
Simulation of Etching in Chlorine Discharges Using an Integrated Feature Evolution-Plasma Model
NASA Technical Reports Server (NTRS)
Hwang, Helen H.; Bose, Deepak; Govindan, T. R.; Meyyappan, M.; Biegel, Bryan (Technical Monitor)
2002-01-01
To better utilize its vast collection of heterogeneous resources that are geographically distributed across the United States, NASA is constructing a computational grid called the Information Power Grid (IPG). This paper describes various tools and techniques that we are developing to measure and improve the performance of a broad class of NASA applications when run on the IPG. In particular, we are investigating the areas of grid benchmarking, grid monitoring, user-level application scheduling, and decentralized system-level scheduling.
Modeling and modification of medical 3D objects. The benefit of using a haptic modeling tool.
Kling-Petersen, T; Rydmark, M
2000-01-01
The Computer Laboratory of the medical faculty in Goteborg (Mednet) has since the end of 1998 been one of a limited numbers of participants in the development of a new modeling tool together with SensAble Technologies Inc [http:¿www.sensable.com/]. The software called SensAble FreeForm was officially released at Siggraph September 1999. Briefly, the software mimics the modeling techniques traditionally used by clay artists. An imported model or a user defined block of "clay" can be modified using different tools such as a ball, square block, scrape etc via the use of a SensAble Technologies PHANToM haptic arm. The model will deform in 3D as a result of touching the "clay" with any selected tool and the amount of deformation is linear to the force applied. By getting instantaneous haptic as well as visual feedback, precise and intuitive changes are easily made. While SensAble FreeForm lacks several of the features normally associated with a 3D modeling program (such as text handling, application of surface and bumpmaps, high-end rendering engines, etc) it's strength lies in the ability to rapidly create non-geometric 3D models. For medical use, very few anatomically correct models are created from scratch. However, FreeForm features tools enable advanced modification of reconstructed or 3D scanned models. One of the main problems with 3D laserscanning of medical specimens is that the technique usually leaves holes or gaps in the dataset corresponding to areas in shadows such as orifices, deep grooves etc. By using FreeForms different tools, these defects are easily corrected and gaps are filled out. Similarly, traditional 3D reconstruction (based on serial sections etc) often shows artifacts as a result of the triangulation and/or tessellation processes. These artifacts usually manifest as unnatural ridges or uneven areas ("the accordion effect"). FreeForm contains a smoothing algorithm that enables the user to select an area to be modified and subsequently apply any given amount of smoothing to the object. While the final objects need to be exported for further 3D graphic manipulation, FreeForm addresses one of the most time consuming problems of 3D modeling: modification and creation of non-geometric 3D objects.
Teaching Web Security Using Portable Virtual Labs
ERIC Educational Resources Information Center
Chen, Li-Chiou; Tao, Lixin
2012-01-01
We have developed a tool called Secure WEb dEvelopment Teaching (SWEET) to introduce security concepts and practices for web application development. This tool provides introductory tutorials, teaching modules utilizing virtualized hands-on exercises, and project ideas in web application security. In addition, the tool provides pre-configured…
Simulink/PARS Integration Support
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vacaliuc, B.; Nakhaee, N.
2013-12-18
The state of the art for signal processor hardware has far out-paced the development tools for placing applications on that hardware. In addition, signal processors are available in a variety of architectures, each uniquely capable of handling specific types of signal processing efficiently. With these processors becoming smaller and demanding less power, it has become possible to group multiple processors, a heterogeneous set of processors, into single systems. Different portions of the desired problem set can be assigned to different processor types as appropriate. As software development tools do not keep pace with these processors, especially when multiple processors ofmore » different types are used, a method is needed to enable software code portability among multiple processors and multiple types of processors along with their respective software environments. Sundance DSP, Inc. has developed a software toolkit called “PARS”, whose objective is to provide a framework that uses suites of tools provided by different vendors, along with modeling tools and a real time operating system, to build an application that spans different processor types. The software language used to express the behavior of the system is a very high level modeling language, “Simulink”, a MathWorks product. ORNL has used this toolkit to effectively implement several deliverables. This CRADA describes this collaboration between ORNL and Sundance DSP, Inc.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Femec, D.A.
This report describes two code-generating tools used to speed design and implementation of relational databases and user interfaces: CREATE-SCHEMA and BUILD-SCREEN. CREATE-SCHEMA produces the SQL commands that actually create and define the database. BUILD-SCREEN takes templates for data entry screens and generates the screen management system routine calls to display the desired screen. Both tools also generate the related FORTRAN declaration statements and precompiled SQL calls. Included with this report is the source code for a number of FORTRAN routines and functions used by the user interface. This code is broadly applicable to a number of different databases.
NASA Astrophysics Data System (ADS)
Ames, D. P.; Kadlec, J.; Cao, Y.; Grover, D.; Horsburgh, J. S.; Whiteaker, T.; Goodall, J. L.; Valentine, D. W.
2010-12-01
A growing number of hydrologic information servers are being deployed by government agencies, university networks, and individual researchers using the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS). The CUAHSI HIS Project has developed a standard software stack, called HydroServer, for publishing hydrologic observations data. It includes the Observations Data Model (ODM) database and Water Data Service web services, which together enable publication of data on the Internet in a standard format called Water Markup Language (WaterML). Metadata describing available datasets hosted on these servers is compiled within a central metadata catalog called HIS Central at the San Diego Supercomputer Center and is searchable through a set of predefined web services based queries. Together, these servers and central catalog service comprise a federated HIS of a scale and comprehensiveness never previously available. This presentation will briefly review/introduce the CUAHSI HIS system with special focus on a new HIS software tool called "HydroDesktop" and the open source software development web portal, www.HydroDesktop.org, which supports community development and maintenance of the software. HydroDesktop is a client-side, desktop software application that acts as a search and discovery tool for exploring the distributed network of HydroServers, downloading specific data series, visualizing and summarizing data series and exporting these to formats needed for analysis by external software. HydroDesktop is based on the open source DotSpatial GIS developer toolkit which provides it with map-based data interaction and visualization, and a plug-in interface that can be used by third party developers and researchers to easily extend the software using Microsoft .NET programming languages. HydroDesktop plug-ins that are presently available or currently under development within the project and by third party collaborators include functions for data search and discovery, extensive graphing, data editing and export, HydroServer exploration, integration with the OpenMI workflow and modeling system, and an interface for data analysis through the R statistical package.
Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology.
Faltermeier, Rupert; Proescholdt, Martin A; Bele, Sylvia; Brawanski, Alexander
2015-01-01
Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses.
Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology
Faltermeier, Rupert; Proescholdt, Martin A.; Bele, Sylvia; Brawanski, Alexander
2015-01-01
Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses. PMID:26693250
Pohjola, Mikko V.; Pohjola, Pasi; Tainio, Marko; Tuomisto, Jouni T.
2013-01-01
The calls for knowledge-based policy and policy-relevant research invoke a need to evaluate and manage environment and health assessments and models according to their societal outcomes. This review explores how well the existing approaches to assessment and model performance serve this need. The perspectives to assessment and model performance in the scientific literature can be called: (1) quality assurance/control, (2) uncertainty analysis, (3) technical assessment of models, (4) effectiveness and (5) other perspectives, according to what is primarily seen to constitute the goodness of assessments and models. The categorization is not strict and methods, tools and frameworks in different perspectives may overlap. However, altogether it seems that most approaches to assessment and model performance are relatively narrow in their scope. The focus in most approaches is on the outputs and making of assessments and models. Practical application of the outputs and the consequential outcomes are often left unaddressed. It appears that more comprehensive approaches that combine the essential characteristics of different perspectives are needed. This necessitates a better account of the mechanisms of collective knowledge creation and the relations between knowledge and practical action. Some new approaches to assessment, modeling and their evaluation and management span the chain from knowledge creation to societal outcomes, but the complexity of evaluating societal outcomes remains a challenge. PMID:23803642
ESIF Call for High-Impact Integrated Projects | Energy Systems Integration
Integrated Projects As a U.S. Department of Energy user facility, the Energy Systems Integration Facility concepts, tools, and technologies needed to measure, analyze, predict, protect, and control the grid of the Facility | NREL ESIF Call for High-Impact Integrated Projects ESIF Call for High-Impact
Using WebQuests as Idea Banks for Fostering Autonomy in Online Language Courses
ERIC Educational Resources Information Center
Sadaghian, Shirin; Marandi, S. Susan
2016-01-01
The concept of language learner autonomy has influenced Computer-Assisted Language Learning (CALL) to the extent that Schwienhorst (2012) informs us of a paradigm change in CALL design in the light of learner autonomy. CALL is not considered a tool anymore, but a learner environment available to language learners anywhere in the world. Based on a…
Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science.
Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel
2016-01-01
One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets.
Advanced Online Survival Analysis Tool for Predictive Modelling in Clinical Data Science
Montes-Torres, Julio; Subirats, José Luis; Ribelles, Nuria; Urda, Daniel; Franco, Leonardo; Alba, Emilio; Jerez, José Manuel
2016-01-01
One of the prevailing applications of machine learning is the use of predictive modelling in clinical survival analysis. In this work, we present our view of the current situation of computer tools for survival analysis, stressing the need of transferring the latest results in the field of machine learning to biomedical researchers. We propose a web based software for survival analysis called OSA (Online Survival Analysis), which has been developed as an open access and user friendly option to obtain discrete time, predictive survival models at individual level using machine learning techniques, and to perform standard survival analysis. OSA employs an Artificial Neural Network (ANN) based method to produce the predictive survival models. Additionally, the software can easily generate survival and hazard curves with multiple options to personalise the plots, obtain contingency tables from the uploaded data to perform different tests, and fit a Cox regression model from a number of predictor variables. In the Materials and Methods section, we depict the general architecture of the application and introduce the mathematical background of each of the implemented methods. The study concludes with examples of use showing the results obtained with public datasets. PMID:27532883
BS-virus-finder: virus integration calling using bisulfite sequencing data.
Gao, Shengjie; Hu, Xuesong; Xu, Fengping; Gao, Changduo; Xiong, Kai; Zhao, Xiao; Chen, Haixiao; Zhao, Shancen; Wang, Mengyao; Fu, Dongke; Zhao, Xiaohui; Bai, Jie; Mao, Likai; Li, Bo; Wu, Song; Wang, Jian; Li, Shengbin; Yang, Huangming; Bolund, Lars; Pedersen, Christian N S
2018-01-01
DNA methylation plays a key role in the regulation of gene expression and carcinogenesis. Bisulfite sequencing studies mainly focus on calling single nucleotide polymorphism, different methylation region, and find allele-specific DNA methylation. Until now, only a few software tools have focused on virus integration using bisulfite sequencing data. We have developed a new and easy-to-use software tool, named BS-virus-finder (BSVF, RRID:SCR_015727), to detect viral integration breakpoints in whole human genomes. The tool is hosted at https://github.com/BGI-SZ/BSVF. BS-virus-finder demonstrates high sensitivity and specificity. It is useful in epigenetic studies and to reveal the relationship between viral integration and DNA methylation. BS-virus-finder is the first software tool to detect virus integration loci by using bisulfite sequencing data. © The Authors 2017. Published by Oxford University Press.
NETIMIS: Dynamic Simulation of Health Economics Outcomes Using Big Data.
Johnson, Owen A; Hall, Peter S; Hulme, Claire
2016-02-01
Many healthcare organizations are now making good use of electronic health record (EHR) systems to record clinical information about their patients and the details of their healthcare. Electronic data in EHRs is generated by people engaged in complex processes within complex environments, and their human input, albeit shaped by computer systems, is compromised by many human factors. These data are potentially valuable to health economists and outcomes researchers but are sufficiently large and complex enough to be considered part of the new frontier of 'big data'. This paper describes emerging methods that draw together data mining, process modelling, activity-based costing and dynamic simulation models. Our research infrastructure includes safe links to Leeds hospital's EHRs with 3 million secondary and tertiary care patients. We created a multidisciplinary team of health economists, clinical specialists, and data and computer scientists, and developed a dynamic simulation tool called NETIMIS (Network Tools for Intervention Modelling with Intelligent Simulation; http://www.netimis.com ) suitable for visualization of both human-designed and data-mined processes which can then be used for 'what-if' analysis by stakeholders interested in costing, designing and evaluating healthcare interventions. We present two examples of model development to illustrate how dynamic simulation can be informed by big data from an EHR. We found the tool provided a focal point for multidisciplinary team work to help them iteratively and collaboratively 'deep dive' into big data.
Ferrari, Thomas; Lombardo, Anna; Benfenati, Emilio
2018-05-14
Several methods exist to develop QSAR models automatically. Some are based on indices of the presence of atoms, other on the most similar compounds, other on molecular descriptors. Here we introduce QSARpy v1.0, a new QSAR modeling tool based on a different approach: the dissimilarity. This tool fragments the molecules of the training set to extract fragments that can be associated to a difference in the property/activity value, called modulators. If the target molecule share part of the structure with a molecule of the training set and differences can be explained with one or more modulators, the property/activity value of the molecule of the training set is adjusted using the value associated to the modulator(s). This tool is tested here on the n-octanol/water partition coefficient (Kow, usually expressed in logarithmic units as log Kow). It is a key parameter in risk assessment since it is a measure of hydrophobicity. Its wide spread use makes these estimation methods very useful to reduce testing costs. Using QSARpy v1.0, we obtained a new model to predict log Kow with accurate performance (RMSE 0.43 and R 2 0.94 for the external test set), comparing favorably with other programs. QSARpy is freely available on request. Copyright © 2018 Elsevier B.V. All rights reserved.
VoiceThread: A Useful Program Evaluation Tool
ERIC Educational Resources Information Center
Mott, Rebecca
2018-01-01
With today's technology, Extension professionals have a variety of tools available for program evaluation. This article describes an innovative platform called VoiceThread that has been used in many classrooms but also is useful for conducting virtual focus group research. I explain how this tool can be used to collect qualitative participant…
Rapid Response to Decision Making for Complex Issues - How Technologies of Cooperation Can Help
2005-11-01
creating bottom–up taxonomies—called folksonomies —using metadata tools like del.icio.us (in which users create their own tags for bookmarking Web...tools such as RSS, tagging (and the consequent development of folksonomies ), wikis, and group visualization tools all help multiply the individual
Software Model Checking Without Source Code
NASA Technical Reports Server (NTRS)
Chaki, Sagar; Ivers, James
2009-01-01
We present a framework, called AIR, for verifying safety properties of assembly language programs via software model checking. AIR extends the applicability of predicate abstraction and counterexample guided abstraction refinement to the automated verification of low-level software. By working at the assembly level, AIR allows verification of programs for which source code is unavailable-such as legacy and COTS software-and programs that use features-such as pointers, structures, and object-orientation-that are problematic for source-level software verification tools. In addition, AIR makes no assumptions about the underlying compiler technology. We have implemented a prototype of AIR and present encouraging results on several non-trivial examples.
A Web Tool for Research in Nonlinear Optics
NASA Astrophysics Data System (ADS)
Prikhod'ko, Nikolay V.; Abramovsky, Viktor A.; Abramovskaya, Natalia V.; Demichev, Andrey P.; Kryukov, Alexandr P.; Polyakov, Stanislav P.
2016-02-01
This paper presents a project of developing the web platform called WebNLO for computer modeling of nonlinear optics phenomena. We discuss a general scheme of the platform and a model for interaction between the platform modules. The platform is built as a set of interacting RESTful web services (SaaS approach). Users can interact with the platform through a web browser or command line interface. Such a resource has no analogues in the field of nonlinear optics and will be created for the first time therefore allowing researchers to access high-performance computing resources that will significantly reduce the cost of the research and development process.
ERIC Educational Resources Information Center
Upitis, Rena; Brook, Julia
2017-01-01
Even though there are demonstrated benefits of using online tools to support student musicians, there is a persistent challenge of providing sufficient and effective professional development for independent music teachers to use such tools successfully. This paper describes several methods for helping teachers use an online tool called iSCORE,…
Improving mapping and SNP-calling performance in multiplexed targeted next-generation sequencing
2012-01-01
Background Compared to classical genotyping, targeted next-generation sequencing (tNGS) can be custom-designed to interrogate entire genomic regions of interest, in order to detect novel as well as known variants. To bring down the per-sample cost, one approach is to pool barcoded NGS libraries before sample enrichment. Still, we lack a complete understanding of how this multiplexed tNGS approach and the varying performance of the ever-evolving analytical tools can affect the quality of variant discovery. Therefore, we evaluated the impact of different software tools and analytical approaches on the discovery of single nucleotide polymorphisms (SNPs) in multiplexed tNGS data. To generate our own test model, we combined a sequence capture method with NGS in three experimental stages of increasing complexity (E. coli genes, multiplexed E. coli, and multiplexed HapMap BRCA1/2 regions). Results We successfully enriched barcoded NGS libraries instead of genomic DNA, achieving reproducible coverage profiles (Pearson correlation coefficients of up to 0.99) across multiplexed samples, with <10% strand bias. However, the SNP calling quality was substantially affected by the choice of tools and mapping strategy. With the aim of reducing computational requirements, we compared conventional whole-genome mapping and SNP-calling with a new faster approach: target-region mapping with subsequent ‘read-backmapping’ to the whole genome to reduce the false detection rate. Consequently, we developed a combined mapping pipeline, which includes standard tools (BWA, SAMtools, etc.), and tested it on public HiSeq2000 exome data from the 1000 Genomes Project. Our pipeline saved 12 hours of run time per Hiseq2000 exome sample and detected ~5% more SNPs than the conventional whole genome approach. This suggests that more potential novel SNPs may be discovered using both approaches than with just the conventional approach. Conclusions We recommend applying our general ‘two-step’ mapping approach for more efficient SNP discovery in tNGS. Our study has also shown the benefit of computing inter-sample SNP-concordances and inspecting read alignments in order to attain more confident results. PMID:22913592
Workflow Agents vs. Expert Systems: Problem Solving Methods in Work Systems Design
NASA Technical Reports Server (NTRS)
Clancey, William J.; Sierhuis, Maarten; Seah, Chin
2009-01-01
During the 1980s, a community of artificial intelligence researchers became interested in formalizing problem solving methods as part of an effort called "second generation expert systems" (2nd GES). How do the motivations and results of this research relate to building tools for the workplace today? We provide an historical review of how the theory of expertise has developed, a progress report on a tool for designing and implementing model-based automation (Brahms), and a concrete example how we apply 2nd GES concepts today in an agent-based system for space flight operations (OCAMS). Brahms incorporates an ontology for modeling work practices, what people are doing in the course of a day, characterized as "activities." OCAMS was developed using a simulation-to-implementation methodology, in which a prototype tool was embedded in a simulation of future work practices. OCAMS uses model-based methods to interactively plan its actions and keep track of the work to be done. The problem solving methods of practice are interactive, employing reasoning for and through action in the real world. Analogously, it is as if a medical expert system were charged not just with interpreting culture results, but actually interacting with a patient. Our perspective shifts from building a "problem solving" (expert) system to building an actor in the world. The reusable components in work system designs include entire "problem solvers" (e.g., a planning subsystem), interoperability frameworks, and workflow agents that use and revise models dynamically in a network of people and tools. Consequently, the research focus shifts so "problem solving methods" include ways of knowing that models do not fit the world, and ways of interacting with other agents and people to gain or verify information and (ultimately) adapt rules and procedures to resolve problematic situations.
Economic Consequence Analysis of Disasters: The ECAT Software Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, Adam; Prager, Fynn; Chen, Zhenhua
This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economicmore » consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.« less
Ouyang, Qin; Chen, Quansheng; Zhao, Jiewen
2016-02-05
The approach presented herein reports the application of near infrared (NIR) spectroscopy, in contrast with human sensory panel, as a tool for estimating Chinese rice wine quality; concretely, to achieve the prediction of the overall sensory scores assigned by the trained sensory panel. Back propagation artificial neural network (BPANN) combined with adaptive boosting (AdaBoost) algorithm, namely BP-AdaBoost, as a novel nonlinear algorithm, was proposed in modeling. First, the optimal spectra intervals were selected by synergy interval partial least square (Si-PLS). Then, BP-AdaBoost model based on the optimal spectra intervals was established, called Si-BP-AdaBoost model. These models were optimized by cross validation, and the performance of each final model was evaluated according to correlation coefficient (Rp) and root mean square error of prediction (RMSEP) in prediction set. Si-BP-AdaBoost showed excellent performance in comparison with other models. The best Si-BP-AdaBoost model was achieved with Rp=0.9180 and RMSEP=2.23 in the prediction set. It was concluded that NIR spectroscopy combined with Si-BP-AdaBoost was an appropriate method for the prediction of the sensory quality in Chinese rice wine. Copyright © 2015 Elsevier B.V. All rights reserved.
Integrated Modeling of Complex Optomechanical Systems
NASA Astrophysics Data System (ADS)
Andersen, Torben; Enmark, Anita
2011-09-01
Mathematical modeling and performance simulation are playing an increasing role in large, high-technology projects. There are two reasons; first, projects are now larger than they were before, and the high cost calls for detailed performance prediction before construction. Second, in particular for space-related designs, it is often difficult to test systems under realistic conditions beforehand, and mathematical modeling is then needed to verify in advance that a system will work as planned. Computers have become much more powerful, permitting calculations that were not possible before. At the same time mathematical tools have been further developed and found acceptance in the community. Particular progress has been made in the fields of structural mechanics, optics and control engineering, where new methods have gained importance over the last few decades. Also, methods for combining optical, structural and control system models into global models have found widespread use. Such combined models are usually called integrated models and were the subject of this symposium. The objective was to bring together people working in the fields of groundbased optical telescopes, ground-based radio telescopes, and space telescopes. We succeeded in doing so and had 39 interesting presentations and many fruitful discussions during coffee and lunch breaks and social arrangements. We are grateful that so many top ranked specialists found their way to Kiruna and we believe that these proceedings will prove valuable during much future work.
New Tool for Benefit-Cost Analysis in Evaluating Transportation Alternatives
DOT National Transportation Integrated Search
1997-01-01
The Intermodal Surface Transportation Efficiency Act (ISTEA) emphasizes assessment of multi-modal alternatives and demand management strategies. In 1995, the Federal Highway Administration (FHWA) developed a corridor sketch planning tool called the S...
Master Middle Ware: A Tool to Integrate Water Resources and Fish Population Dynamics Models
NASA Astrophysics Data System (ADS)
Yi, S.; Sandoval Solis, S.; Thompson, L. C.; Kilduff, D. P.
2017-12-01
Linking models that investigate separate components of ecosystem processes has the potential to unify messages regarding management decisions by evaluating potential trade-offs in a cohesive framework. This project aimed to improve the ability of riparian resource managers to forecast future water availability conditions and resultant fish habitat suitability, in order to better inform their management decisions. To accomplish this goal, we developed a middleware tool that is capable of linking and overseeing the operations of two existing models, a water resource planning tool Water Evaluation and Planning (WEAP) model and a habitat-based fish population dynamics model (WEAPhish). First, we designed the Master Middle Ware (MMW) software in Visual Basic for Application® in one Excel® file that provided a familiar framework for both data input and output Second, MMW was used to link and jointly operate WEAP and WEAPhish, using Visual Basic Application (VBA) macros to implement system level calls to run the models. To demonstrate the utility of this approach, hydrological, biological, and middleware model components were developed for the Butte Creek basin. This tributary of the Sacramento River, California is managed for both hydropower and the persistence of a threatened population of spring-run Chinook salmon (Oncorhynchus tschawytscha). While we have demonstrated the use of MMW for a particular watershed and fish population, MMW can be customized for use with different rivers and fish populations, assuming basic data requirements are met. This model integration improves on ad hoc linkages for managing data transfer between software programs by providing a consistent, user-friendly, and familiar interface across different model implementations. Furthermore, the data-viewing capabilities of MMW facilitate the rapid interpretation of model results by hydrologists, fisheries biologists, and resource managers, in order to accelerate learning and management decision making.
PyRhO: A Multiscale Optogenetics Simulation Platform
Evans, Benjamin D.; Jarvis, Sarah; Schultz, Simon R.; Nikolic, Konstantin
2016-01-01
Optogenetics has become a key tool for understanding the function of neural circuits and controlling their behavior. An array of directly light driven opsins have been genetically isolated from several families of organisms, with a wide range of temporal and spectral properties. In order to characterize, understand and apply these opsins, we present an integrated suite of open-source, multi-scale computational tools called PyRhO. The purpose of developing PyRhO is three-fold: (i) to characterize new (and existing) opsins by automatically fitting a minimal set of experimental data to three-, four-, or six-state kinetic models, (ii) to simulate these models at the channel, neuron and network levels, and (iii) provide functional insights through model selection and virtual experiments in silico. The module is written in Python with an additional IPython/Jupyter notebook based GUI, allowing models to be fit, simulations to be run and results to be shared through simply interacting with a webpage. The seamless integration of model fitting algorithms with simulation environments (including NEURON and Brian2) for these virtual opsins will enable neuroscientists to gain a comprehensive understanding of their behavior and rapidly identify the most suitable variant for application in a particular biological system. This process may thereby guide not only experimental design and opsin choice but also alterations of the opsin genetic code in a neuro-engineering feed-back loop. In this way, we expect PyRhO will help to significantly advance optogenetics as a tool for transforming biological sciences. PMID:27148037
PyRhO: A Multiscale Optogenetics Simulation Platform.
Evans, Benjamin D; Jarvis, Sarah; Schultz, Simon R; Nikolic, Konstantin
2016-01-01
Optogenetics has become a key tool for understanding the function of neural circuits and controlling their behavior. An array of directly light driven opsins have been genetically isolated from several families of organisms, with a wide range of temporal and spectral properties. In order to characterize, understand and apply these opsins, we present an integrated suite of open-source, multi-scale computational tools called PyRhO. The purpose of developing PyRhO is three-fold: (i) to characterize new (and existing) opsins by automatically fitting a minimal set of experimental data to three-, four-, or six-state kinetic models, (ii) to simulate these models at the channel, neuron and network levels, and (iii) provide functional insights through model selection and virtual experiments in silico. The module is written in Python with an additional IPython/Jupyter notebook based GUI, allowing models to be fit, simulations to be run and results to be shared through simply interacting with a webpage. The seamless integration of model fitting algorithms with simulation environments (including NEURON and Brian2) for these virtual opsins will enable neuroscientists to gain a comprehensive understanding of their behavior and rapidly identify the most suitable variant for application in a particular biological system. This process may thereby guide not only experimental design and opsin choice but also alterations of the opsin genetic code in a neuro-engineering feed-back loop. In this way, we expect PyRhO will help to significantly advance optogenetics as a tool for transforming biological sciences.
NASA Technical Reports Server (NTRS)
Patel, Bhogila M.; Hoge, Peter A.; Nagpal, Vinod K.; Hojnicki, Jeffrey S.; Rusick, Jeffrey J.
2004-01-01
This paper describes the methods employed to apply probabilistic modeling techniques to the International Space Station (ISS) power system. These techniques were used to quantify the probabilistic variation in the power output, also called the response variable, due to variations (uncertainties) associated with knowledge of the influencing factors called the random variables. These uncertainties can be due to unknown environmental conditions, variation in the performance of electrical power system components or sensor tolerances. Uncertainties in these variables, cause corresponding variations in the power output, but the magnitude of that effect varies with the ISS operating conditions, e.g. whether or not the solar panels are actively tracking the sun. Therefore, it is important to quantify the influence of these uncertainties on the power output for optimizing the power available for experiments.
Assar, Rodrigo; Montecino, Martín A; Maass, Alejandro; Sherman, David J
2014-07-01
In order to describe the dynamic behavior of a complex biological system, it is useful to combine models integrating processes at different levels and with temporal dependencies. Such combinations are necessary for modeling acclimatization, a phenomenon where changes in environmental conditions can induce drastic changes in the behavior of a biological system. In this article we formalize the use of hybrid systems as a tool to model this kind of biological behavior. A modeling scheme called strong switches is proposed. It allows one to take into account both minor adjustments to the coefficients of a continuous model, and, more interestingly, large-scale changes to the structure of the model. We illustrate the proposed methodology with two applications: acclimatization in wine fermentation kinetics, and acclimatization of osteo-adipo differentiation system linking stimulus signals to bone mass. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Decades of CALL Development: A Retrospective of the Work of James Pusack and Sue Otto
ERIC Educational Resources Information Center
Otto, Sue E. K.
2010-01-01
In this article, the author describes a series of projects that James Pusack and the author engaged in together, a number of them to develop CALL authoring tools. With their shared love of technology and dedication to language teaching and learning, they embarked on a long and immensely enjoyable career in CALL during which each project evolved…
Multiagent Work Practice Simulation: Progress and Challenges
NASA Technical Reports Server (NTRS)
Clancey, William J.; Sierhuis, Maarten; Shaffe, Michael G. (Technical Monitor)
2001-01-01
Modeling and simulating complex human-system interactions requires going beyond formal procedures and information flows to analyze how people interact with each other. Such work practices include conversations, modes of communication, informal assistance, impromptu meetings, workarounds, and so on. To make these social processes visible, we have developed a multiagent simulation tool, called Brahms, for modeling the activities of people belonging to multiple groups, situated in a physical environment (geographic regions, buildings, transport vehicles, etc.) consisting of tools, documents, and a computer system. We are finding many useful applications of Brahms for system requirements analysis, instruction, implementing software agents, and as a workbench for relating cognitive and social theories of human behavior. Many challenges remain for representing work practices, including modeling: memory over multiple days, scheduled activities combining physical objects, groups, and locations on a timeline (such as a Space Shuttle mission), habitat vehicles with trajectories (such as the Shuttle), agent movement in 3D space (e.g., inside the International Space Station), agent posture and line of sight, coupled movements (such as carrying objects), and learning (mimicry, forming habits, detecting repetition, etc.).
Multiagent Work Practice Simulation: Progress and Challenges
NASA Technical Reports Server (NTRS)
Clancey, William J.; Sierhuis, Maarten
2002-01-01
Modeling and simulating complex human-system interactions requires going beyond formal procedures and information flows to analyze how people interact with each other. Such work practices include conversations, modes of communication, informal assistance, impromptu meetings, workarounds, and so on. To make these social processes visible, we have developed a multiagent simulation tool, called Brahms, for modeling the activities of people belonging to multiple groups, situated in a physical environment (geographic regions, buildings, transport vehicles, etc.) consisting of tools, documents, and computer systems. We are finding many useful applications of Brahms for system requirements analysis, instruction, implementing software agents, and as a workbench for relating cognitive and social theories of human behavior. Many challenges remain for representing work practices, including modeling: memory over multiple days, scheduled activities combining physical objects, groups, and locations on a timeline (such as a Space Shuttle mission), habitat vehicles with trajectories (such as the Shuttle), agent movement in 3d space (e.g., inside the International Space Station), agent posture and line of sight, coupled movements (such as carrying objects), and learning (mimicry, forming habits, detecting repetition, etc.).
Financial Data Analysis by means of Coupled Continuous-Time Random Walk in Rachev-Rűschendorf Model
NASA Astrophysics Data System (ADS)
Jurlewicz, A.; Wyłomańska, A.; Żebrowski, P.
2008-09-01
We adapt the continuous-time random walk formalism to describe asset price evolution. We expand the idea proposed by Rachev and Rűschendorf who analyzed the binomial pricing model in the discrete time with randomization of the number of price changes. As a result, in the framework of the proposed model we obtain a mixture of the Gaussian and a generalized arcsine laws as the limiting distribution of log-returns. Moreover, we derive an European-call-option price that is an extension of the Black-Scholes formula. We apply the obtained theoretical results to model actual financial data and try to show that the continuous-time random walk offers alternative tools to deal with several complex issues of financial markets.
Measuring infrastructure: A key step in program evaluation and planning.
Schmitt, Carol L; Glasgow, LaShawn; Lavinghouze, S Rene; Rieker, Patricia P; Fulmer, Erika; McAleer, Kelly; Rogers, Todd
2016-06-01
State tobacco prevention and control programs (TCPs) require a fully functioning infrastructure to respond effectively to the Surgeon General's call for accelerating the national reduction in tobacco use. The literature describes common elements of infrastructure; however, a lack of valid and reliable measures has made it difficult for program planners to monitor relevant infrastructure indicators and address observed deficiencies, or for evaluators to determine the association among infrastructure, program efforts, and program outcomes. The Component Model of Infrastructure (CMI) is a comprehensive, evidence-based framework that facilitates TCP program planning efforts to develop and maintain their infrastructure. Measures of CMI components were needed to evaluate the model's utility and predictive capability for assessing infrastructure. This paper describes the development of CMI measures and results of a pilot test with nine state TCP managers. Pilot test findings indicate that the tool has good face validity and is clear and easy to follow. The CMI tool yields data that can enhance public health efforts in a funding-constrained environment and provides insight into program sustainability. Ultimately, the CMI measurement tool could facilitate better evaluation and program planning across public health programs. Copyright © 2016 Elsevier Ltd. All rights reserved.
Liebling, Steven L; Palenzuela, Carlos
2017-01-01
The idea of stable, localized bundles of energy has strong appeal as a model for particles. In the 1950s, John Wheeler envisioned such bundles as smooth configurations of electromagnetic energy that he called geons , but none were found. Instead, particle-like solutions were found in the late 1960s with the addition of a scalar field, and these were given the name boson stars . Since then, boson stars find use in a wide variety of models as sources of dark matter, as black hole mimickers, in simple models of binary systems, and as a tool in finding black holes in higher dimensions with only a single Killing vector. We discuss important varieties of boson stars, their dynamic properties, and some of their uses, concentrating on recent efforts.
A modified homogeneous relaxation model for CO2 two-phase flow in vapour ejector
NASA Astrophysics Data System (ADS)
Haida, M.; Palacz, M.; Smolka, J.; Nowak, A. J.; Hafner, A.; Banasiak, K.
2016-09-01
In this study, the homogenous relaxation model (HRM) for CO2 flow in a two-phase ejector was modified in order to increase the accuracy of the numerical simulations The two- phase flow model was implemented on the effective computational tool called ejectorPL for fully automated and systematic computations of various ejector shapes and operating conditions. The modification of the HRM was performed by a change of the relaxation time and the constants included in the relaxation time equation based on the experimental result under the operating conditions typical for the supermarket refrigeration system. The modified HRM was compared to the HEM results, which were performed based on the comparison of motive nozzle and suction nozzle mass flow rates.
NASA Astrophysics Data System (ADS)
Berselli, Luigi C.; Spirito, Stefano
2018-06-01
Obtaining reliable numerical simulations of turbulent fluids is a challenging problem in computational fluid mechanics. The large eddy simulation (LES) models are efficient tools to approximate turbulent fluids, and an important step in the validation of these models is the ability to reproduce relevant properties of the flow. In this paper, we consider a fully discrete approximation of the Navier-Stokes-Voigt model by an implicit Euler algorithm (with respect to the time variable) and a Fourier-Galerkin method (in the space variables). We prove the convergence to weak solutions of the incompressible Navier-Stokes equations satisfying the natural local entropy condition, hence selecting the so-called physically relevant solutions.
The Three C's for Urban Science Education
ERIC Educational Resources Information Center
Emdin, Chris
2008-01-01
In this article, the author outlines briefly what he calls the three C's--a set of tools that can be used to improve urban science education. The author then describes ways that these tools can support students who have traditionally been marginalized. These three aligned and closely connected tools provide practical ways to engage students in…
Challenging Google, Microsoft Unveils a Search Tool for Scholarly Articles
ERIC Educational Resources Information Center
Carlson, Scott
2006-01-01
Microsoft has introduced a new search tool to help people find scholarly articles online. The service, which includes journal articles from prominent academic societies and publishers, puts Microsoft in direct competition with Google Scholar. The new free search tool, which should work on most Web browsers, is called Windows Live Academic Search…
Evaluation of Knowla: An Online Assessment and Learning Tool
ERIC Educational Resources Information Center
Thompson, Meredith Myra; Braude, Eric John
2016-01-01
The assessment of learning in large online courses requires tools that are valid, reliable, easy to administer, and can be automatically scored. We have evaluated an online assessment and learning tool called Knowledge Assembly, or Knowla. Knowla measures a student's knowledge in a particular subject by having the student assemble a set of…
ERIC Educational Resources Information Center
Weldeana, Hailu Nigus; Sbhatu, Desta Berhe
2017-01-01
Background: This article reports contributions of an assessment tool called Portfolio of Evidence (PE) in learning college geometry. Material and methods: Two classes of second-year students from one Ethiopian teacher education college, assigned into Treatment and Comparison classes, were participated. The assessment tools used in the Treatment…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walker, G.
1984-09-01
Two classifications of fishing jobs are discussed: open hole and cased hole. When there is no casing in the area of the fish, it is called open hole fishing. When the fish is inside the casing, it is called cased hole fishing. The article lists various things that can become a fish-stuck drill pipe, including: broken drill pipe, drill collars, bit, bit cones, hand tools dropped in the well, sanded up or mud stuck tubing, packers become stuck, and much more. It is suggested that on a fishing job, all parties involved should cooperate with each other, and that fishingmore » tool people obtain all the information concerning the well. That way they can select the right tools and methods to clean out the well as quickly as possible.« less
Verbist, Bie M P; Thys, Kim; Reumers, Joke; Wetzels, Yves; Van der Borght, Koen; Talloen, Willem; Aerssens, Jeroen; Clement, Lieven; Thas, Olivier
2015-01-01
In virology, massively parallel sequencing (MPS) opens many opportunities for studying viral quasi-species, e.g. in HIV-1- and HCV-infected patients. This is essential for understanding pathways to resistance, which can substantially improve treatment. Although MPS platforms allow in-depth characterization of sequence variation, their measurements still involve substantial technical noise. For Illumina sequencing, single base substitutions are the main error source and impede powerful assessment of low-frequency mutations. Fortunately, base calls are complemented with quality scores (Qs) that are useful for differentiating errors from the real low-frequency mutations. A variant calling tool, Q-cpileup, is proposed, which exploits the Qs of nucleotides in a filtering strategy to increase specificity. The tool is imbedded in an open-source pipeline, VirVarSeq, which allows variant calling starting from fastq files. Using both plasmid mixtures and clinical samples, we show that Q-cpileup is able to reduce the number of false-positive findings. The filtering strategy is adaptive and provides an optimized threshold for individual samples in each sequencing run. Additionally, linkage information is kept between single-nucleotide polymorphisms as variants are called at the codon level. This enables virologists to have an immediate biological interpretation of the reported variants with respect to their antiviral drug responses. A comparison with existing SNP caller tools reveals that calling variants at the codon level with Q-cpileup results in an outstanding sensitivity while maintaining a good specificity for variants with frequencies down to 0.5%. The VirVarSeq is available, together with a user's guide and test data, at sourceforge: http://sourceforge.net/projects/virtools/?source=directory. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Kasahara, Kota; Kinoshita, Kengo
2016-01-01
Ion conduction mechanisms of ion channels are a long-standing conundrum. Although the molecular dynamics (MD) method has been extensively used to simulate ion conduction dynamics at the atomic level, analysis and interpretation of MD results are not straightforward due to complexity of the dynamics. In our previous reports, we proposed an analytical method called ion-binding state analysis to scrutinize and summarize ion conduction mechanisms by taking advantage of a variety of analytical protocols, e.g., the complex network analysis, sequence alignment, and hierarchical clustering. This approach effectively revealed the ion conduction mechanisms and their dependence on the conditions, i.e., ion concentration and membrane voltage. Here, we present an easy-to-use computational toolkit for ion-binding state analysis, called IBiSA_tools. This toolkit consists of a C++ program and a series of Python and R scripts. From the trajectory file of MD simulations and a structure file, users can generate several images and statistics of ion conduction processes. A complex network named ion-binding state graph is generated in a standard graph format (graph modeling language; GML), which can be visualized by standard network analyzers such as Cytoscape. As a tutorial, a trajectory of a 50 ns MD simulation of the Kv1.2 channel is also distributed with the toolkit. Users can trace the entire process of ion-binding state analysis step by step. The novel method for analysis of ion conduction mechanisms of ion channels can be easily used by means of IBiSA_tools. This software is distributed under an open source license at the following URL: http://www.ritsumei.ac.jp/~ktkshr/ibisa_tools/.
Modular workcells: modern methods for laboratory automation.
Felder, R A
1998-12-01
Laboratory automation is beginning to become an indispensable survival tool for laboratories facing difficult market competition. However, estimates suggest that only 8% of laboratories will be able to afford total laboratory automation systems. Therefore, automation vendors have developed alternative hardware configurations called 'modular automation', to fit the smaller laboratory. Modular automation consists of consolidated analyzers, integrated analyzers, modular workcells, and pre- and post-analytical automation. These terms will be defined in this paper. Using a modular automation model, the automated core laboratory will become a site where laboratory data is evaluated by trained professionals to provide diagnostic information to practising physicians. Modem software information management and process control tools will complement modular hardware. Proper standardization that will allow vendor-independent modular configurations will assure success of this revolutionary new technology.
ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package
NASA Astrophysics Data System (ADS)
Jaggi, S.
1993-02-01
The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.
CNTRO: A Semantic Web Ontology for Temporal Relation Inferencing in Clinical Narratives.
Tao, Cui; Wei, Wei-Qi; Solbrig, Harold R; Savova, Guergana; Chute, Christopher G
2010-11-13
Using Semantic-Web specifications to represent temporal information in clinical narratives is an important step for temporal reasoning and answering time-oriented queries. Existing temporal models are either not compatible with the powerful reasoning tools developed for the Semantic Web, or designed only for structured clinical data and therefore are not ready to be applied on natural-language-based clinical narrative reports directly. We have developed a Semantic-Web ontology which is called Clinical Narrative Temporal Relation ontology. Using this ontology, temporal information in clinical narratives can be represented as RDF (Resource Description Framework) triples. More temporal information and relations can then be inferred by Semantic-Web based reasoning tools. Experimental results show that this ontology can represent temporal information in real clinical narratives successfully.
Interactive intelligent remote operations: application to space robotics
NASA Astrophysics Data System (ADS)
Dupuis, Erick; Gillett, G. R.; Boulanger, Pierre; Edwards, Eric; Lipsett, Michael G.
1999-11-01
A set of tolls addressing the problems specific to the control and monitoring of remote robotic systems from extreme distances has been developed. The tools include the capability to model and visualize the remote environment, to generate and edit complex task scripts, to execute the scripts to supervisory control mode and to monitor and diagnostic equipment from multiple remote locations. Two prototype systems are implemented for demonstration. The first demonstration, using a prototype joint design called Dexter, shows the applicability of the approach to space robotic operation in low Earth orbit. The second demonstration uses a remotely controlled excavator in an operational open-pit tar sand mine. This demonstrates that the tools developed can also be used for planetary exploration operations as well as for terrestrial mining applications.
Kruser, Jacqueline M; Nabozny, Michael J; Steffens, Nicole M; Brasel, Karen J; Campbell, Toby C; Gaines, Martha E; Schwarze, Margaret L
2015-09-01
To evaluate a communication tool called "Best Case/Worst Case" (BC/WC) based on an established conceptual model of shared decision-making. Focus group study. Older adults (four focus groups) and surgeons (two focus groups) using modified questions from the Decision Aid Acceptability Scale and the Decisional Conflict Scale to evaluate and revise the communication tool. Individuals aged 60 and older recruited from senior centers (n = 37) and surgeons from academic and private practices in Wisconsin (n = 17). Qualitative content analysis was used to explore themes and concepts that focus group respondents identified. Seniors and surgeons praised the tool for the unambiguous illustration of multiple treatment options and the clarity gained from presentation of an array of treatment outcomes. Participants noted that the tool provides an opportunity for in-the-moment, preference-based deliberation about options and a platform for further discussion with other clinicians and loved ones. Older adults worried that the format of the tool was not universally accessible for people with different educational backgrounds, and surgeons had concerns that the tool was vulnerable to physicians' subjective biases. The BC/WC tool is a novel decision support intervention that may help facilitate difficult decision-making for older adults and their physicians when considering invasive, acute medical treatments such as surgery. © 2015, Copyright the Authors Journal compilation © 2015, The American Geriatrics Society.
SmaggIce 2D Version 1.8: Software Toolkit Developed for Aerodynamic Simulation Over Iced Airfoils
NASA Technical Reports Server (NTRS)
Choo, Yung K.; Vickerman, Mary B.
2005-01-01
SmaggIce 2D version 1.8 is a software toolkit developed at the NASA Glenn Research Center that consists of tools for modeling the geometry of and generating the grids for clean and iced airfoils. Plans call for the completed SmaggIce 2D version 2.0 to streamline the entire aerodynamic simulation process--the characterization and modeling of ice shapes, grid generation, and flow simulation--and to be closely coupled with the public-domain application flow solver, WIND. Grid generated using version 1.8, however, can be used by other flow solvers. SmaggIce 2D will help researchers and engineers study the effects of ice accretion on airfoil performance, which is difficult to do with existing software tools because of complex ice shapes. Using SmaggIce 2D, when fully developed, to simulate flow over an iced airfoil will help to reduce the cost of performing flight and wind-tunnel tests for certifying aircraft in natural and simulated icing conditions.
HEAVY-DUTY GREENHOUSE GAS EMISSIONS MODEL ...
Class 2b-8 vocational truck manufacturers and Class 7/8 tractor manufacturers would be subject to vehicle-based fuel economy and emission standards that would use a truck simulation model to evaluate the impact of the truck tires and/or tractor cab design on vehicle compliance with any new standards. The EPA has created a model called “GHG Emissions Model (GEM)”, which is specifically tailored to predict truck GHG emissions. As the model is designed for the express purpose of vehicle compliance demonstration, it is less configurable than similar commercial products and its only outputs are GHG emissions and fuel consumption. This approach gives a simple and compact tool for vehicle compliance without the overhead and costs of a more sophisticated model. Evaluation of both fuel consumption and CO2 emissions from heavy-duty highway vehicles through a whole-vehicle operation simulation model.
Adaptive LINE-P: An Adaptive Linear Energy Prediction Model for Wireless Sensor Network Nodes.
Ahmed, Faisal; Tamberg, Gert; Le Moullec, Yannick; Annus, Paul
2018-04-05
In the context of wireless sensor networks, energy prediction models are increasingly useful tools that can facilitate the power management of the wireless sensor network (WSN) nodes. However, most of the existing models suffer from the so-called fixed weighting parameter, which limits their applicability when it comes to, e.g., solar energy harvesters with varying characteristics. Thus, in this article we propose the Adaptive LINE-P (all cases) model that calculates adaptive weighting parameters based on the stored energy profiles. Furthermore, we also present a profile compression method to reduce the memory requirements. To determine the performance of our proposed model, we have used real data for the solar and wind energy profiles. The simulation results show that our model achieves 90-94% accuracy and that the compressed method reduces memory overheads by 50% as compared to state-of-the-art models.
Adaptive LINE-P: An Adaptive Linear Energy Prediction Model for Wireless Sensor Network Nodes
Ahmed, Faisal
2018-01-01
In the context of wireless sensor networks, energy prediction models are increasingly useful tools that can facilitate the power management of the wireless sensor network (WSN) nodes. However, most of the existing models suffer from the so-called fixed weighting parameter, which limits their applicability when it comes to, e.g., solar energy harvesters with varying characteristics. Thus, in this article we propose the Adaptive LINE-P (all cases) model that calculates adaptive weighting parameters based on the stored energy profiles. Furthermore, we also present a profile compression method to reduce the memory requirements. To determine the performance of our proposed model, we have used real data for the solar and wind energy profiles. The simulation results show that our model achieves 90–94% accuracy and that the compressed method reduces memory overheads by 50% as compared to state-of-the-art models. PMID:29621169
Towards a Food Safety Knowledge Base Applicable in Crisis Situations and Beyond
Falenski, Alexander; Weiser, Armin A.; Thöns, Christian; Appel, Bernd; Käsbohrer, Annemarie; Filter, Matthias
2015-01-01
In case of contamination in the food chain, fast action is required in order to reduce the numbers of affected people. In such situations, being able to predict the fate of agents in foods would help risk assessors and decision makers in assessing the potential effects of a specific contamination event and thus enable them to deduce the appropriate mitigation measures. One efficient strategy supporting this is using model based simulations. However, application in crisis situations requires ready-to-use and easy-to-adapt models to be available from the so-called food safety knowledge bases. Here, we illustrate this concept and its benefits by applying the modular open source software tools PMM-Lab and FoodProcess-Lab. As a fictitious sample scenario, an intentional ricin contamination at a beef salami production facility was modelled. Predictive models describing the inactivation of ricin were reviewed, relevant models were implemented with PMM-Lab, and simulations on residual toxin amounts in the final product were performed with FoodProcess-Lab. Due to the generic and modular modelling concept implemented in these tools, they can be applied to simulate virtually any food safety contamination scenario. Apart from the application in crisis situations, the food safety knowledge base concept will also be useful in food quality and safety investigations. PMID:26247028
A brief review on relaxor ferroelectrics and selected issues in lead-free relaxors
NASA Astrophysics Data System (ADS)
Ahn, Chang Won; Hong, Chang-Hyo; Choi, Byung-Yul; Kim, Hwang-Pill; Han, Hyoung-Su; Hwang, Younghun; Jo, Wook; Wang, Ke; Li, Jing-Feng; Lee, Jae-Shin; Kim, Ill Won
2016-06-01
Relaxor ferroelectricity is one of the most widely investigated but the least understood material classes in the condensed matter physics. This is largely due to the lack of experimental tools that decisively confirm the existing theoretical models. In spite of the diversity in the models, they share the core idea that the observed features in relaxors are closely related to localized chemical heterogeneity. Given this, this review attempts to overview the existing models of importance chronologically, from the diffuse phase transition model to the random-field model and to show how the core idea has been reflected in them to better shape our insight into the nature of relaxor-related phenomena. Then, the discussion will be directed to how the models of a common consensus, developed with the so-called canonical relaxors such as Pb(Mg1/3Nb2/3)O3 (PMN) and (Pb, La)(Zr, Ti)O3 (PLZT), are compatible with phenomenological explanations for the recently identified relaxors such as (Bi1/2Na1/2)TiO3 (BNT)-based lead-free ferroelectrics. This review will be finalized with a discussion on the theoretical aspects of recently introduced 0-3 and 2-2 ferroelectric/relaxor composites as a practical tool for strain engineering.
Towards a Food Safety Knowledge Base Applicable in Crisis Situations and Beyond.
Falenski, Alexander; Weiser, Armin A; Thöns, Christian; Appel, Bernd; Käsbohrer, Annemarie; Filter, Matthias
2015-01-01
In case of contamination in the food chain, fast action is required in order to reduce the numbers of affected people. In such situations, being able to predict the fate of agents in foods would help risk assessors and decision makers in assessing the potential effects of a specific contamination event and thus enable them to deduce the appropriate mitigation measures. One efficient strategy supporting this is using model based simulations. However, application in crisis situations requires ready-to-use and easy-to-adapt models to be available from the so-called food safety knowledge bases. Here, we illustrate this concept and its benefits by applying the modular open source software tools PMM-Lab and FoodProcess-Lab. As a fictitious sample scenario, an intentional ricin contamination at a beef salami production facility was modelled. Predictive models describing the inactivation of ricin were reviewed, relevant models were implemented with PMM-Lab, and simulations on residual toxin amounts in the final product were performed with FoodProcess-Lab. Due to the generic and modular modelling concept implemented in these tools, they can be applied to simulate virtually any food safety contamination scenario. Apart from the application in crisis situations, the food safety knowledge base concept will also be useful in food quality and safety investigations.
Evolutionary neural networks for anomaly detection based on the behavior of a program.
Han, Sang-Jun; Cho, Sung-Bae
2006-06-01
The process of learning the behavior of a given program by using machine-learning techniques (based on system-call audit data) is effective to detect intrusions. Rule learning, neural networks, statistics, and hidden Markov models (HMMs) are some of the kinds of representative methods for intrusion detection. Among them, neural networks are known for good performance in learning system-call sequences. In order to apply this knowledge to real-world problems successfully, it is important to determine the structures and weights of these call sequences. However, finding the appropriate structures requires very long time periods because there are no suitable analytical solutions. In this paper, a novel intrusion-detection technique based on evolutionary neural networks (ENNs) is proposed. One advantage of using ENNs is that it takes less time to obtain superior neural networks than when using conventional approaches. This is because they discover the structures and weights of the neural networks simultaneously. Experimental results with the 1999 Defense Advanced Research Projects Agency (DARPA) Intrusion Detection Evaluation (IDEVAL) data confirm that ENNs are promising tools for intrusion detection.
Metabolic pathway analysis and kinetic studies for production of nattokinase in Bacillus subtilis.
Unrean, Pornkamol; Nguyen, Nhung H A
2013-01-01
We have constructed a reaction network model of Bacillus subtilis. The model was analyzed using a pathway analysis tool called elementary mode analysis (EMA). The analysis tool was used to study the network capabilities and the possible effects of altered culturing conditions on the production of a fibrinolytic enzyme, nattokinase (NK) by B. subtilis. Based on all existing metabolic pathways, the maximum theoretical yield for NK synthesis in B. subtilis under different substrates and oxygen availability was predicted and the optimal culturing condition for NK production was identified. To confirm model predictions, experiments were conducted by testing these culture conditions for their influence on NK activity. The optimal culturing conditions were then applied to batch fermentation, resulting in high NK activity. The EMA approach was also applied for engineering B. subtilis metabolism towards the most efficient pathway for NK synthesis by identifying target genes for deletion and overexpression that enable the cell to produce NK at the maximum theoretical yield. The consistency between experiments and model predictions proves the feasibility of EMA being used to rationally design culture conditions and genetic manipulations for the efficient production of desired products.
Capturing remote mixing due to internal tides using multi-scale modeling tool: SOMAR-LES
NASA Astrophysics Data System (ADS)
Santilli, Edward; Chalamalla, Vamsi; Scotti, Alberto; Sarkar, Sutanu
2016-11-01
Internal tides that are generated during the interaction of an oscillating barotropic tide with the bottom bathymetry dissipate only a fraction of their energy near the generation region. The rest is radiated away in the form of low- high-mode internal tides. These internal tides dissipate energy at remote locations when they interact with the upper ocean pycnocline, continental slope, and large scale eddies. Capturing the wide range of length and time scales involved during the life-cycle of internal tides is computationally very expensive. A recently developed multi-scale modeling tool called SOMAR-LES combines the adaptive grid refinement features of SOMAR with the turbulence modeling features of a Large Eddy Simulation (LES) to capture multi-scale processes at a reduced computational cost. Numerical simulations of internal tide generation at idealized bottom bathymetries are performed to demonstrate this multi-scale modeling technique. Although each of the remote mixing phenomena have been considered independently in previous studies, this work aims to capture remote mixing processes during the life cycle of an internal tide in more realistic settings, by allowing multi-level (coarse and fine) grids to co-exist and exchange information during the time stepping process.
Chodkiewicz, Michał L; Migacz, Szymon; Rudnicki, Witold; Makal, Anna; Kalinowski, Jarosław A; Moriarty, Nigel W; Grosse-Kunstleve, Ralf W; Afonine, Pavel V; Adams, Paul D; Dominiak, Paulina Maria
2018-02-01
It has been recently established that the accuracy of structural parameters from X-ray refinement of crystal structures can be improved by using a bank of aspherical pseudoatoms instead of the classical spherical model of atomic form factors. This comes, however, at the cost of increased complexity of the underlying calculations. In order to facilitate the adoption of this more advanced electron density model by the broader community of crystallographers, a new software implementation called DiSCaMB , 'densities in structural chemistry and molecular biology', has been developed. It addresses the challenge of providing for high performance on modern computing architectures. With parallelization options for both multi-core processors and graphics processing units (using CUDA), the library features calculation of X-ray scattering factors and their derivatives with respect to structural parameters, gives access to intermediate steps of the scattering factor calculations (thus allowing for experimentation with modifications of the underlying electron density model), and provides tools for basic structural crystallographic operations. Permissively (MIT) licensed, DiSCaMB is an open-source C++ library that can be embedded in both academic and commercial tools for X-ray structure refinement.
The Generalized Quantum Episodic Memory Model.
Trueblood, Jennifer S; Hemmer, Pernille
2017-11-01
Recent evidence suggests that experienced events are often mapped to too many episodic states, including those that are logically or experimentally incompatible with one another. For example, episodic over-distribution patterns show that the probability of accepting an item under different mutually exclusive conditions violates the disjunction rule. A related example, called subadditivity, occurs when the probability of accepting an item under mutually exclusive and exhaustive instruction conditions sums to a number >1. Both the over-distribution effect and subadditivity have been widely observed in item and source-memory paradigms. These phenomena are difficult to explain using standard memory frameworks, such as signal-detection theory. A dual-trace model called the over-distribution (OD) model (Brainerd & Reyna, 2008) can explain the episodic over-distribution effect, but not subadditivity. Our goal is to develop a model that can explain both effects. In this paper, we propose the Generalized Quantum Episodic Memory (GQEM) model, which extends the Quantum Episodic Memory (QEM) model developed by Brainerd, Wang, and Reyna (2013). We test GQEM by comparing it to the OD model using data from a novel item-memory experiment and a previously published source-memory experiment (Kellen, Singmann, & Klauer, 2014) examining the over-distribution effect. Using the best-fit parameters from the over-distribution experiments, we conclude by showing that the GQEM model can also account for subadditivity. Overall these results add to a growing body of evidence suggesting that quantum probability theory is a valuable tool in modeling recognition memory. Copyright © 2016 Cognitive Science Society, Inc.
Diagnosing ΛHDE model with statefinder hierarchy and fractional growth parameter
NASA Astrophysics Data System (ADS)
Zhou, LanJun; Wang, Shuang
2016-07-01
Recently, a new dark energy model called ΛHDE was proposed. In this model, dark energy consists of two parts: cosmological constant Λ and holographic dark energy (HDE). Two key parameters of this model are the fractional density of cosmological constant ΩΛ0, and the dimensionless HDE parameter c. Since these two parameters determine the dynamical properties of DE and the destiny of universe, it is important to study the impacts of different values of ΩΛ0 and c on the ΛHDE model. In this paper, we apply various DE diagnostic tools to diagnose ΛHDE models with different values of ΩΛ0 and c; these tools include statefinder hierarchy {S 3 (1) , S 4 (1) }, fractional growth parameter ɛ, and composite null diagnostic (CND), which is a combination of {S 3 (1) , S 4 (1) } and ɛ. We find that: (1) adopting different values of ΩΛ0 only has quantitative impacts on the evolution of the ΛHDE model, while adopting different c has qualitative impacts; (2) compared with S 3 (1) , S 4 (1) can give larger differences among the cosmic evolutions of the ΛHDE model associated with different ΩΛ0 or different c; (3) compared with the case of using a single diagnostic, adopting a CND pair has much stronger ability to diagnose the ΛHDE model.
A web GIS based integrated flood assessment modeling tool for coastal urban watersheds
NASA Astrophysics Data System (ADS)
Kulkarni, A. T.; Mohanty, J.; Eldho, T. I.; Rao, E. P.; Mohan, B. K.
2014-03-01
Urban flooding has become an increasingly important issue in many parts of the world. In this study, an integrated flood assessment model (IFAM) is presented for the coastal urban flood simulation. A web based GIS framework has been adopted to organize the spatial datasets for the study area considered and to run the model within this framework. The integrated flood model consists of a mass balance based 1-D overland flow model, 1-D finite element based channel flow model based on diffusion wave approximation and a quasi 2-D raster flood inundation model based on the continuity equation. The model code is written in MATLAB and the application is integrated within a web GIS server product viz: Web Gram Server™ (WGS), developed at IIT Bombay, using Java, JSP and JQuery technologies. Its user interface is developed using open layers and the attribute data are stored in MySQL open source DBMS. The model is integrated within WGS and is called via Java script. The application has been demonstrated for two coastal urban watersheds of Navi Mumbai, India. Simulated flood extents for extreme rainfall event of 26 July, 2005 in the two urban watersheds of Navi Mumbai city are presented and discussed. The study demonstrates the effectiveness of the flood simulation tool in a web GIS environment to facilitate data access and visualization of GIS datasets and simulation results.
ERIC Educational Resources Information Center
Ambrose, Regina Maria; Palpanathan, Shanthini
2017-01-01
Computer-assisted language learning (CALL) has evolved through various stages in both technology as well as the pedagogical use of technology (Warschauer & Healey, 1998). Studies show that the CALL trend has facilitated students in their English language writing with useful tools such as computer based activities and word processing. Students…
NASA Astrophysics Data System (ADS)
Koutiva, Ifigeneia; Makropoulos, Christos
2015-04-01
The urban water system's sustainable evolution requires tools that can analyse and simulate the complete cycle including both physical and cultural environments. One of the main challenges, in this regard, is the design and development of tools that are able to simulate the society's water demand behaviour and the way policy measures affect it. The effects of these policy measures are a function of personal opinions that subsequently lead to the formation of people's attitudes. These attitudes will eventually form behaviours. This work presents the design of an ABM tool for addressing the social dimension of the urban water system. The created tool, called Urban Water Agents' Behaviour (UWAB) model, was implemented, using the NetLogo agent programming language. The main aim of the UWAB model is to capture the effects of policies and environmental pressures to water conservation behaviour of urban households. The model consists of agents representing urban households that are linked to each other creating a social network that influences the water conservation behaviour of its members. Household agents are influenced as well by policies and environmental pressures, such as drought. The UWAB model simulates behaviour resulting in the evolution of water conservation within an urban population. The final outcome of the model is the evolution of the distribution of different conservation levels (no, low, high) to the selected urban population. In addition, UWAB is implemented in combination with an existing urban water management simulation tool, the Urban Water Optioneering Tool (UWOT) in order to create a modelling platform aiming to facilitate an adaptive approach of water resources management. For the purposes of this proposed modelling platform, UWOT is used in a twofold manner: (1) to simulate domestic water demand evolution and (2) to simulate the response of the water system to the domestic water demand evolution. The main advantage of the UWAB - UWOT model integration is that it allows the investigation of the effects of different water demand management strategies to an urban population's water demand behaviour and ultimately the effects of these policies to the volume of domestic water demand and the water resources system. The proposed modelling platform is optimised to simulate the effects of water policies during the Athens drought period of 1988-1994. The calibrated modelling platform is then applied to evaluate scenarios of water supply, water demand and water demand management strategies.
EPA Registers Innovative Tool to Control Corn Rootworm
Ribonucleic acid interference (RNAi) based Plant Incorporated Protectant (PIP) technology is a new and innovative scientific tool utilized by U.S. growers. Learn more about RNAi technology and the 4 new products containing the RNAi based PIP called SMARTST
Concepts, tools, and strategies for effluent testing: An international survey
Whole effluent testing (also called Direct Toxicity Assessment) remains a critical long-term assessment tool for aquatic environmental protection. Use of animal alternative approaches for wastewater testing is expected to increase as more regulatory authorities routinely require ...
How Can I Deal with My Asthma?
... had trouble with it and why. Use asthma management tools. Even if you're feeling absolutely fine, don't abandon tools like daily long-term control medicines (also called "controller" or "maintenance" medicines) if they're a part of your ...
DengueTools: innovative tools and strategies for the surveillance and control of dengue
Wilder-Smith, Annelies; Renhorn, Karl-Erik; Tissera, Hasitha; Abu Bakar, Sazaly; Alphey, Luke; Kittayapong, Pattamaporn; Lindsay, Steve; Logan, James; Hatz, Christoph; Reiter, Paul; Rocklöv, Joacim; Byass, Peter; Louis, Valérie R.; Tozan, Yesim; Massad, Eduardo; Tenorio, Antonio; Lagneau, Christophe; L'Ambert, Grégory; Brooks, David; Wegerdt, Johannah; Gubler, Duane
2012-01-01
Dengue fever is a mosquito-borne viral disease estimated to cause about 230 million infections worldwide every year, of which 25,000 are fatal. Global incidence has risen rapidly in recent decades with some 3.6 billion people, over half of the world's population, now at risk, mainly in urban centres of the tropics and subtropics. Demographic and societal changes, in particular urbanization, globalization, and increased international travel, are major contributors to the rise in incidence and geographic expansion of dengue infections. Major research gaps continue to hamper the control of dengue. The European Commission launched a call under the 7th Framework Programme with the title of ‘Comprehensive control of Dengue fever under changing climatic conditions’. Fourteen partners from several countries in Europe, Asia, and South America formed a consortium named ‘DengueTools’ to respond to the call to achieve better diagnosis, surveillance, prevention, and predictive models and improve our understanding of the spread of dengue to previously uninfected regions (including Europe) in the context of globalization and climate change. The consortium comprises 12 work packages to address a set of research questions in three areas: Research area 1: Develop a comprehensive early warning and surveillance system that has predictive capability for epidemic dengue and benefits from novel tools for laboratory diagnosis and vector monitoring. Research area 2: Develop novel strategies to prevent dengue in children. Research area 3: Understand and predict the risk of global spread of dengue, in particular the risk of introduction and establishment in Europe, within the context of parameters of vectorial capacity, global mobility, and climate change. In this paper, we report on the rationale and specific study objectives of ‘DengueTools’. DengueTools is funded under the Health theme of the Seventh Framework Programme of the European Community, Grant Agreement Number: 282589 Dengue Tools. PMID:22451836
ERIC Educational Resources Information Center
Bustamante, Rebecca M.
2006-01-01
This module is designed to introduce educational leaders to an organizational assessment tool called a "culture audit." Literature on organizational cultural competence suggests that culture audits are a valuable tool for determining how well school policies, programs, and practices respond to the needs of diverse groups and prepare…
FFI: What it is and what it can do for you
Duncan C. Lutes; MaryBeth Keifer; Nathan C. Benson; John F. Caratti
2009-01-01
A new monitoring tool called FFI (FEAT/FIREMON Integrated) has been developed to assist managers with collection, storage and analysis of ecological information. The tool was developed through the complementary integration of two fire effects monitoring systems commonly used in the United States: FIREMON and the Fire Ecology Assessment Tool (FEAT). FFI provides...
Active Reading Documents (ARDs): A Tool to Facilitate Meaningful Learning through Reading
ERIC Educational Resources Information Center
Dubas, Justin M.; Toledo, Santiago A.
2015-01-01
Presented here is a practical tool called the Active Reading Document (ARD) that can give students the necessary incentive to engage with the text/readings. By designing the tool to incrementally develop student understanding of the material through reading using Marzano's Taxonomy as a framework, the ARD offers support through scaffolding as they…
HYPATIA--An Online Tool for ATLAS Event Visualization
ERIC Educational Resources Information Center
Kourkoumelis, C.; Vourakis, S.
2014-01-01
This paper describes an interactive tool for analysis of data from the ATLAS experiment taking place at the world's highest energy particle collider at CERN. The tool, called HYPATIA/applet, enables students of various levels to become acquainted with particle physics and look for discoveries in a similar way to that of real research.
Developing Multimedia Courseware for the Internet's Java versus Shockwave.
ERIC Educational Resources Information Center
Majchrzak, Tina L.
1996-01-01
Describes and compares two methods for developing multimedia courseware for use on the Internet: an authoring tool called Shockwave, and an object-oriented language called Java. Topics include vector graphics, browsers, interaction with network protocols, data security, multithreading, and computer languages versus development environments. (LRW)
Progress in understanding heavy-ion stopping
NASA Astrophysics Data System (ADS)
Sigmund, P.; Schinner, A.
2016-09-01
We report some highlights of our work with heavy-ion stopping in the energy range where Bethe stopping theory breaks down. Main tools are our binary stopping theory (PASS code), the reciprocity principle, and Paul's data base. Comparisons are made between PASS and three alternative theoretical schemes (CasP, HISTOP and SLPA). In addition to equilibrium stopping we discuss frozen-charge stopping, deviations from linear velocity dependence below the Bragg peak, application of the reciprocity principle in low-velocity stopping, modeling of equilibrium charges, and the significance of the so-called effective charge.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morozov, Dmitriy; Weber, Gunther H.
2014-03-31
Topological techniques provide robust tools for data analysis. They are used, for example, for feature extraction, for data de-noising, and for comparison of data sets. This chapter concerns contour trees, a topological descriptor that records the connectivity of the isosurfaces of scalar functions. These trees are fundamental to analysis and visualization of physical phenomena modeled by real-valued measurements. We study the parallel analysis of contour trees. After describing a particular representation of a contour tree, called local{global representation, we illustrate how di erent problems that rely on contour trees can be solved in parallel with minimal communication.
On Constructing, Grouping and Using Topical Ontology for Semantic Matching
NASA Astrophysics Data System (ADS)
Tang, Yan; de Baer, Peter; Zhao, Gang; Meersman, Robert
An ontology topic is used to group concepts from different contexts (or even from different domain ontologies). This paper presents a pattern-driven modeling methodology for constructing and grouping topics in an ontology (PAD-ON methodology), which is used for matching similarities between competences in the human resource management (HRM) domain. The methodology is supported by a tool called PAD-ON. This paper demonstrates our recent achievement in the work from the EC Prolix project. The paper approach is applied to the training processes at British Telecom as the test bed.
GDSCalc: A Web-Based Application for Evaluating Discrete Graph Dynamical Systems
Elmeligy Abdelhamid, Sherif H.; Kuhlman, Chris J.; Marathe, Madhav V.; Mortveit, Henning S.; Ravi, S. S.
2015-01-01
Discrete dynamical systems are used to model various realistic systems in network science, from social unrest in human populations to regulation in biological networks. A common approach is to model the agents of a system as vertices of a graph, and the pairwise interactions between agents as edges. Agents are in one of a finite set of states at each discrete time step and are assigned functions that describe how their states change based on neighborhood relations. Full characterization of state transitions of one system can give insights into fundamental behaviors of other dynamical systems. In this paper, we describe a discrete graph dynamical systems (GDSs) application called GDSCalc for computing and characterizing system dynamics. It is an open access system that is used through a web interface. We provide an overview of GDS theory. This theory is the basis of the web application; i.e., an understanding of GDS provides an understanding of the software features, while abstracting away implementation details. We present a set of illustrative examples to demonstrate its use in education and research. Finally, we compare GDSCalc with other discrete dynamical system software tools. Our perspective is that no single software tool will perform all computations that may be required by all users; tools typically have particular features that are more suitable for some tasks. We situate GDSCalc within this space of software tools. PMID:26263006
GDSCalc: A Web-Based Application for Evaluating Discrete Graph Dynamical Systems.
Elmeligy Abdelhamid, Sherif H; Kuhlman, Chris J; Marathe, Madhav V; Mortveit, Henning S; Ravi, S S
2015-01-01
Discrete dynamical systems are used to model various realistic systems in network science, from social unrest in human populations to regulation in biological networks. A common approach is to model the agents of a system as vertices of a graph, and the pairwise interactions between agents as edges. Agents are in one of a finite set of states at each discrete time step and are assigned functions that describe how their states change based on neighborhood relations. Full characterization of state transitions of one system can give insights into fundamental behaviors of other dynamical systems. In this paper, we describe a discrete graph dynamical systems (GDSs) application called GDSCalc for computing and characterizing system dynamics. It is an open access system that is used through a web interface. We provide an overview of GDS theory. This theory is the basis of the web application; i.e., an understanding of GDS provides an understanding of the software features, while abstracting away implementation details. We present a set of illustrative examples to demonstrate its use in education and research. Finally, we compare GDSCalc with other discrete dynamical system software tools. Our perspective is that no single software tool will perform all computations that may be required by all users; tools typically have particular features that are more suitable for some tasks. We situate GDSCalc within this space of software tools.
NASA Astrophysics Data System (ADS)
Faqih, A.
2017-03-01
Providing information regarding future climate scenarios is very important in climate change study. The climate scenario can be used as basic information to support adaptation and mitigation studies. In order to deliver future climate scenarios over specific region, baseline and projection data from the outputs of global climate models (GCM) is needed. However, due to its coarse resolution, the data have to be downscaled and bias corrected in order to get scenario data with better spatial resolution that match the characteristics of the observed data. Generating this downscaled data is mostly difficult for scientist who do not have specific background, experience and skill in dealing with the complex data from the GCM outputs. In this regards, it is necessary to develop a tool that can be used to simplify the downscaling processes in order to help scientist, especially in Indonesia, for generating future climate scenario data that can be used for their climate change-related studies. In this paper, we introduce a tool called as “Statistical Bias Correction for Climate Scenarios (SiBiaS)”. The tool is specially designed to facilitate the use of CMIP5 GCM data outputs and process their statistical bias corrections relative to the reference data from observations. It is prepared for supporting capacity building in climate modeling in Indonesia as part of the Indonesia 3rd National Communication (TNC) project activities.
Real-Time Visualization of Network Behaviors for Situational Awareness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Best, Daniel M.; Bohn, Shawn J.; Love, Douglas V.
Plentiful, complex, and dynamic data make understanding the state of an enterprise network difficult. Although visualization can help analysts understand baseline behaviors in network traffic and identify off-normal events, visual analysis systems often do not scale well to operational data volumes (in the hundreds of millions to billions of transactions per day) nor to analysis of emergent trends in real-time data. We present a system that combines multiple, complementary visualization techniques coupled with in-stream analytics, behavioral modeling of network actors, and a high-throughput processing platform called MeDICi. This system provides situational understanding of real-time network activity to help analysts takemore » proactive response steps. We have developed these techniques using requirements gathered from the government users for which the tools are being developed. By linking multiple visualization tools to a streaming analytic pipeline, and designing each tool to support a particular kind of analysis (from high-level awareness to detailed investigation), analysts can understand the behavior of a network across multiple levels of abstraction.« less
A new tool called DISSECT for analysing large genomic data sets using a Big Data approach
Canela-Xandri, Oriol; Law, Andy; Gray, Alan; Woolliams, John A.; Tenesa, Albert
2015-01-01
Large-scale genetic and genomic data are increasingly available and the major bottleneck in their analysis is a lack of sufficiently scalable computational tools. To address this problem in the context of complex traits analysis, we present DISSECT. DISSECT is a new and freely available software that is able to exploit the distributed-memory parallel computational architectures of compute clusters, to perform a wide range of genomic and epidemiologic analyses, which currently can only be carried out on reduced sample sizes or under restricted conditions. We demonstrate the usefulness of our new tool by addressing the challenge of predicting phenotypes from genotype data in human populations using mixed-linear model analysis. We analyse simulated traits from 470,000 individuals genotyped for 590,004 SNPs in ∼4 h using the combined computational power of 8,400 processor cores. We find that prediction accuracies in excess of 80% of the theoretical maximum could be achieved with large sample sizes. PMID:26657010
PRANAS: A New Platform for Retinal Analysis and Simulation.
Cessac, Bruno; Kornprobst, Pierre; Kraria, Selim; Nasser, Hassan; Pamplona, Daniela; Portelli, Geoffrey; Viéville, Thierry
2017-01-01
The retina encodes visual scenes by trains of action potentials that are sent to the brain via the optic nerve. In this paper, we describe a new free access user-end software allowing to better understand this coding. It is called PRANAS (https://pranas.inria.fr), standing for Platform for Retinal ANalysis And Simulation. PRANAS targets neuroscientists and modelers by providing a unique set of retina-related tools. PRANAS integrates a retina simulator allowing large scale simulations while keeping a strong biological plausibility and a toolbox for the analysis of spike train population statistics. The statistical method (entropy maximization under constraints) takes into account both spatial and temporal correlations as constraints, allowing to analyze the effects of memory on statistics. PRANAS also integrates a tool computing and representing in 3D (time-space) receptive fields. All these tools are accessible through a friendly graphical user interface. The most CPU-costly of them have been implemented to run in parallel.
Fractal characteristic in the wearing of cutting tool
NASA Astrophysics Data System (ADS)
Mei, Anhua; Wang, Jinghui
1995-11-01
This paper studies the cutting tool wear with fractal geometry. The wearing image of the flank has been collected by machine vision which consists of CCD camera and personal computer. After being processed by means of preserving smoothing, binary making and edge extracting, the clear boundary enclosing the worn area has been obtained. The fractal dimension of the worn surface is calculated by the methods called `Slit Island' and `Profile'. The experiments and calciating give the conclusion that the worn surface is enclosed by a irregular boundary curve with some fractal dimension and characteristics of self-similarity. Furthermore, the relation between the cutting velocity and the fractal dimension of the worn region has been submitted. This paper presents a series of methods for processing and analyzing the fractal information in the blank wear, which can be applied to research the projective relation between the fractal structure and the wear state, and establish the fractal model of the cutting tool wear.
Comet sample acquisition for ROSETTA lander mission
NASA Astrophysics Data System (ADS)
Marchesi, M.; Campaci, R.; Magnani, P.; Mugnuolo, R.; Nista, A.; Olivier, A.; Re, E.
2001-09-01
ROSETTA/Lander is being developed with a combined effort of European countries, coordinated by German institutes. The commitment for such a challenging probe will provide a unique opportunity for in-situ analysis of a comet nucleus. The payload for coring, sampling and investigations of comet materials is called SD2 (Sampling Drilling and Distribution). The paper presents the drill/sampler tool and the sample transfer trough modeling, design and testing phases. Expected drilling parameters are then compared with experimental data; limited torque consumption and axial thrust on the tool constraint the operation and determine the success of tests. Qualification campaign involved the structural part and related vibration test, the auger/bit parts and drilling test, and the coring mechanism with related sampling test. Mechanical check of specimen volume is also reported, with emphasis on the measurement procedure and on the mechanical unit. The drill tool and all parts of the transfer chain were tested in the hypothetical comet environment, charcterized by frozen material at extreme low temperature and high vacuum (-160°C, 10-3 Pa).
eFarm: A Tool for Better Observing Agricultural Land Systems
Yu, Qiangyi; Shi, Yun; Tang, Huajun; Yang, Peng; Xie, Ankun; Liu, Bin; Wu, Wenbin
2017-01-01
Currently, observations of an agricultural land system (ALS) largely depend on remotely-sensed images, focusing on its biophysical features. While social surveys capture the socioeconomic features, the information was inadequately integrated with the biophysical features of an ALS and the applications are limited due to the issues of cost and efficiency to carry out such detailed and comparable social surveys at a large spatial coverage. In this paper, we introduce a smartphone-based app, called eFarm: a crowdsourcing and human sensing tool to collect the geotagged ALS information at the land parcel level, based on the high resolution remotely-sensed images. We illustrate its main functionalities, including map visualization, data management, and data sensing. Results of the trial test suggest the system works well. We believe the tool is able to acquire the human–land integrated information which is broadly-covered and timely-updated, thus presenting great potential for improving sensing, mapping, and modeling of ALS studies. PMID:28245554
a Discrete Mathematical Model to Simulate Malware Spreading
NASA Astrophysics Data System (ADS)
Del Rey, A. Martin; Sánchez, G. Rodriguez
2012-10-01
With the advent and worldwide development of Internet, the study and control of malware spreading has become very important. In this sense, some mathematical models to simulate malware propagation have been proposed in the scientific literature, and usually they are based on differential equations exploiting the similarities with mathematical epidemiology. The great majority of these models study the behavior of a particular type of malware called computer worms; indeed, to the best of our knowledge, no model has been proposed to simulate the spreading of a computer virus (the traditional type of malware which differs from computer worms in several aspects). In this sense, the purpose of this work is to introduce a new mathematical model not based on continuous mathematics tools but on discrete ones, to analyze and study the epidemic behavior of computer virus. Specifically, cellular automata are used in order to design such model.
Modelling clinical systemic lupus erythematosus: similarities, differences and success stories
Celhar, Teja
2017-01-01
Abstract Mouse models of SLE have been indispensable tools to study disease pathogenesis, to identify genetic susceptibility loci and targets for drug development, and for preclinical testing of novel therapeutics. Recent insights into immunological mechanisms of disease progression have boosted a revival in SLE drug development. Despite promising results in mouse studies, many novel drugs have failed to meet clinical end points. This is probably because of the complexity of the disease, which is driven by polygenic predisposition and diverse environmental factors, resulting in a heterogeneous clinical presentation. Each mouse model recapitulates limited aspects of lupus, especially in terms of the mechanism underlying disease progression. The main mouse models have been fairly successful for the evaluation of broad-acting immunosuppressants. However, the advent of targeted therapeutics calls for a selection of the most appropriate model(s) for testing and, ultimately, identification of patients who will be most likely to respond. PMID:28013204
Mixture models with entropy regularization for community detection in networks
NASA Astrophysics Data System (ADS)
Chang, Zhenhai; Yin, Xianjun; Jia, Caiyan; Wang, Xiaoyang
2018-04-01
Community detection is a key exploratory tool in network analysis and has received much attention in recent years. NMM (Newman's mixture model) is one of the best models for exploring a range of network structures including community structure, bipartite and core-periphery structures, etc. However, NMM needs to know the number of communities in advance. Therefore, in this study, we have proposed an entropy regularized mixture model (called EMM), which is capable of inferring the number of communities and identifying network structure contained in a network, simultaneously. In the model, by minimizing the entropy of mixing coefficients of NMM using EM (expectation-maximization) solution, the small clusters contained little information can be discarded step by step. The empirical study on both synthetic networks and real networks has shown that the proposed model EMM is superior to the state-of-the-art methods.
The ABLe change framework: a conceptual and methodological tool for promoting systems change.
Foster-Fishman, Pennie G; Watson, Erin R
2012-06-01
This paper presents a new approach to the design and implementation of community change efforts like a System of Care. Called the ABLe Change Framework, the model provides simultaneous attention to the content and process of the work, ensuring effective implementation and the pursuit of systems change. Three key strategies are employed in this model to ensure the integration of content and process efforts and effective mobilization of broad scale systems change: Systemic Action Learning Teams, Simple Rules, and Small Wins. In this paper we describe the ABLe Change Framework and present a case study in which we successfully applied this approach to one system of care effort in Michigan.
Network Models: An Underutilized Tool in Wildlife Epidemiology?
Craft, Meggan E.; Caillaud, Damien
2011-01-01
Although the approach of contact network epidemiology has been increasing in popularity for studying transmission of infectious diseases in human populations, it has generally been an underutilized approach for investigating disease outbreaks in wildlife populations. In this paper we explore the differences between the type of data that can be collected on human and wildlife populations, provide an update on recent advances that have been made in wildlife epidemiology by using a network approach, and discuss why networks might have been underutilized and why networks could and should be used more in the future. We conclude with ideas for future directions and a call for field biologists and network modelers to engage in more cross-disciplinary collaboration. PMID:21527981
ONCHIT security in distributed environments: a proposed model for implantable devices.
Lorence, Daniel; Lee, James; Richards, Michael
2010-08-01
Recent ONCHIT mandates call for increased individual health data collection efforts as well as heightened security measures. To date most healthcare organizations have been reluctant to exchange information, citing confidentiality concerns and unshared costs incurred by specific organizations. Implantable monitoring and treatment devices are rapidly emerging as data collection interface tools in response to such mandates. Proposed here is a translational, device-independent consumer-based solution, which focuses on information controlled by specific patients, and functions within a distributed (organization neutral) environment. While the conceptual applications employed in this technology set are provided by way of illustration, they may also serve as a transformative model for emerging EMR/EHR requirements.
Simulation of amide I' band profiles of trans polyproline based on an excitonic coupling model
NASA Astrophysics Data System (ADS)
Measey, Thomas; Schweitzer-Stenner, Reinhard
2005-06-01
We measured the amide I' band profile of the IR, isotropic Raman, anisotropic Raman, and Vibrational Circular Dichroism spectrum of poly- L-proline in D 2O. The band shapes were modeled by using an algorithm that exploits the delocalized character of the excited vibrational states [R. Schweitzer-Stenner, J. Phys. Chem. B. 108 (2004) 16965]. The band shapes could be quantitatively reproduced by invoking the polyproline II or 3 1-helix conformation for all peptide residues. This corroborates the notion that the combined use of the above spectroscopies is an ideal tool to discriminate different conformations associated with the so-called random coil state of peptides.
Is the Jeffreys' scale a reliable tool for Bayesian model comparison in cosmology?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nesseris, Savvas; García-Bellido, Juan, E-mail: savvas.nesseris@uam.es, E-mail: juan.garciabellido@uam.es
2013-08-01
We are entering an era where progress in cosmology is driven by data, and alternative models will have to be compared and ruled out according to some consistent criterium. The most conservative and widely used approach is Bayesian model comparison. In this paper we explicitly calculate the Bayes factors for all models that are linear with respect to their parameters. We do this in order to test the so called Jeffreys' scale and determine analytically how accurate its predictions are in a simple case where we fully understand and can calculate everything analytically. We also discuss the case of nestedmore » models, e.g. one with M{sub 1} and another with M{sub 2} superset of M{sub 1} parameters and we derive analytic expressions for both the Bayes factor and the figure of Merit, defined as the inverse area of the model parameter's confidence contours. With all this machinery and the use of an explicit example we demonstrate that the threshold nature of Jeffreys' scale is not a ''one size fits all'' reliable tool for model comparison and that it may lead to biased conclusions. Furthermore, we discuss the importance of choosing the right basis in the context of models that are linear with respect to their parameters and how that basis affects the parameter estimation and the derived constraints.« less
Software architecture and design of the web services facilitating climate model diagnostic analysis
NASA Astrophysics Data System (ADS)
Pan, L.; Lee, S.; Zhang, J.; Tang, B.; Zhai, C.; Jiang, J. H.; Wang, W.; Bao, Q.; Qi, M.; Kubar, T. L.; Teixeira, J.
2015-12-01
Climate model diagnostic analysis is a computationally- and data-intensive task because it involves multiple numerical model outputs and satellite observation data that can both be high resolution. We have built an online tool that facilitates this process. The tool is called Climate Model Diagnostic Analyzer (CMDA). It employs the web service technology and provides a web-based user interface. The benefits of these choices include: (1) No installation of any software other than a browser, hence it is platform compatable; (2) Co-location of computation and big data on the server side, and small results and plots to be downloaded on the client side, hence high data efficiency; (3) multi-threaded implementation to achieve parallel performance on multi-core servers; and (4) cloud deployment so each user has a dedicated virtual machine. In this presentation, we will focus on the computer science aspects of this tool, namely the architectural design, the infrastructure of the web services, the implementation of the web-based user interface, the mechanism of provenance collection, the approach to virtualization, and the Amazon Cloud deployment. As an example, We will describe our methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). Another example is the use of Docker, a light-weight virtualization container, to distribute and deploy CMDA onto an Amazon EC2 instance. Our tool of CMDA has been successfully used in the 2014 Summer School hosted by the JPL Center for Climate Science. Students had positive feedbacks in general and we will report their comments. An enhanced version of CMDA with several new features, some requested by the 2014 students, will be used in the 2015 Summer School soon.
2013-01-01
Background Out-of-hospital cardiac arrest (OHCA) is a frequent and acute medical condition that requires immediate care. We estimate survival rates from OHCA in the area of Stockholm, through developing an analytical tool for evaluating Emergency Medical Services (EMS) system design changes. The study also is an attempt to validate the proposed model used to generate the outcome measures for the study. Methods and results This was done by combining a geographic information systems (GIS) simulation of driving times with register data on survival rates. The emergency resources comprised ambulance alone and ambulance plus fire services. The simulation model predicted a baseline survival rate of 3.9 per cent, and reducing the ambulance response time by one minute increased survival to 4.6 per cent. Adding the fire services as first responders (dual dispatch) increased survival to 6.2 per cent from the baseline level. The model predictions were validated using empirical data. Conclusion We have presented an analytical tool that easily can be generalized to other regions or countries. The model can be used to predict outcomes of cardiac arrest prior to investment in EMS design changes that affect the alarm process, e.g. (1) static changes such as trimming the emergency call handling time or (2) dynamic changes such as location of emergency resources or which resources should carry a defibrillator. PMID:23415045
NASA Technical Reports Server (NTRS)
White, Allan L.; Palumbo, Daniel L.
1991-01-01
Semi-Markov processes have proved to be an effective and convenient tool to construct models of systems that achieve reliability by redundancy and reconfiguration. These models are able to depict complex system architectures and to capture the dynamics of fault arrival and system recovery. A disadvantage of this approach is that the models can be extremely large, which poses both a model and a computational problem. Techniques are needed to reduce the model size. Because these systems are used in critical applications where failure can be expensive, there must be an analytically derived bound for the error produced by the model reduction technique. A model reduction technique called trimming is presented that can be applied to a popular class of systems. Automatic model generation programs were written to help the reliability analyst produce models of complex systems. This method, trimming, is easy to implement and the error bound easy to compute. Hence, the method lends itself to inclusion in an automatic model generator.
Variant Review with the Integrative Genomics Viewer.
Robinson, James T; Thorvaldsdóttir, Helga; Wenger, Aaron M; Zehir, Ahmet; Mesirov, Jill P
2017-11-01
Manual review of aligned reads for confirmation and interpretation of variant calls is an important step in many variant calling pipelines for next-generation sequencing (NGS) data. Visual inspection can greatly increase the confidence in calls, reduce the risk of false positives, and help characterize complex events. The Integrative Genomics Viewer (IGV) was one of the first tools to provide NGS data visualization, and it currently provides a rich set of tools for inspection, validation, and interpretation of NGS datasets, as well as other types of genomic data. Here, we present a short overview of IGV's variant review features for both single-nucleotide variants and structural variants, with examples from both cancer and germline datasets. IGV is freely available at https://www.igv.org Cancer Res; 77(21); e31-34. ©2017 AACR . ©2017 American Association for Cancer Research.
Enhancement Approachof Object Constraint Language Generation
NASA Astrophysics Data System (ADS)
Salemi, Samin; Selamat, Ali
2018-01-01
OCL is the most prevalent language to document system constraints that are annotated in UML. Writing OCL specifications is not an easy task due to the complexity of the OCL syntax. Therefore, an approach to help and assist developers to write OCL specifications is needed. There are two approaches to do so: First, creating an OCL specifications by a tool called COPACABANA. Second, an MDA-based approach to help developers in writing OCL specification by another tool called NL2OCLviaSBVR that generates OCL specification automatically. This study presents another MDA-based approach called En2OCL, and its objective is twofold. 1- to improve the precison of the existing works. 2- to present a benchmark of these approaches. The benchmark shows that the accuracy of COPACABANA, NL2OCLviaSBVR, and En2OCL are 69.23, 84.64, and 88.40 respectively.
Robust detection, isolation and accommodation for sensor failures
NASA Technical Reports Server (NTRS)
Emami-Naeini, A.; Akhter, M. M.; Rock, S. M.
1986-01-01
The objective is to extend the recent advances in robust control system design of multivariable systems to sensor failure detection, isolation, and accommodation (DIA), and estimator design. This effort provides analysis tools to quantify the trade-off between performance robustness and DIA sensitivity, which are to be used to achieve higher levels of performance robustness for given levels of DIA sensitivity. An innovations-based DIA scheme is used. Estimators, which depend upon a model of the process and process inputs and outputs, are used to generate these innovations. Thresholds used to determine failure detection are computed based on bounds on modeling errors, noise properties, and the class of failures. The applicability of the newly developed tools are demonstrated on a multivariable aircraft turbojet engine example. A new concept call the threshold selector was developed. It represents a significant and innovative tool for the analysis and synthesis of DiA algorithms. The estimators were made robust by introduction of an internal model and by frequency shaping. The internal mode provides asymptotically unbiased filter estimates.The incorporation of frequency shaping of the Linear Quadratic Gaussian cost functional modifies the estimator design to make it suitable for sensor failure DIA. The results are compared with previous studies which used thresholds that were selcted empirically. Comparison of these two techniques on a nonlinear dynamic engine simulation shows improved performance of the new method compared to previous techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Zi-Kui; Gleeson, Brian; Shang, Shunli
This project developed computational tools that can complement and support experimental efforts in order to enable discovery and more efficient development of Ni-base structural materials and coatings. The project goal was reached through an integrated computation-predictive and experimental-validation approach, including first-principles calculations, thermodynamic CALPHAD (CALculation of PHAse Diagram), and experimental investigations on compositions relevant to Ni-base superalloys and coatings in terms of oxide layer growth and microstructure stabilities. The developed description included composition ranges typical for coating alloys and, hence, allow for prediction of thermodynamic properties for these material systems. The calculation of phase compositions, phase fraction, and phase stabilities,more » which are directly related to properties such as ductility and strength, was a valuable contribution, along with the collection of computational tools that are required to meet the increasing demands for strong, ductile and environmentally-protective coatings. Specifically, a suitable thermodynamic description for the Ni-Al-Cr-Co-Si-Hf-Y system was developed for bulk alloy and coating compositions. Experiments were performed to validate and refine the thermodynamics from the CALPHAD modeling approach. Additionally, alloys produced using predictions from the current computational models were studied in terms of their oxidation performance. Finally, results obtained from experiments aided in the development of a thermodynamic modeling automation tool called ESPEI/pycalphad - for more rapid discovery and development of new materials.« less
Eyetracking Methodology in SCMC: A Tool for Empowering Learning and Teaching
ERIC Educational Resources Information Center
Stickler, Ursula; Shi, Lijing
2017-01-01
Computer-assisted language learning, or CALL, is an interdisciplinary area of research, positioned between science and social science, computing and education, linguistics and applied linguistics. This paper argues that by appropriating methods originating in some areas of CALL-related research, for example human-computer interaction (HCI) or…
On the Edge: Intelligent CALL in the 1990s.
ERIC Educational Resources Information Center
Underwood, John
1989-01-01
Examines the possibilities of developing computer-assisted language learning (CALL) based on the best of modern technology, arguing that artificial intelligence (AI) strategies will radically improve the kinds of exercises that can be performed. Recommends combining AI technology with other tools for delivering instruction, such as simulation and…
Rethinking Transfer: Learning from CALL Teacher Education as Consequential Transition
ERIC Educational Resources Information Center
Chao, Chin-chi
2015-01-01
Behind CALL teacher education (CTE) there is an unproblematized consensus of transfer, which suggests a positivist and tool-centered view of learning gains that differs from the sociocultural focus of recent teacher education research. Drawing on Beach's (2003) conceptualization of transfer as "consequential transition," this qualitative…
Educational and Scientific Applications of Climate Model Diagnostic Analyzer
NASA Astrophysics Data System (ADS)
Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Zhang, J.; Bao, Q.
2016-12-01
Climate Model Diagnostic Analyzer (CMDA) is a web-based information system designed for the climate modeling and model analysis community to analyze climate data from models and observations. CMDA provides tools to diagnostically analyze climate data for model validation and improvement, and to systematically manage analysis provenance for sharing results with other investigators. CMDA utilizes cloud computing resources, multi-threading computing, machine-learning algorithms, web service technologies, and provenance-supporting technologies to address technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. As CMDA infrastructure and technology have matured, we have developed the educational and scientific applications of CMDA. Educationally, CMDA supported the summer school of the JPL Center for Climate Sciences for three years since 2014. In the summer school, the students work on group research projects where CMDA provide datasets and analysis tools. Each student is assigned to a virtual machine with CMDA installed in Amazon Web Services. A provenance management system for CMDA is developed to keep track of students' usages of CMDA, and to recommend datasets and analysis tools for their research topic. The provenance system also allows students to revisit their analysis results and share them with their group. Scientifically, we have developed several science use cases of CMDA covering various topics, datasets, and analysis types. Each use case developed is described and listed in terms of a scientific goal, datasets used, the analysis tools used, scientific results discovered from the use case, an analysis result such as output plots and data files, and a link to the exact analysis service call with all the input arguments filled. For example, one science use case is the evaluation of NCAR CAM5 model with MODIS total cloud fraction. The analysis service used is Difference Plot Service of Two Variables, and the datasets used are NCAR CAM total cloud fraction and MODIS total cloud fraction. The scientific highlight of the use case is that the CAM5 model overall does a fairly decent job at simulating total cloud cover, though simulates too few clouds especially near and offshore of the eastern ocean basins where low clouds are dominant.
NASA Technical Reports Server (NTRS)
Mahalingam, Sudhakar; Menart, James A.
2005-01-01
Computational modeling of the plasma located in the discharge chamber of an ion engine is an important activity so that the development and design of the next generation of ion engines may be enhanced. In this work a computational tool called XOOPIC is used to model the primary electrons, secondary electrons, and ions inside the discharge chamber. The details of this computational tool are discussed in this paper. Preliminary results from XOOPIC are presented. The results presented include particle number density distributions for the primary electrons, the secondary electrons, and the ions. In addition the total number of a particular particle in the discharge chamber as a function of time, electric potential maps and magnetic field maps are presented. A primary electron number density plot from PRIMA is given in this paper so that the results of XOOPIC can be compared to it. PRIMA is a computer code that the present investigators have used in much of their previous work that provides results that compare well to experimental results. PRIMA only models the primary electrons in the discharge chamber. Modeling ions and secondary electrons, as well as the primary electrons, will greatly increase our ability to predict different characteristics of the plasma discharge used in an ion engine.
Integrated modeling approach for optimal management of water, energy and food security nexus
NASA Astrophysics Data System (ADS)
Zhang, Xiaodong; Vesselinov, Velimir V.
2017-03-01
Water, energy and food (WEF) are inextricably interrelated. Effective planning and management of limited WEF resources to meet current and future socioeconomic demands for sustainable development is challenging. WEF production/delivery may also produce environmental impacts; as a result, green-house-gas emission control will impact WEF nexus management as well. Nexus management for WEF security necessitates integrated tools for predictive analysis that are capable of identifying the tradeoffs among various sectors, generating cost-effective planning and management strategies and policies. To address these needs, we have developed an integrated model analysis framework and tool called WEFO. WEFO provides a multi-period socioeconomic model for predicting how to satisfy WEF demands based on model inputs representing productions costs, socioeconomic demands, and environmental controls. WEFO is applied to quantitatively analyze the interrelationships and trade-offs among system components including energy supply, electricity generation, water supply-demand, food production as well as mitigation of environmental impacts. WEFO is demonstrated to solve a hypothetical nexus management problem consistent with real-world management scenarios. Model parameters are analyzed using global sensitivity analysis and their effects on total system cost are quantified. The obtained results demonstrate how these types of analyses can be helpful for decision-makers and stakeholders to make cost-effective decisions for optimal WEF management.
NASA Astrophysics Data System (ADS)
Brodersen, R. W.
1984-04-01
A scaled version of the RISC II chip has been fabricated and tested and these new chips have a cycle time that would outperform a VAX 11/780 by about a factor of two on compiled integer C programs. The architectural work on a RISC chip designed for a Smalltalk implementation has been completed. This chip, called SOAR (Smalltalk On a RISC), should run program s4-15 times faster than the Xerox 1100 (Dolphin), a TTL minicomputer, and about as fast as the Xerox 1132 (Dorado), a $100,000 ECL minicomputer. The 1983 VLSI tools tape has been converted for use under the latest UNIX release (4.2). The Magic (formerly called Caddy) layout system will be a unified set of highly automated tools that cover all aspects of the layout process, including stretching, compaction, tiling and routing. A multiple window package and design rule checker for this system have just been completed and compaction and stretching are partially implemented. New slope-based timing models for the Crystal timing analyzer are now fully implemented and in regular use. In an accuracy test using a dozen critical paths from the RISC II processor and cache chips it was found that Crystal's estimates were within 5-10% of SPICE's estimates, while being a factor of 10,000 times faster.
Managing biological networks by using text mining and computer-aided curation
NASA Astrophysics Data System (ADS)
Yu, Seok Jong; Cho, Yongseong; Lee, Min-Ho; Lim, Jongtae; Yoo, Jaesoo
2015-11-01
In order to understand a biological mechanism in a cell, a researcher should collect a huge number of protein interactions with experimental data from experiments and the literature. Text mining systems that extract biological interactions from papers have been used to construct biological networks for a few decades. Even though the text mining of literature is necessary to construct a biological network, few systems with a text mining tool are available for biologists who want to construct their own biological networks. We have developed a biological network construction system called BioKnowledge Viewer that can generate a biological interaction network by using a text mining tool and biological taggers. It also Boolean simulation software to provide a biological modeling system to simulate the model that is made with the text mining tool. A user can download PubMed articles and construct a biological network by using the Multi-level Knowledge Emergence Model (KMEM), MetaMap, and A Biomedical Named Entity Recognizer (ABNER) as a text mining tool. To evaluate the system, we constructed an aging-related biological network that consist 9,415 nodes (genes) by using manual curation. With network analysis, we found that several genes, including JNK, AP-1, and BCL-2, were highly related in aging biological network. We provide a semi-automatic curation environment so that users can obtain a graph database for managing text mining results that are generated in the server system and can navigate the network with BioKnowledge Viewer, which is freely available at http://bioknowledgeviewer.kisti.re.kr.
Formalization of treatment guidelines using Fuzzy Cognitive Maps and semantic web tools.
Papageorgiou, Elpiniki I; Roo, Jos De; Huszka, Csaba; Colaert, Dirk
2012-02-01
Therapy decision making and support in medicine deals with uncertainty and needs to take into account the patient's clinical parameters, the context of illness and the medical knowledge of the physician and guidelines to recommend a treatment therapy. This research study is focused on the formalization of medical knowledge using a cognitive process, called Fuzzy Cognitive Maps (FCMs) and semantic web approach. The FCM technique is capable of dealing with situations including uncertain descriptions using similar procedure such as human reasoning does. Thus, it was selected for the case of modeling and knowledge integration of clinical practice guidelines. The semantic web tools were established to implement the FCM approach. The knowledge base was constructed from the clinical guidelines as the form of if-then fuzzy rules. These fuzzy rules were transferred to FCM modeling technique and, through the semantic web tools, the whole formalization was accomplished. The problem of urinary tract infection (UTI) in adult community was examined for the proposed approach. Forty-seven clinical concepts and eight therapy concepts were identified for the antibiotic treatment therapy problem of UTIs. A preliminary pilot-evaluation study with 55 patient cases showed interesting findings; 91% of the antibiotic treatments proposed by the implemented approach were in fully agreement with the guidelines and physicians' opinions. The results have shown that the suggested approach formalizes medical knowledge efficiently and gives a front-end decision on antibiotics' suggestion for cystitis. Concluding, modeling medical knowledge/therapeutic guidelines using cognitive methods and web semantic tools is both reliable and useful. Copyright © 2011 Elsevier Inc. All rights reserved.
Drewes, Rich; Zou, Quan; Goodman, Philip H
2009-01-01
Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading "glue" tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS.
Drewes, Rich; Zou, Quan; Goodman, Philip H.
2008-01-01
Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading “glue” tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS. PMID:19506707
mrtailor: a tool for PDB-file preparation for the generation of external restraints.
Gruene, Tim
2013-09-01
Model building starting from, for example, a molecular-replacement solution with low sequence similarity introduces model bias, which can be difficult to detect, especially at low resolution. The program mrtailor removes low-similarity regions from a template PDB file according to sequence similarity between the target sequence and the template sequence and maps the target sequence onto the PDB file. The modified PDB file can be used to generate external restraints for low-resolution refinement with reduced model bias and can be used as a starting point for model building and refinement. The program can call ProSMART [Nicholls et al. (2012), Acta Cryst. D68, 404-417] directly in order to create external restraints suitable for REFMAC5 [Murshudov et al. (2011), Acta Cryst. D67, 355-367]. Both a command-line version and a GUI exist.
Dineen, Kelly K.; DuBois, James M.
2017-01-01
Prescription opioids are an important tool for physicians in treating pain but also carry significant risks of harm when prescribed inappropriately or misused by patients or others. Recent increases in opioid related morbidity and mortality has reignited scrutiny of prescribing practices by law enforcement, regulatory agencies, and state medical boards. At the same time, the predominant 4D model of misprescribers is outdated and insufficient; it groups physician misprescribers as dated, duped, disabled, or dishonest. The weaknesses and inaccuracies of the 4D model are explored, along with the serious consequences of its application. This article calls for development of an evidence base in this area and suggests an alternate model of misprescribers, the 3C model, which more accurately characterizes misprescribers as careless, corrupt, or compromised by impairment. PMID:27263262
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poliakoff, David; Legendre, Matt
2017-03-29
GOTCHA is a runtime API intercepting function calls between shared libraries. It is intended to be used by HPC Tools (i.e., performance analysis tools like Open/SpeedShop, HPCToolkit, TAU, etc.). 2:18 PMThese other tools can use Gotch to intercept interesting functions, such as MPI functions, and collect performance metrics about those functions. We intend for this to be open-source software that gets adopted by other open-s0urse tools that are used at LLNL.
North, Frederick; Varkey, Prathiba; Caraballo, Pedro; Vsetecka, Darlene; Bartel, Greg
2007-10-11
Complex decision support software can require significant effort in maintenance and enhancement. A quality improvement tool, the prioritization matrix, was successfully used to guide software enhancement of algorithms in a symptom assessment call center.
How freight moves : estimating milage and routes using an innovative GIS tool
DOT National Transportation Integrated Search
2007-06-01
The Bureau of Transportation Statistics (BTS) has developed an innovative software tool, called GeoMiler, that is helping researchers better estimate freight travel. GeoMiler is being used to compute mileages along likely routes for the nearly 6 mill...
Feasibility of lane closures using probe data : technical brief.
DOT National Transportation Integrated Search
2017-04-01
This study developed an on-line system analysis tool called the Work Zone Interactive : Management Application - Planning (WIMAP-P), an easy-to-use and easy-to-learn tool for : predicting the traffic impact caused by work zone lane closures on freewa...
ERIC Educational Resources Information Center
Wilson, Courtney R.; Trautmann, Nancy M.; MaKinster, James G.; Barker, Barbara J.
2010-01-01
A new online tool called "Science Pipes" allows students to conduct biodiversity investigations. With this free tool, students create and run analyses that would otherwise require access to unwieldy data sets and the ability to write computer code. Using these data, students can conduct guided inquiries or hypothesis-driven research to…
Electronic Assessment and Feedback Tool in Supervision of Nursing Students during Clinical Training
ERIC Educational Resources Information Center
Mettiäinen, Sari
2015-01-01
The aim of this study was to determine nursing teachers' and students' attitudes to and experiences of using an electronic assessment and feedback tool in supervision of clinical training. The tool was called eTaitava, and it was developed in Finland. During the pilot project, the software was used by 12 nursing teachers and 430 nursing students.…
ERIC Educational Resources Information Center
European Training Foundation, Turin (Italy).
This document presents a management tool kit on training needs assessment and program design for countries in transition to a market economy. Chapter 1 describes the tool's development within the framework of the project called Strengthening of Partnership between Management Training Institutions and Companies, Ukraine-Kazakhstan-Kyrgyzstan.…
Developing tools for digital radar image data evaluation
NASA Technical Reports Server (NTRS)
Domik, G.; Leberl, F.; Raggam, J.
1986-01-01
The refinement of radar image analysis methods has led to a need for a systems approach to radar image processing software. Developments stimulated through satellite radar are combined with standard image processing techniques to create a user environment to manipulate and analyze airborne and satellite radar images. One aim is to create radar products for the user from the original data to enhance the ease of understanding the contents. The results are called secondary image products and derive from the original digital images. Another aim is to support interactive SAR image analysis. Software methods permit use of a digital height model to create ortho images, synthetic images, stereo-ortho images, radar maps or color combinations of different component products. Efforts are ongoing to integrate individual tools into a combined hardware/software environment for interactive radar image analysis.
National Combustion Code: A Multidisciplinary Combustor Design System
NASA Technical Reports Server (NTRS)
Stubbs, Robert M.; Liu, Nan-Suey
1997-01-01
The Internal Fluid Mechanics Division conducts both basic research and technology, and system technology research for aerospace propulsion systems components. The research within the division, which is both computational and experimental, is aimed at improving fundamental understanding of flow physics in inlets, ducts, nozzles, turbomachinery, and combustors. This article and the following three articles highlight some of the work accomplished in 1996. A multidisciplinary combustor design system is critical for optimizing the combustor design process. Such a system should include sophisticated computer-aided design (CAD) tools for geometry creation, advanced mesh generators for creating solid model representations, a common framework for fluid flow and structural analyses, modern postprocessing tools, and parallel processing. The goal of the present effort is to develop some of the enabling technologies and to demonstrate their overall performance in an integrated system called the National Combustion Code.
Analysis of In-Route Wireless Charging for the Shuttle System at Zion National Park
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meintz, Andrew; Prohaska, Robert; Konan, Arnaud
System right-sizing is critical to implementation of wireless power transfer (WPT) for electric vehicles (EVs). This study will analyze potential WPT scenarios for the electrification of shuttle buses at Zion National Park utilizing a modelling tool developed by NREL called WPTSim. This tool uses second-by-second speed, location, and road grade data from the conventional shuttles in operation to simulate the incorporation of WPT at fine granularity. Vehicle power and state of charge are simulated over the drive cycle to evaluate potential system designs. The required battery capacity is determined based on the rated power at a variable number of chargingmore » locations. The outcome of this work is an analysis of the design tradeoffs for the electrification of the shuttle fleet with wireless charging versus conventional overnight charging.« less
Grants4Targets - an innovative approach to translate ideas from basic research into novel drugs.
Lessl, Monika; Schoepe, Stefanie; Sommer, Anette; Schneider, Martin; Asadullah, Khusru
2011-04-01
Collaborations between industry and academia are steadily gaining importance. To combine expertises Bayer Healthcare has set up a novel open innovation approach called Grants4Targets. Ideas on novel drug targets can easily be submitted to http://www.grants4targets.com. After a review process, grants are provided to perform focused experiments to further validate the proposed targets. In addition to financial support specific know-how on target validation and drug discovery is provided. Experienced scientists are nominated as project partners and, depending on the project, tools or specific models are provided. Around 280 applications have been received and 41 projects granted. According to our experience, this type of bridging fund combined with joint efforts provides a valuable tool to foster drug discovery collaborations. Copyright © 2010 Elsevier Ltd. All rights reserved.
A modeling paradigm for interdisciplinary water resources modeling: Simple Script Wrappers (SSW)
NASA Astrophysics Data System (ADS)
Steward, David R.; Bulatewicz, Tom; Aistrup, Joseph A.; Andresen, Daniel; Bernard, Eric A.; Kulcsar, Laszlo; Peterson, Jeffrey M.; Staggenborg, Scott A.; Welch, Stephen M.
2014-05-01
Holistic understanding of a water resources system requires tools capable of model integration. This team has developed an adaptation of the OpenMI (Open Modelling Interface) that allows easy interactions across the data passed between models. Capabilities have been developed to allow programs written in common languages such as matlab, python and scilab to share their data with other programs and accept other program's data. We call this interface the Simple Script Wrapper (SSW). An implementation of SSW is shown that integrates groundwater, economic, and agricultural models in the High Plains region of Kansas. Output from these models illustrates the interdisciplinary discovery facilitated through use of SSW implemented models. Reference: Bulatewicz, T., A. Allen, J.M. Peterson, S. Staggenborg, S.M. Welch, and D.R. Steward, The Simple Script Wrapper for OpenMI: Enabling interdisciplinary modeling studies, Environmental Modelling & Software, 39, 283-294, 2013. http://dx.doi.org/10.1016/j.envsoft.2012.07.006 http://code.google.com/p/simple-script-wrapper/
NIKE: a new clinical tool for establishing levels of indications for cataract surgery.
Lundström, Mats; Albrecht, Susanne; Håkansson, Ingemar; Lorefors, Ragnhild; Ohlsson, Sven; Polland, Werner; Schmid, Andrea; Svensson, Göran; Wendel, Eva
2006-08-01
The purpose of this study was to construct a new clinical tool for establishing levels of indications for cataract surgery, and to validate this tool. Teams from nine eye clinics reached an agreement about the need to develop a clinical tool for setting levels of indications for cataract surgery and about the items that should be included in the tool. The tool was to be called 'NIKE' (Nationell Indikationsmodell för Kataraktextraktion). The Canadian Cataract Priority Criteria Tool served as a model for the NIKE tool, which was modified for Swedish conditions. Items included in the tool were visual acuity of both eyes, patients' perceived difficulties in day-to-day life, cataract symptoms, the ability to live independently, and medical/ophthalmic reasons for surgery. The tool was validated and tested in 343 cataract surgery patients. Validity, stability and reliability were tested and the outcome of surgery was studied in relation to the indication setting. Four indication groups (IGs) were suggested. The group with the greatest indications for surgery was named group 1 and that with the lowest, group 4. Validity was proved to be good. Surgery had the greatest impact on the group with the highest indications for surgery. Test-retest reliability test and interexaminer tests of indication settings showed statistically significant intraclass correlations (intraclass correlation coefficients [ICCs] 0.526 and 0.923, respectively). A new clinical tool for indication setting in cataract surgery is presented. This tool, the NIKE, takes into account both visual acuity and the patient's perceived problems in day-to-day life because of cataract. The tool seems to be stable and reliable and neutral towards different examiners.
NASA Astrophysics Data System (ADS)
Verkaik, J.
2013-12-01
The Netherlands Hydrological Instrument (NHI) model predicts water demands in periods of drought, supporting the Dutch decision makers in taking operational as well as long-term decisions with respect to the water supply. Other applications of NHI are predicting fresh-salt interaction, nutrient loadings, and agriculture change. The NHI model consists of several coupled models: a saturated groundwater model (MODFLOW), an unsaturated groundwater model (MetaSWAP), a sub-catchment surface water model (MOZART), and a distribution network of surface waters model (DM/SOBEK). Each of these models requires specific, usually large, input data that may be the result of sophisticated schematization workflows. Input data can also be dependent on each other, for example, the precipitation data is input for the unsaturated zone model (cells) as well as for the surface water models (polygons). For efficient data management, we developed several Python tools such that the modeler or stakeholder can use the model in a user-friendly manner, and data is managed in a consistent, transparent and reproducible way. Two open source Python tools are presented here: the data version control module for the workflow manager VisTrails called FileSync, and the NHI model control script that uses FileSync. VisTrails is an open-source scientific workflow and provenance management system that provides support for simulations, data exploration and visualization. Since VisTrails does not directly support version control we developed a version control module called FileSync. With this generic module, the user can synchronize data from and to his workflow through a dialog window. The FileSync dialog calls the FileSync script that is command-line based and performs the actual data synchronization. This script allows the user to easily create a model repository, upload and download data, create releases and define scenarios. The data synchronization approach applied here differs from systems as Subversion or Git, since these systems do not perform well for large (binary) model data files. For this reason, a new concept of parameterization and data splitting has been implemented. Each file, or set of files, is uniquely labeled as a parameter, and for this parameter metadata is maintained by Subversion. The metadata data contains file hashes to identify data content and the location where the actual bulk data are stored that can be reached by FTP. The NHI model control script is a command-line driven Python script for pre-processing, running, and post-processing the NHI model and uses one single configuration file for all computational kernels. This configuration file is an easy-to-use, keyword-driven, Windows INI-file, having separate sections for all the kernels. It also includes a FileSync data section where the user can specify version controlled model data to be used as input. The NHI control script keeps all the data consistent during the pre-processing. Furthermore, this script is able to do model state handling when the NHI model is used for ensemble forecasting.
NASA Astrophysics Data System (ADS)
Wright, D. J.; O'Dea, E.; Cushing, J. B.; Cuny, J. E.; Toomey, D. R.; Hackett, K.; Tikekar, R.
2001-12-01
The East Pacific Rise (EPR) from 9-10deg. N is currently our best-studied section of fast-spreading mid-ocean ridge. During several decades of investigation it has been explored by the full spectrum of ridge investigators, including chemists, biologists, geologists and geophysicists. These studies, and those that are ongoing, provide a wealth of observational data, results and data-driven theoretical (often numerical) studies that have not yet been fully utilized either by research scientists or by professional educators. While the situation is improving, a large amount of data, results, and related theoretical models still exist either in an inert, non-interactive form (e.g., journal publications) or as unlinked and currently incompatible computer data or algorithms. Infrastructure is needed not just for ready access to data, but linkage of disparate data sets (data to data) as well as data to models in order quantitatively evaluate hypotheses, refine numerical simulations, and explore new relations between observables. The prototype of a computational environment and toolset, called the Virtual Research Vessel (VRV), is being developed to provide scientists and educators with ready access to data, results and numerical models. While this effort is focused on the EPR 9N region, the resulting software tools and infrastructure should be helpful in establishing similar systems for other sections of the global mid-ocean ridge. Work in progress includes efforts to develop: (1) virtual database to incorporate diverse data types with domain-specific metadata into a global schema that allows web-query across different marine geology data sets, and an analogous declarative (database available) description of tools and models; (2) the ability to move data between GIS and the above DBMS, and tools to encourage data submission to archivesl (3) tools for finding and viewing archives, and translating between formats; (4) support for "computational steering" (tool composition) and model coupling (e.g., ability to run tool composition locally but access input data from the web, APIs to support coupling such as invoking programs that are running remotely, and help in writing data wrappers to publish programs); (5) support of migration paths for prototyped model coupling; and (6) export of marine geological data and data analysis to the undergraduate classroom (VRV-ET, "Educational Tool"). See the main VRV web site at http://oregonstate.edu/dept/vrv and the VRV-ET web site at: http://www.cs.uoregon.edu/research/vrv-et.
Information dissemination model for social media with constant updates
NASA Astrophysics Data System (ADS)
Zhu, Hui; Wu, Heng; Cao, Jin; Fu, Gang; Li, Hui
2018-07-01
With the development of social media tools and the pervasiveness of smart terminals, social media has become a significant source of information for many individuals. However, false information can spread rapidly, which may result in negative social impacts and serious economic losses. Thus, reducing the unfavorable effects of false information has become an urgent challenge. In this paper, a new competitive model called DMCU is proposed to describe the dissemination of information with constant updates in social media. In the model, we focus on the competitive relationship between the original false information and updated information, and then propose the priority of related information. To more effectively evaluate the effectiveness of the proposed model, data sets containing actual social media activity are utilized in experiments. Simulation results demonstrate that the DMCU model can precisely describe the process of information dissemination with constant updates, and that it can be used to forecast information dissemination trends on social media.
'spup' - an R package for uncertainty propagation in spatial environmental modelling
NASA Astrophysics Data System (ADS)
Sawicka, Kasia; Heuvelink, Gerard
2016-04-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.
'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling
NASA Astrophysics Data System (ADS)
Sawicka, Kasia; Heuvelink, Gerard
2017-04-01
Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in statistics can be used to summarize and visualize uncertainty about the measured input, model parameters and output of the uncertainty propagation. We demonstrate that the 'spup' package is an effective and easy tool to apply and can be used in multi-disciplinary research and model-based decision support.
Leavesley, G.H.; Markstrom, S.L.; Viger, R.J.
2004-01-01
The interdisciplinary nature and increasing complexity of water- and environmental-resource problems require the use of modeling approaches that can incorporate knowledge from a broad range of scientific disciplines. The large number of distributed hydrological and ecosystem models currently available are composed of a variety of different conceptualizations of the associated processes they simulate. Assessment of the capabilities of these distributed models requires evaluation of the conceptualizations of the individual processes, and the identification of which conceptualizations are most appropriate for various combinations of criteria, such as problem objectives, data constraints, and spatial and temporal scales of application. With this knowledge, "optimal" models for specific sets of criteria can be created and applied. The U.S. Geological Survey (USGS) Modular Modeling System (MMS) is an integrated system of computer software that has been developed to provide these model development and application capabilities. MMS supports the integration of models and tools at a variety of levels of modular design. These include individual process models, tightly coupled models, loosely coupled models, and fully-integrated decision support systems. A variety of visualization and statistical tools are also provided. MMS has been coupled with the Bureau of Reclamation (BOR) object-oriented reservoir and river-system modeling framework, RiverWare, under a joint USGS-BOR program called the Watershed and River System Management Program. MMS and RiverWare are linked using a shared relational database. The resulting database-centered decision support system provides tools for evaluating and applying optimal resource-allocation and management strategies to complex, operational decisions on multipurpose reservoir systems and watersheds. Management issues being addressed include efficiency of water-resources management, environmental concerns such as meeting flow needs for endangered species, and optimizing operations within the constraints of multiple objectives such as power generation, irrigation, and water conservation. This decision support system approach is being developed, tested, and implemented in the Gunni-son, Yakima, San Juan, Rio Grande, and Truckee River basins of the western United States. Copyright ASCE 2004.
Full Life Cycle of Data Analysis with Climate Model Diagnostic Analyzer (CMDA)
NASA Astrophysics Data System (ADS)
Lee, S.; Zhai, C.; Pan, L.; Tang, B.; Zhang, J.; Bao, Q.; Malarout, N.
2017-12-01
We have developed a system that supports the full life cycle of a data analysis process, from data discovery, to data customization, to analysis, to reanalysis, to publication, and to reproduction. The system called Climate Model Diagnostic Analyzer (CMDA) is designed to demonstrate that the full life cycle of data analysis can be supported within one integrated system for climate model diagnostic evaluation with global observational and reanalysis datasets. CMDA has four subsystems that are highly integrated to support the analysis life cycle. Data System manages datasets used by CMDA analysis tools, Analysis System manages CMDA analysis tools which are all web services, Provenance System manages the meta data of CMDA datasets and the provenance of CMDA analysis history, and Recommendation System extracts knowledge from CMDA usage history and recommends datasets/analysis tools to users. These four subsystems are not only highly integrated but also easily expandable. New datasets can be easily added to Data System and scanned to be visible to the other subsystems. New analysis tools can be easily registered to be available in the Analysis System and Provenance System. With CMDA, a user can start a data analysis process by discovering datasets of relevance to their research topic using the Recommendation System. Next, the user can customize the discovered datasets for their scientific use (e.g. anomaly calculation, regridding, etc) with tools in the Analysis System. Next, the user can do their analysis with the tools (e.g. conditional sampling, time averaging, spatial averaging) in the Analysis System. Next, the user can reanalyze the datasets based on the previously stored analysis provenance in the Provenance System. Further, they can publish their analysis process and result to the Provenance System to share with other users. Finally, any user can reproduce the published analysis process and results. By supporting the full life cycle of climate data analysis, CMDA improves the research productivity and collaboration level of its user.
Bat detective-Deep learning tools for bat acoustic signal detection.
Mac Aodha, Oisin; Gibb, Rory; Barlow, Kate E; Browning, Ella; Firman, Michael; Freeman, Robin; Harder, Briana; Kinsey, Libby; Mead, Gary R; Newson, Stuart E; Pandourski, Ivan; Parsons, Stuart; Russ, Jon; Szodoray-Paradi, Abigel; Szodoray-Paradi, Farkas; Tilova, Elena; Girolami, Mark; Brostow, Gabriel; Jones, Kate E
2018-03-01
Passive acoustic sensing has emerged as a powerful tool for quantifying anthropogenic impacts on biodiversity, especially for echolocating bat species. To better assess bat population trends there is a critical need for accurate, reliable, and open source tools that allow the detection and classification of bat calls in large collections of audio recordings. The majority of existing tools are commercial or have focused on the species classification task, neglecting the important problem of first localizing echolocation calls in audio which is particularly problematic in noisy recordings. We developed a convolutional neural network based open-source pipeline for detecting ultrasonic, full-spectrum, search-phase calls produced by echolocating bats. Our deep learning algorithms were trained on full-spectrum ultrasonic audio collected along road-transects across Europe and labelled by citizen scientists from www.batdetective.org. When compared to other existing algorithms and commercial systems, we show significantly higher detection performance of search-phase echolocation calls with our test sets. As an example application, we ran our detection pipeline on bat monitoring data collected over five years from Jersey (UK), and compared results to a widely-used commercial system. Our detection pipeline can be used for the automatic detection and monitoring of bat populations, and further facilitates their use as indicator species on a large scale. Our proposed pipeline makes only a small number of bat specific design decisions, and with appropriate training data it could be applied to detecting other species in audio. A crucial novelty of our work is showing that with careful, non-trivial, design and implementation considerations, state-of-the-art deep learning methods can be used for accurate and efficient monitoring in audio.
Bat detective—Deep learning tools for bat acoustic signal detection
Barlow, Kate E.; Firman, Michael; Freeman, Robin; Harder, Briana; Kinsey, Libby; Mead, Gary R.; Newson, Stuart E.; Pandourski, Ivan; Russ, Jon; Szodoray-Paradi, Abigel; Tilova, Elena; Girolami, Mark; Jones, Kate E.
2018-01-01
Passive acoustic sensing has emerged as a powerful tool for quantifying anthropogenic impacts on biodiversity, especially for echolocating bat species. To better assess bat population trends there is a critical need for accurate, reliable, and open source tools that allow the detection and classification of bat calls in large collections of audio recordings. The majority of existing tools are commercial or have focused on the species classification task, neglecting the important problem of first localizing echolocation calls in audio which is particularly problematic in noisy recordings. We developed a convolutional neural network based open-source pipeline for detecting ultrasonic, full-spectrum, search-phase calls produced by echolocating bats. Our deep learning algorithms were trained on full-spectrum ultrasonic audio collected along road-transects across Europe and labelled by citizen scientists from www.batdetective.org. When compared to other existing algorithms and commercial systems, we show significantly higher detection performance of search-phase echolocation calls with our test sets. As an example application, we ran our detection pipeline on bat monitoring data collected over five years from Jersey (UK), and compared results to a widely-used commercial system. Our detection pipeline can be used for the automatic detection and monitoring of bat populations, and further facilitates their use as indicator species on a large scale. Our proposed pipeline makes only a small number of bat specific design decisions, and with appropriate training data it could be applied to detecting other species in audio. A crucial novelty of our work is showing that with careful, non-trivial, design and implementation considerations, state-of-the-art deep learning methods can be used for accurate and efficient monitoring in audio. PMID:29518076
Halvade-RNA: Parallel variant calling from transcriptomic data using MapReduce.
Decap, Dries; Reumers, Joke; Herzeel, Charlotte; Costanza, Pascal; Fostier, Jan
2017-01-01
Given the current cost-effectiveness of next-generation sequencing, the amount of DNA-seq and RNA-seq data generated is ever increasing. One of the primary objectives of NGS experiments is calling genetic variants. While highly accurate, most variant calling pipelines are not optimized to run efficiently on large data sets. However, as variant calling in genomic data has become common practice, several methods have been proposed to reduce runtime for DNA-seq analysis through the use of parallel computing. Determining the effectively expressed variants from transcriptomics (RNA-seq) data has only recently become possible, and as such does not yet benefit from efficiently parallelized workflows. We introduce Halvade-RNA, a parallel, multi-node RNA-seq variant calling pipeline based on the GATK Best Practices recommendations. Halvade-RNA makes use of the MapReduce programming model to create and manage parallel data streams on which multiple instances of existing tools such as STAR and GATK operate concurrently. Whereas the single-threaded processing of a typical RNA-seq sample requires ∼28h, Halvade-RNA reduces this runtime to ∼2h using a small cluster with two 20-core machines. Even on a single, multi-core workstation, Halvade-RNA can significantly reduce runtime compared to using multi-threading, thus providing for a more cost-effective processing of RNA-seq data. Halvade-RNA is written in Java and uses the Hadoop MapReduce 2.0 API. It supports a wide range of distributions of Hadoop, including Cloudera and Amazon EMR.
A survey on annotation tools for the biomedical literature.
Neves, Mariana; Leser, Ulf
2014-03-01
New approaches to biomedical text mining crucially depend on the existence of comprehensive annotated corpora. Such corpora, commonly called gold standards, are important for learning patterns or models during the training phase, for evaluating and comparing the performance of algorithms and also for better understanding the information sought for by means of examples. Gold standards depend on human understanding and manual annotation of natural language text. This process is very time-consuming and expensive because it requires high intellectual effort from domain experts. Accordingly, the lack of gold standards is considered as one of the main bottlenecks for developing novel text mining methods. This situation led the development of tools that support humans in annotating texts. Such tools should be intuitive to use, should support a range of different input formats, should include visualization of annotated texts and should generate an easy-to-parse output format. Today, a range of tools which implement some of these functionalities are available. In this survey, we present a comprehensive survey of tools for supporting annotation of biomedical texts. Altogether, we considered almost 30 tools, 13 of which were selected for an in-depth comparison. The comparison was performed using predefined criteria and was accompanied by hands-on experiences whenever possible. Our survey shows that current tools can support many of the tasks in biomedical text annotation in a satisfying manner, but also that no tool can be considered as a true comprehensive solution.
Current Capabilities and Planned Enhancements of SUSTAIN - Paper
Efforts have been under way by the U.S. Environmental Protection Agency (EPA) since 2003 to develop a decision-support tool for placement of best management practices (BMPs) at strategic locations in urban watersheds. The tool is called the System for Urban Stormwater Treatment ...
Watershed Central: A New Gateway to Watershed Information
Many communities across the country struggle to find the right approaches, tools and data to in their watershed plans. EPA recently posted a new Web site called "Watershed Central, a “onestop" tool, to help watershed organizations and others find key resources to protect their ...
Visualizing Qualitative Information
ERIC Educational Resources Information Center
Slone, Debra J.
2009-01-01
The abundance of qualitative data in today's society and the need to easily scrutinize, digest, and share this information calls for effective visualization and analysis tools. Yet, no existing qualitative tools have the analytic power, visual effectiveness, and universality of familiar quantitative instruments like bar charts, scatter-plots, and…
STS-57 Pilot Duffy uses TDS soldering tool in SPACEHAB-01 aboard OV-105
1993-07-01
STS057-30-021 (21 June-1 July 1993) --- Astronaut Brian Duffy, pilot, handles a soldering tool onboard the Earth-orbiting Space Shuttle Endeavour. The Soldering Experiment (SE) called for a crew member to solder on a printed circuit board containing 45 connection points, then de-solder 35 points on a similar board. The SE was part of a larger project called the Tools and Diagnostic Systems (TDS), sponsored by the Space and Life Sciences Directorate at Johnson Space Center (JSC). TDS represents a group of equipment selected from the tools and diagnostic hardware to be supported by the International Space Station program. TDS was designed to demonstrate the maintenance of experiment hardware on-orbit and to evaluate the adequacy of its design and the crew interface. Duffy and five other NASA astronauts spent almost ten days aboard the Space Shuttle Endeavour in Earth-orbit supporting the SpaceHab mission, retrieving the European Retrievable Carrier (EURECA) and conducting various experiments.
Chen, J.; Wu, Y.
2012-01-01
This paper presents a study of the integration of the Soil and Water Assessment Tool (SWAT) model and the TOPographic MODEL (TOPMODEL) features for enhancing the physical representation of hydrologic processes. In SWAT, four hydrologic processes, which are surface runoff, baseflow, groundwater re-evaporation and deep aquifer percolation, are modeled by using a group of empirical equations. The empirical equations usually constrain the simulation capability of relevant processes. To replace these equations and to model the influences of topography and water table variation on streamflow generation, the TOPMODEL features are integrated into SWAT, and a new model, the so-called SWAT-TOP, is developed. In the new model, the process of deep aquifer percolation is removed, the concept of groundwater re-evaporation is refined, and the processes of surface runoff and baseflow are remodeled. Consequently, three parameters in SWAT are discarded, and two new parameters to reflect the TOPMODEL features are introduced. SWAT-TOP and SWAT are applied to the East River basin in South China, and the results reveal that, compared with SWAT, the new model can provide a more reasonable simulation of the hydrologic processes of surface runoff, groundwater re-evaporation, and baseflow. This study evidences that an established hydrologic model can be further improved by integrating the features of another model, which is a possible way to enhance our understanding of the workings of catchments.
ERIC Educational Resources Information Center
Wood, Marianne
2007-01-01
This article presents a lesson called Memory Palaces. A memory palace is a memory tool used to remember information, usually as visual images, in a sequence that is logical to the person remembering it. In his book, "In the Palaces of Memory", George Johnson calls them "...structure(s) for arranging knowledge. Lots of connections to language arts,…
ERIC Educational Resources Information Center
Marshall, Harriet
2011-01-01
This paper exposes the tensions between different agendas and calls for what is loosely called "global citizenship education" by developing a set of sociological conceptual tools useful for engaging with associated educational forms and ideals. It presents the instrumentalist and normative agendas at play within global citizenship education…
A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Omitaomu, Olufemi A; Parish, Esther S; Nugent, Philip J
Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). Formore » all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.« less
Cancer systems biology: signal processing for cancer research
Yli-Harja, Olli; Ylipää, Antti; Nykter, Matti; Zhang, Wei
2011-01-01
In this editorial we introduce the research paradigms of signal processing in the era of systems biology. Signal processing is a field of science traditionally focused on modeling electronic and communications systems, but recently it has turned to biological applications with astounding results. The essence of signal processing is to describe the natural world by mathematical models and then, based on these models, develop efficient computational tools for solving engineering problems. Here, we underline, with examples, the endless possibilities which arise when the battle-hardened tools of engineering are applied to solve the problems that have tormented cancer researchers. Based on this approach, a new field has emerged, called cancer systems biology. Despite its short history, cancer systems biology has already produced several success stories tackling previously impracticable problems. Perhaps most importantly, it has been accepted as an integral part of the major endeavors of cancer research, such as analyzing the genomic and epigenomic data produced by The Cancer Genome Atlas (TCGA) project. Finally, we show that signal processing and cancer research, two fields that are seemingly distant from each other, have merged into a field that is indeed more than the sum of its parts. PMID:21439242
Palomar, Esther; Chen, Xiaohong; Liu, Zhiming; Maharjan, Sabita; Bowen, Jonathan
2016-10-28
Smart city systems embrace major challenges associated with climate change, energy efficiency, mobility and future services by embedding the virtual space into a complex cyber-physical system. Those systems are constantly evolving and scaling up, involving a wide range of integration among users, devices, utilities, public services and also policies. Modelling such complex dynamic systems' architectures has always been essential for the development and application of techniques/tools to support design and deployment of integration of new components, as well as for the analysis, verification, simulation and testing to ensure trustworthiness. This article reports on the definition and implementation of a scalable component-based architecture that supports a cooperative energy demand response (DR) system coordinating energy usage between neighbouring households. The proposed architecture, called refinement of Cyber-Physical Component Systems (rCPCS), which extends the refinement calculus for component and object system (rCOS) modelling method, is implemented using Eclipse Extensible Coordination Tools (ECT), i.e., Reo coordination language. With rCPCS implementation in Reo, we specify the communication, synchronisation and co-operation amongst the heterogeneous components of the system assuring, by design scalability and the interoperability, correctness of component cooperation.
Exploration of cellular reaction systems.
Kirkilionis, Markus
2010-01-01
We discuss and review different ways to map cellular components and their temporal interaction with other such components to different non-spatially explicit mathematical models. The essential choices made in the literature are between discrete and continuous state spaces, between rule and event-based state updates and between deterministic and stochastic series of such updates. The temporal modelling of cellular regulatory networks (dynamic network theory) is compared with static network approaches in two first introductory sections on general network modelling. We concentrate next on deterministic rate-based dynamic regulatory networks and their derivation. In the derivation, we include methods from multiscale analysis and also look at structured large particles, here called macromolecular machines. It is clear that mass-action systems and their derivatives, i.e. networks based on enzyme kinetics, play the most dominant role in the literature. The tools to analyse cellular reaction networks are without doubt most complete for mass-action systems. We devote a long section at the end of the review to make a comprehensive review of related tools and mathematical methods. The emphasis is to show how cellular reaction networks can be analysed with the help of different associated graphs and the dissection into modules, i.e. sub-networks.
Furuhama, A; Toida, T; Nishikawa, N; Aoki, Y; Yoshioka, Y; Shiraishi, H
2010-07-01
The KAshinhou Tool for Ecotoxicity (KATE) system, including ecotoxicity quantitative structure-activity relationship (QSAR) models, was developed by the Japanese National Institute for Environmental Studies (NIES) using the database of aquatic toxicity results gathered by the Japanese Ministry of the Environment and the US EPA fathead minnow database. In this system chemicals can be entered according to their one-dimensional structures and classified by substructure. The QSAR equations for predicting the toxicity of a chemical compound assume a linear correlation between its log P value and its aquatic toxicity. KATE uses a structural domain called C-judgement, defined by the substructures of specified functional groups in the QSAR models. Internal validation by the leave-one-out method confirms that the QSAR equations, with r(2 )> 0.7, RMSE
Palomar, Esther; Chen, Xiaohong; Liu, Zhiming; Maharjan, Sabita; Bowen, Jonathan
2016-01-01
Smart city systems embrace major challenges associated with climate change, energy efficiency, mobility and future services by embedding the virtual space into a complex cyber-physical system. Those systems are constantly evolving and scaling up, involving a wide range of integration among users, devices, utilities, public services and also policies. Modelling such complex dynamic systems’ architectures has always been essential for the development and application of techniques/tools to support design and deployment of integration of new components, as well as for the analysis, verification, simulation and testing to ensure trustworthiness. This article reports on the definition and implementation of a scalable component-based architecture that supports a cooperative energy demand response (DR) system coordinating energy usage between neighbouring households. The proposed architecture, called refinement of Cyber-Physical Component Systems (rCPCS), which extends the refinement calculus for component and object system (rCOS) modelling method, is implemented using Eclipse Extensible Coordination Tools (ECT), i.e., Reo coordination language. With rCPCS implementation in Reo, we specify the communication, synchronisation and co-operation amongst the heterogeneous components of the system assuring, by design scalability and the interoperability, correctness of component cooperation. PMID:27801829
ERIC Educational Resources Information Center
Mankowska, Anna
2016-01-01
Little, if any, examination of using play-based tools to examine children's opinions in research exists in the current literature. Therefore, this paper is meant to address that gap within the literature and showcase the study about the use of a specific play-based methodological tool in qualitative research. This specific tool called social board…
Validation of Robotic Surgery Simulator (RoSS).
Kesavadas, Thenkurussi; Stegemann, Andrew; Sathyaseelan, Gughan; Chowriappa, Ashirwad; Srimathveeravalli, Govindarajan; Seixas-Mikelus, Stéfanie; Chandrasekhar, Rameella; Wilding, Gregory; Guru, Khurshid
2011-01-01
Recent growth of daVinci Robotic Surgical System as a minimally invasive surgery tool has led to a call for better training of future surgeons. In this paper, a new virtual reality simulator, called RoSS is presented. Initial results from two studies - face and content validity, are very encouraging. 90% of the cohort of expert robotic surgeons felt that the simulator was excellent or somewhat close to the touch and feel of the daVinci console. Content validity of the simulator received 90% approval in some cases. These studies demonstrate that RoSS has the potential of becoming an important training tool for the daVinci surgical robot.
Ellett, Kevin M.; Middleton, Richard S.; Stauffer, Philip H.; ...
2017-08-18
The application of integrated system models for evaluating carbon capture and storage technology has expanded steadily over the past few years. To date, such models have focused largely on hypothetical scenarios of complex source-sink matching involving numerous large-scale CO 2 emitters, and high-volume, continuous reservoirs such as deep saline formations to function as geologic sinks for carbon storage. Though these models have provided unique insight on the potential costs and feasibility of deploying complex networks of integrated infrastructure, there remains a pressing need to translate such insight to the business community if this technology is to ever achieve a trulymore » meaningful impact in greenhouse gas mitigation. Here, we present a new integrated system modelling tool termed SimCCUS aimed at providing crucial decision support for businesses by extending the functionality of a previously developed model called SimCCS. The primary innovation of the SimCCUS tool development is the incorporation of stacked geological reservoir systems with explicit consideration of processes and costs associated with the operation of multiple CO 2 utilization and storage targets from a single geographic location. In such locations provide significant efficiencies through economies of scale, effectively minimizing CO 2 storage costs while simultaneously maximizing revenue streams via the utilization of CO 2 as a commodity for enhanced hydrocarbon recovery.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ellett, Kevin M.; Middleton, Richard S.; Stauffer, Philip H.
The application of integrated system models for evaluating carbon capture and storage technology has expanded steadily over the past few years. To date, such models have focused largely on hypothetical scenarios of complex source-sink matching involving numerous large-scale CO 2 emitters, and high-volume, continuous reservoirs such as deep saline formations to function as geologic sinks for carbon storage. Though these models have provided unique insight on the potential costs and feasibility of deploying complex networks of integrated infrastructure, there remains a pressing need to translate such insight to the business community if this technology is to ever achieve a trulymore » meaningful impact in greenhouse gas mitigation. Here, we present a new integrated system modelling tool termed SimCCUS aimed at providing crucial decision support for businesses by extending the functionality of a previously developed model called SimCCS. The primary innovation of the SimCCUS tool development is the incorporation of stacked geological reservoir systems with explicit consideration of processes and costs associated with the operation of multiple CO 2 utilization and storage targets from a single geographic location. In such locations provide significant efficiencies through economies of scale, effectively minimizing CO 2 storage costs while simultaneously maximizing revenue streams via the utilization of CO 2 as a commodity for enhanced hydrocarbon recovery.« less
Borrel, Alexandre; Fourches, Denis
2017-12-01
There is a growing interest for the broad use of Augmented Reality (AR) and Virtual Reality (VR) in the fields of bioinformatics and cheminformatics to visualize complex biological and chemical structures. AR and VR technologies allow for stunning and immersive experiences, offering untapped opportunities for both research and education purposes. However, preparing 3D models ready to use for AR and VR is time-consuming and requires a technical expertise that severely limits the development of new contents of potential interest for structural biologists, medicinal chemists, molecular modellers and teachers. Herein we present the RealityConvert software tool and associated website, which allow users to easily convert molecular objects to high quality 3D models directly compatible for AR and VR applications. For chemical structures, in addition to the 3D model generation, RealityConvert also generates image trackers, useful to universally call and anchor that particular 3D model when used in AR applications. The ultimate goal of RealityConvert is to facilitate and boost the development and accessibility of AR and VR contents for bioinformatics and cheminformatics applications. http://www.realityconvert.com. dfourch@ncsu.edu. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com
NASA Astrophysics Data System (ADS)
Itzá Balam, Reymundo; Iturrarán-Viveros, Ursula; Parra, Jorge O.
2018-03-01
Two main stages of seismic modeling are geological model building and numerical computation of seismic response for the model. The quality of the computed seismic response is partly related to the type of model that is built. Therefore, the model building approaches become as important as seismic forward numerical methods. For this purpose, three petrophysical facies (sands, shales and limestones) are extracted from reflection seismic data and some seismic attributes via the clustering method called Self-Organizing Maps (SOM), which, in this context, serves as a geological model building tool. This model with all its properties is the input to the Optimal Implicit Staggered Finite Difference (OISFD) algorithm to create synthetic seismograms for poroelastic, poroacoustic and elastic media. The results show a good agreement between observed and 2-D synthetic seismograms. This demonstrates that the SOM classification method enables us to extract facies from seismic data and allows us to integrate the lithology at the borehole scale with the 2-D seismic data.
Continuous Coordination Tools and their Evaluation
NASA Astrophysics Data System (ADS)
Sarma, Anita; Al-Ani, Ban; Trainer, Erik; Silva Filho, Roberto S.; da Silva, Isabella A.; Redmiles, David; van der Hoek, André
This chapter discusses a set of co-ordination tools (the Continuous Co-ordination (CC) tool suite that includes Ariadne, Workspace Activity Viewer (WAV), Lighthouse, Palantír, and YANCEES) and details of our evaluation framework for these tools. Specifically, we discuss how we assessed the usefulness and the usability of these tools within the context of a predefined evaluation framework called
Mobility analysis, simulation, and scale model testing for the design of wheeled planetary rovers
NASA Technical Reports Server (NTRS)
Lindemann, Randel A.; Eisen, Howard J.
1993-01-01
The use of computer based techniques to model and simulate wheeled rovers on rough natural terrains is considered. Physical models of a prototype vehicle can be used to test the correlation of the simulations in scaled testing. The computer approaches include a quasi-static planar or two dimensional analysis and design tool based on the traction necessary for the vehicle to have imminent mobility. The computer program modeled a six by six wheel drive vehicle of original kinematic configuration, called the Rocker Bogie. The Rocker Bogie was optimized using the quasi-static software with respect to its articulation parameters prior to fabrication of a prototype. In another approach used, the dynamics of the Rocker Bogie vehicle in 3-D space was modeled on an engineering workstation using commercial software. The model included the complex and nonlinear interaction of the tire and terrain. The results of the investigation yielded numerical and graphical results of the rover traversing rough terrain on the earth, moon, and Mars. In addition, animations of the rover excursions were also generated. A prototype vehicle was then used in a series of testbed and field experiments. Correspondence was then established between the computer models and the physical model. The results indicated the utility of the quasi-static tool for configurational design, as well as the predictive ability of the 3-D simulation to model the dynamic behavior of the vehicle over short traverses.
Adapting customer service to consumer-directed health care.
Hammer, David C
2006-09-01
A growing number of hospitals are implementing new tools that provide convenient, more detailed patient access to billing information. These tools are paying off for hospitals through reduced calls to their billing offices, decreased mailing costs, and increased payments, as well as higher rates of customer satisfaction.
RAFCON: A Graphical Tool for Engineering Complex, Robotic Tasks
2016-10-09
Robotic tasks are becoming increasingly complex, and with this also the robotic systems. This requires new tools to manage this complexity and to...execution of robotic tasks, called RAFCON. These tasks are described in hierarchical state machines supporting concurrency. A formal notation of this concept
Elwyn, Glyn; Burstin, Helen; Barry, Michael J; Corry, Maureen P; Durand, Marie Anne; Lessler, Daniel; Saigal, Christopher
2018-04-27
Efforts to implement the use of patient decision aids to stimulate shared decision making are gaining prominence. Patient decision aids have been designed to help patients participate in making specific choices among health care options. Because these tools clearly influence decisions, poor quality, inaccurate or unbalanced presentations or misleading tools are a risk to patients. As payer interest in these tools increases, so does the risk that patients are harmed by the use of tools that are described as patient decision aids yet fail to meet established standards. To address this problem, the National Quality Forum (NQF) in the USA convened a multi-stakeholder expert panel in 2016 to propose national standards for a patient decision aid certification process. In 2017, NQF established an Action Team to foster shared decision making, and to call for a national certification process as one recommendation among others to stimulate improvement. A persistent barrier to the setup of a national patient decision aids certification process is the lack of a sustainable financial model to support the work. Copyright © 2018 The Author(s). Published by Elsevier B.V. All rights reserved.
Near-Infrared Neuroimaging with NinPy
Strangman, Gary E.; Zhang, Quan; Zeffiro, Thomas
2009-01-01
There has been substantial recent growth in the use of non-invasive optical brain imaging in studies of human brain function in health and disease. Near-infrared neuroimaging (NIN) is one of the most promising of these techniques and, although NIN hardware continues to evolve at a rapid pace, software tools supporting optical data acquisition, image processing, statistical modeling, and visualization remain less refined. Python, a modular and computationally efficient development language, can support functional neuroimaging studies of diverse design and implementation. In particular, Python's easily readable syntax and modular architecture allow swift prototyping followed by efficient transition to stable production systems. As an introduction to our ongoing efforts to develop Python software tools for structural and functional neuroimaging, we discuss: (i) the role of non-invasive diffuse optical imaging in measuring brain function, (ii) the key computational requirements to support NIN experiments, (iii) our collection of software tools to support NIN, called NinPy, and (iv) future extensions of these tools that will allow integration of optical with other structural and functional neuroimaging data sources. Source code for the software discussed here will be made available at www.nmr.mgh.harvard.edu/Neural_SystemsGroup/software.html. PMID:19543449
Event-Driven Process Chains (EPC)
NASA Astrophysics Data System (ADS)
Mendling, Jan
This chapter provides a comprehensive overview of Event-driven Process Chains (EPCs) and introduces a novel definition of EPC semantics. EPCs became popular in the 1990s as a conceptual business process modeling language in the context of reference modeling. Reference modeling refers to the documentation of generic business operations in a model such as service processes in the telecommunications sector, for example. It is claimed that reference models can be reused and adapted as best-practice recommendations in individual companies (see [230, 168, 229, 131, 400, 401, 446, 127, 362, 126]). The roots of reference modeling can be traced back to the Kölner Integrationsmodell (KIM) [146, 147] that was developed in the 1960s and 1970s. In the 1990s, the Institute of Information Systems (IWi) in Saarbrücken worked on a project with SAP to define a suitable business process modeling language to document the processes of the SAP R/3 enterprise resource planning system. There were two results from this joint effort: the definition of EPCs [210] and the documentation of the SAP system in the SAP Reference Model (see [92, 211]). The extensive database of this reference model contains almost 10,000 sub-models: 604 of them non-trivial EPC business process models. The SAP Reference model had a huge impact with several researchers referring to it in their publications (see [473, 235, 127, 362, 281, 427, 415]) as well as motivating the creation of EPC reference models in further domains including computer integrated manufacturing [377, 379], logistics [229] or retail [52]. The wide-spread application of EPCs in business process modeling theory and practice is supported by their coverage in seminal text books for business process management and information systems in general (see [378, 380, 49, 384, 167, 240]). EPCs are frequently used in practice due to a high user acceptance [376] and extensive tool support. Some examples of tools that support EPCs are ARIS Toolset by IDS Scheer AG, AENEIS by ATOSS Software AG, ADONIS by BOC GmbH, Visio by Microsoft Corp., Nautilus by Gedilan Consulting GmbH, and Bonapart by Pikos GmbH. In order to facilitate the interchange of EPC business process models between these tools, there is a tool neutral interchange format called EPC Markup Language (EPML) [283, 285, 286, 287, 289, 290, 291].
Constructing Optimal Coarse-Grained Sites of Huge Biomolecules by Fluctuation Maximization.
Li, Min; Zhang, John Zenghui; Xia, Fei
2016-04-12
Coarse-grained (CG) models are valuable tools for the study of functions of large biomolecules on large length and time scales. The definition of CG representations for huge biomolecules is always a formidable challenge. In this work, we propose a new method called fluctuation maximization coarse-graining (FM-CG) to construct the CG sites of biomolecules. The defined residual in FM-CG converges to a maximal value as the number of CG sites increases, allowing an optimal CG model to be rigorously defined on the basis of the maximum. More importantly, we developed a robust algorithm called stepwise local iterative optimization (SLIO) to accelerate the process of coarse-graining large biomolecules. By means of the efficient SLIO algorithm, the computational cost of coarse-graining large biomolecules is reduced to within the time scale of seconds, which is far lower than that of conventional simulated annealing. The coarse-graining of two huge systems, chaperonin GroEL and lengsin, indicates that our new methods can coarse-grain huge biomolecular systems with up to 10,000 residues within the time scale of minutes. The further parametrization of CG sites derived from FM-CG allows us to construct the corresponding CG models for studies of the functions of huge biomolecular systems.
Computerized planning of prostate cryosurgery using variable cryoprobe insertion depth.
Rossi, Michael R; Tanaka, Daigo; Shimada, Kenji; Rabin, Yoed
2010-02-01
The current study presents a computerized planning scheme for prostate cryosurgery using a variable insertion depth strategy. This study is a part of an ongoing effort to develop computerized tools for cryosurgery. Based on typical clinical practices, previous automated planning schemes have required that all cryoprobes be aligned at a single insertion depth. The current study investigates the benefit of removing this constraint, in comparison with results based on uniform insertion depth planning as well as the so-called "pullback procedure". Planning is based on the so-called "bubble-packing method", and its quality is evaluated with bioheat transfer simulations. This study is based on five 3D prostate models, reconstructed from ultrasound imaging, and cryoprobe active length in the range of 15-35 mm. The variable insertion depth technique is found to consistently provide superior results when compared to the other placement methods. Furthermore, it is shown that both the optimal active length and the optimal number of cryoprobes vary among prostate models, based on the size and shape of the target region. Due to its low computational cost, the new scheme can be used to determine the optimal cryoprobe layout for a given prostate model in real time. Copyright 2008 Elsevier Inc. All rights reserved.
Model diagnostics in reduced-rank estimation
Chen, Kun
2016-01-01
Reduced-rank methods are very popular in high-dimensional multivariate analysis for conducting simultaneous dimension reduction and model estimation. However, the commonly-used reduced-rank methods are not robust, as the underlying reduced-rank structure can be easily distorted by only a few data outliers. Anomalies are bound to exist in big data problems, and in some applications they themselves could be of the primary interest. While naive residual analysis is often inadequate for outlier detection due to potential masking and swamping, robust reduced-rank estimation approaches could be computationally demanding. Under Stein's unbiased risk estimation framework, we propose a set of tools, including leverage score and generalized information score, to perform model diagnostics and outlier detection in large-scale reduced-rank estimation. The leverage scores give an exact decomposition of the so-called model degrees of freedom to the observation level, which lead to exact decomposition of many commonly-used information criteria; the resulting quantities are thus named information scores of the observations. The proposed information score approach provides a principled way of combining the residuals and leverage scores for anomaly detection. Simulation studies confirm that the proposed diagnostic tools work well. A pattern recognition example with hand-writing digital images and a time series analysis example with monthly U.S. macroeconomic data further demonstrate the efficacy of the proposed approaches. PMID:28003860