-
Parallelizing Timed Petri Net simulations
NASA Technical Reports Server (NTRS)
Nicol, David M.
1993-01-01
The possibility of using parallel processing to accelerate the simulation of Timed Petri Nets (TPN's) was studied. It was recognized that complex system development tools often transform system descriptions into TPN's or TPN-like models, which are then simulated to obtain information about system behavior. Viewed this way, it was important that the parallelization of TPN's be as automatic as possible, to admit the possibility of the parallelization being embedded in the system design tool. Later years of the grant were devoted to examining the problem of joint performance and reliability analysis, to explore whether both types of analysis could be accomplished within a single framework. In this final report, the results of our studies are summarized. We believe that the problem of parallelizing TPN's automatically for MIMD architectures has been almost completely solved for a large and important class of problems. Our initial investigations into joint performance/reliability analysis are two-fold; it was shown that Monte Carlo simulation, with importance sampling, offers promise of joint analysis in the context of a single tool, and methods for the parallel simulation of general Continuous Time Markov Chains, a model framework within which joint performance/reliability models can be cast, were developed. However, very much more work is needed to determine the scope and generality of these approaches. The results obtained in our two studies, future directions for this type of work, and a list of publications are included.
-
An Annotated Reading List for Concurrent Engineering
DTIC Science & Technology
1989-07-01
The seven tools are sometimes referred to as the seven old tools.) -9- Ishikawa , Kaoru , What is Total Quality Control? The Japanese Way, Prentice-Hall...some solutions. * Ishikawa (1982) presents a practical guide (with easy to use tools) for implementing qual- ity control at the working level...study of, :-, ieering for the last two years. Is..ikawa, Kaoru , Guide to Quality Control, Kraus International Publications, White Plains, NY, 1982. The
-
msBiodat analysis tool, big data analysis for high-throughput experiments.
PubMed
Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver
2016-01-01
Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.
-
Comprehensive Assessment of Models and Events based on Library tools (CAMEL)
NASA Astrophysics Data System (ADS)
Rastaetter, L.; Boblitt, J. M.; DeZeeuw, D.; Mays, M. L.; Kuznetsova, M. M.; Wiegand, C.
2017-12-01
At the Community Coordinated Modeling Center (CCMC), the assessment of modeling skill using a library of model-data comparison metrics is taken to the next level by fully integrating the ability to request a series of runs with the same model parameters for a list of events. The CAMEL framework initiates and runs a series of selected, pre-defined simulation settings for participating models (e.g., WSA-ENLIL, SWMF-SC+IH for the heliosphere, SWMF-GM, OpenGGCM, LFM, GUMICS for the magnetosphere) and performs post-processing using existing tools for a host of different output parameters. The framework compares the resulting time series data with respective observational data and computes a suite of metrics such as Prediction Efficiency, Root Mean Square Error, Probability of Detection, Probability of False Detection, Heidke Skill Score for each model-data pair. The system then plots scores by event and aggregated over all events for all participating models and run settings. We are building on past experiences with model-data comparisons of magnetosphere and ionosphere model outputs in GEM2008, GEM-CEDAR CETI2010 and Operational Space Weather Model challenges (2010-2013). We can apply the framework also to solar-heliosphere as well as radiation belt models. The CAMEL framework takes advantage of model simulations described with Space Physics Archive Search and Extract (SPASE) metadata and a database backend design developed for a next-generation Run-on-Request system at the CCMC.
-
ISAAC - InterSpecies Analysing Application using Containers.
PubMed
Baier, Herbert; Schultz, Jörg
2014-01-15
Information about genes, transcripts and proteins is spread over a wide variety of databases. Different tools have been developed using these databases to identify biological signals in gene lists from large scale analysis. Mostly, they search for enrichments of specific features. But, these tools do not allow an explorative walk through different views and to change the gene lists according to newly upcoming stories. To fill this niche, we have developed ISAAC, the InterSpecies Analysing Application using Containers. The central idea of this web based tool is to enable the analysis of sets of genes, transcripts and proteins under different biological viewpoints and to interactively modify these sets at any point of the analysis. Detailed history and snapshot information allows tracing each action. Furthermore, one can easily switch back to previous states and perform new analyses. Currently, sets can be viewed in the context of genomes, protein functions, protein interactions, pathways, regulation, diseases and drugs. Additionally, users can switch between species with an automatic, orthology based translation of existing gene sets. As todays research usually is performed in larger teams and consortia, ISAAC provides group based functionalities. Here, sets as well as results of analyses can be exchanged between members of groups. ISAAC fills the gap between primary databases and tools for the analysis of large gene lists. With its highly modular, JavaEE based design, the implementation of new modules is straight forward. Furthermore, ISAAC comes with an extensive web-based administration interface including tools for the integration of third party data. Thus, a local installation is easily feasible. In summary, ISAAC is tailor made for highly explorative interactive analyses of gene, transcript and protein sets in a collaborative environment.
-
Integrated software environment based on COMKAT for analyzing tracer pharmacokinetics with molecular imaging.
PubMed
Fang, Yu-Hua Dean; Asthana, Pravesh; Salinas, Cristian; Huang, Hsuan-Ming; Muzic, Raymond F
2010-01-01
An integrated software package, Compartment Model Kinetic Analysis Tool (COMKAT), is presented in this report. COMKAT is an open-source software package with many functions for incorporating pharmacokinetic analysis in molecular imaging research and has both command-line and graphical user interfaces. With COMKAT, users may load and display images, draw regions of interest, load input functions, select kinetic models from a predefined list, or create a novel model and perform parameter estimation, all without having to write any computer code. For image analysis, COMKAT image tool supports multiple image file formats, including the Digital Imaging and Communications in Medicine (DICOM) standard. Image contrast, zoom, reslicing, display color table, and frame summation can be adjusted in COMKAT image tool. It also displays and automatically registers images from 2 modalities. Parametric imaging capability is provided and can be combined with the distributed computing support to enhance computation speeds. For users without MATLAB licenses, a compiled, executable version of COMKAT is available, although it currently has only a subset of the full COMKAT capability. Both the compiled and the noncompiled versions of COMKAT are free for academic research use. Extensive documentation, examples, and COMKAT itself are available on its wiki-based Web site, http://comkat.case.edu. Users are encouraged to contribute, sharing their experience, examples, and extensions of COMKAT. With integrated functionality specifically designed for imaging and kinetic modeling analysis, COMKAT can be used as a software environment for molecular imaging and pharmacokinetic analysis.
-
Popularity and Novelty Dynamics in Evolving Networks.
PubMed
Abbas, Khushnood; Shang, Mingsheng; Abbasi, Alireza; Luo, Xin; Xu, Jian Jun; Zhang, Yu-Xia
2018-04-20
Network science plays a big role in the representation of real-world phenomena such as user-item bipartite networks presented in e-commerce or social media platforms. It provides researchers with tools and techniques to solve complex real-world problems. Identifying and predicting future popularity and importance of items in e-commerce or social media platform is a challenging task. Some items gain popularity repeatedly over time while some become popular and novel only once. This work aims to identify the key-factors: popularity and novelty. To do so, we consider two types of novelty predictions: items appearing in the popular ranking list for the first time; and items which were not in the popular list in the past time window, but might have been popular before the recent past time window. In order to identify the popular items, a careful consideration of macro-level analysis is needed. In this work we propose a model, which exploits item level information over a span of time to rank the importance of the item. We considered ageing or decay effect along with the recent link-gain of the items. We test our proposed model on four various real-world datasets using four information retrieval based metrics.
-
Continual improvement: A bibliography with indexes, 1992-1993
NASA Technical Reports Server (NTRS)
1994-01-01
This bibliography lists 606 references to reports and journal articles entered into the NASA Scientific and Technical Information Database during 1992 to 1993. Topics cover the philosophy and history of Continual Improvement (CI), basic approaches and strategies for implementation, and lessons learned from public and private sector models. Entries are arranged according to the following categories: Leadership for Quality, Information and Analysis, Strategic Planning for CI, Human Resources Utilization, Management of Process Quality, Supplier Quality, Assessing Results, Customer Focus and Satisfaction, TQM Tools and Philosophies, and Applications. Indexes include subject, personal author, corporate source, contract number, report number, and accession number.
-
Validity of a quantitative clinical measurement tool of trunk posture in idiopathic scoliosis.
PubMed
Fortin, Carole; Feldman, Debbie E; Cheriet, Farida; Labelle, Hubert
2010-09-01
Concurrent validity between postural indices obtained from digital photographs (two-dimensional [2D]), surface topography imaging (three-dimensional [3D]), and radiographs. To assess the validity of a quantitative clinical postural assessment tool of the trunk based on photographs (2D) as compared to a surface topography system (3D) as well as indices calculated from radiographs. To monitor progression of scoliosis or change in posture over time in young persons with idiopathic scoliosis (IS), noninvasive and nonionizing methods are recommended. In a clinical setting, posture can be quite easily assessed by calculating key postural indices from photographs. Quantitative postural indices of 70 subjects aged 10 to 20 years old with IS (Cobb angle, 15 degrees -60 degrees) were measured from photographs and from 3D trunk surface images taken in the standing position. Shoulder, scapula, trunk list, pelvis, scoliosis, and waist angles indices were calculated with specially designed software. Frontal and sagittal Cobb angles and trunk list were also calculated on radiographs. The Pearson correlation coefficients (r) was used to estimate concurrent validity of the 2D clinical postural tool of the trunk with indices extracted from the 3D system and with those obtained from radiographs. The correlation between 2D and 3D indices was good to excellent for shoulder, pelvis, trunk list, and thoracic scoliosis (0.81>r<0.97; P<0.01) but fair to moderate for thoracic kyphosis, lumbar lordosis, and thoracolumbar or lumbar scoliosis (0.30>r<0.56; P<0.05). The correlation between 2D and radiograph spinal indices was fair to good (-0.33 to -0.80 with Cobb angles and 0.76 for trunk list; P<0.05). This tool will facilitate clinical practice by monitoring trunk posture among persons with IS. Further, it may contribute to a reduction in the use of radiographs to monitor scoliosis progression.
-
The Use of Climatic Niches in Screening Procedures for Introduced Species to Evaluate Risk of Spread: A Case with the American Eastern Grey Squirrel
PubMed Central
Di Febbraro, Mirko; Lurz, Peter W. W.; Genovesi, Piero; Maiorano, Luigi; Girardello, Marco; Bertolino, Sandro
2013-01-01
Species introduction represents one of the most serious threats for biodiversity. The realized climatic niche of an invasive species can be used to predict its potential distribution in new areas, providing a basis for screening procedures in the compilation of black and white lists to prevent new introductions. We tested this assertion by modeling the realized climatic niche of the Eastern grey squirrel Sciurus carolinensis. Maxent was used to develop three models: one considering only records from the native range (NRM), a second including records from native and invasive range (NIRM), a third calibrated with invasive occurrences and projected in the native range (RCM). Niche conservatism was tested considering both a niche equivalency and a niche similarity test. NRM failed to predict suitable parts of the currently invaded range in Europe, while RCM underestimated the suitability in the native range. NIRM accurately predicted both the native and invasive range. The niche equivalency hypothesis was rejected due to a significant difference between the grey squirrel’s niche in native and invasive ranges. The niche similarity test yielded no significant results. Our analyses support the hypothesis of a shift in the species’ climatic niche in the area of introductions. Species Distribution Models (SDMs) appear to be a useful tool in the compilation of black lists, allowing identifying areas vulnerable to invasions. We advise caution in the use of SDMs based only on the native range of a species for the compilation of white lists for other geographic areas, due to the significant risk of underestimating its potential invasive range. PMID:23843957
-
GALEN: a third generation terminology tool to support a multipurpose national coding system for surgical procedures.
PubMed
Trombert-Paviot, B; Rodrigues, J M; Rogers, J E; Baud, R; van der Haring, E; Rassinoux, A M; Abrial, V; Clavel, L; Idir, H
2000-09-01
Generalised architecture for languages, encyclopedia and nomenclatures in medicine (GALEN) has developed a new generation of terminology tools based on a language independent model describing the semantics and allowing computer processing and multiple reuses as well as natural language understanding systems applications to facilitate the sharing and maintaining of consistent medical knowledge. During the European Union 4 Th. framework program project GALEN-IN-USE and later on within two contracts with the national health authorities we applied the modelling and the tools to the development of a new multipurpose coding system for surgical procedures named CCAM in a minority language country, France. On one hand, we contributed to a language independent knowledge repository and multilingual semantic dictionaries for multicultural Europe. On the other hand, we support the traditional process for creating a new coding system in medicine which is very much labour consuming by artificial intelligence tools using a medically oriented recursive ontology and natural language processing. We used an integrated software named CLAW (for classification workbench) to process French professional medical language rubrics produced by the national colleges of surgeons domain experts into intermediate dissections and to the Grail reference ontology model representation. From this language independent concept model representation, on one hand, we generate with the LNAT natural language generator controlled French natural language to support the finalization of the linguistic labels (first generation) in relation with the meanings of the conceptual system structure. On the other hand, the Claw classification manager proves to be very powerful to retrieve the initial domain experts rubrics list with different categories of concepts (second generation) within a semantic structured representation (third generation) bridge to the electronic patient record detailed terminology.
-
Effect of preventive zinc supplementation on linear growth in children under 5 years of age in developing countries: a meta-analysis of studies for input to the lives saved tool
PubMed Central
2011-01-01
Introduction Zinc plays an important role in cellular growth, cellular differentiation and metabolism. The results of previous meta-analyses evaluating effect of zinc supplementation on linear growth are inconsistent. We have updated and evaluated the available evidence according to Grading of Recommendations, Assessment, Development and Evaluation (GRADE) criteria and tried to explain the difference in results of the previous reviews. Methods A literature search was done on PubMed, Cochrane Library, IZiNCG database and WHO regional data bases using different terms for zinc and linear growth (height). Data were abstracted in a standardized form. Data were analyzed in two ways i.e. weighted mean difference (effect size) and pooled mean difference for absolute increment in length in centimeters. Random effect models were used for these pooled estimates. We have given our recommendations for effectiveness of zinc supplementation in the form of absolute increment in length (cm) in zinc supplemented group compared to control for input to Live Saves Tool (LiST). Results There were thirty six studies assessing the effect of zinc supplementation on linear growth in children < 5 years from developing countries. In eleven of these studies, zinc was given in combination with other micronutrients (iron, vitamin A, etc). The final effect size after pooling all the data sets (zinc ± iron etc) showed a significant positive effect of zinc supplementation on linear growth [Effect size: 0.13 (95% CI 0.04, 0.21), random model] in the developing countries. A subgroup analysis by excluding those data sets where zinc was supplemented in combination with iron showed a more pronounced effect of zinc supplementation on linear growth [Weighed mean difference 0.19 (95 % CI 0.08, 0.30), random model]. A subgroup analysis from studies that reported actual increase in length (cm) showed that a dose of 10 mg zinc/day for duration of 24 weeks led to a net a gain of 0.37 (±0.25) cm in zinc supplemented group compared to placebo. This estimate is recommended for inclusion in Lives Saved Tool (LiST) model. Conclusions Zinc supplementation has a significant positive effect on linear growth, especially when administered alone, and should be included in national strategies to reduce stunting in children < 5 years of age in developing countries. PMID:21501440
-
DynaSim: A MATLAB Toolbox for Neural Modeling and Simulation
PubMed Central
Sherfey, Jason S.; Soplata, Austin E.; Ardid, Salva; Roberts, Erik A.; Stanley, David A.; Pittman-Polletta, Benjamin R.; Kopell, Nancy J.
2018-01-01
DynaSim is an open-source MATLAB/GNU Octave toolbox for rapid prototyping of neural models and batch simulation management. It is designed to speed up and simplify the process of generating, sharing, and exploring network models of neurons with one or more compartments. Models can be specified by equations directly (similar to XPP or the Brian simulator) or by lists of predefined or custom model components. The higher-level specification supports arbitrarily complex population models and networks of interconnected populations. DynaSim also includes a large set of features that simplify exploring model dynamics over parameter spaces, running simulations in parallel using both multicore processors and high-performance computer clusters, and analyzing and plotting large numbers of simulated data sets in parallel. It also includes a graphical user interface (DynaSim GUI) that supports full functionality without requiring user programming. The software has been implemented in MATLAB to enable advanced neural modeling using MATLAB, given its popularity and a growing interest in modeling neural systems. The design of DynaSim incorporates a novel schema for model specification to facilitate future interoperability with other specifications (e.g., NeuroML, SBML), simulators (e.g., NEURON, Brian, NEST), and web-based applications (e.g., Geppetto) outside MATLAB. DynaSim is freely available at http://dynasimtoolbox.org. This tool promises to reduce barriers for investigating dynamics in large neural models, facilitate collaborative modeling, and complement other tools being developed in the neuroinformatics community. PMID:29599715
-
DynaSim: A MATLAB Toolbox for Neural Modeling and Simulation.
PubMed
Sherfey, Jason S; Soplata, Austin E; Ardid, Salva; Roberts, Erik A; Stanley, David A; Pittman-Polletta, Benjamin R; Kopell, Nancy J
2018-01-01
DynaSim is an open-source MATLAB/GNU Octave toolbox for rapid prototyping of neural models and batch simulation management. It is designed to speed up and simplify the process of generating, sharing, and exploring network models of neurons with one or more compartments. Models can be specified by equations directly (similar to XPP or the Brian simulator) or by lists of predefined or custom model components. The higher-level specification supports arbitrarily complex population models and networks of interconnected populations. DynaSim also includes a large set of features that simplify exploring model dynamics over parameter spaces, running simulations in parallel using both multicore processors and high-performance computer clusters, and analyzing and plotting large numbers of simulated data sets in parallel. It also includes a graphical user interface (DynaSim GUI) that supports full functionality without requiring user programming. The software has been implemented in MATLAB to enable advanced neural modeling using MATLAB, given its popularity and a growing interest in modeling neural systems. The design of DynaSim incorporates a novel schema for model specification to facilitate future interoperability with other specifications (e.g., NeuroML, SBML), simulators (e.g., NEURON, Brian, NEST), and web-based applications (e.g., Geppetto) outside MATLAB. DynaSim is freely available at http://dynasimtoolbox.org. This tool promises to reduce barriers for investigating dynamics in large neural models, facilitate collaborative modeling, and complement other tools being developed in the neuroinformatics community.
-
NetActivism: How Citizens Use the Internet. First Edition.
ERIC Educational Resources Information Center
Schwartz, Edward
This book guides citizens in using the Internet for community, social, and political action. Following an in-depth introduction, chapters include: Chapter 1, "Getting Connected" and Chapter 2, "Tools," explain the two Internet tools central to organizing for activism--electronic mail lists and the World Wide Web, and the hardware and software…
-
Welding. Module 8 of the Vocational Education Readiness Test (VERT).
ERIC Educational Resources Information Center
Thomas, Edward L., Comp.
Focusing on welding, this module is one of eight included in the Vocational Education Readiness Tests (VERT). The module begins by listing the objectives of the module and describing tools and equipment needed. The remainder of the module contains sections on manipulative skills, trade vocabulary, tool identification, trade computational skills,…
-
Basic Wiring. Module 2 of the Vocational Education Readiness Test (VERT).
ERIC Educational Resources Information Center
Thomas, Edward L., Comp.
Focusing on basic welding, this module is one of eight included in the Vocational Education Readiness Test (VERT). The module begins by listing the objectives of the module and describing tools and equipment needed. The remainder of the module contains sections on manipulative skills, trade vocabulary, tool identification, trade computation…
-
The Use of Hand Tools in Agricultural Mechanics.
ERIC Educational Resources Information Center
Montana State Univ., Bozeman. Dept. of Agricultural and Industrial Education.
This document contains a unit for teaching the use of hand tools in agricultural mechanics in Montana. It consists of an outline of the unit and seven lesson plans. The unit outline contains the following components: situation, aims and goals, list of lessons, student activities, teacher activities, special equipment needed, and references. The…
-
Masonry. Module 5 of the Vocational Education Readiness Test (VERT).
ERIC Educational Resources Information Center
Thomas, Edward L., Comp.
Focusing on masonry, this module is one of eight included in the Vocational Education Readiness Tests (VERT). The module begins by listing the objectives of the module and describing tools and equipment needed. The remainder of the module contains sections on manipulative skills, trade vocabulary, tool identification, trade computational skills,…
-
An Empirical Derivation of the Run Time of the Bubble Sort Algorithm.
ERIC Educational Resources Information Center
Gonzales, Michael G.
1984-01-01
Suggests a moving pictorial tool to help teach principles in the bubble sort algorithm. Develops such a tool applied to an unsorted list of numbers and describes a method to derive the run time of the algorithm. The method can be modified to run the times of various other algorithms. (JN)