Sample records for specialized analysis tools

  1. Multi-mission space vehicle subsystem analysis tools

    NASA Technical Reports Server (NTRS)

    Kordon, M.; Wood, E.

    2003-01-01

    Spacecraft engineers often rely on specialized simulation tools to facilitate the analysis, design and operation of space systems. Unfortunately these tools are often designed for one phase of a single mission and cannot be easily adapted to other phases or other misions. The Multi-Mission Pace Vehicle Susbsystem Analysis Tools are designed to provide a solution to this problem.

  2. Design and Analysis Techniques for Concurrent Blackboard Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Mcmanus, John William

    1992-01-01

    Blackboard systems are a natural progression of knowledge-based systems into a more powerful problem solving technique. They provide a way for several highly specialized knowledge sources to cooperate to solve large, complex problems. Blackboard systems incorporate the concepts developed by rule-based and expert systems programmers and include the ability to add conventionally coded knowledge sources. The small and specialized knowledge sources are easier to develop and test, and can be hosted on hardware specifically suited to the task that they are solving. The Formal Model for Blackboard Systems was developed to provide a consistent method for describing a blackboard system. A set of blackboard system design tools has been developed and validated for implementing systems that are expressed using the Formal Model. The tools are used to test and refine a proposed blackboard system design before the design is implemented. My research has shown that the level of independence and specialization of the knowledge sources directly affects the performance of blackboard systems. Using the design, simulation, and analysis tools, I developed a concurrent object-oriented blackboard system that is faster, more efficient, and more powerful than existing systems. The use of the design and analysis tools provided the highly specialized and independent knowledge sources required for my concurrent blackboard system to achieve its design goals.

  3. New Results in Software Model Checking and Analysis

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.

    2010-01-01

    This introductory article surveys new techniques, supported by automated tools, for the analysis of software to ensure reliability and safety. Special focus is on model checking techniques. The article also introduces the five papers that are enclosed in this special journal volume.

  4. Discriminant Analysis as a Tool for Admission Selection to Special Academic Programs. AIR 1986 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Kissel, Mary Ann

    The use of stepwise discriminant analysis as a means to select entering students who would benefit from a special program for the disadvantaged was studied. In fall 1984, 278 full-time black students were admitted as first-time students to a large urban university. Of the total, 200 entered a special program for the disadvantaged and 78 entered…

  5. Projectiles, pendula, and special relativity

    NASA Astrophysics Data System (ADS)

    Price, Richard H.

    2005-05-01

    The kind of flat-earth gravity used in introductory physics appears in an accelerated reference system in special relativity. From this viewpoint, we work out the special relativistic description of a ballistic projectile and a simple pendulum, two examples of simple motion driven by earth-surface gravity. The analysis uses only the basic mathematical tools of special relativity typical of a first-year university course.

  6. Elementary Analysis of the Special Relativistic Combination of Velocities, Wigner Rotation and Thomas Precession

    ERIC Educational Resources Information Center

    O'Donnell, Kane; Visser, Matt

    2011-01-01

    The purpose of this paper is to provide an elementary introduction to the qualitative and quantitative results of velocity combination in special relativity, including the Wigner rotation and Thomas precession. We utilize only the most familiar tools of special relativity, in arguments presented at three differing levels: (1) utterly elementary,…

  7. Special population planner 4 : an open source release.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuiper, J.; Metz, W.; Tanzman, E.

    2008-01-01

    Emergencies like Hurricane Katrina and the recent California wildfires underscore the critical need to meet the complex challenge of planning for individuals with special needs and for institutionalized special populations. People with special needs and special populations often have difficulty responding to emergencies or taking protective actions, and emergency responders may be unaware of their existence and situations during a crisis. Special Population Planner (SPP) is an ArcGIS-based emergency planning system released as an open source product. SPP provides for easy production of maps, reports, and analyses to develop and revise emergency response plans. It includes tools to manage amore » voluntary registry of data for people with special needs, integrated links to plans and documents, tools for response planning and analysis, preformatted reports and maps, and data on locations of special populations, facility and resource characteristics, and contacts. The system can be readily adapted for new settings without programming and is broadly applicable. Full documentation and a demonstration database are included in the release.« less

  8. State Civic Education Policy: Framework and Gap Analysis Tool. Special Report

    ERIC Educational Resources Information Center

    Baumann, Paul; Brennan, Jan

    2017-01-01

    The civic education policy framework and gap analysis tool are intended to guide state leaders as they address the complexities of preparing students for college, career and civic life. They allow for adaptation to state- and site-specific circumstances and may be adopted in whole or in piecemeal fashion, according to states' individual…

  9. 48 CFR 45.101 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., special test equipment or special tooling. Government-furnished property means property in the possession... contractor-acquired property. Government property includes material, equipment, special tooling, special test... end-item. Material does not include equipment, special tooling, special test equipment or real...

  10. Applying Dataflow Architecture and Visualization Tools to In Vitro Pharmacology Data Automation.

    PubMed

    Pechter, David; Xu, Serena; Kurtz, Marc; Williams, Steven; Sonatore, Lisa; Villafania, Artjohn; Agrawal, Sony

    2016-12-01

    The pace and complexity of modern drug discovery places ever-increasing demands on scientists for data analysis and interpretation. Data flow programming and modern visualization tools address these demands directly. Three different requirements-one for allosteric modulator analysis, one for a specialized clotting analysis, and one for enzyme global progress curve analysis-are reviewed, and their execution in a combined data flow/visualization environment is outlined. © 2016 Society for Laboratory Automation and Screening.

  11. Requirements management for Gemini Observatory: a small organization with big development projects

    NASA Astrophysics Data System (ADS)

    Close, Madeline; Serio, Andrew; Cordova, Martin; Hardie, Kayla

    2016-08-01

    Gemini Observatory is an astronomical observatory operating two premier 8m-class telescopes, one in each hemisphere. As an operational facility, a majority of Gemini's resources are spent on operations however the observatory undertakes major development projects as well. Current projects include new facility science instruments, an operational paradigm shift to full remote operations, and new operations tools for planning, configuration and change control. Three years ago, Gemini determined that a specialized requirements management tool was needed. Over the next year, the Gemini Systems Engineering Group investigated several tools, selected one for a trial period and configured it for use. Configuration activities including definition of systems engineering processes, development of a requirements framework, and assignment of project roles to tool roles. Test projects were implemented in the tool. At the conclusion of the trial, the group determined that the Gemini could meet its requirements management needs without use of a specialized requirements management tool, and the group identified a number of lessons learned which are described in the last major section of this paper. These lessons learned include how to conduct an organizational needs analysis prior to pursuing a tool; caveats concerning tool criteria and the selection process; the prerequisites and sequence of activities necessary to achieve an optimum configuration of the tool; the need for adequate staff resources and staff training; and a special note regarding organizations in transition and archiving of requirements.

  12. 14 CFR 147.19 - Materials, special tools, and shop equipment requirements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 14 Aeronautics and Space 3 2014-01-01 2014-01-01 false Materials, special tools, and shop... TECHNICIAN SCHOOLS Certification Requirements § 147.19 Materials, special tools, and shop equipment... additional rating, must have an adequate supply of material, special tools, and such of the shop equipment as...

  13. 14 CFR 147.19 - Materials, special tools, and shop equipment requirements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 14 Aeronautics and Space 3 2013-01-01 2013-01-01 false Materials, special tools, and shop... TECHNICIAN SCHOOLS Certification Requirements § 147.19 Materials, special tools, and shop equipment... additional rating, must have an adequate supply of material, special tools, and such of the shop equipment as...

  14. 14 CFR 147.19 - Materials, special tools, and shop equipment requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 3 2011-01-01 2011-01-01 false Materials, special tools, and shop... TECHNICIAN SCHOOLS Certification Requirements § 147.19 Materials, special tools, and shop equipment... additional rating, must have an adequate supply of material, special tools, and such of the shop equipment as...

  15. 14 CFR 147.19 - Materials, special tools, and shop equipment requirements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 3 2012-01-01 2012-01-01 false Materials, special tools, and shop... TECHNICIAN SCHOOLS Certification Requirements § 147.19 Materials, special tools, and shop equipment... additional rating, must have an adequate supply of material, special tools, and such of the shop equipment as...

  16. Noise Reduction in High-Throughput Gene Perturbation Screens

    USDA-ARS?s Scientific Manuscript database

    Motivation: Accurate interpretation of perturbation screens is essential for a successful functional investigation. However, the screened phenotypes are often distorted by noise, and their analysis requires specialized statistical analysis tools. The number and scope of statistical methods available...

  17. Distributed and Collaborative Software Analysis

    NASA Astrophysics Data System (ADS)

    Ghezzi, Giacomo; Gall, Harald C.

    Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of software analysissoftware analysis such as source code analysis, co-change analysis or bug prediction. However, easy and straight forward synergies between these analyses and tools rarely exist because of their stand-alone nature, their platform dependence, their different input and output formats and the variety of data to analyze. As a consequence, distributed and collaborative software analysiscollaborative software analysis scenarios and in particular interoperability are severely limited. We describe a distributed and collaborative software analysis platform that allows for a seamless interoperability of software analysis tools across platform, geographical and organizational boundaries. We realize software analysis tools as services that can be accessed and composed over the Internet. These distributed analysis services shall be widely accessible in our incrementally augmented Software Analysis Broker software analysis broker where organizations and tool providers can register and share their tools. To allow (semi-) automatic use and composition of these tools, they are classified and mapped into a software analysis taxonomy and adhere to specific meta-models and ontologiesontologies for their category of analysis.

  18. Video Analysis of Projectile Motion Using Tablet Computers as Experimental Tools

    ERIC Educational Resources Information Center

    Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.

    2014-01-01

    Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and "g" in order to explore the underlying laws of motion. This experiment…

  19. AstrodyToolsWeb an e-Science project in Astrodynamics and Celestial Mechanics fields

    NASA Astrophysics Data System (ADS)

    López, R.; San-Juan, J. F.

    2013-05-01

    Astrodynamics Web Tools, AstrodyToolsWeb (http://tastrody.unirioja.es), is an ongoing collaborative Web Tools computing infrastructure project which has been specially designed to support scientific computation. AstrodyToolsWeb provides project collaborators with all the technical and human facilities in order to wrap, manage, and use specialized noncommercial software tools in Astrodynamics and Celestial Mechanics fields, with the aim of optimizing the use of resources, both human and material. However, this project is open to collaboration from the whole scientific community in order to create a library of useful tools and their corresponding theoretical backgrounds. AstrodyToolsWeb offers a user-friendly web interface in order to choose applications, introduce data, and select appropriate constraints in an intuitive and easy way for the user. After that, the application is executed in real time, whenever possible; then the critical information about program behavior (errors and logs) and output, including the postprocessing and interpretation of its results (graphical representation of data, statistical analysis or whatever manipulation therein), are shown via the same web interface or can be downloaded to the user's computer.

  20. On Special Functions in the Context of Clifford Analysis

    NASA Astrophysics Data System (ADS)

    Malonek, H. R.; Falcão, M. I.

    2010-09-01

    Considering the foundation of Quaternionic Analysis by R. Fueter and his collaborators in the beginning of the 1930s as starting point of Clifford Analysis, we can look back to 80 years of work in this field. However the interest in multivariate analysis using Clifford algebras only started to grow significantly in the 70s. Since then a great amount of papers on Clifford Analysis referring different classes of Special Functions have appeared. This situation may have been triggered by a more systematic treatment of monogenic functions by their multiple series development derived from Gegenbauer or associated Legendre polynomials (and not only by their integral representation). Also approaches to Special Functions by means of algebraic methods, either Lie algebras or through Lie groups and symmetric spaces gained by that time importance and influenced their treatment in Clifford Analysis. In our talk we will rely on the generalization of the classical approach to Special Functions through differential equations with respect to the hypercomplex derivative, which is a more recently developed tool in Clifford Analysis. In this context special attention will be payed to the role of Special Functions as intermediator between continuous and discrete mathematics. This corresponds to a more recent trend in combinatorics, since it has been revealed that many algebraic structures have hidden combinatorial underpinnings.

  1. System data communication structures for active-control transport aircraft, volume 1

    NASA Technical Reports Server (NTRS)

    Hopkins, A. L.; Martin, J. H.; Brock, L. D.; Jansson, D. G.; Serben, S.; Smith, T. B.; Hanley, L. D.

    1981-01-01

    Candidate data communication techniques are identified, including dedicated links, local buses, broadcast buses, multiplex buses, and mesh networks. The design methodology for mesh networks is then discussed, including network topology and node architecture. Several concepts of power distribution are reviewed, including current limiting and mesh networks for power. The technology issues of packaging, transmission media, and lightning are addressed, and, finally, the analysis tools developed to aid in the communication design process are described. There are special tools to analyze the reliability and connectivity of networks and more general reliability analysis tools for all types of systems.

  2. Microbial Genome Analysis and Comparisons: Web-based Protocols and Resources

    USDA-ARS?s Scientific Manuscript database

    Fully annotated genome sequences of many microorganisms are publicly available as a resource. However, in-depth analysis of these genomes using specialized tools is required to derive meaningful information. We describe here the utility of three powerful publicly available genome databases and ana...

  3. Cytoscape: the network visualization tool for GenomeSpace workflows.

    PubMed

    Demchak, Barry; Hull, Tim; Reich, Michael; Liefeld, Ted; Smoot, Michael; Ideker, Trey; Mesirov, Jill P

    2014-01-01

    Modern genomic analysis often requires workflows incorporating multiple best-of-breed tools. GenomeSpace is a web-based visual workbench that combines a selection of these tools with mechanisms that create data flows between them. One such tool is Cytoscape 3, a popular application that enables analysis and visualization of graph-oriented genomic networks. As Cytoscape runs on the desktop, and not in a web browser, integrating it into GenomeSpace required special care in creating a seamless user experience and enabling appropriate data flows. In this paper, we present the design and operation of the Cytoscape GenomeSpace app, which accomplishes this integration, thereby providing critical analysis and visualization functionality for GenomeSpace users. It has been downloaded over 850 times since the release of its first version in September, 2013.

  4. Cytoscape: the network visualization tool for GenomeSpace workflows

    PubMed Central

    Demchak, Barry; Hull, Tim; Reich, Michael; Liefeld, Ted; Smoot, Michael; Ideker, Trey; Mesirov, Jill P.

    2014-01-01

    Modern genomic analysis often requires workflows incorporating multiple best-of-breed tools. GenomeSpace is a web-based visual workbench that combines a selection of these tools with mechanisms that create data flows between them. One such tool is Cytoscape 3, a popular application that enables analysis and visualization of graph-oriented genomic networks. As Cytoscape runs on the desktop, and not in a web browser, integrating it into GenomeSpace required special care in creating a seamless user experience and enabling appropriate data flows. In this paper, we present the design and operation of the Cytoscape GenomeSpace app, which accomplishes this integration, thereby providing critical analysis and visualization functionality for GenomeSpace users. It has been downloaded over 850 times since the release of its first version in September, 2013. PMID:25165537

  5. Transonic CFD applications at Boeing

    NASA Technical Reports Server (NTRS)

    Tinoco, E. N.

    1989-01-01

    The use of computational methods for three dimensional transonic flow design and analysis at the Boeing Company is presented. A range of computational tools consisting of production tools for every day use by project engineers, expert user tools for special applications by computational researchers, and an emerging tool which may see considerable use in the near future are described. These methods include full potential and Euler solvers, some coupled to three dimensional boundary layer analysis methods, for transonic flow analysis about nacelle, wing-body, wing-body-strut-nacelle, and complete aircraft configurations. As the examples presented show, such a toolbox of codes is necessary for the variety of applications typical of an industrial environment. Such a toolbox of codes makes possible aerodynamic advances not previously achievable in a timely manner, if at all.

  6. Interactive visualization to advance earthquake simulation

    USGS Publications Warehouse

    Kellogg, L.H.; Bawden, G.W.; Bernardin, T.; Billen, M.; Cowgill, E.; Hamann, B.; Jadamec, M.; Kreylos, O.; Staadt, O.; Sumner, D.

    2008-01-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. Virtual mapping tools allow virtual "field studies" in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations. ?? Birkhaueser 2008.

  7. Making Space for Specialized Astronomy Resources

    NASA Astrophysics Data System (ADS)

    MacMillan, D.

    2007-10-01

    With the growth of both free and subscription-based resources, articles on astronomy have never been easier to find. Locating the best and most current materials for any given search, however, now requires multiple tools and strategies dependent on the query. An analysis of the tools currently available shows that while astronomy is well-served by Google Scholar, Scopus and Inspec, its literature is best accessed through specialized resources such as ADS (Astrophysics Data System). While no surprise to astronomers, this has major implications for those of us who teach information literacy skills to astronomy students and work in academic settings where astronomy is just one of many subjects for which our non-specialist colleagues at the reference desk provide assistance. This paper will examine some of the implications of this analysis for library instruction, reference assistance and training, and library webpage development.

  8. PREFACE: Anti-counterfeit Image Analysis Methods (A Special Session of ICSXII)

    NASA Astrophysics Data System (ADS)

    Javidi, B.; Fournel, T.

    2007-06-01

    The International Congress for Stereology is dedicated to theoretical and applied aspects of stochastic tools, image analysis and mathematical morphology. A special emphasis on `anti-counterfeit image analysis methods' has been given this year for the XIIth edition (ICSXII). Facing the economic and social threat of counterfeiting, this devoted session presents recent advances and original solutions in the field. A first group of methods are related to marks located either on the product (physical marks) or on the data (hidden information) to be protected. These methods concern laser fs 3D encoding and source separation for machine-readable identification, moiré and `guilloche' engraving for visual verification and watermarking. Machine-readable travel documents are well-suited examples introducing the second group of methods which are related to cryptography. Used in passports for data authentication and identification (of people), cryptography provides some powerful tools. Opto-digital processing allows some efficient implementations described in the papers and promising applications. We would like to thank the reviewers who have contributed to a session of high quality, and the authors for their fine and hard work. We would like to address some special thanks to the invited lecturers, namely Professor Roger Hersch and Dr Isaac Amidror for their survey of moiré methods, Prof. Serge Vaudenay for his survey of existing protocols concerning machine-readable travel documents, and Dr Elisabet Pérez-Cabré for her presentation on optical encryption for multifactor authentication. We also thank Professor Dominique Jeulin, President of the International Society for Stereology, Professor Michel Jourlin, President of the organizing committee of ICSXII, for their help and advice, and Mr Graham Douglas, the Publisher of Journal of Physics: Conference Series at IOP Publishing, for his efficiency. We hope that this collection of papers will be useful as a tool to further develop a very important field. Bahram Javidi University of Connecticut (USA) Thierry Fournel University of Saint-Etienne (France) Chairs of the special session on `Anti-counterfeit image analysis methods', July 2007

  9. Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS

    NASA Astrophysics Data System (ADS)

    Joshi, D. M.; Patel, H. K.

    2015-10-01

    Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.

  10. Some applications of mathematics in theoretical physics - A review

    NASA Astrophysics Data System (ADS)

    Bora, Kalpana

    2016-06-01

    Mathematics is a very beautiful subject-very much an indispensible tool for Physics, more so for Theoretical Physics (by which we mean here mainly Field Theory and High Energy Physics). These branches of Physics are based on Quantum Mechanics and Special Theory of Relativity, and many mathematical concepts are used in them. In this work, we shall elucidate upon only some of them, like-differential geometry, infinite series, Mellin transforms, Fourier and integral transforms, special functions, calculus, complex algebra, topology, group theory, Riemannian geometry, functional analysis, linear algebra, operator algebra, etc. We shall also present, some physics issues, where these mathematical tools are used. It is not wrong to say that Mathematics is such a powerful tool, without which, there can not be any Physics theory!! A brief review on our research work is also presented.

  11. Pathology economic model tool: a novel approach to workflow and budget cost analysis in an anatomic pathology laboratory.

    PubMed

    Muirhead, David; Aoun, Patricia; Powell, Michael; Juncker, Flemming; Mollerup, Jens

    2010-08-01

    The need for higher efficiency, maximum quality, and faster turnaround time is a continuous focus for anatomic pathology laboratories and drives changes in work scheduling, instrumentation, and management control systems. To determine the costs of generating routine, special, and immunohistochemical microscopic slides in a large, academic anatomic pathology laboratory using a top-down approach. The Pathology Economic Model Tool was used to analyze workflow processes at The Nebraska Medical Center's anatomic pathology laboratory. Data from the analysis were used to generate complete cost estimates, which included not only materials, consumables, and instrumentation but also specific labor and overhead components for each of the laboratory's subareas. The cost data generated by the Pathology Economic Model Tool were compared with the cost estimates generated using relative value units. Despite the use of automated systems for different processes, the workflow in the laboratory was found to be relatively labor intensive. The effect of labor and overhead on per-slide costs was significantly underestimated by traditional relative-value unit calculations when compared with the Pathology Economic Model Tool. Specific workflow defects with significant contributions to the cost per slide were identified. The cost of providing routine, special, and immunohistochemical slides may be significantly underestimated by traditional methods that rely on relative value units. Furthermore, a comprehensive analysis may identify specific workflow processes requiring improvement.

  12. Using Microsoft PowerPoint as an Astronomical Image Analysis Tool

    NASA Astrophysics Data System (ADS)

    Beck-Winchatz, Bernhard

    2006-12-01

    Engaging students in the analysis of authentic scientific data is an effective way to teach them about the scientific process and to develop their problem solving, teamwork and communication skills. In astronomy several image processing and analysis software tools have been developed for use in school environments. However, the practical implementation in the classroom is often difficult because the teachers may not have the comfort level with computers necessary to install and use these tools, they may not have adequate computer privileges and/or support, and they may not have the time to learn how to use specialized astronomy software. To address this problem, we have developed a set of activities in which students analyze astronomical images using basic tools provided in PowerPoint. These include measuring sizes, distances, and angles, and blinking images. In contrast to specialized software, PowerPoint is broadly available on school computers. Many teachers are already familiar with PowerPoint, and the skills developed while learning how to analyze astronomical images are highly transferable. We will discuss several practical examples of measurements, including the following: -Variations in the distances to the sun and moon from their angular sizes -Magnetic declination from images of shadows -Diameter of the moon from lunar eclipse images -Sizes of lunar craters -Orbital radii of the Jovian moons and mass of Jupiter -Supernova and comet searches -Expansion rate of the universe from images of distant galaxies

  13. COVD-QOL questionnaire: An adaptation for school vision screening using Rasch analysis

    PubMed Central

    Abu Bakar, Nurul Farhana; Ai Hong, Chen; Pik Pin, Goh

    2012-01-01

    Purpose To adapt the College of Optometrist in Vision Development (COVD-QOL) questionnaire as a vision screening tool for primary school children. Methods An interview session was conducted with children, teachers or guardians regarding visual symptoms of 88 children (45 from special education classes and 43 from mainstream classes) in government primary schools. Data was assessed for response categories, fit items (infit/outfit: 0.6–1.4) and separation reliability (item/person: 0.80). The COVD-QOL questionnaire results were compared with vision assessment in identifying three categories of vision disorders: reduce visual acuity, accommodative response anomaly and convergence insufficiency. Analysis on the screening performance using the simplified version of the questionnaire was evaluated based on receiver-operating characteristic analysis for detection of any type of target conditions for both types of classes. Predictive validity analysis was used a Spearman rank correlation (>0.3). Results Two of the response categories were underutilized and therefore collapsed to the adjacent category and items were reduced to 14. Item separation reliability for the simplified version of the questionnaire was acceptable (0.86) but the person separation reliability was inadequate for special education classes (0.79) similar to mainstream classes (0.78). The discriminant cut-off score of 9 (mainstream classes) and 3 (special education classes) from the 14 items provided sensitivity and specificity of (65% and 54%) and (78% and 80%) with Spearman rank correlation of 0.16 and 0.40 respectively. Conclusion The simplified version of COVD-QOL questionnaire (14-items) performs adequately among children in special education classes suggesting its suitability as a vision screening tool.

  14. An admissions system to select veterinary medical students with an interest in food animals and veterinary public health.

    PubMed

    Haarhuis, Jan C M; Muijtjens, Arno M M; Scherpbier, Albert J J A; van Beukelen, Peter

    2009-01-01

    Interest in the areas of food animals (FA) and veterinary public health (VPH) appears to be declining among prospective students of veterinary medicine. To address the expected shortage of veterinarians in these areas, the Utrecht Faculty of Veterinary Medicine has developed an admissions procedure to select undergraduates whose aptitude and interests are suited to these areas. A study using expert meetings, open interviews, and document analysis identified personal characteristics that distinguished veterinarians working in the areas of FA and VPH from their colleagues who specialized in companion animals (CA) and equine medicine (E). The outcomes were used to create a written selection tool. We validated this tool in a study among undergraduate veterinary students in their final (sixth) year before graduation. The applicability of the tool was verified in a study among first-year students who had opted to pursue either FA/VPH or CA/E. The tool revealed statistically significant differences with acceptable effect sizes between the two student groups. Because the written selection tool did not cover all of the differences between the veterinarians who specialized in FA/VPH and those who specialized in CA/E, we developed a prestructured panel interview and added it to the questionnaire. The evaluation of the written component showed that it was suitable for selecting those students who were most likely to succeed in the FA/VPH track.

  15. The Undergraduate Biomechanics Experience at Iowa State University.

    ERIC Educational Resources Information Center

    Francis, Peter R.

    This paper discusses the objectives of a program in biomechanics--the analysis of sports skills and movement--and the evolution of the biomechanics program at Iowa State University. The primary objective of such a course is to provide the student with the basic tools necessary for adequate analysis of human movement, with special emphasis upon…

  16. Pyrolysis kinetics and combustion of thin wood using advanced cone calorimetry test method

    Treesearch

    Mark A. Dietenberger

    2011-01-01

    Mechanistic pyrolysis kinetics analysis of extractives, holocellulose, and lignin in solid wood over entire heating regime was possible using specialized cone calorimeter test and new mathematical analysis tools. Added hardware components include: modified sample holder for thin specimen with tiny thermocouples, methane ring burner with stainless steel mesh above cone...

  17. Microcomputers, Software and Foreign Languages for Special Purposes: An Analysis of TXTPRO.

    ERIC Educational Resources Information Center

    Tang, Michael S.

    TXTPRO, a computer program developed as a graduate-level research tool for descriptive linguistic analysis, produces simple alphabetic and word frequency lists, analyzes word combinations, and develops concordances. With modifications, a teacher could enter the program into a mainframe or a microcomputer and use it for text analyses to develop…

  18. Computing tools for implementing standards for single-case designs.

    PubMed

    Chen, Li-Ting; Peng, Chao-Ying Joanne; Chen, Ming-E

    2015-11-01

    In the single-case design (SCD) literature, five sets of standards have been formulated and distinguished: design standards, assessment standards, analysis standards, reporting standards, and research synthesis standards. This article reviews computing tools that can assist researchers and practitioners in meeting the analysis standards recommended by the What Works Clearinghouse: Procedures and Standards Handbook-the WWC standards. These tools consist of specialized web-based calculators or downloadable software for SCD data, and algorithms or programs written in Excel, SAS procedures, SPSS commands/Macros, or the R programming language. We aligned these tools with the WWC standards and evaluated them for accuracy and treatment of missing data, using two published data sets. All tools were tested to be accurate. When missing data were present, most tools either gave an error message or conducted analysis based on the available data. Only one program used a single imputation method. This article concludes with suggestions for an inclusive computing tool or environment, additional research on the treatment of missing data, and reasonable and flexible interpretations of the WWC standards. © The Author(s) 2015.

  19. 47 CFR 73.9007 - Robustness requirements for covered demodulator products.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... RADIO SERVICES RADIO BROADCAST SERVICES Digital Broadcast Television Redistribution Control § 73.9007...-available tools or equipment also means specialized electronic tools or software tools that are widely... requirements set forth in this subpart. Such specialized electronic tools or software tools includes, but is...

  20. Interactive Visualization to Advance Earthquake Simulation

    NASA Astrophysics Data System (ADS)

    Kellogg, Louise H.; Bawden, Gerald W.; Bernardin, Tony; Billen, Magali; Cowgill, Eric; Hamann, Bernd; Jadamec, Margarete; Kreylos, Oliver; Staadt, Oliver; Sumner, Dawn

    2008-04-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. For example, simulations of earthquake-related processes typically generate complex, time-varying data sets in two or more dimensions. To facilitate interpretation and analysis of these data sets, evaluate the underlying models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth’s surface and interior. Virtual mapping tools allow virtual “field studies” in inaccessible regions. Interactive tools allow us to manipulate shapes in order to construct models of geological features for geodynamic models, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulation or field observations, thereby enabling us to improve our interpretation of the dynamical processes that drive earthquakes. VR has traditionally been used primarily as a presentation tool, albeit with active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for scientific analysis requires building on the method’s strengths, that is, using both 3D perception and interaction with observed or simulated data. This approach also takes advantage of the specialized skills of geological scientists who are trained to interpret, the often limited, geological and geophysical data available from field observations.

  1. Remote sensing change detection tools for natural resource managers: Understanding concepts and tradeoffs in the design of landscape monitoring projects

    Treesearch

    Robert E. Kennedy; Philip A. Townsend; John E. Gross; Warren B. Cohen; Paul Bolstad; Wang Y. Q.; Phyllis Adams

    2009-01-01

    Remote sensing provides a broad view of landscapes and can be consistent through time, making it an important tool for monitoring and managing protected areas. An impediment to broader use of remote sensing science for monitoring has been the need for resource managers to understand the specialized capabilities of an ever-expanding array of image sources and analysis...

  2. The PhytoClust tool for metabolic gene clusters discovery in plant genomes

    PubMed Central

    Fuchs, Lisa-Maria

    2017-01-01

    Abstract The existence of Metabolic Gene Clusters (MGCs) in plant genomes has recently raised increased interest. Thus far, MGCs were commonly identified for pathways of specialized metabolism, mostly those associated with terpene type products. For efficient identification of novel MGCs, computational approaches are essential. Here, we present PhytoClust; a tool for the detection of candidate MGCs in plant genomes. The algorithm employs a collection of enzyme families related to plant specialized metabolism, translated into hidden Markov models, to mine given genome sequences for physically co-localized metabolic enzymes. Our tool accurately identifies previously characterized plant MGCs. An exhaustive search of 31 plant genomes detected 1232 and 5531 putative gene cluster types and candidates, respectively. Clustering analysis of putative MGCs types by species reflected plant taxonomy. Furthermore, enrichment analysis revealed taxa- and species-specific enrichment of certain enzyme families in MGCs. When operating through our web-interface, PhytoClust users can mine a genome either based on a list of known cluster types or by defining new cluster rules. Moreover, for selected plant species, the output can be complemented by co-expression analysis. Altogether, we envisage PhytoClust to enhance novel MGCs discovery which will in turn impact the exploration of plant metabolism. PMID:28486689

  3. The PhytoClust tool for metabolic gene clusters discovery in plant genomes.

    PubMed

    Töpfer, Nadine; Fuchs, Lisa-Maria; Aharoni, Asaph

    2017-07-07

    The existence of Metabolic Gene Clusters (MGCs) in plant genomes has recently raised increased interest. Thus far, MGCs were commonly identified for pathways of specialized metabolism, mostly those associated with terpene type products. For efficient identification of novel MGCs, computational approaches are essential. Here, we present PhytoClust; a tool for the detection of candidate MGCs in plant genomes. The algorithm employs a collection of enzyme families related to plant specialized metabolism, translated into hidden Markov models, to mine given genome sequences for physically co-localized metabolic enzymes. Our tool accurately identifies previously characterized plant MGCs. An exhaustive search of 31 plant genomes detected 1232 and 5531 putative gene cluster types and candidates, respectively. Clustering analysis of putative MGCs types by species reflected plant taxonomy. Furthermore, enrichment analysis revealed taxa- and species-specific enrichment of certain enzyme families in MGCs. When operating through our web-interface, PhytoClust users can mine a genome either based on a list of known cluster types or by defining new cluster rules. Moreover, for selected plant species, the output can be complemented by co-expression analysis. Altogether, we envisage PhytoClust to enhance novel MGCs discovery which will in turn impact the exploration of plant metabolism. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  4. User Instructions for the Policy Analysis Modeling System (PAMS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNeil, Michael A.; Letschert, Virginie E.; Van Buskirk, Robert D.

    PAMS uses country-specific and product-specific data to calculate estimates of impacts of a Minimum Efficiency Performance Standard (MEPS) program. The analysis tool is self-contained in a Microsoft Excel spreadsheet, and requires no links to external data, or special code additions to run. The analysis can be customized to a particular program without additional user input, through the use of the pull-down menus located on the Summary page. In addition, the spreadsheet contains many areas into which user-generated input data can be entered for increased accuracy of projection. The following is a step-by-step guide for using and customizing the tool.

  5. TACIT: An open-source text analysis, crawling, and interpretation tool.

    PubMed

    Dehghani, Morteza; Johnson, Kate M; Garten, Justin; Boghrati, Reihane; Hoover, Joe; Balasubramanian, Vijayan; Singh, Anurag; Shankar, Yuvarani; Pulickal, Linda; Rajkumar, Aswin; Parmar, Niki Jitendra

    2017-04-01

    As human activity and interaction increasingly take place online, the digital residues of these activities provide a valuable window into a range of psychological and social processes. A great deal of progress has been made toward utilizing these opportunities; however, the complexity of managing and analyzing the quantities of data currently available has limited both the types of analysis used and the number of researchers able to make use of these data. Although fields such as computer science have developed a range of techniques and methods for handling these difficulties, making use of those tools has often required specialized knowledge and programming experience. The Text Analysis, Crawling, and Interpretation Tool (TACIT) is designed to bridge this gap by providing an intuitive tool and interface for making use of state-of-the-art methods in text analysis and large-scale data management. Furthermore, TACIT is implemented as an open, extensible, plugin-driven architecture, which will allow other researchers to extend and expand these capabilities as new methods become available.

  6. Pyrolysis kinetics and combustion of thin wood by an advanced cone caorimetry test method

    Treesearch

    Mark Dietenberger

    2012-01-01

    Pyrolysis kinetics analysis of extractives, holocellulose, and lignin in the solid redwood over the entire heating regime was possible by specialized cone calorimeter test and new mathematical analysis tools. Added hardware components include: modified sample holder for the thin specimen with tiny thermocouples, the methane ring burner with stainless-steel mesh above...

  7. Challenges of working with FIADB17 data: the SOLE experience

    Treesearch

    Michael Spinney; Paul Van Deusen

    2007-01-01

    The Southern On Line Estimator (SOLE) is an Internet-based Forest Inventory and Analysis (FIA) data analysis tool. SOLE is based on data downloaded from the publicly available FIA database (FIADB) and summarized by plot condition. The tasks of downloading, processing, and summarizing FIADB data require specialized expertise in inventory theory and data manipulation....

  8. Video analysis of projectile motion using tablet computers as experimental tools

    NASA Astrophysics Data System (ADS)

    Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.

    2014-01-01

    Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and g in order to explore the underlying laws of motion. This experiment can easily be performed by students themselves, providing more autonomy in their problem-solving processes than traditional learning approaches. We believe that this autonomy and the authenticity of the experimental tool both foster their motivation.

  9. Food Safety Practices Assessment Tool: An Innovative Way to Test Food Safety Skills among Individuals with Special Needs

    ERIC Educational Resources Information Center

    Carbone, Elena T.; Scarpati, Stanley E.; Pivarnik, Lori F.

    2013-01-01

    This article describes an innovative assessment tool designed to evaluate the effectiveness of a food safety skills curriculum for learners receiving special education services. As schools respond to the increased demand for training students with special needs about food safety, the need for effective curricula and tools is also increasing. A…

  10. 48 CFR 31.205-40 - Special tooling and special test equipment costs.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Special tooling and special test equipment costs. 31.205-40 Section 31.205-40 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL CONTRACTING REQUIREMENTS CONTRACT COST PRINCIPLES AND PROCEDURES Contracts With Commercial Organizations 31.205-40...

  11. Bibliometric mapping: eight decades of analytical chemistry, with special focus on the use of mass spectrometry.

    PubMed

    Waaijer, Cathelijn J F; Palmblad, Magnus

    2015-01-01

    In this Feature we use automatic bibliometric mapping tools to visualize the history of analytical chemistry from the 1920s until the present. In particular, we have focused on the application of mass spectrometry in different fields. The analysis shows major shifts in research focus and use of mass spectrometry. We conclude by discussing the application of bibliometric mapping and visualization tools in analytical chemists' research.

  12. Acoustic prediction methods for the NASA generalized advanced propeller analysis system (GAPAS)

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Block, P. J. W.

    1984-01-01

    Classical methods of propeller performance analysis are coupled with state-of-the-art Aircraft Noise Prediction Program (ANOPP:) techniques to yield a versatile design tool, the NASA Generalized Advanced Propeller Analysis System (GAPAS) for the novel quiet and efficient propellers. ANOPP is a collection of modular specialized programs. GAPAS as a whole addresses blade geometry and aerodynamics, rotor performance and loading, and subsonic propeller noise.

  13. Preliminary design review package on air flat plate collector for solar heating and cooling system

    NASA Technical Reports Server (NTRS)

    1977-01-01

    Guidelines to be used in the development and fabrication of a prototype air flat plate collector subsystem containing 320 square feet (10-4 ft x 8 ft panels) of collector area are presented. Topics discussed include: (1) verification plan; (2) thermal analysis; (3) safety hazard analysis; (4) drawing list; (5) special handling, installation and maintenance tools; (6) structural analysis; and (7) selected drawings.

  14. ReSeqTools: an integrated toolkit for large-scale next-generation sequencing based resequencing analysis.

    PubMed

    He, W; Zhao, S; Liu, X; Dong, S; Lv, J; Liu, D; Wang, J; Meng, Z

    2013-12-04

    Large-scale next-generation sequencing (NGS)-based resequencing detects sequence variations, constructs evolutionary histories, and identifies phenotype-related genotypes. However, NGS-based resequencing studies generate extraordinarily large amounts of data, making computations difficult. Effective use and analysis of these data for NGS-based resequencing studies remains a difficult task for individual researchers. Here, we introduce ReSeqTools, a full-featured toolkit for NGS (Illumina sequencing)-based resequencing analysis, which processes raw data, interprets mapping results, and identifies and annotates sequence variations. ReSeqTools provides abundant scalable functions for routine resequencing analysis in different modules to facilitate customization of the analysis pipeline. ReSeqTools is designed to use compressed data files as input or output to save storage space and facilitates faster and more computationally efficient large-scale resequencing studies in a user-friendly manner. It offers abundant practical functions and generates useful statistics during the analysis pipeline, which significantly simplifies resequencing analysis. Its integrated algorithms and abundant sub-functions provide a solid foundation for special demands in resequencing projects. Users can combine these functions to construct their own pipelines for other purposes.

  15. Microscopy image segmentation tool: Robust image data analysis

    NASA Astrophysics Data System (ADS)

    Valmianski, Ilya; Monton, Carlos; Schuller, Ivan K.

    2014-03-01

    We present a software package called Microscopy Image Segmentation Tool (MIST). MIST is designed for analysis of microscopy images which contain large collections of small regions of interest (ROIs). Originally developed for analysis of porous anodic alumina scanning electron images, MIST capabilities have been expanded to allow use in a large variety of problems including analysis of biological tissue, inorganic and organic film grain structure, as well as nano- and meso-scopic structures. MIST provides a robust segmentation algorithm for the ROIs, includes many useful analysis capabilities, and is highly flexible allowing incorporation of specialized user developed analysis. We describe the unique advantages MIST has over existing analysis software. In addition, we present a number of diverse applications to scanning electron microscopy, atomic force microscopy, magnetic force microscopy, scanning tunneling microscopy, and fluorescent confocal laser scanning microscopy.

  16. Tool for Crimping Flexible Circuit Leads

    NASA Technical Reports Server (NTRS)

    Hulse, Aaron; Diftler, Myron A.

    2009-01-01

    A hand tool has been developed for crimping leads in flexible tails that are parts of some electronic circuits -- especially some sensor circuits. The tool is used to cut the tails to desired lengths and attach solder tabs to the leads. For tailoring small numbers of circuits for special applications, this hand tool is a less expensive alternative to a commercially available automated crimping tool. The crimping tool consists of an off-the-shelf hand crimping tool plus a specialized crimping insert designed specifically for the intended application.

  17. Stone tools from the ancient Tongan state reveal prehistoric interaction centers in the Central Pacific

    PubMed Central

    Clark, Geoffrey R.; Reepmeyer, Christian; Melekiola, Nivaleti; Woodhead, Jon; Dickinson, William R.; Martinsson-Wallin, Helene

    2014-01-01

    Tonga was unique in the prehistoric Pacific for developing a maritime state that integrated the archipelago under a centralized authority and for undertaking long-distance economic and political exchanges in the second millennium A.D. To establish the extent of Tonga’s maritime polity, we geochemically analyzed stone tools excavated from the central places of the ruling paramounts, particularly lithic artifacts associated with stone-faced chiefly tombs. The lithic networks of the Tongan state focused on Samoa and Fiji, with one adze sourced to the Society Islands 2,500 km from Tongatapu. To test the hypothesis that nonlocal lithics were especially valued by Tongan elites and were an important source of political capital, we analyzed prestate lithics from Tongatapu and stone artifacts from Samoa. In the Tongan state, 66% of worked stone tools were long-distance imports, indicating that interarchipelago connections intensified with the development of the Tongan polity after A.D. 1200. In contrast, stone tools found in Samoa were from local sources, including tools associated with a monumental structure contemporary with the Tongan state. Network analysis of lithics entering the Tongan state and of the distribution of Samoan adzes in the Pacific identified a centralized polity and the products of specialized lithic workshops, respectively. These results indicate that a significant consequence of social complexity was the establishment of new types of specialized sites in distant geographic areas. Specialized sites were loci of long-distance interaction and formed important centers for the transmission of information, people, and materials in prehistoric Oceania. PMID:25002481

  18. Stone tools from the ancient Tongan state reveal prehistoric interaction centers in the Central Pacific

    NASA Astrophysics Data System (ADS)

    Clark, Geoffrey R.; Reepmeyer, Christian; Melekiola, Nivaleti; Woodhead, Jon; Dickinson, William R.; Martinsson-Wallin, Helene

    2014-07-01

    Tonga was unique in the prehistoric Pacific for developing a maritime state that integrated the archipelago under a centralized authority and for undertaking long-distance economic and political exchanges in the second millennium A.D. To establish the extent of Tonga's maritime polity, we geochemically analyzed stone tools excavated from the central places of the ruling paramounts, particularly lithic artifacts associated with stone-faced chiefly tombs. The lithic networks of the Tongan state focused on Samoa and Fiji, with one adze sourced to the Society Islands 2,500 km from Tongatapu. To test the hypothesis that nonlocal lithics were especially valued by Tongan elites and were an important source of political capital, we analyzed prestate lithics from Tongatapu and stone artifacts from Samoa. In the Tongan state, 66% of worked stone tools were long-distance imports, indicating that interarchipelago connections intensified with the development of the Tongan polity after A.D. 1200. In contrast, stone tools found in Samoa were from local sources, including tools associated with a monumental structure contemporary with the Tongan state. Network analysis of lithics entering the Tongan state and of the distribution of Samoan adzes in the Pacific identified a centralized polity and the products of specialized lithic workshops, respectively. These results indicate that a significant consequence of social complexity was the establishment of new types of specialized sites in distant geographic areas. Specialized sites were loci of long-distance interaction and formed important centers for the transmission of information, people, and materials in prehistoric Oceania.

  19. Data Albums: An Event Driven Search, Aggregation and Curation Tool for Earth Science

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Kulkarni, Ajinkya; Maskey, Manil; Bakare, Rohan; Basyal, Sabin; Li, Xiang; Flynn, Shannon

    2014-01-01

    Approaches used in Earth science research such as case study analysis and climatology studies involve discovering and gathering diverse data sets and information to support the research goals. To gather relevant data and information for case studies and climatology analysis is both tedious and time consuming. Current Earth science data systems are designed with the assumption that researchers access data primarily by instrument or geophysical parameter. In cases where researchers are interested in studying a significant event, they have to manually assemble a variety of datasets relevant to it by searching the different distributed data systems. This paper presents a specialized search, aggregation and curation tool for Earth science to address these challenges. The search rool automatically creates curated 'Data Albums', aggregated collections of information related to a specific event, containing links to relevant data files [granules] from different instruments, tools and services for visualization and analysis, and information about the event contained in news reports, images or videos to supplement research analysis. Curation in the tool is driven via an ontology based relevancy ranking algorithm to filter out non relevant information and data.

  20. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  1. Examining Interrater Agreement Analyses of a Pilot Special Education Observation Tool

    ERIC Educational Resources Information Center

    Johnson, Evelyn S.; Semmelroth, Carrie L.

    2012-01-01

    This paper reports the results of interrater agreement analyses on a pilot special education teacher evaluation instrument, the Recognizing Effective Special Education Teachers (RESET) Observation Tool (OT). Using evidence-based instructional practices as the basis for the evaluation, the RESET OT is designed for the spectrum of different…

  2. Validating an Observation Protocol to Measure Special Education Teacher Effectiveness

    ERIC Educational Resources Information Center

    Johnson, Evelyn S.; Semmelroth, Carrie L.

    2015-01-01

    This study used Kane's (2013) Interpretation/Use Argument (IUA) to measure validity on the Recognizing Effective Special Education Teachers (RESET) observation tool. The RESET observation tool is designed to evaluate special education teacher effectiveness using evidence-based instructional practices as the basis for evaluation. In alignment with…

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  4. T.Rex

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-06-08

    T.Rex is used to explore tabular data sets containing up to ten million records to help rapidly understand a previously unknown data set. Analysis can quickly identify patterns of interest and the records and fields that capture those patterns. T.Rex contains a growing set of deep analytical tools and supports robust export capabilities that selected data can be incorporated into to other specialized tools for further analysis. T.Rex is flexible in ingesting different types and formats of data, allowing the user to interactively experiment and perform trial and error guesses on the structure of the data; and also has amore » variety of linked visual analytic tools that enable exploration of the data to find relevant content, relationships among content, trends within the content, and capture knowledge about the content. Finally, T.Rex has a rich export capability, to extract relevant subsets of a larger data source, to further analyze their data in other analytic tools.« less

  5. NASA Tech Briefs, November/December 1986, Special Edition

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Topics: Computing: The View from NASA Headquarters; Earth Resources Laboratory Applications Software: Versatile Tool for Data Analysis; The Hypercube: Cost-Effective Supercomputing; Artificial Intelligence: Rendezvous with NASA; NASA's Ada Connection; COSMIC: NASA's Software Treasurehouse; Golden Oldies: Tried and True NASA Software; Computer Technical Briefs; NASA TU Services; Digital Fly-by-Wire.

  6. Break-even Analysis: Tool for Budget Planning

    ERIC Educational Resources Information Center

    Lohmann, Roger A.

    1976-01-01

    Multiple funding creates special management problems for the administrator of a human service agency. This article presents a useful analytic technique adapted from business practice that can help the administrator draw up and balance a unified budget. Such a budget also affords reliable overview of the agency's financial status. (Author)

  7. Parents of Autistic Children and Their Experiences with Assistive Technology

    ERIC Educational Resources Information Center

    Curran, David

    2017-01-01

    Assistive Technology (AT) has become an important tool used by special needs children for improving their quality of life by empowering their abilities, therefore improving their personal independence. The purpose of this Interpretative Phenomenological Analysis (IPA) study was to closely examine the experiences and meaning-making of parents, of…

  8. Special Focus

    PubMed Central

    Nawrocki, Eric P.; Burge, Sarah W.

    2013-01-01

    The development of RNA bioinformatic tools began more than 30 y ago with the description of the Nussinov and Zuker dynamic programming algorithms for single sequence RNA secondary structure prediction. Since then, many tools have been developed for various RNA sequence analysis problems such as homology search, multiple sequence alignment, de novo RNA discovery, read-mapping, and many more. In this issue, we have collected a sampling of reviews and original research that demonstrate some of the many ways bioinformatics is integrated with current RNA biology research. PMID:23948768

  9. Revisiting Information Technology tools serving authorship and editorship: a case-guided tutorial to statistical analysis and plagiarism detection

    PubMed Central

    Bamidis, P D; Lithari, C; Konstantinidis, S T

    2010-01-01

    With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces. PMID:21487489

  10. Revisiting Information Technology tools serving authorship and editorship: a case-guided tutorial to statistical analysis and plagiarism detection.

    PubMed

    Bamidis, P D; Lithari, C; Konstantinidis, S T

    2010-12-01

    With the number of scientific papers published in journals, conference proceedings, and international literature ever increasing, authors and reviewers are not only facilitated with an abundance of information, but unfortunately continuously confronted with risks associated with the erroneous copy of another's material. In parallel, Information Communication Technology (ICT) tools provide to researchers novel and continuously more effective ways to analyze and present their work. Software tools regarding statistical analysis offer scientists the chance to validate their work and enhance the quality of published papers. Moreover, from the reviewers and the editor's perspective, it is now possible to ensure the (text-content) originality of a scientific article with automated software tools for plagiarism detection. In this paper, we provide a step-bystep demonstration of two categories of tools, namely, statistical analysis and plagiarism detection. The aim is not to come up with a specific tool recommendation, but rather to provide useful guidelines on the proper use and efficiency of either category of tools. In the context of this special issue, this paper offers a useful tutorial to specific problems concerned with scientific writing and review discourse. A specific neuroscience experimental case example is utilized to illustrate the young researcher's statistical analysis burden, while a test scenario is purpose-built using open access journal articles to exemplify the use and comparative outputs of seven plagiarism detection software pieces.

  11. [Influence of surgeon specialization upon the results of colon cancer surgery. Usefulness of propensity scores].

    PubMed

    Martínez-Ramos, D; Escrig-Sos, J; Miralles-Tena, J M; Rivadulla-Serrano, M I; Daroca-José, J M; Salvador Sanchís, J L

    2008-07-01

    surgeon influence on colorectal cancer surgery outcomes has been repeatedly studied in the scientific literature, but conclusions have been contradictory. Here we study whether surgeon specialization is a determinant factor for outcome in these patients. The importance of propensity scores (PS) in surgical research is also studied. a retrospective study was performed and medical records were reviewed for 236 patients who were intervened for colon cancer in Castellon General Hospital (Spain). Cases were divided into two groups (specialist and non-specialist surgeons), and both 5-year surveillance and disease free survival were compared. Comparisons were first made with no adjustments, and then subsequently using PS analysis. the initial (non-adjusted) analysis was clearly favourable for the specialist surgeon group (5-year surveillance, 64.3 vs. 79.3%, p = 0.028). After adjusting for PS no statistical significance was obtained. surgeon specialization had no significant impact on patient outcome after colon cancer surgery. Propensity score analysis is an important tool in the analysis of surgical non-randomized studies, particularly when events under scrutiny are rare.

  12. Some applications of mathematics in theoretical physics - A review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bora, Kalpana

    2016-06-21

    Mathematics is a very beautiful subject−very much an indispensible tool for Physics, more so for Theoretical Physics (by which we mean here mainly Field Theory and High Energy Physics). These branches of Physics are based on Quantum Mechanics and Special Theory of Relativity, and many mathematical concepts are used in them. In this work, we shall elucidate upon only some of them, like−differential geometry, infinite series, Mellin transforms, Fourier and integral transforms, special functions, calculus, complex algebra, topology, group theory, Riemannian geometry, functional analysis, linear algebra, operator algebra, etc. We shall also present, some physics issues, where these mathematical toolsmore » are used. It is not wrong to say that Mathematics is such a powerful tool, without which, there can not be any Physics theory!! A brief review on our research work is also presented.« less

  13. A thermal biosensor based on enzyme reaction.

    PubMed

    Zheng, Yi-Hua; Hua, Tse-Chao; Xu, Fei

    2005-01-01

    Application of the thermal biosensor as analytical tool is promising due to advantages as universal, simplicity and quick response. A novel thermal biosensor based on enzyme reaction has been developed. This biosensor is a flow injection analysis system and consists of two channels with enzyme reaction column and reference column. The reference column, which is set for eliminating the unspecific heat, is inactived on special enzyme reaction of the ingredient to be detected. The special enzyme reaction takes places in the enzyme reaction column at a constant temperature realizing by a thermoelectric thermostat. Thermal sensor based on the thermoelectric module containing 127 serial BiTe-thermocouples is used to monitor the temperature difference between two streams from the enzyme reaction column and the reference column. The analytical example for dichlorvos shows that this biosensor can be used as analytical tool in medicine and biology.

  14. Open source software projects of the caBIG In Vivo Imaging Workspace Software special interest group.

    PubMed

    Prior, Fred W; Erickson, Bradley J; Tarbox, Lawrence

    2007-11-01

    The Cancer Bioinformatics Grid (caBIG) program was created by the National Cancer Institute to facilitate sharing of IT infrastructure, data, and applications among the National Cancer Institute-sponsored cancer research centers. The program was launched in February 2004 and now links more than 50 cancer centers. In April 2005, the In Vivo Imaging Workspace was added to promote the use of imaging in cancer clinical trials. At the inaugural meeting, four special interest groups (SIGs) were established. The Software SIG was charged with identifying projects that focus on open-source software for image visualization and analysis. To date, two projects have been defined by the Software SIG. The eXtensible Imaging Platform project has produced a rapid application development environment that researchers may use to create targeted workflows customized for specific research projects. The Algorithm Validation Tools project will provide a set of tools and data structures that will be used to capture measurement information and associated needed to allow a gold standard to be defined for the given database against which change analysis algorithms can be tested. Through these and future efforts, the caBIG In Vivo Imaging Workspace Software SIG endeavors to advance imaging informatics and provide new open-source software tools to advance cancer research.

  15. CoryneBase: Corynebacterium Genomic Resources and Analysis Tools at Your Fingertips

    PubMed Central

    Tan, Mui Fern; Jakubovics, Nick S.; Wee, Wei Yee; Mutha, Naresh V. R.; Wong, Guat Jah; Ang, Mia Yang; Yazdi, Amir Hessam; Choo, Siew Woh

    2014-01-01

    Corynebacteria are used for a wide variety of industrial purposes but some species are associated with human diseases. With increasing number of corynebacterial genomes having been sequenced, comparative analysis of these strains may provide better understanding of their biology, phylogeny, virulence and taxonomy that may lead to the discoveries of beneficial industrial strains or contribute to better management of diseases. To facilitate the ongoing research of corynebacteria, a specialized central repository and analysis platform for the corynebacterial research community is needed to host the fast-growing amount of genomic data and facilitate the analysis of these data. Here we present CoryneBase, a genomic database for Corynebacterium with diverse functionality for the analysis of genomes aimed to provide: (1) annotated genome sequences of Corynebacterium where 165,918 coding sequences and 4,180 RNAs can be found in 27 species; (2) access to comprehensive Corynebacterium data through the use of advanced web technologies for interactive web interfaces; and (3) advanced bioinformatic analysis tools consisting of standard BLAST for homology search, VFDB BLAST for sequence homology search against the Virulence Factor Database (VFDB), Pairwise Genome Comparison (PGC) tool for comparative genomic analysis, and a newly designed Pathogenomics Profiling Tool (PathoProT) for comparative pathogenomic analysis. CoryneBase offers the access of a range of Corynebacterium genomic resources as well as analysis tools for comparative genomics and pathogenomics. It is publicly available at http://corynebacterium.um.edu.my/. PMID:24466021

  16. Preparing Effective Special Education Teachers. What Works for Special-Needs Learners Series

    ERIC Educational Resources Information Center

    Mamlin, Nancy

    2012-01-01

    What tools are in the toolkit of an excellent special educator, and how can teacher preparation programs provide these tools in the most efficient, effective way possible? This practical, clearly written book is grounded in current research and policy as well as the author's extensive experience as a teacher educator. It identifies what special…

  17. Coherent tools for physics-based simulation and characterization of noise in semiconductor devices oriented to nonlinear microwave circuit CAD

    NASA Astrophysics Data System (ADS)

    Riah, Zoheir; Sommet, Raphael; Nallatamby, Jean C.; Prigent, Michel; Obregon, Juan

    2004-05-01

    We present in this paper a set of coherent tools for noise characterization and physics-based analysis of noise in semiconductor devices. This noise toolbox relies on a low frequency noise measurement setup with special high current capabilities thanks to an accurate and original calibration. It relies also on a simulation tool based on the drift diffusion equations and the linear perturbation theory, associated with the Green's function technique. This physics-based noise simulator has been implemented successfully in the Scilab environment and is specifically dedicated to HBTs. Some results are given and compared to those existing in the literature.

  18. Development of advanced structural analysis methodologies for predicting widespread fatigue damage in aircraft structures

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Starnes, James H., Jr.; Newman, James C., Jr.

    1995-01-01

    NASA is developing a 'tool box' that includes a number of advanced structural analysis computer codes which, taken together, represent the comprehensive fracture mechanics capability required to predict the onset of widespread fatigue damage. These structural analysis tools have complementary and specialized capabilities ranging from a finite-element-based stress-analysis code for two- and three-dimensional built-up structures with cracks to a fatigue and fracture analysis code that uses stress-intensity factors and material-property data found in 'look-up' tables or from equations. NASA is conducting critical experiments necessary to verify the predictive capabilities of the codes, and these tests represent a first step in the technology-validation and industry-acceptance processes. NASA has established cooperative programs with aircraft manufacturers to facilitate the comprehensive transfer of this technology by making these advanced structural analysis codes available to industry.

  19. APMS: An Integrated Suite of Tools for Measuring Performance and Safety

    NASA Technical Reports Server (NTRS)

    Statler, Irving C.; Lynch, Robert E.; Connors, Mary M. (Technical Monitor)

    1997-01-01

    This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data. The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data-analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions. APMS will offer to the air transport community an open, voluntary standard for flight-data-analysis software, a standard that will help to ensure suitable functionality, and data interchangeability, among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs of air crews in mind. APMS tools must serve the needs of the government and air carriers, as well as air crews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but through statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the air crew.

  20. The Aviation Performance Measuring System (APMS): An Integrated Suite of Tools for Measuring Performance and Safety

    NASA Technical Reports Server (NTRS)

    Statler, Irving C.; Connor, Mary M. (Technical Monitor)

    1998-01-01

    This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data, The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions. APMS offers to the air transport community an open, voluntary standard for flight-data-analysis software; a standard that will help to ensure suitable functionality and data interchangeability among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs-of aircrews in mind. APMS tools must serve the needs of the government and air carriers, as well as aircrews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but also through statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the aircrew.

  1. APMS: An Integrated Suite of Tools for Measuring Performance and Safety

    NASA Technical Reports Server (NTRS)

    Statler, Irving C. (Technical Monitor)

    1997-01-01

    This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data. The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data-analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions . APMS will offer to the air transport community an open, voluntary standard for flight-data-analysis software, a standard that will help to ensure suitable functionality, and data interchangeability, among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs of air crews in mind. APMS tools must serve the needs of the government and air carriers, as well as air crews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but through statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the air crew.

  2. APMS: An Integrated Set of Tools for Measuring Safety

    NASA Technical Reports Server (NTRS)

    Statler, Irving C.; Reynard, William D. (Technical Monitor)

    1996-01-01

    This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data. The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data-analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions. APMS will offer to the air transport community an open, voluntary standard for flight-data-analysis software, a standard that will help to ensure suitable functionality, and data interchangeability, among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs of air crews in mind. APMS tools must serve the needs of the government and air carriers, as well as air crews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but through statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the air crew.

  3. Amy Robertson | NREL

    Science.gov Websites

    validation, and data analysis. At NREL, Amy specializes in the modeling of offshore wind system dynamics. She Amy.Robertson@nrel.gov | 303-384-7157 Amy's expertise is in structural dynamics modeling, verification and of offshore wind modeling tools. Prior to joining NREL, Amy worked as an independent consultant for

  4. Open Simulation Laboratories [Guest editors' introduction

    DOE PAGES

    Alexander, Francis J.; Meneveau, Charles

    2015-09-01

    The introduction for the special issue on open simulation laboratories, the guest editors describe how OSLs will become more common as their potential is better understood and they begin providing access to valuable datasets to much larger segments of the scientific community. Moreover, new analysis tools and ways to do science will inevitably develop as a result.

  5. Establishing the Validity of the Task-Based English Speaking Test (TBEST) for International Teaching Assistants

    ERIC Educational Resources Information Center

    Witt, Autumn Song

    2010-01-01

    This dissertation follows an oral language assessment tool from initial design and implementation to validity analysis. The specialized variables of this study are the population: international teaching assistants and the purpose: spoken assessment as a hiring prerequisite. However, the process can easily be applied to other populations and…

  6. Children with Disabilities: Constructing Metaphors and Meanings through Art

    ERIC Educational Resources Information Center

    Saldaña, Claudia

    2016-01-01

    The aim of this qualitative study is to explore how art, as a semiotic tool, transforms children with disabilities. To achieve this purpose, one must listen to the voices of teachers and childcare workers in the field of special education. The study's preliminary findings found three main categories through data analysis: 1) Teachers' perceptions…

  7. Wheat crown rot pathogens Fusarium graminearum and F. pseudograminearum lack specialization.

    PubMed

    Chakraborty, Sukumar; Obanor, Friday; Westecott, Rhyannyn; Abeywickrama, Krishanthi

    2010-10-01

    This article reports a lack of pathogenic specialization among Australian Fusarium graminearum and F. pseudograminearum causing crown rot (CR) of wheat using analysis of variance (ANOVA), principal component and biplot analysis, Kendall's coefficient of concordance (W), and κ statistics. Overall, F. pseudograminearum was more aggressive than F. graminearum, supporting earlier delineation of the crown-infecting group as a new species. Although significant wheat line-pathogen isolate interaction in ANOVA suggested putative specialization when seedlings of 60 wheat lines were inoculated with 4 pathogen isolates or 26 wheat lines were inoculated with 10 isolates, significant W and κ showed agreement in rank order of wheat lines, indicating a lack of specialization. The first principal component representing nondifferential aggressiveness explained a large part (up to 65%) of the variation in CR severity. The differential components were small and more pronounced in seedlings than in adult plants. By maximizing variance on the first two principal components, biplots were useful for highlighting the association between isolates and wheat lines. A key finding of this work is that a range of analytical tools are needed to explore pathogenic specialization, and a statistically significant interaction in an ANOVA cannot be taken as conclusive evidence of specialization. With no highly resistant wheat cultivars, Fusarium isolates mostly differ in aggressiveness; however, specialization may appear as more resistant cultivars become widespread.

  8. The integration of a LANDSAT analysis capability with a geographic information system

    NASA Technical Reports Server (NTRS)

    Nordstrand, E. A.

    1981-01-01

    The integration of LANDSAT data was achieved through the development of a flexible, compatible analysis tool and using an existing data base to select the usable data from a LANDSAT analysis. The software package allows manipulation of grid cell data plus the flexibility to allow the user to include FORTRAN statements for special functions. Using this combination of capabilities the user can classify a LANDSAT image and then selectivity merge the results with other data that may exist for the study area.

  9. An evolutionary computation based algorithm for calculating solar differential rotation by automatic tracking of coronal bright points

    NASA Astrophysics Data System (ADS)

    Shahamatnia, Ehsan; Dorotovič, Ivan; Fonseca, Jose M.; Ribeiro, Rita A.

    2016-03-01

    Developing specialized software tools is essential to support studies of solar activity evolution. With new space missions such as Solar Dynamics Observatory (SDO), solar images are being produced in unprecedented volumes. To capitalize on that huge data availability, the scientific community needs a new generation of software tools for automatic and efficient data processing. In this paper a prototype of a modular framework for solar feature detection, characterization, and tracking is presented. To develop an efficient system capable of automatic solar feature tracking and measuring, a hybrid approach combining specialized image processing, evolutionary optimization, and soft computing algorithms is being followed. The specialized hybrid algorithm for tracking solar features allows automatic feature tracking while gathering characterization details about the tracked features. The hybrid algorithm takes advantages of the snake model, a specialized image processing algorithm widely used in applications such as boundary delineation, image segmentation, and object tracking. Further, it exploits the flexibility and efficiency of Particle Swarm Optimization (PSO), a stochastic population based optimization algorithm. PSO has been used successfully in a wide range of applications including combinatorial optimization, control, clustering, robotics, scheduling, and image processing and video analysis applications. The proposed tool, denoted PSO-Snake model, was already successfully tested in other works for tracking sunspots and coronal bright points. In this work, we discuss the application of the PSO-Snake algorithm for calculating the sidereal rotational angular velocity of the solar corona. To validate the results we compare them with published manual results performed by an expert.

  10. Analysis of semantic search within the domains of uncertainty: using Keyword Effectiveness Indexing as an evaluation tool.

    PubMed

    Lorence, Daniel; Abraham, Joanna

    2006-01-01

    Medical and health-related searches pose a special case of risk when using the web as an information resource. Uninsured consumers, lacking access to a trained provider, will often rely on information from the internet for self-diagnosis and treatment. In areas where treatments are uncertain or controversial, most consumers lack the knowledge to make an informed decision. This exploratory technology assessment examines the use of Keyword Effectiveness Indexing (KEI) analysis as a potential tool for profiling information search and keyword retrieval patterns. Results demonstrate that the KEI methodology can be useful in identifying e-health search patterns, but is limited by semantic or text-based web environments.

  11. USSR Report, Machine Tools and Metalworking Equipment, No. 6

    DTIC Science & Technology

    1983-05-18

    production output per machine tool at a tool plant average 2-3 times the figures for tool shops. This is explained by the well-known advantages of...specialized production. Specifically, the advantages of standardization and unification of machine- attachment design can be fully exploited in...lemiiiiä IS MVCti\\e UtiUzation °f appropriate special equipmeT ters)! million thread-cutting dies, and 2.3 million milling cut- The advantages of

  12. Using the Frailty Assessment for Care Planning Tool (FACT) to screen elderly chronic kidney disease patients for frailty: the nurse experience.

    PubMed

    Moffatt, Heather; Moorhouse, Paige; Mallery, Laurie; Landry, David; Tennankore, Karthik

    2018-01-01

    Recent evidence supports the prognostic significance of frailty for functional decline and poor health outcomes in patients with chronic kidney disease. Yet, despite the development of clinical tools to screen for frailty, little is known about the experiential impact of screening for frailty in this setting. The Frailty Assessment for Care Planning Tool (FACT) evaluates frailty across 4 domains: mobility, function, social circumstances, and cognition. The purpose of this qualitative study was as follows: 1) explore the nurse experience of screening for frailty using the FACT tool in a specialized outpatient renal clinic; 2) determine how, if at all, provider perceptions of frailty changed after implementation of the frailty screening tool; and 3) determine the perceived factors that influence uptake and administration of the FACT screening tool in a specialized clinical setting. A semi-structured interview of 5 nurses from the Nova Scotia Health Authority, Central Zone Renal Clinic was conducted. A grounded theory approach was used to generate thematic categories and analysis models. Four primary themes emerged in the data analysis: "we were skeptical", "we made it work", "we learned how", and "we understand". As the renal nurses gained a sense of confidence in their ability to implement the FACT tool, initial barriers to implementation were attenuated. Implementation factors - such as realistic goals, clear guidelines, and ongoing training - were important factors for successful uptake of the frailty screening initiative. Nurse participants reported an overall positive experience using the FACT method to screen for frailty and indicated that their understanding of the multiple dimensions and subtleties of "frailty" were enhanced. Future nurse-led FACT screening initiatives should incorporate those factors identified as being integral to program success: realistic goals, clear guidelines, and ongoing training. Adopting the evaluation of frailty as a priority within clinical departments will encourage sustainability.

  13. SPICE for ESA Planetary Missions: geometry and visualization support to studies, operations and data analysis within your reach

    NASA Astrophysics Data System (ADS)

    Costa, Marc

    2018-05-01

    JUICE is a mission chosen in the framework of the Cosmic Vision 2015-2024 program of the SRE. JUICE will survey the Jovian system with a special focus on the three Galilean Moons. Currently the mission is under study activities during its Definition Phase. For this period the future mission scenarios are being studied by the Science Working Team (SWT). The Mission Analysis and Payload Support (MAPPS) and the Solar System Science Operations Laboratory (SOLab) tools are being used to provide active support to the SWT in synergy with other operational tools used in the Department in order to evaluate the feasibility of those scenarios. This contribution will outline the capabilities, synergies as well as use cases of the mentioned tools focusing on the support provided to JUICEís study phase on the study of its critical operational scenarios and the early developments of its Science Ground Segment demonstrating the added value that such a tool provides to planetary science missions.

  14. Children's Books about Special Needs Used as a Mediating Tool, The Perceptions of Inclusion Classroom Teachers in Mainstream Schools

    ERIC Educational Resources Information Center

    Lea, Baratz

    2015-01-01

    The current study addresses the disparity between the awareness of teachers in special education frameworks regarding the important role of books as a mediating tool and their reticence to use this tool. Twenty three interviews were conducted in two stages: before and after using the book "Shelley the Hyperactive Turtle" in the…

  15. Grid Generation for Multidisciplinary Design and Optimization of an Aerospace Vehicle: Issues and Challenges

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    2000-01-01

    The purpose of this paper is to discuss grid generation issues and to challenge the grid generation community to develop tools suitable for automated multidisciplinary analysis and design optimization of aerospace vehicles. Special attention is given to the grid generation issues of computational fluid dynamics and computational structural mechanics disciplines.

  16. The MicronEye Motion Monitor: A New Tool for Class and Laboratory Demonstrations.

    ERIC Educational Resources Information Center

    Nissan, M.; And Others

    1988-01-01

    Describes a special camera that can be directly linked to a computer that has been adapted for studying movement. Discusses capture, processing, and analysis of two-dimensional data with either IBM PC or Apple II computers. Gives examples of a variety of mechanical tests including pendulum motion, air track, and air table. (CW)

  17. Geoscience data visualization and analysis using GeoMapApp

    NASA Astrophysics Data System (ADS)

    Ferrini, Vicki; Carbotte, Suzanne; Ryan, William; Chan, Samantha

    2013-04-01

    Increased availability of geoscience data resources has resulted in new opportunities for developing visualization and analysis tools that not only promote data integration and synthesis, but also facilitate quantitative cross-disciplinary access to data. Interdisciplinary investigations, in particular, frequently require visualizations and quantitative access to specialized data resources across disciplines, which has historically required specialist knowledge of data formats and software tools. GeoMapApp (www.geomapapp.org) is a free online data visualization and analysis tool that provides direct quantitative access to a wide variety of geoscience data for a broad international interdisciplinary user community. While GeoMapApp provides access to online data resources, it can also be packaged to work offline through the deployment of a small portable hard drive. This mode of operation can be particularly useful during field programs to provide functionality and direct access to data when a network connection is not possible. Hundreds of data sets from a variety of repositories are directly accessible in GeoMapApp, without the need for the user to understand the specifics of file formats or data reduction procedures. Available data include global and regional gridded data, images, as well as tabular and vector datasets. In addition to basic visualization and data discovery functionality, users are provided with simple tools for creating customized maps and visualizations and to quantitatively interrogate data. Specialized data portals with advanced functionality are also provided for power users to further analyze data resources and access underlying component datasets. Users may import and analyze their own geospatial datasets by loading local versions of geospatial data and can access content made available through Web Feature Services (WFS) and Web Map Services (WMS). Once data are loaded in GeoMapApp, a variety options are provided to export data and/or 2D/3D visualizations into common formats including grids, images, text files, spreadsheets, etc. Examples of interdisciplinary investigations that make use of GeoMapApp visualization and analysis functionality will be provided.

  18. Human-like, population-level specialization in the manufacture of pandanus tools by New Caledonian crows Corvus moneduloides.

    PubMed Central

    Hunt, G R

    2000-01-01

    The main way of gaining insight into the behaviour and neurological faculties of our early ancestors is to study artefactual evidence for the making and use of tools, but this places severe constraints on what knowledge can be obtained. New Caledonian crows, however, offer a potential analogous model system for learning about these difficult-to-establish aspects of prehistoric humans. I found new evidence of human-like specialization in crows' manufacture of hook tools from pandanus leaves: functional lateralization or 'handedness' and the shaping of these tools to a rule system. These population-level features are unprecedented in the tool behaviour of free-living non-humans and provide the first demonstration that a population bias for handedness in tool-making and the shaping of tools to rule systems are not concomitant with symbolic thought and language. It is unknown how crows obtain their tool behaviour. Nevertheless, at the least they can be studied in order to learn about the neuropsychology associated with early specialized and/or advanced population features in tool-making such as hook use, handedness and the shaping of tools to rule systems. PMID:10722223

  19. Neandertals made the first specialized bone tools in Europe

    PubMed Central

    Soressi, Marie; McPherron, Shannon P.; Lenoir, Michel; Dogandžić, Tamara; Goldberg, Paul; Jacobs, Zenobia; Maigrot, Yolaine; Martisius, Naomi L.; Miller, Christopher E.; Rendu, William; Richards, Michael; Skinner, Matthew M.; Steele, Teresa E.; Talamo, Sahra; Texier, Jean-Pierre

    2013-01-01

    Modern humans replaced Neandertals ∼40,000 y ago. Close to the time of replacement, Neandertals show behaviors similar to those of the modern humans arriving into Europe, including the use of specialized bone tools, body ornaments, and small blades. It is highly debated whether these modern behaviors developed before or as a result of contact with modern humans. Here we report the identification of a type of specialized bone tool, lissoir, previously only associated with modern humans. The microwear preserved on one of these lissoir is consistent with the use of lissoir in modern times to obtain supple, lustrous, and more impermeable hides. These tools are from a Neandertal context proceeding the replacement period and are the oldest specialized bone tools in Europe. As such, they are either a demonstration of independent invention by Neandertals or an indication that modern humans started influencing European Neandertals much earlier than previously believed. Because these finds clearly predate the oldest known age for the use of similar objects in Europe by anatomically modern humans, they could also be evidence for cultural diffusion from Neandertals to modern humans. PMID:23940333

  20. Object-oriented design of medical imaging software.

    PubMed

    Ligier, Y; Ratib, O; Logean, M; Girard, C; Perrier, R; Scherrer, J R

    1994-01-01

    A special software package for interactive display and manipulation of medical images was developed at the University Hospital of Geneva, as part of a hospital wide Picture Archiving and Communication System (PACS). This software package, called Osiris, was especially designed to be easily usable and adaptable to the needs of noncomputer-oriented physicians. The Osiris software has been developed to allow the visualization of medical images obtained from any imaging modality. It provides generic manipulation tools, processing tools, and analysis tools more specific to clinical applications. This software, based on an object-oriented paradigm, is portable and extensible. Osiris is available on two different operating systems: the Unix X-11/OSF-Motif based workstations, and the Macintosh family.

  1. Tools for surveying and improving the quality of life: people with special needs in focus.

    PubMed

    Hoyningen-Süess, Ursula; Oberholzer, David; Stalder, René; Brügger, Urs

    2012-01-01

    This article seeks to describe online tools for surveying and improving quality of life for people with disabilities living in assisted living centers and special education service organizations. Ensuring a decent quality of life for disabled people is an important welfare state goal. Using well-accepted quality of life conceptions, online diagnostic and planning tools were developed during an Institute for Education, University of Zurich, research project. The diagnostic tools measure, evaluate and analyze disabled people's quality of life. The planning tools identify factors that can affect their quality of life and suggest improvements. Instrument validity and reliability are not tested according to the standard statistical procedures. This will be done at a more advanced stage of the project. Instead, the tool is developed, refined and adjusted in cooperation with practitioners who are constantly judging it according to best practice standards. The tools support staff in assisted living centers and special education service organizations. These tools offer comprehensive resources for surveying, quantifying, evaluating, describing and simulating quality of life elements.

  2. 48 CFR 245.608-5 - Special items screening.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Disposal of Contractor Inventory 245.608-5 Special items screening. (a) Special test equipment with standard components. (1) The contractor shall report any excess special test equipment (STE) using SF 1432, Inventory Schedule D (Special Tooling and Special Test Equipment). The contractor shall list and describe on...

  3. PROVAT: a tool for Voronoi tessellation analysis of protein structures and complexes.

    PubMed

    Gore, Swanand P; Burke, David F; Blundell, Tom L

    2005-08-01

    Voronoi tessellation has proved to be a useful tool in protein structure analysis. We have developed PROVAT, a versatile public domain software that enables computation and visualization of Voronoi tessellations of proteins and protein complexes. It is a set of Python scripts that integrate freely available specialized software (Qhull, Pymol etc.) into a pipeline. The calculation component of the tool computes Voronoi tessellation of a given protein system in a way described by a user-supplied XML recipe and stores resulting neighbourhood information as text files with various styles. The Python pickle file generated in the process is used by the visualization component, a Pymol plug-in, that offers a GUI to explore the tessellation visually. PROVAT source code can be downloaded from http://raven.bioc.cam.ac.uk/~swanand/Provat1, which also provides a webserver for its calculation component, documentation and examples.

  4. DNA marker technology for wildlife conservation

    PubMed Central

    Arif, Ibrahim A.; Khan, Haseeb A.; Bahkali, Ali H.; Al Homaidan, Ali A.; Al Farhan, Ahmad H.; Al Sadoon, Mohammad; Shobrak, Mohammad

    2011-01-01

    Use of molecular markers for identification of protected species offers a greater promise in the field of conservation biology. The information on genetic diversity of wildlife is necessary to ascertain the genetically deteriorated populations so that better management plans can be established for their conservation. Accurate classification of these threatened species allows understanding of the species biology and identification of distinct populations that should be managed with utmost care. Molecular markers are versatile tools for identification of populations with genetic crisis by comparing genetic diversities that in turn helps to resolve taxonomic uncertainties and to establish management units within species. The genetic marker analysis also provides sensitive and useful tools for prevention of illegal hunting and poaching and for more effective implementation of the laws for protection of the endangered species. This review summarizes various tools of DNA markers technology for application in molecular diversity analysis with special emphasis on wildlife conservation. PMID:23961128

  5. YersiniaBase: a genomic resource and analysis platform for comparative analysis of Yersinia.

    PubMed

    Tan, Shi Yang; Dutta, Avirup; Jakubovics, Nicholas S; Ang, Mia Yang; Siow, Cheuk Chuen; Mutha, Naresh Vr; Heydari, Hamed; Wee, Wei Yee; Wong, Guat Jah; Choo, Siew Woh

    2015-01-16

    Yersinia is a Gram-negative bacteria that includes serious pathogens such as the Yersinia pestis, which causes plague, Yersinia pseudotuberculosis, Yersinia enterocolitica. The remaining species are generally considered non-pathogenic to humans, although there is evidence that at least some of these species can cause occasional infections using distinct mechanisms from the more pathogenic species. With the advances in sequencing technologies, many genomes of Yersinia have been sequenced. However, there is currently no specialized platform to hold the rapidly-growing Yersinia genomic data and to provide analysis tools particularly for comparative analyses, which are required to provide improved insights into their biology, evolution and pathogenicity. To facilitate the ongoing and future research of Yersinia, especially those generally considered non-pathogenic species, a well-defined repository and analysis platform is needed to hold the Yersinia genomic data and analysis tools for the Yersinia research community. Hence, we have developed the YersiniaBase, a robust and user-friendly Yersinia resource and analysis platform for the analysis of Yersinia genomic data. YersiniaBase has a total of twelve species and 232 genome sequences, of which the majority are Yersinia pestis. In order to smooth the process of searching genomic data in a large database, we implemented an Asynchronous JavaScript and XML (AJAX)-based real-time searching system in YersiniaBase. Besides incorporating existing tools, which include JavaScript-based genome browser (JBrowse) and Basic Local Alignment Search Tool (BLAST), YersiniaBase also has in-house developed tools: (1) Pairwise Genome Comparison tool (PGC) for comparing two user-selected genomes; (2) Pathogenomics Profiling Tool (PathoProT) for comparative pathogenomics analysis of Yersinia genomes; (3) YersiniaTree for constructing phylogenetic tree of Yersinia. We ran analyses based on the tools and genomic data in YersiniaBase and the preliminary results showed differences in virulence genes found in Yersinia pestis and Yersinia pseudotuberculosis compared to other Yersinia species, and differences between Yersinia enterocolitica subsp. enterocolitica and Yersinia enterocolitica subsp. palearctica. YersiniaBase offers free access to wide range of genomic data and analysis tools for the analysis of Yersinia. YersiniaBase can be accessed at http://yersinia.um.edu.my .

  6. System for Automated Geoscientific Analyses (SAGA) v. 2.1.4

    NASA Astrophysics Data System (ADS)

    Conrad, O.; Bechtel, B.; Bock, M.; Dietrich, H.; Fischer, E.; Gerlitz, L.; Wehberg, J.; Wichmann, V.; Böhner, J.

    2015-02-01

    The System for Automated Geoscientific Analyses (SAGA) is an open-source Geographic Information System (GIS), mainly licensed under the GNU General Public License. Since its first release in 2004, SAGA has rapidly developed from a specialized tool for digital terrain analysis to a comprehensive and globally established GIS platform for scientific analysis and modeling. SAGA is coded in C++ in an object oriented design and runs under several operating systems including Windows and Linux. Key functional features of the modular organized software architecture comprise an application programming interface for the development and implementation of new geoscientific methods, an easily approachable graphical user interface with many visualization options, a command line interpreter, and interfaces to scripting and low level programming languages like R and Python. The current version 2.1.4 offers more than 700 tools, which are implemented in dynamically loadable libraries or shared objects and represent the broad scopes of SAGA in numerous fields of geoscientific endeavor and beyond. In this paper, we inform about the system's architecture, functionality, and its current state of development and implementation. Further, we highlight the wide spectrum of scientific applications of SAGA in a review of published studies with special emphasis on the core application areas digital terrain analysis, geomorphology, soil science, climatology and meteorology, as well as remote sensing.

  7. System for Automated Geoscientific Analyses (SAGA) v. 2.1.4

    NASA Astrophysics Data System (ADS)

    Conrad, O.; Bechtel, B.; Bock, M.; Dietrich, H.; Fischer, E.; Gerlitz, L.; Wehberg, J.; Wichmann, V.; Böhner, J.

    2015-07-01

    The System for Automated Geoscientific Analyses (SAGA) is an open source geographic information system (GIS), mainly licensed under the GNU General Public License. Since its first release in 2004, SAGA has rapidly developed from a specialized tool for digital terrain analysis to a comprehensive and globally established GIS platform for scientific analysis and modeling. SAGA is coded in C++ in an object oriented design and runs under several operating systems including Windows and Linux. Key functional features of the modular software architecture comprise an application programming interface for the development and implementation of new geoscientific methods, a user friendly graphical user interface with many visualization options, a command line interpreter, and interfaces to interpreted languages like R and Python. The current version 2.1.4 offers more than 600 tools, which are implemented in dynamically loadable libraries or shared objects and represent the broad scopes of SAGA in numerous fields of geoscientific endeavor and beyond. In this paper, we inform about the system's architecture, functionality, and its current state of development and implementation. Furthermore, we highlight the wide spectrum of scientific applications of SAGA in a review of published studies, with special emphasis on the core application areas digital terrain analysis, geomorphology, soil science, climatology and meteorology, as well as remote sensing.

  8. 14 CFR 147.19 - Materials, special tools, and shop equipment requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., DEPARTMENT OF TRANSPORTATION (CONTINUED) SCHOOLS AND OTHER CERTIFICATED AGENCIES AVIATION MAINTENANCE TECHNICIAN SCHOOLS Certification Requirements § 147.19 Materials, special tools, and shop equipment requirements. An applicant for an aviation maintenance technician school certificate and rating, or for an...

  9. Googling DNA sequences on the World Wide Web.

    PubMed

    Hajibabaei, Mehrdad; Singer, Gregory A C

    2009-11-10

    New web-based technologies provide an excellent opportunity for sharing and accessing information and using web as a platform for interaction and collaboration. Although several specialized tools are available for analyzing DNA sequence information, conventional web-based tools have not been utilized for bioinformatics applications. We have developed a novel algorithm and implemented it for searching species-specific genomic sequences, DNA barcodes, by using popular web-based methods such as Google. We developed an alignment independent character based algorithm based on dividing a sequence library (DNA barcodes) and query sequence to words. The actual search is conducted by conventional search tools such as freely available Google Desktop Search. We implemented our algorithm in two exemplar packages. We developed pre and post-processing software to provide customized input and output services, respectively. Our analysis of all publicly available DNA barcode sequences shows a high accuracy as well as rapid results. Our method makes use of conventional web-based technologies for specialized genetic data. It provides a robust and efficient solution for sequence search on the web. The integration of our search method for large-scale sequence libraries such as DNA barcodes provides an excellent web-based tool for accessing this information and linking it to other available categories of information on the web.

  10. Steinberg ``AUDIOMAPS'' Music Appreciation-Via-Understanding: Special-Relativity + Expectations ``Quantum-Theory'': a Quantum-ACOUSTO/MUSICO-Dynamics (QA/MD)

    NASA Astrophysics Data System (ADS)

    Fender, Lee; Steinberg, Russell; Siegel, Edward Carl-Ludwig

    2011-03-01

    Steinberg wildly popular "AUDIOMAPS" music enjoyment/appreciation-via-understanding methodology, versus art, music-dynamics evolves, telling a story in (3+1)-dimensions: trails, frames, timbres, + dynamics amplitude vs. music-score time-series (formal-inverse power-spectrum) surprisingly closely parallels (3+1)-dimensional Einstein(1905) special-relativity "+" (with its enjoyment-expectations) a manifestation of quantum-theory expectation-values, together a music quantum-ACOUSTO/MUSICO-dynamics(QA/MD). Analysis via Derrida deconstruction enabled Siegel-Baez "Category-Semantics" "FUZZYICS"="CATEGORYICS ('TRIZ") Aristotle SoO DEduction , irrespective of Boon-Klimontovich vs. Voss-Clark[PRL(77)] music power-spectrum analysis sampling-time/duration controversy: part versus whole, shows QA/MD reigns supreme as THE music appreciation-via-analysis tool for the listener in musicology!!! Connection to Deutsch-Hartmann-Levitin[This is Your Brain on Music, (06)] brain/mind-barrier brain/mind-music connection is subtle/compelling/immediate!!!

  11. Human factors issues and approaches in the spatial layout of a space station control room, including the use of virtual reality as a design analysis tool

    NASA Technical Reports Server (NTRS)

    Hale, Joseph P., II

    1994-01-01

    Human Factors Engineering support was provided for the 30% design review of the late Space Station Freedom Payload Control Area (PCA). The PCA was to be the payload operations control room, analogous to the Spacelab Payload Operations Control Center (POCC). This effort began with a systematic collection and refinement of the relevant requirements driving the spatial layout of the consoles and PCA. This information was used as input for specialized human factors analytical tools and techniques in the design and design analysis activities. Design concepts and configuration options were developed and reviewed using sketches, 2-D Computer-Aided Design (CAD) drawings, and immersive Virtual Reality (VR) mockups.

  12. Efficient bibliographic searches on allergy using ISI databases.

    PubMed

    Sáez Gómez, J M; Annan, J W; Negro Alvarez, J M; Guillen-Grima, F; Bozzola, C M; Ivancevich, J C; Aguinaga Ontoso, E

    2008-01-01

    The aim of this article is to provide an introduction to using databases from the Thomson ISI Web of Knowledge, with special reference to Citation Indexes as an analysis tool for publications, and also to explain the meaning of the well-known Impact Factor. We present the partially modified new Consultation Interface to enhance information search routines of these databases. It introduces distinctive methods in search bibliography, including the correct application of analysis tools, paying particular attention to Journal Citation Reports and Impact Factor. We finish this article with comment on the consequences of using the Impact Factor as a quality indicator for the assessment of journals and publications, and how to ensure measures for indexing in the Thomson ISI Databases.

  13. [Application of Fourier transform attenuated total reflection infrared spectroscopy in analysis of pulp and paper industry].

    PubMed

    Zhang, Yong; Cao, Chun-yu; Feng, Wen-ying; Xu, Ming; Su, Zhen-hua; Liu, Xiao-meng; Lü, Wei-jun

    2011-03-01

    As one of the most powerful tools to investigate the compositions of raw materials and the property of pulp and paper, infrared spectroscopy has played an important role in pulp and paper industry. However, the traditional transmission infrared spectroscopy has not met the requirements of the producing processes because of its disadvantages of time consuming and sample destruction. New technique would be needed to be found. Fourier transform attenuated total reflection infrared spectroscopy (ATR-FTIR) is an advanced spectroscopic tool for nondestructive evaluation and could rapidly, accurately estimate the production properties of each process in pulp and paper industry. The present review describes the application of ATR-FTIR in analysis of pulp and paper industry. The analysis processes will include: pulping, papermaking, environmental protecting, special processing and paper identifying.

  14. Technology Enhanced Learning for People with Intellectual Disabilities and Cerebral Paralysis: The MAS Platform

    NASA Astrophysics Data System (ADS)

    Colomo-Palacios, Ricardo; Paniagua-Martín, Fernando; García-Crespo, Ángel; Ruiz-Mezcua, Belén

    Education for students with disabilities now takes place in a wide range of settings, thus, including a wider range of assistive tools. As a result of this, one of the most interesting application domains of technology enhanced learning is related to the adoption of learning technologies and designs for people with disabilities. Following this unstoppable trend, this paper presents MAS, a software platform aimed to help people with severe intellectual disabilities and cerebral paralysis in their learning processes. MAS, as a technology enhanced learning platform, provides several tools that supports learning and monitoring for people with special needs, including adaptative games, data processing and monitoring tools. Installed in a special needs education institution in Madrid, Spain, MAS provides special educators with a tool that improved students education processes.

  15. Using the simplified case mix tool (sCMT) to identify cost in special care dental services to support commissioning.

    PubMed

    Duane, B G; Freeman, R; Richards, D; Crosbie, S; Patel, P; White, S; Humphris, G

    2017-03-01

    To commission dental services for vulnerable (special care) patient groups effectively, consistently and fairly an evidence base is needed of the costs involved. The simplified Case Mixed Tool (sCMT) can assess treatment mode complexity for these patient groups. To determine if the sCMT can be used to identify costs of service provision. Patients (n=495) attending the Sussex Community NHS Trust Special Care Dental Service for care were assessed using the sCMT. sCMT score and costs (staffing, laboratory fees, etc.) besides patient age, whether a new patient and use of general anaesthetic/intravenous sedation. Statistical analysis (adjusted linear regression modelling) compared sCMT score and costs then sensitivity analyses of the costings to age, being a new patient and sedation use were undertaken. Regression tables were produced to present estimates of service costs. Costs increased with sCMT total scale and single item values in a predictable manner in all analyses except for 'cooperation'. Costs increased with the use of IV sedation; with each rising level of the sCMT, and with complexity in every sCMT category, except cooperation. Costs increased with increase in complexity of treatment mode as measured by sCMT scores. Measures such as the sCMT can provide predictions of the resource allocations required when commissioning special care dental services. Copyright© 2017 Dennis Barber Ltd.

  16. DoD Key Technologies Plan

    DTIC Science & Technology

    1992-07-01

    methodologies ; software performance analysis; software testing; and concurrent languages. Finally, efforts in algorithms, which are primarily designed to upgrade...These codes provide a powerful research tool for testing new concepts and designs prior to experimental implementation. DoE’s laser program has also...development, and specially designed production facilities. World leadership in bth non -fluorinated and fluorinated materials resides in the U.S. but Japan

  17. An Exploratory Factor Analysis of the Sheltered Instruction Observation Protocol as an Evaluation Tool to Measure Teaching Effectiveness

    ERIC Educational Resources Information Center

    Polat, Nihat; Cepik, Saban

    2016-01-01

    To narrow the achievement gap between English language learners (ELLs) and their native-speaking peers in K-12 settings in the United States, effective instructional models must be identified. However, identifying valid observation protocols that can measure the effectiveness of specially designed instructional practices is not an easy task. This…

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Ucilia

    This report has the following articles: (1) Deconstructing Microbes--metagenomic research on bugs in termites relies on new data analysis tools; (2) Popular Science--a nanomaterial research paper in Nano Letters drew strong interest from the scientific community; (3) Direct Approach--researchers employ an algorithm to solve an energy-reduction issue essential in describing complex physical system; and (4) SciDAC Special--A science journal features research on petascale enabling technologies.

  19. A trade-off analysis design tool. Aircraft interior noise-motion/passenger satisfaction model

    NASA Technical Reports Server (NTRS)

    Jacobson, I. D.

    1977-01-01

    A design tool was developed to enhance aircraft passenger satisfaction. The effect of aircraft interior motion and noise on passenger comfort and satisfaction was modelled. Effects of individual aircraft noise sources were accounted for, and the impact of noise on passenger activities and noise levels to safeguard passenger hearing were investigated. The motion noise effect models provide a means for tradeoff analyses between noise and motion variables, and also provide a framework for optimizing noise reduction among noise sources. Data for the models were collected onboard commercial aircraft flights and specially scheduled tests.

  20. A guide to understanding meta-analysis.

    PubMed

    Israel, Heidi; Richter, Randy R

    2011-07-01

    With the focus on evidence-based practice in healthcare, a well-conducted systematic review that includes a meta-analysis where indicated represents a high level of evidence for treatment effectiveness. The purpose of this commentary is to assist clinicians in understanding meta-analysis as a statistical tool using both published articles and explanations of components of the technique. We describe what meta-analysis is, what heterogeneity is, and how it affects meta-analysis, effect size, the modeling techniques of meta-analysis, and strengths and weaknesses of meta-analysis. Common components like forest plot interpretation, software that may be used, special cases for meta-analysis, such as subgroup analysis, individual patient data, and meta-regression, and a discussion of criticisms, are included.

  1. Specialized CFD Grid Generation Methods for Near-Field Sonic Boom Prediction

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Campbell, Richard L.; Elmiligui, Alaa; Cliff, Susan E.; Nayani, Sudheer N.

    2014-01-01

    Ongoing interest in analysis and design of low sonic boom supersonic transports re- quires accurate and ecient Computational Fluid Dynamics (CFD) tools. Specialized grid generation techniques are employed to predict near- eld acoustic signatures of these con- gurations. A fundamental examination of grid properties is performed including grid alignment with ow characteristics and element type. The issues a ecting the robustness of cylindrical surface extrusion are illustrated. This study will compare three methods in the extrusion family of grid generation methods that produce grids aligned with the freestream Mach angle. These methods are applied to con gurations from the First AIAA Sonic Boom Prediction Workshop.

  2. Influence of Punch Geometry on Process Parameters in Cold Backward Extrusion

    NASA Astrophysics Data System (ADS)

    Plančak, M.; Barišić, B.; Car, Z.; Movrin, D.

    2011-01-01

    In cold extrusion of steel tools make direct contact with the metal to be extruded. Those tools are exposed to high contact stresses which, in certain cases, may be limiting factors in applying this technology. The present paper was bound to the influence of punch head design on radial stress at the container wall in the process of cold backward extrusion. Five different punch head geometries were investigated. Radial stress on the container wall was measured by pin load cell technique. Special tooling for the experimental investigation was designed and made. Process has been analyzed also by FE method. 2D models of tools were obtained by UGS NX and for FE analysis Simufact Forming GP software was used. Obtained results (experimental and obtained by FE) were compared and analyzed. Optimal punch head geometry has been suggested.

  3. Generic trending and analysis system

    NASA Technical Reports Server (NTRS)

    Keehan, Lori; Reese, Jay

    1994-01-01

    The Generic Trending and Analysis System (GTAS) is a generic spacecraft performance monitoring tool developed by NASA Code 511 and Loral Aerosys. It is designed to facilitate quick anomaly resolution and trend analysis. Traditionally, the job of off-line analysis has been performed using hardware and software systems developed for real-time spacecraft contacts; then, the systems were supplemented with a collection of tools developed by Flight Operations Team (FOT) members. Since the number of upcoming missions is increasing, NASA can no longer afford to operate in this manner. GTAS improves control center productivity and effectiveness because it provides a generic solution across multiple missions. Thus, GTAS eliminates the need for each individual mission to develop duplicate capabilities. It also allows for more sophisticated tools to be developed because it draws resources from several projects. In addition, the GTAS software system incorporates commercial off-the-shelf tools software (COTS) packages and reuses components of other NASA-developed systems wherever possible. GTAS has incorporated lessons learned from previous missions by involving the users early in the development process. GTAS users took a proactive role in requirements analysis, design, development, and testing. Because of user involvement, several special tools were designed and are now being developed. GTAS users expressed considerable interest in facilitating data collection for long term trending and analysis. As a result, GTAS provides easy access to large volumes of processed telemetry data directly in the control center. The GTAS archival and retrieval capabilities are supported by the integration of optical disk technology and a COTS relational database management system.

  4. Supporting secure programming in web applications through interactive static analysis.

    PubMed

    Zhu, Jun; Xie, Jing; Lipford, Heather Richter; Chu, Bill

    2014-07-01

    Many security incidents are caused by software developers' failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE) and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases.

  5. Supporting secure programming in web applications through interactive static analysis

    PubMed Central

    Zhu, Jun; Xie, Jing; Lipford, Heather Richter; Chu, Bill

    2013-01-01

    Many security incidents are caused by software developers’ failure to adhere to secure programming practices. Static analysis tools have been used to detect software vulnerabilities. However, their wide usage by developers is limited by the special training required to write rules customized to application-specific logic. Our approach is interactive static analysis, to integrate static analysis into Integrated Development Environment (IDE) and provide in-situ secure programming support to help developers prevent vulnerabilities during code construction. No additional training is required nor are there any assumptions on ways programs are built. Our work is motivated in part by the observation that many vulnerabilities are introduced due to failure to practice secure programming by knowledgeable developers. We implemented a prototype interactive static analysis tool as a plug-in for Java in Eclipse. Our technical evaluation of our prototype detected multiple zero-day vulnerabilities in a large open source project. Our evaluations also suggest that false positives may be limited to a very small class of use cases. PMID:25685513

  6. CsSNP: A Web-Based Tool for the Detecting of Comparative Segments SNPs.

    PubMed

    Wang, Yi; Wang, Shuangshuang; Zhou, Dongjie; Yang, Shuai; Xu, Yongchao; Yang, Chao; Yang, Long

    2016-07-01

    SNP (single nucleotide polymorphism) is a popular tool for the study of genetic diversity, evolution, and other areas. Therefore, it is necessary to develop a convenient, utility, robust, rapid, and open source detecting-SNP tool for all researchers. Since the detection of SNPs needs special software and series steps including alignment, detection, analysis and present, the study of SNPs is limited for nonprofessional users. CsSNP (Comparative segments SNP, http://biodb.sdau.edu.cn/cssnp/ ) is a freely available web tool based on the Blat, Blast, and Perl programs to detect comparative segments SNPs and to show the detail information of SNPs. The results are filtered and presented in the statistics figure and a Gbrowse map. This platform contains the reference genomic sequences and coding sequences of 60 plant species, and also provides new opportunities for the users to detect SNPs easily. CsSNP is provided a convenient tool for nonprofessional users to find comparative segments SNPs in their own sequences, and give the users the information and the analysis of SNPs, and display these data in a dynamic map. It provides a new method to detect SNPs and may accelerate related studies.

  7. Advanced Risk Reduction Tool (ARRT) Special Case Study Report: Science and Engineering Technical Assessments (SETA) Program

    NASA Technical Reports Server (NTRS)

    Kirsch, Paul J.; Hayes, Jane; Zelinski, Lillian

    2000-01-01

    This special case study report presents the Science and Engineering Technical Assessments (SETA) team's findings for exploring the correlation between the underlying models of Advanced Risk Reduction Tool (ARRT) relative to how it identifies, estimates, and integrates Independent Verification & Validation (IV&V) activities. The special case study was conducted under the provisions of SETA Contract Task Order (CTO) 15 and the approved technical approach documented in the CTO-15 Modification #1 Task Project Plan.

  8. CyREST: Turbocharging Cytoscape Access for External Tools via a RESTful API.

    PubMed

    Ono, Keiichiro; Muetze, Tanja; Kolishovski, Georgi; Shannon, Paul; Demchak, Barry

    2015-01-01

    As bioinformatic workflows become increasingly complex and involve multiple specialized tools, so does the difficulty of reliably reproducing those workflows. Cytoscape is a critical workflow component for executing network visualization, analysis, and publishing tasks, but it can be operated only manually via a point-and-click user interface. Consequently, Cytoscape-oriented tasks are laborious and often error prone, especially with multistep protocols involving many networks. In this paper, we present the new cyREST Cytoscape app and accompanying harmonization libraries. Together, they improve workflow reproducibility and researcher productivity by enabling popular languages (e.g., Python and R, JavaScript, and C#) and tools (e.g., IPython/Jupyter Notebook and RStudio) to directly define and query networks, and perform network analysis, layouts and renderings. We describe cyREST's API and overall construction, and present Python- and R-based examples that illustrate how Cytoscape can be integrated into large scale data analysis pipelines. cyREST is available in the Cytoscape app store (http://apps.cytoscape.org) where it has been downloaded over 1900 times since its release in late 2014.

  9. CyREST: Turbocharging Cytoscape Access for External Tools via a RESTful API

    PubMed Central

    Ono, Keiichiro; Muetze, Tanja; Kolishovski, Georgi; Shannon, Paul; Demchak, Barry

    2015-01-01

    As bioinformatic workflows become increasingly complex and involve multiple specialized tools, so does the difficulty of reliably reproducing those workflows. Cytoscape is a critical workflow component for executing network visualization, analysis, and publishing tasks, but it can be operated only manually via a point-and-click user interface. Consequently, Cytoscape-oriented tasks are laborious and often error prone, especially with multistep protocols involving many networks. In this paper, we present the new cyREST Cytoscape app and accompanying harmonization libraries. Together, they improve workflow reproducibility and researcher productivity by enabling popular languages (e.g., Python and R, JavaScript, and C#) and tools (e.g., IPython/Jupyter Notebook and RStudio) to directly define and query networks, and perform network analysis, layouts and renderings. We describe cyREST’s API and overall construction, and present Python- and R-based examples that illustrate how Cytoscape can be integrated into large scale data analysis pipelines. cyREST is available in the Cytoscape app store (http://apps.cytoscape.org) where it has been downloaded over 1900 times since its release in late 2014. PMID:26672762

  10. Research Directions in Real-Time Systems.

    DTIC Science & Technology

    1996-09-01

    This report summarizes a survey of published research in real time systems . Material is presented that provides an overview of the topic, focusing on...communications protocols and scheduling techniques. It is noted that real - time systems deserve special attention separate from other areas because of...formal tools for design and analysis of real - time systems . The early work on applications as well as notable theoretical advances are summarized

  11. MASS SPECTROMETRY-BASED METABOLOMICS

    PubMed Central

    Dettmer, Katja; Aronov, Pavel A.; Hammock, Bruce D.

    2007-01-01

    This review presents an overview of the dynamically developing field of mass spectrometry-based metabolomics. Metabolomics aims at the comprehensive and quantitative analysis of wide arrays of metabolites in biological samples. These numerous analytes have very diverse physico-chemical properties and occur at different abundance levels. Consequently, comprehensive metabolomics investigations are primarily a challenge for analytical chemistry and specifically mass spectrometry has vast potential as a tool for this type of investigation. Metabolomics require special approaches for sample preparation, separation, and mass spectrometric analysis. Current examples of those approaches are described in this review. It primarily focuses on metabolic fingerprinting, a technique that analyzes all detectable analytes in a given sample with subsequent classification of samples and identification of differentially expressed metabolites, which define the sample classes. To perform this complex task, data analysis tools, metabolite libraries, and databases are required. Therefore, recent advances in metabolomics bioinformatics are also discussed. PMID:16921475

  12. 48 CFR 1545.309 - Providing Government production and research property under special restrictions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... improvements necessary for installing special tooling, special test equipment, or plant equipment, shall not be... production and research property under special restrictions. 1545.309 Section 1545.309 Federal Acquisition Regulations System ENVIRONMENTAL PROTECTION AGENCY CONTRACT MANAGEMENT GOVERNMENT PROPERTY Providing...

  13. Discrete event simulation tool for analysis of qualitative models of continuous processing systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T. (Inventor); Basham, Bryan D. (Inventor); Harris, Richard A. (Inventor)

    1990-01-01

    An artificial intelligence design and qualitative modeling tool is disclosed for creating computer models and simulating continuous activities, functions, and/or behavior using developed discrete event techniques. Conveniently, the tool is organized in four modules: library design module, model construction module, simulation module, and experimentation and analysis. The library design module supports the building of library knowledge including component classes and elements pertinent to a particular domain of continuous activities, functions, and behavior being modeled. The continuous behavior is defined discretely with respect to invocation statements, effect statements, and time delays. The functionality of the components is defined in terms of variable cluster instances, independent processes, and modes, further defined in terms of mode transition processes and mode dependent processes. Model construction utilizes the hierarchy of libraries and connects them with appropriate relations. The simulation executes a specialized initialization routine and executes events in a manner that includes selective inherency of characteristics through a time and event schema until the event queue in the simulator is emptied. The experimentation and analysis module supports analysis through the generation of appropriate log files and graphics developments and includes the ability of log file comparisons.

  14. S-MART, a software toolbox to aid RNA-Seq data analysis.

    PubMed

    Zytnicki, Matthias; Quesneville, Hadi

    2011-01-01

    High-throughput sequencing is now routinely performed in many experiments. But the analysis of the millions of sequences generated, is often beyond the expertise of the wet labs who have no personnel specializing in bioinformatics. Whereas several tools are now available to map high-throughput sequencing data on a genome, few of these can extract biological knowledge from the mapped reads. We have developed a toolbox called S-MART, which handles mapped RNA-Seq data. S-MART is an intuitive and lightweight tool which performs many of the tasks usually required for the analysis of mapped RNA-Seq reads. S-MART does not require any computer science background and thus can be used by all of the biologist community through a graphical interface. S-MART can run on any personal computer, yielding results within an hour even for Gb of data for most queries. S-MART may perform the entire analysis of the mapped reads, without any need for other ad hoc scripts. With this tool, biologists can easily perform most of the analyses on their computer for their RNA-Seq data, from the mapped data to the discovery of important loci.

  15. S-MART, A Software Toolbox to Aid RNA-seq Data Analysis

    PubMed Central

    Zytnicki, Matthias; Quesneville, Hadi

    2011-01-01

    High-throughput sequencing is now routinely performed in many experiments. But the analysis of the millions of sequences generated, is often beyond the expertise of the wet labs who have no personnel specializing in bioinformatics. Whereas several tools are now available to map high-throughput sequencing data on a genome, few of these can extract biological knowledge from the mapped reads. We have developed a toolbox called S-MART, which handles mapped RNA-Seq data. S-MART is an intuitive and lightweight tool which performs many of the tasks usually required for the analysis of mapped RNA-Seq reads. S-MART does not require any computer science background and thus can be used by all of the biologist community through a graphical interface. S-MART can run on any personal computer, yielding results within an hour even for Gb of data for most queries. S-MART may perform the entire analysis of the mapped reads, without any need for other ad hoc scripts. With this tool, biologists can easily perform most of the analyses on their computer for their RNA-Seq data, from the mapped data to the discovery of important loci. PMID:21998740

  16. Introducing W.A.T.E.R.S.: a workflow for the alignment, taxonomy, and ecology of ribosomal sequences.

    PubMed

    Hartman, Amber L; Riddle, Sean; McPhillips, Timothy; Ludäscher, Bertram; Eisen, Jonathan A

    2010-06-12

    For more than two decades microbiologists have used a highly conserved microbial gene as a phylogenetic marker for bacteria and archaea. The small-subunit ribosomal RNA gene, also known as 16 S rRNA, is encoded by ribosomal DNA, 16 S rDNA, and has provided a powerful comparative tool to microbial ecologists. Over time, the microbial ecology field has matured from small-scale studies in a select number of environments to massive collections of sequence data that are paired with dozens of corresponding collection variables. As the complexity of data and tool sets have grown, the need for flexible automation and maintenance of the core processes of 16 S rDNA sequence analysis has increased correspondingly. We present WATERS, an integrated approach for 16 S rDNA analysis that bundles a suite of publicly available 16 S rDNA analysis software tools into a single software package. The "toolkit" includes sequence alignment, chimera removal, OTU determination, taxonomy assignment, phylogentic tree construction as well as a host of ecological analysis and visualization tools. WATERS employs a flexible, collection-oriented 'workflow' approach using the open-source Kepler system as a platform. By packaging available software tools into a single automated workflow, WATERS simplifies 16 S rDNA analyses, especially for those without specialized bioinformatics, programming expertise. In addition, WATERS, like some of the newer comprehensive rRNA analysis tools, allows researchers to minimize the time dedicated to carrying out tedious informatics steps and to focus their attention instead on the biological interpretation of the results. One advantage of WATERS over other comprehensive tools is that the use of the Kepler workflow system facilitates result interpretation and reproducibility via a data provenance sub-system. Furthermore, new "actors" can be added to the workflow as desired and we see WATERS as an initial seed for a sizeable and growing repository of interoperable, easy-to-combine tools for asking increasingly complex microbial ecology questions.

  17. 48 CFR 970.1504-1-8 - Special equipment purchases.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... boilers, generators, machine tools, and large electrical equipment. In some cases, it would also include... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Special equipment....1504-1-8 Special equipment purchases. (a) Special equipment is sometimes procured in conjunction with...

  18. wft4galaxy: a workflow testing tool for galaxy.

    PubMed

    Piras, Marco Enrico; Pireddu, Luca; Zanetti, Gianluigi

    2017-12-01

    Workflow managers for scientific analysis provide a high-level programming platform facilitating standardization, automation, collaboration and access to sophisticated computing resources. The Galaxy workflow manager provides a prime example of this type of platform. As compositions of simpler tools, workflows effectively comprise specialized computer programs implementing often very complex analysis procedures. To date, no simple way to automatically test Galaxy workflows and ensure their correctness has appeared in the literature. With wft4galaxy we offer a tool to bring automated testing to Galaxy workflows, making it feasible to bring continuous integration to their development and ensuring that defects are detected promptly. wft4galaxy can be easily installed as a regular Python program or launched directly as a Docker container-the latter reducing installation effort to a minimum. Available at https://github.com/phnmnl/wft4galaxy under the Academic Free License v3.0. marcoenrico.piras@crs4.it. © The Author 2017. Published by Oxford University Press.

  19. Food abundance, prey morphology, and diet specialization influence individual sea otter tool use

    USGS Publications Warehouse

    Fujii, Jessica A.; Ralls, Katherine; Tinker, M. Tim

    2017-01-01

    Sea otters are well-known tool users, employing objects such as rocks or shells to break open invertebrate prey. We used a series of generalized linear mixed effect models to examine observational data on prey capture and tool use from 211 tagged individuals from 5 geographically defined study areas throughout the sea otter’s range in California. Our best supported model was able to explain 75% of the variation in the frequency of tool use by individual sea otters with only ecological and demographic variables. In one study area, where sea otter food resources were abundant, all individuals had similar diets focusing on preferred prey items and used tools at low to moderate frequencies (4–38% of prey captures). In the remaining areas, where sea otters were food-limited, individuals specialized on different subsets of the available prey and had a wider range of average tool-use frequency (0–98% of prey captures). The prevalence of difficult-to-access prey in individual diets was a major predictor of tool use and increased the likelihood of using tools on prey that were not difficult to access as well. Age, sex, and feeding habitat also contributed to the probability of tool use but to a smaller extent. We developed a conceptual model illustrating how food abundance, the prevalence of difficult-to-access prey, and individual diet specialization interacted to determine the likelihood that individual sea otters would use tools and considered the model’s relevance to other tool-using species.

  20. Plastic Clamp Retains Clevis Pin

    NASA Technical Reports Server (NTRS)

    Cortes, R. G.

    1983-01-01

    Plastic clamp requires no special installation or removal tools. Clamp slips easily over end of pin. Once engaged in groove, holds pin securely. Installed and removed easily without special tools - screwdriver or putty knife adequate for prying out of groove. Used to retain bearings, rollers pulleys, other parts that rotate. Applications include slowly and intermittently rotating parts in appliances.

  1. Ramping and Uncertainty Prediction Tool - Analysis and Visualization of Wind Generation Impact on Electrical Grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etingov, Pavel; Makarov, PNNL Yuri; Subbarao, PNNL Kris

    RUT software is designed for use by the Balancing Authorities to predict and display additional requirements caused by the variability and uncertainty in load and generation. The prediction is made for the next operating hours as well as for the next day. The tool predicts possible deficiencies in generation capability and ramping capability. This deficiency of balancing resources can cause serious risks to power system stability and also impact real-time market energy prices. The tool dynamically and adaptively correlates changing system conditions with the additional balancing needs triggered by the interplay between forecasted and actual load and output of variablemore » resources. The assessment is performed using a specially developed probabilistic algorithm incorporating multiple sources of uncertainty including wind, solar and load forecast errors. The tool evaluates required generation for a worst case scenario, with a user-specified confidence level.« less

  2. Avanti lipid tools: connecting lipids, technology, and cell biology.

    PubMed

    Sims, Kacee H; Tytler, Ewan M; Tipton, John; Hill, Kasey L; Burgess, Stephen W; Shaw, Walter A

    2014-08-01

    Lipid research is challenging owing to the complexity and diversity of the lipidome. Here we review a set of experimental tools developed for the seasoned lipid researcher, as well as, those who are new to the field of lipid research. Novel tools for probing protein-lipid interactions, applications for lipid binding antibodies, enhanced systems for the cellular delivery of lipids, improved visualization of lipid membranes using gold-labeled lipids, and advances in mass spectrometric analysis techniques will be discussed. Because lipid mediators are known to participate in a host of signal transduction and trafficking pathways within the cell, a comprehensive lipid toolbox that aids the science of lipidomics research is essential to better understand the molecular mechanisms of interactions between cellular components. This article is part of a Special Issue entitled Tools to study lipid functions. Copyright © 2014. Published by Elsevier B.V.

  3. Modeling interdependencies between business and communication processes in hospitals.

    PubMed

    Brigl, Birgit; Wendt, Thomas; Winter, Alfred

    2003-01-01

    The optimization and redesign of business processes in hospitals is an important challenge for the hospital information management who has to design and implement a suitable HIS architecture. Nevertheless, there are no tools available specializing in modeling information-driven business processes and the consequences on the communication between information processing, tools. Therefore, we will present an approach which facilitates the representation and analysis of business processes and resulting communication processes between application components and their interdependencies. This approach aims not only to visualize those processes, but to also to evaluate if there are weaknesses concerning the information processing infrastructure which hinder the smooth implementation of the business processes.

  4. Cutting force measurement of electrical jigsaw by strain gauges

    NASA Astrophysics Data System (ADS)

    Kazup, L.; Varadine Szarka, A.

    2016-11-01

    This paper describes a measuring method based on strain gauges for accurate specification of electric jigsaw's cutting force. The goal of the measurement is to provide an overall perspective about generated forces in a jigsaw's gearbox during a cutting period. The lifetime of the tool is affected by these forces primarily. This analysis is part of the research and development project aiming to develop a special linear magnetic brake for realizing automatic lifetime tests of electric jigsaws or similar handheld tools. The accurate specification of cutting force facilitates to define realistic test cycles during the automatic lifetime test. The accuracy and precision resulted by the well described cutting force characteristic and the possibility of automation provide new dimension for lifetime testing of the handheld tools with alternating movement.

  5. Forensic Analysis of Windows Hosts Using UNIX-based Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cory Altheide

    2004-07-19

    Many forensic examiners are introduced to UNIX-based forensic utilities when faced with investigating a UNIX-like operating system for the first time. They will use these utilities for this very specific task, because in many cases these tools are the only ones for the given job. For example, at the time of this writing, given a FreeBSD 5.x file system, the author's only choice is to use The Coroner's Toolkit running on FreeBSD 5.x. However, many of the same tools examiners use for the occasional UNIX-like system investigation are extremely capable when a Windows system is the target. Indeed, the Linuxmore » operating system itself can prove to be an extremely useful forensics platform with very little use of specialized forensics utilities at all.« less

  6. Steinberg ``AUDIOMAPS" Music Appreciation-Via-Understanding: Special-Relativity + Expectations "Quantum-Theory": a Quantum-ACOUSTO/MUSICO-Dynamics (QA/MD)

    NASA Astrophysics Data System (ADS)

    Steinberg, R.; Siegel, E.

    2010-03-01

    ``AUDIOMAPS'' music enjoyment/appreciation-via-understanding methodology, versus art, music-dynamics evolves, telling a story in (3+1)-dimensions: trails, frames, timbres, + dynamics amplitude vs. music-score time-series (formal-inverse power- spectrum) surprisingly closely parallels (3+1)-dimensional Einstein(1905) special-relativity ``+'' (with its enjoyment- expectations) a manifestation of quantum-theory expectation- values, together a music quantum-ACOUSTO/MUSICO-dynamics (QA/MD). Analysis via Derrida deconstruction enabled Siegel- Baez ``Category-Semantics'' ``FUZZYICS''=``CATEGORYICS (``SON of 'TRIZ") classic Aristotle ``Square-of-Opposition" (SoO) DEduction-logic, irrespective of Boon-Klimontovich versus Voss- Clark[PRL(77)] music power-spectrum analysis sampling- time/duration controversy: part versus whole, shows that ``AUDIOMAPS" QA/MD reigns supreme as THE music appreciation-via- analysis tool for the listener in musicology!!! Connection to Deutsch-Hartmann-Levitin[This is Your Brain on Music,(2006)] brain/mind-barrier brain/mind-music connection is both subtle and compelling and immediate!!!

  7. Systems Analysis Directorate Activities Summary, Aug 1976

    DTIC Science & Technology

    1976-09-01

    expensive development program. I Incl M. RHIAN wd Inc] 1&2 Acting Director Added I incl Systems Analysis Directcrate 3. DA 2028 31 Fe . was .1 Okla i*,f...PUBLICATIONS AND DATE BLANK FORMS U.s, Ptrt if (reverse) for Repair Parts and Special Tool Lists (RPSTL) and Supply F 09 ohis a t. fe -, suea AR 310-1...ac ptep penst asmcy Is Pe US Catalgus/Supply Manuals (SC/SM). 30 Jul 76 Army Adiut•*t Gesetol castep. TO: ’FP e, rd fe Proponent of pubfi eftio. w

  8. Timing characterization and analysis of the Linux-based, closed loop control computer for the Subaru Telescope laser guide star adaptive optics system

    NASA Astrophysics Data System (ADS)

    Dinkins, Matthew; Colley, Stephen

    2008-07-01

    Hardware and software specialized for real time control reduce the timing jitter of executables when compared to off-the-shelf hardware and software. However, these specialized environments are costly in both money and development time. While conventional systems have a cost advantage, the jitter in these systems is much larger and potentially problematic. This study analyzes the timing characterstics of a standard Dell server running a fully featured Linux operating system to determine if such a system would be capable of meeting the timing requirements for closed loop operations. Investigations are preformed on the effectiveness of tools designed to make off-the-shelf system performance closer to specialized real time systems. The Gnu Compiler Collection (gcc) is compared to the Intel C Compiler (icc), compiler optimizations are investigated, and real-time extensions to Linux are evaluated.

  9. Tool Preloads Screw and Applies Locknut

    NASA Technical Reports Server (NTRS)

    Wood, K. E.

    1982-01-01

    Special tool reaches through structural members inside Space Shuttle fasten nut on preloaded screw that holds thermal protection tile against outside skin of vehicle. Tool attaches tiles with accuratelycontrolled tensile loading.

  10. Influenza Virus Database (IVDB): an integrated information resource and analysis platform for influenza virus research.

    PubMed

    Chang, Suhua; Zhang, Jiajie; Liao, Xiaoyun; Zhu, Xinxing; Wang, Dahai; Zhu, Jiang; Feng, Tao; Zhu, Baoli; Gao, George F; Wang, Jian; Yang, Huanming; Yu, Jun; Wang, Jing

    2007-01-01

    Frequent outbreaks of highly pathogenic avian influenza and the increasing data available for comparative analysis require a central database specialized in influenza viruses (IVs). We have established the Influenza Virus Database (IVDB) to integrate information and create an analysis platform for genetic, genomic, and phylogenetic studies of the virus. IVDB hosts complete genome sequences of influenza A virus generated by Beijing Institute of Genomics (BIG) and curates all other published IV sequences after expert annotation. Our Q-Filter system classifies and ranks all nucleotide sequences into seven categories according to sequence content and integrity. IVDB provides a series of tools and viewers for comparative analysis of the viral genomes, genes, genetic polymorphisms and phylogenetic relationships. A search system has been developed for users to retrieve a combination of different data types by setting search options. To facilitate analysis of global viral transmission and evolution, the IV Sequence Distribution Tool (IVDT) has been developed to display the worldwide geographic distribution of chosen viral genotypes and to couple genomic data with epidemiological data. The BLAST, multiple sequence alignment and phylogenetic analysis tools were integrated for online data analysis. Furthermore, IVDB offers instant access to pre-computed alignments and polymorphisms of IV genes and proteins, and presents the results as SNP distribution plots and minor allele distributions. IVDB is publicly available at http://influenza.genomics.org.cn.

  11. Orion Entry, Descent, and Landing Simulation

    NASA Technical Reports Server (NTRS)

    Hoelscher, Brian R.

    2007-01-01

    The Orion Entry, Descent, and Landing simulation was created over the past two years to serve as the primary Crew Exploration Vehicle guidance, navigation, and control (GN&C) design and analysis tool at the National Aeronautics and Space Administration (NASA). The Advanced NASA Technology Architecture for Exploration Studies (ANTARES) simulation is a six degree-of-freedom tool with a unique design architecture which has a high level of flexibility. This paper describes the decision history and motivations that guided the creation of this simulation tool. The capabilities of the models within ANTARES are presented in detail. Special attention is given to features of the highly flexible GN&C architecture and the details of the implemented GN&C algorithms. ANTARES provides a foundation simulation for the Orion Project that has already been successfully used for requirements analysis, system definition analysis, and preliminary GN&C design analysis. ANTARES will find useful application in engineering analysis, mission operations, crew training, avionics-in-the-loop testing, etc. This paper focuses on the entry simulation aspect of ANTARES, which is part of a bigger simulation package supporting the entire mission profile of the Orion vehicle. The unique aspects of entry GN&C design are covered, including how the simulation is being used for Monte Carlo dispersion analysis and for support of linear stability analysis. Sample simulation output from ANTARES is presented in an appendix.

  12. Introduction to the special issue on recentering science: Replication, robustness, and reproducibility in psychophysiology.

    PubMed

    Kappenman, Emily S; Keil, Andreas

    2017-01-01

    In recent years, the psychological and behavioral sciences have increased efforts to strengthen methodological practices and publication standards, with the ultimate goal of enhancing the value and reproducibility of published reports. These issues are especially important in the multidisciplinary field of psychophysiology, which yields rich and complex data sets with a large number of observations. In addition, the technological tools and analysis methods available in the field of psychophysiology are continually evolving, widening the array of techniques and approaches available to researchers. This special issue presents articles detailing rigorous and systematic evaluations of tasks, measures, materials, analysis approaches, and statistical practices in a variety of subdisciplines of psychophysiology. These articles highlight challenges in conducting and interpreting psychophysiological research and provide data-driven, evidence-based recommendations for overcoming those challenges to produce robust, reproducible results in the field of psychophysiology. © 2016 Society for Psychophysiological Research.

  13. Cost Analysis of an Air Brayton Receiver for a Solar Thermal Electric Power System in Selected Annual Production Volumes

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Pioneer Engineering and Manufacturing Company estimated the cost of manufacturing and Air Brayton Receiver for a Solar Thermal Electric Power System as designed by the AiResearch Division of the Garrett Corporation. Production costs were estimated at annual volumes of 100; 1,000; 5,000; 10,000; 50,000; 100,000 and 1,000,000 units. These costs included direct labor, direct material and manufacturing burden. A make or buy analysis was made of each part of each volume. At high volumes special fabrication concepts were used to reduce operation cycle times. All costs were estimated at an assumed 100% plant capacity. Economic feasibility determined the level of production at which special concepts were to be introduced. Estimated costs were based on the economics of the last half of 1980. Tooling and capital equipment costs were estimated for ach volume. Infrastructure and personnel requirements were also estimated.

  14. GoPros™ as an underwater photogrammetry tool for citizen science

    PubMed Central

    David, Peter A.; Dupont, Sally F.; Mathewson, Ciaran P.; O’Neill, Samuel J.; Powell, Nicholas N.; Williamson, Jane E.

    2016-01-01

    Citizen science can increase the scope of research in the marine environment; however, it suffers from necessitating specialized training and simplified methodologies that reduce research output. This paper presents a simplified, novel survey methodology for citizen scientists, which combines GoPro imagery and structure from motion to construct an ortho-corrected 3D model of habitats for analysis. Results using a coral reef habitat were compared to surveys conducted with traditional snorkelling methods for benthic cover, holothurian counts, and coral health. Results were comparable between the two methods, and structure from motion allows the results to be analysed off-site for any chosen visual analysis. The GoPro method outlined in this study is thus an effective tool for citizen science in the marine environment, especially for comparing changes in coral cover or volume over time. PMID:27168973

  15. GoPros™ as an underwater photogrammetry tool for citizen science.

    PubMed

    Raoult, Vincent; David, Peter A; Dupont, Sally F; Mathewson, Ciaran P; O'Neill, Samuel J; Powell, Nicholas N; Williamson, Jane E

    2016-01-01

    Citizen science can increase the scope of research in the marine environment; however, it suffers from necessitating specialized training and simplified methodologies that reduce research output. This paper presents a simplified, novel survey methodology for citizen scientists, which combines GoPro imagery and structure from motion to construct an ortho-corrected 3D model of habitats for analysis. Results using a coral reef habitat were compared to surveys conducted with traditional snorkelling methods for benthic cover, holothurian counts, and coral health. Results were comparable between the two methods, and structure from motion allows the results to be analysed off-site for any chosen visual analysis. The GoPro method outlined in this study is thus an effective tool for citizen science in the marine environment, especially for comparing changes in coral cover or volume over time.

  16. PARPs database: A LIMS systems for protein-protein interaction data mining or laboratory information management system

    PubMed Central

    Droit, Arnaud; Hunter, Joanna M; Rouleau, Michèle; Ethier, Chantal; Picard-Cloutier, Aude; Bourgais, David; Poirier, Guy G

    2007-01-01

    Background In the "post-genome" era, mass spectrometry (MS) has become an important method for the analysis of proteins and the rapid advancement of this technique, in combination with other proteomics methods, results in an increasing amount of proteome data. This data must be archived and analysed using specialized bioinformatics tools. Description We herein describe "PARPs database," a data analysis and management pipeline for liquid chromatography tandem mass spectrometry (LC-MS/MS) proteomics. PARPs database is a web-based tool whose features include experiment annotation, protein database searching, protein sequence management, as well as data-mining of the peptides and proteins identified. Conclusion Using this pipeline, we have successfully identified several interactions of biological significance between PARP-1 and other proteins, namely RFC-1, 2, 3, 4 and 5. PMID:18093328

  17. Analyzing the effectiveness of flare dispensing programs against pulse width modulation seekers using self-organizing maps

    NASA Astrophysics Data System (ADS)

    Şahingil, Mehmet C.; Aslan, Murat Š.

    2013-10-01

    Infrared guided missile seekers utilizing pulse width modulation in target tracking is one of the threats against air platforms. To be able to achieve a "soft-kill" protection of own platform against these type of threats, one needs to examine carefully the seeker operating principle with its special electronic counter-counter measure (ECCM) capability. One of the cost-effective ways of soft kill protection is to use flare decoys in accordance with an optimized dispensing program. Such an optimization requires a good understanding of the threat seeker, capabilities of the air platform and engagement scenario information between them. Modeling and simulation is very powerful tool to achieve a valuable insight and understand the underlying phenomenology. A careful interpretation of simulation results is crucial to infer valuable conclusions from the data. In such an interpretation there are lots of factors (features) which affect the results. Therefore, powerful statistical tools and pattern recognition algorithms are of special interest in the analysis. In this paper, we show how self-organizing maps (SOMs), which is one of those powerful tools, can be used in analyzing the effectiveness of various flare dispensing programs against a PWM seeker. We perform several Monte Carlo runs for a typical engagement scenario in a MATLAB-based simulation environment. In each run, we randomly change the flare dispending program and obtain corresponding class: "successful" or "unsuccessful", depending on whether the corresponding flare dispensing program deceives the seeker or not, respectively. Then, in the analysis phase, we use SOMs to interpret and visualize the results.

  18. Modeling languages for biochemical network simulation: reaction vs equation based approaches.

    PubMed

    Wiechert, Wolfgang; Noack, Stephan; Elsheikh, Atya

    2010-01-01

    Biochemical network modeling and simulation is an essential task in any systems biology project. The systems biology markup language (SBML) was established as a standardized model exchange language for mechanistic models. A specific strength of SBML is that numerous tools for formulating, processing, simulation and analysis of models are freely available. Interestingly, in the field of multidisciplinary simulation, the problem of model exchange between different simulation tools occurred much earlier. Several general modeling languages like Modelica have been developed in the 1990s. Modelica enables an equation based modular specification of arbitrary hierarchical differential algebraic equation models. Moreover, libraries for special application domains can be rapidly developed. This contribution compares the reaction based approach of SBML with the equation based approach of Modelica and explains the specific strengths of both tools. Several biological examples illustrating essential SBML and Modelica concepts are given. The chosen criteria for tool comparison are flexibility for constraint specification, different modeling flavors, hierarchical, modular and multidisciplinary modeling. Additionally, support for spatially distributed systems, event handling and network analysis features is discussed. As a major result it is shown that the choice of the modeling tool has a strong impact on the expressivity of the specified models but also strongly depends on the requirements of the application context.

  19. ASTROS: A multidisciplinary automated structural design tool

    NASA Technical Reports Server (NTRS)

    Neill, D. J.

    1989-01-01

    ASTROS (Automated Structural Optimization System) is a finite-element-based multidisciplinary structural optimization procedure developed under Air Force sponsorship to perform automated preliminary structural design. The design task is the determination of the structural sizes that provide an optimal structure while satisfying numerous constraints from many disciplines. In addition to its automated design features, ASTROS provides a general transient and frequency response capability, as well as a special feature to perform a transient analysis of a vehicle subjected to a nuclear blast. The motivation for the development of a single multidisciplinary design tool is that such a tool can provide improved structural designs in less time than is currently needed. The role of such a tool is even more apparent as modern materials come into widespread use. Balancing conflicting requirements for the structure's strength and stiffness while exploiting the benefits of material anisotropy is perhaps an impossible task without assistance from an automated design tool. Finally, the use of a single tool can bring the design task into better focus among design team members, thereby improving their insight into the overall task.

  20. Special Education Teachers' Lived Experiences in the Implementation of the iPad as an Instructional Tool for Students with Intellectual Disabilities

    ERIC Educational Resources Information Center

    Epps, Takisha Salander

    2016-01-01

    The purpose of this transcendental phenomenological study was to describe the lived experience of 11 special education teachers, who implemented iPads as an instructional tool for elementary students with intellectual disabilities. This study was conducted in a North Carolina school district. The theories, which guided this study were Vygotsky's…

  1. Scribl: an HTML5 Canvas-based graphics library for visualizing genomic data over the web.

    PubMed

    Miller, Chase A; Anthony, Jon; Meyer, Michelle M; Marth, Gabor

    2013-02-01

    High-throughput biological research requires simultaneous visualization as well as analysis of genomic data, e.g. read alignments, variant calls and genomic annotations. Traditionally, such integrative analysis required desktop applications operating on locally stored data. Many current terabyte-size datasets generated by large public consortia projects, however, are already only feasibly stored at specialist genome analysis centers. As even small laboratories can afford very large datasets, local storage and analysis are becoming increasingly limiting, and it is likely that most such datasets will soon be stored remotely, e.g. in the cloud. These developments will require web-based tools that enable users to access, analyze and view vast remotely stored data with a level of sophistication and interactivity that approximates desktop applications. As rapidly dropping cost enables researchers to collect data intended to answer questions in very specialized contexts, developers must also provide software libraries that empower users to implement customized data analyses and data views for their particular application. Such specialized, yet lightweight, applications would empower scientists to better answer specific biological questions than possible with general-purpose genome browsers currently available. Using recent advances in core web technologies (HTML5), we developed Scribl, a flexible genomic visualization library specifically targeting coordinate-based data such as genomic features, DNA sequence and genetic variants. Scribl simplifies the development of sophisticated web-based graphical tools that approach the dynamism and interactivity of desktop applications. Software is freely available online at http://chmille4.github.com/Scribl/ and is implemented in JavaScript with all modern browsers supported.

  2. Need a Special Tool? Make It Yourself!

    ERIC Educational Resources Information Center

    Mordini, Robert D.

    2007-01-01

    People seem to have created a tool for every purpose. If a person searches diligently, he can usually find the tool he needs. However, several things may affect this process such as time, cost of the tool, and limited tool sources. The solution to all these is to make the tool yourself. People have made tools for many thousands of years, and with…

  3. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    NASA Astrophysics Data System (ADS)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  4. GAMES II Project: a general architecture for medical knowledge-based systems.

    PubMed

    Bruno, F; Kindler, H; Leaning, M; Moustakis, V; Scherrer, J R; Schreiber, G; Stefanelli, M

    1994-10-01

    GAMES II aims at developing a comprehensive and commercially viable methodology to avoid problems ordinarily occurring in KBS development. GAMES II methodology proposes to design a KBS starting from an epistemological model of medical reasoning (the Select and Test Model). The design is viewed as a process of adding symbol level information to the epistemological model. The architectural framework provided by GAMES II integrates the use of different formalisms and techniques providing a large set of tools. The user can select the most suitable one for representing a piece of knowledge after a careful analysis of its epistemological characteristics. Special attention is devoted to the tools dealing with knowledge acquisition (both manual and automatic). A panel of practicing physicians are assessing the medical value of such a framework and its related tools by using it in a practical application.

  5. 76 FR 11361 - Defense Federal Acquisition Regulation Supplement; Preservation of Tooling for Major Defense...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-02

    ... tooling, but should include ``all property, i.e., special test equipment, ground support equipment, machine tools and machines and other intangibles to maintain capability.'' Response: DoD is fully...

  6. An Overview of the Runtime Verification Tool Java PathExplorer

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Rosu, Grigore; Clancy, Daniel (Technical Monitor)

    2002-01-01

    We present an overview of the Java PathExplorer runtime verification tool, in short referred to as JPAX. JPAX can monitor the execution of a Java program and check that it conforms with a set of user provided properties formulated in temporal logic. JPAX can in addition analyze the program for concurrency errors such as deadlocks and data races. The concurrency analysis requires no user provided specification. The tool facilitates automated instrumentation of a program's bytecode, which when executed will emit an event stream, the execution trace, to an observer. The observer dispatches the incoming event stream to a set of observer processes, each performing a specialized analysis, such as the temporal logic verification, the deadlock analysis and the data race analysis. Temporal logic specifications can be formulated by the user in the Maude rewriting logic, where Maude is a high-speed rewriting system for equational logic, but here extended with executable temporal logic. The Maude rewriting engine is then activated as an event driven monitoring process. Alternatively, temporal specifications can be translated into efficient automata, which check the event stream. JPAX can be used during program testing to gain increased information about program executions, and can potentially furthermore be applied during operation to survey safety critical systems.

  7. Big Data and Neuroimaging.

    PubMed

    Webb-Vargas, Yenny; Chen, Shaojie; Fisher, Aaron; Mejia, Amanda; Xu, Yuting; Crainiceanu, Ciprian; Caffo, Brian; Lindquist, Martin A

    2017-12-01

    Big Data are of increasing importance in a variety of areas, especially in the biosciences. There is an emerging critical need for Big Data tools and methods, because of the potential impact of advancements in these areas. Importantly, statisticians and statistical thinking have a major role to play in creating meaningful progress in this arena. We would like to emphasize this point in this special issue, as it highlights both the dramatic need for statistical input for Big Data analysis and for a greater number of statisticians working on Big Data problems. We use the field of statistical neuroimaging to demonstrate these points. As such, this paper covers several applications and novel methodological developments of Big Data tools applied to neuroimaging data.

  8. BDA special care case mix model.

    PubMed

    Bateman, P; Arnold, C; Brown, R; Foster, L V; Greening, S; Monaghan, N; Zoitopoulos, L

    2010-04-10

    Routine dental care provided in special care dentistry is complicated by patient specific factors which increase the time taken and costs of treatment. The BDA have developed and conducted a field trial of a case mix tool to measure this complexity. For each episode of care the case mix tool assesses the following on a four point scale: 'ability to communicate', 'ability to cooperate', 'medical status', 'oral risk factors', 'access to oral care' and 'legal and ethical barriers to care'. The tool is reported to be easy to use and captures sufficient detail to discriminate between types of service and special care dentistry provided. It offers potential as a simple to use and clinically relevant source of performance management and commissioning data. This paper describes the model, demonstrates how it is currently being used, and considers future developments in its use.

  9. Interpolation problem for the solutions of linear elasticity equations based on monogenic functions

    NASA Astrophysics Data System (ADS)

    Grigor'ev, Yuri; Gürlebeck, Klaus; Legatiuk, Dmitrii

    2017-11-01

    Interpolation is an important tool for many practical applications, and very often it is beneficial to interpolate not only with a simple basis system, but rather with solutions of a certain differential equation, e.g. elasticity equation. A typical example for such type of interpolation are collocation methods widely used in practice. It is known, that interpolation theory is fully developed in the framework of the classical complex analysis. However, in quaternionic analysis, which shows a lot of analogies to complex analysis, the situation is more complicated due to the non-commutative multiplication. Thus, a fundamental theorem of algebra is not available, and standard tools from linear algebra cannot be applied in the usual way. To overcome these problems, a special system of monogenic polynomials the so-called Pseudo Complex Polynomials, sharing some properties of complex powers, is used. In this paper, we present an approach to deal with the interpolation problem, where solutions of elasticity equations in three dimensions are used as an interpolation basis.

  10. Assessing How Participators Combine Acts in Their "Political Tool Kits": A Person-Centered Measurement Approach for Analyzing Citizen Participation.

    PubMed

    Oser, Jennifer

    2017-01-01

    Scholars have recognized that a recent increase in the ways citizens participate beyond the electoral arena may be a promising avenue of renewal for citizen participation. In this article we test the theory that different kinds of citizenship norms motivate some citizens to specialize in electoral-oriented activities (e.g. voting), while others specialize in non-institutionalized activities (e.g. protest). The latent class analysis of data from the U.S. Citizen, Involvement and Democracy Survey (2005) in the current study assesses how actors combine a variety of acts in their "political tool kits" of participation, and facilitates a comparison to prior findings that analyze single political behaviors. Results indicate a participatory type that specializes in non-institutionalized acts, but the group's high probability of voting does not align with the expectations in the literature. An electoral-oriented specialist type is not identified; instead, the findings show that a majority of the population is best characterized as disengaged, while a small group of all-around activists embrace all possible opportunities for political action. The actor-centered theoretical and measurement approach in this study identifies caveats to the theory that changing citizenship norms are leading to civic and political renewal. We discuss the implications of these findings for measuring different aspects of democratic (dis)engagement and participatory (in)equality.

  11. 78 FR 40921 - Amendment to the International Traffic in Arms Regulations: Continued Implementation of Export...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-08

    ... specially designed parts, components, accessories, and attachments therefor (a) in production, (b... production, testing and inspection equipment, and tooling, specially designed for plants or facilities..., a definition for specially designed, and responses to public comments and changes to other sections...

  12. Comprehensive Design Reliability Activities for Aerospace Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Christenson, R. L.; Whitley, M. R.; Knight, K. C.

    2000-01-01

    This technical publication describes the methodology, model, software tool, input data, and analysis result that support aerospace design reliability studies. The focus of these activities is on propulsion systems mechanical design reliability. The goal of these activities is to support design from a reliability perspective. Paralleling performance analyses in schedule and method, this requires the proper use of metrics in a validated reliability model useful for design, sensitivity, and trade studies. Design reliability analysis in this view is one of several critical design functions. A design reliability method is detailed and two example analyses are provided-one qualitative and the other quantitative. The use of aerospace and commercial data sources for quantification is discussed and sources listed. A tool that was developed to support both types of analyses is presented. Finally, special topics discussed include the development of design criteria, issues of reliability quantification, quality control, and reliability verification.

  13. Implementation of lean manufacturing for frozen fish process at PT. XYZ

    NASA Astrophysics Data System (ADS)

    Setiyawan, D. T.; Pertiwijaya, H. R.; Effendi, U.

    2018-03-01

    PT. XYZ is a company specialized in the processing of fishery products particularly in frozen fish fillet. The purpose of this research was to identify the type of waste and determine the recommendations of minimizing waste Lean manufacturing approach was used in the identification of waste by describing the Value Stream Mapping (VSM) and selecting tools in the Value Stream Analysis Tools (VALSAT). The results of this research showed that the highest waste that generated was the defect of leak packaging on fillet products with an average of 1.21%. In addition to defect, other insufficiencies were found such as: unnecessary motion, unnecessary overhead, and waiting time. Recommendations for improvements that given include reduction of time at several stages of the process, making production schedules, and conducting regular machine maintenance. VSM analysis shows reduced lead time of 582.04 minutes to 572.01 minutes.

  14. Interactive Visualization of Computational Fluid Dynamics using Mosaic

    NASA Technical Reports Server (NTRS)

    Clucas, Jean; Watson, Velvin; Chancellor, Marisa K. (Technical Monitor)

    1994-01-01

    The Web provides new Methods for accessing Information world-wide, but the current text-and-pictures approach neither utilizes all the Web's possibilities not provides for its limitations. While the inclusion of pictures and animations in a paper communicates more effectively than text alone, It Is essentially an extension of the concept of "publication." Also, as use of the Web increases putting images and animations online will quickly load even the "Information Superhighway." We need to find forms of communication that take advantage of the special nature of the Web. This paper presents one approach: the use of the Internet and the Mosaic interface for data sharing and collaborative analysis. We will describe (and In the presentation, demonstrate) our approach: using FAST (Flow Analysis Software Toolkit), a scientific visualization package, as a data viewer and interactive tool called from MOSAIC. Our intent is to stimulate the development of other tools that utilize the unique nature of electronic communication.

  15. Snoopy--a unifying Petri net framework to investigate biomolecular networks.

    PubMed

    Rohr, Christian; Marwan, Wolfgang; Heiner, Monika

    2010-04-01

    To investigate biomolecular networks, Snoopy provides a unifying Petri net framework comprising a family of related Petri net classes. Models can be hierarchically structured, allowing for the mastering of larger networks. To move easily between the qualitative, stochastic and continuous modelling paradigms, models can be converted into each other. We get models sharing structure, but specialized by their kinetic information. The analysis and iterative reverse engineering of biomolecular networks is supported by the simultaneous use of several Petri net classes, while the graphical user interface adapts dynamically to the active one. Built-in animation and simulation are complemented by exports to various analysis tools. Snoopy facilitates the addition of new Petri net classes thanks to its generic design. Our tool with Petri net samples is available free of charge for non-commercial use at http://www-dssz.informatik.tu-cottbus.de/snoopy.html; supported operating systems: Mac OS X, Windows and Linux (selected distributions).

  16. Multilayer perceptron with local constraint as an emerging method in spatial data analysis

    NASA Astrophysics Data System (ADS)

    de Bollivier, M.; Dubois, G.; Maignan, M.; Kanevsky, M.

    1997-02-01

    The use of Geographic Information Systems has revolutionalized the handling and the visualization of geo-referenced data and has underlined the critic role of spatial analysis. The usual tools for such a purpose are geostatistics which are widely used in Earth science. Geostatistics are based upon several hypothesis which are not always verified in practice. On the other hand, Artificial Neural Network (ANN) a priori can be used without special assumptions and are known to be flexible. This paper proposes to discuss the application of ANN in the case of the interpolation of a geo-referenced variable.

  17. An annotation system for 3D fluid flow visualization

    NASA Technical Reports Server (NTRS)

    Loughlin, Maria M.; Hughes, John F.

    1995-01-01

    Annotation is a key activity of data analysis. However, current systems for data analysis focus almost exclusively on visualization. We propose a system which integrates annotations into a visualization system. Annotations are embedded in 3D data space, using the Post-it metaphor. This embedding allows contextual-based information storage and retrieval, and facilitates information sharing in collaborative environments. We provide a traditional database filter and a Magic Lens filter to create specialized views of the data. The system has been customized for fluid flow applications, with features which allow users to store parameters of visualization tools and sketch 3D volumes.

  18. Development and Validation of a Decision Tool for Early Identification of Adult Patients with Severe and Complex Eating Disorder Psychopathology in Need of Highly Specialized Care.

    PubMed

    Dingemans, Alexandra E; Goorden, Maartje; Lötters, Freek J B; Bouwmans, Clazien; Danner, Unna N; van Elburg, Annemarie A; van Furth, Eric F; Hakkaart-van Roijen, Leona

    2017-09-01

    Patients with complex and severe eating disorders often receive a number of ineffective or/and insufficient treatments. Direct referral of these patients to highly specialized tertiary treatment facilities in an earlier stage of the disorder is likely to be more (cost)-effective. The aim of the study was to develop a decision tool that aids clinicians in early identification of these patients. After identification of criteria that were indicative of severity and complexity of eating disorder psychopathology by means of a systematic review of literature and consultation of a focus group, a Delphi method was applied to obtain consensus from experts on the list of relevant criteria. Finally, the decision tool was validated in clinical practice, and cut-off criteria were established. The tool demonstrated good feasibility and validity to identify patients for highly specialized tertiary care. The final decision tool consisted of five criteria that can easily be implemented in clinical practice. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association. Copyright © 2017 John Wiley & Sons, Ltd and Eating Disorders Association.

  19. Precession technique and electron diffractometry as new tools for crystal structure analysis and chemical bonding determination.

    PubMed

    Avilov, A; Kuligin, K; Nicolopoulos, S; Nickolskiy, M; Boulahya, K; Portillo, J; Lepeshov, G; Sobolev, B; Collette, J P; Martin, N; Robins, A C; Fischione, P

    2007-01-01

    We have developed a new fast electron diffractometer working with high dynamic range and linearity for crystal structure determinations. Electron diffraction (ED) patterns can be scanned serially in front of a Faraday cage detector; the total measurement time for several hundred ED reflections can be tens of seconds having high statistical accuracy for all measured intensities (1-2%). This new tool can be installed to any type of TEM without any column modification and is linked to a specially developed electron beam precession "Spinning Star" system. Precession of the electron beam (Vincent-Midgley technique) reduces dynamical effects allowing also use of accurate intensities for crystal structure analysis. We describe the technical characteristics of this new tool together with the first experimental results. Accurate measurement of electron diffraction intensities by electron diffractometer opens new possibilities not only for revealing unknown structures, but also for electrostatic potential determination and chemical bonding investigation. As an example, we present detailed atomic bonding information of CaF(2) as revealed for the first time by precise electron diffractometry.

  20. Special tool kit aids heavily garmented workers

    NASA Technical Reports Server (NTRS)

    Holmes, A. E.

    1966-01-01

    Triangular aluminum tool kit, filled with polyurethane is constructed to receive various tools and hold them in a snug but quick-release fit as an aid to heavily gloved workers. The kit is designed to allow mounting within easily accessable reach and to provide protection of the tools during storage.

  1. Reducing tool wear by partial cladding of critical zones in hot form tool by laser metal deposition

    NASA Astrophysics Data System (ADS)

    Vollmer, Robert; Sommitsch, Christof

    2017-10-01

    This paper points out a production method to reduce tool wear in hot stamping applications. Usually tool wear can be observed at locally strongly stressed areas superimposed with gliding movement between blank and tool surface. The shown solution is based on a partial laser cladding of the tool surface with a wear resistant coating to increase the lifespan of tool inserts. Preliminary studies showed good results applying a material combination of tungsten carbide particles embedded in a metallic matrix. Different Nickel based alloys welded on hot work tool steel (1.2343) were tested mechanically in the interface zone. The material with the best bonding characteristic is chosen and reinforced with spherical tungsten carbide particles in a second laser welding step. Since the machining of tungsten carbides is very elaborate a special manufacturing strategy is developed to reduce the milling effort as much as possible. On special test specimens milling tests are carried out to proof the machinability. As outlook a tool insert of a b-pillar is coated to perform real hot forming tests.

  2. Study of a direct visualization display tool for space applications

    NASA Astrophysics Data System (ADS)

    Pereira do Carmo, J.; Gordo, P. R.; Martins, M.; Rodrigues, F.; Teodoro, P.

    2017-11-01

    The study of a Direct Visualization Display Tool (DVDT) for space applications is reported. The review of novel technologies for a compact display tool is described. Several applications for this tool have been identified with the support of ESA astronauts and are presented. A baseline design is proposed. It consists mainly of OLEDs as image source; a specially designed optical prism as relay optics; a Personal Digital Assistant (PDA), with data acquisition card, as control unit; and voice control and simplified keyboard as interfaces. Optical analysis and the final estimated performance are reported. The system is able to display information (text, pictures or/and video) with SVGA resolution directly to the astronaut using a Field of View (FOV) of 20x14.5 degrees. The image delivery system is a monocular Head Mounted Display (HMD) that weights less than 100g. The HMD optical system has an eye pupil of 7mm and an eye relief distance of 30mm.

  3. Artificial Intelligence Applications in Special Education: How Feasible? Final Report.

    ERIC Educational Resources Information Center

    Hofmeister, Alan M.; Ferrara, Joseph M.

    The research project investigated whether expert system tools have become sophisticated enough to be applied efficiently to problems in special education. (Expert systems are a development of artificial intelligence that combines the computer's capacity for storing specialized knowledge with a general set of rules intended to replicate the…

  4. Tools for Teaming: Resources for Linking Vocational Programs with Special Populations.

    ERIC Educational Resources Information Center

    Tavares, Barbara, Ed.

    This publication provides resources for linking vocational programs with five special populations. Sections 1-5 each focus on one special population and contain some or all of these resources: activities; recruitment; teacher tips; laws; staff development; funding streams; parent advice; instructional modifications; websites; community resources;…

  5. iCanPlot: Visual Exploration of High-Throughput Omics Data Using Interactive Canvas Plotting

    PubMed Central

    Sinha, Amit U.; Armstrong, Scott A.

    2012-01-01

    Increasing use of high throughput genomic scale assays requires effective visualization and analysis techniques to facilitate data interpretation. Moreover, existing tools often require programming skills, which discourages bench scientists from examining their own data. We have created iCanPlot, a compelling platform for visual data exploration based on the latest technologies. Using the recently adopted HTML5 Canvas element, we have developed a highly interactive tool to visualize tabular data and identify interesting patterns in an intuitive fashion without the need of any specialized computing skills. A module for geneset overlap analysis has been implemented on the Google App Engine platform: when the user selects a region of interest in the plot, the genes in the region are analyzed on the fly. The visualization and analysis are amalgamated for a seamless experience. Further, users can easily upload their data for analysis—which also makes it simple to share the analysis with collaborators. We illustrate the power of iCanPlot by showing an example of how it can be used to interpret histone modifications in the context of gene expression. PMID:22393367

  6. Editing of EIA coded, numerically controlled, machine tool tapes

    NASA Technical Reports Server (NTRS)

    Weiner, J. M.

    1975-01-01

    Editing of numerically controlled (N/C) machine tool tapes (8-level paper tape) using an interactive graphic display processor is described. A rapid technique required for correcting production errors in N/C tapes was developed using the interactive text editor on the IMLAC PDS-ID graphic display system and two special programs resident on disk. The correction technique and special programs for processing N/C tapes coded to EIA specifications are discussed.

  7. How to Enhance Awareness on Bullying for Special Needs Students Using "Edpuzzle" a Web 2.0 Tool

    ERIC Educational Resources Information Center

    Abou Afach, Sara; Kiwan, Elias; Semaan, Charbel

    2018-01-01

    The purpose for this study is to be able to deliver messages and life tips for special needs students in an easy way. For that, we used a web 2.0 visual tool "EdPuzzle" to show a video about bullying, having in it some questions to know if the message is delivered and understood by these students. The outcome of the study was positive…

  8. Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2017-10-01

    We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.

  9. Commodities Trading: An Essential Economic Tool.

    ERIC Educational Resources Information Center

    Welch, Mary A., Ed.

    1989-01-01

    This issue focuses on commodities trading as an essential economic tool. Activities include critical thinking about marketing decisions and discussion on how futures markets and options are used as important economic tools. Discussion questions and a special student project are included. (EH)

  10. Productivity improvement through cycle time analysis

    NASA Astrophysics Data System (ADS)

    Bonal, Javier; Rios, Luis; Ortega, Carlos; Aparicio, Santiago; Fernandez, Manuel; Rosendo, Maria; Sanchez, Alejandro; Malvar, Sergio

    1996-09-01

    A cycle time (CT) reduction methodology has been developed in the Lucent Technology facility (former AT&T) in Madrid, Spain. It is based on a comparison of the contribution of each process step in each technology with a target generated by a cycle time model. These targeted cycle times are obtained using capacity data of the machines processing those steps, queuing theory and theory of constrains (TOC) principles (buffers to protect bottleneck and low cycle time/inventory everywhere else). Overall efficiency equipment (OEE) like analysis is done in the machine groups with major differences between their target cycle time and real values. Comparisons between the current value of the parameters that command their capacity (process times, availability, idles, reworks, etc.) and the engineering standards are done to detect the cause of exceeding their contribution to the cycle time. Several friendly and graphical tools have been developed to track and analyze those capacity parameters. Specially important have showed to be two tools: ASAP (analysis of scheduling, arrivals and performance) and performer which analyzes interrelation problems among machines procedures and direct labor. The performer is designed for a detailed and daily analysis of an isolate machine. The extensive use of this tool by the whole labor force has demonstrated impressive results in the elimination of multiple small inefficiencies with a direct positive implications on OEE. As for ASAP, it shows the lot in process/queue for different machines at the same time. ASAP is a powerful tool to analyze the product flow management and the assigned capacity for interdependent operations like the cleaning and the oxidation/diffusion. Additional tools have been developed to track, analyze and improve the process times and the availability.

  11. What's Special about Human Imitation? A Comparison with Enculturated Apes.

    PubMed

    Subiaul, Francys

    2016-07-07

    What, if anything, is special about human imitation? An evaluation of enculturated apes' imitation skills, a "best case scenario" of non-human apes' imitation performance, reveals important similarities and differences between this special population of apes and human children. Candidates for shared imitation mechanisms include the ability to imitate various familiar transitive responses and object-object actions that involve familiar tools. Candidates for uniquely derived imitation mechanisms include: imitating novel transitive actions and novel tool-using responses as well as imitating opaque or intransitive gestures, regardless of familiarity. While the evidence demonstrates that enculturated apes outperform non-enculturated apes and perform more like human children, all apes, regardless of rearing history, generally excel at imitating familiar, over-rehearsed responses and are poor, relative to human children, at imitating novel, opaque or intransitive responses. Given the similarities between the sensory and motor systems of preschool age human children and non-human apes, it is unlikely that differences in sensory input and/or motor-output alone explain the observed discontinuities in imitation performance. The special rearing history of enculturated apes-including imitation-specific training-further diminishes arguments suggesting that differences are experience-dependent. Here, it is argued that such differences are best explained by distinct, specialized mechanisms that have evolved for copying rules and responses in particular content domains. Uniquely derived social and imitation learning mechanisms may represent adaptations for learning novel communicative gestures and complex tool-use. Given our species' dependence on both language and tools, mechanisms that accelerated learning in these domains are likely to have faced intense selective pressures, starting with the earliest of human ancestors.

  12. Diagnostic flexible pharyngo-laryngoscopy: development of a procedure specific assessment tool using a Delphi methodology.

    PubMed

    Melchiors, Jacob; Henriksen, Mikael Johannes Vuokko; Dikkers, Frederik G; Gavilán, Javier; Noordzij, J Pieter; Fried, Marvin P; Novakovic, Daniel; Fagan, Johannes; Charabi, Birgitte W; Konge, Lars; von Buchwald, Christian

    2018-05-01

    Proper training and assessment of skill in flexible pharyngo-laryngoscopy are central in the education of otorhinolaryngologists. To facilitate an evidence-based approach to curriculum development in this field, a structured analysis of what constitutes flexible pharyngo-laryngoscopy is necessary. Our aim was to develop an assessment tool based on this analysis. We conducted an international Delphi study involving experts from twelve countries in five continents. Utilizing reiterative assessment, the panel defined the procedure and reached consensus (defined as 80% agreement) on the phrasing of an assessment tool. FIFTY PANELISTS COMPLETED THE DELPHI PROCESS. THE MEDIAN AGE OF THE PANELISTS WAS 44 YEARS (RANGE 33-64 YEARS). MEDIAN EXPERIENCE IN OTORHINOLARYNGOLOGY WAS 15 YEARS (RANGE 6-35 YEARS). TWENTY-FIVE WERE SPECIALIZED IN LARYNGOLOGY, 16 WERE HEAD AND NECK SURGEONS, AND NINE WERE GENERAL OTORHINOLARYNGOLOGISTS. AN ASSESSMENT TOOL WAS CREATED CONSISTING OF TWELVE DISTINCT ITEMS.: Conclusion The gathering of validity evidence for assessment of core procedural skills within Otorhinolaryngology is central to the development of a competence-based education. The use of an international Delphi panel allows for the creation of an assessment tool which is widely applicable and valid. This work allows for an informed approach to technical skills training for flexible pharyngo-laryngoscopy and as further validity evidence is gathered allows for a valid assessment of clinical performance within this important skillset.

  13. Integrating diverse databases into an unified analysis framework: a Galaxy approach

    PubMed Central

    Blankenberg, Daniel; Coraor, Nathan; Von Kuster, Gregory; Taylor, James; Nekrutenko, Anton

    2011-01-01

    Recent technological advances have lead to the ability to generate large amounts of data for model and non-model organisms. Whereas, in the past, there have been a relatively small number of central repositories that serve genomic data, an increasing number of distinct specialized data repositories and resources have been established. Here, we describe a generic approach that provides for the integration of a diverse spectrum of data resources into a unified analysis framework, Galaxy (http://usegalaxy.org). This approach allows the simplified coupling of external data resources with the data analysis tools available to Galaxy users, while leveraging the native data mining facilities of the external data resources. Database URL: http://usegalaxy.org PMID:21531983

  14. Scribl: an HTML5 Canvas-based graphics library for visualizing genomic data over the web

    PubMed Central

    Miller, Chase A.; Anthony, Jon; Meyer, Michelle M.; Marth, Gabor

    2013-01-01

    Motivation: High-throughput biological research requires simultaneous visualization as well as analysis of genomic data, e.g. read alignments, variant calls and genomic annotations. Traditionally, such integrative analysis required desktop applications operating on locally stored data. Many current terabyte-size datasets generated by large public consortia projects, however, are already only feasibly stored at specialist genome analysis centers. As even small laboratories can afford very large datasets, local storage and analysis are becoming increasingly limiting, and it is likely that most such datasets will soon be stored remotely, e.g. in the cloud. These developments will require web-based tools that enable users to access, analyze and view vast remotely stored data with a level of sophistication and interactivity that approximates desktop applications. As rapidly dropping cost enables researchers to collect data intended to answer questions in very specialized contexts, developers must also provide software libraries that empower users to implement customized data analyses and data views for their particular application. Such specialized, yet lightweight, applications would empower scientists to better answer specific biological questions than possible with general-purpose genome browsers currently available. Results: Using recent advances in core web technologies (HTML5), we developed Scribl, a flexible genomic visualization library specifically targeting coordinate-based data such as genomic features, DNA sequence and genetic variants. Scribl simplifies the development of sophisticated web-based graphical tools that approach the dynamism and interactivity of desktop applications. Availability and implementation: Software is freely available online at http://chmille4.github.com/Scribl/ and is implemented in JavaScript with all modern browsers supported. Contact: gabor.marth@bc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23172864

  15. Meta-analysis in Stata using gllamm.

    PubMed

    Bagos, Pantelis G

    2015-12-01

    There are several user-written programs for performing meta-analysis in Stata (Stata Statistical Software: College Station, TX: Stata Corp LP). These include metan, metareg, mvmeta, and glst. However, there are several cases for which these programs do not suffice. For instance, there is no software for performing univariate meta-analysis with correlated estimates, for multilevel or hierarchical meta-analysis, or for meta-analysis of longitudinal data. In this work, we show with practical applications that many disparate models, including but not limited to the ones mentioned earlier, can be fitted using gllamm. The software is very versatile and can handle a wide variety of models with applications in a wide range of disciplines. The method presented here takes advantage of these modeling capabilities and makes use of appropriate transformations, based on the Cholesky decomposition of the inverse of the covariance matrix, known as generalized least squares, in order to handle correlated data. The models described earlier can be thought of as special instances of a general linear mixed-model formulation, but to the author's knowledge, a general exposition in order to incorporate all the available models for meta-analysis as special cases and the instructions to fit them in Stata has not been presented so far. Source code is available at http:www.compgen.org/tools/gllamm. Copyright © 2015 John Wiley & Sons, Ltd.

  16. The Social Construction of Ability in Movement Assessment Tools

    ERIC Educational Resources Information Center

    Tidén, Anna; Redelius, Karin; Lundvall, Suzanne

    2017-01-01

    This paper focuses on how "ability" is conceptualised, configured and produced in movement assessment tools. The aim of the study was to critically analyse assessment tools used for healthy and typically developed children. The sample consists of 10 tools from 6 different countries. In the study, we pay special attention to content and…

  17. Operant Conditioning: A Tool for Special Physical Educators in the 1980s.

    ERIC Educational Resources Information Center

    Dunn, John M.; French, Ron

    1982-01-01

    The usefulness of operant conditioning for the special physical educator in managing behavior problems is pointed out, and steps to follow in applying operant conditioning techniques are outlined. (SB)

  18. The West Virginia Special Education Technology Integration Specialist (SETIS) Program: 2012-2014 Evaluation Report

    ERIC Educational Resources Information Center

    Stohr, Amber D.

    2015-01-01

    The Special Education Technology Integration Specialist (SETIS) program provides professional development for special education teachers to assist them in achieving proficiency with 21st Century Technology Tools. The program completed its eighth and ninth rounds during the 2012-2013 and 2013-2014 school years, training more than 30 special…

  19. A Study of Special Education Teachers' Knowledge of Assistive Technology for Children with Reading Difficulties

    ERIC Educational Resources Information Center

    Sydeski, Randal T.

    2013-01-01

    This study investigated high school special education teachers' knowledge of assistive technology (AT) for students with reading difficulties in Southwestern Pennsylvania. A survey was disseminated via e-mail using the "SurveyMonkey" online survey tool to 201 special education teachers. The survey asked questions pertaining to the…

  20. The Critical Success Factors Method: Its Application in a Special Library Environment.

    ERIC Educational Resources Information Center

    Borbely, Jack

    1981-01-01

    Discusses the background and theory of the Critical Success Factors (CSF) management method, as well as its application in an information center or other special library environment. CSF is viewed as a management tool that can enhance the viability of the special library within its parent organization. (FM)

  1. Music Integration Therapy: An Instructional Tool for Students with Special Needs

    ERIC Educational Resources Information Center

    Rodriguez, Delilah

    2017-01-01

    Students with special needs are required by law to have an individualized education plan based on their unique educational needs. Special education teachers understand these needs and provide students with instructional strategies that allow them to succeed. Music has often been used to provide students with disabilities alternative ways to learn…

  2. Web-based platform for collaborative medical imaging research

    NASA Astrophysics Data System (ADS)

    Rittner, Leticia; Bento, Mariana P.; Costa, André L.; Souza, Roberto M.; Machado, Rubens C.; Lotufo, Roberto A.

    2015-03-01

    Medical imaging research depends basically on the availability of large image collections, image processing and analysis algorithms, hardware and a multidisciplinary research team. It has to be reproducible, free of errors, fast, accessible through a large variety of devices spread around research centers and conducted simultaneously by a multidisciplinary team. Therefore, we propose a collaborative research environment, named Adessowiki, where tools and datasets are integrated and readily available in the Internet through a web browser. Moreover, processing history and all intermediate results are stored and displayed in automatic generated web pages for each object in the research project or clinical study. It requires no installation or configuration from the client side and offers centralized tools and specialized hardware resources, since processing takes place in the cloud.

  3. Nonlinear dynamics of a machining system with two interdependent delays

    NASA Astrophysics Data System (ADS)

    Gouskov, Alexander M.; Voronov, Sergey A.; Paris, Henri; Batzer, Stephen A.

    2002-12-01

    The dynamics of turning by a tool head with two rows, each containing several cutters, is considered. A mathematical model of a process with two interdependent delays with the possibility of cutting discontinuity is analyzed. The domains of dynamic instability are derived, and the influence of technological parameters on system response is presented. The numeric analysis show that there exists specific conditions for given regimes in which one row of cutters produces an intermittent chip form while the other row produces continuous chips. It is demonstrated that the contribution of parametric excitation by shape roughness of an imperfect (unmachined) cylindrical workpiece surface is not substantial due to the special filtering properties of cutters that are uniformly distributed circumferentially along the tool head.

  4. Visual readability analysis: how to make your writings easier to read.

    PubMed

    Oelke, Daniela; Spretke, David; Stoffel, Andreas; Keim, Daniel A

    2012-05-01

    We present a tool that is specifically designed to support a writer in revising a draft version of a document. In addition to showing which paragraphs and sentences are difficult to read and understand, we assist the reader in understanding why this is the case. This requires features that are expressive predictors of readability, and are also semantically understandable. In the first part of the paper, we, therefore, discuss a semiautomatic feature selection approach that is used to choose appropriate measures from a collection of 141 candidate readability features. In the second part, we present the visual analysis tool VisRA, which allows the user to analyze the feature values across the text and within single sentences. Users can choose between different visual representations accounting for differences in the size of the documents and the availability of information about the physical and logical layout of the documents. We put special emphasis on providing as much transparency as possible to ensure that the user can purposefully improve the readability of a sentence. Several case studies are presented that show the wide range of applicability of our tool. Furthermore, an in-depth evaluation assesses the quality of the measure and investigates how well users do in revising a text with the help of the tool.

  5. A Knowledge Portal and Collaboration Environment for the Earth Sciences

    NASA Astrophysics Data System (ADS)

    D'Agnese, F. A.

    2008-12-01

    Earth Knowledge is developing a web-based 'Knowledge Portal and Collaboration Environment' that will serve as the information-technology-based foundation of a modular Internet-based Earth-Systems Monitoring, Analysis, and Management Tool. This 'Knowledge Portal' is essentially a 'mash- up' of web-based and client-based tools and services that support on-line collaboration, community discussion, and broad public dissemination of earth and environmental science information in a wide-area distributed network. In contrast to specialized knowledge-management or geographic-information systems developed for long- term and incremental scientific analysis, this system will exploit familiar software tools using industry standard protocols, formats, and APIs to discover, process, fuse, and visualize existing environmental datasets using Google Earth and Google Maps. An early form of these tools and services is being used by Earth Knowledge to facilitate the investigations and conversations of scientists, resource managers, and citizen-stakeholders addressing water resource sustainability issues in the Great Basin region of the desert southwestern United States. These ongoing projects will serve as use cases for the further development of this information-technology infrastructure. This 'Knowledge Portal' will accelerate the deployment of Earth- system data and information into an operational knowledge management system that may be used by decision-makers concerned with stewardship of water resources in the American Desert Southwest.

  6. Compact, Non-Pneumatic Rock-Powder Samplers

    NASA Technical Reports Server (NTRS)

    Sherrit, Stewart; Bar-Cohen, Yoseph; Badescu, Mircea; Bao, Xiaoqi; Chang, Zensheu; Jones, Christopher; Aldrich, Jack

    2008-01-01

    Tool bits that automatically collect powdered rock, permafrost, or other hard material generated in repeated hammering action have been invented. The present invention pertains to the special case in which it is desired to collect samples in powder form for analysis by x-ray diffraction and possibly other techniques. The present invention eliminates the need for both the mechanical collection equipment and the crushing chamber and the pneumatic collection equipment of prior approaches, so that it becomes possible to make the overall sample-acquisition apparatus more compact.

  7. A Longitudinal Analysis of the Acceptance Rates of the Navy’s Voluntary Separation Incentive/Special Separation Benefit (VSI/SSB) Program

    DTIC Science & Technology

    1993-09-23

    Authorization act , as one of the most visible policy tools in its current strategy to downsize the military. The program has been fairly successful in...as substantial reenlistment bonuses to keep quality personnel. These policies have been successful . Today’s military is the most senior of any in the...last 50 years. Ironically, it is the successes of manpower planners in developing these policies, coupled with their increased understanding of the

  8. Mutational Signatures in Cancer (MuSiCa): a web application to implement mutational signatures analysis in cancer samples.

    PubMed

    Díaz-Gay, Marcos; Vila-Casadesús, Maria; Franch-Expósito, Sebastià; Hernández-Illán, Eva; Lozano, Juan José; Castellví-Bel, Sergi

    2018-06-14

    Mutational signatures have been proved as a valuable pattern in somatic genomics, mainly regarding cancer, with a potential application as a biomarker in clinical practice. Up to now, several bioinformatic packages to address this topic have been developed in different languages/platforms. MutationalPatterns has arisen as the most efficient tool for the comparison with the signatures currently reported in the Catalogue of Somatic Mutations in Cancer (COSMIC) database. However, the analysis of mutational signatures is nowadays restricted to a small community of bioinformatic experts. In this work we present Mutational Signatures in Cancer (MuSiCa), a new web tool based on MutationalPatterns and built using the Shiny framework in R language. By means of a simple interface suited to non-specialized researchers, it provides a comprehensive analysis of the somatic mutational status of the supplied cancer samples. It permits characterizing the profile and burden of mutations, as well as quantifying COSMIC-reported mutational signatures. It also allows classifying samples according to the above signature contributions. MuSiCa is a helpful web application to characterize mutational signatures in cancer samples. It is accessible online at http://bioinfo.ciberehd.org/GPtoCRC/en/tools.html and source code is freely available at https://github.com/marcos-diazg/musica .

  9. An Interior Signage System for the USAF Academy Hospital

    DTIC Science & Technology

    1979-08-01

    manner. Graphic Design - Graphic design is a design for visual communication . Graphic Design Tools - There are four basic graphic design tools available...specializes in the design of two dimensional visual communication components. The graphic designer utilizes the four graphic design tools in developing

  10. Marshall Space Flight Center's Virtual Reality Applications Program 1993

    NASA Technical Reports Server (NTRS)

    Hale, Joseph P., II

    1993-01-01

    A Virtual Reality (VR) applications program has been under development at the Marshall Space Flight Center (MSFC) since 1989. Other NASA Centers, most notably Ames Research Center (ARC), have contributed to the development of the VR enabling technologies and VR systems. This VR technology development has now reached a level of maturity where specific applications of VR as a tool can be considered. The objectives of the MSFC VR Applications Program are to develop, validate, and utilize VR as a Human Factors design and operations analysis tool and to assess and evaluate VR as a tool in other applications (e.g., training, operations development, mission support, teleoperations planning, etc.). The long-term goals of this technology program is to enable specialized Human Factors analyses earlier in the hardware and operations development process and develop more effective training and mission support systems. The capability to perform specialized Human Factors analyses earlier in the hardware and operations development process is required to better refine and validate requirements during the requirements definition phase. This leads to a more efficient design process where perturbations caused by late-occurring requirements changes are minimized. A validated set of VR analytical tools must be developed to enable a more efficient process for the design and development of space systems and operations. Similarly, training and mission support systems must exploit state-of-the-art computer-based technologies to maximize training effectiveness and enhance mission support. The approach of the VR Applications Program is to develop and validate appropriate virtual environments and associated object kinematic and behavior attributes for specific classes of applications. These application-specific environments and associated simulations will be validated, where possible, through empirical comparisons with existing, accepted tools and methodologies. These validated VR analytical tools will then be available for use in the design and development of space systems and operations and in training and mission support systems.

  11. SurvMicro: assessment of miRNA-based prognostic signatures for cancer clinical outcomes by multivariate survival analysis.

    PubMed

    Aguirre-Gamboa, Raul; Trevino, Victor

    2014-06-01

    MicroRNAs (miRNAs) play a key role in post-transcriptional regulation of mRNA levels. Their function in cancer has been studied by high-throughput methods generating valuable sources of public information. Thus, miRNA signatures predicting cancer clinical outcomes are emerging. An important step to propose miRNA-based biomarkers before clinical validation is their evaluation in independent cohorts. Although it can be carried out using public data, such task is time-consuming and requires a specialized analysis. Therefore, to aid and simplify the evaluation of prognostic miRNA signatures in cancer, we developed SurvMicro, a free and easy-to-use web tool that assesses miRNA signatures from publicly available miRNA profiles using multivariate survival analysis. SurvMicro is composed of a wide and updated database of >40 cohorts in different tissues and a web tool where survival analysis can be done in minutes. We presented evaluations to portray the straightforward functionality of SurvMicro in liver and lung cancer. To our knowledge, SurvMicro is the only bioinformatic tool that aids the evaluation of multivariate prognostic miRNA signatures in cancer. SurvMicro and its tutorial are freely available at http://bioinformatica.mty.itesm.mx/SurvMicro. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. GenomeCAT: a versatile tool for the analysis and integrative visualization of DNA copy number variants.

    PubMed

    Tebel, Katrin; Boldt, Vivien; Steininger, Anne; Port, Matthias; Ebert, Grit; Ullmann, Reinhard

    2017-01-06

    The analysis of DNA copy number variants (CNV) has increasing impact in the field of genetic diagnostics and research. However, the interpretation of CNV data derived from high resolution array CGH or NGS platforms is complicated by the considerable variability of the human genome. Therefore, tools for multidimensional data analysis and comparison of patient cohorts are needed to assist in the discrimination of clinically relevant CNVs from others. We developed GenomeCAT, a standalone Java application for the analysis and integrative visualization of CNVs. GenomeCAT is composed of three modules dedicated to the inspection of single cases, comparative analysis of multidimensional data and group comparisons aiming at the identification of recurrent aberrations in patients sharing the same phenotype, respectively. Its flexible import options ease the comparative analysis of own results derived from microarray or NGS platforms with data from literature or public depositories. Multidimensional data obtained from different experiment types can be merged into a common data matrix to enable common visualization and analysis. All results are stored in the integrated MySQL database, but can also be exported as tab delimited files for further statistical calculations in external programs. GenomeCAT offers a broad spectrum of visualization and analysis tools that assist in the evaluation of CNVs in the context of other experiment data and annotations. The use of GenomeCAT does not require any specialized computer skills. The various R packages implemented for data analysis are fully integrated into GenomeCATs graphical user interface and the installation process is supported by a wizard. The flexibility in terms of data import and export in combination with the ability to create a common data matrix makes the program also well suited as an interface between genomic data from heterogeneous sources and external software tools. Due to the modular architecture the functionality of GenomeCAT can be easily extended by further R packages or customized plug-ins to meet future requirements.

  13. Using Interactive Visualization to Analyze Solid Earth Data and Geodynamics Models

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.; Kreylos, O.; Billen, M. I.; Hamann, B.; Jadamec, M. A.; Rundle, J. B.; van Aalsburg, J.; Yikilmaz, M. B.

    2008-12-01

    The geological sciences are challenged to manage and interpret increasing volumes of data as observations and simulations increase in size and complexity. Major projects such as EarthScope and GeoEarthScope are producing the data needed to characterize the structure and kinematics of Earth's surface and interior at unprecedented resolution. At the same time, high-performance computing enables high-precision and fine- detail simulation of geodynamics processes, complementing the observational data. To facilitate interpretation and analysis of these datasets, to evaluate models, and to drive future calculations, we have developed methods of interactive visualization with a special focus on using immersive virtual reality (VR) environments to interact with models of Earth's surface and interior. VR has traditionally been used primarily as a presentation tool allowing active navigation through data. Reaping the full intellectual benefits of immersive VR as a tool for accelerated scientific analysis requires building on the method's strengths, that is, using both 3D perception and interaction with observed or simulated data. Our approach to VR takes advantage of the specialized skills of geoscientists who are trained to interpret geological and geophysical data generated from field observations. Interactive tools allow the scientist to explore and interpret geodynamic models, tomographic models, and topographic observations, while feature extraction tools support quantitative measurement of structures that emerge from numerical simulations or field observations. The use of VR technology enables us to improve our interpretation of crust and mantle structure and of geodynamical processes. Mapping tools based on computer visualization allow virtual "field studies" in inaccessible regions, and an interactive tool allows us to construct digital fault models for use in numerical models. Using the interactive tools on a high-end platform such as an immersive virtual reality room known as a Cave Automatic Virtual Environment (CAVE), enables the scientist to stand in data three-dimensional dataset while taking measurements. The CAVE involves three or more projection surfaces arranged as walls in a room. Stereo projectors combined with a motion tracking system and immersion recreates the experience of carrying out research in the field. This high-end system provides significant advantages for scientists working with complex volumetric data. The interactive tools also work on low-cost platforms that provide stereo views and the potential for interactivity such as a Geowall or a 3D enabled TV. The Geowall is also a well-established tool for education, and in combination with the tools we have developed, enables the rapid transfer of research data and new knowledge to the classroom. The interactive visualization tools can also be used on a desktop or laptop with or without stereo capability. Further information about the Virtual Reality User Interface (VRUI), the 3DVisualizer, the Virtual mapping tools, and the LIDAR viewer, can be found on the KeckCAVES website, www.keckcaves.org.

  14. Quantifying Parental Influence on Youth Athlete Specialization: A Survey of Athletes' Parents.

    PubMed

    Padaki, Ajay S; Ahmad, Christopher S; Hodgins, Justin L; Kovacevic, David; Lynch, Thomas Sean; Popkin, Charles A

    2017-09-01

    Youth athlete specialization has been linked to decreased enjoyment, burnout, and increased injury risk, although the impact of specialization on athletic success is unknown. The extent to which parents exert extrinsic influence on this phenomenon remains unclear. The goal of this study was to assess parental influences placed on young athletes to specialize. It was hypothesized that parents generate both direct and indirect pressures on specialized athletes. Cross-sectional study; Level of evidence, 3. A survey tool was designed by an interdisciplinary medical team to evaluate parental influence on youth specialization. Surveys were administered to parents of the senior author's orthopaedic pediatric patients. Of the 211 parents approached, 201 (95.3%) completed the assessment tool. One-third of parents stated that their children played a single sport only, 53.2% had children who played multiple sports but had a favorite sport, and 13.4% had children who balanced their multiple sports equally. Overall, 115 (57.2%) parents hoped for their children to play collegiately or professionally, and 100 (49.7%) parents encouraged their children to specialize in a single sport. Parents of highly specialized and moderately specialized athletes were more likely to report directly influencing their children's specialization ( P = .038) and to expect their children to play collegiately or professionally ( P = .014). Finally, parents who hired personal trainers for their children were more likely to believe that their children held collegiate or professional aspirations ( P = .009). Parents influence youth athlete specialization both directly and by investment in elite coaching and personal instruction. Parents of more specialized athletes exert more influence than parents of unspecialized athletes.

  15. FunGene: the functional gene pipeline and repository.

    PubMed

    Fish, Jordan A; Chai, Benli; Wang, Qiong; Sun, Yanni; Brown, C Titus; Tiedje, James M; Cole, James R

    2013-01-01

    Ribosomal RNA genes have become the standard molecular markers for microbial community analysis for good reasons, including universal occurrence in cellular organisms, availability of large databases, and ease of rRNA gene region amplification and analysis. As markers, however, rRNA genes have some significant limitations. The rRNA genes are often present in multiple copies, unlike most protein-coding genes. The slow rate of change in rRNA genes means that multiple species sometimes share identical 16S rRNA gene sequences, while many more species share identical sequences in the short 16S rRNA regions commonly analyzed. In addition, the genes involved in many important processes are not distributed in a phylogenetically coherent manner, potentially due to gene loss or horizontal gene transfer. While rRNA genes remain the most commonly used markers, key genes in ecologically important pathways, e.g., those involved in carbon and nitrogen cycling, can provide important insights into community composition and function not obtainable through rRNA analysis. However, working with ecofunctional gene data requires some tools beyond those required for rRNA analysis. To address this, our Functional Gene Pipeline and Repository (FunGene; http://fungene.cme.msu.edu/) offers databases of many common ecofunctional genes and proteins, as well as integrated tools that allow researchers to browse these collections and choose subsets for further analysis, build phylogenetic trees, test primers and probes for coverage, and download aligned sequences. Additional FunGene tools are specialized to process coding gene amplicon data. For example, FrameBot produces frameshift-corrected protein and DNA sequences from raw reads while finding the most closely related protein reference sequence. These tools can help provide better insight into microbial communities by directly studying key genes involved in important ecological processes.

  16. A computer-aided ECG diagnostic tool.

    PubMed

    Oweis, Rami; Hijazi, Lily

    2006-03-01

    Jordan lacks companies that provide local medical facilities with products that are of help in daily performed medical procedures. Because of this, the country imports most of these expensive products. Consequently, a local interest in producing such products has emerged and resulted in serious research efforts in this area. The main goal of this paper is to provide local (the north of Jordan) clinics with a computer-aided electrocardiogram (ECG) diagnostic tool in an attempt to reduce time and work demands for busy physicians especially in areas where only one general medicine doctor is employed and a bulk of cases are to be diagnosed. The tool was designed to help in detecting heart defects such as arrhythmias and heart blocks using ECG signal analysis depending on the time-domain representation, the frequency-domain spectrum, and the relationship between them. The application studied here represents a state of the art ECG diagnostic tool that was designed, implemented, and tested in Jordan to serve wide spectrum of population who are from poor families. The results of applying the tool on randomly selected representative sample showed about 99% matching with those results obtained at specialized medical facilities. Costs, ease of interface, and accuracy indicated the usefulness of the tool and its use as an assisting diagnostic tool.

  17. Building Professional Learning Communities in Special Education through Social Networking: Directions for Future Research

    ERIC Educational Resources Information Center

    Hardman, Elizabeth L.

    2011-01-01

    This paper examines the challenges inherent in building professional learning communities (PLCs) in special education and describes how two Web 2.0 tools were used to build a community that engages general and special education teachers, school administrators, and teacher educators in implementing research based inclusive practices that are known…

  18. Pre-Service Special Education Teachers Acceptance and Use of ICT: A Structural Equation Model

    ERIC Educational Resources Information Center

    Yeni, Sabiha; Gecu-Parmaksiz, Zeynep

    2016-01-01

    Information and communication technology (ICT) supported education helps the individuals with special educational needs to take their attention to the course content and to concentrate their attention on the task they need to perform. The mechanical advantages of ICT tools make them attractive for individuals with special educational needs. If…

  19. New Earth Science Data and Access Methods

    NASA Technical Reports Server (NTRS)

    Moses, John F.; Weinstein, Beth E.; Farnham, Jennifer

    2004-01-01

    NASA's Earth Science Enterprise, working with its domestic and international partners, provides scientific data and analysis to improve life here on Earth. NASA provides science data products that cover a wide range of physical, geophysical, biochemical and other parameters, as well as services for interdisciplinary Earth science studies. Management and distribution of these products is administered through the Earth Observing System Data and Information System (EOSDIS) Distributed Active Archive Centers (DAACs), which all hold data within a different Earth science discipline. This paper will highlight selected EOS datasets and will focus on how these observations contribute to the improvement of essential services such as weather forecasting, climate prediction, air quality, and agricultural efficiency. Emphasis will be placed on new data products derived from instruments on board Terra, Aqua and ICESat as well as new regional data products and field campaigns. A variety of data tools and services are available to the user community. This paper will introduce primary and specialized DAAC-specific methods for finding, ordering and using these data products. Special sections will focus on orienting users unfamiliar with DAAC resources, HDF-EOS formatted data and the use of desktop research and application tools.

  20. Visualization of International Solar-Terrestrial Physics Program (ISTP) data

    NASA Technical Reports Server (NTRS)

    Kessel, Ramona L.; Candey, Robert M.; Hsieh, Syau-Yun W.; Kayser, Susan

    1995-01-01

    The International Solar-Terrestrial Physics Program (ISTP) is a multispacecraft, multinational program whose objective is to promote further understanding of the Earth's complex plasma environment. Extensive data sharing and data analysis will be needed to ensure the success of the overall ISTP program. For this reason, there has been a special emphasis on data standards throughout ISTP. One of the key tools will be the common data format (CDF), developed, maintained, and evolved at the National Space Science Data Center (NSSDC), with the set of ISTP implementation guidelines specially designed for space physics data sets by the Space Physics Data Facility (associated with the NSSDC). The ISTP guidelines were developed to facilitate searching, plotting, merging, and subsetting of data sets. We focus here on the plotting application. A prototype software package was developed to plot key parameter (KP) data from the ISTP program at the Science Planning and Operations Facility (SPOF). The ISTP Key Parameter Visualization Tool is based on the Interactive Data Language (IDL) and is keyed to the ISTP guidelines, reading data stored in CDF. With the combination of CDF, the ISTP guidelines, and the visualization software, we can look forward to easier and more effective data sharing and use among ISTP scientists.

  1. Overview of codes and tools for nuclear engineering education

    NASA Astrophysics Data System (ADS)

    Yakovlev, D.; Pryakhin, A.; Medvedeva, L.

    2017-01-01

    The recent world trends in nuclear education have been developed in the direction of social education, networking, virtual tools and codes. MEPhI as a global leader on the world education market implements new advanced technologies for the distance and online learning and for student research work. MEPhI produced special codes, tools and web resources based on the internet platform to support education in the field of nuclear technology. At the same time, MEPhI actively uses codes and tools from the third parties. Several types of the tools are considered: calculation codes, nuclear data visualization tools, virtual labs, PC-based educational simulators for nuclear power plants (NPP), CLP4NET, education web-platforms, distance courses (MOOCs and controlled and managed content systems). The university pays special attention to integrated products such as CLP4NET, which is not a learning course, but serves to automate the process of learning through distance technologies. CLP4NET organizes all tools in the same information space. Up to now, MEPhI has achieved significant results in the field of distance education and online system implementation.

  2. 75 FR 26321 - Seventeenth Plenary Meeting: RTCA Special Committee 203: Unmanned Aircraft Systems

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-11

    ... RTCA Workspace Web Tool Special Committee Status Overview Workgroup Updates WG1--Systems Engineering..., Washington, DC 20036; telephone (202) 833-9339; fax (202) 833-9434; Web site http://www.rtca.org...

  3. What’s Special about Human Imitation? A Comparison with Enculturated Apes

    PubMed Central

    Subiaul, Francys

    2016-01-01

    What, if anything, is special about human imitation? An evaluation of enculturated apes’ imitation skills, a “best case scenario” of non-human apes’ imitation performance, reveals important similarities and differences between this special population of apes and human children. Candidates for shared imitation mechanisms include the ability to imitate various familiar transitive responses and object–object actions that involve familiar tools. Candidates for uniquely derived imitation mechanisms include: imitating novel transitive actions and novel tool-using responses as well as imitating opaque or intransitive gestures, regardless of familiarity. While the evidence demonstrates that enculturated apes outperform non-enculturated apes and perform more like human children, all apes, regardless of rearing history, generally excel at imitating familiar, over-rehearsed responses and are poor, relative to human children, at imitating novel, opaque or intransitive responses. Given the similarities between the sensory and motor systems of preschool age human children and non-human apes, it is unlikely that differences in sensory input and/or motor-output alone explain the observed discontinuities in imitation performance. The special rearing history of enculturated apes—including imitation-specific training—further diminishes arguments suggesting that differences are experience-dependent. Here, it is argued that such differences are best explained by distinct, specialized mechanisms that have evolved for copying rules and responses in particular content domains. Uniquely derived social and imitation learning mechanisms may represent adaptations for learning novel communicative gestures and complex tool-use. Given our species’ dependence on both language and tools, mechanisms that accelerated learning in these domains are likely to have faced intense selective pressures, starting with the earliest of human ancestors. PMID:27399786

  4. Integrated omics analysis of specialized metabolism in medicinal plants.

    PubMed

    Rai, Amit; Saito, Kazuki; Yamazaki, Mami

    2017-05-01

    Medicinal plants are a rich source of highly diverse specialized metabolites with important pharmacological properties. Until recently, plant biologists were limited in their ability to explore the biosynthetic pathways of these metabolites, mainly due to the scarcity of plant genomics resources. However, recent advances in high-throughput large-scale analytical methods have enabled plant biologists to discover biosynthetic pathways for important plant-based medicinal metabolites. The reduced cost of generating omics datasets and the development of computational tools for their analysis and integration have led to the elucidation of biosynthetic pathways of several bioactive metabolites of plant origin. These discoveries have inspired synthetic biology approaches to develop microbial systems to produce bioactive metabolites originating from plants, an alternative sustainable source of medicinally important chemicals. Since the demand for medicinal compounds are increasing with the world's population, understanding the complete biosynthesis of specialized metabolites becomes important to identify or develop reliable sources in the future. Here, we review the contributions of major omics approaches and their integration to our understanding of the biosynthetic pathways of bioactive metabolites. We briefly discuss different approaches for integrating omics datasets to extract biologically relevant knowledge and the application of omics datasets in the construction and reconstruction of metabolic models. © 2017 The Authors The Plant Journal © 2017 John Wiley & Sons Ltd.

  5. Segmentation-free image processing and analysis of precipitate shapes in 2D and 3D

    NASA Astrophysics Data System (ADS)

    Bales, Ben; Pollock, Tresa; Petzold, Linda

    2017-06-01

    Segmentation based image analysis techniques are routinely employed for quantitative analysis of complex microstructures containing two or more phases. The primary advantage of these approaches is that spatial information on the distribution of phases is retained, enabling subjective judgements of the quality of the segmentation and subsequent analysis process. The downside is that computing micrograph segmentations with data from morphologically complex microstructures gathered with error-prone detectors is challenging and, if no special care is taken, the artifacts of the segmentation will make any subsequent analysis and conclusions uncertain. In this paper we demonstrate, using a two phase nickel-base superalloy microstructure as a model system, a new methodology for analysis of precipitate shapes using a segmentation-free approach based on the histogram of oriented gradients feature descriptor, a classic tool in image analysis. The benefits of this methodology for analysis of microstructure in two and three-dimensions are demonstrated.

  6. Review of free software tools for image analysis of fluorescence cell micrographs.

    PubMed

    Wiesmann, V; Franz, D; Held, C; Münzenmayer, C; Palmisano, R; Wittenberg, T

    2015-01-01

    An increasing number of free software tools have been made available for the evaluation of fluorescence cell micrographs. The main users are biologists and related life scientists with no or little knowledge of image processing. In this review, we give an overview of available tools and guidelines about which tools the users should use to segment fluorescence micrographs. We selected 15 free tools and divided them into stand-alone, Matlab-based, ImageJ-based, free demo versions of commercial tools and data sharing tools. The review consists of two parts: First, we developed a criteria catalogue and rated the tools regarding structural requirements, functionality (flexibility, segmentation and image processing filters) and usability (documentation, data management, usability and visualization). Second, we performed an image processing case study with four representative fluorescence micrograph segmentation tasks with figure-ground and cell separation. The tools display a wide range of functionality and usability. In the image processing case study, we were able to perform figure-ground separation in all micrographs using mainly thresholding. Cell separation was not possible with most of the tools, because cell separation methods are provided only by a subset of the tools and are difficult to parametrize and to use. Most important is that the usability matches the functionality of a tool. To be usable, specialized tools with less functionality need to fulfill less usability criteria, whereas multipurpose tools need a well-structured menu and intuitive graphical user interface. © 2014 Fraunhofer-Institute for Integrated Circuits IIS Journal of Microscopy © 2014 Royal Microscopical Society.

  7. ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research.

    PubMed

    Campagnola, Luke; Kratz, Megan B; Manis, Paul B

    2014-01-01

    The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org.

  8. Bibliographic Projects and Tools in Israel.

    ERIC Educational Resources Information Center

    Kedar, Rochelle

    This paper presents several of the most prominent bibliographic tools and projects current in Israel, as well as a few specialized and less well-known projects. Bibliographic tools include the Israel Union Catalog and the Israel Union List of Serials. The following are the major bibliographic projects described: the National Jewish Bibliography…

  9. A survey of compiler development aids. [concerning lexical, syntax, and semantic analysis

    NASA Technical Reports Server (NTRS)

    Buckles, B. P.; Hodges, B. C.; Hsia, P.

    1977-01-01

    A theoretical background was established for the compilation process by dividing it into five phases and explaining the concepts and algorithms that underpin each. The five selected phases were lexical analysis, syntax analysis, semantic analysis, optimization, and code generation. Graph theoretical optimization techniques were presented, and approaches to code generation were described for both one-pass and multipass compilation environments. Following the initial tutorial sections, more than 20 tools that were developed to aid in the process of writing compilers were surveyed. Eight of the more recent compiler development aids were selected for special attention - SIMCMP/STAGE2, LANG-PAK, COGENT, XPL, AED, CWIC, LIS, and JOCIT. The impact of compiler development aids were assessed some of their shortcomings and some of the areas of research currently in progress were inspected.

  10. NEFI: Network Extraction From Images

    PubMed Central

    Dirnberger, M.; Kehl, T.; Neumann, A.

    2015-01-01

    Networks are amongst the central building blocks of many systems. Given a graph of a network, methods from graph theory enable a precise investigation of its properties. Software for the analysis of graphs is widely available and has been applied to study various types of networks. In some applications, graph acquisition is relatively simple. However, for many networks data collection relies on images where graph extraction requires domain-specific solutions. Here we introduce NEFI, a tool that extracts graphs from images of networks originating in various domains. Regarding previous work on graph extraction, theoretical results are fully accessible only to an expert audience and ready-to-use implementations for non-experts are rarely available or insufficiently documented. NEFI provides a novel platform allowing practitioners to easily extract graphs from images by combining basic tools from image processing, computer vision and graph theory. Thus, NEFI constitutes an alternative to tedious manual graph extraction and special purpose tools. We anticipate NEFI to enable time-efficient collection of large datasets. The analysis of these novel datasets may open up the possibility to gain new insights into the structure and function of various networks. NEFI is open source and available at http://nefi.mpi-inf.mpg.de. PMID:26521675

  11. OpenVigil FDA - Inspection of U.S. American Adverse Drug Events Pharmacovigilance Data and Novel Clinical Applications.

    PubMed

    Böhm, Ruwen; von Hehn, Leocadie; Herdegen, Thomas; Klein, Hans-Joachim; Bruhn, Oliver; Petri, Holger; Höcker, Jan

    2016-01-01

    Pharmacovigilance contributes to health care. However, direct access to the underlying data for academic institutions and individual physicians or pharmacists is intricate, and easily employable analysis modes for everyday clinical situations are missing. This underlines the need for a tool to bring pharmacovigilance to the clinics. To address these issues, we have developed OpenVigil FDA, a novel web-based pharmacovigilance analysis tool which uses the openFDA online interface of the Food and Drug Administration (FDA) to access U.S. American and international pharmacovigilance data from the Adverse Event Reporting System (AERS). OpenVigil FDA provides disproportionality analyses to (i) identify the drug most likely evoking a new adverse event, (ii) compare two drugs concerning their safety profile, (iii) check arbitrary combinations of two drugs for unknown drug-drug interactions and (iv) enhance the relevance of results by identifying confounding factors and eliminating them using background correction. We present examples for these applications and discuss the promises and limits of pharmacovigilance, openFDA and OpenVigil FDA. OpenVigil FDA is the first public available tool to apply pharmacovigilance findings directly to real-life clinical problems. OpenVigil FDA does not require special licenses or statistical programs.

  12. Increased flexibility for modeling telemetry and nest-survival data using the multistate framework

    USGS Publications Warehouse

    Devineau, Olivier; Kendall, William L.; Doherty, Paul F.; Shenk, Tanya M.; White, Gary C.; Lukacs, Paul M.; Burnham, Kenneth P.

    2014-01-01

    Although telemetry is one of the most common tools used in the study of wildlife, advances in the analysis of telemetry data have lagged compared to progress in the development of telemetry devices. We demonstrate how standard known-fate telemetry and related nest-survival data analysis models are special cases of the more general multistate framework. We present a short theoretical development, and 2 case examples regarding the American black duck and the mallard. We also present a more complex lynx data analysis. Although not necessary in all situations, the multistate framework provides additional flexibility to analyze telemetry data, which may help analysts and biologists better deal with the vagaries of real-world data collection.

  13. PGSB/MIPS PlantsDB Database Framework for the Integration and Analysis of Plant Genome Data.

    PubMed

    Spannagl, Manuel; Nussbaumer, Thomas; Bader, Kai; Gundlach, Heidrun; Mayer, Klaus F X

    2017-01-01

    Plant Genome and Systems Biology (PGSB), formerly Munich Institute for Protein Sequences (MIPS) PlantsDB, is a database framework for the integration and analysis of plant genome data, developed and maintained for more than a decade now. Major components of that framework are genome databases and analysis resources focusing on individual (reference) genomes providing flexible and intuitive access to data. Another main focus is the integration of genomes from both model and crop plants to form a scaffold for comparative genomics, assisted by specialized tools such as the CrowsNest viewer to explore conserved gene order (synteny). Data exchange and integrated search functionality with/over many plant genome databases is provided within the transPLANT project.

  14. Photothermal and infrared thermography characterizations of thermal diffusion in hydroxyapatite materials

    NASA Astrophysics Data System (ADS)

    Bante-Guerra, J.; Conde-Contreras, M.; Trujillo, S.; Martinez-Torres, P.; Cruz-Jimenez, B.; Quintana, P.; Alvarado-Gil, J. J.

    2009-02-01

    Non destructive analysis of hydroxyapatite materials is an active research area mainly in the study of dental pieces and bones due to the importance these pieces have in medicine, archeology, dentistry, forensics and anthropology. Infrared thermography and photothermal techniques constitute highly valuable tools in those cases. In this work the quantitative analysis of thermal diffusion in bones is presented. The results obtained using thermographic images are compared with the ones obtained from the photothermal radiometry. Special emphasis is done in the analysis of samples with previous thermal damage. Our results show that the treatments induce changes in the physical properties of the samples. These results could be useful in the identification of the agents that induced modifications of unknown origin in hydroxyapatite structures.

  15. Blogs and Military Information Strategy

    DTIC Science & Technology

    2006-06-01

    organization of the US Special Operations Command (USSOCOM), MacDill Air Force Base, Florida. The mission of the Joint Special Operations...tion in academic, interagency and US military communities. The JSOU portal is https://jsou.socom.mil. Joint Special Operations University Brigadier...long-term conflict where the use of the global communications tool, the inter- net, plays a prominent role. The authors examine blogging from a

  16. Captive chimpanzees' manual laterality in tool use context: Influence of communication and of sociodemographic factors.

    PubMed

    Prieur, Jacques; Pika, Simone; Blois-Heulin, Catherine; Barbu, Stéphanie

    2018-04-14

    Understanding variations of apes' laterality between activities is a central issue when investigating the evolutionary origins of human hemispheric specialization of manual functions and language. We assessed laterality of 39 chimpanzees in a non-communication action similar to termite fishing that we compared with data on five frequent conspecific-directed gestures involving a tool previously exploited in the same subjects. We evaluated, first, population-level manual laterality for tool-use in non-communication actions; second, the influence of sociodemographic factors (age, sex, group, and hierarchy) on manual laterality in both non-communication actions and gestures. No significant right-hand bias at the population level was found for non-communication tool use, contrary to our previous findings for gestures involving a tool. A multifactorial analysis revealed that hierarchy and age particularly modulated manual laterality. Dominants and immatures were more right-handed when using a tool in gestures than in non-communication actions. On the contrary, subordinates, adolescents, young and mature adults as well as males were more right-handed when using a tool in non-communication actions than in gestures. Our findings support the hypothesis that some primate species may have a specific left-hemisphere processing gestures distinct from the cerebral system processing non-communication manual actions and to partly support the tool use hypothesis. Copyright © 2018 Elsevier B.V. All rights reserved.

  17. Tripartite community structure in social bookmarking data

    NASA Astrophysics Data System (ADS)

    Neubauer, Nicolas; Obermayer, Klaus

    2011-12-01

    Community detection is a branch of network analysis concerned with identifying strongly connected subnetworks. Social bookmarking sites aggregate datasets of often hundreds of millions of triples (document, user, and tag), which, when interpreted as edges of a graph, give rise to special networks called 3-partite, 3-uniform hypergraphs. We identify challenges and opportunities of generalizing community detection and in particular modularity optimization to these structures. Two methods for community detection are introduced that preserve the hypergraph's special structure to different degrees. Their performance is compared on synthetic datasets, showing the benefits of structure preservation. Furthermore, a tool for interactive exploration of the community detection results is introduced and applied to examples from real datasets. We find additional evidence for the importance of structure preservation and, more generally, demonstrate how tripartite community detection can help understand the structure of social bookmarking data.

  18. Development and exploratory analysis of the Neurorehabilitation Program Styles Survey.

    PubMed

    McCorkel, Beth A; Glueckauf, Robert L; Ecklund-Johnson, Eric P; Tomusk, Allison B; Trexler, Lance E; Diller, Leonard

    2003-01-01

    To develop a survey instrument that assesses implementation of key components of outpatient neurorehabilitation programs and test the capacity of this instrument to differentiate between rehabilitation approaches. The Neurorehabilitation Program Styles Survey (NPSS) was administered to 18 outpatient facilities: 10 specialized and 8 discipline-specific outpatient neurorehabilitation programs. Scores were compared between types of programs using independent samples t tests. The NPSS showed good reliability and contrasted groups validity, significantly differentiating between types of programs. The NPSS holds considerable promise as a tool for distinguishing among different types of brain injury programs, and for assessing the differential effectiveness of specialized versus discipline-specific outpatient brain rehabilitation programs. Future research on the NPSS will assess the stability of the instrument over time, its content validity, and capacity to differentiate the full continuum of neurorehabilitation programs.

  19. Practical End-to-End Performance Testing Tool for High Speed 3G-Based Networks

    NASA Astrophysics Data System (ADS)

    Shinbo, Hiroyuki; Tagami, Atsushi; Ano, Shigehiro; Hasegawa, Toru; Suzuki, Kenji

    High speed IP communication is a killer application for 3rd generation (3G) mobile systems. Thus 3G network operators should perform extensive tests to check whether expected end-to-end performances are provided to customers under various environments. An important objective of such tests is to check whether network nodes fulfill requirements to durations of processing packets because a long duration of such processing causes performance degradation. This requires testers (persons who do tests) to precisely know how long a packet is hold by various network nodes. Without any tool's help, this task is time-consuming and error prone. Thus we propose a multi-point packet header analysis tool which extracts and records packet headers with synchronized timestamps at multiple observation points. Such recorded packet headers enable testers to calculate such holding durations. The notable feature of this tool is that it is implemented on off-the shelf hardware platforms, i.e., lap-top personal computers. The key challenges of the implementation are precise clock synchronization without any special hardware and a sophisticated header extraction algorithm without any drop.

  20. PubFinder: a tool for improving retrieval rate of relevant PubMed abstracts.

    PubMed

    Goetz, Thomas; von der Lieth, Claus-Wilhelm

    2005-07-01

    Since it is becoming increasingly laborious to manually extract useful information embedded in the ever-growing volumes of literature, automated intelligent text analysis tools are becoming more and more essential to assist in this task. PubFinder (www.glycosciences.de/tools/PubFinder) is a publicly available web tool designed to improve the retrieval rate of scientific abstracts relevant for a specific scientific topic. Only the selection of a representative set of abstracts is required, which are central for a scientific topic. No special knowledge concerning the query-syntax is necessary. Based on the selected abstracts, a list of discriminating words is automatically calculated, which is subsequently used for scoring all defined PubMed abstracts for their probability of belonging to the defined scientific topic. This results in a hit-list of references in the descending order of their likelihood score. The algorithms and procedures implemented in PubFinder facilitate the perpetual task for every scientist of staying up-to-date with current publications dealing with a specific subject in biomedicine.

  1. Analysis of metabolomic data: tools, current strategies and future challenges for omics data integration.

    PubMed

    Cambiaghi, Alice; Ferrario, Manuela; Masseroli, Marco

    2017-05-01

    Metabolomics is a rapidly growing field consisting of the analysis of a large number of metabolites at a system scale. The two major goals of metabolomics are the identification of the metabolites characterizing each organism state and the measurement of their dynamics under different situations (e.g. pathological conditions, environmental factors). Knowledge about metabolites is crucial for the understanding of most cellular phenomena, but this information alone is not sufficient to gain a comprehensive view of all the biological processes involved. Integrated approaches combining metabolomics with transcriptomics and proteomics are thus required to obtain much deeper insights than any of these techniques alone. Although this information is available, multilevel integration of different 'omics' data is still a challenge. The handling, processing, analysis and integration of these data require specialized mathematical, statistical and bioinformatics tools, and several technical problems hampering a rapid progress in the field exist. Here, we review four main tools for number of users or provided features (MetaCoreTM, MetaboAnalyst, InCroMAP and 3Omics) out of the several available for metabolomic data analysis and integration with other 'omics' data, highlighting their strong and weak aspects; a number of related issues affecting data analysis and integration are also identified and discussed. Overall, we provide an objective description of how some of the main currently available software packages work, which may help the experimental practitioner in the choice of a robust pipeline for metabolomic data analysis and integration. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Supporting performance and configuration management of GTE cellular networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, Ming; Lafond, C.; Jakobson, G.

    GTE Laboratories, in cooperation with GTE Mobilnet, has developed and deployed PERFFEX (PERFormance Expert), an intelligent system for performance and configuration management of cellular networks. PERFEX assists cellular network performance and radio engineers in the analysis of large volumes of cellular network performance and configuration data. It helps them locate and determine the probable causes of performance problems, and provides intelligent suggestions about how to correct them. The system combines an expert cellular network performance tuning capability with a map-based graphical user interface, data visualization programs, and a set of special cellular engineering tools. PERFEX is in daily use atmore » more than 25 GTE Mobile Switching Centers. Since the first deployment of the system in late 1993, PERFEX has become a major GTE cellular network performance optimization tool.« less

  3. AVO helps seismic imaging in deepwater environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skidmore, C.; Lindsay, R.O.; Ratcliff, D.

    1997-11-03

    Amplitude and frequency variations related to offset should be analyzed routinely during interpretation of seismic data acquired in deepwater environments. Amplitude variation with offset (AVO) in three dimensions is the key exploration tool in deep waters of the Gulf of Mexico. But application of the tool requires special care. Three-dimensional AVO helps the interpreter understand stratigraphy and the meaning of amplitude anomalies. Used in conjunction with well log data, it can help the interpreter distinguish amplitudes related to the presence of hydrocarbons from those that result from, for example, rock-property changes within a non-hydrocarbon-bearing layer, such as a shale, ormore » residual gas (fizz water) in high-porosity sands. The paper discusses examples from the Gulf of Mexico, will control application, improving detail, and frequency-dependent analysis.« less

  4. Simulator for heterogeneous dataflow architectures

    NASA Technical Reports Server (NTRS)

    Malekpour, Mahyar R.

    1993-01-01

    A new simulator is developed to simulate the execution of an algorithm graph in accordance with the Algorithm to Architecture Mapping Model (ATAMM) rules. ATAMM is a Petri Net model which describes the periodic execution of large-grained, data-independent dataflow graphs and which provides predictable steady state time-optimized performance. This simulator extends the ATAMM simulation capability from a heterogenous set of resources, or functional units, to a more general heterogenous architecture. Simulation test cases show that the simulator accurately executes the ATAMM rules for both a heterogenous architecture and a homogenous architecture, which is the special case for only one processor type. The simulator forms one tool in an ATAMM Integrated Environment which contains other tools for graph entry, graph modification for performance optimization, and playback of simulations for analysis.

  5. Benefit from NASA

    NASA Image and Video Library

    2001-08-01

    Apollo-era technology spurred the development of cordless products that we take for granted everyday. In the 1960s, NASA asked Black Decker to develop a special drill that would be powerful enough to cut through hard layers of the lunar surface and be lightweight, compact, and operate under its own power source, allowing Apollo astronauts to collect lunar samples further away from the Lunar Experiment Module. In response, Black Decker developed a computer program that analyzed and optimized drill motor operations. From their analysis, engineers were able to design a motor that was powerful yet required minimal battery power to operate. Since those first days of cordless products, Black Decker has continued to refine this technology and they now sell their rechargeable products worldwide (i.e. the Dustbuster, cordless tools for home and industrial use, and medical tools.)

  6. Cordless Products

    NASA Technical Reports Server (NTRS)

    2001-01-01

    Apollo-era technology spurred the development of cordless products that we take for granted everyday. In the 1960s, NASA asked Black Decker to develop a special drill that would be powerful enough to cut through hard layers of the lunar surface and be lightweight, compact, and operate under its own power source, allowing Apollo astronauts to collect lunar samples further away from the Lunar Experiment Module. In response, Black Decker developed a computer program that analyzed and optimized drill motor operations. From their analysis, engineers were able to design a motor that was powerful yet required minimal battery power to operate. Since those first days of cordless products, Black Decker has continued to refine this technology and they now sell their rechargeable products worldwide (i.e. the Dustbuster, cordless tools for home and industrial use, and medical tools.)

  7. Quantifying Parental Influence on Youth Athlete Specialization: A Survey of Athletes’ Parents

    PubMed Central

    Padaki, Ajay S.; Ahmad, Christopher S.; Hodgins, Justin L.; Kovacevic, David; Lynch, Thomas Sean; Popkin, Charles A.

    2017-01-01

    Background: Youth athlete specialization has been linked to decreased enjoyment, burnout, and increased injury risk, although the impact of specialization on athletic success is unknown. The extent to which parents exert extrinsic influence on this phenomenon remains unclear. Purpose/Hypothesis: The goal of this study was to assess parental influences placed on young athletes to specialize. It was hypothesized that parents generate both direct and indirect pressures on specialized athletes. Study Design: Cross-sectional study; Level of evidence, 3. Methods: A survey tool was designed by an interdisciplinary medical team to evaluate parental influence on youth specialization. Surveys were administered to parents of the senior author’s orthopaedic pediatric patients. Results: Of the 211 parents approached, 201 (95.3%) completed the assessment tool. One-third of parents stated that their children played a single sport only, 53.2% had children who played multiple sports but had a favorite sport, and 13.4% had children who balanced their multiple sports equally. Overall, 115 (57.2%) parents hoped for their children to play collegiately or professionally, and 100 (49.7%) parents encouraged their children to specialize in a single sport. Parents of highly specialized and moderately specialized athletes were more likely to report directly influencing their children’s specialization (P = .038) and to expect their children to play collegiately or professionally (P = .014). Finally, parents who hired personal trainers for their children were more likely to believe that their children held collegiate or professional aspirations (P = .009). Conclusion: Parents influence youth athlete specialization both directly and by investment in elite coaching and personal instruction. Parents of more specialized athletes exert more influence than parents of unspecialized athletes. PMID:28975135

  8. Software on the Peregrine System | High-Performance Computing | NREL

    Science.gov Websites

    . Development Tools View list of tools for build automation, version control, and high-level or specialized scripting. Toolchains Learn about the available toolchains to build applications from source code

  9. Zion National Park Propane-to-Electric Shuttle Bus Testing | Transportation

    Science.gov Websites

    storage requirements based on the fleet's unique operation. NREL will process and analyze the data using specialized tools-including the Drive-Cycle Rapid Investigation, Visualization, and Evaluation (DRIVE) tool

  10. VDJServer: A Cloud-Based Analysis Portal and Data Commons for Immune Repertoire Sequences and Rearrangements.

    PubMed

    Christley, Scott; Scarborough, Walter; Salinas, Eddie; Rounds, William H; Toby, Inimary T; Fonner, John M; Levin, Mikhail K; Kim, Min; Mock, Stephen A; Jordan, Christopher; Ostmeyer, Jared; Buntzman, Adam; Rubelt, Florian; Davila, Marco L; Monson, Nancy L; Scheuermann, Richard H; Cowell, Lindsay G

    2018-01-01

    Recent technological advances in immune repertoire sequencing have created tremendous potential for advancing our understanding of adaptive immune response dynamics in various states of health and disease. Immune repertoire sequencing produces large, highly complex data sets, however, which require specialized methods and software tools for their effective analysis and interpretation. VDJServer is a cloud-based analysis portal for immune repertoire sequence data that provide access to a suite of tools for a complete analysis workflow, including modules for preprocessing and quality control of sequence reads, V(D)J gene segment assignment, repertoire characterization, and repertoire comparison. VDJServer also provides sophisticated visualizations for exploratory analysis. It is accessible through a standard web browser via a graphical user interface designed for use by immunologists, clinicians, and bioinformatics researchers. VDJServer provides a data commons for public sharing of repertoire sequencing data, as well as private sharing of data between users. We describe the main functionality and architecture of VDJServer and demonstrate its capabilities with use cases from cancer immunology and autoimmunity. VDJServer provides a complete analysis suite for human and mouse T-cell and B-cell receptor repertoire sequencing data. The combination of its user-friendly interface and high-performance computing allows large immune repertoire sequencing projects to be analyzed with no programming or software installation required. VDJServer is a web-accessible cloud platform that provides access through a graphical user interface to a data management infrastructure, a collection of analysis tools covering all steps in an analysis, and an infrastructure for sharing data along with workflows, results, and computational provenance. VDJServer is a free, publicly available, and open-source licensed resource.

  11. High-Performance Data Analysis Tools for Sun-Earth Connection Missions

    NASA Technical Reports Server (NTRS)

    Messmer, Peter

    2011-01-01

    The data analysis tool of choice for many Sun-Earth Connection missions is the Interactive Data Language (IDL) by ITT VIS. The increasing amount of data produced by these missions and the increasing complexity of image processing algorithms requires access to higher computing power. Parallel computing is a cost-effective way to increase the speed of computation, but algorithms oftentimes have to be modified to take advantage of parallel systems. Enhancing IDL to work on clusters gives scientists access to increased performance in a familiar programming environment. The goal of this project was to enable IDL applications to benefit from both computing clusters as well as graphics processing units (GPUs) for accelerating data analysis tasks. The tool suite developed in this project enables scientists now to solve demanding data analysis problems in IDL that previously required specialized software, and it allows them to be solved orders of magnitude faster than on conventional PCs. The tool suite consists of three components: (1) TaskDL, a software tool that simplifies the creation and management of task farms, collections of tasks that can be processed independently and require only small amounts of data communication; (2) mpiDL, a tool that allows IDL developers to use the Message Passing Interface (MPI) inside IDL for problems that require large amounts of data to be exchanged among multiple processors; and (3) GPULib, a tool that simplifies the use of GPUs as mathematical coprocessors from within IDL. mpiDL is unique in its support for the full MPI standard and its support of a broad range of MPI implementations. GPULib is unique in enabling users to take advantage of an inexpensive piece of hardware, possibly already installed in their computer, and achieve orders of magnitude faster execution time for numerically complex algorithms. TaskDL enables the simple setup and management of task farms on compute clusters. The products developed in this project have the potential to interact, so one can build a cluster of PCs, each equipped with a GPU, and use mpiDL to communicate between the nodes and GPULib to accelerate the computations on each node.

  12. Development of Anthropometric Analogous Headforms. Phase 1.

    DTIC Science & Technology

    1994-10-31

    shown in figure 5. This surface mesh can then be transformed into polygon faces that are able to be rendered by the AutoCAD rendering tools . Rendering of...computer-generated surfaces. The material removal techniques require the programming of the tool path of the cutter and in some cases requires specialized... tooling . Tool path programs are available to transfer the computer-generated surface into actual paths of the cutting tool . In cases where the

  13. A measuring tool for tree-rings analysis

    NASA Astrophysics Data System (ADS)

    Shumilov, Oleg; Kanatjev, Alexander; Kasatkina, Elena

    2013-04-01

    A special tool has been created for the annual tree-ring widths measurement and analysis. It consists of professional scanner, computer system and software. This created complex in many aspects does not yield the similar systems (LINTAB, WinDENDRO), but in comparison to manual measurement systems, it offers a number of advantages: productivity gain, possibility of archiving the results of the measurements at any stage of the processing, operator comfort. It has been developed a new software, allowing processing of samples of different types (cores, saw cuts), including those which is difficult to process, having got a complex wood structure (inhomogeneity of growing in different directions, missed, light and false rings etc.). This software can analyze pictures made with optical scanners, analog or digital cameras. The complex software program was created on programming language C++, being compatible with modern operating systems like Windows X. Annual ring widths are measured along paths traced interactively. These paths can have any orientation and can be created so that ring widths are measured perpendicular to ring boundaries. A graphic of ring-widths in function of the year is displayed on a screen during the analysis and it can be used for visual and numerical cross-dating and comparison with other series or master-chronologies. Ring widths are saved to the text files in a special format, and those files are converted to the format accepted for data conservation in the International Tree-Ring Data Bank. The created complex is universal in application that will allow its use for decision of the different problems in biology and ecology. With help of this complex it has been reconstructed a long-term juniper (1328-2004) and pine (1445-2005) tree-ring chronologies on the base of samples collected at Kola Peninsula (northwestern Russia).

  14. Developing tools and resources for the biomedical domain of the Greek language.

    PubMed

    Vagelatos, Aristides; Mantzari, Elena; Pantazara, Mavina; Tsalidis, Christos; Kalamara, Chryssoula

    2011-06-01

    This paper presents the design and implementation of terminological and specialized textual resources that were produced in the framework of the Greek research project "IATROLEXI". The aim of the project was to create the critical infrastructure for the Greek language, i.e. linguistic resources and tools for use in high level Natural Language Processing (NLP) applications in the domain of biomedicine. The project was built upon existing resources developed by the project partners and further enhanced within its framework, i.e. a Greek morphological lexicon of about 100,000 words, and language processing tools such as a lemmatiser and a morphosyntactic tagger. Christos Tsalidis, Additionally, it developed new assets, such as a specialized corpus of biomedical texts and an ontology of medical terminology.

  15. Special Issue: Book Reviews 2002-2003.

    ERIC Educational Resources Information Center

    Grauer, Barbara Ellman, Ed.

    2003-01-01

    This special issue reviews 71 books on the following topics: career management; career opportunities for people with disabilities; federal government career information; college career development/counseling; job search strategies, tools, methods; coaching; retirement issues; strategies for managers; women and careers; general career books; and…

  16. A Petri Net Approach Based Elementary Siphons Supervisor for Flexible Manufacturing Systems

    NASA Astrophysics Data System (ADS)

    Abdul-Hussin, Mowafak Hassan

    2015-05-01

    This paper presents an approach to constructing a class of an S3PR net for modeling, simulation and control of processes occurring in the flexible manufacturing system (FMS) used based elementary siphons of a Petri net. Siphons are very important to the analysis and control of deadlocks of FMS that is significant objectives of siphons. Petri net models in the efficiency structure analysis, and utilization of the FMSs when different policy can be implemented lead to the deadlock prevention. We are representing an effective deadlock-free policy of a special class of Petri nets called S3PR. Simulation of Petri net structural analysis and reachability graph analysis is used for analysis and control of Petri nets. Petri nets contain been successfully as one of the most powerful tools for modelling of FMS, where Using structural analysis, we show that liveness of such systems can be attributed to the absence of under marked siphons.

  17. Auditory neuroimaging with fMRI and PET.

    PubMed

    Talavage, Thomas M; Gonzalez-Castillo, Javier; Scott, Sophie K

    2014-01-01

    For much of the past 30 years, investigations of auditory perception and language have been enhanced or even driven by the use of functional neuroimaging techniques that specialize in localization of central responses. Beginning with investigations using positron emission tomography (PET) and gradually shifting primarily to usage of functional magnetic resonance imaging (fMRI), auditory neuroimaging has greatly advanced our understanding of the organization and response properties of brain regions critical to the perception of and communication with the acoustic world in which we live. As the complexity of the questions being addressed has increased, the techniques, experiments and analyses applied have also become more nuanced and specialized. A brief review of the history of these investigations sets the stage for an overview and analysis of how these neuroimaging modalities are becoming ever more effective tools for understanding the auditory brain. We conclude with a brief discussion of open methodological issues as well as potential clinical applications for auditory neuroimaging. This article is part of a Special Issue entitled Human Auditory Neuroimaging. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Design, Specification and Construction of Specialized Measurement System in the Experimental Building

    NASA Astrophysics Data System (ADS)

    Fedorczak-Cisak, Malgorzata; Kwasnowski, Pawel; Furtak, Marcin; Hayduk, Grzegorz

    2017-10-01

    Experimental buildings for “in situ” research are a very important tool for collecting data on energy efficiency of the energy-saving technologies. One of the most advanced building of this type in Poland is the Maloposkie Laboratory of Energy-saving Buildings at Cracow University of Technology. The building itself is used by scientists as a research object and research tool to test energy-saving technologies. It is equipped with a specialized measuring system consisting of approx. 3 000 different sensors distributed in technical installations and structural elements of the building (walls, ceilings, cornices) and the ground. The authors of the paper will present the innovative design and technology of this specialized instrumentation. They will discuss issues arising during the implementation and use of the building.

  19. Parallel adaptive discontinuous Galerkin approximation for thin layer avalanche modeling

    NASA Astrophysics Data System (ADS)

    Patra, A. K.; Nichita, C. C.; Bauer, A. C.; Pitman, E. B.; Bursik, M.; Sheridan, M. F.

    2006-08-01

    This paper describes the development of highly accurate adaptive discontinuous Galerkin schemes for the solution of the equations arising from a thin layer type model of debris flows. Such flows have wide applicability in the analysis of avalanches induced by many natural calamities, e.g. volcanoes, earthquakes, etc. These schemes are coupled with special parallel solution methodologies to produce a simulation tool capable of very high-order numerical accuracy. The methodology successfully replicates cold rock avalanches at Mount Rainier, Washington and hot volcanic particulate flows at Colima Volcano, Mexico.

  20. Domestic and Foreign Trade Position of the United States Aircraft Turbine Engine Industry. Task Six. Short-Term Gas Turbine Propulsion Analysis and Assessment

    DTIC Science & Technology

    1991-06-01

    500 remaining machine tool firms had less than twenty employees each. Manufacturing rationalization was negligible; product specialization and combined...terms, most of the fuselage. Over 130 Japanese employees were dispatched to Seattle during the 767 development, even though the agreement was for...through, and consider not just the name plates, but who’s involved in sharing the risk- -and the rewards , if any--you recite lots of other names: M.T.U

  1. Reconfigurable Flight Control Using Nonlinear Dynamic Inversion with a Special Accelerometer Implementation

    NASA Technical Reports Server (NTRS)

    Bacon, Barton J.; Ostroff, Aaron J.

    2000-01-01

    This paper presents an approach to on-line control design for aircraft that have suffered either actuator failure, missing effector surfaces, surface damage, or any combination. The approach is based on a modified version of nonlinear dynamic inversion. The approach does not require a model of the baseline vehicle (effectors at zero deflection), but does require feedback of accelerations and effector positions. Implementation issues are addressed and the method is demonstrated on an advanced tailless aircraft. An experimental simulation analysis tool is used to directly evaluate the nonlinear system's stability robustness.

  2. [The recent news in endoscopic surgery: a review of the literature and meta-analysis].

    PubMed

    Klimenko, K É

    2012-01-01

    During a few recent years, endonasal surgery has become the principal tool for the operative treatment of many pathologies affecting the base of the skull. The present work was designed to estimate the possibilities of using endoscopic endonasal surgery to treat sinus and skull base lesions and illustrate the recent progress in the development of endoscopic equipment and instrumentation. The meta-analysis of the results of on-going research on the application of the endonasal endoscopic technology is described with the special emphasis on the plastic treatment of liquor fistulas, removal of juvenile nasopharyngeal angiofibromas, treatment of pathological changes in the clivial region and odontoid cervicomedullary junction.

  3. Book Review: Book review

    NASA Astrophysics Data System (ADS)

    Tweed, Fiona S.

    2017-08-01

    This special edition of Zeitschrift für Geomorphologie (ZfG) is based on presentations given at a conference entitled 'Hydrological Extreme Events in Historic and Prehistoric Times' which took place in Bonn in June 2014. The volume consists of an editorial introduction and nine research papers reflecting a range of approaches to understanding past events, including modelling, analysis of historical data and studies that focus on a consistent approach to collection and analysis of data from different areas. The HEX project, which generated the conference in Bonn, adopted a multidisciplinary approach and this is reflected in the collection of papers, which emphasise the importance of combining a range of approaches and analyses as tools for decoding both landscapes and processes.

  4. GABBs: Cyberinfrastructure for Self-Service Geospatial Data Exploration, Computation, and Sharing

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Zhao, L.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2016-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. In addressing these needs, the Geospatial data Analysis Building Blocks (GABBs) project aims at building geospatial modeling, data analysis and visualization capabilities in an open source web platform, HUBzero. Funded by NSF's Data Infrastructure Building Blocks initiative, GABBs is creating a geospatial data architecture that integrates spatial data management, mapping and visualization, and interfaces in the HUBzero platform for scientific collaborations. The geo-rendering enabled Rappture toolkit, a generic Python mapping library, geospatial data exploration and publication tools, and an integrated online geospatial data management solution are among the software building blocks from the project. The GABBS software will be available through Amazon's AWS Marketplace VM images and open source. Hosting services are also available to the user community. The outcome of the project will enable researchers and educators to self-manage their scientific data, rapidly create GIS-enable tools, share geospatial data and tools on the web, and build dynamic workflows connecting data and tools, all without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the GABBs architecture, toolkits and libraries, and showcase the scientific use cases that utilize GABBs capabilities, as well as the challenges and solutions for GABBs to interoperate with other cyberinfrastructure platforms.

  5. Tools for quantifying isotopic niche space and dietary variation at the individual and population level.

    USGS Publications Warehouse

    Newsome, Seth D.; Yeakel, Justin D.; Wheatley, Patrick V.; Tinker, M. Tim

    2012-01-01

    Ecologists are increasingly using stable isotope analysis to inform questions about variation in resource and habitat use from the individual to community level. In this study we investigate data sets from 2 California sea otter (Enhydra lutris nereis) populations to illustrate the advantages and potential pitfalls of applying various statistical and quantitative approaches to isotopic data. We have subdivided these tools, or metrics, into 3 categories: IsoSpace metrics, stable isotope mixing models, and DietSpace metrics. IsoSpace metrics are used to quantify the spatial attributes of isotopic data that are typically presented in bivariate (e.g., δ13C versus δ15N) 2-dimensional space. We review IsoSpace metrics currently in use and present a technique by which uncertainty can be included to calculate the convex hull area of consumers or prey, or both. We then apply a Bayesian-based mixing model to quantify the proportion of potential dietary sources to the diet of each sea otter population and compare this to observational foraging data. Finally, we assess individual dietary specialization by comparing a previously published technique, variance components analysis, to 2 novel DietSpace metrics that are based on mixing model output. As the use of stable isotope analysis in ecology continues to grow, the field will need a set of quantitative tools for assessing isotopic variance at the individual to community level. Along with recent advances in Bayesian-based mixing models, we hope that the IsoSpace and DietSpace metrics described here will provide another set of interpretive tools for ecologists.

  6. StreptoBase: An Oral Streptococcus mitis Group Genomic Resource and Analysis Platform.

    PubMed

    Zheng, Wenning; Tan, Tze King; Paterson, Ian C; Mutha, Naresh V R; Siow, Cheuk Chuen; Tan, Shi Yang; Old, Lesley A; Jakubovics, Nicholas S; Choo, Siew Woh

    2016-01-01

    The oral streptococci are spherical Gram-positive bacteria categorized under the phylum Firmicutes which are among the most common causative agents of bacterial infective endocarditis (IE) and are also important agents in septicaemia in neutropenic patients. The Streptococcus mitis group is comprised of 13 species including some of the most common human oral colonizers such as S. mitis, S. oralis, S. sanguinis and S. gordonii as well as species such as S. tigurinus, S. oligofermentans and S. australis that have only recently been classified and are poorly understood at present. We present StreptoBase, which provides a specialized free resource focusing on the genomic analyses of oral species from the mitis group. It currently hosts 104 S. mitis group genomes including 27 novel mitis group strains that we sequenced using the high throughput Illumina HiSeq technology platform, and provides a comprehensive set of genome sequences for analyses, particularly comparative analyses and visualization of both cross-species and cross-strain characteristics of S. mitis group bacteria. StreptoBase incorporates sophisticated in-house designed bioinformatics web tools such as Pairwise Genome Comparison (PGC) tool and Pathogenomic Profiling Tool (PathoProT), which facilitate comparative pathogenomics analysis of Streptococcus strains. Examples are provided to demonstrate how StreptoBase can be employed to compare genome structure of different S. mitis group bacteria and putative virulence genes profile across multiple streptococcal strains. In conclusion, StreptoBase offers access to a range of streptococci genomic resources as well as analysis tools and will be an invaluable platform to accelerate research in streptococci. Database URL: http://streptococcus.um.edu.my.

  7. A One-Hand Nut and Bolt Assembly Tool

    NASA Technical Reports Server (NTRS)

    Spencer, J. M.

    1984-01-01

    Special wrench speeds nut and bolt assembly when insufficient room to hold nut behind bolthole with standard tool. C-clamp shaped box-andsocket-wrench assembly holds nut on blind side in alinement to receive bolt from open side.

  8. 48 CFR 17.106-1 - General.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ..., labor learning, and other nonrecurring costs to be incurred by an “average” prime contractor or... applicable, as plant or equipment relocation or rearrangement, special tooling and special test equipment... learning. They shall not include any costs of labor or materials, or other expenses (except as indicated...

  9. Improving Flood Risk Management for California's Central Valley: How the State Developed a Toolbox for Large, System-wide Studies

    NASA Astrophysics Data System (ADS)

    Pingel, N.; Liang, Y.; Bindra, A.

    2016-12-01

    More than 1 million Californians live and work in the floodplains of the Sacramento-San Joaquin Valley where flood risks are among the highest in the nation. In response to this threat to people, property and the environment, the Department of Water Resources (DWR) has been called to action to improve flood risk management. This has transpired through significant advances in development of flood information and tools, analysis, and planning. Senate Bill 5 directed DWR to prepare the Central Valley Flood Protection Plan (CVFPP) and update it every 5 years. A key component of this aggressive planning approach is answering the question: What is the current flood risk, and how would proposed improvements change flood risk throughout the system? Answering this question is a substantial challenge due to the size and complexity of the watershed and flood control system. The watershed is roughly 42,000 sq mi, and flows are controlled by numerous reservoirs, bypasses, and levees. To overcome this challenge, the State invested in development of a comprehensive analysis "tool box" through various DWR programs. Development of the tool box included: collection of hydro-meteorological, topographic, geotechnical, and economic data; development of rainfall-runoff, reservoir operation, hydraulic routing, and flood risk analysis models; and development of specialized applications and computing schemes to accelerate the analysis. With this toolbox, DWR is analyzing flood hazard, flood control system performance, exposure and vulnerability of people and property to flooding, consequence of flooding for specific events, and finally flood risk for a range of CVFPP alternatives. Based on the results, DWR will put forward a State Recommended Plan in the 2017 CVFPP. Further, the value of the analysis tool box extends beyond the CVFPP. It will serve as a foundation for other flood studies for years to come and has already been successfully applied for inundation mapping to support emergency response, reservoir operation analysis, and others.

  10. Visual analysis of online social media to open up the investigation of stance phenomena

    PubMed Central

    Kucher, Kostiantyn; Schamp-Bjerede, Teri; Kerren, Andreas; Paradis, Carita; Sahlgren, Magnus

    2015-01-01

    Online social media are a perfect text source for stance analysis. Stance in human communication is concerned with speaker attitudes, beliefs, feelings and opinions. Expressions of stance are associated with the speakers' view of what they are talking about and what is up for discussion and negotiation in the intersubjective exchange. Taking stance is thus crucial for the social construction of meaning. Increased knowledge of stance can be useful for many application fields such as business intelligence, security analytics, or social media monitoring. In order to process large amounts of text data for stance analyses, linguists need interactive tools to explore the textual sources as well as the processed data based on computational linguistics techniques. Both original texts and derived data are important for refining the analyses iteratively. In this work, we present a visual analytics tool for online social media text data that can be used to open up the investigation of stance phenomena. Our approach complements traditional linguistic analysis techniques and is based on the analysis of utterances associated with two stance categories: sentiment and certainty. Our contributions include (1) the description of a novel web-based solution for analyzing the use and patterns of stance meanings and expressions in human communication over time; and (2) specialized techniques used for visualizing analysis provenance and corpus overview/navigation. We demonstrate our approach by means of text media on a highly controversial scandal with regard to expressions of anger and provide an expert review from linguists who have been using our tool. PMID:29249903

  11. Visual analysis of online social media to open up the investigation of stance phenomena.

    PubMed

    Kucher, Kostiantyn; Schamp-Bjerede, Teri; Kerren, Andreas; Paradis, Carita; Sahlgren, Magnus

    2016-04-01

    Online social media are a perfect text source for stance analysis. Stance in human communication is concerned with speaker attitudes, beliefs, feelings and opinions. Expressions of stance are associated with the speakers' view of what they are talking about and what is up for discussion and negotiation in the intersubjective exchange. Taking stance is thus crucial for the social construction of meaning. Increased knowledge of stance can be useful for many application fields such as business intelligence, security analytics, or social media monitoring. In order to process large amounts of text data for stance analyses, linguists need interactive tools to explore the textual sources as well as the processed data based on computational linguistics techniques. Both original texts and derived data are important for refining the analyses iteratively. In this work, we present a visual analytics tool for online social media text data that can be used to open up the investigation of stance phenomena. Our approach complements traditional linguistic analysis techniques and is based on the analysis of utterances associated with two stance categories: sentiment and certainty. Our contributions include (1) the description of a novel web-based solution for analyzing the use and patterns of stance meanings and expressions in human communication over time; and (2) specialized techniques used for visualizing analysis provenance and corpus overview/navigation. We demonstrate our approach by means of text media on a highly controversial scandal with regard to expressions of anger and provide an expert review from linguists who have been using our tool.

  12. Early development of Science Opportunity Analysis tools for the Jupiter Icy Moons Explorer (JUICE) mission

    NASA Astrophysics Data System (ADS)

    Cardesin Moinelo, Alejandro; Vallat, Claire; Altobelli, Nicolas; Frew, David; Llorente, Rosario; Costa, Marc; Almeida, Miguel; Witasse, Olivier

    2016-10-01

    JUICE is the first large mission in the framework of ESA's Cosmic Vision 2015-2025 program. JUICE will survey the Jovian system with a special focus on three of the Galilean Moons: Europa, Ganymede and Callisto.The mission has recently been adopted and big efforts are being made by the Science Operations Center (SOC) at the European Space and Astronomy Centre (ESAC) in Madrid for the development of tools to provide the necessary support to the Science Working Team (SWT) for science opportunity analysis and early assessment of science operation scenarios. This contribution will outline some of the tools being developed within ESA and in collaboration with the Navigation and Ancillary Information Facility (NAIF) at JPL.The Mission Analysis and Payload Planning Support (MAPPS) is developed by ESA and has been used by most of ESA's planetary missions to generate and validate science observation timelines for the simulation of payload and spacecraft operations. MAPPS has the capability to compute and display all the necessary geometrical information such as the distances, illumination angles and projected field-of-view of an imaging instrument on the surface of the given body and a preliminary setup is already in place for the early assessment of JUICE science operations.NAIF provides valuable SPICE support to the JUICE mission and several tools are being developed to compute and visualize science opportunities. In particular the WebGeoCalc and Cosmographia systems are provided by NAIF to compute time windows and create animations of the observation geometry available via traditional SPICE data files, such as planet orbits, spacecraft trajectory, spacecraft orientation, instrument field-of-view "cones" and instrument footprints. Other software tools are being developed by ESA and other collaborating partners to support the science opportunity analysis for all missions, like the SOLab (Science Operations Laboratory) or new interfaces for observation definitions and opportunity window databases.

  13. Agroforestry landscapes and global change: landscape ecology tools for management and conservation

    Treesearch

    Guillermo Martinez Pastur; Emilie Andrieu; Louis R. Iverson; Pablo Luis Peri

    2012-01-01

    Forest ecosystems are impacted by multiple uses under the influence of global drivers, and where landscape ecology tools may substantially facilitate the management and conservation of the agroforestry ecosystems. The use of landscape ecology tools was described in the eight papers of the present special issue, including changes in forested landscapes due to...

  14. Influence of Co and W powders on viscosity of composite solders during soldering of specially shaped diamond-abrasive tools

    NASA Astrophysics Data System (ADS)

    Sokolov, E. G.; Aref’eva, S. A.; Svistun, L. I.

    2018-03-01

    The influence of Co and W powders on the structure and the viscosity of composite solders Sn-Cu-Co-W used for the manufacture of the specially shaped diamond tools has been studied. The solders were obtained by mixing the metallic powders with an organic binder. The mixtures with and without diamonds were applied to steel rollers and shaped substrates. The sintering was carried out in a vacuum at 820 ° C with time-exposure of 40 minutes. The influence of Co and W powders on the viscosity solders was evaluated on the basis of the study of structures and according to the results of sintering specially shaped diamond tools. It was found that to provide the necessary viscosity and to obtain the uniform diamond-containing layers on the complex shaped surfaces, Sn-Cu-Co-W solder should contain 27–35 vol % of solid phase. This is achieved with a total solder content of 24–32 wt % of cobalt powder and 7 wt % of tungsten powder.

  15. Skill networks and measures of complex human capital

    PubMed Central

    2017-01-01

    We propose a network-based method for measuring worker skills. We illustrate the method using data from an online freelance website. Using the tools of network analysis, we divide skills into endogenous categories based on their relationship with other skills in the market. Workers who specialize in these different areas earn dramatically different wages. We then show that, in this market, network-based measures of human capital provide additional insight into wages beyond traditional measures. In particular, we show that workers with diverse skills earn higher wages than those with more specialized skills. Moreover, we can distinguish between two different types of workers benefiting from skill diversity: jacks-of-all-trades, whose skills can be applied independently on a wide range of jobs, and synergistic workers, whose skills are useful in combination and fill a hole in the labor market. On average, workers whose skills are synergistic earn more than jacks-of-all-trades. PMID:29133397

  16. Development and analysis of the Software Implemented Fault-Tolerance (SIFT) computer

    NASA Technical Reports Server (NTRS)

    Goldberg, J.; Kautz, W. H.; Melliar-Smith, P. M.; Green, M. W.; Levitt, K. N.; Schwartz, R. L.; Weinstock, C. B.

    1984-01-01

    SIFT (Software Implemented Fault Tolerance) is an experimental, fault-tolerant computer system designed to meet the extreme reliability requirements for safety-critical functions in advanced aircraft. Errors are masked by performing a majority voting operation over the results of identical computations, and faulty processors are removed from service by reassigning computations to the nonfaulty processors. This scheme has been implemented in a special architecture using a set of standard Bendix BDX930 processors, augmented by a special asynchronous-broadcast communication interface that provides direct, processor to processor communication among all processors. Fault isolation is accomplished in hardware; all other fault-tolerance functions, together with scheduling and synchronization are implemented exclusively by executive system software. The system reliability is predicted by a Markov model. Mathematical consistency of the system software with respect to the reliability model has been partially verified, using recently developed tools for machine-aided proof of program correctness.

  17. [Social values and addiction: applicability and psychometric properties of VAL-89 questionnaire].

    PubMed

    Pedrero Perez, Eduardo Jose; Rojo Mota, Gloria; Olivar Arroyo, Alvaro

    2008-01-01

    To study the psychometric properties of the VAL-89 questionnaire and its possible use in addict individuals who ask for treatment. Analysis of the psychometric properties of the questionnaire and its factorial structure, applying it to 792 individuals. 365 of them were substance users seeking treatment and 427 were general population. Reliability of the questionnaire is confirmed, although its factorial structure appears to be different from the original. In our study appear 12 factors, instead of the original 10. These factors are named: Power, Stimulation, Submission, Tradition, Spirituality, Self-Sufficience, Hedonism, Sociability, Universality, Convencionalism, Idealism and Self-Realization. These factors are distributed through several dimensions represented by four axis: individual-social, dominance-equality, tradition-pleasure and great values-anomie. The VAL-89 questionnaire seems to be a useful tool to explore which are the more appreciated social values, being of special interest to know which are specially selected by addict individuals.

  18. Building a Prototype of LHC Analysis Oriented Computing Centers

    NASA Astrophysics Data System (ADS)

    Bagliesi, G.; Boccali, T.; Della Ricca, G.; Donvito, G.; Paganoni, M.

    2012-12-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  19. PGSB PlantsDB: updates to the database framework for comparative plant genome research.

    PubMed

    Spannagl, Manuel; Nussbaumer, Thomas; Bader, Kai C; Martis, Mihaela M; Seidel, Michael; Kugler, Karl G; Gundlach, Heidrun; Mayer, Klaus F X

    2016-01-04

    PGSB (Plant Genome and Systems Biology: formerly MIPS) PlantsDB (http://pgsb.helmholtz-muenchen.de/plant/index.jsp) is a database framework for the comparative analysis and visualization of plant genome data. The resource has been updated with new data sets and types as well as specialized tools and interfaces to address user demands for intuitive access to complex plant genome data. In its latest incarnation, we have re-worked both the layout and navigation structure and implemented new keyword search options and a new BLAST sequence search functionality. Actively involved in corresponding sequencing consortia, PlantsDB has dedicated special efforts to the integration and visualization of complex triticeae genome data, especially for barley, wheat and rye. We enhanced CrowsNest, a tool to visualize syntenic relationships between genomes, with data from the wheat sub-genome progenitor Aegilops tauschii and added functionality to the PGSB RNASeqExpressionBrowser. GenomeZipper results were integrated for the genomes of barley, rye, wheat and perennial ryegrass and interactive access is granted through PlantsDB interfaces. Data exchange and cross-linking between PlantsDB and other plant genome databases is stimulated by the transPLANT project (http://transplantdb.eu/). © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  20. Application of principal component analysis (PCA) as a sensory assessment tool for fermented food products.

    PubMed

    Ghosh, Debasree; Chattopadhyay, Parimal

    2012-06-01

    The objective of the work was to use the method of quantitative descriptive analysis (QDA) to describe the sensory attributes of the fermented food products prepared with the incorporation of lactic cultures. Panellists were selected and trained to evaluate various attributes specially color and appearance, body texture, flavor, overall acceptability and acidity of the fermented food products like cow milk curd and soymilk curd, idli, sauerkraut and probiotic ice cream. Principal component analysis (PCA) identified the six significant principal components that accounted for more than 90% of the variance in the sensory attribute data. Overall product quality was modelled as a function of principal components using multiple least squares regression (R (2) = 0.8). The result from PCA was statistically analyzed by analysis of variance (ANOVA). These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring the fermented food product attributes that are important for consumer acceptability.

  1. Web Feet Guide to Search Engines: Finding It on the Net.

    ERIC Educational Resources Information Center

    Web Feet, 2001

    2001-01-01

    This guide to search engines for the World Wide Web discusses selecting the right search engine; interpreting search results; major search engines; online tutorials and guides; search engines for kids; specialized search tools for various subjects; and other specialized engines and gateways. (LRW)

  2. Special Events: Planning for Success, Second Edition.

    ERIC Educational Resources Information Center

    Harris, April L.

    This book is intended to serve as a practical reference tool for advancement services professionals, illustrating the importance of special events as a way to communicate with and personalize contact between the higher education institution and donors, community leaders, students, elected officials, and others. Each chapter offers comprehensive…

  3. Coping with terrorism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerr, D.M.

    1985-01-01

    Terrorism has emerged as a tool of low-intensity conflict used to undermine Western and moderate governments. There is evidence that the US faces a new threshold of terrorist threat both at home and abroad because the tools are available, media attention is global and often undisciplined, and the motives for terrorist attack span a wide spectrum. The US has no internal consensus of how to respond to acts of terrorism. The goal of the terrorists is to erode faith in the government and the democratic system. The author analyzes the threat and examines opportunities for an adequate response. Among hismore » recommendations are to make infrastructure networks more robust and less vulnerable, the use of new technologies that enhance security, clear guidelines for intelligence gathering and analysis, specially trained response forces, and political moderation and cooperation.« less

  4. Tools for 3D scientific visualization in computational aerodynamics at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Bancroft, Gordon; Plessel, Todd; Merritt, Fergus; Watson, Val

    1989-01-01

    Hardware, software, and techniques used by the Fluid Dynamics Division (NASA) for performing visualization of computational aerodynamics, which can be applied to the visualization of flow fields from computer simulations of fluid dynamics about the Space Shuttle, are discussed. Three visualization techniques applied, post-processing, tracking, and steering, are described, as well as the post-processing software packages used, PLOT3D, SURF (Surface Modeller), GAS (Graphical Animation System), and FAST (Flow Analysis software Toolkit). Using post-processing methods a flow simulation was executed on a supercomputer and, after the simulation was complete, the results were processed for viewing. It is shown that the high-resolution, high-performance three-dimensional workstation combined with specially developed display and animation software provides a good tool for analyzing flow field solutions obtained from supercomputers.

  5. Lateral conduction effects on heat-transfer data obtained with the phase-change paint technique

    NASA Technical Reports Server (NTRS)

    Maise, G.; Rossi, M. J.

    1974-01-01

    A computerized tool, CAPE, (Conduction Analysis Program using Eigenvalues) has been developed to account for lateral heat conduction in wind tunnel models in the data reduction of the phase-change paint technique. The tool also accounts for the effects of finite thickness (thin wings) and surface curvature. A special reduction procedure using just one time of melt is also possible on leading edges. A novel iterative numerical scheme was used, with discretized spatial coordinates but analytic integration in time, to solve the inverse conduction problem involved in the data reduction. A yes-no chart is provided which tells the test engineer when various corrections are large enough so that CAPE should be used. The accuracy of the phase-change paint technique in the presence of finite thickness and lateral conduction is also investigated.

  6. Optimization of droplets for UV-NIL using coarse-grain simulation of resist flow

    NASA Astrophysics Data System (ADS)

    Sirotkin, Vadim; Svintsov, Alexander; Zaitsev, Sergey

    2009-03-01

    A mathematical model and numerical method are described, which make it possible to simulate ultraviolet ("step and flash") nanoimprint lithography (UV-NIL) process adequately even using standard Personal Computers. The model is derived from 3D Navier-Stokes equations with the understanding that the resist motion is largely directed along the substrate surface and characterized by ultra-low values of the Reynolds number. By the numerical approximation of the model, a special finite difference method is applied (a coarse-grain method). A coarse-grain modeling tool for detailed analysis of resist spreading in UV-NIL at the structure-scale level is tested. The obtained results demonstrate the high ability of the tool to calculate optimal dispensing for given stamp design and process parameters. This dispensing provides uniform filled areas and a homogeneous residual layer thickness in UV-NIL.

  7. IAC - INTEGRATED ANALYSIS CAPABILITY

    NASA Technical Reports Server (NTRS)

    Frisch, H. P.

    1994-01-01

    The objective of the Integrated Analysis Capability (IAC) system is to provide a highly effective, interactive analysis tool for the integrated design of large structures. With the goal of supporting the unique needs of engineering analysis groups concerned with interdisciplinary problems, IAC was developed to interface programs from the fields of structures, thermodynamics, controls, and system dynamics with an executive system and database to yield a highly efficient multi-disciplinary system. Special attention is given to user requirements such as data handling and on-line assistance with operational features, and the ability to add new modules of the user's choice at a future date. IAC contains an executive system, a data base, general utilities, interfaces to various engineering programs, and a framework for building interfaces to other programs. IAC has shown itself to be effective in automatic data transfer among analysis programs. IAC 2.5, designed to be compatible as far as possible with Level 1.5, contains a major upgrade in executive and database management system capabilities, and includes interfaces to enable thermal, structures, optics, and control interaction dynamics analysis. The IAC system architecture is modular in design. 1) The executive module contains an input command processor, an extensive data management system, and driver code to execute the application modules. 2) Technical modules provide standalone computational capability as well as support for various solution paths or coupled analyses. 3) Graphics and model generation interfaces are supplied for building and viewing models. Advanced graphics capabilities are provided within particular analysis modules such as INCA and NASTRAN. 4) Interface modules provide for the required data flow between IAC and other modules. 5) User modules can be arbitrary executable programs or JCL procedures with no pre-defined relationship to IAC. 6) Special purpose modules are included, such as MIMIC (Model Integration via Mesh Interpolation Coefficients), which transforms field values from one model to another; LINK, which simplifies incorporation of user specific modules into IAC modules; and DATAPAC, the National Bureau of Standards statistical analysis package. The IAC database contains structured files which provide a common basis for communication between modules and the executive system, and can contain unstructured files such as NASTRAN checkpoint files, DISCOS plot files, object code, etc. The user can define groups of data and relations between them. A full data manipulation and query system operates with the database. The current interface modules comprise five groups: 1) Structural analysis - IAC contains a NASTRAN interface for standalone analysis or certain structural/control/thermal combinations. IAC provides enhanced structural capabilities for normal modes and static deformation analysis via special DMAP sequences. IAC 2.5 contains several specialized interfaces from NASTRAN in support of multidisciplinary analysis. 2) Thermal analysis - IAC supports finite element and finite difference techniques for steady state or transient analysis. There are interfaces for the NASTRAN thermal analyzer, SINDA/SINFLO, and TRASYS II. FEMNET, which converts finite element structural analysis models to finite difference thermal analysis models, is also interfaced with the IAC database. 3) System dynamics - The DISCOS simulation program which allows for either nonlinear time domain analysis or linear frequency domain analysis, is fully interfaced to the IAC database management capability. 4) Control analysis - Interfaces for the ORACLS, SAMSAN, NBOD2, and INCA programs allow a wide range of control system analyses and synthesis techniques. Level 2.5 includes EIGEN, which provides tools for large order system eigenanalysis, and BOPACE, which allows for geometric capabilities and finite element analysis with nonlinear material. Also included in IAC level 2.5 is SAMSAN 3.1, an engineering analysis program which contains a general purpose library of over 600 subroutin

  8. Mentoring as an Induction Tool in Special Education Administration

    ERIC Educational Resources Information Center

    Smith, Cynthia Sonderegger; Arsenault, Kimberly

    2014-01-01

    Mentoring is a widely used method of induction into a variety of professional roles, including educational leadership. However, little scholarly literature has focused on the role of mentoring in the career development of special education administrators. In this examination of 14 such mentoring relationships, the existence of career and…

  9. The Individual Family Support Plan: A Tool to Assist Special Populations of Gifted Learners.

    ERIC Educational Resources Information Center

    Damiani, Victoria B.

    1996-01-01

    This article describes Project Mandela, a federally funded enrichment and family support program for special populations (such as culturally diverse and economically disadvantaged) of gifted learners. Eighty-seven families participated in development of Individual Family Support Plans to enhance children's educational progress. The project found…

  10. Teacher Motivation and Retention in High School Special Education

    ERIC Educational Resources Information Center

    Lawrence, Matthew Daniel

    2017-01-01

    The purpose of this study was to identify the best motivational tools for recruiting high-quality high school special education teachers; and to identify effective techniques and strategies to facilitate those individuals' experiencing career longevity within the profession. This study also sought to highlight the specific issues that actively…

  11. Preservice Teachers in Special Education: Using Edublogs for Transition Collaboration

    ERIC Educational Resources Information Center

    Seabrooks-Blackmore, Janice; Patterson, Karen B.

    2013-01-01

    This was an exploratory study that examined the introduction and use of Edublogs as a collaborative communication tool in an undergraduate preservice special education course. Participants were enrolled in a course that addressed transition and the development of individualized transition education plans for students with disabilities. Pre- and…

  12. 48 CFR 1852.245-79 - Use of Government-owned property.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...-owned facilities (real property or plant equipment), special test equipment, or special tooling... property. 1852.245-79 Section 1852.245-79 Federal Acquisition Regulations System NATIONAL AERONAUTICS AND... and Clauses 1852.245-79 Use of Government-owned property. As prescribed in 1845.106-70(i), insert the...

  13. Special Education Curriculum (Computerized IEP Catalog).

    ERIC Educational Resources Information Center

    Garland Independent School District, TX.

    This special education curriculum, developed by the Garland (Texas) Independent School District, outlines the basic tools for preparing an Individual Educational Plan (IEP) for each handicapped student. The curricular information is organized and coded to facilitate computerized printing of the IEP. The document begins with a list of 13…

  14. WebQuests: A Tool for All Teachers

    ERIC Educational Resources Information Center

    Rader, Laura

    2009-01-01

    Classroom teachers are assuming more and more responsibility for meeting the needs of students from a larger number of diverse backgrounds and with increasingly diverse special needs. Many practicing teachers identify students with special needs as their greatest concern and challenge, but often one of their greatest rewards. One way of…

  15. Teachers and Occupational Therapists: A Partnership for Children with Special Needs (Pre-Vocational).

    ERIC Educational Resources Information Center

    Rourk, Jane Davis

    Occupational therapists should work cooperatively with special educators and vocational educators in the prevocational and vocational education of handicapped adolescent students. The occupational therapist's knowledge of disabling conditions, the therapeutic relationship, and methods of adapting tools and equipment are of particular value in such…

  16. Mind the Gap: Accountability, Observation and Special Education

    ERIC Educational Resources Information Center

    Crowe, Christina C.; Rivers, Susan E.; Bertoli, Michelle C.

    2017-01-01

    There is an absence of observation-based tools designed to evaluate teaching in special education classrooms. Evaluations derived from classroom observations are integral to the accountability process, adding value to understanding teaching and learning by providing a lens into the classroom that test scores cannot capture. The present paper…

  17. 19 CFR 145.34 - Personal and household effects and tools of trade.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 19 Customs Duties 2 2011-04-01 2011-04-01 false Personal and household effects and tools of trade. 145.34 Section 145.34 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) MAIL IMPORTATIONS Special Classes of Merchandise § 145.34 Personal and household effects and tools of...

  18. 19 CFR 145.34 - Personal and household effects and tools of trade.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Personal and household effects and tools of trade. 145.34 Section 145.34 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) MAIL IMPORTATIONS Special Classes of Merchandise § 145.34 Personal and household effects and tools of...

  19. 19 CFR 145.34 - Personal and household effects and tools of trade.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 19 Customs Duties 2 2012-04-01 2012-04-01 false Personal and household effects and tools of trade. 145.34 Section 145.34 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) MAIL IMPORTATIONS Special Classes of Merchandise § 145.34 Personal and household effects and tools of...

  20. 19 CFR 145.34 - Personal and household effects and tools of trade.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 19 Customs Duties 2 2013-04-01 2013-04-01 false Personal and household effects and tools of trade. 145.34 Section 145.34 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) MAIL IMPORTATIONS Special Classes of Merchandise § 145.34 Personal and household effects and tools of...

  1. Screening of mutations affecting protein stability and dynamics of FGFR1—A simulation analysis

    PubMed Central

    Doss, C. George Priya; Rajith, B.; Garwasis, Nimisha; Mathew, Pretty Raju; Raju, Anand Solomon; Apoorva, K.; William, Denise; Sadhana, N.R.; Himani, Tanwar; Dike, IP.

    2012-01-01

    Single amino acid substitutions in Fibroblast Growth Factor Receptor 1 (FGFR1) destabilize protein and have been implicated in several genetic disorders like various forms of cancer, Kallamann syndrome, Pfeiffer syndrome, Jackson Weiss syndrome, etc. In order to gain functional insight into mutation caused by amino acid substitution to protein function and expression, special emphasis was laid on molecular dynamics simulation techniques in combination with in silico tools such as SIFT, PolyPhen 2.0, I-Mutant 3.0 and SNAP. It has been estimated that 68% nsSNPs were predicted to be deleterious by I-Mutant, slightly higher than SIFT (37%), PolyPhen 2.0 (61%) and SNAP (58%). From the observed results, P722S mutation was found to be most deleterious by comparing results of all in silico tools. By molecular dynamics approach, we have shown that P722S mutation leads to increase in flexibility, and deviated more from the native structure which was supported by the decrease in the number of hydrogen bonds. In addition, biophysical analysis revealed a clear insight of stability loss due to P722S mutation in FGFR1 protein. Majority of mutations predicted by these in silico tools were in good concordance with the experimental results. PMID:27896051

  2. Application of bioinformatics tools and databases in microbial dehalogenation research (a review).

    PubMed

    Satpathy, R; Konkimalla, V B; Ratha, J

    2015-01-01

    Microbial dehalogenation is a biochemical process in which the halogenated substances are catalyzed enzymatically in to their non-halogenated form. The microorganisms have a wide range of organohalogen degradation ability both explicit and non-specific in nature. Most of these halogenated organic compounds being pollutants need to be remediated; therefore, the current approaches are to explore the potential of microbes at a molecular level for effective biodegradation of these substances. Several microorganisms with dehalogenation activity have been identified and characterized. In this aspect, the bioinformatics plays a key role to gain deeper knowledge in this field of dehalogenation. To facilitate the data mining, many tools have been developed to annotate these data from databases. Therefore, with the discovery of a microorganism one can predict a gene/protein, sequence analysis, can perform structural modelling, metabolic pathway analysis, biodegradation study and so on. This review highlights various methods of bioinformatics approach that describes the application of various databases and specific tools in the microbial dehalogenation fields with special focus on dehalogenase enzymes. Attempts have also been made to decipher some recent applications of in silico modeling methods that comprise of gene finding, protein modelling, Quantitative Structure Biodegradibility Relationship (QSBR) study and reconstruction of metabolic pathways employed in dehalogenation research area.

  3. Screening of mutations affecting protein stability and dynamics of FGFR1-A simulation analysis.

    PubMed

    Doss, C George Priya; Rajith, B; Garwasis, Nimisha; Mathew, Pretty Raju; Raju, Anand Solomon; Apoorva, K; William, Denise; Sadhana, N R; Himani, Tanwar; Dike, I P

    2012-12-01

    Single amino acid substitutions in Fibroblast Growth Factor Receptor 1 ( FGFR1 ) destabilize protein and have been implicated in several genetic disorders like various forms of cancer, Kallamann syndrome, Pfeiffer syndrome, Jackson Weiss syndrome, etc. In order to gain functional insight into mutation caused by amino acid substitution to protein function and expression, special emphasis was laid on molecular dynamics simulation techniques in combination with in silico tools such as SIFT, PolyPhen 2.0, I-Mutant 3.0 and SNAP. It has been estimated that 68% nsSNPs were predicted to be deleterious by I-Mutant, slightly higher than SIFT (37%), PolyPhen 2.0 (61%) and SNAP (58%). From the observed results, P722S mutation was found to be most deleterious by comparing results of all in silico tools. By molecular dynamics approach, we have shown that P722S mutation leads to increase in flexibility, and deviated more from the native structure which was supported by the decrease in the number of hydrogen bonds. In addition, biophysical analysis revealed a clear insight of stability loss due to P722S mutation in FGFR1 protein. Majority of mutations predicted by these in silico tools were in good concordance with the experimental results.

  4. Research on the EDM Technology for Micro-holes at Complex Spatial Locations

    NASA Astrophysics Data System (ADS)

    Y Liu, J.; Guo, J. M.; Sun, D. J.; Cai, Y. H.; Ding, L. T.; Jiang, H.

    2017-12-01

    For the demands on machining micro-holes at complex spatial location, several key technical problems are conquered such as micro-Electron Discharge Machining (micro-EDM) power supply system’s development, the host structure’s design and machining process technical. Through developing low-voltage power supply circuit, high-voltage circuit, micro and precision machining circuit and clearance detection system, the narrow pulse and high frequency six-axis EDM machining power supply system is developed to meet the demands on micro-hole discharging machining. With the method of combining the CAD structure design, CAE simulation analysis, modal test, ODS (Operational Deflection Shapes) test and theoretical analysis, the host construction and key axes of the machine tool are optimized to meet the position demands of the micro-holes. Through developing the special deionized water filtration system to make sure that the machining process is stable enough. To verify the machining equipment and processing technical developed in this paper through developing the micro-hole’s processing flow and test on the real machine tool. As shown in the final test results: the efficient micro-EDM machining pulse power supply system, machine tool host system, deionized filtration system and processing method developed in this paper meet the demands on machining micro-holes at complex spatial locations.

  5. Image-Based Single Cell Profiling: High-Throughput Processing of Mother Machine Experiments

    PubMed Central

    Sachs, Christian Carsten; Grünberger, Alexander; Helfrich, Stefan; Probst, Christopher; Wiechert, Wolfgang; Kohlheyer, Dietrich; Nöh, Katharina

    2016-01-01

    Background Microfluidic lab-on-chip technology combined with live-cell imaging has enabled the observation of single cells in their spatio-temporal context. The mother machine (MM) cultivation system is particularly attractive for the long-term investigation of rod-shaped bacteria since it facilitates continuous cultivation and observation of individual cells over many generations in a highly parallelized manner. To date, the lack of fully automated image analysis software limits the practical applicability of the MM as a phenotypic screening tool. Results We present an image analysis pipeline for the automated processing of MM time lapse image stacks. The pipeline supports all analysis steps, i.e., image registration, orientation correction, channel/cell detection, cell tracking, and result visualization. Tailored algorithms account for the specialized MM layout to enable a robust automated analysis. Image data generated in a two-day growth study (≈ 90 GB) is analyzed in ≈ 30 min with negligible differences in growth rate between automated and manual evaluation quality. The proposed methods are implemented in the software molyso (MOther machine AnaLYsis SOftware) that provides a new profiling tool to analyze unbiasedly hitherto inaccessible large-scale MM image stacks. Conclusion Presented is the software molyso, a ready-to-use open source software (BSD-licensed) for the unsupervised analysis of MM time-lapse image stacks. molyso source code and user manual are available at https://github.com/modsim/molyso. PMID:27661996

  6. Minimal and moderate oral sedation in the adult special needs patient.

    PubMed

    Coke, John M; Edwards, Michael D

    2009-04-01

    Oral minimal/moderate sedation can be an effective tool to aid in the dental management of adult special needs patients. Specific sedative drugs must be chosen by the dentist that can be used safely and effectively on these patients. This article focuses on a select number of these drugs, specific medical and pharmacologic challenges presented by adult special needs patients, and techniques to safely administer oral minimal and moderate sedation.

  7. The FOT tool kit concept

    NASA Technical Reports Server (NTRS)

    Fatig, Michael

    1993-01-01

    Flight operations and the preparation for it has become increasingly complex as mission complexities increase. Further, the mission model dictates that a significant increase in flight operations activities is upon us. Finally, there is a need for process improvement and economy in the operations arena. It is therefore time that we recognize flight operations as a complex process requiring a defined, structured, and life cycle approach vitally linked to space segment, ground segment, and science operations processes. With this recognition, an FOT Tool Kit consisting of six major components designed to provide tools to guide flight operations activities throughout the mission life cycle was developed. The major components of the FOT Tool Kit and the concepts behind the flight operations life cycle process as developed at NASA's GSFC for GSFC-based missions are addressed. The Tool Kit is therefore intended to increase productivity, quality, cost, and schedule performance of the flight operations tasks through the use of documented, structured methodologies; knowledge of past lessons learned and upcoming new technology; and through reuse and sharing of key products and special application programs made possible through the development of standardized key products and special program directories.

  8. ACQ4: an open-source software platform for data acquisition and analysis in neurophysiology research

    PubMed Central

    Campagnola, Luke; Kratz, Megan B.; Manis, Paul B.

    2014-01-01

    The complexity of modern neurophysiology experiments requires specialized software to coordinate multiple acquisition devices and analyze the collected data. We have developed ACQ4, an open-source software platform for performing data acquisition and analysis in experimental neurophysiology. This software integrates the tasks of acquiring, managing, and analyzing experimental data. ACQ4 has been used primarily for standard patch-clamp electrophysiology, laser scanning photostimulation, multiphoton microscopy, intrinsic imaging, and calcium imaging. The system is highly modular, which facilitates the addition of new devices and functionality. The modules included with ACQ4 provide for rapid construction of acquisition protocols, live video display, and customizable analysis tools. Position-aware data collection allows automated construction of image mosaics and registration of images with 3-dimensional anatomical atlases. ACQ4 uses free and open-source tools including Python, NumPy/SciPy for numerical computation, PyQt for the user interface, and PyQtGraph for scientific graphics. Supported hardware includes cameras, patch clamp amplifiers, scanning mirrors, lasers, shutters, Pockels cells, motorized stages, and more. ACQ4 is available for download at http://www.acq4.org. PMID:24523692

  9. An efficient annotation and gene-expression derivation tool for Illumina Solexa datasets.

    PubMed

    Hosseini, Parsa; Tremblay, Arianne; Matthews, Benjamin F; Alkharouf, Nadim W

    2010-07-02

    The data produced by an Illumina flow cell with all eight lanes occupied, produces well over a terabyte worth of images with gigabytes of reads following sequence alignment. The ability to translate such reads into meaningful annotation is therefore of great concern and importance. Very easily, one can get flooded with such a great volume of textual, unannotated data irrespective of read quality or size. CASAVA, a optional analysis tool for Illumina sequencing experiments, enables the ability to understand INDEL detection, SNP information, and allele calling. To not only extract from such analysis, a measure of gene expression in the form of tag-counts, but furthermore to annotate such reads is therefore of significant value. We developed TASE (Tag counting and Analysis of Solexa Experiments), a rapid tag-counting and annotation software tool specifically designed for Illumina CASAVA sequencing datasets. Developed in Java and deployed using jTDS JDBC driver and a SQL Server backend, TASE provides an extremely fast means of calculating gene expression through tag-counts while annotating sequenced reads with the gene's presumed function, from any given CASAVA-build. Such a build is generated for both DNA and RNA sequencing. Analysis is broken into two distinct components: DNA sequence or read concatenation, followed by tag-counting and annotation. The end result produces output containing the homology-based functional annotation and respective gene expression measure signifying how many times sequenced reads were found within the genomic ranges of functional annotations. TASE is a powerful tool to facilitate the process of annotating a given Illumina Solexa sequencing dataset. Our results indicate that both homology-based annotation and tag-count analysis are achieved in very efficient times, providing researchers to delve deep in a given CASAVA-build and maximize information extraction from a sequencing dataset. TASE is specially designed to translate sequence data in a CASAVA-build into functional annotations while producing corresponding gene expression measurements. Achieving such analysis is executed in an ultrafast and highly efficient manner, whether the analysis be a single-read or paired-end sequencing experiment. TASE is a user-friendly and freely available application, allowing rapid analysis and annotation of any given Illumina Solexa sequencing dataset with ease.

  10. An update on the use of cerebrospinal fluid analysis as a diagnostic tool in multiple sclerosis.

    PubMed

    Gastaldi, Matteo; Zardini, Elisabetta; Franciotta, Diego

    2017-01-01

    Intrathecal B-lymphocyte activation is a hallmark of multiple sclerosis (MS), a multi-factorial inflammatory-demyelinating disease of the central nervous system. Such activation has a counterpart in the cerebrospinal fluid (CSF) oligoclonal IgG bands (OCB), whose diagnostic role in MS has been downgraded within the current McDonald's criteria. With a theoretico-practical approach, the authors review the physiopathological basis of the CSF dynamics, and the state-of-the-art of routine CSF analysis and CSF biomarkers in MS. Areas covered: The authors discuss pros and cons of CSF analysis, including critical evaluations of both well-established, and promising diagnostic and prognostic laboratory tools. New acquisitions on the CSF and cerebral interstitial fluid dynamics are also presented. The authors searched the PubMed database for English-language articles reported between January 2010 and June 2016, using the key words 'multiple sclerosis', 'cerebrospinal fluid', 'oligoclonal bands'. Reference lists of relevant articles were scanned for additional studies. Expert commentary: The availability of performing high-quality, routine CSF tests in specialized laboratories, the emerging potential of novel CSF biomarkers, and the trend for early treatments should induce a reappraisal of CSF analysis for diagnostic and prognostic purposes in MS. Further procedural and methodological improvements seem to be necessary in both research and translational diagnostic CSF settings.

  11. XS: a FASTQ read simulator.

    PubMed

    Pratas, Diogo; Pinho, Armando J; Rodrigues, João M O S

    2014-01-16

    The emerging next-generation sequencing (NGS) is bringing, besides the natural huge amounts of data, an avalanche of new specialized tools (for analysis, compression, alignment, among others) and large public and private network infrastructures. Therefore, a direct necessity of specific simulation tools for testing and benchmarking is rising, such as a flexible and portable FASTQ read simulator, without the need of a reference sequence, yet correctly prepared for producing approximately the same characteristics as real data. We present XS, a skilled FASTQ read simulation tool, flexible, portable (does not need a reference sequence) and tunable in terms of sequence complexity. It has several running modes, depending on the time and memory available, and is aimed at testing computing infrastructures, namely cloud computing of large-scale projects, and testing FASTQ compression algorithms. Moreover, XS offers the possibility of simulating the three main FASTQ components individually (headers, DNA sequences and quality-scores). XS provides an efficient and convenient method for fast simulation of FASTQ files, such as those from Ion Torrent (currently uncovered by other simulators), Roche-454, Illumina and ABI-SOLiD sequencing machines. This tool is publicly available at http://bioinformatics.ua.pt/software/xs/.

  12. IMGT/3Dstructure-DB and IMGT/StructuralQuery, a database and a tool for immunoglobulin, T cell receptor and MHC structural data

    PubMed Central

    Kaas, Quentin; Ruiz, Manuel; Lefranc, Marie-Paule

    2004-01-01

    IMGT/3Dstructure-DB and IMGT/Structural-Query are a novel 3D structure database and a new tool for immunological proteins. They are part of IMGT, the international ImMunoGenetics information system®, a high-quality integrated knowledge resource specializing in immunoglobulins (IG), T cell receptors (TR), major histocompatibility complex (MHC) and related proteins of the immune system (RPI) of human and other vertebrate species, which consists of databases, Web resources and interactive on-line tools. IMGT/3Dstructure-DB data are described according to the IMGT Scientific chart rules based on the IMGT-ONTOLOGY concepts. IMGT/3Dstructure-DB provides IMGT gene and allele identification of IG, TR and MHC proteins with known 3D structures, domain delimitations, amino acid positions according to the IMGT unique numbering and renumbered coordinate flat files. Moreover IMGT/3Dstructure-DB provides 2D graphical representations (or Collier de Perles) and results of contact analysis. The IMGT/StructuralQuery tool allows search of this database based on specific structural characteristics. IMGT/3Dstructure-DB and IMGT/StructuralQuery are freely available at http://imgt.cines.fr. PMID:14681396

  13. Geothopica and the interactive analysis and visualization of the updated Italian National Geothermal Database

    NASA Astrophysics Data System (ADS)

    Trumpy, Eugenio; Manzella, Adele

    2017-02-01

    The Italian National Geothermal Database (BDNG), is the largest collection of Italian Geothermal data and was set up in the 1980s. It has since been updated both in terms of content and management tools: information on deep wells and thermal springs (with temperature > 30 °C) are currently organized and stored in a PostgreSQL relational database management system, which guarantees high performance, data security and easy access through different client applications. The BDNG is the core of the Geothopica web site, whose webGIS tool allows different types of user to access geothermal data, to visualize multiple types of datasets, and to perform integrated analyses. The webGIS tool has been recently improved by two specially designed, programmed and implemented visualization tools to display data on well lithology and underground temperatures. This paper describes the contents of the database and its software and data update, as well as the webGIS tool including the new tools for data lithology and temperature visualization. The geoinformation organized in the database and accessible through Geothopica is of use not only for geothermal purposes, but also for any kind of georesource and CO2 storage project requiring the organization of, and access to, deep underground data. Geothopica also supports project developers, researchers, and decision makers in the assessment, management and sustainable deployment of georesources.

  14. 18 CFR 367.57 - Equipment.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... include those costs incurred in connection with the first clearing and grading of land and rights-of-way... accounting as service company property is verified by current inventories. Special tools acquired and... drills and similar tool equipment when used in connection with the operation and maintenance of a...

  15. Spontaneous imbibition in fractal tortuous micro-nano pores considering dynamic contact angle and slip effect: phase portrait analysis and analytical solutions.

    PubMed

    Li, Caoxiong; Shen, Yinghao; Ge, Hongkui; Zhang, Yanjun; Liu, Tao

    2018-03-02

    Shales have abundant micro-nano pores. Meanwhile, a considerable amount of fracturing liquid is imbibed spontaneously in the hydraulic fracturing process. The spontaneous imbibition in tortuous micro-nano pores is special to shale, and dynamic contact angle and slippage are two important characteristics. In this work, we mainly investigate spontaneous imbibition considering dynamic contact angle and slip effect in fractal tortuous capillaries. We introduce phase portrait analysis to analyse the dynamic state and stability of imbibition. Moreover, analytical solutions to the imbibition equation are derived under special situations, and the solutions are verified by published data. Finally, we discuss the influences of slip length, dynamic contact angle and gravity on spontaneous imbibition. The analysis shows that phase portrait is an ideal tool for analysing spontaneous imbibition because it can evaluate the process without solving the complex governing ordinary differential equations. Moreover, dynamic contact angle and slip effect play an important role in fluid imbibition in fractal tortuous capillaries. Neglecting slip effect in micro-nano pores apparently underestimates imbibition capability, and ignoring variations in contact angle causes inaccuracy in predicting imbibition speed at the initial stage of the process. Finally, gravity is one of the factors that control the stabilisation of the imbibition process.

  16. Contact Modelling in Isogeometric Analysis: Application to Sheet Metal Forming Processes

    NASA Astrophysics Data System (ADS)

    Cardoso, Rui P. R.; Adetoro, O. B.; Adan, D.

    2016-08-01

    Isogeometric Analysis (IGA) has been growing in popularity in the past few years essentially due to the extra flexibility it introduces with the use of higher degrees in the basis functions leading to higher convergence rates. IGA also offers the capability of easily reproducing discontinuous displacement and/or strain fields by just manipulating the multiplicity of the knot parametric coordinates. Another advantage of IGA is that it uses the Non-Uniform Rational B-Splines (NURBS) basis functions, that are very common in CAD solid modelling, and consequently it makes easier the transition from CAD models to numerical analysis. In this work it is explored the contact analysis in IGA for both implicit and explicit time integration schemes. Special focus will be given on contact search and contact detection techniques under NURBS patches for both the rigid tools and the deformed sheet blank.

  17. YBYRÁ facilitates comparison of large phylogenetic trees.

    PubMed

    Machado, Denis Jacob

    2015-07-01

    The number and size of tree topologies that are being compared by phylogenetic systematists is increasing due to technological advancements in high-throughput DNA sequencing. However, we still lack tools to facilitate comparison among phylogenetic trees with a large number of terminals. The "YBYRÁ" project integrates software solutions for data analysis in phylogenetics. It comprises tools for (1) topological distance calculation based on the number of shared splits or clades, (2) sensitivity analysis and automatic generation of sensitivity plots and (3) clade diagnoses based on different categories of synapomorphies. YBYRÁ also provides (4) an original framework to facilitate the search for potential rogue taxa based on how much they affect average matching split distances (using MSdist). YBYRÁ facilitates comparison of large phylogenetic trees and outperforms competing software in terms of usability and time efficiency, specially for large data sets. The programs that comprises this toolkit are written in Python, hence they do not require installation and have minimum dependencies. The entire project is available under an open-source licence at http://www.ib.usp.br/grant/anfibios/researchSoftware.html .

  18. Specialization in and within sexual offending in England and Wales.

    PubMed

    Howard, Philip D; Barnett, Georgia D; Mann, Ruth E

    2014-06-01

    Existing evidence suggests that offenders tend not to specialize in sexual offending in general but that there is some specialization in particular types of sexual offending. This study examined the sexual histories and reoffending of a large, national data set of offenders convicted of a sexual offense and managed in England and Wales by the National Offender Management Service (N = 14,804). The study found that specialization in sexual offending compared to nonsexual offending was most evident for offenders with convictions for accessing indecent images. We also found considerable evidence of specialization within sexual offending, most notably for noncontact offenders, especially again indecent images offenders. Crossover between sexual offense types was very rare for those with contact adult offenses and for noncontact offenders although those with child contact offenses sometimes crossed over to indecent images reoffending. If specialization within sexual offending exists, the use of single risk assessment instruments to predict all types of sexual recidivism may be less effective than previously assumed. A comparison of different prediction models indicated that some items presently used in one-size-fits-all risk tools to predict any sexual reoffending only effectively predict certain subtypes of sexual offending. Statistically there appear to be some potential benefits to creating specialist risk predictors for different subtypes of offending, but further work is needed to justify the implementation demands that would be caused by abandoning one-size-fits-all tools.

  19. Heat Treatment of Tools in Light Industry

    NASA Astrophysics Data System (ADS)

    Petukhov, V. A.

    2005-09-01

    Heat treatment processes for some tools (knitting needles, travelers for thimbles of spinning and doubling frames, thread-forming spinnerets) used for the production of cloths, hosiery, and other articles) in the knitting and textile industries are considered. Problems of the choice of steel and the kind and parameters of heat treatment are discussed in connection with the special features of tool design and operating conditions.

  20. Constraint-based component-modeling for knowledge-based design

    NASA Technical Reports Server (NTRS)

    Kolb, Mark A.

    1992-01-01

    The paper describes the application of various advanced programming techniques derived from artificial intelligence research to the development of flexible design tools for conceptual design. Special attention is given to two techniques which appear to be readily applicable to such design tools: the constraint propagation technique and the object-oriented programming. The implementation of these techniques in a prototype computer tool, Rubber Airplane, is described.

  1. Wireline system for multiple direct push tool usage

    DOEpatents

    Bratton, Wesley L.; Farrington, Stephen P.; Shinn, II, James D.; Nolet, Darren C.

    2003-11-11

    A tool latching and retrieval system allows the deployment and retrieval of a variety of direct push subsurface characterization tools through an embedded rod string during a single penetration without requiring withdrawal of the string from the ground. This enables the in situ interchange of different tools, as well as the rapid retrieval of soil core samples from multiple depths during a single direct push penetration. The system includes specialized rods that make up the rod string, a tool housing which is integral to the rod string, a lock assembly, and several tools which mate to the lock assembly.

  2. Using Contemporary Technology Tools to Improve the Effectiveness of Teacher Educators in Special Education

    ERIC Educational Resources Information Center

    O'Brien, Chris; Aguinaga, Nancy J.; Hines, Rebecca; Hartshorne, Richard

    2011-01-01

    Ongoing developments in educational technology, including web-based instruction, streaming video, podcasting, video-conferencing, and the use of wikis and blogs to create learning communities, have substantial impact on distance education and preparation of special educators in rural communities. These developments can be overwhelming, however,…

  3. 75 FR 76930 - Amendment to the International Traffic in Arms Regulations: Revision of U.S. Munitions List...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-10

    ... equipment. (1) (Tier 2) Production equipment, tooling, and test equipment ``specially designed'' for armored... designed'' for the articles controlled in this Category. Note 1 to paragraph (b): For production of major..., inspection and production equipment ``specially designed'' for a subsystem or component not specifically...

  4. Coloring Outside the Rules

    ERIC Educational Resources Information Center

    Cleaver, Samantha

    2006-01-01

    This article describes how the author finds a tool for behavior management in the special education teacher's classroom. The author had just began teaching a self-contained special education class of five-year-olds with developmental delays when a student hit her. In the moment, she chose to ignore it. However, ignoring him was a mistake--he…

  5. Using the Internet in Rural Special Education: Accessing Resources.

    ERIC Educational Resources Information Center

    Bull, Kay S.; Kimball, Sarah L.

    This paper provides basic information on searching the Internet and describes sites of interest in the area of rural special education. The first section traces the evolution of the Internet through various phases--ARPANET, NSFNET, CERNET, and the beginnings of the World Wide Web--and describes various protocols (methods and tools) developed to…

  6. Special Education Related Services and Distance Education in the 21st Century Classroom

    ERIC Educational Resources Information Center

    Pantazis, Mary Ellen

    2013-01-01

    This exploratory study addresses special education related services and the requirements when a school-aged student with a disability attends school using synchronous distance education tools to access the least restrictive environment. The researcher examines these placements to explore the implications virtual schooling has on students receiving…

  7. Refocusing the Lens: Enhancing Elementary Special Education Reading Instruction through Video Self-Reflection

    ERIC Educational Resources Information Center

    Osipova, Anna; Prichard, Brooke; Boardman, Alison Gould; Kiely, Mary Theresa; Carroll, Patricia E.

    2011-01-01

    This article presents the findings from a pilot study exploring the use of video as a self-reflection tool combined with high-quality, collaborative professional development (PD). Participants were in-service, upper-elementary, special education instructors teaching word study and fluency to students with learning disabilities. Participants…

  8. Special Education Management System Project Document. 2. Santa Cruz BCP Observation Booklet.

    ERIC Educational Resources Information Center

    Santa Cruz County Superintendent of Schools, CA.

    Presented in booklet and chart form is the Behavioral Characteristics Progression (BCP), part of the Santa Cruz Special Education Management Project, consisting of 2400 observable traits grouped into 50 behavioral strands. The BCP is seen to be a nonstandardized criterion referenced tool which replaces conventional age and disability labels with…

  9. The Design of an IEP Decision Aid: A Tool for Diverse Parents of Children with Autism

    ERIC Educational Resources Information Center

    Schuttler, Jessica Oeth

    2012-01-01

    Decision-making is a universal process that occurs constantly in life. Parent participation in educational decision-making is recognized as important by special education law, by special education and school psychology literature (Christenson & Sheridan, 2001; IDEIA, 2004;). Partnership in decision-making is especially important for parents of…

  10. KSC-2014-1352

    NASA Image and Video Library

    2014-02-10

    CAPE CANAVERAL, Fla. – Special Rescue Operations firefighters with NASA Fire Rescue Services in the Protective Services Office at NASA’s Kennedy Space Center in Florida practice vehicle extrication training at an auto salvage yard near the center. In the foreground, a firefighter with an axe assists as another firefighter uses a special tool to punch through the door of the vehicle. A special hydraulic cutting tool and reciprocating saw were used to cut through and remove the roof. In the background, other firefighters are practicing with the Jaws of Life to simulate the rescue of a trapped and injured person. Kennedy’s firefighters recently achieved Pro Board Certification in aerial fire truck operations. With the completion of vehicle extrication and Jaws of Life training, the Protective Services Office is one step closer to achieving certification in vehicle machinery extrication. Kennedy’s firefighters are with G4S Government Solutions Inc., on the Kennedy Protective Services Contract. Photo credit: NASA/Daniel Casper

  11. KSC-2014-1353

    NASA Image and Video Library

    2014-02-10

    CAPE CANAVERAL, Fla. – Special Rescue Operations firefighters with NASA Fire Rescue Services in the Protective Services Office at NASA’s Kennedy Space Center in Florida practice vehicle extrication training at an auto salvage yard near the center. In the foreground, a firefighter with an axe assists as another firefighter uses a special tool to punch through the door of the vehicle. A special hydraulic cutting tool and reciprocating saw were used to cut through and remove the roof. In the background, other firefighters are practicing with the Jaws of Life to simulate the rescue of a trapped and injured person. Kennedy’s firefighters recently achieved Pro Board Certification in aerial fire truck operations. With the completion of vehicle extrication and Jaws of Life training, the Protective Services Office is one step closer to achieving certification in vehicle machinery extrication. Kennedy’s firefighters are with G4S Government Solutions Inc., on the Kennedy Protective Services Contract. Photo credit: NASA/Daniel Casper

  12. KSC-2014-1350

    NASA Image and Video Library

    2014-02-10

    CAPE CANAVERAL, Fla. – Special Rescue Operations firefighters with NASA Fire Rescue Services in the Protective Services Office at NASA’s Kennedy Space Center in Florida practice vehicle extrication training at an auto salvage yard near the center. Firefighters have removed the roof of the car using a special hydraulic cutting tool and reciprocating saw. Other firefighters have used axes and special tools to punch through and clear away the windshield and windows. Another firefighter uses the Jaws of Life on the car to simulate the rescue of a trapped and injured person. Kennedy’s firefighters recently achieved Pro Board Certification in aerial fire truck operations. With the completion of vehicle extrication and Jaws of Life training, the Protective Services Office is one step closer to achieving certification in vehicle machinery extrication. Kennedy’s firefighters are with G4S Government Solutions Inc., on the Kennedy Protective Services Contract. Photo credit: NASA/Daniel Casper

  13. KSC-2014-1349

    NASA Image and Video Library

    2014-02-10

    CAPE CANAVERAL, Fla. – Special Rescue Operations firefighters with NASA Fire Rescue Services in the Protective Services Office at NASA’s Kennedy Space Center in Florida practice vehicle extrication training at an auto salvage yard near the center. Firefighters carry away the roof of the car that was removed using a special hydraulic cutting tool and reciprocating saw. Other firefighters used axes and special tools to punch through and clear away the windshield and windows. They will use the Jaws of Life to simulate the rescue of a trapped and injured person. Kennedy’s firefighters recently achieved Pro Board Certification in aerial fire truck operations. With the completion of vehicle extrication and Jaws of Life training, the Protective Services Office is one step closer to achieving certification in vehicle machinery extrication. Kennedy’s firefighters are with G4S Government Solutions Inc., on the Kennedy Protective Services Contract. Photo credit: NASA/Daniel Casper

  14. KSC-2014-1351

    NASA Image and Video Library

    2014-02-10

    CAPE CANAVERAL, Fla. – Special Rescue Operations firefighters with NASA Fire Rescue Services in the Protective Services Office at NASA’s Kennedy Space Center in Florida practice vehicle extrication training at an auto salvage yard near the center. Two firefighters assist as another firefighter uses the Jaws of Life on the car to simulate the rescue of a trapped and injured person. A special hydraulic cutting tool and reciprocating saw were used to remove the roof of the vehicle. Other firefighters used axes and special tools to punch through and clear away the windshield and the windows. Kennedy’s firefighters recently achieved Pro Board Certification in aerial fire truck operations. With the completion of vehicle extrication and Jaws of Life training, the Protective Services Office is one step closer to achieving certification in vehicle machinery extrication. Kennedy’s firefighters are with G4S Government Solutions Inc., on the Kennedy Protective Services Contract. Photo credit: NASA/Daniel Casper

  15. KSC-2014-1355

    NASA Image and Video Library

    2014-02-10

    CAPE CANAVERAL, Fla. – Special Rescue Operations firefighters with NASA Fire Rescue Services in the Protective Services Office at NASA’s Kennedy Space Center in Florida practice vehicle extrication training at an auto salvage yard near the center. A firefighter uses the Jaws of Life to finish removing the door from the vehicle and simulate the rescue of a trapped and injured person. A special hydraulic cutting tool and reciprocating saw were used to cut through and remove the roof. An axe and other special tools were used to punch through and clear away the windshield and windows. Kennedy’s firefighters recently achieved Pro Board Certification in aerial fire truck operations. With the completion of vehicle extrication and Jaws of Life training, the Protective Services Office is one step closer to achieving certification in vehicle machinery extrication. Kennedy’s firefighters are with G4S Government Solutions Inc., on the Kennedy Protective Services Contract. Photo credit: NASA/Daniel Casper

  16. NeuronRead, an open source semi-automated tool for morphometric analysis of phase contrast and fluorescence neuronal images.

    PubMed

    Dias, Roberto A; Gonçalves, Bruno P; da Rocha, Joana F; da Cruz E Silva, Odete A B; da Silva, Augusto M F; Vieira, Sandra I

    2017-12-01

    Neurons are specialized cells of the Central Nervous System whose function is intricately related to the neuritic network they develop to transmit information. Morphological evaluation of this network and other neuronal structures is required to establish relationships between neuronal morphology and function, and may allow monitoring physiological and pathophysiologic alterations. Fluorescence-based microphotographs are the most widely used in cellular bioimaging, but phase contrast (PhC) microphotographs are easier to obtain, more affordable, and do not require invasive, complicated and disruptive techniques. Despite the various freeware tools available for fluorescence-based images analysis, few exist that can tackle the more elusive and harder-to-analyze PhC images. To surpass this, an interactive semi-automated image processing workflow was developed to easily extract relevant information (e.g. total neuritic length, average cell body area) from both PhC and fluorescence neuronal images. This workflow, named 'NeuronRead', was developed in the form of an ImageJ macro. Its robustness and adaptability were tested and validated on rat cortical primary neurons under control and differentiation inhibitory conditions. Validation included a comparison to manual determinations and to a golden standard freeware tool for fluorescence image analysis. NeuronRead was subsequently applied to PhC images of neurons at distinct differentiation days and exposed or not to DAPT, a pharmacological inhibitor of the γ-secretase enzyme, which cleaves the well-known Alzheimer's amyloid precursor protein (APP) and the Notch receptor. Data obtained confirms a neuritogenic regulatory role for γ-secretase products and validates NeuronRead as a time- and cost-effective useful monitoring tool. Copyright © 2017. Published by Elsevier Inc.

  17. Concordium 2015: Strategic Uses of Evidence to Transform Delivery Systems

    PubMed Central

    Holve, Erin; Weiss, Samantha

    2016-01-01

    In September 2015 the EDM Forum hosted AcademyHealth’s newest national conference, Concordium. The 11 papers featured in the eGEMs “Concordium 2015” special issue successfully reflect the major themes and issues discussed at the meeting. Many of the papers address informatics or methodological approaches to natural language processing (NLP) or text analysis, which is indicative of the importance of analyzing text data to gain insights into care coordination and patient-centered outcomes. Perspectives on the tools and infrastructure requirements that are needed to build learning health systems were also recurrent themes. PMID:27683671

  18. DAB user's guide

    NASA Technical Reports Server (NTRS)

    Trosin, J.

    1985-01-01

    Use of the Display AButments (DAB) which plots PAN AIR geometries is presented. The DAB program creates hidden line displays of PAN AIR geometries and labels specified geometry components, such as abutments, networks, and network edges. It is used to alleviate the very time consuming and error prone abutment list checking phase of developing a valid PAN AIR geometry, and therefore represents a valuable tool for debugging complex PAN AIR geometry definitions. DAB is written in FORTRAN 77 and runs on a Digital Equipment Corporation VAX 11/780 under VMS. It utilizes a special color version of the SKETCH hidden line analysis routine.

  19. Man-rated flight software for the F-8 DFBW program

    NASA Technical Reports Server (NTRS)

    Bairnsfather, R. R.

    1976-01-01

    The design, implementation, and verification of the flight control software used in the F-8 DFBW program are discussed. Since the DFBW utilizes an Apollo computer and hardware, the procedures, controls, and basic management techniques employed are based on those developed for the Apollo software system. Program assembly control, simulator configuration control, erasable-memory load generation, change procedures and anomaly reporting are discussed. The primary verification tools are described, as well as the program test plans and their implementation on the various simulators. Failure effects analysis and the creation of special failure generating software for testing purposes are described.

  20. Haemocompatibility of iron oxide nanoparticles synthesized for theranostic applications: a high-sensitivity microfluidic tool

    NASA Astrophysics Data System (ADS)

    Rodrigues, Raquel O.; Bañobre-López, Manuel; Gallo, Juan; Tavares, Pedro B.; Silva, Adrián M. T.; Lima, Rui; Gomes, Helder T.

    2016-07-01

    The poor heating efficiency of the most reported magnetic nanoparticles (MNPs), allied to the lack of comprehensive biocompatibility and haemodynamic studies, hampers the spread of multifunctional nanoparticles as the next generation of therapeutic bio-agents in medicine. The present work reports the synthesis and characterization, with special focus on biological/toxicological compatibility, of superparamagnetic nanoparticles with diameter around 18 nm, suitable for theranostic applications (i.e. simultaneous diagnosis and therapy of cancer). Envisioning more insights into the complex nanoparticle-red blood cells (RBCs) membrane interaction, the deformability of the human RBCs in contact with magnetic nanoparticles (MNPs) was assessed for the first time with a microfluidic extensional approach, and used as an indicator of haematological disorders in comparison with a conventional haematological test, i.e. the haemolysis analysis. Microfluidic results highlight the potential of this microfluidic tool over traditional haemolysis analysis, by detecting small increments in the rigidity of the blood cells, when traditional haemotoxicology analysis showed no significant alteration (haemolysis rates lower than 2 %). The detected rigidity has been predicted to be due to the wrapping of small MNPs by the bilayer membrane of the RBCs, which is directly related to MNPs size, shape and composition. The proposed microfluidic tool adds a new dimension into the field of nanomedicine, allowing to be applied as a high-sensitivity technique capable of bringing a better understanding of the biological impact of nanoparticles developed for clinical applications.

  1. sRNAdb: A small non-coding RNA database for gram-positive bacteria

    PubMed Central

    2012-01-01

    Background The class of small non-coding RNA molecules (sRNA) regulates gene expression by different mechanisms and enables bacteria to mount a physiological response due to adaptation to the environment or infection. Over the last decades the number of sRNAs has been increasing rapidly. Several databases like Rfam or fRNAdb were extended to include sRNAs as a class of its own. Furthermore new specialized databases like sRNAMap (gram-negative bacteria only) and sRNATarBase (target prediction) were established. To the best of the authors’ knowledge no database focusing on sRNAs from gram-positive bacteria is publicly available so far. Description In order to understand sRNA’s functional and phylogenetic relationships we have developed sRNAdb and provide tools for data analysis and visualization. The data compiled in our database is assembled from experiments as well as from bioinformatics analyses. The software enables comparison and visualization of gene loci surrounding the sRNAs of interest. To accomplish this, we use a client–server based approach. Offline versions of the database including analyses and visualization tools can easily be installed locally on the user’s computer. This feature facilitates customized local addition of unpublished sRNA candidates and related information such as promoters or terminators using tab-delimited files. Conclusion sRNAdb allows a user-friendly and comprehensive comparative analysis of sRNAs from available sequenced gram-positive prokaryotic replicons. Offline versions including analysis and visualization tools facilitate complex user specific bioinformatics analyses. PMID:22883983

  2. Alining Large Cylinders for Welding

    NASA Technical Reports Server (NTRS)

    Ehl, J. H.

    1985-01-01

    Special tooling alines and holds internally-stiffened large-diameter cylindrical parts for welding. Alinement brackets attached to strengthening fins on insides of cylindrical tank sections. Jackscrews on brackets raised or lowered to eliminate mismatches between adjacent sections. Tooling substantially reduces costs while allowing more precise control and improved quality.

  3. U.S. - GERMAN BILATERAL WORKING GROUP WORKSHOP ON: ECONOMIC TOOLS FOR SUSTAINABLE BROWNFIELDS REDEVELOPMENT

    EPA Science Inventory

    This CD-ROM contains information from a two-day workshop discussing innovative brownfields financing and economic strategies in the United States and Germany. A special emphasis was given to the identification of advantages and disadvantages of different financial tools, economi...

  4. Update 76: Selected Recent Works in the Social Sciences.

    ERIC Educational Resources Information Center

    Pike, Mary L., Ed.; Lusignan, Louise, Ed.

    This is a selected bibliography of current reference and acquisition tools in the social sciences. The tools include sourcebooks, dictionaries, indexes, conference proceedings, special bibliographies, directories, research reports, and journals. Most citations represent works published since 1970 and new editions of important earlier works.…

  5. Prosthetic Hand For Holding Rods, Tools, And Handles

    NASA Technical Reports Server (NTRS)

    Belcher, Jewell G., Jr.; Vest, Thomas W.

    1995-01-01

    Prosthetic hand with quick-grip/quick-release lever broadens range of specialized functions available to lower-arm amputee by providing improved capabilities for gripping rods, tools, handles, and like. Includes two stationary lower fingers opposed by one pivoting upper finger. Lever operates in conjunction with attached bracket.

  6. Tool Measures Depths of Defects on a Case Tang Joint

    NASA Technical Reports Server (NTRS)

    Ream, M. Bryan; Montgomery, Ronald B.; Mecham, Brent A.; Keirstead, Bums W.

    2005-01-01

    A special-purpose tool has been developed for measuring the depths of defects on an O-ring seal surface. The surface lies in a specially shaped ringlike fitting, called a capture feature tang, located on an end of a cylindrical segment of a case that contains a solid-fuel booster rocket motor for launching a space shuttle. The capture feature tang is a part of a tang-and-clevis, O-ring joint between the case segment and a similar, adjacent cylindrical case segment. When the segments are joined, the tang makes an interference fit with the clevis and squeezes the O-ring at the side of the gap.

  7. QPROT: Statistical method for testing differential expression using protein-level intensity data in label-free quantitative proteomics.

    PubMed

    Choi, Hyungwon; Kim, Sinae; Fermin, Damian; Tsou, Chih-Chiang; Nesvizhskii, Alexey I

    2015-11-03

    We introduce QPROT, a statistical framework and computational tool for differential protein expression analysis using protein intensity data. QPROT is an extension of the QSPEC suite, originally developed for spectral count data, adapted for the analysis using continuously measured protein-level intensity data. QPROT offers a new intensity normalization procedure and model-based differential expression analysis, both of which account for missing data. Determination of differential expression of each protein is based on the standardized Z-statistic based on the posterior distribution of the log fold change parameter, guided by the false discovery rate estimated by a well-known Empirical Bayes method. We evaluated the classification performance of QPROT using the quantification calibration data from the clinical proteomic technology assessment for cancer (CPTAC) study and a recently published Escherichia coli benchmark dataset, with evaluation of FDR accuracy in the latter. QPROT is a statistical framework with computational software tool for comparative quantitative proteomics analysis. It features various extensions of QSPEC method originally built for spectral count data analysis, including probabilistic treatment of missing values in protein intensity data. With the increasing popularity of label-free quantitative proteomics data, the proposed method and accompanying software suite will be immediately useful for many proteomics laboratories. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Diversity and Evolution of Mycobacterium tuberculosis: Moving to Whole-Genome-Based Approaches

    PubMed Central

    Niemann, Stefan; Supply, Philip

    2014-01-01

    Genotyping of clinical Mycobacterium tuberculosis complex (MTBC) strains has become a standard tool for epidemiological tracing and for the investigation of the local and global strain population structure. Of special importance is the analysis of the expansion of multidrug (MDR) and extensively drug-resistant (XDR) strains. Classical genotyping and, more recently, whole-genome sequencing have revealed that the strains of the MTBC are more diverse than previously anticipated. Globally, several phylogenetic lineages can be distinguished whose geographical distribution is markedly variable. Strains of particular (sub)lineages, such as Beijing, seem to be more virulent and associated with enhanced resistance levels and fitness, likely fueling their spread in certain world regions. The upcoming generalization of whole-genome sequencing approaches will expectedly provide more comprehensive insights into the molecular and epidemiological mechanisms involved and lead to better diagnostic and therapeutic tools. PMID:25190252

  9. Bioinformatics tools in predictive ecology: applications to fisheries

    PubMed Central

    Tucker, Allan; Duplisea, Daniel

    2012-01-01

    There has been a huge effort in the advancement of analytical techniques for molecular biological data over the past decade. This has led to many novel algorithms that are specialized to deal with data associated with biological phenomena, such as gene expression and protein interactions. In contrast, ecological data analysis has remained focused to some degree on off-the-shelf statistical techniques though this is starting to change with the adoption of state-of-the-art methods, where few assumptions can be made about the data and a more explorative approach is required, for example, through the use of Bayesian networks. In this paper, some novel bioinformatics tools for microarray data are discussed along with their ‘crossover potential’ with an application to fisheries data. In particular, a focus is made on the development of models that identify functionally equivalent species in different fish communities with the aim of predicting functional collapse. PMID:22144390

  10. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    PubMed

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  11. SeqCompress: an algorithm for biological sequence compression.

    PubMed

    Sardaraz, Muhammad; Tahir, Muhammad; Ikram, Ataul Aziz; Bajwa, Hassan

    2014-10-01

    The growth of Next Generation Sequencing technologies presents significant research challenges, specifically to design bioinformatics tools that handle massive amount of data efficiently. Biological sequence data storage cost has become a noticeable proportion of total cost in the generation and analysis. Particularly increase in DNA sequencing rate is significantly outstripping the rate of increase in disk storage capacity, which may go beyond the limit of storage capacity. It is essential to develop algorithms that handle large data sets via better memory management. This article presents a DNA sequence compression algorithm SeqCompress that copes with the space complexity of biological sequences. The algorithm is based on lossless data compression and uses statistical model as well as arithmetic coding to compress DNA sequences. The proposed algorithm is compared with recent specialized compression tools for biological sequences. Experimental results show that proposed algorithm has better compression gain as compared to other existing algorithms. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Life Participation for Parents: a tool for family-centered occupational therapy.

    PubMed

    Fingerhut, Patricia E

    2013-01-01

    This study describes the continued development of the Life Participation for Parents (LPP), a measurement tool to facilitate family-centered pediatric practice. LPP questionnaires were completed by 162 parents of children with special needs receiving intervention at 15 pediatric private practice clinics. Results were analyzed to establish instrument reliability and validity. Good internal consistency (α = .90) and test-retest reliability (r = .89) were established. Construct validity was examined through assessment of internal structure and comparison of the instrument to related variables. A principal components analysis resulted in a two-factor model accounting for 43.81% of the variance. As hypothesized, the LPP correlated only moderately with the Parenting Stress Index-Short Form (r = .54). The variables of child's diagnoses, age, and time in therapy did not predict parental responses. The LPP is a reliable and valid instrument for measuring satisfaction with parental participation in life occupations. Copyright © 2013 by the American Occupational Therapy Association, Inc.

  13. A MATLAB toolbox and Excel workbook for calculating the densities, seismic wave speeds, and major element composition of minerals and rocks at pressure and temperature

    NASA Astrophysics Data System (ADS)

    Abers, Geoffrey A.; Hacker, Bradley R.

    2016-02-01

    To interpret seismic images, rock seismic velocities need to be calculated at elevated pressure and temperature for arbitrary compositions. This technical report describes an algorithm, software, and data to make such calculations from the physical properties of minerals. It updates a previous compilation and Excel® spreadsheet and includes new MATLAB® tools for the calculations. The database of 60 mineral end-members includes all parameters needed to estimate density and elastic moduli for many crustal and mantle rocks at conditions relevant to the upper few hundreds of kilometers of Earth. The behavior of α and β quartz is treated as a special case, owing to its unusual Poisson's ratio and thermal expansion that vary rapidly near the α-β transition. The MATLAB tools allow integration of these calculations into a variety of modeling and data analysis projects.

  14. From direct-space discrepancy functions to crystallographic least squares.

    PubMed

    Giacovazzo, Carmelo

    2015-01-01

    Crystallographic least squares are a fundamental tool for crystal structure analysis. In this paper their properties are derived from functions estimating the degree of similarity between two electron-density maps. The new approach leads also to modifications of the standard least-squares procedures, potentially able to improve their efficiency. The role of the scaling factor between observed and model amplitudes is analysed: the concept of unlocated model is discussed and its scattering contribution is combined with that arising from the located model. Also, the possible use of an ancillary parameter, to be associated with the classical weight related to the variance of the observed amplitudes, is studied. The crystallographic discrepancy factors, basic tools often combined with least-squares procedures in phasing approaches, are analysed. The mathematical approach here described includes, as a special case, the so-called vector refinement, used when accurate estimates of the target phases are available.

  15. STRUTEX: A prototype knowledge-based system for initially configuring a structure to support point loads in two dimensions

    NASA Technical Reports Server (NTRS)

    Robers, James L.; Sobieszczanski-Sobieski, Jaroslaw

    1989-01-01

    Only recently have engineers begun making use of Artificial Intelligence (AI) tools in the area of conceptual design. To continue filling this void in the design process, a prototype knowledge-based system, called STRUTEX has been developed to initially configure a structure to support point loads in two dimensions. This prototype was developed for testing the application of AI tools to conceptual design as opposed to being a testbed for new methods for improving structural analysis and optimization. This system combines numerical and symbolic processing by the computer with interactive problem solving aided by the vision of the user. How the system is constructed to interact with the user is described. Of special interest is the information flow between the knowledge base and the data base under control of the algorithmic main program. Examples of computed and refined structures are presented during the explanation of the system.

  16. Bioinformatics tools in predictive ecology: applications to fisheries.

    PubMed

    Tucker, Allan; Duplisea, Daniel

    2012-01-19

    There has been a huge effort in the advancement of analytical techniques for molecular biological data over the past decade. This has led to many novel algorithms that are specialized to deal with data associated with biological phenomena, such as gene expression and protein interactions. In contrast, ecological data analysis has remained focused to some degree on off-the-shelf statistical techniques though this is starting to change with the adoption of state-of-the-art methods, where few assumptions can be made about the data and a more explorative approach is required, for example, through the use of Bayesian networks. In this paper, some novel bioinformatics tools for microarray data are discussed along with their 'crossover potential' with an application to fisheries data. In particular, a focus is made on the development of models that identify functionally equivalent species in different fish communities with the aim of predicting functional collapse.

  17. Data mining in radiology

    PubMed Central

    Kharat, Amit T; Singh, Amarjit; Kulkarni, Vilas M; Shah, Digish

    2014-01-01

    Data mining facilitates the study of radiology data in various dimensions. It converts large patient image and text datasets into useful information that helps in improving patient care and provides informative reports. Data mining technology analyzes data within the Radiology Information System and Hospital Information System using specialized software which assesses relationships and agreement in available information. By using similar data analysis tools, radiologists can make informed decisions and predict the future outcome of a particular imaging finding. Data, information and knowledge are the components of data mining. Classes, Clusters, Associations, Sequential patterns, Classification, Prediction and Decision tree are the various types of data mining. Data mining has the potential to make delivery of health care affordable and ensure that the best imaging practices are followed. It is a tool for academic research. Data mining is considered to be ethically neutral, however concerns regarding privacy and legality exists which need to be addressed to ensure success of data mining. PMID:25024513

  18. Genomes in the cloud: balancing privacy rights and the public good.

    PubMed

    Ohno-Machado, Lucila; Farcas, Claudiu; Kim, Jihoon; Wang, Shuang; Jiang, Xiaoqian

    2013-01-01

    The NIH-funded iDASH1 National Center for Biomedical Computing was created in 2010 with the goal of developing infrastructure, algorithms, and tools to integrate Data for Analysis, 'anonymization,' and SHaring. iDASH is based on the premise that, while a strong case for not sharing information to preserve individual privacy can be made, an equally compelling case for sharing genome information for the public good (i.e., to support new discoveries that promote health or alleviate the burden of disease) should also be made. In fact, these cases do not need to be mutually exclusive: genome data sharing on a cloud does not necessarily have to compromise individual privacy, although current practices need significant improvement. So far, protection of subject data from re-identification and misuse has been relying primarily on regulations such as HIPAA, the Common Rule, and GINA. However, protection of biometrics such as a genome requires specialized infrastructure and tools.

  19. [Teacher's perfomance assessment in Family Medicine specialization].

    PubMed

    Martínez-González, Adrián; Gómez-Clavelina, Francisco J; Hernández-Torres, Isaías; Flores-Hernández, Fernando; Sánchez-Mendiola, Melchor

    2016-01-01

    In Mexico there is no systematic evaluation of teachers in medical specialties. It is difficult to identify appropriate teaching practices. The lack of evaluation has limited the recognition and improvement of teaching. The objective of this study was to analyze feedback from students about teaching activities of teachers-tutors responsible for the specialization course in family medicine, and evaluate the evidence of reliability and validity of the instrument applied online. It was an observational and cross-sectional study. Seventy eight teachers of Family Medicine of medical residency were evaluated by 734 resident´s opinion. The anonymous questionnaire to assess teaching performance by resident's opinion and it is composed of 5 dimensions using a Likert scale. Descriptive and inferential statistics (t test, one-way ANOVA and factor analysis) were used. Residents stated that teaching performance is acceptable, with an average of 4.25 ± 0.93. The best valued dimension was "Methodology" with an average of 4.34 ± .92 in contrast to the "assessment" dimension with 4.16 ± 1.04. Teachers of specialization in family medicine have acceptable performance by resident's opinion. The online assessment tool meets the criteria of validity and reliability.

  20. Machine Tool Advanced Skills Technology (MAST). Common Ground: Toward a Standards-Based Training System for the U.S. Machine Tool and Metal Related Industries. Volume 5: Mold Making, of a 15-Volume Set of Skill Standards and Curriculum Training Materials for the Precision Manufacturing Industry.

    ERIC Educational Resources Information Center

    Texas State Technical Coll., Waco.

    This document is intended to help education and training institutions deliver the Machine Tool Advanced Skills Technology (MAST) curriculum to a variety of individuals and organizations. MAST consists of industry-specific skill standards and model curricula for 15 occupational speciality areas within the U.S. machine tool and metals-related…

  1. Tools for Local and Distributed Climate Data Access

    NASA Astrophysics Data System (ADS)

    Schweitzer, R.; O'Brien, K.; Burger, E. F.; Smith, K. M.; Manke, A. B.; Radhakrishnan, A.; Balaji, V.

    2017-12-01

    Last year we reported on our efforts to adapt existing tools to facilitate model development. During the lifecycle of a Climate Model Intercomparison Project (CMIP), data must be quality controlled before it can be published and studied. Like previous efforts, the next CMIP6 will produce an unprecedented volume of data. For an institution, modelling group or modeller the volume of data is unmanageable without tools that organize and automate as many processes as possible. Even if a modelling group has tools for data and metadata management, it often falls on individuals to do the initial quality assessment for a model run with bespoke tools. Using individually crafted tools can lead to interruptions when project personnel change and may result in inconsistencies and duplication of effort across groups. This talk will expand on our experiences using available tools (Ferret/PyFerret, the Live Access Server, the GFDL Curator, the GFDL Model Development Database Interface and the THREDDS Data Server) to seamlessly automate the data assembly process to give users "one-click" access to a rich suite of Web-based analysis and comparison tools. On the surface, it appears that this collection of tools is well suited to the task, but our experience of the last year taught us that the data volume and distributed storage adds a number of challenges in adapting the tools for this task. Quality control and initial evaluation add their own set of challenges. We will discuss how we addressed the needs of QC researchers by expanding standard tools to include specialized plots and leveraged the configurability of the tools to add specific user defined analysis operations so they are available to everyone using the system. We also report on our efforts to overcome some of the technical barriers for wide adoption of the tools by providing pre-built containers that are easily deployed in virtual machine and cloud environments. Finally, we will offer some suggestions for added features, configuration options and improved robustness that can make future implementation of similar systems operate faster and more reliably. Solving these challenges for data sets distributed narrowly across networks and storage systems of points the way to solving similar problems associated with sharing data distributed across institutions continents.

  2. Research on the spatial analysis method of seismic hazard for island

    NASA Astrophysics Data System (ADS)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying

    2017-05-01

    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.

  3. Database resources of the National Center for Biotechnology Information

    PubMed Central

    Wheeler, David L.; Barrett, Tanya; Benson, Dennis A.; Bryant, Stephen H.; Canese, Kathi; Chetvernin, Vyacheslav; Church, Deanna M.; DiCuccio, Michael; Edgar, Ron; Federhen, Scott; Geer, Lewis Y.; Helmberg, Wolfgang; Kapustin, Yuri; Kenton, David L.; Khovayko, Oleg; Lipman, David J.; Madden, Thomas L.; Maglott, Donna R.; Ostell, James; Pruitt, Kim D.; Schuler, Gregory D.; Schriml, Lynn M.; Sequeira, Edwin; Sherry, Stephen T.; Sirotkin, Karl; Souvorov, Alexandre; Starchenko, Grigory; Suzek, Tugba O.; Tatusov, Roman; Tatusova, Tatiana A.; Wagner, Lukas; Yaschenko, Eugene

    2006-01-01

    In addition to maintaining the GenBank(R) nucleic acid sequence database, the National Center for Biotechnology Information (NCBI) provides analysis and retrieval resources for the data in GenBank and other biological data made available through NCBI's Web site. NCBI resources include Entrez, the Entrez Programming Utilities, MyNCBI, PubMed, PubMed Central, Entrez Gene, the NCBI Taxonomy Browser, BLAST, BLAST Link (BLink), Electronic PCR, OrfFinder, Spidey, Splign, RefSeq, UniGene, HomoloGene, ProtEST, dbMHC, dbSNP, Cancer Chromosomes, Entrez Genomes and related tools, the Map Viewer, Model Maker, Evidence Viewer, Clusters of Orthologous Groups, Retroviral Genotyping Tools, HIV-1, Human Protein Interaction Database, SAGEmap, Gene Expression Omnibus, Entrez Probe, GENSAT, Online Mendelian Inheritance in Man, Online Mendelian Inheritance in Animals, the Molecular Modeling Database, the Conserved Domain Database, the Conserved Domain Architecture Retrieval Tool and the PubChem suite of small molecule databases. Augmenting many of the Web applications are custom implementations of the BLAST program optimized to search specialized datasets. All of the resources can be accessed through the NCBI home page at: . PMID:16381840

  4. Database resources of the National Center for Biotechnology Information.

    PubMed

    Sayers, Eric W; Barrett, Tanya; Benson, Dennis A; Bolton, Evan; Bryant, Stephen H; Canese, Kathi; Chetvernin, Vyacheslav; Church, Deanna M; Dicuccio, Michael; Federhen, Scott; Feolo, Michael; Fingerman, Ian M; Geer, Lewis Y; Helmberg, Wolfgang; Kapustin, Yuri; Krasnov, Sergey; Landsman, David; Lipman, David J; Lu, Zhiyong; Madden, Thomas L; Madej, Tom; Maglott, Donna R; Marchler-Bauer, Aron; Miller, Vadim; Karsch-Mizrachi, Ilene; Ostell, James; Panchenko, Anna; Phan, Lon; Pruitt, Kim D; Schuler, Gregory D; Sequeira, Edwin; Sherry, Stephen T; Shumway, Martin; Sirotkin, Karl; Slotta, Douglas; Souvorov, Alexandre; Starchenko, Grigory; Tatusova, Tatiana A; Wagner, Lukas; Wang, Yanli; Wilbur, W John; Yaschenko, Eugene; Ye, Jian

    2012-01-01

    In addition to maintaining the GenBank® nucleic acid sequence database, the National Center for Biotechnology Information (NCBI) provides analysis and retrieval resources for the data in GenBank and other biological data made available through the NCBI Website. NCBI resources include Entrez, the Entrez Programming Utilities, MyNCBI, PubMed, PubMed Central (PMC), Gene, the NCBI Taxonomy Browser, BLAST, BLAST Link (BLink), Primer-BLAST, COBALT, Splign, RefSeq, UniGene, HomoloGene, ProtEST, dbMHC, dbSNP, dbVar, Epigenomics, Genome and related tools, the Map Viewer, Model Maker, Evidence Viewer, Trace Archive, Sequence Read Archive, BioProject, BioSample, Retroviral Genotyping Tools, HIV-1/Human Protein Interaction Database, Gene Expression Omnibus (GEO), Probe, Online Mendelian Inheritance in Animals (OMIA), the Molecular Modeling Database (MMDB), the Conserved Domain Database (CDD), the Conserved Domain Architecture Retrieval Tool (CDART), Biosystems, Protein Clusters and the PubChem suite of small molecule databases. Augmenting many of the Web applications are custom implementations of the BLAST program optimized to search specialized data sets. All of these resources can be accessed through the NCBI home page at www.ncbi.nlm.nih.gov.

  5. Investigation of Friction Stir Welding of Al Metal Matrix Composite Materials

    NASA Technical Reports Server (NTRS)

    Diwan, Ravinder M.

    2003-01-01

    The innovative process of Friction Stir Welding (FSW) has generated tremendous interest since its inception about a decade or so ago since the first patent in 1991 by TWI of Cambridge, England. This interest has been seen in many recent international conferences and publications on the subject and relevant published literature. Still the process needs both intensive basic study of deformation mechanisms during this FSW process and analysis and feasibility study to evaluate production methods that will yield high quality strong welds from the stirring action of the appropriate pin tool into the weld plate materials. Development of production processes is a complex task that involves effects of material thickness, materials weldability, pin tool design, pin height, and pin shoulder diameter and related control conditions. The frictional heating with rotational speeds of the pin tool as it plunges into the material and the ensuing plastic flow arising during the traverse of the welding faying surfaces provide the known special advantages of the FSW process in the area of this new advanced joining technology.

  6. Database resources of the National Center for Biotechnology

    PubMed Central

    Wheeler, David L.; Church, Deanna M.; Federhen, Scott; Lash, Alex E.; Madden, Thomas L.; Pontius, Joan U.; Schuler, Gregory D.; Schriml, Lynn M.; Sequeira, Edwin; Tatusova, Tatiana A.; Wagner, Lukas

    2003-01-01

    In addition to maintaining the GenBank(R) nucleic acid sequence database, the National Center for Biotechnology Information (NCBI) provides data analysis and retrieval resources for the data in GenBank and other biological data made available through NCBI's Web site. NCBI resources include Entrez, PubMed, PubMed Central (PMC), LocusLink, the NCBITaxonomy Browser, BLAST, BLAST Link (BLink), Electronic PCR (e-PCR), Open Reading Frame (ORF) Finder, References Sequence (RefSeq), UniGene, HomoloGene, ProtEST, Database of Single Nucleotide Polymorphisms (dbSNP), Human/Mouse Homology Map, Cancer Chromosome Aberration Project (CCAP), Entrez Genomes and related tools, the Map Viewer, Model Maker (MM), Evidence Viewer (EV), Clusters of Orthologous Groups (COGs) database, Retroviral Genotyping Tools, SAGEmap, Gene Expression Omnibus (GEO), Online Mendelian Inheritance in Man (OMIM), the Molecular Modeling Database (MMDB), the Conserved Domain Database (CDD), and the Conserved Domain Architecture Retrieval Tool (CDART). Augmenting many of the Web applications are custom implementations of the BLAST program optimized to search specialized data sets. All of the resources can be accessed through the NCBI home page at: http://www.ncbi.nlm.nih.gov. PMID:12519941

  7. A special purpose silicon compiler for designing supercomputing VLSI systems

    NASA Technical Reports Server (NTRS)

    Venkateswaran, N.; Murugavel, P.; Kamakoti, V.; Shankarraman, M. J.; Rangarajan, S.; Mallikarjun, M.; Karthikeyan, B.; Prabhakar, T. S.; Satish, V.; Venkatasubramaniam, P. R.

    1991-01-01

    Design of general/special purpose supercomputing VLSI systems for numeric algorithm execution involves tackling two important aspects, namely their computational and communication complexities. Development of software tools for designing such systems itself becomes complex. Hence a novel design methodology has to be developed. For designing such complex systems a special purpose silicon compiler is needed in which: the computational and communicational structures of different numeric algorithms should be taken into account to simplify the silicon compiler design, the approach is macrocell based, and the software tools at different levels (algorithm down to the VLSI circuit layout) should get integrated. In this paper a special purpose silicon (SPS) compiler based on PACUBE macrocell VLSI arrays for designing supercomputing VLSI systems is presented. It is shown that turn-around time and silicon real estate get reduced over the silicon compilers based on PLA's, SLA's, and gate arrays. The first two silicon compiler characteristics mentioned above enable the SPS compiler to perform systolic mapping (at the macrocell level) of algorithms whose computational structures are of GIPOP (generalized inner product outer product) form. Direct systolic mapping on PLA's, SLA's, and gate arrays is very difficult as they are micro-cell based. A novel GIPOP processor is under development using this special purpose silicon compiler.

  8. Decision generation tools and Bayesian inference

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz; Wang, Wenjian; Forrester, Thomas; Kostrzewski, Andrew; Veeris, Christian; Nielsen, Thomas

    2014-05-01

    Digital Decision Generation (DDG) tools are important software sub-systems of Command and Control (C2) systems and technologies. In this paper, we present a special type of DDGs based on Bayesian Inference, related to adverse (hostile) networks, including such important applications as terrorism-related networks and organized crime ones.

  9. Improving Disability Awareness among Extension Agents

    ERIC Educational Resources Information Center

    Mahadevan, Lakshmi; Peterson, Rick L.; Grenwelge, Cheryl

    2014-01-01

    Increasing prevalence rates and legislative mandates imply that educators, parents, and Extension agents will need better tools and resources to meet the needs of special populations. The Texas A&M AgriLife Extension Service addresses this issue by using e-learning tools. Extension agents can take advantage of these courses to gain critical…

  10. Biodiversity Conservation and Conservation Biotechnology Tools

    USDA-ARS?s Scientific Manuscript database

    This special issue is dedicated to the in vitro tools and methods used to conserve the genetic diversity of rare and threatened species from around the world. Species that are on the brink of extinction, due to the rapid loss of genetic diversity and habitat, come mainly from resource poor areas the...

  11. Puppetry: Opportunities for Success.

    ERIC Educational Resources Information Center

    Roysdon, Douglas

    The booklet describes ways in which puppetry can promote growth in special and regular education students. Current usage is traced in four categories: puppetry as a demonstrative teaching tool, as an approach to help develop language and communication skills, as a therapeutic tool, and as a form of theater and school arts. Guidelines are presented…

  12. The use of a new ecosystem services assessment tool, EPA H2O, for identifying, quantifying, and valuing ecosystem services production.

    EPA Science Inventory

    The task of estimating ecosystem service production and delivery deserves special attention. Assessment tools that incorporate both supply and delivery of ecosystem services are needed to better understand how ecosystem services production becomes realized benefits. Here, we de...

  13. PROS: An IRAF based system for analysis of x ray data

    NASA Technical Reports Server (NTRS)

    Conroy, M. A.; Deponte, J.; Moran, J. F.; Orszak, J. S.; Roberts, W. P.; Schmidt, D.

    1992-01-01

    PROS is an IRAF based software package for the reduction and analysis of x-ray data. The use of a standard, portable, integrated environment provides for both multi-frequency and multi-mission analysis. The analysis of x-ray data differs from optical analysis due to the nature of the x-ray data and its acquisition during constantly varying conditions. The scarcity of data, the low signal-to-noise ratio and the large gaps in exposure time make data screening and masking an important part of the analysis. PROS was developed to support the analysis of data from the ROSAT and Einstein missions but many of the tasks have been used on data from other missions. IRAF/PROS provides a complete end-to-end system for x-ray data analysis: (1) a set of tools for importing and exporting data via FITS format -- in particular, IRAF provides a specialized event-list format, QPOE, that is compatible with its IMAGE (2-D array) format; (2) a powerful set of IRAF system capabilities for both temporal and spatial event filtering; (3) full set of imaging and graphics tasks; (4) specialized packages for scientific analysis such as spatial, spectral and timing analysis -- these consist of both general and mission specific tasks; and (5) complete system support including ftp and magnetic tape releases, electronic and conventional mail hotline support, electronic mail distribution of solutions to frequently asked questions and current known bugs. We will discuss the philosophy, architecture and development environment used by PROS to generate a portable, multimission software environment. PROS is available on all platforms that support IRAF, including Sun/Unix, VAX/VMS, HP, and Decstations. It is available on request at no charge.

  14. Demonstration of a Safety Analysis on a Complex System

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy; Alfaro, Liliana; Alvarado, Christine; Brown, Molly; Hunt, Earl B.; Jaffe, Matt; Joslyn, Susan; Pinnell, Denise; Reese, Jon; Samarziya, Jeffrey; hide

    1997-01-01

    For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.

  15. Visualization of protein interaction networks: problems and solutions

    PubMed Central

    2013-01-01

    Background Visualization concerns the representation of data visually and is an important task in scientific research. Protein-protein interactions (PPI) are discovered using either wet lab techniques, such mass spectrometry, or in silico predictions tools, resulting in large collections of interactions stored in specialized databases. The set of all interactions of an organism forms a protein-protein interaction network (PIN) and is an important tool for studying the behaviour of the cell machinery. Since graphic representation of PINs may highlight important substructures, e.g. protein complexes, visualization is more and more used to study the underlying graph structure of PINs. Although graphs are well known data structures, there are different open problems regarding PINs visualization: the high number of nodes and connections, the heterogeneity of nodes (proteins) and edges (interactions), the possibility to annotate proteins and interactions with biological information extracted by ontologies (e.g. Gene Ontology) that enriches the PINs with semantic information, but complicates their visualization. Methods In these last years many software tools for the visualization of PINs have been developed. Initially thought for visualization only, some of them have been successively enriched with new functions for PPI data management and PIN analysis. The paper analyzes the main software tools for PINs visualization considering four main criteria: (i) technology, i.e. availability/license of the software and supported OS (Operating System) platforms; (ii) interoperability, i.e. ability to import/export networks in various formats, ability to export data in a graphic format, extensibility of the system, e.g. through plug-ins; (iii) visualization, i.e. supported layout and rendering algorithms and availability of parallel implementation; (iv) analysis, i.e. availability of network analysis functions, such as clustering or mining of the graph, and the possibility to interact with external databases. Results Currently, many tools are available and it is not easy for the users choosing one of them. Some tools offer sophisticated 2D and 3D network visualization making available many layout algorithms, others tools are more data-oriented and support integration of interaction data coming from different sources and data annotation. Finally, some specialistic tools are dedicated to the analysis of pathways and cellular processes and are oriented toward systems biology studies, where the dynamic aspects of the processes being studied are central. Conclusion A current trend is the deployment of open, extensible visualization tools (e.g. Cytoscape), that may be incrementally enriched by the interactomics community with novel and more powerful functions for PIN analysis, through the development of plug-ins. On the other hand, another emerging trend regards the efficient and parallel implementation of the visualization engine that may provide high interactivity and near real-time response time, as in NAViGaTOR. From a technological point of view, open-source, free and extensible tools, like Cytoscape, guarantee a long term sustainability due to the largeness of the developers and users communities, and provide a great flexibility since new functions are continuously added by the developer community through new plug-ins, but the emerging parallel, often closed-source tools like NAViGaTOR, can offer near real-time response time also in the analysis of very huge PINs. PMID:23368786

  16. Simulations of a epidemic model with parameters variation analysis for the dengue fever

    NASA Astrophysics Data System (ADS)

    Jardim, C. L. T. F.; Prates, D. B.; Silva, J. M.; Ferreira, L. A. F.; Kritz, M. V.

    2015-09-01

    Mathematical models can be widely found in the literature for describing and analyzing epidemics. The models that use differential equations to represent mathematically such description are specially sensible to parameters involved in the modelling. In this work, an already developed model, called SIR, is analyzed when applied to a scenario of a dengue fever epidemic. Such choice is powered by the existence of useful tools presented by a variation of this original model, which allow an inclusion of different aspects of the dengue fever disease, as its seasonal characteristics, the presence of more than one strain of the vector and of the biological factor of cross-immunity. The analysis and results interpretation are performed through numerical solutions of the model in question, and a special attention is given to the different solutions generated by the use of different values for the parameters present in this model. Slight variations are performed either dynamically or statically in those parameters, mimicking hypothesized changes in the biological scenario of this simulation and providing a source of evaluation of how those changes would affect the outcomes of the epidemic in a population.

  17. Specialization in the Human Brain: The Case of Numbers

    PubMed Central

    Kadosh, Roi Cohen; Bahrami, Bahador; Walsh, Vincent; Butterworth, Brian; Popescu, Tudor; Price, Cathy J.

    2011-01-01

    How numerical representation is encoded in the adult human brain is important for a basic understanding of human brain organization, its typical and atypical development, its evolutionary precursors, cognitive architectures, education, and rehabilitation. Previous studies have shown that numerical processing activates the same intraparietal regions irrespective of the presentation format (e.g., symbolic digits or non-symbolic dot arrays). This has led to claims that there is a single format-independent, numerical representation. In the current study we used a functional magnetic resonance adaptation paradigm, and effective connectivity analysis to re-examine whether numerical processing in the intraparietal sulci is dependent or independent on the format of the stimuli. We obtained two novel results. First, the whole brain analysis revealed that format change (e.g., from dots to digits), in the absence of a change in magnitude, activated the same intraparietal regions as magnitude change, but to a greater degree. Second, using dynamic causal modeling as a tool to disentangle neuronal specialization across regions that are commonly activated, we found that the connectivity between the left and right intraparietal sulci is format-dependent. Together, this line of results supports the idea that numerical representation is subserved by multiple mechanisms within the same parietal regions. PMID:21808615

  18. Indicators of patients with major depressive disorder in need of highly specialized care: A systematic review.

    PubMed

    van Krugten, Frédérique C W; Kaddouri, Meriam; Goorden, Maartje; van Balkom, Anton J L M; Bockting, Claudi L H; Peeters, Frenk P M L; Hakkaart-van Roijen, Leona

    2017-01-01

    Early identification of patients with major depressive disorder (MDD) that cannot be managed by secondary mental health services and who require highly specialized mental healthcare could enhance need-based patient stratification. This, in turn, may reduce the number of treatment steps needed to achieve and sustain an adequate treatment response. The development of a valid tool to identify patients with MDD in need of highly specialized care is hampered by the lack of a comprehensive understanding of indicators that distinguish patients with and without a need for highly specialized MDD care. The aim of this study, therefore, was to systematically review studies on indicators of patients with MDD likely in need of highly specialized care. A structured literature search was performed on the PubMed and PsycINFO databases following PRISMA guidelines. Two reviewers independently assessed study eligibility and determined the quality of the identified studies. Three reviewers independently executed data extraction by using a pre-piloted, standardized extraction form. The resulting indicators were grouped by topical similarity, creating a concise summary of the findings. The systematic search of all databases yielded a total of 7,360 references, of which sixteen were eligible for inclusion. The sixteen papers yielded a total of 48 unique indicators. Overall, a more pronounced depression severity, a younger age of onset, a history of prior poor treatment response, psychiatric comorbidity, somatic comorbidity, childhood trauma, psychosocial impairment, older age, and a socioeconomically disadvantaged status were found to be associated with proxies of need for highly specialized MDD care. Several indicators are associated with the need for highly specialized MDD care. These indicators provide easily measurable factors that may serve as a starting point for the development of a valid tool to identify patients with MDD in need of highly specialized care.

  19. Development and Validation of a Method for Determining Tridimensional Angular Displacements with Special Applications to Ice Hockey Motions.

    ERIC Educational Resources Information Center

    Gagnon, Micheline; And Others

    1983-01-01

    A method for determining the tridimensional angular displacement of skates during the two-legged stop in ice hockey was developed and validated. The angles were measured by geometry, using a cinecamera and specially equipped skates. The method provides a new tool for kinetic analyses of skating movements. (Authors/PP)

  20. Special Strategies Observation System-Revised: A Useful Tool for Educational Research and Evaluation

    ERIC Educational Resources Information Center

    Meehan, Merrill L.; Cowley, Kimberly S.; Finch, Nicole L.; Chadwick, Kristine L.; Ermolov, Lisa D.; Riffle, M. Joy S.

    2004-01-01

    A review of the critical literature provides a brief history of systematic observation of classroom behaviors, long valued as an important data collection method in educational research. Milestones in systematic observation of classrooms are traced back to 1914 and the development and use of the Special Strategies Observation System (SSOS) through…

  1. Training Pre-Service Special Education Teachers to Facilitate Meaningful Parent Participation in IEPs Using Simulated Meetings

    ERIC Educational Resources Information Center

    Holdren, Natalie Robin O'Connor

    2017-01-01

    The current study sought to establish whether simulated Individualized Education Plan (IEP) meetings using scenarios and actors may serve as an effective tool for assessing and improving pre-service special education teachers' ability to facilitate parent participation in legally correct IEP meetings with the introduction of a training…

  2. Hidden Uses of Presentation Software--The Ideal Tool for Making Customized Materials for Special Needs Students and Clients.

    ERIC Educational Resources Information Center

    Gilden, Deborah

    This paper discusses how presentation software can be used to design custom materials for a variety of people with special needs, including children and adults with low vision, people with developmental disabilities, and stroke patients with cognitive impairments. Benefits of using presentation software include: (1) presentation software gives the…

  3. Observation of Pupils and Teachers in Mainstream and Special Education Settings: Alternative Strategies.

    ERIC Educational Resources Information Center

    Weinberg, Richard A., Ed.; Wood, Frank H., Ed.

    Presented are 12 papers which focus on four systematized methods of classroom observation. Stressed is the importance of formal, systematic observation as a tool for viewing and recording pupil behaviors and insuring that the individual child's needs are met in both the mainstream and special education settings. R. Brandt offers an historical…

  4. Examining the Nature of Critical Incidents during Interactions between Special Education Teachers and Virtual Coaches

    ERIC Educational Resources Information Center

    Snyder, Kathleen

    2013-01-01

    Coaching is a powerful tool for improving special education teachers' use of evidence-based practices. Recent technological advances have great potential to influence the manner in which coaching is implemented. Virtual coaching is an innovative cycle of coaching that utilizes video-conferencing and bug-in-ear technology in internet-mediated…

  5. Solution of task related to control of swiss-type automatic lathe to get planes parallel to part axis

    NASA Astrophysics Data System (ADS)

    Tabekina, N. A.; Chepchurov, M. S.; Evtushenko, E. I.; Dmitrievsky, B. S.

    2018-05-01

    The work solves the problem of automation of machining process namely turning to produce parts having the planes parallel to an axis of rotation of part without using special tools. According to the results, the availability of the equipment of a high speed electromechanical drive to control the operative movements of lathe machine will enable one to get the planes parallel to the part axis. The method of getting planes parallel to the part axis is based on the mathematical model, which is presented as functional dependency between the conveying velocity of the driven element and the time. It describes the operative movements of lathe machine all over the tool path. Using the model of movement of the tool, it has been found that the conveying velocity varies from the maximum to zero value. It will allow one to carry out the reverse of the drive. The scheme of tool placement regarding the workpiece has been proposed for unidirectional movement of the driven element at high conveying velocity. The control method of CNC machines can be used for getting geometrically complex parts on the lathe without using special milling tools.

  6. MED32/442: Internet Lectures: A five years experience at the University of Vienna

    PubMed Central

    Kritz, H; Najemnik, C; Sinzinger, H

    1999-01-01

    Introduction Internet lectures are a very useful specialized tool to distribute information to interested people. Our experience started 1994 and we want to give a survey. Methods We started with a special lecture on atherosclerosis and improved the method within the last years. We had a lot of problems with certification and identification procedures, but they are solved now. In the last years we added web casting tools. Results At this time we have 80- 140 student per term, which are participants of a lectures. These are students attending a certificate course. Additionally we distribute our content over two homepages which we have installed to improve our work http://www.lipidforum.at, http://www.billrothhaus.at (10000 hits/month). Discussion Internet will be the most important tool for future teaching. It is an interesting experience to teach students from all over the world and you learn a lot from these people by yourself.

  7. Preventing War: Special Operations Engagement in Support of Security Sector Reform

    DTIC Science & Technology

    2014-12-04

    This study analyzes recent special operations engagement in Mali and the Philippines . Through that analysis, enduring engagement, special......recent special operations engagement in Mali and the Philippines . Through that analysis, enduring engagement, special operation engagement campaigns

  8. New research perspectives from a novel approach to quantify tracheid wall thickness.

    PubMed

    Prendin, Angela Luisa; Petit, Giai; Carrer, Marco; Fonti, Patrick; Björklund, Jesper; von Arx, Georg

    2017-07-01

    The analysis of xylem cell anatomical features in dated tree rings provides insights into xylem functional responses and past growth conditions at intra-annual resolution. So far, special focus has been given to the lumen of the water-conducting cells, whereas the equally relevant cell wall thickness (CWT) has been less investigated due to methodological limitations. Here we present a novel approach to measure tracheid CWT in high-resolution images of wood cross-sections that is implemented within the specialized image-analysis tool 'ROXAS'. Compared with the traditional manual line measurements along a selection of few radial files, this novel image-analysis tool can: (i) measure CWT of all tracheids in a tree-ring cross-section, thus increasing the number of individual tracheid measurements by a factor of ~10-20; (ii) measure the tangential and radial walls separately; and (iii) laterally integrate the measurements in a customizable way from only the thinnest central part of the cell walls up to the thickest part of the tracheids at the corners. Cell wall thickness measurements performed with our novel approach and the traditional manual approach showed comparable accuracy for several image resolutions, with an optimal accuracy-efficiency balance at 100× magnification. The configurable settings intended to underscore different cell wall properties indeed changed the absolute levels and intra- and inter-annual patterns of CWT. This versatility, together with the high data production capacity, allows to tailor the measurements of CWT to the specific goal of each study, which opens new research perspectives, e.g., for investigating structure-function relationships, tree stress responses and carbon allocation patterns, and for reconstructing climate based on intra- and inter-annual variability of anatomical wood density. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Teachers with learning disabilities: a view from both sides of the desk.

    PubMed

    Ferri, B A; Keefe, C H; Gregg, N

    2001-01-01

    The purpose of this qualitative multicase study was to explore the perceptions of individuals who could speak from both sides of the special education desk--as students and as teachers. The three participants for this study each received special education services for learning disabilities while in school and were currently teaching students with learning disabilities. Specifically the study focused on how participants' past experiences with receiving special education services influenced their current practice as special education teachers. Participants' views on service delivery models, the importance of teacher expectations, and the value of conceiving a learning disability as a tool rather than a deficit are discussed.

  10. Web-Based Geographic Information System Tool for Accessing Hanford Site Environmental Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Triplett, Mark B.; Seiple, Timothy E.; Watson, David J.

    Data volume, complexity, and access issues pose severe challenges for analysts, regulators and stakeholders attempting to efficiently use legacy data to support decision making at the U.S. Department of Energy’s (DOE) Hanford Site. DOE has partnered with the Pacific Northwest National Laboratory (PNNL) on the PHOENIX (PNNL-Hanford Online Environmental Information System) project, which seeks to address data access, transparency, and integration challenges at Hanford to provide effective decision support. PHOENIX is a family of spatially-enabled web applications providing quick access to decades of valuable scientific data and insight through intuitive query, visualization, and analysis tools. PHOENIX realizes broad, public accessibilitymore » by relying only on ubiquitous web-browsers, eliminating the need for specialized software. It accommodates a wide range of users with intuitive user interfaces that require little or no training to quickly obtain and visualize data. Currently, PHOENIX is actively hosting three applications focused on groundwater monitoring, groundwater clean-up performance reporting, and in-tank monitoring. PHOENIX-based applications are being used to streamline investigative and analytical processes at Hanford, saving time and money. But more importantly, by integrating previously isolated datasets and developing relevant visualization and analysis tools, PHOENIX applications are enabling DOE to discover new correlations hidden in legacy data, allowing them to more effectively address complex issues at Hanford.« less

  11. A generally applicable lightweight method for calculating a value structure for tools and services in bioinformatics infrastructure projects.

    PubMed

    Mayer, Gerhard; Quast, Christian; Felden, Janine; Lange, Matthias; Prinz, Manuel; Pühler, Alfred; Lawerenz, Chris; Scholz, Uwe; Glöckner, Frank Oliver; Müller, Wolfgang; Marcus, Katrin; Eisenacher, Martin

    2017-10-30

    Sustainable noncommercial bioinformatics infrastructures are a prerequisite to use and take advantage of the potential of big data analysis for research and economy. Consequently, funders, universities and institutes as well as users ask for a transparent value model for the tools and services offered. In this article, a generally applicable lightweight method is described by which bioinformatics infrastructure projects can estimate the value of tools and services offered without determining exactly the total costs of ownership. Five representative scenarios for value estimation from a rough estimation to a detailed breakdown of costs are presented. To account for the diversity in bioinformatics applications and services, the notion of service-specific 'service provision units' is introduced together with the factors influencing them and the main underlying assumptions for these 'value influencing factors'. Special attention is given on how to handle personnel costs and indirect costs such as electricity. Four examples are presented for the calculation of the value of tools and services provided by the German Network for Bioinformatics Infrastructure (de.NBI): one for tool usage, one for (Web-based) database analyses, one for consulting services and one for bioinformatics training events. Finally, from the discussed values, the costs of direct funding and the costs of payment of services by funded projects are calculated and compared. © The Author 2017. Published by Oxford University Press.

  12. Training Tools for Nontechnical Skills for Surgeons-A Systematic Review.

    PubMed

    Wood, Thomas Charles; Raison, Nicholas; Haldar, Shreya; Brunckhorst, Oliver; McIlhenny, Craig; Dasgupta, Prokar; Ahmed, Kamran

    Development of nontechnical skills for surgeons has been recognized as an important factor in surgical care. Training tools for this specific domain are being created and validated to maximize the surgeon's nontechnical ability. This systematic review aims to outline, address, and recommend these training tools. A full and comprehensive literature search, using a systematic format, was performed on ScienceDirect and PubMed, with data extraction occurring in line with specified inclusion criteria. Systematic review was performed fully at King's College London. A total of 84 heterogeneous articles were used in this review. Further, 23 training tools including scoring systems, training programs, and mixtures of the two for a range of specialities were identified in the literature. Most can be applied to surgery overall, although some tools target specific specialities (such as neurosurgery). Interrater reliability, construct, content, and face validation statuses were variable according to the specific tool in question. Study results pertaining to nontechnical skill training tools have thus far been universally positive, but further studies are required for those more recently developed and less extensively used tools. Recommendations can be made for individual training tools based on their level of validation and for their target audience. Based on the number of studies performed and their status of validity, NOTSS and Oxford NOTECHS II can be considered the gold standard for individual- and team-based nontechnical skills training, respectively, especially when used in conjunction with a training program. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  13. Quality Management Tools: Facilitating Clinical Research Data Integrity by Utilizing Specialized Reports with Electronic Case Report Forms

    PubMed Central

    Trocky, NM; Fontinha, M

    2005-01-01

    Data collected throughout the course of a clinical research trial must be reviewed for accuracy and completeness continually. The Oracle Clinical® (OC) data management application utilized to capture clinical data facilitates data integrity through pre-programmed validations, edit and range checks, and discrepancy management modules. These functions were not enough. Coupled with the use of specially created reports in Oracle Discoverer® and Integrated Review TM, both ad-hoc query and reporting tools, research staff have enhanced their ability to clean, analyze and report more accurate data captured within and among Case Report Forms (eCRFs) by individual study or across multiple studies. PMID:16779428

  14. Economic analysis of cancer treatment costs: another tool for oncology managers.

    PubMed

    Chirikos, T N; Ruckdeschel, J C; Krischer, J P

    2001-01-01

    Oncology managers increasingly need more information about how much and why treatment costs vary across cancer patients. In response to this need, our Center is building an analytic capacity for investigating economic aspects of cancer treatment. Economic analysis is characterized by a simultaneous consideration of treatment costs and outcomes; it focuses on how treatment cost/outcome ratios vary across patient populations with similar diseases. In this paper, we present an overview of our work, with special emphasis on the measurement of outcomes and the inputs or costs of treatment, the variability of cost/outcome ratios, and the analysis of the factors that predict or explain this observed variation. We illustrate how the analysis is conducted, set out selected results relating to lung and breast cancer patients, and assess some of the advantages and disadvantages of the approach. Among other things, we conclude that economic analysis of cancer treatment costs is feasible and that it can provide useful data for managerial decision making.

  15. Comparing Student Evaluations of Certified and Non-Certified Nurse Educators

    ERIC Educational Resources Information Center

    Grobe, Jennifer L.

    2017-01-01

    Educational experts identified certification as a measure of knowledge and a tool to promote excellence that can be used to enhance education to advance specialized competency. As this tool is promoted in the preparation of nurse educators, resources must focus on how the continuing education efforts impact the classroom environment. Little has…

  16. Mining Hidden Gems Beneath the Surface: A Look At the Invisible Web.

    ERIC Educational Resources Information Center

    Carlson, Randal D.; Repman, Judi

    2002-01-01

    Describes resources for researchers called the Invisible Web that are hidden from the usual search engines and other tools and contrasts them with those resources available on the surface Web. Identifies specialized search tools, databases, and strategies that can be used to locate credible in-depth information. (Author/LRW)

  17. 75 FR 25159 - Defense Federal Acquisition Regulation Supplement; Preservation of Tooling for Major Defense...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-07

    ... Authorization Act for Fiscal Year 2009. Section 815 requires acquisition plans for major weapons systems to... hardware for major defense acquisition programs through the end of the service life of the related weapons... affects all contracts for major weapons that will require special tooling associated with the production...

  18. Computer Mathematical Tools: Practical Experience of Learning to Use Them

    ERIC Educational Resources Information Center

    Semenikhina, Elena; Drushlyak, Marina

    2014-01-01

    The article contains general information about the use of specialized mathematics software in the preparation of math teachers. The authors indicate the reasons to study the mathematics software. In particular, they analyze the possibility of presenting basic mathematical courses using mathematical computer tools from both a teacher and a student,…

  19. The Use of Hand Tools in Agricultural Mechanics.

    ERIC Educational Resources Information Center

    Montana State Univ., Bozeman. Dept. of Agricultural and Industrial Education.

    This document contains a unit for teaching the use of hand tools in agricultural mechanics in Montana. It consists of an outline of the unit and seven lesson plans. The unit outline contains the following components: situation, aims and goals, list of lessons, student activities, teacher activities, special equipment needed, and references. The…

  20. A Web-Based Tool to Support Data-Based Early Intervention Decision Making

    ERIC Educational Resources Information Center

    Buzhardt, Jay; Greenwood, Charles; Walker, Dale; Carta, Judith; Terry, Barbara; Garrett, Matthew

    2010-01-01

    Progress monitoring and data-based intervention decision making have become key components of providing evidence-based early childhood special education services. Unfortunately, there is a lack of tools to support early childhood service providers' decision-making efforts. The authors describe a Web-based system that guides service providers…

  1. Enhancing Classroom Effectiveness through Social Networking Tools

    ERIC Educational Resources Information Center

    Kurthakoti, Raghu; Boostrom, Robert E., Jr.; Summey, John H.; Campbell, David A.

    2013-01-01

    To determine the usefulness of social networking Web sites such as Ning.com as a communication tool in marketing courses, a study was designed with special concern for social network use in comparison to Blackboard. Students from multiple marketing courses were surveyed. Assessments of Ning.com and Blackboard were performed both to understand how…

  2. 76 FR 34744 - Notice of Proposed Information Collection for Public Comment; Continuum of Care Check-up...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-14

    ... Information Collection for Public Comment; Continuum of Care Check-up Assessment Tool AGENCY: U.S. Department...: Continuum of Care Check-up Assessment Tool. Description of the need for the information proposed: The CoC... FURTHER INFORMATION CONTACT: Ann Marie Oliva, Director, Office of Special Needs Assistance Programs...

  3. Overview and insights regarding the JEQ soil and water assessment tool (SWAT) special issue

    USDA-ARS?s Scientific Manuscript database

    The Soil and Water Assessment Tool (SWAT) model has emerged as one of the most widely used water quality watershed- and river basin-scale models worldwide, and has been extensively applied for a broad range of hydrologic and/or environmental problems. Factors driving the international use of SWAT i...

  4. Practical Tools for Designing and Weighting Survey Samples

    ERIC Educational Resources Information Center

    Valliant, Richard; Dever, Jill A.; Kreuter, Frauke

    2013-01-01

    Survey sampling is fundamentally an applied field. The goal in this book is to put an array of tools at the fingertips of practitioners by explaining approaches long used by survey statisticians, illustrating how existing software can be used to solve survey problems, and developing some specialized software where needed. This book serves at least…

  5. Evidence-Based Assessment of Autism Spectrum Disorders in Children and Adolescents

    ERIC Educational Resources Information Center

    Ozonoff, Sally; Goodlin-Jones, Beth L.; Solomon, Marjorie

    2005-01-01

    This article reviews evidence-based criteria that can guide practitioners in the selection, use, and interpretation of assessment tools for autism spectrum disorders (ASD). As Mash and Hunsley (2005) discuss in this special section, evidence-based assessment tools not only demonstrate adequate psychometric qualities, but also have relevance to the…

  6. Perspective: Reaches of chemical physics in biology.

    PubMed

    Gruebele, Martin; Thirumalai, D

    2013-09-28

    Chemical physics as a discipline contributes many experimental tools, algorithms, and fundamental theoretical models that can be applied to biological problems. This is especially true now as the molecular level and the systems level descriptions begin to connect, and multi-scale approaches are being developed to solve cutting edge problems in biology. In some cases, the concepts and tools got their start in non-biological fields, and migrated over, such as the idea of glassy landscapes, fluorescence spectroscopy, or master equation approaches. In other cases, the tools were specifically developed with biological physics applications in mind, such as modeling of single molecule trajectories or super-resolution laser techniques. In this introduction to the special topic section on chemical physics of biological systems, we consider a wide range of contributions, all the way from the molecular level, to molecular assemblies, chemical physics of the cell, and finally systems-level approaches, based on the contributions to this special issue. Chemical physicists can look forward to an exciting future where computational tools, analytical models, and new instrumentation will push the boundaries of biological inquiry.

  7. Perspective: Reaches of chemical physics in biology

    PubMed Central

    Gruebele, Martin; Thirumalai, D.

    2013-01-01

    Chemical physics as a discipline contributes many experimental tools, algorithms, and fundamental theoretical models that can be applied to biological problems. This is especially true now as the molecular level and the systems level descriptions begin to connect, and multi-scale approaches are being developed to solve cutting edge problems in biology. In some cases, the concepts and tools got their start in non-biological fields, and migrated over, such as the idea of glassy landscapes, fluorescence spectroscopy, or master equation approaches. In other cases, the tools were specifically developed with biological physics applications in mind, such as modeling of single molecule trajectories or super-resolution laser techniques. In this introduction to the special topic section on chemical physics of biological systems, we consider a wide range of contributions, all the way from the molecular level, to molecular assemblies, chemical physics of the cell, and finally systems-level approaches, based on the contributions to this special issue. Chemical physicists can look forward to an exciting future where computational tools, analytical models, and new instrumentation will push the boundaries of biological inquiry. PMID:24089712

  8. Cluster Method Analysis of K. S. C. Image

    NASA Technical Reports Server (NTRS)

    Rodriguez, Joe, Jr.; Desai, M.

    1997-01-01

    Information obtained from satellite-based systems has moved to the forefront as a method in the identification of many land cover types. Identification of different land features through remote sensing is an effective tool for regional and global assessment of geometric characteristics. Classification data acquired from remote sensing images have a wide variety of applications. In particular, analysis of remote sensing images have special applications in the classification of various types of vegetation. Results obtained from classification studies of a particular area or region serve towards a greater understanding of what parameters (ecological, temporal, etc.) affect the region being analyzed. In this paper, we make a distinction between both types of classification approaches although, focus is given to the unsupervised classification method using 1987 Thematic Mapped (TM) images of Kennedy Space Center.

  9. Fluorescence fluctuations analysis in nanoapertures: physical concepts and biological applications.

    PubMed

    Lenne, Pierre-François; Rigneault, Hervé; Marguet, Didier; Wenger, Jérôme

    2008-11-01

    During the past years, nanophotonics has provided new approaches to study the biological processes below the optical diffraction limit. How single molecules diffuse, bind and assemble can be studied now at the nanometric level, not only in solutions but also in complex and crowded environments such as in live cells. In this context fluorescence fluctuations spectroscopy is a unique tool since it has proven to be easy to use in combination with nanostructures, which are able to confine light in nanometric volumes. We review here recent advances in fluorescence fluctuations' analysis below the optical diffraction limit with a special focus on nanoapertures milled in metallic films. We discuss applications in the field of single-molecule detection, DNA sequencing and membrane organization, and underscore some potential perspectives of this new emerging technology.

  10. A Liquid Array Platform For the Multiplexed Analysis of Synthetic Molecule-Protein Interactions

    PubMed Central

    Doran, Todd M.; Kodadek, Thomas

    2014-01-01

    Synthetic molecule microarrays, consisting of many different compounds spotted onto a planar surface such as modified glass or cellulose, have proven to be useful tools for the multiplexed analysis of small molecule- and peptide-protein interactions. However, these arrays are technically difficult to manufacture and use with high reproducibility and require specialized equipment. Here we report a more convenient alternative comprised of color-encoded beads that display a small molecule protein ligand on the surface. Quantitative, multiplexed assay of protein binding to up to 24 different ligands can be achieved using a common flow cytometer for the readout. This technology should be useful for evaluating hits from library screening efforts, the determination of structure activity relationships and for certain types of serological analyses. PMID:24245981

  11. Holography as a tool for widespread industrial applications: analysis and comments

    NASA Astrophysics Data System (ADS)

    Smigielski, Paul

    1991-10-01

    During the last national meeting of the Holographic Club of the French Optical Society held at SAUMUR, 22-23 November 1990, on `Vibration analysis with the help of holographic and associated methods,' more than 80% of attendees were industrialists. Some scientists who specialized in coherent optics said that it is not necessary to be an optician to use holography in the industry. That means that veritable progress has been achieved since the discovery of holographic interferometry in 1965. But, on the other hand, too few industrialists use holographic techniques. This paper critically examines the evolution of holographic interferometry through concrete examples and shows that hopes of industrial uses of holography are more credible today than yesterday because of new developments expected in hardwares (lasers, recording materials, etc.) and softwares.

  12. Advances in Quantum Mechanochemistry: Electronic Structure Methods and Force Analysis.

    PubMed

    Stauch, Tim; Dreuw, Andreas

    2016-11-23

    In quantum mechanochemistry, quantum chemical methods are used to describe molecules under the influence of an external force. The calculation of geometries, energies, transition states, reaction rates, and spectroscopic properties of molecules on the force-modified potential energy surfaces is the key to gain an in-depth understanding of mechanochemical processes at the molecular level. In this review, we present recent advances in the field of quantum mechanochemistry and introduce the quantum chemical methods used to calculate the properties of molecules under an external force. We place special emphasis on quantum chemical force analysis tools, which can be used to identify the mechanochemically relevant degrees of freedom in a deformed molecule, and spotlight selected applications of quantum mechanochemical methods to point out their synergistic relationship with experiments.

  13. Part 3 Specialized aspects of GIS and spatial analysis . Garage band science and dynamic spatial models

    NASA Astrophysics Data System (ADS)

    Box, Paul W.

    GIS and spatial analysis is suited mainly for static pictures of the landscape, but many of the processes that need exploring are dynamic in nature. Dynamic processes can be complex when put in a spatial context; our ability to study such processes will probably come with advances in understanding complex systems in general. Cellular automata and agent-based models are two prime candidates for exploring complex spatial systems, but are difficult to implement. Innovative tools that help build complex simulations will create larger user communities, who will probably find novel solutions for understanding complexity. A significant source for such innovations is likely to be from the collective efforts of hobbyists and part-time programmers, who have been dubbed ``garage-band scientists'' in the popular press.

  14. An Export-Marketing Model for Pharmaceutical Firms (The Case of Iran)

    PubMed Central

    Mohammadzadeh, Mehdi; Aryanpour, Narges

    2013-01-01

    Internationalization is a matter of committed decision-making that starts with export marketing, in which an organization tries to diagnose and use opportunities in target markets based on realistic evaluation of internal strengths and weaknesses with analysis of macro and microenvironments in order to gain presence in other countries. A developed model for export and international marketing of pharmaceutical companies is introduced. The paper reviews common theories of the internationalization process, followed by examining different methods and models for assessing preparation for export activities and examining conceptual model based on a single case study method on a basket of seven leading domestic firms by using mainly questionares as the data gathering tool along with interviews for bias reduction. Finally, in keeping with the study objectives, the special aspects of the pharmaceutical marketing environment have been covered, revealing special dimensions of pharmaceutical marketing that have been embedded within the appropriate base model. The new model for international activities of pharmaceutical companies was refined by expert opinions extracted from result of questionnaires. PMID:24250597

  15. Objective Video Quality Assessment Based on Machine Learning for Underwater Scientific Applications

    PubMed Central

    Moreno-Roldán, José-Miguel; Luque-Nieto, Miguel-Ángel; Poncela, Javier; Otero, Pablo

    2017-01-01

    Video services are meant to be a fundamental tool in the development of oceanic research. The current technology for underwater networks (UWNs) imposes strong constraints in the transmission capacity since only a severely limited bitrate is available. However, previous studies have shown that the quality of experience (QoE) is enough for ocean scientists to consider the service useful, although the perceived quality can change significantly for small ranges of variation of video parameters. In this context, objective video quality assessment (VQA) methods become essential in network planning and real time quality adaptation fields. This paper presents two specialized models for objective VQA, designed to match the special requirements of UWNs. The models are built upon machine learning techniques and trained with actual user data gathered from subjective tests. Our performance analysis shows how both of them can successfully estimate quality as a mean opinion score (MOS) value and, for the second model, even compute a distribution function for user scores. PMID:28333123

  16. Auditory Neuroimaging with fMRI and PET

    PubMed Central

    Talavage, Thomas M.; Gonzalez-Castillo, Javier; Scott, Sophie K.

    2013-01-01

    For much of the past 30 years, investigations of auditory perception and language have been enhanced or even driven by the use of functional neuroimaging techniques that specialize in localization of central responses. Beginning with investigations using positron emission tomography (PET) and gradually shifting primarily to usage of functional magnetic resonance imaging (fMRI), auditory neuroimaging has greatly advanced our understanding of the organization and response properties of brain regions critical to the perception of and communication with the acoustic world in which we live. As the complexity of the questions being addressed has increased, the techniques, experiments and analyses applied have also become more nuanced and specialized. A brief review of the history of these investigations sets the stage for an overview and analysis of how these neuroimaging modalities are becoming ever more effective tools for understanding the auditory brain. We conclude with a brief discussion of open methodological issues as well as potential clinical applications for auditory neuroimaging. PMID:24076424

  17. Skill networks and measures of complex human capital.

    PubMed

    Anderson, Katharine A

    2017-11-28

    We propose a network-based method for measuring worker skills. We illustrate the method using data from an online freelance website. Using the tools of network analysis, we divide skills into endogenous categories based on their relationship with other skills in the market. Workers who specialize in these different areas earn dramatically different wages. We then show that, in this market, network-based measures of human capital provide additional insight into wages beyond traditional measures. In particular, we show that workers with diverse skills earn higher wages than those with more specialized skills. Moreover, we can distinguish between two different types of workers benefiting from skill diversity: jacks-of-all-trades, whose skills can be applied independently on a wide range of jobs, and synergistic workers, whose skills are useful in combination and fill a hole in the labor market. On average, workers whose skills are synergistic earn more than jacks-of-all-trades. Copyright © 2017 the Author(s). Published by PNAS.

  18. An export-marketing model for pharmaceutical firms (the case of iran).

    PubMed

    Mohammadzadeh, Mehdi; Aryanpour, Narges

    2013-01-01

    Internationalization is a matter of committed decision-making that starts with export marketing, in which an organization tries to diagnose and use opportunities in target markets based on realistic evaluation of internal strengths and weaknesses with analysis of macro and microenvironments in order to gain presence in other countries. A developed model for export and international marketing of pharmaceutical companies is introduced. The paper reviews common theories of the internationalization process, followed by examining different methods and models for assessing preparation for export activities and examining conceptual model based on a single case study method on a basket of seven leading domestic firms by using mainly questionares as the data gathering tool along with interviews for bias reduction. Finally, in keeping with the study objectives, the special aspects of the pharmaceutical marketing environment have been covered, revealing special dimensions of pharmaceutical marketing that have been embedded within the appropriate base model. The new model for international activities of pharmaceutical companies was refined by expert opinions extracted from result of questionnaires.

  19. SOURCE EXPLORER: Towards Web Browser Based Tools for Astronomical Source Visualization and Analysis

    NASA Astrophysics Data System (ADS)

    Young, M. D.; Hayashi, S.; Gopu, A.

    2014-05-01

    As a new generation of large format, high-resolution imagers come online (ODI, DECAM, LSST, etc.) we are faced with the daunting prospect of astronomical images containing upwards of hundreds of thousands of identifiable sources. Visualizing and interacting with such large datasets using traditional astronomical tools appears to be unfeasible, and a new approach is required. We present here a method for the display and analysis of arbitrarily large source datasets using dynamically scaling levels of detail, enabling scientists to rapidly move from large-scale spatial overviews down to the level of individual sources and everything in-between. Based on the recognized standards of HTML5+JavaScript, we enable observers and archival users to interact with their images and sources from any modern computer without having to install specialized software. We demonstrate the ability to produce large-scale source lists from the images themselves, as well as overlaying data from publicly available source ( 2MASS, GALEX, SDSS, etc.) or user provided source lists. A high-availability cluster of computational nodes allows us to produce these source maps on demand and customized based on user input. User-generated source lists and maps are persistent across sessions and are available for further plotting, analysis, refinement, and culling.

  20. Open source GIS based tools to improve hydrochemical water resources management in EU H2020 FREEWAT platform

    NASA Astrophysics Data System (ADS)

    Criollo, Rotman; Velasco, Violeta; Vázquez-Suñé, Enric; Nardi, Albert; Marazuela, Miguel A.; Rossetto, Rudy; Borsi, Iacopo; Foglia, Laura; Cannata, Massimiliano; De Filippis, Giovanna

    2017-04-01

    Due to the general increase of water scarcity (Steduto et al., 2012), water quantity and quality must be well known to ensure a proper access to water resources in compliance with local and regional directives. This circumstance can be supported by tools which facilitate process of data management and its analysis. Such analyses have to provide research/professionals, policy makers and users with the ability to improve the management of the water resources with standard regulatory guidelines. Compliance with the established standard regulatory guidelines (with a special focus on requirement deriving from the GWD) should have an effective monitoring, evaluation, and interpretation of a large number of physical and chemical parameters. These amounts of datasets have to be assessed and interpreted: (i) integrating data from different sources and gathered with different data access techniques and formats; (ii) managing data with varying temporal and spatial extent; (iii) integrating groundwater quality information with other relevant information such as further hydrogeological data (Velasco et al., 2014) and pre-processing these data generally for the realization of groundwater models. In this context, the Hydrochemical Analysis Tools, akvaGIS Tools, has been implemented within the H2020 FREEWAT project; which aims to manage water resources by modelling water resource management in an open source GIS platform (QGIS desktop). The main goal of AkvaGIS Tools is to improve water quality analysis through different capabilities to improve the case study conceptual model managing all data related into its geospatial database (implemented in Spatialite) and a set of tools for improving the harmonization, integration, standardization, visualization and interpretation of the hydrochemical data. To achieve that, different commands cover a wide range of methodologies for querying, interpreting, and comparing groundwater quality data and facilitate the pre-processing analysis for being used in the realization of groundwater modelling. They include, ionic balance calculations, chemical time-series analysis, correlation of chemical parameters, and calculation of various common hydrochemical diagrams (Salinity, Schöeller-Berkaloff, Piper, and Stiff), among others. Furthermore, it allows the generation of maps of the spatial distributions of parameters and diagrams and thematic maps for the parameters measured and classified in the queried area. References: Rossetto R., Borsi I., Schifani C., Bonari E., Mogorovich P., Primicerio M. (2013). SID&GRID: Integrating hydrological modeling in GIS environment. Rendiconti Online Societa Geologica Italiana, Vol. 24, 282-283 Steduto, P., Faurès, J.M., Hoogeveen, J., Winpenny, J.T., Burke, J.J. (2012). Coping with water scarcity: an action framework for agriculture and food security. ISSN 1020-1203 ; 38 Velasco, V., Tubau, I., Vázquez-Suñé, E., Gogu, R., Gaitanaru, D., Alcaraz, M., Sanchez-Vila, X. (2014). GIS-based hydrogeochemical analysis tools (QUIMET). Computers & Geosciences, 70, 164-180.

  1. quantGenius: implementation of a decision support system for qPCR-based gene quantification.

    PubMed

    Baebler, Špela; Svalina, Miha; Petek, Marko; Stare, Katja; Rotter, Ana; Pompe-Novak, Maruša; Gruden, Kristina

    2017-05-25

    Quantitative molecular biology remains a challenge for researchers due to inconsistent approaches for control of errors in the final results. Due to several factors that can influence the final result, quantitative analysis and interpretation of qPCR data are still not trivial. Together with the development of high-throughput qPCR platforms, there is a need for a tool allowing for robust, reliable and fast nucleic acid quantification. We have developed "quantGenius" ( http://quantgenius.nib.si ), an open-access web application for a reliable qPCR-based quantification of nucleic acids. The quantGenius workflow interactively guides the user through data import, quality control (QC) and calculation steps. The input is machine- and chemistry-independent. Quantification is performed using the standard curve approach, with normalization to one or several reference genes. The special feature of the application is the implementation of user-guided QC-based decision support system, based on qPCR standards, that takes into account pipetting errors, assay amplification efficiencies, limits of detection and quantification of the assays as well as the control of PCR inhibition in individual samples. The intermediate calculations and final results are exportable in a data matrix suitable for further statistical analysis or visualization. We additionally compare the most important features of quantGenius with similar advanced software tools and illustrate the importance of proper QC system in the analysis of qPCR data in two use cases. To our knowledge, quantGenius is the only qPCR data analysis tool that integrates QC-based decision support and will help scientists to obtain reliable results which are the basis for biologically meaningful data interpretation.

  2. Insightful problem solving and creative tool modification by captive nontool-using rooks.

    PubMed

    Bird, Christopher D; Emery, Nathan J

    2009-06-23

    The ability to use tools has been suggested to indicate advanced physical cognition in animals. Here we show that rooks, a member of the corvid family that do not appear to use tools in the wild are capable of insightful problem solving related to sophisticated tool use, including spontaneously modifying and using a variety of tools, shaping hooks out of wire, and using a series of tools in a sequence to gain a reward. It is remarkable that a species that does not use tools in the wild appears to possess an understanding of tools rivaling habitual tool users such as New Caledonian crows and chimpanzees. Our findings suggest that the ability to represent tools may be a domain-general cognitive capacity rather than an adaptive specialization and questions the relationship between physical intelligence and wild tool use.

  3. WHATIF: an open-source desktop application for extraction and management of the incidental findings from next-generation sequencing variant data

    PubMed Central

    Ye, Zhan; Kadolph, Christopher; Strenn, Robert; Wall, Daniel; McPherson, Elizabeth; Lin, Simon

    2015-01-01

    Background Identification and evaluation of incidental findings in patients following whole exome (WGS) or whole genome sequencing (WGS) is challenging for both practicing physicians and researchers. The American College of Medical Genetics and Genomics (ACMG) recently recommended a list of reportable incidental genetic findings. However, no informatics tools are currently available to support evaluation of incidental findings in next-generation sequencing data. Methods The Wisconsin Hierarchical Analysis Tool for Incidental Findings (WHATIF), was developed as a stand-alone Windows-based desktop executable, to support the interactive analysis of incidental findings in the context of the ACMG recommendations. WHATIF integrates the European Bioinformatics Institute Variant Effect Predictor (VEP) tool for biological interpretation and the National Center for Biotechnology Information ClinVar tool for clinical interpretation. Results An open-source desktop program was created to annotate incidental findings and present the results with a user-friendly interface. Further, a meaningful index (WHATIF Index) was devised for each gene to facilitate ranking of the relative importance of the variants and estimate the potential workload associated with further evaluation of the variants. Our WHATIF application is available at: http://tinyurl.com/WHATIF-SOFTWARE Conclusions The WHATIF application offers a user-friendly interface and allows users to investigate the extracted variant information efficiently and intuitively while always accessing the up to date information on variants via application programming interfaces (API) connections. WHATIF’s highly flexible design and straightforward implementation aids users in customizing the source code to meet their own special needs. PMID:25890833

  4. Use of sonic tomography to detect and quantify wood decay in living trees1

    PubMed Central

    Gilbert, Gregory S.; Ballesteros, Javier O.; Barrios-Rodriguez, Cesar A.; Bonadies, Ernesto F.; Cedeño-Sánchez, Marjorie L.; Fossatti-Caballero, Nohely J.; Trejos-Rodríguez, Mariam M.; Pérez-Suñiga, José Moises; Holub-Young, Katharine S.; Henn, Laura A. W.; Thompson, Jennifer B.; García-López, Cesar G.; Romo, Amanda C.; Johnston, Daniel C.; Barrick, Pablo P.; Jordan, Fulvia A.; Hershcovich, Shiran; Russo, Natalie; Sánchez, Juan David; Fábrega, Juan Pablo; Lumpkin, Raleigh; McWilliams, Hunter A.; Chester, Kathleen N.; Burgos, Alana C.; Wong, E. Beatriz; Diab, Jonathan H.; Renteria, Sonia A.; Harrower, Jennifer T.; Hooton, Douglas A.; Glenn, Travis C.; Faircloth, Brant C.; Hubbell, Stephen P.

    2016-01-01

    Premise of the study: Field methodology and image analysis protocols using acoustic tomography were developed and evaluated as a tool to estimate the amount of internal decay and damage of living trees, with special attention to tropical rainforest trees with irregular trunk shapes. Methods and Results: Living trunks of a diversity of tree species in tropical rainforests in the Republic of Panama were scanned using an Argus Electronic PiCUS 3 Sonic Tomograph and evaluated for the amount and patterns of internal decay. A protocol using ImageJ analysis software was used to quantify the proportions of intact and compromised wood. The protocols provide replicable estimates of internal decay and cavities for trees of varying shapes, wood density, and bark thickness. Conclusions: Sonic tomography, coupled with image analysis, provides an efficient, noninvasive approach to evaluate decay patterns and structural integrity of even irregularly shaped living trees. PMID:28101433

  5. Modeling And Simulation Of Bar Code Scanners Using Computer Aided Design Software

    NASA Astrophysics Data System (ADS)

    Hellekson, Ron; Campbell, Scott

    1988-06-01

    Many optical systems have demanding requirements to package the system in a small 3 dimensional space. The use of computer graphic tools can be a tremendous aid to the designer in analyzing the optical problems created by smaller and less costly systems. The Spectra Physics grocery store bar code scanner employs an especially complex 3 dimensional scan pattern to read bar code labels. By using a specially written program which interfaces with a computer aided design system, we have simulated many of the functions of this complex optical system. In this paper we will illustrate how a recent version of the scanner has been designed. We will discuss the use of computer graphics in the design process including interactive tweaking of the scan pattern, analysis of collected light, analysis of the scan pattern density, and analysis of the manufacturing tolerances used to build the scanner.

  6. Aspiring to Unintended Consequences of Natural Language Processing: A Review of Recent Developments in Clinical and Consumer-Generated Text Processing.

    PubMed

    Demner-Fushman, D; Elhadad, N

    2016-11-10

    This paper reviews work over the past two years in Natural Language Processing (NLP) applied to clinical and consumer-generated texts. We included any application or methodological publication that leverages text to facilitate healthcare and address the health-related needs of consumers and populations. Many important developments in clinical text processing, both foundational and task-oriented, were addressed in community- wide evaluations and discussed in corresponding special issues that are referenced in this review. These focused issues and in-depth reviews of several other active research areas, such as pharmacovigilance and summarization, allowed us to discuss in greater depth disease modeling and predictive analytics using clinical texts, and text analysis in social media for healthcare quality assessment, trends towards online interventions based on rapid analysis of health-related posts, and consumer health question answering, among other issues. Our analysis shows that although clinical NLP continues to advance towards practical applications and more NLP methods are used in large-scale live health information applications, more needs to be done to make NLP use in clinical applications a routine widespread reality. Progress in clinical NLP is mirrored by developments in social media text analysis: the research is moving from capturing trends to addressing individual health-related posts, thus showing potential to become a tool for precision medicine and a valuable addition to the standard healthcare quality evaluation tools.

  7. Based on the Theory of TRIZ Solving the Problem of 18650 Battery Electrolyte Filling

    NASA Astrophysics Data System (ADS)

    Shao-hua, Cui; Jiang-ping, Mei; Ling-hua, Zhang; Xiao, Du

    2017-12-01

    As a type of standardized battery cylindrical 18650 lithium-ion battery is widely used in new energy vehicle industry, It can be produced in large quantities without changing type. Because of its special advantages than others. But due to the pressure of rising capacity, electrolyte filling (which is short for E/L) process has become more and more difficult. While reducing the production efficiency eases the problem of E/L, it also poses performance and security problems. So the issue cannot be solved using the common knowledge of the industry. In this paper, This article does not use lean manufacturing or 6Sigma methods, we use TRIZ theory to analyze the E/L difficulty problem in detail (using causal analysis, technical contradiction analysis, substance - field analysis, physical contradiction analysis and other tools). By creating an atmosphere of vacuum and pressure replace the existing E/L tooling for single cell mechanical structure, through blowing hot air method to increase the temperature of electrolyte, Dissolving the J/R into a electrolyte tank which is full of 0.3Mpa nitrogen. Under the premise of not reducing the production efficiency, at the same time ensuring performance and safety, we try to find out a method to solve the E/L difficulty problem, and would get better application in the construction of new production lines in the new factory.

  8. Differentiating Second Language Acquisition from Specific Learning Disability: An Observational Tool Assessing Dual Language Learners' Pragmatic Competence

    ERIC Educational Resources Information Center

    Farnsworth, Megan

    2018-01-01

    Overrepresentation of Dual Language Learners (DLLs) in special education remains a problem even after 40 years of inquiry. One factor is that the U.S. federal government has neither clearly explained the definition of Specific Learning Disability (SLD) nor operationally defined it to identify children for special education services. This lack of…

  9. A Corpus-Based EAP Course for NNS Doctoral Students: Moving from Available Specialized Corpora to Self-Compiled Corpora

    ERIC Educational Resources Information Center

    Lee, David; Swales, John

    2006-01-01

    This paper presents a discussion of an experimental, innovative course in corpus-informed EAP for doctoral students. Participants were given access to specialized corpora of academic writing and speaking, instructed in the tools of the trade (web- and PC-based concordancers) and gradually inducted into the skills needed to best exploit the data…

  10. The Inclusion of Children with Special Educational Needs in an Intensive French as a Second-Language Program: From Theory to Practice

    ERIC Educational Resources Information Center

    Joy, Rhonda; Murphy, Elizabeth

    2012-01-01

    This paper portrays the activity system of eight classes of Grade 6 children with special educational needs in an Intensive French as a second-language education program. Classroom norms and tools reflected a social-interactionist and social-constructivist approach with scaffolding, social interaction, multiple modes of representing, holistic,…

  11. To Be the Best That We Can Be: A Self-Study Guide for Early Childhood Special Education Programs and Staff.

    ERIC Educational Resources Information Center

    Gaetz, Joan; And Others

    This self-study guide facilitates evaluation of early childhood special education programs by providing a tool for identifying both strengths and areas for improvement. Steps are outlined for completing a program self-study. Then forms are offered for assessing the quality of specific program areas. A section on necessary relationships examines…

  12. Amazing "Speaking Dynamically" Tools Utilized in High School Special Education Classroom.

    ERIC Educational Resources Information Center

    Robinson, Deann A.; Attix, Gerald V.

    This paper describes how Alameda High School in California is using the Speaking Dynamically Pro v2.0 software by Mayer-Johnson to help 13 special education students (ages 14-18) communicate and make the world more accessible. The students have a range of disabilities and skill levels. This computer program enables the students to make choices,…

  13. Panning for Gold: Utility of the World Wide Web for Metadata and Authority Control in Special Collections.

    ERIC Educational Resources Information Center

    Ellero, Nadine P.

    2002-01-01

    Describes the use of the World Wide Web as a name authority resource and tool for special collections' analytic-level cataloging, based on experiences at The Claude Moore Health Sciences Library. Highlights include primary documents and metadata; authority control and the Web as authority source information; and future possibilities. (Author/LRW)

  14. Sustaining School-Community Partnerships To Enhance Outcomes for Children and Youth. A Guidebook and Tool Kit.

    ERIC Educational Resources Information Center

    California Univ., Los Angeles. Center for Mental Health in Schools.

    Too many good programs initiated as specially funded projects, pilots, and demonstrations tend to be lost when the period of special funding ends. This guide/toolkit is designed as a resource aid for those in schools and communities who are concerned about sustaining valuable initiatives and innovations. Optimally, sustainability should be a focus…

  15. Simulating Disabilities as a Tool for Altering Individual Perceptions of Working with Children with Special Needs

    ERIC Educational Resources Information Center

    Colwell, Cynthia M.

    2013-01-01

    The purpose of this study was to examine the impact of disability simulations on the attitudes of individuals who will be working with children with special needs in music settings and to compare these attitudes between student music therapists and pre-service music educators. Each participant completed a questionnaire on the first day of class…

  16. Integration of Audio Visual Multimedia for Special Education Pre-Service Teachers' Self Reflections in Developing Teaching Competencies

    ERIC Educational Resources Information Center

    Sediyani, Tri; Yufiarti; Hadi, Eko

    2017-01-01

    This study aims to develop a model of learning by integrating multimedia and audio-visual self-reflective learners. This multimedia was developed as a tool for prospective teachers as learners in the education of children with special needs to reflect on their teaching competencies before entering the world of education. Research methods to…

  17. New User-Friendly Approach to Obtain an Eisenberg Plot and Its Use as a Practical Tool in Protein Sequence Analysis

    PubMed Central

    Keller, Rob C.A.

    2011-01-01

    The Eisenberg plot or hydrophobic moment plot methodology is one of the most frequently used methods of bioinformatics. Bioinformatics is more and more recognized as a helpful tool in Life Sciences in general, and recent developments in approaches recognizing lipid binding regions in proteins are promising in this respect. In this study a bioinformatics approach specialized in identifying lipid binding helical regions in proteins was used to obtain an Eisenberg plot. The validity of the Heliquest generated hydrophobic moment plot was checked and exemplified. This study indicates that the Eisenberg plot methodology can be transferred to another hydrophobicity scale and renders a user-friendly approach which can be utilized in routine checks in protein–lipid interaction and in protein and peptide lipid binding characterization studies. A combined approach seems to be advantageous and results in a powerful tool in the search of helical lipid-binding regions in proteins and peptides. The strength and limitations of the Eisenberg plot approach itself are discussed as well. The presented approach not only leads to a better understanding of the nature of the protein–lipid interactions but also provides a user-friendly tool for the search of lipid-binding regions in proteins and peptides. PMID:22016610

  18. New user-friendly approach to obtain an Eisenberg plot and its use as a practical tool in protein sequence analysis.

    PubMed

    Keller, Rob C A

    2011-01-01

    The Eisenberg plot or hydrophobic moment plot methodology is one of the most frequently used methods of bioinformatics. Bioinformatics is more and more recognized as a helpful tool in Life Sciences in general, and recent developments in approaches recognizing lipid binding regions in proteins are promising in this respect. In this study a bioinformatics approach specialized in identifying lipid binding helical regions in proteins was used to obtain an Eisenberg plot. The validity of the Heliquest generated hydrophobic moment plot was checked and exemplified. This study indicates that the Eisenberg plot methodology can be transferred to another hydrophobicity scale and renders a user-friendly approach which can be utilized in routine checks in protein-lipid interaction and in protein and peptide lipid binding characterization studies. A combined approach seems to be advantageous and results in a powerful tool in the search of helical lipid-binding regions in proteins and peptides. The strength and limitations of the Eisenberg plot approach itself are discussed as well. The presented approach not only leads to a better understanding of the nature of the protein-lipid interactions but also provides a user-friendly tool for the search of lipid-binding regions in proteins and peptides.

  19. Occipital cortical thickness in very low birth weight born adolescents predicts altered neural specialization of visual semantic category related neural networks.

    PubMed

    Klaver, Peter; Latal, Beatrice; Martin, Ernst

    2015-01-01

    Very low birth weight (VLBW) premature born infants have a high risk to develop visual perceptual and learning deficits as well as widespread functional and structural brain abnormalities during infancy and childhood. Whether and how prematurity alters neural specialization within visual neural networks is still unknown. We used functional and structural brain imaging to examine the visual semantic system of VLBW born (<1250 g, gestational age 25-32 weeks) adolescents (13-15 years, n = 11, 3 males) and matched term born control participants (13-15 years, n = 11, 3 males). Neurocognitive assessment revealed no group differences except for lower scores on an adaptive visuomotor integration test. All adolescents were scanned while viewing pictures of animals and tools and scrambled versions of these pictures. Both groups demonstrated animal and tool category related neural networks. Term born adolescents showed tool category related neural activity, i.e. tool pictures elicited more activity than animal pictures, in temporal and parietal brain areas. Animal category related activity was found in the occipital, temporal and frontal cortex. VLBW born adolescents showed reduced tool category related activity in the dorsal visual stream compared with controls, specifically the left anterior intraparietal sulcus, and enhanced animal category related activity in the left middle occipital gyrus and right lingual gyrus. Lower birth weight of VLBW adolescents correlated with larger thickness of the pericalcarine gyrus in the occipital cortex and smaller surface area of the superior temporal gyrus in the lateral temporal cortex. Moreover, larger thickness of the pericalcarine gyrus and smaller surface area of the superior temporal gyrus correlated with reduced tool category related activity in the parietal cortex. Together, our data suggest that very low birth weight predicts alterations of higher order visual semantic networks, particularly in the dorsal stream. The differences in neural specialization may be associated with aberrant cortical development of areas in the visual system that develop early in childhood. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Earth Observation oriented teaching materials development based on OGC Web services and Bashyt generated reports

    NASA Astrophysics Data System (ADS)

    Stefanut, T.; Gorgan, D.; Giuliani, G.; Cau, P.

    2012-04-01

    Creating e-Learning materials in the Earth Observation domain is a difficult task especially for non-technical specialists who have to deal with distributed repositories, large amounts of information and intensive processing requirements. Furthermore, due to the lack of specialized applications for developing teaching resources, technical knowledge is required also for defining data presentation structures or in the development and customization of user interaction techniques for better teaching results. As a response to these issues during the GiSHEO FP7 project [1] and later in the EnviroGRIDS FP7 [2] project, we have developed the eGLE e-Learning Platform [3], a tool based application that provides dedicated functionalities to the Earth Observation specialists for developing teaching materials. The proposed architecture is built around a client-server design that provides the core functionalities (e.g. user management, tools integration, teaching materials settings, etc.) and has been extended with a distributed component implemented through the tools that are integrated into the platform, as described further. Our approach in dealing with multiple transfer protocol types, heterogeneous data formats or various user interaction techniques involve the development and integration of very specialized elements (tools) that can be customized by the trainers in a visual manner through simple user interfaces. In our concept each tool is dedicated to a specific data type, implementing optimized mechanisms for searching, retrieving, visualizing and interacting with it. At the same time, in each learning resource can be integrated any number of tools, through drag-and-drop interaction, allowing the teacher to retrieve pieces of data of various types (e.g. images, charts, tables, text, videos etc.) from different sources (e.g. OGC web services, charts created through Bashyt application, etc.) through different protocols (ex. WMS, BASHYT API, FTP, HTTP etc.) and to display them all together in a unitary manner using the same visual structure [4]. Addressing the High Power Computation requirements that are met while processing environmental data, our platform can be easily extended through tools that connect to GRID infrastructures, WCS web services, Bashyt API (for creating specialized hydrological reports) or any other specialized services (ex. graphics cluster visualization) that can be reached over the Internet. At run time, on the trainee's computer each tool is launched in an asynchronous running mode and connects to the data source that has been established by the teacher, retrieving and displaying the information to the user. The data transfer is accomplished directly between the trainee's computer and the corresponding services (e.g. OGC, Bashyt API, etc.) without passing through the core server platform. In this manner, the eGLE application can provide better and more responsive connections to a large number of users.

  1. Universal primers for amplification of the complete mitochondrial control region in marine fish species.

    PubMed

    Cheng, Y Z; Xu, T J; Jin, X X; Tang, D; Wei, T; Sun, Y Y; Meng, F Q; Shi, G; Wang, R X

    2012-01-01

    Through multiple alignment analysis of mitochondrial tRNA-Thr and tRNA-Phe sequences from 161 fishes, new universal primers specially targeting the entire mitochondrial control region were designed. This new primer set successfully amplified the expected PCR products from various kinds of marine fish species, belonging to various families, and the amplified segments were confirmed to be the control region by sequencing. These primers provide a useful tool to study the control region diversity in economically important fish species, the possible mechanism of control region evolution, and the functions of the conserved motifs in the control region.

  2. Program Model Checking as a New Trend

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This paper introduces a special section of STTT (International Journal on Software Tools for Technology Transfer) containing a selection of papers that were presented at the 7th International SPIN workshop, Stanford, August 30 - September 1, 2000. The workshop was named SPIN Model Checking and Software Verification, with an emphasis on model checking of programs. The paper outlines the motivation for stressing software verification, rather than only design and model verification, by presenting the work done in the Automated Software Engineering group at NASA Ames Research Center within the last 5 years. This includes work in software model checking, testing like technologies and static analysis.

  3. Mining microarray data at NCBI's Gene Expression Omnibus (GEO)*.

    PubMed

    Barrett, Tanya; Edgar, Ron

    2006-01-01

    The Gene Expression Omnibus (GEO) at the National Center for Biotechnology Information (NCBI) has emerged as the leading fully public repository for gene expression data. This chapter describes how to use Web-based interfaces, applications, and graphics to effectively explore, visualize, and interpret the hundreds of microarray studies and millions of gene expression patterns stored in GEO. Data can be examined from both experiment-centric and gene-centric perspectives using user-friendly tools that do not require specialized expertise in microarray analysis or time-consuming download of massive data sets. The GEO database is publicly accessible through the World Wide Web at http://www.ncbi.nlm.nih.gov/geo.

  4. [Soldiers and HIV: impact of medical knowledge on the analysis of discrimination in the constitution].

    PubMed

    Pou Giménez, Francisca

    2012-01-01

    In 2007 the Mexican Supreme Court issued several opinions dealing with military personnel dismissed from the Army because of their being HIV-positive. The author describes the main questions under discussion and the core arguments developed by the Court, and stresses three reasons why these cases deserve close attention: positively, because they reinforced the use of the proportionality principle as a tool for identifying discriminatory norms and because they opened the door to the use of specialized scientific knowledge in constitutional adjudication; negatively, because they failed to build on the direct normative efficacy of the right to health.

  5. Assortativity Patterns in Multi-dimensional Inter-organizational Networks: A Case Study of the Humanitarian Relief Sector

    NASA Astrophysics Data System (ADS)

    Zhao, Kang; Ngamassi, Louis-Marie; Yen, John; Maitland, Carleen; Tapia, Andrea

    We use computational tools to study assortativity patterns in multi-dimensional inter-organizational networks on the basis of different node attributes. In the case study of an inter-organizational network in the humanitarian relief sector, we consider not only macro-level topological patterns, but also assortativity on the basis of micro-level organizational attributes. Unlike assortative social networks, this inter-organizational network exhibits disassortative or random patterns on three node attributes. We believe organizations' seek of complementarity is one of the main reasons for the special patterns. Our analysis also provides insights on how to promote collaborations among the humanitarian relief organizations.

  6. Virtual biomedical universities and e-learning.

    PubMed

    Beux, P Le; Fieschi, M

    2007-01-01

    In this special issue on virtual biomedical universities and e-learning we will make a survey on the principal existing teaching applications of ICT used in medical Schools around the world. In the following we identify five types of research and experiments in this field of medical e-learning and virtual medical universities. The topics of this special issue goes from educational computer program to create and simulate virtual patients with a wide variety of medical conditions in different clinical settings and over different time frames to using distance learning in developed and developing countries program training medical informatics of clinicians. We also present the necessity of good indexing and research tools for training resources together with workflows to manage the multiple source content of virtual campus or universities and the virtual digital video resources. A special attention is given to training new generations of clinicians in ICT tools and methods to be used in clinical settings as well as in medical schools.

  7. Nicotine uses and abuses: from brain probe to public health menace.

    PubMed

    Pomerleau, O F; Pomerleau, C S

    1989-01-01

    There has been a notable lack of dialogue between neuroscientists, who use nicotine in their work as they would any other pharmacological tool, and public policy and health researchers, who view nicotine dependence with increasing dismay and see the continued use of tobacco products as a modern day scourge. This special journal issue attempts to foster communication among nicotine researchers working along the continuum from basic to applied science. An additional objective is to convey a sense for the special problems and opportunities in the study of nicotine and tobacco use that may be of general interest to those concerned with substance abuse. The articles that follow explore two themes, (1) nicotine as a tool to probe neural activity, and (2) tobacco use as a health hazard and societal problem, by examining nicotine from pharmacochemical, biobehavioral, and econo-social perspectives. The rationale for the integration is that there may be benefits from viewing nicotine in a context broader than those dictated by custom and technological specialization.

  8. KSC-2014-1354

    NASA Image and Video Library

    2014-02-10

    CAPE CANAVERAL, Fla. – Special Rescue Operations firefighters with NASA Fire Rescue Services in the Protective Services Office at NASA’s Kennedy Space Center in Florida practice vehicle extrication training at an auto salvage yard near the center. A firefighter uses a spreader to push the dashboard away from the seat. They used the Jaws of Life to remove the door from the vehicle and simulate the rescue of a trapped and injured person. A special hydraulic cutting tool and reciprocating saw were used to cut through and remove the roof. An axe and other special tools were used to punch through and clear away the windshield and windows. Kennedy’s firefighters recently achieved Pro Board Certification in aerial fire truck operations. With the completion of vehicle extrication and Jaws of Life training, the Protective Services Office is one step closer to achieving certification in vehicle machinery extrication. Kennedy’s firefighters are with G4S Government Solutions Inc., on the Kennedy Protective Services Contract. Photo credit: NASA/Daniel Casper

  9. Deriving Earth Science Data Analytics Requirements

    NASA Technical Reports Server (NTRS)

    Kempler, Steven J.

    2015-01-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists.Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics toolstechniques requirements that would support specific ESDA type goals. Representative existing data analytics toolstechniques relevant to ESDA will also be addressed.

  10. Bayesian network modeling: A case study of an epidemiologic system analysis of cardiovascular risk.

    PubMed

    Fuster-Parra, P; Tauler, P; Bennasar-Veny, M; Ligęza, A; López-González, A A; Aguiló, A

    2016-04-01

    An extensive, in-depth study of cardiovascular risk factors (CVRF) seems to be of crucial importance in the research of cardiovascular disease (CVD) in order to prevent (or reduce) the chance of developing or dying from CVD. The main focus of data analysis is on the use of models able to discover and understand the relationships between different CVRF. In this paper a report on applying Bayesian network (BN) modeling to discover the relationships among thirteen relevant epidemiological features of heart age domain in order to analyze cardiovascular lost years (CVLY), cardiovascular risk score (CVRS), and metabolic syndrome (MetS) is presented. Furthermore, the induced BN was used to make inference taking into account three reasoning patterns: causal reasoning, evidential reasoning, and intercausal reasoning. Application of BN tools has led to discovery of several direct and indirect relationships between different CVRF. The BN analysis showed several interesting results, among them: CVLY was highly influenced by smoking being the group of men the one with highest risk in CVLY; MetS was highly influence by physical activity (PA) being again the group of men the one with highest risk in MetS, and smoking did not show any influence. BNs produce an intuitive, transparent, graphical representation of the relationships between different CVRF. The ability of BNs to predict new scenarios when hypothetical information is introduced makes BN modeling an Artificial Intelligence (AI) tool of special interest in epidemiological studies. As CVD is multifactorial the use of BNs seems to be an adequate modeling tool. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. Development of efficient and cost-effective distributed hydrological modeling tool MWEasyDHM based on open-source MapWindow GIS

    NASA Astrophysics Data System (ADS)

    Lei, Xiaohui; Wang, Yuhui; Liao, Weihong; Jiang, Yunzhong; Tian, Yu; Wang, Hao

    2011-09-01

    Many regions are still threatened with frequent floods and water resource shortage problems in China. Consequently, the task of reproducing and predicting the hydrological process in watersheds is hard and unavoidable for reducing the risks of damage and loss. Thus, it is necessary to develop an efficient and cost-effective hydrological tool in China as many areas should be modeled. Currently, developed hydrological tools such as Mike SHE and ArcSWAT (soil and water assessment tool based on ArcGIS) show significant power in improving the precision of hydrological modeling in China by considering spatial variability both in land cover and in soil type. However, adopting developed commercial tools in such a large developing country comes at a high cost. Commercial modeling tools usually contain large numbers of formulas, complicated data formats, and many preprocessing or postprocessing steps that may make it difficult for the user to carry out simulation, thus lowering the efficiency of the modeling process. Besides, commercial hydrological models usually cannot be modified or improved to be suitable for some special hydrological conditions in China. Some other hydrological models are open source, but integrated into commercial GIS systems. Therefore, by integrating hydrological simulation code EasyDHM, a hydrological simulation tool named MWEasyDHM was developed based on open-source MapWindow GIS, the purpose of which is to establish the first open-source GIS-based distributed hydrological model tool in China by integrating modules of preprocessing, model computation, parameter estimation, result display, and analysis. MWEasyDHM provides users with a friendly manipulating MapWindow GIS interface, selectable multifunctional hydrological processing modules, and, more importantly, an efficient and cost-effective hydrological simulation tool. The general construction of MWEasyDHM consists of four major parts: (1) a general GIS module for hydrological analysis, (2) a preprocessing module for modeling inputs, (3) a model calibration module, and (4) a postprocessing module. The general GIS module for hydrological analysis is developed on the basis of totally open-source GIS software, MapWindow, which contains basic GIS functions. The preprocessing module is made up of three submodules including a DEM-based submodule for hydrological analysis, a submodule for default parameter calculation, and a submodule for the spatial interpolation of meteorological data. The calibration module contains parallel computation, real-time computation, and visualization. The postprocessing module includes model calibration and model results spatial visualization using tabular form and spatial grids. MWEasyDHM makes it possible for efficient modeling and calibration of EasyDHM, and promises further development of cost-effective applications in various watersheds.

  12. ART/Ada and CLIPS/Ada

    NASA Technical Reports Server (NTRS)

    Culbert, Chris

    1990-01-01

    Although they have reached a point of commercial viability, expert systems were originally developed in artificial intelligence (AI) research environments. Many of the available tools still work best in such environments. These environments typically utilize special hardware such as LISP machines and relatively unfamiliar languages such as LISP or Prolog. Space Station applications will require deep integration of expert system technology with applications developed in conventional languages, specifically Ada. The ability to apply automation to Space Station functions could be greatly enhanced by widespread availability of state-of-the-art expert system tools based on Ada. Although there have been some efforts to examine the use of Ada for AI applications, there are few, if any, existing products which provide state-of-the-art AI capabilities in an Ada tool. The goal of the ART/Ada Design Project is to conduct research into the implementation in Ada of state-of-the-art hybrid expert systems building tools (ESBT's). This project takes the following approach: using the existing design of the ART-IM ESBT as a starting point, analyze the impact of the Ada language and Ada development methodologies on that design; redesign the system in Ada; and analyze its performance. The research project will attempt to achieve a comprehensive understanding of the potential for embedding expert systems in Ada systems for eventual application in future Space Station Freedom projects. During Phase 1 of the project, initial requirements analysis, design, and implementation of the kernel subset of ART-IM functionality was completed. During Phase 2, the effort has been focused on the implementation and performance analysis of several versions with increasing functionality. Since production quality ART/Ada tools will not be available for a considerable time, and additional subtask of this project will be the completion of an Ada version of the CLIPS expert system shell developed by NASA. This tool will provide full syntactic compatibility with any eventual products of the ART/Ada design while allowing SSFP developers early access to this technology.

  13. IAC - INTEGRATED ANALYSIS CAPABILITY

    NASA Technical Reports Server (NTRS)

    Frisch, H. P.

    1994-01-01

    The objective of the Integrated Analysis Capability (IAC) system is to provide a highly effective, interactive analysis tool for the integrated design of large structures. With the goal of supporting the unique needs of engineering analysis groups concerned with interdisciplinary problems, IAC was developed to interface programs from the fields of structures, thermodynamics, controls, and system dynamics with an executive system and database to yield a highly efficient multi-disciplinary system. Special attention is given to user requirements such as data handling and on-line assistance with operational features, and the ability to add new modules of the user's choice at a future date. IAC contains an executive system, a data base, general utilities, interfaces to various engineering programs, and a framework for building interfaces to other programs. IAC has shown itself to be effective in automatic data transfer among analysis programs. IAC 2.5, designed to be compatible as far as possible with Level 1.5, contains a major upgrade in executive and database management system capabilities, and includes interfaces to enable thermal, structures, optics, and control interaction dynamics analysis. The IAC system architecture is modular in design. 1) The executive module contains an input command processor, an extensive data management system, and driver code to execute the application modules. 2) Technical modules provide standalone computational capability as well as support for various solution paths or coupled analyses. 3) Graphics and model generation interfaces are supplied for building and viewing models. Advanced graphics capabilities are provided within particular analysis modules such as INCA and NASTRAN. 4) Interface modules provide for the required data flow between IAC and other modules. 5) User modules can be arbitrary executable programs or JCL procedures with no pre-defined relationship to IAC. 6) Special purpose modules are included, such as MIMIC (Model Integration via Mesh Interpolation Coefficients), which transforms field values from one model to another; LINK, which simplifies incorporation of user specific modules into IAC modules; and DATAPAC, the National Bureau of Standards statistical analysis package. The IAC database contains structured files which provide a common basis for communication between modules and the executive system, and can contain unstructured files such as NASTRAN checkpoint files, DISCOS plot files, object code, etc. The user can define groups of data and relations between them. A full data manipulation and query system operates with the database. The current interface modules comprise five groups: 1) Structural analysis - IAC contains a NASTRAN interface for standalone analysis or certain structural/control/thermal combinations. IAC provides enhanced structural capabilities for normal modes and static deformation analysis via special DMAP sequences. IAC 2.5 contains several specialized interfaces from NASTRAN in support of multidisciplinary analysis. 2) Thermal analysis - IAC supports finite element and finite difference techniques for steady state or transient analysis. There are interfaces for the NASTRAN thermal analyzer, SINDA/SINFLO, and TRASYS II. FEMNET, which converts finite element structural analysis models to finite difference thermal analysis models, is also interfaced with the IAC database. 3) System dynamics - The DISCOS simulation program which allows for either nonlinear time domain analysis or linear frequency domain analysis, is fully interfaced to the IAC database management capability. 4) Control analysis - Interfaces for the ORACLS, SAMSAN, NBOD2, and INCA programs allow a wide range of control system analyses and synthesis techniques. Level 2.5 includes EIGEN, which provides tools for large order system eigenanalysis, and BOPACE, which allows for geometric capabilities and finite element analysis with nonlinear material. Also included in IAC level 2.5 is SAMSAN 3.1, an engineering analysis program which contains a general purpose library of over 600 subroutines for numerical analysis. 5) Graphics - The graphics package IPLOT is included in IAC. IPLOT generates vector displays of tabular data in the form of curves, charts, correlation tables, etc. Either DI3000 or PLOT-10 graphics software is required for full graphic capability. In addition to these analysis tools, IAC 2.5 contains an IGES interface which allows the user to read arbitrary IGES files into an IAC database and to edit and output new IGES files. IAC is available by license for a period of 10 years to approved U.S. licensees. The licensed program product includes one set of supporting documentation. Additional copies may be purchased separately. IAC is written in FORTRAN 77 and has been implemented on a DEC VAX series computer operating under VMS. IAC can be executed by multiple concurrent users in batch or interactive mode. The program is structured to allow users to easily delete those program capabilities and "how to" examples they do not want in order to reduce the size of the package. The basic central memory requirement for IAC is approximately 750KB. The following programs are also available from COSMIC as separate packages: NASTRAN, SINDA/SINFLO, TRASYS II, DISCOS, ORACLS, SAMSAN, NBOD2, and INCA. The development of level 2.5 of IAC was completed in 1989.

  14. An efficient annotation and gene-expression derivation tool for Illumina Solexa datasets

    PubMed Central

    2010-01-01

    Background The data produced by an Illumina flow cell with all eight lanes occupied, produces well over a terabyte worth of images with gigabytes of reads following sequence alignment. The ability to translate such reads into meaningful annotation is therefore of great concern and importance. Very easily, one can get flooded with such a great volume of textual, unannotated data irrespective of read quality or size. CASAVA, a optional analysis tool for Illumina sequencing experiments, enables the ability to understand INDEL detection, SNP information, and allele calling. To not only extract from such analysis, a measure of gene expression in the form of tag-counts, but furthermore to annotate such reads is therefore of significant value. Findings We developed TASE (Tag counting and Analysis of Solexa Experiments), a rapid tag-counting and annotation software tool specifically designed for Illumina CASAVA sequencing datasets. Developed in Java and deployed using jTDS JDBC driver and a SQL Server backend, TASE provides an extremely fast means of calculating gene expression through tag-counts while annotating sequenced reads with the gene's presumed function, from any given CASAVA-build. Such a build is generated for both DNA and RNA sequencing. Analysis is broken into two distinct components: DNA sequence or read concatenation, followed by tag-counting and annotation. The end result produces output containing the homology-based functional annotation and respective gene expression measure signifying how many times sequenced reads were found within the genomic ranges of functional annotations. Conclusions TASE is a powerful tool to facilitate the process of annotating a given Illumina Solexa sequencing dataset. Our results indicate that both homology-based annotation and tag-count analysis are achieved in very efficient times, providing researchers to delve deep in a given CASAVA-build and maximize information extraction from a sequencing dataset. TASE is specially designed to translate sequence data in a CASAVA-build into functional annotations while producing corresponding gene expression measurements. Achieving such analysis is executed in an ultrafast and highly efficient manner, whether the analysis be a single-read or paired-end sequencing experiment. TASE is a user-friendly and freely available application, allowing rapid analysis and annotation of any given Illumina Solexa sequencing dataset with ease. PMID:20598141

  15. Initial implementation of a comparative data analysis ontology.

    PubMed

    Prosdocimi, Francisco; Chisham, Brandon; Pontelli, Enrico; Thompson, Julie D; Stoltzfus, Arlin

    2009-07-03

    Comparative analysis is used throughout biology. When entities under comparison (e.g. proteins, genomes, species) are related by descent, evolutionary theory provides a framework that, in principle, allows N-ary comparisons of entities, while controlling for non-independence due to relatedness. Powerful software tools exist for specialized applications of this approach, yet it remains under-utilized in the absence of a unifying informatics infrastructure. A key step in developing such an infrastructure is the definition of a formal ontology. The analysis of use cases and existing formalisms suggests that a significant component of evolutionary analysis involves a core problem of inferring a character history, relying on key concepts: "Operational Taxonomic Units" (OTUs), representing the entities to be compared; "character-state data" representing the observations compared among OTUs; "phylogenetic tree", representing the historical path of evolution among the entities; and "transitions", the inferred evolutionary changes in states of characters that account for observations. Using the Web Ontology Language (OWL), we have defined these and other fundamental concepts in a Comparative Data Analysis Ontology (CDAO). CDAO has been evaluated for its ability to represent token data sets and to support simple forms of reasoning. With further development, CDAO will provide a basis for tools (for semantic transformation, data retrieval, validation, integration, etc.) that make it easier for software developers and biomedical researchers to apply evolutionary methods of inference to diverse types of data, so as to integrate this powerful framework for reasoning into their research.

  16. An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis

    NASA Astrophysics Data System (ADS)

    Kim, Yongmin; Alexander, Thomas

    1986-06-01

    In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.

  17. Stress Management for Special Educators: The Self-Administered Tool for Awareness and Relaxation (STAR)

    ERIC Educational Resources Information Center

    Williams, Krista; Poel, Elissa Wolfe

    2006-01-01

    The Self-Administered Tool for Awareness and Relaxation (STAR) is a stress management strategy designed to facilitate awareness of the physical, mental, emotional, and physiological effects of stress through the interconnectedness of the brain, body, and emotions. The purpose of this article is to present a stress-management model for teachers,…

  18. Continuous Symmetry and Chemistry Teachers: Learning Advanced Chemistry Content through Novel Visualization Tools

    ERIC Educational Resources Information Center

    Tuvi-Arad, Inbal; Blonder, Ron

    2010-01-01

    In this paper we describe the learning process of a group of experienced chemistry teachers in a specially designed workshop on molecular symmetry and continuous symmetry. The workshop was based on interactive visualization tools that allow molecules and their symmetry elements to be rotated in three dimensions. The topic of continuous symmetry is…

  19. Construction Mechanic, Engine Tune-Up I, 8-7. Military Curriculum Materials for Vocational and Technical Education.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. National Center for Research in Vocational Education.

    This course, adapted from military curriculum materials for use in vocational and technical education, teaches students to perform a complete engine tune-up using appropriate hand tools, special tools, and testing equipment. Students completing the course will be able to diagnose gasoline-engine performance and perform corrective measures to…

  20. Learning Hypotheses and an Associated Tool to Design and to Analyse Teaching-Learning Sequences. Special Issue

    ERIC Educational Resources Information Center

    Buty, Christian; Tiberghien, Andree; Le Marechal, Jean-Francois

    2004-01-01

    This contribution presents a tool elaborated from a theoretical framework linking epistemological, learning and didactical hypotheses. This framework lead to design teaching sequences from a socio-constructivist perspective, and is based on the role of models in physics or chemistry, and on the role of students' initial knowledge in learning…

  1. Multiprocessor programming environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, M.B.; Fornaro, R.

    Programming tools and techniques have been well developed for traditional uniprocessor computer systems. The focus of this research project is on the development of a programming environment for a high speed real time heterogeneous multiprocessor system, with special emphasis on languages and compilers. The new tools and techniques will allow a smooth transition for programmers with experience only on single processor systems.

  2. La terminologie, outil et/ou objet pedagogique (Terminology, Instructional Tool and/or Object).

    ERIC Educational Resources Information Center

    Van Deth, Jean-Pierre

    1990-01-01

    It is proposed that while interest has focused on specialized professional vocabulary as an object of language instruction, the same vocabulary can be viewed as a tool for teaching. The exchange of concepts and terminology between student and teacher improves student understanding of the language and the problems of translation. (MSE)

  3. Special Education Pupils Find Learning Tool in iPad Applications

    ERIC Educational Resources Information Center

    Shah, Nirvi

    2011-01-01

    iPads and other tablet computers are more than a novelty for many students with disabilities, including deaf students in Pennsylvania, youngsters with autism in Southern California, and children with Down syndrome. They are tools that pave a fresh path to learning. Tablet computers are useful for students with disabilities because some of the…

  4. Research flight software engineering and MUST, an integrated system of support tools

    NASA Technical Reports Server (NTRS)

    Straeter, T. A.; Foudriat, E. C.; Will, R. W.

    1977-01-01

    Consideration is given to software development to support NASA flight research. The Multipurpose User-Oriented Software Technology (MUST) program, designed to integrate digital systems into flight research, is discussed. Particular attention is given to the program's special interactive user interface, subroutine library, assemblers, compiler, automatic documentation tools, and test and simulation subsystems.

  5. Hydra—The National Earthquake Information Center’s 24/7 seismic monitoring, analysis, catalog production, quality analysis, and special studies tool suite

    USGS Publications Warehouse

    Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.

    2016-08-18

    This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.

  6. Grey Relational Analysis Coupled with Principal Component Analysis for Optimization of Stereolithography Process to Enhance Part Quality

    NASA Astrophysics Data System (ADS)

    Raju, B. S.; Sekhar, U. Chandra; Drakshayani, D. N.

    2017-08-01

    The paper investigates optimization of stereolithography process for SL5530 epoxy resin material to enhance part quality. The major characteristics indexed for performance selected to evaluate the processes are tensile strength, Flexural strength, Impact strength and Density analysis and corresponding process parameters are Layer thickness, Orientation and Hatch spacing. In this study, the process is intrinsically with multiple parameters tuning so that grey relational analysis which uses grey relational grade as performance index is specially adopted to determine the optimal combination of process parameters. Moreover, the principal component analysis is applied to evaluate the weighting values corresponding to various performance characteristics so that their relative importance can be properly and objectively desired. The results of confirmation experiments reveal that grey relational analysis coupled with principal component analysis can effectively acquire the optimal combination of process parameters. Hence, this confirm that the proposed approach in this study can be an useful tool to improve the process parameters in stereolithography process, which is very useful information for machine designers as well as RP machine users.

  7. The BioMart community portal: an innovative alternative to large, centralized data repositories

    PubMed Central

    Smedley, Damian; Haider, Syed; Durinck, Steffen; Pandini, Luca; Provero, Paolo; Allen, James; Arnaiz, Olivier; Awedh, Mohammad Hamza; Baldock, Richard; Barbiera, Giulia; Bardou, Philippe; Beck, Tim; Blake, Andrew; Bonierbale, Merideth; Brookes, Anthony J.; Bucci, Gabriele; Buetti, Iwan; Burge, Sarah; Cabau, Cédric; Carlson, Joseph W.; Chelala, Claude; Chrysostomou, Charalambos; Cittaro, Davide; Collin, Olivier; Cordova, Raul; Cutts, Rosalind J.; Dassi, Erik; Genova, Alex Di; Djari, Anis; Esposito, Anthony; Estrella, Heather; Eyras, Eduardo; Fernandez-Banet, Julio; Forbes, Simon; Free, Robert C.; Fujisawa, Takatomo; Gadaleta, Emanuela; Garcia-Manteiga, Jose M.; Goodstein, David; Gray, Kristian; Guerra-Assunção, José Afonso; Haggarty, Bernard; Han, Dong-Jin; Han, Byung Woo; Harris, Todd; Harshbarger, Jayson; Hastings, Robert K.; Hayes, Richard D.; Hoede, Claire; Hu, Shen; Hu, Zhi-Liang; Hutchins, Lucie; Kan, Zhengyan; Kawaji, Hideya; Keliet, Aminah; Kerhornou, Arnaud; Kim, Sunghoon; Kinsella, Rhoda; Klopp, Christophe; Kong, Lei; Lawson, Daniel; Lazarevic, Dejan; Lee, Ji-Hyun; Letellier, Thomas; Li, Chuan-Yun; Lio, Pietro; Liu, Chu-Jun; Luo, Jie; Maass, Alejandro; Mariette, Jerome; Maurel, Thomas; Merella, Stefania; Mohamed, Azza Mostafa; Moreews, Francois; Nabihoudine, Ibounyamine; Ndegwa, Nelson; Noirot, Céline; Perez-Llamas, Cristian; Primig, Michael; Quattrone, Alessandro; Quesneville, Hadi; Rambaldi, Davide; Reecy, James; Riba, Michela; Rosanoff, Steven; Saddiq, Amna Ali; Salas, Elisa; Sallou, Olivier; Shepherd, Rebecca; Simon, Reinhard; Sperling, Linda; Spooner, William; Staines, Daniel M.; Steinbach, Delphine; Stone, Kevin; Stupka, Elia; Teague, Jon W.; Dayem Ullah, Abu Z.; Wang, Jun; Ware, Doreen; Wong-Erasmus, Marie; Youens-Clark, Ken; Zadissa, Amonida; Zhang, Shi-Jian; Kasprzyk, Arek

    2015-01-01

    The BioMart Community Portal (www.biomart.org) is a community-driven effort to provide a unified interface to biomedical databases that are distributed worldwide. The portal provides access to numerous database projects supported by 30 scientific organizations. It includes over 800 different biological datasets spanning genomics, proteomics, model organisms, cancer data, ontology information and more. All resources available through the portal are independently administered and funded by their host organizations. The BioMart data federation technology provides a unified interface to all the available data. The latest version of the portal comes with many new databases that have been created by our ever-growing community. It also comes with better support and extensibility for data analysis and visualization tools. A new addition to our toolbox, the enrichment analysis tool is now accessible through graphical and web service interface. The BioMart community portal averages over one million requests per day. Building on this level of service and the wealth of information that has become available, the BioMart Community Portal has introduced a new, more scalable and cheaper alternative to the large data stores maintained by specialized organizations. PMID:25897122

  8. Extraction of Modal Parameters from Spacecraft Flight Data

    NASA Technical Reports Server (NTRS)

    James, George H.; Cao, Timothy T.; Fogt, Vincent A.; Wilson, Robert L.; Bartkowicz, Theodore J.

    2010-01-01

    The modeled response of spacecraft systems must be validated using flight data as ground tests cannot adequately represent the flight. Tools from the field of operational modal analysis would typically be brought to bear on such structures. However, spacecraft systems have several complicated issues: 1. High amplitudes of loads; 2. Compressive loads on the vehicle in flight; 3. Lack of generous time-synchronized flight data; 4. Changing properties during the flight; and 5. Major vehicle changes due to staging. A particularly vexing parameter to extract is modal damping. Damping estimation has become a more critical issue as new mass-driven vehicle designs seek to use the highest damping value possible. The paper will focus on recent efforts to utilize spacecraft flight data to extract system parameters, with a special interest on modal damping. This work utilizes the analysis of correlation functions derived from a sliding window technique applied to the time record. Four different case studies are reported in the sequence that drove the authors understanding. The insights derived from these four exercises are preliminary conclusions for the general state-of-the-art, but may be of specific utility to similar problems approached with similar tools.

  9. Insightful problem solving and creative tool modification by captive nontool-using rooks

    PubMed Central

    Bird, Christopher D.; Emery, Nathan J.

    2009-01-01

    The ability to use tools has been suggested to indicate advanced physical cognition in animals. Here we show that rooks, a member of the corvid family that do not appear to use tools in the wild are capable of insightful problem solving related to sophisticated tool use, including spontaneously modifying and using a variety of tools, shaping hooks out of wire, and using a series of tools in a sequence to gain a reward. It is remarkable that a species that does not use tools in the wild appears to possess an understanding of tools rivaling habitual tool users such as New Caledonian crows and chimpanzees. Our findings suggest that the ability to represent tools may be a domain-general cognitive capacity rather than an adaptive specialization and questions the relationship between physical intelligence and wild tool use. PMID:19478068

  10. New Metrics, Measures, and Uses for Fluency Data: An Introduction to a Special Issue on the Assessment of Reading Fluency

    ERIC Educational Resources Information Center

    Biancarosa, Gina; Cummings, Kelli D.

    2015-01-01

    The primary objective of this special issue is to synthesize results from recent reading fluency research endeavors, and to link these findings to practical uses of reading curriculum-based measurement (R-CBM) tools. Taken together, the manuscripts presented in this issue discuss measurement work related to new metrics of indexing student reading…

  11. Photorefractive Polymers for Updateable 3D Displays

    DTIC Science & Technology

    2010-02-24

    Holographic 3D displays provide highly realistic images without the need for special eyewear , making them valuable tools for applications that require...situational awareness” such as medical, industrial , and military imaging. A considerable amount of research has been dedicated to the development of...imaging techniques that rely on special eyewear such as polarizing goggles have unwanted side-effects such as eye fatigue and motion sickness and

  12. Students' Understanding of the Special Theory of Relativity and Design for a Guided Visit to a Science Museum

    ERIC Educational Resources Information Center

    Guisasola, Jenaro; Solbes, Jordi; Barragues, Jose-Ignacio; Morentin, Maite; Moreno, Antonio

    2009-01-01

    The present paper describes the design of teaching materials that are used as learning tools in school visits to a science museum. An exhibition on "A century of the Special Theory of Relativity", in the Kutxaespacio Science Museum, in San Sebastian, Spain, was used to design a visit for first-year engineering students at the university…

  13. The Feasibility and Acceptability of Using a Portfolio to Assess Professional Competence

    PubMed Central

    Tuekam, Rosine

    2011-01-01

    ABSTRACT Purpose: Little is known about physical therapists' views on the use of portfolios to evaluate professional competence. The purpose of this study was to gather the opinions of physical therapists on the feasibility and acceptability of a portfolio prepared to demonstrate evidence of clinical specialization through reported activities and accomplishments related to professional development, leadership, and research. Methods: Twenty-nine Canadian physical therapists practising in the neurosciences area were given 8 weeks to prepare a professional portfolio. Participants submitted the portfolio along with a survey addressing the preparation of the portfolio and its role as an assessment tool. Qualitative content analysis was used to interpret the participants' comments. Results: Participants reported that maintaining organized records facilitated the preparation of their portfolio. They experienced pride when reviewing their completed portfolios, which summarized their professional activities and highlighted their achievements. Concerns were noted about the veracity of self-reported records and the ability of the documentation to provide a comprehensive view of the full scope of the professional competencies required for clinical specialization (e.g., clinical skills). Conclusion: The study's findings support the feasibility and acceptability of a portfolio review to assess professional competence and clinical specialization in physical therapy and have implications for both physical therapists and professional agencies. PMID:22210983

  14. The feasibility and acceptability of using a portfolio to assess professional competence.

    PubMed

    Miller, Patricia A; Tuekam, Rosine

    2011-01-01

    Little is known about physical therapists' views on the use of portfolios to evaluate professional competence. The purpose of this study was to gather the opinions of physical therapists on the feasibility and acceptability of a portfolio prepared to demonstrate evidence of clinical specialization through reported activities and accomplishments related to professional development, leadership, and research. Twenty-nine Canadian physical therapists practising in the neurosciences area were given 8 weeks to prepare a professional portfolio. Participants submitted the portfolio along with a survey addressing the preparation of the portfolio and its role as an assessment tool. Qualitative content analysis was used to interpret the participants' comments. Participants reported that maintaining organized records facilitated the preparation of their portfolio. They experienced pride when reviewing their completed portfolios, which summarized their professional activities and highlighted their achievements. Concerns were noted about the veracity of self-reported records and the ability of the documentation to provide a comprehensive view of the full scope of the professional competencies required for clinical specialization (e.g., clinical skills). The study's findings support the feasibility and acceptability of a portfolio review to assess professional competence and clinical specialization in physical therapy and have implications for both physical therapists and professional agencies.

  15. Grinding arrangement for ball nose milling cutters

    NASA Technical Reports Server (NTRS)

    Burch, C. F. (Inventor)

    1974-01-01

    A grinding arrangement for spiral fluted ball nose end mills and like tools includes a tool holder for positioning the tool relative to a grinding wheel. The tool is mounted in a spindle within the tool holder for rotation about its centerline and the tool holder is pivotably mounted for angular movement about an axis which intersects that centerline. A follower arm of a cam follower secured to the spindle cooperates with a specially shaped cam to provide rotation of the tool during the angular movement of the tool holder during the grinding cycle, by an amount determined by the cam profile. In this way the surface of the cutting edge in contact with the grinding wheel is maintained at the same height on the grinding wheel throughout the angular movement of the tool holder during the grinding cycle.

  16. Product competitiveness analysis for e-commerce platform of special agricultural products

    NASA Astrophysics Data System (ADS)

    Wan, Fucheng; Ma, Ning; Yang, Dongwei; Xiong, Zhangyuan

    2017-09-01

    On the basis of analyzing the influence factors of the product competitiveness of the e-commerce platform of the special agricultural products and the characteristics of the analytical methods for the competitiveness of the special agricultural products, the price, the sales volume, the postage included service, the store reputation, the popularity, etc. were selected in this paper as the dimensionality for analyzing the competitiveness of the agricultural products, and the principal component factor analysis was taken as the competitiveness analysis method. Specifically, the web crawler was adopted to capture the information of various special agricultural products in the e-commerce platform ---- chi.taobao.com. Then, the original data captured thereby were preprocessed and MYSQL database was adopted to establish the information library for the special agricultural products. Then, the principal component factor analysis method was adopted to establish the analysis model for the competitiveness of the special agricultural products, and SPSS was adopted in the principal component factor analysis process to obtain the competitiveness evaluation factor system (support degree factor, price factor, service factor and evaluation factor) of the special agricultural products. Then, the linear regression method was adopted to establish the competitiveness index equation of the special agricultural products for estimating the competitiveness of the special agricultural products.

  17. Genome-based exploration of the specialized metabolic capacities of the genus Rhodococcus.

    PubMed

    Ceniceros, Ana; Dijkhuizen, Lubbert; Petrusma, Mirjan; Medema, Marnix H

    2017-08-09

    Bacteria of the genus Rhodococcus are well known for their ability to degrade a large range of organic compounds. Some rhodococci are free-living, saprophytic bacteria; others are animal and plant pathogens. Recently, several studies have shown that their genomes encode putative pathways for the synthesis of a large number of specialized metabolites that are likely to be involved in microbe-microbe and host-microbe interactions. To systematically explore the specialized metabolic potential of this genus, we here performed a comprehensive analysis of the biosynthetic coding capacity across publicly available rhododoccal genomes, and compared these with those of several Mycobacterium strains as well as that of their mutual close relative Amycolicicoccus subflavus. Comparative genomic analysis shows that most predicted biosynthetic gene cluster families in these strains are clade-specific and lack any homology with gene clusters encoding the production of known natural products. Interestingly, many of these clusters appear to encode the biosynthesis of lipopeptides, which may play key roles in the diverse environments were rhodococci thrive, by acting as biosurfactants, pathogenicity factors or antimicrobials. We also identified several gene cluster families that are universally shared among all three genera, which therefore may have a more 'primary' role in their physiology. Inactivation of these clusters by mutagenesis might help to generate weaker strains that can be used as live vaccines. The genus Rhodococcus thus provides an interesting target for natural product discovery, in view of its large and mostly uncharacterized biosynthetic repertoire, its relatively fast growth and the availability of effective genetic tools for its genomic modification.

  18. Comparative study of tool machinery sliding systems; comparison between plane and cylindrical basic shapes

    NASA Astrophysics Data System (ADS)

    Glăvan, D. O.; Babanatsas, T.; Babanatis Merce, R. M.; Glăvan, A.

    2018-01-01

    The paper brings in attention the importance that the sliding system of a tool machinery is having in the final precision of the manufacturing. We are basically comparing two type of slides, one constructed with plane surfaces and the other one with circular cross-sections (as known as cylindrical slides), analysing each solution from the point of view of its technology of manufacturing, of the precision that the particular slides are transferring to the tool machinery, cost of production, etc. Special attention is given to demonstrate theoretical and to confirm by experimental works what is happening with the stress distribution in the case of plane slides and cylindrical slides, both in longitudinal and in cross-over sections. Considering the results obtained for the stress distribution in the transversal and longitudinal cross sections, by composing them, we can obtain the stress distribution on the semicircular slide. Based on the results, special solutions for establishing the stress distribution between two surfaces without interact in the contact zone have been developed.

  19. Astronauts Newman and Walz evaluate tools for use on HST servicing mission

    NASA Image and Video Library

    1993-09-16

    STS051-06-023 (16 Sept 1993) --- Astronauts James H. Newman (in bay) and Carl E. Walz, mission specialists, practice space walking techniques and evaluate tools to be used on the first Hubble Space Telescope (HST) servicing mission scheduled for later this year. Walz rehearses using the Power Ratchet Tool (PRT), one of several special pieces of gear to be put to duty during the scheduled five periods of extravehicular activity (EVA) on the STS-61 mission.

  20. Database resources of the National Center for Biotechnology Information

    PubMed Central

    Sayers, Eric W.; Barrett, Tanya; Benson, Dennis A.; Bolton, Evan; Bryant, Stephen H.; Canese, Kathi; Chetvernin, Vyacheslav; Church, Deanna M.; DiCuccio, Michael; Federhen, Scott; Feolo, Michael; Fingerman, Ian M.; Geer, Lewis Y.; Helmberg, Wolfgang; Kapustin, Yuri; Krasnov, Sergey; Landsman, David; Lipman, David J.; Lu, Zhiyong; Madden, Thomas L.; Madej, Tom; Maglott, Donna R.; Marchler-Bauer, Aron; Miller, Vadim; Karsch-Mizrachi, Ilene; Ostell, James; Panchenko, Anna; Phan, Lon; Pruitt, Kim D.; Schuler, Gregory D.; Sequeira, Edwin; Sherry, Stephen T.; Shumway, Martin; Sirotkin, Karl; Slotta, Douglas; Souvorov, Alexandre; Starchenko, Grigory; Tatusova, Tatiana A.; Wagner, Lukas; Wang, Yanli; Wilbur, W. John; Yaschenko, Eugene; Ye, Jian

    2012-01-01

    In addition to maintaining the GenBank® nucleic acid sequence database, the National Center for Biotechnology Information (NCBI) provides analysis and retrieval resources for the data in GenBank and other biological data made available through the NCBI Website. NCBI resources include Entrez, the Entrez Programming Utilities, MyNCBI, PubMed, PubMed Central (PMC), Gene, the NCBI Taxonomy Browser, BLAST, BLAST Link (BLink), Primer-BLAST, COBALT, Splign, RefSeq, UniGene, HomoloGene, ProtEST, dbMHC, dbSNP, dbVar, Epigenomics, Genome and related tools, the Map Viewer, Model Maker, Evidence Viewer, Trace Archive, Sequence Read Archive, BioProject, BioSample, Retroviral Genotyping Tools, HIV-1/Human Protein Interaction Database, Gene Expression Omnibus (GEO), Probe, Online Mendelian Inheritance in Animals (OMIA), the Molecular Modeling Database (MMDB), the Conserved Domain Database (CDD), the Conserved Domain Architecture Retrieval Tool (CDART), Biosystems, Protein Clusters and the PubChem suite of small molecule databases. Augmenting many of the Web applications are custom implementations of the BLAST program optimized to search specialized data sets. All of these resources can be accessed through the NCBI home page at www.ncbi.nlm.nih.gov. PMID:22140104

  1. Database resources of the National Center for Biotechnology Information

    PubMed Central

    2013-01-01

    In addition to maintaining the GenBank® nucleic acid sequence database, the National Center for Biotechnology Information (NCBI, http://www.ncbi.nlm.nih.gov) provides analysis and retrieval resources for the data in GenBank and other biological data made available through the NCBI web site. NCBI resources include Entrez, the Entrez Programming Utilities, MyNCBI, PubMed, PubMed Central, Gene, the NCBI Taxonomy Browser, BLAST, BLAST Link (BLink), Primer-BLAST, COBALT, Splign, RefSeq, UniGene, HomoloGene, ProtEST, dbMHC, dbSNP, dbVar, Epigenomics, the Genetic Testing Registry, Genome and related tools, the Map Viewer, Model Maker, Evidence Viewer, Trace Archive, Sequence Read Archive, BioProject, BioSample, Retroviral Genotyping Tools, HIV-1/Human Protein Interaction Database, Gene Expression Omnibus, Probe, Online Mendelian Inheritance in Animals, the Molecular Modeling Database, the Conserved Domain Database, the Conserved Domain Architecture Retrieval Tool, Biosystems, Protein Clusters and the PubChem suite of small molecule databases. Augmenting many of the web applications are custom implementations of the BLAST program optimized to search specialized data sets. All of these resources can be accessed through the NCBI home page. PMID:23193264

  2. PLAYGROUND: preparing students for the cyber battleground

    NASA Astrophysics Data System (ADS)

    Nielson, Seth James

    2016-12-01

    Attempting to educate practitioners of computer security can be difficult if for no other reason than the breadth of knowledge required today. The security profession includes widely diverse subfields including cryptography, network architectures, programming, programming languages, design, coding practices, software testing, pattern recognition, economic analysis, and even human psychology. While an individual may choose to specialize in one of these more narrow elements, there is a pressing need for practitioners that have a solid understanding of the unifying principles of the whole. We created the Playground network simulation tool and used it in the instruction of a network security course to graduate students. This tool was created for three specific purposes. First, it provides simulation sufficiently powerful to permit rigorous study of desired principles while simultaneously reducing or eliminating unnecessary and distracting complexities. Second, it permitted the students to rapidly prototype a suite of security protocols and mechanisms. Finally, with equal rapidity, the students were able to develop attacks against the protocols that they themselves had created. Based on our own observations and student reviews, we believe that these three features combine to create a powerful pedagogical tool that provides students with a significant amount of breadth and intense emotional connection to computer security in a single semester.

  3. Applying differential dynamic logic to reconfigurable biological networks.

    PubMed

    Figueiredo, Daniel; Martins, Manuel A; Chaves, Madalena

    2017-09-01

    Qualitative and quantitative modeling frameworks are widely used for analysis of biological regulatory networks, the former giving a preliminary overview of the system's global dynamics and the latter providing more detailed solutions. Another approach is to model biological regulatory networks as hybrid systems, i.e., systems which can display both continuous and discrete dynamic behaviors. Actually, the development of synthetic biology has shown that this is a suitable way to think about biological systems, which can often be constructed as networks with discrete controllers, and present hybrid behaviors. In this paper we discuss this approach as a special case of the reconfigurability paradigm, well studied in Computer Science (CS). In CS there are well developed computational tools to reason about hybrid systems. We argue that it is worth applying such tools in a biological context. One interesting tool is differential dynamic logic (dL), which has recently been developed by Platzer and applied to many case-studies. In this paper we discuss some simple examples of biological regulatory networks to illustrate how dL can be used as an alternative, or also as a complement to methods already used. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Database resources of the National Center for Biotechnology Information.

    PubMed

    Wheeler, David L; Barrett, Tanya; Benson, Dennis A; Bryant, Stephen H; Canese, Kathi; Chetvernin, Vyacheslav; Church, Deanna M; DiCuccio, Michael; Edgar, Ron; Federhen, Scott; Geer, Lewis Y; Kapustin, Yuri; Khovayko, Oleg; Landsman, David; Lipman, David J; Madden, Thomas L; Maglott, Donna R; Ostell, James; Miller, Vadim; Pruitt, Kim D; Schuler, Gregory D; Sequeira, Edwin; Sherry, Steven T; Sirotkin, Karl; Souvorov, Alexandre; Starchenko, Grigory; Tatusov, Roman L; Tatusova, Tatiana A; Wagner, Lukas; Yaschenko, Eugene

    2007-01-01

    In addition to maintaining the GenBank nucleic acid sequence database, the National Center for Biotechnology Information (NCBI) provides analysis and retrieval resources for the data in GenBank and other biological data made available through NCBI's Web site. NCBI resources include Entrez, the Entrez Programming Utilities, My NCBI, PubMed, PubMed Central, Entrez Gene, the NCBI Taxonomy Browser, BLAST, BLAST Link(BLink), Electronic PCR, OrfFinder, Spidey, Splign, RefSeq, UniGene, HomoloGene, ProtEST, dbMHC, dbSNP, Cancer Chromosomes, Entrez Genome, Genome Project and related tools, the Trace and Assembly Archives, the Map Viewer, Model Maker, Evidence Viewer, Clusters of Orthologous Groups (COGs), Viral Genotyping Tools, Influenza Viral Resources, HIV-1/Human Protein Interaction Database, Gene Expression Omnibus (GEO), Entrez Probe, GENSAT, Online Mendelian Inheritance in Man (OMIM), Online Mendelian Inheritance in Animals (OMIA), the Molecular Modeling Database (MMDB), the Conserved Domain Database (CDD), the Conserved Domain Architecture Retrieval Tool (CDART) and the PubChem suite of small molecule databases. Augmenting many of the Web applications are custom implementations of the BLAST program optimized to search specialized data sets. These resources can be accessed through the NCBI home page at www.ncbi.nlm.nih.gov.

  5. Database resources of the National Center for Biotechnology Information.

    PubMed

    Sayers, Eric W; Barrett, Tanya; Benson, Dennis A; Bryant, Stephen H; Canese, Kathi; Chetvernin, Vyacheslav; Church, Deanna M; DiCuccio, Michael; Edgar, Ron; Federhen, Scott; Feolo, Michael; Geer, Lewis Y; Helmberg, Wolfgang; Kapustin, Yuri; Landsman, David; Lipman, David J; Madden, Thomas L; Maglott, Donna R; Miller, Vadim; Mizrachi, Ilene; Ostell, James; Pruitt, Kim D; Schuler, Gregory D; Sequeira, Edwin; Sherry, Stephen T; Shumway, Martin; Sirotkin, Karl; Souvorov, Alexandre; Starchenko, Grigory; Tatusova, Tatiana A; Wagner, Lukas; Yaschenko, Eugene; Ye, Jian

    2009-01-01

    In addition to maintaining the GenBank nucleic acid sequence database, the National Center for Biotechnology Information (NCBI) provides analysis and retrieval resources for the data in GenBank and other biological data made available through the NCBI web site. NCBI resources include Entrez, the Entrez Programming Utilities, MyNCBI, PubMed, PubMed Central, Entrez Gene, the NCBI Taxonomy Browser, BLAST, BLAST Link (BLink), Electronic PCR, OrfFinder, Spidey, Splign, RefSeq, UniGene, HomoloGene, ProtEST, dbMHC, dbSNP, Cancer Chromosomes, Entrez Genomes and related tools, the Map Viewer, Model Maker, Evidence Viewer, Clusters of Orthologous Groups (COGs), Retroviral Genotyping Tools, HIV-1/Human Protein Interaction Database, Gene Expression Omnibus (GEO), Entrez Probe, GENSAT, Online Mendelian Inheritance in Man (OMIM), Online Mendelian Inheritance in Animals (OMIA), the Molecular Modeling Database (MMDB), the Conserved Domain Database (CDD), the Conserved Domain Architecture Retrieval Tool (CDART) and the PubChem suite of small molecule databases. Augmenting many of the web applications is custom implementation of the BLAST program optimized to search specialized data sets. All of the resources can be accessed through the NCBI home page at www.ncbi.nlm.nih.gov.

  6. High-throughput two-dimensional root system phenotyping platform facilitates genetic analysis of root growth and development.

    PubMed

    Clark, Randy T; Famoso, Adam N; Zhao, Keyan; Shaff, Jon E; Craft, Eric J; Bustamante, Carlos D; McCouch, Susan R; Aneshansley, Daniel J; Kochian, Leon V

    2013-02-01

    High-throughput phenotyping of root systems requires a combination of specialized techniques and adaptable plant growth, root imaging and software tools. A custom phenotyping platform was designed to capture images of whole root systems, and novel software tools were developed to process and analyse these images. The platform and its components are adaptable to a wide range root phenotyping studies using diverse growth systems (hydroponics, paper pouches, gel and soil) involving several plant species, including, but not limited to, rice, maize, sorghum, tomato and Arabidopsis. The RootReader2D software tool is free and publicly available and was designed with both user-guided and automated features that increase flexibility and enhance efficiency when measuring root growth traits from specific roots or entire root systems during large-scale phenotyping studies. To demonstrate the unique capabilities and high-throughput capacity of this phenotyping platform for studying root systems, genome-wide association studies on rice (Oryza sativa) and maize (Zea mays) root growth were performed and root traits related to aluminium (Al) tolerance were analysed on the parents of the maize nested association mapping (NAM) population. © 2012 Blackwell Publishing Ltd.

  7. Big data analytics in immunology: a knowledge-based approach.

    PubMed

    Zhang, Guang Lan; Sun, Jing; Chitkushev, Lou; Brusic, Vladimir

    2014-01-01

    With the vast amount of immunological data available, immunology research is entering the big data era. These data vary in granularity, quality, and complexity and are stored in various formats, including publications, technical reports, and databases. The challenge is to make the transition from data to actionable knowledge and wisdom and bridge the knowledge gap and application gap. We report a knowledge-based approach based on a framework called KB-builder that facilitates data mining by enabling fast development and deployment of web-accessible immunological data knowledge warehouses. Immunological knowledge discovery relies heavily on both the availability of accurate, up-to-date, and well-organized data and the proper analytics tools. We propose the use of knowledge-based approaches by developing knowledgebases combining well-annotated data with specialized analytical tools and integrating them into analytical workflow. A set of well-defined workflow types with rich summarization and visualization capacity facilitates the transformation from data to critical information and knowledge. By using KB-builder, we enabled streamlining of normally time-consuming processes of database development. The knowledgebases built using KB-builder will speed up rational vaccine design by providing accurate and well-annotated data coupled with tailored computational analysis tools and workflow.

  8. Evolution of Technique in Endoscopic Transsphenoidal Surgery for Pituitary Adenoma: A Single Institution Experience from 220 Procedures

    PubMed Central

    Hansasuta, Ake; Pokanan, Siriwut; Punyawai, Pritsana

    2018-01-01

    Introduction Endoscopic transsphenoidal surgery (ETSS) for pituitary adenoma (PA) has been a recent shift from the traditional microscopic technique. Although some literature demonstrated superiority of ETSS over the microscopic method and some evaluated mono- vs. binostril access within the ETSS, none had explored the potential influence of dedicated instrument, as this procedure had evolved, on patients’ outcomes when compared to traditional microscopic tools. Objective To investigate our own clinical and radiographic outcomes of ETSS for PA with its technical evolution over time as well as a significance of, having vs. lacking, the special endoscopic tools. Methods Included patients underwent ETSS for PA performed by the first author (AH). Prospectively recorded patients’ data concerning pre-, intra- and postoperative clinical and radiographic assessments were subject to analysis. The three groups of differently evolving ETSS techniques, beginning with mononostril (MN) to binostril ETSS with standard microsurgical instruments (BN1) and, lastly, binostril ETSS with specially-designed endoscopic tools (BN2), were examined for their impact on the intra- and, short- and long-term, postoperative results. Also, the survival after ETSS for PA, as defined by the need for reintervention in each technical group, was appraised. Results From January 2006 to 2012, there were 47, 101 and 72 ETSS, from 183 patients, in the MN, BN1 and BN2 cohorts, respectively. Significant preoperative findings were greater proportion of patients with prior surgery (p=0.01) and tumors with parasellar extension (p=0.02) in the binostril (BN1&2) than the MN group. Substantially shorter operative time and less amount of blood loss were evident as our technique had evolved (p<0.001). Despite higher incidence, and more advanced grades, of cerebrospinal fluid leakage in the binostril groups (p < 0.001), the requirement for post-ETSS surgical repair was less than the mononostril cohort (p=0.04). At six-month follow-up (n=214), quantitative radiographic outcome analysis was markedly superior in BN2. Consequently, long-term result was better in this latest technical group. Important negative risk factors, from multivariate Cox regression analysis, were prior surgery, Knosp grade, and firm tumor while BN1, BN2 and percentages of anteroposterior dimension PA removal had positive effect on longer survival. Conclusion The evolution of technique for ETSS for PA from MN to BN2 has shown its efficacy by improving intra- and postoperative outcomes in our study cohorts. Based on our results, not only that a neurosurgeon, wishing to start performing ETSS, should enroll in a formal fellowship training but he/she should also utilize advanced endoscopic tools, as we have proved its superior results in dealing with PA.  PMID:29515939

  9. Evolution of Technique in Endoscopic Transsphenoidal Surgery for Pituitary Adenoma: A Single Institution Experience from 220 Procedures.

    PubMed

    Hansasuta, Ake; Pokanan, Siriwut; Punyawai, Pritsana; Mahattanakul, Wattana

    2018-01-01

    Introduction Endoscopic transsphenoidal surgery (ETSS) for pituitary adenoma (PA) has been a recent shift from the traditional microscopic technique. Although some literature demonstrated superiority of ETSS over the microscopic method and some evaluated mono- vs. binostril access within the ETSS, none had explored the potential influence of dedicated instrument, as this procedure had evolved, on patients' outcomes when compared to traditional microscopic tools. Objective To investigate our own clinical and radiographic outcomes of ETSS for PA with its technical evolution over time as well as a significance of, having vs. lacking, the special endoscopic tools. Methods Included patients underwent ETSS for PA performed by the first author (AH). Prospectively recorded patients' data concerning pre-, intra- and postoperative clinical and radiographic assessments were subject to analysis. The three groups of differently evolving ETSS techniques, beginning with mononostril (MN) to binostril ETSS with standard microsurgical instruments (BN1) and, lastly, binostril ETSS with specially-designed endoscopic tools (BN2), were examined for their impact on the intra- and, short- and long-term, postoperative results. Also, the survival after ETSS for PA, as defined by the need for reintervention in each technical group, was appraised. Results From January 2006 to 2012, there were 47, 101 and 72 ETSS, from 183 patients, in the MN, BN1 and BN2 cohorts, respectively. Significant preoperative findings were greater proportion of patients with prior surgery (p=0.01) and tumors with parasellar extension (p=0.02) in the binostril (BN1&2) than the MN group. Substantially shorter operative time and less amount of blood loss were evident as our technique had evolved (p<0.001). Despite higher incidence, and more advanced grades, of cerebrospinal fluid leakage in the binostril groups (p < 0.001), the requirement for post-ETSS surgical repair was less than the mononostril cohort (p=0.04). At six-month follow-up (n=214), quantitative radiographic outcome analysis was markedly superior in BN2. Consequently, long-term result was better in this latest technical group. Important negative risk factors, from multivariate Cox regression analysis, were prior surgery, Knosp grade, and firm tumor while BN1, BN2 and percentages of anteroposterior dimension PA removal had positive effect on longer survival. Conclusion The evolution of technique for ETSS for PA from MN to BN2 has shown its efficacy by improving intra- and postoperative outcomes in our study cohorts. Based on our results, not only that a neurosurgeon, wishing to start performing ETSS, should enroll in a formal fellowship training but he/she should also utilize advanced endoscopic tools, as we have proved its superior results in dealing with PA.

  10. Indicators of patients with major depressive disorder in need of highly specialized care: A systematic review

    PubMed Central

    Kaddouri, Meriam; Goorden, Maartje; van Balkom, Anton J. L. M.; Bockting, Claudi L. H.; Peeters, Frenk P. M. L.; Hakkaart-van Roijen, Leona

    2017-01-01

    Objectives Early identification of patients with major depressive disorder (MDD) that cannot be managed by secondary mental health services and who require highly specialized mental healthcare could enhance need-based patient stratification. This, in turn, may reduce the number of treatment steps needed to achieve and sustain an adequate treatment response. The development of a valid tool to identify patients with MDD in need of highly specialized care is hampered by the lack of a comprehensive understanding of indicators that distinguish patients with and without a need for highly specialized MDD care. The aim of this study, therefore, was to systematically review studies on indicators of patients with MDD likely in need of highly specialized care. Methods A structured literature search was performed on the PubMed and PsycINFO databases following PRISMA guidelines. Two reviewers independently assessed study eligibility and determined the quality of the identified studies. Three reviewers independently executed data extraction by using a pre-piloted, standardized extraction form. The resulting indicators were grouped by topical similarity, creating a concise summary of the findings. Results The systematic search of all databases yielded a total of 7,360 references, of which sixteen were eligible for inclusion. The sixteen papers yielded a total of 48 unique indicators. Overall, a more pronounced depression severity, a younger age of onset, a history of prior poor treatment response, psychiatric comorbidity, somatic comorbidity, childhood trauma, psychosocial impairment, older age, and a socioeconomically disadvantaged status were found to be associated with proxies of need for highly specialized MDD care. Conclusions Several indicators are associated with the need for highly specialized MDD care. These indicators provide easily measurable factors that may serve as a starting point for the development of a valid tool to identify patients with MDD in need of highly specialized care. PMID:28178306

  11. Usefulness of an ad hoc questionnaire (Acro-CQ) for the systematic assessment of acromegaly comorbidities at diagnosis and their management at follow-up.

    PubMed

    Guaraldi, F; Gori, D; Beccuti, G; Prencipe, N; Giordano, R; Mints, Y; Di Giacomo, V S; Berton, A; Lorente, M; Gasco, V; Ghigo, E; Salvatori, R; Grottoli, S

    2016-11-01

    To determine the validity of a self-administered questionnaire (Acro-CQ) developed to systematically assess the presence, type and time of onset of acromegaly comorbidities. This is a cross-sectional study; 105 acromegaly patients and 147 controls with other types of pituitary adenoma, referred to a specialized Italian Center, autonomously compiled Acro-CQ in an outpatient clinical setting. To test its reliability in a different setting, Acro-CQ was administered via mail to 78 patients with acromegaly and 100 with other pituitary adenomas, referred to a specialized US Center. Data obtained from questionnaires in both settings were compared with medical records (gold standard). Demographics of patients and controls from both countries were similar. In both settings, >95 % of the questionnaires were completely filled; only one item was missed in the others. Concordance with medical record was excellent (k > 0.85) for most of the items, independently from the way of administration, patient age, gender and nationality, pituitary adenoma type and disease activity. Acro-CQ is an inexpensive, highly accepted from patients and reliable tool recommended to expedite systematic collection of relevant clinical data in acromegaly at diagnosis, to be replicated at follow-ups. This tool may guide a targeted, cost-effective management of complications. Moreover, it could be applied to retrieve data for survey studies in both acromegaly and other pituitary adenomas, as information is easily and rapidly accessible for statistical analysis.

  12. Bridging history and social psychology: what, how and why.

    PubMed

    Glăveanu, Vlad; Yamamoto, Koji

    2012-12-01

    This special issue aims to bridge history and social psychology by bringing together historians and social psychologists in an exercise of reading and learning from each other's work. This interdisciplinary exercise is not only timely but of great importance for both disciplines. Social psychologists can benefit from engaging with historical sources by being able to contextualise their findings and enrich their theoretical models. It is not only that all social and psychological phenomena have a history but this history is very much part of present-day and future developments. On the other hand historians can enhance their analysis of historical sources by drawing upon the conceptual tools developed in social psychology. They can "test" these tools and contribute to their validation and enrichment from completely different perspectives. Most important, as contributions to this special issue amply demonstrate, psychology's "historical turn" has the potential to shed a new light on striking, yet underexplored, similarities between contemporary public spheres and their pre-modern counterparts. This issue thereby calls into question the dichotomy between traditional and de-traditionalized societies-a distinction that lies at the heart of many social psychology accounts of the world we live in. The present editorial will introduce and consider this act of bridging history and social psychology by focusing on three main questions: What is the bridge made of? How can the two disciplines be bridged? and Why we cross this interdisciplinary bridge? In the end a reflection on the future of this collaboration will be offered.

  13. Mortality, migration, income, and air pollution: a comparative study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bozzo, S.R.; Novak, K.M.; Galdos, F.

    1978-06-02

    The interrelationships among different demographic factors, specific causes of death, median family income, and estimated air pollution emissions were examined. Using the Medical Data Base (MEDABA) developed at Brookhaven National Laboratory, the entire population of the United States was cross-tabulated by income and emission levels of air pollutants. Path analysis was used to examine a number of patterns and relationships for each age, race, and sex group containing a minimum of 10,000 persons. Competitive and complementary effects were observed. These effects were frequently age dependent and occasionaly sex related. This specialized data base, the application of path analysis, and themore » development of a dynamic population and mortality model, in combination, proved to be a useful tool for investigating the effects of energy related pollutants on the exposed population.« less

  14. Assessing the Kansas water-level monitoring program: An example of the application of classical statistics to a geological problem

    USGS Publications Warehouse

    Davis, J.C.

    2000-01-01

    Geologists may feel that geological data are not amenable to statistical analysis, or at best require specialized approaches such as nonparametric statistics and geostatistics. However, there are many circumstances, particularly in systematic studies conducted for environmental or regulatory purposes, where traditional parametric statistical procedures can be beneficial. An example is the application of analysis of variance to data collected in an annual program of measuring groundwater levels in Kansas. Influences such as well conditions, operator effects, and use of the water can be assessed and wells that yield less reliable measurements can be identified. Such statistical studies have resulted in yearly improvements in the quality and reliability of the collected hydrologic data. Similar benefits may be achieved in other geological studies by the appropriate use of classical statistical tools.

  15. Analyzing free fall with a smartphone acceleration sensor

    NASA Astrophysics Data System (ADS)

    Wogt, Patrik; Kuhn, Jochen

    2012-03-01

    This paper provides a first example of experiments in this column using smartphones as experimental tools. More examples concerning this special tool will follow in the next issues. The differences between a smartphone and a ``regular'' cell phone are that smartphones offer more advanced computing ability and connectivity. Smartphones combine the functions of personal digital assistants (PDAs) and cell phones.

  16. Specially Made for Science: Researchers Develop Online Tools For Collaborations

    ERIC Educational Resources Information Center

    Guterman, Lila

    2008-01-01

    Blogs, wikis, and social-networking sites such as Facebook may get media buzz these days, but for scientists, engineers, and doctors, they are not even on the radar. The most effective tools of the Internet for such people tend to be efforts more narrowly aimed at their needs, such as software that helps geneticists replicate one another's…

  17. Patient classification tool in home health care.

    PubMed

    Pavasaris, B

    1989-01-01

    Medicare's system of diagnosis related groups for health care cost reimbursements is inadequate for the special requirements of home health care. A visiting nurses association's patient classification tool correlates a meticulous record of professional time spent per patient with patient diagnosis and level of care, aimed at helping policymakers develop a more equitable DRG-based prospective payment formula for home care costs.

  18. An Android Research and Development Program.

    DTIC Science & Technology

    1983-03-01

    reprogrammable multifunctional manipulator designed to move material, parts, tools, or special devices, through variable programmed motions for the performance...thesis: 1. An ’industrial robot’ is a [mechanized,] reprogrammable multifunctional manipulator designed to move material, parts, tools, or...insertion is also well defined in space. These manipulators are currently in use in the automobile industry, and two were were demonstrated by Kohol

  19. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME

    PubMed Central

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2017-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948

  20. An advanced environment for hybrid modeling of biological systems based on modelica.

    PubMed

    Pross, Sabrina; Bachmann, Bernhard

    2011-01-20

    Biological systems are often very complex so that an appropriate formalism is needed for modeling their behavior. Hybrid Petri Nets, consisting of time-discrete Petri Net elements as well as continuous ones, have proven to be ideal for this task. Therefore, a new Petri Net library was implemented based on the object-oriented modeling language Modelica which allows the modeling of discrete, stochastic and continuous Petri Net elements by differential, algebraic and discrete equations. An appropriate Modelica-tool performs the hybrid simulation with discrete events and the solution of continuous differential equations. A special sub-library contains so-called wrappers for specific reactions to simplify the modeling process. The Modelica-models can be connected to Simulink-models for parameter optimization, sensitivity analysis and stochastic simulation in Matlab. The present paper illustrates the implementation of the Petri Net component models, their usage within the modeling process and the coupling between the Modelica-tool Dymola and Matlab/Simulink. The application is demonstrated by modeling the metabolism of Chinese Hamster Ovary Cells.

Top