Sample records for generation tool decider

  1. DECIDE: a Decision Support Tool to Facilitate Parents' Choices Regarding Genome-Wide Sequencing.

    PubMed

    Birch, Patricia; Adam, S; Bansback, N; Coe, R R; Hicklin, J; Lehman, A; Li, K C; Friedman, J M

    2016-12-01

    We describe the rationale, development, and usability testing for an integrated e-learning tool and decision aid for parents facing decisions about genome-wide sequencing (GWS) for their children with a suspected genetic condition. The online tool, DECIDE, is designed to provide decision-support and to promote high quality decisions about undergoing GWS with or without return of optional incidental finding results. DECIDE works by integrating educational material with decision aids. Users may tailor their learning by controlling both the amount of information and its format - text and diagrams and/or short videos. The decision aid guides users to weigh the importance of various relevant factors in their own lives and circumstances. After considering the pros and cons of GWS and return of incidental findings, DECIDE summarizes the user's responses and apparent preferred choices. In a usability study of 16 parents who had already chosen GWS after conventional genetic counselling, all participants found DECIDE to be helpful. Many would have been satisfied to use it alone to guide their GWS decisions, but most would prefer to have the option of consulting a health care professional as well to aid their decision. Further testing is necessary to establish the effectiveness of using DECIDE as an adjunct to or instead of conventional pre-test genetic counselling for clinical genome-wide sequencing.

  2. Tools, information sources, and methods used in deciding on drug availability in HMOs.

    PubMed

    Barner, J C; Thomas, J

    1998-01-01

    The use and importance of specific decision-making tools, information sources, and drug-use management methods in determining drug availability and use in HMOs were studied. A questionnaire was sent to 303 randomly selected HMOs. Respondents were asked to rate their use of each of four formal decision-making tools and its relative importance, as well as the use and importance of eight information sources and 11 methods for managing drug availability and use, on a 5-point scale. The survey response rate was 28%. Approximately half of the respondents reported that their HMOs used decision analysis or multiattribute analysis in deciding on drug availability. If used, these tools were rated as very important. There were significant differences in levels of use by HMO type, membership size, and age. Journal articles and reference books were reported most often as information sources. Retrospective drug-use review was used very often and perceived to be very important in managing drug use. Other management methods were used only occasionally, but the importance placed on these tools when used ranged from moderately to very important. Older organizations used most of the management methods more often than did other HMOs. Decision analysis and multiattribute analysis were the most commonly used tools for deciding on which drugs to make available to HMO members, and reference books and journal articles were the most commonly used information sources. Retrospective and prospective drug-use reviews were the most commonly applied methods for managing HMO members' access to drugs.

  3. E-DECIDER: Using Earth Science Data and Modeling Tools to Develop Decision Support for Earthquake Disaster Response

    NASA Astrophysics Data System (ADS)

    Glasscoe, Margaret T.; Wang, Jun; Pierce, Marlon E.; Yoder, Mark R.; Parker, Jay W.; Burl, Michael C.; Stough, Timothy M.; Granat, Robert A.; Donnellan, Andrea; Rundle, John B.; Ma, Yu; Bawden, Gerald W.; Yuen, Karen

    2015-08-01

    Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing new capabilities for decision making utilizing remote sensing data and modeling software to provide decision support for earthquake disaster management and response. E-DECIDER incorporates the earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools allows us to provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). This in turn is delivered through standards-compliant web services for desktop and hand-held devices.

  4. E-DECIDER Decision Support Gateway For Earthquake Disaster Response

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Stough, T. M.; Parker, J. W.; Burl, M. C.; Donnellan, A.; Blom, R. G.; Pierce, M. E.; Wang, J.; Ma, Y.; Rundle, J. B.; Yoder, M. R.

    2013-12-01

    Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing capabilities for decision-making utilizing remote sensing data and modeling software in order to provide decision support for earthquake disaster management and response. E-DECIDER incorporates earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project in order to produce standards-compliant map data products to aid in decision-making following an earthquake. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools, help provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). E-DECIDER utilizes a service-based GIS model for its cyber-infrastructure in order to produce standards-compliant products for different user types with multiple service protocols (such as KML, WMS, WFS, and WCS). The goal is to make complex GIS processing and domain-specific analysis tools more accessible to general users through software services as well as provide system sustainability through infrastructure services. The system comprises several components, which include: a GeoServer for thematic mapping and data distribution, a geospatial database for storage and spatial analysis, web service APIs, including simple-to-use REST APIs for complex GIS functionalities, and geoprocessing tools including python scripts to produce standards-compliant data products. These are then served to the E-DECIDER decision support gateway (http://e-decider.org), the E-DECIDER mobile interface, and to the Department of Homeland Security decision support middleware UICDS (Unified Incident Command and Decision Support). The E-DECIDER decision support gateway features a web interface that

  5. A survey of tools and resources for the next generation analyst

    NASA Astrophysics Data System (ADS)

    Hall, David L.; Graham, Jake; Catherman, Emily

    2015-05-01

    We have previously argued that a combination of trends in information technology (IT) and changing habits of people using IT provide opportunities for the emergence of a new generation of analysts that can perform effective intelligence, surveillance and reconnaissance (ISR) on a "do it yourself" (DIY) or "armchair" approach (see D.L. Hall and J. Llinas (2014)). Key technology advances include: i) new sensing capabilities including the use of micro-scale sensors and ad hoc deployment platforms such as commercial drones, ii) advanced computing capabilities in mobile devices that allow advanced signal and image processing and modeling, iii) intelligent interconnections due to advances in "web N" capabilities, and iv) global interconnectivity and increasing bandwidth. In addition, the changing habits of the digital natives reflect new ways of collecting and reporting information, sharing information, and collaborating in dynamic teams. This paper provides a survey and assessment of tools and resources to support this emerging analysis approach. The tools range from large-scale commercial tools such as IBM i2 Analyst Notebook, Palantir, and GeoSuite to emerging open source tools such as GeoViz and DECIDE from university research centers. The tools include geospatial visualization tools, social network analysis tools and decision aids. A summary of tools is provided along with links to web sites for tool access.

  6. Using bio.tools to generate and annotate workbench tool descriptions

    PubMed Central

    Hillion, Kenzo-Hugo; Kuzmin, Ivan; Khodak, Anton; Rasche, Eric; Crusoe, Michael; Peterson, Hedi; Ison, Jon; Ménager, Hervé

    2017-01-01

    Workbench and workflow systems such as Galaxy, Taverna, Chipster, or Common Workflow Language (CWL)-based frameworks, facilitate the access to bioinformatics tools in a user-friendly, scalable and reproducible way. Still, the integration of tools in such environments remains a cumbersome, time consuming and error-prone process. A major consequence is the incomplete or outdated description of tools that are often missing important information, including parameters and metadata such as publication or links to documentation. ToolDog (Tool DescriptiOn Generator) facilitates the integration of tools - which have been registered in the ELIXIR tools registry (https://bio.tools) - into workbench environments by generating tool description templates. ToolDog includes two modules. The first module analyses the source code of the bioinformatics software with language-specific plugins, and generates a skeleton for a Galaxy XML or CWL tool description. The second module is dedicated to the enrichment of the generated tool description, using metadata provided by bio.tools. This last module can also be used on its own to complete or correct existing tool descriptions with missing metadata. PMID:29333231

  7. Web Tools: The Second Generation

    ERIC Educational Resources Information Center

    Pascopella, Angela

    2008-01-01

    Web 2.0 tools and technologies, or second generation tools, help districts to save time and money, and eliminate the need to transfer or move files back and forth across computers. Many Web 2.0 tools help students think critically and solve problems, which falls under the 21st-century skills. The second-generation tools are growing in popularity…

  8. Systems Prototyping with Fourth Generation Tools.

    ERIC Educational Resources Information Center

    Sholtys, Phyllis

    1983-01-01

    The development of information systems using an engineering approach that uses both traditional programing techniques and fourth generation software tools is described. Fourth generation applications tools are used to quickly develop a prototype system that is revised as the user clarifies requirements. (MLW)

  9. GridTool: A surface modeling and grid generation tool

    NASA Technical Reports Server (NTRS)

    Samareh-Abolhassani, Jamshid

    1995-01-01

    GridTool is designed around the concept that the surface grids are generated on a set of bi-linear patches. This type of grid generation is quite easy to implement, and it avoids the problems associated with complex CAD surface representations and associated surface parameterizations. However, the resulting surface grids are close to but not on the original CAD surfaces. This problem can be alleviated by projecting the resulting surface grids onto the original CAD surfaces. GridTool is designed primary for unstructured grid generation systems. Currently, GridTool supports VGRID and FELISA systems, and it can be easily extended to support other unstructured grid generation systems. The data in GridTool is stored parametrically so that once the problem is set up, one can modify the surfaces and the entire set of points, curves and patches will be updated automatically. This is very useful in a multidisciplinary design and optimization process. GridTool is written entirely in ANSI 'C', the interface is based on the FORMS library, and the graphics is based on the GL library. The code has been tested successfully on IRIS workstations running IRIX4.0 and above. The memory is allocated dynamically, therefore, memory size will depend on the complexity of geometry/grid. GridTool data structure is based on a link-list structure which allows the required memory to expand and contract dynamically according to the user's data size and action. Data structure contains several types of objects such as points, curves, patches, sources and surfaces. At any given time, there is always an active object which is drawn in magenta, or in their highlighted colors as defined by the resource file which will be discussed later.

  10. Sense, decide, act, communicate (SDAC): next generation of smart sensor systems

    NASA Astrophysics Data System (ADS)

    Berry, Nina; Davis, Jesse; Ko, Teresa H.; Kyker, Ron; Pate, Ron; Stark, Doug; Stinnett, Regan; Baker, James; Cushner, Adam; Van Dyke, Colin; Kyckelhahn, Brian

    2004-09-01

    The recent war on terrorism and increased urban warfare has been a major catalysis for increased interest in the development of disposable unattended wireless ground sensors. While the application of these sensors to hostile domains has been generally governed by specific tasks, this research explores a unique paradigm capitalizing on the fundamental functionality related to sensor systems. This functionality includes a sensors ability to Sense - multi-modal sensing of environmental events, Decide - smart analysis of sensor data, Act - response to environmental events, and Communication - internal to system and external to humans (SDAC). The main concept behind SDAC sensor systems is to integrate the hardware, software, and networking to generate 'knowledge and not just data'. This research explores the usage of wireless SDAC units to collectively make up a sensor system capable of persistent, adaptive, and autonomous behavior. These systems are base on the evaluation of scenarios and existing systems covering various domains. This paper presents a promising view of sensor network characteristics, which will eventually yield smart (intelligent collectives) network arrays of SDAC sensing units generally applicable to multiple related domains. This paper will also discuss and evaluate the demonstration system developed to test the concepts related to SDAC systems.

  11. Disaster Response Tools for Decision Support and Data Discovery - E-DECIDER and GeoGateway

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Donnellan, A.; Parker, J. W.; Granat, R. A.; Lyzenga, G. A.; Pierce, M. E.; Wang, J.; Grant Ludwig, L.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.

    2015-12-01

    Providing actionable data for situational awareness following an earthquake or other disaster is critical to decision makers in order to improve their ability to anticipate requirements and provide appropriate resources for response. E-DECIDER (Emergency Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) is a decision support system producing remote sensing and geophysical modeling products that are relevant to the emergency preparedness and response communities and serves as a gateway to enable the delivery of actionable information to these communities. GeoGateway is a data product search and analysis gateway for scientific discovery, field use, and disaster response focused on NASA UAVSAR and GPS data that integrates with fault data, seismicity and models. Key information on the nature, magnitude and scope of damage, or Essential Elements of Information (EEI), necessary to achieve situational awareness are often generated from a wide array of organizations and disciplines, using any number of geospatial and non-geospatial technologies. We have worked in partnership with the California Earthquake Clearinghouse to develop actionable data products for use in their response efforts, particularly in regularly scheduled, statewide exercises like the recent May 2015 Capstone/SoCal NLE/Ardent Sentry Exercises and in the August 2014 South Napa earthquake activation. We also provided a number of products, services, and consultation to the NASA agency-wide response to the April 2015 Gorkha, Nepal earthquake. We will present perspectives on developing tools for decision support and data discovery in partnership with the Clearinghouse and for the Nepal earthquake. Products delivered included map layers as part of the common operational data plan for the Clearinghouse, delivered through XchangeCore Web Service Data Orchestration, enabling users to create merged datasets from multiple providers. For the Nepal response effort, products included models

  12. Who decides and what are people willing-to-pay for whole genome sequencing information?

    PubMed Central

    Marshall, DA; Gonzalez, JM; Johnson, FR; MacDonald, KV; Pugh, A; Douglas, MP; Phillips, KA

    2016-01-01

    PURPOSE Whole genome sequencing (WGS) can be used as a powerful diagnostic tool which could also be used for screening but may generate anxiety, unnecessary testing and overtreatment. Current guidelines suggest reporting clinically actionable secondary findings when diagnostic testing is performed. We estimated preferences for receiving WGS results. METHODS A US nationally representative survey (n=410 adults) was used to rank preferences for who decides (expert panel, your doctor, you) which WGS results are reported. We estimated the value of information about variants with varying levels of clinical usefulness using willingness-to-pay contingent valuation questions. RESULTS 43% preferred to decide themselves what information is included in the WGS report. 38% (95% CI:33–43%) would not pay for actionable variants, and 3% (95% CI:1–5%) would pay more than $1000. 55% (95% CI:50–60%) would not pay for variants in which medical treatment is currently unclear, and 7% (95% CI:5–9%) would pay more than $400. CONCLUSION Most people prefer to decide what WGS results are reported. Despite valuing actionable information more, some respondents perceive that genetic information could negatively impact them. Preference heterogeneity for WGS information should be considered in the development of policies, particularly to integrate patient preferences with personalized medicine and shared decision making. PMID:27253734

  13. Semi-autonomous remote sensing time series generation tool

    NASA Astrophysics Data System (ADS)

    Babu, Dinesh Kumar; Kaufmann, Christof; Schmidt, Marco; Dhams, Thorsten; Conrad, Christopher

    2017-10-01

    High spatial and temporal resolution data is vital for crop monitoring and phenology change detection. Due to the lack of satellite architecture and frequent cloud cover issues, availability of daily high spatial data is still far from reality. Remote sensing time series generation of high spatial and temporal data by data fusion seems to be a practical alternative. However, it is not an easy process, since it involves multiple steps and also requires multiple tools. In this paper, a framework of Geo Information System (GIS) based tool is presented for semi-autonomous time series generation. This tool will eliminate the difficulties by automating all the steps and enable the users to generate synthetic time series data with ease. Firstly, all the steps required for the time series generation process are identified and grouped into blocks based on their functionalities. Later two main frameworks are created, one to perform all the pre-processing steps on various satellite data and the other one to perform data fusion to generate time series. The two frameworks can be used individually to perform specific tasks or they could be combined to perform both the processes in one go. This tool can handle most of the known geo data formats currently available which makes it a generic tool for time series generation of various remote sensing satellite data. This tool is developed as a common platform with good interface which provides lot of functionalities to enable further development of more remote sensing applications. A detailed description on the capabilities and the advantages of the frameworks are given in this paper.

  14. MCM generator: a Java-based tool for generating medical metadata.

    PubMed

    Munoz, F; Hersh, W

    1998-01-01

    In a previous paper we introduced the need to implement a mechanism to facilitate the discovery of relevant Web medical documents. We maintained that the use of META tags, specifically ones that define the medical subject and resource type of a document, help towards this goal. We have now developed a tool to facilitate the generation of these tags for the authors of medical documents. Written entirely in Java, this tool makes use of the SAPHIRE server, and helps the author identify the Medical Subject Heading terms that most appropriately describe the subject of the document. Furthermore, it allows the author to generate metadata tags for the 15 elements that the Dublin Core considers as core elements in the description of a document. This paper describes the use of this tool in the cataloguing of Web and non-Web medical documents, such as images, movie, and sound files.

  15. Governance and Factions--Who Decides Who Decides?

    ERIC Educational Resources Information Center

    Hodgkinson, Harold L.

    In several projects, the Center is studying the question: who will decide which factions will be represented in the decision-making process. In the Campus Governance Project investigating the nature of governance, over 3,000 questionnaires were administered and 900 intensive interviews conducted at 19 institutions. The questionnaire was designed…

  16. Protocol for a randomised controlled trial of a web-based healthy relationship tool and safety decision aid for women experiencing domestic violence (I-DECIDE).

    PubMed

    Hegarty, Kelsey; Tarzia, Laura; Murray, Elizabeth; Valpied, Jodie; Humphreys, Cathy; Taft, Angela; Gold, Lisa; Glass, Nancy

    2015-08-01

    Domestic violence is a serious problem affecting the health and wellbeing of women globally. Interventions in health care settings have primarily focused on screening and referral, however, women often may not disclose abuse to health practitioners. The internet offers a confidential space in which women can assess the health of their relationships and make a plan for safety and wellbeing for themselves and their children. This randomised controlled trial is testing the effectiveness of a web-based healthy relationship tool and safety decision aid (I-DECIDE). Based broadly on the IRIS trial in the United States, it has been adapted for the Australian context where it is conducted entirely online and uses the Psychosocial Readiness Model as the basis for the intervention. In this two arm, pragmatic randomised controlled trial, women who have experienced abuse or fear of a partner in the previous 6 months will be computer randomised to receive either the I-DECIDE website or a comparator website (basic relationship and safety advice). The intervention includes self-directed reflection exercises on their relationship, danger level, priority setting, and results in an individualised, tailored action plan. Primary self-reported outcomes are: self-efficacy (General Self-Efficacy Scale) immediately after completion, 6 and 12 months post-baseline; and depressive symptoms (Centre for Epidemiologic Studies Depression Scale, Revised, 6 and 12 months post-baseline). Secondary outcomes include mean number of helpful actions for safety and wellbeing, mean level of fear of partner and cost-effectiveness. This fully-automated trial will evaluate a web-based self-information, self-reflection and self-management tool for domestic violence. We hypothesise that the improvement in self-efficacy and mental health will be mediated by increased perceived support and awareness encouraging positive change. If shown to be effective, I-DECIDE could be easily incorporated into the community

  17. Generating DEM from LIDAR data - comparison of available software tools

    NASA Astrophysics Data System (ADS)

    Korzeniowska, K.; Lacka, M.

    2011-12-01

    In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.

  18. EDCATS: An Evaluation Tool

    NASA Technical Reports Server (NTRS)

    Heard, Pamala D.

    1998-01-01

    The purpose of this research is to explore the development of Marshall Space Flight Center Unique Programs. These academic tools provide the Education Program Office with important information from the Education Computer Aided Tracking System (EDCATS). This system is equipped to provide on-line data entry, evaluation, analysis, and report generation, with full archiving for all phases of the evaluation process. Another purpose is to develop reports and data that is tailored to Marshall Space Flight Center Unique Programs. It also attempts to acquire knowledge on how, why, and where information is derived. As a result, a user will be better prepared to decide which available tool is the most feasible for their reports.

  19. Implementing the I-DECIDED clinical decision-making tool for peripheral intravenous catheter assessment and safe removal: protocol for an interrupted time-series study.

    PubMed

    Ray-Barruel, Gillian; Cooke, Marie; Mitchell, Marion; Chopra, Vineet; Rickard, Claire M

    2018-06-04

    Millions of acute care hospital patients need a peripheral intravenous catheter (PIVC) each year. However, up to half of PIVCs remain in situ when not being used, and 30%-50% of intravenous (IV) catheters develop complications or stop working before treatment is finished, requiring the insertion of a new device. Improved assessment could prompt timely removal of redundant catheters and prevent IV complications. This study aims to validate an evidence-based PIVC assessment and decision-making tool called I-DECIDED and evaluate the effect of implementing this tool into acute hospital clinical practice. The protocol outlines a prospective, multicentre, mixed-methods study using an interrupted time-series (multiple measures preintervention and postintervention) implementation at three Australian hospitals between August 2017 and July 2018. The study will examine the effectiveness of the I-DECIDED assessment and decision-making tool in clinical practice on prompting timely PIVC removal and early detection of complications. Primary outcomes are prevalence of redundant PIVCs (defined as device in situ without a clear purpose), IV complications (occlusion, dislodgement, infiltration, extravasation and phlebitis) and substandard dressings (loose, lifting, moist or soiled); device utilisation ratios; and primary bloodstream infection rates. Secondary outcomes including staff barriers and enablers to PIVC assessment and removal, patient participation, documentation of PIVC assessment and decisions taken to continue or remove the PIVC will be recorded. Using the Promoting Action on Research Implementation in Health Services framework, we will undertake staff focus groups, bedside patient interviews and PIVC assessments and chart audits. Patients aged 18 years or more with a PIVC will be eligible for inclusion. Ethical approval from Queensland Health (HREC/17/QPCH/47), Griffith University (Ref No. 2017/152) and St Vincent's Health and Aged Care Human Research and Ethics

  20. Next Generation CTAS Tools

    NASA Technical Reports Server (NTRS)

    Erzberger, Heinz

    2000-01-01

    The FAA's Free Flight Phase 1 Office is in the process of deploying the current generation of CTAS tools, which are the Traffic Management Advisor (TMA) and the passive Final Approach Spacing Tool (pFAST), at selected centers and airports. Research at NASA is now focussed on extending the CTAS software and computer human interfaces to provide more advanced capabilities. The Multi-center TMA (McTMA) is designed to operate at airports where arrival flows originate from two or more centers whose boundaries are in close proximity to the TRACON boundary. McTMA will also include techniques for routing arrival flows away from congested airspace and around airspace reserved for arrivals into other hub airports. NASA is working with FAA and MITRE to build a prototype McTMA for the Philadelphia airport. The active Final Approach Spacing Tool (aFAST) provides speed and heading advisories to help controllers achieve accurate spacing between aircraft on final approach. These advisories will be integrated with those in the existing pFAST to provide a set of comprehensive advisories for controlling arrival traffic from the TRACON boundary to touchdown at complex, high-capacity airports. A research prototype of aFAST, designed for the Dallas-Fort Worth is in an advanced stage of development. The Expedite Departure Path (EDP) and Direct-To tools are designed to help controllers guide departing aircraft out of the TRACON airspace and to climb to cruise altitude along the most efficient routes.

  1. Health. DECIDE.

    ERIC Educational Resources Information Center

    Huffman, Ruth E.; And Others

    This module, Health, is one of five from Project DECIDE, which was created to design, develop, write, and implement materials to provide adult basic education administrators, instructors, para-professionals, and other personnel with curriculum to accompany the Indiana Adult Basic Education Curriculum Guide, "Learning for Everyday…

  2. Nearly arc-length tool path generation and tool radius compensation algorithm research in FTS turning

    NASA Astrophysics Data System (ADS)

    Zhao, Minghui; Zhao, Xuesen; Li, Zengqiang; Sun, Tao

    2014-08-01

    In the non-rotational symmetrical microstrcture surfaces generation using turning method with Fast Tool Servo(FTS), non-uniform distribution of the interpolation data points will lead to long processing cycle and poor surface quality. To improve this situation, nearly arc-length tool path generation algorithm is proposed, which generates tool tip trajectory points in nearly arc-length instead of the traditional interpolation rule of equal angle and adds tool radius compensation. All the interpolation points are equidistant in radial distribution because of the constant feeding speed in X slider, the high frequency tool radius compensation components are in both X direction and Z direction, which makes X slider difficult to follow the input orders due to its large mass. Newton iterative method is used to calculate the neighboring contour tangent point coordinate value with the interpolation point X position as initial value, in this way, the new Z coordinate value is gotten, and the high frequency motion components in X direction is decomposed into Z direction. Taking a typical microstructure with 4μm PV value for test, which is mixed with two 70μm wave length sine-waves, the max profile error at the angle of fifteen is less than 0.01μm turning by a diamond tool with big radius of 80μm. The sinusoidal grid is machined on a ultra-precision lathe succesfully, the wavelength is 70.2278μm the Ra value is 22.81nm evaluated by data points generated by filtering out the first five harmonics.

  3. Efficient Data Generation and Publication as a Test Tool

    NASA Technical Reports Server (NTRS)

    Einstein, Craig Jakob

    2017-01-01

    A tool to facilitate the generation and publication of test data was created to test the individual components of a command and control system designed to launch spacecraft. Specifically, this tool was built to ensure messages are properly passed between system components. The tool can also be used to test whether the appropriate groups have access (read/write privileges) to the correct messages. The messages passed between system components take the form of unique identifiers with associated values. These identifiers are alphanumeric strings that identify the type of message and the additional parameters that are contained within the message. The values that are passed with the message depend on the identifier. The data generation tool allows for the efficient creation and publication of these messages. A configuration file can be used to set the parameters of the tool and also specify which messages to pass.

  4. Decision generation tools and Bayesian inference

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz; Wang, Wenjian; Forrester, Thomas; Kostrzewski, Andrew; Veeris, Christian; Nielsen, Thomas

    2014-05-01

    Digital Decision Generation (DDG) tools are important software sub-systems of Command and Control (C2) systems and technologies. In this paper, we present a special type of DDGs based on Bayesian Inference, related to adverse (hostile) networks, including such important applications as terrorism-related networks and organized crime ones.

  5. On the next generation of reliability analysis tools

    NASA Technical Reports Server (NTRS)

    Babcock, Philip S., IV; Leong, Frank; Gai, Eli

    1987-01-01

    The current generation of reliability analysis tools concentrates on improving the efficiency of the description and solution of the fault-handling processes and providing a solution algorithm for the full system model. The tools have improved user efficiency in these areas to the extent that the problem of constructing the fault-occurrence model is now the major analysis bottleneck. For the next generation of reliability tools, it is proposed that techniques be developed to improve the efficiency of the fault-occurrence model generation and input. Further, the goal is to provide an environment permitting a user to provide a top-down design description of the system from which a Markov reliability model is automatically constructed. Thus, the user is relieved of the tedious and error-prone process of model construction, permitting an efficient exploration of the design space, and an independent validation of the system's operation is obtained. An additional benefit of automating the model construction process is the opportunity to reduce the specialized knowledge required. Hence, the user need only be an expert in the system he is analyzing; the expertise in reliability analysis techniques is supplied.

  6. The Rack-Gear Tool Generation Modelling. Non-Analytical Method Developed in CATIA, Using the Relative Generating Trajectories Method

    NASA Astrophysics Data System (ADS)

    Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.

    2016-11-01

    The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.

  7. Next generation tools for genomic data generation, distribution, and visualization

    PubMed Central

    2010-01-01

    Background With the rapidly falling cost and availability of high throughput sequencing and microarray technologies, the bottleneck for effectively using genomic analysis in the laboratory and clinic is shifting to one of effectively managing, analyzing, and sharing genomic data. Results Here we present three open-source, platform independent, software tools for generating, analyzing, distributing, and visualizing genomic data. These include a next generation sequencing/microarray LIMS and analysis project center (GNomEx); an application for annotating and programmatically distributing genomic data using the community vetted DAS/2 data exchange protocol (GenoPub); and a standalone Java Swing application (GWrap) that makes cutting edge command line analysis tools available to those who prefer graphical user interfaces. Both GNomEx and GenoPub use the rich client Flex/Flash web browser interface to interact with Java classes and a relational database on a remote server. Both employ a public-private user-group security model enabling controlled distribution of patient and unpublished data alongside public resources. As such, they function as genomic data repositories that can be accessed manually or programmatically through DAS/2-enabled client applications such as the Integrated Genome Browser. Conclusions These tools have gained wide use in our core facilities, research laboratories and clinics and are freely available for non-profit use. See http://sourceforge.net/projects/gnomex/, http://sourceforge.net/projects/genoviz/, and http://sourceforge.net/projects/useq. PMID:20828407

  8. A NEO population generation and observation simulation software tool

    NASA Astrophysics Data System (ADS)

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as

  9. UIVerify: A Web-Based Tool for Verification and Automatic Generation of User Interfaces

    NASA Technical Reports Server (NTRS)

    Shiffman, Smadar; Degani, Asaf; Heymann, Michael

    2004-01-01

    In this poster, we describe a web-based tool for verification and automatic generation of user interfaces. The verification component of the tool accepts as input a model of a machine and a model of its interface, and checks that the interface is adequate (correct). The generation component of the tool accepts a model of a given machine and the user's task, and then generates a correct and succinct interface. This write-up will demonstrate the usefulness of the tool by verifying the correctness of a user interface to a flight-control system. The poster will include two more examples of using the tool: verification of the interface to an espresso machine, and automatic generation of a succinct interface to a large hypothetical machine.

  10. Re-Imagining Specialized STEM Academies: Igniting and Nurturing "Decidedly Different Minds", by Design

    ERIC Educational Resources Information Center

    Marshall, Stephanie Pace

    2010-01-01

    This article offers a personal vision and conceptual design for reimagining specialized science, technology, engineering, and mathematics (STEM) academies designed to nurture "decidedly different" STEM minds and ignite a new generation of global STEM talent, innovation, and entrepreneurial leadership. This design enables students to engage…

  11. The parser generator as a general purpose tool

    NASA Technical Reports Server (NTRS)

    Noonan, R. E.; Collins, W. R.

    1985-01-01

    The parser generator has proven to be an extremely useful, general purpose tool. It can be used effectively by programmers having only a knowledge of grammars and no training at all in the theory of formal parsing. Some of the application areas for which a table-driven parser can be used include interactive, query languages, menu systems, translators, and programming support tools. Each of these is illustrated by an example grammar.

  12. The DECIDE evidence to recommendation framework adapted to the public health field in Sweden.

    PubMed

    Guldbrandsson, Karin; Stenström, Nils; Winzer, Regina

    2016-12-01

    Organizations worldwide compile results from scientific studies, and grade the evidence of interventions, in order to assist policy makers. However, quality of evidence alone is seldom sufficient to make a recommendation. The Developing and Evaluating Communication Strategies to Support Informed Decisions and Practice Based on Evidence (DECIDE) framework aims to facilitate decision making and to improve dissemination and implementation of recommendations in the healthcare and public health sector. The aim of this study was to investigate whether the DECIDE framework is applicable in the public health field in Sweden. The DECIDE framework was presented and discussed in interviews with stakeholders and governmental organizations and tested in panels. Content analyses were performed. In general, the informants were positive to the DECIDE framework. However, two questions, the first regarding individual autonomy and the second regarding method sustainability, were by the stakeholders felt to be missing in the framework. The importance of the composition of the DECIDE stakeholder panel was lifted by the informants, as was the significant role of the chair. Further, the informants raised concerns about the general lack of research evidence based on RCT design regarding universal methods in the public health sector. Finally, the local, regional and national levels' responsibility for dissemination and implementation of recommendations were lifted by the informants. The DECIDE framework might be useful as a tool for dissemination and implementation of recommendations in the public health field in Sweden. Important questions for further research are whether these findings are suitable for other public health topics and in other public health settings. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  13. Experiences with a generator tool for building clinical application modules.

    PubMed

    Kuhn, K A; Lenz, R; Elstner, T; Siegele, H; Moll, R

    2003-01-01

    To elaborate main system characteristics and relevant deployment experiences for the health information system (HIS) Orbis/OpenMed, which is in widespread use in Germany, Austria, and Switzerland. In a deployment phase of 3 years in a 1.200 bed university hospital, where the system underwent significant improvements, the system's functionality and its software design have been analyzed in detail. We focus on an integrated CASE tool for generating embedded clinical applications and for incremental system evolution. We present a participatory and iterative software engineering process developed for efficient utilization of such a tool. The system's functionality is comparable to other commercial products' functionality; its components are embedded in a vendor-specific application framework, and standard interfaces are being used for connecting subsystems. The integrated generator tool is a remarkable feature; it became a key factor of our project. Tool generated applications are workflow enabled and embedded into the overall data base schema. Rapid prototyping and iterative refinement are supported, so application modules can be adapted to the users' work practice. We consider tools supporting an iterative and participatory software engineering process highly relevant for health information system architects. The potential of a system to continuously evolve and to be effectively adapted to changing needs may be more important than sophisticated but hard-coded HIS functionality. More work will focus on HIS software design and on software engineering. Methods and tools are needed for quick and robust adaptation of systems to health care processes and changing requirements.

  14. An Infrastructure for UML-Based Code Generation Tools

    NASA Astrophysics Data System (ADS)

    Wehrmeister, Marco A.; Freitas, Edison P.; Pereira, Carlos E.

    The use of Model-Driven Engineering (MDE) techniques in the domain of distributed embedded real-time systems are gain importance in order to cope with the increasing design complexity of such systems. This paper discusses an infrastructure created to build GenERTiCA, a flexible tool that supports a MDE approach, which uses aspect-oriented concepts to handle non-functional requirements from embedded and real-time systems domain. GenERTiCA generates source code from UML models, and also performs weaving of aspects, which have been specified within the UML model. Additionally, this paper discusses the Distributed Embedded Real-Time Compact Specification (DERCS), a PIM created to support UML-based code generation tools. Some heuristics to transform UML models into DERCS, which have been implemented in GenERTiCA, are also discussed.

  15. A decision support tool for landfill methane generation and gas collection.

    PubMed

    Emkes, Harriet; Coulon, Frédéric; Wagland, Stuart

    2015-09-01

    This study presents a decision support tool (DST) to enhance methane generation at individual landfill sites. To date there is no such tool available to provide landfill decision makers with clear and simplified information to evaluate biochemical processes within a landfill site, to assess performance of gas production and to identify potential remedies to any issues. The current lack in understanding stems from the complexity of the landfill waste degradation process. Two scoring sets for landfill gas production performance are calculated with the tool: (1) methane output score which measures the deviation of the actual methane output rate at each site which the prediction generated by the first order decay model LandGEM; and (2) landfill gas indicators' score, which measures the deviation of the landfill gas indicators from their ideal ranges for optimal methane generation conditions. Landfill gas indicators include moisture content, temperature, alkalinity, pH, BOD, COD, BOD/COD ratio, ammonia, chloride, iron and zinc. A total landfill gas indicator score is provided using multi-criteria analysis to calculate the sum of weighted scores for each indicator. The weights for each indicator are calculated using an analytical hierarchical process. The tool is tested against five real scenarios for landfill sites in UK with a range of good, average and poor landfill methane generation over a one year period (2012). An interpretation of the results is given for each scenario and recommendations are highlighted for methane output rate enhancement. Results demonstrate how the tool can help landfill managers and operators to enhance their understanding of methane generation at a site-specific level, track landfill methane generation over time, compare and rank sites, and identify problems areas within a landfill site. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Next-Generation Tools For Next-Generation Surveys

    NASA Astrophysics Data System (ADS)

    Murray, S. G.

    2017-04-01

    The next generation of large-scale galaxy surveys, across the electromagnetic spectrum, loom on the horizon as explosively game-changing datasets, in terms of our understanding of cosmology and structure formation. We are on the brink of a torrent of data that is set to both confirm and constrain current theories to an unprecedented level, and potentially overturn many of our conceptions. One of the great challenges of this forthcoming deluge is to extract maximal scientific content from the vast array of raw data. This challenge requires not only well-understood and robust physical models, but a commensurate network of software implementations with which to efficiently apply them. The halo model, a semi-analytic treatment of cosmological spatial statistics down to nonlinear scales, provides an excellent mathematical framework for exploring the nature of dark matter. This thesis presents a next-generation toolkit based on the halo model formalism, intended to fulfil the requirements of next-generation surveys. Our toolkit comprises three tools: (i) hmf, a comprehensive and flexible calculator for halo mass functions (HMFs) within extended Press-Schechter theory, (ii) the MRP distribution for extremely efficient analytic characterisation of HMFs, and (iii) halomod, an extension of hmf which provides support for the full range of halo model components. In addition to the development and technical presentation of these tools, we apply each to the task of physical modelling. With hmf, we determine the precision of our knowledge of the HMF, due to uncertainty in our knowledge of the cosmological parameters, over the past decade of cosmic microwave background (CMB) experiments. We place rule-of-thumb uncertainties on the predicted HMF for the Planck cosmology, and find that current limits on the precision are driven by modeling uncertainties rather than those from cosmological parameters. With the MRP, we create and test a method for robustly fitting the HMF to observed

  17. Improving Breast Cancer Surgical Treatment Decision Making: The iCanDecide Randomized Clinical Trial.

    PubMed

    Hawley, Sarah T; Li, Yun; An, Lawrence C; Resnicow, Kenneth; Janz, Nancy K; Sabel, Michael S; Ward, Kevin C; Fagerlin, Angela; Morrow, Monica; Jagsi, Reshma; Hofer, Timothy P; Katz, Steven J

    2018-03-01

    Purpose This study was conducted to determine the effect of iCanDecide, an interactive and tailored breast cancer treatment decision tool, on the rate of high-quality patient decisions-both informed and values concordant-regarding locoregional breast cancer treatment and on patient appraisal of decision making. Methods We conducted a randomized clinical trial of newly diagnosed patients with early-stage breast cancer making locoregional treatment decisions. From 22 surgical practices, 537 patients were recruited and randomly assigned online to the iCanDecide interactive and tailored Web site (intervention) or the iCanDecide static Web site (control). Participants completed a baseline survey and were mailed a follow-up survey 4 to 5 weeks after enrollment to assess the primary outcome of a high-quality decision, which consisted of two components, high knowledge and values-concordant treatment, and secondary outcomes (decision preparation, deliberation, and subjective decision quality). Results Patients in the intervention arm had higher odds of making a high-quality decision than did those in the control arm (odds ratio, 2.00; 95% CI, 1.37 to 2.92; P = .0004), which was driven primarily by differences in the rates of high knowledge between groups. The majority of patients in both arms made values-concordant treatment decisions (78.6% in the intervention arm and 81.4% in the control arm). More patients in the intervention arm had high decision preparation (estimate, 0.18; 95% CI, 0.02 to 0.34; P = .027), but there were no significant differences in the other decision appraisal outcomes. The effect of the intervention was similar for women who were leaning strongly toward a treatment option at enrollment compared with those who were not. Conclusion The tailored and interactive iCanDecide Web site, which focused on knowledge building and values clarification, positively affected high-quality decisions largely by improving knowledge compared with static online

  18. The Exercise: An Exercise Generator Tool for the SOURCe Project

    ERIC Educational Resources Information Center

    Kakoyianni-Doa, Fryni; Tziafa, Eleni; Naskos, Athanasios

    2016-01-01

    The Exercise, an Exercise generator in the SOURCe project, is a tool that complements the properties and functionalities of the SOURCe project, which includes the search engine for the Searchable Online French-Greek parallel corpus for the UniveRsity of Cyprus (SOURCe) (Kakoyianni-Doa & Tziafa, 2013), the PENCIL (an alignment tool)…

  19. PLQP & Company: Decidable Logics for Quantum Algorithms

    NASA Astrophysics Data System (ADS)

    Baltag, Alexandru; Bergfeld, Jort; Kishida, Kohei; Sack, Joshua; Smets, Sonja; Zhong, Shengyang

    2014-10-01

    We introduce a probabilistic modal (dynamic-epistemic) quantum logic PLQP for reasoning about quantum algorithms. We illustrate its expressivity by using it to encode the correctness of the well-known quantum search algorithm, as well as of a quantum protocol known to solve one of the paradigmatic tasks from classical distributed computing (the leader election problem). We also provide a general method (extending an idea employed in the decidability proof in Dunn et al. (J. Symb. Log. 70:353-359, 2005)) for proving the decidability of a range of quantum logics, interpreted on finite-dimensional Hilbert spaces. We give general conditions for the applicability of this method, and in particular we apply it to prove the decidability of PLQP.

  20. 36 CFR 215.8 - Appeal Deciding Officer.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... names, the Appeal Deciding Officer shall identify all qualified appellants (§ 215.13). (ii) The Appeal Deciding Officer may appoint the first name listed as the lead appellant (§ 215.2) to act on behalf of all parties to that appeal when the appeal does not specify a lead appellant (§ 215.14(b)(3)). (3) Appeal...

  1. 36 CFR 215.8 - Appeal Deciding Officer.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... names, the Appeal Deciding Officer shall identify all qualified appellants (§ 215.13). (ii) The Appeal Deciding Officer may appoint the first name listed as the lead appellant (§ 215.2) to act on behalf of all parties to that appeal when the appeal does not specify a lead appellant (§ 215.14(b)(3)). (3) Appeal...

  2. Computational Tools and Facilities for the Next-Generation Analysis and Design Environment

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1997-01-01

    This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.

  3. The mission events graphic generator software: A small tool with big results

    NASA Technical Reports Server (NTRS)

    Lupisella, Mark; Leibee, Jack; Scaffidi, Charles

    1993-01-01

    Utilization of graphics has long been a useful methodology for many aspects of spacecraft operations. A personal computer based software tool that implements straight-forward graphics and greatly enhances spacecraft operations is presented. This unique software tool is the Mission Events Graphic Generator (MEGG) software which is used in support of the Hubble Space Telescope (HST) Project. MEGG reads the HST mission schedule and generates a graphical timeline.

  4. Analysis and design of friction stir welding tool

    NASA Astrophysics Data System (ADS)

    Jagadeesha, C. B.

    2016-12-01

    Since its inception no one has done analysis and design of FSW tool. Initial dimensions of FSW tool are decided by educated guess. Optimum stresses on tool pin have been determined at optimized parameters for bead on plate welding on AZ31B-O Mg alloy plate. Fatigue analysis showed that the chosen FSW tool for the welding experiment has not ∞ life and it has determined that the life of FSW tool is 2.66×105 cycles or revolutions. So one can conclude that any arbitrarily decided FSW tool generally has finite life and cannot be used for ∞ life. In general, one can determine the suitability of tool and its material to be used in FSW of the given workpiece materials in advance by this analysis in terms of fatigue life of the tool.

  5. Pharmacy career deciding: making choice a "good fit".

    PubMed

    Willis, Sarah Caroline; Shann, Phillip; Hassell, Karen

    2009-01-01

    The purpose of this article is to explore factors influencing career deciding amongst pharmacy students and graduates in the U.K. Group interviews were used to devise a topic guide for five subsequent focus groups with pharmacy students and graduates. Focus groups were tape-recorded, recordings transcribed, and transcripts analysed. Key themes and interlinking factors relating to pharmacy career deciding were identified in the transcripts, following a constructivist approach. Participants' described making a "good fit" between themselves, their experiences, social networks etc. and pharmacy. Central to a coherent career deciding narrative were: having a job on graduation; and the instrumental advantage of studying a vocational course. Focusing on career deciding of UK pharmacy students and graduates may limit the study's generalisability to other countries. However, our findings are relevant to those interested in understanding students' motivations for healthcare careers, since our results suggest that making a "good fit" describes a general process of matching between a healthcare career and personal experience. As we have found that pharmacy career deciding was not, usually, a planned activity, career advisors and those involved in higher education recruitment should take into account the roles played by personal preferences and values in choosing a degree course. A qualitative study like this can illustrate how career deciding occurs and provide insight into the process from a student's perspective. This can help inform guidance processes, selection to healthcare professions courses within the higher education sector, and stimulate debate amongst those involved with recruitment of healthcare workers about desirable motivators for healthcare careers.

  6. The generation of criteria for selecting analytical tools for landscape management

    Treesearch

    Marilyn Duffey-Armstrong

    1979-01-01

    This paper presents an approach to generating criteria for selecting the analytical tools used to assess visual resources for various landscape management tasks. The approach begins by first establishing the overall parameters for the visual assessment task, and follows by defining the primary requirements of the various sets of analytical tools to be used. Finally,...

  7. The efficiency of geophysical adjoint codes generated by automatic differentiation tools

    NASA Astrophysics Data System (ADS)

    Vlasenko, A. V.; Köhl, A.; Stammer, D.

    2016-02-01

    The accuracy of numerical models that describe complex physical or chemical processes depends on the choice of model parameters. Estimating an optimal set of parameters by optimization algorithms requires knowledge of the sensitivity of the process of interest to model parameters. Typically the sensitivity computation involves differentiation of the model, which can be performed by applying algorithmic differentiation (AD) tools to the underlying numerical code. However, existing AD tools differ substantially in design, legibility and computational efficiency. In this study we show that, for geophysical data assimilation problems of varying complexity, the performance of adjoint codes generated by the existing AD tools (i) Open_AD, (ii) Tapenade, (iii) NAGWare and (iv) Transformation of Algorithms in Fortran (TAF) can be vastly different. Based on simple test problems, we evaluate the efficiency of each AD tool with respect to computational speed, accuracy of the adjoint, the efficiency of memory usage, and the capability of each AD tool to handle modern FORTRAN 90-95 elements such as structures and pointers, which are new elements that either combine groups of variables or provide aliases to memory addresses, respectively. We show that, while operator overloading tools are the only ones suitable for modern codes written in object-oriented programming languages, their computational efficiency lags behind source transformation by orders of magnitude, rendering the application of these modern tools to practical assimilation problems prohibitive. In contrast, the application of source transformation tools appears to be the most efficient choice, allowing handling even large geophysical data assimilation problems. However, they can only be applied to numerical models written in earlier generations of programming languages. Our study indicates that applying existing AD tools to realistic geophysical problems faces limitations that urgently need to be solved to allow the

  8. Genome Annotation Generator: a simple tool for generating and correcting WGS annotation tables for NCBI submission.

    PubMed

    Geib, Scott M; Hall, Brian; Derego, Theodore; Bremer, Forest T; Cannoles, Kyle; Sim, Sheina B

    2018-04-01

    One of the most overlooked, yet critical, components of a whole genome sequencing (WGS) project is the submission and curation of the data to a genomic repository, most commonly the National Center for Biotechnology Information (NCBI). While large genome centers or genome groups have developed software tools for post-annotation assembly filtering, annotation, and conversion into the NCBI's annotation table format, these tools typically require back-end setup and connection to an Structured Query Language (SQL) database and/or some knowledge of programming (Perl, Python) to implement. With WGS becoming commonplace, genome sequencing projects are moving away from the genome centers and into the ecology or biology lab, where fewer resources are present to support the process of genome assembly curation. To fill this gap, we developed software to assess, filter, and transfer annotation and convert a draft genome assembly and annotation set into the NCBI annotation table (.tbl) format, facilitating submission to the NCBI Genome Assembly database. This software has no dependencies, is compatible across platforms, and utilizes a simple command to perform a variety of simple and complex post-analysis, pre-NCBI submission WGS project tasks. The Genome Annotation Generator is a consistent and user-friendly bioinformatics tool that can be used to generate a .tbl file that is consistent with the NCBI submission pipeline. The Genome Annotation Generator achieves the goal of providing a publicly available tool that will facilitate the submission of annotated genome assemblies to the NCBI. It is useful for any individual researcher or research group that wishes to submit a genome assembly of their study system to the NCBI.

  9. Genome Annotation Generator: a simple tool for generating and correcting WGS annotation tables for NCBI submission

    PubMed Central

    Hall, Brian; Derego, Theodore; Bremer, Forest T; Cannoles, Kyle

    2018-01-01

    Abstract Background One of the most overlooked, yet critical, components of a whole genome sequencing (WGS) project is the submission and curation of the data to a genomic repository, most commonly the National Center for Biotechnology Information (NCBI). While large genome centers or genome groups have developed software tools for post-annotation assembly filtering, annotation, and conversion into the NCBI’s annotation table format, these tools typically require back-end setup and connection to an Structured Query Language (SQL) database and/or some knowledge of programming (Perl, Python) to implement. With WGS becoming commonplace, genome sequencing projects are moving away from the genome centers and into the ecology or biology lab, where fewer resources are present to support the process of genome assembly curation. To fill this gap, we developed software to assess, filter, and transfer annotation and convert a draft genome assembly and annotation set into the NCBI annotation table (.tbl) format, facilitating submission to the NCBI Genome Assembly database. This software has no dependencies, is compatible across platforms, and utilizes a simple command to perform a variety of simple and complex post-analysis, pre-NCBI submission WGS project tasks. Findings The Genome Annotation Generator is a consistent and user-friendly bioinformatics tool that can be used to generate a .tbl file that is consistent with the NCBI submission pipeline Conclusions The Genome Annotation Generator achieves the goal of providing a publicly available tool that will facilitate the submission of annotated genome assemblies to the NCBI. It is useful for any individual researcher or research group that wishes to submit a genome assembly of their study system to the NCBI. PMID:29635297

  10. PC graphics generation and management tool for real-time applications

    NASA Technical Reports Server (NTRS)

    Truong, Long V.

    1992-01-01

    A graphics tool was designed and developed for easy generation and management of personal computer graphics. It also provides methods and 'run-time' software for many common artificial intelligence (AI) or expert system (ES) applications.

  11. An Authoring Tool for User Generated Mobile Services

    NASA Astrophysics Data System (ADS)

    Danado, José; Davies, Marcin; Ricca, Paulo; Fensel, Anna

    Imagine what kind of applications become possible when our mobile devices not only present data but provide valuable information to other users. Users become able to instantaneously create services and to publish content and knowledge on their own mobile device, which can be discovered and accessed remotely by other mobile users in a simple way. To achieve the vision of customizable and context aware user-generated mobile services, we present a mobile authoring tool for end-users to create, customize and deploy mobile services while on-the-go. This tool is designed to allow users with different levels of technical expertise to create mobile services. The paper also gives insight on the performed usability evaluations, namely user interviews and an online survey.

  12. MEAT: An Authoring Tool for Generating Adaptable Learning Resources

    ERIC Educational Resources Information Center

    Kuo, Yen-Hung; Huang, Yueh-Min

    2009-01-01

    Mobile learning (m-learning) is a new trend in the e-learning field. The learning services in m-learning environments are supported by fundamental functions, especially the content and assessment services, which need an authoring tool to rapidly generate adaptable learning resources. To fulfill the imperious demand, this study proposes an…

  13. Enzyme Function Initiative-Enzyme Similarity Tool (EFI-EST): A web tool for generating protein sequence similarity networks.

    PubMed

    Gerlt, John A; Bouvier, Jason T; Davidson, Daniel B; Imker, Heidi J; Sadkhin, Boris; Slater, David R; Whalen, Katie L

    2015-08-01

    The Enzyme Function Initiative, an NIH/NIGMS-supported Large-Scale Collaborative Project (EFI; U54GM093342; http://enzymefunction.org/), is focused on devising and disseminating bioinformatics and computational tools as well as experimental strategies for the prediction and assignment of functions (in vitro activities and in vivo physiological/metabolic roles) to uncharacterized enzymes discovered in genome projects. Protein sequence similarity networks (SSNs) are visually powerful tools for analyzing sequence relationships in protein families (H.J. Atkinson, J.H. Morris, T.E. Ferrin, and P.C. Babbitt, PLoS One 2009, 4, e4345). However, the members of the biological/biomedical community have not had access to the capability to generate SSNs for their "favorite" protein families. In this article we announce the EFI-EST (Enzyme Function Initiative-Enzyme Similarity Tool) web tool (http://efi.igb.illinois.edu/efi-est/) that is available without cost for the automated generation of SSNs by the community. The tool can create SSNs for the "closest neighbors" of a user-supplied protein sequence from the UniProt database (Option A) or of members of any user-supplied Pfam and/or InterPro family (Option B). We provide an introduction to SSNs, a description of EFI-EST, and a demonstration of the use of EFI-EST to explore sequence-function space in the OMP decarboxylase superfamily (PF00215). This article is designed as a tutorial that will allow members of the community to use the EFI-EST web tool for exploring sequence/function space in protein families. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Adaptive scallop height tool path generation for robot-based incremental sheet metal forming

    NASA Astrophysics Data System (ADS)

    Seim, Patrick; Möllensiep, Dennis; Störkle, Denis Daniel; Thyssen, Lars; Kuhlenkötter, Bernd

    2016-10-01

    Incremental sheet metal forming is an emerging process for the production of individualized products or prototypes in low batch sizes and with short times to market. In these processes, the desired shape is produced by the incremental inward motion of the workpiece-independent forming tool in depth direction and its movement along the contour in lateral direction. Based on this shape production, the tool path generation is a key factor on e.g. the resulting geometric accuracy, the resulting surface quality, and the working time. This paper presents an innovative tool path generation based on a commercial milling CAM package considering the surface quality and working time. This approach offers the ability to define a specific scallop height as an indicator of the surface quality for specific faces of a component. Moreover, it decreases the required working time for the production of the entire component compared to the use of a commercial software package without this adaptive approach. Different forming experiments have been performed to verify the newly developed tool path generation. Mainly, this approach serves to solve the existing conflict of combining the working time and the surface quality within the process of incremental sheet metal forming.

  15. The role of optimization in the next generation of computer-based design tools

    NASA Technical Reports Server (NTRS)

    Rogan, J. Edward

    1989-01-01

    There is a close relationship between design optimization and the emerging new generation of computer-based tools for engineering design. With some notable exceptions, the development of these new tools has not taken full advantage of recent advances in numerical design optimization theory and practice. Recent work in the field of design process architecture has included an assessment of the impact of next-generation computer-based design tools on the design process. These results are summarized, and insights into the role of optimization in a design process based on these next-generation tools are presented. An example problem has been worked out to illustrate the application of this technique. The example problem - layout of an aircraft main landing gear - is one that is simple enough to be solved by many other techniques. Although the mathematical relationships describing the objective function and constraints for the landing gear layout problem can be written explicitly and are quite straightforward, an approximation technique has been used in the solution of this problem that can just as easily be applied to integrate supportability or producibility assessments using theory of measurement techniques into the design decision-making process.

  16. 49 CFR 40.377 - Who decides whether to issue a PIE?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Public Interest Exclusions § 40.377 Who decides whether to issue a PIE? (a) The ODAPC Director, or his or her designee, decides whether to issue a PIE. If a designee... 49 Transportation 1 2010-10-01 2010-10-01 false Who decides whether to issue a PIE? 40.377 Section...

  17. Who Should Decide How Machines Make Morally Laden Decisions?

    PubMed

    Martin, Dominic

    2017-08-01

    Who should decide how a machine will decide what to do when it is driving a car, performing a medical procedure, or, more generally, when it is facing any kind of morally laden decision? More and more, machines are making complex decisions with a considerable level of autonomy. We should be much more preoccupied by this problem than we currently are. After a series of preliminary remarks, this paper will go over four possible answers to the question raised above. First, we may claim that it is the maker of a machine that gets to decide how it will behave in morally laden scenarios. Second, we may claim that the users of a machine should decide. Third, that decision may have to be made collectively or, fourth, by other machines built for this special purpose. The paper argues that each of these approaches suffers from its own shortcomings, and it concludes by showing, among other things, which approaches should be emphasized for different types of machines, situations, and/or morally laden decisions.

  18. Deciding where to Stop Speaking

    ERIC Educational Resources Information Center

    Tydgat, Ilse; Stevens, Michael; Hartsuiker, Robert J.; Pickering, Martin J.

    2011-01-01

    This study investigated whether speakers strategically decide where to interrupt their speech once they need to stop. We conducted four naming experiments in which pictures of colored shapes occasionally changed in color or shape. Participants then merely had to stop (Experiment 1); or they had to stop and resume speech (Experiments 2-4). They…

  19. S3D: An interactive surface grid generation tool

    NASA Technical Reports Server (NTRS)

    Luh, Raymond Ching-Chung; Pierce, Lawrence E.; Yip, David

    1992-01-01

    S3D, an interactive software tool for surface grid generation, is described. S3D provides the means with which a geometry definition based either on a discretized curve set or a rectangular set can be quickly processed towards the generation of a surface grid for computational fluid dynamics (CFD) applications. This is made possible as a result of implementing commonly encountered surface gridding tasks in an environment with a highly efficient and user friendly graphical interface. Some of the more advanced features of S3D include surface-surface intersections, optimized surface domain decomposition and recomposition, and automated propagation of edge distributions to surrounding grids.

  20. 5 CFR 890.1023 - Information considered in deciding a contest.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Imposed Against Health Care Providers Permissive Debarments § 890.1023 Information considered in deciding... 5 Administrative Personnel 2 2013-01-01 2013-01-01 false Information considered in deciding a contest. 890.1023 Section 890.1023 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED...

  1. A Tool for Model-Based Generation of Scenario-driven Electric Power Load Profiles

    NASA Technical Reports Server (NTRS)

    Rozek, Matthew L.; Donahue, Kenneth M.; Ingham, Michel D.; Kaderka, Justin D.

    2015-01-01

    Power consumption during all phases of spacecraft flight is of great interest to the aerospace community. As a result, significant analysis effort is exerted to understand the rates of electrical energy generation and consumption under many operational scenarios of the system. Previously, no standard tool existed for creating and maintaining a power equipment list (PEL) of spacecraft components that consume power, and no standard tool existed for generating power load profiles based on this PEL information during mission design phases. This paper presents the Scenario Power Load Analysis Tool (SPLAT) as a model-based systems engineering tool aiming to solve those problems. SPLAT is a plugin for MagicDraw (No Magic, Inc.) that aids in creating and maintaining a PEL, and also generates a power and temporal variable constraint set, in Maple language syntax, based on specified operational scenarios. The constraint set can be solved in Maple to show electric load profiles (i.e. power consumption from loads over time). SPLAT creates these load profiles from three modeled inputs: 1) a list of system components and their respective power modes, 2) a decomposition hierarchy of the system into these components, and 3) the specification of at least one scenario, which consists of temporal constraints on component power modes. In order to demonstrate how this information is represented in a system model, a notional example of a spacecraft planetary flyby is introduced. This example is also used to explain the overall functionality of SPLAT, and how this is used to generate electric power load profiles. Lastly, a cursory review of the usage of SPLAT on the Cold Atom Laboratory project is presented to show how the tool was used in an actual space hardware design application.

  2. 5 CFR 890.1036 - Information considered in deciding a contest.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Imposed Against Health Care Providers Suspension § 890.1036 Information considered in deciding a contest... 5 Administrative Personnel 2 2014-01-01 2014-01-01 false Information considered in deciding a contest. 890.1036 Section 890.1036 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED...

  3. 5 CFR 890.1036 - Information considered in deciding a contest.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Imposed Against Health Care Providers Suspension § 890.1036 Information considered in deciding a contest... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Information considered in deciding a contest. 890.1036 Section 890.1036 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED...

  4. 5 CFR 890.1036 - Information considered in deciding a contest.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Imposed Against Health Care Providers Suspension § 890.1036 Information considered in deciding a contest... 5 Administrative Personnel 2 2013-01-01 2013-01-01 false Information considered in deciding a contest. 890.1036 Section 890.1036 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED...

  5. AQME: A forensic mitochondrial DNA analysis tool for next-generation sequencing data.

    PubMed

    Sturk-Andreaggi, Kimberly; Peck, Michelle A; Boysen, Cecilie; Dekker, Patrick; McMahon, Timothy P; Marshall, Charla K

    2017-11-01

    The feasibility of generating mitochondrial DNA (mtDNA) data has expanded considerably with the advent of next-generation sequencing (NGS), specifically in the generation of entire mtDNA genome (mitogenome) sequences. However, the analysis of these data has emerged as the greatest challenge to implementation in forensics. To address this need, a custom toolkit for use in the CLC Genomics Workbench (QIAGEN, Hilden, Germany) was developed through a collaborative effort between the Armed Forces Medical Examiner System - Armed Forces DNA Identification Laboratory (AFMES-AFDIL) and QIAGEN Bioinformatics. The AFDIL-QIAGEN mtDNA Expert, or AQME, generates an editable mtDNA profile that employs forensic conventions and includes the interpretation range required for mtDNA data reporting. AQME also integrates an mtDNA haplogroup estimate into the analysis workflow, which provides the analyst with phylogenetic nomenclature guidance and a profile quality check without the use of an external tool. Supplemental AQME outputs such as nucleotide-per-position metrics, configurable export files, and an audit trail are produced to assist the analyst during review. AQME is applied to standard CLC outputs and thus can be incorporated into any mtDNA bioinformatics pipeline within CLC regardless of sample type, library preparation or NGS platform. An evaluation of AQME was performed to demonstrate its functionality and reliability for the analysis of mitogenome NGS data. The study analyzed Illumina mitogenome data from 21 samples (including associated controls) of varying quality and sample preparations with the AQME toolkit. A total of 211 tool edits were automatically applied to 130 of the 698 total variants reported in an effort to adhere to forensic nomenclature. Although additional manual edits were required for three samples, supplemental tools such as mtDNA haplogroup estimation assisted in identifying and guiding these necessary modifications to the AQME-generated profile. Along

  6. Developing next-generation telehealth tools and technologies: patients, systems, and data perspectives.

    PubMed

    Ackerman, Michael J; Filart, Rosemarie; Burgess, Lawrence P; Lee, Insup; Poropatich, Ronald K

    2010-01-01

    The major goals of telemedicine today are to develop next-generation telehealth tools and technologies to enhance healthcare delivery to medically underserved populations using telecommunication technology, to increase access to medical specialty services while decreasing healthcare costs, and to provide training of healthcare providers, clinical trainees, and students in health-related fields. Key drivers for these tools and technologies are the need and interest to collaborate among telehealth stakeholders, including patients, patient communities, research funders, researchers, healthcare services providers, professional societies, industry, healthcare management/economists, and healthcare policy makers. In the development, marketing, adoption, and implementation of these tools and technologies, communication, training, cultural sensitivity, and end-user customization are critical pieces to the process. Next-generation tools and technologies are vehicles toward personalized medicine, extending the telemedicine model to include cell phones and Internet-based telecommunications tools for remote and home health management with video assessment, remote bedside monitoring, and patient-specific care tools with event logs, patient electronic profile, and physician note-writing capability. Telehealth is ultimately a system of systems in scale and complexity. To cover the full spectrum of dynamic and evolving needs of end-users, we must appreciate system complexity as telehealth moves toward increasing functionality, integration, interoperability, outreach, and quality of service. Toward that end, our group addressed three overarching questions: (1) What are the high-impact topics? (2) What are the barriers to progress? and (3) What roles can the National Institutes of Health and its various institutes and centers play in fostering the future development of telehealth?

  7. Career Cruising Impact on the Self Efficacy of Deciding Majors

    ERIC Educational Resources Information Center

    Smother, Anthony William

    2012-01-01

    The purpose of this study was to analyze the impact of "Career Cruising"© on self-efficacy of deciding majors in a university setting. The use of the self-assessment instrument, "Career Cruising"©, was used with measuring the career-decision making self-efficacy in a pre and post-test with deciding majors. The independent…

  8. The Generation Rate of Respirable Dust from Cutting Fiber Cement Siding Using Different Tools

    PubMed Central

    Qi, Chaolong; Echt, Alan; Gressel, Michael G

    2017-01-01

    This article describes the evaluation of the generation rate of respirable dust (GAPS, defined as the mass of respirable dust generated per unit linear length cut) from cutting fiber cement siding using different tools in a laboratory testing system. We used an aerodynamic particle sizer spectrometer (APS) to continuously monitor the real-time size distributions of the dust throughout cutting tests when using a variety of tools, and calculated the generation rate of respirable dust for each testing condition using the size distribution data. The test result verifies that power shears provided an almost dust-free operation with a GAPS of 0.006 gram meter−1 (g m−1) at the testing condition. For the same power saws, the cuts using saw blades with more teeth generated more respirable dusts. Using the same blade for all four miter saws tested in this study, a positive linear correlation was found between the saws’ blade rotating speed and its dust generation rate. In addition, a circular saw running at the highest blade rotating speed of 9068 RPM generated the greatest amount of dust. All the miter saws generated less dust in the ‘chopping mode’ than in the ‘chopping and sliding’ mode. For the tested saws, GAPS consistently decreased with the increases of the saw cutting feed rate and the number of board in the stack. All the test results point out that fewer cutting interactions between the saw blade’s teeth and the siding board for a unit linear length of cut tend to result in a lower generation rate of respirable dust. These results may help guide optimal operation in practice and future tool development aimed at minimizing dust generation while producing a satisfactory cut. PMID:28395343

  9. The Generation Rate of Respirable Dust from Cutting Fiber Cement Siding Using Different Tools.

    PubMed

    Qi, Chaolong; Echt, Alan; Gressel, Michael G

    2017-03-01

    This article describes the evaluation of the generation rate of respirable dust (GAPS, defined as the mass of respirable dust generated per unit linear length cut) from cutting fiber cement siding using different tools in a laboratory testing system. We used an aerodynamic particle sizer spectrometer (APS) to continuously monitor the real-time size distributions of the dust throughout cutting tests when using a variety of tools, and calculated the generation rate of respirable dust for each testing condition using the size distribution data. The test result verifies that power shears provided an almost dust-free operation with a GAPS of 0.006 g m-1 at the testing condition. For the same power saws, the cuts using saw blades with more teeth generated more respirable dusts. Using the same blade for all four miter saws tested in this study, a positive linear correlation was found between the saws' blade rotating speed and its dust generation rate. In addition, a circular saw running at the highest blade rotating speed of 9068 rpm generated the greatest amount of dust. All the miter saws generated less dust in the 'chopping mode' than in the 'chopping and sliding' mode. For the tested saws, GAPS consistently decreased with the increases of the saw cutting feed rate and the number of board in the stack. All the test results point out that fewer cutting interactions between the saw blade's teeth and the siding board for a unit linear length of cut tend to result in a lower generation rate of respirable dust. These results may help guide optimal operation in practice and future tool development aimed at minimizing dust generation while producing a satisfactory cut. Published by Oxford University Press on behalf of The British Occupational Hygiene Society 2017.

  10. Evaluating Variant Calling Tools for Non-Matched Next-Generation Sequencing Data

    NASA Astrophysics Data System (ADS)

    Sandmann, Sarah; de Graaf, Aniek O.; Karimi, Mohsen; van der Reijden, Bert A.; Hellström-Lindberg, Eva; Jansen, Joop H.; Dugas, Martin

    2017-02-01

    Valid variant calling results are crucial for the use of next-generation sequencing in clinical routine. However, there are numerous variant calling tools that usually differ in algorithms, filtering strategies, recommendations and thus, also in the output. We evaluated eight open-source tools regarding their ability to call single nucleotide variants and short indels with allelic frequencies as low as 1% in non-matched next-generation sequencing data: GATK HaplotypeCaller, Platypus, VarScan, LoFreq, FreeBayes, SNVer, SAMtools and VarDict. We analysed two real datasets from patients with myelodysplastic syndrome, covering 54 Illumina HiSeq samples and 111 Illumina NextSeq samples. Mutations were validated by re-sequencing on the same platform, on a different platform and expert based review. In addition we considered two simulated datasets with varying coverage and error profiles, covering 50 samples each. In all cases an identical target region consisting of 19 genes (42,322 bp) was analysed. Altogether, no tool succeeded in calling all mutations. High sensitivity was always accompanied by low precision. Influence of varying coverages- and background noise on variant calling was generally low. Taking everything into account, VarDict performed best. However, our results indicate that there is a need to improve reproducibility of the results in the context of multithreading.

  11. Helping Youth Decide: A Workshop Guide.

    ERIC Educational Resources Information Center

    Duquette, Donna Marie; Boo, Katherine

    This guide was written to complement the publication "Helping Youth Decide," a manual designed to help parents develop effective parent-child communication and help their children make responsible decisions during the adolescent years. The workshop guide is intended to assist people who work with families to provide additional information and…

  12. TrawlerWeb: an online de novo motif discovery tool for next-generation sequencing datasets.

    PubMed

    Dang, Louis T; Tondl, Markus; Chiu, Man Ho H; Revote, Jerico; Paten, Benedict; Tano, Vincent; Tokolyi, Alex; Besse, Florence; Quaife-Ryan, Greg; Cumming, Helen; Drvodelic, Mark J; Eichenlaub, Michael P; Hallab, Jeannette C; Stolper, Julian S; Rossello, Fernando J; Bogoyevitch, Marie A; Jans, David A; Nim, Hieu T; Porrello, Enzo R; Hudson, James E; Ramialison, Mirana

    2018-04-05

    A strong focus of the post-genomic era is mining of the non-coding regulatory genome in order to unravel the function of regulatory elements that coordinate gene expression (Nat 489:57-74, 2012; Nat 507:462-70, 2014; Nat 507:455-61, 2014; Nat 518:317-30, 2015). Whole-genome approaches based on next-generation sequencing (NGS) have provided insight into the genomic location of regulatory elements throughout different cell types, organs and organisms. These technologies are now widespread and commonly used in laboratories from various fields of research. This highlights the need for fast and user-friendly software tools dedicated to extracting cis-regulatory information contained in these regulatory regions; for instance transcription factor binding site (TFBS) composition. Ideally, such tools should not require prior programming knowledge to ensure they are accessible for all users. We present TrawlerWeb, a web-based version of the Trawler_standalone tool (Nat Methods 4:563-5, 2007; Nat Protoc 5:323-34, 2010), to allow for the identification of enriched motifs in DNA sequences obtained from next-generation sequencing experiments in order to predict their TFBS composition. TrawlerWeb is designed for online queries with standard options common to web-based motif discovery tools. In addition, TrawlerWeb provides three unique new features: 1) TrawlerWeb allows the input of BED files directly generated from NGS experiments, 2) it automatically generates an input-matched biologically relevant background, and 3) it displays resulting conservation scores for each instance of the motif found in the input sequences, which assists the researcher in prioritising the motifs to validate experimentally. Finally, to date, this web-based version of Trawler_standalone remains the fastest online de novo motif discovery tool compared to other popular web-based software, while generating predictions with high accuracy. TrawlerWeb provides users with a fast, simple and easy-to-use web

  13. Forecasting municipal solid waste generation using prognostic tools and regression analysis.

    PubMed

    Ghinea, Cristina; Drăgoi, Elena Niculina; Comăniţă, Elena-Diana; Gavrilescu, Marius; Câmpean, Teofil; Curteanu, Silvia; Gavrilescu, Maria

    2016-11-01

    For an adequate planning of waste management systems the accurate forecast of waste generation is an essential step, since various factors can affect waste trends. The application of predictive and prognosis models are useful tools, as reliable support for decision making processes. In this paper some indicators such as: number of residents, population age, urban life expectancy, total municipal solid waste were used as input variables in prognostic models in order to predict the amount of solid waste fractions. We applied Waste Prognostic Tool, regression analysis and time series analysis to forecast municipal solid waste generation and composition by considering the Iasi Romania case study. Regression equations were determined for six solid waste fractions (paper, plastic, metal, glass, biodegradable and other waste). Accuracy Measures were calculated and the results showed that S-curve trend model is the most suitable for municipal solid waste (MSW) prediction. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Human/autonomy collaboration for the automated generation of intelligence products

    NASA Astrophysics Data System (ADS)

    DiBona, Phil; Schlachter, Jason; Kuter, Ugur; Goldman, Robert

    2017-05-01

    Intelligence Analysis remains a manual process despite trends toward autonomy in information processing. Analysts need agile decision-­-support tools that can adapt to the evolving information needs of the mission, allowing the analyst to pose novel analytic questions. Our research enables the analysts to only provide a constrained English specification of what the intelligence product should be. Using HTN planning, the autonomy discovers, decides, and generates a workflow of algorithms to create the intelligence product. Therefore, the analyst can quickly and naturally communicate to the autonomy what information product is needed, rather than how to create it.

  15. Ramping and Uncertainty Prediction Tool - Analysis and Visualization of Wind Generation Impact on Electrical Grid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etingov, Pavel; Makarov, PNNL Yuri; Subbarao, PNNL Kris

    RUT software is designed for use by the Balancing Authorities to predict and display additional requirements caused by the variability and uncertainty in load and generation. The prediction is made for the next operating hours as well as for the next day. The tool predicts possible deficiencies in generation capability and ramping capability. This deficiency of balancing resources can cause serious risks to power system stability and also impact real-time market energy prices. The tool dynamically and adaptively correlates changing system conditions with the additional balancing needs triggered by the interplay between forecasted and actual load and output of variablemore » resources. The assessment is performed using a specially developed probabilistic algorithm incorporating multiple sources of uncertainty including wind, solar and load forecast errors. The tool evaluates required generation for a worst case scenario, with a user-specified confidence level.« less

  16. 25 CFR 39.133 - Who decides how Language Development funds can be used?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school boards decide... 25 Indians 1 2012-04-01 2011-04-01 true Who decides how Language Development funds can be used? 39...

  17. Video Games as a Training Tool to Prepare the Next Generation of Cyber Warriors

    DTIC Science & Technology

    2014-10-01

    2. REPORT TYPE N/A 3. DATES COVERED - 4 . TITLE AND SUBTITLE Video Games as a Training Tool to Prepare the Next Generation of Cyber Warriors...CYBERSECURITY WORKFORCE SHORTAGE .......................................................................... 3 4 1.1 GREATER CYBERSECURITY EDUCATION IS... 4 6 2.1 HOW VIDEO GAMES CAN BE EFFECTIVE LEARNING TOOLS

  18. What are people willing to pay for whole-genome sequencing information, and who decides what they receive?

    PubMed

    Marshall, Deborah A; Gonzalez, Juan Marcos; Johnson, F Reed; MacDonald, Karen V; Pugh, Amy; Douglas, Michael P; Phillips, Kathryn A

    2016-12-01

    Whole-genome sequencing (WGS) can be used as a powerful diagnostic tool as well as for screening, but it may lead to anxiety, unnecessary testing, and overtreatment. Current guidelines suggest reporting clinically actionable secondary findings when diagnostic testing is performed. We examined preferences for receiving WGS results. A US nationally representative survey (n = 410 adults) was used to rank preferences for who decides (an expert panel, your doctor, you) which WGS results are reported. We estimated the value of information about variants with varying levels of clinical usefulness by using willingness to pay contingent valuation questions. The results were as follows: 43% preferred to decide themselves what information is included in the WGS report. 38% (95% confidence interval (CI): 33-43%) would not pay for actionable variants, and 3% (95% CI: 1-5%) would pay more than $1,000. 55% (95% CI: 50-60%) would not pay for variants for which medical treatment is currently unclear, and 7% (95% CI: 5-9%) would pay more than $400. Most people prefer to decide what WGS results are reported. Despite valuing actionable information more, some respondents perceive that genetic information could negatively impact them. Preference heterogeneity for WGS information should be considered in the development of policies, particularly to integrate patient preferences with personalized medicine and shared decision making.Genet Med 18 12, 1295-1302.

  19. Tempest: Tools for Addressing the Needs of Next-Generation Climate Models

    NASA Astrophysics Data System (ADS)

    Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.

    2015-12-01

    Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.

  20. E-DECIDER Disaster Response and Decision Support Cyberinfrastructure: Technology and Challenges

    NASA Astrophysics Data System (ADS)

    Glasscoe, M. T.; Parker, J. W.; Pierce, M. E.; Wang, J.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.

    2014-12-01

    Timely delivery of critical information to decision makers during a disaster is essential to response and damage assessment. Key issues to an efficient emergency response after a natural disaster include rapidly processing and delivering this critical information to emergency responders and reducing human intervention as much as possible. Essential elements of information necessary to achieve situational awareness are often generated by a wide array of organizations and disciplines, using any number of geospatial and non-geospatial technologies. A key challenge is the current state of practice does not easily support information sharing and technology interoperability. NASA E-DECIDER (Emergency Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) has worked with the California Earthquake Clearinghouse and its partners to address these issues and challenges by adopting the XChangeCore Web Service Data Orchestration technology and participating in several earthquake response exercises. The E-DECIDER decision support system provides rapid delivery of advanced situational awareness data products to operations centers and emergency responders in the field. Remote sensing and hazard data, model-based map products, information from simulations, damage detection, and crowdsourcing is integrated into a single geospatial view and delivered through a service oriented architecture for improved decision-making and then directly to mobile devices of responders. By adopting a Service Oriented Architecture based on Open Geospatial Consortium standards, the system provides an extensible, comprehensive framework for geospatial data processing and distribution on Cloud platforms and other distributed environments. While the Clearinghouse and its partners are not first responders, they do support the emergency response community by providing information about the damaging effects earthquakes. It is critical for decision makers to maintain a situational awareness

  1. Genomic tools to improve progress and preserve variation for future generations

    USDA-ARS?s Scientific Manuscript database

    Use of genomic tools has greatly decreased generation intervals and increased genetic progress in dairy cattle, but faster selection cycles can also increase rates of inbreeding per unit of time. Average pedigree inbreeding of Holstein cows increased from 4.6% in 2000 to 5.6% in 2009 to 6.6% in 201...

  2. Implementing a Quantitative Analysis Design Tool for Future Generation Interfaces

    DTIC Science & Technology

    2012-03-01

    with Remotely Piloted Aircraft (RPA) has resulted in the need of a platform to evaluate interface design. The Vigilant Spirit Control Station ( VSCS ...Spirit interface. A modified version of the HCI Index was successfully applied to perform a quantitative analysis of the baseline VSCS interface and...time of the original VSCS interface. These results revealed the effectiveness of the tool and demonstrated in the design of future generation

  3. Deciding How To Decide: Decision Making in Schools. A Presenter's Guide. Research Based Training for School Administrators. Revised.

    ERIC Educational Resources Information Center

    Oregon Univ., Eugene. Center for Educational Policy and Management.

    This workshop presenter's guide is intended for use by administrators in training one another in the Project Leadership program developed by the Association of California School Administrators (ACSA). The purposes of the guide are: to provide administrators with a framework for deciding when others (particularly subordinates) should participate in…

  4. NUMERICAL STUDY OF ELECTROMAGNETIC WAVES GENERATED BY A PROTOTYPE DIELECTRIC LOGGING TOOL

    EPA Science Inventory

    To understand the electromagnetic waves generated by a prototype dielectric logging tool, a
    numerical study was conducted using both the finite-difference, time-domain method and a frequency- wavenumber method. When the propagation velocity in the borehole was greater than th...

  5. Generative Text Sets: Tools for Negotiating Critically Inclusive Early Childhood Teacher Education Pedagogical Practices

    ERIC Educational Resources Information Center

    Souto-Manning, Mariana

    2017-01-01

    Through a case study, this article sheds light onto generative text sets as tools for developing and enacting critically inclusive early childhood teacher education pedagogies. In doing so, it positions teaching and learning processes as sociocultural, historical, and political acts as it inquires into the use of generative text sets in one early…

  6. The Society-Deciders Model and Fairness in Nations

    NASA Astrophysics Data System (ADS)

    Flomenbom, Ophir

    2015-05-01

    Modeling the dynamics in nations from economical and sociological perspectives is a central theme in economics and sociology. Accurate models can predict and therefore help all the world's citizens. Yet, recent years have show that the current models are missing. Here, we develop a dynamical society-deciders model that can explain the stability in a nation, based on concepts from dynamics, ecology and socio-econo-physics; a nation has two groups that interconnect, the deciders and the society. We show that a nation is either stable or it collapses. This depends on just two coefficients that we relate with sociological and economical indicators. We define a new socio-economic indicator, fairness. Fairness can measure the stability in a nation and how probable a change favoring the society is. We compute fairness among all the world's nations. Interestingly, in comparison with other indicators, fairness shows that the USA loses its rank among Western democracies, India is the best among the 15 most populated nations, and Egypt, Libya and Tunisia have significantly improved their rankings as a result of recent revolutions, further increasing the probability of additional positive changes. Within the model, long lasting crises are solved rather than with increasing governmental spending or cuts with regulations that reduce the stability of the deciders, namely, increasing fairness, while, for example, shifting wealth in the direction of the people, and therefore increasing further opportunities.

  7. Collaborative writing: Tools and tips.

    PubMed

    Eapen, Bell Raj

    2007-01-01

    Majority of technical writing is done by groups of experts and various web based applications have made this collaboration easy. Email exchange of word processor documents with tracked changes used to be the standard technique for collaborative writing. However web based tools like Google docs and Spreadsheets have made the process fast and efficient. Various versioning tools and synchronous editors are available for those who need additional functionality. Having a group leader who decides the scheduling, communication and conflict resolving protocols is important for successful collaboration.

  8. Trajectory Assessment and Modification Tools for Next Generation Air Traffic Management Operations

    NASA Technical Reports Server (NTRS)

    Brasil, Connie; Lee, Paul; Mainini, Matthew; Lee, Homola; Lee, Hwasoo; Prevot, Thomas; Smith, Nancy

    2011-01-01

    This paper reviews three Next Generation Air Transportation System (NextGen) based high fidelity air traffic control human-in-the-loop (HITL) simulations, with a focus on the expected requirement of enhanced automated trajectory assessment and modification tools to support future air traffic flow management (ATFM) planning positions. The simulations were conducted at the National Aeronautics and Space Administration (NASA) Ames Research Centers Airspace Operations Laboratory (AOL) in 2009 and 2010. The test airspace for all three simulations assumed the mid-term NextGenEn-Route high altitude environment utilizing high altitude sectors from the Kansas City and Memphis Air Route Traffic Control Centers. Trajectory assessment, modification and coordination decision support tools were developed at the AOL in order to perform future ATFM tasks. Overall tool usage results and user acceptability ratings were collected across three areas of NextGen operatoins to evaluate the tools. In addition to the usefulness and usability feedback, feasibility issues, benefits, and future requirements were also addressed. Overall, the tool sets were rated very useful and usable, and many elements of the tools received high scores and were used frequently and successfully. Tool utilization results in all three HITLs showed both user and system benefits including better airspace throughput, reduced controller workload, and highly effective communication protocols in both full Data Comm and mixed-equipage environments.

  9. ReSeqTools: an integrated toolkit for large-scale next-generation sequencing based resequencing analysis.

    PubMed

    He, W; Zhao, S; Liu, X; Dong, S; Lv, J; Liu, D; Wang, J; Meng, Z

    2013-12-04

    Large-scale next-generation sequencing (NGS)-based resequencing detects sequence variations, constructs evolutionary histories, and identifies phenotype-related genotypes. However, NGS-based resequencing studies generate extraordinarily large amounts of data, making computations difficult. Effective use and analysis of these data for NGS-based resequencing studies remains a difficult task for individual researchers. Here, we introduce ReSeqTools, a full-featured toolkit for NGS (Illumina sequencing)-based resequencing analysis, which processes raw data, interprets mapping results, and identifies and annotates sequence variations. ReSeqTools provides abundant scalable functions for routine resequencing analysis in different modules to facilitate customization of the analysis pipeline. ReSeqTools is designed to use compressed data files as input or output to save storage space and facilitates faster and more computationally efficient large-scale resequencing studies in a user-friendly manner. It offers abundant practical functions and generates useful statistics during the analysis pipeline, which significantly simplifies resequencing analysis. Its integrated algorithms and abundant sub-functions provide a solid foundation for special demands in resequencing projects. Users can combine these functions to construct their own pipelines for other purposes.

  10. Criteria for deciding about forestry research programs

    Treesearch

    Robert Z. Callaham

    1981-01-01

    In early 1979, the Forest Service, U.S. Department of Agriculture, was required to decide several significant issues affecting its future research program. These decisions were in response to requirements of the Forest and Rangeland Renewable Resources Planning Act of 1974 (RPA). The decisions required information that was not either available or assembled. Most...

  11. 20 CFR 416.1881 - Deciding whether someone is your parent or stepparent.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Deciding whether someone is your parent or... SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Relationship Who Is Considered Your Parent § 416.1881 Deciding whether someone is your parent or stepparent. (a) We consider your parent to be— (1) Your natural...

  12. 48 CFR 750.7106 - Standards for deciding cases.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Standards for deciding cases. 750.7106 Section 750.7106 Federal Acquisition Regulations System AGENCY FOR INTERNATIONAL DEVELOPMENT CONTRACT MANAGEMENT EXTRAORDINARY CONTRACTUAL ACTIONS Extraordinary Contractual Actions To Protect...

  13. 13 CFR 124.1008 - When will SBA not decide an SDB protest?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false When will SBA not decide an SDB... SDB protest? (a) SBA will not decide a protest as to disadvantaged status of any concern other than... protested concern's circumstances have materially changed since SBA certified it as an SDB, or that the...

  14. 13 CFR 124.1008 - When will SBA not decide an SDB protest?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false When will SBA not decide an SDB... SDB protest? (a) SBA will not decide a protest as to disadvantaged status of any concern other than... protested concern's circumstances have materially changed since SBA certified it as an SDB, or that the...

  15. 13 CFR 124.1008 - When will SBA not decide an SDB protest?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false When will SBA not decide an SDB... SDB protest? (a) SBA will not decide a protest as to disadvantaged status of any concern other than... protested concern's circumstances have materially changed since SBA certified it as an SDB, or that the...

  16. 13 CFR 124.1008 - When will SBA not decide an SDB protest?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false When will SBA not decide an SDB... SDB protest? (a) SBA will not decide a protest as to disadvantaged status of any concern other than... protested concern's circumstances have materially changed since SBA certified it as an SDB, or that the...

  17. 13 CFR 124.1008 - When will SBA not decide an SDB protest?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false When will SBA not decide an SDB... SDB protest? (a) SBA will not decide a protest as to disadvantaged status of any concern other than... protested concern's circumstances have materially changed since SBA certified it as an SDB, or that the...

  18. World wide matching of registration metrology tools of various generations

    NASA Astrophysics Data System (ADS)

    Laske, F.; Pudnos, A.; Mackey, L.; Tran, P.; Higuchi, M.; Enkrich, C.; Roeth, K.-D.; Schmidt, K.-H.; Adam, D.; Bender, J.

    2008-10-01

    Turn around time/cycle time is a key success criterion in the semiconductor photomask business. Therefore, global mask suppliers typically allocate work loads based on fab capability and utilization capacity. From a logistical point of view, the manufacturing location of a photomask should be transparent to the customer (mask user). Matching capability of production equipment and especially metrology tools is considered a key enabler to guarantee cross site manufacturing flexibility. Toppan, with manufacturing sites in eight countries worldwide, has an on-going program to match the registration metrology systems of all its production sites. This allows for manufacturing flexibility and risk mitigation.In cooperation with Vistec Semiconductor Systems, Toppan has recently completed a program to match the Vistec LMS IPRO systems at all production sites worldwide. Vistec has developed a new software feature which allows for significantly improved matching of LMS IPRO(x) registration metrology tools of various generations. We will report on the results of the global matching campaign of several of the leading Toppan sites.

  19. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    NASA Astrophysics Data System (ADS)

    Battaglieri, M.; Briscoe, B. J.; Celentano, A.; Chung, S.-U.; D'Angelo, A.; De Vita, R.; Döring, M.; Dudek, J.; Eidelman, S.; Fegan, S.; Ferretti, J.; Filippi, A.; Fox, G.; Galata, G.; García-Tecocoatzi, H.; Glazier, D. I.; Grube, B.; Hanhart, C.; Hoferichter, M.; Hughes, S. M.; Ireland, D. G.; Ketzer, B.; Klein, F. J.; Kubis, B.; Liu, B.; Masjuan, P.; Mathieu, V.; McKinnon, B.; Mitchel, R.; Nerling, F.; Paul, S.; Peláez, J. R.; Rademacker, J.; Rizzo, A.; Salgado, C.; Santopinto, E.; Sarantsev, A. V.; Sato, T.; Schlüter, T.; [Silva]da Silva, M. L. L.; Stankovic, I.; Strakovsky, I.; Szczepaniak, A.; Vassallo, A.; Walford, N. K.; Watts, D. P.; Zana, L.

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopy in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.

  20. Analysis Tools for Next-Generation Hadron Spectroscopy Experiments

    DOE PAGES

    Battaglieri, Marco; Briscoe, William; Celentano, Andrea; ...

    2015-01-01

    The series of workshops on New Partial-Wave Analysis Tools for Next-Generation Hadron Spectroscopy Experiments was initiated with the ATHOS 2012 meeting, which took place in Camogli, Italy, June 20-22, 2012. It was followed by ATHOS 2013 in Kloster Seeon near Munich, Germany, May 21-24, 2013. The third, ATHOS3, meeting is planned for April 13-17, 2015 at The George Washington University Virginia Science and Technology Campus, USA. The workshops focus on the development of amplitude analysis tools for meson and baryon spectroscopy, and complement other programs in hadron spectroscopy organized in the recent past including the INT-JLab Workshop on Hadron Spectroscopymore » in Seattle in 2009, the International Workshop on Amplitude Analysis in Hadron Spectroscopy at the ECT*-Trento in 2011, the School on Amplitude Analysis in Modern Physics in Bad Honnef in 2011, the Jefferson Lab Advanced Study Institute Summer School in 2012, and the School on Concepts of Modern Amplitude Analysis Techniques in Flecken-Zechlin near Berlin in September 2013. The aim of this document is to summarize the discussions that took place at the ATHOS 2012 and ATHOS 2013 meetings. We do not attempt a comprehensive review of the field of amplitude analysis, but offer a collection of thoughts that we hope may lay the ground for such a document.« less

  1. A Useful Laboratory Tool

    ERIC Educational Resources Information Center

    Johnson, Samuel A.; Tutt, Tye

    2008-01-01

    Recently, a high school Science Club generated a large number of questions involving temperature. Therefore, they decided to construct a thermal gradient apparatus in order to conduct a wide range of experiments beyond the standard "cookbook" labs. They felt that this apparatus could be especially useful in future ninth-grade biology classes, in…

  2. Case-Based Pedagogy Using Student-Generated Vignettes: A Pre-Service Intercultural Awareness Tool

    ERIC Educational Resources Information Center

    Cournoyer, Amy

    2010-01-01

    This qualitative study investigated the effectiveness of case-based pedagogy as an instructional tool aimed at increasing cultural awareness and competence in the preparation of 18 pre-service and in-service students enrolled in an Intercultural Education course. Each participant generated a vignette based on an instructional challenge identified…

  3. Consumer Economics, Book I [and] Book II. DECIDE.

    ERIC Educational Resources Information Center

    Huffman, Ruth E.; And Others

    This module, Consumer Economics, is one of five from Project DECIDE, which was created to design, develop, write, and implement materials to provide adult basic education administrators, instructors, para-professionals, and other personnel with curriculum to accompany the Indiana Adult Basic Education Curriculum Guide, "Learning for Everyday…

  4. miRanalyzer: a microRNA detection and analysis tool for next-generation sequencing experiments.

    PubMed

    Hackenberg, Michael; Sturm, Martin; Langenberger, David; Falcón-Pérez, Juan Manuel; Aransay, Ana M

    2009-07-01

    Next-generation sequencing allows now the sequencing of small RNA molecules and the estimation of their expression levels. Consequently, there will be a high demand of bioinformatics tools to cope with the several gigabytes of sequence data generated in each single deep-sequencing experiment. Given this scene, we developed miRanalyzer, a web server tool for the analysis of deep-sequencing experiments for small RNAs. The web server tool requires a simple input file containing a list of unique reads and its copy numbers (expression levels). Using these data, miRanalyzer (i) detects all known microRNA sequences annotated in miRBase, (ii) finds all perfect matches against other libraries of transcribed sequences and (iii) predicts new microRNAs. The prediction of new microRNAs is an especially important point as there are many species with very few known microRNAs. Therefore, we implemented a highly accurate machine learning algorithm for the prediction of new microRNAs that reaches AUC values of 97.9% and recall values of up to 75% on unseen data. The web tool summarizes all the described steps in a single output page, which provides a comprehensive overview of the analysis, adding links to more detailed output pages for each analysis module. miRanalyzer is available at http://web.bioinformatics.cicbiogune.es/microRNA/.

  5. Power and ambivalence in intergenerational communication: Deciding to institutionalize in Shanghai.

    PubMed

    Chen, Lin

    2017-04-01

    China's tradition of taking care of one's aging parents continues to evolve, as evidenced by the growth in nursing home residents in Shanghai. However, how these families make the decision to institutionalize remains unclear. To fill this gap, this study draws on power relations to examine communication dynamics when oldest-old and their adult children decide to institutionalize. This study used a phenomenological approach. Twelve dyads of matched elderly residents and their children participated in face-to-face, in-depth interviews (N=24). The format and content of intergenerational communication indicated that both conflicts and compromises took place. Adult children achieved greater decision-making power than their frail parents, which evoked older adults' ambivalent feelings. A discrepancy in perceived filial piety between generations also emerged. These dynamics of caregiving decision-making offer insight in understanding evolving filial piety in urban China. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. 25 CFR 39.133 - Who decides how Language Development funds can be used?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... INDIAN SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school... 25 Indians 1 2010-04-01 2010-04-01 false Who decides how Language Development funds can be used...

  7. Decided and Undecided Students: Career Self-Efficacy, Negative Thinking, and Decision-Making Difficulties

    ERIC Educational Resources Information Center

    Bullock-Yowell, Emily; McConnell, Amy E.; Schedin, Emily A.

    2014-01-01

    The career concern differences between undecided and decided college students (N = 223) are examined. Undecided college students (n = 83) reported lower career decision-making self-efficacy, higher incidences of negative career thoughts, and more career decision-making difficulties than their decided peers (n = 143). Results reveal that undecided…

  8. 25 CFR 39.133 - Who decides how Language Development funds can be used?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... INDIAN SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school... 25 Indians 1 2014-04-01 2014-04-01 false Who decides how Language Development funds can be used...

  9. 25 CFR 39.133 - Who decides how Language Development funds can be used?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... INDIAN SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school... 25 Indians 1 2013-04-01 2013-04-01 false Who decides how Language Development funds can be used...

  10. 25 CFR 39.133 - Who decides how Language Development funds can be used?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 1 2011-04-01 2011-04-01 false Who decides how Language Development funds can be used... INDIAN SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school...

  11. GALEN: a third generation terminology tool to support a multipurpose national coding system for surgical procedures.

    PubMed

    Trombert-Paviot, B; Rodrigues, J M; Rogers, J E; Baud, R; van der Haring, E; Rassinoux, A M; Abrial, V; Clavel, L; Idir, H

    2000-09-01

    Generalised architecture for languages, encyclopedia and nomenclatures in medicine (GALEN) has developed a new generation of terminology tools based on a language independent model describing the semantics and allowing computer processing and multiple reuses as well as natural language understanding systems applications to facilitate the sharing and maintaining of consistent medical knowledge. During the European Union 4 Th. framework program project GALEN-IN-USE and later on within two contracts with the national health authorities we applied the modelling and the tools to the development of a new multipurpose coding system for surgical procedures named CCAM in a minority language country, France. On one hand, we contributed to a language independent knowledge repository and multilingual semantic dictionaries for multicultural Europe. On the other hand, we support the traditional process for creating a new coding system in medicine which is very much labour consuming by artificial intelligence tools using a medically oriented recursive ontology and natural language processing. We used an integrated software named CLAW (for classification workbench) to process French professional medical language rubrics produced by the national colleges of surgeons domain experts into intermediate dissections and to the Grail reference ontology model representation. From this language independent concept model representation, on one hand, we generate with the LNAT natural language generator controlled French natural language to support the finalization of the linguistic labels (first generation) in relation with the meanings of the conceptual system structure. On the other hand, the Claw classification manager proves to be very powerful to retrieve the initial domain experts rubrics list with different categories of concepts (second generation) within a semantic structured representation (third generation) bridge to the electronic patient record detailed terminology.

  12. Destination memory for self-generated actions.

    PubMed

    El Haj, Mohamad

    2016-10-01

    There is a substantial body of literature showing memory enhancement for self-generated information in normal aging. The present paper investigated this outcome for destination memory or memory for outputted information. In Experiment 1, younger adults and older adults had to place (self-generated actions) and observe an experimenter placing (experiment-generated actions) items into two different destinations (i.e., a black circular box and a white square box). On a subsequent recognition task, the participants had to decide into which box each item had originally been placed. These procedures showed better destination memory for self- than experimenter-generated actions. In Experiment 2, destination and source memory were assessed for self-generated actions. Younger adults and older adults had to place items into the two boxes (self-generated actions), take items out of the boxes (self-generated actions), and observe an experimenter taking items out of the boxes (experiment-generated actions). On a subsequent recognition task, they had to decide into which box (destination memory)/from which box (source memory) each item had originally been placed/taken. For both populations, source memory was better than destination memory for self-generated actions, and both were better than source memory for experimenter-generated actions. Taken together, these findings highlight the beneficial effect of self-generation on destination memory in older adults.

  13. Chimera Grid Tools

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert

    2005-01-01

    Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.

  14. 75 FR 62482 - List of Nonconforming Vehicles Decided To Be Eligible for Importation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-12

    ... [Docket No. NHTSA-2010-0125; Notice 2] List of Nonconforming Vehicles Decided To Be Eligible for... Register on Tuesday, September 21, 2010, (75 FR 57396) that revised the list of vehicles not originally manufactured to conform to the Federal Motor Vehicle Safety Standards (FMVSS) that NHTSA has decided to be...

  15. A Planning Tool for Estimating Waste Generated by a Radiological Incident and Subsequent Decontamination Efforts - 13569

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boe, Timothy; Lemieux, Paul; Schultheisz, Daniel

    2013-07-01

    Management of debris and waste from a wide-area radiological incident would probably constitute a significant percentage of the total remediation cost and effort. The U.S. Environmental Protection Agency's (EPA's) Waste Estimation Support Tool (WEST) is a unique planning tool for estimating the potential volume and radioactivity levels of waste generated by a radiological incident and subsequent decontamination efforts. The WEST was developed to support planners and decision makers by generating a first-order estimate of the quantity and characteristics of waste resulting from a radiological incident. The tool then allows the user to evaluate the impact of various decontamination/demolition strategies onmore » the waste types and volumes generated. WEST consists of a suite of standalone applications and Esri{sup R} ArcGIS{sup R} scripts for rapidly estimating waste inventories and levels of radioactivity generated from a radiological contamination incident as a function of user-defined decontamination and demolition approaches. WEST accepts Geographic Information System (GIS) shape-files defining contaminated areas and extent of contamination. Building stock information, including square footage, building counts, and building composition estimates are then generated using the Federal Emergency Management Agency's (FEMA's) Hazus{sup R}-MH software. WEST then identifies outdoor surfaces based on the application of pattern recognition to overhead aerial imagery. The results from the GIS calculations are then fed into a Microsoft Excel{sup R} 2007 spreadsheet with a custom graphical user interface where the user can examine the impact of various decontamination/demolition scenarios on the quantity, characteristics, and residual radioactivity of the resulting waste streams. (authors)« less

  16. 20 CFR 670.200 - Who decides where Job Corps centers will be located?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 4 2014-04-01 2014-04-01 false Who decides where Job Corps centers will be... LABOR (CONTINUED) THE JOB CORPS UNDER TITLE I OF THE WORKFORCE INVESTMENT ACT Site Selection and Protection and Maintenance of Facilities § 670.200 Who decides where Job Corps centers will be located? (a...

  17. 5 CFR 890.1023 - Information considered in deciding a contest.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) CIVIL SERVICE REGULATIONS (CONTINUED) FEDERAL EMPLOYEES HEALTH BENEFITS PROGRAM Administrative Sanctions Imposed Against Health Care Providers Permissive Debarments § 890.1023 Information considered in deciding...

  18. 25 CFR 162.531 - How will BIA decide whether to approve a WEEL?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 1 2013-04-01 2013-04-01 false How will BIA decide whether to approve a WEEL? 162.531 Section 162.531 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Weel Approval § 162.531 How will BIA decide whether to approve a...

  19. 25 CFR 162.531 - How will BIA decide whether to approve a WEEL?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false How will BIA decide whether to approve a WEEL? 162.531 Section 162.531 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Weel Approval § 162.531 How will BIA decide whether to approve a...

  20. A generative tool for building health applications driven by ISO 13606 archetypes.

    PubMed

    Menárguez-Tortosa, Marcos; Martínez-Costa, Catalina; Fernández-Breis, Jesualdo Tomás

    2012-10-01

    The use of Electronic Healthcare Records (EHR) standards in the development of healthcare applications is crucial for achieving the semantic interoperability of clinical information. Advanced EHR standards make use of the dual model architecture, which provides a solution for clinical interoperability based on the separation of the information and knowledge. However, the impact of such standards is biased by the limited availability of tools that facilitate their usage and practical implementation. In this paper, we present an approach for the automatic generation of clinical applications for the ISO 13606 EHR standard, which is based on the dual model architecture. This generator has been generically designed, so it can be easily adapted to other dual model standards and can generate applications for multiple technological platforms. Such good properties are based on the combination of standards for the representation of generic user interfaces and model-driven engineering techniques.

  1. Generating "Random" Integers

    ERIC Educational Resources Information Center

    Griffiths, Martin

    2011-01-01

    One of the author's undergraduate students recently asked him whether it was possible to generate a random positive integer. After some thought, the author realised that there were plenty of interesting mathematical ideas inherent in her question. So much so in fact, that the author decided to organise a workshop, open both to undergraduates and…

  2. A Report Generator Volume 1

    DTIC Science & Technology

    1988-01-01

    A Generator for Natural Language Interfaces," Computational Linguistis. Vol. 11, Number 4, October-December, 1985. pp. 219-242. de Joia , A. and...employ in order to communicate to their intended audience. Production, therefore, encompasses issues of deciding what is pertinent as well as de ...rhetorical predicates; design of a system motivated by the desire for domain and language independency, semantic connection of the generation system

  3. Galen: a third generation terminology tool to support a multipurpose national coding system for surgical procedures.

    PubMed

    Trombert-Paviot, B; Rodrigues, J M; Rogers, J E; Baud, R; van der Haring, E; Rassinoux, A M; Abrial, V; Clavel, L; Idir, H

    1999-01-01

    GALEN has developed a new generation of terminology tools based on a language independent concept reference model using a compositional formalism allowing computer processing and multiple reuses. During the 4th framework program project Galen-In-Use we applied the modelling and the tools to the development of a new multipurpose coding system for surgical procedures (CCAM) in France. On one hand we contributed to a language independent knowledge repository for multicultural Europe. On the other hand we support the traditional process for creating a new coding system in medicine which is very much labour consuming by artificial intelligence tools using a medically oriented recursive ontology and natural language processing. We used an integrated software named CLAW to process French professional medical language rubrics produced by the national colleges of surgeons into intermediate dissections and to the Grail reference ontology model representation. From this language independent concept model representation on one hand we generate controlled French natural language to support the finalization of the linguistic labels in relation with the meanings of the conceptual system structure. On the other hand the classification manager of third generation proves to be very powerful to retrieve the initial professional rubrics with different categories of concepts within a semantic network.

  4. Deciding in Democracies: A Role for Thinking Skills?

    ERIC Educational Resources Information Center

    Gardner, Peter

    2014-01-01

    In societies that respect our right to decide many things for ourselves, exercising that right can be a source of anxiety. We want to make the right decisions, which is difficult when we are confronted with complex issues that are usually the preserve of specialists. But is help at hand? Are thinking skills the very things that non-specialists…

  5. The Requirements Generation System: A tool for managing mission requirements

    NASA Technical Reports Server (NTRS)

    Sheppard, Sylvia B.

    1994-01-01

    Historically, NASA's cost for developing mission requirements has been a significant part of a mission's budget. Large amounts of time have been allocated in mission schedules for the development and review of requirements by the many groups who are associated with a mission. Additionally, tracing requirements from a current document to a parent document has been time-consuming and costly. The Requirements Generation System (RGS) is a computer-supported cooperative-work tool that assists mission developers in the online creation, review, editing, tracing, and approval of mission requirements as well as in the production of requirements documents. This paper describes the RGS and discusses some lessons learned during its development.

  6. Projectile-generating explosive access tool

    DOEpatents

    Jakaboski, Juan-Carlos; Hughs, Chance G; Todd, Steven N

    2013-06-11

    A method for generating a projectile using an explosive device that can generate a projectile from the opposite side of a wall from the side where the explosive device is detonated. The projectile can be generated without breaching the wall of the structure or container. The device can optionally open an aperture in a solid wall of a structure or a container and form a high-kinetic-energy projectile from the portion of the wall removed to create the aperture.

  7. Novel integrative genomic tool for interrogating lithium response in bipolar disorder

    PubMed Central

    Hunsberger, J G; Chibane, F L; Elkahloun, A G; Henderson, R; Singh, R; Lawson, J; Cruceanu, C; Nagarajan, V; Turecki, G; Squassina, A; Medeiros, C D; Del Zompo, M; Rouleau, G A; Alda, M; Chuang, D-M

    2015-01-01

    We developed a novel integrative genomic tool called GRANITE (Genetic Regulatory Analysis of Networks Investigational Tool Environment) that can effectively analyze large complex data sets to generate interactive networks. GRANITE is an open-source tool and invaluable resource for a variety of genomic fields. Although our analysis is confined to static expression data, GRANITE has the capability of evaluating time-course data and generating interactive networks that may shed light on acute versus chronic treatment, as well as evaluating dose response and providing insight into mechanisms that underlie therapeutic versus sub-therapeutic doses or toxic doses. As a proof-of-concept study, we investigated lithium (Li) response in bipolar disorder (BD). BD is a severe mood disorder marked by cycles of mania and depression. Li is one of the most commonly prescribed and decidedly effective treatments for many patients (responders), although its mode of action is not yet fully understood, nor is it effective in every patient (non-responders). In an in vitro study, we compared vehicle versus chronic Li treatment in patient-derived lymphoblastoid cells (LCLs) (derived from either responders or non-responders) using both microRNA (miRNA) and messenger RNA gene expression profiling. We present both Li responder and non-responder network visualizations created by our GRANITE analysis in BD. We identified by network visualization that the Let-7 family is consistently downregulated by Li in both groups where this miRNA family has been implicated in neurodegeneration, cell survival and synaptic development. We discuss the potential of this analysis for investigating treatment response and even providing clinicians with a tool for predicting treatment response in their patients, as well as for providing the industry with a tool for identifying network nodes as targets for novel drug discovery. PMID:25646593

  8. Novel integrative genomic tool for interrogating lithium response in bipolar disorder.

    PubMed

    Hunsberger, J G; Chibane, F L; Elkahloun, A G; Henderson, R; Singh, R; Lawson, J; Cruceanu, C; Nagarajan, V; Turecki, G; Squassina, A; Medeiros, C D; Del Zompo, M; Rouleau, G A; Alda, M; Chuang, D-M

    2015-02-03

    We developed a novel integrative genomic tool called GRANITE (Genetic Regulatory Analysis of Networks Investigational Tool Environment) that can effectively analyze large complex data sets to generate interactive networks. GRANITE is an open-source tool and invaluable resource for a variety of genomic fields. Although our analysis is confined to static expression data, GRANITE has the capability of evaluating time-course data and generating interactive networks that may shed light on acute versus chronic treatment, as well as evaluating dose response and providing insight into mechanisms that underlie therapeutic versus sub-therapeutic doses or toxic doses. As a proof-of-concept study, we investigated lithium (Li) response in bipolar disorder (BD). BD is a severe mood disorder marked by cycles of mania and depression. Li is one of the most commonly prescribed and decidedly effective treatments for many patients (responders), although its mode of action is not yet fully understood, nor is it effective in every patient (non-responders). In an in vitro study, we compared vehicle versus chronic Li treatment in patient-derived lymphoblastoid cells (LCLs) (derived from either responders or non-responders) using both microRNA (miRNA) and messenger RNA gene expression profiling. We present both Li responder and non-responder network visualizations created by our GRANITE analysis in BD. We identified by network visualization that the Let-7 family is consistently downregulated by Li in both groups where this miRNA family has been implicated in neurodegeneration, cell survival and synaptic development. We discuss the potential of this analysis for investigating treatment response and even providing clinicians with a tool for predicting treatment response in their patients, as well as for providing the industry with a tool for identifying network nodes as targets for novel drug discovery.

  9. 13 CFR 125.17 - Who decides if a contract opportunity for SDVO competition exists?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Who decides if a contract opportunity for SDVO competition exists? 125.17 Section 125.17 Business Credit and Assistance SMALL BUSINESS... opportunity for SDVO competition exists? The contracting officer for the contracting activity decides if a...

  10. Differential Tuning of Ventral and Dorsal Streams during the Generation of Common and Uncommon Tool Uses.

    PubMed

    Matheson, Heath E; Buxbaum, Laurel J; Thompson-Schill, Sharon L

    2017-11-01

    Our use of tools is situated in different contexts. Prior evidence suggests that diverse regions within the ventral and dorsal streams represent information supporting common tool use. However, given the flexibility of object concepts, these regions may be tuned to different types of information when generating novel or uncommon uses of tools. To investigate this, we collected fMRI data from participants who reported common or uncommon tool uses in response to visually presented familiar objects. We performed a pattern dissimilarity analysis in which we correlated cortical patterns with behavioral measures of visual, action, and category information. The results showed that evoked cortical patterns within the dorsal tool use network reflected action and visual information to a greater extent in the uncommon use group, whereas evoked neural patterns within the ventral tool use network reflected categorical information more strongly in the common use group. These results reveal the flexibility of cortical representations of tool use and the situated nature of cortical representations more generally.

  11. The Project Manager's Tool Kit

    NASA Technical Reports Server (NTRS)

    Cameron, W. Scott

    2003-01-01

    Project managers are rarely described as being funny. Moreover, a good sense of humor rarely seems to be one of the deciding factors in choosing someone to be a project manager, or something that pops up as a major discussion point at an annual performance review. Perhaps this is because people think you aren't serious about your work if you laugh. I disagree with this assessment, but that's not really my point. As I talk to people either pursuing a career in project management, or broadening their assignment to include project management, I encourage them to consider what tools they need to be successful. I suggest that they consider any strength they have to be part of their Project Management (PM) Tool Kit, and being funny could be one of the tools they need.

  12. Two New Tools for Glycopeptide Analysis Researchers: A Glycopeptide Decoy Generator and a Large Data Set of Assigned CID Spectra of Glycopeptides.

    PubMed

    Lakbub, Jude C; Su, Xiaomeng; Zhu, Zhikai; Patabandige, Milani W; Hua, David; Go, Eden P; Desaire, Heather

    2017-08-04

    The glycopeptide analysis field is tightly constrained by a lack of effective tools that translate mass spectrometry data into meaningful chemical information, and perhaps the most challenging aspect of building effective glycopeptide analysis software is designing an accurate scoring algorithm for MS/MS data. We provide the glycoproteomics community with two tools to address this challenge. The first tool, a curated set of 100 expert-assigned CID spectra of glycopeptides, contains a diverse set of spectra from a variety of glycan types; the second tool, Glycopeptide Decoy Generator, is a new software application that generates glycopeptide decoys de novo. We developed these tools so that emerging methods of assigning glycopeptides' CID spectra could be rigorously tested. Software developers or those interested in developing skills in expert (manual) analysis can use these tools to facilitate their work. We demonstrate the tools' utility in assessing the quality of one particular glycopeptide software package, GlycoPep Grader, which assigns glycopeptides to CID spectra. We first acquired the set of 100 expert assigned CID spectra; then, we used the Decoy Generator (described herein) to generate 20 decoys per target glycopeptide. The assigned spectra and decoys were used to test the accuracy of GlycoPep Grader's scoring algorithm; new strengths and weaknesses were identified in the algorithm using this approach. Both newly developed tools are freely available. The software can be downloaded at http://glycopro.chem.ku.edu/GPJ.jar.

  13. A Web Tool for Generating High Quality Machine-readable Biological Pathways.

    PubMed

    Ramirez-Gaona, Miguel; Marcu, Ana; Pon, Allison; Grant, Jason; Wu, Anthony; Wishart, David S

    2017-02-08

    PathWhiz is a web server built to facilitate the creation of colorful, interactive, visually pleasing pathway diagrams that are rich in biological information. The pathways generated by this online application are machine-readable and fully compatible with essentially all web-browsers and computer operating systems. It uses a specially developed, web-enabled pathway drawing interface that permits the selection and placement of different combinations of pre-drawn biological or biochemical entities to depict reactions, interactions, transport processes and binding events. This palette of entities consists of chemical compounds, proteins, nucleic acids, cellular membranes, subcellular structures, tissues, and organs. All of the visual elements in it can be interactively adjusted and customized. Furthermore, because this tool is a web server, all pathways and pathway elements are publicly accessible. This kind of pathway "crowd sourcing" means that PathWhiz already contains a large and rapidly growing collection of previously drawn pathways and pathway elements. Here we describe a protocol for the quick and easy creation of new pathways and the alteration of existing pathways. To further facilitate pathway editing and creation, the tool contains replication and propagation functions. The replication function allows existing pathways to be used as templates to create or edit new pathways. The propagation function allows one to take an existing pathway and automatically propagate it across different species. Pathways created with this tool can be "re-styled" into different formats (KEGG-like or text-book like), colored with different backgrounds, exported to BioPAX, SBGN-ML, SBML, or PWML data exchange formats, and downloaded as PNG or SVG images. The pathways can easily be incorporated into online databases, integrated into presentations, posters or publications, or used exclusively for online visualization and exploration. This protocol has been successfully applied to

  14. Pediatric post-thrombotic syndrome in children: Toward the development of a new diagnostic and evaluative measurement tool.

    PubMed

    Avila, M L; Brandão, L R; Williams, S; Ward, L C; Montoya, M I; Stinson, J; Kiss, A; Lara-Corrales, I; Feldman, B M

    2016-08-01

    Our goal was to conduct the item generation and piloting phases of a new discriminative and evaluative tool for pediatric post-thrombotic syndrome. We followed a formative model for the development of the tool, focusing on the signs/symptoms (items) that define post-thrombotic syndrome. For item generation, pediatric thrombosis experts and subjects diagnosed with extremity post-thrombotic syndrome during childhood nominated items. In the piloting phase, items were cross-sectionally measured in children with limb deep vein thrombosis to examine item performance. Twenty-three experts and 16 subjects listed 34 items, which were then measured in 140 subjects with previous diagnosis of limb deep vein thrombosis (70 upper extremity and 70 lower extremity). The items with strongest correlation with post-thrombotic syndrome severity and largest area under the curve were pain (in older children), paresthesia, and swollen limb for the upper extremity group, and pain (in older children), tired limb, heaviness, tightness and paresthesia for the lower extremity group. The diagnostic properties of the items and their correlations with post-thrombotic syndrome severity varied according to the assessed venous territory. The information gathered in this study will help experts decide which item should be considered for inclusion in the new tool. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Wireless sensor systems for sense/decide/act/communicate.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, Nina M.; Cushner, Adam; Baker, James A.

    2003-12-01

    After 9/11, the United States (U.S.) was suddenly pushed into challenging situations they could no longer ignore as simple spectators. The War on Terrorism (WoT) was suddenly ignited and no one knows when this war will end. While the government is exploring many existing and potential technologies, the area of wireless Sensor networks (WSN) has emerged as a foundation for establish future national security. Unlike other technologies, WSN could provide virtual presence capabilities needed for precision awareness and response in military, intelligence, and homeland security applications. The Advance Concept Group (ACG) vision of Sense/Decide/Act/Communicate (SDAC) sensor system is an instantiationmore » of the WSN concept that takes a 'systems of systems' view. Each sensing nodes will exhibit the ability to: Sense the environment around them, Decide as a collective what the situation of their environment is, Act in an intelligent and coordinated manner in response to this situational determination, and Communicate their actions amongst each other and to a human command. This LDRD report provides a review of the research and development done to bring the SDAC vision closer to reality.« less

  16. Development of the software tool for generation and visualization of the finite element head model with bone conduction sounds

    NASA Astrophysics Data System (ADS)

    Nikolić, Dalibor; Milošević, Žarko; Saveljić, Igor; Filipović, Nenad

    2015-12-01

    Vibration of the skull causes a hearing sensation. We call it Bone Conduction (BC) sound. There are several investigations about transmission properties of bone conducted sound. The aim of this study was to develop a software tool for easy generation of the finite element (FE) model of the human head with different materials based on human head anatomy and to calculate sound conduction through the head. Developed software tool generates a model in a few steps. The first step is to do segmentation of CT medical images (DICOM) and to generate a surface mesh files (STL). Each STL file presents a different layer of human head with different material properties (brain, CSF, different layers of the skull bone, skin, etc.). The next steps are to make tetrahedral mesh from obtained STL files, to define FE model boundary conditions and to solve FE equations. This tool uses PAK solver, which is the open source software implemented in SIFEM FP7 project, for calculations of the head vibration. Purpose of this tool is to show impact of the bone conduction sound of the head on the hearing system and to estimate matching of obtained results with experimental measurements.

  17. Evaluation of Computer Tools for Idea Generation and Team Formation in Project-Based Learning

    ERIC Educational Resources Information Center

    Ardaiz-Villanueva, Oscar; Nicuesa-Chacon, Xabier; Brene-Artazcoz, Oscar; Sanz de Acedo Lizarraga, Maria Luisa; Sanz de Acedo Baquedano, Maria Teresa

    2011-01-01

    The main objective of this research was to validate the effectiveness of Wikideas and Creativity Connector tools to stimulate the generation of ideas and originality by university students organized into groups according to their indexes of creativity and affinity. Another goal of the study was to evaluate the classroom climate created by these…

  18. Validity of Measured Interest for Decided and Undecided Students.

    ERIC Educational Resources Information Center

    Bartling, Herbert C.; Hood, Albert B.

    The usefulness of vocational interest measures has been questioned by those who have studied the predictive validity of expressed choice. The predictive validities of measured interest for decided and undecided students, expressed choice and measured interest, and expressed choice and measured interest when they are congruent and incongruent were…

  19. Automating Initial Guess Generation for High Fidelity Trajectory Optimization Tools

    NASA Technical Reports Server (NTRS)

    Villa, Benjamin; Lantoine, Gregory; Sims, Jon; Whiffen, Gregory

    2013-01-01

    Many academic studies in spaceflight dynamics rely on simplified dynamical models, such as restricted three-body models or averaged forms of the equations of motion of an orbiter. In practice, the end result of these preliminary orbit studies needs to be transformed into more realistic models, in particular to generate good initial guesses for high-fidelity trajectory optimization tools like Mystic. This paper reviews and extends some of the approaches used in the literature to perform such a task, and explores the inherent trade-offs of such a transformation with a view toward automating it for the case of ballistic arcs. Sample test cases in the libration point regimes and small body orbiter transfers are presented.

  20. Decision-making tool for applying adaptive traffic control systems : final report.

    DOT National Transportation Integrated Search

    2016-03-01

    Adaptive traffic signal control technologies have been increasingly deployed in real world situations. The objective of this project was to develop a decision-making tool to guide traffic engineers and decision-makers who must decide whether or not a...

  1. 25 CFR 162.566 - How will BIA decide whether to approve a WSR lease?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 1 2013-04-01 2013-04-01 false How will BIA decide whether to approve a WSR lease? 162.566 Section 162.566 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Wsr Lease Approval § 162.566 How will BIA decide whether to...

  2. 25 CFR 162.566 - How will BIA decide whether to approve a WSR lease?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false How will BIA decide whether to approve a WSR lease? 162.566 Section 162.566 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Wsr Lease Approval § 162.566 How will BIA decide whether to...

  3. RADC SCAT automated sneak circuit analysis tool

    NASA Astrophysics Data System (ADS)

    Depalma, Edward L.

    The sneak circuit analysis tool (SCAT) provides a PC-based system for real-time identification (during the design phase) of sneak paths and design concerns. The tool utilizes an expert system shell to assist the analyst so that prior experience with sneak analysis is not necessary for performance. Both sneak circuits and design concerns are targeted by this tool, with both digital and analog circuits being examined. SCAT focuses the analysis at the assembly level, rather than the entire system, so that most sneak problems can be identified and corrected by the responsible design engineer in a timely manner. The SCAT program identifies the sneak circuits to the designer, who then decides what course of action is necessary.

  4. Podcasting: Connecting with a New Generation

    ERIC Educational Resources Information Center

    Halderson, Jeanne

    2006-01-01

    In this article, the author describes how she uses podcasting as an educational tool for her seventh grade students. Using only the applications that come pre-loaded on the Mac iBook, they work together to develop the content, write storyboards, produce and edit the podcasts, and analyze their work. From creating the script to deciding how to…

  5. MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.

    PubMed

    Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y

    2017-08-14

    Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .

  6. Tool for Generation of MAC/GMC Representative Unit Cell for CMC/PMC Analysis

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Pineda, Evan J.

    2016-01-01

    This document describes a recently developed analysis tool that enhances the resident capabilities of the Micromechanics Analysis Code with the Generalized Method of Cells (MAC/GMC) 4.0. This tool is especially useful in analyzing ceramic matrix composites (CMCs), where higher fidelity with improved accuracy of local response is needed. The tool, however, can be used for analyzing polymer matrix composites (PMCs) as well. MAC/GMC 4.0 is a composite material and laminate analysis software developed at NASA Glenn Research Center. The software package has been built around the concept of the generalized method of cells (GMC). The computer code is developed with a user friendly framework, along with a library of local inelastic, damage, and failure models. Further, application of simulated thermomechanical loading, generation of output results, and selection of architectures to represent the composite material have been automated to increase the user friendliness, as well as to make it more robust in terms of input preparation and code execution. Finally, classical lamination theory has been implemented within the software, wherein GMC is used to model the composite material response of each ply. Thus, the full range of GMC composite material capabilities is available for analysis of arbitrary laminate configurations as well. The primary focus of the current effort is to provide a graphical user interface (GUI) capability that generates a number of different user-defined repeating unit cells (RUCs). In addition, the code has provisions for generation of a MAC/GMC-compatible input text file that can be merged with any MAC/GMC input file tailored to analyze composite materials. Although the primary intention was to address the three different constituents and phases that are usually present in CMCs-namely, fibers, matrix, and interphase-it can be easily modified to address two-phase polymer matrix composite (PMC) materials where an interphase is absent. Currently, the

  7. From Modelling to Execution of Enterprise Integration Scenarios: The GENIUS Tool

    NASA Astrophysics Data System (ADS)

    Scheibler, Thorsten; Leymann, Frank

    One of the predominant problems IT companies are facing today is Enterprise Application Integration (EAI). Most of the infrastructures built to tackle integration issues are proprietary because no standards exist for how to model, develop, and actually execute integration scenarios. EAI patterns gain importance for non-technical business users to ease and harmonize the development of EAI scenarios. These patterns describe recurring EAI challenges and propose possible solutions in an abstract way. Therefore, one can use those patterns to describe enterprise architectures in a technology neutral manner. However, patterns are documentation only used by developers and systems architects to decide how to implement an integration scenario manually. Thus, patterns are not theoretical thought to stand for artefacts that will immediately be executed. This paper presents a tool supporting a method how EAI patterns can be used to generate executable artefacts for various target platforms automatically using a model-driven development approach, hence turning patterns into something executable. Therefore, we introduce a continuous tool chain beginning at the design phase and ending in executing an integration solution in a completely automatically manner. For evaluation purposes we introduce a scenario demonstrating how the tool is utilized for modelling and actually executing an integration scenario.

  8. 5 CFR 890.1038 - Deciding a contest without additional fact-finding.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) FEDERAL EMPLOYEES HEALTH BENEFITS PROGRAM Administrative Sanctions Imposed Against Health Care Providers Suspension § 890.1038 Deciding a contest without additional...

  9. Proteasix: a tool for automated and large-scale prediction of proteases involved in naturally occurring peptide generation.

    PubMed

    Klein, Julie; Eales, James; Zürbig, Petra; Vlahou, Antonia; Mischak, Harald; Stevens, Robert

    2013-04-01

    In this study, we have developed Proteasix, an open-source peptide-centric tool that can be used to predict in silico the proteases involved in naturally occurring peptide generation. We developed a curated cleavage site (CS) database, containing 3500 entries about human protease/CS combinations. On top of this database, we built a tool, Proteasix, which allows CS retrieval and protease associations from a list of peptides. To establish the proof of concept of the approach, we used a list of 1388 peptides identified from human urine samples, and compared the prediction to the analysis of 1003 randomly generated amino acid sequences. Metalloprotease activity was predominantly involved in urinary peptide generation, and more particularly to peptides associated with extracellular matrix remodelling, compared to proteins from other origins. In comparison, random sequences returned almost no results, highlighting the specificity of the prediction. This study provides a tool that can facilitate linking of identified protein fragments to predicted protease activity, and therefore into presumed mechanisms of disease. Experiments are needed to confirm the in silico hypotheses; nevertheless, this approach may be of great help to better understand molecular mechanisms of disease, and define new biomarkers, and therapeutic targets. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Considerations in Deciding to Treat Contaminated Unsaturated Soils In Situ

    EPA Pesticide Factsheets

    The purpose of this Issue Paper is to assist the user in deciding if in situ treatment of contaminated soil is a potentially feasible remedial alternative and to assist in the process of reviewing and screening in situ technologies.

  11. Migrating the Belle II collaborative services and tools

    NASA Astrophysics Data System (ADS)

    Braun, N.; Dossett, D.; Dramburg, M.; Frost, O.; Gellrich, A.; Grygier, J.; Hauth, T.; Jahnke-Zumbusch, D.; Knittel, D.; Kuhr, T.; Levonian, S.; Moser, H.-G.; Li, L.; Nakao, N.; Prim, M.; Reest, P. v. d.; Schwenssen, F.; Urquijo, P.; Vennemann, B.

    2017-10-01

    The Belle II collaboration decided in 2016 to migrate its collaborative services and tools into the existing IT infrastructure at DESY. The goal was to reduce the maintenance effort for solutions operated by Belle II members as well as to deploy state-of-art technologies. In addition, some new services and tools were or will be introduced. Planning and migration work was carried out by small teams consisting of experts form Belle II and the involved IT divisions. The migration was successfully accomplished before the KEK computer centre replacement in August 2016.

  12. Nanopore sequencing technology and tools for genome assembly: computational analysis of the current state, bottlenecks and future directions.

    PubMed

    Senol Cali, Damla; Kim, Jeremie S; Ghose, Saugata; Alkan, Can; Mutlu, Onur

    2018-04-02

    Nanopore sequencing technology has the potential to render other sequencing technologies obsolete with its ability to generate long reads and provide portability. However, high error rates of the technology pose a challenge while generating accurate genome assemblies. The tools used for nanopore sequence analysis are of critical importance, as they should overcome the high error rates of the technology. Our goal in this work is to comprehensively analyze current publicly available tools for nanopore sequence analysis to understand their advantages, disadvantages and performance bottlenecks. It is important to understand where the current tools do not perform well to develop better tools. To this end, we (1) analyze the multiple steps and the associated tools in the genome assembly pipeline using nanopore sequence data, and (2) provide guidelines for determining the appropriate tools for each step. Based on our analyses, we make four key observations: (1) the choice of the tool for basecalling plays a critical role in overcoming the high error rates of nanopore sequencing technology. (2) Read-to-read overlap finding tools, GraphMap and Minimap, perform similarly in terms of accuracy. However, Minimap has a lower memory usage, and it is faster than GraphMap. (3) There is a trade-off between accuracy and performance when deciding on the appropriate tool for the assembly step. The fast but less accurate assembler Miniasm can be used for quick initial assembly, and further polishing can be applied on top of it to increase the accuracy, which leads to faster overall assembly. (4) The state-of-the-art polishing tool, Racon, generates high-quality consensus sequences while providing a significant speedup over another polishing tool, Nanopolish. We analyze various combinations of different tools and expose the trade-offs between accuracy, performance, memory usage and scalability. We conclude that our observations can guide researchers and practitioners in making conscious

  13. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    NASA Technical Reports Server (NTRS)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K

  14. Projectile-generating explosive access tool

    DOEpatents

    Jakaboski, Juan-Carlos [Albuquerque, NM; Hughs, Chance G [Tijeras, NM; Todd, Steven N [Rio Rancho, NM

    2011-10-18

    An explosive device that can generate a projectile from the opposite side of a wall from the side where the explosive device is detonated. The projectile can be generated without breaching the wall of the structure or container. The device can optionally open an aperture in a solid wall of a structure or a container and form a high-kinetic-energy projectile from the portion of the wall removed to create the aperture.

  15. Synthetic biology in mammalian cells: Next generation research tools and therapeutics

    PubMed Central

    Lienert, Florian; Lohmueller, Jason J; Garg, Abhishek; Silver, Pamela A

    2014-01-01

    Recent progress in DNA manipulation and gene circuit engineering has greatly improved our ability to programme and probe mammalian cell behaviour. These advances have led to a new generation of synthetic biology research tools and potential therapeutic applications. Programmable DNA-binding domains and RNA regulators are leading to unprecedented control of gene expression and elucidation of gene function. Rebuilding complex biological circuits such as T cell receptor signalling in isolation from their natural context has deepened our understanding of network motifs and signalling pathways. Synthetic biology is also leading to innovative therapeutic interventions based on cell-based therapies, protein drugs, vaccines and gene therapies. PMID:24434884

  16. 32 CFR 1653.4 - File to be returned after appeal to the President is decided.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... SELECTIVE SERVICE SYSTEM APPEAL TO THE PRESIDENT § 1653.4 File to be returned after appeal to the President is decided. When the appeal to the President has been decided, the file shall be returned as... 32 National Defense 6 2011-07-01 2011-07-01 false File to be returned after appeal to the...

  17. 42 CFR 83.16 - How will the Secretary decide the outcome(s) of a petition?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false How will the Secretary decide the outcome(s) of a... AS MEMBERS OF THE SPECIAL EXPOSURE COHORT UNDER THE ENERGY EMPLOYEES OCCUPATIONAL ILLNESS... Secretary decide the outcome(s) of a petition? (a) The Director of NIOSH will propose a decision to add or...

  18. Development and evaluation of the DECIDE to move! Physical activity educational video.

    PubMed

    Majid, Haseeb M; Schumann, Kristina P; Doswell, Angela; Sutherland, June; Hill Golden, Sherita; Stewart, Kerry J; Hill-Briggs, Felicia

    2012-01-01

    To develop a video that provides accessible and usable information about the importance of physical activity to type 2 diabetes self-management and ways of incorporating physical activity into everyday life. A 15-minute physical activity educational video narrated by US Surgeon General Dr Regina Benjamin was developed and evaluated. The video addresses the following topics: the effects of exercise on diabetes, preparations for beginning physical activity, types of physical activity, safety considerations (eg, awareness of symptoms of hypoglycemia during activity), and goal setting. Two patient screening groups were held for evaluation and revision of the video. Patient satisfaction ratings ranged 4.6 to 4.9 out of a possible 5.0 on dimensions of overall satisfaction, how informative they found the video to be, how well the video held their interest and attention, how easy the video was to understand, and how easy the video was to see and hear. Patients reported the educational video effective in empowering them to take strides toward increasing and maintaining physical activity in their lives. The tool is currently used in a clinical research trial, Project DECIDE, as one component of a diabetes and cardiovascular disease self-management program.

  19. Rating a Teacher Observation Tool: Five Ways to Ensure Classroom Observations are Focused and Rigorous

    ERIC Educational Resources Information Center

    New Teacher Project, 2011

    2011-01-01

    This "Rating a Teacher Observation Tool" identifies five simple questions and provides an easy-to-use scorecard to help policymakers decide whether an observation framework is likely to produce fair and accurate results. The five questions are: (1) Do the criteria and tools cover the classroom performance areas most connected to student outcomes?…

  20. Deciding when It's Time to Buy a New PC

    ERIC Educational Resources Information Center

    Goldsborough, Reid

    2004-01-01

    How to best decide when it's time to replace your PC, whether at home or at work, is always tricky. Spending on computers can make you more productive, but it's money you otherwise cannot spend, invest or save, and faster systems always await you in the future. What is clear is that the computer industry really wants you to buy, and the computer…

  1. Next generation sequencing (NGS): a golden tool in forensic toolkit.

    PubMed

    Aly, S M; Sabri, D M

    The DNA analysis is a cornerstone in contemporary forensic sciences. DNA sequencing technologies are powerful tools that enrich molecular sciences in the past based on Sanger sequencing and continue to glowing these sciences based on Next generation sequencing (NGS). Next generation sequencing has excellent potential to flourish and increase the molecular applications in forensic sciences by jumping over the pitfalls of the conventional method of sequencing. The main advantages of NGS compared to conventional method that it utilizes simultaneously a large number of genetic markers with high-resolution of genetic data. These advantages will help in solving several challenges such as mixture analysis and dealing with minute degraded samples. Based on these new technologies, many markers could be examined to get important biological data such as age, geographical origins, tissue type determination, external visible traits and monozygotic twins identification. It also could get data related to microbes, insects, plants and soil which are of great medico-legal importance. Despite the dozens of forensic research involving NGS, there are requirements before using this technology routinely in forensic cases. Thus, there is a great need to more studies that address robustness of these techniques. Therefore, this work highlights the applications of forensic sciences in the era of massively parallel sequencing.

  2. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road.

    PubMed

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible.

  3. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road

    PubMed Central

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on “on-demand payment” for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. PMID:26230400

  4. 13 CFR 126.604 - Who decides if a contract opportunity for HUBZone set-aside competition exists?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Who decides if a contract opportunity for HUBZone set-aside competition exists? 126.604 Section 126.604 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION HUBZONE PROGRAM Contractual Assistance § 126.604 Who decides if a contract...

  5. Deciding for themselves.

    PubMed

    Mccormack, J; Nelson, C

    1985-11-01

    World Education, in a collaboration with PfP/International and with funding from US AID, has begun comprehensive program in Kenya that offers non-governmental organizations non-formal training, technical assistance in organization and business management, and financial assistance in the form of loans for revolving credit funds. The approach emphasizes Kenyans deciding for themselves about the directions projects should take. This article discusses the Tototo Home Industries' rural economic development program. After receiving a loan from Tototo, the women of Bofu village planned to stock their small village shop with matches, kerosene, soap, salt, and cooking oil. The remainder of the loan was saved to purchase future stock. For this project, bookkeeping and management skills were necessary. To meet this need, the Tototo small business advisor designed a simple cash boom system to be used by all the groups. Sessions in accounting were included in the annual training of trainers workshop. Currently, accounts advisor visit the groups monthly to provide follow-up training and assistance to ensure the women understand how to record project transactions accurately. The lesson to be drawn from these projects is simple. It is not unrealistic to set high expectations for project participants, but it is important to remain aware of the difficulty of the new concepts presented to these groups. In order for them to adequately master both concept and practice, the participants must be given sufficient time and support. In fact, consistent follow-up and close contact with the villages is the key to Tototo's success.

  6. GeneratorSE: A Sizing Tool for Variable-Speed Wind Turbine Generators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sethuraman, Latha; Dykes, Katherine L

    This report documents a set of analytical models employed by the optimization algorithms within the GeneratorSE framework. The initial values and boundary conditions employed for the generation of the various designs and initial estimates for basic design dimensions, masses, and efficiency for the four different models of generators are presented and compared with empirical data collected from previous studies and some existing commercial turbines. These models include designs applicable for variable-speed, high-torque application featuring direct-drive synchronous generators and low-torque application featuring induction generators. In all of the four models presented, the main focus of optimization is electromagnetic design with themore » exception of permanent-magnet and wire-wound synchronous generators, wherein the structural design is also optimized. Thermal design is accommodated in GeneratorSE as a secondary attribute by limiting the winding current densities to acceptable limits. A preliminary validation of electromagnetic design was carried out by comparing the optimized magnetic loading against those predicted by numerical simulation in FEMM4.2, a finite-element software for analyzing electromagnetic and thermal physics problems for electrical machines. For direct-drive synchronous generators, the analytical models for the structural design are validated by static structural analysis in ANSYS.« less

  7. 5 CFR 890.1013 - Deciding whether to propose a permissive debarment.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... debarment. 890.1013 Section 890.1013 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) FEDERAL EMPLOYEES HEALTH BENEFITS PROGRAM Administrative Sanctions Imposed Against Health Care Providers Permissive Debarments § 890.1013 Deciding whether to propose a...

  8. 5 CFR 890.1013 - Deciding whether to propose a permissive debarment.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... debarment. 890.1013 Section 890.1013 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) FEDERAL EMPLOYEES HEALTH BENEFITS PROGRAM Administrative Sanctions Imposed Against Health Care Providers Permissive Debarments § 890.1013 Deciding whether to propose a...

  9. 5 CFR 890.1013 - Deciding whether to propose a permissive debarment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... debarment. 890.1013 Section 890.1013 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) FEDERAL EMPLOYEES HEALTH BENEFITS PROGRAM Administrative Sanctions Imposed Against Health Care Providers Permissive Debarments § 890.1013 Deciding whether to propose a...

  10. 5 CFR 890.1038 - Deciding a contest without additional fact-finding.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... fact-finding. 890.1038 Section 890.1038 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) FEDERAL EMPLOYEES HEALTH BENEFITS PROGRAM Administrative Sanctions Imposed Against Health Care Providers Suspension § 890.1038 Deciding a contest without additional...

  11. 5 CFR 890.1038 - Deciding a contest without additional fact-finding.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... fact-finding. 890.1038 Section 890.1038 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) FEDERAL EMPLOYEES HEALTH BENEFITS PROGRAM Administrative Sanctions Imposed Against Health Care Providers Suspension § 890.1038 Deciding a contest without additional...

  12. NullSeq: A Tool for Generating Random Coding Sequences with Desired Amino Acid and GC Contents.

    PubMed

    Liu, Sophia S; Hockenberry, Adam J; Lancichinetti, Andrea; Jewett, Michael C; Amaral, Luís A N

    2016-11-01

    The existence of over- and under-represented sequence motifs in genomes provides evidence of selective evolutionary pressures on biological mechanisms such as transcription, translation, ligand-substrate binding, and host immunity. In order to accurately identify motifs and other genome-scale patterns of interest, it is essential to be able to generate accurate null models that are appropriate for the sequences under study. While many tools have been developed to create random nucleotide sequences, protein coding sequences are subject to a unique set of constraints that complicates the process of generating appropriate null models. There are currently no tools available that allow users to create random coding sequences with specified amino acid composition and GC content for the purpose of hypothesis testing. Using the principle of maximum entropy, we developed a method that generates unbiased random sequences with pre-specified amino acid and GC content, which we have developed into a python package. Our method is the simplest way to obtain maximally unbiased random sequences that are subject to GC usage and primary amino acid sequence constraints. Furthermore, this approach can easily be expanded to create unbiased random sequences that incorporate more complicated constraints such as individual nucleotide usage or even di-nucleotide frequencies. The ability to generate correctly specified null models will allow researchers to accurately identify sequence motifs which will lead to a better understanding of biological processes as well as more effective engineering of biological systems.

  13. Project DECIDE. Business Enterprise Approach to Career Exploration. Implementation Handbook.

    ERIC Educational Resources Information Center

    Post, John O., Jr.; And Others

    The purpose of this document is to describe project DECIDE, a business enterprise career exploration program, in the form of an implementation handbook. Chapter 1 presents the major characteristics of the model, which focuses on providing special needs students and regular junior high students the opportunity to improve their personal, social, and…

  14. A survey of tools for variant analysis of next-generation genome sequencing data

    PubMed Central

    Pabinger, Stephan; Dander, Andreas; Fischer, Maria; Snajder, Rene; Sperk, Michael; Efremova, Mirjana; Krabichler, Birgit; Speicher, Michael R.; Zschocke, Johannes

    2014-01-01

    Recent advances in genome sequencing technologies provide unprecedented opportunities to characterize individual genomic landscapes and identify mutations relevant for diagnosis and therapy. Specifically, whole-exome sequencing using next-generation sequencing (NGS) technologies is gaining popularity in the human genetics community due to the moderate costs, manageable data amounts and straightforward interpretation of analysis results. While whole-exome and, in the near future, whole-genome sequencing are becoming commodities, data analysis still poses significant challenges and led to the development of a plethora of tools supporting specific parts of the analysis workflow or providing a complete solution. Here, we surveyed 205 tools for whole-genome/whole-exome sequencing data analysis supporting five distinct analytical steps: quality assessment, alignment, variant identification, variant annotation and visualization. We report an overview of the functionality, features and specific requirements of the individual tools. We then selected 32 programs for variant identification, variant annotation and visualization, which were subjected to hands-on evaluation using four data sets: one set of exome data from two patients with a rare disease for testing identification of germline mutations, two cancer data sets for testing variant callers for somatic mutations, copy number variations and structural variations, and one semi-synthetic data set for testing identification of copy number variations. Our comprehensive survey and evaluation of NGS tools provides a valuable guideline for human geneticists working on Mendelian disorders, complex diseases and cancers. PMID:23341494

  15. Use of Erythropoietin-Stimulating Agents (ESA) in Patients With End-Stage Renal Failure Decided to Forego Dialysis: Palliative Perspective.

    PubMed

    Cheng, Hon Wai Benjamin; Chan, Kwok Ying; Lau, Hoi To; Man, Ching Wah; Cheng, Suk Ching; Lam, Carman

    2017-05-01

    Normochromic normocytic anemia is a common complication in chronic kidney disease (CKD) and is associated with many adverse clinical consequences. Erythropoiesis-stimulating agents (ESAs) act to replace endogenous erythropoietin for patients with end-stage renal disease having anemia. Today, ESAs remain the main tool for treating anemia associated with CKD. In current practice, the use of ESA is not limited to the patients on renal replacement therapy but has extended to nondialysis patients under palliative care (PC). Current evidence on ESA usage in patients with CKD decided to forego dialysis often have to take reference from studies conducted in other groups of patients with CKD, including pre-dialysis patients and those on renal replacement therapy. There is paucity of studies targeting use of ESAs in renal PC patients. Small-scale retrospective study in renal PC patients had suggested clinical advantage of ESAs in terms of hemoglobin improvement, reduction in fatigue, and hospitalization rate. With the expected growth in elderly patients with CKD decided to forego dialysis and manage conservatively, there remains an urgent need to call for large-scale prospective trial in exploring efficacy of ESAs in this population, targeting on quality of life and symptoms improvement outcome. This article also reviews the mechanism of action, pharmacology, adverse effects, and clinical trial evidence for ESA in patients with CKD under renal PC.

  16. Advance care planning for nursing home residents with dementia: Influence of 'we DECide' on policy and practice.

    PubMed

    Ampe, Sophie; Sevenants, Aline; Smets, Tinne; Declercq, Anja; Van Audenhove, Chantal

    2017-01-01

    (1) To pilot 'we DECide' in terms of influence on advance care planning policy and practice in nursing home dementia care units. (2) To investigate barriers and facilitators for implementing 'we DECide'. This was a pre-test-post-test study in 18 nursing homes. Measurements included: compliance with best practice of advance care planning policy (ACP-audit); advance care planning practice (ACP criteria: degree to which advance care planning was discussed, and OPTION scale: degree of involvement of residents and families in conversations). Advance care planning policy was significantly more compliant with best practice after 'we DECide'; policy in the control group was not. Advance care planning was not discussed more frequently, nor were residents and families involved to a higher degree in conversations after 'we DECide'. Barriers to realizing advance care planning included staff's limited responsibilities; facilitators included support by management staff, and involvement of the whole organization. 'We DECide' had a positive influence on advance care planning policy. Daily practice, however, did not change. Future studies should pay more attention to long-term implementation strategies. Long-term implementation of advance care planning requires involvement of the whole organization and a continuing support system for health care professionals. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  17. Deciding What to Research: An Overview of a Participatory Workshop

    ERIC Educational Resources Information Center

    Northway, Ruth; Hurley, Karen; O'Connor, Chris; Thomas, Helen; Howarth, Joyce; Langley, Emma; Bale, Sue

    2014-01-01

    While recent years have seen an increase in the number of participatory and inclusive research studies being undertaken where people with learning disabilities are active members of the research team, little has been published about how teams decide what to research. This paper aims to fill this gap by discussing how in one area of Wales a…

  18. Development of an Optimum Rescue Tool, Detailed Prototype Concept Design.

    DTIC Science & Technology

    1981-06-01

    vAll Ai24 108 0VLOOKIN Of A1 OPTIU.M *t$CUI VOOL OITAILEO /D ..010 ",:N COAC IPT DES III AMETIc/01fI SANTA SAROARA CA IT A Ot SIN IT aL. -~ JUaI A...penetrate hardened metal structures of aircraft. A large number of tools are transported to the scene of a crashed air- craft. Valuable time is lost deciding...carrying system is necessary for transport of the tool to allow the operator free use of his hands. Such a system would be a shoulder sling assembly

  19. 5 CFR 890.1013 - Deciding whether to propose a permissive debarment.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...) CIVIL SERVICE REGULATIONS (CONTINUED) FEDERAL EMPLOYEES HEALTH BENEFITS PROGRAM Administrative Sanctions Imposed Against Health Care Providers Permissive Debarments § 890.1013 Deciding whether to propose a... pose a risk to the health and safety of FEHBP-covered individuals or to the integrity of FEHBP...

  20. The Five Stages of Deciding on a Purchase...or a Job.

    ERIC Educational Resources Information Center

    Summey, John H.; Anderson, Carol H.

    1992-01-01

    Describes five stages of deciding on purchase or job: recognition of employment need; career information search; evaluation of career alternatives; identification and acceptance of employment; and postchoice evaluation. Evaluated importance of freedom/significance, growth, and variety in career decisions of 362 college students. Concludes…

  1. Development of the Assessment of Burden of COPD tool: an integrated tool to measure the burden of COPD.

    PubMed

    Slok, Annerika H M; in 't Veen, Johannes C C M; Chavannes, Niels H; van der Molen, Thys; Rutten-van Mölken, Maureen P M H; Kerstjens, Huib A M; Salomé, Philippe L; Holverda, Sebastiaan; Dekhuijzen, P N Richard; Schuiten, Denise; Asijee, Guus M; van Schayck, Onno C P

    2014-07-10

    In deciding on the treatment plan for patients with chronic obstructive pulmonary disease (COPD), the burden of COPD as experienced by patients should be the core focus. It is therefore important for daily practice to develop a tool that can both assess the burden of COPD and facilitate communication with patients in clinical practice. This paper describes the development of an integrated tool to assess the burden of COPD in daily practice. A definition of the burden of COPD was formulated by a Dutch expert team. Interviews showed that patients and health-care providers agreed on this definition. We found no existing instruments that fully measured burden of disease according to this definition. However, the Clinical COPD Questionnaire meets most requirements, and was therefore used and adapted. The adapted questionnaire is called the Assessment of Burden of COPD (ABC) scale. In addition, the ABC tool was developed, of which the ABC scale is the core part. The ABC tool is a computer program with an algorithm that visualises outcomes and provides treatment advice. The next step in the development of the tool is to test the validity and effectiveness of both the ABC scale and tool in daily practice.

  2. Trident: A Universal Tool for Generating Synthetic Absorption Spectra from Astrophysical Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hummels, Cameron B.; Smith, Britton D.; Silvia, Devin W.

    Hydrodynamical simulations are increasingly able to accurately model physical systems on stellar, galactic, and cosmological scales; however, the utility of these simulations is often limited by our ability to directly compare them with the data sets produced by observers: spectra, photometry, etc. To address this problem, we have created trident, a Python-based open-source tool for post-processing hydrodynamical simulations to produce synthetic absorption spectra and related data. trident can (i) create absorption-line spectra for any trajectory through a simulated data set mimicking both background quasar and down-the-barrel configurations; (ii) reproduce the spectral characteristics of common instruments like the Cosmic Origins Spectrograph;more » (iii) operate across the ultraviolet, optical, and infrared using customizable absorption-line lists; (iv) trace simulated physical structures directly to spectral features; (v) approximate the presence of ion species absent from the simulation outputs; (vi) generate column density maps for any ion; and (vii) provide support for all major astrophysical hydrodynamical codes. trident was originally developed to aid in the interpretation of observations of the circumgalactic medium and intergalactic medium, but it remains a general tool applicable in other contexts.« less

  3. Trident: A Universal Tool for Generating Synthetic Absorption Spectra from Astrophysical Simulations

    NASA Astrophysics Data System (ADS)

    Hummels, Cameron B.; Smith, Britton D.; Silvia, Devin W.

    2017-09-01

    Hydrodynamical simulations are increasingly able to accurately model physical systems on stellar, galactic, and cosmological scales; however, the utility of these simulations is often limited by our ability to directly compare them with the data sets produced by observers: spectra, photometry, etc. To address this problem, we have created trident, a Python-based open-source tool for post-processing hydrodynamical simulations to produce synthetic absorption spectra and related data. trident can (I) create absorption-line spectra for any trajectory through a simulated data set mimicking both background quasar and down-the-barrel configurations; (II) reproduce the spectral characteristics of common instruments like the Cosmic Origins Spectrograph; (III) operate across the ultraviolet, optical, and infrared using customizable absorption-line lists; (IV) trace simulated physical structures directly to spectral features; (v) approximate the presence of ion species absent from the simulation outputs; (VI) generate column density maps for any ion; and (vii) provide support for all major astrophysical hydrodynamical codes. trident was originally developed to aid in the interpretation of observations of the circumgalactic medium and intergalactic medium, but it remains a general tool applicable in other contexts.

  4. A Fair Tribunal: Governing Board Bias and the Power to Decide.

    ERIC Educational Resources Information Center

    Uerling, Donald F.

    This paper presents examples of judicial reasoning in conflicts involving governing board bias and the power to decide in higher education and in various public school settings. Two cases, "Simard v. Board of Education" and "Hortonville Joint School District No. 1 v. Hortonville Education Association," provide the general…

  5. Rapid SAW Sensor Development Tools

    NASA Technical Reports Server (NTRS)

    Wilson, William C.; Atkinson, Gary M.

    2007-01-01

    The lack of integrated design tools for Surface Acoustic Wave (SAW) devices has led us to develop tools for the design, modeling, analysis, and automatic layout generation of SAW devices. These tools enable rapid development of wireless SAW sensors. The tools developed have been designed to integrate into existing Electronic Design Automation (EDA) tools to take advantage of existing 3D modeling, and Finite Element Analysis (FEA). This paper presents the SAW design, modeling, analysis, and automated layout generation tools.

  6. Systems Prototyping with Fourth Generation Tools: One Answer to the Productivity Puzzle? AIR 1983 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Sholtys, Phyllis A.

    The development of information systems using an engineering approach employing both traditional programming techniques and nonprocedural languages is described. A fourth generation application tool is used to develop a prototype system that is revised and expanded as the user clarifies individual requirements. When fully defined, a combination of…

  7. Combining SLBL routine with landslide-generated tsunami model for a quick hazard assessment tool

    NASA Astrophysics Data System (ADS)

    Franz, Martin; Rudaz, Benjamin; Jaboyedoff, Michel; Podladchikov, Yury

    2016-04-01

    Regions with steep topography are potentially subject to landslide-induced tsunami, because of the proximity between lakes, rivers, sea shores and potential instabilities. The concentration of the population and infrastructures on the water body shores and downstream valleys could lead to catastrophic consequences. In order to assess comprehensively this phenomenon together with the induced risks, we have developed a tool which allows the construction of the landslide geometry, and which is able to simulate its propagation, the generation and the propagation of the wave and eventually the spread on the shores or the associated downstream flow. The tool is developed in the Matlab© environment, with a graphical user interface (GUI) to select the parameters in a user-friendly manner. The whole process is done in three steps implying different methods. Firstly, the geometry of the sliding mass is constructed using the Sloping Local Base Level (SLBL) concept. Secondly, the propagation of this volume is performed using a model based on viscous flow equations. Finally, the wave generation and its propagation are simulated using the shallow water equations stabilized by the Lax-Friedrichs scheme. The transition between wet and dry bed is performed by the combination of the two latter sets of equations. The intensity map is based on the criterion of flooding in Switzerland provided by the OFEG and results from the multiplication of the velocity and the depth obtained during the simulation. The tool can be used for hazard assessment in the case of well-known landslides, where the SLBL routine can be constrained and checked for realistic construction of the geometrical model. In less-known cases, various failure plane geometries can be automatically built between given range and thus a multi-scenario approach is used. In any case, less-known parameters such as the landslide velocity, its run-out distance, etc. can also be set to vary within given ranges, leading to multi

  8. Automatic Methods and Tools for the Verification of Real Time Systems

    DTIC Science & Technology

    1997-11-30

    We developed formal methods and tools for the verification of real - time systems . This was accomplished by extending techniques, based on automata...embedded real - time systems , we introduced hybrid automata, which equip traditional discrete automata with real-numbered clock variables and continuous... real - time systems , and we identified the exact boundary between decidability and undecidability of real-time reasoning.

  9. Numerical study of electromagnetic waves generated by a prototype dielectric logging tool

    USGS Publications Warehouse

    Ellefsen, K.J.; Abraham, J.D.; Wright, D.L.; Mazzella, A.T.

    2004-01-01

    To understand the electromagnetic waves generated by a prototype dielectric logging tool, a numerical study was conducted using both the finite-difference, time-domain method and a frequency-wavenumber method. When the propagation velocity in the borehole was greater than that in the formation (e.g., an air-filled borehole in the unsaturated zone), only a guided wave propagated along the borehole. As the frequency decreased, both the phase and the group velocities of the guided wave asymptotically approached the phase velocity of a plane wave in the formation. The guided wave radiated electromagnetic energy into the formation, causing its amplitude to decrease. When the propagation velocity in the borehole was less than that in the formation (e.g., a water-filled borehole in the saturated zone), both a refracted wave and a guided wave propagated along the borehole. The velocity of the refracted wave equaled the phase velocity of a plane wave in the formation, and the refracted wave preceded the guided wave. As the frequency decreased, both the phase and the group velocities of the guided wave asymptotically approached the phase velocity of a plane wave in the formation. The guided wave did not radiate electromagnetic energy into the formation. To analyze traces recorded by the prototype tool during laboratory tests, they were compared to traces calculated with the finite-difference method. The first parts of both the recorded and the calculated traces were similar, indicating that guided and refracted waves indeed propagated along the prototype tool. ?? 2004 Society of Exploration Geophysicists. All rights reserved.

  10. Analyzing Item Generation with Natural Language Processing Tools for the "TOEIC"® Listening Test. Research Report. ETS RR-17-52

    ERIC Educational Resources Information Center

    Yoon, Su-Youn; Lee, Chong Min; Houghton, Patrick; Lopez, Melissa; Sakano, Jennifer; Loukina, Anastasia; Krovetz, Bob; Lu, Chi; Madani, Nitin

    2017-01-01

    In this study, we developed assistive tools and resources to support TOEIC® Listening test item generation. There has recently been an increased need for a large pool of items for these tests. This need has, in turn, inspired efforts to increase the efficiency of item generation while maintaining the quality of the created items. We aimed to…

  11. Power Plant Model Validation Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The PPMV is used to validate generator model using disturbance recordings. The PPMV tool contains a collection of power plant models and model validation studies, as well as disturbance recordings from a number of historic grid events. The user can import data from a new disturbance into the database, which converts PMU and SCADA data into GE PSLF format, and then run the tool to validate (or invalidate) the model for a specific power plant against its actual performance. The PNNL PPMV tool enables the automation of the process of power plant model validation using disturbance recordings. The tool usesmore » PMU and SCADA measurements as input information. The tool automatically adjusts all required EPCL scripts and interacts with GE PSLF in the batch mode. The main tool features includes: The tool interacts with GE PSLF; The tool uses GE PSLF Play-In Function for generator model validation; Database of projects (model validation studies); Database of the historic events; Database of the power plant; The tool has advanced visualization capabilities; and The tool automatically generates reports« less

  12. Comparison of BrainTool to other UML modeling and model transformation tools

    NASA Astrophysics Data System (ADS)

    Nikiforova, Oksana; Gusarovs, Konstantins

    2017-07-01

    In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.

  13. A Statistical Bias Correction Tool for Generating Climate Change Scenarios in Indonesia based on CMIP5 Datasets

    NASA Astrophysics Data System (ADS)

    Faqih, A.

    2017-03-01

    Providing information regarding future climate scenarios is very important in climate change study. The climate scenario can be used as basic information to support adaptation and mitigation studies. In order to deliver future climate scenarios over specific region, baseline and projection data from the outputs of global climate models (GCM) is needed. However, due to its coarse resolution, the data have to be downscaled and bias corrected in order to get scenario data with better spatial resolution that match the characteristics of the observed data. Generating this downscaled data is mostly difficult for scientist who do not have specific background, experience and skill in dealing with the complex data from the GCM outputs. In this regards, it is necessary to develop a tool that can be used to simplify the downscaling processes in order to help scientist, especially in Indonesia, for generating future climate scenario data that can be used for their climate change-related studies. In this paper, we introduce a tool called as “Statistical Bias Correction for Climate Scenarios (SiBiaS)”. The tool is specially designed to facilitate the use of CMIP5 GCM data outputs and process their statistical bias corrections relative to the reference data from observations. It is prepared for supporting capacity building in climate modeling in Indonesia as part of the Indonesia 3rd National Communication (TNC) project activities.

  14. Governing a new generation of philanthropy: key leadership tools for success.

    PubMed

    Rice, James A

    2008-01-01

    Philanthropy has taken center stage again after the rapid growth of hospitals in the 1990s. It is an essential resource, not only because today's hospitals need the money more than ever, but also because great philanthropy helps forge rewarding relationships with the community. In meetings with more than 1,000 hospital board members and leaders at The Governance Institute's 2007 conferences, it became clear that maximizing philanthropy in the future will require boards to enhance three initiatives: a bolder service mission, more effective stakeholder engagement tools and enhanced planned giving programs. Health care philanthropy boards and the boards of their related organizations would be wise to devote time for robust conversations about their strategies for feeding the voracious capital appetites of contemporary health care systems and for examining their ability to govern a new generation of philanthropy.

  15. 5 CFR 890.1041 - Deciding a contest after a fact-finding proceeding.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) CIVIL SERVICE REGULATIONS (CONTINUED) FEDERAL EMPLOYEES HEALTH BENEFITS PROGRAM Administrative Sanctions Imposed Against Health Care Providers Suspension § 890.1041 Deciding a contest after a fact-finding... official shall issue a final written decision that either sustains, modifies, or terminates the suspension...

  16. Generation of Tutorial Dialogues: Discourse Strategies for Active Learning

    DTIC Science & Technology

    1998-05-29

    AND SUBTITLE Generation of Tutorial Dialogues: Discourse Strategies for active Learning AUTHORS Dr. Martha Evens 7. PERFORMING ORGANI2ATION NAME...time the student starts in on a new topic. Michael and Rovick constantly attempt to promote active learning . They regularly use hints and only resort...Controlling active learning : How tutors decide when to generate hints. Proceedings of FLAIRS 󈨣. Melbourne Beach, FL. 157-161. Hume, G., Michael

  17. Next Generation Electromagnetic Pump Analysis Tools (PLM DOC-0005-2188). Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stregy, Seth; Dasilva, Ana; Yilmaz, Serkan

    2015-10-29

    This report provides the broad historical review of EM Pump development and details of MATRIX development under this project. This report summarizes the efforts made to modernize the legacy performance models used in previous EM Pump designs and the improvements made to the analysis tools. This report provides information on Tasks 1, 3, and 4 of the entire project. The research for Task 4 builds upon Task 1: Update EM Pump Databank and Task 3: Modernize the Existing EM Pump Analysis Model, which are summarized within this report. Where research for Task 2: Insulation Materials Development and Evaluation identified parametersmore » applicable to the analysis model with Task 4, the analysis code was updated, and analyses were made for additional materials. The important design variables for the manufacture and operation of an EM Pump that the model improvement can evaluate are: space constraints; voltage capability of insulation system; maximum flux density through iron; flow rate and outlet pressure; efficiency and manufacturability. The development of the next-generation EM Pump analysis tools during this two-year program provides information in three broad areas: Status of analysis model development; Improvements made to older simulations; and Comparison to experimental data.« less

  18. Mobile Phones Democratize and Cultivate Next-Generation Imaging, Diagnostics and Measurement Tools

    PubMed Central

    Ozcan, Aydogan

    2014-01-01

    In this article, I discuss some of the emerging applications and the future opportunities and challenges created by the use of mobile phones and their embedded components for the development of next-generation imaging, sensing, diagnostics and measurement tools. The massive volume of mobile phone users, which has now reached ~7 billion, drives the rapid improvements of the hardware, software and high-end imaging and sensing technologies embedded in our phones, transforming the mobile phone into a cost-effective and yet extremely powerful platform to run e.g., biomedical tests and perform scientific measurements that would normally require advanced laboratory instruments. This rapidly evolving and continuing trend will help us transform how medicine, engineering and sciences are practiced and taught globally. PMID:24647550

  19. Development of the Assessment of Burden of COPD tool: an integrated tool to measure the burden of COPD

    PubMed Central

    Slok, Annerika H M; in ’t Veen, Johannes C C M; Chavannes, Niels H; van der Molen, Thys; Rutten-van Mölken, Maureen P M H; Kerstjens, Huib A M; Salomé, Philippe L; Holverda, Sebastiaan; Dekhuijzen, PN Richard; Schuiten, Denise; Asijee, Guus M; van Schayck, Onno C P

    2014-01-01

    In deciding on the treatment plan for patients with chronic obstructive pulmonary disease (COPD), the burden of COPD as experienced by patients should be the core focus. It is therefore important for daily practice to develop a tool that can both assess the burden of COPD and facilitate communication with patients in clinical practice. This paper describes the development of an integrated tool to assess the burden of COPD in daily practice. A definition of the burden of COPD was formulated by a Dutch expert team. Interviews showed that patients and health-care providers agreed on this definition. We found no existing instruments that fully measured burden of disease according to this definition. However, the Clinical COPD Questionnaire meets most requirements, and was therefore used and adapted. The adapted questionnaire is called the Assessment of Burden of COPD (ABC) scale. In addition, the ABC tool was developed, of which the ABC scale is the core part. The ABC tool is a computer program with an algorithm that visualises outcomes and provides treatment advice. The next step in the development of the tool is to test the validity and effectiveness of both the ABC scale and tool in daily practice. PMID:25010353

  20. 5 CFR 890.1029 - Deciding a contest after a fact-finding proceeding.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) CIVIL SERVICE REGULATIONS (CONTINUED) FEDERAL EMPLOYEES HEALTH BENEFITS PROGRAM Administrative Sanctions Imposed Against Health Care Providers Permissive Debarments § 890.1029 Deciding a contest after a fact... is made to debar a provider, the debarring official has the discretion to set the period of debarment...

  1. NeuroTessMesh: A Tool for the Generation and Visualization of Neuron Meshes and Adaptive On-the-Fly Refinement.

    PubMed

    Garcia-Cantero, Juan J; Brito, Juan P; Mata, Susana; Bayona, Sofia; Pastor, Luis

    2017-01-01

    Gaining a better understanding of the human brain continues to be one of the greatest challenges for science, largely because of the overwhelming complexity of the brain and the difficulty of analyzing the features and behavior of dense neural networks. Regarding analysis, 3D visualization has proven to be a useful tool for the evaluation of complex systems. However, the large number of neurons in non-trivial circuits, together with their intricate geometry, makes the visualization of a neuronal scenario an extremely challenging computational problem. Previous work in this area dealt with the generation of 3D polygonal meshes that approximated the cells' overall anatomy but did not attempt to deal with the extremely high storage and computational cost required to manage a complex scene. This paper presents NeuroTessMesh, a tool specifically designed to cope with many of the problems associated with the visualization of neural circuits that are comprised of large numbers of cells. In addition, this method facilitates the recovery and visualization of the 3D geometry of cells included in databases, such as NeuroMorpho, and provides the tools needed to approximate missing information such as the soma's morphology. This method takes as its only input the available compact, yet incomplete, morphological tracings of the cells as acquired by neuroscientists. It uses a multiresolution approach that combines an initial, coarse mesh generation with subsequent on-the-fly adaptive mesh refinement stages using tessellation shaders. For the coarse mesh generation, a novel approach, based on the Finite Element Method, allows approximation of the 3D shape of the soma from its incomplete description. Subsequently, the adaptive refinement process performed in the graphic card generates meshes that provide good visual quality geometries at a reasonable computational cost, both in terms of memory and rendering time. All the described techniques have been integrated into Neuro

  2. NeuroTessMesh: A Tool for the Generation and Visualization of Neuron Meshes and Adaptive On-the-Fly Refinement

    PubMed Central

    Garcia-Cantero, Juan J.; Brito, Juan P.; Mata, Susana; Bayona, Sofia; Pastor, Luis

    2017-01-01

    Gaining a better understanding of the human brain continues to be one of the greatest challenges for science, largely because of the overwhelming complexity of the brain and the difficulty of analyzing the features and behavior of dense neural networks. Regarding analysis, 3D visualization has proven to be a useful tool for the evaluation of complex systems. However, the large number of neurons in non-trivial circuits, together with their intricate geometry, makes the visualization of a neuronal scenario an extremely challenging computational problem. Previous work in this area dealt with the generation of 3D polygonal meshes that approximated the cells’ overall anatomy but did not attempt to deal with the extremely high storage and computational cost required to manage a complex scene. This paper presents NeuroTessMesh, a tool specifically designed to cope with many of the problems associated with the visualization of neural circuits that are comprised of large numbers of cells. In addition, this method facilitates the recovery and visualization of the 3D geometry of cells included in databases, such as NeuroMorpho, and provides the tools needed to approximate missing information such as the soma’s morphology. This method takes as its only input the available compact, yet incomplete, morphological tracings of the cells as acquired by neuroscientists. It uses a multiresolution approach that combines an initial, coarse mesh generation with subsequent on-the-fly adaptive mesh refinement stages using tessellation shaders. For the coarse mesh generation, a novel approach, based on the Finite Element Method, allows approximation of the 3D shape of the soma from its incomplete description. Subsequently, the adaptive refinement process performed in the graphic card generates meshes that provide good visual quality geometries at a reasonable computational cost, both in terms of memory and rendering time. All the described techniques have been integrated into Neuro

  3. USSR Report Machine Tools and Metalworking Equipment.

    DTIC Science & Technology

    1986-04-22

    directors decided to teach the Bulat a new trade. This generator is now used to strengthen high-speed cutting mills by hardening them in a medium of...modules (GPM) and flexible production complexes ( GPK ). The flexible automated line is usually used for mass production of components. Here the...of programmable coordinates (x^ithout grip) 5 4 Method of programming teaching Memory capacity of robot system, points 300 Positioning error, mm

  4. A Study on Tooling and Its Effect on Heat Generation and Mechanical Properties of Welded Joints in Friction Stir Welding

    NASA Astrophysics Data System (ADS)

    Tikader, Sujoy; Biswas, Pankaj; Puri, Asit Baran

    2018-04-01

    Friction stir welding (FSW) has been the most attracting solid state welding process as it serves numerous advantages like good mechanical, metallurgical properties etc. Non weldable aluminium alloys like 5XXX, 7XXX series can be simply joined by this process. In this present study a mathematical model has been developed and experiments were successfully performed to evaluate mechanical properties of FSW on similar aluminium alloys i.e. AA1100 for different process parameters and mainly two kind of tool geometry (straight cylindrical and conical or cylindrical tapered shaped pin with flat shoulder). Tensile strength and micro hardness for different process parameters are reported of the welded plate sample. It was noticed that in FSW of similar alloy with tool made of SS-310 tool steel, friction is the major contributor for the heat generation. It was seen that tool geometry, tool rotational speed, plunging force by the tool and traverse speed have significant effect on tensile strength and hardness of friction stir welded joints.

  5. Improving Vocational Students' Consideration of Source Information When Deciding about Science Controversies

    ERIC Educational Resources Information Center

    Stadtler, Marc; Scharrer, Lisa; Macedo-Rouet, Monica; Rouet, Jean-François; Bromme, Rainer

    2016-01-01

    We present an empirical investigation of a classroom training fostering vocational students' consideration of source information when deciding about science-based controversies. The training was specifically aimed at raising students' awareness of the division of cognitive labor and the resulting need to take a source's competence into account…

  6. SynBioSS designer: a web-based tool for the automated generation of kinetic models for synthetic biological constructs

    PubMed Central

    Weeding, Emma; Houle, Jason

    2010-01-01

    Modeling tools can play an important role in synthetic biology the same way modeling helps in other engineering disciplines: simulations can quickly probe mechanisms and provide a clear picture of how different components influence the behavior of the whole. We present a brief review of available tools and present SynBioSS Designer. The Synthetic Biology Software Suite (SynBioSS) is used for the generation, storing, retrieval and quantitative simulation of synthetic biological networks. SynBioSS consists of three distinct components: the Desktop Simulator, the Wiki, and the Designer. SynBioSS Designer takes as input molecular parts involved in gene expression and regulation (e.g. promoters, transcription factors, ribosome binding sites, etc.), and automatically generates complete networks of reactions that represent transcription, translation, regulation, induction and degradation of those parts. Effectively, Designer uses DNA sequences as input and generates networks of biomolecular reactions as output. In this paper we describe how Designer uses universal principles of molecular biology to generate models of any arbitrary synthetic biological system. These models are useful as they explain biological phenotypic complexity in mechanistic terms. In turn, such mechanistic explanations can assist in designing synthetic biological systems. We also discuss, giving practical guidance to users, how Designer interfaces with the Registry of Standard Biological Parts, the de facto compendium of parts used in synthetic biology applications. PMID:20639523

  7. The Creative task Creator: a tool for the generation of customized, Web-based creativity tasks.

    PubMed

    Pretz, Jean E; Link, John A

    2008-11-01

    This article presents a Web-based tool for the creation of divergent-thinking and open-ended creativity tasks. A Java program generates HTML forms with PHP scripting that run an Alternate Uses Task and/or open-ended response items. Researchers may specify their own instructions, objects, and time limits, or use default settings. Participants can also be prompted to select their best responses to the Alternate Uses Task (Silvia et al., 2008). Minimal programming knowledge is required. The program runs on any server, and responses are recorded in a standard MySQL database. Responses can be scored using the consensual assessment technique (Amabile, 1996) or Torrance's (1998) traditional scoring method. Adoption of this Web-based tool should facilitate creativity research across cultures and access to eminent creators. The Creative Task Creator may be downloaded from the Psychonomic Society's Archive of Norms, Stimuli, and Data, www.psychonomic.org/archive.

  8. Virtual Tool Mark Generation for Efficient Striation Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekstrand, Laura; Zhang, Song; Grieve, Taylor

    2014-02-16

    This study introduces a tool mark analysis approach based upon 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. An open-source 3D graphics software package is utilized to simulate the marking process as the projection of the tip's geometry in the direction of tool travel. The edge of this projection becomes a virtual tool mark that is compared to cross-sections of the marked plate geometry using the statistical likelihood algorithm introduced by Chumbley et al. In a study with both sides of six screwdriver tips and 34 corresponding marks, the method distinguishedmore » known matches from known nonmatches with zero false-positive matches and two false-negative matches. For matches, it could predict the correct marking angle within ±5–10°. Individual comparisons could be made in seconds on a desktop computer, suggesting that the method could save time for examiners.« less

  9. A web based tool for storing and visualising data generated within a smart home.

    PubMed

    McDonald, H A; Nugent, C D; Moore, G; Finlay, D D; Hallberg, J

    2011-01-01

    There is a growing need to re-assess the current approaches available to researchers for storing and managing heterogeneous data generated within a smart home environment. In our current work we have developed the homeML Application; a web based tool to support researchers engaged in the area of smart home research as they perform experiments. Within this paper the homeML Application is presented which includes the fundamental components of the homeML Repository and the homeML Toolkit. Results from a usability study conducted by 10 computer science researchers are presented; the initial results of which have been positive.

  10. 45 CFR 150.417 - Issues to be heard and decided by ALJ.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Issues to be heard and decided by ALJ. 150.417 Section 150.417 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS Administrative Hearings § 150.417...

  11. 45 CFR 150.417 - Issues to be heard and decided by ALJ.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Issues to be heard and decided by ALJ. 150.417 Section 150.417 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS CMS ENFORCEMENT IN GROUP AND INDIVIDUAL INSURANCE MARKETS Administrative Hearings § 150.417...

  12. A review and evaluation of numerical tools for fractional calculus and fractional order controls

    NASA Astrophysics Data System (ADS)

    Li, Zhuo; Liu, Lu; Dehghan, Sina; Chen, YangQuan; Xue, Dingyü

    2017-06-01

    In recent years, as fractional calculus becomes more and more broadly used in research across different academic disciplines, there are increasing demands for the numerical tools for the computation of fractional integration/differentiation, and the simulation of fractional order systems. Time to time, being asked about which tool is suitable for a specific application, the authors decide to carry out this survey to present recapitulative information of the available tools in the literature, in hope of benefiting researchers with different academic backgrounds. With this motivation, the present article collects the scattered tools into a dashboard view, briefly introduces their usage and algorithms, evaluates the accuracy, compares the performance, and provides informative comments for selection.

  13. Ventromedial Hypothalamus and the Generation of Aggression

    PubMed Central

    Hashikawa, Yoshiko; Hashikawa, Koichi; Falkner, Annegret L.; Lin, Dayu

    2017-01-01

    Aggression is a costly behavior, sometimes with severe consequences including death. Yet aggression is prevalent across animal species ranging from insects to humans, demonstrating its essential role in the survival of individuals and groups. The question of how the brain decides when to generate this costly behavior has intrigued neuroscientists for over a century and has led to the identification of relevant neural substrates. Various lesion and electric stimulation experiments have revealed that the hypothalamus, an ancient structure situated deep in the brain, is essential for expressing aggressive behaviors. More recently, studies using precise circuit manipulation tools have identified a small subnucleus in the medial hypothalamus, the ventrolateral part of the ventromedial hypothalamus (VMHvl), as a key structure for driving both aggression and aggression-seeking behaviors. Here, we provide an updated summary of the evidence that supports a role of the VMHvl in aggressive behaviors. We will consider our recent findings detailing the physiological response properties of populations of VMHvl cells during aggressive behaviors and provide new understanding regarding the role of the VMHvl embedded within the larger whole-brain circuit for social sensation and action. PMID:29375329

  14. 12 CFR 617.7615 - What should the System institution do when it decides to lease acquired agricultural real estate?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... decides to lease acquired agricultural real estate? 617.7615 Section 617.7615 Banks and Banking FARM... the System institution do when it decides to lease acquired agricultural real estate? (a) Notify the... real estate at a rate equivalent to the appraised rental value of the property. (1) Within 15 days...

  15. 12 CFR 617.7615 - What should the System institution do when it decides to lease acquired agricultural real estate?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... decides to lease acquired agricultural real estate? 617.7615 Section 617.7615 Banks and Banking FARM... the System institution do when it decides to lease acquired agricultural real estate? (a) Notify the... real estate at a rate equivalent to the appraised rental value of the property. (1) Within 15 days...

  16. 12 CFR 617.7610 - What should the System institution do when it decides to sell acquired agricultural real estate?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... decides to sell acquired agricultural real estate? 617.7610 Section 617.7610 Banks and Banking FARM CREDIT... institution do when it decides to sell acquired agricultural real estate? (a) Notify the previous owner, (1) Within 15 days of the System institution's decision to sell acquired agricultural real estate, it must...

  17. 12 CFR 617.7615 - What should the System institution do when it decides to lease acquired agricultural real estate?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... decides to lease acquired agricultural real estate? 617.7615 Section 617.7615 Banks and Banking FARM... the System institution do when it decides to lease acquired agricultural real estate? (a) Notify the... real estate at a rate equivalent to the appraised rental value of the property. (1) Within 15 days...

  18. 12 CFR 617.7620 - What should the System institution do when it decides to sell acquired agricultural real estate...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... decides to sell acquired agricultural real estate at a public auction? 617.7620 Section 617.7620 Banks and... What should the System institution do when it decides to sell acquired agricultural real estate at a public auction? System institutions electing to sell or lease acquired agricultural real estate or a...

  19. 12 CFR 617.7610 - What should the System institution do when it decides to sell acquired agricultural real estate?

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... decides to sell acquired agricultural real estate? 617.7610 Section 617.7610 Banks and Banking FARM CREDIT... institution do when it decides to sell acquired agricultural real estate? (a) Notify the previous owner, (1) Within 15 days of the System institution's decision to sell acquired agricultural real estate, it must...

  20. 12 CFR 617.7610 - What should the System institution do when it decides to sell acquired agricultural real estate?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... decides to sell acquired agricultural real estate? 617.7610 Section 617.7610 Banks and Banking FARM CREDIT... institution do when it decides to sell acquired agricultural real estate? (a) Notify the previous owner, (1) Within 15 days of the System institution's decision to sell acquired agricultural real estate, it must...

  1. 12 CFR 617.7615 - What should the System institution do when it decides to lease acquired agricultural real estate?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... decides to lease acquired agricultural real estate? 617.7615 Section 617.7615 Banks and Banking FARM... the System institution do when it decides to lease acquired agricultural real estate? (a) Notify the... real estate at a rate equivalent to the appraised rental value of the property. (1) Within 15 days...

  2. 12 CFR 617.7610 - What should the System institution do when it decides to sell acquired agricultural real estate?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... decides to sell acquired agricultural real estate? 617.7610 Section 617.7610 Banks and Banking FARM CREDIT... institution do when it decides to sell acquired agricultural real estate? (a) Notify the previous owner, (1) Within 15 days of the System institution's decision to sell acquired agricultural real estate, it must...

  3. 12 CFR 617.7620 - What should the System institution do when it decides to sell acquired agricultural real estate...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... decides to sell acquired agricultural real estate at a public auction? 617.7620 Section 617.7620 Banks and... What should the System institution do when it decides to sell acquired agricultural real estate at a public auction? System institutions electing to sell or lease acquired agricultural real estate or a...

  4. 12 CFR 617.7615 - What should the System institution do when it decides to lease acquired agricultural real estate?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... decides to lease acquired agricultural real estate? 617.7615 Section 617.7615 Banks and Banking FARM... the System institution do when it decides to lease acquired agricultural real estate? (a) Notify the... real estate at a rate equivalent to the appraised rental value of the property. (1) Within 15 days...

  5. 76 FR 75876 - Record of Decision for the Modification of the Groton Generation Station Interconnection...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-05

    ...) received a request from Basin Electric Power Cooperative (Basin Electric) to modify its Large Generator Interconnection Agreement (LGIA) with Basin Electric for the Groton Generation Station to eliminate current... considered the environmental impacts and has decided to modify its LGIA with Basin Electric for the Groton...

  6. Energy solutions in rural Africa: mapping electrification costs of distributed solar and diesel generation versus grid extension

    NASA Astrophysics Data System (ADS)

    Szabó, S.; Bódis, K.; Huld, T.; Moner-Girona, M.

    2011-07-01

    Three rural electrification options are analysed showing the cost optimal conditions for a sustainable energy development applying renewable energy sources in Africa. A spatial electricity cost model has been designed to point out whether diesel generators, photovoltaic systems or extension of the grid are the least-cost option in off-grid areas. The resulting mapping application offers support to decide in which regions the communities could be electrified either within the grid or in an isolated mini-grid. Donor programs and National Rural Electrification Agencies (or equivalent governmental departments) could use this type of delineation for their program boundaries and then could use the local optimization tools adapted to the prevailing parameters. The views expressed in this paper are those of the authors and do not necessarily represent European Commission and UNEP policy.

  7. Deciding Who Lives: Considered Risk Casualty Decisions in Homeland Security

    DTIC Science & Technology

    2008-12-01

    conflicts exist in their decision environment, why they exist, and how their associated biases can be minimized. They should also be alerted to the...live and who will die. This concern is part of a wider concern about how the qualities of being a person limit what it is permissible to do to...believe. You can’t train a political leader , a governor, a mayor to say that, all right, when this happens, this is how you should decide. It’s

  8. Unexpected benefits of deciding by mind wandering

    PubMed Central

    Giblin, Colleen E.; Morewedge, Carey K.; Norton, Michael I.

    2013-01-01

    The mind wanders, even when people are attempting to make complex decisions. We suggest that mind wandering—allowing one's thoughts to wander until the “correct” choice comes to mind—can positively impact people's feelings about their decisions. We compare post-choice satisfaction from choices made by mind wandering to reason-based choices and randomly assigned outcomes. Participants chose a poster by mind wandering or deliberating, or were randomly assigned a poster. Whereas forecasters predicted that participants who chose by mind wandering would evaluate their outcome as inferior to participants who deliberated (Experiment 1), participants who used mind wandering as a decision strategy evaluated their choice just as positively as did participants who used deliberation (Experiment 2). In some cases, it appears that people can spare themselves the effort of deliberation and instead “decide by wind wandering,” yet experience no decrease in satisfaction. PMID:24046760

  9. Identifying problems and generating recommendations for enhancing complex systems: applying the abstraction hierarchy framework as an analytical tool.

    PubMed

    Xu, Wei

    2007-12-01

    This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.

  10. 20 CFR 10.206 - May an employee who uses leave after an injury later decide to use COP instead?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... injury later decide to use COP instead? 10.206 Section 10.206 Employees' Benefits OFFICE OF WORKERS... THE FEDERAL EMPLOYEES' COMPENSATION ACT, AS AMENDED Continuation of Pay Eligibility for Cop § 10.206 May an employee who uses leave after an injury later decide to use COP instead? On Form CA-1, an...

  11. 20 CFR 10.206 - May an employee who uses leave after an injury later decide to use COP instead?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... later decide to use COP instead? 10.206 Section 10.206 Employees' Benefits OFFICE OF WORKERS... THE FEDERAL EMPLOYEES' COMPENSATION ACT, AS AMENDED Continuation of Pay Eligibility for Cop § 10.206 May an employee who uses leave after an injury later decide to use COP instead? On Form CA-1, an...

  12. 20 CFR 10.206 - May an employee who uses leave after an injury later decide to use COP instead?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... injury later decide to use COP instead? 10.206 Section 10.206 Employees' Benefits OFFICE OF WORKERS... THE FEDERAL EMPLOYEES' COMPENSATION ACT, AS AMENDED Continuation of Pay Eligibility for Cop § 10.206 May an employee who uses leave after an injury later decide to use COP instead? On Form CA-1, an...

  13. 20 CFR 10.206 - May an employee who uses leave after an injury later decide to use COP instead?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... later decide to use COP instead? 10.206 Section 10.206 Employees' Benefits OFFICE OF WORKERS... THE FEDERAL EMPLOYEES' COMPENSATION ACT, AS AMENDED Continuation of Pay Eligibility for Cop § 10.206 May an employee who uses leave after an injury later decide to use COP instead? On Form CA-1, an...

  14. 20 CFR 10.206 - May an employee who uses leave after an injury later decide to use COP instead?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... injury later decide to use COP instead? 10.206 Section 10.206 Employees' Benefits OFFICE OF WORKERS... THE FEDERAL EMPLOYEES' COMPENSATION ACT, AS AMENDED Continuation of Pay Eligibility for Cop § 10.206 May an employee who uses leave after an injury later decide to use COP instead? On Form CA-1, an...

  15. Deciding Not to Un-Do the "I Do:" Therapy Experiences of Women Who Consider Divorce But Decide to Remain Married.

    PubMed

    Kanewischer, Erica J W; Harris, Steven M

    2015-07-01

    This study explores women's experience of marital therapy while they navigated decision making around divorce. A qualitative method was used to gain a deeper understanding of the participants' therapy and relationship decision-making experiences. How are women's decisions whether or not to exit their marriage affected by therapy? The researchers interviewed 15 women who had considered initiating divorce before they turned 40 and had attended at least five marital therapy sessions but ultimately decided not to divorce. In general, participants reported that the therapy was helpful to them, their decision-making process and their marriages. Five main themes emerged from the interviews: Women Initiated Therapy, Therapist Was Experienced as Unbiased, Therapy was Helpful, Importance of Extra-therapeutic Factors, and Gradual Process. © 2014 American Association for Marriage and Family Therapy.

  16. 20 CFR 416.1881 - Deciding whether someone is your parent or stepparent.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... mother or father; or (2) A person who legally adopted you. (b) We consider your stepparent to be the... 20 Employees' Benefits 2 2011-04-01 2011-04-01 false Deciding whether someone is your parent or... SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Relationship Who Is Considered Your Parent § 416.1881...

  17. 20 CFR 416.1881 - Deciding whether someone is your parent or stepparent.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... mother or father; or (2) A person who legally adopted you. (b) We consider your stepparent to be the... 20 Employees' Benefits 2 2012-04-01 2012-04-01 false Deciding whether someone is your parent or... SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Relationship Who Is Considered Your Parent § 416.1881...

  18. 20 CFR 416.1881 - Deciding whether someone is your parent or stepparent.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... mother or father; or (2) A person who legally adopted you. (b) We consider your stepparent to be the... 20 Employees' Benefits 2 2014-04-01 2014-04-01 false Deciding whether someone is your parent or... SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Relationship Who Is Considered Your Parent § 416.1881...

  19. 20 CFR 416.1881 - Deciding whether someone is your parent or stepparent.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... mother or father; or (2) A person who legally adopted you. (b) We consider your stepparent to be the... 20 Employees' Benefits 2 2013-04-01 2013-04-01 false Deciding whether someone is your parent or... SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Relationship Who Is Considered Your Parent § 416.1881...

  20. The next generation of melanocyte data: Genetic, epigenetic, and transcriptional resource datasets and analysis tools.

    PubMed

    Loftus, Stacie K

    2018-05-01

    The number of melanocyte- and melanoma-derived next generation sequence genome-scale datasets have rapidly expanded over the past several years. This resource guide provides a summary of publicly available sources of melanocyte cell derived whole genome, exome, mRNA and miRNA transcriptome, chromatin accessibility and epigenetic datasets. Also highlighted are bioinformatic resources and tools for visualization and data queries which allow researchers a genome-scale view of the melanocyte. Published 2018. This article is a U.S. Government work and is in the public domain in the USA.

  1. Evolutionary algorithm based optimization of hydraulic machines utilizing a state-of-the-art block coupled CFD solver and parametric geometry and mesh generation tools

    NASA Astrophysics Data System (ADS)

    S, Kyriacou; E, Kontoleontos; S, Weissenberger; L, Mangani; E, Casartelli; I, Skouteropoulou; M, Gattringer; A, Gehrer; M, Buchmayr

    2014-03-01

    An efficient hydraulic optimization procedure, suitable for industrial use, requires an advanced optimization tool (EASY software), a fast solver (block coupled CFD) and a flexible geometry generation tool. EASY optimization software is a PCA-driven metamodel-assisted Evolutionary Algorithm (MAEA (PCA)) that can be used in both single- (SOO) and multiobjective optimization (MOO) problems. In MAEAs, low cost surrogate evaluation models are used to screen out non-promising individuals during the evolution and exclude them from the expensive, problem specific evaluation, here the solution of Navier-Stokes equations. For additional reduction of the optimization CPU cost, the PCA technique is used to identify dependences among the design variables and to exploit them in order to efficiently drive the application of the evolution operators. To further enhance the hydraulic optimization procedure, a very robust and fast Navier-Stokes solver has been developed. This incompressible CFD solver employs a pressure-based block-coupled approach, solving the governing equations simultaneously. This method, apart from being robust and fast, also provides a big gain in terms of computational cost. In order to optimize the geometry of hydraulic machines, an automatic geometry and mesh generation tool is necessary. The geometry generation tool used in this work is entirely based on b-spline curves and surfaces. In what follows, the components of the tool chain are outlined in some detail and the optimization results of hydraulic machine components are shown in order to demonstrate the performance of the presented optimization procedure.

  2. An Integrated Suite of Text and Data Mining Tools - Phase II

    DTIC Science & Technology

    2005-08-30

    Riverside, CA, USA Mazda Motor Corp, Jpn Univ of Darmstadt, Darmstadt, Ger Navy Center for Applied Research in Artificial Intelligence Univ of...with Georgia Tech Research Corporation developed a desktop text-mining software tool named TechOASIS (known commercially as VantagePoint). By the...of this dataset and groups the Corporate Source items that co-occur with the found items. He decides he is only interested in the institutions

  3. A GIS semiautomatic tool for classifying and mapping wetland soils

    NASA Astrophysics Data System (ADS)

    Moreno-Ramón, Héctor; Marqués-Mateu, Angel; Ibáñez-Asensio, Sara

    2016-04-01

    Wetlands are one of the most productive and biodiverse ecosystems in the world. Water is the main resource and controls the relationships between agents and factors that determine the quality of the wetland. However, vegetation, wildlife and soils are also essential factors to understand these environments. It is possible that soils have been the least studied resource due to their sampling problems. This feature has caused that sometimes wetland soils have been classified broadly. The traditional methodology states that homogeneous soil units should be based on the five soil forming-factors. The problem can appear when the variation of one soil-forming factor is too small to differentiate a change in soil units, or in case that there is another factor, which is not taken into account (e.g. fluctuating water table). This is the case of Albufera of Valencia, a coastal wetland located in the middle east of the Iberian Peninsula (Spain). The saline water table fluctuates throughout the year and it generates differences in soils. To solve this problem, the objectives of this study were to establish a reliable methodology to avoid that problems, and develop a GIS tool that would allow us to define homogeneous soil units in wetlands. This step is essential for the soil scientist, who has to decide the number of soil profiles in a study. The research was conducted with data from 133 soil pits of a previous study in the wetland. In that study, soil parameters of 401 samples (organic carbon, salinity, carbonates, n-value, etc.) were analysed. In a first stage, GIS layers were generated according to depth. The method employed was Bayesian Maxim Entropy. Subsequently, it was designed a program in GIS environment that was based on the decision tree algorithms. The goal of this tool was to create a single layer, for each soil variable, according to the different diagnostic criteria of Soil Taxonomy (properties, horizons and diagnostic epipedons). At the end, the program

  4. 20 CFR 670.200 - Who decides where Job Corps centers will be located?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... for making decisions concerning the establishment, relocation, expansion, or closing of contract... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Who decides where Job Corps centers will be located? 670.200 Section 670.200 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF...

  5. Evaluating an image-fusion algorithm with synthetic-image-generation tools

    NASA Astrophysics Data System (ADS)

    Gross, Harry N.; Schott, John R.

    1996-06-01

    An algorithm that combines spectral mixing and nonlinear optimization is used to fuse multiresolution images. Image fusion merges images of different spatial and spectral resolutions to create a high spatial resolution multispectral combination. High spectral resolution allows identification of materials in the scene, while high spatial resolution locates those materials. In this algorithm, conventional spectral mixing estimates the percentage of each material (called endmembers) within each low resolution pixel. Three spectral mixing models are compared; unconstrained, partially constrained, and fully constrained. In the partially constrained application, the endmember fractions are required to sum to one. In the fully constrained application, all fractions are additionally required to lie between zero and one. While negative fractions seem inappropriate, they can arise from random spectral realizations of the materials. In the second part of the algorithm, the low resolution fractions are used as inputs to a constrained nonlinear optimization that calculates the endmember fractions for the high resolution pixels. The constraints mirror the low resolution constraints and maintain consistency with the low resolution fraction results. The algorithm can use one or more higher resolution sharpening images to locate the endmembers to high spatial accuracy. The algorithm was evaluated with synthetic image generation (SIG) tools. A SIG developed image can be used to control the various error sources that are likely to impair the algorithm performance. These error sources include atmospheric effects, mismodeled spectral endmembers, and variability in topography and illumination. By controlling the introduction of these errors, the robustness of the algorithm can be studied and improved upon. The motivation for this research is to take advantage of the next generation of multi/hyperspectral sensors. Although the hyperspectral images will be of modest to low resolution

  6. LCFM - LIVING COLOR FRAME MAKER: PC GRAPHICS GENERATION AND MANAGEMENT TOOL FOR REAL-TIME APPLICATIONS

    NASA Technical Reports Server (NTRS)

    Truong, L. V.

    1994-01-01

    Computer graphics are often applied for better understanding and interpretation of data under observation. These graphics become more complicated when animation is required during "run-time", as found in many typical modern artificial intelligence and expert systems. Living Color Frame Maker is a solution to many of these real-time graphics problems. Living Color Frame Maker (LCFM) is a graphics generation and management tool for IBM or IBM compatible personal computers. To eliminate graphics programming, the graphic designer can use LCFM to generate computer graphics frames. The graphical frames are then saved as text files, in a readable and disclosed format, which can be easily accessed and manipulated by user programs for a wide range of "real-time" visual information applications. For example, LCFM can be implemented in a frame-based expert system for visual aids in management of systems. For monitoring, diagnosis, and/or controlling purposes, circuit or systems diagrams can be brought to "life" by using designated video colors and intensities to symbolize the status of hardware components (via real-time feedback from sensors). Thus status of the system itself can be displayed. The Living Color Frame Maker is user friendly with graphical interfaces, and provides on-line help instructions. All options are executed using mouse commands and are displayed on a single menu for fast and easy operation. LCFM is written in C++ using the Borland C++ 2.0 compiler for IBM PC series computers and compatible computers running MS-DOS. The program requires a mouse and an EGA/VGA display. A minimum of 77K of RAM is also required for execution. The documentation is provided in electronic form on the distribution medium in WordPerfect format. A sample MS-DOS executable is provided on the distribution medium. The standard distribution medium for this program is one 5.25 inch 360K MS-DOS format diskette. The contents of the diskette are compressed using the PKWARE archiving tools

  7. Nematode.net update 2011: addition of data sets and tools featuring next-generation sequencing data

    PubMed Central

    Martin, John; Abubucker, Sahar; Heizer, Esley; Taylor, Christina M.; Mitreva, Makedonka

    2012-01-01

    Nematode.net (http://nematode.net) has been a publicly available resource for studying nematodes for over a decade. In the past 3 years, we reorganized Nematode.net to provide more user-friendly navigation through the site, a necessity due to the explosion of data from next-generation sequencing platforms. Organism-centric portals containing dynamically generated data are available for over 56 different nematode species. Next-generation data has been added to the various data-mining portals hosted, including NemaBLAST and NemaBrowse. The NemaPath metabolic pathway viewer builds associations using KOs, rather than ECs to provide more accurate and fine-grained descriptions of proteins. Two new features for data analysis and comparative genomics have been added to the site. NemaSNP enables the user to perform population genetics studies in various nematode populations using next-generation sequencing data. HelmCoP (Helminth Control and Prevention) as an independent component of Nematode.net provides an integrated resource for storage, annotation and comparative genomics of helminth genomes to aid in learning more about nematode genomes, as well as drug, pesticide, vaccine and drug target discovery. With this update, Nematode.net will continue to realize its original goal to disseminate diverse bioinformatic data sets and provide analysis tools to the broad scientific community in a useful and user-friendly manner. PMID:22139919

  8. ODG: Omics database generator - a tool for generating, querying, and analyzing multi-omics comparative databases to facilitate biological understanding.

    PubMed

    Guhlin, Joseph; Silverstein, Kevin A T; Zhou, Peng; Tiffin, Peter; Young, Nevin D

    2017-08-10

    Rapid generation of omics data in recent years have resulted in vast amounts of disconnected datasets without systemic integration and knowledge building, while individual groups have made customized, annotated datasets available on the web with few ways to link them to in-lab datasets. With so many research groups generating their own data, the ability to relate it to the larger genomic and comparative genomic context is becoming increasingly crucial to make full use of the data. The Omics Database Generator (ODG) allows users to create customized databases that utilize published genomics data integrated with experimental data which can be queried using a flexible graph database. When provided with omics and experimental data, ODG will create a comparative, multi-dimensional graph database. ODG can import definitions and annotations from other sources such as InterProScan, the Gene Ontology, ENZYME, UniPathway, and others. This annotation data can be especially useful for studying new or understudied species for which transcripts have only been predicted, and rapidly give additional layers of annotation to predicted genes. In better studied species, ODG can perform syntenic annotation translations or rapidly identify characteristics of a set of genes or nucleotide locations, such as hits from an association study. ODG provides a web-based user-interface for configuring the data import and for querying the database. Queries can also be run from the command-line and the database can be queried directly through programming language hooks available for most languages. ODG supports most common genomic formats as well as generic, easy to use tab-separated value format for user-provided annotations. ODG is a user-friendly database generation and query tool that adapts to the supplied data to produce a comparative genomic database or multi-layered annotation database. ODG provides rapid comparative genomic annotation and is therefore particularly useful for non-model or

  9. Method and Tool for Design Process Navigation and Automatic Generation of Simulation Models for Manufacturing Systems

    NASA Astrophysics Data System (ADS)

    Nakano, Masaru; Kubota, Fumiko; Inamori, Yutaka; Mitsuyuki, Keiji

    Manufacturing system designers should concentrate on designing and planning manufacturing systems instead of spending their efforts on creating the simulation models to verify the design. This paper proposes a method and its tool to navigate the designers through the engineering process and generate the simulation model automatically from the design results. The design agent also supports collaborative design projects among different companies or divisions with distributed engineering and distributed simulation techniques. The idea was implemented and applied to a factory planning process.

  10. STOP using just GO: a multi-ontology hypothesis generation tool for high throughput experimentation

    PubMed Central

    2013-01-01

    Background Gene Ontology (GO) enrichment analysis remains one of the most common methods for hypothesis generation from high throughput datasets. However, we believe that researchers strive to test other hypotheses that fall outside of GO. Here, we developed and evaluated a tool for hypothesis generation from gene or protein lists using ontological concepts present in manually curated text that describes those genes and proteins. Results As a consequence we have developed the method Statistical Tracking of Ontological Phrases (STOP) that expands the realm of testable hypotheses in gene set enrichment analyses by integrating automated annotations of genes to terms from over 200 biomedical ontologies. While not as precise as manually curated terms, we find that the additional enriched concepts have value when coupled with traditional enrichment analyses using curated terms. Conclusion Multiple ontologies have been developed for gene and protein annotation, by using a dataset of both manually curated GO terms and automatically recognized concepts from curated text we can expand the realm of hypotheses that can be discovered. The web application STOP is available at http://mooneygroup.org/stop/. PMID:23409969

  11. Gravity Waves Generated by Convection: A New Idealized Model Tool and Direct Validation with Satellite Observations

    NASA Astrophysics Data System (ADS)

    Alexander, M. Joan; Stephan, Claudia

    2015-04-01

    In climate models, gravity waves remain too poorly resolved to be directly modelled. Instead, simplified parameterizations are used to include gravity wave effects on model winds. A few climate models link some of the parameterized waves to convective sources, providing a mechanism for feedback between changes in convection and gravity wave-driven changes in circulation in the tropics and above high-latitude storms. These convective wave parameterizations are based on limited case studies with cloud-resolving models, but they are poorly constrained by observational validation, and tuning parameters have large uncertainties. Our new work distills results from complex, full-physics cloud-resolving model studies to essential variables for gravity wave generation. We use the Weather Research Forecast (WRF) model to study relationships between precipitation, latent heating/cooling and other cloud properties to the spectrum of gravity wave momentum flux above midlatitude storm systems. Results show the gravity wave spectrum is surprisingly insensitive to the representation of microphysics in WRF. This is good news for use of these models for gravity wave parameterization development since microphysical properties are a key uncertainty. We further use the full-physics cloud-resolving model as a tool to directly link observed precipitation variability to gravity wave generation. We show that waves in an idealized model forced with radar-observed precipitation can quantitatively reproduce instantaneous satellite-observed features of the gravity wave field above storms, which is a powerful validation of our understanding of waves generated by convection. The idealized model directly links observations of surface precipitation to observed waves in the stratosphere, and the simplicity of the model permits deep/large-area domains for studies of wave-mean flow interactions. This unique validated model tool permits quantitative studies of gravity wave driving of regional

  12. 30 CFR 204.208 - May a State decide that it will or will not allow one or both of the relief options under this...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... MARGINAL PROPERTIES Accounting and Auditing Relief § 204.208 May a State decide that it will or will not... not grant that type of marginal property relief. (b) To help States decide whether to allow one or... Report of Marginal Properties by October 1 preceding the calendar year. (c) If a State decides under...

  13. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Etmektzoglou, A; Mishra, P; Svatos, M

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomesmore » available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to

  14. Customised 3D Printing: An Innovative Training Tool for the Next Generation of Orbital Surgeons.

    PubMed

    Scawn, Richard L; Foster, Alex; Lee, Bradford W; Kikkawa, Don O; Korn, Bobby S

    2015-01-01

    Additive manufacturing or 3D printing is the process by which three dimensional data fields are translated into real-life physical representations. 3D printers create physical printouts using heated plastics in a layered fashion resulting in a three-dimensional object. We present a technique for creating customised, inexpensive 3D orbit models for use in orbital surgical training using 3D printing technology. These models allow trainee surgeons to perform 'wet-lab' orbital decompressions and simulate upcoming surgeries on orbital models that replicate a patient's bony anatomy. We believe this represents an innovative training tool for the next generation of orbital surgeons.

  15. ViSAPy: a Python tool for biophysics-based generation of virtual spiking activity for evaluation of spike-sorting algorithms.

    PubMed

    Hagen, Espen; Ness, Torbjørn V; Khosrowshahi, Amir; Sørensen, Christina; Fyhn, Marianne; Hafting, Torkel; Franke, Felix; Einevoll, Gaute T

    2015-04-30

    New, silicon-based multielectrodes comprising hundreds or more electrode contacts offer the possibility to record spike trains from thousands of neurons simultaneously. This potential cannot be realized unless accurate, reliable automated methods for spike sorting are developed, in turn requiring benchmarking data sets with known ground-truth spike times. We here present a general simulation tool for computing benchmarking data for evaluation of spike-sorting algorithms entitled ViSAPy (Virtual Spiking Activity in Python). The tool is based on a well-established biophysical forward-modeling scheme and is implemented as a Python package built on top of the neuronal simulator NEURON and the Python tool LFPy. ViSAPy allows for arbitrary combinations of multicompartmental neuron models and geometries of recording multielectrodes. Three example benchmarking data sets are generated, i.e., tetrode and polytrode data mimicking in vivo cortical recordings and microelectrode array (MEA) recordings of in vitro activity in salamander retinas. The synthesized example benchmarking data mimics salient features of typical experimental recordings, for example, spike waveforms depending on interspike interval. ViSAPy goes beyond existing methods as it includes biologically realistic model noise, synaptic activation by recurrent spiking networks, finite-sized electrode contacts, and allows for inhomogeneous electrical conductivities. ViSAPy is optimized to allow for generation of long time series of benchmarking data, spanning minutes of biological time, by parallel execution on multi-core computers. ViSAPy is an open-ended tool as it can be generalized to produce benchmarking data or arbitrary recording-electrode geometries and with various levels of complexity. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  16. Modeling Guidelines for Code Generation in the Railway Signaling Context

    NASA Technical Reports Server (NTRS)

    Ferrari, Alessio; Bacherini, Stefano; Fantechi, Alessandro; Zingoni, Niccolo

    2009-01-01

    recommendations has been performed for the automotive control systems domain in order to enforce code generation [7]. The MAAB guidelines have been found profitable also in the aerospace/avionics sector [1] and they have been adopted by the MathWorks Aerospace Leadership Council (MALC). General Electric Transportation Systems (GETS) is a well known railway signaling systems manufacturer leading in Automatic Train Protection (ATP) systems technology. Inside an effort of adopting formal methods within its own development process, GETS decided to introduce system modeling by means of the MathWorks tools [2], and in 2008 chose to move to code generation. This article reports the experience performed by GETS in developing its own modeling standard through customizing the MAAB rules for the railway signaling domain and shows the result of this experience with a successful product development story.

  17. ISPATOM: A Generic Real-Time Data Processing Tool Without Programming

    NASA Technical Reports Server (NTRS)

    Dershowitz, Adam

    2007-01-01

    Information Sharing Protocol Advanced Tool of Math (ISPATOM) is an application program allowing for the streamlined generation of comps, which subscribe to streams of incoming telemetry data, perform any necessary computations on the data, then send the data to other programs for display and/or further processing in NASA mission control centers. Heretofore, the development of comps was difficult, expensive, and time-consuming: Each comp was custom written manually, in a low-level computing language, by a programmer attempting to follow requirements of flight controllers. ISPATOM enables a flight controller who is not a programmer to write a comp by simply typing in one or more equation( s) at a command line or retrieving the equation(s) from a text file. ISPATOM then subscribes to the necessary input data, performs all of necessary computations, and sends out the results. It sends out new results whenever the input data change. The use of equations in ISPATOM is no more difficult than is entering equations in a spreadsheet. The time involved in developing a comp is thus limited to the time taken to decide on the necessary equations. Thus, ISPATOM is a real-time dynamic calculator.

  18. Next Generation Analytic Tools for Large Scale Genetic Epidemiology Studies of Complex Diseases

    PubMed Central

    Mechanic, Leah E.; Chen, Huann-Sheng; Amos, Christopher I.; Chatterjee, Nilanjan; Cox, Nancy J.; Divi, Rao L.; Fan, Ruzong; Harris, Emily L.; Jacobs, Kevin; Kraft, Peter; Leal, Suzanne M.; McAllister, Kimberly; Moore, Jason H.; Paltoo, Dina N.; Province, Michael A.; Ramos, Erin M.; Ritchie, Marylyn D.; Roeder, Kathryn; Schaid, Daniel J.; Stephens, Matthew; Thomas, Duncan C.; Weinberg, Clarice R.; Witte, John S.; Zhang, Shunpu; Zöllner, Sebastian; Feuer, Eric J.; Gillanders, Elizabeth M.

    2012-01-01

    Over the past several years, genome-wide association studies (GWAS) have succeeded in identifying hundreds of genetic markers associated with common diseases. However, most of these markers confer relatively small increments of risk and explain only a small proportion of familial clustering. To identify obstacles to future progress in genetic epidemiology research and provide recommendations to NIH for overcoming these barriers, the National Cancer Institute sponsored a workshop entitled “Next Generation Analytic Tools for Large-Scale Genetic Epidemiology Studies of Complex Diseases” on September 15–16, 2010. The goal of the workshop was to facilitate discussions on (1) statistical strategies and methods to efficiently identify genetic and environmental factors contributing to the risk of complex disease; and (2) how to develop, apply, and evaluate these strategies for the design, analysis, and interpretation of large-scale complex disease association studies in order to guide NIH in setting the future agenda in this area of research. The workshop was organized as a series of short presentations covering scientific (gene-gene and gene-environment interaction, complex phenotypes, and rare variants and next generation sequencing) and methodological (simulation modeling and computational resources and data management) topic areas. Specific needs to advance the field were identified during each session and are summarized. PMID:22147673

  19. ImageParser: a tool for finite element generation from three-dimensional medical images

    PubMed Central

    Yin, HM; Sun, LZ; Wang, G; Yamada, T; Wang, J; Vannier, MW

    2004-01-01

    Background The finite element method (FEM) is a powerful mathematical tool to simulate and visualize the mechanical deformation of tissues and organs during medical examinations or interventions. It is yet a challenge to build up an FEM mesh directly from a volumetric image partially because the regions (or structures) of interest (ROIs) may be irregular and fuzzy. Methods A software package, ImageParser, is developed to generate an FEM mesh from 3-D tomographic medical images. This software uses a semi-automatic method to detect ROIs from the context of image including neighboring tissues and organs, completes segmentation of different tissues, and meshes the organ into elements. Results The ImageParser is shown to build up an FEM model for simulating the mechanical responses of the breast based on 3-D CT images. The breast is compressed by two plate paddles under an overall displacement as large as 20% of the initial distance between the paddles. The strain and tangential Young's modulus distributions are specified for the biomechanical analysis of breast tissues. Conclusion The ImageParser can successfully exact the geometry of ROIs from a complex medical image and generate the FEM mesh with customer-defined segmentation information. PMID:15461787

  20. Advanced Welding Tool

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Accutron Tool & Instrument Co.'s welder was originally developed as a tool specifically for joining parts made of plastic or composite materials in any atmosphere to include the airless environment of space. Developers decided on induction or magnetic heating to avoid causing deformation and it also can be used with almost any type of thermoplastic material. Induction coil transfers magnetic flux through the plastic to a metal screen that is sandwiched between the sheets of plastic to be joined. When welder is energized, alternating current produces inductive heating on the screen causing the adjacent plastic surfaces to melt and flow into the mesh, creating a bond on the total surface area. Dave Brown, owner of Great Falls Canoe and Kayak Repair, Vienna, VA, uses a special repair technique based on operation of the Induction Toroid Welder to fix canoes. Whitewater canoeing poses the problem of frequent gashes that are difficult to repair. The main reason is that many canoes are made of plastics. The commercial Induction model is a self-contained, portable welding gun with a switch on the handle to regulate the temperature of the plastic melting screen. Welder has a broad range of applications in the automobile, appliance, aerospace and construction industries.

  1. 42 CFR 405.1038 - Deciding a case without a hearing before an ALJ.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Deciding a case without a hearing before an ALJ. 405.1038 Section 405.1038 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH..., Redeterminations, Reconsiderations, and Appeals Under Original Medicare (Part A and Part B) Alj Hearings § 405.1038...

  2. 42 CFR 405.1016 - Time frames for deciding an appeal before an ALJ.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 2 2011-10-01 2011-10-01 false Time frames for deciding an appeal before an ALJ. 405.1016 Section 405.1016 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH..., Redeterminations, Reconsiderations, and Appeals Under Original Medicare (Part A and Part B) Alj Hearings § 405.1016...

  3. 42 CFR 405.1016 - Time frames for deciding an appeal before an ALJ.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Time frames for deciding an appeal before an ALJ. 405.1016 Section 405.1016 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH..., Redeterminations, Reconsiderations, and Appeals Under Original Medicare (Part A and Part B) Alj Hearings § 405.1016...

  4. 25 CFR 115.618 - What happens if at the conclusion of the notice and hearing process we decide to encumber your...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... hearing process we decide to encumber your IIM account because of an administrative error which resulted... process we decide to encumber your IIM account because of an administrative error which resulted in funds... INTERIOR FINANCIAL ACTIVITIES TRUST FUNDS FOR TRIBES AND INDIVIDUAL INDIANS IIM Accounts: Hearing Process...

  5. 25 CFR 115.618 - What happens if at the conclusion of the notice and hearing process we decide to encumber your...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... INTERIOR FINANCIAL ACTIVITIES TRUST FUNDS FOR TRIBES AND INDIVIDUAL INDIANS IIM Accounts: Hearing Process... hearing process we decide to encumber your IIM account because of an administrative error which resulted... process we decide to encumber your IIM account because of an administrative error which resulted in funds...

  6. 25 CFR 115.618 - What happens if at the conclusion of the notice and hearing process we decide to encumber your...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... INTERIOR FINANCIAL ACTIVITIES TRUST FUNDS FOR TRIBES AND INDIVIDUAL INDIANS IIM Accounts: Hearing Process... hearing process we decide to encumber your IIM account because of an administrative error which resulted... process we decide to encumber your IIM account because of an administrative error which resulted in funds...

  7. 25 CFR 115.618 - What happens if at the conclusion of the notice and hearing process we decide to encumber your...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... INTERIOR FINANCIAL ACTIVITIES TRUST FUNDS FOR TRIBES AND INDIVIDUAL INDIANS IIM Accounts: Hearing Process... hearing process we decide to encumber your IIM account because of an administrative error which resulted... process we decide to encumber your IIM account because of an administrative error which resulted in funds...

  8. 25 CFR 115.618 - What happens if at the conclusion of the notice and hearing process we decide to encumber your...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... INTERIOR FINANCIAL ACTIVITIES TRUST FUNDS FOR TRIBES AND INDIVIDUAL INDIANS IIM Accounts: Hearing Process... hearing process we decide to encumber your IIM account because of an administrative error which resulted... process we decide to encumber your IIM account because of an administrative error which resulted in funds...

  9. Community-based participatory research and user-centered design in a diabetes medication information and decision tool.

    PubMed

    Henderson, Vida A; Barr, Kathryn L; An, Lawrence C; Guajardo, Claudia; Newhouse, William; Mase, Rebecca; Heisler, Michele

    2013-01-01

    Together, community-based participatory research (CBPR), user-centered design (UCD), and health information technology (HIT) offer promising approaches to improve health disparities in low-resource settings. This article describes the application of CBPR and UCD principles to the development of iDecide/Decido, an interactive, tailored, web-based diabetes medication education and decision support tool delivered by community health workers (CHWs) to African American and Latino participants with diabetes in Southwest and Eastside Detroit. The decision aid is offered in English or Spanish and is delivered on an iPad in participants' homes. The overlapping principles of CBPR and UCD used to develop iDecide/Decido include a user-focused or community approach, equitable academic and community partnership in all study phases, an iterative development process that relies on input from all stakeholders, and a program experience that is specified, adapted, and implemented with the target community. Collaboration between community members, researchers, and developers is especially evident in the program's design concept, animations, pictographs, issue cards, goal setting, tailoring, and additional CHW tools. The principles of CBPR and UCD can be successfully applied in developing health information tools that are easy to use and understand, interactive, and target health disparities.

  10. System diagnostic builder: a rule-generation tool for expert systems that do intelligent data evaluation

    NASA Astrophysics Data System (ADS)

    Nieten, Joseph L.; Burke, Roger

    1993-03-01

    The system diagnostic builder (SDB) is an automated knowledge acquisition tool using state- of-the-art artificial intelligence (AI) technologies. The SDB uses an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert (SME). Thus, data is captured from the subject system, classified by an expert, and used to drive the rule generation process. These rule-bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The rule-bases can be used in any knowledge based system which monitors or controls a physical system or simulation. The SDB has demonstrated the utility of using inductive machine learning technology to generate reliable knowledge bases. In fact, we have discovered that the knowledge captured by the SDB can be used in any number of applications. For example, the knowledge bases captured from the SMS can be used as black box simulations by intelligent computer aided training devices. We can also use the SDB to construct knowledge bases for the process control industry, such as chemical production, or oil and gas production. These knowledge bases can be used in automated advisory systems to ensure safety, productivity, and consistency.

  11. S&T converging trends in dealing with disaster: A review on AI tools

    NASA Astrophysics Data System (ADS)

    Hasan, Abu Bakar; Isa, Mohd. Hafez Mohd.

    2016-01-01

    Science and Technology (S&T) has been able to help mankind to solve or minimize problems when arise. Different methodologies, techniques and tools were developed or used for specific cases by researchers, engineers, scientists throughout the world, and numerous papers and articles have been written by them. Nine selected cases such as flash flood, earthquakes, workplace accident, fault in aircraft industry, seismic vulnerability, disaster mitigation and management, and early fault detection in nuclear industry have been studied. This paper looked at those cases, and their results showed nearly 60% uses artificial intelligence (AI) as a tool. This paper also did some review that will help young researchers in deciding the types of AI tools to be selected; thus proving the future trends in S&T.

  12. An online tool for tracking soil nitrogen

    NASA Astrophysics Data System (ADS)

    Wang, J.; Umar, M.; Banger, K.; Pittelkow, C. M.; Nafziger, E. D.

    2016-12-01

    Near real-time crop models can be useful tools for optimizing agricultural management practices. For example, model simulations can potentially provide current estimates of nitrogen availability in soil, helping growers decide whether more nitrogen needs to be applied in a given season. Traditionally, crop models have been used at point locations (i.e. single fields) with homogenous soil, climate and initial conditions. However, nitrogen availability across fields with varied weather and soil conditions at a regional or national level is necessary to guide better management decisions. This study presents the development of a publicly available, online tool that automates the integration of high-spatial-resolution forecast and past weather and soil data in DSSAT to estimate nitrogen availability for individual fields in Illinois. The model has been calibrated with field experiments from past year at six research corn fields across Illinois. These sites were treated with applications of different N fertilizer timings and amounts. The tool requires minimal management information from growers and yet has the capability to simulate nitrogen-water-crop interactions with calibrated parameters that are more appropriate for Illinois. The results from the tool will be combined with incoming field experiment data from 2016 for model validation and further improvement of model's predictive accuracy. The tool has the potential to help guide better nitrogen management practices to maximize economic and environmental benefits.

  13. Communicating with the New Generations. The Challenge for Pediatric Dentists.

    PubMed

    Saadia, Marc; Valencia, Roberto

    2015-01-01

    Most of the children and parents are virtuous and will give us plenty of reasons to enjoy what we do. Unfortunately, we all know that something is somehow wrong with these new generations. Parents and children sometimes place Pediatric dentists in a dilemma. The social structure changes every few years causing a burden on how to deal with these families. For this reason, dentists might decide to sedate or go to the operating room when these children might be potentially good dental patients. Deciding this course of action, does not allow us to bond with them. Bonding with children must be worked and nurtured. This is part of what pediatric dentists are trained for. This manuscript will illustrate the major changes seen with the new generations of parents and children and how it affects us the way we work in our offices. We will show the importance of bonding with parents and children, moving beyond the biological aspects and venturing into the psycho-socio and cultural issues. Knowing our children and adolescents will allow us to detect potential physical or emotional hazardous behavior.

  14. Split views among parents regarding children's right to decide about participation in research: a questionnaire survey.

    PubMed

    Swartling, U; Helgesson, G; Hansson, M G; Ludvigsson, J

    2009-07-01

    Based on extensive questionnaire data, this paper focuses on parents' views about children's right to decide about participation in research. The data originates from 4000 families participating in a longitudinal prospective screening as 1997. Although current regulations and recommendations underline that children should have influence over their participation, many parents in this study disagree. Most (66%) were positive providing information to the child about relevant aspects of the study. However, responding parents were split about whether or not children should at some point be allowed decisional authority when participating in research: 41.6% of the parents reported being against or unsure. Those who responded positively believed that children should be allowed to decide about blood-sampling procedures (70%), but to a less extent about participation (48.5%), analyses of samples (19.7%) and biological bank storage (15.4%). That as many as possible should remain in the study, and that children do not have the competence to understand the consequences for research was strongly stressed by respondents who do not think children should have a right to decide. When asked what interests they consider most important in paediatric research, child autonomy and decision-making was ranked lowest. We discuss the implications of these findings.

  15. Tracking PACS usage with open source tools.

    PubMed

    French, Todd L; Langer, Steve G

    2011-08-01

    A typical choice faced by Picture Archiving and Communication System (PACS) administrators is deciding how many PACS workstations are needed and where they should be sited. Oftentimes, the social consequences of having too few are severe enough to encourage oversupply and underutilization. This is costly, at best in terms of hardware and electricity, and at worst (depending on the PACS licensing and support model) in capital costs and maintenance fees. The PACS administrator needs tools to asses accurately the use to which her fleet is being subjected, and thus make informed choices before buying more workstations. Lacking a vended solution for this challenge, we developed our own.

  16. 41 CFR 302-12.109 - What must we consider in deciding whether to use the fixed-fee or cost-reimbursable contracting...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... deciding whether to use the fixed-fee or cost-reimbursable contracting method? 302-12.109 Section 302-12... Services Company § 302-12.109 What must we consider in deciding whether to use the fixed-fee or cost...-fee or cost-reimbursable contracting method: (a) Risk of alternative methods. Under a fixed fee...

  17. Next generation human skin constructs as advanced tools for drug development.

    PubMed

    Abaci, H E; Guo, Zongyou; Doucet, Yanne; Jacków, Joanna; Christiano, Angela

    2017-11-01

    Many diseases, as well as side effects of drugs, manifest themselves through skin symptoms. Skin is a complex tissue that hosts various specialized cell types and performs many roles including physical barrier, immune and sensory functions. Therefore, modeling skin in vitro presents technical challenges for tissue engineering. Since the first attempts at engineering human epidermis in 1970s, there has been a growing interest in generating full-thickness skin constructs mimicking physiological functions by incorporating various skin components, such as vasculature and melanocytes for pigmentation. Development of biomimetic in vitro human skin models with these physiological functions provides a new tool for drug discovery, disease modeling, regenerative medicine and basic research for skin biology. This goal, however, has long been delayed by the limited availability of different cell types, the challenges in establishing co-culture conditions, and the ability to recapitulate the 3D anatomy of the skin. Recent breakthroughs in induced pluripotent stem cell (iPSC) technology and microfabrication techniques such as 3D-printing have allowed for building more reliable and complex in vitro skin models for pharmaceutical screening. In this review, we focus on the current developments and prevailing challenges in generating skin constructs with vasculature, skin appendages such as hair follicles, pigmentation, immune response, innervation, and hypodermis. Furthermore, we discuss the promising advances that iPSC technology offers in order to generate in vitro models of genetic skin diseases, such as epidermolysis bullosa and psoriasis. We also discuss how future integration of the next generation human skin constructs onto microfluidic platforms along with other tissues could revolutionize the early stages of drug development by creating reliable evaluation of patient-specific effects of pharmaceutical agents. Impact statement Skin is a complex tissue that hosts various

  18. SIMBA: a web tool for managing bacterial genome assembly generated by Ion PGM sequencing technology.

    PubMed

    Mariano, Diego C B; Pereira, Felipe L; Aguiar, Edgar L; Oliveira, Letícia C; Benevides, Leandro; Guimarães, Luís C; Folador, Edson L; Sousa, Thiago J; Ghosh, Preetam; Barh, Debmalya; Figueiredo, Henrique C P; Silva, Artur; Ramos, Rommel T J; Azevedo, Vasco A C

    2016-12-15

    The evolution of Next-Generation Sequencing (NGS) has considerably reduced the cost per sequenced-base, allowing a significant rise of sequencing projects, mainly in prokaryotes. However, the range of available NGS platforms requires different strategies and software to correctly assemble genomes. Different strategies are necessary to properly complete an assembly project, in addition to the installation or modification of various software. This requires users to have significant expertise in these software and command line scripting experience on Unix platforms, besides possessing the basic expertise on methodologies and techniques for genome assembly. These difficulties often delay the complete genome assembly projects. In order to overcome this, we developed SIMBA (SImple Manager for Bacterial Assemblies), a freely available web tool that integrates several component tools for assembling and finishing bacterial genomes. SIMBA provides a friendly and intuitive user interface so bioinformaticians, even with low computational expertise, can work under a centralized administrative control system of assemblies managed by the assembly center head. SIMBA guides the users to execute assembly process through simple and interactive pages. SIMBA workflow was divided in three modules: (i) projects: allows a general vision of genome sequencing projects, in addition to data quality analysis and data format conversions; (ii) assemblies: allows de novo assemblies with the software Mira, Minia, Newbler and SPAdes, also assembly quality validations using QUAST software; and (iii) curation: presents methods to finishing assemblies through tools for scaffolding contigs and close gaps. We also presented a case study that validated the efficacy of SIMBA to manage bacterial assemblies projects sequenced using Ion Torrent PGM. Besides to be a web tool for genome assembly, SIMBA is a complete genome assemblies project management system, which can be useful for managing of several

  19. Generation X Teaches College: Generation Construction as Pedagogical Tool in the Writing Classroom.

    ERIC Educational Resources Information Center

    Hassel, Holly; Epp, Dawn Vernooy

    In the 1996 book "Generation X Goes to College: An Eye-Opening Account of Teaching in Post-Modern America," Peter Sacks probes the "decay" of higher education in the United States; a decay he attributes to listless, entitled students. This paper interrogates the paradigm of Boomers and Generation Xers poised in opposition to…

  20. Deciding Termination for Ancestor Match- Bounded String Rewriting Systems

    NASA Technical Reports Server (NTRS)

    Geser, Alfons; Hofbauer, Dieter; Waldmann, Johannes

    2005-01-01

    Termination of a string rewriting system can be characterized by termination on suitable recursively defined languages. This kind of termination criteria has been criticized for its lack of automation. In an earlier paper we have shown how to construct an automated termination criterion if the recursion is aligned with the rewrite relation. We have demonstrated the technique with Dershowitz's forward closure criterion. In this paper we show that a different approach is suitable when the recursion is aligned with the inverse of the rewrite relation. We apply this idea to Kurth's ancestor graphs and obtain ancestor match-bounded string rewriting systems. Termination is shown to be decidable for this class. The resulting method improves upon those based on match-boundedness or inverse match-boundedness.

  1. 25 CFR 12.62 - Who decides what uniform an Indian country law enforcement officer can wear and who pays for it?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ..., DEPARTMENT OF THE INTERIOR LAW AND ORDER INDIAN COUNTRY LAW ENFORCEMENT Support Functions § 12.62 Who decides what uniform an Indian country law enforcement officer can wear and who pays for it? Each local law... 25 Indians 1 2011-04-01 2011-04-01 false Who decides what uniform an Indian country law...

  2. 25 CFR 12.62 - Who decides what uniform an Indian country law enforcement officer can wear and who pays for it?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ..., DEPARTMENT OF THE INTERIOR LAW AND ORDER INDIAN COUNTRY LAW ENFORCEMENT Support Functions § 12.62 Who decides what uniform an Indian country law enforcement officer can wear and who pays for it? Each local law... 25 Indians 1 2014-04-01 2014-04-01 false Who decides what uniform an Indian country law...

  3. 25 CFR 12.62 - Who decides what uniform an Indian country law enforcement officer can wear and who pays for it?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ..., DEPARTMENT OF THE INTERIOR LAW AND ORDER INDIAN COUNTRY LAW ENFORCEMENT Support Functions § 12.62 Who decides what uniform an Indian country law enforcement officer can wear and who pays for it? Each local law... 25 Indians 1 2013-04-01 2013-04-01 false Who decides what uniform an Indian country law...

  4. 25 CFR 12.62 - Who decides what uniform an Indian country law enforcement officer can wear and who pays for it?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... THE INTERIOR LAW AND ORDER INDIAN COUNTRY LAW ENFORCEMENT Support Functions § 12.62 Who decides what uniform an Indian country law enforcement officer can wear and who pays for it? Each local law enforcement... 25 Indians 1 2012-04-01 2011-04-01 true Who decides what uniform an Indian country law enforcement...

  5. Next generation tools for high-throughput promoter and expression analysis employing single-copy knock-ins at the Hprt1 locus.

    PubMed

    Yang, G S; Banks, K G; Bonaguro, R J; Wilson, G; Dreolini, L; de Leeuw, C N; Liu, L; Swanson, D J; Goldowitz, D; Holt, R A; Simpson, E M

    2009-03-01

    We have engineered a set of useful tools that facilitate targeted single copy knock-in (KI) at the hypoxanthine guanine phosphoribosyl transferase 1 (Hprt1) locus. We employed fine scale mapping to delineate the precise breakpoint location at the Hprt1(b-m3) locus allowing allele specific PCR assays to be established. Our suite of tools contains four targeting expression vectors and a complementing series of embryonic stem cell lines. Two of these vectors encode enhanced green fluorescent protein (EGFP) driven by the human cytomegalovirus immediate-early enhancer/modified chicken beta-actin (CAG) promoter, whereas the other two permit flexible combinations of a chosen promoter combined with a reporter and/or gene of choice. We have validated our tools as part of the Pleiades Promoter Project (http://www.pleiades.org), with the generation of brain-specific EGFP positive germline mouse strains.

  6. Generation of Look-Up Tables for Dynamic Job Shop Scheduling Decision Support Tool

    NASA Astrophysics Data System (ADS)

    Oktaviandri, Muchamad; Hassan, Adnan; Mohd Shaharoun, Awaluddin

    2016-02-01

    Majority of existing scheduling techniques are based on static demand and deterministic processing time, while most job shop scheduling problem are concerned with dynamic demand and stochastic processing time. As a consequence, the solutions obtained from the traditional scheduling technique are ineffective wherever changes occur to the system. Therefore, this research intends to develop a decision support tool (DST) based on promising artificial intelligent that is able to accommodate the dynamics that regularly occur in job shop scheduling problem. The DST was designed through three phases, i.e. (i) the look-up table generation, (ii) inverse model development and (iii) integration of DST components. This paper reports the generation of look-up tables for various scenarios as a part in development of the DST. A discrete event simulation model was used to compare the performance among SPT, EDD, FCFS, S/OPN and Slack rules; the best performances measures (mean flow time, mean tardiness and mean lateness) and the job order requirement (inter-arrival time, due dates tightness and setup time ratio) which were compiled into look-up tables. The well-known 6/6/J/Cmax Problem from Muth and Thompson (1963) was used as a case study. In the future, the performance measure of various scheduling scenarios and the job order requirement will be mapped using ANN inverse model.

  7. 34 CFR 647.20 - How does the Secretary decide which new grants to make?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false How does the Secretary decide which new grants to make? 647.20 Section 647.20 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION RONALD E. MCNAIR POSTBACCALAUREATE ACHIEVEMENT...

  8. Live and Let Die: CITES--How We Decide the Fate of the World's Species.

    ERIC Educational Resources Information Center

    Beasley, Conger, Jr.

    1992-01-01

    Discusses the significance of the decisions made at the Eighth Convention on the International Trade of Endangered Species (CITES) when governmental delegates and nongovernmental organizations from around the world decided the fate of potentially threatened and endangered species of plants and animals. Particular emphasis is placed on the politics…

  9. 14 CFR 11.87 - Are there circumstances in which FAA may decide not to publish a summary of my petition for...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... decide not to publish a summary of my petition for exemption? 11.87 Section 11.87 Aeronautics and Space... in which FAA may decide not to publish a summary of my petition for exemption? The FAA may not publish a summary of your petition for exemption and request comments if you present or we find good cause...

  10. Evaluation of the efficiency and fault density of software generated by code generators

    NASA Technical Reports Server (NTRS)

    Schreur, Barbara

    1993-01-01

    Flight computers and flight software are used for GN&C (guidance, navigation, and control), engine controllers, and avionics during missions. The software development requires the generation of a considerable amount of code. The engineers who generate the code make mistakes and the generation of a large body of code with high reliability requires considerable time. Computer-aided software engineering (CASE) tools are available which generates code automatically with inputs through graphical interfaces. These tools are referred to as code generators. In theory, code generators could write highly reliable code quickly and inexpensively. The various code generators offer different levels of reliability checking. Some check only the finished product while some allow checking of individual modules and combined sets of modules as well. Considering NASA's requirement for reliability, an in house manually generated code is needed. Furthermore, automatically generated code is reputed to be as efficient as the best manually generated code when executed. In house verification is warranted.

  11. S&T converging trends in dealing with disaster: A review on AI tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hasan, Abu Bakar, E-mail: abakarh@usim.edu.my; Isa, Mohd Hafez Mohd.

    Science and Technology (S&T) has been able to help mankind to solve or minimize problems when arise. Different methodologies, techniques and tools were developed or used for specific cases by researchers, engineers, scientists throughout the world, and numerous papers and articles have been written by them. Nine selected cases such as flash flood, earthquakes, workplace accident, fault in aircraft industry, seismic vulnerability, disaster mitigation and management, and early fault detection in nuclear industry have been studied. This paper looked at those cases, and their results showed nearly 60% uses artificial intelligence (AI) as a tool. This paper also did somemore » review that will help young researchers in deciding the types of AI tools to be selected; thus proving the future trends in S&T.« less

  12. Iterating between Tools to Create and Edit Visualizations.

    PubMed

    Bigelow, Alex; Drucker, Steven; Fisher, Danyel; Meyer, Miriah

    2017-01-01

    A common workflow for visualization designers begins with a generative tool, like D3 or Processing, to create the initial visualization; and proceeds to a drawing tool, like Adobe Illustrator or Inkscape, for editing and cleaning. Unfortunately, this is typically a one-way process: once a visualization is exported from the generative tool into a drawing tool, it is difficult to make further, data-driven changes. In this paper, we propose a bridge model to allow designers to bring their work back from the drawing tool to re-edit in the generative tool. Our key insight is to recast this iteration challenge as a merge problem - similar to when two people are editing a document and changes between them need to reconciled. We also present a specific instantiation of this model, a tool called Hanpuku, which bridges between D3 scripts and Illustrator. We show several examples of visualizations that are iteratively created using Hanpuku in order to illustrate the flexibility of the approach. We further describe several hypothetical tools that bridge between other visualization tools to emphasize the generality of the model.

  13. Sustainability Tools Inventory - Initial Gaps Analysis | Science ...

    EPA Pesticide Factsheets

    This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4

  14. 49 CFR 40.387 - What matters does the Director decide concerning a proposed PIE?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 1 2011-10-01 2011-10-01 false What matters does the Director decide concerning a... TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Public Interest Exclusions § 40.387 What matters does... complete information needed for a decision, the Director may remand the matter to the initiating official...

  15. Do physico-chemical properties of silver nanoparticles decide their interaction with biological media and bactericidal action? A review.

    PubMed

    Pareek, Vikram; Gupta, Rinki; Panwar, Jitendra

    2018-09-01

    The unprecedented increase in antibiotic resistance in this era has resuscitated the attention of scientific community to exploit silver and its various species as antimicrobial agents. Plenty of studies have been done to measure the antimicrobial potential of silver species (cationic silver, metallic Ag 0 or silver nanoparticles, silver oxide particulates etc.) and indicated that membrane damage, oxidative stress, protein dysfunction and DNA damage to be the possible cause of injury to the microbial cell. However, the precise molecular mechanism of their mode of action has remained unclear, which makes an obstacle towards the generation of potential antibacterial agent against various pathogenic and multidrug resistant (MDR) bacteria. In order to endeavor this issue, one should first have the complete understanding about the resistance mechanisms present in bacteria that can be a therapeutic target for the silver-based drug formulations. Apart from this, in-depth understanding of the interactions of various silver species (with the biological media) is a probable deciding factor for the synthesis of silver-based drug formulations because the particular form and physico-chemical properties of silver can ultimately decide their antimicrobial action. In context to above mentioned serious concerns, the present article aims to discuss the mechanisms behind the confrontation of bacteria against various drugs and the effect of physico-chemical properties of silver species on their bactericidal action as well as critically evaluates the available reports on bacterial transcriptomic and proteomic profiles upon the exposure of various silver species. Further, this review state the mechanism of action that needs to be followed for the complete understanding of toxic potential of silver nanoparticles, which will open a possibility to synthesize new silver nanoparticle based antimicrobial systems with desired properties to ensure their safe use, exposure over extended period

  16. How do patients with end-stage ankle arthritis decide between two surgical treatments? A qualitative study.

    PubMed

    Zaidi, Razi; Pfeil, Michael; Macgregor, Alexander J; Goldberg, Andy

    2013-01-01

    To examine how patients decide between ankle fusion and ankle replacement in end-stage ankle arthritis. Purposive patient selection, semistructured interviews, thematic analysis. Royal National Orthopaedic Hospital, Stanmore, UK. 14 patients diagnosed with end-stage ankle osteoarthritis. We interviewed 6 men and 8 women with a mean age of 58 years (range 41-83). All had opted for surgery after failure of at least 6 months of conservative management, sequentially trading-off daily activities to limit the evolving pain. To decide between two offered treatments of ankle fusion and total ankle replacement (TAR), three major sources informed the patients' decision-making process: their surgeon, peers and the internet. The treating surgeon was viewed as the most reliable and influential source of information. Information gleaned from other patients was also important, but with questionable reliability, as was information from the internet, both of which invariably required validation by the surgeon and in some cases the general practitioner. Patients seek knowledge from a wealth of sources including the internet, web forums and other patients. While they leverage each of these sources to guide decision-making, the most important and influential factor in governing how patients decide on any particular surgical intervention is their surgeon. A high quality doctor-patient relationship, coupled with clear, balanced and complete information is essential to enable shared decision-making to become a standard model of care.

  17. 25 CFR 12.62 - Who decides what uniform an Indian country law enforcement officer can wear and who pays for it?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 1 2010-04-01 2010-04-01 false Who decides what uniform an Indian country law enforcement officer can wear and who pays for it? 12.62 Section 12.62 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAW AND ORDER INDIAN COUNTRY LAW ENFORCEMENT Support Functions § 12.62 Who decides what uniform an Indian country law enforcemen...

  18. 20 CFR 405.340 - Deciding a claim without a hearing before an administrative law judge.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... administrative law judge. 405.340 Section 405.340 Employees' Benefits SOCIAL SECURITY ADMINISTRATION ADMINISTRATIVE REVIEW PROCESS FOR ADJUDICATING INITIAL DISABILITY CLAIMS Administrative Law Judge Hearing § 405.340 Deciding a claim without a hearing before an administrative law judge. (a) Decision wholly...

  19. Test Generators: Teacher's Tool or Teacher's Headache?

    ERIC Educational Resources Information Center

    Eiser, Leslie

    1988-01-01

    Discusses the advantages and disadvantages of test generation programs. Includes setting up, printing exams and "bells and whistles." Reviews eight computer packages for Apple and IBM personal computers. Compares features, costs, and usage. (CW)

  20. Machine tool locator

    DOEpatents

    Hanlon, John A.; Gill, Timothy J.

    2001-01-01

    Machine tools can be accurately measured and positioned on manufacturing machines within very small tolerances by use of an autocollimator on a 3-axis mount on a manufacturing machine and positioned so as to focus on a reference tooling ball or a machine tool, a digital camera connected to the viewing end of the autocollimator, and a marker and measure generator for receiving digital images from the camera, then displaying or measuring distances between the projection reticle and the reference reticle on the monitoring screen, and relating the distances to the actual position of the autocollimator relative to the reference tooling ball. The images and measurements are used to set the position of the machine tool and to measure the size and shape of the machine tool tip, and examine cutting edge wear. patent

  1. 5 CFR 890.1069 - Information the debarring official must consider in deciding a provider's contest of proposed...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) FEDERAL... deciding a provider's contest of proposed penalties and assessments. (a) Documentary material and written...

  2. Green Tool

    EPA Pesticide Factsheets

    The Green Tool represents infiltration-based stormwater control practices. It allows modelers to select a BMP type, channel shape and BMP unit dimensions, outflow control devices, and infiltration method. The program generates an HSPF-formatted FTABLE.

  3. Metabolomics as a Hypothesis-Generating Functional Genomics Tool for the Annotation of Arabidopsis thaliana Genes of “Unknown Function”

    PubMed Central

    Quanbeck, Stephanie M.; Brachova, Libuse; Campbell, Alexis A.; Guan, Xin; Perera, Ann; He, Kun; Rhee, Seung Y.; Bais, Preeti; Dickerson, Julie A.; Dixon, Philip; Wohlgemuth, Gert; Fiehn, Oliver; Barkan, Lenore; Lange, Iris; Lange, B. Markus; Lee, Insuk; Cortes, Diego; Salazar, Carolina; Shuman, Joel; Shulaev, Vladimir; Huhman, David V.; Sumner, Lloyd W.; Roth, Mary R.; Welti, Ruth; Ilarslan, Hilal; Wurtele, Eve S.; Nikolau, Basil J.

    2012-01-01

    Metabolomics is the methodology that identifies and measures global pools of small molecules (of less than about 1,000 Da) of a biological sample, which are collectively called the metabolome. Metabolomics can therefore reveal the metabolic outcome of a genetic or environmental perturbation of a metabolic regulatory network, and thus provide insights into the structure and regulation of that network. Because of the chemical complexity of the metabolome and limitations associated with individual analytical platforms for determining the metabolome, it is currently difficult to capture the complete metabolome of an organism or tissue, which is in contrast to genomics and transcriptomics. This paper describes the analysis of Arabidopsis metabolomics data sets acquired by a consortium that includes five analytical laboratories, bioinformaticists, and biostatisticians, which aims to develop and validate metabolomics as a hypothesis-generating functional genomics tool. The consortium is determining the metabolomes of Arabidopsis T-DNA mutant stocks, grown in standardized controlled environment optimized to minimize environmental impacts on the metabolomes. Metabolomics data were generated with seven analytical platforms, and the combined data is being provided to the research community to formulate initial hypotheses about genes of unknown function (GUFs). A public database (www.PlantMetabolomics.org) has been developed to provide the scientific community with access to the data along with tools to allow for its interactive analysis. Exemplary datasets are discussed to validate the approach, which illustrate how initial hypotheses can be generated from the consortium-produced metabolomics data, integrated with prior knowledge to provide a testable hypothesis concerning the functionality of GUFs. PMID:22645570

  4. Second generation of ``Miranda procedure'' for CP violation in Dalitz studies of B (and D and τ) decays

    NASA Astrophysics Data System (ADS)

    Bediaga, I.; Miranda, J.; dos Reis, A. C.; Bigi, I. I.; Gomes, A.; Otalora Goicochea, J. M.; Veiga, A.

    2012-08-01

    The “Miranda procedure” proposed for analyzing Dalitz plots for CP asymmetries in charged B and D decays in a model-independent manner is extended and refined in this paper. The complexity of Cabibbo-Kobayashi-Maskawa CP phenomenology through order λ6 is needed in searches for new dynamics (ND). Detailed analyses of three-body final states offer great advantages: (i) They give us more powerful tools for deciding whether an observed CP asymmetry represents the manifestation of ND and its features. (ii) Many advantages can already be obtained by the Miranda procedure without construction of a detailed Dalitz plot description. (iii) One studies CP asymmetries independent of production asymmetries. We illustrate the power of a second generation Miranda procedure with examples with time integrated rates for Bd/B¯d decays to final states KSπ+π- as trial runs, with comments on B±→K±π+π-/K±K+K-.

  5. Computer-generated holograms (CGH) realization: the integration of dedicated software tool with digital slides printer

    NASA Astrophysics Data System (ADS)

    Guarnieri, Vittorio; Francini, Franco

    1997-12-01

    Last generation of digital printer is usually characterized by a spatial resolution enough high to allow the designer to realize a binary CGH directly on a transparent film avoiding photographic reduction techniques. These devices are able to produce slides or offset prints. Furthermore, services supplied by commercial printing company provide an inexpensive method to rapidly verify the validity of the design by means of a test-and-trial process. Notably, this low-cost approach appears to be suitable for a didactical environment. On the basis of these considerations, a set of software tools able to design CGH's has been developed. The guidelines inspiring the work have been the following ones: (1) ray-tracing approach, considering the object to be reproduced as source of spherical waves; (2) Optimization and speed-up of the algorithms used, in order to produce a portable code, runnable on several hardware platforms. In this paper calculation methods to obtain some fundamental geometric functions (points, lines, curves) are described. Furthermore, by the juxtaposition of these primitives functions it is possible to produce the holograms of more complex objects. Many examples of generated CGHs are presented.

  6. Early visual analysis tool using magnetoencephalography for treatment and recovery of neuronal dysfunction.

    PubMed

    Rasheed, Waqas; Neoh, Yee Yik; Bin Hamid, Nor Hisham; Reza, Faruque; Idris, Zamzuri; Tang, Tong Boon

    2017-10-01

    Functional neuroimaging modalities play an important role in deciding the diagnosis and course of treatment of neuronal dysfunction and degeneration. This article presents an analytical tool with visualization by exploiting the strengths of the MEG (magnetoencephalographic) neuroimaging technique. The tool automates MEG data import (in tSSS format), channel information extraction, time/frequency decomposition, and circular graph visualization (connectogram) for simple result inspection. For advanced users, the tool also provides magnitude squared coherence (MSC) values allowing personalized threshold levels, and the computation of default model from MEG data of control population. Default model obtained from healthy population data serves as a useful benchmark to diagnose and monitor neuronal recovery during treatment. The proposed tool further provides optional labels with international 10-10 system nomenclature in order to facilitate comparison studies with EEG (electroencephalography) sensor space. Potential applications in epilepsy and traumatic brain injury studies are also discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Tsunami Generation Modelling for Early Warning Systems

    NASA Astrophysics Data System (ADS)

    Annunziato, A.; Matias, L.; Ulutas, E.; Baptista, M. A.; Carrilho, F.

    2009-04-01

    In the frame of a collaboration between the European Commission Joint Research Centre and the Institute of Meteorology in Portugal, a complete analytical tool to support Early Warning Systems is being developed. The tool will be part of the Portuguese National Early Warning System and will be used also in the frame of the UNESCO North Atlantic Section of the Tsunami Early Warning System. The system called Tsunami Analysis Tool (TAT) includes a worldwide scenario database that has been pre-calculated using the SWAN-JRC code (Annunziato, 2007). This code uses a simplified fault generation mechanism and the hydraulic model is based on the SWAN code (Mader, 1988). In addition to the pre-defined scenario, a system of computers is always ready to start a new calculation whenever a new earthquake is detected by the seismic networks (such as USGS or EMSC) and is judged capable to generate a Tsunami. The calculation is performed using minimal parameters (epicentre and the magnitude of the earthquake): the programme calculates the rupture length and rupture width by using empirical relationship proposed by Ward (2002). The database calculations, as well the newly generated calculations with the current conditions are therefore available to TAT where the real online analysis is performed. The system allows to analyze also sea level measurements available worldwide in order to compare them and decide if a tsunami is really occurring or not. Although TAT, connected with the scenario database and the online calculation system, is at the moment the only software that can support the tsunami analysis on a global scale, we are convinced that the fault generation mechanism is too simplified to give a correct tsunami prediction. Furthermore short tsunami arrival times especially require a possible earthquake source parameters data on tectonic features of the faults like strike, dip, rake and slip in order to minimize real time uncertainty of rupture parameters. Indeed the earthquake

  8. Generating community-built tools for data sharing and analysis in environmental networks

    USGS Publications Warehouse

    Read, Jordan S.; Gries, Corinna; Read, Emily K.; Klug, Jennifer; Hanson, Paul C.; Hipsey, Matthew R.; Jennings, Eleanor; O'Reilley, Catherine; Winslow, Luke A.; Pierson, Don; McBride, Christopher G.; Hamilton, David

    2016-01-01

    Rapid data growth in many environmental sectors has necessitated tools to manage and analyze these data. The development of tools often lags behind the proliferation of data, however, which may slow exploratory opportunities and scientific progress. The Global Lake Ecological Observatory Network (GLEON) collaborative model supports an efficient and comprehensive data–analysis–insight life cycle, including implementations of data quality control checks, statistical calculations/derivations, models, and data visualizations. These tools are community-built and openly shared. We discuss the network structure that enables tool development and a culture of sharing, leading to optimized output from limited resources. Specifically, data sharing and a flat collaborative structure encourage the development of tools that enable scientific insights from these data. Here we provide a cross-section of scientific advances derived from global-scale analyses in GLEON. We document enhancements to science capabilities made possible by the development of analytical tools and highlight opportunities to expand this framework to benefit other environmental networks.

  9. Virtual tool mark generation for efficient striation analysis in forensic science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekstrand, Laura

    In 2009, a National Academy of Sciences report called for investigation into the scienti c basis behind tool mark comparisons (National Academy of Sciences, 2009). Answering this call, Chumbley et al. (2010) attempted to prove or disprove the hypothesis that tool marks are unique to a single tool. They developed a statistical algorithm that could, in most cases, discern matching and non-matching tool marks made at di erent angles by sequentially numbered screwdriver tips. Moreover, in the cases where the algorithm misinterpreted a pair of marks, an experienced forensics examiner could discern the correct outcome. While this research served tomore » con rm the basic assumptions behind tool mark analysis, it also suggested that statistical analysis software could help to reduce the examiner's workload. This led to a new tool mark analysis approach, introduced in this thesis, that relies on 3D scans of screwdriver tip and marked plate surfaces at the micrometer scale from an optical microscope. These scans are carefully cleaned to remove noise from the data acquisition process and assigned a coordinate system that mathematically de nes angles and twists in a natural way. The marking process is then simulated by using a 3D graphics software package to impart rotations to the tip and take the projection of the tip's geometry in the direction of tool travel. The edge of this projection, retrieved from the 3D graphics software, becomes a virtual tool mark. Using this method, virtual marks are made at increments of 5 and compared to a scan of the evidence mark. The previously developed statistical package from Chumbley et al. (2010) performs the comparison, comparing the similarity of the geometry of both marks to the similarity that would occur due to random chance. The resulting statistical measure of the likelihood of the match informs the examiner of the angle of the best matching virtual mark, allowing the examiner to focus his/her mark analysis on a smaller range

  10. Building energy analysis tool

    DOEpatents

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars

    2016-04-12

    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  11. Decision Analysis Tools for Volcano Observatories

    NASA Astrophysics Data System (ADS)

    Hincks, T. H.; Aspinall, W.; Woo, G.

    2005-12-01

    Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.

  12. Hybrid texture generator

    NASA Astrophysics Data System (ADS)

    Miyata, Kazunori; Nakajima, Masayuki

    1995-04-01

    A method is given for synthesizing a texture by using the interface of a conventional drawing tool. The majority of conventional texture generation methods are based on the procedural approach, and can generate a variety of textures that are adequate for generating a realistic image. But it is hard for a user to imagine what kind of texture will be generated simply by looking at its parameters. Furthermore, it is difficult to design a new texture freely without a knowledge of all the procedures for texture generation. Our method offers a solution to these problems, and has the following four merits: First, a variety of textures can be obtained by combining a set of feature lines and attribute functions. Second, data definitions are flexible. Third, the user can preview a texture together with its feature lines. Fourth, people can design their own textures interactively and freely by using the interface of a conventional drawing tool. For users who want to build this texture generation method into their own programs, we also give the language specifications for generating a texture. This method can interactively provide a variety of textures, and can also be used for typographic design.

  13. Development of a high-temperature diagnostics-while-drilling tool.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chavira, David J.; Huey, David; Hetmaniak, Chris

    2009-01-01

    system has been fielded in cost-sharing efforts with an industrial partner to support the development of new generation hard-rock drag bits. Following the demonstrated success of the POC DWD system, efforts were initiated in FY05 to design, fabricate and test a high-temperature (HT) capable version of the DWD system. The design temperature for the HT DWD system was 225 C. Programmatic requirements dictated that a HT DWD tool be developed during FY05 and that a working system be demonstrated before the end of FY05. During initial design discussions regarding a high-temperature system it was decided that, to the extent possible, the HT DWD system would maintain functionality similar to the low temperature system, that is, the HT DWD system would also be designed to provide the driller with real-time information on bit and bottom-hole-assembly (BHA) dynamics while drilling. Additionally, because of time and fiscal constraints associated with the HT system development, the design of the HT DWD tool would follow that of the LT tool. The downhole electronics package would be contained in a concentrically located pressure barrel and the use of externally applied strain gages with thru-tool connectors would also be used in the new design. Also, in order to maximize the potential wells available for the HT DWD system and to allow better comparison with the low-temperature design, the diameter of the tool was maintained at 7-inches. This report discusses the efforts associated with the development of a DWD system capable of sustained operation at 225 C. This report documents work performed in the second phase of the Diagnostics-While-Drilling (DWD) project in which a high-temperature (HT) version of the phase 1 low-temperature (LT) proof-of-concept (POC) DWD tool was built and tested. Descriptions of the design, fabrication and field testing of the HT tool are provided. Background on prior phases of the project can be found in SAND2003-2069 and SAND2000-0239.« less

  14. Flexible Energy Scheduling Tool for Integrating Variable Generation | Grid

    Science.gov Websites

    , security-constrained economic dispatch, and automatic generation control programs. DOWNLOAD PAPER Electric commitment, security-constrained economic dispatch, and automatic generation control sub-models. Each sub resolutions and operating strategies can be explored. FESTIV produces not only economic metrics but also

  15. 50 CFR 23.2 - How do I decide if these regulations apply to my shipment or me?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... ENDANGERED SPECIES OF WILD FAUNA AND FLORA (CITES) Introduction § 23.2 How do I decide if these regulations... plant species (including parts, products, derivatives, whether wild-collected, or born or propagated in...

  16. 50 CFR 23.2 - How do I decide if these regulations apply to my shipment or me?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... ENDANGERED SPECIES OF WILD FAUNA AND FLORA (CITES) Introduction § 23.2 How do I decide if these regulations... plant species (including parts, products, derivatives, whether wild-collected, or born or propagated in...

  17. 50 CFR 23.2 - How do I decide if these regulations apply to my shipment or me?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... ENDANGERED SPECIES OF WILD FAUNA AND FLORA (CITES) Introduction § 23.2 How do I decide if these regulations... plant species (including parts, products, derivatives, whether wild-collected, or born or propagated in...

  18. 50 CFR 23.2 - How do I decide if these regulations apply to my shipment or me?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... ENDANGERED SPECIES OF WILD FAUNA AND FLORA (CITES) Introduction § 23.2 How do I decide if these regulations... plant species (including parts, products, derivatives, whether wild-collected, or born or propagated in...

  19. I-DECIDE: An Online Intervention Drawing on the Psychosocial Readiness Model for Women Experiencing Domestic Violence.

    PubMed

    Tarzia, Laura; Murray, Elizabeth; Humphreys, Cathy; Glass, Nancy; Taft, Angela; Valpied, Jodie; Hegarty, Kelsey

    2016-01-01

    Domestic violence (DV) perpetrated by men against women is a pervasive global problem with significant physical and emotional consequences. Although some face-to-face interventions in health care settings have shown promise, there are barriers to disclosure to health care practitioners and women may not be ready to access or accept help, reducing uptake. Similar to the mental health field, interventions from clinical practice can be adapted to be delivered by technology. This article outlines the theoretical and conceptual development of I-DECIDE, an online healthy relationship tool and safety decision aid for women experiencing DV. The article explores the use of the Psychosocial Readiness Model (PRM) as a theoretical framework for the intervention and evaluation. This is a theoretical article drawing on current theory and literature around health care and online interventions for DV. The article argues that the Internet as a method of intervention delivery for DV might overcome many of the barriers present in health care settings. Using the PRM as a framework for an online DV intervention may help women on a pathway to safety and well-being for themselves and their children. This hypothesis will be tested in a randomized, controlled trial in 2015/2016. This article highlights the importance of using a theoretical model in intervention development and evaluation. Copyright © 2016 Jacobs Institute of Women's Health. Published by Elsevier Inc. All rights reserved.

  20. Involving people with dementia in developing an interactive web tool for shared decision-making: experiences with a participatory design approach.

    PubMed

    Span, Marijke; Hettinga, Marike; Groen-van de Ven, Leontine; Jukema, Jan; Janssen, Ruud; Vernooij-Dassen, Myrra; Eefsting, Jan; Smits, Carolien

    2018-06-01

    The aim of this study was at gaining insight into the participatory design approach of involving people with dementia in the development of the DecideGuide, an interactive web tool facilitating shared decision-making in their care networks. An explanatory case study design was used when developing the DecideGuide. A secondary analysis focused on the data gathered from the participating people with dementia during the development stages: semi-structured interviews (n = 23), four focus group interviews (n = 18), usability tests (n = 3), and a field study (n = 4). Content analysis was applied to the data. Four themes showed to be important regarding the participation experiences of involving people with dementia in research: valuable feedback on content and design of the DecideGuide, motivation to participate, perspectives of people with dementia and others about distress related to involvement, and time investment. People with dementia can give essential feedback and, therefore, their contribution is useful and valuable. Meaningful participation of people with dementia takes time that should be taken into account. It is important for people with dementia to be able to reciprocate the efforts others make and to feel of significance to others. Implications for Rehabilitation People with dementia can contribute meaningfully to the content and design and their perspective is essential for developing useful and user-friendly tools. Participating in research activities may contribute to social inclusion, empowerment, and quality of life of people with dementia.

  1. Study protocol for 'we DECide': implementation of advance care planning for nursing home residents with dementia.

    PubMed

    Ampe, Sophie; Sevenants, Aline; Coppens, Evelien; Spruytte, Nele; Smets, Tinne; Declercq, Anja; van Audenhove, Chantal

    2015-05-01

    To evaluate the effects of 'we DECide', an educational intervention for nursing home staff on shared decision-making in the context of advance care planning for residents with dementia. Advance care planning (preparing care choices for when persons no longer have decision-making capacity) is of utmost importance for nursing home residents with dementia, but is mostly not realized for this group. Advance care planning consists of discussing care choices and making decisions and corresponds to shared decision-making (the involvement of persons and their families in care and treatment decisions). This quasi-experimental pre-test-post-test study is conducted in 19 nursing homes (Belgium). Participants are nursing home staff. 'We DECide' focuses on three crucial moments for discussing advance care planning: the time of admission, crisis situations and everyday conversations. The 'ACP-audit' assesses participants' views on the organization of advance care planning (organizational level), the 'OPTION scale' evaluates the degree of shared decision-making in individual conversations (clinical level) and the 'IFC-SDM Questionnaire' assesses participants' views on Importance, Frequency and Competence of realizing shared decision-making (clinical level). (Project funded: July 2010). The study hypothesis is that 'we DECide' results in a higher realization of shared decision-making in individual conversations on advance care planning. A better implementation of advance care planning will lead to a higher quality of end-of-life care and more person-centred care. We believe our study will be of interest to researchers and to professional nursing home caregivers and policy-makers. © 2014 John Wiley & Sons Ltd.

  2. Advanced genetic tools for plant biotechnology.

    PubMed

    Liu, Wusheng; Yuan, Joshua S; Stewart, C Neal

    2013-11-01

    Basic research has provided a much better understanding of the genetic networks and regulatory hierarchies in plants. To meet the challenges of agriculture, we must be able to rapidly translate this knowledge into generating improved plants. Therefore, in this Review, we discuss advanced tools that are currently available for use in plant biotechnology to produce new products in plants and to generate plants with new functions. These tools include synthetic promoters, 'tunable' transcription factors, genome-editing tools and site-specific recombinases. We also review some tools with the potential to enable crop improvement, such as methods for the assembly and synthesis of large DNA molecules, plant transformation with linked multigenes and plant artificial chromosomes. These genetic technologies should be integrated to realize their potential for applications to pressing agricultural and environmental problems.

  3. 20 CFR 416.1448 - Deciding a case without an oral hearing before an administrative law judge.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Deciding a case without an oral hearing before an administrative law judge. 416.1448 Section 416.1448 Employees' Benefits SOCIAL SECURITY... Review Process, and Reopening of Determinations and Decisions Administrative Law Judge Hearing Procedures...

  4. Dynamic Hurricane Data Analysis Tool

    NASA Technical Reports Server (NTRS)

    Knosp, Brian W.; Li, Peggy; Vu, Quoc A.

    2009-01-01

    A dynamic hurricane data analysis tool allows users of the JPL Tropical Cyclone Information System (TCIS) to analyze data over a Web medium. The TCIS software is described in the previous article, Tropical Cyclone Information System (TCIS) (NPO-45748). This tool interfaces with the TCIS database to pull in data from several different atmospheric and oceanic data sets, both observed by instruments. Users can use this information to generate histograms, maps, and profile plots for specific storms. The tool also displays statistical values for the user-selected parameter for the mean, standard deviation, median, minimum, and maximum values. There is little wait time, allowing for fast data plots over date and spatial ranges. Users may also zoom-in for a closer look at a particular spatial range. This is version 1 of the software. Researchers will use the data and tools on the TCIS to understand hurricane processes, improve hurricane forecast models and identify what types of measurements the next generation of instruments will need to collect.

  5. A Systematic Literature Review on Evaluation of Digital Tools for Authoring Evidence-Based Clinical Guidelines.

    PubMed

    Khodambashi, Soudabeh; Nytrø, Øystein

    2017-01-01

    To facilitate the clinical guideline (GL) development process, different groups of researchers have proposed tools that enable computer-supported tools for authoring and publishing GLs. In a previous study we interviewed GL authors in different Norwegian institutions and identified tool shortcomings. In this follow-up study our goal is to explore to what extent GL authoring tools have been evaluated by researchers, guideline organisations, or GL authors. This article presents results from a systematic literature review of evaluation (including usability) of GL authoring tools. A controlled database search and backward snow-balling were used to identify relevant articles. From the 12692 abstracts found, 188 papers were fully reviewed and 26 papers were identified as relevant. The GRADEPro tool has attracted some evaluation, however popular tools and platforms such as DECIDE, Doctor Evidence, JBI-SUMARI, G-I-N library have not been subject to specific evaluation from an authoring perspective. Therefore, we found that little attention was paid to the evaluation of the tools in general. We could not find any evaluation relevant to how tools integrate and support the complex GL development workflow. The results of this paper are highly relevant to GL authors, tool developers and GL publishing organisations in order to improve and control the GL development and maintenance process.

  6. 20 CFR 404.948 - Deciding a case without an oral hearing before an administrative law judge.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Deciding a case without an oral hearing before an administrative law judge. 404.948 Section 404.948 Employees' Benefits SOCIAL SECURITY... Process, and Reopening of Determinations and Decisions Administrative Law Judge Hearing Procedures § 404...

  7. Managing complex research datasets using electronic tools: a meta-analysis exemplar.

    PubMed

    Brown, Sharon A; Martin, Ellen E; Garcia, Theresa J; Winter, Mary A; García, Alexandra A; Brown, Adama; Cuevas, Heather E; Sumlin, Lisa L

    2013-06-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, for example, EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process as well as enhancing communication among research team members. The purpose of this article is to describe the electronic processes designed, using commercially available software, for an extensive, quantitative model-testing meta-analysis. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to decide on which electronic tools to use, determine how these tools would be used, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members.

  8. deepTools2: a next generation web server for deep-sequencing data analysis.

    PubMed

    Ramírez, Fidel; Ryan, Devon P; Grüning, Björn; Bhardwaj, Vivek; Kilpert, Fabian; Richter, Andreas S; Heyne, Steffen; Dündar, Friederike; Manke, Thomas

    2016-07-08

    We present an update to our Galaxy-based web server for processing and visualizing deeply sequenced data. Its core tool set, deepTools, allows users to perform complete bioinformatic workflows ranging from quality controls and normalizations of aligned reads to integrative analyses, including clustering and visualization approaches. Since we first described our deepTools Galaxy server in 2014, we have implemented new solutions for many requests from the community and our users. Here, we introduce significant enhancements and new tools to further improve data visualization and interpretation. deepTools continue to be open to all users and freely available as a web service at deeptools.ie-freiburg.mpg.de The new deepTools2 suite can be easily deployed within any Galaxy framework via the toolshed repository, and we also provide source code for command line usage under Linux and Mac OS X. A public and documented API for access to deepTools functionality is also available. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. Are women deciding against home births in low and middle income countries?

    PubMed

    Amoako Johnson, Fiifi; Padmadas, Sabu S; Matthews, Zoë

    2013-01-01

    Although there is evidence to tracking progress towards facility births within the UN Millennium Development Goals framework, we do not know whether women are deciding against home birth over their reproductive lives. Using Demographic and Health Surveys (DHS) data from 44 countries, this study aims to investigate the patterns and shifts in childbirth locations and to determine whether these shifts are in favour of home or health settings. The analyses considered 108,777 women who had at least two births in the five years preceding the most recent DHS over the period 2000-2010. The vast majority of women opted for the same place of childbirth for their successive births. However, about 14% did switch their place and not all these decisions favoured health facility over home setting. In 24 of the 44 countries analysed, a higher proportion of women switched from a health facility to home. Multilevel regression analyses show significantly higher odds of switching from home to a facility for high parity women, those with frequent antenatal visits and more wealth. However, in countries with high infant mortality rates, low parity women had an increased probability of switching from home to a health facility. There is clear evidence that women do change their childbirth locations over successive births in low and middle income countries. After two decades of efforts to improve maternal health, it might be expected that a higher proportion of women will be deciding against home births in favour of facility births. The results from this analysis show that is not the case.

  10. Advanced genetic tools for plant biotechnology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, WS; Yuan, JS; Stewart, CN

    2013-10-09

    Basic research has provided a much better understanding of the genetic networks and regulatory hierarchies in plants. To meet the challenges of agriculture, we must be able to rapidly translate this knowledge into generating improved plants. Therefore, in this Review, we discuss advanced tools that are currently available for use in plant biotechnology to produce new products in plants and to generate plants with new functions. These tools include synthetic promoters, 'tunable' transcription factors, genome-editing tools and site-specific recombinases. We also review some tools with the potential to enable crop improvement, such as methods for the assembly and synthesis ofmore » large DNA molecules, plant transformation with linked multigenes and plant artificial chromosomes. These genetic technologies should be integrated to realize their potential for applications to pressing agricultural and environmental problems.« less

  11. Low-energy electron beam proximity projection lithography (LEEPL): the world's first e-beam production tool, LEEPL 3000

    NASA Astrophysics Data System (ADS)

    Behringer, Uwe F. W.

    2004-06-01

    In June 2000 ago the company Accretech and LEEPL corporation decided to develop an E-beam lithography tool for high throughput wafer exposure, called LEEPL. In an amazing short time the alpha tool was built. In 2002 the beta tool was installed at Accretech. Today the first production tool the LEEPL 3000 is ready to be shipped. The 2keV E-beam tool will be used in the first lithography strategy to expose (in mix and match mode with optical exposure tools) critical levels like gate structures, contact holes (CH), and via pattern of the 90 nm and 65 nm node. At the SEMATECH EPL workshop on September 22nd in Cambridge, England it was mentioned that the amount of these levels will increase very rapidly (8 in 2007; 13 in 2010 and 17 in 2013). The schedule of the production tool for 45 nm node is mid 2005 and for the 32 nm node 2008. The Figure 1 shows from left to right α-tool, the β-tool and the production tool LEEPL 3000. Figure 1 also shows the timetable of the 4 LEEPL forum all held in Japan.

  12. A Graphic Symbol Tool for the Evaluation of Communication, Satisfaction and Priorities of Individuals with Intellectual Disability Who Use a Speech Generating Device

    ERIC Educational Resources Information Center

    Valiquette, Christine; Sutton, Ann; Ska, Bernadette

    2010-01-01

    This article reports on the views of individuals with learning disability (LD) on their use of their speech generating devices (SGDs), their satisfaction about their communication, and their priorities. The development of an interview tool made of graphic symbols and entitled Communication, Satisfaction and Priorities of SGD Users (CSPU) is…

  13. Involving Communities in Deciding What Benefits They Receive in Multinational Research

    PubMed Central

    Wendler, David; Shah, Seema

    2015-01-01

    There is wide agreement that communities in lower-income countries should benefit when they participate in multinational research. Debate now focuses on how and to what extent these communities should benefit. This debate has identified compelling reasons to reject the claim that whatever benefits a community agrees to accept are necessarily fair. Yet, those who conduct clinical research may conclude from this rejection that there is no reason to involve communities in the process of deciding how they benefit. Against this possibility, the present manuscript argues that involving host communities in this process helps to promote four important goals: (1) protecting host communities, (2) respecting host communities, (3) promoting transparency, and (4) enhancing social value. PMID:26224724

  14. ICO amplicon NGS data analysis: a Web tool for variant detection in common high-risk hereditary cancer genes analyzed by amplicon GS Junior next-generation sequencing.

    PubMed

    Lopez-Doriga, Adriana; Feliubadaló, Lídia; Menéndez, Mireia; Lopez-Doriga, Sergio; Morón-Duran, Francisco D; del Valle, Jesús; Tornero, Eva; Montes, Eva; Cuesta, Raquel; Campos, Olga; Gómez, Carolina; Pineda, Marta; González, Sara; Moreno, Victor; Capellá, Gabriel; Lázaro, Conxi

    2014-03-01

    Next-generation sequencing (NGS) has revolutionized genomic research and is set to have a major impact on genetic diagnostics thanks to the advent of benchtop sequencers and flexible kits for targeted libraries. Among the main hurdles in NGS are the difficulty of performing bioinformatic analysis of the huge volume of data generated and the high number of false positive calls that could be obtained, depending on the NGS technology and the analysis pipeline. Here, we present the development of a free and user-friendly Web data analysis tool that detects and filters sequence variants, provides coverage information, and allows the user to customize some basic parameters. The tool has been developed to provide accurate genetic analysis of targeted sequencing of common high-risk hereditary cancer genes using amplicon libraries run in a GS Junior System. The Web resource is linked to our own mutation database, to assist in the clinical classification of identified variants. We believe that this tool will greatly facilitate the use of the NGS approach in routine laboratories.

  15. Regulatory sequence analysis tools.

    PubMed

    van Helden, Jacques

    2003-07-01

    The web resource Regulatory Sequence Analysis Tools (RSAT) (http://rsat.ulb.ac.be/rsat) offers a collection of software tools dedicated to the prediction of regulatory sites in non-coding DNA sequences. These tools include sequence retrieval, pattern discovery, pattern matching, genome-scale pattern matching, feature-map drawing, random sequence generation and other utilities. Alternative formats are supported for the representation of regulatory motifs (strings or position-specific scoring matrices) and several algorithms are proposed for pattern discovery. RSAT currently holds >100 fully sequenced genomes and these data are regularly updated from GenBank.

  16. Next-generation sequencing in clinical virology: Discovery of new viruses.

    PubMed

    Datta, Sibnarayan; Budhauliya, Raghvendra; Das, Bidisha; Chatterjee, Soumya; Vanlalhmuaka; Veer, Vijay

    2015-08-12

    Viruses are a cause of significant health problem worldwide, especially in the developing nations. Due to different anthropological activities, human populations are exposed to different viral pathogens, many of which emerge as outbreaks. In such situations, discovery of novel viruses is utmost important for deciding prevention and treatment strategies. Since last century, a number of different virus discovery methods, based on cell culture inoculation, sequence-independent PCR have been used for identification of a variety of viruses. However, the recent emergence and commercial availability of next-generation sequencers (NGS) has entirely changed the field of virus discovery. These massively parallel sequencing platforms can sequence a mixture of genetic materials from a very heterogeneous mix, with high sensitivity. Moreover, these platforms work in a sequence-independent manner, making them ideal tools for virus discovery. However, for their application in clinics, sample preparation or enrichment is necessary to detect low abundance virus populations. A number of techniques have also been developed for enrichment or viral nucleic acids. In this manuscript, we review the evolution of sequencing; NGS technologies available today as well as widely used virus enrichment technologies. We also discuss the challenges associated with their applications in the clinical virus discovery.

  17. Developing and Evaluating Communication Strategies to Support Informed Decisions and Practice Based on Evidence (DECIDE): protocol and preliminary results.

    PubMed

    Treweek, Shaun; Oxman, Andrew D; Alderson, Philip; Bossuyt, Patrick M; Brandt, Linn; Brożek, Jan; Davoli, Marina; Flottorp, Signe; Harbour, Robin; Hill, Suzanne; Liberati, Alessandro; Liira, Helena; Schünemann, Holger J; Rosenbaum, Sarah; Thornton, Judith; Vandvik, Per Olav; Alonso-Coello, Pablo

    2013-01-09

    Healthcare decision makers face challenges when using guidelines, including understanding the quality of the evidence or the values and preferences upon which recommendations are made, which are often not clear. GRADE is a systematic approach towards assessing the quality of evidence and the strength of recommendations in healthcare. GRADE also gives advice on how to go from evidence to decisions. It has been developed to address the weaknesses of other grading systems and is now widely used internationally. The Developing and Evaluating Communication Strategies to Support Informed Decisions and Practice Based on Evidence (DECIDE) consortium (http://www.decide-collaboration.eu/), which includes members of the GRADE Working Group and other partners, will explore methods to ensure effective communication of evidence-based recommendations targeted at key stakeholders: healthcare professionals, policymakers, and managers, as well as patients and the general public. Surveys and interviews with guideline producers and other stakeholders will explore how presentation of the evidence could be improved to better meet their information needs. We will collect further stakeholder input from advisory groups, via consultations and user testing; this will be done across a wide range of healthcare systems in Europe, North America, and other countries. Targeted communication strategies will be developed, evaluated in randomized trials, refined, and assessed during the development of real guidelines. Results of the DECIDE project will improve the communication of evidence-based healthcare recommendations. Building on the work of the GRADE Working Group, DECIDE will develop and evaluate methods that address communication needs of guideline users. The project will produce strategies for communicating recommendations that have been rigorously evaluated in diverse settings, and it will support the transfer of research into practice in healthcare systems globally.

  18. Aided generation of search interfaces to astronomical archives

    NASA Astrophysics Data System (ADS)

    Zorba, Sonia; Bignamini, Andrea; Cepparo, Francesco; Knapic, Cristina; Molinaro, Marco; Smareglia, Riccardo

    2016-07-01

    Astrophysical data provider organizations that host web based interfaces to provide access to data resources have to cope with possible changes in data management that imply partial rewrites of web applications. To avoid doing this manually it was decided to develop a dynamically configurable Java EE web application that can set itself up reading needed information from configuration files. Specification of what information the astronomical archive database has to expose is managed using the TAP SCHEMA schema from the IVOA TAP recommendation, that can be edited using a graphical interface. When configuration steps are done the tool will build a war file to allow easy deployment of the application.

  19. College or Training Programs: How to Decide. PACER Center ACTion Information Sheets. PHP-c115

    ERIC Educational Resources Information Center

    PACER Center, 2006

    2006-01-01

    A high school diploma opens the door to many exciting new options. These might include a first full-time job, or part-time or full-time attendance at a technical school, community college, or university. Students might want to obtain a certificate, an associate degree, or a diploma. With so many choices, it can be a challenge to decide which path…

  20. Benchmarking short sequence mapping tools

    PubMed Central

    2013-01-01

    Background The development of next-generation sequencing instruments has led to the generation of millions of short sequences in a single run. The process of aligning these reads to a reference genome is time consuming and demands the development of fast and accurate alignment tools. However, the current proposed tools make different compromises between the accuracy and the speed of mapping. Moreover, many important aspects are overlooked while comparing the performance of a newly developed tool to the state of the art. Therefore, there is a need for an objective evaluation method that covers all the aspects. In this work, we introduce a benchmarking suite to extensively analyze sequencing tools with respect to various aspects and provide an objective comparison. Results We applied our benchmarking tests on 9 well known mapping tools, namely, Bowtie, Bowtie2, BWA, SOAP2, MAQ, RMAP, GSNAP, Novoalign, and mrsFAST (mrFAST) using synthetic data and real RNA-Seq data. MAQ and RMAP are based on building hash tables for the reads, whereas the remaining tools are based on indexing the reference genome. The benchmarking tests reveal the strengths and weaknesses of each tool. The results show that no single tool outperforms all others in all metrics. However, Bowtie maintained the best throughput for most of the tests while BWA performed better for longer read lengths. The benchmarking tests are not restricted to the mentioned tools and can be further applied to others. Conclusion The mapping process is still a hard problem that is affected by many factors. In this work, we provided a benchmarking suite that reveals and evaluates the different factors affecting the mapping process. Still, there is no tool that outperforms all of the others in all the tests. Therefore, the end user should clearly specify his needs in order to choose the tool that provides the best results. PMID:23758764

  1. Fluid sampling tool

    DOEpatents

    Johnston, Roger G.; Garcia, Anthony R. E.; Martinez, Ronald K.

    2001-09-25

    The invention includes a rotatable tool for collecting fluid through the wall of a container. The tool includes a fluid collection section with a cylindrical shank having an end portion for drilling a hole in the container wall when the tool is rotated, and a threaded portion for tapping the hole in the container wall. A passageway in the shank in communication with at least one radial inlet hole in the drilling end and an opening at the end of the shank is adapted to receive fluid from the container. The tool also includes a cylindrical chamber affixed to the end of the shank opposite to the drilling portion thereof for receiving and storing fluid passing through the passageway. The tool also includes a flexible, deformable gasket that provides a fluid-tight chamber to confine kerf generated during the drilling and tapping of the hole. The invention also includes a fluid extractor section for extracting fluid samples from the fluid collecting section.

  2. Towards a New Generation of Time-Series Visualization Tools in the ESA Heliophysics Science Archives

    NASA Astrophysics Data System (ADS)

    Perez, H.; Martinez, B.; Cook, J. P.; Herment, D.; Fernandez, M.; De Teodoro, P.; Arnaud, M.; Middleton, H. R.; Osuna, P.; Arviset, C.

    2017-12-01

    During the last decades a varied set of Heliophysics missions have allowed the scientific community to gain a better knowledge on the solar atmosphere and activity. The remote sensing images of missions such as SOHO have paved the ground for Helio-based spatial data visualization software such as JHelioViewer/Helioviewer. On the other hand, the huge amount of in-situ measurements provided by other missions such as Cluster provide a wide base for plot visualization software whose reach is still far from being fully exploited. The Heliophysics Science Archives within the ESAC Science Data Center (ESDC) already provide a first generation of tools for time-series visualization focusing on each mission's needs: visualization of quicklook plots, cross-calibration time series, pre-generated/on-demand multi-plot stacks (Cluster), basic plot zoom in/out options (Ulysses) and easy navigation through the plots in time (Ulysses, Cluster, ISS-Solaces). However, as the needs evolve and the scientists involved in new missions require to plot multi-variable data, heat maps stacks interactive synchronization and axis variable selection among other improvements. The new Heliophysics archives (such as Solar Orbiter) and the evolution of existing ones (Cluster) intend to address these new challenges. This paper provides an overview of the different approaches for visualizing time-series followed within the ESA Heliophysics Archives and their foreseen evolution.

  3. Formal Methods Tool Qualification

    NASA Technical Reports Server (NTRS)

    Wagner, Lucas G.; Cofer, Darren; Slind, Konrad; Tinelli, Cesare; Mebsout, Alain

    2017-01-01

    Formal methods tools have been shown to be effective at finding defects in safety-critical digital systems including avionics systems. The publication of DO-178C and the accompanying formal methods supplement DO-333 allows applicants to obtain certification credit for the use of formal methods without providing justification for them as an alternative method. This project conducted an extensive study of existing formal methods tools, identifying obstacles to their qualification and proposing mitigations for those obstacles. Further, it interprets the qualification guidance for existing formal methods tools and provides case study examples for open source tools. This project also investigates the feasibility of verifying formal methods tools by generating proof certificates which capture proof of the formal methods tool's claim, which can be checked by an independent, proof certificate checking tool. Finally, the project investigates the feasibility of qualifying this proof certificate checker, in the DO-330 framework, in lieu of qualifying the model checker itself.

  4. System Diagnostic Builder - A rule generation tool for expert systems that do intelligent data evaluation. [applied to Shuttle Mission Simulator

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph; Burke, Roger

    1993-01-01

    Consideration is given to the System Diagnostic Builder (SDB), an automated knowledge acquisition tool using state-of-the-art AI technologies. The SDB employs an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert. Thus, data are captured from the subject system, classified, and used to drive the rule generation process. These rule bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The knowledge bases captured from the Shuttle Mission Simulator can be used as black box simulations by the Intelligent Computer Aided Training devices. The SDB can also be used to construct knowledge bases for the process control industry, such as chemical production or oil and gas production.

  5. Architecture for the Next Generation System Management Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallard, Jerome; Lebre, I Adrien; Morin, Christine

    2011-01-01

    To get more results or greater accuracy, computational scientists execute their applications on distributed computing platforms such as Clusters, Grids and Clouds. These platforms are different in terms of hardware and software resources as well as locality: some span across multiple sites and multiple administrative domains whereas others are limited to a single site/domain. As a consequence, in order to scale their applica- tions up the scientists have to manage technical details for each target platform. From our point of view, this complexity should be hidden from the scientists who, in most cases, would prefer to focus on their researchmore » rather than spending time dealing with platform configuration concerns. In this article, we advocate for a system management framework that aims to automatically setup the whole run-time environment according to the applications needs. The main difference with regards to usual approaches is that they generally only focus on the software layer whereas we address both the hardware and the software expecta- tions through a unique system. For each application, scientists describe their requirements through the definition of a Virtual Platform (VP) and a Virtual System Environment (VSE). Relying on the VP/VSE definitions, the framework is in charge of: (i) the configuration of the physical infrastructure to satisfy the VP requirements, (ii) the setup of the VP, and (iii) the customization of the execution environment (VSE) upon the former VP. We propose a new formalism that the system can rely upon to successfully perform each of these three steps without burdening the user with the specifics of the configuration for the physical resources, and system management tools. This formalism leverages Goldberg s theory for recursive virtual machines by introducing new concepts based on system virtualization (identity, partitioning, aggregation) and emulation (simple, abstraction). This enables the definition of complex VP

  6. 2 CFR 180.845 - What does the debarring official consider in deciding whether to debar me?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 2 Grants and Agreements 1 2010-01-01 2010-01-01 false What does the debarring official consider in deciding whether to debar me? 180.845 Section 180.845 Grants and Agreements OFFICE OF MANAGEMENT AND BUDGET.... The record includes— (1) All information in support of the debarring official's proposed debarment; (2...

  7. The Complexity Analysis Tool

    DTIC Science & Technology

    1988-10-01

    overview of the complexity analysis tool ( CAT ), an automated tool which will analyze mission critical computer resources (MCCR) software. CAT is based...84 MAR UNCLASSIFIED SECURITY CLASSIFICATION OF THIS PAGE 19. ABSTRACT: (cont) CAT automates the metric for BASIC (HP-71), ATLAS (EQUATE), Ada (subset...UNIX 5.2). CAT analyzes source code and computes complexity on a module basis. CAT also generates graphic representations of the logic flow paths and

  8. EERE's State & Local Energy Data Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shambarger, Erick; DeCesaro, Jennifer

    2014-06-23

    EERE's State and Local Energy Data (SLED) Tool provides basic energy market information that can help state and local governments plan and implement clean energy projects, including electricity generation; fuel sources and costs; applicable policies, regulations, and financial incentives; and renewable energy resource potential. Watch this video to learn more about the tool and hear testimonials from real users about the benefits of using this tool.

  9. EERE's State & Local Energy Data Tool

    ScienceCinema

    Shambarger, Erick; DeCesaro, Jennifer

    2018-05-30

    EERE's State and Local Energy Data (SLED) Tool provides basic energy market information that can help state and local governments plan and implement clean energy projects, including electricity generation; fuel sources and costs; applicable policies, regulations, and financial incentives; and renewable energy resource potential. Watch this video to learn more about the tool and hear testimonials from real users about the benefits of using this tool.

  10. Development of Next Generation Synthetic Biology Tools for Use in Streptomyces venezuelae

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phelan, Ryan M.; Sachs, Daniel; Petkiewicz, Shayne J.

    Streptomyces have a rich history as producers of important natural products and this genus of bacteria has recently garnered attention for its potential applications in the broader context of synthetic biology. However, the dearth of genetic tools available to control and monitor protein production precludes rapid and predictable metabolic engineering that is possible in hosts such as Escherichia coli or Saccharomyces cerevisiae. In an effort to improve genetic tools for Streptomyces venezuelae, we developed a suite of standardized, orthogonal integration vectors and an improved method to monitor protein production in this host. These tools were applied to characterize heterologous promotersmore » and various attB chromosomal integration sites. A final study leveraged the characterized toolset to demonstrate its use in producing the biofuel precursor bisabolene using a chromosomally integrated expression system. In conclusion, these tools advance S. venezuelae to be a practical host for future metabolic engineering efforts.« less

  11. 40 CFR 155.46 - Deciding that a registration review is complete and additional review is not needed.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Deciding that a registration review is complete and additional review is not needed. 155.46 Section 155.46 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) PESTICIDE PROGRAMS REGISTRATION STANDARDS AND REGISTRATION REVIEW Registration...

  12. Developing a tool to estimate water withdrawal and consumption in electricity generation in the United States.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, M.; Peng, J.; NE)

    2011-02-24

    Freshwater consumption for electricity generation is projected to increase dramatically in the next couple of decades in the United States. The increased demand is likely to further strain freshwater resources in regions where water has already become scarce. Meanwhile, the automotive industry has stepped up its research, development, and deployment efforts on electric vehicles (EVs) and plug-in hybrid electric vehicles (PHEVs). Large-scale, escalated production of EVs and PHEVs nationwide would require increased electricity production, and so meeting the water demand becomes an even greater challenge. The goal of this study is to provide a baseline assessment of freshwater use inmore » electricity generation in the United States and at the state level. Freshwater withdrawal and consumption requirements for power generated from fossil, nonfossil, and renewable sources via various technologies and by use of different cooling systems are examined. A data inventory has been developed that compiles data from government statistics, reports, and literature issued by major research institutes. A spreadsheet-based model has been developed to conduct the estimates by means of a transparent and interactive process. The model further allows us to project future water withdrawal and consumption in electricity production under the forecasted increases in demand. This tool is intended to provide decision makers with the means to make a quick comparison among various fuel, technology, and cooling system options. The model output can be used to address water resource sustainability when considering new projects or expansion of existing plants.« less

  13. Evaluation of the efficiency and reliability of software generated by code generators

    NASA Technical Reports Server (NTRS)

    Schreur, Barbara

    1994-01-01

    There are numerous studies which show that CASE Tools greatly facilitate software development. As a result of these advantages, an increasing amount of software development is done with CASE Tools. As more software engineers become proficient with these tools, their experience and feedback lead to further development with the tools themselves. What has not been widely studied, however, is the reliability and efficiency of the actual code produced by the CASE Tools. This investigation considered these matters. Three segments of code generated by MATRIXx, one of many commercially available CASE Tools, were chosen for analysis: ETOFLIGHT, a portion of the Earth to Orbit Flight software, and ECLSS and PFMC, modules for Environmental Control and Life Support System and Pump Fan Motor Control, respectively.

  14. User Manual for the PROTEUS Mesh Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Micheal A.; Shemon, Emily R

    2016-09-19

    PROTEUS is built around a finite element representation of the geometry for visualization. In addition, the PROTEUS-SN solver was built to solve the even-parity transport equation on a finite element mesh provided as input. Similarly, PROTEUS-MOC and PROTEUS-NEMO were built to apply the method of characteristics on unstructured finite element meshes. Given the complexity of real world problems, experience has shown that using commercial mesh generator to create rather simple input geometries is overly complex and slow. As a consequence, significant effort has been put into place to create multiple codes that help assist in the mesh generation and manipulation.more » There are three input means to create a mesh in PROTEUS: UFMESH, GRID, and NEMESH. At present, the UFMESH is a simple way to generate two-dimensional Cartesian and hexagonal fuel assembly geometries. The UFmesh input allows for simple assembly mesh generation while the GRID input allows the generation of Cartesian, hexagonal, and regular triangular structured grid geometry options. The NEMESH is a way for the user to create their own mesh or convert another mesh file format into a PROTEUS input format. Given that one has an input mesh format acceptable for PROTEUS, we have constructed several tools which allow further mesh and geometry construction (i.e. mesh extrusion and merging). This report describes the various mesh tools that are provided with the PROTEUS code giving both descriptions of the input and output. In many cases the examples are provided with a regression test of the mesh tools. The most important mesh tools for any user to consider using are the MT_MeshToMesh.x and the MT_RadialLattice.x codes. The former allows the conversion between most mesh types handled by PROTEUS while the second allows the merging of multiple (assembly) meshes into a radial structured grid. Note that the mesh generation process is recursive in nature and that each input specific for a given mesh tool (such as

  15. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    PubMed

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  16. Latest generation of flat detector CT as a peri-interventional diagnostic tool: a comparative study with multidetector CT.

    PubMed

    Leyhe, Johanna Rosemarie; Tsogkas, Ioannis; Hesse, Amélie Carolina; Behme, Daniel; Schregel, Katharina; Papageorgiou, Ismini; Liman, Jan; Knauth, Michael; Psychogios, Marios-Nikos

    2017-12-01

    Flat detector CT (FDCT) has been used as a peri-interventional diagnostic tool in numerous studies with mixed results regarding image quality and detection of intracranial lesions. We compared the diagnostic aspects of the latest generation FDCT with standard multidetector CT (MDCT). 102 patients were included in our retrospective study. All patients had undergone interventional procedures. FDCT was acquired peri-interventionally and compared with postinterventional MDCT regarding depiction of ventricular/subarachnoidal spaces, detection of intracranial hemorrhage, and delineation of ischemic lesions using an ordinal scale. Ischemic lesions were quantified with the Alberta Stroke Program Early CT Scale (ASPECTS) on both examinations. Two neuroradiologists with varying grades of experience and a medical student scored the anonymized images separately, blinded to the clinical history. The two methods were of equal diagnostic value regarding evaluation of the ventricular system and the subarachnoidal spaces. Subarachnoidal, intraventricular, and parenchymal hemorrhages were detected with a sensitivity of 95%, 97%, and 100% and specificity of 97%, 100%, and 99%, respectively, using FDCT. Gray-white differentiation was feasible in the majority of FDCT scans, and ischemic lesions were detected with a sensitivity of 71% on FDCT, compared with MDCT scans. The mean difference in ASPECTS values on FDCT and MDCT was 0.5 points (95% CI 0.12 to 0.88). The latest generation of FDCT is a reliable and accurate tool for the detection of intracranial hemorrhage. Gray-white differentiation is feasible in the supratentorial region. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  17. Micro Slot Generation by μ-ED Milling

    NASA Astrophysics Data System (ADS)

    Dave, H. K.; Mayanak, M. K.; Rajpurohit, S. R.; Mathai, V. J.

    2016-08-01

    Micro electro discharge machining is one of the most widely used advanced micro machining technique owing to its capability to fabricate micro features on any electrically conductive materials irrespective of its material properties. Despite its wide acceptability, the process is always adversely affected by issues like wear that occurred on the tool electrode, which results into generation of inaccurate features. Micro ED milling, a process variant in which the tool electrode simultaneously rotated and scanned during machining, is reported to have high process efficiency for generation of 3D complicated shapes and features with relatively less electrode wear intensity. In the present study an attempt has been made to study the effect of two process parameters viz. capacitance and scanning speed of tool electrode on end wear that occurs on the tool electrode and overcut of micro slots generated by micro ED milling. The experiment has been conducted on Al 1100 alloy with tungsten electrode having diameter of 300 μm. Results suggest that wear on the tool electrode and overcut of the micro features generated are highly influenced by the level of the capacitance employed during machining. For the parameter usage employed for present study however, no significant effect of variation of scanning speed has been observed on both responses.

  18. 30 CFR 227.107 - When will the MMS Director decide whether to approve a State's delegation proposal?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false When will the MMS Director decide whether to approve a State's delegation proposal? 227.107 Section 227.107 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT OF THE INTERIOR MINERALS REVENUE MANAGEMENT DELEGATION TO STATES Delegation Process...

  19. 30 CFR 285.429 - What criteria will MMS consider in deciding whether to renew a lease or grant?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... deciding whether to renew a lease or grant: (a) Design life of existing technology. (b) Availability and feasibility of new technology. (c) Environmental and safety record of the lessee or grantee. (d) Operational... electrical distribution and transmission system. ...

  20. Volunteers Help Decide Where to Point Mars Camera

    NASA Image and Video Library

    2015-07-22

    This series of images from NASA's Mars Reconnaissance Orbiter successively zooms into "spider" features -- or channels carved in the surface in radial patterns -- in the south polar region of Mars. In a new citizen-science project, volunteers will identify features like these using wide-scale images from the orbiter. Their input will then help mission planners decide where to point the orbiter's high-resolution camera for more detailed views of interesting terrain. Volunteers will start with images from the orbiter's Context Camera (CTX), which provides wide views of the Red Planet. The first two images in this series are from CTX; the top right image zooms into a portion of the image at left. The top right image highlights the geological spider features, which are carved into the terrain in the Martian spring when dry ice turns to gas. By identifying unusual features like these, volunteers will help the mission team choose targets for the orbiter's High Resolution Imaging Science Experiment (HiRISE) camera, which can reveal more detail than any other camera ever put into orbit around Mars. The final image is this series (bottom right) shows a HiRISE close-up of one of the spider features. http://photojournal.jpl.nasa.gov/catalog/PIA19823

  1. MRO Sequence Checking Tool

    NASA Technical Reports Server (NTRS)

    Fisher, Forest; Gladden, Roy; Khanampornpan, Teerapat

    2008-01-01

    The MRO Sequence Checking Tool program, mro_check, automates significant portions of the MRO (Mars Reconnaissance Orbiter) sequence checking procedure. Though MRO has similar checks to the ODY s (Mars Odyssey) Mega Check tool, the checks needed for MRO are unique to the MRO spacecraft. The MRO sequence checking tool automates the majority of the sequence validation procedure and check lists that are used to validate the sequences generated by MRO MPST (mission planning and sequencing team). The tool performs more than 50 different checks on the sequence. The automation varies from summarizing data about the sequence needed for visual verification of the sequence, to performing automated checks on the sequence and providing a report for each step. To allow for the addition of new checks as needed, this tool is built in a modular fashion.

  2. Datasets2Tools, repository and search engine for bioinformatics datasets, tools and canned analyses

    PubMed Central

    Torre, Denis; Krawczuk, Patrycja; Jagodnik, Kathleen M.; Lachmann, Alexander; Wang, Zichen; Wang, Lily; Kuleshov, Maxim V.; Ma’ayan, Avi

    2018-01-01

    Biomedical data repositories such as the Gene Expression Omnibus (GEO) enable the search and discovery of relevant biomedical digital data objects. Similarly, resources such as OMICtools, index bioinformatics tools that can extract knowledge from these digital data objects. However, systematic access to pre-generated ‘canned’ analyses applied by bioinformatics tools to biomedical digital data objects is currently not available. Datasets2Tools is a repository indexing 31,473 canned bioinformatics analyses applied to 6,431 datasets. The Datasets2Tools repository also contains the indexing of 4,901 published bioinformatics software tools, and all the analyzed datasets. Datasets2Tools enables users to rapidly find datasets, tools, and canned analyses through an intuitive web interface, a Google Chrome extension, and an API. Furthermore, Datasets2Tools provides a platform for contributing canned analyses, datasets, and tools, as well as evaluating these digital objects according to their compliance with the findable, accessible, interoperable, and reusable (FAIR) principles. By incorporating community engagement, Datasets2Tools promotes sharing of digital resources to stimulate the extraction of knowledge from biomedical research data. Datasets2Tools is freely available from: http://amp.pharm.mssm.edu/datasets2tools. PMID:29485625

  3. Datasets2Tools, repository and search engine for bioinformatics datasets, tools and canned analyses.

    PubMed

    Torre, Denis; Krawczuk, Patrycja; Jagodnik, Kathleen M; Lachmann, Alexander; Wang, Zichen; Wang, Lily; Kuleshov, Maxim V; Ma'ayan, Avi

    2018-02-27

    Biomedical data repositories such as the Gene Expression Omnibus (GEO) enable the search and discovery of relevant biomedical digital data objects. Similarly, resources such as OMICtools, index bioinformatics tools that can extract knowledge from these digital data objects. However, systematic access to pre-generated 'canned' analyses applied by bioinformatics tools to biomedical digital data objects is currently not available. Datasets2Tools is a repository indexing 31,473 canned bioinformatics analyses applied to 6,431 datasets. The Datasets2Tools repository also contains the indexing of 4,901 published bioinformatics software tools, and all the analyzed datasets. Datasets2Tools enables users to rapidly find datasets, tools, and canned analyses through an intuitive web interface, a Google Chrome extension, and an API. Furthermore, Datasets2Tools provides a platform for contributing canned analyses, datasets, and tools, as well as evaluating these digital objects according to their compliance with the findable, accessible, interoperable, and reusable (FAIR) principles. By incorporating community engagement, Datasets2Tools promotes sharing of digital resources to stimulate the extraction of knowledge from biomedical research data. Datasets2Tools is freely available from: http://amp.pharm.mssm.edu/datasets2tools.

  4. End-user satisfaction of a patient education tool manual versus computer-generated tool.

    PubMed

    Tronni, C; Welebob, E

    1996-01-01

    This article reports a nonexperimental comparative study of end-user satisfaction before and after implementation of a vendor supplied computerized system (Micromedex, Inc) for providing up-to-date patient instructions regarding diseases, injuries, procedures, and medications. The purpose of this research was to measure the satisfaction of nurses who directly interact with a specific patient educational software application and to compare user satisfaction with manual versus computer generated materials. A computing satisfaction questionnaire that uses a scale of 1 to 5 (1 being the lowest) was used to measure end-user computing satisfaction in five constructs: content, accuracy, format, ease of use, and timeliness. Summary statistics were used to calculate mean ratings for each of the questionnaire's 12 items and for each of the five constructs. Mean differences between the ratings before and after implementation of the five constructs were significant by paired t test. Total user satisfaction improved with the computerized system, and the computer generated materials were given a higher rating than were the manual materials. Implications of these findings are discussed.

  5. Managing complex research datasets using electronic tools: A meta-analysis exemplar

    PubMed Central

    Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.

    2013-01-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256

  6. Generative Processes: Thick Drawing

    ERIC Educational Resources Information Center

    Wallick, Karl

    2012-01-01

    This article presents techniques and theories of generative drawing as a means for developing complex content in architecture design studios. Appending the word "generative" to drawing adds specificity to the most common representation tool and clarifies that such drawings are not singularly about communication or documentation but are…

  7. Analytical Tools for Space Suit Design

    NASA Technical Reports Server (NTRS)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  8. Automated branching pattern report generation for laparoscopic surgery assistance

    NASA Astrophysics Data System (ADS)

    Oda, Masahiro; Matsuzaki, Tetsuro; Hayashi, Yuichiro; Kitasaka, Takayuki; Misawa, Kazunari; Mori, Kensaku

    2015-05-01

    This paper presents a method for generating branching pattern reports of abdominal blood vessels for laparoscopic gastrectomy. In gastrectomy, it is very important to understand branching structure of abdominal arteries and veins, which feed and drain specific abdominal organs including the stomach, the liver and the pancreas. In the real clinical stage, a surgeon creates a diagnostic report of the patient anatomy. This report summarizes the branching patterns of the blood vessels related to the stomach. The surgeon decides actual operative procedure. This paper shows an automated method to generate a branching pattern report for abdominal blood vessels based on automated anatomical labeling. The report contains 3D rendering showing important blood vessels and descriptions of branching patterns of each vessel. We have applied this method for fifty cases of 3D abdominal CT scans and confirmed the proposed method can automatically generate branching pattern reports of abdominal arteries.

  9. Involving Communities in Deciding What Benefits They Receive in Multinational Research.

    PubMed

    Wendler, David; Shah, Seema

    2015-10-01

    There is wide agreement that communities in lower-income countries should benefit when they participate in multinational research. Debate now focuses on how and to what extent these communities should benefit. This debate has identified compelling reasons to reject the claim that whatever benefits a community agrees to accept are necessarily fair. Yet, those who conduct clinical research may conclude from this rejection that there is no reason to involve communities in the process of deciding how they benefit. Against this possibility, the present manuscript argues that involving host communities in this process helps to promote four important goals: (1) protecting host communities, (2) respecting host communities, (3) promoting transparency, and (4) enhancing social value. Published by Oxford University Press on behalf of the Journal of Medicine and Philosophy, Inc. 2015.

  10. Decide now, pay later: Early influences in math and science education

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malcom, S.

    1995-12-31

    Who are the people deciding to major in science, math or engineering in college? The early interest in science and math education which can lead to science and engineering careers, is shaped as much by the encompassing world of the child as it is by formal education experiences. This paper documents what we know and what we need to know about the influences on children from pre-kindergarten through sixth grade, including the home, pre-school groups, science and math programs in churches, community groups, the media, cultural institutions (museums, zoos, botanical gardens), libraries, and schools (curriculum, instruction, policies and assessment). Itmore » also covers the nature and quality of curricular and intervention programs, and identifies strategies that appear to be most effective for various groups.« less

  11. Slide system for machine tools

    DOEpatents

    Douglass, S.S.; Green, W.L.

    1980-06-12

    The present invention relates to a machine tool which permits the machining of nonaxisymmetric surfaces on a workpiece while rotating the workpiece about a central axis of rotation. The machine tool comprises a conventional two-slide system (X-Y) with one of these slides being provided with a relatively short travel high-speed auxiliary slide which carries the material-removing tool. The auxiliary slide is synchronized with the spindle speed and the position of the other two slides and provides a high-speed reciprocating motion required for the displacement of the cutting tool for generating a nonaxisymmetric surface at a selected location on the workpiece.

  12. Slide system for machine tools

    DOEpatents

    Douglass, Spivey S.; Green, Walter L.

    1982-01-01

    The present invention relates to a machine tool which permits the machining of nonaxisymmetric surfaces on a workpiece while rotating the workpiece about a central axis of rotation. The machine tool comprises a conventional two-slide system (X-Y) with one of these slides being provided with a relatively short travel high-speed auxiliary slide which carries the material-removing tool. The auxiliary slide is synchronized with the spindle speed and the position of the other two slides and provides a high-speed reciprocating motion required for the displacement of the cutting tool for generating a nonaxisymmetric surface at a selected location on the workpiece.

  13. POST II Trajectory Animation Tool Using MATLAB, V1.0

    NASA Technical Reports Server (NTRS)

    Raiszadeh, Behzad

    2005-01-01

    A trajectory animation tool has been developed for accurately depicting position and the attitude of the bodies in flight. The movies generated from This MATLAB based tool serve as an engineering analysis aid to gain further understanding into the dynamic behavior of bodies in flight. This tool has been designed to interface with the output generated from POST II simulations, and is able to animate a single as well as multiple vehicles in flight.

  14. The cognitive and neural basis of option generation and subsequent choice.

    PubMed

    Kaiser, Stefan; Simon, Joe J; Kalis, Annemarie; Schweizer, Sophie; Tobler, Philippe N; Mojzisch, Andreas

    2013-12-01

    Decision-making research has thoroughly investigated how people choose from a set of externally provided options. However, in ill-structured real-world environments, possible options for action are not defined by the situation but have to be generated by the agent. Here, we apply behavioral analysis (Study 1) and functional magnetic resonance imaging (Study 2) to investigate option generation and subsequent choice. For this purpose, we employ a new experimental task that requires participants to generate options for simple real-world scenarios and to subsequently decide among the generated options. Correlational analysis with a cognitive test battery suggests that retrieval of options from long-term memory is a relevant process during option generation. The results of the fMRI study demonstrate that option generation in simple real-world scenarios recruits the anterior prefrontal cortex. Furthermore, we show that choice behavior and its neural correlates differ between self-generated and externally provided options. Specifically, choice between self-generated options is associated with stronger recruitment of the dorsal anterior cingulate cortex. This impact of option generation on subsequent choice underlines the need for an expanded model of decision making to accommodate choice between self-generated options.

  15. 20 CFR 10.609 - How does OWCP decide whether new evidence requires modification of the prior decision?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false How does OWCP decide whether new evidence requires modification of the prior decision? 10.609 Section 10.609 Employees' Benefits OFFICE OF WORKERS.... (b) A claims examiner who did not participate in making the contested decision will conduct the merit...

  16. 20 CFR 416.630 - How will we notify you when we decide you need a representative payee?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 2 2011-04-01 2011-04-01 false How will we notify you when we decide you need a representative payee? 416.630 Section 416.630 Employees' Benefits SOCIAL SECURITY ADMINISTRATION SUPPLEMENTAL SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Representative Payment § 416.630 How will we...

  17. 20 CFR 416.630 - How will we notify you when we decide you need a representative payee?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 20 Employees' Benefits 2 2012-04-01 2012-04-01 false How will we notify you when we decide you need a representative payee? 416.630 Section 416.630 Employees' Benefits SOCIAL SECURITY ADMINISTRATION SUPPLEMENTAL SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Representative Payment § 416.630 How will we...

  18. 20 CFR 416.630 - How will we notify you when we decide you need a representative payee?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 20 Employees' Benefits 2 2013-04-01 2013-04-01 false How will we notify you when we decide you need a representative payee? 416.630 Section 416.630 Employees' Benefits SOCIAL SECURITY ADMINISTRATION SUPPLEMENTAL SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Representative Payment § 416.630 How will we...

  19. 30 CFR 585.429 - What criteria will BOEM consider in deciding whether to renew a lease or grant?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... criteria in deciding whether to renew a lease or grant: (a) Design life of existing technology. (b) Availability and feasibility of new technology. (c) Environmental and safety record of the lessee or grantee... reliability within the regional electrical distribution and transmission system. ...

  20. 30 CFR 585.429 - What criteria will BOEM consider in deciding whether to renew a lease or grant?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... criteria in deciding whether to renew a lease or grant: (a) Design life of existing technology. (b) Availability and feasibility of new technology. (c) Environmental and safety record of the lessee or grantee... reliability within the regional electrical distribution and transmission system. ...

  1. 30 CFR 585.429 - What criteria will BOEM consider in deciding whether to renew a lease or grant?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... criteria in deciding whether to renew a lease or grant: (a) Design life of existing technology. (b) Availability and feasibility of new technology. (c) Environmental and safety record of the lessee or grantee... reliability within the regional electrical distribution and transmission system. ...

  2. Spacecraft Guidance, Navigation, and Control Visualization Tool

    NASA Technical Reports Server (NTRS)

    Mandic, Milan; Acikmese, Behcet; Blackmore, Lars

    2011-01-01

    G-View is a 3D visualization tool for supporting spacecraft guidance, navigation, and control (GN&C) simulations relevant to small-body exploration and sampling (see figure). The tool is developed in MATLAB using Virtual Reality Toolbox and provides users with the ability to visualize the behavior of their simulations, regardless of which programming language (or machine) is used to generate simulation results. The only requirement is that multi-body simulation data is generated and placed in the proper format before applying G-View.

  3. Stage Separation CFD Tool Development and Evaluation

    NASA Technical Reports Server (NTRS)

    Droege, Alan; Gomez, Reynaldo; Wang, Ten-See

    2002-01-01

    This viewgraph presentation evaluates CFD (Computational Fluid Dynamics) tools for solving stage separation problems. The demonstration and validation of the tools is for a second generation RLV (Reusable Launch Vehicle) stage separation. The flow solvers are: Cart3D; Overflow/Overflow-D; Unic.

  4. Scheduling Results for the THEMIS Observation Scheduling Tool

    NASA Technical Reports Server (NTRS)

    Mclaren, David; Rabideau, Gregg; Chien, Steve; Knight, Russell; Anwar, Sadaat; Mehall, Greg; Christensen, Philip

    2011-01-01

    We describe a scheduling system intended to assist in the development of instrument data acquisitions for the THEMIS instrument, onboard the Mars Odyssey spacecraft, and compare results from multiple scheduling algorithms. This tool creates observations of both (a) targeted geographical regions of interest and (b) general mapping observations, while respecting spacecraft constraints such as data volume, observation timing, visibility, lighting, season, and science priorities. This tool therefore must address both geometric and state/timing/resource constraints. We describe a tool that maps geometric polygon overlap constraints to set covering constraints using a grid-based approach. These set covering constraints are then incorporated into a greedy optimization scheduling algorithm incorporating operations constraints to generate feasible schedules. The resultant tool generates schedules of hundreds of observations per week out of potential thousands of observations. This tool is currently under evaluation by the THEMIS observation planning team at Arizona State University.

  5. PDS4: Harnessing the Power of Generate and Apache Velocity

    NASA Astrophysics Data System (ADS)

    Padams, J.; Cayanan, M.; Hardman, S.

    2018-04-01

    The PDS4 Generate Tool is a Java-based command-line tool developed by the Cartography and Imaging Sciences Nodes (PDSIMG) for generating PDS4 XML labels, from Apache Velocity templates and input metadata.

  6. 42 CFR 55a.103 - What criteria has HHS established for deciding which grant application to fund?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false What criteria has HHS established for deciding which grant application to fund? 55a.103 Section 55a.103 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES GRANTS PROGRAM GRANTS FOR BLACK LUNG CLINICS General Provisions § 55a...

  7. 42 CFR 55a.103 - What criteria has HHS established for deciding which grant application to fund?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 1 2011-10-01 2011-10-01 false What criteria has HHS established for deciding which grant application to fund? 55a.103 Section 55a.103 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES GRANTS PROGRAM GRANTS FOR BLACK LUNG CLINICS General Provisions § 55a...

  8. 42 CFR 23.5 - What are the criteria for deciding which applications for assignment will be approved?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... development agency to which an application was submitted for review under § 23.4(c). (6) Comments received... applications for assignment will be approved? 23.5 Section 23.5 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT... Service Corps Personnel § 23.5 What are the criteria for deciding which applications for assignment will...

  9. 20 CFR 408.630 - How will we notify you when we decide you need a representative payee?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false How will we notify you when we decide you need a representative payee? 408.630 Section 408.630 Employees' Benefits SOCIAL SECURITY ADMINISTRATION SPECIAL BENEFITS FOR CERTAIN WORLD WAR II VETERANS Representative Payment § 408.630 How will we notify you...

  10. 20 CFR 408.630 - How will we notify you when we decide you need a representative payee?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 2 2011-04-01 2011-04-01 false How will we notify you when we decide you need a representative payee? 408.630 Section 408.630 Employees' Benefits SOCIAL SECURITY ADMINISTRATION SPECIAL BENEFITS FOR CERTAIN WORLD WAR II VETERANS Representative Payment § 408.630 How will we notify you...

  11. Rover Team Decides: Safety First

    NASA Technical Reports Server (NTRS)

    2006-01-01

    NASA's Mars Exploration Rover Spirit recorded this view while approaching the northwestern edge of 'Home Plate,' a circular plateau-like area of bright, layered outcrop material roughly 80 meters (260 feet) in diameter. The images combined into this mosaic were taken by Spirit's navigation camera during the rover's 746th, 748th and 750th Martian days, or sols (Feb. 7, 9 and 11, 2006).

    With Martian winter closing in, engineers and scientists working with NASA's Mars Exploration Rover Spirit decided to play it safe for the time being rather than attempt to visit the far side of Home Plate in search of rock layers that might show evidence of a past watery environment. This feature has been one of the major milestones of the mission. Though it's conceivable that rock layers might be exposed on the opposite side, sunlight is diminishing on the rover's solar panels and team members chose not to travel in a counterclockwise direction that would take the rover to the west and south slopes of the plateau. Slopes in that direction are hidden from view and team members chose, following a long, thorough discussion, to have the rover travel clockwise and remain on north-facing slopes rather than risk sending the rover deeper into unknown terrain.

    In addition to studying numerous images from Spirit's cameras, team members studied three-dimensional models created with images from the Mars Orbiter Camera on NASA's Mars Globel Surveyor orbiter. The models showed a valley on the southern side of Home Plate, the slopes of which might cause the rover's solar panels to lose power for unknown lengths of time. In addition, images from Spirit's cameras showed a nearby, talus-covered section of slope on the west side of Home Plate, rather than exposed rock layers scientists eventually hope to investigate.

    Home Plate has been on the rover's potential itinerary since the early days of the mission, when it stood out in images taken by the Mars Orbiter Camera shortly after

  12. Plug-in Plan Tool v3.0.3.1

    NASA Technical Reports Server (NTRS)

    Andrea-Liner, Kathleen E.; Au, Brion J.; Fisher, Blake R.; Rodbumrung, Watchara; Hamic, Jeffrey C.; Smith, Kary; Beadle, David S.

    2012-01-01

    The role of PLUTO (Plug-in Port UTilization Officer) and the growth of the International Space Station (ISS) have exceeded the capabilities of the current tool PiP (Plug-in Plan). Its users (crew and flight controllers) have expressed an interest in a new, easy-to-use tool with a higher level of interactivity and functionality that is not bound by the limitations of Excel. The PiP Tool assists crewmembers and ground controllers in making real-time decisions concerning the safety and compatibility of hardware plugged into the UOPs (Utility Outlet Panels) onboard the ISS. The PiP Tool also provides a reference to the current configuration of the hardware plugged in to the UOPs, and enables the PLUTO and crew to test Plug-in locations for constraint violations (such as cable connector mismatches or amp limit violations), to see the amps and volts for an end item, to see whether or not the end item uses 1553 data, and the cable length between the outlet and the end item. As new equipment is flown or returned, the database can be updated appropriately as needed. The current tool is a macroheavy Excel spreadsheet with its own database and reporting functionality. The new tool captures the capabilities of the original tool, ports them to new software, defines a new dataset, and compensates for ever-growing unique constraints associated with the Plug-in Plan. New constraints were designed into the tool, and updates to existing constraints were added to provide more flexibility and customizability. In addition, there is an option to associate a "Flag" with each device that will let the user know there is a unique constraint associated with it when they use it. This helps improve the safety and efficiency of real-time calls by limiting the amount of "corporate knowledge" overhead that has to be trained and learned through use. The tool helps save time by automating previous manual processes, such as calculating connector types and deciding which cables are required and in

  13. Tools to support evidence-informed public health decision making.

    PubMed

    Yost, Jennifer; Dobbins, Maureen; Traynor, Robyn; DeCorby, Kara; Workentine, Stephanie; Greco, Lori

    2014-07-18

    Public health professionals are increasingly expected to engage in evidence-informed decision making to inform practice and policy decisions. Evidence-informed decision making involves the use of research evidence along with expertise, existing public health resources, knowledge about community health issues, the local context and community, and the political climate. The National Collaborating Centre for Methods and Tools has identified a seven step process for evidence-informed decision making. Tools have been developed to support public health professionals as they work through each of these steps. This paper provides an overview of tools used in three Canadian public health departments involved in a study to develop capacity for evidence-informed decision making. As part of a knowledge translation and exchange intervention, a Knowledge Broker worked with public health professionals to identify and apply tools for use with each of the steps of evidence-informed decision making. The Knowledge Broker maintained a reflective journal and interviews were conducted with a purposive sample of decision makers and public health professionals. This paper presents qualitative analysis of the perceived usefulness and usability of the tools. Tools were used in the health departments to assist in: question identification and clarification; searching for the best available research evidence; assessing the research evidence for quality through critical appraisal; deciphering the 'actionable message(s)' from the research evidence; tailoring messages to the local context to ensure their relevance and suitability; deciding whether and planning how to implement research evidence in the local context; and evaluating the effectiveness of implementation efforts. Decision makers provided descriptions of how the tools were used within the health departments and made suggestions for improvement. Overall, the tools were perceived as valuable for advancing and sustaining evidence

  14. New generation of exploration tools: interactive modeling software and microcomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly,more » these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.« less

  15. Support for Systematic Code Reviews with the SCRUB Tool

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerald J.

    2010-01-01

    SCRUB is a code review tool that supports both large, team-based software development efforts (e.g., for mission software) as well as individual tasks. The tool was developed at JPL to support a new, streamlined code review process that combines human-generated review reports with program-generated review reports from a customizable range of state-of-the-art source code analyzers. The leading commercial tools include Codesonar, Coverity, and Klocwork, each of which can achieve a reasonably low rate of false-positives in the warnings that they generate. The time required to analyze code with these tools can vary greatly. In each case, however, the tools produce results that would be difficult to realize with human code inspections alone. There is little overlap in the results produced by the different analyzers, and each analyzer used generally increases the effectiveness of the overall effort. The SCRUB tool allows all reports to be accessed through a single, uniform interface (see figure) that facilitates brows ing code and reports. Improvements over existing software include significant simplification, and leveraging of a range of commercial, static source code analyzers in a single, uniform framework. The tool runs as a small stand-alone application, avoiding the security problems related to tools based on Web browsers. A developer or reviewer, for instance, must have already obtained access rights to a code base before that code can be browsed and reviewed with the SCRUB tool. The tool cannot open any files or folders to which the user does not already have access. This means that the tool does not need to enforce or administer any additional security policies. The analysis results presented through the SCRUB tool s user interface are always computed off-line, given that, especially for larger projects, this computation can take longer than appropriate for interactive tool use. The recommended code review process that is supported by the SCRUB tool consists of

  16. [Adaptation of the Medical Office Survey on Patient Safety Culture (MOSPSC) tool].

    PubMed

    Silvestre-Busto, C; Torijano-Casalengua, M L; Olivera-Cañadas, G; Astier-Peña, M P; Maderuelo-Fernández, J A; Rubio-Aguado, E A

    2015-01-01

    To adapt the Medical Office Survey on Patient Safety Culture (MOSPSC) Excel(®) tool for its use by Primary Care Teams of the Spanish National Public Health System. The process of translation and adaptation of MOSPSC from the Agency for Healthcare and Research in Quality (AHRQ) was performed in five steps: Original version translation, Conceptual equivalence evaluation, Acceptability and viability assessment, Content validity and Questionnaire test and response analysis, and psychometric properties assessment. After confirming MOSPSC as a valid, reliable, consistent and useful tool for assessing patient safety culture in our setting, an Excel(®) worksheet was translated and adapted in the same way. It was decided to develop a tool to analyze the "Spanish survey" and to keep it linked to the "Original version" tool. The "Spanish survey" comparison data are those obtained in a 2011 nationwide Spanish survey, while the "Original version" comparison data are those provided by the AHRQ in 2012. The translated and adapted tool and the analysis of the results from a 2011 nationwide Spanish survey are available on the website of the Ministry of Health, Social Services and Equality. It allows the questions which are decisive in the different dimensions to be determined, and it provides a comparison of the results with graphical representation. Translation and adaptation of this tool enables a patient safety culture in Primary Care in Spain to be more effectively applied. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.

  17. Study of Tools for Network Discovery and Network Mapping

    DTIC Science & Technology

    2003-11-01

    connected to the switch. iv. Accessibility of historical data and event data In general, network discovery tools keep a history of the collected...has the following software dependencies: - Java Virtual machine 76 - Perl modules - RRD Tool - TomCat - PostgreSQL STRENGTHS AND...systems - provide a simple view of the current network status - generate alarms on status change - generate history of status change VISUAL MAP

  18. An FMS Dynamic Production Scheduling Algorithm Considering Cutting Tool Failure and Cutting Tool Life

    NASA Astrophysics Data System (ADS)

    Setiawan, A.; Wangsaputra, R.; Martawirya, Y. Y.; Halim, A. H.

    2016-02-01

    This paper deals with Flexible Manufacturing System (FMS) production rescheduling due to unavailability of cutting tools caused either of cutting tool failure or life time limit. The FMS consists of parallel identical machines integrated with an automatic material handling system and it runs fully automatically. Each machine has a same cutting tool configuration that consists of different geometrical cutting tool types on each tool magazine. The job usually takes two stages. Each stage has sequential operations allocated to machines considering the cutting tool life. In the real situation, the cutting tool can fail before the cutting tool life is reached. The objective in this paper is to develop a dynamic scheduling algorithm when a cutting tool is broken during unmanned and a rescheduling needed. The algorithm consists of four steps. The first step is generating initial schedule, the second step is determination the cutting tool failure time, the third step is determination of system status at cutting tool failure time and the fourth step is the rescheduling for unfinished jobs. The approaches to solve the problem are complete-reactive scheduling and robust-proactive scheduling. The new schedules result differences starting time and completion time of each operations from the initial schedule.

  19. 25 CFR 224.71 - What standards will the Secretary use to decide to approve a final proposed TERA?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... INTERIOR ENERGY AND MINERALS TRIBAL ENERGY RESOURCE AGREEMENTS UNDER THE INDIAN TRIBAL ENERGY DEVELOPMENT AND SELF DETERMINATION ACT Approval of Tribal Energy Resource Agreements § 224.71 What standards will... interests of the tribe and the Federal policy of promoting tribal self-determination in deciding whether to...

  20. Economic and Financial Analysis Tools | Energy Analysis | NREL

    Science.gov Websites

    Economic and Financial Analysis Tools Economic and Financial Analysis Tools Use these economic and . Job and Economic Development Impact (JEDI) Model Use these easy-to-use, spreadsheet-based tools to analyze the economic impacts of constructing and operating power generation and biofuel plants at the

  1. Model-Driven Test Generation of Distributed Systems

    NASA Technical Reports Server (NTRS)

    Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin

    2012-01-01

    This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.

  2. How many invariant polynomials are needed to decide local unitary equivalence of qubit states?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maciążek, Tomasz; Faculty of Physics, University of Warsaw, ul. Hoża 69, 00-681 Warszawa; Oszmaniec, Michał

    2013-09-15

    Given L-qubit states with the fixed spectra of reduced one-qubit density matrices, we find a formula for the minimal number of invariant polynomials needed for solving local unitary (LU) equivalence problem, that is, problem of deciding if two states can be connected by local unitary operations. Interestingly, this number is not the same for every collection of the spectra. Some spectra require less polynomials to solve LU equivalence problem than others. The result is obtained using geometric methods, i.e., by calculating the dimensions of reduced spaces, stemming from the symplectic reduction procedure.

  3. Computational tools for copy number variation (CNV) detection using next-generation sequencing data: features and perspectives.

    PubMed

    Zhao, Min; Wang, Qingguo; Wang, Quan; Jia, Peilin; Zhao, Zhongming

    2013-01-01

    Copy number variation (CNV) is a prevalent form of critical genetic variation that leads to an abnormal number of copies of large genomic regions in a cell. Microarray-based comparative genome hybridization (arrayCGH) or genotyping arrays have been standard technologies to detect large regions subject to copy number changes in genomes until most recently high-resolution sequence data can be analyzed by next-generation sequencing (NGS). During the last several years, NGS-based analysis has been widely applied to identify CNVs in both healthy and diseased individuals. Correspondingly, the strong demand for NGS-based CNV analyses has fuelled development of numerous computational methods and tools for CNV detection. In this article, we review the recent advances in computational methods pertaining to CNV detection using whole genome and whole exome sequencing data. Additionally, we discuss their strengths and weaknesses and suggest directions for future development.

  4. Computational tools for copy number variation (CNV) detection using next-generation sequencing data: features and perspectives

    PubMed Central

    2013-01-01

    Copy number variation (CNV) is a prevalent form of critical genetic variation that leads to an abnormal number of copies of large genomic regions in a cell. Microarray-based comparative genome hybridization (arrayCGH) or genotyping arrays have been standard technologies to detect large regions subject to copy number changes in genomes until most recently high-resolution sequence data can be analyzed by next-generation sequencing (NGS). During the last several years, NGS-based analysis has been widely applied to identify CNVs in both healthy and diseased individuals. Correspondingly, the strong demand for NGS-based CNV analyses has fuelled development of numerous computational methods and tools for CNV detection. In this article, we review the recent advances in computational methods pertaining to CNV detection using whole genome and whole exome sequencing data. Additionally, we discuss their strengths and weaknesses and suggest directions for future development. PMID:24564169

  5. Phenotype Instance Verification and Evaluation Tool (PIVET): A Scaled Phenotype Evidence Generation Framework Using Web-Based Medical Literature

    PubMed Central

    Ke, Junyuan; Ho, Joyce C; Ghosh, Joydeep; Wallace, Byron C

    2018-01-01

    Background Researchers are developing methods to automatically extract clinically relevant and useful patient characteristics from raw healthcare datasets. These characteristics, often capturing essential properties of patients with common medical conditions, are called computational phenotypes. Being generated by automated or semiautomated, data-driven methods, such potential phenotypes need to be validated as clinically meaningful (or not) before they are acceptable for use in decision making. Objective The objective of this study was to present Phenotype Instance Verification and Evaluation Tool (PIVET), a framework that uses co-occurrence analysis on an online corpus of publically available medical journal articles to build clinical relevance evidence sets for user-supplied phenotypes. PIVET adopts a conceptual framework similar to the pioneering prototype tool PheKnow-Cloud that was developed for the phenotype validation task. PIVET completely refactors each part of the PheKnow-Cloud pipeline to deliver vast improvements in speed without sacrificing the quality of the insights PheKnow-Cloud achieved. Methods PIVET leverages indexing in NoSQL databases to efficiently generate evidence sets. Specifically, PIVET uses a succinct representation of the phenotypes that corresponds to the index on the corpus database and an optimized co-occurrence algorithm inspired by the Aho-Corasick algorithm. We compare PIVET’s phenotype representation with PheKnow-Cloud’s by using PheKnow-Cloud’s experimental setup. In PIVET’s framework, we also introduce a statistical model trained on domain expert–verified phenotypes to automatically classify phenotypes as clinically relevant or not. Additionally, we show how the classification model can be used to examine user-supplied phenotypes in an online, rather than batch, manner. Results PIVET maintains the discriminative power of PheKnow-Cloud in terms of identifying clinically relevant phenotypes for the same corpus with

  6. Distributed Coordination for Optimal Energy Generation and Distribution in Cyber-Physical Energy Networks.

    PubMed

    Ahn, Hyo-Sung; Kim, Byeong-Yeon; Lim, Young-Hun; Lee, Byung-Hun; Oh, Kwang-Kyo

    2018-03-01

    This paper proposes three coordination laws for optimal energy generation and distribution in energy network, which is composed of physical flow layer and cyber communication layer. The physical energy flows through the physical layer; but all the energies are coordinated to generate and flow by distributed coordination algorithms on the basis of communication information. First, distributed energy generation and energy distribution laws are proposed in a decoupled manner without considering the interactive characteristics between the energy generation and energy distribution. Second, a joint coordination law to treat the energy generation and energy distribution in a coupled manner taking account of the interactive characteristics is designed. Third, to handle over- or less-energy generation cases, an energy distribution law for networks with batteries is designed. The coordination laws proposed in this paper are fully distributed in the sense that they are decided optimally only using relative information among neighboring nodes. Through numerical simulations, the validity of the proposed distributed coordination laws is illustrated.

  7. 25 CFR 162.573 - How will BIA decide whether to approve an amendment to a WSR lease?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false How will BIA decide whether to approve an amendment to a WSR lease? 162.573 Section 162.573 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Wsr Lease Amendments § 162.573 How will...

  8. 25 CFR 162.577 - How will BIA decide whether to approve an assignment of a WSR lease?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 1 2013-04-01 2013-04-01 false How will BIA decide whether to approve an assignment of a WSR lease? 162.577 Section 162.577 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Wsr Lease Assignments § 162.577 How will...

  9. 25 CFR 162.573 - How will BIA decide whether to approve an amendment to a WSR lease?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 1 2013-04-01 2013-04-01 false How will BIA decide whether to approve an amendment to a WSR lease? 162.573 Section 162.573 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Wsr Lease Amendments § 162.573 How will...

  10. 25 CFR 162.577 - How will BIA decide whether to approve an assignment of a WSR lease?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false How will BIA decide whether to approve an assignment of a WSR lease? 162.577 Section 162.577 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Wsr Lease Assignments § 162.577 How will...

  11. 25 CFR 162.581 - How will BIA decide whether to approve a sublease of a WSR lease?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 1 2013-04-01 2013-04-01 false How will BIA decide whether to approve a sublease of a WSR lease? 162.581 Section 162.581 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Wsr Lease Subleases § 162.581 How will BIA...

  12. 43 CFR 4.908 - What is the administrative record for my appeal if it is deemed decided?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Oil and Gas Royalties and Related Matters § 4.908 What is the administrative record for my appeal if... 43 Public Lands: Interior 1 2011-10-01 2011-10-01 false What is the administrative record for my appeal if it is deemed decided? 4.908 Section 4.908 Public Lands: Interior Office of the Secretary of the...

  13. 25 CFR 162.581 - How will BIA decide whether to approve a sublease of a WSR lease?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false How will BIA decide whether to approve a sublease of a WSR lease? 162.581 Section 162.581 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Wsr Lease Subleases § 162.581 How will BIA...

  14. Diagnostic tool for red blood cell membrane disorders: Assessment of a new generation ektacytometer☆

    PubMed Central

    Da Costa, Lydie; Suner, Ludovic; Galimand, Julie; Bonnel, Amandine; Pascreau, Tiffany; Couque, Nathalie; Fenneteau, Odile; Mohandas, Narla

    2016-01-01

    Inherited red blood cell (RBC) membrane disorders, such as hereditary spherocytosis, elliptocytosis and hereditary ovalocytosis, result from mutations in genes encoding various RBC membrane and skeletal proteins. The RBC membrane, a composite structure composed of a lipid bilayer linked to a spectrin/actin-based membrane skeleton, confers upon the RBC unique features of deformability and mechanical stability. The disease severity is primarily dependent on the extent of membrane surface area loss. RBC membrane disorders can be readily diagnosed by various laboratory approaches that include RBC cytology, flow cytometry, ektacytometry, electrophoresis of RBC membrane proteins and genetics. The reference technique for diagnosis of RBC membrane disorders is the osmotic gradient ektacytometry. However, in spite of its recognition as the reference technique, this technique is rarely used as a routine diagnosis tool for RBC membrane disorders due to its limited availability. This may soon change as a new generation of ektacytometer has been recently engineered. In this review, we describe the workflow of the samples shipped to our Hematology laboratory for RBC membrane disorder analysis and the data obtained for a large cohort of French patients presenting with RBC membrane disorders using a newly available version of the ektacytomer. PMID:26603718

  15. Test Generator for MATLAB Simulations

    NASA Technical Reports Server (NTRS)

    Henry, Joel

    2011-01-01

    MATLAB Automated Test Tool, version 3.0 (MATT 3.0) is a software package that provides automated tools that reduce the time needed for extensive testing of simulation models that have been constructed in the MATLAB programming language by use of the Simulink and Real-Time Workshop programs. MATT 3.0 runs on top of the MATLAB engine application-program interface to communicate with the Simulink engine. MATT 3.0 automatically generates source code from the models, generates custom input data for testing both the models and the source code, and generates graphs and other presentations that facilitate comparison of the outputs of the models and the source code for the same input data. Context-sensitive and fully searchable help is provided in HyperText Markup Language (HTML) format.

  16. A new generation of tools for search, recovery and quality evaluation of World Wide Web medical resources.

    PubMed

    Aguillo, I

    2000-01-01

    Although the Internet is already a valuable information resource in medicine, there are important challenges to be faced before physicians and general users will have extensive access to this information. As a result of a research effort to compile a health-related Internet directory, new tools and strategies have been developed to solve key problems derived from the explosive growth of medical information on the Net and the great concern over the quality of such critical information. The current Internet search engines lack some important capabilities. We suggest using second generation tools (client-side based) able to deal with large quantities of data and to increase the usability of the records recovered. We tested the capabilities of these programs to solve health-related information problems, recognising six groups according to the kind of topics addressed: Z39.50 clients, downloaders, multisearchers, tracing agents, indexers and mappers. The evaluation of the quality of health information available on the Internet could require a large amount of human effort. A possible solution may be to use quantitative indicators based on the hypertext visibility of the Web sites. The cybermetric measures are valid for quality evaluation if they are derived from indirect peer review by experts with Web pages citing the site. The hypertext links acting as citations need to be extracted from a controlled sample of quality super-sites.

  17. Epidemiology of transmissible diseases: Array hybridization and next generation sequencing as universal nucleic acid-mediated typing tools.

    PubMed

    Michael Dunne, W; Pouseele, Hannes; Monecke, Stefan; Ehricht, Ralf; van Belkum, Alex

    2017-09-21

    The magnitude of interest in the epidemiology of transmissible human diseases is reflected in the vast number of tools and methods developed recently with the expressed purpose to characterize and track evolutionary changes that occur in agents of these diseases over time. Within the past decade a new suite of such tools has become available with the emergence of the so-called "omics" technologies. Among these, two are exponents of the ongoing genomic revolution. Firstly, high-density nucleic acid probe arrays have been proposed and developed using various chemical and physical approaches. Via hybridization-mediated detection of entire genes or genetic polymorphisms in such genes and intergenic regions these so called "DNA chips" have been successfully applied for distinguishing very closely related microbial species and strains. Second and even more phenomenal, next generation sequencing (NGS) has facilitated the assessment of the complete nucleotide sequence of entire microbial genomes. This technology currently provides the most detailed level of bacterial genotyping and hence allows for the resolution of microbial spread and short-term evolution in minute detail. We will here review the very recent history of these two technologies, sketch their usefulness in the elucidation of the spread and epidemiology of mostly hospital-acquired infections and discuss future developments. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Managing expectations when publishing tools and methods for computational proteomics.

    PubMed

    Martens, Lennart; Kohlbacher, Oliver; Weintraub, Susan T

    2015-05-01

    Computational tools are pivotal in proteomics because they are crucial for identification, quantification, and statistical assessment of data. The gateway to finding the best choice of a tool or approach for a particular problem is frequently journal articles, yet there is often an overwhelming variety of options that makes it hard to decide on the best solution. This is particularly difficult for nonexperts in bioinformatics. The maturity, reliability, and performance of tools can vary widely because publications may appear at different stages of development. A novel idea might merit early publication despite only offering proof-of-principle, while it may take years before a tool can be considered mature, and by that time it might be difficult for a new publication to be accepted because of a perceived lack of novelty. After discussions with members of the computational mass spectrometry community, we describe here proposed recommendations for organization of informatics manuscripts as a way to set the expectations of readers (and reviewers) through three different manuscript types that are based on existing journal designations. Brief Communications are short reports describing novel computational approaches where the implementation is not necessarily production-ready. Research Articles present both a novel idea and mature implementation that has been suitably benchmarked. Application Notes focus on a mature and tested tool or concept and need not be novel but should offer advancement from improved quality, ease of use, and/or implementation. Organizing computational proteomics contributions into these three manuscript types will facilitate the review process and will also enable readers to identify the maturity and applicability of the tool for their own workflows.

  19. Tools to support evidence-informed public health decision making

    PubMed Central

    2014-01-01

    Background Public health professionals are increasingly expected to engage in evidence-informed decision making to inform practice and policy decisions. Evidence-informed decision making involves the use of research evidence along with expertise, existing public health resources, knowledge about community health issues, the local context and community, and the political climate. The National Collaborating Centre for Methods and Tools has identified a seven step process for evidence-informed decision making. Tools have been developed to support public health professionals as they work through each of these steps. This paper provides an overview of tools used in three Canadian public health departments involved in a study to develop capacity for evidence-informed decision making. Methods As part of a knowledge translation and exchange intervention, a Knowledge Broker worked with public health professionals to identify and apply tools for use with each of the steps of evidence-informed decision making. The Knowledge Broker maintained a reflective journal and interviews were conducted with a purposive sample of decision makers and public health professionals. This paper presents qualitative analysis of the perceived usefulness and usability of the tools. Results Tools were used in the health departments to assist in: question identification and clarification; searching for the best available research evidence; assessing the research evidence for quality through critical appraisal; deciphering the ‘actionable message(s)’ from the research evidence; tailoring messages to the local context to ensure their relevance and suitability; deciding whether and planning how to implement research evidence in the local context; and evaluating the effectiveness of implementation efforts. Decision makers provided descriptions of how the tools were used within the health departments and made suggestions for improvement. Overall, the tools were perceived as valuable for advancing

  20. Experiences on developing digital down conversion algorithms using Xilinx system generator

    NASA Astrophysics Data System (ADS)

    Xu, Chengfa; Yuan, Yuan; Zhao, Lizhi

    2013-07-01

    The Digital Down Conversion (DDC) algorithm is a classical signal processing method which is widely used in radar and communication systems. In this paper, the DDC function is implemented by Xilinx System Generator tool on FPGA. System Generator is an FPGA design tool provided by Xilinx Inc and MathWorks Inc. It is very convenient for programmers to manipulate the design and debug the function, especially for the complex algorithm. Through the developing process of DDC function based on System Generator, the results show that System Generator is a very fast and efficient tool for FPGA design.

  1. Brain mechanisms of perceiving tools and imagining tool use acts: a functional MRI study.

    PubMed

    Wadsworth, Heather M; Kana, Rajesh K

    2011-06-01

    The ability to conceptualize and manipulate tools in a complex manner is a distinguishing characteristic of humans, and forms a promising milestone in human evolution. While using tools is a motor act, proposals for executing such acts may be evoked by the mere perception of a tool. Imagining an action using a tool may invoke mental readjustment of body posture, planning motor movements, and matching such plans with the model action. This fMRI study examined the brain response in 32 healthy adults when they either viewed a tool or imagined using it. While both viewing and imagining tasks recruited similar regions, imagined tool use showed greater activation in motor areas, and in areas around the bilateral temporoparietal junction. Viewing tools, on the other hand, produced robust activation in the inferior frontal, occipital, parietal, and ventral temporal areas. Analysis of gender differences indicated males recruiting medial prefrontal and anterior cingulate cortices and females, left supramarginal gyrus and left anterior insula. While tool viewing seems to generate prehensions about using them, the imagined action using a tool mirrored brain responses underlying functional use of it. The findings of this study may suggest that perception and imagination of tools may form precursors to overt actions. Published by Elsevier Ltd.

  2. Deciding Where to Turn: A Qualitative Investigation of College Students' Helpseeking Decisions After Sexual Assault.

    PubMed

    DeLoveh, Heidi L M; Cattaneo, Lauren Bennett

    2017-03-01

    Sexual assault is a widespread problem on college campuses that has been the subject of substantial attention in recent years (Ali, 2011; Krebs, Lindquist, Berzofsky, Shook-Sa, & Peterson, 2016). Resources designed to address the problem exist, but there is evidence that they are underutilized by survivors (Campbell, 2008). The current study used grounded theory to explore how sexual assault survivors make decisions about helpseeking. In-depth interviews were conducted with 14 college sexual assault survivors to develop a theoretical model for their decision-making process. The resulting model, Deciding Where to Turn, suggests that survivors engage in three key decision points: determining if there is a problem related to the sexual assault (Do I Need Help), considering options (What Can I Do), and weighing the consequences of these options (What Will I Do). This process results in one of four behavioral choices: cope on one's own, seek support from friends/family, seek support from formal resources, or covert helpseeking, where needs are met without disclosure. Deciding Where to Turn contributes to the literature by providing a framework for understanding helpseeking decisions after sexual assault, highlighting the need to match reactions to survivor perceptions. The concept of covert helpseeking in particular adds to the way researchers and practitioners think about helpseeking. Research and practice implications are discussed. © Society for Community Research and Action 2017.

  3. Automated Euler and Navier-Stokes Database Generation for a Glide-Back Booster

    NASA Technical Reports Server (NTRS)

    Chaderjian, Neal M.; Rogers, Stuart E.; Aftosmis, Mike J.; Pandya, Shishir A.; Ahmad, Jasim U.; Tejnil, Edward

    2004-01-01

    The past two decades have seen a sustained increase in the use of high fidelity Computational Fluid Dynamics (CFD) in basic research, aircraft design, and the analysis of post-design issues. As the fidelity of a CFD method increases, the number of cases that can be readily and affordably computed greatly diminishes. However, computer speeds now exceed 2 GHz, hundreds of processors are currently available and more affordable, and advances in parallel CFD algorithms scale more readily with large numbers of processors. All of these factors make it feasible to compute thousands of high fidelity cases. However, there still remains the overwhelming task of monitoring the solution process. This paper presents an approach to automate the CFD solution process. A new software tool, AeroDB, is used to compute thousands of Euler and Navier-Stokes solutions for a 2nd generation glide-back booster in one week. The solution process exploits a common job-submission grid environment, the NASA Information Power Grid (IPG), using 13 computers located at 4 different geographical sites. Process automation and web-based access to a MySql database greatly reduces the user workload, removing much of the tedium and tendency for user input errors. The AeroDB framework is shown. The user submits/deletes jobs, monitors AeroDB's progress, and retrieves data and plots via a web portal. Once a job is in the database, a job launcher uses an IPG resource broker to decide which computers are best suited to run the job. Job/code requirements, the number of CPUs free on a remote system, and queue lengths are some of the parameters the broker takes into account. The Globus software provides secure services for user authentication, remote shell execution, and secure file transfers over an open network. AeroDB automatically decides when a job is completed. Currently, the Cart3D unstructured flow solver is used for the Euler equations, and the Overflow structured overset flow solver is used for the

  4. 25 CFR 115.617 - What happens when the BIA decides to supervise or encumber your IIM account after your hearing?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ..., DEPARTMENT OF THE INTERIOR FINANCIAL ACTIVITIES TRUST FUNDS FOR TRIBES AND INDIVIDUAL INDIANS IIM Accounts: Hearing Process for Restricting an IIM Account § 115.617 What happens when the BIA decides to supervise or...

  5. 25 CFR 115.617 - What happens when the BIA decides to supervise or encumber your IIM account after your hearing?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ..., DEPARTMENT OF THE INTERIOR FINANCIAL ACTIVITIES TRUST FUNDS FOR TRIBES AND INDIVIDUAL INDIANS IIM Accounts: Hearing Process for Restricting an IIM Account § 115.617 What happens when the BIA decides to supervise or...

  6. 25 CFR 115.617 - What happens when the BIA decides to supervise or encumber your IIM account after your hearing?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ..., DEPARTMENT OF THE INTERIOR FINANCIAL ACTIVITIES TRUST FUNDS FOR TRIBES AND INDIVIDUAL INDIANS IIM Accounts: Hearing Process for Restricting an IIM Account § 115.617 What happens when the BIA decides to supervise or...

  7. 25 CFR 115.617 - What happens when the BIA decides to supervise or encumber your IIM account after your hearing?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ..., DEPARTMENT OF THE INTERIOR FINANCIAL ACTIVITIES TRUST FUNDS FOR TRIBES AND INDIVIDUAL INDIANS IIM Accounts: Hearing Process for Restricting an IIM Account § 115.617 What happens when the BIA decides to supervise or...

  8. 25 CFR 115.617 - What happens when the BIA decides to supervise or encumber your IIM account after your hearing?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ..., DEPARTMENT OF THE INTERIOR FINANCIAL ACTIVITIES TRUST FUNDS FOR TRIBES AND INDIVIDUAL INDIANS IIM Accounts: Hearing Process for Restricting an IIM Account § 115.617 What happens when the BIA decides to supervise or...

  9. Grid Stiffened Structure Analysis Tool

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Grid Stiffened Analysis Tool contract is contract performed by Boeing under NASA purchase order H30249D. The contract calls for a "best effort" study comprised of two tasks: (1) Create documentation for a composite grid-stiffened structure analysis tool, in the form of a Microsoft EXCEL spread sheet, that was developed by originally at Stanford University and later further developed by the Air Force, and (2) Write a program that functions as a NASTRAN pre-processor to generate an FEM code for grid-stiffened structure. In performing this contract, Task 1 was given higher priority because it enables NASA to make efficient use of a unique tool they already have; Task 2 was proposed by Boeing because it also would be beneficial to the analysis of composite grid-stiffened structures, specifically in generating models for preliminary design studies. The contract is now complete, this package includes copies of the user's documentation for Task 1 and a CD ROM & diskette with an electronic copy of the user's documentation and an updated version of the "GRID 99" spreadsheet.

  10. Financial management: a necessary tool for generating cash.

    PubMed

    Humphrey, E; Cilwik, C J

    1994-01-01

    This article is an introduction to four types of financial analysis and a foundation for additional exposure to financial analysis. If you don't like working with numbers, consider hiring an accountant or a qualified industry consultant to help you analyze your business. Eventually, you will learn what financial clues to look for when analyzing your business and how to reach your objectives and generate cash to reinvest in your business.

  11. Generated spiral bevel gears: Optimal machine-tool settings and tooth contact analysis

    NASA Technical Reports Server (NTRS)

    Litvin, F. L.; Tsung, W. J.; Coy, J. J.; Heine, C.

    1985-01-01

    Geometry and kinematic errors were studied for Gleason generated spiral bevel gears. A new method was devised for choosing optimal machine settings. These settings provide zero kinematic errors and an improved bearing contact. The kinematic errors are a major source of noise and vibration in spiral bevel gears. The improved bearing contact gives improved conditions for lubrication. A computer program for tooth contact analysis was developed, and thereby the new generation process was confirmed. The new process is governed by the requirement that during the generation process there is directional constancy of the common normal of the contacting surfaces for generator and generated surfaces of pinion and gear.

  12. JVM: Java Visual Mapping tool for next generation sequencing read.

    PubMed

    Yang, Ye; Liu, Juan

    2015-01-01

    We developed a program JVM (Java Visual Mapping) for mapping next generation sequencing read to reference sequence. The program is implemented in Java and is designed to deal with millions of short read generated by sequence alignment using the Illumina sequencing technology. It employs seed index strategy and octal encoding operations for sequence alignments. JVM is useful for DNA-Seq, RNA-Seq when dealing with single-end resequencing. JVM is a desktop application, which supports reads capacity from 1 MB to 10 GB.

  13. Automatic Chinese Factual Question Generation

    ERIC Educational Resources Information Center

    Liu, Ming; Rus, Vasile; Liu, Li

    2017-01-01

    Question generation is an emerging research area of artificial intelligence in education. Question authoring tools are important in educational technologies, e.g., intelligent tutoring systems, as well as in dialogue systems. Approaches to generate factual questions, i.e., questions that have concrete answers, mainly make use of the syntactical…

  14. How Do Mathematics Teachers Decide What to Teach? Curriculum Authority and Sources of Information Accessed by Australian Teachers

    ERIC Educational Resources Information Center

    Clarke, David J.; Clarke, Doug M.; Sullivan, Peter

    2012-01-01

    Essential to teachers' planning are decisions regarding what should be taught. Curriculum documents are the most obvious authority. But what is a "curriculum document" for a mathematics teacher in Australia? Are there other credible sources of information that Australian teachers draw on when deciding what to teach? This article examines…

  15. 32 CFR 37.530 - What criteria do I use in deciding whether to accept a recipient's cost sharing?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 1 2012-07-01 2012-07-01 false What criteria do I use in deciding whether to accept a recipient's cost sharing? 37.530 Section 37.530 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE DoD GRANT AND AGREEMENT REGULATIONS TECHNOLOGY INVESTMENT AGREEMENTS Pre...

  16. 32 CFR 37.530 - What criteria do I use in deciding whether to accept a recipient's cost sharing?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 1 2011-07-01 2011-07-01 false What criteria do I use in deciding whether to accept a recipient's cost sharing? 37.530 Section 37.530 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE DoD GRANT AND AGREEMENT REGULATIONS TECHNOLOGY INVESTMENT AGREEMENTS Pre...

  17. 32 CFR 37.530 - What criteria do I use in deciding whether to accept a recipient's cost sharing?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 1 2010-07-01 2010-07-01 false What criteria do I use in deciding whether to accept a recipient's cost sharing? 37.530 Section 37.530 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE DoD GRANT AND AGREEMENT REGULATIONS TECHNOLOGY INVESTMENT AGREEMENTS Pre...

  18. 32 CFR 37.530 - What criteria do I use in deciding whether to accept a recipient's cost sharing?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 1 2014-07-01 2014-07-01 false What criteria do I use in deciding whether to accept a recipient's cost sharing? 37.530 Section 37.530 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE DoD GRANT AND AGREEMENT REGULATIONS TECHNOLOGY INVESTMENT AGREEMENTS Pre...

  19. 32 CFR 37.530 - What criteria do I use in deciding whether to accept a recipient's cost sharing?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 1 2013-07-01 2013-07-01 false What criteria do I use in deciding whether to accept a recipient's cost sharing? 37.530 Section 37.530 National Defense Department of Defense OFFICE OF THE SECRETARY OF DEFENSE DoD GRANT AND AGREEMENT REGULATIONS TECHNOLOGY INVESTMENT AGREEMENTS Pre...

  20. Generation of a Knockout Mouse Embryonic Stem Cell Line Using a Paired CRISPR/Cas9 Genome Engineering Tool.

    PubMed

    Wettstein, Rahel; Bodak, Maxime; Ciaudo, Constance

    2016-01-01

    CRISPR/Cas9, originally discovered as a bacterial immune system, has recently been engineered into the latest tool to successfully introduce site-specific mutations in a variety of different organisms. Composed only of the Cas9 protein as well as one engineered guide RNA for its functionality, this system is much less complex in its setup and easier to handle than other guided nucleases such as Zinc-finger nucleases or TALENs.Here, we describe the simultaneous transfection of two paired CRISPR sgRNAs-Cas9 plasmids, in mouse embryonic stem cells (mESCs), resulting in the knockout of the selected target gene. Together with a four primer-evaluation system, it poses an efficient way to generate new independent knockout mouse embryonic stem cell lines.

  1. OCSEGen: Open Components and Systems Environment Generator

    NASA Technical Reports Server (NTRS)

    Tkachuk, Oksana

    2014-01-01

    To analyze a large system, one often needs to break it into smaller components.To analyze a component or unit under analysis, one needs to model its context of execution, called environment, which represents the components with which the unit interacts. Environment generation is a challenging problem, because the environment needs to be general enough to uncover unit errors, yet precise enough to make the analysis tractable. In this paper, we present a tool for automated environment generation for open components and systems. The tool, called OCSEGen, is implemented on top of the Soot framework. We present the tool's current support and discuss its possible future extensions.

  2. Electromagnetic anti-jam telemetry tool

    DOEpatents

    Ganesan, Harini [Sugar Land, TX; Mayzenberg, Nataliya [Missouri City, TX

    2008-02-12

    A mud-pulse telemetry tool includes a tool housing, a motor disposed in the tool housing, and a magnetic coupling coupled to the motor and having an inner shaft and an outer shaft. The tool may also include a stator coupled to the tool housing, a restrictor disposed proximate the stator and coupled to the magnetic coupling, so that the restrictor and the stator adapted to generate selected pulses in a drilling fluid when the restrictor is selectively rotated. The tool may also include a first anti-jam magnet coupled to the too housing, and an second anti-jam magnet disposed proximate the first anti-jam magnet and coupled to the inner shaft and/or the outer shaft, wherein at least one of the first anti-jam magnet and the second anti-jam magnet is an electromagnet, and wherein the first anti-jam magnet and the second anti-jam magnet are positioned with adjacent like poles.

  3. Reliability model generator

    NASA Technical Reports Server (NTRS)

    Cohen, Gerald C. (Inventor); McMann, Catherine M. (Inventor)

    1991-01-01

    An improved method and system for automatically generating reliability models for use with a reliability evaluation tool is described. The reliability model generator of the present invention includes means for storing a plurality of low level reliability models which represent the reliability characteristics for low level system components. In addition, the present invention includes means for defining the interconnection of the low level reliability models via a system architecture description. In accordance with the principles of the present invention, a reliability model for the entire system is automatically generated by aggregating the low level reliability models based on the system architecture description.

  4. Selection and application of microbial source tracking tools for water-quality investigations

    USGS Publications Warehouse

    Stoeckel, Donald M.

    2005-01-01

    Microbial source tracking (MST) is a complex process that includes many decision-making steps. Once a contamination problem has been defined, the potential user of MST tools must thoroughly consider study objectives before deciding upon a source identifier, a detection method, and an analytical approach to apply to the problem. Regardless of which MST protocol is chosen, underlying assumptions can affect the results and interpretation. It is crucial to incorporate tests of those assumptions in the study quality-control plan to help validate results and facilitate interpretation. Detailed descriptions of MST objectives, protocols, and assumptions are provided in this report to assist in selection and application of MST tools for water-quality investigations. Several case studies illustrate real-world applications of MST protocols over a range of settings, spatial scales, and types of contamination. Technical details of many available source identifiers and detection methods are included as appendixes. By use of this information, researchers should be able to formulate realistic expectations for the information that MST tools can provide and, where possible, successfully execute investigations to characterize sources of fecal contamination to resource waters.

  5. Visualization in simulation tools: requirements and a tool specification to support the teaching of dynamic biological processes.

    PubMed

    Jørgensen, Katarina M; Haddow, Pauline C

    2011-08-01

    Simulation tools are playing an increasingly important role behind advances in the field of systems biology. However, the current generation of biological science students has either little or no experience with such tools. As such, this educational glitch is limiting both the potential use of such tools as well as the potential for tighter cooperation between the designers and users. Although some simulation tool producers encourage their use in teaching, little attempt has hitherto been made to analyze and discuss their suitability as an educational tool for noncomputing science students. In general, today's simulation tools assume that the user has a stronger mathematical and computing background than that which is found in most biological science curricula, thus making the introduction of such tools a considerable pedagogical challenge. This paper provides an evaluation of the pedagogical attributes of existing simulation tools for cell signal transduction based on Cognitive Load theory. Further, design recommendations for an improved educational simulation tool are provided. The study is based on simulation tools for cell signal transduction. However, the discussions are relevant to a broader biological simulation tool set.

  6. Effect of DECIDE (Decision-making Education for Choices In Diabetes Everyday) Program Delivery Modalities on Clinical and Behavioral Outcomes in Urban African Americans With Type 2 Diabetes: A Randomized Trial.

    PubMed

    Fitzpatrick, Stephanie L; Golden, Sherita Hill; Stewart, Kerry; Sutherland, June; DeGross, Sharie; Brown, Tina; Wang, Nae-Yuh; Allen, Jerilyn; Cooper, Lisa A; Hill-Briggs, Felicia

    2016-12-01

    To compare the effectiveness of three delivery modalities of Decision-making Education for Choices In Diabetes Everyday (DECIDE), a nine-module, literacy-adapted diabetes and cardiovascular disease (CVD) education and problem-solving training, compared with an enhanced usual care (UC), on clinical and behavioral outcomes among urban African Americans with type 2 diabetes. Eligible participants (n = 182) had a suboptimal CVD risk factor profile (A1C, blood pressure, and/or lipids). Participants were randomized to DECIDE Self-Study (n = 46), DECIDE Individual (n = 45), DECIDE Group (n = 46), or Enhanced UC (n = 45). Intervention duration was 18-20 weeks. Outcomes were A1C, blood pressure, lipids, problem-solving, disease knowledge, and self-care activities, all measured at baseline, 1 week, and 6 months after completion of the intervention. DECIDE modalities and Enhanced UC did not significantly differ in clinical outcomes at 6 months postintervention. In participants with A1C ≥7.5% (58 mmol/mol) at baseline, A1C declined in each DECIDE modality at 1 week postintervention (P < 0.05) and only in Self-Study at 6 months postintervention (b = -0.24, P < 0.05). There was significant reduction in systolic blood pressure in Self-Study (b = -4.04) and Group (b = -3.59) at 6 months postintervention. Self-Study, Individual, and Enhanced UC had significant declines in LDL and Self-Study had an increase in HDL (b = 1.76, P < 0.05) at 6 months postintervention. Self-Study and Individual had a higher increase in knowledge than Enhanced UC (P < 0.05), and all arms improved in problem-solving (P < 0.01) at 6 months postintervention. DECIDE modalities showed benefits after intervention. Self-Study demonstrated robust improvements across clinical and behavioral outcomes, suggesting program suitability for broader dissemination to populations with similar educational and literacy levels. © 2016 by the American Diabetes Association.

  7. Ergonomic design intervention strategy for work tools development for women agro based workers in Northeast India.

    PubMed

    Chakrabarti, Debkumar; Bhattachheriya, Nandita

    2012-01-01

    Strategy for finding the appropriate strategy for work tool development has become a crucial issue in occupational wellness of varied nature of women workforce of Northeast India. This paper deals with ergonomics intervention through sustainable work tool design development process. Workers who frequently shift to different activities quite often in unorganised small-scale fruit processing units where productivity is directly related to the harvesting season require different work tools relevant to specific tasks and mostly workers themselves manage work tools of their own with available local resources. Whereas in contrast the tea-leaf pluckers are engaged in a single task throughout the year, and the work schedule and work equipment is decided and supplied to them based on the corporate decision where the workers do not have any individual control. Observations confirm the need for organising participatory workshops specific to trade based occupational well-being and different work tools for different tasks in mostly private owned unorganised sector. Implementation of single variety work tool development that supports a crucial component in tea-leaf plucking for which they are engaged in full time employment; and through a corporate decision a single design with its number of users makes a good effect.

  8. Non-therapeutic research with minors: how do chairpersons of German research ethics committees decide?

    PubMed Central

    Lenk, C; Radenbach, K; Dahl, M; Wiesemann, C

    2004-01-01

    Objectives: Clinical trials in humans in Germany—as in many other countries—must be approved by local research ethics committees (RECs). The current study has been designed to document and evaluate decisions of chairpersons of RECs in the problematic field of non-therapeutic research with minors. The authors' purpose was to examine whether non-therapeutic research was acceptable for chairpersons at all, and whether there was certainty on how to decide in research trials involving more than minimal risk. Design: In a questionnaire, REC chairpersons had to evaluate five different scenarios with (in parts) non-therapeutic research. The scenarios described realistic potential research projects with minors, involving increasing levels of risk for the research participants. The chairpersons had to decide whether the respective projects should be approved. Methods: A total of 49 German REC chairpersons were sent questionnaires; 29 questionnaires were returned. The main measurements were approval or rejection of research scenarios. Results: Chairpersons of German RECs generally tend to accept non-therapeutic research with minors if the apparent risk for the participating children is low. If the risk is clearly higher than "minimal", the chairpersons' decisions differ widely. Conclusion: The fact that there seem to be different attitudes of chairpersons to non-therapeutic research with minors is problematic from an ethical point of view. It suggests a general uncertainty about the standards of protection for minor research participants in Germany. Therefore, further ethical and legal regulation of non-therapeutic research with minors in Germany seems necessary. PMID:14872082

  9. Public access management as an adaptive wildlife management tool

    USGS Publications Warehouse

    Ouren, Douglas S.; Watts, Raymond D.

    2005-01-01

    One key issue in the Black Mesa – Black Canyon area is the interaction between motorized vehicles and. The working hypothesis for this study is that early season elk movement onto private lands and the National Park is precipitated by increased use of Off Highway Vehicles (OHV’s). Data on intensity of motorized use is extremely limited. In this study, we monitor intensity of motorized vehicle and trail use on elk movements and habitat usage and analyze interactions. If management agencies decide to alter accessibility, we will monitor wildlife responses to changes in the human-use regime. This provides a unique opportunity for adaptive management experimentation based on coordinated research and monitoring. The products from this project will provide natural resource managers across the nation with tools and information to better meet these resource challenges.

  10. Deciding Optimal Noise Monitoring Sites with Matrix Gray Absolute Relation Degree Theory

    NASA Astrophysics Data System (ADS)

    Gao, Zhihua; Li, Yadan; Zhao, Limin; Wang, Shuangwei

    2015-08-01

    Noise maps are applied to assess noise level in cities all around the world. There are mainly two ways of producing noise maps: one way is producing noise maps through theoretical simulations with the surrounding conditions, such as traffic flow, building distribution, etc.; the other one is calculating noise level with actual measurement data from noise monitors. Currently literature mainly focuses on considering more factors that affect sound traveling during theoretical simulations and interpolation methods in producing noise maps based on measurements of noise. Although many factors were considered during simulation, noise maps have to be calibrated by actual noise measurements. Therefore, the way of obtaining noise data is significant to both producing and calibrating a noise map. However, there is little literature mentioned about rules of deciding the right monitoring sites when placed the specified number of noise sensors and given the deviation of a noise map produced with data from them. In this work, by utilizing matrix Gray Absolute Relation Degree Theory, we calculated the relation degrees between the most precise noise surface and those interpolated with different combinations of noise data with specified number. We found that surfaces plotted with different combinations of noise data produced different relation degrees with the most precise one. Then we decided the least significant one among the total and calculated the corresponding deviation when it was excluded in making a noise surface. Processing the left noise data in the same way, we found out the least significant datum among the left data one by one. With this method, we optimized the noise sensor’s distribution in an area about 2km2. And we also calculated the bias of surfaces with the least significant data removed. Our practice provides an optimistic solution to the situation faced by most governments that there is limited financial budget available for noise monitoring, especially in

  11. MAGE (M-file/Mif Automatic GEnerator): A graphical interface tool for automatic generation of Object Oriented Micromagnetic Framework configuration files and Matlab scripts for results analysis

    NASA Astrophysics Data System (ADS)

    Chęciński, Jakub; Frankowski, Marek

    2016-10-01

    We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.

  12. Software Tools for Weed Seed Germination Modeling

    USDA-ARS?s Scientific Manuscript database

    The next generation of weed seed germination models will need to account for variable soil microclimate conditions. In order to predict this microclimate environment we have developed a suite of individual tools (models) that can be used in conjunction with the next generation of weed seed germinati...

  13. Metrology laboratory requirements for third-generation synchrotron radiation sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Takacs, P.Z.; Quian, Shinan

    1997-11-01

    New third-generation synchrotron radiation sources that are now, or will soon, come on line will need to decide how to handle the testing of optical components delivered for use in their beam lines. In many cases it is desirable to establish an in-house metrology laboratory to do the work. We review the history behind the formation of the Optical Metrology Laboratory at Brookhaven National Laboratory and the rationale for its continued existence. We offer suggestions to those who may be contemplating setting up similar facilities, based on our experiences over the past two decades.

  14. Dormancy and germination: How does the crop seed decide?

    PubMed

    Shu, K; Meng, Y J; Shuai, H W; Liu, W G; Du, J B; Liu, J; Yang, W Y

    2015-11-01

    Whether seeds germinate or maintain dormancy is decided upon through very intricate physiological processes. Correct timing of these processes is most important for the plants life cycle. If moist conditions are encountered, a low dormancy level causes pre-harvest sprouting in various crop species, such as wheat, corn and rice, this decreases crop yield and negatively impacts downstream industrial processing. In contrast, a deep level of seed dormancy prevents normal germination even under favourable conditions, resulting in a low emergence rate during agricultural production. Therefore, an optimal seed dormancy level is valuable for modern mechanised agricultural systems. Over the past several years, numerous studies have demonstrated that diverse endogenous and environmental factors regulate the balance between dormancy and germination, such as light, temperature, water status and bacteria in soil, and phytohormones such as ABA (abscisic acid) and GA (gibberellic acid). In this updated review, we highlight recent advances regarding the molecular mechanisms underlying regulation of seed dormancy and germination processes, including the external environmental and internal hormonal cues, and primarily focusing on the staple crop species. Furthermore, future challenges and research directions for developing a full understanding of crop seed dormancy and germination are also discussed. © 2015 German Botanical Society and The Royal Botanical Society of the Netherlands.

  15. tkLayout: a design tool for innovative silicon tracking detectors

    NASA Astrophysics Data System (ADS)

    Bianchi, G.

    2014-03-01

    A new CMS tracker is scheduled to become operational for the LHC Phase 2 upgrade in the early 2020's. tkLayout is a software package developed to create 3d models for the design of the CMS tracker and to evaluate its fundamental performance figures. The new tracker will have to cope with much higher luminosity conditions, resulting in increased track density, harsher radiation exposure and, especially, much higher data acquisition bandwidth, such that equipping the tracker with triggering capabilities is envisaged. The design of an innovative detector involves deciding on an architecture offering the best trade-off among many figures of merit, such as tracking resolution, power dissipation, bandwidth, cost and so on. Quantitatively evaluating these figures of merit as early as possible in the design phase is of capital importance and it is best done with the aid of software models. tkLayout is a flexible modeling tool: new performance estimates and support for different detector geometries can be quickly added, thanks to its modular structure. Besides, the software executes very quickly (about two minutes), so that many possible architectural variations can be rapidly modeled and compared, to help in the choice of a viable detector layout and then to optimize it. A tracker geometry is generated from simple configuration files, defining the module types, layout and materials. Support structures are automatically added and services routed to provide a realistic tracker description. The tracker geometries thus generated can be exported to the standard CMS simulation framework (CMSSW) for full Monte Carlo studies. tkLayout has proven essential in giving guidance to CMS in studying different detector layouts and exploring the feasibility of innovative solutions for tracking detectors, in terms of design, performance and projected costs. This tool has been one of the keys to making important design decisions for over five years now and has also enabled project engineers

  16. Sliding versus Deciding in Relationships: Associations with Relationship Quality, Commitment, and Infidelity

    PubMed Central

    Owen, Jesse; Rhoades, Galena K.; Stanley, Scott M.

    2013-01-01

    From choosing a partner to date to deciding to cohabit or marry, individuals are faced with many relationship choices. Given the costs of failed relationships (e.g., personal distress, problems with work, lower well-being for children, lost opportunities to meet other partners), it is important consider how individuals are approaching these decisions. The current study tested if more thoughtful and clear relationship decision-making processes would relate to individuals’ levels of satisfaction with and dedication to their partners as well as their extra-dyadic involvements. In a sample of 252 men and women, the results showed that regardless of relationship status (i.e., dating, cohabiting, or married), those who reported more thoughtful decision-making processes also reported more dedication to their partners, higher satisfaction with the relationship, and fewer extra-dyadic involvements. PMID:23690736

  17. Learning to merge: a new tool for interactive mapping

    NASA Astrophysics Data System (ADS)

    Porter, Reid B.; Lundquist, Sheng; Ruggiero, Christy

    2013-05-01

    The task of turning raw imagery into semantically meaningful maps and overlays is a key area of remote sensing activity. Image analysts, in applications ranging from environmental monitoring to intelligence, use imagery to generate and update maps of terrain, vegetation, road networks, buildings and other relevant features. Often these tasks can be cast as a pixel labeling problem, and several interactive pixel labeling tools have been developed. These tools exploit training data, which is generated by analysts using simple and intuitive paint-program annotation tools, in order to tailor the labeling algorithm for the particular dataset and task. In other cases, the task is best cast as a pixel segmentation problem. Interactive pixel segmentation tools have also been developed, but these tools typically do not learn from training data like the pixel labeling tools do. In this paper we investigate tools for interactive pixel segmentation that also learn from user input. The input has the form of segment merging (or grouping). Merging examples are 1) easily obtained from analysts using vector annotation tools, and 2) more challenging to exploit than traditional labels. We outline the key issues in developing these interactive merging tools, and describe their application to remote sensing.

  18. Downhole tool

    DOEpatents

    Hall, David R.; Muradov, Andrei; Pixton, David S.; Dahlgren, Scott Steven; Briscoe, Michael A.

    2007-03-20

    A double shouldered downhole tool connection comprises box and pin connections having mating threads intermediate mating primary and secondary shoulders. The connection further comprises a secondary shoulder component retained in the box connection intermediate a floating component and the primary shoulders. The secondary shoulder component and the pin connection cooperate to transfer a portion of makeup load to the box connection. The downhole tool may be selected from the group consisting of drill pipe, drill collars, production pipe, and reamers. The floating component may be selected from the group consisting of electronics modules, generators, gyroscopes, power sources, and stators. The secondary shoulder component may comprises an interface to the box connection selected from the group consisting of radial grooves, axial grooves, tapered grooves, radial protrusions, axial protrusions, tapered protrusions, shoulders, and threads.

  19. Development of micromachine tool prototypes for microfactories

    NASA Astrophysics Data System (ADS)

    Kussul, E.; Baidyk, T.; Ruiz-Huerta, L.; Caballero-Ruiz, A.; Velasco, G.; Kasatkina, L.

    2002-11-01

    At present, many areas of industry have strong tendencies towards miniaturization of products. Mechanical components of these products as a rule are manufactured using conventional large-scale equipment or micromechanical equipment based on microelectronic technology (MEMS). The first method has some drawbacks because conventional large-scale equipment consumes much energy, space and material. The second method seems to be more advanced but has some limitations, for example, two-dimensional (2D) or 2.5-dimensional shapes of components and materials compatible with silicon technology. In this paper, we consider an alternative technology of micromechanical device production. This technology is based on micromachine tools (MMT) and microassembly devices, which can be produced as sequential generations of microequipment. The first generation can be produced by conventional large-scale equipment. The machine tools of this generation can have overall sizes of 100-200 mm. Using microequipment of this generation, second generation microequipment having smaller overall sizes can be produced. This process can be repeated to produce generations of micromachine tools having overall sizes of some millimetres. In this paper we describe the efforts and some results of first generation microequipment prototyping. A micromachining centre having an overall size of 130 × 160 × 85 mm3 was produced and characterized. This centre has allowed us to manufacture micromechanical details having sizes from 50 µm to 5 mm. These details have complex three-dimensional shapes (for example, screw, gear, graduated shaft, conic details, etc), and are made from different materials, such as brass, steel, different plastics etc. We have started to investigate and to make prototypes of the assembly microdevices controlled by a computer vision system. In this paper we also describe an example of the applications (microfilters) for the proposed technology.

  20. When Groups Decide to Use Asynchronous Online Discussions: Collaborative Learning and Social Presence under a Voluntary Participation Structure

    ERIC Educational Resources Information Center

    So, H.-J.

    2009-01-01

    The purpose of this study is to explore how groups decide to use asynchronous online discussion forums in a non-mandatory setting, and, after the group decision is made, how group members use online discussion forums to complete a collaborative learning project requiring complex data gathering and research processes. While a large body of research…

  1. 25 CFR 162.585 - How will BIA decide whether to approve a leasehold mortgage of a WSR lease?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 1 2013-04-01 2013-04-01 false How will BIA decide whether to approve a leasehold mortgage of a WSR lease? 162.585 Section 162.585 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Wsr Leasehold Mortgages § 162.585...

  2. 25 CFR 162.585 - How will BIA decide whether to approve a leasehold mortgage of a WSR lease?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 1 2014-04-01 2014-04-01 false How will BIA decide whether to approve a leasehold mortgage of a WSR lease? 162.585 Section 162.585 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Wsr Leasehold Mortgages § 162.585...

  3. Use of a Local Immunotherapy as an Adjunctive Tool for the Generation of Human Monoclonal Antibodies from Regional Lymph Nodes of Colonic Cancer Patients

    PubMed Central

    Yagyu, Toshio; Monden, Takushi; Tamaki, Yasuhiro; Morimoto, Hideki; Takeda, Tsutomu; Kobayashi, Tetsuro; Shimano, Takashi; Murakami, Hiroki; Mori, Takesada

    1992-01-01

    Human hybridomas were generated through the fusion of the human B‐lymphoblastoid cell line HO‐323 with the regional lymph node lymphocytes of colonic cancer patients who had received a local immunotherapy. A total of 353 hybridomas were obtained from 4 patients and 116 of these were found to secrete ≧ 100 ng/ml human immunoglobulin. The efficiency was remarkably high as compared with that from patients without the local immunotherapy. Further immunohistological examination showed that 5 hybridomas secreted IgM which selectively reacted with colonic cancers. The results indicate that local immunotherapy could be an adjunctive tool for the generation of highly potent human hybridomas through augmenting the host's immunity. PMID:1544869

  4. Automatic Certification of Kalman Filters for Reliable Code Generation

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd; Schumann, Johann; Richardson, Julian

    2005-01-01

    AUTOFILTER is a tool for automatically deriving Kalman filter code from high-level declarative specifications of state estimation problems. It can generate code with a range of algorithmic characteristics and for several target platforms. The tool has been designed with reliability of the generated code in mind and is able to automatically certify that the code it generates is free from various error classes. Since documentation is an important part of software assurance, AUTOFILTER can also automatically generate various human-readable documents, containing both design and safety related information. We discuss how these features address software assurance standards such as DO-178B.

  5. FluxSuite: a New Scientific Tool for Advanced Network Management and Cross-Sharing of Next-Generation Flux Stations

    NASA Astrophysics Data System (ADS)

    Burba, G. G.; Johnson, D.; Velgersdyk, M.; Beaty, K.; Forgione, A.; Begashaw, I.; Allyn, D.

    2015-12-01

    Significant increases in data generation and computing power in recent years have greatly improved spatial and temporal flux data coverage on multiple scales, from a single station to continental flux networks. At the same time, operating budgets for flux teams and stations infrastructure are getting ever more difficult to acquire and sustain. With more stations and networks, larger data flows from each station, and smaller operating budgets, modern tools are needed to effectively and efficiently handle the entire process. This would help maximize time dedicated to answering research questions, and minimize time and expenses spent on data processing, quality control and station management. Cross-sharing the stations with external institutions may also help leverage available funding, increase scientific collaboration, and promote data analyses and publications. FluxSuite, a new advanced tool combining hardware, software and web-service, was developed to address these specific demands. It automates key stages of flux workflow, minimizes day-to-day site management, and modernizes the handling of data flows: Each next-generation station measures all parameters needed for flux computations Field microcomputer calculates final fully-corrected flux rates in real time, including computation-intensive Fourier transforms, spectra, co-spectra, multiple rotations, stationarity, footprint, etc. Final fluxes, radiation, weather and soil data are merged into a single quality-controlled file Multiple flux stations are linked into an automated time-synchronized network Flux network manager, or PI, can see all stations in real time, including fluxes, supporting data, automated reports, and email alerts PI can assign rights, allow or restrict access to stations and data: selected stations can be shared via rights-managed access internally or with external institutions Researchers without stations could form "virtual networks" for specific projects by collaborating with PIs from

  6. BrainCheck - a very brief tool to detect incipient cognitive decline: optimized case-finding combining patient- and informant-based data.

    PubMed

    Ehrensperger, Michael M; Taylor, Kirsten I; Berres, Manfred; Foldi, Nancy S; Dellenbach, Myriam; Bopp, Irene; Gold, Gabriel; von Gunten, Armin; Inglin, Daniel; Müri, René; Rüegger, Brigitte; Kressig, Reto W; Monsch, Andreas U

    2014-01-01

    Optimal identification of subtle cognitive impairment in the primary care setting requires a very brief tool combining (a) patients' subjective impairments, (b) cognitive testing, and (c) information from informants. The present study developed a new, very quick and easily administered case-finding tool combining these assessments ('BrainCheck') and tested the feasibility and validity of this instrument in two independent studies. We developed a case-finding tool comprised of patient-directed (a) questions about memory and depression and (b) clock drawing, and (c) the informant-directed 7-item version of the Informant Questionnaire on Cognitive Decline in the Elderly (IQCODE). Feasibility study: 52 general practitioners rated the feasibility and acceptance of the patient-directed tool. Validation study: An independent group of 288 Memory Clinic patients (mean ± SD age = 76.6 ± 7.9, education = 12.0 ± 2.6; 53.8% female) with diagnoses of mild cognitive impairment (n = 80), probable Alzheimer's disease (n = 185), or major depression (n = 23) and 126 demographically matched, cognitively healthy volunteer participants (age = 75.2 ± 8.8, education = 12.5 ± 2.7; 40% female) partook. All patient and healthy control participants were administered the patient-directed tool, and informants of 113 patient and 70 healthy control participants completed the very short IQCODE. Feasibility study: General practitioners rated the patient-directed tool as highly feasible and acceptable. Validation study: A Classification and Regression Tree analysis generated an algorithm to categorize patient-directed data which resulted in a correct classification rate (CCR) of 81.2% (sensitivity = 83.0%, specificity = 79.4%). Critically, the CCR of the combined patient- and informant-directed instruments (BrainCheck) reached nearly 90% (that is 89.4%; sensitivity = 97.4%, specificity = 81.6%). A new and very brief instrument for

  7. EGG: Empirical Galaxy Generator

    NASA Astrophysics Data System (ADS)

    Schreiber, C.; Elbaz, D.; Pannella, M.; Merlin, E.; Castellano, M.; Fontana, A.; Bourne, N.; Boutsia, K.; Cullen, F.; Dunlop, J.; Ferguson, H. C.; Michałowski, M. J.; Okumura, K.; Santini, P.; Shu, X. W.; Wang, T.; White, C.

    2018-04-01

    The Empirical Galaxy Generator (EGG) generates fake galaxy catalogs and images with realistic positions, morphologies and fluxes from the far-ultraviolet to the far-infrared. The catalogs are generated by egg-gencat and stored in binary FITS tables (column oriented). Another program, egg-2skymaker, is used to convert the generated catalog into ASCII tables suitable for ingestion by SkyMaker (ascl:1010.066) to produce realistic high resolution images (e.g., Hubble-like), while egg-gennoise and egg-genmap can be used to generate the low resolution images (e.g., Herschel-like). These tools can be used to test source extraction codes, or to evaluate the reliability of any map-based science (stacking, dropout identification, etc.).

  8. Core-Cutoff Tool

    NASA Technical Reports Server (NTRS)

    Gheen, Darrell

    2007-01-01

    A tool makes a cut perpendicular to the cylindrical axis of a core hole at a predetermined depth to free the core at that depth. The tool does not damage the surrounding material from which the core was cut, and it operates within the core-hole kerf. Coring usually begins with use of a hole saw or a hollow cylindrical abrasive cutting tool to make an annular hole that leaves the core (sometimes called the plug ) in place. In this approach to coring as practiced heretofore, the core is removed forcibly in a manner chosen to shear the core, preferably at or near the greatest depth of the core hole. Unfortunately, such forcible removal often damages both the core and the surrounding material (see Figure 1). In an alternative prior approach, especially applicable to toxic or fragile material, a core is formed and freed by means of milling operations that generate much material waste. In contrast, the present tool eliminates the damage associated with the hole-saw approach and reduces the extent of milling operations (and, hence, reduces the waste) associated with the milling approach. The present tool (see Figure 2) includes an inner sleeve and an outer sleeve and resembles the hollow cylindrical tool used to cut the core hole. The sleeves are thin enough that this tool fits within the kerf of the core hole. The inner sleeve is attached to a shaft that, in turn, can be attached to a drill motor or handle for turning the tool. This tool also includes a cutting wire attached to the distal ends of both sleeves. The cutting wire is long enough that with sufficient relative rotation of the inner and outer sleeves, the wire can cut all the way to the center of the core. The tool is inserted in the kerf until its distal end is seated at the full depth. The inner sleeve is then turned. During turning, frictional drag on the outer core pulls the cutting wire into contact with the core. The cutting force of the wire against the core increases with the tension in the wire and

  9. Graphics processing unit (GPU) real-time infrared scene generation

    NASA Astrophysics Data System (ADS)

    Christie, Chad L.; Gouthas, Efthimios (Themie); Williams, Owen M.

    2007-04-01

    VIRSuite, the GPU-based suite of software tools developed at DSTO for real-time infrared scene generation, is described. The tools include the painting of scene objects with radiometrically-associated colours, translucent object generation, polar plot validation and versatile scene generation. Special features include radiometric scaling within the GPU and the presence of zoom anti-aliasing at the core of VIRSuite. Extension of the zoom anti-aliasing construct to cover target embedding and the treatment of translucent objects is described.

  10. Ontology-based configuration of problem-solving methods and generation of knowledge-acquisition tools: application of PROTEGE-II to protocol-based decision support.

    PubMed

    Tu, S W; Eriksson, H; Gennari, J H; Shahar, Y; Musen, M A

    1995-06-01

    PROTEGE-II is a suite of tools and a methodology for building knowledge-based systems and domain-specific knowledge-acquisition tools. In this paper, we show how PROTEGE-II can be applied to the task of providing protocol-based decision support in the domain of treating HIV-infected patients. To apply PROTEGE-II, (1) we construct a decomposable problem-solving method called episodic skeletal-plan refinement, (2) we build an application ontology that consists of the terms and relations in the domain, and of method-specific distinctions not already captured in the domain terms, and (3) we specify mapping relations that link terms from the application ontology to the domain-independent terms used in the problem-solving method. From the application ontology, we automatically generate a domain-specific knowledge-acquisition tool that is custom-tailored for the application. The knowledge-acquisition tool is used for the creation and maintenance of domain knowledge used by the problem-solving method. The general goal of the PROTEGE-II approach is to produce systems and components that are reusable and easily maintained. This is the rationale for constructing ontologies and problem-solving methods that can be composed from a set of smaller-grained methods and mechanisms. This is also why we tightly couple the knowledge-acquisition tools to the application ontology that specifies the domain terms used in the problem-solving systems. Although our evaluation is still preliminary, for the application task of providing protocol-based decision support, we show that these goals of reusability and easy maintenance can be achieved. We discuss design decisions and the tradeoffs that have to be made in the development of the system.

  11. An Intelligent Crop Planning Tool for Controlled Ecological Life Support Systems

    NASA Technical Reports Server (NTRS)

    Whitaker, Laura O.; Leon, Jorge

    1996-01-01

    This paper describes a crop planning tool developed for the Controlled Ecological Life Support Systems (CELSS) project which is in the research phases at various NASA facilities. The Crop Planning Tool was developed to assist in the understanding of the long term applications of a CELSS environment. The tool consists of a crop schedule generator as well as a crop schedule simulator. The importance of crop planning tools such as the one developed is discussed. The simulator is outlined in detail while the schedule generator is touched upon briefly. The simulator consists of data inputs, plant and human models, and various other CELSS activity models such as food consumption and waste regeneration. The program inputs such as crew data and crop states are discussed. References are included for all nominal parameters used. Activities including harvesting, planting, plant respiration, and human respiration are discussed using mathematical models. Plans provided to the simulator by the plan generator are evaluated for their 'fitness' to the CELSS environment with an objective function based upon daily reservoir levels. Sample runs of the Crop Planning Tool and future needs for the tool are detailed.

  12. ICLUS Tools and Datasets (Version 1.2) and User's Manual: Arcgis Tools and Datasets for Modeling US Housing Density (External Review Draft)

    EPA Science Inventory

    This draft Geographic Information System (GIS) tool can be used to generate scenarios of housing-density changes and calculate impervious surface cover for the conterminous United States. A draft User’s Guide accompanies the tool. This product distributes the population project...

  13. Diagnostic tool for red blood cell membrane disorders: Assessment of a new generation ektacytometer.

    PubMed

    Da Costa, Lydie; Suner, Ludovic; Galimand, Julie; Bonnel, Amandine; Pascreau, Tiffany; Couque, Nathalie; Fenneteau, Odile; Mohandas, Narla

    2016-01-01

    Inherited red blood cell (RBC) membrane disorders, such as hereditary spherocytosis, elliptocytosis and hereditary ovalocytosis, result from mutations in genes encoding various RBC membrane and skeletal proteins. The RBC membrane, a composite structure composed of a lipid bilayer linked to a spectrin/actin-based membrane skeleton, confers upon the RBC unique features of deformability and mechanical stability. The disease severity is primarily dependent on the extent of membrane surface area loss. RBC membrane disorders can be readily diagnosed by various laboratory approaches that include RBC cytology, flow cytometry, ektacytometry, electrophoresis of RBC membrane proteins and genetics. The reference technique for diagnosis of RBC membrane disorders is the osmotic gradient ektacytometry. However, in spite of its recognition as the reference technique, this technique is rarely used as a routine diagnosis tool for RBC membrane disorders due to its limited availability. This may soon change as a new generation of ektacytometer has been recently engineered. In this review, we describe the workflow of the samples shipped to our Hematology laboratory for RBC membrane disorder analysis and the data obtained for a large cohort of French patients presenting with RBC membrane disorders using a newly available version of the ektacytomer. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Application of Multimedia Design Principles to Visuals Used in Course-Books: An Evaluation Tool

    ERIC Educational Resources Information Center

    Kuzu, Abdullah; Akbulut, Yavuz; Sahin, Mehmet Can

    2007-01-01

    This paper introduces an evaluation tool prepared to examine the quality of visuals in course-books. The tool is based on Mayer's Cognitive Theory of Multimedia Learning (i.e. Generative Theory) and its principles regarding the correct use of illustrations within text. The reason to generate the tool, the development process along with the…

  15. 5 CFR 894.202 - If I enroll for self plus one, may I decide which family member to cover?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false If I enroll for self plus one, may I decide which family member to cover? 894.202 Section 894.202 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) FEDERAL EMPLOYEES DENTAL AND VISION INSURANCE...

  16. 5 CFR 894.202 - If I enroll for self plus one, may I decide which family member to cover?

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false If I enroll for self plus one, may I decide which family member to cover? 894.202 Section 894.202 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) FEDERAL EMPLOYEES DENTAL AND VISION INSURANCE...

  17. Phenotype Instance Verification and Evaluation Tool (PIVET): A Scaled Phenotype Evidence Generation Framework Using Web-Based Medical Literature.

    PubMed

    Henderson, Jette; Ke, Junyuan; Ho, Joyce C; Ghosh, Joydeep; Wallace, Byron C

    2018-05-04

    Researchers are developing methods to automatically extract clinically relevant and useful patient characteristics from raw healthcare datasets. These characteristics, often capturing essential properties of patients with common medical conditions, are called computational phenotypes. Being generated by automated or semiautomated, data-driven methods, such potential phenotypes need to be validated as clinically meaningful (or not) before they are acceptable for use in decision making. The objective of this study was to present Phenotype Instance Verification and Evaluation Tool (PIVET), a framework that uses co-occurrence analysis on an online corpus of publically available medical journal articles to build clinical relevance evidence sets for user-supplied phenotypes. PIVET adopts a conceptual framework similar to the pioneering prototype tool PheKnow-Cloud that was developed for the phenotype validation task. PIVET completely refactors each part of the PheKnow-Cloud pipeline to deliver vast improvements in speed without sacrificing the quality of the insights PheKnow-Cloud achieved. PIVET leverages indexing in NoSQL databases to efficiently generate evidence sets. Specifically, PIVET uses a succinct representation of the phenotypes that corresponds to the index on the corpus database and an optimized co-occurrence algorithm inspired by the Aho-Corasick algorithm. We compare PIVET's phenotype representation with PheKnow-Cloud's by using PheKnow-Cloud's experimental setup. In PIVET's framework, we also introduce a statistical model trained on domain expert-verified phenotypes to automatically classify phenotypes as clinically relevant or not. Additionally, we show how the classification model can be used to examine user-supplied phenotypes in an online, rather than batch, manner. PIVET maintains the discriminative power of PheKnow-Cloud in terms of identifying clinically relevant phenotypes for the same corpus with which PheKnow-Cloud was originally developed, but

  18. Conceptual Design of Electron-Beam Generated Plasma Tools

    NASA Astrophysics Data System (ADS)

    Agarwal, Ankur; Rauf, Shahid; Dorf, Leonid; Collins, Ken; Boris, David; Walton, Scott

    2015-09-01

    Realization of the next generation of high-density nanostructured devices is predicated on etching features with atomic layer resolution, no damage and high selectivity. High energy electron beams generate plasmas with unique features that make them attractive for applications requiring monolayer precision. In these plasmas, high energy beam electrons ionize the background gas and the resultant daughter electrons cool to low temperatures via collisions with gas molecules and lack of any accelerating fields. For example, an electron temperature of <0.6 eV with densities comparable to conventional plasma sources can be obtained in molecular gases. The chemistry in such plasmas can significantly differ from RF plasmas as the ions/radicals are produced primarily by beam electrons rather than those in the tail of a low energy distribution. In this work, we will discuss the conceptual design of an electron beam based plasma processing system. Plasma properties will be discussed for Ar, Ar/N2, and O2 plasmas using a computational plasma model, and comparisons made to experiments. The fluid plasma model is coupled to a Monte Carlo kinetic model for beam electrons which considers gas phase collisions and the effect of electric and magnetic fields on electron motion. The impact of critical operating parameters such as magnetic field, beam energy, and gas pressure on plasma characteristics in electron-beam plasma processing systems will be discussed. Partially supported by the NRL base program.

  19. Developing Oral Case Presentation Skills: Peer and Self-Evaluations as Instructional Tools.

    PubMed

    Williams, Dustyn E; Surakanti, Shravani

    2016-01-01

    Oral case presentation is an essential skill in clinical practice that is decidedly varied and understudied in teaching curricula. We developed a curriculum to improve oral case presentation skills in medical students. As part of an internal medicine clerkship, students receive instruction in the elements of a good oral case presentation and then present a real-world case in front of a video camera. Each student self-evaluates his/her presentation and receives evaluations from his/her peers. We expect peer and self-evaluation to be meaningful tools for developing skills in oral presentation. We hope to not only improve the quality of oral case presentations by students but also to reduce the time burden on faculty.

  20. SEQ-POINTER: Next generation, planetary spacecraft remote sensing science observation design tool

    NASA Technical Reports Server (NTRS)

    Boyer, Jeffrey S.

    1994-01-01

    Since Mariner, NASA-JPL planetary missions have been supported by ground software to plan and design remote sensing science observations. The software used by the science and sequence designers to plan and design observations has evolved with mission and technological advances. The original program, PEGASIS (Mariners 4, 6, and 7), was re-engineered as POGASIS (Mariner 9, Viking, and Mariner 10), and again later as POINTER (Voyager and Galileo). Each of these programs were developed under technological, political, and fiscal constraints which limited their adaptability to other missions and spacecraft designs. Implementation of a multi-mission tool, SEQ POINTER, under the auspices of the JPL Multimission Operations Systems Office (MOSO) is in progress. This version has been designed to address the limitations experienced on previous versions as they were being adapted to a new mission and spacecraft. The tool has been modularly designed with subroutine interface structures to support interchangeable celestial body and spacecraft definition models. The computational and graphics modules have also been designed to interface with data collected from previous spacecraft, or on-going observations, which describe the surface of each target body. These enhancements make SEQ POINTER a candidate for low-cost mission usage, when a remote sensing science observation design capability is required. The current and planned capabilities of the tool will be discussed. The presentation will also include a 5-10 minute video presentation demonstrating the capabilities of a proto-Cassini Project version that was adapted to test the tool. The work described in this abstract was performed by the Jet Propulsion Laboratory, California Institute of Technology, under contract to the National Aeronautics and Space Administration.

  1. SEQ-POINTER: Next generation, planetary spacecraft remote sensing science observation design tool

    NASA Astrophysics Data System (ADS)

    Boyer, Jeffrey S.

    1994-11-01

    Since Mariner, NASA-JPL planetary missions have been supported by ground software to plan and design remote sensing science observations. The software used by the science and sequence designers to plan and design observations has evolved with mission and technological advances. The original program, PEGASIS (Mariners 4, 6, and 7), was re-engineered as POGASIS (Mariner 9, Viking, and Mariner 10), and again later as POINTER (Voyager and Galileo). Each of these programs were developed under technological, political, and fiscal constraints which limited their adaptability to other missions and spacecraft designs. Implementation of a multi-mission tool, SEQ POINTER, under the auspices of the JPL Multimission Operations Systems Office (MOSO) is in progress. This version has been designed to address the limitations experienced on previous versions as they were being adapted to a new mission and spacecraft. The tool has been modularly designed with subroutine interface structures to support interchangeable celestial body and spacecraft definition models. The computational and graphics modules have also been designed to interface with data collected from previous spacecraft, or on-going observations, which describe the surface of each target body. These enhancements make SEQ POINTER a candidate for low-cost mission usage, when a remote sensing science observation design capability is required. The current and planned capabilities of the tool will be discussed. The presentation will also include a 5-10 minute video presentation demonstrating the capabilities of a proto-Cassini Project version that was adapted to test the tool. The work described in this abstract was performed by the Jet Propulsion Laboratory, California Institute of Technology, under contract to the National Aeronautics and Space Administration.

  2. Batch mode grid generation: An endangered species

    NASA Technical Reports Server (NTRS)

    Schuster, David M.

    1992-01-01

    Non-interactive grid generation schemes should thrive as emphasis shifts from development of numerical analysis and design methods to application of these tools to real engineering problems. A strong case is presented for the continued development and application of non-interactive geometry modeling methods. Guidelines, strategies, and techniques for developing and implementing these tools are presented using current non-interactive grid generation methods as examples. These schemes play an important role in the development of multidisciplinary analysis methods and some of these applications are also discussed.

  3. Effect of DECIDE (Decision-making Education for Choices In Diabetes Everyday) Program Delivery Modalities on Clinical and Behavioral Outcomes in Urban African Americans With Type 2 Diabetes: A Randomized Trial

    PubMed Central

    Fitzpatrick, Stephanie L.; Golden, Sherita Hill; Stewart, Kerry; Sutherland, June; DeGross, Sharie; Brown, Tina; Wang, Nae-Yuh; Allen, Jerilyn; Cooper, Lisa A.

    2016-01-01

    OBJECTIVE To compare the effectiveness of three delivery modalities of Decision-making Education for Choices In Diabetes Everyday (DECIDE), a nine-module, literacy-adapted diabetes and cardiovascular disease (CVD) education and problem-solving training, compared with an enhanced usual care (UC), on clinical and behavioral outcomes among urban African Americans with type 2 diabetes. RESEARCH DESIGN AND METHODS Eligible participants (n = 182) had a suboptimal CVD risk factor profile (A1C, blood pressure, and/or lipids). Participants were randomized to DECIDE Self-Study (n = 46), DECIDE Individual (n = 45), DECIDE Group (n = 46), or Enhanced UC (n = 45). Intervention duration was 18–20 weeks. Outcomes were A1C, blood pressure, lipids, problem-solving, disease knowledge, and self-care activities, all measured at baseline, 1 week, and 6 months after completion of the intervention. RESULTS DECIDE modalities and Enhanced UC did not significantly differ in clinical outcomes at 6 months postintervention. In participants with A1C ≥7.5% (58 mmol/mol) at baseline, A1C declined in each DECIDE modality at 1 week postintervention (P < 0.05) and only in Self-Study at 6 months postintervention (b = −0.24, P < 0.05). There was significant reduction in systolic blood pressure in Self-Study (b = −4.04) and Group (b = −3.59) at 6 months postintervention. Self-Study, Individual, and Enhanced UC had significant declines in LDL and Self-Study had an increase in HDL (b = 1.76, P < 0.05) at 6 months postintervention. Self-Study and Individual had a higher increase in knowledge than Enhanced UC (P < 0.05), and all arms improved in problem-solving (P < 0.01) at 6 months postintervention. CONCLUSIONS DECIDE modalities showed benefits after intervention. Self-Study demonstrated robust improvements across clinical and behavioral outcomes, suggesting program suitability for broader dissemination to populations with similar educational and literacy levels. PMID:27879359

  4. Computer Generated Optical Illusions: A Teaching and Research Tool.

    ERIC Educational Resources Information Center

    Bailey, Bruce; Harman, Wade

    Interactive computer-generated simulations that highlight psychological principles were investigated in this study in which 33 female and 19 male undergraduate college student volunteers of median age 21 matched line and circle sizes in six variations of Ponzo's illusion. Prior to working with the illusions, data were collected based on subjects'…

  5. Simultaneous Scheduling of Jobs, AGVs and Tools Considering Tool Transfer Times in Multi Machine FMS By SOS Algorithm

    NASA Astrophysics Data System (ADS)

    Sivarami Reddy, N.; Ramamurthy, D. V., Dr.; Prahlada Rao, K., Dr.

    2017-08-01

    This article addresses simultaneous scheduling of machines, AGVs and tools where machines are allowed to share the tools considering transfer times of jobs and tools between machines, to generate best optimal sequences that minimize makespan in a multi-machine Flexible Manufacturing System (FMS). Performance of FMS is expected to improve by effective utilization of its resources, by proper integration and synchronization of their scheduling. Symbiotic Organisms Search (SOS) algorithm is a potent tool which is a better alternative for solving optimization problems like scheduling and proven itself. The proposed SOS algorithm is tested on 22 job sets with makespan as objective for scheduling of machines and tools where machines are allowed to share tools without considering transfer times of jobs and tools and the results are compared with the results of existing methods. The results show that the SOS has outperformed. The same SOS algorithm is used for simultaneous scheduling of machines, AGVs and tools where machines are allowed to share tools considering transfer times of jobs and tools to determine the best optimal sequences that minimize makespan.

  6. Portable spark-gap arc generator

    NASA Technical Reports Server (NTRS)

    Ignaczak, L. R.

    1978-01-01

    Self-contained spark generator that simulates electrical noise caused by discharge of static charge is useful tool when checking sensitive component and equipment. In test set-up, device introduces repeatable noise pulses as behavior of components is monitored. Generator uses only standard commercial parts and weighs only 4 pounds; portable dc power supply is used. Two configurations of generator have been developed: one is free-running arc source, and one delivers spark in response to triggering pulse.

  7. Quantifying Traces of Tool Use: A Novel Morphometric Analysis of Damage Patterns on Percussive Tools

    PubMed Central

    Caruana, Matthew V.; Carvalho, Susana; Braun, David R.; Presnyakova, Darya; Haslam, Michael; Archer, Will; Bobe, Rene; Harris, John W. K.

    2014-01-01

    Percussive technology continues to play an increasingly important role in understanding the evolution of tool use. Comparing the archaeological record with extractive foraging behaviors in nonhuman primates has focused on percussive implements as a key to investigating the origins of lithic technology. Despite this, archaeological approaches towards percussive tools have been obscured by a lack of standardized methodologies. Central to this issue have been the use of qualitative, non-diagnostic techniques to identify percussive tools from archaeological contexts. Here we describe a new morphometric method for distinguishing anthropogenically-generated damage patterns on percussive tools from naturally damaged river cobbles. We employ a geomatic approach through the use of three-dimensional scanning and geographical information systems software to statistically quantify the identification process in percussive technology research. This will strengthen current technological analyses of percussive tools in archaeological frameworks and open new avenues for translating behavioral inferences of early hominins from percussive damage patterns. PMID:25415303

  8. Digital test assembly of truck parts with the IMMA-tool--an illustrative case.

    PubMed

    Hanson, L; Högberg, D; Söderholm, M

    2012-01-01

    Several digital human modelling (DHM) tools have been developed for simulation and visualisation of human postures and motions. In 2010 the DHM tool IMMA (Intelligently Moving Manikins) was introduced as a DHM tool that uses advanced path planning techniques to generate collision free and biomechanically acceptable motions for digital human models (as well as parts) in complex assembly situations. The aim of the paper is to illustrate how the IPS/IMMA tool is used at Scania CV AB in a digital test assembly process, and to compare the tool with other DHM tools on the market. The illustrated case of using the IMMA tool, here combined with the path planner tool IPS, indicates that the tool is promising. The major strengths of the tool are its user friendly interface, the motion generation algorithms, the batch simulation of manikins and the ergonomics assessment methods that consider time.

  9. Model Based Analysis and Test Generation for Flight Software

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  10. Improving Organizational Learning: Defining Units of Learning from Social Tools

    ERIC Educational Resources Information Center

    Menolli, André Luís Andrade; Reinehr, Sheila; Malucelli, Andreia

    2013-01-01

    New technologies, such as social networks, wikis, blogs and other social tools, enable collaborative work and are important facilitators of the social learning process. Many companies are using these types of tools as substitutes for their intranets, especially software development companies. However, the content generated by these tools in many…

  11. An interactive multi-block grid generation system

    NASA Technical Reports Server (NTRS)

    Kao, T. J.; Su, T. Y.; Appleby, Ruth

    1992-01-01

    A grid generation procedure combining interactive and batch grid generation programs was put together to generate multi-block grids for complex aircraft configurations. The interactive section provides the tools for 3D geometry manipulation, surface grid extraction, boundary domain construction for 3D volume grid generation, and block-block relationships and boundary conditions for flow solvers. The procedure improves the flexibility and quality of grid generation to meet the design/analysis requirements.

  12. Deciding alternative left turn signal phases using expert systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, E.C.P.

    1988-01-01

    The Texas Transportation Institute (TTI) conducted a study to investigate the feasibility of applying artificial intelligence (AI) technology and expert systems (ES) design concepts to a traffic engineering problem. Prototype systems were developed to analyze user input, evaluate various reasoning, and suggest suitable left turn phase treatment. These systems were developed using AI programming tools on IBM PC/XT/AT-compatible microcomputers. Two slightly different systems were designed using AI languages; another was built with a knowledge engineering tool. These systems include the PD PROLOG and TURBO PROLOG AI programs, as well as the INSIGHT Production Rule Language.

  13. RSAT: regulatory sequence analysis tools.

    PubMed

    Thomas-Chollier, Morgane; Sand, Olivier; Turatsinze, Jean-Valéry; Janky, Rekin's; Defrance, Matthieu; Vervisch, Eric; Brohée, Sylvain; van Helden, Jacques

    2008-07-01

    The regulatory sequence analysis tools (RSAT, http://rsat.ulb.ac.be/rsat/) is a software suite that integrates a wide collection of modular tools for the detection of cis-regulatory elements in genome sequences. The suite includes programs for sequence retrieval, pattern discovery, phylogenetic footprint detection, pattern matching, genome scanning and feature map drawing. Random controls can be performed with random gene selections or by generating random sequences according to a variety of background models (Bernoulli, Markov). Beyond the original word-based pattern-discovery tools (oligo-analysis and dyad-analysis), we recently added a battery of tools for matrix-based detection of cis-acting elements, with some original features (adaptive background models, Markov-chain estimation of P-values) that do not exist in other matrix-based scanning tools. The web server offers an intuitive interface, where each program can be accessed either separately or connected to the other tools. In addition, the tools are now available as web services, enabling their integration in programmatic workflows. Genomes are regularly updated from various genome repositories (NCBI and EnsEMBL) and 682 organisms are currently supported. Since 1998, the tools have been used by several hundreds of researchers from all over the world. Several predictions made with RSAT were validated experimentally and published.

  14. Test-Case Generation using an Explicit State Model Checker Final Report

    NASA Technical Reports Server (NTRS)

    Heimdahl, Mats P. E.; Gao, Jimin

    2003-01-01

    In the project 'Test-Case Generation using an Explicit State Model Checker' we have extended an existing tools infrastructure for formal modeling to export Java code so that we can use the NASA Ames tool Java Pathfinder (JPF) for test case generation. We have completed a translator from our source language RSML(exp -e) to Java and conducted initial studies of how JPF can be used as a testing tool. In this final report, we provide a detailed description of the translation approach as implemented in our tools.

  15. Applying CASE Tools for On-Board Software Development

    NASA Astrophysics Data System (ADS)

    Brammer, U.; Hönle, A.

    For many space projects the software development is facing great pressure with respect to quality, costs and schedule. One way to cope with these challenges is the application of CASE tools for automatic generation of code and documentation. This paper describes two CASE tools: Rhapsody (I-Logix) featuring UML and ISG (BSSE) that provides modeling of finite state machines. Both tools have been used at Kayser-Threde in different space projects for the development of on-board software. The tools are discussed with regard to the full software development cycle.

  16. Forensic surface metrology: tool mark evidence.

    PubMed

    Gambino, Carol; McLaughlin, Patrick; Kuo, Loretta; Kammerman, Frani; Shenkin, Peter; Diaczuk, Peter; Petraco, Nicholas; Hamby, James; Petraco, Nicholas D K

    2011-01-01

    Over the last several decades, forensic examiners of impression evidence have come under scrutiny in the courtroom due to analysis methods that rely heavily on subjective morphological comparisons. Currently, there is no universally accepted system that generates numerical data to independently corroborate visual comparisons. Our research attempts to develop such a system for tool mark evidence, proposing a methodology that objectively evaluates the association of striated tool marks with the tools that generated them. In our study, 58 primer shear marks on 9 mm cartridge cases, fired from four Glock model 19 pistols, were collected using high-resolution white light confocal microscopy. The resulting three-dimensional surface topographies were filtered to extract all "waviness surfaces"-the essential "line" information that firearm and tool mark examiners view under a microscope. Extracted waviness profiles were processed with principal component analysis (PCA) for dimension reduction. Support vector machines (SVM) were used to make the profile-gun associations, and conformal prediction theory (CPT) for establishing confidence levels. At the 95% confidence level, CPT coupled with PCA-SVM yielded an empirical error rate of 3.5%. Complementary, bootstrap-based computations for estimated error rates were 0%, indicating that the error rate for the algorithmic procedure is likely to remain low on larger data sets. Finally, suggestions are made for practical courtroom application of CPT for assigning levels of confidence to SVM identifications of tool marks recorded with confocal microscopy. Copyright © 2011 Wiley Periodicals, Inc.

  17. 42 CFR 59.7 - What criteria will the Department of Health and Human Services use to decide which family...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 1 2014-10-01 2014-10-01 false What criteria will the Department of Health and Human Services use to decide which family planning services projects to fund and in what amount? 59.7 Section 59.7 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES GRANTS GRANTS...

  18. 42 CFR 59.7 - What criteria will the Department of Health and Human Services use to decide which family...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false What criteria will the Department of Health and Human Services use to decide which family planning services projects to fund and in what amount? 59.7 Section 59.7 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES GRANTS GRANTS...

  19. 42 CFR 59.7 - What criteria will the Department of Health and Human Services use to decide which family...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 1 2012-10-01 2012-10-01 false What criteria will the Department of Health and Human Services use to decide which family planning services projects to fund and in what amount? 59.7 Section 59.7 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES GRANTS GRANTS...

  20. 42 CFR 59.7 - What criteria will the Department of Health and Human Services use to decide which family...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 42 Public Health 1 2013-10-01 2013-10-01 false What criteria will the Department of Health and Human Services use to decide which family planning services projects to fund and in what amount? 59.7 Section 59.7 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES GRANTS GRANTS...