ERIC Educational Resources Information Center
Johnson, Samuel A.; Tutt, Tye
2008-01-01
Recently, a high school Science Club generated a large number of questions involving temperature. Therefore, they decided to construct a thermal gradient apparatus in order to conduct a wide range of experiments beyond the standard "cookbook" labs. They felt that this apparatus could be especially useful in future ninth-grade biology classes, in…
Podcasting: Connecting with a New Generation
ERIC Educational Resources Information Center
Halderson, Jeanne
2006-01-01
In this article, the author describes how she uses podcasting as an educational tool for her seventh grade students. Using only the applications that come pre-loaded on the Mac iBook, they work together to develop the content, write storyboards, produce and edit the podcasts, and analyze their work. From creating the script to deciding how to…
NASA Technical Reports Server (NTRS)
Heard, Pamala D.
1998-01-01
The purpose of this research is to explore the development of Marshall Space Flight Center Unique Programs. These academic tools provide the Education Program Office with important information from the Education Computer Aided Tracking System (EDCATS). This system is equipped to provide on-line data entry, evaluation, analysis, and report generation, with full archiving for all phases of the evaluation process. Another purpose is to develop reports and data that is tailored to Marshall Space Flight Center Unique Programs. It also attempts to acquire knowledge on how, why, and where information is derived. As a result, a user will be better prepared to decide which available tool is the most feasible for their reports.
Measurement of Spindle Rigidity by using a Magnet Loader
NASA Astrophysics Data System (ADS)
Yamazaki, Taku; Matsubara, Atsushi; Fujita, Tomoya; Muraki, Toshiyuki; Asano, Kohei; Kawashima, Kazuyuki
The static rigidity of a rotating spindle in the radial direction is investigated in this research. A magnetic loading device (magnet loader) has been developed for the measurement. The magnet loader, which has coils and iron cores, generates the electromagnetic force and attracts a dummy tool attached to the spindle. However, the eddy current is generated in the dummy tool with the spindle rotation and reduces the attractive force at high spindle speed. In order to understand the magnetic flux and eddy current in the dummy tool, the electromagnetic field analysis by FEM was carried out. Grooves on the attraction surface of the dummy tool were designed to cut the eddy current flow. The dimension of the groove were decided based on the FEM analysis, and the designed tool were manufactured and tested. The test result shows that the designed tool successfully reduces the eddy current and recovers the attractive force. By using the magnet loader and the grooved tool, the spindle rigidity can be measured when the spindle rotates with a speed up to 10,000 min-1.
USSR Report Machine Tools and Metalworking Equipment.
1986-04-22
directors decided to teach the Bulat a new trade. This generator is now used to strengthen high-speed cutting mills by hardening them in a medium of...modules (GPM) and flexible production complexes ( GPK ). The flexible automated line is usually used for mass production of components. Here the...of programmable coordinates (x^ithout grip) 5 4 Method of programming teaching Memory capacity of robot system, points 300 Positioning error, mm
A survey of tools and resources for the next generation analyst
NASA Astrophysics Data System (ADS)
Hall, David L.; Graham, Jake; Catherman, Emily
2015-05-01
We have previously argued that a combination of trends in information technology (IT) and changing habits of people using IT provide opportunities for the emergence of a new generation of analysts that can perform effective intelligence, surveillance and reconnaissance (ISR) on a "do it yourself" (DIY) or "armchair" approach (see D.L. Hall and J. Llinas (2014)). Key technology advances include: i) new sensing capabilities including the use of micro-scale sensors and ad hoc deployment platforms such as commercial drones, ii) advanced computing capabilities in mobile devices that allow advanced signal and image processing and modeling, iii) intelligent interconnections due to advances in "web N" capabilities, and iv) global interconnectivity and increasing bandwidth. In addition, the changing habits of the digital natives reflect new ways of collecting and reporting information, sharing information, and collaborating in dynamic teams. This paper provides a survey and assessment of tools and resources to support this emerging analysis approach. The tools range from large-scale commercial tools such as IBM i2 Analyst Notebook, Palantir, and GeoSuite to emerging open source tools such as GeoViz and DECIDE from university research centers. The tools include geospatial visualization tools, social network analysis tools and decision aids. A summary of tools is provided along with links to web sites for tool access.
Human/autonomy collaboration for the automated generation of intelligence products
NASA Astrophysics Data System (ADS)
DiBona, Phil; Schlachter, Jason; Kuter, Ugur; Goldman, Robert
2017-05-01
Intelligence Analysis remains a manual process despite trends toward autonomy in information processing. Analysts need agile decision--support tools that can adapt to the evolving information needs of the mission, allowing the analyst to pose novel analytic questions. Our research enables the analysts to only provide a constrained English specification of what the intelligence product should be. Using HTN planning, the autonomy discovers, decides, and generates a workflow of algorithms to create the intelligence product. Therefore, the analyst can quickly and naturally communicate to the autonomy what information product is needed, rather than how to create it.
Who decides and what are people willing-to-pay for whole genome sequencing information?
Marshall, DA; Gonzalez, JM; Johnson, FR; MacDonald, KV; Pugh, A; Douglas, MP; Phillips, KA
2016-01-01
PURPOSE Whole genome sequencing (WGS) can be used as a powerful diagnostic tool which could also be used for screening but may generate anxiety, unnecessary testing and overtreatment. Current guidelines suggest reporting clinically actionable secondary findings when diagnostic testing is performed. We estimated preferences for receiving WGS results. METHODS A US nationally representative survey (n=410 adults) was used to rank preferences for who decides (expert panel, your doctor, you) which WGS results are reported. We estimated the value of information about variants with varying levels of clinical usefulness using willingness-to-pay contingent valuation questions. RESULTS 43% preferred to decide themselves what information is included in the WGS report. 38% (95% CI:33–43%) would not pay for actionable variants, and 3% (95% CI:1–5%) would pay more than $1000. 55% (95% CI:50–60%) would not pay for variants in which medical treatment is currently unclear, and 7% (95% CI:5–9%) would pay more than $400. CONCLUSION Most people prefer to decide what WGS results are reported. Despite valuing actionable information more, some respondents perceive that genetic information could negatively impact them. Preference heterogeneity for WGS information should be considered in the development of policies, particularly to integrate patient preferences with personalized medicine and shared decision making. PMID:27253734
Lo, Andrea C; Olson, Robert; Feldman-Stewart, Deb; Truong, Pauline T; Aquino-Parsons, Christina; Bottorff, Joan L; Carolan, Hannah
2017-12-01
To evaluate the information needs of ductal carcinoma in situ (DCIS) patients. Four focus groups involving 24 previously treated DCIS patients were conducted to develop a comprehensive list of questions they felt were important to have answered at the time of diagnosis. Using a survey, a separate group of patients treated for DCIS then rated the importance of having each of these questions addressed before treatment decision making. Response options were "essential," "desired," "not important," "no opinion," and "avoid." For each essential/desired question, respondents specified how addressing it would help them: "understand," "decide," "plan," "not sure," or "other." Focus group participants generated 117 questions used in the survey. Fifty-seven patients completed the survey (55% response rate). Respondents rated a median of 66 questions as essential. The most commonly cited reason for rating a question essential was to "understand," followed by to "decide." The top questions women deemed essential to help them understand were disease specific, whereas the top questions deemed essential to help women decide were predominantly treatment specific, pertaining to available options, recurrence and survival outcomes, and timelines to decide and start treatment. DCIS patients want a large number of questions answered, mostly for understanding, and also for deciding and planning. A core set of questions that most patients consider essential for decision making has been formulated and may be used in the clinical setting and in research to develop educational resources and decision-making tools specific to DCIS.
E-DECIDER Decision Support Gateway For Earthquake Disaster Response
NASA Astrophysics Data System (ADS)
Glasscoe, M. T.; Stough, T. M.; Parker, J. W.; Burl, M. C.; Donnellan, A.; Blom, R. G.; Pierce, M. E.; Wang, J.; Ma, Y.; Rundle, J. B.; Yoder, M. R.
2013-12-01
Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing capabilities for decision-making utilizing remote sensing data and modeling software in order to provide decision support for earthquake disaster management and response. E-DECIDER incorporates earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project in order to produce standards-compliant map data products to aid in decision-making following an earthquake. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools, help provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). E-DECIDER utilizes a service-based GIS model for its cyber-infrastructure in order to produce standards-compliant products for different user types with multiple service protocols (such as KML, WMS, WFS, and WCS). The goal is to make complex GIS processing and domain-specific analysis tools more accessible to general users through software services as well as provide system sustainability through infrastructure services. The system comprises several components, which include: a GeoServer for thematic mapping and data distribution, a geospatial database for storage and spatial analysis, web service APIs, including simple-to-use REST APIs for complex GIS functionalities, and geoprocessing tools including python scripts to produce standards-compliant data products. These are then served to the E-DECIDER decision support gateway (http://e-decider.org), the E-DECIDER mobile interface, and to the Department of Homeland Security decision support middleware UICDS (Unified Incident Command and Decision Support). The E-DECIDER decision support gateway features a web interface that delivers map data products including deformation modeling results (slope change and strain magnitude) and aftershock forecasts, with remote sensing change detection results under development. These products are event triggered (from the USGS earthquake feed) and will be posted to event feeds on the E-DECIDER webpage and accessible via the mobile interface and UICDS. E-DECIDER also features a KML service that provides infrastructure information from the FEMA HAZUS database through UICDS and the mobile interface. The back-end GIS service architecture and front-end gateway components form a decision support system that is designed for ease-of-use and extensibility for end-users.
DECIDE: a Decision Support Tool to Facilitate Parents' Choices Regarding Genome-Wide Sequencing.
Birch, Patricia; Adam, S; Bansback, N; Coe, R R; Hicklin, J; Lehman, A; Li, K C; Friedman, J M
2016-12-01
We describe the rationale, development, and usability testing for an integrated e-learning tool and decision aid for parents facing decisions about genome-wide sequencing (GWS) for their children with a suspected genetic condition. The online tool, DECIDE, is designed to provide decision-support and to promote high quality decisions about undergoing GWS with or without return of optional incidental finding results. DECIDE works by integrating educational material with decision aids. Users may tailor their learning by controlling both the amount of information and its format - text and diagrams and/or short videos. The decision aid guides users to weigh the importance of various relevant factors in their own lives and circumstances. After considering the pros and cons of GWS and return of incidental findings, DECIDE summarizes the user's responses and apparent preferred choices. In a usability study of 16 parents who had already chosen GWS after conventional genetic counselling, all participants found DECIDE to be helpful. Many would have been satisfied to use it alone to guide their GWS decisions, but most would prefer to have the option of consulting a health care professional as well to aid their decision. Further testing is necessary to establish the effectiveness of using DECIDE as an adjunct to or instead of conventional pre-test genetic counselling for clinical genome-wide sequencing.
Analysis and design of friction stir welding tool
NASA Astrophysics Data System (ADS)
Jagadeesha, C. B.
2016-12-01
Since its inception no one has done analysis and design of FSW tool. Initial dimensions of FSW tool are decided by educated guess. Optimum stresses on tool pin have been determined at optimized parameters for bead on plate welding on AZ31B-O Mg alloy plate. Fatigue analysis showed that the chosen FSW tool for the welding experiment has not ∞ life and it has determined that the life of FSW tool is 2.66×105 cycles or revolutions. So one can conclude that any arbitrarily decided FSW tool generally has finite life and cannot be used for ∞ life. In general, one can determine the suitability of tool and its material to be used in FSW of the given workpiece materials in advance by this analysis in terms of fatigue life of the tool.
NASA Astrophysics Data System (ADS)
Glasscoe, Margaret T.; Wang, Jun; Pierce, Marlon E.; Yoder, Mark R.; Parker, Jay W.; Burl, Michael C.; Stough, Timothy M.; Granat, Robert A.; Donnellan, Andrea; Rundle, John B.; Ma, Yu; Bawden, Gerald W.; Yuen, Karen
2015-08-01
Earthquake Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response (E-DECIDER) is a NASA-funded project developing new capabilities for decision making utilizing remote sensing data and modeling software to provide decision support for earthquake disaster management and response. E-DECIDER incorporates the earthquake forecasting methodology and geophysical modeling tools developed through NASA's QuakeSim project. Remote sensing and geodetic data, in conjunction with modeling and forecasting tools allows us to provide both long-term planning information for disaster management decision makers as well as short-term information following earthquake events (i.e. identifying areas where the greatest deformation and damage has occurred and emergency services may need to be focused). This in turn is delivered through standards-compliant web services for desktop and hand-held devices.
Clinical guideline representation in a CDS: a human information processing method.
Kilsdonk, Ellen; Riezebos, Rinke; Kremer, Leontien; Peute, Linda; Jaspers, Monique
2012-01-01
The Dutch Childhood Oncology Group (DCOG) has developed evidence-based guidelines for screening childhood cancer survivors for possible late complications of treatment. These paper-based guidelines appeared to not suit clinicians' information retrieval strategies; it was thus decided to communicate the guidelines through a Computerized Decision Support (CDS) tool. To ensure high usability of this tool, an analysis of clinicians' cognitive strategies in retrieving information from the paper-based guidelines was used as requirements elicitation method. An information processing model was developed through an analysis of think aloud protocols and used as input for the design of the CDS user interface. Usability analysis of the user interface showed that the navigational structure of the CDS tool fitted well with the clinicians' mental strategies employed in deciding on survivors screening protocols. Clinicians were more efficient and more complete in deciding on patient-tailored screening procedures when supported by the CDS tool than by the paper-based guideline booklet. The think-aloud method provided detailed insight into users' clinical work patterns that supported the design of a highly usable CDS system.
Avila, M L; Brandão, L R; Williams, S; Ward, L C; Montoya, M I; Stinson, J; Kiss, A; Lara-Corrales, I; Feldman, B M
2016-08-01
Our goal was to conduct the item generation and piloting phases of a new discriminative and evaluative tool for pediatric post-thrombotic syndrome. We followed a formative model for the development of the tool, focusing on the signs/symptoms (items) that define post-thrombotic syndrome. For item generation, pediatric thrombosis experts and subjects diagnosed with extremity post-thrombotic syndrome during childhood nominated items. In the piloting phase, items were cross-sectionally measured in children with limb deep vein thrombosis to examine item performance. Twenty-three experts and 16 subjects listed 34 items, which were then measured in 140 subjects with previous diagnosis of limb deep vein thrombosis (70 upper extremity and 70 lower extremity). The items with strongest correlation with post-thrombotic syndrome severity and largest area under the curve were pain (in older children), paresthesia, and swollen limb for the upper extremity group, and pain (in older children), tired limb, heaviness, tightness and paresthesia for the lower extremity group. The diagnostic properties of the items and their correlations with post-thrombotic syndrome severity varied according to the assessed venous territory. The information gathered in this study will help experts decide which item should be considered for inclusion in the new tool. Copyright © 2016 Elsevier Ltd. All rights reserved.
van Asselt, Thea; Ramaekers, Bram; Corro Ramos, Isaac; Joore, Manuela; Al, Maiwenn; Lesman-Leegte, Ivonne; Postma, Maarten; Vemer, Pepijn; Feenstra, Talitha
2018-01-01
The costs of performing research are an important input in value of information (VOI) analyses but are difficult to assess. The aim of this study was to investigate the costs of research, serving two purposes: (1) estimating research costs for use in VOI analyses; and (2) developing a costing tool to support reviewers of grant proposals in assessing whether the proposed budget is realistic. For granted study proposals from the Netherlands Organization for Health Research and Development (ZonMw), type of study, potential cost drivers, proposed budget, and general characteristics were extracted. Regression analysis was conducted in an attempt to generate a 'predicted budget' for certain combinations of cost drivers, for implementation in the costing tool. Of 133 drug-related research grant proposals, 74 were included for complete data extraction. Because an association between cost drivers and budgets was not confirmed, we could not generate a predicted budget based on regression analysis, but only historic reference budgets given certain study characteristics. The costing tool was designed accordingly, i.e. with given selection criteria the tool returns the range of budgets in comparable studies. This range can be used in VOI analysis to estimate whether the expected net benefit of sampling will be positive to decide upon the net value of future research. The absence of association between study characteristics and budgets may indicate inconsistencies in the budgeting or granting process. Nonetheless, the tool generates useful information on historical budgets, and the option to formally relate VOI to budgets. To our knowledge, this is the first attempt at creating such a tool, which can be complemented with new studies being granted, enlarging the underlying database and keeping estimates up to date.
Novel Problem Solving - The NASA Solution Mechanism Guide
NASA Technical Reports Server (NTRS)
Keeton, Kathryn E.; Richard, Elizabeth E.; Davis, Jeffrey R.
2014-01-01
Over the past five years, the Human Health and Performance (HH&P) Directorate at the NASA Johnson Space Center (JSC) has conducted a number of pilot and ongoing projects in collaboration and open innovation. These projects involved the use of novel open innovation competitions that sought solutions from "the crowd", non-traditional problem solvers. The projects expanded to include virtual collaboration centers such as the NASA Human Health and Performance Center (NHHPC) and more recently a collaborative research project between NASA and the National Science Foundation (NSF). These novel problem-solving tools produced effective results and the HH&P wanted to capture the knowledge from these new tools, to teach the results to the directorate, and to implement new project management tools and coursework. The need to capture and teach the results of these novel problem solving tools, the HH&P decided to create a web-based tool to capture best practices and case studies, to teach novice users how to use new problem solving tools and to change project management training/. This web-based tool was developed with a small, multi-disciplinary group and named the Solution Mechanism Guide (SMG). An alpha version was developed that was tested against several sessions of user groups to get feedback on the SMG and determine a future course for development. The feedback was very positive and the HH&P decided to move to the beta-phase of development. To develop the web-based tool, the HH&P utilized the NASA Tournament Lab (NTL) to develop the software with TopCoder under an existing contract. In this way, the HH&P is using one new tool (the NTL and TopCoder) to develop the next generation tool, the SMG. The beta-phase of the SMG is planed for release in the spring of 2014 and results of the beta-phase testing will be available for the IAC meeting in September. The SMG is intended to disrupt the way problem solvers and project managers approach problem solving and to increase the use of novel and more cost and time effective problem solving tools such as open innovation, collaborative research, and virtual collaborative project centers. The HH&P envisions changing project management coursework by including the SMG in the teaching of project management problem solving tools.
Tools, information sources, and methods used in deciding on drug availability in HMOs.
Barner, J C; Thomas, J
1998-01-01
The use and importance of specific decision-making tools, information sources, and drug-use management methods in determining drug availability and use in HMOs were studied. A questionnaire was sent to 303 randomly selected HMOs. Respondents were asked to rate their use of each of four formal decision-making tools and its relative importance, as well as the use and importance of eight information sources and 11 methods for managing drug availability and use, on a 5-point scale. The survey response rate was 28%. Approximately half of the respondents reported that their HMOs used decision analysis or multiattribute analysis in deciding on drug availability. If used, these tools were rated as very important. There were significant differences in levels of use by HMO type, membership size, and age. Journal articles and reference books were reported most often as information sources. Retrospective drug-use review was used very often and perceived to be very important in managing drug use. Other management methods were used only occasionally, but the importance placed on these tools when used ranged from moderately to very important. Older organizations used most of the management methods more often than did other HMOs. Decision analysis and multiattribute analysis were the most commonly used tools for deciding on which drugs to make available to HMO members, and reference books and journal articles were the most commonly used information sources. Retrospective and prospective drug-use reviews were the most commonly applied methods for managing HMO members' access to drugs.
Aided generation of search interfaces to astronomical archives
NASA Astrophysics Data System (ADS)
Zorba, Sonia; Bignamini, Andrea; Cepparo, Francesco; Knapic, Cristina; Molinaro, Marco; Smareglia, Riccardo
2016-07-01
Astrophysical data provider organizations that host web based interfaces to provide access to data resources have to cope with possible changes in data management that imply partial rewrites of web applications. To avoid doing this manually it was decided to develop a dynamically configurable Java EE web application that can set itself up reading needed information from configuration files. Specification of what information the astronomical archive database has to expose is managed using the TAP SCHEMA schema from the IVOA TAP recommendation, that can be edited using a graphical interface. When configuration steps are done the tool will build a war file to allow easy deployment of the application.
ERIC Educational Resources Information Center
Dyrud, Marilyn A.; Worley, Rebecca B.; Schultz, Benjamin
2005-01-01
Blogs are communication tools, they serve as vehicles to transmit messages. Before deciding to blog, one needs to devise a strategy on how this medium will fit in with his or her communication needs. This will also help later in deciding which features one will need to include in his or her blog. This article discusses ways on how to start and…
NASA Astrophysics Data System (ADS)
Szabó, S.; Bódis, K.; Huld, T.; Moner-Girona, M.
2011-07-01
Three rural electrification options are analysed showing the cost optimal conditions for a sustainable energy development applying renewable energy sources in Africa. A spatial electricity cost model has been designed to point out whether diesel generators, photovoltaic systems or extension of the grid are the least-cost option in off-grid areas. The resulting mapping application offers support to decide in which regions the communities could be electrified either within the grid or in an isolated mini-grid. Donor programs and National Rural Electrification Agencies (or equivalent governmental departments) could use this type of delineation for their program boundaries and then could use the local optimization tools adapted to the prevailing parameters. The views expressed in this paper are those of the authors and do not necessarily represent European Commission and UNEP policy.
CQI: using the Hoshin planning system to design an orientation process.
Platt, D; Laird, C
1995-01-01
The Hoshin planning system, developed in Japan after World War II, includes management tools intended specifically for planning new processes. There are seven tools, which can be used individually or in any combination: affinity diagrams, interrelationship digraphs, systematic diagrams, matrix diagrams, process decision program charts, arrow diagrams and prioritization matrices. The radiology department at Carson-Tahoe Hospital formed a CQI team to improve the training of front office clerks. The team quickly discovered that a new orientation program was needed and decided to use Hoshin tools to create one. Using the tools, the team identified and prioritized all relevant factors, described specific tasks needed to complete the planning process and how long each would take, anticipated problems, and assigned areas of responsibility to members of the team. Each time the team grew weary or discouraged, the clarity and organization afforded by the tools helped them feel productive and in control of the process. The team was amazed at the creative ideas they generated through this 3-month-long process. Not only did they develop and implement a new orientation program, they also cultivated a stronger sense of pride and confidence in their work and each other.
Donated chemical probes for open science
Ackloo, Suzanne; Arrowsmith, Cheryl H; Bauser, Marcus; Baryza, Jeremy L; Blagg, Julian; Böttcher, Jark; Bountra, Chas; Brown, Peter J; Bunnage, Mark E; Carter, Adrian J; Damerell, David; Dötsch, Volker; Drewry, David H; Edwards, Aled M; Edwards, James; Elkins, Jon M; Fischer, Christian; Frye, Stephen V; Gollner, Andreas; Grimshaw, Charles E; IJzerman, Adriaan; Hanke, Thomas; Hartung, Ingo V; Hitchcock, Steve; Howe, Trevor; Hughes, Terry V; Laufer, Stefan; Li, Volkhart MJ; Liras, Spiros; Marsden, Brian D; Matsui, Hisanori; Mathias, John; O'Hagan, Ronan C; Owen, Dafydd R; Pande, Vineet; Rauh, Daniel; Rosenberg, Saul H; Roth, Bryan L; Schneider, Natalie S; Scholten, Cora; Singh Saikatendu, Kumar; Simeonov, Anton; Takizawa, Masayuki; Tse, Chris; Thompson, Paul R; Treiber, Daniel K; Viana, Amélia YI; Wells, Carrow I; Willson, Timothy M; Zuercher, William J; Knapp, Stefan
2018-01-01
Potent, selective and broadly characterized small molecule modulators of protein function (chemical probes) are powerful research reagents. The pharmaceutical industry has generated many high-quality chemical probes and several of these have been made available to academia. However, probe-associated data and control compounds, such as inactive structurally related molecules and their associated data, are generally not accessible. The lack of data and guidance makes it difficult for researchers to decide which chemical tools to choose. Several pharmaceutical companies (AbbVie, Bayer, Boehringer Ingelheim, Janssen, MSD, Pfizer, and Takeda) have therefore entered into a pre-competitive collaboration to make available a large number of innovative high-quality probes, including all probe-associated data, control compounds and recommendations on use (https://openscienceprobes.sgc-frankfurt.de/). Here we describe the chemical tools and target-related knowledge that have been made available, and encourage others to join the project. PMID:29676732
Senol Cali, Damla; Kim, Jeremie S; Ghose, Saugata; Alkan, Can; Mutlu, Onur
2018-04-02
Nanopore sequencing technology has the potential to render other sequencing technologies obsolete with its ability to generate long reads and provide portability. However, high error rates of the technology pose a challenge while generating accurate genome assemblies. The tools used for nanopore sequence analysis are of critical importance, as they should overcome the high error rates of the technology. Our goal in this work is to comprehensively analyze current publicly available tools for nanopore sequence analysis to understand their advantages, disadvantages and performance bottlenecks. It is important to understand where the current tools do not perform well to develop better tools. To this end, we (1) analyze the multiple steps and the associated tools in the genome assembly pipeline using nanopore sequence data, and (2) provide guidelines for determining the appropriate tools for each step. Based on our analyses, we make four key observations: (1) the choice of the tool for basecalling plays a critical role in overcoming the high error rates of nanopore sequencing technology. (2) Read-to-read overlap finding tools, GraphMap and Minimap, perform similarly in terms of accuracy. However, Minimap has a lower memory usage, and it is faster than GraphMap. (3) There is a trade-off between accuracy and performance when deciding on the appropriate tool for the assembly step. The fast but less accurate assembler Miniasm can be used for quick initial assembly, and further polishing can be applied on top of it to increase the accuracy, which leads to faster overall assembly. (4) The state-of-the-art polishing tool, Racon, generates high-quality consensus sequences while providing a significant speedup over another polishing tool, Nanopolish. We analyze various combinations of different tools and expose the trade-offs between accuracy, performance, memory usage and scalability. We conclude that our observations can guide researchers and practitioners in making conscious and effective choices for each step of the genome assembly pipeline using nanopore sequence data. Also, with the help of bottlenecks we have found, developers can improve the current tools or build new ones that are both accurate and fast, to overcome the high error rates of the nanopore sequencing technology.
Serendipia: Castilla-La Mancha telepathology network
Peces, Carlos; García-Rojo, Marcial; Sacristán, José; Gallardo, Antonio José; Rodríguez, Ambrosio
2008-01-01
Nowadays, there is no standard solution for acquiring, archiving and communication of Pathology digital images. In addition, there does not exist any commercial Pathology Information System (LIS) that can manage the relationship between the reports generated by the pathologist and their corresponding images. Due to this situation, the Healthcare Service of Castilla-La Mancha decided to create a completely digital Pathology Department, the project is called SERENDIPIA. SERENDIPIA project provides all the necessary image acquiring devices needed to cover all kind of images that can be generated in a Pathology Department. In addition, in the SERENDIPIA project an Information System was developed that allows, on the one hand, it to cover the daily workflow of a Pathology Department (including the storage and the manage of the reports and its images), and, on the other hand, the Information System provides a WEB telepathology portal with collaborative tools like second opinion. PMID:18673519
Multicriteria analysis for the selection of the most appropriate energy crops: the case of Cyprus
NASA Astrophysics Data System (ADS)
Kylili, Angeliki; Christoforou, Elias; Fokaides, Paris A.; Polycarpou, Polycarpos
2016-01-01
Energy crops are considered key actors in meeting the international and European carbon reduction targets, increasing the national energy security through renewable energy production, mitigating climate change impacts, and promoting sustainability. Multicriteria analysis is a suitable decision-making tool for the energy sector, where the final decisions have to consider for a range of aspects, and can be utilised as well for deciding on appropriate energy crops. In this paper, a popular multicriteria method, PROMETHEE, is employed for the identification of the most optimal energy crops for their exploitation in Cyprus. The criteria and the weights of each are defined, and accordingly five different scenarios are developed and examined. The obtained results indicated that the promotion of second-generation energy crops is more ideal in terms of the set objectives, as well as more sustainable than the exploitation of any first-generation energy crop.
Novel integrative genomic tool for interrogating lithium response in bipolar disorder
Hunsberger, J G; Chibane, F L; Elkahloun, A G; Henderson, R; Singh, R; Lawson, J; Cruceanu, C; Nagarajan, V; Turecki, G; Squassina, A; Medeiros, C D; Del Zompo, M; Rouleau, G A; Alda, M; Chuang, D-M
2015-01-01
We developed a novel integrative genomic tool called GRANITE (Genetic Regulatory Analysis of Networks Investigational Tool Environment) that can effectively analyze large complex data sets to generate interactive networks. GRANITE is an open-source tool and invaluable resource for a variety of genomic fields. Although our analysis is confined to static expression data, GRANITE has the capability of evaluating time-course data and generating interactive networks that may shed light on acute versus chronic treatment, as well as evaluating dose response and providing insight into mechanisms that underlie therapeutic versus sub-therapeutic doses or toxic doses. As a proof-of-concept study, we investigated lithium (Li) response in bipolar disorder (BD). BD is a severe mood disorder marked by cycles of mania and depression. Li is one of the most commonly prescribed and decidedly effective treatments for many patients (responders), although its mode of action is not yet fully understood, nor is it effective in every patient (non-responders). In an in vitro study, we compared vehicle versus chronic Li treatment in patient-derived lymphoblastoid cells (LCLs) (derived from either responders or non-responders) using both microRNA (miRNA) and messenger RNA gene expression profiling. We present both Li responder and non-responder network visualizations created by our GRANITE analysis in BD. We identified by network visualization that the Let-7 family is consistently downregulated by Li in both groups where this miRNA family has been implicated in neurodegeneration, cell survival and synaptic development. We discuss the potential of this analysis for investigating treatment response and even providing clinicians with a tool for predicting treatment response in their patients, as well as for providing the industry with a tool for identifying network nodes as targets for novel drug discovery. PMID:25646593
Novel integrative genomic tool for interrogating lithium response in bipolar disorder.
Hunsberger, J G; Chibane, F L; Elkahloun, A G; Henderson, R; Singh, R; Lawson, J; Cruceanu, C; Nagarajan, V; Turecki, G; Squassina, A; Medeiros, C D; Del Zompo, M; Rouleau, G A; Alda, M; Chuang, D-M
2015-02-03
We developed a novel integrative genomic tool called GRANITE (Genetic Regulatory Analysis of Networks Investigational Tool Environment) that can effectively analyze large complex data sets to generate interactive networks. GRANITE is an open-source tool and invaluable resource for a variety of genomic fields. Although our analysis is confined to static expression data, GRANITE has the capability of evaluating time-course data and generating interactive networks that may shed light on acute versus chronic treatment, as well as evaluating dose response and providing insight into mechanisms that underlie therapeutic versus sub-therapeutic doses or toxic doses. As a proof-of-concept study, we investigated lithium (Li) response in bipolar disorder (BD). BD is a severe mood disorder marked by cycles of mania and depression. Li is one of the most commonly prescribed and decidedly effective treatments for many patients (responders), although its mode of action is not yet fully understood, nor is it effective in every patient (non-responders). In an in vitro study, we compared vehicle versus chronic Li treatment in patient-derived lymphoblastoid cells (LCLs) (derived from either responders or non-responders) using both microRNA (miRNA) and messenger RNA gene expression profiling. We present both Li responder and non-responder network visualizations created by our GRANITE analysis in BD. We identified by network visualization that the Let-7 family is consistently downregulated by Li in both groups where this miRNA family has been implicated in neurodegeneration, cell survival and synaptic development. We discuss the potential of this analysis for investigating treatment response and even providing clinicians with a tool for predicting treatment response in their patients, as well as for providing the industry with a tool for identifying network nodes as targets for novel drug discovery.
Decision-making tool for applying adaptive traffic control systems : final report.
DOT National Transportation Integrated Search
2016-03-01
Adaptive traffic signal control technologies have been increasingly deployed in real world situations. The objective of this project was to develop a decision-making tool to guide traffic engineers and decision-makers who must decide whether or not a...
From Modelling to Execution of Enterprise Integration Scenarios: The GENIUS Tool
NASA Astrophysics Data System (ADS)
Scheibler, Thorsten; Leymann, Frank
One of the predominant problems IT companies are facing today is Enterprise Application Integration (EAI). Most of the infrastructures built to tackle integration issues are proprietary because no standards exist for how to model, develop, and actually execute integration scenarios. EAI patterns gain importance for non-technical business users to ease and harmonize the development of EAI scenarios. These patterns describe recurring EAI challenges and propose possible solutions in an abstract way. Therefore, one can use those patterns to describe enterprise architectures in a technology neutral manner. However, patterns are documentation only used by developers and systems architects to decide how to implement an integration scenario manually. Thus, patterns are not theoretical thought to stand for artefacts that will immediately be executed. This paper presents a tool supporting a method how EAI patterns can be used to generate executable artefacts for various target platforms automatically using a model-driven development approach, hence turning patterns into something executable. Therefore, we introduce a continuous tool chain beginning at the design phase and ending in executing an integration solution in a completely automatically manner. For evaluation purposes we introduce a scenario demonstrating how the tool is utilized for modelling and actually executing an integration scenario.
ERIC Educational Resources Information Center
Marshall, Stephanie Pace
2010-01-01
This article offers a personal vision and conceptual design for reimagining specialized science, technology, engineering, and mathematics (STEM) academies designed to nurture "decidedly different" STEM minds and ignite a new generation of global STEM talent, innovation, and entrepreneurial leadership. This design enables students to engage…
ERIC Educational Resources Information Center
New Teacher Project, 2011
2011-01-01
This "Rating a Teacher Observation Tool" identifies five simple questions and provides an easy-to-use scorecard to help policymakers decide whether an observation framework is likely to produce fair and accurate results. The five questions are: (1) Do the criteria and tools cover the classroom performance areas most connected to student outcomes?…
ERIC Educational Resources Information Center
Baldwin, Grover H.
The use of quantitative decision making tools provides the decision maker with a range of alternatives among which to decide, permits acceptance and use of the optimal solution, and decreases risk. Training line administrators in the use of these tools can help school business officials obtain reliable information upon which to base district…
77 FR 292 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-04
... Pricing Tool. Ultimately, CMS decides whether to approve the plan pricing (i.e., payment and premium...: Bid Pricing Tool (BPT) for Medicare Advantage (MA) Plans and Prescription Drug Plans (PDP); Use: Under... to submit an actuarial pricing ``bid'' for each plan offered to Medicare beneficiaries for approval...
Illuminating Apps for Fourth Grade
ERIC Educational Resources Information Center
Lennex, Lesia; Bodenlos, Emily
2014-01-01
Elementary science is chock-full of wonderful experiences for students. Do children see iPads as a tool for learning about science? Using Prensky (2010) as a guide, the researchers decided to see if "assessing students with their own" tools (p.178) using iPad apps would support learning discrete knowledge for electricity and light…
Automatic 3D virtual scenes modeling for multisensors simulation
NASA Astrophysics Data System (ADS)
Latger, Jean; Le Goff, Alain; Cathala, Thierry; Larive, Mathieu
2006-05-01
SEDRIS that stands for Synthetic Environment Data Representation and Interchange Specification is a DoD/DMSO initiative in order to federate and make interoperable 3D mocks up in the frame of virtual reality and simulation. This paper shows an original application of SEDRIS concept for research physical multi sensors simulation, when SEDRIS is more classically known for training simulation. CHORALE (simulated Optronic Acoustic Radar battlefield) is used by the French DGA/DCE (Directorate for Test and Evaluation of the French Ministry of Defense) to perform multi-sensors simulations. CHORALE enables the user to create virtual and realistic multi spectral 3D scenes, and generate the physical signal received by a sensor, typically an IR sensor. In the scope of this CHORALE workshop, French DGA has decided to introduce a SEDRIS based new 3D terrain modeling tool that enables to create automatically 3D databases, directly usable by the physical sensor simulation CHORALE renderers. This AGETIM tool turns geographical source data (including GIS facilities) into meshed geometry enhanced with the sensor physical extensions, fitted to the ray tracing rendering of CHORALE, both for the infrared, electromagnetic and acoustic spectrum. The basic idea is to enhance directly the 2D source level with the physical data, rather than enhancing the 3D meshed level, which is more efficient (rapid database generation) and more reliable (can be generated many times, changing some parameters only). The paper concludes with the last current evolution of AGETIM in the scope mission rehearsal for urban war using sensors. This evolution includes indoor modeling for automatic generation of inner parts of buildings.
NASA Astrophysics Data System (ADS)
Bediaga, I.; Miranda, J.; dos Reis, A. C.; Bigi, I. I.; Gomes, A.; Otalora Goicochea, J. M.; Veiga, A.
2012-08-01
The “Miranda procedure” proposed for analyzing Dalitz plots for CP asymmetries in charged B and D decays in a model-independent manner is extended and refined in this paper. The complexity of Cabibbo-Kobayashi-Maskawa CP phenomenology through order λ6 is needed in searches for new dynamics (ND). Detailed analyses of three-body final states offer great advantages: (i) They give us more powerful tools for deciding whether an observed CP asymmetry represents the manifestation of ND and its features. (ii) Many advantages can already be obtained by the Miranda procedure without construction of a detailed Dalitz plot description. (iii) One studies CP asymmetries independent of production asymmetries. We illustrate the power of a second generation Miranda procedure with examples with time integrated rates for Bd/B¯d decays to final states KSπ+π- as trial runs, with comments on B±→K±π+π-/K±K+K-.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pardo-Bosch, Francesc, E-mail: francesc.pardo@upc.edu; Political Science Department, University of California - Berkeley; Aguado, Antonio, E-mail: antonio.aguado@upc.edu
Infrastructure construction, one of the biggest driving forces of the economy nowadays, requires a huge analysis and clear transparency to decide what projects have to be executed with the few resources available. With the aim to provide the public administrations a tool with which they can make their decisions easier, the Sustainability Index of Infrastructure Projects (SIIP) has been defined, with a multi-criteria decision system called MIVES, in order to classify non-uniform investments. This index evaluates, in two inseparable stages, the contribution to the sustainable development of each infrastructure project, analyzing its social, environmental and economic impact. The result ofmore » the SIIP allows to decide the order with which projects will be prioritized. The case of study developed proves the adaptability and utility of this tool for the ordinary budget management.« less
Application of Spacesuit Glove Requirements Tools to Athletic and Personal Protective Equipment
NASA Technical Reports Server (NTRS)
England, Scott; Benson, Elizabeth; Melsoh, Miranda; Thompson, Shelby; Rajulu, Sudhakar
2010-01-01
Despite decades of ongoing improvement, astronauts must still struggle with inhibited dexterity and accelerated fatigue due to the requirement of wearing a pressurized Extra-Vehicular Activity (EVA) glove. Recent research in the Anthropometry and Biomechanics Facility at NASA's Johnson Space Center has focused on developing requirements for improvements in the design of the next generation of EVA glove. In the course of this research, it was decided to expand the scope of the testing to include a variety of commercially available athletic and consumer gloves to help provide a more recognizable comparison for investigators and designers to evaluate the current state of EVA glove mobility and strength. This comparison is being provided with the hope that innovative methods may help commercial development of gloves for various athletic and personal protective endeavors.
Comparative Study of Platforms for E-Learning in the Higher Education
ERIC Educational Resources Information Center
Mondejar-Jimenez, Jose; Mondejar-Jimenez, Juan-Antonio; Vargas-Vargas, Manuel; Meseguer-Santamaria, Maria-Leticia
2008-01-01
Castilla-La Mancha University has decided to implement two tools: WebCT and Moodle, "Virtual Campus" has emerged: www.campusvirtual.ulcm.es. This paper is dedicated to the analysis of said tool as a primary mode of e-learning expansion in the university environment. It can be used to carry out standard educational university activities…
The Social Network: Keeping in Touch with Alumni through Online Media
ERIC Educational Resources Information Center
Bunker, Matt
2011-01-01
Not all social-networking tools are created equal. Knowing where alumni are and what they're doing online is key when deciding what social networks to use. Knowing how to address and employ social networking can change the way institutions engage alumni. Social media help institutions connect with alumni; these tools help build, sustain, and even…
ERIC Educational Resources Information Center
Milshtein, Amy
2001-01-01
Examines development considerations and tips for controlling costs when a university decides to develop an online distance learning service. Use of the interactive Web Site for Determining Costs tool for unveiling hidden costs is highlighted. (GR)
Span, Marijke; Hettinga, Marike; Groen-van de Ven, Leontine; Jukema, Jan; Janssen, Ruud; Vernooij-Dassen, Myrra; Eefsting, Jan; Smits, Carolien
2018-06-01
The aim of this study was at gaining insight into the participatory design approach of involving people with dementia in the development of the DecideGuide, an interactive web tool facilitating shared decision-making in their care networks. An explanatory case study design was used when developing the DecideGuide. A secondary analysis focused on the data gathered from the participating people with dementia during the development stages: semi-structured interviews (n = 23), four focus group interviews (n = 18), usability tests (n = 3), and a field study (n = 4). Content analysis was applied to the data. Four themes showed to be important regarding the participation experiences of involving people with dementia in research: valuable feedback on content and design of the DecideGuide, motivation to participate, perspectives of people with dementia and others about distress related to involvement, and time investment. People with dementia can give essential feedback and, therefore, their contribution is useful and valuable. Meaningful participation of people with dementia takes time that should be taken into account. It is important for people with dementia to be able to reciprocate the efforts others make and to feel of significance to others. Implications for Rehabilitation People with dementia can contribute meaningfully to the content and design and their perspective is essential for developing useful and user-friendly tools. Participating in research activities may contribute to social inclusion, empowerment, and quality of life of people with dementia.
Automatic Methods and Tools for the Verification of Real Time Systems
1997-11-30
We developed formal methods and tools for the verification of real - time systems . This was accomplished by extending techniques, based on automata...embedded real - time systems , we introduced hybrid automata, which equip traditional discrete automata with real-numbered clock variables and continuous... real - time systems , and we identified the exact boundary between decidability and undecidability of real-time reasoning.
Donated chemical probes for open science.
Müller, Susanne; Ackloo, Suzanne; Arrowsmith, Cheryl H; Bauser, Marcus; Baryza, Jeremy L; Blagg, Julian; Böttcher, Jark; Bountra, Chas; Brown, Peter J; Bunnage, Mark E; Carter, Adrian J; Damerell, David; Dötsch, Volker; Drewry, David H; Edwards, Aled M; Edwards, James; Elkins, Jon M; Fischer, Christian; Frye, Stephen V; Gollner, Andreas; Grimshaw, Charles E; IJzerman, Adriaan; Hanke, Thomas; Hartung, Ingo V; Hitchcock, Steve; Howe, Trevor; Hughes, Terry V; Laufer, Stefan; Li, Volkhart Mj; Liras, Spiros; Marsden, Brian D; Matsui, Hisanori; Mathias, John; O'Hagan, Ronan C; Owen, Dafydd R; Pande, Vineet; Rauh, Daniel; Rosenberg, Saul H; Roth, Bryan L; Schneider, Natalie S; Scholten, Cora; Singh Saikatendu, Kumar; Simeonov, Anton; Takizawa, Masayuki; Tse, Chris; Thompson, Paul R; Treiber, Daniel K; Viana, Amélia Yi; Wells, Carrow I; Willson, Timothy M; Zuercher, William J; Knapp, Stefan; Mueller-Fahrnow, Anke
2018-04-20
Potent, selective and broadly characterized small molecule modulators of protein function (chemical probes) are powerful research reagents. The pharmaceutical industry has generated many high-quality chemical probes and several of these have been made available to academia. However, probe-associated data and control compounds, such as inactive structurally related molecules and their associated data, are generally not accessible. The lack of data and guidance makes it difficult for researchers to decide which chemical tools to choose. Several pharmaceutical companies (AbbVie, Bayer, Boehringer Ingelheim, Janssen, MSD, Pfizer, and Takeda) have therefore entered into a pre-competitive collaboration to make available a large number of innovative high-quality probes, including all probe-associated data, control compounds and recommendations on use (
Integration of an expert system into a user interface language demonstration
NASA Technical Reports Server (NTRS)
Stclair, D. C.
1986-01-01
The need for a User Interface Language (UIL) has been recognized by the Space Station Program Office as a necessary tool to aid in minimizing the cost of software generation by multiple users. Previous history in the Space Shuttle Program has shown that many different areas of software generation, such as operations, integration, testing, etc., have each used a different user command language although the types of operations being performed were similar in many respects. Since the Space Station represents a much more complex software task, a common user command language--a user interface language--is required to support the large spectrum of space station software developers and users. To assist in the selection of an appropriate set of definitions for a UIL, a series of demonstration programs was generated with which to test UIL concepts against specific Space Station scenarios using operators for the astronaut and scientific community. Because of the importance of expert system in the space station, it was decided that an expert system should be embedded in the UIL. This would not only provide insight into the UIL components required but would indicate the effectiveness with which an expert system could function in such an environment.
Applying Results Findings: The Recovery Potential Project
The document describes a pilot study using the Illinois 303(d) listed waters, aimed at developing tools and data to help state TMDL and restoration programs decide where best to use their limited restoration resources.
Destination memory for self-generated actions.
El Haj, Mohamad
2016-10-01
There is a substantial body of literature showing memory enhancement for self-generated information in normal aging. The present paper investigated this outcome for destination memory or memory for outputted information. In Experiment 1, younger adults and older adults had to place (self-generated actions) and observe an experimenter placing (experiment-generated actions) items into two different destinations (i.e., a black circular box and a white square box). On a subsequent recognition task, the participants had to decide into which box each item had originally been placed. These procedures showed better destination memory for self- than experimenter-generated actions. In Experiment 2, destination and source memory were assessed for self-generated actions. Younger adults and older adults had to place items into the two boxes (self-generated actions), take items out of the boxes (self-generated actions), and observe an experimenter taking items out of the boxes (experiment-generated actions). On a subsequent recognition task, they had to decide into which box (destination memory)/from which box (source memory) each item had originally been placed/taken. For both populations, source memory was better than destination memory for self-generated actions, and both were better than source memory for experimenter-generated actions. Taken together, these findings highlight the beneficial effect of self-generation on destination memory in older adults.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-05
...) received a request from Basin Electric Power Cooperative (Basin Electric) to modify its Large Generator Interconnection Agreement (LGIA) with Basin Electric for the Groton Generation Station to eliminate current... considered the environmental impacts and has decided to modify its LGIA with Basin Electric for the Groton...
Hierarchical dispatch using two-stage optimisation for electricity markets in smart grid
NASA Astrophysics Data System (ADS)
Yang, Jie; Zhang, Guoshan; Ma, Kai
2016-11-01
This paper proposes a hierarchical dispatch method for the electricity markets consisting of wholesale markets and retail markets. In the wholesale markets, the generators and the retailers decide the generation and the purchase according to the market-clearing price. In the retail markets, the retailers set the retail price to adjust the electricity consumption of the consumers. Due to the two-way communications in smart grid, the retailers can decide the electricity purchase from the wholesale markets based on the information on electricity usage of consumers in the retail markets. We establish the hierarchical dispatch model for the wholesale markets and the retail markets and develop distributed algorithms to search for the optimal generation, purchase, and consumption. Numerical results show the balance between the supply and demand, the profits of the retailers, and the convergence of the distributed algorithms.
MetaMeta: integrating metagenome analysis tools to improve taxonomic profiling.
Piro, Vitor C; Matschkowski, Marcel; Renard, Bernhard Y
2017-08-14
Many metagenome analysis tools are presently available to classify sequences and profile environmental samples. In particular, taxonomic profiling and binning methods are commonly used for such tasks. Tools available among these two categories make use of several techniques, e.g., read mapping, k-mer alignment, and composition analysis. Variations on the construction of the corresponding reference sequence databases are also common. In addition, different tools provide good results in different datasets and configurations. All this variation creates a complicated scenario to researchers to decide which methods to use. Installation, configuration and execution can also be difficult especially when dealing with multiple datasets and tools. We propose MetaMeta: a pipeline to execute and integrate results from metagenome analysis tools. MetaMeta provides an easy workflow to run multiple tools with multiple samples, producing a single enhanced output profile for each sample. MetaMeta includes a database generation, pre-processing, execution, and integration steps, allowing easy execution and parallelization. The integration relies on the co-occurrence of organisms from different methods as the main feature to improve community profiling while accounting for differences in their databases. In a controlled case with simulated and real data, we show that the integrated profiles of MetaMeta overcome the best single profile. Using the same input data, it provides more sensitive and reliable results with the presence of each organism being supported by several methods. MetaMeta uses Snakemake and has six pre-configured tools, all available at BioConda channel for easy installation (conda install -c bioconda metameta). The MetaMeta pipeline is open-source and can be downloaded at: https://gitlab.com/rki_bioinformatics .
Collaborative writing: Tools and tips.
Eapen, Bell Raj
2007-01-01
Majority of technical writing is done by groups of experts and various web based applications have made this collaboration easy. Email exchange of word processor documents with tracked changes used to be the standard technique for collaborative writing. However web based tools like Google docs and Spreadsheets have made the process fast and efficient. Various versioning tools and synchronous editors are available for those who need additional functionality. Having a group leader who decides the scheduling, communication and conflict resolving protocols is important for successful collaboration.
Ventromedial Hypothalamus and the Generation of Aggression
Hashikawa, Yoshiko; Hashikawa, Koichi; Falkner, Annegret L.; Lin, Dayu
2017-01-01
Aggression is a costly behavior, sometimes with severe consequences including death. Yet aggression is prevalent across animal species ranging from insects to humans, demonstrating its essential role in the survival of individuals and groups. The question of how the brain decides when to generate this costly behavior has intrigued neuroscientists for over a century and has led to the identification of relevant neural substrates. Various lesion and electric stimulation experiments have revealed that the hypothalamus, an ancient structure situated deep in the brain, is essential for expressing aggressive behaviors. More recently, studies using precise circuit manipulation tools have identified a small subnucleus in the medial hypothalamus, the ventrolateral part of the ventromedial hypothalamus (VMHvl), as a key structure for driving both aggression and aggression-seeking behaviors. Here, we provide an updated summary of the evidence that supports a role of the VMHvl in aggressive behaviors. We will consider our recent findings detailing the physiological response properties of populations of VMHvl cells during aggressive behaviors and provide new understanding regarding the role of the VMHvl embedded within the larger whole-brain circuit for social sensation and action. PMID:29375329
A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing
NASA Technical Reports Server (NTRS)
Takaki, Mitsuo; Cavalcanti, Diego; Gheyi, Rohit; Iyoda, Juliano; dAmorim, Marcelo; Prudencio, Ricardo
2009-01-01
The complexity of constraints is a major obstacle for constraint-based software verification. Automatic constraint solvers are fundamentally incomplete: input constraints often build on some undecidable theory or some theory the solver does not support. This paper proposes and evaluates several randomized solvers to address this issue. We compare the effectiveness of a symbolic solver (CVC3), a random solver, three hybrid solvers (i.e., mix of random and symbolic), and two heuristic search solvers. We evaluate the solvers on two benchmarks: one consisting of manually generated constraints and another generated with a concolic execution of 8 subjects. In addition to fully decidable constraints, the benchmarks include constraints with non-linear integer arithmetic, integer modulo and division, bitwise arithmetic, and floating-point arithmetic. As expected symbolic solving (in particular, CVC3) subsumes the other solvers for the concolic execution of subjects that only generate decidable constraints. For the remaining subjects the solvers are complementary.
Implementation of a Distributed Object-Oriented Database Management System
1989-03-01
and heuristic algorithms. A method for determining ueit allocation by splitting relations in the conceptual schema base on queries and updates is...level framworks can provide to the user the appearance of many tools to be closely integrated. In particular, the KBSA tools use many high level...development process should begin first with conceptual design of the system. Approximately one month should be used to decide how the new projects
Henderson, Vida A; Barr, Kathryn L; An, Lawrence C; Guajardo, Claudia; Newhouse, William; Mase, Rebecca; Heisler, Michele
2013-01-01
Together, community-based participatory research (CBPR), user-centered design (UCD), and health information technology (HIT) offer promising approaches to improve health disparities in low-resource settings. This article describes the application of CBPR and UCD principles to the development of iDecide/Decido, an interactive, tailored, web-based diabetes medication education and decision support tool delivered by community health workers (CHWs) to African American and Latino participants with diabetes in Southwest and Eastside Detroit. The decision aid is offered in English or Spanish and is delivered on an iPad in participants' homes. The overlapping principles of CBPR and UCD used to develop iDecide/Decido include a user-focused or community approach, equitable academic and community partnership in all study phases, an iterative development process that relies on input from all stakeholders, and a program experience that is specified, adapted, and implemented with the target community. Collaboration between community members, researchers, and developers is especially evident in the program's design concept, animations, pictographs, issue cards, goal setting, tailoring, and additional CHW tools. The principles of CBPR and UCD can be successfully applied in developing health information tools that are easy to use and understand, interactive, and target health disparities.
The DECIDE evidence to recommendation framework adapted to the public health field in Sweden.
Guldbrandsson, Karin; Stenström, Nils; Winzer, Regina
2016-12-01
Organizations worldwide compile results from scientific studies, and grade the evidence of interventions, in order to assist policy makers. However, quality of evidence alone is seldom sufficient to make a recommendation. The Developing and Evaluating Communication Strategies to Support Informed Decisions and Practice Based on Evidence (DECIDE) framework aims to facilitate decision making and to improve dissemination and implementation of recommendations in the healthcare and public health sector. The aim of this study was to investigate whether the DECIDE framework is applicable in the public health field in Sweden. The DECIDE framework was presented and discussed in interviews with stakeholders and governmental organizations and tested in panels. Content analyses were performed. In general, the informants were positive to the DECIDE framework. However, two questions, the first regarding individual autonomy and the second regarding method sustainability, were by the stakeholders felt to be missing in the framework. The importance of the composition of the DECIDE stakeholder panel was lifted by the informants, as was the significant role of the chair. Further, the informants raised concerns about the general lack of research evidence based on RCT design regarding universal methods in the public health sector. Finally, the local, regional and national levels' responsibility for dissemination and implementation of recommendations were lifted by the informants. The DECIDE framework might be useful as a tool for dissemination and implementation of recommendations in the public health field in Sweden. Important questions for further research are whether these findings are suitable for other public health topics and in other public health settings. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
3D satellite puzzles for young and old kids
NASA Astrophysics Data System (ADS)
Biondi, Riccardo; Galoforo, Germana
2017-04-01
The Italian Space Agency (ASI) is active in outreach willing to increase the interest of young generations and general public toward the space activities. ASI proposes educational programmes for supporting and encouraging the development of European society based on knowledge, inspiring and motivating the young generations. One of the initiatives promoted by ASI on this regards is the 3D satellite puzzles. The idea was born in 2007 from the will to conceive an educational product for promoting and explaining to students the small all-Italian mission AGILE (Astrorivelatore Gamma ad Immagini ultra Leggero) thought as a tool for students aged 8-13. Working with this slot of students is very productive in terms of the imprints left on the kids, in fact it is useful to produce things they can use, touch and play with, with an active approach instead of a passive one. Therefore it was decided to produce something that kids could build and use at home with their parents or friends, or all together at school with teachers and mates. Other puzzles followed AGILE, one about the COSMO-SkyMED satellites about Earth Observation and also a broader one of the International Space Station. During these 10 years the puzzles were mostly used as outreach tools for school children, but they surprisingly received a great success also within older generations. So far the 3D puzzles have been printed in more than 10 thousand copies and distributed for free to students of hundreds of schools in Italy, and to the general public through science associations, planetaria and museums. Recently they have been used also during special events such as the international Geoscience Communication School (as best practice outreach tool), the EXPO 2015 and the European Researcheŕs Night at the Parlamentarium in Brussels 2016. While the students are building the puzzles, the tutor explains them the different components that they are assembling, what the importance of the satellite is and how it works. It is interesting to see how the students can spend hours building their own satellite and how passionate they are for this type of tool.
1988-01-01
A Generator for Natural Language Interfaces," Computational Linguistis. Vol. 11, Number 4, October-December, 1985. pp. 219-242. de Joia , A. and...employ in order to communicate to their intended audience. Production, therefore, encompasses issues of deciding what is pertinent as well as de ...rhetorical predicates; design of a system motivated by the desire for domain and language independency, semantic connection of the generation system
Migrating the Belle II collaborative services and tools
NASA Astrophysics Data System (ADS)
Braun, N.; Dossett, D.; Dramburg, M.; Frost, O.; Gellrich, A.; Grygier, J.; Hauth, T.; Jahnke-Zumbusch, D.; Knittel, D.; Kuhr, T.; Levonian, S.; Moser, H.-G.; Li, L.; Nakao, N.; Prim, M.; Reest, P. v. d.; Schwenssen, F.; Urquijo, P.; Vennemann, B.
2017-10-01
The Belle II collaboration decided in 2016 to migrate its collaborative services and tools into the existing IT infrastructure at DESY. The goal was to reduce the maintenance effort for solutions operated by Belle II members as well as to deploy state-of-art technologies. In addition, some new services and tools were or will be introduced. Planning and migration work was carried out by small teams consisting of experts form Belle II and the involved IT divisions. The migration was successfully accomplished before the KEK computer centre replacement in August 2016.
ERIC Educational Resources Information Center
Griffiths, Martin
2011-01-01
One of the author's undergraduate students recently asked him whether it was possible to generate a random positive integer. After some thought, the author realised that there were plenty of interesting mathematical ideas inherent in her question. So much so in fact, that the author decided to organise a workshop, open both to undergraduates and…
Hegarty, Kelsey; Tarzia, Laura; Murray, Elizabeth; Valpied, Jodie; Humphreys, Cathy; Taft, Angela; Gold, Lisa; Glass, Nancy
2015-08-01
Domestic violence is a serious problem affecting the health and wellbeing of women globally. Interventions in health care settings have primarily focused on screening and referral, however, women often may not disclose abuse to health practitioners. The internet offers a confidential space in which women can assess the health of their relationships and make a plan for safety and wellbeing for themselves and their children. This randomised controlled trial is testing the effectiveness of a web-based healthy relationship tool and safety decision aid (I-DECIDE). Based broadly on the IRIS trial in the United States, it has been adapted for the Australian context where it is conducted entirely online and uses the Psychosocial Readiness Model as the basis for the intervention. In this two arm, pragmatic randomised controlled trial, women who have experienced abuse or fear of a partner in the previous 6 months will be computer randomised to receive either the I-DECIDE website or a comparator website (basic relationship and safety advice). The intervention includes self-directed reflection exercises on their relationship, danger level, priority setting, and results in an individualised, tailored action plan. Primary self-reported outcomes are: self-efficacy (General Self-Efficacy Scale) immediately after completion, 6 and 12 months post-baseline; and depressive symptoms (Centre for Epidemiologic Studies Depression Scale, Revised, 6 and 12 months post-baseline). Secondary outcomes include mean number of helpful actions for safety and wellbeing, mean level of fear of partner and cost-effectiveness. This fully-automated trial will evaluate a web-based self-information, self-reflection and self-management tool for domestic violence. We hypothesise that the improvement in self-efficacy and mental health will be mediated by increased perceived support and awareness encouraging positive change. If shown to be effective, I-DECIDE could be easily incorporated into the community sector and health care settings, providing an alternative to formal services for women not ready or able to acknowledge abuse and access specialised services. Trial registered on 15(th) December 2014 with the Australian New Zealand Clinical Trials Registry ACTRN12614001306606.
A Summary of Some Discrete-Event System Control Problems
NASA Astrophysics Data System (ADS)
Rudie, Karen
A summary of the area of control of discrete-event systems is given. In this research area, automata and formal language theory is used as a tool to model physical problems that arise in technological and industrial systems. The key ingredients to discrete-event control problems are a process that can be modeled by an automaton, events in that process that cannot be disabled or prevented from occurring, and a controlling agent that manipulates the events that can be disabled to guarantee that the process under control either generates all the strings in some prescribed language or as many strings as possible in some prescribed language. When multiple controlling agents act on a process, decentralized control problems arise. In decentralized discrete-event systems, it is presumed that the agents effecting control cannot each see all event occurrences. Partial observation leads to some problems that cannot be solved in polynomial time and some others that are not even decidable.
Fitzpatrick, Stephanie L; Hill-Briggs, Felicia
2017-06-01
Purpose The purpose of this study was to identify effective strategies for sustained weight management used by African American patients with obesity and type 2 diabetes. Methods In this study, nominal group technique was used to identify effective strategies for weight management used by 12 African Americans with overweight/obesity and type 2 diabetes who successfully lost or maintained their weight after completing DECIDE (Decision-making Education for Choices In Diabetes Everyday), a 9-module, literacy-adapted diabetes and cardiovascular disease (CVD) education and problem-solving training program. Results Participants generated a list of 101 strategies that covered 4 domains: nutrition, physical activity, cognitive-behavioral strategies, and other. Self-monitoring and relying on social support were the top 2 strategies for weight maintenance. Conclusion Future obesity studies should consider including friends/family as well as electronic tools to facilitate self-monitoring and regular practice of behavioral strategies for long-term success.
An Integrated Suite of Text and Data Mining Tools - Phase II
2005-08-30
Riverside, CA, USA Mazda Motor Corp, Jpn Univ of Darmstadt, Darmstadt, Ger Navy Center for Applied Research in Artificial Intelligence Univ of...with Georgia Tech Research Corporation developed a desktop text-mining software tool named TechOASIS (known commercially as VantagePoint). By the...of this dataset and groups the Corporate Source items that co-occur with the found items. He decides he is only interested in the institutions
Creating scientific and technical talent through educational outreach
NASA Astrophysics Data System (ADS)
Diggs, Darnell E.; Grote, James G.; Fielding, Jennifer; Jones, Keith W.; Jenkins, Larry C.; Turner, I. Leon
2007-09-01
Using descriptive and explanatory research methodologies, researchers have qualitatively investigated factors influential in causing our nation's youth to decide whether to select science, technology, engineering, and mathematics (STEM) as their academic majors. Furthermore, researchers have also examined what causes African American men and women to decide whether they desire to become engineers and scientists. Using preexisting studies numerous themes have emerged from these data, which supports educational outreach as a powerful tool for encouraging our youth to consider the STEM disciplines. This paper highlights Air Force Research Laboratory researchers' efforts in combating the forces that could jeopardize our nation's position as one of the leaders in technology and scientific innovations.
Disaster Response and Decision Support in Partnership with the California Earthquake Clearinghouse
NASA Astrophysics Data System (ADS)
Glasscoe, M. T.; Rosinski, A.; Vaughan, D.; Morentz, J.
2014-12-01
Getting the right information to the right people at the right time is critical during a natural disaster. E-DECIDER (Emergency Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) is a NASA decision support system designed to produce remote sensing and geophysical modeling data products that are relevant to the emergency preparedness and response communities and serve as a gateway to enable the delivery of NASA decision support products to these communities. The E-DECIDER decision support system has several tools, services, and products that have been used to support end-user exercises in partnership with the California Earthquake Clearinghouse since 2012, including near real-time deformation modeling results and on-demand maps of critical infrastructure that may have been potentially exposed to damage by a disaster. E-DECIDER's underlying service architecture allows the system to facilitate delivery of NASA decision support products to the Clearinghouse through XchangeCore Web Service Data Orchestration that allows trusted information exchange among partner agencies. This in turn allows Clearinghouse partners to visualize data products produced by E-DECIDER and other NASA projects through incident command software such as SpotOnResponse or ArcGIS Online.
ISPATOM: A Generic Real-Time Data Processing Tool Without Programming
NASA Technical Reports Server (NTRS)
Dershowitz, Adam
2007-01-01
Information Sharing Protocol Advanced Tool of Math (ISPATOM) is an application program allowing for the streamlined generation of comps, which subscribe to streams of incoming telemetry data, perform any necessary computations on the data, then send the data to other programs for display and/or further processing in NASA mission control centers. Heretofore, the development of comps was difficult, expensive, and time-consuming: Each comp was custom written manually, in a low-level computing language, by a programmer attempting to follow requirements of flight controllers. ISPATOM enables a flight controller who is not a programmer to write a comp by simply typing in one or more equation( s) at a command line or retrieving the equation(s) from a text file. ISPATOM then subscribes to the necessary input data, performs all of necessary computations, and sends out the results. It sends out new results whenever the input data change. The use of equations in ISPATOM is no more difficult than is entering equations in a spreadsheet. The time involved in developing a comp is thus limited to the time taken to decide on the necessary equations. Thus, ISPATOM is a real-time dynamic calculator.
Social voting advice applications-definitions, challenges, datasets and evaluation.
Katakis, Ioannis; Tsapatsoulis, Nicolas; Mendez, Fernando; Triga, Vasiliki; Djouvas, Constantinos
2014-07-01
Voting advice applications (VAAs) are online tools that have become increasingly popular and purportedly aid users in deciding which party/candidate to vote for during an election. In this paper we present an innovation to current VAA design which is based on the introduction of a social network element. We refer to this new type of online tool as a social voting advice application (SVAA). SVAAs extend VAAs by providing (a) community-based recommendations, (b) comparison of users' political opinions, and (c) a channel of user communication. In addition, SVAAs enriched with data mining modules, can operate as citizen sensors recording the sentiment of the electorate on issues and candidates. Drawing on VAA datasets generated by the Preference Matcher research consortium, we evaluate the results of the first VAA-Choose4Greece-which incorporated social voting features and was launched during the landmark Greek national elections of 2012. We demonstrate how an SVAA can provide community based features and, at the same time, serve as a citizen sensor. Evaluation of the proposed techniques is realized on a series of datasets collected from various VAAs, including Choose4Greece. The collection is made available online in order to promote research in the field.
Toward a molecular programming language for algorithmic self-assembly
NASA Astrophysics Data System (ADS)
Patitz, Matthew John
Self-assembly is the process whereby relatively simple components autonomously combine to form more complex objects. Nature exhibits self-assembly to form everything from microscopic crystals to living cells to galaxies. With a desire to both form increasingly sophisticated products and to understand the basic components of living systems, scientists have developed and studied artificial self-assembling systems. One such framework is the Tile Assembly Model introduced by Erik Winfree in 1998. In this model, simple two-dimensional square 'tiles' are designed so that they self-assemble into desired shapes. The work in this thesis consists of a series of results which build toward the future goal of designing an abstracted, high-level programming language for designing the molecular components of self-assembling systems which can perform powerful computations and form into intricate structures. The first two sets of results demonstrate self-assembling systems which perform infinite series of computations that characterize computably enumerable and decidable languages, and exhibit tools for algorithmically generating the necessary sets of tiles. In the next chapter, methods for generating tile sets which self-assemble into complicated shapes, namely a class of discrete self-similar fractal structures, are presented. Next, a software package for graphically designing tile sets, simulating their self-assembly, and debugging designed systems is discussed. Finally, a high-level programming language which abstracts much of the complexity and tedium of designing such systems, while preventing many of the common errors, is presented. The summation of this body of work presents a broad coverage of the spectrum of desired outputs from artificial self-assembling systems and a progression in the sophistication of tools used to design them. By creating a broader and deeper set of modular tools for designing self-assembling systems, we hope to increase the complexity which is attainable. These tools provide a solid foundation for future work in both the Tile Assembly Model and explorations into more advanced models.
Systems Engineering for Contingency Basing
2012-11-30
partner practices. Reduced Environmental , Safety and Occupational Health (ESOH) Risks. The processes and tools need to enable the measurement...62 Figure 43: Fuel environmental considerations decision hierarchy .................................. 64 Figure 44: A Medical facility...66 Figure 46: Screen Shot of CB-Decider, a web-based collaborative modeling environment
ERIC Educational Resources Information Center
Wragg, Paul H.; Allen, Rodney F.
Four lessons for secondary school geography instruction focus on creative thinking through generating alternatives, imagining consequences, generating analogies, and creating products. In lesson 1 students lay out alternatives to solving a problem and decide the best course of action. Suggested activities include consideration of alternatives in…
Hawley, Sarah T; Li, Yun; An, Lawrence C; Resnicow, Kenneth; Janz, Nancy K; Sabel, Michael S; Ward, Kevin C; Fagerlin, Angela; Morrow, Monica; Jagsi, Reshma; Hofer, Timothy P; Katz, Steven J
2018-03-01
Purpose This study was conducted to determine the effect of iCanDecide, an interactive and tailored breast cancer treatment decision tool, on the rate of high-quality patient decisions-both informed and values concordant-regarding locoregional breast cancer treatment and on patient appraisal of decision making. Methods We conducted a randomized clinical trial of newly diagnosed patients with early-stage breast cancer making locoregional treatment decisions. From 22 surgical practices, 537 patients were recruited and randomly assigned online to the iCanDecide interactive and tailored Web site (intervention) or the iCanDecide static Web site (control). Participants completed a baseline survey and were mailed a follow-up survey 4 to 5 weeks after enrollment to assess the primary outcome of a high-quality decision, which consisted of two components, high knowledge and values-concordant treatment, and secondary outcomes (decision preparation, deliberation, and subjective decision quality). Results Patients in the intervention arm had higher odds of making a high-quality decision than did those in the control arm (odds ratio, 2.00; 95% CI, 1.37 to 2.92; P = .0004), which was driven primarily by differences in the rates of high knowledge between groups. The majority of patients in both arms made values-concordant treatment decisions (78.6% in the intervention arm and 81.4% in the control arm). More patients in the intervention arm had high decision preparation (estimate, 0.18; 95% CI, 0.02 to 0.34; P = .027), but there were no significant differences in the other decision appraisal outcomes. The effect of the intervention was similar for women who were leaning strongly toward a treatment option at enrollment compared with those who were not. Conclusion The tailored and interactive iCanDecide Web site, which focused on knowledge building and values clarification, positively affected high-quality decisions largely by improving knowledge compared with static online information. To be effective, future patient-facing decision tools should be integrated into the clinical workflow to improve decision making.
CrossTalk: The Journal of Defense Software Engineering. Volume 21, Number 2
2008-02-01
data systems are select- ed as tools to provide the data. For example, the Air Force decided to initiate a Reliability Pathfinder to study and define...small projects, and someone just sitting will certainly be noticed more readily. Providing tools for communication such as white boards and open space...community may have to pay for both. In the case of DRILS, we saw that it could potentially interface with the Reliability and Maintainability Infor- mation
Disaster Response Tools for Decision Support and Data Discovery - E-DECIDER and GeoGateway
NASA Astrophysics Data System (ADS)
Glasscoe, M. T.; Donnellan, A.; Parker, J. W.; Granat, R. A.; Lyzenga, G. A.; Pierce, M. E.; Wang, J.; Grant Ludwig, L.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.
2015-12-01
Providing actionable data for situational awareness following an earthquake or other disaster is critical to decision makers in order to improve their ability to anticipate requirements and provide appropriate resources for response. E-DECIDER (Emergency Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) is a decision support system producing remote sensing and geophysical modeling products that are relevant to the emergency preparedness and response communities and serves as a gateway to enable the delivery of actionable information to these communities. GeoGateway is a data product search and analysis gateway for scientific discovery, field use, and disaster response focused on NASA UAVSAR and GPS data that integrates with fault data, seismicity and models. Key information on the nature, magnitude and scope of damage, or Essential Elements of Information (EEI), necessary to achieve situational awareness are often generated from a wide array of organizations and disciplines, using any number of geospatial and non-geospatial technologies. We have worked in partnership with the California Earthquake Clearinghouse to develop actionable data products for use in their response efforts, particularly in regularly scheduled, statewide exercises like the recent May 2015 Capstone/SoCal NLE/Ardent Sentry Exercises and in the August 2014 South Napa earthquake activation. We also provided a number of products, services, and consultation to the NASA agency-wide response to the April 2015 Gorkha, Nepal earthquake. We will present perspectives on developing tools for decision support and data discovery in partnership with the Clearinghouse and for the Nepal earthquake. Products delivered included map layers as part of the common operational data plan for the Clearinghouse, delivered through XchangeCore Web Service Data Orchestration, enabling users to create merged datasets from multiple providers. For the Nepal response effort, products included models, damage and loss estimates, and aftershock forecasts that were posted to a NASA information site and delivered directly to end-users such as USAID, OFDA, World Bank, and UNICEF.
Development of an Optimum Rescue Tool, Detailed Prototype Concept Design.
1981-06-01
vAll Ai24 108 0VLOOKIN Of A1 OPTIU.M *t$CUI VOOL OITAILEO /D ..010 ",:N COAC IPT DES III AMETIc/01fI SANTA SAROARA CA IT A Ot SIN IT aL. -~ JUaI A...penetrate hardened metal structures of aircraft. A large number of tools are transported to the scene of a crashed air- craft. Valuable time is lost deciding...carrying system is necessary for transport of the tool to allow the operator free use of his hands. Such a system would be a shoulder sling assembly
The First Israeli Hydro-Electric Pumped Storage Power Plant Gilboa PSPP
NASA Astrophysics Data System (ADS)
Maruzewski, P., Dr.; Sautereau, T.; Sapir, Y.; Barak, H.; Hénard, F.; Blaix, J.-C.
2016-11-01
The Israeli Public Utilities Authority, PUA, decided to increase the instantaneous power available on the grid by adding Pumped Storage Power Plants, PSPP, to the existing generation capacity. PSP Investments Ltd. is a private investor that decided to develop the Gilboa PSPP. Its capacity is 300MWe. The project performance has to comply with PUA regulation for PSPP, and with all relevant Israeli laws and IECo standards. This paper itemizes an overview of the Gilboa PSPP through short summaries of units’ components from design step to manufacturing processes.
Targeting Neutrophil Protease-Mediated Degradation of Tsp-1 to Induce Metastatic Dormancy
2017-10-01
have decided to use CRISPR -Cas9 mediated gene knockout as it results in complete depletion of gene products. The sequences for gRNAs (Fig. 1B) have...express Tsp-1 receptor CD36 Fig. 1. (A) Tsp-1 receptor CD36 is expressed by tumor cells. (B) Design of CRISPR -Cas9 gRNAs for generating biallelic...DWLPK peptide solubility were created and optimized. For Aim 1, Subtask 2.1: We have decided to use CRISPR -Cas9 as they are more potent in
Generation of Tutorial Dialogues: Discourse Strategies for Active Learning
1998-05-29
AND SUBTITLE Generation of Tutorial Dialogues: Discourse Strategies for active Learning AUTHORS Dr. Martha Evens 7. PERFORMING ORGANI2ATION NAME...time the student starts in on a new topic. Michael and Rovick constantly attempt to promote active learning . They regularly use hints and only resort...Controlling active learning : How tutors decide when to generate hints. Proceedings of FLAIRS . Melbourne Beach, FL. 157-161. Hume, G., Michael
A review and evaluation of numerical tools for fractional calculus and fractional order controls
NASA Astrophysics Data System (ADS)
Li, Zhuo; Liu, Lu; Dehghan, Sina; Chen, YangQuan; Xue, Dingyü
2017-06-01
In recent years, as fractional calculus becomes more and more broadly used in research across different academic disciplines, there are increasing demands for the numerical tools for the computation of fractional integration/differentiation, and the simulation of fractional order systems. Time to time, being asked about which tool is suitable for a specific application, the authors decide to carry out this survey to present recapitulative information of the available tools in the literature, in hope of benefiting researchers with different academic backgrounds. With this motivation, the present article collects the scattered tools into a dashboard view, briefly introduces their usage and algorithms, evaluates the accuracy, compares the performance, and provides informative comments for selection.
Casanueva, Felipe F; Barkan, Ariel L; Buchfelder, Michael; Klibanski, Anne; Laws, Edward R; Loeffler, Jay S; Melmed, Shlomo; Mortini, Pietro; Wass, John; Giustina, Andrea
2017-10-01
With the goal of generate uniform criteria among centers dealing with pituitary tumors and to enhance patient care, the Pituitary Society decided to generate criteria for developing Pituitary Tumors Centers of Excellence (PTCOE). To develop that task, a group of ten experts served as a Task Force and through two years of iterative work an initial draft was elaborated. This draft was discussed, modified and finally approved by the Board of Directors of the Pituitary Society. Such document was presented and debated at a specific session of the Congress of the Pituitary Society, Orlando 2017, and suggestions were incorporated. Finally the document was distributed to a large group of global experts that introduced further modifications with final endorsement. After five years of iterative work a document with the ideal criteria for a PTCOE is presented. Acknowledging that very few centers in the world, if any, likely fulfill the requirements here presented, the document may be a tool to guide improvements of care delivery to patients with pituitary disorders. All these criteria must be accommodated to the regulations and organization of Health of a given country.
Next-generation sequencing in clinical virology: Discovery of new viruses.
Datta, Sibnarayan; Budhauliya, Raghvendra; Das, Bidisha; Chatterjee, Soumya; Vanlalhmuaka; Veer, Vijay
2015-08-12
Viruses are a cause of significant health problem worldwide, especially in the developing nations. Due to different anthropological activities, human populations are exposed to different viral pathogens, many of which emerge as outbreaks. In such situations, discovery of novel viruses is utmost important for deciding prevention and treatment strategies. Since last century, a number of different virus discovery methods, based on cell culture inoculation, sequence-independent PCR have been used for identification of a variety of viruses. However, the recent emergence and commercial availability of next-generation sequencers (NGS) has entirely changed the field of virus discovery. These massively parallel sequencing platforms can sequence a mixture of genetic materials from a very heterogeneous mix, with high sensitivity. Moreover, these platforms work in a sequence-independent manner, making them ideal tools for virus discovery. However, for their application in clinics, sample preparation or enrichment is necessary to detect low abundance virus populations. A number of techniques have also been developed for enrichment or viral nucleic acids. In this manuscript, we review the evolution of sequencing; NGS technologies available today as well as widely used virus enrichment technologies. We also discuss the challenges associated with their applications in the clinical virus discovery.
The role of university research in primary and secondary education
NASA Astrophysics Data System (ADS)
Redondo, A.; Llopart, M.; Ramos, L.; Roger, T.; Rafols, R.; Redondo, J. M.
2009-04-01
One of the most important roles of educators at all levels(transversally and inter-generationally between adult education, university and the primary schools, specially in sciences is to estimulate the quest for new knowledge and to help to provide the basic thinking tools of the proper scientific method. An innovative plan has been set up though the Campus Universitari de la Mediterrania that integrates the UPC, the local Education authorities and the local governement in Vilanova i la Geltru, Barcelona. To coordinate university professors invited to lecture in summer courses, so their research and lecturing materials may be used as school level material (as a CD collection) and to help younger students to iniciate their own research proyects. During 2006-2008 a series of Environmental science seminars, group proyects decided by the students or proposed jointly by the CUM were started. Examples of these works, such as Cetacean comunication (with the help of the Laboratory of Bioacustic Applications of the UPC), Shapes and patterns in the environment (Cosmocaixa Science Museum), the Rainbow, Waves and Tides, Turbulence, The growth of snails and the Fibonacci sequence, etc... will be presented, showing the importance of comunicating scientific interest to the younger generations.
RADC SCAT automated sneak circuit analysis tool
NASA Astrophysics Data System (ADS)
Depalma, Edward L.
The sneak circuit analysis tool (SCAT) provides a PC-based system for real-time identification (during the design phase) of sneak paths and design concerns. The tool utilizes an expert system shell to assist the analyst so that prior experience with sneak analysis is not necessary for performance. Both sneak circuits and design concerns are targeted by this tool, with both digital and analog circuits being examined. SCAT focuses the analysis at the assembly level, rather than the entire system, so that most sneak problems can be identified and corrected by the responsible design engineer in a timely manner. The SCAT program identifies the sneak circuits to the designer, who then decides what course of action is necessary.
ERIC Educational Resources Information Center
Malzer, Maris; Popovic, Milica; Striedinger, Angelika; Bjorklund, Karin; Olsson, Anna-Clara; Elstad, Linda; Brus, Sanja; Stark, Kat; Stojanovic, Marko; Scholz, Christine
2009-01-01
"Tolerance is not enough, discrimination must be fought" is what ESU staff stated in their Seminar on Equality in London, last May. Following their seminar, they decided to provide members with more practical tools to fight discrimination in higher education. This handbook aims at as part of that strategy. Focusing on several issues that…
The Value of Interactive Assignments in the Online Learning Environment
ERIC Educational Resources Information Center
Florenthal, Bela
2016-01-01
The offerings of Web-based supplemental material for textbooks have been increasingly growing. When deciding to adopt a textbook, instructors examine the added value of the associated supplements, also called "e-learning tools," to enhance students' learning of course concepts. In this study, one such supplement, interactive assignments,…
Source Selection Simulation: Intact Team Training on Picking a Provider
2015-06-01
seat of a new $100 million stealth fighter before giving her flight simulation time. The ar- gument for source-selection simulation ( SSS ) training is...dynamic is the creation of the SSS Tool. Drawing on his success in using a similar tool in contingency contracting, Long decided we should use a Web...of SSS intact team training. On Sept. 30–Oct. 3, 2014, Professors Long and Elsesser de- livered DAU’s first-ever Intact Team SSS Training to Eglin’s
Service management at CERN with Service-Now
NASA Astrophysics Data System (ADS)
Toteva, Z.; Alvarez Alonso, R.; Alvarez Granda, E.; Cheimariou, M.-E.; Fedorko, I.; Hefferman, J.; Lemaitre, S.; Clavo, D. Martin; Martinez Pedreira, P.; Pera Mira, O.
2012-12-01
The Information Technology (IT) and the General Services (GS) departments at CERN have decided to combine their extensive experience in support for IT and non-IT services towards a common goal - to bring the services closer to the end user based on Information Technology Infrastructure Library (ITIL) best practice. The collaborative efforts have so far produced definitions for the incident and the request fulfilment processes which are based on a unique two-dimensional service catalogue that combines both the user and the support team views of all services. After an extensive evaluation of the available industrial solutions, Service-now was selected as the tool to implement the CERN Service-Management processes. The initial release of the tool provided an attractive web portal for the users and successfully implemented two basic ITIL processes; the incident management and the request fulfilment processes. It also integrated with the CERN personnel databases and the LHC GRID ticketing system. Subsequent releases continued to integrate with other third-party tools like the facility management systems of CERN as well as to implement new processes such as change management. Independently from those new development activities it was decided to simplify the request fulfilment process in order to achieve easier acceptance by the CERN user community. We believe that due to the high modularity of the Service-now tool, the parallel design of ITIL processes e.g., event management and non-ITIL processes, e.g., computer centre hardware management, will be easily achieved. This presentation will describe the experience that we have acquired and the techniques that were followed to achieve the CERN customization of the Service-Now tool.
30 CFR 285.429 - What criteria will MMS consider in deciding whether to renew a lease or grant?
Code of Federal Regulations, 2011 CFR
2011-07-01
... existing technology. (b) Availability and feasibility of new technology. (c) Environmental and safety... generation capacity and reliability within the regional electrical distribution and transmission system. ...
Ray-Barruel, Gillian; Cooke, Marie; Mitchell, Marion; Chopra, Vineet; Rickard, Claire M
2018-06-04
Millions of acute care hospital patients need a peripheral intravenous catheter (PIVC) each year. However, up to half of PIVCs remain in situ when not being used, and 30%-50% of intravenous (IV) catheters develop complications or stop working before treatment is finished, requiring the insertion of a new device. Improved assessment could prompt timely removal of redundant catheters and prevent IV complications. This study aims to validate an evidence-based PIVC assessment and decision-making tool called I-DECIDED and evaluate the effect of implementing this tool into acute hospital clinical practice. The protocol outlines a prospective, multicentre, mixed-methods study using an interrupted time-series (multiple measures preintervention and postintervention) implementation at three Australian hospitals between August 2017 and July 2018. The study will examine the effectiveness of the I-DECIDED assessment and decision-making tool in clinical practice on prompting timely PIVC removal and early detection of complications. Primary outcomes are prevalence of redundant PIVCs (defined as device in situ without a clear purpose), IV complications (occlusion, dislodgement, infiltration, extravasation and phlebitis) and substandard dressings (loose, lifting, moist or soiled); device utilisation ratios; and primary bloodstream infection rates. Secondary outcomes including staff barriers and enablers to PIVC assessment and removal, patient participation, documentation of PIVC assessment and decisions taken to continue or remove the PIVC will be recorded. Using the Promoting Action on Research Implementation in Health Services framework, we will undertake staff focus groups, bedside patient interviews and PIVC assessments and chart audits. Patients aged 18 years or more with a PIVC will be eligible for inclusion. Ethical approval from Queensland Health (HREC/17/QPCH/47), Griffith University (Ref No. 2017/152) and St Vincent's Health and Aged Care Human Research and Ethics Committee (Ref No. 17/28). Results will be published. ANZCTR: 12617000067370; Pre-results. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
75 FR 1059 - Agency Information Collection Activities: Submission for OMB Review; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-08
... decides whether to approve the plan pricing (i.e., payment and premium) proposed by each organization... collection; Title of Information Collection: CY 2011 Bid Pricing Tool (BPT) for Medicare Advantage (MA) Plans...) and Prescription Drug Plans are required to submit an actuarial pricing ``bid'' for each plan offered...
KPI Student Executive Summary Report, Winter 2002.
ERIC Educational Resources Information Center
Sheridan Coll. (Ontario).
In the mid-1990s, the Ontario Government decided to enhance the accountability of the Colleges of Applied Arts and Technology by measuring and rewarding their performance in meeting specific goals and outcomes. The KPI (Key Performance Indicators) Satisfaction Survey is a tool developed by the Ministry of Training Colleges and Universities in…
How Do I Start a Property Records System?
ERIC Educational Resources Information Center
Whyman, Wynne
2003-01-01
A property records system organizes data to be utilized by a camp's facilities department and integrated into other areas. Start by deciding what records to keep and allotting the time. Then develop consistent procedures, including organizing data, creating a catalog, making back-up copies, and integrating procedures. Use software tools. A good…
Essentials of Teacher Training Sessions with GeoGebra
ERIC Educational Resources Information Center
Andresen, Mette; Misfeldt, Morten
2010-01-01
Formal requests were recently introduced for integration of ICT in secondary school mathematics. As the main issue, students must develop competence to decide when and how it is appropriate to use available ICT tools and to use them. These new requests put demands on those teachers who have not developed corresponding competencies themselves.…
Analysis of Wastewater and Water System Renewal Decision-Making Tools and Approaches
In regards to the development of software for decision support for pipeline renewal, most of the attention to date has been paid to the development of asset management models which help an owner decide on which portions of a system to prioritize for needed actions. There has not ...
Electronic Information: Literacy Skills for a Computer Age.
ERIC Educational Resources Information Center
Johnston, Jerome
Intended to identify essential skills for academics and students as our society comes to depend increasingly on electronic text, and to decide how, when, and where these skills should be taught, this paper begins by discussing the tools of electronic information processing, i.e., telecommunications, computers, and software. A summary of the skills…
Multimedia Projects in Education: Designing, Producing, and Assessing, Third Edition
ERIC Educational Resources Information Center
Ivers, Karen S.; Barron, Ann E.
2005-01-01
Building on the materials in the two previous successful editions, this book features approximately 40% all new material and updates the previous information. The authors use the DDD-E model (Decide, Design, Develop--Evaluate) to show how to select and plan multimedia projects, use presentation and development tools, manage graphics, audio, and…
From Test Takers to Test Makers
ERIC Educational Resources Information Center
Smith, Kari
2009-01-01
As a classroom teacher, Kari Smith realized that traditional objective tests don't always assess what students actually know. But tests are so deeply embedded in the education system that it would be difficult to do away with them entirely. Smith decided to make tests into learning tools. In this article, Smith describes three strategies for…
Once Established, What Techniques Work Best for Monitoring the District?
ERIC Educational Resources Information Center
Triverio, Louis E.
Monitoring school budget expenditures is as important as budgeting. School boards should decide which broad financial policies will provide control of expenditures, what financial tools to use in monitoring expenditures, and what areas outside of the budget should be monitored. A board's financial policy ought to deal with the line item transfers,…
Reflections on a Technology-Rich Mathematics Classroom
ERIC Educational Resources Information Center
Hodges, Thomas E.; Conner, Elizabeth
2011-01-01
Integrating technology into the mathematics classroom means more than just new teaching tools--it is an opportunity to redefine what it means to teach and learn mathematics. Yet deciding when a particular form of technology may be appropriate for a specific mathematics topic can be difficult. Such decisions center on what is commonly being…
Alternative IT Sourcing Strategies: Six Views
ERIC Educational Resources Information Center
Mahon, Ed; McPherson, Michael R.; Vaughan, Joseph; Rowe, Theresa; Pickett, Michael P.; Bielec, John A.
2011-01-01
IT leaders today must not only provide but also decide: which tools and services should they continue to supply, which are better delivered by others, and perhaps most critically, which methods from among the bewildering array of alternative sourcing strategies will best serve their faculty, staff, and students. In 2009, the EDUCAUSE Center for…
Weaving Social Media into a Business Proposal Project
ERIC Educational Resources Information Center
Li, Xiaoli
2012-01-01
Given that students are enthusiastic about social media or even have expertise in some social media tools, the author decided to design a class project in her Writing for Careers (Business Communication) class that integrates social media in terms of content and project management. This article intends to describe such a class project design as…
A GIS semiautomatic tool for classifying and mapping wetland soils
NASA Astrophysics Data System (ADS)
Moreno-Ramón, Héctor; Marqués-Mateu, Angel; Ibáñez-Asensio, Sara
2016-04-01
Wetlands are one of the most productive and biodiverse ecosystems in the world. Water is the main resource and controls the relationships between agents and factors that determine the quality of the wetland. However, vegetation, wildlife and soils are also essential factors to understand these environments. It is possible that soils have been the least studied resource due to their sampling problems. This feature has caused that sometimes wetland soils have been classified broadly. The traditional methodology states that homogeneous soil units should be based on the five soil forming-factors. The problem can appear when the variation of one soil-forming factor is too small to differentiate a change in soil units, or in case that there is another factor, which is not taken into account (e.g. fluctuating water table). This is the case of Albufera of Valencia, a coastal wetland located in the middle east of the Iberian Peninsula (Spain). The saline water table fluctuates throughout the year and it generates differences in soils. To solve this problem, the objectives of this study were to establish a reliable methodology to avoid that problems, and develop a GIS tool that would allow us to define homogeneous soil units in wetlands. This step is essential for the soil scientist, who has to decide the number of soil profiles in a study. The research was conducted with data from 133 soil pits of a previous study in the wetland. In that study, soil parameters of 401 samples (organic carbon, salinity, carbonates, n-value, etc.) were analysed. In a first stage, GIS layers were generated according to depth. The method employed was Bayesian Maxim Entropy. Subsequently, it was designed a program in GIS environment that was based on the decision tree algorithms. The goal of this tool was to create a single layer, for each soil variable, according to the different diagnostic criteria of Soil Taxonomy (properties, horizons and diagnostic epipedons). At the end, the program generated a set of layers with the geographical information, which corresponded with each diagnostic criteria. Finally, the superposition of layers generated the different homogeneous soil units where the soil scientist should locate the soil profiles. Historically, the Albufera of Valencia has been classified as a soil homogeneous unit, but it was demonstrated that there were six homogeneous units after the methodology and the GIS tool application. In that regard, the outcome reveals that it had been necessary to open only six profiles, against the 19 profiles opened when the real study was carried out. As a conclusion, the methodology and the SIG tool demonstrated that could be employed in areas where the soil forming-factors cannot be distinguished. The application of rapid measurement methods and this methodology could economise the definition process of homogeneous units.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neal, L.
1995-12-31
A new rule makes nuclear power plant license renewal a viable option. A small group of corporate executives will soon face one of the toughest decisions of their careers-a decision that will affect 17 million American homes. Forty-five commercial nuclear power plants will reach the end of their operating licenses early in the next century. They represent billions of dollars in capital investment, and the companies that own them must decide whether to keep them on the grid or scrap them. But before a company decides whether to pull the plug on a big generating plant, it will have tomore » do some homework. Company executives will have to roll up their sleeves and dig deep into projections of electricity demand, assessments of generating options and cold, hard economics. At the same time, they must keep wary eyes on the political landscape, scanning ahead for roadblocks and quicksand.« less
CRAB3: Establishing a new generation of services for distributed analysis at CMS
NASA Astrophysics Data System (ADS)
Cinquilli, M.; Spiga, D.; Grandi, C.; Hernàndez, J. M.; Konstantinov, P.; Mascheroni, M.; Riahi, H.; Vaandering, E.
2012-12-01
In CMS Computing the highest priorities for analysis tools are the improvement of the end users’ ability to produce and publish reliable samples and analysis results as well as a transition to a sustainable development and operations model. To achieve these goals CMS decided to incorporate analysis processing into the same framework as data and simulation processing. This strategy foresees that all workload tools (TierO, Tier1, production, analysis) share a common core with long term maintainability as well as the standardization of the operator interfaces. The re-engineered analysis workload manager, called CRAB3, makes use of newer technologies, such as RESTFul based web services and NoSQL Databases, aiming to increase the scalability and reliability of the system. As opposed to CRAB2, in CRAB3 all work is centrally injected and managed in a global queue. A pool of agents, which can be geographically distributed, consumes work from the central services serving the user tasks. The new architecture of CRAB substantially changes the deployment model and operations activities. In this paper we present the implementation of CRAB3, emphasizing how the new architecture improves the workflow automation and simplifies maintainability. In particular, we will highlight the impact of the new design on daily operations.
Communicating with the New Generations. The Challenge for Pediatric Dentists.
Saadia, Marc; Valencia, Roberto
2015-01-01
Most of the children and parents are virtuous and will give us plenty of reasons to enjoy what we do. Unfortunately, we all know that something is somehow wrong with these new generations. Parents and children sometimes place Pediatric dentists in a dilemma. The social structure changes every few years causing a burden on how to deal with these families. For this reason, dentists might decide to sedate or go to the operating room when these children might be potentially good dental patients. Deciding this course of action, does not allow us to bond with them. Bonding with children must be worked and nurtured. This is part of what pediatric dentists are trained for. This manuscript will illustrate the major changes seen with the new generations of parents and children and how it affects us the way we work in our offices. We will show the importance of bonding with parents and children, moving beyond the biological aspects and venturing into the psycho-socio and cultural issues. Knowing our children and adolescents will allow us to detect potential physical or emotional hazardous behavior.
Linear {GLP}-algebras and their elementary theories
NASA Astrophysics Data System (ADS)
Pakhomov, F. N.
2016-12-01
The polymodal provability logic {GLP} was introduced by Japaridze in 1986. It is the provability logic of certain chains of provability predicates of increasing strength. Every polymodal logic corresponds to a variety of polymodal algebras. Beklemishev and Visser asked whether the elementary theory of the free {GLP}-algebra generated by the constants \\mathbf{0}, \\mathbf{1} is decidable [1]. For every positive integer n we solve the corresponding question for the logics {GLP}_n that are the fragments of {GLP} with n modalities. We prove that the elementary theory of the free {GLP}_n-algebra generated by the constants \\mathbf{0}, \\mathbf{1} is decidable for all n. We introduce the notion of a linear {GLP}_n-algebra and prove that all free {GLP}_n-algebras generated by the constants \\mathbf{0}, \\mathbf{1} are linear. We also consider the more general case of the logics {GLP}_α whose modalities are indexed by the elements of a linearly ordered set α: we define the notion of a linear algebra and prove the latter result in this case.
[Manipulation of the human genome: ethics and law].
Goulart, Maria Carolina Vaz; Iano, Flávia Godoy; Silva, Paulo Maurício; Sales-Peres, Silvia Helena de Carvalho; Sales-Peres, Arsênio
2010-06-01
The molecular biology has provided the basic tool for geneticists deepening in the molecular mechanisms that influence different diseases. It should be noted the scientific and moral responsibility of the researchers, because the scientists should imagine the moral consequences of the commercial application of genetic tests, since this fact involves not only the individual and their families, but the entire population. Besides being also necessary to make a reflection on how this information from the human genome will be used, for good or bad. The objective of this review was to bring the light of knowledge, data on characteristics of the ethical application of molecular biology, linking it with the rights of human beings. After studying literature, it might be observed that the Human Genome Project has generated several possibilities, such as the identification of genes associated with diseases with synergistic properties, but sometimes modifying behavior to genetically intervene in humans, bringing benefits or social harm. The big challenge is to decide what humanity wants on this giant leap.
Building a generalized distributed system model
NASA Technical Reports Server (NTRS)
Mukkamala, Ravi; Foudriat, E. C.
1991-01-01
A modeling tool for both analysis and design of distributed systems is discussed. Since many research institutions have access to networks of workstations, the researchers decided to build a tool running on top of the workstations to function as a prototype as well as a distributed simulator for a computing system. The effects of system modeling on performance prediction in distributed systems and the effect of static locking and deadlocks on the performance predictions of distributed transactions are also discussed. While the probability of deadlock is considerably small, its effects on performance could be significant.
Evaluating geographic information systems technology
Guptill, Stephen C.
1989-01-01
Computerized geographic information systems (GISs) are emerging as the spatial data handling tools of choice for solving complex geographical problems. However, few guidelines exist for assisting potential users in identifying suitable hardware and software. A process to be followed in evaluating the merits of GIS technology is presented. Related standards and guidelines, software functions, hardware components, and benchmarking are discussed. By making users aware of all aspects of adopting GIS technology, they can decide if GIS is an appropriate tool for their application and, if so, which GIS should be used.
Korving, H; Clemens, F
2002-01-01
In recent years, decision analysis has become an important technique in many disciplines. It provides a methodology for rational decision-making allowing for uncertainties in the outcome of several possible actions to be undertaken. An example in urban drainage is the situation in which an engineer has to decide upon a major reconstruction of a system in order to prevent pollution of receiving waters due to CSOs. This paper describes the possibilities of Bayesian decision-making in urban drainage. In particular, the utility of monitoring prior to deciding on the reconstruction of a sewer system to reduce CSO emissions is studied. Our concern is with deciding whether a price should be paid for new information and which source of information is the best choice given the expected uncertainties in the outcome. The influence of specific uncertainties (sewer system data and model parameters) on the probability of CSO volumes is shown to be significant. Using Bayes' rule, to combine prior impressions with new observations, reduces the risks linked with the planning of sewer system reconstructions.
Marshall, Deborah A; Gonzalez, Juan Marcos; Johnson, F Reed; MacDonald, Karen V; Pugh, Amy; Douglas, Michael P; Phillips, Kathryn A
2016-12-01
Whole-genome sequencing (WGS) can be used as a powerful diagnostic tool as well as for screening, but it may lead to anxiety, unnecessary testing, and overtreatment. Current guidelines suggest reporting clinically actionable secondary findings when diagnostic testing is performed. We examined preferences for receiving WGS results. A US nationally representative survey (n = 410 adults) was used to rank preferences for who decides (an expert panel, your doctor, you) which WGS results are reported. We estimated the value of information about variants with varying levels of clinical usefulness by using willingness to pay contingent valuation questions. The results were as follows: 43% preferred to decide themselves what information is included in the WGS report. 38% (95% confidence interval (CI): 33-43%) would not pay for actionable variants, and 3% (95% CI: 1-5%) would pay more than $1,000. 55% (95% CI: 50-60%) would not pay for variants for which medical treatment is currently unclear, and 7% (95% CI: 5-9%) would pay more than $400. Most people prefer to decide what WGS results are reported. Despite valuing actionable information more, some respondents perceive that genetic information could negatively impact them. Preference heterogeneity for WGS information should be considered in the development of policies, particularly to integrate patient preferences with personalized medicine and shared decision making.Genet Med 18 12, 1295-1302.
Humanities in Gross Anatomy Project: A Novel Humanistic Learning Tool at Des Moines University
ERIC Educational Resources Information Center
Canby, Craig A.; Bush, Traci A.
2010-01-01
Gross anatomy affords physical therapy students an opportunity to discover human morphology by intimately studying the dead. Moreover, it also exposes future physical therapists to the humanistic aspects of the profession. In 2007, anatomy faculty decided to socialize students to the humanities with a new course requirement: Humanities in Gross…
KPI Employer Executive Summary Report, Summer 2000-Winter 2001.
ERIC Educational Resources Information Center
Sheridan Coll. (Ontario).
In the mid 1990s, the Ontario Government decided to enhance the accountability of the Colleges of Applied Arts and Technology by measuring and rewarding their performance in meeting specific goals and outcomes. The KPI Satisfaction Survey is a tool developed by the Ministry of Training Colleges and Universities in conjunction with the colleges to…
ERIC Educational Resources Information Center
Weber, Berta N.
1971-01-01
Cultural study provides an invaluable tool for the motivation and enrichment of work in the language classroom. The teacher of German, having decided to embark on a culture study program, must not, however, make the mistake of concentrating on the past, nor of letting current political boundaries restrict his approach; rather, he will find that…
Goal Programming: A New Tool for the Christmas Tree Industry
Bruce G. Hansen
1977-01-01
Goal programing (GP) can be useful for decision making in the natural Christmas tree industry. Its usefulness is demonstrated through an analysis of a hypothetical problem in which two potential growers decide how to use 10 acres in growing Christmas trees. Though the physical settings are identical, distinct differences between their goals significantly influence the...
ERIC Educational Resources Information Center
Watson, Jennifer; Obersteller, Elizabeth A.; Rennie, Linda; Whitbread, Cherie
2001-01-01
Participatory research in Australia's Northern Territory sought opinions from nurses, general practitioners, Aboriginal health workers, and Aboriginal and Torres Strait Islanders on the development of culturally relevant foot care education for Indigenous people with diabetes. They decided to use a visual approach (posters and flip charts) to…
Predicting Boundary-Layer Transition on Space-Shuttle Re-Entry
NASA Technical Reports Server (NTRS)
Berry, Scott; Horvath, Tom; Merski, Ron; Liechty, Derek; Greene, Frank; Bibb, Karen; Buck, Greg; Hamilton, Harris; Weilmuenster, Jim; Campbell, Chuck;
2008-01-01
The BLT Prediction Tool ("BLT" signifies "Boundary Layer Transition") is provided as part of the Damage Assessment Team analysis package, which is utilized for analyzing local aerothermodynamics environments of damaged or repaired space-shuttle thermal protection tiles. Such analyses are helpful in deciding whether to repair launch-induced damage before re-entering the terrestrial atmosphere.
Exploring Yellowstone National Park with Mathematical Modeling
ERIC Educational Resources Information Center
Wickstrom, Megan H.; Carr, Ruth; Lackey, Dacia
2017-01-01
Mathematical modeling, a practice standard in the Common Core State Standards for Mathematics (CCSSM) (CCSSI 2010), is a process by which students develop and use mathematics as a tool to make sense of the world around them. Students investigate a real-world situation by asking mathematical questions; along the way, they need to decide how to use…
The Use of an Information Brokering Tool in an Electronic Museum Environment.
ERIC Educational Resources Information Center
Zimmermann, Andreas; Lorenz, Andreas; Specht, Marcus
When art and technology meet, a huge information flow has to be managed. The LISTEN project conducted by the Fraunhofer Institut in St. Augustin (Germany) augments every day environments with audio information. In order to distribute and administer this information in an efficient way, the Institute decided to employ an information brokering tool…
ERIC Educational Resources Information Center
Berman, Ruth Aronson
1978-01-01
Contrastive analysis is suggested as a tool in language teaching for such areas as: (1) deciding how much to focus on different aspects of the target language; (2) making generalizations about its structure; and (3) explaining texts or constructions which might otherwise be incomprehensible. The claim is made that such procedures need to be based…
Social Networking Tools in a University Setting: A Student's Perspective
ERIC Educational Resources Information Center
Haytko, Diana L.; Parker, R. Stephen
2012-01-01
As Professors, we are challenged to reach ever-changing cohorts of college students as they flow through our classes and our lives. Technological advancements happen daily and we need to decide which, if any, to incorporate into our classrooms. Our students constantly check Facebook, Twitter, MySpace and other online social networks. Should we be…
Risk Assessment as an Environmental Management Tool: Considerations for Freshwater Wetlands
A. Dennis Lemly
1997-01-01
This paper presents a foundation for improving the risk assessment process for freshwater wetlands. Integrating wetland science, i.e., use of an ecosystem-based approach, is the key concept. Each biotic and abiotic wetland component should be identified and its contribution to ecosystem functions and societal values determined when deciding whether a stressor poses an...
Office management of mild head injury in children and adolescents.
Garcia-Rodriguez, Juan Antonio; Thomas, Roger E
2014-06-01
To provide family physicians with updated, practical, evidence-based information about mild head injury (MHI) and concussion in the pediatric population. MEDLINE (1950 to February 2013), the Cochrane Database of Systematic Reviews (2005 to 2013), the Cochrane Central Register of Controlled Trials (2005 to 2013), and DARE (2005 to 2013) were searched using terms relevant to concussion and head trauma. Guidelines, position statements, articles, and original research relevant to MHI were selected. Trauma is the main cause of death in children older than 1 year of age, and within this group head trauma is the leading cause of disability and death. Nine percent of reported athletic injuries in high school students involve MHI. Family physicians need to take a focused history, perform physical and neurologic examinations, use standardized evaluation instruments (Glasgow Coma Scale; the Sport Concussion Assessment Tool, version 3; the child version of the Sport Concussion Assessment Tool; and the Balance Error Scoring System), instruct parents how to monitor their children, decide when caregivers are not an appropriately responsible resource, follow up with patients promptly, guide a safe return to play and to learning, and decide when neuropsychological testing for longer-term follow-up is required. A thorough history, physical and neurologic assessment, the use of validated tools to provide an objective framework, and periodic follow-up are the basis of family physician management of pediatric MHI. Copyright© the College of Family Physicians of Canada.
S&T converging trends in dealing with disaster: A review on AI tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasan, Abu Bakar, E-mail: abakarh@usim.edu.my; Isa, Mohd Hafez Mohd.
Science and Technology (S&T) has been able to help mankind to solve or minimize problems when arise. Different methodologies, techniques and tools were developed or used for specific cases by researchers, engineers, scientists throughout the world, and numerous papers and articles have been written by them. Nine selected cases such as flash flood, earthquakes, workplace accident, fault in aircraft industry, seismic vulnerability, disaster mitigation and management, and early fault detection in nuclear industry have been studied. This paper looked at those cases, and their results showed nearly 60% uses artificial intelligence (AI) as a tool. This paper also did somemore » review that will help young researchers in deciding the types of AI tools to be selected; thus proving the future trends in S&T.« less
The Project Manager's Tool Kit
NASA Technical Reports Server (NTRS)
Cameron, W. Scott
2003-01-01
Project managers are rarely described as being funny. Moreover, a good sense of humor rarely seems to be one of the deciding factors in choosing someone to be a project manager, or something that pops up as a major discussion point at an annual performance review. Perhaps this is because people think you aren't serious about your work if you laugh. I disagree with this assessment, but that's not really my point. As I talk to people either pursuing a career in project management, or broadening their assignment to include project management, I encourage them to consider what tools they need to be successful. I suggest that they consider any strength they have to be part of their Project Management (PM) Tool Kit, and being funny could be one of the tools they need.
S&T converging trends in dealing with disaster: A review on AI tools
NASA Astrophysics Data System (ADS)
Hasan, Abu Bakar; Isa, Mohd. Hafez Mohd.
2016-01-01
Science and Technology (S&T) has been able to help mankind to solve or minimize problems when arise. Different methodologies, techniques and tools were developed or used for specific cases by researchers, engineers, scientists throughout the world, and numerous papers and articles have been written by them. Nine selected cases such as flash flood, earthquakes, workplace accident, fault in aircraft industry, seismic vulnerability, disaster mitigation and management, and early fault detection in nuclear industry have been studied. This paper looked at those cases, and their results showed nearly 60% uses artificial intelligence (AI) as a tool. This paper also did some review that will help young researchers in deciding the types of AI tools to be selected; thus proving the future trends in S&T.
ERIC Educational Resources Information Center
Wheeler, Melissa
2014-01-01
Researchers have reported that the graduation and retention rate of students whose parents do not hold college degrees (first-generation college students or FGCS) are lower than that of their peers whose parents do hold college degrees. FGCS are 1.3 times more likely to leave college after their first year compared to their non-FGCS peers…
Our Stories Matter: Storytelling and Social Justice in the Hollaback! Movement
ERIC Educational Resources Information Center
Wånggren, Lena
2016-01-01
As feminist and anti-racist scholars and activists have long known, which stories predominate and which are marginalised is always a question of power and authority--about who is entitled to speak, and who has the authority to decide the meanings of words and actions. Storytelling can be used as a tool for social justice, as exemplified by the…
An Anthropologist Explores the Culture of Video Blogging
ERIC Educational Resources Information Center
Young, Jeffrey R.
2007-01-01
Michael L. Wesch, an assistant professor of cultural anthropology at Kansas State University, was writing a paper about social networking and other interactive tools, which are collectively referred to as Web 2.0, when he decided to make use of the technology to spread his message. So he put together a short video with examples of Web 2.0…
2016-12-01
based on life expectancy and the TSP account selected. The TSP Growth and Annuity Element also estimates how taxes will increase at the time the service...the BRS. 14. SUBJECT TERMS military retirement, blended retirement, HIGH-36, thrift savings plan, investment risk, retirement taxes , net present...20 2. Tax Impacts
Using Smart Phones in Language Learning--A Pilot Study to Turn CALL into MALL
ERIC Educational Resources Information Center
Kétyi, András
2013-01-01
The popularity of smart phones has increased enormously in the last few years. Because of the increasing penetration of these devices and the above-average willingness of our students using new tools and devices in language courses, we decided to design a voluntary pilot project for mobile language learning for students who learn German as a…
Modeling Guidelines for Code Generation in the Railway Signaling Context
NASA Technical Reports Server (NTRS)
Ferrari, Alessio; Bacherini, Stefano; Fantechi, Alessandro; Zingoni, Niccolo
2009-01-01
Modeling guidelines constitute one of the fundamental cornerstones for Model Based Development. Their relevance is essential when dealing with code generation in the safety-critical domain. This article presents the experience of a railway signaling systems manufacturer on this issue. Introduction of Model-Based Development (MBD) and code generation in the industrial safety-critical sector created a crucial paradigm shift in the development process of dependable systems. While traditional software development focuses on the code, with MBD practices the focus shifts to model abstractions. The change has fundamental implications for safety-critical systems, which still need to guarantee a high degree of confidence also at code level. Usage of the Simulink/Stateflow platform for modeling, which is a de facto standard in control software development, does not ensure by itself production of high-quality dependable code. This issue has been addressed by companies through the definition of modeling rules imposing restrictions on the usage of design tools components, in order to enable production of qualified code. The MAAB Control Algorithm Modeling Guidelines (MathWorks Automotive Advisory Board)[3] is a well established set of publicly available rules for modeling with Simulink/Stateflow. This set of recommendations has been developed by a group of OEMs and suppliers of the automotive sector with the objective of enforcing and easing the usage of the MathWorks tools within the automotive industry. The guidelines have been published in 2001 and afterwords revisited in 2007 in order to integrate some additional rules developed by the Japanese division of MAAB [5]. The scope of the current edition of the guidelines ranges from model maintainability and readability to code generation issues. The rules are conceived as a reference baseline and therefore they need to be tailored to comply with the characteristics of each industrial context. Customization of these recommendations has been performed for the automotive control systems domain in order to enforce code generation [7]. The MAAB guidelines have been found profitable also in the aerospace/avionics sector [1] and they have been adopted by the MathWorks Aerospace Leadership Council (MALC). General Electric Transportation Systems (GETS) is a well known railway signaling systems manufacturer leading in Automatic Train Protection (ATP) systems technology. Inside an effort of adopting formal methods within its own development process, GETS decided to introduce system modeling by means of the MathWorks tools [2], and in 2008 chose to move to code generation. This article reports the experience performed by GETS in developing its own modeling standard through customizing the MAAB rules for the railway signaling domain and shows the result of this experience with a successful product development story.
Deciding alternative left turn signal phases using expert systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, E.C.P.
1988-01-01
The Texas Transportation Institute (TTI) conducted a study to investigate the feasibility of applying artificial intelligence (AI) technology and expert systems (ES) design concepts to a traffic engineering problem. Prototype systems were developed to analyze user input, evaluate various reasoning, and suggest suitable left turn phase treatment. These systems were developed using AI programming tools on IBM PC/XT/AT-compatible microcomputers. Two slightly different systems were designed using AI languages; another was built with a knowledge engineering tool. These systems include the PD PROLOG and TURBO PROLOG AI programs, as well as the INSIGHT Production Rule Language.
Has computational creativity successfully made it "Beyond the Fence" in musical theatre?
NASA Astrophysics Data System (ADS)
Jordanous, Anna
2017-10-01
A significant test for software is to task it with replicating human performance, as done recently with creative software and the commercial project Beyond the Fence (undertaken for a television documentary Computer Says Show). The remit of this project was to use computer software as much as possible to produce "the world's first computer-generated musical". Several creative systems were used to generate this musical, which was performed in London's West End in 2016. This paper considers the challenge of evaluating this project. Current computational creativity evaluation methods are ill-suited to evaluating projects that involve creative input from multiple systems and people. Following recent inspiration within computational creativity research from interaction design, here the DECIDE evaluation framework is applied to evaluate the Beyond the Fence project. Evaluation finds that the project was reasonably successful at achieving the task of using computational generation to produce a credible musical. Lessons have been learned for future computational creativity projects though, particularly for affording creative software more agency and enabling software to interact with other creative partners. Upon reflection, the DECIDE framework emerges as a useful evaluation "checklist" (if not a tangible operational methodology) for evaluating multiple creative systems participating in a creative task.
Innovation on Energy Power Technology (1)
NASA Astrophysics Data System (ADS)
Nagano, Susumu; Kakishima, Masayoshi
After the last war, the output of single Steam Turbine Generator produced by the own technology in Japan returned to a prewar level. Electric power companies imported the large-capacity high efficiency Steam Turbine Generator from the foreign manufacturers in order to support the sudden increase of electric power demand. On the other hand, they decided to produce those in our own country to improve industrial technology. The domestic production of large-capacity 125MW Steam Turbine Generator overcome much difficulty and did much contribution for the later domestic technical progress.
75 FR 6020 - Electrical Interconnection of the Lower Snake River Wind Energy Project
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-05
... River Wind Energy Project AGENCY: Bonneville Power Administration (BPA), Department of Energy (DOE... (BPA) has decided to offer Puget Sound Energy Inc., a Large Generator Interconnection Agreement for... and Columbia counties, Washington. To interconnect the Wind Project, BPA will construct a new...
Can Man Control His Biological Evolution? A Symposium on Genetic Engineering. Xeroxing Human Beings
ERIC Educational Resources Information Center
Freund, Paul A.
1972-01-01
If the aim of new research is to improve the genetic inheritance of future generations, then decisions regarding who should decide what research should be done needs to be established. Positive and negative eugenics need to be considered thoroughly. (PS)
A systematized WYSIWYG pipeline for digital stereoscopic 3D filmmaking
NASA Astrophysics Data System (ADS)
Mueller, Robert; Ward, Chris; Hušák, Michal
2008-02-01
Digital tools are transforming stereoscopic 3D content creation and delivery, creating an opportunity for the broad acceptance and success of stereoscopic 3D films. Beginning in late 2005, a series of mostly CGI features has successfully initiated the public to this new generation of highly-comfortable, artifact-free digital 3D. While the response has been decidedly favorable, a lack of high-quality live-action films could hinder long-term success. Liveaction stereoscopic films have historically been more time-consuming, costly, and creatively-limiting than 2D films - thus a need arises for a live-action 3D filmmaking process which minimizes such limitations. A unique 'systematized' what-you-see-is-what-you-get (WYSIWYG) pipeline is described which allows the efficient, intuitive and accurate capture and integration of 3D and 2D elements from multiple shoots and sources - both live-action and CGI. Throughout this pipeline, digital tools utilize a consistent algorithm to provide meaningful and accurate visual depth references with respect to the viewing audience in the target theater environment. This intuitive, visual approach introduces efficiency and creativity to the 3D filmmaking process by eliminating both the need for a 'mathematician mentality' of spreadsheets and calculators, as well as any trial and error guesswork, while enabling the most comfortable, 'pixel-perfect', artifact-free 3D product possible.
Journey to Library 2.0: One Library Trains Staff on the Social Tools Users Employ
ERIC Educational Resources Information Center
Hastings, Robin
2007-01-01
In December 2006, the Missouri River Regional Library (MRRL) in Jefferson City, embarked on a journey. They had been watching the Public Library of Charlotte-Mecklenburg County (PLCMC), NC, on its adventure through the wilds of Web 2.0, and they decided to follow the trail it had blazed. What PLCMC had done was pretty revolutionary. The library…
ERIC Educational Resources Information Center
LaPlante, Josephine M.; Durham, Taylor R.
A revised edition of PS-14, "An Introduction to Benefit-Cost Analysis for Evaluating Public Programs," presents concepts and techniques of benefit-cost analysis as tools that can be used to assist in deciding between alternatives. The goals of the new edition include teaching students to think about the possible benefits and costs of each…
Automated Euler and Navier-Stokes Database Generation for a Glide-Back Booster
NASA Technical Reports Server (NTRS)
Chaderjian, Neal M.; Rogers, Stuart E.; Aftosmis, Mike J.; Pandya, Shishir A.; Ahmad, Jasim U.; Tejnil, Edward
2004-01-01
The past two decades have seen a sustained increase in the use of high fidelity Computational Fluid Dynamics (CFD) in basic research, aircraft design, and the analysis of post-design issues. As the fidelity of a CFD method increases, the number of cases that can be readily and affordably computed greatly diminishes. However, computer speeds now exceed 2 GHz, hundreds of processors are currently available and more affordable, and advances in parallel CFD algorithms scale more readily with large numbers of processors. All of these factors make it feasible to compute thousands of high fidelity cases. However, there still remains the overwhelming task of monitoring the solution process. This paper presents an approach to automate the CFD solution process. A new software tool, AeroDB, is used to compute thousands of Euler and Navier-Stokes solutions for a 2nd generation glide-back booster in one week. The solution process exploits a common job-submission grid environment, the NASA Information Power Grid (IPG), using 13 computers located at 4 different geographical sites. Process automation and web-based access to a MySql database greatly reduces the user workload, removing much of the tedium and tendency for user input errors. The AeroDB framework is shown. The user submits/deletes jobs, monitors AeroDB's progress, and retrieves data and plots via a web portal. Once a job is in the database, a job launcher uses an IPG resource broker to decide which computers are best suited to run the job. Job/code requirements, the number of CPUs free on a remote system, and queue lengths are some of the parameters the broker takes into account. The Globus software provides secure services for user authentication, remote shell execution, and secure file transfers over an open network. AeroDB automatically decides when a job is completed. Currently, the Cart3D unstructured flow solver is used for the Euler equations, and the Overflow structured overset flow solver is used for the Navier-Stokes equations. Other codes can be readily included into the AeroDB framework.
Grant, Sean; Agniel, Denis; Almirall, Daniel; Burkhart, Q; Hunter, Sarah B; McCaffrey, Daniel F; Pedersen, Eric R; Ramchand, Rajeev; Griffin, Beth Ann
2017-12-19
Over 1.6 million adolescents in the United States meet criteria for substance use disorders (SUDs). While there are promising treatments for SUDs, adolescents respond to these treatments differentially in part based on the setting in which treatments are delivered. One way to address such individualized response to treatment is through the development of adaptive interventions (AIs): sequences of decision rules for altering treatment based on an individual's needs. This protocol describes a project with the overarching goal of beginning the development of AIs that provide recommendations for altering the setting of an adolescent's substance use treatment. This project has three discrete aims: (1) explore the views of various stakeholders (parents, providers, policymakers, and researchers) on deciding the setting of substance use treatment for an adolescent based on individualized need, (2) generate hypotheses concerning candidate AIs, and (3) compare the relative effectiveness among candidate AIs and non-adaptive interventions commonly used in everyday practice. This project uses a mixed-methods approach. First, we will conduct an iterative stakeholder engagement process, using RAND's ExpertLens online system, to assess the importance of considering specific individual needs and clinical outcomes when deciding the setting for an adolescent's substance use treatment. Second, we will use results from the stakeholder engagement process to analyze an observational longitudinal data set of 15,656 adolescents in substance use treatment, supported by the Substance Abuse and Mental Health Services Administration, using the Global Appraisal of Individual Needs questionnaire. We will utilize methods based on Q-learning regression to generate hypotheses about candidate AIs. Third, we will use robust statistical methods that aim to appropriately handle casemix adjustment on a large number of covariates (marginal structural modeling and inverse probability of treatment weights) to compare the relative effectiveness among candidate AIs and non-adaptive decision rules that are commonly used in everyday practice. This project begins filling a major gap in clinical and research efforts for adolescents in substance use treatment. Findings could be used to inform the further development and revision of influential multi-dimensional assessment and treatment planning tools, or lay the foundation for subsequent experiments to further develop or test AIs for treatment planning.
NASA Astrophysics Data System (ADS)
Behringer, Uwe F. W.
2004-06-01
In June 2000 ago the company Accretech and LEEPL corporation decided to develop an E-beam lithography tool for high throughput wafer exposure, called LEEPL. In an amazing short time the alpha tool was built. In 2002 the beta tool was installed at Accretech. Today the first production tool the LEEPL 3000 is ready to be shipped. The 2keV E-beam tool will be used in the first lithography strategy to expose (in mix and match mode with optical exposure tools) critical levels like gate structures, contact holes (CH), and via pattern of the 90 nm and 65 nm node. At the SEMATECH EPL workshop on September 22nd in Cambridge, England it was mentioned that the amount of these levels will increase very rapidly (8 in 2007; 13 in 2010 and 17 in 2013). The schedule of the production tool for 45 nm node is mid 2005 and for the 32 nm node 2008. The Figure 1 shows from left to right α-tool, the β-tool and the production tool LEEPL 3000. Figure 1 also shows the timetable of the 4 LEEPL forum all held in Japan.
NASA Technical Reports Server (NTRS)
Shyam, Vikram
2010-01-01
A preprocessor for the Computational Fluid Dynamics (CFD) code TURBO has been developed and tested. The preprocessor converts grids produced by GridPro (Program Development Company (PDC)) into a format readable by TURBO and generates the necessary input files associated with the grid. The preprocessor also generates information that enables the user to decide how to allocate the computational load in a multiple block per processor scenario.
Generative Themes and At-Risk Students
ERIC Educational Resources Information Center
Thelin, William H.; Taczak, Kara
2007-01-01
At the University of Akron, the administration decided to segregate the students previously called "provisional" from the "regular" population. As an open-access institution, the university directly admits only approximately 15 percent of the students to a program of study. The vast majority of students start in University College and transfer to…
Tracking PACS usage with open source tools.
French, Todd L; Langer, Steve G
2011-08-01
A typical choice faced by Picture Archiving and Communication System (PACS) administrators is deciding how many PACS workstations are needed and where they should be sited. Oftentimes, the social consequences of having too few are severe enough to encourage oversupply and underutilization. This is costly, at best in terms of hardware and electricity, and at worst (depending on the PACS licensing and support model) in capital costs and maintenance fees. The PACS administrator needs tools to asses accurately the use to which her fleet is being subjected, and thus make informed choices before buying more workstations. Lacking a vended solution for this challenge, we developed our own.
Sue Miller; Bill Elliot; Pete Robichaud; Randy Foltz; Dennis Flanagan; Erin Brooks
2014-01-01
Forest erosion can lead to topsoil loss, and also to damaging deposits of sediment in aquatic ecosystems. For this reason, forest managers must be able to estimate the erosion potential of both planned management activities and catastrophic events, in order to decide where to use limited funds to focus erosion control efforts. To meet this need, scientists from RMRS (...
ERIC Educational Resources Information Center
Agho, Jude; Oseghale, Francis
2008-01-01
Feminism, especially the womanist brand, has been a very popular critical tool that most critics, men and women alike, have employed in their critical appraisal of African literary works. This is decidedly a very fertile area of contemporary scholarship. The emergence of this critical methodology in the African context stems from the perceived…
Development and pilot testing of a decision aid for drivers with dementia.
Carmody, John; Potter, Jan; Lewis, Kate; Bhargava, Sanjay; Traynor, Victoria; Iverson, Don
2014-03-19
An increasing number of older adults drive automobiles. Given that the prevalence of dementia is rising, it is necessary to address the issue of driving retirement. The purpose of this study is to evaluate how a self-administered decision aid contributed to decision making about driving retirement by individuals living with dementia. The primary outcome measure in this study was decisional conflict. Knowledge, decision, satisfaction with decision, booklet use and booklet acceptability were the secondary outcome measures. A mixed methods approach was adopted. Drivers with dementia were recruited from an Aged Care clinic and a Primary Care center in NSW, Australia. Telephone surveys were conducted before and after participants read the decision aid. Twelve participants were recruited (mean age 75, SD 6.7). The primary outcome measure, decisional conflict, improved following use of the decision aid. Most participants felt that the decision aid: (i) was balanced; (ii) presented information well; and (iii) helped them decide about driving. In addition, mean knowledge scores improved after booklet use. This decision aid shows promise as an acceptable, useful and low-cost tool for drivers with dementia. A self-administered decision aid can be used to assist individuals with dementia decide about driving retirement. A randomized controlled trial is underway to evaluate the effectiveness of the tool.
NASA Astrophysics Data System (ADS)
Kumar, Girish; Jain, Vipul; Gandhi, O. P.
2018-03-01
Maintenance helps to extend equipment life by improving its condition and avoiding catastrophic failures. Appropriate model or mechanism is, thus, needed to quantify system availability vis-a-vis a given maintenance strategy, which will assist in decision-making for optimal utilization of maintenance resources. This paper deals with semi-Markov process (SMP) modeling for steady state availability analysis of mechanical systems that follow condition-based maintenance (CBM) and evaluation of optimal condition monitoring interval. The developed SMP model is solved using two-stage analytical approach for steady-state availability analysis of the system. Also, CBM interval is decided for maximizing system availability using Genetic Algorithm approach. The main contribution of the paper is in the form of a predictive tool for system availability that will help in deciding the optimum CBM policy. The proposed methodology is demonstrated for a centrifugal pump.
Kopelman, Loretta M; Kopelman, Arthur E
2007-01-01
Clinicians sometimes disagree about how much to honor surrogates' deeply held cultural values or traditions when they differ from those of the host country. Such a controversy arose when parents requested a cultural accommodation to let their infant die by withdrawing life saving care. While both the parents and clinicians claimed to be using the Best Interests Standard to decide what to do, they were at an impasse. This standard is analyzed into three necessary and jointly sufficient conditions and used to resolve the question of how much to accommodate cultural preferences and how to treat this infant. The extreme versions of absolutism and relativism are rejected. Properly understood, the Best Interests Standard can serve as a powerful tool in settling disputes about how to make good decisions for those who cannot decide for themselves.
NASA Astrophysics Data System (ADS)
Barbasiewicz, Adrianna; Widerski, Tadeusz; Daliga, Karol
2018-01-01
This article was created as a result of research conducted within the master thesis. The purpose of the measurements was to analyze the accuracy of the positioning of points by computer programs. Selected software was a specialized computer software dedicated to photogrammetric work. For comparative purposes it was decided to use tools with similar functionality. As the basic parameters that affect the results selected the resolution of the photos on which the key points were searched. In order to determine the location of the determined points, it was decided to follow the photogrammetric resection rule. In order to automate the measurement, the measurement session planning was omitted. The coordinates of the points collected by the tachymetric measure were used as a reference system. The resulting deviations and linear displacements oscillate in millimeters. The visual aspects of the cloud points have also been briefly analyzed.
Breaking up is hard to do: how teens end violent dating relationships.
Martsolf, Donna S; Draucker, Claire Burke; Brandau, Melvina
2013-01-01
Dating violence affects nearly 30% of teens and is associated with numerous negative health outcomes. Teens do not tend to use adult or peer assistance to end violent dating relationships, and little is known about how they manage to end them. The purpose of this study was to determine the common ways in which teens end violent dating relationships. Grounded theory methods were used to analyze transcribed interviews conducted with a community sample of 83 young adults who had experienced dating violence as teens. Participants described six ways of ending violent dating relationships: deciding enough is enough; becoming interested in someone else; being on again, off again; fading away; deciding it's best for us both; and moving away. Professionals working with teens can present the six ways of breaking up as a tool to initiate discussion about the issues involved in ending violent dating relationships.
Rasheed, Waqas; Neoh, Yee Yik; Bin Hamid, Nor Hisham; Reza, Faruque; Idris, Zamzuri; Tang, Tong Boon
2017-10-01
Functional neuroimaging modalities play an important role in deciding the diagnosis and course of treatment of neuronal dysfunction and degeneration. This article presents an analytical tool with visualization by exploiting the strengths of the MEG (magnetoencephalographic) neuroimaging technique. The tool automates MEG data import (in tSSS format), channel information extraction, time/frequency decomposition, and circular graph visualization (connectogram) for simple result inspection. For advanced users, the tool also provides magnitude squared coherence (MSC) values allowing personalized threshold levels, and the computation of default model from MEG data of control population. Default model obtained from healthy population data serves as a useful benchmark to diagnose and monitor neuronal recovery during treatment. The proposed tool further provides optional labels with international 10-10 system nomenclature in order to facilitate comparison studies with EEG (electroencephalography) sensor space. Potential applications in epilepsy and traumatic brain injury studies are also discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Generational Mentorship: How to Be a Mentor
ERIC Educational Resources Information Center
Altamirano, Ismari
2016-01-01
Ismari Altamirano has worked in higher education since 2009 and is currently the Associate Director in the registrar office. Here she describes her early mentee experience as learning by watching and listening rather than being an active participant in the relationship. Her mentor had her attend meetings where managers decided how to redistribute…
Embedded Managers in Informal Learning Spaces
ERIC Educational Resources Information Center
Raisin, Victoria; Fennewald, Joseph
2016-01-01
Many universities have decided to invest in updating their informal learning spaces. One decision to be made in planning the space is how to staff it. The researchers carried out a qualitative case study to better understand the perspective of learning space managers who work in offices within their assigned space. The research generated six…
Fact sheet to help academic laboratories decide whether to opt into the alternate set of hazardous waste requirements for eligible academic laboratories found in RCRA subpart K, how to plan for the transition to subpart K, and what first steps to take.
Cadres Decide Everything: The Pay and Pension Security of Workers in Science
ERIC Educational Resources Information Center
Rimashevskaia, N. M.; Zubova, L. T.; Antropova, O. A.
2011-01-01
Russian science is experiencing processes of personnel aging and stagnation, which are disrupting the continuity of the generations and are limiting prospective workers' opportunities for professional and career growth. The decline in the prestige of science work, the exodus of specialists into other, more attractive segments of economic activity…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Daci, Lulzime, E-mail: lulzime.daci@nodlandssykehuset.no; Malkaj, Partizan, E-mail: malkaj-p@hotmail.com
2016-03-25
In this study we analyzed and compared the dose distribution of different IMRT and VMAT plans with the intent to provide pre-treatment quality assurance using two different tools. Materials/Methods: We have used the electronic portal imaging device EPID after calibration to dose and correction for the background offset signal and also the Delta4 phantom after en evaluation of angular sensitivity. The Delta4 phantom has a two-dimensional array with ionization chambers. We analyzed three plans for each anatomical site calculated by Eclipse treatment planning system. The measurements were analyzed using γ-evaluation method with passing criteria 3% absolute dose and 3 mm distancemore » to agreement (DTA). For all the plans the range of score has been from 97% to 99% for gantry fixed at 0° while for rotational planes there was a slightly decreased pass rates and above 95%. Point measurement with a ionization chamber were done in additional to see the accuracy of portal dosimetry and to evaluate the Delta4 device to various dose rates. Conclusions: Both Delt4 and Portal dosimetry shows good results between the measured and calculated doses. While Delta4 is more accurate in measurements EPID is more time efficient. We have decided to use both methods in the first steps of IMRT and VMAT implementation and later on to decide which of the tools to use depending on the complexity of plans, how much accurate we want to be and the time we have on the machine.« less
Khodambashi, Soudabeh; Nytrø, Øystein
2017-01-01
To facilitate the clinical guideline (GL) development process, different groups of researchers have proposed tools that enable computer-supported tools for authoring and publishing GLs. In a previous study we interviewed GL authors in different Norwegian institutions and identified tool shortcomings. In this follow-up study our goal is to explore to what extent GL authoring tools have been evaluated by researchers, guideline organisations, or GL authors. This article presents results from a systematic literature review of evaluation (including usability) of GL authoring tools. A controlled database search and backward snow-balling were used to identify relevant articles. From the 12692 abstracts found, 188 papers were fully reviewed and 26 papers were identified as relevant. The GRADEPro tool has attracted some evaluation, however popular tools and platforms such as DECIDE, Doctor Evidence, JBI-SUMARI, G-I-N library have not been subject to specific evaluation from an authoring perspective. Therefore, we found that little attention was paid to the evaluation of the tools in general. We could not find any evaluation relevant to how tools integrate and support the complex GL development workflow. The results of this paper are highly relevant to GL authors, tool developers and GL publishing organisations in order to improve and control the GL development and maintenance process.
Futility: a concept in evolution.
Burns, Jeffrey P; Truog, Robert D
2007-12-01
The debate about how to resolve cases in which patients and families demand interventions that clinicians regard as futile has been in evolution over the past 20 years. This debate can be divided into three generations. The first generation was characterized by attempts to define futility in terms of certain clinical criteria. These attempts failed because they proposed limitations to care based on value judgments for which there is no consensus among a significant segment of society. The second generation was a procedural approach that empowered hospitals, through their ethics committees, to decide whether interventions demanded by families were futile. Many hospitals adopted such policies, and some states incorporated this approach into legislation. This approach has also failed because it gives hospitals authority to decide whether or not to accede to demands that the clinicians regard as unreasonable, when any national consensus on what is a "beneficial treatment" remains under intense debate. Absent such a consensus, procedural mechanisms to resolve futility disputes inevitably confront the same insurmountable barriers as attempts to define futility. We therefore predict emergence of a third generation, focused on communication and negotiation at the bedside. We present a paradigm that has proven successful in business and law. In the small number of cases in which even the best efforts at communication and negotiation fail, we suggest that clinicians should find ways to better support each other in providing this care, rather than seeking to override the requests of these patients and families.
Ehrensperger, Michael M; Taylor, Kirsten I; Berres, Manfred; Foldi, Nancy S; Dellenbach, Myriam; Bopp, Irene; Gold, Gabriel; von Gunten, Armin; Inglin, Daniel; Müri, René; Rüegger, Brigitte; Kressig, Reto W; Monsch, Andreas U
2014-01-01
Optimal identification of subtle cognitive impairment in the primary care setting requires a very brief tool combining (a) patients' subjective impairments, (b) cognitive testing, and (c) information from informants. The present study developed a new, very quick and easily administered case-finding tool combining these assessments ('BrainCheck') and tested the feasibility and validity of this instrument in two independent studies. We developed a case-finding tool comprised of patient-directed (a) questions about memory and depression and (b) clock drawing, and (c) the informant-directed 7-item version of the Informant Questionnaire on Cognitive Decline in the Elderly (IQCODE). Feasibility study: 52 general practitioners rated the feasibility and acceptance of the patient-directed tool. Validation study: An independent group of 288 Memory Clinic patients (mean ± SD age = 76.6 ± 7.9, education = 12.0 ± 2.6; 53.8% female) with diagnoses of mild cognitive impairment (n = 80), probable Alzheimer's disease (n = 185), or major depression (n = 23) and 126 demographically matched, cognitively healthy volunteer participants (age = 75.2 ± 8.8, education = 12.5 ± 2.7; 40% female) partook. All patient and healthy control participants were administered the patient-directed tool, and informants of 113 patient and 70 healthy control participants completed the very short IQCODE. Feasibility study: General practitioners rated the patient-directed tool as highly feasible and acceptable. Validation study: A Classification and Regression Tree analysis generated an algorithm to categorize patient-directed data which resulted in a correct classification rate (CCR) of 81.2% (sensitivity = 83.0%, specificity = 79.4%). Critically, the CCR of the combined patient- and informant-directed instruments (BrainCheck) reached nearly 90% (that is 89.4%; sensitivity = 97.4%, specificity = 81.6%). A new and very brief instrument for general practitioners, 'BrainCheck', combined three sources of information deemed critical for effective case-finding (that is, patients' subject impairments, cognitive testing, informant information) and resulted in a nearly 90% CCR. Thus, it provides a very efficient and valid tool to aid general practitioners in deciding whether patients with suspected cognitive impairments should be further evaluated or not ('watchful waiting').
Expanding the Toolkit and Resource Environment to Assist Translation (TREAT) and Its User Base
2011-06-01
3 Figure 2. Screenshot of TREAT (translation of Arabic source into English target) and two corresponding markup tool windows on Arabic source...initial framework in place, we decided to expand TREAT to provide support to two new groups of users: students learning to be Arabic -language...translators and teachers training them. The students and the teachers are native English speakers, so the training includes learning how to read Arabic
Prototypical Images in Condom Scripts among AIDS-Bereaved Adolescents
ERIC Educational Resources Information Center
Reich, Warren A.; Rubin, Rachel M.
2007-01-01
Twenty-five HIV-negative late adolescents (13 women and 12 men) who had lost a parent to AIDS generated vignettes in which the characters were deciding whether to use a condom (condom scripts). Two clinically trained judges rated the interpersonal tone of the condom scripts on 17 semantic differential scales. Three other clinically trained raters…
41 CFR 102-192.145 - Which program levels should have a mail manager?
Code of Federal Regulations, 2010 CFR
2010-07-01
... should have a mail manager? 102-192.145 Section 102-192.145 Public Contracts and Property Management... have a mail manager? Every program level within a Federal agency that generates a significant quantity of outgoing mail should have its own mail manager. Each agency must decide which programs will have a...
ERIC Educational Resources Information Center
Poon, OiYan
2014-01-01
Despite their popular portrayal as high achieving and structurally incorporated, race continues to shape the career choices of Asian American college students. As second-generation Americans, Asian Americans negotiate a constellation of factors when deciding their career choices, most notably, pressures from immigrant parents, awareness of labor…
The Effect of Using Relative and Absolute Criteria to Decide Students' Passing or Failing a Course
ERIC Educational Resources Information Center
Sayin, Ayfer
2016-01-01
In the formation education that is carried out within the scope of undergraduate and non-thesis graduate programs within the same university, different criteria are used to evaluate students' success. In this study, classification accuracy of letter grades that are generated to evaluate students' success using relative and absolute criteria and…
NASA Astrophysics Data System (ADS)
Sasidhar, Jaladanki; Muthu, D.; Venkatasubramanian, C.; Ramakrishnan, K.
2017-07-01
The success of any construction project will depend on efficient management of resources in a perfect manner to complete the project with a reasonable budget and time and the quality cannot be compromised. The efficient and timely procurement of material, deployment of adequate labor at correct time and mobilization of machinery lacking in time, all of them causes delay, lack of quality and finally affect the project cost. It is known factor that Project cost can be controlled by taking corrective actions on mobilization of resources at a right time. This research focuses on integration of management systems with the computer to generate the model which uses OOM data structure which decides to include automatic commodity code generation, automatic takeoff execution, intelligent purchase order generation, and components of design and schedule integration to overcome the problems of stock out. To overcome the problem in equipment management system inventory management module is suggested and the data set of equipment registration number, equipment number, description, date of purchase, manufacturer, equipment price, market value, life of equipment, production data of the equipment which includes equipment number, date, name of the job, hourly rate, insurance, depreciation cost of the equipment, taxes, storage cost, interest, oil, grease, and fuel consumption, etc. is analyzed and the decision support systems to overcome the problem arising out improper management is generated. The problem on labor is managed using scheduling, Strategic management of human resources. From the generated support systems tool, the resources are mobilized at a right time and help the project manager to finish project in time and thereby save the abnormal project cost and also provides the percentage that can be improved and also research focuses on determining the percentage of delays that are caused by lack of management of materials, manpower and machinery in different types of projects and how the percentage various from project to project.
Using bio.tools to generate and annotate workbench tool descriptions
Hillion, Kenzo-Hugo; Kuzmin, Ivan; Khodak, Anton; Rasche, Eric; Crusoe, Michael; Peterson, Hedi; Ison, Jon; Ménager, Hervé
2017-01-01
Workbench and workflow systems such as Galaxy, Taverna, Chipster, or Common Workflow Language (CWL)-based frameworks, facilitate the access to bioinformatics tools in a user-friendly, scalable and reproducible way. Still, the integration of tools in such environments remains a cumbersome, time consuming and error-prone process. A major consequence is the incomplete or outdated description of tools that are often missing important information, including parameters and metadata such as publication or links to documentation. ToolDog (Tool DescriptiOn Generator) facilitates the integration of tools - which have been registered in the ELIXIR tools registry (https://bio.tools) - into workbench environments by generating tool description templates. ToolDog includes two modules. The first module analyses the source code of the bioinformatics software with language-specific plugins, and generates a skeleton for a Galaxy XML or CWL tool description. The second module is dedicated to the enrichment of the generated tool description, using metadata provided by bio.tools. This last module can also be used on its own to complete or correct existing tool descriptions with missing metadata. PMID:29333231
Report on Lincoln Electric System gas turbine inlet air cooling. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ebeling, J.A.; Buecker, B.J.; Kitchen, B.J.
1993-12-01
As a result of increased electric power demand, the Lincoln Electric System (LES) of Lincoln, Nebraska (USA) decided to upgrade the generating capacity of their system. Based on capacity addition studies, the utility elected to improve performance of a GE MS7001B combustion turbine located at their Rokeby station. The turbine is used to meet summer-time peak loads, and as is common among combustion turbines, capacity declines as ambient air temperature rises. To improve the turbine capacity, LES decided to employ the proven technique of inlet air cooling, but with a novel approach: off-peak ice generation to be used for peak-loadmore » air cooling. EPRI contributed design concept definition and preliminary engineering. The American Public Power Association provided co-funding. Burns & McDonnell Engineering Company, under contract to Lincoln Electric System, provided detailed design and construction documents. LES managed the construction, start-up, and testing of the cooling system. This report describes the technical basis for the cooling system design, and it discusses combustion turbine performance, project economics, and potential system improvements. Control logic and P&ID drawings are also included. The inlet air cooling system has been available since the fall of 1991. When in use, the cooling system has increased turbine capacity by up to 17% at a cost of less than $200 per increased kilowatt of generation.« less
Improving Cancer-Related Outcomes with Connected Health - Part 2: Objective 2
A core principle of connected health is that individuals are empowered to decide when, whether, and how much to participate in their health and healthcare (see Principles of Connected Health in Part 1). Decisions about participation may change over time. Connected health tools are needed to ensure that people at risk for cancer, cancer patients, and cancer survivors have access to the information they need when they need it and in formats that meet their needs.
Analysis of Decisions Made Using the Analytic Hierarchy Process
2013-09-01
country petroleum pipelines (Dey, 2003), deciding how best to manage U.S. watersheds (De Steiguer, Duberstein, and Lopes, 2003), and the U. S. Army...many benefits to its use. Primarily these fall under the heading of managing chaos. Specifically, the AHP is a tool that can be used to simplify and...originally. The commonly used scenario is this: the waiter asks if you want chicken or fish, and you reply fish. The waiter then remembers that steak is
More Chinese Students Abroad Are Deciding Not To Return Home.
ERIC Educational Resources Information Center
Hertling, James
1997-01-01
In 18 years, over 260,000 Chinese students have left China to study abroad, and only about one-third have returned. Their flight is compounding the devastation of China's knowledge and talent pool that began with Mao Ze-dong. China is encouraging study abroad, to rectify the loss of a generation of academics, and is most interested in science and…
A comparison of techniques for generating forest ownership spatial products
Brett J. Butler; Jaketon H. Hewes; Greg C. Liknes; Mark D. Nelson; Stephanie A. Snyder
2014-01-01
To fully understand forest resources, it is imperative to understand the social context in which the forests exist. A pivotal part of that context is the forest ownership. It is the owners, operating within biophysical and social constraints, who ultimately decide if the land will remain forested, how the resources will be used, and by whom. Forest ownership patterns...
The Stresses of the Second-Year Generation Y Medical Student: A Phenomenological Study
ERIC Educational Resources Information Center
Ivins, Margaret
2013-01-01
The second year of medical school is widely considered a difficult year. During the second year, the students may experience their first patient interaction as well as working with physicians directly in a hospital or in a clinic. In addition, during the second year of medical school, students may decide that they do not like working with patients…
USDA-ARS?s Scientific Manuscript database
A first step in exploring population structure in crop plants and other organisms is to define the number of subpopulations that exist for a given data set. The genetic marker data sets being generated have become increasingly large over time and commonly are the high-dimension, low sample size (HDL...
Open source 3D visualization and interaction dedicated to hydrological models
NASA Astrophysics Data System (ADS)
Richard, Julien; Giangola-Murzyn, Agathe; Gires, Auguste; Tchiguirinskaia, Ioulia; Schertzer, Daniel
2014-05-01
Climate change and surface urbanization strongly modify the hydrological cycle in urban areas, increasing the consequences of extreme events such as floods or draughts. These issues lead to the development of the Multi-Hydro model at the Ecole des Ponts ParisTech (A. Giangola-Murzyn et al., 2012). This fully distributed model allows to compute the hydrological response of urban and peri-urban areas. Unfortunately such models are seldom user friendly. Indeed generating the inputs before launching a new simulation is usually a tricky tasks, and understanding and interpreting the outputs remains specialist tasks not accessible to the wider public. The MH-AssimTool was developed to overcome these issues. To enable an easier and improved understanding of the model outputs, we decided to convert the raw output data (grids file in ascii format) to a 3D display. Some commercial paying models provide a 3D visualization. Because of the cost of their licenses, this kind of tools may not be accessible to the most concerned stakeholders. So, we are developing a new tool based on C++ for the computation, Qt for the graphic user interface, QGIS for the geographical side and OpenGL for the 3D display. All these languages and libraries are open source and multi-platform. We will discuss some preprocessing issues for the data conversion from 2.5D to 3D. Indeed, the GIS data, is considered as a 2.5D (e.i. 2D polygon + one height) and the its transform to 3D display implies a lot of algorithms. For example,to visualize in 3D one building, it is needed to have for each point the coordinates and the elevation according to the topography. Furthermore one have to create new points to represent the walls. Finally the interactions between the model and stakeholders through this new interface and how this helps converting a research tool into a an efficient operational decision tool will be discussed. This ongoing research on the improvement of the visualization methods is supported by the KIC-Climate Blue Green Dream project.
NASA Technical Reports Server (NTRS)
Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert
2005-01-01
Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.
Sense, decide, act, communicate (SDAC): next generation of smart sensor systems
NASA Astrophysics Data System (ADS)
Berry, Nina; Davis, Jesse; Ko, Teresa H.; Kyker, Ron; Pate, Ron; Stark, Doug; Stinnett, Regan; Baker, James; Cushner, Adam; Van Dyke, Colin; Kyckelhahn, Brian
2004-09-01
The recent war on terrorism and increased urban warfare has been a major catalysis for increased interest in the development of disposable unattended wireless ground sensors. While the application of these sensors to hostile domains has been generally governed by specific tasks, this research explores a unique paradigm capitalizing on the fundamental functionality related to sensor systems. This functionality includes a sensors ability to Sense - multi-modal sensing of environmental events, Decide - smart analysis of sensor data, Act - response to environmental events, and Communication - internal to system and external to humans (SDAC). The main concept behind SDAC sensor systems is to integrate the hardware, software, and networking to generate 'knowledge and not just data'. This research explores the usage of wireless SDAC units to collectively make up a sensor system capable of persistent, adaptive, and autonomous behavior. These systems are base on the evaluation of scenarios and existing systems covering various domains. This paper presents a promising view of sensor network characteristics, which will eventually yield smart (intelligent collectives) network arrays of SDAC sensing units generally applicable to multiple related domains. This paper will also discuss and evaluate the demonstration system developed to test the concepts related to SDAC systems.
Cheng, Hon Wai Benjamin; Chan, Kwok Ying; Lau, Hoi To; Man, Ching Wah; Cheng, Suk Ching; Lam, Carman
2017-05-01
Normochromic normocytic anemia is a common complication in chronic kidney disease (CKD) and is associated with many adverse clinical consequences. Erythropoiesis-stimulating agents (ESAs) act to replace endogenous erythropoietin for patients with end-stage renal disease having anemia. Today, ESAs remain the main tool for treating anemia associated with CKD. In current practice, the use of ESA is not limited to the patients on renal replacement therapy but has extended to nondialysis patients under palliative care (PC). Current evidence on ESA usage in patients with CKD decided to forego dialysis often have to take reference from studies conducted in other groups of patients with CKD, including pre-dialysis patients and those on renal replacement therapy. There is paucity of studies targeting use of ESAs in renal PC patients. Small-scale retrospective study in renal PC patients had suggested clinical advantage of ESAs in terms of hemoglobin improvement, reduction in fatigue, and hospitalization rate. With the expected growth in elderly patients with CKD decided to forego dialysis and manage conservatively, there remains an urgent need to call for large-scale prospective trial in exploring efficacy of ESAs in this population, targeting on quality of life and symptoms improvement outcome. This article also reviews the mechanism of action, pharmacology, adverse effects, and clinical trial evidence for ESA in patients with CKD under renal PC.
Kleene Monads: Handling Iteration in a Framework of Generic Effects
NASA Astrophysics Data System (ADS)
Goncharov, Sergey; Schröder, Lutz; Mossakowski, Till
Monads are a well-established tool for modelling various computational effects. They form the semantic basis of Moggi’s computational metalanguage, the metalanguage of effects for short, which made its way into modern functional programming in the shape of Haskell’s do-notation. Standard computational idioms call for specific classes of monads that support additional control operations. Here, we introduce Kleene monads, which additionally feature nondeterministic choice and Kleene star, i.e. nondeterministic iteration, and we provide a metalanguage and a sound calculus for Kleene monads, the metalanguage of control and effects, which is the natural joint extension of Kleene algebra and the metalanguage of effects. This provides a framework for studying abstract program equality focussing on iteration and effects. These aspects are known to have decidable equational theories when studied in isolation. However, it is well known that decidability breaks easily; e.g. the Horn theory of continuous Kleene algebras fails to be recursively enumerable. Here, we prove several negative results for the metalanguage of control and effects; in particular, already the equational theory of the unrestricted metalanguage of control and effects over continuous Kleene monads fails to be recursively enumerable. We proceed to identify a fragment of this language which still contains both Kleene algebra and the metalanguage of effects and for which the natural axiomatisation is complete, and indeed the equational theory is decidable.
Bhattacharya, Arpita; Kolovson, Samantha; Sung, Yi-Chen; Eacker, Mike; Chen, Michael; Munson, Sean A; Kientz, Julie A
2018-03-01
Most health technologies are designed to support people who have already decided to work toward better health. Thus, there remains an opportunity to design technologies to help motivate people who have not yet decided to make a change. Understanding the experiences of people who have already started to make a health behavior change and how they made a pivotal decision can be useful in understanding how to design such tools. In this paper, we describe results from data collected in 2 phases. Phase 1 consisted of 127 surveys and 13 interviews with adults who have already accomplished behavior change(s). Phase 2 consisted of 117 surveys and 12 interviews with adults who have either already accomplished their behavior change(s) or are currently working toward them. We identified four factors that lead to pivotal experiences: (1) prolonged discontent and desire to change, (2) significant changes that increase fear or hope of future, (3) increased understanding of one's behavior and personal data, and (4) social accountability. We also describe a design space for designing technology-based interventions for encouraging people to decide to make a change to improve their health. Based on feedback from participants, we discuss opportunities for further exploration of the design space for people who are not yet motivated to change and for ethical considerations for this type of intervention. Copyright © 2018 Elsevier Inc. All rights reserved.
Cardiovascular point of care initiative: enhancements in clinical data management.
Robertson, Jane
2003-01-01
The Department of Cardiovascular Surgery at East Alabama Medical Center (EAMC) initiated a program in 1996 to improve the quality and usefulness of clinical outcomes data. After years of using a commercial vendor product and enduring a tedious collection process, the department decided to develop its own tools to support quality improvement efforts. Using a hand-held personal data assistant (PDA), the team developed tools that allowed ongoing data collection at the point of care delivery. The tools and methods facilitated the collection of real time, accurate information that allowed EAMC to participate in multiple clinical quality initiatives. The ability to conduct rapid-cycle performance improvement studies propelled EAMC's Cardiovascular Surgery Program into the Top 100 as recognized by HCIA, now Solucient, for 3 consecutive years (1999-2001). This report will describe the evolution of the data collection process as well as the quality improvements that resulted.
O'Hanley, Jesse R; Wright, Jed; Diebel, Matthew; Fedora, Mark A; Soucy, Charles L
2013-08-15
Systematic methods for prioritizing the repair and removal of fish passage barriers, while growing of late, have hitherto focused almost exclusively on meeting the needs of migratory fish species (e.g., anadromous salmonids). An important but as of yet unaddressed issue is the development of new modeling approaches which are applicable to resident fish species habitat restoration programs. In this paper, we develop a budget constrained optimization model for deciding which barriers to repair or remove in order to maximize habitat availability for stream resident fish. Habitat availability at the local stream reach is determined based on the recently proposed C metric, which accounts for the amount, quality, distance and level of connectivity to different stream habitat types. We assess the computational performance of our model using geospatial barrier and stream data collected from the Pine-Popple Watershed, located in northeast Wisconsin (USA). The optimization model is found to be an efficient and practical decision support tool. Optimal solutions, which are useful in informing basin-wide restoration planning efforts, can be generated on average in only a few minutes. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Song, Xiaolong; Yang, Jianxin; Lu, Bin; Yang, Dong
2017-01-01
China is now facing e-waste problems from both growing domestic generation and illegal imports. Many stakeholders are involved in the e-waste treatment system due to the complexity of e-waste life cycle. Beginning with the state of the e-waste treatment industry in China, this paper summarizes the latest progress in e-waste management from such aspects as the new edition of the China RoHS Directive, new Treatment List, new funding subsidy standard, and eco-design pilots. Thus, a conceptual model for life cycle management of e-waste is generalized. The operating procedure is to first identify the life cycle stages of the e-waste and extract the important life cycle information. Then, life cycle tools can be used to conduct a systematic analysis to help decide how to maximize the benefits from a series of life cycle engineering processes. Meanwhile, life cycle thinking is applied to improve the legislation relating to e-waste so as to continuously improve the sustainability of the e-waste treatment system. By providing an integrative framework, the life cycle management of e-waste should help to realize sustainable management of e-waste in developing countries.
Cumulative hazard: The case of nuisance flooding
NASA Astrophysics Data System (ADS)
Moftakhari, Hamed R.; AghaKouchak, Amir; Sanders, Brett F.; Matthew, Richard A.
2017-02-01
The cumulative cost of frequent events (e.g., nuisance floods) over time may exceed the costs of the extreme but infrequent events for which societies typically prepare. Here we analyze the likelihood of exceedances above mean higher high water and the corresponding property value exposure for minor, major, and extreme coastal floods. Our results suggest that, in response to sea level rise, nuisance flooding (NF) could generate property value exposure comparable to, or larger than, extreme events. Determining whether (and when) low cost, nuisance incidents aggregate into high cost impacts and deciding when to invest in preventive measures are among the most difficult decisions for policymakers. It would be unfortunate if efforts to protect societies from extreme events (e.g., 0.01 annual probability) left them exposed to a cumulative hazard with enormous costs. We propose a Cumulative Hazard Index (CHI) as a tool for framing the future cumulative impact of low cost incidents relative to infrequent extreme events. CHI suggests that in New York, NY, Washington, DC, Miami, FL, San Francisco, CA, and Seattle, WA, a careful consideration of socioeconomic impacts of NF for prioritization is crucial for sustainable coastal flood risk management.
Dodaro, Antonio; Recchia, Virginia
2011-11-01
The phenomenon of inappropriateness in ionizing imaging and medical interventions is large-scale and increasing. This tendency causes noteworthy damages to health and to patient's autonomy. Moreover, this trend causes a huge increment of health expenditures, waiting lists, organizational conflicts, judicial disputes, insurance compensations. The actual passive signature on unreadable templates of informed consent in the Italian hospital context constitutes, by a matter of facts, a central node of inappropriateness problem. This way to manage informed consent - "event" model - mortifies the patient's right to decide freely and deliberately, being him unaware of biological consequences of clinical-therapeutical interventions on himself and his progeny's health. Physician himself can generate arbitrary clinical acts, with heavy deontological and legal consequences. Hence, informed consent in ionizing imaging necessitates a particular "process" management, useful to convey a series of other clinical and organisational processes towards a full realisation of therapeutic alliance among physician and patient. This review aims at highlighting - in a juridical and communicative key - a range of tools which are applicable to contrasting the hospital abuse of ionizing radiations, for defending both patients' health and patients' dignity, being them primarily persons and citizens of a rule-of-law State.
Valpied, Jodie; Koziol-McLain, Jane; Glass, Nancy; Hegarty, Kelsey
2017-01-01
The use of Web-based methods to deliver and evaluate interventions is growing in popularity, particularly in a health care context. They have shown particular promise in responding to sensitive or stigmatized issues such as mental health and sexually transmitted infections. In the field of domestic violence (DV), however, the idea of delivering and evaluating interventions via the Web is still relatively new. Little is known about how to successfully navigate several challenges encountered by the researchers while working in this area. This paper uses the case study of I-DECIDE, a Web-based healthy relationship tool and safety decision aid for women experiencing DV, developed in Australia. The I-DECIDE website has recently been evaluated through a randomized controlled trial, and we outline some of the methodological and ethical challenges encountered during recruitment, retention, and evaluation. We suggest that with careful consideration of these issues, randomized controlled trials can be safely conducted via the Web in this sensitive area. PMID:28351830
ANFIS multi criteria decision making for overseas construction projects: a methodology
NASA Astrophysics Data System (ADS)
Utama, W. P.; Chan, A. P. C.; Zulherman; Zahoor, H.; Gao, R.; Jumas, D. Y.
2018-02-01
A critical part when a company targeting a foreign market is how to make a better decision in connection with potential project selection. Since different attributes of information are often incomplete, imprecise and ill-defined in overseas projects selection, the process of decision making by relying on the experiences and intuition is a risky attitude. This paper aims to demonstrate a decision support method in deciding overseas construction projects (OCPs). An Adaptive Neuro-Fuzzy Inference System (ANFIS), the amalgamation of Neural Network and Fuzzy Theory, was used as decision support tool to decide to go or not go on OCPs. Root mean square error (RMSE) and coefficient of correlation (R) were employed to identify the ANFIS system indicating an optimum and efficient result. The optimum result was obtained from ANFIS network with two input membership functions, Gaussian membership function (gaussmf) and hybrid optimization method. The result shows that ANFIS may help the decision-making process for go/not go decision in OCPs.
Slok, Annerika H M; in 't Veen, Johannes C C M; Chavannes, Niels H; van der Molen, Thys; Rutten-van Mölken, Maureen P M H; Kerstjens, Huib A M; Salomé, Philippe L; Holverda, Sebastiaan; Dekhuijzen, P N Richard; Schuiten, Denise; Asijee, Guus M; van Schayck, Onno C P
2014-07-10
In deciding on the treatment plan for patients with chronic obstructive pulmonary disease (COPD), the burden of COPD as experienced by patients should be the core focus. It is therefore important for daily practice to develop a tool that can both assess the burden of COPD and facilitate communication with patients in clinical practice. This paper describes the development of an integrated tool to assess the burden of COPD in daily practice. A definition of the burden of COPD was formulated by a Dutch expert team. Interviews showed that patients and health-care providers agreed on this definition. We found no existing instruments that fully measured burden of disease according to this definition. However, the Clinical COPD Questionnaire meets most requirements, and was therefore used and adapted. The adapted questionnaire is called the Assessment of Burden of COPD (ABC) scale. In addition, the ABC tool was developed, of which the ABC scale is the core part. The ABC tool is a computer program with an algorithm that visualises outcomes and provides treatment advice. The next step in the development of the tool is to test the validity and effectiveness of both the ABC scale and tool in daily practice.
Slok, Annerika H M; in ’t Veen, Johannes C C M; Chavannes, Niels H; van der Molen, Thys; Rutten-van Mölken, Maureen P M H; Kerstjens, Huib A M; Salomé, Philippe L; Holverda, Sebastiaan; Dekhuijzen, PN Richard; Schuiten, Denise; Asijee, Guus M; van Schayck, Onno C P
2014-01-01
In deciding on the treatment plan for patients with chronic obstructive pulmonary disease (COPD), the burden of COPD as experienced by patients should be the core focus. It is therefore important for daily practice to develop a tool that can both assess the burden of COPD and facilitate communication with patients in clinical practice. This paper describes the development of an integrated tool to assess the burden of COPD in daily practice. A definition of the burden of COPD was formulated by a Dutch expert team. Interviews showed that patients and health-care providers agreed on this definition. We found no existing instruments that fully measured burden of disease according to this definition. However, the Clinical COPD Questionnaire meets most requirements, and was therefore used and adapted. The adapted questionnaire is called the Assessment of Burden of COPD (ABC) scale. In addition, the ABC tool was developed, of which the ABC scale is the core part. The ABC tool is a computer program with an algorithm that visualises outcomes and provides treatment advice. The next step in the development of the tool is to test the validity and effectiveness of both the ABC scale and tool in daily practice. PMID:25010353
Developing Oral Case Presentation Skills: Peer and Self-Evaluations as Instructional Tools.
Williams, Dustyn E; Surakanti, Shravani
2016-01-01
Oral case presentation is an essential skill in clinical practice that is decidedly varied and understudied in teaching curricula. We developed a curriculum to improve oral case presentation skills in medical students. As part of an internal medicine clerkship, students receive instruction in the elements of a good oral case presentation and then present a real-world case in front of a video camera. Each student self-evaluates his/her presentation and receives evaluations from his/her peers. We expect peer and self-evaluation to be meaningful tools for developing skills in oral presentation. We hope to not only improve the quality of oral case presentations by students but also to reduce the time burden on faculty.
Web Tools: The Second Generation
ERIC Educational Resources Information Center
Pascopella, Angela
2008-01-01
Web 2.0 tools and technologies, or second generation tools, help districts to save time and money, and eliminate the need to transfer or move files back and forth across computers. Many Web 2.0 tools help students think critically and solve problems, which falls under the 21st-century skills. The second-generation tools are growing in popularity…
Patterns in Safety-Related Projects
NASA Astrophysics Data System (ADS)
Parsons, Mike; Hunter, Charles
Within Logica UK, safety-related projects are run in a variety of ways depending on the constraints imposed and how the risks and mitigations are owned and handled. A total of eight different types of project development patterns have been identified and this paper discusses each type. A simple decision tool has been developed based on the patterns which is used as an aid in deciding how to bid a safety project, allowing tradeoffs between risk ownership, development methods and cost to be assessed.
Tsunami Generation Modelling for Early Warning Systems
NASA Astrophysics Data System (ADS)
Annunziato, A.; Matias, L.; Ulutas, E.; Baptista, M. A.; Carrilho, F.
2009-04-01
In the frame of a collaboration between the European Commission Joint Research Centre and the Institute of Meteorology in Portugal, a complete analytical tool to support Early Warning Systems is being developed. The tool will be part of the Portuguese National Early Warning System and will be used also in the frame of the UNESCO North Atlantic Section of the Tsunami Early Warning System. The system called Tsunami Analysis Tool (TAT) includes a worldwide scenario database that has been pre-calculated using the SWAN-JRC code (Annunziato, 2007). This code uses a simplified fault generation mechanism and the hydraulic model is based on the SWAN code (Mader, 1988). In addition to the pre-defined scenario, a system of computers is always ready to start a new calculation whenever a new earthquake is detected by the seismic networks (such as USGS or EMSC) and is judged capable to generate a Tsunami. The calculation is performed using minimal parameters (epicentre and the magnitude of the earthquake): the programme calculates the rupture length and rupture width by using empirical relationship proposed by Ward (2002). The database calculations, as well the newly generated calculations with the current conditions are therefore available to TAT where the real online analysis is performed. The system allows to analyze also sea level measurements available worldwide in order to compare them and decide if a tsunami is really occurring or not. Although TAT, connected with the scenario database and the online calculation system, is at the moment the only software that can support the tsunami analysis on a global scale, we are convinced that the fault generation mechanism is too simplified to give a correct tsunami prediction. Furthermore short tsunami arrival times especially require a possible earthquake source parameters data on tectonic features of the faults like strike, dip, rake and slip in order to minimize real time uncertainty of rupture parameters. Indeed the earthquake parameters available right after an earthquake are preliminary and could be inaccurate. Determining which earthquake source parameters would affect the initial height and time series of tsunamis will show the sensitivity of the tsunami time series to seismic source details. Therefore a new fault generation model will be adopted, according to the seismotectonics properties of the different regions, and finally included in the calculation scheme. In order to do this, within the collaboration framework of Portuguese authorities, a new model is being defined, starting from the seismic sources in the North Atlantic, Caribbean and Gulf of Cadiz. As earthquakes occurring in North Atlantic and Caribbean sources may affect Portugal mainland, the Azores and Madeira archipelagos also these sources will be included in the analysis. Firstly we have started to examine the geometries of those sources that spawn tsunamis to understand the effect of fault geometry and depths of earthquakes. References: Annunziato, A., 2007. The Tsunami Assesment Modelling System by the Joint Research Center, Science of Tsunami Hazards, Vol. 26, pp. 70-92. Mader, C.L., 1988. Numerical modelling of water waves, University of California Press, Berkeley, California. Ward, S.N., 2002. Tsunamis, Encyclopedia of Physical Science and Technology, Vol. 17, pp. 175-191, ed. Meyers, R.A., Academic Press.
Metrology laboratory requirements for third-generation synchrotron radiation sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takacs, P.Z.; Quian, Shinan
1997-11-01
New third-generation synchrotron radiation sources that are now, or will soon, come on line will need to decide how to handle the testing of optical components delivered for use in their beam lines. In many cases it is desirable to establish an in-house metrology laboratory to do the work. We review the history behind the formation of the Optical Metrology Laboratory at Brookhaven National Laboratory and the rationale for its continued existence. We offer suggestions to those who may be contemplating setting up similar facilities, based on our experiences over the past two decades.
Undecidability of the elementary theory of the semilattice of GLP-words
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pakhomov, Fedor N
The Lindenbaum algebra of Peano PA can be enriched by the n-consistency operators which assign, to a given formula, the statement that the formula is compatible with the theory PA extended by the set of all true {Pi}{sub n}-sentences. In the Lindenbaum algebra of PA, a lower semilattice is generated from 1 by the n-consistency operators. We prove the undecidability of the elementary theory of this semilattice and the decidability of the elementary theory of the subsemilattice (of this semilattice) generated by the 0-consistency and 1-consistency operators only. Bibliography: 16 titles.
Data Sharing Interviews with Crop Sciences Faculty: Why They Share Data and How the Library Can Help
ERIC Educational Resources Information Center
Williams, Sarah C.
2013-01-01
This study was designed to generate a deeper understanding of data sharing by targeting faculty members who had already made data publicly available. During interviews, crop scientists at the University of Illinois at Urbana-Champaign were asked why they decided to share data, why they chose a data sharing method (e. g., supplementary file,…
NASA Astrophysics Data System (ADS)
Karki, Rajesh
Renewable energy application in electric power systems is growing rapidly worldwide due to enhanced public concerns for adverse environmental impacts and escalation in energy costs associated with the use of conventional energy sources. Photovoltaics and wind energy sources are being increasingly recognized as cost effective generation sources. A comprehensive evaluation of reliability and cost is required to analyze the actual benefits of utilizing these energy sources. The reliability aspects of utilizing renewable energy sources have largely been ignored in the past due the relatively insignificant contribution of these sources in major power systems, and consequently due to the lack of appropriate techniques. Renewable energy sources have the potential to play a significant role in the electrical energy requirements of small isolated power systems which are primarily supplied by costly diesel fuel. A relatively high renewable energy penetration can significantly reduce the system fuel costs but can also have considerable impact on the system reliability. Small isolated systems routinely plan their generating facilities using deterministic adequacy methods that cannot incorporate the highly erratic behavior of renewable energy sources. The utilization of a single probabilistic risk index has not been generally accepted in small isolated system evaluation despite its utilization in most large power utilities. Deterministic and probabilistic techniques are combined in this thesis using a system well-being approach to provide useful adequacy indices for small isolated systems that include renewable energy. This thesis presents an evaluation model for small isolated systems containing renewable energy sources by integrating simulation models that generate appropriate atmospheric data, evaluate chronological renewable power outputs and combine total available energy and load to provide useful system indices. A software tool SIPSREL+ has been developed which generates risk, well-being and energy based indices to provide realistic cost/reliability measures of utilizing renewable energy. The concepts presented and the examples illustrated in this thesis will help system planners to decide on appropriate installation sites, the types and mix of different energy generating sources, the optimum operating policies, and the optimum generation expansion plans required to meet increasing load demands in small isolated power systems containing photovoltaic and wind energy sources.
Chakrabarti, Debkumar; Bhattachheriya, Nandita
2012-01-01
Strategy for finding the appropriate strategy for work tool development has become a crucial issue in occupational wellness of varied nature of women workforce of Northeast India. This paper deals with ergonomics intervention through sustainable work tool design development process. Workers who frequently shift to different activities quite often in unorganised small-scale fruit processing units where productivity is directly related to the harvesting season require different work tools relevant to specific tasks and mostly workers themselves manage work tools of their own with available local resources. Whereas in contrast the tea-leaf pluckers are engaged in a single task throughout the year, and the work schedule and work equipment is decided and supplied to them based on the corporate decision where the workers do not have any individual control. Observations confirm the need for organising participatory workshops specific to trade based occupational well-being and different work tools for different tasks in mostly private owned unorganised sector. Implementation of single variety work tool development that supports a crucial component in tea-leaf plucking for which they are engaged in full time employment; and through a corporate decision a single design with its number of users makes a good effect.
Power and ambivalence in intergenerational communication: Deciding to institutionalize in Shanghai.
Chen, Lin
2017-04-01
China's tradition of taking care of one's aging parents continues to evolve, as evidenced by the growth in nursing home residents in Shanghai. However, how these families make the decision to institutionalize remains unclear. To fill this gap, this study draws on power relations to examine communication dynamics when oldest-old and their adult children decide to institutionalize. This study used a phenomenological approach. Twelve dyads of matched elderly residents and their children participated in face-to-face, in-depth interviews (N=24). The format and content of intergenerational communication indicated that both conflicts and compromises took place. Adult children achieved greater decision-making power than their frail parents, which evoked older adults' ambivalent feelings. A discrepancy in perceived filial piety between generations also emerged. These dynamics of caregiving decision-making offer insight in understanding evolving filial piety in urban China. Copyright © 2017 Elsevier Inc. All rights reserved.
Survey of Ophthalmologists Regarding Practice Patterns for Dry Eye and Sjogren Syndrome.
Bunya, Vatinee Y; Fernandez, Karen B; Ying, Gui-Shuang; Massaro-Giordano, Mina; Macchi, Ilaria; Sulewski, Michael E; Hammersmith, Kristin M; Nagra, Parveen K; Rapuano, Christopher J; Orlin, Stephen E
2018-01-15
To survey ophthalmologists about current practice patterns regarding the evaluation of dry eye patients and referrals for a Sjogren syndrome (SS) workup. An online survey was sent to ophthalmologists affiliated with the Scheie Eye Institute or Wills Eye Hospital using REDCap in August 2015. Descriptive statistics were used to summarize the data. Four hundred seventy-four survey invitations were sent out and 101 (21%) ophthalmologists completed the survey. The common traditional dry eye test performed was corneal fluorescein staining (62%) and the most common newer dry eye test performed was tear osmolarity (18%). Half of respondents (51%) refer fewer than 5% of their dry eye patients for SS workups, with 18% reporting that they never refer any patients. The most common reasons for referrals included positive review of systems (60%), severe dry eye symptoms (51%) or ocular signs (47%), or dry eye that is refractory to treatment (42%). The majority (83%) felt that there is a need for an evidence-based standardized screening tool for dry eye patients to decide who should be referred for evaluation for SS. Ophthalmologists continue to prefer the use of traditional dry eye tests in practice, with the most common test being corneal fluorescein staining. There is an underreferral of dry eye patients for SS workups, which is contributing to the continued underdiagnosis of the disease. Most respondents felt that there was a need for an evidence-based standardized screening tool to decide which dry eye patients should be referred for SS evaluations.
Development of a high-temperature diagnostics-while-drilling tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chavira, David J.; Huey, David; Hetmaniak, Chris
2009-01-01
The envisioned benefits of Diagnostics-While-Drilling (DWD) are based on the principle that high-speed, real-time information from the downhole environment will promote better control of the drilling process. Although in practice a DWD system could provide information related to any aspect of exploration and production of subsurface resources, the current DWD system provides data on drilling dynamics. This particular set of new tools provided by DWD will allow quicker detection of problems, reduce drilling flat-time and facilitate more efficient drilling (drilling optimization) with the overarching result of decreased drilling costs. In addition to providing the driller with an improved, real-time picturemore » of the drilling conditions downhole, data generated from DWD systems provides researchers with valuable, high fidelity data sets necessary for developing and validating enhanced understanding of the drilling process. Toward this end, the availability of DWD creates a synergy with other Sandia Geothermal programs, such as the hard-rock bit program, where the introduction of alternative rock-reduction technologies are contingent on the reduction or elimination of damaging dynamic effects. More detailed descriptions of the rationale for the program and early development efforts are described in more detail by others [SAND2003-2069 and SAND2000-0239]. A first-generation low-temperature (LT) DWD system was fielded in a series of proof-of-concept tests (POC) to validate functionality. Using the LT system, DWD was subsequently used to support a single-laboratory/multiple-partner CRADA (Cooperative Research and Development Agreement) entitled Advanced Drag Bits for Hard-Rock Drilling. The drag-bit CRADA was established between Sandia and four bit companies, and involved testing of a PDC bit from each company [Wise, et al., 2003, 2004] in the same lithologic interval at the Gas Technology Institute (GTI) test facility near Catoosa, OK. In addition, the LT DWD system has been fielded in cost-sharing efforts with an industrial partner to support the development of new generation hard-rock drag bits. Following the demonstrated success of the POC DWD system, efforts were initiated in FY05 to design, fabricate and test a high-temperature (HT) capable version of the DWD system. The design temperature for the HT DWD system was 225 C. Programmatic requirements dictated that a HT DWD tool be developed during FY05 and that a working system be demonstrated before the end of FY05. During initial design discussions regarding a high-temperature system it was decided that, to the extent possible, the HT DWD system would maintain functionality similar to the low temperature system, that is, the HT DWD system would also be designed to provide the driller with real-time information on bit and bottom-hole-assembly (BHA) dynamics while drilling. Additionally, because of time and fiscal constraints associated with the HT system development, the design of the HT DWD tool would follow that of the LT tool. The downhole electronics package would be contained in a concentrically located pressure barrel and the use of externally applied strain gages with thru-tool connectors would also be used in the new design. Also, in order to maximize the potential wells available for the HT DWD system and to allow better comparison with the low-temperature design, the diameter of the tool was maintained at 7-inches. This report discusses the efforts associated with the development of a DWD system capable of sustained operation at 225 C. This report documents work performed in the second phase of the Diagnostics-While-Drilling (DWD) project in which a high-temperature (HT) version of the phase 1 low-temperature (LT) proof-of-concept (POC) DWD tool was built and tested. Descriptions of the design, fabrication and field testing of the HT tool are provided. Background on prior phases of the project can be found in SAND2003-2069 and SAND2000-0239.« less
An Overview of the Technological and Scientific Achievements of the Terahertz
NASA Astrophysics Data System (ADS)
Rostami, Ali; Rasooli, Hassan; Baghban, Hamed
2011-01-01
Due to the importance of terahertz radiation in the past several years in spectroscopy, astrophysics, and imaging techniques namely for biomedical applications (its low interference and non-ionizing characteristics, has been made to be a good candidate to be used as a powerful technique for safe, in vivo medical imaging), we decided to review of the terahertz technology and its associated science achievements. The review consists of terahertz terminology, different applications, and main components which are used for detection and generation of terahertz radiation. Also a brief theoretical study of generation and detection of terahertz pulses will be considered. Finally, the chapter will be ended by providing the usage of organic materials for generation and detection of terahertz radiation.
Ahn, Hyo-Sung; Kim, Byeong-Yeon; Lim, Young-Hun; Lee, Byung-Hun; Oh, Kwang-Kyo
2018-03-01
This paper proposes three coordination laws for optimal energy generation and distribution in energy network, which is composed of physical flow layer and cyber communication layer. The physical energy flows through the physical layer; but all the energies are coordinated to generate and flow by distributed coordination algorithms on the basis of communication information. First, distributed energy generation and energy distribution laws are proposed in a decoupled manner without considering the interactive characteristics between the energy generation and energy distribution. Second, a joint coordination law to treat the energy generation and energy distribution in a coupled manner taking account of the interactive characteristics is designed. Third, to handle over- or less-energy generation cases, an energy distribution law for networks with batteries is designed. The coordination laws proposed in this paper are fully distributed in the sense that they are decided optimally only using relative information among neighboring nodes. Through numerical simulations, the validity of the proposed distributed coordination laws is illustrated.
tkLayout: a design tool for innovative silicon tracking detectors
NASA Astrophysics Data System (ADS)
Bianchi, G.
2014-03-01
A new CMS tracker is scheduled to become operational for the LHC Phase 2 upgrade in the early 2020's. tkLayout is a software package developed to create 3d models for the design of the CMS tracker and to evaluate its fundamental performance figures. The new tracker will have to cope with much higher luminosity conditions, resulting in increased track density, harsher radiation exposure and, especially, much higher data acquisition bandwidth, such that equipping the tracker with triggering capabilities is envisaged. The design of an innovative detector involves deciding on an architecture offering the best trade-off among many figures of merit, such as tracking resolution, power dissipation, bandwidth, cost and so on. Quantitatively evaluating these figures of merit as early as possible in the design phase is of capital importance and it is best done with the aid of software models. tkLayout is a flexible modeling tool: new performance estimates and support for different detector geometries can be quickly added, thanks to its modular structure. Besides, the software executes very quickly (about two minutes), so that many possible architectural variations can be rapidly modeled and compared, to help in the choice of a viable detector layout and then to optimize it. A tracker geometry is generated from simple configuration files, defining the module types, layout and materials. Support structures are automatically added and services routed to provide a realistic tracker description. The tracker geometries thus generated can be exported to the standard CMS simulation framework (CMSSW) for full Monte Carlo studies. tkLayout has proven essential in giving guidance to CMS in studying different detector layouts and exploring the feasibility of innovative solutions for tracking detectors, in terms of design, performance and projected costs. This tool has been one of the keys to making important design decisions for over five years now and has also enabled project engineers and simulation experts to focus their efforts on other important or specific issues. Even if tkLayout was designed for the CMS tracker upgrade project, its flexibility makes it experiment-agnostic, so that it could be easily adapted to model other tracking detectors. The technology behind tkLayout is presented, as well as some of the results obtained in the context of the CMS silicon tracker design studies.
UIVerify: A Web-Based Tool for Verification and Automatic Generation of User Interfaces
NASA Technical Reports Server (NTRS)
Shiffman, Smadar; Degani, Asaf; Heymann, Michael
2004-01-01
In this poster, we describe a web-based tool for verification and automatic generation of user interfaces. The verification component of the tool accepts as input a model of a machine and a model of its interface, and checks that the interface is adequate (correct). The generation component of the tool accepts a model of a given machine and the user's task, and then generates a correct and succinct interface. This write-up will demonstrate the usefulness of the tool by verifying the correctness of a user interface to a flight-control system. The poster will include two more examples of using the tool: verification of the interface to an espresso machine, and automatic generation of a succinct interface to a large hypothetical machine.
Assuming a Pharmacy Organization Leadership Position: A Guide for Pharmacy Leaders.
Shay, Blake; Weber, Robert J
2015-11-01
Important and influential pharmacy organization leadership positions, such as president, board member, or committee chair, are volunteer positions and require a commitment of personal and professional time. These positions provide excellent opportunities for leadership development, personal promotion, and advancement of the profession. In deciding to assume a leadership position, interested individuals must consider the impact on their personal and professional commitments and relationships, career planning, employer support, current and future department projects, employee support, and personal readiness. This article reviews these factors and also provides an assessment tool that leaders can use to determine their readiness to assume leadership positions. By using an assessment tool, pharmacy leaders can better understand their ability to assume an important and influential leadership position while achieving job and personal goals.
Assuming a Pharmacy Organization Leadership Position: A Guide for Pharmacy Leaders
Shay, Blake; Weber, Robert J.
2015-01-01
Important and influential pharmacy organization leadership positions, such as president, board member, or committee chair, are volunteer positions and require a commitment of personal and professional time. These positions provide excellent opportunities for leadership development, personal promotion, and advancement of the profession. In deciding to assume a leadership position, interested individuals must consider the impact on their personal and professional commitments and relationships, career planning, employer support, current and future department projects, employee support, and personal readiness. This article reviews these factors and also provides an assessment tool that leaders can use to determine their readiness to assume leadership positions. By using an assessment tool, pharmacy leaders can better understand their ability to assume an important and influential leadership position while achieving job and personal goals. PMID:27621512
Health literacy and usability of clinical trial search engines.
Utami, Dina; Bickmore, Timothy W; Barry, Barbara; Paasche-Orlow, Michael K
2014-01-01
Several web-based search engines have been developed to assist individuals to find clinical trials for which they may be interested in volunteering. However, these search engines may be difficult for individuals with low health and computer literacy to navigate. The authors present findings from a usability evaluation of clinical trial search tools with 41 participants across the health and computer literacy spectrum. The study consisted of 3 parts: (a) a usability study of an existing web-based clinical trial search tool; (b) a usability study of a keyword-based clinical trial search tool; and (c) an exploratory study investigating users' information needs when deciding among 2 or more candidate clinical trials. From the first 2 studies, the authors found that users with low health literacy have difficulty forming queries using keywords and have significantly more difficulty using a standard web-based clinical trial search tool compared with users with adequate health literacy. From the third study, the authors identified the search factors most important to individuals searching for clinical trials and how these varied by health literacy level.
Static versus dynamic sampling for data mining
DOE Office of Scientific and Technical Information (OSTI.GOV)
John, G.H.; Langley, P.
1996-12-31
As data warehouses grow to the point where one hundred gigabytes is considered small, the computational efficiency of data-mining algorithms on large databases becomes increasingly important. Using a sample from the database can speed up the datamining process, but this is only acceptable if it does not reduce the quality of the mined knowledge. To this end, we introduce the {open_quotes}Probably Close Enough{close_quotes} criterion to describe the desired properties of a sample. Sampling usually refers to the use of static statistical tests to decide whether a sample is sufficiently similar to the large database, in the absence of any knowledgemore » of the tools the data miner intends to use. We discuss dynamic sampling methods, which take into account the mining tool being used and can thus give better samples. We describe dynamic schemes that observe a mining tool`s performance on training samples of increasing size and use these results to determine when a sample is sufficiently large. We evaluate these sampling methods on data from the UCI repository and conclude that dynamic sampling is preferable.« less
Review of Orbiter Flight Boundary Layer Transition Data
NASA Technical Reports Server (NTRS)
Mcginley, Catherine B.; Berry, Scott A.; Kinder, Gerald R.; Barnell, maria; Wang, Kuo C.; Kirk, Benjamin S.
2006-01-01
In support of the Shuttle Return to Flight program, a tool was developed to predict when boundary layer transition would occur on the lower surface of the orbiter during reentry due to the presence of protuberances and cavities in the thermal protection system. This predictive tool was developed based on extensive wind tunnel tests conducted after the loss of the Space Shuttle Columbia. Recognizing that wind tunnels cannot simulate the exact conditions an orbiter encounters as it re-enters the atmosphere, a preliminary attempt was made to use the documented flight related damage and the orbiter transition times, as deduced from flight instrumentation, to calibrate the predictive tool. After flight STS-114, the Boundary Layer Transition Team decided that a more in-depth analysis of the historical flight data was needed to better determine the root causes of the occasional early transition times of some of the past shuttle flights. In this paper we discuss our methodology for the analysis, the various sources of shuttle damage information, the analysis of the flight thermocouple data, and how the results compare to the Boundary Layer Transition prediction tool designed for Return to Flight.
Multi-Sector Sustainability Browser (MSSB) User Manual: A ...
EPA’s Sustainable and Healthy Communities (SHC) Research Program is developing methodologies, resources, and tools to assist community members and local decision makers in implementing policy choices that facilitate sustainable approaches in managing their resources affecting the built environment, natural environment, and human health. In order to assist communities and decision makers in implementing sustainable practices, EPA is developing computer-based systems including models, databases, web tools, and web browsers to help communities decide upon approaches that support their desired outcomes. Communities need access to resources that will allow them to achieve their sustainability objectives through intelligent decisions in four key sustainability areas: • Land Use • Buildings and Infrastructure • Transportation • Materials Management (i.e., Municipal Solid Waste [MSW] processing and disposal) The Multi-Sector Sustainability Browser (MSSB) is designed to support sustainable decision-making for communities, local and regional planners, and policy and decision makers. Document is an EPA Technical Report, which is the user manual for the Multi-Sector Sustainability Browser (MSSB) tool. The purpose of the document is to provide basic guidance on use of the tool for users
Selection and application of microbial source tracking tools for water-quality investigations
Stoeckel, Donald M.
2005-01-01
Microbial source tracking (MST) is a complex process that includes many decision-making steps. Once a contamination problem has been defined, the potential user of MST tools must thoroughly consider study objectives before deciding upon a source identifier, a detection method, and an analytical approach to apply to the problem. Regardless of which MST protocol is chosen, underlying assumptions can affect the results and interpretation. It is crucial to incorporate tests of those assumptions in the study quality-control plan to help validate results and facilitate interpretation. Detailed descriptions of MST objectives, protocols, and assumptions are provided in this report to assist in selection and application of MST tools for water-quality investigations. Several case studies illustrate real-world applications of MST protocols over a range of settings, spatial scales, and types of contamination. Technical details of many available source identifiers and detection methods are included as appendixes. By use of this information, researchers should be able to formulate realistic expectations for the information that MST tools can provide and, where possible, successfully execute investigations to characterize sources of fecal contamination to resource waters.
NASA Astrophysics Data System (ADS)
Harrington, David M.; Snik, Frans; Keller, Christoph U.; Sueoka, Stacey R.; van Harten, Gerard
2017-10-01
We outline polarization fringe predictions derived from an application of the Berreman calculus for the Daniel K. Inouye Solar Telescope (DKIST) retarder optics. The DKIST retarder baseline design used six crystals, single-layer antireflection coatings, thick cover windows, and oil between all optical interfaces. This tool estimates polarization fringes and optic Mueller matrices as functions of all optical design choices. The amplitude and period of polarized fringes under design changes, manufacturing errors, tolerances, and several physical factors can now be estimated. This tool compares well with observations of fringes for data collected with the spectropolarimeter for infrared and optical regions at the Dunn Solar Telescope using bicrystalline achromatic retarders as well as laboratory tests. With this tool, we show impacts of design decisions on polarization fringes as impacted by antireflection coatings, oil refractive indices, cover window presence, and part thicknesses. This tool helped DKIST decide to remove retarder cover windows and also recommends reconsideration of coating strategies for DKIST. We anticipate this tool to be essential in designing future retarders for mitigation of polarization and intensity fringe errors in other high spectral resolution astronomical systems.
Managing complex research datasets using electronic tools: A meta-analysis exemplar
Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.
2013-01-01
Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256
Managing complex research datasets using electronic tools: a meta-analysis exemplar.
Brown, Sharon A; Martin, Ellen E; Garcia, Theresa J; Winter, Mary A; García, Alexandra A; Brown, Adama; Cuevas, Heather E; Sumlin, Lisa L
2013-06-01
Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, for example, EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process as well as enhancing communication among research team members. The purpose of this article is to describe the electronic processes designed, using commercially available software, for an extensive, quantitative model-testing meta-analysis. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to decide on which electronic tools to use, determine how these tools would be used, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members.
El Fenomeno Chavez: Hugo Chavez of Venezuela, Modern Day Bolivar
2007-03-01
Venezuela’s state owned oil company, Petroleos de Venezuela. Ever seeking opportunities to provoke the giant United States, Chavez agreed to provide...controlled joint “El Fenomeno Chavez” . . . 9 venture, Petroleos de Venezuela SA (PDVSA). Exxon Mobil Corporation decided to sell their stakes among...exploited. 9. Major oil companies in Venezuela: 1. Petroleos de Venezuela (PdVSA) – government-owned; generates 1/3 of national GDP; monopolized the
Systems Prototyping with Fourth Generation Tools.
ERIC Educational Resources Information Center
Sholtys, Phyllis
1983-01-01
The development of information systems using an engineering approach that uses both traditional programing techniques and fourth generation software tools is described. Fourth generation applications tools are used to quickly develop a prototype system that is revised as the user clarifies requirements. (MLW)
NASA Astrophysics Data System (ADS)
Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.
2016-11-01
The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.
2007-09-01
is necessary to convert the solids to a 3-D computational mesh. The user must decide how many layers of mesh elements are required for each material ...together to define the geology gives the user more control over the material contacts. Secondly, the tool to convert directly to a 3-D mesh from the...included in the model. Rocks, cracks , fissures, and plant material can affect the flow character- istics, but cannot be included in a model on this scale
HMDs as enablers of situation awareness: the OODA loop and sense-making
NASA Astrophysics Data System (ADS)
Melzer, James E.
2012-06-01
Helmet-Mounted Displays have been shown to be powerful tools that can unlock the pilot from the interior of the cockpit or the forward line of sight of the Head-Up Display. Imagery that is presented in one of three reference frames can enable the pilots to do their job more effectively while simultaneously decreasing workload. This paper will review key attributes of Situation Awareness, the Observe/Orient/Decide/Act (OODA) Loop and Sensemaking and how HMDs can aid the pilot in achieving these ideal cognitive states.
Summarizing health inequalities in a Balanced Scorecard. Methodological considerations.
Auger, Nathalie; Raynault, Marie-France
2006-01-01
The association between social determinants and health inequalities is well recognized. What are now needed are tools to assist in disseminating such information. This article describes how the Balanced Scorecard may be used for summarizing data on health inequalities. The process begins by selecting appropriate social groups and indicators, and is followed by the measurement of differences across person, place, or time. The next step is to decide whether to focus on absolute versus relative inequality. The last step is to determine the scoring method, including whether to address issues of depth of inequality.
Selb, Melissa; Gimigliano, Francesca; Prodinger, Birgit; Stucki, Gerold; Pestelli, Germano; Iocco, Maurizio; Boldrini, Paolo
2017-04-01
As part of international efforts to develop and implement national models including the specification of ICF-based clinical data collection tools, the Italian rehabilitation community initiated a project to develop simple, intuitive descriptions of the ICF Rehabilitation Set, highlighting the core concept of each category in user-friendly language. This paper outlines the Italian experience in developing simple, intuitive descriptions of the ICF Rehabilitation Set as an ICF-based clinical data collection tool for Italy. Consensus process. Expert conference. Multidisciplinary group of rehabilitation professionals. The first of a two-stage consensus process involved developing an initial proposal for simple, intuitive descriptions of each ICF Rehabilitation Set category based on descriptions generated in a similar process in China. Stage two involved a consensus conference. Divided into three working groups, participants discussed and voted (vote A) whether the initially proposed descriptions of each ICF Rehabilitation Set category was simple and intuitive enough for use in daily practice. Afterwards the categories with descriptions considered ambiguous i.e. not simple and intuitive enough, were divided among the working groups, who were asked to propose a new description for the allocated categories. These proposals were then voted (vote B) on in a plenary session. The last step of the consensus conference required each working group to develop a new proposal for each and the same categories with descriptions still considered ambiguous. Participants then voted (final vote) for which of the three proposed descriptions they preferred. Nineteen clinicians from diverse rehabilitation disciplines from various regions of Italy participated in the consensus process. Three ICF categories already achieved consensus in vote A, while 20 ICF categories were accepted in vote B. The remaining 7 categories were decided in the final vote. The findings were discussed in light of current efforts toward developing strategies for ICF implementation, specifically for the application of an ICF-based clinical data collection tool, not only for Italy but also for the rest of Europe. Promising as minimal standards for monitoring the impact of interventions and for standardized reporting of functioning as a relevant outcome in rehabilitation.
Evaluating models of healthcare delivery using the Model of Care Evaluation Tool (MCET).
Hudspeth, Randall S; Vogt, Marjorie; Wysocki, Ken; Pittman, Oralea; Smith, Susan; Cooke, Cindy; Dello Stritto, Rita; Hoyt, Karen Sue; Merritt, T Jeanne
2016-08-01
Our aim was to provide the outcome of a structured Model of Care (MoC) Evaluation Tool (MCET), developed by an FAANP Best-practices Workgroup, that can be used to guide the evaluation of existing MoCs being considered for use in clinical practice. Multiple MoCs are available, but deciding which model of health care delivery to use can be confusing. This five-component tool provides a structured assessment approach to model selection and has universal application. A literature review using CINAHL, PubMed, Ovid, and EBSCO was conducted. The MCET evaluation process includes five sequential components with a feedback loop from component 5 back to component 3 for reevaluation of any refinements. The components are as follows: (1) Background, (2) Selection of an MoC, (3) Implementation, (4) Evaluation, and (5) Sustainability and Future Refinement. This practical resource considers an evidence-based approach to use in determining the best model to implement based on need, stakeholder considerations, and feasibility. ©2015 American Association of Nurse Practitioners.
FUJIFILM X10 white orbs and DeOrbIt
NASA Astrophysics Data System (ADS)
Dietz, Henry Gordon
2013-01-01
The FUJIFILM X10 is a high-end enthusiast compact digital camera using an unusual sensor design. Unfortunately, upon its Fall 2011 release, the camera quickly became infamous for the uniquely disturbing "white orbs" that often appeared in areas where the sensor was saturated. FUJIFILM's first attempt at a fix was firmware released on February 25, 2012 if it had little effect. In April 2012, a sensor replacement essentially solved the problem. This paper explores the "white orb" phenomenon in detail. After FUJIFILM's attempt at a firmware fix failed, the author decided to create a post-processing tool that automatically could repair existing images. DeOrbIt was released as a free tool on March 7, 2012. To better understand the problem and how to fix it, the WWW form version of the tool logs images, processing parameters, and evaluations by users. The current paper describes the technical problem, the novel computational photography methods used by DeOrbit to repair affected images, and the public perceptions revealed by this experiment.
Estimating in vivo airway surface liquid concentration in trials of inhaled antibiotics.
Hasan, M A; Lange, C F
2007-01-01
Antibiotic drugs exhibit concentration dependence in their efficacy. Therefore, ensuring appropriate concentration of these drugs in the relevant body fluid is important for obtaining the desired therapeutic and physiological action. Until recently there had been no suitable method available to measure or estimate concentration of drugs in the human airways resulting from inhaled aerosols or to determine the amount of inhaled antibiotics required to ensure minimum inhibitory concentration of a drug in the airway surface liquid (ASL). In this paper a numerical method is used for estimating local concentration of inhaled pharmaceutical aerosols in different generations of the human tracheobronchial airways. The method utilizes a mathematical lung deposition model to estimate amounts of aerosols depositing in different lung generations, and a recent ASL model along with deposition results to assess the concentration of deposited drugs immediately following inhalation. Examples of concentration estimates for two case studies: one for the antibiotic tobramycin against Pseudomonas aeruginosa, and another for taurolidine against Burkholderia cepacia are presented. The aerosol characteristics, breathing pattern and properties of nebulized solutions were adopted from two recent clinical studies on efficacy of these drugs in cystic fibrosis (CF) patients and from other sources in the literature. While the clinically effective tobramycin showed a concentration higher than the required in vivo concentration, that for the ineffective taurolidine was found to be below the speculated required in vivo concentration. Results of this study thus show that the mathematical ASL model combined with the lung deposition model can be an effective tool for helping decide the optimum dosage of inhaled antibiotic drugs delivered during human clinical trials.
Monte Carlo simulation of edge placement error
NASA Astrophysics Data System (ADS)
Kobayashi, Shinji; Okada, Soichiro; Shimura, Satoru; Nafus, Kathleen; Fonseca, Carlos; Estrella, Joel; Enomoto, Masashi
2018-03-01
In the discussion of edge placement error (EPE), we proposed interactive pattern fidelity error (IPFE) as an indicator to judge pass/fail of integrated patterns. IPFE consists of lower and upper layer EPEs (CD and center of gravity: COG) and overlay, which is decided from the combination of each maximum variation. We succeeded in obtaining the IPFE density function by Monte Carlo simulation. In the results, we also found that the standard deviation (σ) of each indicator should be controlled by 4.0σ, at the semiconductor grade, such as 100 billion patterns per die. Moreover, CD, COG and overlay were analyzed by analysis of variance (ANOVA); we can discuss all variations from wafer to wafer (WTW), pattern to pattern (PTP), line edge roughness (LWR) and stochastic pattern noise (SPN) on an equal footing. From the analysis results, we can determine that these variations belong to which process and tools. Furthermore, measurement length of LWR is also discussed in ANOVA. We propose that the measurement length for IPFE analysis should not be decided to the micro meter order, such as >2 μm length, but for which device is actually desired.
NASA Astrophysics Data System (ADS)
Horswell, I.; Gimenez, E. N.; Marchal, J.; Tartoni, N.
2011-01-01
Hybrid silicon photon-counting detectors are becoming standard equipment for many synchrotron applications. The latest in the Medipix family of read-out chips designed as part of the Medipix Collaboration at CERN is the Medipix3, which while maintaining the same pixel size as its predecessor, offers increased functionality and operating modes. The active area of the Medipix3 chip is approx 14mm × 14mm (containing 256 × 256 pixels) which is not large enough for many detector applications, this results in the need to tile many sensors and chips. As a first step on the road to develop such a detector, it was decided to build a prototype single chip readout system to gain the necessary experience in operating a Medipix3 chip. To provide a flexible learning and development tool it was decided to build an interface based on the recently released FlexRIOTM system from National Instruments and to use the LabVIEWTM graphical programming environment. This system and the achieved performance are described in this paper.
Design for the Maintainer: Projecting Maintenance Performance from Design Characteristics.
1981-07-01
of Kahneman and Tversky (Tversky & Kahneman, 1974; Kahneman & Tversky, 1979). They have observed some general principles to which human decision...makers tend to adhere. The first of these is the "representativeness heuristicw . According to this principle , the question, ’will event A be generated by...process B?", will be decided affirmatively to the extent that the event A resembles process B. According to this principle , if failure in a computer
Russia’s Role in the Emerging Eurasian Security Environment
2008-01-01
China for generations, racism, raw materials and land , are even more pressing in today‘s globalized environment. As far as Iran goes, Russian...THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 2 multipolar counterbalance, but neither Russia and China ...a decided amount of realpolitik geo-strategy guiding Russian relations with the United States, the European Union, China , the Central Asian states as
An Approach to Establishing System Benefits for Technologies In NASA's Spaceliner Investment Area
NASA Technical Reports Server (NTRS)
Hueter, Uwe; Pannell, Bill; Lyles, Garry M. (Technical Monitor)
2001-01-01
NASA's has established long term goals for access-to-space. The third generation launch systems are to be fully reusable and operational around 2025. The goals for the third generation launch system are to significantly reduce cost and improve safety over current systems. The Advanced Space Transportation Program Office (ASTP) at the NASA's Marshall Space Flight Center in Huntsville, AL has the agency lead to develop space transportation technologies. Within ASTP, under the Spaceliner Investment Area, third generation technologies are being pursued. The Spaceliner Investment Area's primary objective is to mature vehicle technologies to enable substantial increases in the design and operating margins of third generation RLVs (current Space Shuttle is considered the first generation RLV) by incorporating advanced propulsion systems, materials, structures, thermal protection systems, power, and avionics technologies. Advancements in design tools and better characterization of the operational environment will result in reduced design and operational variabilities leading to improvements in margins. Improvements in operational efficiencies will be obtained through the introduction of integrated vehicle health management, operations and range technologies. Investments in these technologies will enable the reduction in the high operational costs associated with today's vehicles by allowing components to operate well below their design points resulting in improved component operating life, reliability, and safety which in turn reduces both maintenance and refurbishment costs. The introduction of advanced technologies may enable horizontal takeoff by significantly reducing the takeoff weight and allowing use of existing infrastructure. This would be a major step toward the goal of airline-like operation. These factors in conjunction with increased flight rates, resulting from reductions in transportation costs, will result in significant improvements of future vehicles. The real-world problem is that resources are limited and technologies need to be prioritized to assure the resources are spent on technologies that provide the highest system level benefits. Toward that end, a systems approach is being taken to determine the benefits of technologies for the Spaceliner Investment Area. Technologies identified to be enabling will be funded. However, the other technologies will be funded based on their system's benefits. Since the final launch system concept will not be decided for many years, several vehicle concepts are being evaluated to determine technology benefits. Not only performance, but also cost and operability are being assessed. This will become an annual process to assess these technologies against their goals and the benefits to various launch systems concepts. The paper describes the system process, tools and concepts used to determine the technology benefits. Preliminary results will be presented along with the current technology investments that are being made by ASTP's Spaceliner Investment Area.
Young, J M; Austin, J J; Weyrich, L S
2017-02-01
Analysis of physical evidence is typically a deciding factor in forensic casework by establishing what transpired at a scene or who was involved. Forensic geoscience is an emerging multi-disciplinary science that can offer significant benefits to forensic investigations. Soil is a powerful, nearly 'ideal' contact trace evidence, as it is highly individualistic, easy to characterise, has a high transfer and retention probability, and is often overlooked in attempts to conceal evidence. However, many real-life cases encounter close proximity soil samples or soils with low inorganic content, which cannot be easily discriminated based on current physical and chemical analysis techniques. The capability to improve forensic soil discrimination, and identify key indicator taxa from soil using the organic fraction is currently lacking. The development of new DNA sequencing technologies offers the ability to generate detailed genetic profiles from soils and enhance current forensic soil analyses. Here, we discuss the use of DNA metabarcoding combined with high-throughput sequencing (HTS) technology to distinguish between soils from different locations in a forensic context. Specifically, we provide recommendations for best practice, outline the potential limitations encountered in a forensic context and describe the future directions required to integrate soil DNA analysis into casework. © FEMS 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
de Andrade Lopes, Alneu; Minghim, Rosane; Melo, Vinícius; Paulovich, Fernando V.
2006-01-01
The current availability of information many times impair the tasks of searching, browsing and analyzing information pertinent to a topic of interest. This paper presents a methodology to create a meaningful graphical representation of documents corpora targeted at supporting exploration of correlated documents. The purpose of such an approach is to produce a map from a document body on a research topic or field based on the analysis of their contents, and similarities amongst articles. The document map is generated, after text pre-processing, by projecting the data in two dimensions using Latent Semantic Indexing. The projection is followed by hierarchical clustering to support sub-area identification. The map can be interactively explored, helping to narrow down the search for relevant articles. Tests were performed using a collection of documents pre-classified into three research subject classes: Case-Based Reasoning, Information Retrieval, and Inductive Logic Programming. The map produced was capable of separating the main areas and approaching documents by their similarity, revealing possible topics, and identifying boundaries between them. The tool can deal with the exploration of inter-topics and intra-topic relationship and is useful in many contexts that need deciding on relevant articles to read, such as scientific research, education, and training.
MCNP-REN - A Monte Carlo Tool for Neutron Detector Design Without Using the Point Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abhold, M.E.; Baker, M.C.
1999-07-25
The development of neutron detectors makes extensive use of the predictions of detector response through the use of Monte Carlo techniques in conjunction with the point reactor model. Unfortunately, the point reactor model fails to accurately predict detector response in common applications. For this reason, the general Monte Carlo N-Particle code (MCNP) was modified to simulate the pulse streams that would be generated by a neutron detector and normally analyzed by a shift register. This modified code, MCNP - Random Exponentially Distributed Neutron Source (MCNP-REN), along with the Time Analysis Program (TAP) predict neutron detector response without using the pointmore » reactor model, making it unnecessary for the user to decide whether or not the assumptions of the point model are met for their application. MCNP-REN is capable of simulating standard neutron coincidence counting as well as neutron multiplicity counting. Measurements of MOX fresh fuel made using the Underwater Coincidence Counter (UWCC) as well as measurements of HEU reactor fuel using the active neutron Research Reactor Fuel Counter (RRFC) are compared with calculations. The method used in MCNP-REN is demonstrated to be fundamentally sound and shown to eliminate the need to use the point model for detector performance predictions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Le Goaller, C.; Doutreluingne, C.; Berton, M.A.
2007-07-01
This paper describes the methodology followed by the French Atomic Energy Commission (CEA) to decommission the buildings of former research facilities for demolition or possible reuse. It is a well known fact that the French nuclear safety authority has decided not to define any general release level for the decommissioning of nuclear facilities, thus effectively prohibiting radiological measurement-driven decommissioning. The decommissioning procedure therefore requires an intensive in-depth examination of each nuclear plant. This requires a good knowledge of the past history of the plant, and should be initiated as early as possible. The paper first describes the regulatory framework recentlymore » unveiled by the French Safety Authority, then, reviews its application to ongoing decommissioning projects. The cornerstone of the strategy is the definition of waste zoning in the buildings to segregate areas producing conventional waste from those generating nuclear waste. After dismantling, suitable measurements are carried out to confirm the conventional state of the remaining walls. This requires low-level measurement methods providing a suitable detection limit within an acceptable measuring time. Although this generally involves particle counting and in-situ low level gamma spectrometry, the paper focuses on y spectrometry. Finally, the lessons learned from ongoing projects are discussed. (authors)« less
Automated branching pattern report generation for laparoscopic surgery assistance
NASA Astrophysics Data System (ADS)
Oda, Masahiro; Matsuzaki, Tetsuro; Hayashi, Yuichiro; Kitasaka, Takayuki; Misawa, Kazunari; Mori, Kensaku
2015-05-01
This paper presents a method for generating branching pattern reports of abdominal blood vessels for laparoscopic gastrectomy. In gastrectomy, it is very important to understand branching structure of abdominal arteries and veins, which feed and drain specific abdominal organs including the stomach, the liver and the pancreas. In the real clinical stage, a surgeon creates a diagnostic report of the patient anatomy. This report summarizes the branching patterns of the blood vessels related to the stomach. The surgeon decides actual operative procedure. This paper shows an automated method to generate a branching pattern report for abdominal blood vessels based on automated anatomical labeling. The report contains 3D rendering showing important blood vessels and descriptions of branching patterns of each vessel. We have applied this method for fifty cases of 3D abdominal CT scans and confirmed the proposed method can automatically generate branching pattern reports of abdominal arteries.
The cognitive and neural basis of option generation and subsequent choice.
Kaiser, Stefan; Simon, Joe J; Kalis, Annemarie; Schweizer, Sophie; Tobler, Philippe N; Mojzisch, Andreas
2013-12-01
Decision-making research has thoroughly investigated how people choose from a set of externally provided options. However, in ill-structured real-world environments, possible options for action are not defined by the situation but have to be generated by the agent. Here, we apply behavioral analysis (Study 1) and functional magnetic resonance imaging (Study 2) to investigate option generation and subsequent choice. For this purpose, we employ a new experimental task that requires participants to generate options for simple real-world scenarios and to subsequently decide among the generated options. Correlational analysis with a cognitive test battery suggests that retrieval of options from long-term memory is a relevant process during option generation. The results of the fMRI study demonstrate that option generation in simple real-world scenarios recruits the anterior prefrontal cortex. Furthermore, we show that choice behavior and its neural correlates differ between self-generated and externally provided options. Specifically, choice between self-generated options is associated with stronger recruitment of the dorsal anterior cingulate cortex. This impact of option generation on subsequent choice underlines the need for an expanded model of decision making to accommodate choice between self-generated options.
NASA Technical Reports Server (NTRS)
Young, William D.
1992-01-01
The application of formal methods to the analysis of computing systems promises to provide higher and higher levels of assurance as the sophistication of our tools and techniques increases. Improvements in tools and techniques come about as we pit the current state of the art against new and challenging problems. A promising area for the application of formal methods is in real-time and distributed computing. Some of the algorithms in this area are both subtle and important. In response to this challenge and as part of an ongoing attempt to verify an implementation of the Interactive Convergence Clock Synchronization Algorithm (ICCSA), we decided to undertake a proof of the correctness of the algorithm using the Boyer-Moore theorem prover. This paper describes our approach to proving the ICCSA using the Boyer-Moore prover.
Public access management as an adaptive wildlife management tool
Ouren, Douglas S.; Watts, Raymond D.
2005-01-01
One key issue in the Black Mesa – Black Canyon area is the interaction between motorized vehicles and. The working hypothesis for this study is that early season elk movement onto private lands and the National Park is precipitated by increased use of Off Highway Vehicles (OHV’s). Data on intensity of motorized use is extremely limited. In this study, we monitor intensity of motorized vehicle and trail use on elk movements and habitat usage and analyze interactions. If management agencies decide to alter accessibility, we will monitor wildlife responses to changes in the human-use regime. This provides a unique opportunity for adaptive management experimentation based on coordinated research and monitoring. The products from this project will provide natural resource managers across the nation with tools and information to better meet these resource challenges.
LHCb migration from Subversion to Git
NASA Astrophysics Data System (ADS)
Clemencic, M.; Couturier, B.; Closier, J.; Cattaneo, M.
2017-10-01
Due to user demand and to support new development workflows based on code review and multiple development streams, LHCb decided to port the source code management from Subversion to Git, using the CERN GitLab hosting service. Although tools exist for this kind of migration, LHCb specificities and development models required careful planning of the migration, development of migration tools, changes to the development model, and redefinition of the release procedures. Moreover we had to support a hybrid situation with some software projects hosted in Git and others still in Subversion, or even branches of one projects hosted in different systems. We present the way we addressed the special LHCb requirements, the technical details of migrating large non standard Subversion repositories, and how we managed to smoothly migrate the software projects following the schedule of each project manager.
BioBrick assembly standards and techniques and associated software tools.
Røkke, Gunvor; Korvald, Eirin; Pahr, Jarle; Oyås, Ove; Lale, Rahmi
2014-01-01
The BioBrick idea was developed to introduce the engineering principles of abstraction and standardization into synthetic biology. BioBricks are DNA sequences that serve a defined biological function and can be readily assembled with any other BioBrick parts to create new BioBricks with novel properties. In order to achieve this, several assembly standards can be used. Which assembly standards a BioBrick is compatible with, depends on the prefix and suffix sequences surrounding the part. In this chapter, five of the most common assembly standards will be described, as well as some of the most used assembly techniques, cloning procedures, and a presentation of the available software tools that can be used for deciding on the best method for assembling of different BioBricks, and searching for BioBrick parts in the Registry of Standard Biological Parts database.
Pérez Vaquero, M Á; Gorria, C; Lezaun, M; López, F J; Monge, J; Eguizabal, C; Vesga, M A
2016-05-01
The management of platelet concentrate (PC) stocks is not simple given their short shelf life and variable demand. In general, managers decide on PC production based on personal experience. The objective of this study was to provide a tool to help decide how many PC units to produce each day in a more rational and objective way. From the historical data on PCs produced, transfused and discarded in the Basque Country in 2012, a mathematical model was built, based on the normality of the time series of the transfusions performed on each day of the week throughout the year. This model was implemented in an easy-to-use Excel spreadsheet and validated using real production data from 2013. Comparing with real 2013 data, in the best scenario, the number of PC units that expired was 87·7% lower, PC production, 14·3% lower and the age of the PCs transfused nearly 1-day younger in the simulation. If we want to ensure a minimum stock at the end of each day, the outdating rate and average age of the transfused PCs progressively increase. The practical application of the designed tool can facilitate decision-making about how many PC units to produce each day, resulting in very significant reductions in PC production and wastage and corresponding cost savings, together with an almost 1 day decrease in the mean age of PCs transfused. © 2016 The Authors. Vox Sanguinis published by John Wiley & Sons Ltd on behalf of International Society of Blood Transfusion.
Predictive models in cancer management: A guide for clinicians.
Kazem, Mohammed Ali
2017-04-01
Predictive tools in cancer management are used to predict different outcomes including survival probability or risk of recurrence. The uptake of these tools by clinicians involved in cancer management has not been as common as other clinical tools, which may be due to the complexity of some of these tools or a lack of understanding of how they can aid decision-making in particular clinical situations. The aim of this article is to improve clinicians' knowledge and understanding of predictive tools used in cancer management, including how they are built, how they can be applied to medical practice, and what their limitations may be. Literature review was conducted to investigate the role of predictive tools in cancer management. All predictive models share similar characteristics, but depending on the type of the tool its ability to predict an outcome will differ. Each type has its own pros and cons, and its generalisability will depend on the cohort used to build the tool. These factors will affect the clinician's decision whether to apply the model to their cohort or not. Before a model is used in clinical practice, it is important to appreciate how the model is constructed, what its use may add over and above traditional decision-making tools, and what problems or limitations may be associated with it. Understanding all the above is an important step for any clinician who wants to decide whether or not use predictive tools in their practice. Copyright © 2016 Royal College of Surgeons of Edinburgh (Scottish charity number SC005317) and Royal College of Surgeons in Ireland. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Olivieri, Pierre
Non destructive testing (NDT) plays an important role in the aerospace industry during the fabrication and maintenance of the structures built and is used, among other useful applications, to detect flaws such as cracks at an early stage. However, NDT techniques are still mainly done manually, especially on complex aeronautical structures, which then results in several drawbacks. In addition to be difficult and time-consuming, reliability and repeatability of inspection results are likely to be affected, since they rely on each operator's experience and dexterity. The present thesis is part of a larger project (MANU-418) of the Consortium for Research and Innovation in Aerospace in Quebec (CRIAQ). In this project, it has been proposed to develop a system using a 6-DOF manipulator arm to automate three particular NDT techniques often needed in the aerospace industry: eddy current testing (ECT), fluorescent penetrant inspection (FPI), and infrared thermography (IRT). The main objective of the MANU-418 project is to demonstrate the efficiency of the developed system and provide inspection results of surface and near surface flaws (cracks usually) at least as reliably and repeatably as inspection results from a human operator. One specific objective stemming from the main objective of the project is to develop a methodology and a software tool to generate covering paths adapted for the three aforementioned NDT techniques to inspect the complex surfaces of aerospace structures. The present thesis aims at reaching this specific objective. At first, geometrical and topological properties of the surfaces considered in this project are defined (flat surfaces, round and straight edges, cylindrical or near cylindrical surfaces, holes). It is also assumed that the 3D model of the surface to inspect is known in advance. Moreover, it has been decided within the framework of the MANU-418 project to give priority to the automation of ECT compared with the other techniques (FPI and IRT). As a result, the methodology developed to generate inspection paths is more closely focused on path constraints relative to the manual operations of ECT using a differential eddy current probe (named here EC probe), but it is developed to be flexible enough to be used with the other techniques as well. Common inspection paths for ECT are usually defined by a sweeping motion using a zigzag pattern with the EC probe in mild contact with the inspected surface. Moreover, the main axis of the probe must keep a normal orientation with the surface, and the alignment of its two coils must always be oriented along the direction of its motion. A first methodology is then proposed to generate covering paths on the whole surface of interest while meeting all EC probe motion constraints. First, the surface is meshed with triangular facets, and then it is subdivided into several patches such that their geometry and topology are simpler than the whole surface. Paths are then generated on each patch by intersecting their facets with offset section planes defined along a sweeping direction. Furthermore, another methodology is developed to generate paths around an indication (namely a small area where the presence of a flaw is suspected) whose position and orientation are assumed to be known a priori.. Then, a software tool with a graphical user interface has been developed in the MATLAB environment to generate inspection paths based on these methodologies. A set of path parameters can be changed by the user to get desired paths (distance between passes, sweep direction, etc.). Once paths are computed, an ordered list of coordinates (positions and orientations) of the tool is exported in an EXCEL spreadsheet so that it could be used with a real robot. In this research, these data are then used to perform simulations of trajectories (path described as a function of the time) with a MotoMan robot (model SV3XL) using the MotoSim software. After validation of these trajectories in this software (absence of collisions, positions are all reachable, etc.), they are finally converted into instructions for the real MotoMan robot to proceed with experimental tests. These first simulations and experimentations on a MotoMan robot of the generated paths have given results close to the expected inspection trajectories used manually in the NDT techniques considered, especially for the ECT technique. Nevertheless, it is strongly recommended to validate this path generation method with more experimental tests. For instance, a "test" tool could be manufactured to measure errors of position and orientation of this tool with respect to expected trajectories on a typical complex aeronautical structure. (Abstract shortened by UMI.).
Generating DEM from LIDAR data - comparison of available software tools
NASA Astrophysics Data System (ADS)
Korzeniowska, K.; Lacka, M.
2011-12-01
In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.
The dark side of photovoltaic — 3D simulation of glare assessing risk and discomfort
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, Thomas; Wollert, Alexander
2015-04-15
Photovoltaic (PV) systems form an important force in the implementation of renewable energies, but as we all know, the force has always its dark side. Besides efficiency considerations and discussions about architectures of power distribution networks, the increasing numbers of installations of PV systems for implementing renewable energies have secondary effects. PV systems can generate glare due to optical reflections and hence might be a serious concern. On the one hand, glare could affect safety, e.g. regarding traffic. On the other hand, glare is a constant source of discomfort in vicinities of PV systems. Hence, assessment of glare is decisivemore » for the success of renewable energies near municipalities and traffic zones for the success of solar power. Several courts decided on the change of PV systems and even on their de-installation because of glare effects. Thus, location-based assessments are required to limit potential reflections and to avoid risks for public infrastructure or discomfort of residents. The question arises on how to calculate reflections accurately according to the environment's topography. Our approach is founded in a 3D-based simulation methodology to calculate and visualize reflections based on the geometry of the environment of PV systems. This computational model is implemented by an interactive tool for simulation and visualization. Hence, project planners receive flexible assistance for adjusting the parameters of solar panels amid the planning process and in particular before the installation of a PV system. - Highlights: • Solar panels cause glare that impacts neighborhoods and traffic infrastructures. • Glare might cause disability and discomfort. • 3D environment for the calculation of glare • Interactive tool to simulate and visualize reflections • Impact assessment of solar power plant farms.« less
Cost Optimal Design of a Power Inductor by Sequential Gradient Search
NASA Astrophysics Data System (ADS)
Basak, Raju; Das, Arabinda; Sanyal, Amarnath
2018-05-01
Power inductors are used for compensating VAR generated by long EHV transmission lines and in electronic circuits. For the EHV-lines, the rating of the inductor is decided upon by techno-economic considerations on the basis of the line-susceptance. It is a high voltage high current device, absorbing little active power and large reactive power. The cost is quite high- hence the design should be made cost-optimally. The 3-phase power inductor is similar in construction to a 3-phase core-type transformer with the exception that it has only one winding per phase and each limb is provided with an air-gap, the length of which is decided upon by the inductance required. In this paper, a design methodology based on sequential gradient search technique and the corresponding algorithm leading to cost-optimal design of a 3-phase EHV power inductor has been presented. The case-study has been made on a 220 kV long line of NHPC running from Chukha HPS to Birpara of Coochbihar.
NASA Astrophysics Data System (ADS)
Deep, Prakash; Paninjath, Sankaranarayanan; Pereira, Mark; Buck, Peter
2016-05-01
At advanced technology nodes mask complexity has been increased because of large-scale use of resolution enhancement technologies (RET) which includes Optical Proximity Correction (OPC), Inverse Lithography Technology (ILT) and Source Mask Optimization (SMO). The number of defects detected during inspection of such mask increased drastically and differentiation of critical and non-critical defects are more challenging, complex and time consuming. Because of significant defectivity of EUVL masks and non-availability of actinic inspection, it is important and also challenging to predict the criticality of defects for printability on wafer. This is one of the significant barriers for the adoption of EUVL for semiconductor manufacturing. Techniques to decide criticality of defects from images captured using non actinic inspection images is desired till actinic inspection is not available. High resolution inspection of photomask images detects many defects which are used for process and mask qualification. Repairing all defects is not practical and probably not required, however it's imperative to know which defects are severe enough to impact wafer before repair. Additionally, wafer printability check is always desired after repairing a defect. AIMSTM review is the industry standard for this, however doing AIMSTM review for all defects is expensive and very time consuming. Fast, accurate and an economical mechanism is desired which can predict defect printability on wafer accurately and quickly from images captured using high resolution inspection machine. Predicting defect printability from such images is challenging due to the fact that the high resolution images do not correlate with actual mask contours. The challenge is increased due to use of different optical condition during inspection other than actual scanner condition, and defects found in such images do not have correlation with actual impact on wafer. Our automated defect simulation tool predicts printability of defects at wafer level and automates the process of defect dispositioning from images captured using high resolution inspection machine. It first eliminates false defects due to registration, focus errors, image capture errors and random noise caused during inspection. For the remaining real defects, actual mask-like contours are generated using the Calibre® ILT solution [1][2], which is enhanced to predict the actual mask contours from high resolution defect images. It enables accurate prediction of defect contours, which is not possible from images captured using inspection machine because some information is already lost due to optical effects. Calibre's simulation engine is used to generate images at wafer level using scanner optical conditions and mask-like contours as input. The tool then analyses simulated images and predicts defect printability. It automatically calculates maximum CD variation and decides which defects are severe to affect patterns on wafer. In this paper, we assess the printability of defects for the mask of advanced technology nodes. In particular, we will compare the recovered mask contours with contours extracted from SEM image of the mask and compare simulation results with AIMSTM for a variety of defects and patterns. The results of printability assessment and the accuracy of comparison are presented in this paper. We also suggest how this method can be extended to predict printability of defects identified on EUV photomasks.
Cameron, Duncan H; Zucchero Sarracini, Carla; Rozmovits, Linda; Naglie, Gary; Herrmann, Nathan; Molnar, Frank; Jordan, John; Byszewski, Anna; Tang-Wai, David; Dow, Jamie; Frank, Christopher; Henry, Blair; Pimlott, Nicholas; Seitz, Dallas; Vrkljan, Brenda; Taylor, Rebecca; Masellis, Mario; Rapoport, Mark J
2017-09-01
Driving in persons with dementia poses risks that must be counterbalanced with the importance of the care for autonomy and mobility. Physicians often find substantial challenges in the assessment and reporting of driving safety for persons with dementia. This paper describes a driving in dementia decision tool (DD-DT) developed to aid physicians in deciding when to report older drivers with either mild dementia or mild cognitive impairment to local transportation administrators. A multi-faceted, computerized decision support tool was developed, using a systematic literature and guideline review, expert opinion from an earlier Delphi study, as well as qualitative interviews and focus groups with physicians, caregivers of former drivers with dementia, and transportation administrators. The tool integrates inputs from the physician-user about the patient's clinical and driving history as well as cognitive findings, and it produces a recommendation for reporting to transportation administrators. This recommendation is translated into a customized reporting form for the transportation authority, if applicable, and additional resources are provided for the patient and caregiver. An innovative approach was needed to develop the DD-DT. The literature and guideline review confirmed the algorithm derived from the earlier Delphi study, and barriers identified in the qualitative research were incorporated into the design of the tool.
A Survey of Current Russion RTG Capabilities
NASA Technical Reports Server (NTRS)
Chmielewski, A.; Borshchevsky, A.; Lange, R.; Cook, B.
1994-01-01
Supplying radioisotope thermoelectric generators (RTG) to American space missions became very complex. The process is marred by many obstacles: high cost, lack of new developments, difficult launch approval and NEPA compliance. At the same time there are many ambitious space missions for which an RTG would indisputably be the lightest, smallest and most robust power source. An American delegation investigated status of RTG production in Russia to decide if our product line could be supplemented by the Russian designs.
Semi-inclusive polarised lepton-nucleon scattering and the anomalous gluon contribution
NASA Astrophysics Data System (ADS)
Güllenstern, St.; Veltri, M.; Górnicki, P.; Mankiewicz, L.; Schäfer, A.
1993-08-01
We discuss a new observable for semi-inclusive pion production in polarised lepton-nucleon collisions. This observable is sensitive to the polarised and unpolarised strange quark distribution and the anomalous gluon contribution, provided that their fragmentation functions into pions differ substantially from that of light quarks. From Monte Carlo data generated with our PEPSI code we conclude that HERMES might be able to decide whether the polarized strange quark and gluon distributions are large.
Transmission expansion with smart switching under demand uncertainty and line failures
Schumacher, Kathryn M.; Chen, Richard Li-Yang; Cohn, Amy E. M.
2016-06-07
One of the major challenges in deciding where to build new transmission lines is that there is uncertainty regarding future loads, renewal generation output and equipment failures. We propose a robust optimization model whose transmission expansion solutions ensure that demand can be met over a wide range of conditions. Specifically, we require feasible operation for all loads and renewable generation levels within given ranges, and for all single transmission line failures. Furthermore, we consider transmission switching as an allowable recovery action. This relatively inexpensive method of redirecting power flows improves resiliency, but introduces computational challenges. Lastly, we present a novelmore » algorithm to solve this model. Computational results are discussed.« less
Rapid SAW Sensor Development Tools
NASA Technical Reports Server (NTRS)
Wilson, William C.; Atkinson, Gary M.
2007-01-01
The lack of integrated design tools for Surface Acoustic Wave (SAW) devices has led us to develop tools for the design, modeling, analysis, and automatic layout generation of SAW devices. These tools enable rapid development of wireless SAW sensors. The tools developed have been designed to integrate into existing Electronic Design Automation (EDA) tools to take advantage of existing 3D modeling, and Finite Element Analysis (FEA). This paper presents the SAW design, modeling, analysis, and automated layout generation tools.
EMI datalib - joining the best of ARC and gLite data libraries
NASA Astrophysics Data System (ADS)
Nilsen, J. K.; Cameron, D.; Devresse, A.; Molnar, Zs; Nagy, Zs; Salichos, M.
2012-12-01
To manage data in the grid, with its jungle of protocols and enormous amount of data in different storage solutions, it is important to have a strong, versatile and reliable data management library. While there are several data management tools and libraries available, they all have different strengths and weaknesses, and it can be hard to decide which tool to use for which purpose. EMI is a collaboration between the European middleware providers aiming to take the best out of each middleware to create one consolidated, all-purpose grid middleware. When EMI started there were two main tools for managing data - gLite had lcg util and the GFAL library, ARC had the ARC data tools and libarcdata2. While different in design and purpose, they both have the same goal; to manage data in the grid. The design of the new EMI datalib was ready by the end of 2011, and a first prototype is now implemented and going through a thorough testing phase. This presentation will give the latest results of the consolidated library together with an overview of the design, test plan and roadmap of EMI datalib.
Guidelines for the analysis of free energy calculations
Klimovich, Pavel V.; Shirts, Michael R.; Mobley, David L.
2015-01-01
Free energy calculations based on molecular dynamics (MD) simulations show considerable promise for applications ranging from drug discovery to prediction of physical properties and structure-function studies. But these calculations are still difficult and tedious to analyze, and best practices for analysis are not well defined or propagated. Essentially, each group analyzing these calculations needs to decide how to conduct the analysis and, usually, develop its own analysis tools. Here, we review and recommend best practices for analysis yielding reliable free energies from molecular simulations. Additionally, we provide a Python tool, alchemical–analysis.py, freely available on GitHub at https://github.com/choderalab/pymbar–examples, that implements the analysis practices reviewed here for several reference simulation packages, which can be adapted to handle data from other packages. Both this review and the tool covers analysis of alchemical calculations generally, including free energy estimates via both thermodynamic integration and free energy perturbation-based estimators. Our Python tool also handles output from multiple types of free energy calculations, including expanded ensemble and Hamiltonian replica exchange, as well as standard fixed ensemble calculations. We also survey a range of statistical and graphical ways of assessing the quality of the data and free energy estimates, and provide prototypes of these in our tool. We hope these tools and discussion will serve as a foundation for more standardization of and agreement on best practices for analysis of free energy calculations. PMID:25808134
An online tool for tracking soil nitrogen
NASA Astrophysics Data System (ADS)
Wang, J.; Umar, M.; Banger, K.; Pittelkow, C. M.; Nafziger, E. D.
2016-12-01
Near real-time crop models can be useful tools for optimizing agricultural management practices. For example, model simulations can potentially provide current estimates of nitrogen availability in soil, helping growers decide whether more nitrogen needs to be applied in a given season. Traditionally, crop models have been used at point locations (i.e. single fields) with homogenous soil, climate and initial conditions. However, nitrogen availability across fields with varied weather and soil conditions at a regional or national level is necessary to guide better management decisions. This study presents the development of a publicly available, online tool that automates the integration of high-spatial-resolution forecast and past weather and soil data in DSSAT to estimate nitrogen availability for individual fields in Illinois. The model has been calibrated with field experiments from past year at six research corn fields across Illinois. These sites were treated with applications of different N fertilizer timings and amounts. The tool requires minimal management information from growers and yet has the capability to simulate nitrogen-water-crop interactions with calibrated parameters that are more appropriate for Illinois. The results from the tool will be combined with incoming field experiment data from 2016 for model validation and further improvement of model's predictive accuracy. The tool has the potential to help guide better nitrogen management practices to maximize economic and environmental benefits.
Memory function and supportive technology
Charness, Neil; Best, Ryan; Souders, Dustin
2013-01-01
Episodic and working memory processes show pronounced age-related decline, with other memory processes such as semantic, procedural, and metamemory less affected. Older adults tend to complain the most about prospective and retrospective memory failures. We introduce a framework for deciding how to mitigate memory decline using augmentation and substitution and discuss techniques that change the user, through mnemonics training, and change the tool or environment, by providing environmental support. We provide examples of low-tech and high-tech memory supports and discuss constraints on the utility of high-tech systems including effectiveness of devices, attitudes toward memory aids, and reliability of systems. PMID:24379752
Yellow fever vaccine: worthy friend or stealthy foe?
Seligman, Stephen J; Casanova, Jean-Laurent
2016-06-01
Recognition that the live yellow fever vaccine may rarely be associated with viscerotropic disease (YEL-AVD) has diminished its safety status. However, the vaccine remains the principal tool for limiting the occurrence of yellow fever, making large portions of Africa and South America more habitable. The subject has previously been exhaustively reviewed. Novel concepts in the current report include the description of a systematic method for deciding whom to vaccinate, recommendations for obtaining data helpful in making that decision, and suggestions for additional study. The vaccine is indeed a worthy friend, but its adverse reactions need to be recognized.
Temporal logics and real time expert systems.
Blom, J A
1996-10-01
This paper introduces temporal logics. Due to the eternal compromise between expressive adequacy and reasoning efficiency that must decided upon in any application, full (first order logic or modal logic based) temporal logics are frequently not suitable. This is especially true in real time expert systems, where a fixed (and usually small) response time must be guaranteed. One such expert system, Fagan's VM, is reviewed, and a delineation is given of how to formally describe and reason with time in medical protocols. It is shown that Petri net theory is a useful tool to check the correctness of formalised protocols.
[Adaptation of the Medical Office Survey on Patient Safety Culture (MOSPSC) tool].
Silvestre-Busto, C; Torijano-Casalengua, M L; Olivera-Cañadas, G; Astier-Peña, M P; Maderuelo-Fernández, J A; Rubio-Aguado, E A
2015-01-01
To adapt the Medical Office Survey on Patient Safety Culture (MOSPSC) Excel(®) tool for its use by Primary Care Teams of the Spanish National Public Health System. The process of translation and adaptation of MOSPSC from the Agency for Healthcare and Research in Quality (AHRQ) was performed in five steps: Original version translation, Conceptual equivalence evaluation, Acceptability and viability assessment, Content validity and Questionnaire test and response analysis, and psychometric properties assessment. After confirming MOSPSC as a valid, reliable, consistent and useful tool for assessing patient safety culture in our setting, an Excel(®) worksheet was translated and adapted in the same way. It was decided to develop a tool to analyze the "Spanish survey" and to keep it linked to the "Original version" tool. The "Spanish survey" comparison data are those obtained in a 2011 nationwide Spanish survey, while the "Original version" comparison data are those provided by the AHRQ in 2012. The translated and adapted tool and the analysis of the results from a 2011 nationwide Spanish survey are available on the website of the Ministry of Health, Social Services and Equality. It allows the questions which are decisive in the different dimensions to be determined, and it provides a comparison of the results with graphical representation. Translation and adaptation of this tool enables a patient safety culture in Primary Care in Spain to be more effectively applied. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.
25 CFR 39.133 - Who decides how Language Development funds can be used?
Code of Federal Regulations, 2012 CFR
2012-04-01
... SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school boards decide... 25 Indians 1 2012-04-01 2011-04-01 true Who decides how Language Development funds can be used? 39...
DECIDE: a software for computer-assisted evaluation of diagnostic test performance.
Chiecchio, A; Bo, A; Manzone, P; Giglioli, F
1993-05-01
The evaluation of the performance of clinical tests is a complex problem involving different steps and many statistical tools, not always structured in an organic and rational system. This paper presents a software which provides an organic system of statistical tools helping evaluation of clinical test performance. The program allows (a) the building and the organization of a working database, (b) the selection of the minimal set of tests with the maximum information content, (c) the search of the model best fitting the distribution of the test values, (d) the selection of optimal diagnostic cut-off value of the test for every positive/negative situation, (e) the evaluation of performance of the combinations of correlated and uncorrelated tests. The uncertainty associated with all the variables involved is evaluated. The program works in a MS-DOS environment with EGA or higher performing graphic card.
Experience Using Formal Methods for Specifying a Multi-Agent System
NASA Technical Reports Server (NTRS)
Rouff, Christopher; Rash, James; Hinchey, Michael; Szczur, Martha R. (Technical Monitor)
2000-01-01
The process and results of using formal methods to specify the Lights Out Ground Operations System (LOGOS) is presented in this paper. LOGOS is a prototype multi-agent system developed to show the feasibility of providing autonomy to satellite ground operations functions at NASA Goddard Space Flight Center (GSFC). After the initial implementation of LOGOS the development team decided to use formal methods to check for race conditions, deadlocks and omissions. The specification exercise revealed several omissions as well as race conditions. After completing the specification, the team concluded that certain tools would have made the specification process easier. This paper gives a sample specification of two of the agents in the LOGOS system and examples of omissions and race conditions found. It concludes with describing an architecture of tools that would better support the future specification of agents and other concurrent systems.
Experiences on developing digital down conversion algorithms using Xilinx system generator
NASA Astrophysics Data System (ADS)
Xu, Chengfa; Yuan, Yuan; Zhao, Lizhi
2013-07-01
The Digital Down Conversion (DDC) algorithm is a classical signal processing method which is widely used in radar and communication systems. In this paper, the DDC function is implemented by Xilinx System Generator tool on FPGA. System Generator is an FPGA design tool provided by Xilinx Inc and MathWorks Inc. It is very convenient for programmers to manipulate the design and debug the function, especially for the complex algorithm. Through the developing process of DDC function based on System Generator, the results show that System Generator is a very fast and efficient tool for FPGA design.
PDS4: Harnessing the Power of Generate and Apache Velocity
NASA Astrophysics Data System (ADS)
Padams, J.; Cayanan, M.; Hardman, S.
2018-04-01
The PDS4 Generate Tool is a Java-based command-line tool developed by the Cartography and Imaging Sciences Nodes (PDSIMG) for generating PDS4 XML labels, from Apache Velocity templates and input metadata.
E-DECIDER Disaster Response and Decision Support Cyberinfrastructure: Technology and Challenges
NASA Astrophysics Data System (ADS)
Glasscoe, M. T.; Parker, J. W.; Pierce, M. E.; Wang, J.; Eguchi, R. T.; Huyck, C. K.; Hu, Z.; Chen, Z.; Yoder, M. R.; Rundle, J. B.; Rosinski, A.
2014-12-01
Timely delivery of critical information to decision makers during a disaster is essential to response and damage assessment. Key issues to an efficient emergency response after a natural disaster include rapidly processing and delivering this critical information to emergency responders and reducing human intervention as much as possible. Essential elements of information necessary to achieve situational awareness are often generated by a wide array of organizations and disciplines, using any number of geospatial and non-geospatial technologies. A key challenge is the current state of practice does not easily support information sharing and technology interoperability. NASA E-DECIDER (Emergency Data Enhanced Cyber-Infrastructure for Disaster Evaluation and Response) has worked with the California Earthquake Clearinghouse and its partners to address these issues and challenges by adopting the XChangeCore Web Service Data Orchestration technology and participating in several earthquake response exercises. The E-DECIDER decision support system provides rapid delivery of advanced situational awareness data products to operations centers and emergency responders in the field. Remote sensing and hazard data, model-based map products, information from simulations, damage detection, and crowdsourcing is integrated into a single geospatial view and delivered through a service oriented architecture for improved decision-making and then directly to mobile devices of responders. By adopting a Service Oriented Architecture based on Open Geospatial Consortium standards, the system provides an extensible, comprehensive framework for geospatial data processing and distribution on Cloud platforms and other distributed environments. While the Clearinghouse and its partners are not first responders, they do support the emergency response community by providing information about the damaging effects earthquakes. It is critical for decision makers to maintain a situational awareness that is knowledgeable of potential and current conditions, possible impacts on populations and infrastructure, and other key information. E-DECIDER and the Clearinghouse have worked together to address many of these issues and challenges to deliver interoperable, authoritative decision support products.
NASA Technical Reports Server (NTRS)
Noble, Viveca K.
1994-01-01
When data is transmitted through a noisy channel, errors are produced within the data rendering it indecipherable. Through the use of error control coding techniques, the bit error rate can be reduced to any desired level without sacrificing the transmission data rate. The Astrionics Laboratory at Marshall Space Flight Center has decided to use a modular, end-to-end telemetry data simulator to simulate the transmission of data from flight to ground and various methods of error control. The simulator includes modules for random data generation, data compression, Consultative Committee for Space Data Systems (CCSDS) transfer frame formation, error correction/detection, error generation and error statistics. The simulator utilizes a concatenated coding scheme which includes CCSDS standard (255,223) Reed-Solomon (RS) code over GF(2(exp 8)) with interleave depth of 5 as the outermost code, (7, 1/2) convolutional code as an inner code and CCSDS recommended (n, n-16) cyclic redundancy check (CRC) code as the innermost code, where n is the number of information bits plus 16 parity bits. The received signal-to-noise for a desired bit error rate is greatly reduced through the use of forward error correction techniques. Even greater coding gain is provided through the use of a concatenated coding scheme. Interleaving/deinterleaving is necessary to randomize burst errors which may appear at the input of the RS decoder. The burst correction capability length is increased in proportion to the interleave depth. The modular nature of the simulator allows for inclusion or exclusion of modules as needed. This paper describes the development and operation of the simulator, the verification of a C-language Reed-Solomon code, and the possibility of using Comdisco SPW(tm) as a tool for determining optimal error control schemes.
NASA Astrophysics Data System (ADS)
Pérez-Aparicio, Elena; Lillo-Bravo, Isidoro; Moreno-Tejera, Sara; Silva-Pérez, Manuel
2017-06-01
Thermal energy for industrial processes can be generated using thermal (ST) or photovoltaic (PV) solar energy. ST energy has traditionally been the most favorable option due to its cost and efficiency. Current costs and efficiencies values make the PV solar energy become an alternative to ST energy as supplier of industrial process heat. The aim of this study is to provide a useful tool to decide in each case which option is economically and environmentally the most suitable alternative. The methodology used to compare ST and PV systems is based on the calculation of the levelized cost of energy (LCOE) and greenhouse gas emissions (GHG) avoided by using renewable technologies instead of conventional sources of energy. In both cases, these calculations depend on costs and efficiencies associated with ST or PV systems and the conversion factor from thermal or electrical energy to GHG. To make these calculations, a series of hypotheses are assumed related to consumer and energy prices, operation, maintenance and replacement costs, lifetime of the system or working temperature of the industrial process. This study applies the methodology at five different sites which have been selected taking into account their radiometric and meteorological characteristics. In the case of ST energy three technologies are taken into account, compound parabolic concentrator (CPC), linear Fresnel collector (LFC) and parabolic trough collector (PTC). The PV option includes two ways of use of generated electricity, an electrical resistance or a combination of an electrical resistance and a heat pump (HP). Current values of costs and efficiencies make ST system remains as the most favorable option. These parameters may vary significantly over time. The evolution of these parameters may convert PV systems into the most favorable option for particular applications.
Irrelevance Reasoning in Knowledge Based Systems
NASA Technical Reports Server (NTRS)
Levy, A. Y.
1993-01-01
This dissertation considers the problem of reasoning about irrelevance of knowledge in a principled and efficient manner. Specifically, it is concerned with two key problems: (1) developing algorithms for automatically deciding what parts of a knowledge base are irrelevant to a query and (2) the utility of relevance reasoning. The dissertation describes a novel tool, the query-tree, for reasoning about irrelevance. Based on the query-tree, we develop several algorithms for deciding what formulas are irrelevant to a query. Our general framework sheds new light on the problem of detecting independence of queries from updates. We present new results that significantly extend previous work in this area. The framework also provides a setting in which to investigate the connection between the notion of irrelevance and the creation of abstractions. We propose a new approach to research on reasoning with abstractions, in which we investigate the properties of an abstraction by considering the irrelevance claims on which it is based. We demonstrate the potential of the approach for the cases of abstraction of predicates and projection of predicate arguments. Finally, we describe an application of relevance reasoning to the domain of modeling physical devices.
Development and evaluation of the DECIDE to move! Physical activity educational video.
Majid, Haseeb M; Schumann, Kristina P; Doswell, Angela; Sutherland, June; Hill Golden, Sherita; Stewart, Kerry J; Hill-Briggs, Felicia
2012-01-01
To develop a video that provides accessible and usable information about the importance of physical activity to type 2 diabetes self-management and ways of incorporating physical activity into everyday life. A 15-minute physical activity educational video narrated by US Surgeon General Dr Regina Benjamin was developed and evaluated. The video addresses the following topics: the effects of exercise on diabetes, preparations for beginning physical activity, types of physical activity, safety considerations (eg, awareness of symptoms of hypoglycemia during activity), and goal setting. Two patient screening groups were held for evaluation and revision of the video. Patient satisfaction ratings ranged 4.6 to 4.9 out of a possible 5.0 on dimensions of overall satisfaction, how informative they found the video to be, how well the video held their interest and attention, how easy the video was to understand, and how easy the video was to see and hear. Patients reported the educational video effective in empowering them to take strides toward increasing and maintaining physical activity in their lives. The tool is currently used in a clinical research trial, Project DECIDE, as one component of a diabetes and cardiovascular disease self-management program.
Binomial test statistics using Psi functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, Kimiko o
2007-01-01
For the negative binomial model (probability generating function (p + 1 - pt){sup -k}) a logarithmic derivative is the Psi function difference {psi}(k + x) - {psi}(k); this and its derivatives lead to a test statistic to decide on the validity of a specified model. The test statistic uses a data base so there exists a comparison available between theory and application. Note that the test function is not dominated by outliers. Applications to (i) Fisher's tick data, (ii) accidents data, (iii) Weldon's dice data are included.
Diagnosis and Threat Detection Capabilities of the SERENITY Monitoring Framework
NASA Astrophysics Data System (ADS)
Tsigkritis, Theocharis; Spanoudakis, George; Kloukinas, Christos; Lorenzoli, Davide
The SERENITY monitoring framework offers mechanisms for diagnosing the causes of violations of security and dependability (S&D) properties and detecting potential violations of such properties, called "Cthreats". Diagnostic information and threat detection are often necessary for deciding what an appropriate reaction to a violation is and taking pre-emptive actions against predicted violations, respectively. In this chapter, we describe the mechanisms of the SERENITY monitoring framework which generate diagnostic information for violations of S&D properties and detecting threats.
Leyrat, Clémence; Caille, Agnès; Foucher, Yohann; Giraudeau, Bruno
2016-01-22
Despite randomization, baseline imbalance and confounding bias may occur in cluster randomized trials (CRTs). Covariate imbalance may jeopardize the validity of statistical inferences if they occur on prognostic factors. Thus, the diagnosis of a such imbalance is essential to adjust statistical analysis if required. We developed a tool based on the c-statistic of the propensity score (PS) model to detect global baseline covariate imbalance in CRTs and assess the risk of confounding bias. We performed a simulation study to assess the performance of the proposed tool and applied this method to analyze the data from 2 published CRTs. The proposed method had good performance for large sample sizes (n =500 per arm) and when the number of unbalanced covariates was not too small as compared with the total number of baseline covariates (≥40% of unbalanced covariates). We also provide a strategy for pre selection of the covariates needed to be included in the PS model to enhance imbalance detection. The proposed tool could be useful in deciding whether covariate adjustment is required before performing statistical analyses of CRTs.
Niu, Sheng-Yong; Yang, Jinyu; McDermaid, Adam; Zhao, Jing; Kang, Yu; Ma, Qin
2017-05-08
Metagenomic and metatranscriptomic sequencing approaches are more frequently being used to link microbiota to important diseases and ecological changes. Many analyses have been used to compare the taxonomic and functional profiles of microbiota across habitats or individuals. While a large portion of metagenomic analyses focus on species-level profiling, some studies use strain-level metagenomic analyses to investigate the relationship between specific strains and certain circumstances. Metatranscriptomic analysis provides another important insight into activities of genes by examining gene expression levels of microbiota. Hence, combining metagenomic and metatranscriptomic analyses will help understand the activity or enrichment of a given gene set, such as drug-resistant genes among microbiome samples. Here, we summarize existing bioinformatics tools of metagenomic and metatranscriptomic data analysis, the purpose of which is to assist researchers in deciding the appropriate tools for their microbiome studies. Additionally, we propose an Integrated Meta-Function mapping pipeline to incorporate various reference databases and accelerate functional gene mapping procedures for both metagenomic and metatranscriptomic analyses. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Evaluation of the efficiency and reliability of software generated by code generators
NASA Technical Reports Server (NTRS)
Schreur, Barbara
1994-01-01
There are numerous studies which show that CASE Tools greatly facilitate software development. As a result of these advantages, an increasing amount of software development is done with CASE Tools. As more software engineers become proficient with these tools, their experience and feedback lead to further development with the tools themselves. What has not been widely studied, however, is the reliability and efficiency of the actual code produced by the CASE Tools. This investigation considered these matters. Three segments of code generated by MATRIXx, one of many commercially available CASE Tools, were chosen for analysis: ETOFLIGHT, a portion of the Earth to Orbit Flight software, and ECLSS and PFMC, modules for Environmental Control and Life Support System and Pump Fan Motor Control, respectively.
Choice and Decision Processes and Careers. Information Series No. 7.
ERIC Educational Resources Information Center
Tiedeman, David V.; Miller-Tiedeman, Anna
The state of the art review depicts comprehension of choice and decision making as knowable and demonstrates that something is already known about both processes and how to facilitate their comprehension. The assumption is made that deciding to choose, choosing to decide, deciding in order to choose, and choosing in order to decide must be…
Test-Case Generation using an Explicit State Model Checker Final Report
NASA Technical Reports Server (NTRS)
Heimdahl, Mats P. E.; Gao, Jimin
2003-01-01
In the project 'Test-Case Generation using an Explicit State Model Checker' we have extended an existing tools infrastructure for formal modeling to export Java code so that we can use the NASA Ames tool Java Pathfinder (JPF) for test case generation. We have completed a translator from our source language RSML(exp -e) to Java and conducted initial studies of how JPF can be used as a testing tool. In this final report, we provide a detailed description of the translation approach as implemented in our tools.
Bayesian networks for satellite payload testing
NASA Astrophysics Data System (ADS)
Przytula, Krzysztof W.; Hagen, Frank; Yung, Kar
1999-11-01
Satellite payloads are fast increasing in complexity, resulting in commensurate growth in cost of manufacturing and operation. A need exists for a software tool, which would assist engineers in production and operation of satellite systems. We have designed and implemented a software tool, which performs part of this task. The tool aids a test engineer in debugging satellite payloads during system testing. At this stage of satellite integration and testing both the tested payload and the testing equipment represent complicated systems consisting of a very large number of components and devices. When an error is detected during execution of a test procedure, the tool presents to the engineer a ranked list of potential sources of the error and a list of recommended further tests. The engineer decides this on this basis if to perform some of the recommended additional test or replace the suspect component. The tool has been installed in payload testing facility. The tool is based on Bayesian networks, a graphical method of representing uncertainty in terms of probabilistic influences. The Bayesian network was configured using detailed flow diagrams of testing procedures and block diagrams of the payload and testing hardware. The conditional and prior probability values were initially obtained from experts and refined in later stages of design. The Bayesian network provided a very informative model of the payload and testing equipment and inspired many new ideas regarding the future test procedures and testing equipment configurations. The tool is the first step in developing a family of tools for various phases of satellite integration and operation.
ESAS Deliverable PS 1.1.2.3: Customer Survey on Code Generations in Safety-Critical Applications
NASA Technical Reports Server (NTRS)
Schumann, Johann; Denney, Ewen
2006-01-01
Automated code generators (ACG) are tools that convert a (higher-level) model of a software (sub-)system into executable code without the necessity for a developer to actually implement the code. Although both commercially supported and in-house tools have been used in many industrial applications, little data exists on how these tools are used in safety-critical domains (e.g., spacecraft, aircraft, automotive, nuclear). The aims of the survey, therefore, were threefold: 1) to determine if code generation is primarily used as a tool for prototyping, including design exploration and simulation, or for fiight/production code; 2) to determine the verification issues with code generators relating, in particular, to qualification and certification in safety-critical domains; and 3) to determine perceived gaps in functionality of existing tools.
Stock or cash? The trade-offs for buyers and sellers in mergers and acquisitions.
Rappaport, A; Sirower, M L
1999-01-01
In 1988, less than 2% of large deals were paid for entirely in stock; by 1998, that number had risen to 50%. The shift has profound ramifications for shareholders of both the acquiring and acquired companies. In this article, the authors provide a framework and two simple tools to guide boards of both companies through the issues they need to consider when making decisions about how to pay for--and whether to accept--a deal. First an acquirer has to decide whether to finance the deal using stock or pay cash. Second, if the acquirer decides to issue stock, it then must decide whether to offer a fixed value of shares or a fixed number of them. Offering cash places all the potential risks and rewards with the acquirer--and sends a strong signal to the markets that it has confidence in the value not only of the deal but in its own stock. By issuing shares, however, an acquirer in essence offers to share the newly merged company with the stockholders of the acquired company--a signal the market often interprets as a lack of confidence in the value of the acquirer's stock. Offering a fixed number of shares reinforces that impression because it requires the selling stockholders to share the risk that the value of the acquirer's stock will decline before the deal goes through. Offering a fixed value of shares sends a more confident signal to the markets, as the acquirer assumes all of that risk. The choice between cash and stock should never be made without full and careful consideration of the potential consequences. The all-too-frequent disappointing returns from stock transactions underscore how important the method of payment truly is.
Decision-making under uncertainty: results from an experiment conducted at EGU 2012
NASA Astrophysics Data System (ADS)
Ramos, Maria-Helena; van Andel, Schalk Jan; Pappenberger, Florian
2013-04-01
Do probabilistic forecasts lead to better decisions? At the EGU General Assembly 2012, we conducted a laboratory-style experiment to address this question. Several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision makers. Participants were prompted to make decisions when forecasts were provided with and without uncertainty information. They had to decide whether to open or not a gate which was the inlet of a retention basin designed to protect a town. The rules were such that: if they decided to open the gate, the retention basin was flooded and the farmers in this basin demanded a compensation for flooding their land; if they decided not to open the gate and a flood occurred on the river, the town was flooded and they had to pay a fine to the town. Participants were encouraged to keep note of their individual decisions in a worksheet. About 100 worksheets were collected at the end of the game and the results of their evaluation are presented here. In general, they show that decisions are based on a combination of what is displayed by the expected (forecast) value and what is given by the uncertainty information. In the absence of uncertainty information, decision makers are compelled towards a more risk-averse attitude. Besides, more money was lost by a large majority of participants when they had to make decisions without uncertainty information. Limitations of the experiment setting are discussed, as well as the importance of the development of training tools to increase effectiveness in the use of probabilistic predictions to support decisions under uncertainty.
FY15 Report on Thermomechanical Testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Francis D.; Buchholz, Stuart
2015-08-01
Sandia is participating in the third phase of a United States (US)-German Joint Project that compares constitutive models and simulation procedures on the basis of model calculations of the thermomechanical behavior and healing of rock salt (Salzer et al. 2015). The first goal of the project is to evaluate the ability of numerical modeling tools to correctly describe the relevant deformation phenomena in rock salt under various influences. Among the numerical modeling tools required to address this are constitutive models that are used in computer simulations for the description of the thermal, mechanical, and hydraulic behavior of the host rockmore » under various influences and for the long-term prediction of this behavior. Achieving this goal will lead to increased confidence in the results of numerical simulations related to the secure disposal of radioactive wastes in rock salt. Results of the Joint Project may ultimately be used to make various assertions regarding stability analysis of an underground repository in salt during the operating phase as well as long-term integrity of the geological barrier in the post-operating phase A primary evaluation of constitutive model capabilities comes by way of predicting large-scale field tests. The Joint Project partners decided to model Waste Isolation Pilot Plant (WIPP) Rooms B & D which are full-scale rooms having the same dimensions. Room D deformed under natural, ambient conditions while Room B was thermally driven by an array of waste-simulating heaters (Munson et al. 1988; 1990). Existing laboratory test data for WIPP salt were carefully scrutinized and the partners decided that additional testing would be needed to help evaluate advanced features of the constitutive models. The German partners performed over 140 laboratory tests on WIPP salt at no charge to the US Department of Energy (DOE).« less
Lakbub, Jude C; Su, Xiaomeng; Zhu, Zhikai; Patabandige, Milani W; Hua, David; Go, Eden P; Desaire, Heather
2017-08-04
The glycopeptide analysis field is tightly constrained by a lack of effective tools that translate mass spectrometry data into meaningful chemical information, and perhaps the most challenging aspect of building effective glycopeptide analysis software is designing an accurate scoring algorithm for MS/MS data. We provide the glycoproteomics community with two tools to address this challenge. The first tool, a curated set of 100 expert-assigned CID spectra of glycopeptides, contains a diverse set of spectra from a variety of glycan types; the second tool, Glycopeptide Decoy Generator, is a new software application that generates glycopeptide decoys de novo. We developed these tools so that emerging methods of assigning glycopeptides' CID spectra could be rigorously tested. Software developers or those interested in developing skills in expert (manual) analysis can use these tools to facilitate their work. We demonstrate the tools' utility in assessing the quality of one particular glycopeptide software package, GlycoPep Grader, which assigns glycopeptides to CID spectra. We first acquired the set of 100 expert assigned CID spectra; then, we used the Decoy Generator (described herein) to generate 20 decoys per target glycopeptide. The assigned spectra and decoys were used to test the accuracy of GlycoPep Grader's scoring algorithm; new strengths and weaknesses were identified in the algorithm using this approach. Both newly developed tools are freely available. The software can be downloaded at http://glycopro.chem.ku.edu/GPJ.jar.
Semi-autonomous remote sensing time series generation tool
NASA Astrophysics Data System (ADS)
Babu, Dinesh Kumar; Kaufmann, Christof; Schmidt, Marco; Dhams, Thorsten; Conrad, Christopher
2017-10-01
High spatial and temporal resolution data is vital for crop monitoring and phenology change detection. Due to the lack of satellite architecture and frequent cloud cover issues, availability of daily high spatial data is still far from reality. Remote sensing time series generation of high spatial and temporal data by data fusion seems to be a practical alternative. However, it is not an easy process, since it involves multiple steps and also requires multiple tools. In this paper, a framework of Geo Information System (GIS) based tool is presented for semi-autonomous time series generation. This tool will eliminate the difficulties by automating all the steps and enable the users to generate synthetic time series data with ease. Firstly, all the steps required for the time series generation process are identified and grouped into blocks based on their functionalities. Later two main frameworks are created, one to perform all the pre-processing steps on various satellite data and the other one to perform data fusion to generate time series. The two frameworks can be used individually to perform specific tasks or they could be combined to perform both the processes in one go. This tool can handle most of the known geo data formats currently available which makes it a generic tool for time series generation of various remote sensing satellite data. This tool is developed as a common platform with good interface which provides lot of functionalities to enable further development of more remote sensing applications. A detailed description on the capabilities and the advantages of the frameworks are given in this paper.
49 CFR 40.377 - Who decides whether to issue a PIE?
Code of Federal Regulations, 2010 CFR
2010-10-01
... WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Public Interest Exclusions § 40.377 Who decides whether to issue a PIE? (a) The ODAPC Director, or his or her designee, decides whether to issue a PIE. If a designee... 49 Transportation 1 2010-10-01 2010-10-01 false Who decides whether to issue a PIE? 40.377 Section...
Development of Anthropometric Analogous Headforms. Phase 1.
1994-10-31
shown in figure 5. This surface mesh can then be transformed into polygon faces that are able to be rendered by the AutoCAD rendering tools . Rendering of...computer-generated surfaces. The material removal techniques require the programming of the tool path of the cutter and in some cases requires specialized... tooling . Tool path programs are available to transfer the computer-generated surface into actual paths of the cutting tool . In cases where the
Power Plant Model Validation Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
The PPMV is used to validate generator model using disturbance recordings. The PPMV tool contains a collection of power plant models and model validation studies, as well as disturbance recordings from a number of historic grid events. The user can import data from a new disturbance into the database, which converts PMU and SCADA data into GE PSLF format, and then run the tool to validate (or invalidate) the model for a specific power plant against its actual performance. The PNNL PPMV tool enables the automation of the process of power plant model validation using disturbance recordings. The tool usesmore » PMU and SCADA measurements as input information. The tool automatically adjusts all required EPCL scripts and interacts with GE PSLF in the batch mode. The main tool features includes: The tool interacts with GE PSLF; The tool uses GE PSLF Play-In Function for generator model validation; Database of projects (model validation studies); Database of the historic events; Database of the power plant; The tool has advanced visualization capabilities; and The tool automatically generates reports« less
Application of Multimedia Design Principles to Visuals Used in Course-Books: An Evaluation Tool
ERIC Educational Resources Information Center
Kuzu, Abdullah; Akbulut, Yavuz; Sahin, Mehmet Can
2007-01-01
This paper introduces an evaluation tool prepared to examine the quality of visuals in course-books. The tool is based on Mayer's Cognitive Theory of Multimedia Learning (i.e. Generative Theory) and its principles regarding the correct use of illustrations within text. The reason to generate the tool, the development process along with the…
NASA Astrophysics Data System (ADS)
Peng, Zhuoyin; Liu, Zhou; Chen, Jianlin; Liao, Lida; Chen, Jian; Li, Cong; Li, Wei
2018-06-01
With the development of photovoltaic industry, the cost of photovoltaic power generation has become the significant issue. And the metallization process has decided the cost of original materials and photovoltaic efficiency of the solar cells. Nowadays, double printing process has been introduced instead of one-step printing process for front contact of polycrystalline silicon solar cells, which can effectively improve the photovoltaic conversion efficiency of silicon solar cells. Here, the relative cheap Cu paste has replaced the expensive Ag paste to form Ag/Cu composite front contact of silicon solar cells. The photovoltaic performance and the cost of photovoltaic power generation have been investigated. With the optimization on structure and height of Cu finger layer for Ag/Cu composite double-printed front contact, the silicon solar cells have exhibited a photovoltaic conversion efficiency of 18.41%, which has reduced 3.42 cent per Watt for the cost of photovoltaic power generation.
A Study of Economical Incentives for Voltage Profile Control Method in Future Distribution Network
NASA Astrophysics Data System (ADS)
Tsuji, Takao; Sato, Noriyuki; Hashiguchi, Takuhei; Goda, Tadahiro; Tange, Seiji; Nomura, Toshio
In a future distribution network, it is difficult to maintain system voltage because a large number of distributed generators are introduced to the system. The authors have proposed “voltage profile control method” using power factor control of distributed generators in the previous work. However, the economical disbenefit is caused by the active power decrease when the power factor is controlled in order to increase the reactive power. Therefore, proper incentives must be given to the customers that corporate to the voltage profile control method. Thus, in this paper, we develop a new rules which can decide the economical incentives to the customers. The method is tested in one feeder distribution network model and its effectiveness is shown.
Affective modulation of external misattribution bias in source monitoring in schizophrenia.
Costafreda, S G; Brébion, G; Allen, P; McGuire, P K; Fu, C H Y
2008-06-01
Schizophrenic patients tend to attribute internal events to external agents, a bias that may be linked to positive symptoms. We investigated the effect of emotional valence on the cognitive bias. Male schizophrenic subjects (n=30) and an experimenter alternatively produced neutral and negative words. The subject then decided whether he or the experimenter had generated the item. External misattributions were more common than self-misattributions, and the bias was greater for patients with active hallucinations and delusions relative to patients in remission. Actively psychotic patients but not patients in remission were more likely to generate external misattributions with negative relative to neutral words. Affective modulation of the externalizing cognitive bias in source monitoring is evident in patients with hallucinations and delusions.
Quantum control on entangled bipartite qubits
DOE Office of Scientific and Technical Information (OSTI.GOV)
Delgado, Francisco
2010-04-15
Ising interactions between qubits can produce distortion on entangled pairs generated for engineering purposes (e.g., for quantum computation or quantum cryptography). The presence of parasite magnetic fields destroys or alters the expected behavior for which it was intended. In addition, these pairs are generated with some dispersion in their original configuration, so their discrimination is necessary for applications. Nevertheless, discrimination should be made after Ising distortion. Quantum control helps in both problems; making some projective measurements upon the pair to decide the original state to replace it, or just trying to reconstruct it using some procedures which do not altermore » their quantum nature. Results about the performance of these procedures are reported. First, we will work with pure systems studying restrictions and advantages. Then, we will extend these operations for mixed states generated with uncertainty in the time of distortion, correcting them by assuming the control prescriptions for the most probable one.« less
The mission events graphic generator software: A small tool with big results
NASA Technical Reports Server (NTRS)
Lupisella, Mark; Leibee, Jack; Scaffidi, Charles
1993-01-01
Utilization of graphics has long been a useful methodology for many aspects of spacecraft operations. A personal computer based software tool that implements straight-forward graphics and greatly enhances spacecraft operations is presented. This unique software tool is the Mission Events Graphic Generator (MEGG) software which is used in support of the Hubble Space Telescope (HST) Project. MEGG reads the HST mission schedule and generates a graphical timeline.
Making Decisions About Participating in Elections
ERIC Educational Resources Information Center
Patrick, John J.
1976-01-01
Suggestions are offered for teaching about three kinds of decisions in electoral politics which all citizens face: deciding whether to participate, deciding how to participate, and deciding for whom to vote. (AV)
Zhang, Qi; Zeng, Xin; Younkin, Sam; Kawli, Trupti; Snyder, Michael P; Keleş, Sündüz
2016-02-24
Chromatin immunoprecipitation followed by sequencing (ChIP-seq) experiments revolutionized genome-wide profiling of transcription factors and histone modifications. Although maturing sequencing technologies allow these experiments to be carried out with short (36-50 bps), long (75-100 bps), single-end, or paired-end reads, the impact of these read parameters on the downstream data analysis are not well understood. In this paper, we evaluate the effects of different read parameters on genome sequence alignment, coverage of different classes of genomic features, peak identification, and allele-specific binding detection. We generated 101 bps paired-end ChIP-seq data for many transcription factors from human GM12878 and MCF7 cell lines. Systematic evaluations using in silico variations of these data as well as fully simulated data, revealed complex interplay between the sequencing parameters and analysis tools, and indicated clear advantages of paired-end designs in several aspects such as alignment accuracy, peak resolution, and most notably, allele-specific binding detection. Our work elucidates the effect of design on the downstream analysis and provides insights to investigators in deciding sequencing parameters in ChIP-seq experiments. We present the first systematic evaluation of the impact of ChIP-seq designs on allele-specific binding detection and highlights the power of pair-end designs in such studies.
[Practical guidelines for the diagnosis and treatment of obstructive sleep apnea syndrome].
Nogueira, Facundo; Nigro, Carlos; Cambursano, Hugo; Borsini, Eduardo; Silio, Julio; Avila, Jorge
2013-01-01
Obstructive sleep apnoea syndrome (OSAS) is one of the most relevant chronic respiratory pathologies due to its high prevalence and impact in morbidity and mortality. In 2001, the Asociación Argentina de Medicina Respiratoria (AAMR) published the first Argentinean Consensus on Sleep-Related breathing Disorders. Since then, wide new scientific evidence has emerged, increasing significantly the knowledge about this pathology. According to this, the Sleep-Related breathing Disorders and Oxygen Therapy Section of the AAMR, decided to update its Consensus, developing this Practical Guidelines on Management of patients with OSAS. A working group was created with members belonging to the section, experts in OSAS. They extensively reviewed the literature and wrote these guidelines, orientated to practical resolution of clinical problems and giving answers to questions emerged from dealing with patients who suffer from this syndrome. The document defines OSAS and describes the diagnosis and severity criteria, as well as the risk factors, ways of presentation and epidemiology. Clinical consequences, mainly on cognition, cardiovascular system and metabolism are pointed out. Different diagnostic methods, with their indications and technical aspects for validation and interpretation are detailed. Finally, we describe therapeutic alternatives, as well as practical aspects of their implementation. The authors' aim was to generate an accessible tool for teaching and spreading the knowledge on these disorders, which have a great impact in public health.
Guidelines for the analysis of free energy calculations.
Klimovich, Pavel V; Shirts, Michael R; Mobley, David L
2015-05-01
Free energy calculations based on molecular dynamics simulations show considerable promise for applications ranging from drug discovery to prediction of physical properties and structure-function studies. But these calculations are still difficult and tedious to analyze, and best practices for analysis are not well defined or propagated. Essentially, each group analyzing these calculations needs to decide how to conduct the analysis and, usually, develop its own analysis tools. Here, we review and recommend best practices for analysis yielding reliable free energies from molecular simulations. Additionally, we provide a Python tool, alchemical-analysis.py, freely available on GitHub as part of the pymbar package (located at http://github.com/choderalab/pymbar), that implements the analysis practices reviewed here for several reference simulation packages, which can be adapted to handle data from other packages. Both this review and the tool covers analysis of alchemical calculations generally, including free energy estimates via both thermodynamic integration and free energy perturbation-based estimators. Our Python tool also handles output from multiple types of free energy calculations, including expanded ensemble and Hamiltonian replica exchange, as well as standard fixed ensemble calculations. We also survey a range of statistical and graphical ways of assessing the quality of the data and free energy estimates, and provide prototypes of these in our tool. We hope this tool and discussion will serve as a foundation for more standardization of and agreement on best practices for analysis of free energy calculations.
Iterating between Tools to Create and Edit Visualizations.
Bigelow, Alex; Drucker, Steven; Fisher, Danyel; Meyer, Miriah
2017-01-01
A common workflow for visualization designers begins with a generative tool, like D3 or Processing, to create the initial visualization; and proceeds to a drawing tool, like Adobe Illustrator or Inkscape, for editing and cleaning. Unfortunately, this is typically a one-way process: once a visualization is exported from the generative tool into a drawing tool, it is difficult to make further, data-driven changes. In this paper, we propose a bridge model to allow designers to bring their work back from the drawing tool to re-edit in the generative tool. Our key insight is to recast this iteration challenge as a merge problem - similar to when two people are editing a document and changes between them need to reconciled. We also present a specific instantiation of this model, a tool called Hanpuku, which bridges between D3 scripts and Illustrator. We show several examples of visualizations that are iteratively created using Hanpuku in order to illustrate the flexibility of the approach. We further describe several hypothetical tools that bridge between other visualization tools to emphasize the generality of the model.
Static Structural and Modal Analysis of Gas Turbine Blade
NASA Astrophysics Data System (ADS)
Ranjan Kumar, Ravi; Pandey, K. M., Prof.
2017-08-01
Gas turbine is one of the most versatile items of turbo machinery nowadays. It is used in different modes such as power generation, oil and gas, process plants, aviation, domestic and related small industries. This paper is based on the problems concerning blade profile selection, material selection and turbine rotor blade vibration that seriously impact the induced stress-deformation and structural functioning of developmental gas turbine engine. In this paper for generating specific power by rotating blade at specific RPM, blade profile and material has been decided by static structural analysis. Gas turbine rotating blade RPM is decided by Modal Analysis so that the natural frequency of blade should not match with the excitation frequency. For the above blade profile has been modeled in SOLIDWORKS and analysis has been done in ANSYS WORKBENCH 14. Existing NACA6409 profile has been selected as base model and then it is modified by bending it through 72.5° and 145°. Hence these three different blade profiles have been analyzed for three different materials viz. Super Alloy X, Nimonic 80A and Inconel 625 at three different speed viz. 20000, 40000 and 60000RPM. It is found that NACA6409 with 72.5° bent gives best result for all material at all speed. Among all the material Inconel 625 gives best result. Hence Blade of Inconel 625 having 72.5° bent profile is the best combination for all RPM.
Architecture of Explanatory Inference in the Human Prefrontal Cortex
Barbey, Aron K.; Patterson, Richard
2011-01-01
Causal reasoning is a ubiquitous feature of human cognition. We continuously seek to understand, at least implicitly and often explicitly, the causal scenarios in which we live, so that we may anticipate what will come next, plan a potential response and envision its outcome, decide among possible courses of action in light of their probable outcomes, make midstream adjustments in our goal-related activities as our situation changes, and so on. A considerable body of research shows that the lateral prefrontal cortex (PFC) is crucial for causal reasoning, but also that there are significant differences in the manner in which ventrolateral PFC, dorsolateral PFC, and anterolateral PFC support causal reasoning. We propose, on the basis of research on the evolution, architecture, and functional organization of the lateral PFC, a general framework for understanding its roles in the many and varied sorts of causal reasoning carried out by human beings. Specifically, the ventrolateral PFC supports the generation of basic causal explanations and inferences; dorsolateral PFC supports the evaluation of these scenarios in light of some given normative standard (e.g., of plausibility or correctness in light of real or imagined causal interventions); and anterolateral PFC supports explanation and inference at an even higher level of complexity, coordinating the processes of generation and evaluation with further cognitive processes, and especially with computations of hedonic value and emotional implications of possible behavioral scenarios – considerations that are often critical both for understanding situations causally and for deciding about our own courses of action. PMID:21845182
Sonification for geoscience: Exploring faults from the inside
NASA Astrophysics Data System (ADS)
Mair, K.; Barrett, N.
2013-12-01
Sonification is the use of non-speech audio to convey information or display data. It can involve techniques such as the direct transposition of a signal into the audible range, or the mapping of specific sounds to a signal or event. Since our ears are very sensitive to unusual sonic events or sound patterns, sonification provides a potentially powerful way to perceptualize (or hear) our data. The addition of 3D audio to the sonification tool palate allows us to more accurately connect to spatial data in a novel, engaging and highly accessible approach. Here we investigate the use of sonification for geoscience by sonifying the data generated in computer models of earthquake processes. We present results from our recent 3D DEM (discrete element method) models where granular debris is sheared between rough walls to simulate an evolving fault. To best appreciate the inherently 3D nature of the crushing and sliding events (continuously tracked in our models) that occur as faults slip, we use Ambiosonics (a sound field recreation technology). This allows the position of individual events to be preserved generating a virtual 3D soundscape so we can explore faults - from the inside! During sonification, events such as grain scale fracturing, grain motions and interactions are mapped to specific sounds whose pitch, timbre, and volume reflect properties such as the depth, character, size of the individual events. Active listeners can explore the 3D soundscape and listen to evolving processes in the data by physically moving through the space or navigating via a 3D mouse controller. The soundscape can be heard either through an array of 16 speakers or using a pair of headphones. Emergent phenomena in the models generate sound patterns that are easily spotted by experts and youngsters alike. Also, because our ears are excellent signal-to-noise filters, events are easily recognizable above the background noise. From a learning perspective, the exploration process required to interact with the data is attractive: it is curiosity driven and requires active listening to decide where to go next; and each listener has a unique experience that they choose and control. In addition, the fact that this approach uses a different sense (and part of the brain), means that the listener's appreciation will depend more on their audio awareness than on prior education or aptitude. It is also an accessible tool for the visually impaired. From a scientists perspective, depending on the datasets in question, sonification may become a viable alternative or a complementary tool used alongside standard visualisation techniques. We anticipate significant potential for the use of sonification in the presentation, interpretation and communication of geoscience datasets in the classroom, conference hall, boardroom and public art gallery. Visitors to this poster will be encouraged to interact with and explore our 3D fault soundscape using a laptop and headphone interface.
Code of Federal Regulations, 2010 CFR
2010-07-01
... MARGINAL PROPERTIES Accounting and Auditing Relief § 204.208 May a State decide that it will or will not... not grant that type of marginal property relief. (b) To help States decide whether to allow one or... Report of Marginal Properties by October 1 preceding the calendar year. (c) If a State decides under...
Spontaneous giving and calculated greed.
Rand, David G; Greene, Joshua D; Nowak, Martin A
2012-09-20
Cooperation is central to human social behaviour. However, choosing to cooperate requires individuals to incur a personal cost to benefit others. Here we explore the cognitive basis of cooperative decision-making in humans using a dual-process framework. We ask whether people are predisposed towards selfishness, behaving cooperatively only through active self-control; or whether they are intuitively cooperative, with reflection and prospective reasoning favouring 'rational' self-interest. To investigate this issue, we perform ten studies using economic games. We find that across a range of experimental designs, subjects who reach their decisions more quickly are more cooperative. Furthermore, forcing subjects to decide quickly increases contributions, whereas instructing them to reflect and forcing them to decide slowly decreases contributions. Finally, an induction that primes subjects to trust their intuitions increases contributions compared with an induction that promotes greater reflection. To explain these results, we propose that cooperation is intuitive because cooperative heuristics are developed in daily life where cooperation is typically advantageous. We then validate predictions generated by this proposed mechanism. Our results provide convergent evidence that intuition supports cooperation in social dilemmas, and that reflection can undermine these cooperative impulses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1985-01-06
A fatal accident circumstance and epidemiology report on an incident occurring in a confined space and involving two fatalities is presented. Two employees of a petroleum company were determining whether an empty 10,000-gallon toluene tank needed cleaning. Due to limited visibility, one worker decided to enter the tank. As he descended through a 16 inch opening in the top of the tank, he apparently fell into the tank. The other worker called the city fire department. The responding unit decided to use a K 12 saw to cut an opening in the side of the tank. Although water sprays weremore » used to minimize spark generation, an explosion occurred and a fireman was killed by the concussion. Preliminary medical information indicates that the worker inside the tank was dead prior to the explosion. Recommendations include city fire departments establishing a registry of confined spaces and toxic or explosive substances in the area in which they serve and conducting research to determine the best methods to gain entry into enclosed spaces containing inflammable or explosive atmospheres.« less
NASA Technical Reports Server (NTRS)
Goldin, Daniel S.
1992-01-01
Given here is the address of NASA Administrator Daniel S. Goldin to the Association of Space Explorers. Mr. Goldin's remarks are on the topic of why we should go to Mars, a subject he approaches by first answering the question, What would it mean if we decided today not to go to Mars? After a discussion of the meaning of Columbus' voyage to America, he answers the question by saying that if we decide not to go to Mars, our generation will truly achieve a first in human history - we will be the first to stop at a frontier. After noting that the need to explore is intrinsic to life itself, Mr. Goldin presents several reasons why we should go to the Moon and go to Mars. One reason is economic, another is to increase our scientific knowledge, and yet another is to further the political evolution of humankind through the international cooperation required for building settlements on the Moon and Mars. He concludes by expanding upon the idea that this nation has never been one to shrink from a challenge.
Risk Acceptance Personality Paradigm: How We View What We Don't Know We Don't Know
NASA Technical Reports Server (NTRS)
Massie, Michael J.; Morris, A. Terry
2011-01-01
The purpose of integrated hazard analyses, probabilistic risk assessments, failure modes and effects analyses, fault trees and many other similar tools is to give managers of a program some idea of the risks associated with their program. All risk tools establish a set of undesired events and then try to evaluate the risk to the program by assessing the severity of the undesired event and the likelihood of that event occurring. Some tools provide qualitative results, some provide quantitative results and some do both. However, in the end the program manager and his/her team must decide which risks are acceptable and which are not. Even with a wide array of analysis tools available, risk acceptance is often a controversial and difficult decision making process. And yet, today's space exploration programs are moving toward more risk based design approaches. Thus, risk identification and good risk assessment is becoming even more vital to the engineering development process. This paper explores how known and unknown information influences risk-based decisions by looking at how the various parts of our personalities are affected by what they know and what they don't know. This paper then offers some criteria for consideration when making risk-based decisions.
A software tool for determination of breast cancer treatment methods using data mining approach.
Cakır, Abdülkadir; Demirel, Burçin
2011-12-01
In this work, breast cancer treatment methods are determined using data mining. For this purpose, software is developed to help to oncology doctor for the suggestion of application of the treatment methods about breast cancer patients. 462 breast cancer patient data, obtained from Ankara Oncology Hospital, are used to determine treatment methods for new patients. This dataset is processed with Weka data mining tool. Classification algorithms are applied one by one for this dataset and results are compared to find proper treatment method. Developed software program called as "Treatment Assistant" uses different algorithms (IB1, Multilayer Perception and Decision Table) to find out which one is giving better result for each attribute to predict and by using Java Net beans interface. Treatment methods are determined for the post surgical operation of breast cancer patients using this developed software tool. At modeling step of data mining process, different Weka algorithms are used for output attributes. For hormonotherapy output IB1, for tamoxifen and radiotherapy outputs Multilayer Perceptron and for the chemotherapy output decision table algorithm shows best accuracy performance compare to each other. In conclusion, this work shows that data mining approach can be a useful tool for medical applications particularly at the treatment decision step. Data mining helps to the doctor to decide in a short time.
Managing expectations when publishing tools and methods for computational proteomics.
Martens, Lennart; Kohlbacher, Oliver; Weintraub, Susan T
2015-05-01
Computational tools are pivotal in proteomics because they are crucial for identification, quantification, and statistical assessment of data. The gateway to finding the best choice of a tool or approach for a particular problem is frequently journal articles, yet there is often an overwhelming variety of options that makes it hard to decide on the best solution. This is particularly difficult for nonexperts in bioinformatics. The maturity, reliability, and performance of tools can vary widely because publications may appear at different stages of development. A novel idea might merit early publication despite only offering proof-of-principle, while it may take years before a tool can be considered mature, and by that time it might be difficult for a new publication to be accepted because of a perceived lack of novelty. After discussions with members of the computational mass spectrometry community, we describe here proposed recommendations for organization of informatics manuscripts as a way to set the expectations of readers (and reviewers) through three different manuscript types that are based on existing journal designations. Brief Communications are short reports describing novel computational approaches where the implementation is not necessarily production-ready. Research Articles present both a novel idea and mature implementation that has been suitably benchmarked. Application Notes focus on a mature and tested tool or concept and need not be novel but should offer advancement from improved quality, ease of use, and/or implementation. Organizing computational proteomics contributions into these three manuscript types will facilitate the review process and will also enable readers to identify the maturity and applicability of the tool for their own workflows.
pyLIMA : an open source microlensing software
NASA Astrophysics Data System (ADS)
Bachelet, Etienne
2017-01-01
Planetary microlensing is a unique tool to detect cold planets around low-mass stars which is approaching a watershed in discoveries as near-future missions incorporate dedicated surveys. NASA and ESA have decided to complement WFIRST-AFTA and Euclid with microlensing programs to enrich our statistics about this planetary population. Of the nany challenges in- herent in these missions, the data analysis is of primary importance, yet is often perceived as time consuming, complex and daunting barrier to participation in the field. We present the first open source modeling software to conduct a microlensing analysis. This software is written in Python and use as much as possible existing packages.
Berger-Fiffy, Jill
2012-01-01
Harvard Vanguard Medical Associates (Harvard Vanguard) decided to develop a Shared Medical Appointment (SMA) program in 2007 for a variety of reasons. The program has launched 86 SMAs in 17 specialties at 12 sites and has exceeded 13 000 patient visits. Currently, the practice offers 54 SMAs and is believed to be the largest program in the country. This article provides an overview regarding staffing, space and equipment, project planning, promotional materials, training programs, workflow development, and the use of quality improvement (ie, LEAN) tools used to monitor the work to be completed and the metrics to date.
Pharmaceutical structure montages as catalysts for design and discovery.
Njarðarson, Jon T
2012-05-01
Majority of pharmaceuticals are small molecule organic compounds. Their structures are most effectively described and communicated using the graphical language of organic chemistry. A few years ago we decided to harness this powerful language to create new educational tools that could serve well for data mining and as catalysts for discovery. The results were the Top 200 drug posters, which we have posted online for everyone to enjoy and update yearly. This article details the origin and motivation for our design and highlights the value of this graphical format by presenting and analyzing a new pharmaceutical structure montage (poster) focused on US FDA approved drugs in 2011.
Betan, E J; Stanton, A L
1999-06-01
Ethical dilemmas are inherently challenging. By definition, clinicians decide between conflicting principles of welfare and naturally confront competing pulls and inclinations. This investigation of students' responses to an ethical scenario highlights how emotions and concerns can interfere with willingness to implement ethical knowledge. Clear-cut rules are the exception in psychotherapy, and clinicians must judge ethical issues on the basis of the unique context of each case. As such, subjectivity and emotional involvement are essential tools for determining ethical action, but they must be integrated with rational analysis. Strategies for attending to influential emotions and contextual factors in order to mobilize ethical commitment are described.
Computational Tools and Facilities for the Next-Generation Analysis and Design Environment
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)
1997-01-01
This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.
Infrastructure for the life sciences: design and implementation of the UniProt website.
Jain, Eric; Bairoch, Amos; Duvaud, Severine; Phan, Isabelle; Redaschi, Nicole; Suzek, Baris E; Martin, Maria J; McGarvey, Peter; Gasteiger, Elisabeth
2009-05-08
The UniProt consortium was formed in 2002 by groups from the Swiss Institute of Bioinformatics (SIB), the European Bioinformatics Institute (EBI) and the Protein Information Resource (PIR) at Georgetown University, and soon afterwards the website http://www.uniprot.org was set up as a central entry point to UniProt resources. Requests to this address were redirected to one of the three organisations' websites. While these sites shared a set of static pages with general information about UniProt, their pages for searching and viewing data were different. To provide users with a consistent view and to cut the cost of maintaining three separate sites, the consortium decided to develop a common website for UniProt. Following several years of intense development and a year of public beta testing, the http://www.uniprot.org domain was switched to the newly developed site described in this paper in July 2008. The UniProt consortium is the main provider of protein sequence and annotation data for much of the life sciences community. The http://www.uniprot.org website is the primary access point to this data and to documentation and basic tools for the data. These tools include full text and field-based text search, similarity search, multiple sequence alignment, batch retrieval and database identifier mapping. This paper discusses the design and implementation of the new website, which was released in July 2008, and shows how it improves data access for users with different levels of experience, as well as to machines for programmatic access.http://www.uniprot.org/ is open for both academic and commercial use. The site was built with open source tools and libraries. Feedback is very welcome and should be sent to help@uniprot.org. The new UniProt website makes accessing and understanding UniProt easier than ever. The two main lessons learned are that getting the basics right for such a data provider website has huge benefits, but is not trivial and easy to underestimate, and that there is no substitute for using empirical data throughout the development process to decide on what is and what is not working for your users.
SU-E-T-278: Realization of Dose Verification Tool for IMRT Plan Based On DPM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cai, Jinfeng; Cao, Ruifen; Dai, Yumei
Purpose: To build a Monte Carlo dose verification tool for IMRT Plan by implementing a irradiation source model into DPM code. Extend the ability of DPM to calculate any incident angles and irregular-inhomogeneous fields. Methods: With the virtual source and the energy spectrum which unfolded from the accelerator measurement data,combined with optimized intensity maps to calculate the dose distribution of the irradiation irregular-inhomogeneous field. The irradiation source model of accelerator was substituted by a grid-based surface source. The contour and the intensity distribution of the surface source were optimized by ARTS (Accurate/Advanced Radiotherapy System) optimization module based on the tumormore » configuration. The weight of the emitter was decided by the grid intensity. The direction of the emitter was decided by the combination of the virtual source and the emitter emitting position. The photon energy spectrum unfolded from the accelerator measurement data was adjusted by compensating the contaminated electron source. For verification, measured data and realistic clinical IMRT plan were compared with DPM dose calculation. Results: The regular field was verified by comparing with the measured data. It was illustrated that the differences were acceptable (<2% inside the field, 2–3mm in the penumbra). The dose calculation of irregular field by DPM simulation was also compared with that of FSPB (Finite Size Pencil Beam) and the passing rate of gamma analysis was 95.1% for peripheral lung cancer. The regular field and the irregular rotational field were all within the range of permitting error. The computing time of regular fields were less than 2h, and the test of peripheral lung cancer was 160min. Through parallel processing, the adapted DPM could complete the calculation of IMRT plan within half an hour. Conclusion: The adapted parallelized DPM code with irradiation source model is faster than classic Monte Carlo codes. Its computational accuracy and speed satisfy the clinical requirement, and it is expectable to be a Monte Carlo dose verification tool for IMRT Plan. Strategic Priority Research Program of the China Academy of Science(XDA03040000); National Natural Science Foundation of China (81101132)« less
NetList(+): A simple interface language for chip design
NASA Astrophysics Data System (ADS)
Wuu, Tzyh-Yung
1991-04-01
NetList (+) is a design specification language developed at MOSIS for rapid turn-around cell-based ASIC prototyping. By using NetList (+), a uniform representation is achieved for the specification, simulation, and physical description of a design. The goal is to establish an interfacing methodology between design specification and independent computer aided design tools. Designers need only to specify a system by writing a corresponding netlist. This netlist is used for both functional simulation and timing simulation. The same netlist is also used to derive the low level physical tools to generate layout. Another goal of using NetList (+) is to generate parts of a design by running it through different kinds of placement and routing (P and R) tools. For example some parts of a design will be generated by standard cell P and R tools. Other parts may be generated by a layout tiler; i.e., datapath compiler, RAM/ROM generator, or PLA generator. Finally all different parts of a design can be integrated by general block P and R tools as a single chip. The NetList (+) language can actually act as an interface among tools. Section 2 shows a flowchart to illustrate the NetList (+) system and its relation with other related design tools. Section 3 shows how to write a NetList (+) description from the block diagram of a circuit. In section 4 discusses how to prepare a cell library or several cell libraries for a design system. Section 5 gives a few designs by NetList (+) and shows their simulation and layout results.
Video Games as a Training Tool to Prepare the Next Generation of Cyber Warriors
2014-10-01
2. REPORT TYPE N/A 3. DATES COVERED - 4 . TITLE AND SUBTITLE Video Games as a Training Tool to Prepare the Next Generation of Cyber Warriors...CYBERSECURITY WORKFORCE SHORTAGE .......................................................................... 3 4 1.1 GREATER CYBERSECURITY EDUCATION IS... 4 6 2.1 HOW VIDEO GAMES CAN BE EFFECTIVE LEARNING TOOLS
The Exercise: An Exercise Generator Tool for the SOURCe Project
ERIC Educational Resources Information Center
Kakoyianni-Doa, Fryni; Tziafa, Eleni; Naskos, Athanasios
2016-01-01
The Exercise, an Exercise generator in the SOURCe project, is a tool that complements the properties and functionalities of the SOURCe project, which includes the search engine for the Searchable Online French-Greek parallel corpus for the UniveRsity of Cyprus (SOURCe) (Kakoyianni-Doa & Tziafa, 2013), the PENCIL (an alignment tool)…
Use of the MATRIXx Integrated Toolkit on the Microwave Anisotropy Probe Attitude Control System
NASA Technical Reports Server (NTRS)
Ward, David K.; Andrews, Stephen F.; McComas, David C.; ODonnell, James R., Jr.
1999-01-01
Recent advances in analytical software tools allow the analysis, simulation, flight code, and documentation of an algorithm to be generated from a single source, all within one integrated analytical design package. NASA's Microwave Anisotropy Probe project has used one such package, Integrated Systems' MATRIXx suite, in the design of the spacecraft's Attitude Control System. The project's experience with the linear analysis, simulation, code generation, and documentation tools will be presented and compared with more traditional development tools. In particular, the quality of the flight software generated will be examined in detail. Finally, lessons learned on each of the tools will be shared.
TIDE TOOL: Open-Source Sea-Level Monitoring Software for Tsunami Warning Systems
NASA Astrophysics Data System (ADS)
Weinstein, S. A.; Kong, L. S.; Becker, N. C.; Wang, D.
2012-12-01
A tsunami warning center (TWC) typically decides to issue a tsunami warning bulletin when initial estimates of earthquake source parameters suggest it may be capable of generating a tsunami. A TWC, however, relies on sea-level data to provide prima facie evidence for the existence or non-existence of destructive tsunami waves and to constrain tsunami wave height forecast models. In the aftermath of the 2004 Sumatra disaster, the International Tsunami Information Center asked the Pacific Tsunami Warning Center (PTWC) to develop a platform-independent, easy-to-use software package to give nascent TWCs the ability to process WMO Global Telecommunications System (GTS) sea-level messages and to analyze the resulting sea-level curves (marigrams). In response PTWC developed TIDE TOOL that has since steadily grown in sophistication to become PTWC's operational sea-level processing system. TIDE TOOL has two main parts: a decoder that reads GTS sea-level message logs, and a graphical user interface (GUI) written in the open-source platform-independent graphical toolkit scripting language Tcl/Tk. This GUI consists of dynamic map-based clients that allow the user to select and analyze a single station or groups of stations by displaying their marigams in strip-chart or screen-tiled forms. TIDE TOOL also includes detail maps of each station to show each station's geographical context and reverse tsunami travel time contours to each station. TIDE TOOL can also be coupled to the GEOWARE™ TTT program to plot tsunami travel times and to indicate the expected tsunami arrival time on the marigrams. Because sea-level messages are structured in a rich variety of formats TIDE TOOL includes a metadata file, COMP_META, that contains all of the information needed by TIDE TOOL to decode sea-level data as well as basic information such as the geographical coordinates of each station. TIDE TOOL can therefore continuously decode theses sea-level messages in real-time and display the time-series data in the GUI as well. This GUI also includes mouse-clickable functions such as zooming or expanding the time-series display, measuring tsunami signal characteristics (arrival time, wave period and amplitude, etc.), and removing the tide signal from the time-series data. De-tiding of the time series is necessary to obtain accurate measurements of tsunami wave parameters and to maintain accurate historical tsunami databases. With TIDE TOOL, de-tiding is accomplished with a set of tide harmonic coefficients routinely computed and updated at PTWC for many of the stations in PTWC's inventory (~570). PTWC also uses the decoded time series files (previous 3-5 days' worth) to compute on-the-fly tide coefficients. The latter is useful in cases where the station is new and a long-term stable set of tide coefficients are not available or cannot be easily obtained due to various non-astronomical effects. The international tsunami warning system is coordinated globally by the UNESCO IOC, and a number of countries in the Pacific and Indian Ocean, and Caribbean depend on Tide Tool to monitor tsunamis in real time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1965-04-30
The manual serves as a guide to the important factors to consider in establishing a small-scale community electric system. Financial requirements include labor costs, machinery, equipment, utilities and administrative costs, raw materials (for diesel fuel to run the generators). Tables on cost estimates are given, with a blank column for actual cost statements; the summary provides questions that will help the planner decide what is necessary for setting up the plant and whether the requirements can be met.
CE-SAM: A Conversational Interface for ISR Mission Support
2013-06-01
soft -data21”, that is data which is not produced directly by ISR “ hard ” sensing assets but which is instead generated via other means, e.g. by humans...Mott, G. de Mel, and T. Pham, “Integrating hard and soft in- formation sources for D2D using controlled natural language,” in 2012 15th...need to decide which ISR sensing asset (or groups of assets) might be fit to satisfy with a high success-rate our information request. This is a hard
Design, synthesis and cellular metabolism study of 4'-selenonucleosides.
Yu, Jinha; Sahu, Pramod K; Kim, Gyudong; Qu, Shuhao; Choi, Yoojin; Song, Jayoung; Lee, Sang Kook; Noh, Minsoo; Park, Sunghyouk; Jeong, Lak Shin
2015-01-01
4'-seleno-homonucleosides were synthesized as next-generation nucleosides, and their cellular phosphorylation was studied to confirm the hypothesis that bulky selenium atom can sterically hinder the approach of cellular nucleoside kinase to the 5'-OH for phosphorylation. 4'-seleno-homonucleosides (n = 2), with one-carbon homologation, were synthesized through a tandem seleno-Michael addition-SN2 ring cyclization. LC-MS analysis demonstrated that they were phosphorylated by cellular nucleoside kinases, resulting in anticancer activity. The bulky selenium atom played a key role in deciding the phosphorylation by cellular nucleoside kinases. [Formula: see text].
Pressurized fluidized bed offers promising route to cogeneration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1980-03-01
STAL-LAVAL has been monitoring the development of pressurized fluidized-bed combustion (PFBC) technology and has decided to apply it as a way to burn coal and satisfy the important criteria of efficiency, low cost, environmental acceptability, low investment cost, and the capacity to use a wide range of coal qualities. The present status of PFBC and co-generation technology is reviewed and examples of industrial as well as utiltiy applications are cited. A successful commercialization of PFBC could contribute to the success of coal-utilization policies. (DCK)
LISA, the next generation: from a web-based application to a fat client.
Pierlet, Noëlla; Aerts, Werner; Vanautgaerden, Mark; Van den Bosch, Bart; De Deurwaerder, André; Schils, Erik; Noppe, Thomas
2008-01-01
The LISA application, developed by the University Hospitals Leuven, permits referring physicians to consult the electronic medical records of their patients over the internet in a highly secure way. We decided to completely change the way we secured the application, discard the existing web application and build a completely new application, based on the in-house developed hospital information system, used in the University Hospitals Leuven. The result is a fat Java client, running on a Windows Terminal Server, secured by a commercial SSL-VPN solution.
Efficient Data Generation and Publication as a Test Tool
NASA Technical Reports Server (NTRS)
Einstein, Craig Jakob
2017-01-01
A tool to facilitate the generation and publication of test data was created to test the individual components of a command and control system designed to launch spacecraft. Specifically, this tool was built to ensure messages are properly passed between system components. The tool can also be used to test whether the appropriate groups have access (read/write privileges) to the correct messages. The messages passed between system components take the form of unique identifiers with associated values. These identifiers are alphanumeric strings that identify the type of message and the additional parameters that are contained within the message. The values that are passed with the message depend on the identifier. The data generation tool allows for the efficient creation and publication of these messages. A configuration file can be used to set the parameters of the tool and also specify which messages to pass.
An Ada programming support environment
NASA Technical Reports Server (NTRS)
Tyrrill, AL; Chan, A. David
1986-01-01
The toolset of an Ada Programming Support Environment (APSE) being developed at North American Aircraft Operations (NAAO) of Rockwell International, is described. The APSE is resident on three different hosts and must support developments for the hosts and for embedded targets. Tools and developed software must be freely portable between the hosts. The toolset includes the usual editors, compilers, linkers, debuggers, configuration magnagers, and documentation tools. Generally, these are being supplied by the host computer vendors. Other tools, for example, pretty printer, cross referencer, compilation order tool, and management tools were obtained from public-domain sources, are implemented in Ada and are being ported to the hosts. Several tools being implemented in-house are of interest, these include an Ada Design Language processor based on compilable Ada. A Standalone Test Environment Generator facilitates test tool construction and partially automates unit level testing. A Code Auditor/Static Analyzer permits the Ada programs to be evaluated against measures of quality. An Ada Comment Box Generator partially automates generation of header comment boxes.
Comparison of BrainTool to other UML modeling and model transformation tools
NASA Astrophysics Data System (ADS)
Nikiforova, Oksana; Gusarovs, Konstantins
2017-07-01
In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.
NASA Astrophysics Data System (ADS)
Zhao, Minghui; Zhao, Xuesen; Li, Zengqiang; Sun, Tao
2014-08-01
In the non-rotational symmetrical microstrcture surfaces generation using turning method with Fast Tool Servo(FTS), non-uniform distribution of the interpolation data points will lead to long processing cycle and poor surface quality. To improve this situation, nearly arc-length tool path generation algorithm is proposed, which generates tool tip trajectory points in nearly arc-length instead of the traditional interpolation rule of equal angle and adds tool radius compensation. All the interpolation points are equidistant in radial distribution because of the constant feeding speed in X slider, the high frequency tool radius compensation components are in both X direction and Z direction, which makes X slider difficult to follow the input orders due to its large mass. Newton iterative method is used to calculate the neighboring contour tangent point coordinate value with the interpolation point X position as initial value, in this way, the new Z coordinate value is gotten, and the high frequency motion components in X direction is decomposed into Z direction. Taking a typical microstructure with 4μm PV value for test, which is mixed with two 70μm wave length sine-waves, the max profile error at the angle of fifteen is less than 0.01μm turning by a diamond tool with big radius of 80μm. The sinusoidal grid is machined on a ultra-precision lathe succesfully, the wavelength is 70.2278μm the Ra value is 22.81nm evaluated by data points generated by filtering out the first five harmonics.
Testing of a De Nora polymer electrolyte fuel cell stack of 1 kW for naval applications
NASA Astrophysics Data System (ADS)
Schmal, D.; Kluiters, C. E.; Barendregt, I. P.
In a previous study calculations were carried out for a navy frigate with respect to the energy consumption of a propulsion/electricity generation system based on fuel cells. The fuel consumption for the 'all-fuel cell' ship was compared with the consumption of the current propulsion/electricity generation system based on gas turbines and diesel engines; it showed potential energy savings of a fuel cell based system amounting from 25 to 30%. On the basis of these results and taking into account various military aspects it was decided to start tests with a polymer electrolyte fuel cell (PEFC) stack. For this purpose a De Nora 1 kW PEFC was chosen. Results of the first tests after installation are satisfying.
The generation of criteria for selecting analytical tools for landscape management
Marilyn Duffey-Armstrong
1979-01-01
This paper presents an approach to generating criteria for selecting the analytical tools used to assess visual resources for various landscape management tasks. The approach begins by first establishing the overall parameters for the visual assessment task, and follows by defining the primary requirements of the various sets of analytical tools to be used. Finally,...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karp, Peter D.
Pathway Tools is a systems-biology software package written by SRI International (SRI) that produces Pathway/Genome Databases (PGDBs) for organisms with a sequenced genome. Pathway Tools also provides a wide range of capabilities for analyzing predicted metabolic networks and user-generated omics data. More than 5,000 academic, industrial, and government groups have licensed Pathway Tools. This user community includes researchers at all three DOE bioenergy centers, as well as academic and industrial metabolic engineering (ME) groups. An integral part of the Pathway Tools software is MetaCyc, a large, multiorganism database of metabolic pathways and enzymes that SRI and its academic collaborators manuallymore » curate. This project included two main goals: I. Enhance the MetaCyc content of bioenergy-related enzymes and pathways. II. Develop computational tools for engineering metabolic pathways that satisfy specified design goals, in particular for bioenergy-related pathways. In part I, SRI proposed to significantly expand the coverage of bioenergy-related metabolic information in MetaCyc, followed by the generation of organism-specific PGDBs for all energy-relevant organisms sequenced at the DOE Joint Genome Institute (JGI). Part I objectives included: 1: Expand the content of MetaCyc to include bioenergy-related enzymes and pathways. 2: Enhance the Pathway Tools software to enable display of complex polymer degradation processes. 3: Create new PGDBs for the energy-related organisms sequenced by JGI, update existing PGDBs with new MetaCyc content, and make these data available to JBEI via the BioCyc website. In part II, SRI proposed to develop an efficient computational tool for the engineering of metabolic pathways. Part II objectives included: 4: Develop computational tools for generating metabolic pathways that satisfy specified design goals, enabling users to specify parameters such as starting and ending compounds, and preferred or disallowed intermediate compounds. The pathways were to be generated using metabolic reactions from a reference database (DB). 5: Develop computational tools for ranking the pathways generated in objective (4) according to their optimality. The ranking criteria include stoichiometric yield, the number and cost of additional inputs and the cofactor compounds required by the pathway, pathway length, and pathway energetics. 6: Develop tools for visualizing generated pathways to facilitate the evaluation of a large space of generated pathways.« less
Embedded real-time image processing hardware for feature extraction and clustering
NASA Astrophysics Data System (ADS)
Chiu, Lihu; Chang, Grant
2003-08-01
Printronix, Inc. uses scanner-based image systems to perform print quality measurements for line-matrix printers. The size of the image samples and image definition required make commercial scanners convenient to use. The image processing is relatively well defined, and we are able to simplify many of the calculations into hardware equations and "c" code. The process of rapidly prototyping the system using DSP based "c" code gets the algorithms well defined early in the development cycle. Once a working system is defined, the rest of the process involves splitting the task up for the FPGA and the DSP implementation. Deciding which of the two to use, the DSP or the FPGA, is a simple matter of trial benchmarking. There are two kinds of benchmarking: One for speed, and the other for memory. The more memory intensive algorithms should run in the DSP, and the simple real time tasks can use the FPGA most effectively. Once the task is split, we can decide which platform the algorithm should be executed. This involves prototyping all the code in the DSP, then timing various blocks of the algorithm. Slow routines can be optimized using the compiler tools, and if further reduction in time is needed, into tasks that the FPGA can perform.
An Arbitrary First Order Theory Can Be Represented by a Program: A Theorem
NASA Technical Reports Server (NTRS)
Hosheleva, Olga
1997-01-01
How can we represent knowledge inside a computer? For formalized knowledge, classical logic seems to be the most adequate tool. Classical logic is behind all formalisms of classical mathematics, and behind many formalisms used in Artificial Intelligence. There is only one serious problem with classical logic: due to the famous Godel's theorem, classical logic is algorithmically undecidable; as a result, when the knowledge is represented in the form of logical statements, it is very difficult to check whether, based on this statement, a given query is true or not. To make knowledge representations more algorithmic, a special field of logic programming was invented. An important portion of logic programming is algorithmically decidable. To cover knowledge that cannot be represented in this portion, several extensions of the decidable fragments have been proposed. In the spirit of logic programming, these extensions are usually introduced in such a way that even if a general algorithm is not available, good heuristic methods exist. It is important to check whether the already proposed extensions are sufficient, or further extensions is necessary. In the present paper, we show that one particular extension, namely, logic programming with classical negation, introduced by M. Gelfond and V. Lifschitz, can represent (in some reasonable sense) an arbitrary first order logical theory.
NASA Technical Reports Server (NTRS)
1982-01-01
Accutron Tool & Instrument Co.'s welder was originally developed as a tool specifically for joining parts made of plastic or composite materials in any atmosphere to include the airless environment of space. Developers decided on induction or magnetic heating to avoid causing deformation and it also can be used with almost any type of thermoplastic material. Induction coil transfers magnetic flux through the plastic to a metal screen that is sandwiched between the sheets of plastic to be joined. When welder is energized, alternating current produces inductive heating on the screen causing the adjacent plastic surfaces to melt and flow into the mesh, creating a bond on the total surface area. Dave Brown, owner of Great Falls Canoe and Kayak Repair, Vienna, VA, uses a special repair technique based on operation of the Induction Toroid Welder to fix canoes. Whitewater canoeing poses the problem of frequent gashes that are difficult to repair. The main reason is that many canoes are made of plastics. The commercial Induction model is a self-contained, portable welding gun with a switch on the handle to regulate the temperature of the plastic melting screen. Welder has a broad range of applications in the automobile, appliance, aerospace and construction industries.
Guidelines for appropriate care: the importance of empirical normative analysis.
Berg, M; Meulen, R T; van den Burg, M
2001-01-01
The Royal Dutch Medical Association recently completed a research project aimed at investigating how guidelines for 'appropriate medical care' should be construed. The project took as a starting point that explicit attention should be given to ethical and political considerations in addition to data about costs and effectiveness. In the project, two research groups set out to design guidelines and cost-effectiveness analyses (CEAs) for two circumscribed medical areas (angina pectoris and major depression). Our third group was responsible for the normative analysis. We undertook an explorative, qualitative pilot study of the normative considerations that played a role in constructing the guidelines and CEAs, and simultaneously interviewed specialists about the normative considerations that guided their diagnostic and treatment decisions. Explicating normative considerations, we argue, is important democratically: the issues at stake should not be left to decision analysts and guideline developers to decide. Moreover, it is a necessary condition for a successful implementation of such tools: those who draw upon these tools will only accept them when they can recognize themselves in the considerations implied. Empirical normative analysis, we argue, is a crucial tool in developing guidelines for appropriate medical care.
Satapathy, Sujata; Choudhary, Vandana; Sagar, Rajesh
2017-02-01
Absence of visible physical symptoms and limited capacity to express trauma directly, pose significant challenges in assessment of its exact nature of trauma and its correlates in child sexual abuse. There are numerous assessment tools however, deciding upon the appropriateness is often challenging in Asian socio-cultural and health care set up. A review would provide a ready reference to the practioner regarding the exact clinically utility of the tools and also would guide them in the direction of culture specific modifications. Computerized databases namely Medline, PsycINFO, Health and Psychosocial Instruments, and Social Sciences Citation Index were used. 52 scales were obtained and analysed in terms of scale characteristics, reference to theory and DSM, and cultural competency. Despite of a wide variety of methods, and newer instruments, many of the traditionally used techniques of child's internal thinking and emotional assessment appear outdated while reviewing the recent theories of CSA related psychological trauma. An integrated format, incroporating child-parent-clinicain rating, with multiple domain speciafic items and verbal and non-verbal tasks, is the current need in the Asian region. Copyright © 2016 Elsevier B.V. All rights reserved.
A Non-technical User-Oriented Display Notation for XACML Conditions
NASA Astrophysics Data System (ADS)
Stepien, Bernard; Felty, Amy; Matwin, Stan
Ideally, access control to resources in complex IT systems ought to be handled by business decision makers who own a given resource (e.g., the pay and benefits section of an organization should decide and manage the access rules to the payroll system). To make this happen, the security and database communities need to develop vendor-independent access management tools, useable by decision makers, rather than technical personnel detached from a given business function. We have developed and implemented such tool, based on XACML. The XACML is an important emerging tool for managing complex access control applications. As a formal notation, based on an XML schema representing the grammar of a given application, XACML is precise and non-ambiguous. But this very property puts it out of reach of non-technical users. We propose a new notation for displaying and editing XACML rules that is independent of XML, and we develop an editor for it. Our notation combines a tree representation of logical expressions with an accessible natural language layer. Our early experience indicates that such rules can be grasped by non-technical users wishing to develop and control rules for accessing their own resources.
Preparing and Analyzing Iced Airfoils
NASA Technical Reports Server (NTRS)
Vickerman, Mary B.; Baez, Marivell; Braun, Donald C.; Cotton, Barbara J.; Choo, Yung K.; Coroneos, Rula M.; Pennline, James A.; Hackenberg, Anthony W.; Schilling, Herbert W.; Slater, John W.;
2004-01-01
SmaggIce version 1.2 is a computer program for preparing and analyzing iced airfoils. It includes interactive tools for (1) measuring ice-shape characteristics, (2) controlled smoothing of ice shapes, (3) curve discretization, (4) generation of artificial ice shapes, and (5) detection and correction of input errors. Measurements of ice shapes are essential for establishing relationships between characteristics of ice and effects of ice on airfoil performance. The shape-smoothing tool helps prepare ice shapes for use with already available grid-generation and computational-fluid-dynamics software for studying the aerodynamic effects of smoothed ice on airfoils. The artificial ice-shape generation tool supports parametric studies since ice-shape parameters can easily be controlled with the artificial ice. In such studies, artificial shapes generated by this program can supplement simulated ice obtained from icing research tunnels and real ice obtained from flight test under icing weather condition. SmaggIce also automatically detects geometry errors such as tangles or duplicate points in the boundary which may be introduced by digitization and provides tools to correct these. By use of interactive tools included in SmaggIce version 1.2, one can easily characterize ice shapes and prepare iced airfoils for grid generation and flow simulations.
POST II Trajectory Animation Tool Using MATLAB, V1.0
NASA Technical Reports Server (NTRS)
Raiszadeh, Behzad
2005-01-01
A trajectory animation tool has been developed for accurately depicting position and the attitude of the bodies in flight. The movies generated from This MATLAB based tool serve as an engineering analysis aid to gain further understanding into the dynamic behavior of bodies in flight. This tool has been designed to interface with the output generated from POST II simulations, and is able to animate a single as well as multiple vehicles in flight.
5 CFR 890.1023 - Information considered in deciding a contest.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Imposed Against Health Care Providers Permissive Debarments § 890.1023 Information considered in deciding... 5 Administrative Personnel 2 2013-01-01 2013-01-01 false Information considered in deciding a contest. 890.1023 Section 890.1023 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED...
Development of an Easy-to-Use Tool for the Assessment of Emergency Department Physical Design.
Majidi, Alireza; Tabatabaey, Ali; Motamed, Hassan; Motamedi, Maryam; Forouzanfar, Mohammad Mehdi
2014-01-01
Physical design of the emergency department (ED) has an important effect on its role and function. To date, no guidelines have been introduced to set the standards for the construction of EDs in Iran. In this study, we aim to devise an easy-to-use tool based on the available literature and expert opinion for the quick and effective assessment of EDs in regards to their physical design. For this purpose, based on current literature on emergency design, a comprehensive checklist was developed. Then, this checklist was analyzed by a panel consisting of heads of three major EDs and contradicting items were decided. 178 crude items were derived from available literature. The Items were categorized in to three major domains of Physical space, Equipment, and Accessibility. The final checklist approved by the panel consisted of 163 items categorized into six domains. Each item was phrased as a "Yes or No" question for ease of analysis, meaning that the criterion is either met or not.
NASA Astrophysics Data System (ADS)
Dewi, Cut; Nopera Rauzi, Era
2018-05-01
This paper discusses the role of architectural heritage as a tool for resilience in a community after a surpassing disaster. It argues that architectural heritage is not merely a passive victim needing to be rescued; rather it is also an active agent in providing resilience for survivors. It is evidence in the ways it acts as a signifier of collective memories and place identities, and a place to seek refuge in emergency time and to decide central decision during the reconstruction process. This paper explores several theories related to architectural heritage in post-disaster context and juxtaposes them in a case study of Banda Aceh after the 2004 Tsunami Disaster. The paper is based on a six-month anthropological fieldwork in 2012 in Banda Aceh after the Tsunami Disaster. During the fieldwork, 166 respondents were interviewed to gain extensive insight into the ways architecture might play a role in post-disaster reconstruction.
A decision support tool for landfill methane generation and gas collection.
Emkes, Harriet; Coulon, Frédéric; Wagland, Stuart
2015-09-01
This study presents a decision support tool (DST) to enhance methane generation at individual landfill sites. To date there is no such tool available to provide landfill decision makers with clear and simplified information to evaluate biochemical processes within a landfill site, to assess performance of gas production and to identify potential remedies to any issues. The current lack in understanding stems from the complexity of the landfill waste degradation process. Two scoring sets for landfill gas production performance are calculated with the tool: (1) methane output score which measures the deviation of the actual methane output rate at each site which the prediction generated by the first order decay model LandGEM; and (2) landfill gas indicators' score, which measures the deviation of the landfill gas indicators from their ideal ranges for optimal methane generation conditions. Landfill gas indicators include moisture content, temperature, alkalinity, pH, BOD, COD, BOD/COD ratio, ammonia, chloride, iron and zinc. A total landfill gas indicator score is provided using multi-criteria analysis to calculate the sum of weighted scores for each indicator. The weights for each indicator are calculated using an analytical hierarchical process. The tool is tested against five real scenarios for landfill sites in UK with a range of good, average and poor landfill methane generation over a one year period (2012). An interpretation of the results is given for each scenario and recommendations are highlighted for methane output rate enhancement. Results demonstrate how the tool can help landfill managers and operators to enhance their understanding of methane generation at a site-specific level, track landfill methane generation over time, compare and rank sites, and identify problems areas within a landfill site. Copyright © 2015 Elsevier Ltd. All rights reserved.
5 CFR 890.1036 - Information considered in deciding a contest.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Imposed Against Health Care Providers Suspension § 890.1036 Information considered in deciding a contest... 5 Administrative Personnel 2 2014-01-01 2014-01-01 false Information considered in deciding a contest. 890.1036 Section 890.1036 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED...
5 CFR 890.1036 - Information considered in deciding a contest.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Imposed Against Health Care Providers Suspension § 890.1036 Information considered in deciding a contest... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Information considered in deciding a contest. 890.1036 Section 890.1036 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED...
5 CFR 890.1036 - Information considered in deciding a contest.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Imposed Against Health Care Providers Suspension § 890.1036 Information considered in deciding a contest... 5 Administrative Personnel 2 2013-01-01 2013-01-01 false Information considered in deciding a contest. 890.1036 Section 890.1036 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED...
ERIC Educational Resources Information Center
Souto-Manning, Mariana
2017-01-01
Through a case study, this article sheds light onto generative text sets as tools for developing and enacting critically inclusive early childhood teacher education pedagogies. In doing so, it positions teaching and learning processes as sociocultural, historical, and political acts as it inquires into the use of generative text sets in one early…
Kallenberg, F G J; Aalfs, C M; The, F O; Wientjes, C A; Depla, A C; Mundt, M W; Bossuyt, P M M; Dekker, E
2017-09-21
Identifying a hereditary colorectal cancer (CRC) syndrome or familial CRC (FCC) in a CRC patient may enable the patient and relatives to enroll in surveillance protocols. As these individuals are insufficiently recognized, we evaluated an online family history tool, consisting of a patient-administered family history questionnaire and an automated genetic referral recommendation, to facilitate the identification of patients with hereditary CRC or FCC. Between 2015 and 2016, all newly diagnosed CRC patients in five Dutch outpatient clinics, were included in a trial with a stepped-wedge design, when first visiting the clinic. Each hospital continued standard procedures for identifying patients at risk (control strategy) and then, after a predetermined period, switched to offering the family history tool to included patients (intervention strategy). After considering the tool-based recommendation, the health care provider could decide on and arrange the referral. Primary outcome was the relative number of CRC patients who received screening or surveillance recommendations for themselves or relatives because of hereditary CRC or FCC, provided by genetic counseling. The intervention effect was evaluated using a logit-linear model. With the tool, 46/489 (9.4%) patients received a screening or surveillance recommendation, compared to 35/292 (12.0%) in the control group. In the intention-to-treat-analysis, accounting for time trends and hospital effects, this difference was not statistically significant (p = 0.58). A family history tool does not necessarily assist in increasing the number of CRC patients and relatives enrolled in screening or surveillance recommendations for hereditary CRC or FCC. Other interventions should be considered.
Digital test assembly of truck parts with the IMMA-tool--an illustrative case.
Hanson, L; Högberg, D; Söderholm, M
2012-01-01
Several digital human modelling (DHM) tools have been developed for simulation and visualisation of human postures and motions. In 2010 the DHM tool IMMA (Intelligently Moving Manikins) was introduced as a DHM tool that uses advanced path planning techniques to generate collision free and biomechanically acceptable motions for digital human models (as well as parts) in complex assembly situations. The aim of the paper is to illustrate how the IPS/IMMA tool is used at Scania CV AB in a digital test assembly process, and to compare the tool with other DHM tools on the market. The illustrated case of using the IMMA tool, here combined with the path planner tool IPS, indicates that the tool is promising. The major strengths of the tool are its user friendly interface, the motion generation algorithms, the batch simulation of manikins and the ergonomics assessment methods that consider time.
Sustainability Tools Inventory - Initial Gaps Analysis | Science ...
This report identifies a suite of tools that address a comprehensive set of community sustainability concerns. The objective is to discover whether "gaps" exist in the tool suite’s analytic capabilities. These tools address activities that significantly influence resource consumption, waste generation, and hazard generation including air pollution and greenhouse gases. In addition, the tools have been evaluated using four screening criteria: relevance to community decision making, tools in an appropriate developmental stage, tools that may be transferrable to situations useful for communities, and tools with requiring skill levels appropriate to communities. This document provides an initial gap analysis in the area of community sustainability decision support tools. It provides a reference to communities for existing decision support tools, and a set of gaps for those wishing to develop additional needed tools to help communities to achieve sustainability. It contributes to SHC 1.61.4
PC graphics generation and management tool for real-time applications
NASA Technical Reports Server (NTRS)
Truong, Long V.
1992-01-01
A graphics tool was designed and developed for easy generation and management of personal computer graphics. It also provides methods and 'run-time' software for many common artificial intelligence (AI) or expert system (ES) applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Etingov, Pavel; Makarov, PNNL Yuri; Subbarao, PNNL Kris
RUT software is designed for use by the Balancing Authorities to predict and display additional requirements caused by the variability and uncertainty in load and generation. The prediction is made for the next operating hours as well as for the next day. The tool predicts possible deficiencies in generation capability and ramping capability. This deficiency of balancing resources can cause serious risks to power system stability and also impact real-time market energy prices. The tool dynamically and adaptively correlates changing system conditions with the additional balancing needs triggered by the interplay between forecasted and actual load and output of variablemore » resources. The assessment is performed using a specially developed probabilistic algorithm incorporating multiple sources of uncertainty including wind, solar and load forecast errors. The tool evaluates required generation for a worst case scenario, with a user-specified confidence level.« less
MCM generator: a Java-based tool for generating medical metadata.
Munoz, F; Hersh, W
1998-01-01
In a previous paper we introduced the need to implement a mechanism to facilitate the discovery of relevant Web medical documents. We maintained that the use of META tags, specifically ones that define the medical subject and resource type of a document, help towards this goal. We have now developed a tool to facilitate the generation of these tags for the authors of medical documents. Written entirely in Java, this tool makes use of the SAPHIRE server, and helps the author identify the Medical Subject Heading terms that most appropriately describe the subject of the document. Furthermore, it allows the author to generate metadata tags for the 15 elements that the Dublin Core considers as core elements in the description of a document. This paper describes the use of this tool in the cataloguing of Web and non-Web medical documents, such as images, movie, and sound files.
Decision Analysis Tools for Volcano Observatories
NASA Astrophysics Data System (ADS)
Hincks, T. H.; Aspinall, W.; Woo, G.
2005-12-01
Staff at volcano observatories are predominantly engaged in scientific activities related to volcano monitoring and instrumentation, data acquisition and analysis. Accordingly, the academic education and professional training of observatory staff tend to focus on these scientific functions. From time to time, however, staff may be called upon to provide decision support to government officials responsible for civil protection. Recognizing that Earth scientists may have limited technical familiarity with formal decision analysis methods, specialist software tools that assist decision support in a crisis should be welcome. A review is given of two software tools that have been under development recently. The first is for probabilistic risk assessment of human and economic loss from volcanic eruptions, and is of practical use in short and medium-term risk-informed planning of exclusion zones, post-disaster response, etc. A multiple branch event-tree architecture for the software, together with a formalism for ascribing probabilities to branches, have been developed within the context of the European Community EXPLORIS project. The second software tool utilizes the principles of the Bayesian Belief Network (BBN) for evidence-based assessment of volcanic state and probabilistic threat evaluation. This is of practical application in short-term volcano hazard forecasting and real-time crisis management, including the difficult challenge of deciding when an eruption is over. An open-source BBN library is the software foundation for this tool, which is capable of combining synoptically different strands of observational data from diverse monitoring sources. A conceptual vision is presented of the practical deployment of these decision analysis tools in a future volcano observatory environment. Summary retrospective analyses are given of previous volcanic crises to illustrate the hazard and risk insights gained from use of these tools.
Heath, Gemma; Abdin, Shanara; Begum, Rahima; Kearney, Shauna
2016-08-01
Against a backdrop of recommendations for increasing access to and uptake of early surgical intervention for children with medically intractable epilepsy, it is important to understand how parents and professionals decide to put children forward for epilepsy surgery and what their decisional support needs are. The aim of this study was to explore how parents and health professionals make decisions regarding putting children forward for pediatric epilepsy surgery. Individual interviews were conducted with nine parents of children who had undergone pediatric epilepsy surgery at a specialist children's hospital and ten healthcare professionals who made up the children's epilepsy surgery service multidisciplinary healthcare team (MDT). Three MDT meetings were also observed. Data were analyzed thematically. Four themes were generated from analysis of interviews with parents: presentation of surgery as a treatment option, decision-making, looking back, and interventions. Three themes were generated from analysis of interviews/observations with health professionals: triangulating information, team working, and patient and family perspectives. Parents wanted more information and support in deciding to put their child forward for epilepsy surgery. They attempted to balance the potential benefits of surgery against any risks of harm. For health professionals, a multidisciplinary approach was seen as crucial to the decision-making process. Advocating for the family was perceived to be the responsibility of nonmedical professionals. Decision-making can be supported by incorporating families into discussions regarding epilepsy surgery as a potential treatment option earlier in the process and by providing families with additional information and access to other parents with similar experiences. Copyright © 2016 Elsevier Inc. All rights reserved.
Generation of novel motor sequences: the neural correlates of musical improvisation.
Berkowitz, Aaron L; Ansari, Daniel
2008-06-01
While some motor behavior is instinctive and stereotyped or learned and re-executed, much action is a spontaneous response to a novel set of environmental conditions. The neural correlates of both pre-learned and cued motor sequences have been previously studied, but novel motor behavior has thus far not been examined through brain imaging. In this paper, we report a study of musical improvisation in trained pianists with functional magnetic resonance imaging (fMRI), using improvisation as a case study of novel action generation. We demonstrate that both rhythmic (temporal) and melodic (ordinal) motor sequence creation modulate activity in a network of brain regions comprised of the dorsal premotor cortex, the rostral cingulate zone of the anterior cingulate cortex, and the inferior frontal gyrus. These findings are consistent with a role for the dorsal premotor cortex in movement coordination, the rostral cingulate zone in voluntary selection, and the inferior frontal gyrus in sequence generation. Thus, the invention of novel motor sequences in musical improvisation recruits a network of brain regions coordinated to generate possible sequences, select among them, and execute the decided-upon sequence.
Governance and Factions--Who Decides Who Decides?
ERIC Educational Resources Information Center
Hodgkinson, Harold L.
In several projects, the Center is studying the question: who will decide which factions will be represented in the decision-making process. In the Campus Governance Project investigating the nature of governance, over 3,000 questionnaires were administered and 900 intensive interviews conducted at 19 institutions. The questionnaire was designed…
GridTool: A surface modeling and grid generation tool
NASA Technical Reports Server (NTRS)
Samareh-Abolhassani, Jamshid
1995-01-01
GridTool is designed around the concept that the surface grids are generated on a set of bi-linear patches. This type of grid generation is quite easy to implement, and it avoids the problems associated with complex CAD surface representations and associated surface parameterizations. However, the resulting surface grids are close to but not on the original CAD surfaces. This problem can be alleviated by projecting the resulting surface grids onto the original CAD surfaces. GridTool is designed primary for unstructured grid generation systems. Currently, GridTool supports VGRID and FELISA systems, and it can be easily extended to support other unstructured grid generation systems. The data in GridTool is stored parametrically so that once the problem is set up, one can modify the surfaces and the entire set of points, curves and patches will be updated automatically. This is very useful in a multidisciplinary design and optimization process. GridTool is written entirely in ANSI 'C', the interface is based on the FORMS library, and the graphics is based on the GL library. The code has been tested successfully on IRIS workstations running IRIX4.0 and above. The memory is allocated dynamically, therefore, memory size will depend on the complexity of geometry/grid. GridTool data structure is based on a link-list structure which allows the required memory to expand and contract dynamically according to the user's data size and action. Data structure contains several types of objects such as points, curves, patches, sources and surfaces. At any given time, there is always an active object which is drawn in magenta, or in their highlighted colors as defined by the resource file which will be discussed later.
Schmittdiel, Julie A; Desai, Jay; Schroeder, Emily B; Paolino, Andrea R; Nichols, Gregory A; Lawrence, Jean M; O'Connor, Patrick J; Ohnsorg, Kris A; Newton, Katherine M; Steiner, John F
2015-06-01
Engaging stakeholders in the research process has the potential to improve quality of care and the patient care experience. Online patient community surveys can elicit important topic areas for comparative effectiveness research. Stakeholder meetings with substantial patient representation, as well as representation from health care delivery systems and research funding agencies, are a valuable tool for selecting and refining pilot research and quality improvement projects. Giving patient stakeholders a deciding vote in selecting pilot research topics helps ensure their 'voice' is heard. Researchers and health care leaders should continue to develop best-practices and strategies for increasing patient involvement in comparative effectiveness and delivery science research.
Picking Up Artifacts: Storyboarding as a Gateway to Reuse
NASA Astrophysics Data System (ADS)
Wahid, Shahtab; Branham, Stacy M.; Cairco, Lauren; McCrickard, D. Scott; Harrison, Steve
Storyboarding offers designers the opportunity to illustrate a visual narrative of use. Because designers often refer to past ideas, we argue storyboards can be constructed by reusing shared artifacts. We present a study in which we explore how designers reuse artifacts consisting of images and rationale during storyboard construction. We find images can aid in accessing rationale and that connections among features aid in deciding what to reuse, creating new artifacts, and constructing. Based on requirements derived from our findings, we present a storyboarding tool, PIC-UP, to facilitate artifact sharing and reuse and evaluate its use in an exploratory study. We conclude with remarks on facilitating reuse and future work.
Adrenal venous sampling in a patient with adrenal Cushing syndrome
Villa-Franco, Carlos Andrés; Román-Gonzalez, Alejandro; Velez-Hoyos, Alejandro; Echeverri-Isaza, Santiago
2015-01-01
The primary bilateral macronodular adrenal hyperplasia or the independent adrenocorticotropic hormone bilateral nodular adrenal hyperplasia is a rare cause hypercortisolism, its diagnosis is challenging and there is no clear way to decide the best therapeutic approach. Adrenal venous sampling is commonly used to distinguish the source of hormonal production in patients with primary hyperaldosteronism. It could be a useful tool in this context because it might provide information to guide the treatment. We report the case of a patient with ACTH independent Cushing syndrome in whom the use of adrenal venous sampling with some modifications radically modified the treatment and allowed the diagnosis of a macronodular adrenal hyperplasia. PMID:26309345
Identifying contributors of two-person DNA mixtures by familial database search.
Chung, Yuk-Ka; Fung, Wing K
2013-01-01
The role of familial database search as a crime-solving tool has been increasingly recognized by forensic scientists. As an enhancement to the existing familial search approach on single source cases, this article presents our current progress in exploring the potential use of familial search to mixture cases. A novel method was established to predict the outcome of the search, from which a simple strategy for determining an appropriate scale of investigation by the police force is developed. Illustrated by an example using Swedish data, our approach is shown to have the potential for assisting the police force to decide on the scale of investigation, thereby achieving desirable crime-solving rate with reasonable cost.
The detection and extraction of interleaved code segments
NASA Technical Reports Server (NTRS)
Rugaber, Spencer; Stirewalt, Kurt; Wills, Linda M.
1995-01-01
This project is concerned with a specific difficulty that arises when trying to understand and modify computer programs. In particular, it is concerned with the phenomenon of 'interleaving' in which one section of a program accomplishes several purposes, and disentangling the code responsible for each purposes is difficult. Unraveling interleaved code involves discovering the purpose of each strand of computation, as well as understanding why the programmer decided to interleave the strands. Increased understanding improve the productivity and quality of software maintenance, enhancement, and documentation activities. It is the goal of the project to characterize the phenomenon of interleaving as a prerequisite for building tools to detect and extract interleaved code fragments.
2005-03-01
Materials management information systems (MMISs) incorporate information tools that hospitals can use to automate certain business processes, increase staff compliance with these processes, and identify opportunities for cost savings. Recently, there has been a push by hospital administration to purchase enterprise resource planning (ERP) systems, information systems that promise to integrate many more facets of healthcare business. We offer this article to help materials managers, administrators, and others involved with information system selection understand the changes that have taken place in materials management information systems, decide whether they need a new system and, if so, whether a stand-alone MMIS or an ERP system will be the best choice.
NASA Astrophysics Data System (ADS)
Haqq-Misra, J.
2014-04-01
The idea that a planet or its biota may be intrinsically valuable, apart from its usefulness to humans, is contentious among ethicists, while difficulties abound in attempting to decide what is objectively better or worse for a planet or life. As a way of dissecting the issue of value and life, I present a two-axis comparative tool for ethical frameworks that considers the intrinsic or instrumental value placed upon organisms, environments, planetary systems, and space. I discuss ethical considerations relevant to contemporary space exploration, near-future human exploration of Solar System bodies, and long-term possibilities of interplanetary colonization. This allows for more transparent discussions of value with regard to future space exploration or the discovery of extraterrestrial life.
NASA Astrophysics Data System (ADS)
Boehnlein, Thomas R.; Kramb, Victoria
2018-04-01
Proper formal documentation of computer acquired NDE experimental data generated during research is critical to the longevity and usefulness of the data. Without documentation describing how and why the data was acquired, NDE research teams lose capability such as their ability to generate new information from previously collected data or provide adequate information so that their work can be replicated by others seeking to validate their research. Despite the critical nature of this issue, NDE data is still being generated in research labs without appropriate documentation. By generating documentation in series with data, equal priority is given to both activities during the research process. One way to achieve this is to use a reactive documentation system (RDS). RDS prompts an operator to document the data as it is generated rather than relying on the operator to decide when and what to document. This paper discusses how such a system can be implemented in a dynamic environment made up of in-house and third party NDE data acquisition systems without creating additional burden on the operator. The reactive documentation approach presented here is agnostic enough that the principles can be applied to any operator controlled, computer based, data acquisition system.
Dislich, Claudia; Hettig, Elisabeth; Salecker, Jan; Heinonen, Johannes; Lay, Jann; Meyer, Katrin M; Wiegand, Kerstin; Tarigan, Suria
2018-01-01
Land-use changes have dramatically transformed tropical landscapes. We describe an ecological-economic land-use change model as an integrated, exploratory tool used to analyze how tropical land-use change affects ecological and socio-economic functions. The model analysis seeks to determine what kind of landscape mosaic can improve the ensemble of ecosystem functioning, biodiversity, and economic benefit based on the synergies and trade-offs that we have to account for. More specifically, (1) how do specific ecosystem functions, such as carbon storage, and economic functions, such as household consumption, relate to each other? (2) How do external factors, such as the output prices of crops, affect these relationships? (3) How do these relationships change when production inefficiency differs between smallholder farmers and learning is incorporated? We initialize the ecological-economic model with artificially generated land-use maps parameterized to our study region. The economic sub-model simulates smallholder land-use management decisions based on a profit maximization assumption. Each household determines factor inputs for all household fields and decides on land-use change based on available wealth. The ecological sub-model includes a simple account of carbon sequestration in above-ground and below-ground vegetation. We demonstrate model capabilities with results on household consumption and carbon sequestration from different output price and farming efficiency scenarios. The overall results reveal complex interactions between the economic and ecological spheres. For instance, model scenarios with heterogeneous crop-specific household productivity reveal a comparatively high inertia of land-use change. Our model analysis even shows such an increased temporal stability in landscape composition and carbon stocks of the agricultural area under dynamic price trends. These findings underline the utility of ecological-economic models, such as ours, to act as exploratory tools which can advance our understanding of the mechanisms underlying the trade-offs and synergies of ecological and economic functions in tropical landscapes.
A suite of models to support the quantitative assessment of spread in pest risk analysis.
Robinet, Christelle; Kehlenbeck, Hella; Kriticos, Darren J; Baker, Richard H A; Battisti, Andrea; Brunel, Sarah; Dupin, Maxime; Eyre, Dominic; Faccoli, Massimo; Ilieva, Zhenya; Kenis, Marc; Knight, Jon; Reynaud, Philippe; Yart, Annie; van der Werf, Wopke
2012-01-01
Pest Risk Analyses (PRAs) are conducted worldwide to decide whether and how exotic plant pests should be regulated to prevent invasion. There is an increasing demand for science-based risk mapping in PRA. Spread plays a key role in determining the potential distribution of pests, but there is no suitable spread modelling tool available for pest risk analysts. Existing models are species specific, biologically and technically complex, and data hungry. Here we present a set of four simple and generic spread models that can be parameterised with limited data. Simulations with these models generate maps of the potential expansion of an invasive species at continental scale. The models have one to three biological parameters. They differ in whether they treat spatial processes implicitly or explicitly, and in whether they consider pest density or pest presence/absence only. The four models represent four complementary perspectives on the process of invasion and, because they have different initial conditions, they can be considered as alternative scenarios. All models take into account habitat distribution and climate. We present an application of each of the four models to the western corn rootworm, Diabrotica virgifera virgifera, using historic data on its spread in Europe. Further tests as proof of concept were conducted with a broad range of taxa (insects, nematodes, plants, and plant pathogens). Pest risk analysts, the intended model users, found the model outputs to be generally credible and useful. The estimation of parameters from data requires insights into population dynamics theory, and this requires guidance. If used appropriately, these generic spread models provide a transparent and objective tool for evaluating the potential spread of pests in PRAs. Further work is needed to validate models, build familiarity in the user community and create a database of species parameters to help realize their potential in PRA practice.
A Suite of Models to Support the Quantitative Assessment of Spread in Pest Risk Analysis
Robinet, Christelle; Kehlenbeck, Hella; Kriticos, Darren J.; Baker, Richard H. A.; Battisti, Andrea; Brunel, Sarah; Dupin, Maxime; Eyre, Dominic; Faccoli, Massimo; Ilieva, Zhenya; Kenis, Marc; Knight, Jon; Reynaud, Philippe; Yart, Annie; van der Werf, Wopke
2012-01-01
Pest Risk Analyses (PRAs) are conducted worldwide to decide whether and how exotic plant pests should be regulated to prevent invasion. There is an increasing demand for science-based risk mapping in PRA. Spread plays a key role in determining the potential distribution of pests, but there is no suitable spread modelling tool available for pest risk analysts. Existing models are species specific, biologically and technically complex, and data hungry. Here we present a set of four simple and generic spread models that can be parameterised with limited data. Simulations with these models generate maps of the potential expansion of an invasive species at continental scale. The models have one to three biological parameters. They differ in whether they treat spatial processes implicitly or explicitly, and in whether they consider pest density or pest presence/absence only. The four models represent four complementary perspectives on the process of invasion and, because they have different initial conditions, they can be considered as alternative scenarios. All models take into account habitat distribution and climate. We present an application of each of the four models to the western corn rootworm, Diabrotica virgifera virgifera, using historic data on its spread in Europe. Further tests as proof of concept were conducted with a broad range of taxa (insects, nematodes, plants, and plant pathogens). Pest risk analysts, the intended model users, found the model outputs to be generally credible and useful. The estimation of parameters from data requires insights into population dynamics theory, and this requires guidance. If used appropriately, these generic spread models provide a transparent and objective tool for evaluating the potential spread of pests in PRAs. Further work is needed to validate models, build familiarity in the user community and create a database of species parameters to help realize their potential in PRA practice. PMID:23056174
Dislich, Claudia; Hettig, Elisabeth; Heinonen, Johannes; Lay, Jann; Meyer, Katrin M.; Wiegand, Kerstin; Tarigan, Suria
2018-01-01
Land-use changes have dramatically transformed tropical landscapes. We describe an ecological-economic land-use change model as an integrated, exploratory tool used to analyze how tropical land-use change affects ecological and socio-economic functions. The model analysis seeks to determine what kind of landscape mosaic can improve the ensemble of ecosystem functioning, biodiversity, and economic benefit based on the synergies and trade-offs that we have to account for. More specifically, (1) how do specific ecosystem functions, such as carbon storage, and economic functions, such as household consumption, relate to each other? (2) How do external factors, such as the output prices of crops, affect these relationships? (3) How do these relationships change when production inefficiency differs between smallholder farmers and learning is incorporated? We initialize the ecological-economic model with artificially generated land-use maps parameterized to our study region. The economic sub-model simulates smallholder land-use management decisions based on a profit maximization assumption. Each household determines factor inputs for all household fields and decides on land-use change based on available wealth. The ecological sub-model includes a simple account of carbon sequestration in above-ground and below-ground vegetation. We demonstrate model capabilities with results on household consumption and carbon sequestration from different output price and farming efficiency scenarios. The overall results reveal complex interactions between the economic and ecological spheres. For instance, model scenarios with heterogeneous crop-specific household productivity reveal a comparatively high inertia of land-use change. Our model analysis even shows such an increased temporal stability in landscape composition and carbon stocks of the agricultural area under dynamic price trends. These findings underline the utility of ecological-economic models, such as ours, to act as exploratory tools which can advance our understanding of the mechanisms underlying the trade-offs and synergies of ecological and economic functions in tropical landscapes. PMID:29351290
An Intelligent Crop Planning Tool for Controlled Ecological Life Support Systems
NASA Technical Reports Server (NTRS)
Whitaker, Laura O.; Leon, Jorge
1996-01-01
This paper describes a crop planning tool developed for the Controlled Ecological Life Support Systems (CELSS) project which is in the research phases at various NASA facilities. The Crop Planning Tool was developed to assist in the understanding of the long term applications of a CELSS environment. The tool consists of a crop schedule generator as well as a crop schedule simulator. The importance of crop planning tools such as the one developed is discussed. The simulator is outlined in detail while the schedule generator is touched upon briefly. The simulator consists of data inputs, plant and human models, and various other CELSS activity models such as food consumption and waste regeneration. The program inputs such as crew data and crop states are discussed. References are included for all nominal parameters used. Activities including harvesting, planting, plant respiration, and human respiration are discussed using mathematical models. Plans provided to the simulator by the plan generator are evaluated for their 'fitness' to the CELSS environment with an objective function based upon daily reservoir levels. Sample runs of the Crop Planning Tool and future needs for the tool are detailed.
Study of Tools for Network Discovery and Network Mapping
2003-11-01
connected to the switch. iv. Accessibility of historical data and event data In general, network discovery tools keep a history of the collected...has the following software dependencies: - Java Virtual machine 76 - Perl modules - RRD Tool - TomCat - PostgreSQL STRENGTHS AND...systems - provide a simple view of the current network status - generate alarms on status change - generate history of status change VISUAL MAP
ERIC Educational Resources Information Center
Millan, Eva; Belmonte, Maria-Victoria; Ruiz-Montiel, Manuela; Gavilanes, Juan; Perez-de-la-Cruz, Jose-Luis
2016-01-01
In this paper, we present BH-ShaDe, a new software tool to assist architecture students learning the ill-structured domain/task of housing design. The software tool provides students with automatic or interactively generated floor plan schemas for basic houses. The students can then use the generated schemas as initial seeds to develop complete…
Kill a brand, keep a customer.
Kumar, Nirmalya
2003-12-01
Most brands don't make much money. Year after year, businesses generate 80% to 90% of their profits from less than 20% of their brands. Yet most companies tend to ignore loss-making brands, unaware of the hidden costs they incur. That's because executives believe it's easy to erase a brand; they have only to stop investing in it, they assume, and it will die a natural death. But they're wrong. When companies drop brands clumsily, they antagonize loyal customers: Research shows that seven times out of eight, when firms merge two brands, the market share of the new brand never reaches the combined share of the two original ones. It doesn't have to be that way. Smart companies use a four-step process to kill brands methodically. First, CEOs make the case for rationalization by getting groups of senior executives to conduct joint audits of the brand portfolio. These audits make the need to prune brands apparent throughout the organization. In the next stage, executives need to decide how many brands will be retained, which they do either by setting broad parameters that all brands must meet or by identifying the brands they need in order to cater to all the customer segments in their markets. Third, executives must dispose of the brands they've decided to drop, deciding in each case whether it is appropriate to merge, sell, milk, or just eliminate the brand outright. Finally, it's critical that executives invest the resources they've freed to grow the brands they've retained. Done right, dropping brands will result in a company poised for new growth from the source where it's likely to be found--its profitable brands.
25 CFR 39.133 - Who decides how Language Development funds can be used?
Code of Federal Regulations, 2010 CFR
2010-04-01
... INDIAN SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school... 25 Indians 1 2010-04-01 2010-04-01 false Who decides how Language Development funds can be used...
25 CFR 39.133 - Who decides how Language Development funds can be used?
Code of Federal Regulations, 2014 CFR
2014-04-01
... INDIAN SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school... 25 Indians 1 2014-04-01 2014-04-01 false Who decides how Language Development funds can be used...
25 CFR 39.133 - Who decides how Language Development funds can be used?
Code of Federal Regulations, 2013 CFR
2013-04-01
... INDIAN SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school... 25 Indians 1 2013-04-01 2013-04-01 false Who decides how Language Development funds can be used...
25 CFR 39.133 - Who decides how Language Development funds can be used?
Code of Federal Regulations, 2011 CFR
2011-04-01
... 25 Indians 1 2011-04-01 2011-04-01 false Who decides how Language Development funds can be used... INDIAN SCHOOL EQUALIZATION PROGRAM Indian School Equalization Formula Language Development Programs § 39.133 Who decides how Language Development funds can be used? Tribal governing bodies or local school...
75 FR 62482 - List of Nonconforming Vehicles Decided To Be Eligible for Importation
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-12
... [Docket No. NHTSA-2010-0125; Notice 2] List of Nonconforming Vehicles Decided To Be Eligible for... Register on Tuesday, September 21, 2010, (75 FR 57396) that revised the list of vehicles not originally manufactured to conform to the Federal Motor Vehicle Safety Standards (FMVSS) that NHTSA has decided to be...
ERIC Educational Resources Information Center
Bullock-Yowell, Emily; McConnell, Amy E.; Schedin, Emily A.
2014-01-01
The career concern differences between undecided and decided college students (N = 223) are examined. Undecided college students (n = 83) reported lower career decision-making self-efficacy, higher incidences of negative career thoughts, and more career decision-making difficulties than their decided peers (n = 143). Results reveal that undecided…
13 CFR 125.17 - Who decides if a contract opportunity for SDVO competition exists?
Code of Federal Regulations, 2010 CFR
2010-01-01
... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Who decides if a contract opportunity for SDVO competition exists? 125.17 Section 125.17 Business Credit and Assistance SMALL BUSINESS... opportunity for SDVO competition exists? The contracting officer for the contracting activity decides if a...
Integrated Control Modeling for Propulsion Systems Using NPSS
NASA Technical Reports Server (NTRS)
Parker, Khary I.; Felder, James L.; Lavelle, Thomas M.; Withrow, Colleen A.; Yu, Albert Y.; Lehmann, William V. A.
2004-01-01
The Numerical Propulsion System Simulation (NPSS), an advanced engineering simulation environment used to design and analyze aircraft engines, has been enhanced by integrating control development tools into it. One of these tools is a generic controller interface that allows NPSS to communicate with control development software environments such as MATLAB and EASY5. The other tool is a linear model generator (LMG) that gives NPSS the ability to generate linear, time-invariant state-space models. Integrating these tools into NPSS enables it to be used for control system development. This paper will discuss the development and integration of these tools into NPSS. In addition, it will show a comparison of transient model results of a generic, dual-spool, military-type engine model that has been implemented in NPSS and Simulink. It will also show the linear model generator s ability to approximate the dynamics of a nonlinear NPSS engine model.
Model-Based GUI Testing Using Uppaal at Novo Nordisk
NASA Astrophysics Data System (ADS)
Hjort, Ulrik H.; Illum, Jacob; Larsen, Kim G.; Petersen, Michael A.; Skou, Arne
This paper details a collaboration between Aalborg University and Novo Nordiskin developing an automatic model-based test generation tool for system testing of the graphical user interface of a medical device on an embedded platform. The tool takes as input an UML Statemachine model and generates a test suite satisfying some testing criterion, such as edge or state coverage, and converts the individual test case into a scripting language that can be automatically executed against the target. The tool has significantly reduced the time required for test construction and generation, and reduced the number of test scripts while increasing the coverage.
Micro Slot Generation by μ-ED Milling
NASA Astrophysics Data System (ADS)
Dave, H. K.; Mayanak, M. K.; Rajpurohit, S. R.; Mathai, V. J.
2016-08-01
Micro electro discharge machining is one of the most widely used advanced micro machining technique owing to its capability to fabricate micro features on any electrically conductive materials irrespective of its material properties. Despite its wide acceptability, the process is always adversely affected by issues like wear that occurred on the tool electrode, which results into generation of inaccurate features. Micro ED milling, a process variant in which the tool electrode simultaneously rotated and scanned during machining, is reported to have high process efficiency for generation of 3D complicated shapes and features with relatively less electrode wear intensity. In the present study an attempt has been made to study the effect of two process parameters viz. capacitance and scanning speed of tool electrode on end wear that occurs on the tool electrode and overcut of micro slots generated by micro ED milling. The experiment has been conducted on Al 1100 alloy with tungsten electrode having diameter of 300 μm. Results suggest that wear on the tool electrode and overcut of the micro features generated are highly influenced by the level of the capacitance employed during machining. For the parameter usage employed for present study however, no significant effect of variation of scanning speed has been observed on both responses.
Interactive Inverse Design Optimization of Fuselage Shape for Low-Boom Supersonic Concepts
NASA Technical Reports Server (NTRS)
Li, Wu; Shields, Elwood; Le, Daniel
2008-01-01
This paper introduces a tool called BOSS (Boom Optimization using Smoothest Shape modifications). BOSS utilizes interactive inverse design optimization to develop a fuselage shape that yields a low-boom aircraft configuration. A fundamental reason for developing BOSS is the need to generate feasible low-boom conceptual designs that are appropriate for further refinement using computational fluid dynamics (CFD) based preliminary design methods. BOSS was not developed to provide a numerical solution to the inverse design problem. Instead, BOSS was intended to help designers find the right configuration among an infinite number of possible configurations that are equally good using any numerical figure of merit. BOSS uses the smoothest shape modification strategy for modifying the fuselage radius distribution at 100 or more longitudinal locations to find a smooth fuselage shape that reduces the discrepancies between the design and target equivalent area distributions over any specified range of effective distance. For any given supersonic concept (with wing, fuselage, nacelles, tails, and/or canards), a designer can examine the differences between the design and target equivalent areas, decide which part of the design equivalent area curve needs to be modified, choose a desirable rate for the reduction of the discrepancies over the specified range, and select a parameter for smoothness control of the fuselage shape. BOSS will then generate a fuselage shape based on the designer's inputs in a matter of seconds. Using BOSS, within a few hours, a designer can either generate a realistic fuselage shape that yields a supersonic configuration with a low-boom ground signature or quickly eliminate any configuration that cannot achieve low-boom characteristics with fuselage shaping alone. A conceptual design case study is documented to demonstrate how BOSS can be used to develop a low-boom supersonic concept from a low-drag supersonic concept. The paper also contains a study on how perturbations in the equivalent area distribution affect the ground signature shape and how new target area distributions for low-boom signatures can be constructed using superposition of equivalent area distributions derived from the Seebass-George-Darden (SGD) theory.
Park, S B; Kim, H; Yao, M; Ellis, R; Machtay, M; Sohn, J W
2012-06-01
To quantify the systematic error of a Deformable Image Registration (DIR) system and establish Quality Assurance (QA) procedure. To address the shortfall of landmark approach which it is only available at the significant visible feature points, we adapted a Deformation Vector Map (DVM) comparison approach. We used two CT image sets (R and T image sets) taken for the same patient at different time and generated a DVM, which includes the DIR systematic error. The DVM was calculated using fine-tuned B-Spline DIR and L-BFGS optimizer. By utilizing this DVM we generated R' image set to eliminate the systematic error in DVM,. Thus, we have truth data set, R' and T image sets, and the truth DVM. To test a DIR system, we use R' and T image sets to a DIR system. We compare the test DVM to the truth DVM. If there is no systematic error, they should be identical. We built Deformation Error Histogram (DEH) for quantitative analysis. The test registration was performed with an in-house B-Spline DIR system using a stochastic gradient descent optimizer. Our example data set was generated with a head and neck patient case. We also tested CT to CBCT deformable registration. We found skin regions which interface with the air has relatively larger errors. Also mobile joints such as shoulders had larger errors. Average error for ROIs were as follows; CTV: 0.4mm, Brain stem: 1.4mm, Shoulders: 1.6mm, and Normal tissues: 0.7mm. We succeeded to build DEH approach to quantify the DVM uncertainty. Our data sets are available for testing other systems in our web page. Utilizing DEH, users can decide how much systematic error they would accept. DEH and our data can be a tool for an AAPM task group to compose a DIR system QA guideline. This project is partially supported by the Agency for Healthcare Research and Quality (AHRQ) grant 1R18HS017424-01A2. © 2012 American Association of Physicists in Medicine.
Self-organized flexible leadership promotes collective intelligence in human groups
Kurvers, Ralf H. J. M.; Wolf, Max; Naguib, Marc; Krause, Jens
2015-01-01
Collective intelligence refers to the ability of groups to outperform individual decision-makers. At present, relatively little is known about the mechanisms promoting collective intelligence in natural systems. We here test a novel mechanism generating collective intelligence: self-organization according to information quality. We tested this mechanism by performing simulated predator detection experiments using human groups. By continuously tracking the personal information of all members prior to collective decisions, we found that individuals adjusted their response time during collective decisions to the accuracy of their personal information. When individuals possessed accurate personal information, they decided quickly during collective decisions providing accurate information to the other group members. By contrast, when individuals had inaccurate personal information, they waited longer, allowing them to use social information before making a decision. Individuals deciding late during collective decisions had an increased probability of changing their decision leading to increased collective accuracy. Our results thus show that groups can self-organize according to the information accuracy of their members, thereby promoting collective intelligence. Interestingly, we find that individuals flexibly acted both as leader and as follower depending on the quality of their personal information at any particular point in time. PMID:27019718
Tools to support evidence-informed public health decision making.
Yost, Jennifer; Dobbins, Maureen; Traynor, Robyn; DeCorby, Kara; Workentine, Stephanie; Greco, Lori
2014-07-18
Public health professionals are increasingly expected to engage in evidence-informed decision making to inform practice and policy decisions. Evidence-informed decision making involves the use of research evidence along with expertise, existing public health resources, knowledge about community health issues, the local context and community, and the political climate. The National Collaborating Centre for Methods and Tools has identified a seven step process for evidence-informed decision making. Tools have been developed to support public health professionals as they work through each of these steps. This paper provides an overview of tools used in three Canadian public health departments involved in a study to develop capacity for evidence-informed decision making. As part of a knowledge translation and exchange intervention, a Knowledge Broker worked with public health professionals to identify and apply tools for use with each of the steps of evidence-informed decision making. The Knowledge Broker maintained a reflective journal and interviews were conducted with a purposive sample of decision makers and public health professionals. This paper presents qualitative analysis of the perceived usefulness and usability of the tools. Tools were used in the health departments to assist in: question identification and clarification; searching for the best available research evidence; assessing the research evidence for quality through critical appraisal; deciphering the 'actionable message(s)' from the research evidence; tailoring messages to the local context to ensure their relevance and suitability; deciding whether and planning how to implement research evidence in the local context; and evaluating the effectiveness of implementation efforts. Decision makers provided descriptions of how the tools were used within the health departments and made suggestions for improvement. Overall, the tools were perceived as valuable for advancing and sustaining evidence-informed decision making. Tools are available to support the process of evidence-informed decision making among public health professionals. The usability and usefulness of these tools for advancing and sustaining evidence-informed decision making are discussed, including recommendations for the tools' application in other public health settings beyond this study. Knowledge and awareness of these tools may assist other health professionals in their efforts to implement evidence-informed practice.
Rates for backup service under PURPA when the supplying utility has excess generating capacity
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Under PURPA, cogenerators are entitled to receive backup service. It is often said that tariffs for backup service should reflect the low probability that an unscheduled outage will occur during system peak. This memorandum concludes that probabilistic analysis of contribution to coincident peak demand is not relevant under PURPA during periods in which a utility system is experiencing generating capacity surpluses, and that in such situations, backup rates should be designed so that should the customer insist on installing a cogeneration system, that the customer's contribution to fixed costs remains constant. The reason for this is to assure that prospectivemore » cogenerators receive appropriate pricing signals in their assessment of proposed cogeneration projects, and should they decide to install cogeneration facilities requiring backup service, to hold the remaining customers on the system harmless.« less
On the multifractal effects generated by monofractal signals
NASA Astrophysics Data System (ADS)
Grech, Dariusz; Pamuła, Grzegorz
2013-12-01
We study quantitatively the level of false multifractal signal one may encounter while analyzing multifractal phenomena in time series within multifractal detrended fluctuation analysis (MF-DFA). The investigated effect appears as a result of finite length of used data series and is additionally amplified by the long-term memory the data eventually may contain. We provide the detailed quantitative description of such apparent multifractal background signal as a threshold in spread of generalized Hurst exponent values Δh or a threshold in the width of multifractal spectrum Δα below which multifractal properties of the system are only apparent, i.e. do not exist, despite Δα≠0 or Δh≠0. We find this effect quite important for shorter or persistent series and we argue it is linear with respect to autocorrelation exponent γ. Its strength decays according to power law with respect to the length of time series. The influence of basic linear and nonlinear transformations applied to initial data in finite time series with various levels of long memory is also investigated. This provides additional set of semi-analytical results. The obtained formulas are significant in any interdisciplinary application of multifractality, including physics, financial data analysis or physiology, because they allow to separate the ‘true’ multifractal phenomena from the apparent (artificial) multifractal effects. They should be a helpful tool of the first choice to decide whether we do in particular case with the signal with real multiscaling properties or not.
Energy analysis of holographic lenses for solar concentration
NASA Astrophysics Data System (ADS)
Marín-Sáez, Julia; Collados, M. Victoria; Chemisana, Daniel; Atencia, Jesús
2017-05-01
The use of volume and phase holographic elements in the design of photovoltaic solar concentrators has become very popular as an alternative solution to refractive systems, due to their high efficiency, low cost and possibilities of building integration. Angular and chromatic selectivity of volume holograms can affect their behavior as solar concentrators. In holographic lenses, angular and chromatic selectivity varies along the lens plane. Besides, considering that the holographic materials are not sensitive to the wavelengths for which the solar cells are most efficient, the reconstruction wavelength is usually different from the recording one. As a consequence, not all points of the lens work at Bragg condition for a defined incident direction or wavelength. A software tool that calculates the direction and efficiency of solar rays at the output of a volume holographic element has been developed in this study. It allows the analysis of the total energy that reaches the solar cell, taking into account the sun movement, the solar spectrum and the sensitivity of the solar cell. The dependence of the recording wavelength on the collected energy is studied with this software. As the recording angle is different along a holographic lens, some zones of the lens could not act as a volume hologram. The efficiency at the transition zones between volume and thin behavior in lenses recorded in Bayfol HX is experimentally analyzed in order to decide if the energy of generated higher diffraction orders has to be included in the simulation.
Forks in the road: choices in procedures for designing wildland linkages.
Beier, Paul; Majka, Daniel R; Spencer, Wayne D
2008-08-01
Models are commonly used to identify lands that will best maintain the ability of wildlife to move between wildland blocks through matrix lands after the remaining matrix has become incompatible with wildlife movement. We offer a roadmap of 16 choices and assumptions that arise in designing linkages to facilitate movement or gene flow of focal species between 2 or more predefined wildland blocks. We recommend designing linkages to serve multiple (rather than one) focal species likely to serve as a collective umbrella for all native species and ecological processes, explicitly acknowledging untested assumptions, and using uncertainty analysis to illustrate potential effects of model uncertainty. Such uncertainty is best displayed to stakeholders as maps of modeled linkages under different assumptions. We also recommend modeling corridor dwellers (species that require more than one generation to move their genes between wildland blocks) differently from passage species (for which an individual can move between wildland blocks within a few weeks). We identify a problem, which we call the subjective translation problem, that arises because the analyst must subjectively decide how to translate measurements of resource selection into resistance. This problem can be overcome by estimating resistance from observations of animal movement, genetic distances, or interpatch movements. There is room for substantial improvement in the procedures used to design linkages robust to climate change and in tools that allow stakeholders to compare an optimal linkage design to alternative designs that minimize costs or achieve other conservation goals.
Economics In Optical Design, Analysis, And Production
NASA Astrophysics Data System (ADS)
Willey, Ronald R.
1983-10-01
There are indications that we are entering an era wherein economics will play an increasing role in the optical design and production process. Economics has always been a factor in the competition between commercial ventures in the product arena. Now, we may begin to see competition between different technologies for the scarce resources of the society, including money. A proper design approach begins with a thorough examination and refinement of the requirements from the top down. The interrelationships of the various components must be properly understood and balanced. The specifications must be clear, complete, and realistic. Improper or incomplete system design can cause an extensive waste of resources. The detail optical design to meet the performance requirements has sometimes been the only part of the process that the designer has considered his own responsibility. The final optimization should also consider economic related factors: the cost of tolerances, the available tools test plates, materials, and test equipment. In the preliminary design stage, he should have decided which alignment and test methods are most appropriate to the system. The distribution of tolerances in an optical/mechanical system is a frequently neglected opportunity to reduce cost. We have reported previously on our work in this area, and expand further on it in the context of this paper. The designer now has an opportunity to generate better designs at a lower cost that are more economical to produce. The watchword for the 1980's may become the one found in the assembly automation industry: "more, better, for less".
The parser generator as a general purpose tool
NASA Technical Reports Server (NTRS)
Noonan, R. E.; Collins, W. R.
1985-01-01
The parser generator has proven to be an extremely useful, general purpose tool. It can be used effectively by programmers having only a knowledge of grammars and no training at all in the theory of formal parsing. Some of the application areas for which a table-driven parser can be used include interactive, query languages, menu systems, translators, and programming support tools. Each of these is illustrated by an example grammar.
Graphics processing unit (GPU) real-time infrared scene generation
NASA Astrophysics Data System (ADS)
Christie, Chad L.; Gouthas, Efthimios (Themie); Williams, Owen M.
2007-04-01
VIRSuite, the GPU-based suite of software tools developed at DSTO for real-time infrared scene generation, is described. The tools include the painting of scene objects with radiometrically-associated colours, translucent object generation, polar plot validation and versatile scene generation. Special features include radiometric scaling within the GPU and the presence of zoom anti-aliasing at the core of VIRSuite. Extension of the zoom anti-aliasing construct to cover target embedding and the treatment of translucent objects is described.
2012-01-18
Ni, H. Morkoç, “Signature of hot phonons in reliability of nitride HFETs and signal delay” Acta Physica Polonica A. 119(2) 225-227 (2011) 27. L...lines in AlInN/GaN heterostructures”, Acta Physica Polonica A. 119(2) 173-175 (2011) 29. J. H. Leach, M. Wu, H. Morkoç, M. Ramonas, and A. Matulionis...Ardaraviius¤, O. Kiprijanovi, and J. Liberis, “Hot-Phonon Decided Carrier Velocity in AlInN/GaN Based Two-Dimensional Channels” Acta Physica
Microbubble cloud characterization by nonlinear frequency mixing.
Cavaro, M; Payan, C; Moysan, J; Baqué, F
2011-05-01
In the frame of the fourth generation forum, France decided to develop sodium fast nuclear reactors. French Safety Authority requests the associated monitoring of argon gas into sodium. This implies to estimate the void fraction, and a histogram indicating the bubble population. In this context, the present letter studies the possibility of achieving an accurate determination of the histogram with acoustic methods. A nonlinear, two-frequency mixing technique has been implemented, and a specific optical device has been developed in order to validate the experimental results. The acoustically reconstructed histograms are in excellent agreement with those obtained using optical methods.
36 CFR 215.8 - Appeal Deciding Officer.
Code of Federal Regulations, 2010 CFR
2010-07-01
... names, the Appeal Deciding Officer shall identify all qualified appellants (§ 215.13). (ii) The Appeal Deciding Officer may appoint the first name listed as the lead appellant (§ 215.2) to act on behalf of all parties to that appeal when the appeal does not specify a lead appellant (§ 215.14(b)(3)). (3) Appeal...
36 CFR 215.8 - Appeal Deciding Officer.
Code of Federal Regulations, 2011 CFR
2011-07-01
... names, the Appeal Deciding Officer shall identify all qualified appellants (§ 215.13). (ii) The Appeal Deciding Officer may appoint the first name listed as the lead appellant (§ 215.2) to act on behalf of all parties to that appeal when the appeal does not specify a lead appellant (§ 215.14(b)(3)). (3) Appeal...
20 CFR 670.200 - Who decides where Job Corps centers will be located?
Code of Federal Regulations, 2014 CFR
2014-04-01
... 20 Employees' Benefits 4 2014-04-01 2014-04-01 false Who decides where Job Corps centers will be... LABOR (CONTINUED) THE JOB CORPS UNDER TITLE I OF THE WORKFORCE INVESTMENT ACT Site Selection and Protection and Maintenance of Facilities § 670.200 Who decides where Job Corps centers will be located? (a...
13 CFR 124.1008 - When will SBA not decide an SDB protest?
Code of Federal Regulations, 2013 CFR
2013-01-01
... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false When will SBA not decide an SDB... SDB protest? (a) SBA will not decide a protest as to disadvantaged status of any concern other than... protested concern's circumstances have materially changed since SBA certified it as an SDB, or that the...
13 CFR 124.1008 - When will SBA not decide an SDB protest?
Code of Federal Regulations, 2014 CFR
2014-01-01
... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false When will SBA not decide an SDB... SDB protest? (a) SBA will not decide a protest as to disadvantaged status of any concern other than... protested concern's circumstances have materially changed since SBA certified it as an SDB, or that the...
13 CFR 124.1008 - When will SBA not decide an SDB protest?
Code of Federal Regulations, 2010 CFR
2010-01-01
... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false When will SBA not decide an SDB... SDB protest? (a) SBA will not decide a protest as to disadvantaged status of any concern other than... protested concern's circumstances have materially changed since SBA certified it as an SDB, or that the...
13 CFR 124.1008 - When will SBA not decide an SDB protest?
Code of Federal Regulations, 2012 CFR
2012-01-01
... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false When will SBA not decide an SDB... SDB protest? (a) SBA will not decide a protest as to disadvantaged status of any concern other than... protested concern's circumstances have materially changed since SBA certified it as an SDB, or that the...
13 CFR 124.1008 - When will SBA not decide an SDB protest?
Code of Federal Regulations, 2011 CFR
2011-01-01
... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false When will SBA not decide an SDB... SDB protest? (a) SBA will not decide a protest as to disadvantaged status of any concern other than... protested concern's circumstances have materially changed since SBA certified it as an SDB, or that the...
78 FR 38811 - Small Business Size and Status Integrity
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-28
... made by a judge, jury, or other decider of fact. Given the fact-specific nature of such a finding, SBA... certification is a factual determination best made by a judge, jury, or other decider of fact. One commenter..., jury or other decider of fact. SBA has made minor wording changes in the limitation of liability...
Career Cruising Impact on the Self Efficacy of Deciding Majors
ERIC Educational Resources Information Center
Smother, Anthony William
2012-01-01
The purpose of this study was to analyze the impact of "Career Cruising"© on self-efficacy of deciding majors in a university setting. The use of the self-assessment instrument, "Career Cruising"©, was used with measuring the career-decision making self-efficacy in a pre and post-test with deciding majors. The independent…
20 CFR 416.1881 - Deciding whether someone is your parent or stepparent.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Deciding whether someone is your parent or... SECURITY INCOME FOR THE AGED, BLIND, AND DISABLED Relationship Who Is Considered Your Parent § 416.1881 Deciding whether someone is your parent or stepparent. (a) We consider your parent to be— (1) Your natural...
ERIC Educational Resources Information Center
Oregon Univ., Eugene. Center for Educational Policy and Management.
This workshop presenter's guide is intended for use by administrators in training one another in the Project Leadership program developed by the Association of California School Administrators (ACSA). The purposes of the guide are: to provide administrators with a framework for deciding when others (particularly subordinates) should participate in…
13 CFR 126.604 - Who decides if a contract opportunity for HUBZone set-aside competition exists?
Code of Federal Regulations, 2010 CFR
2010-01-01
... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Who decides if a contract opportunity for HUBZone set-aside competition exists? 126.604 Section 126.604 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION HUBZONE PROGRAM Contractual Assistance § 126.604 Who decides if a contract...
Tang, Hongying Lilian; Goh, Jonathan; Peto, Tunde; Ling, Bingo Wing-Kuen; Al Turk, Lutfiah Ismail; Hu, Yin; Wang, Su; Saleh, George Michael
2013-01-01
In any diabetic retinopathy screening program, about two-thirds of patients have no retinopathy. However, on average, it takes a human expert about one and a half times longer to decide an image is normal than to recognize an abnormal case with obvious features. In this work, we present an automated system for filtering out normal cases to facilitate a more effective use of grading time. The key aim with any such tool is to achieve high sensitivity and specificity to ensure patients' safety and service efficiency. There are many challenges to overcome, given the variation of images and characteristics to identify. The system combines computed evidence obtained from various processing stages, including segmentation of candidate regions, classification and contextual analysis through Hidden Markov Models. Furthermore, evolutionary algorithms are employed to optimize the Hidden Markov Models, feature selection and heterogeneous ensemble classifiers. In order to evaluate its capability of identifying normal images across diverse populations, a population-oriented study was undertaken comparing the software's output to grading by humans. In addition, population based studies collect large numbers of images on subjects expected to have no abnormality. These studies expect timely and cost-effective grading. Altogether 9954 previously unseen images taken from various populations were tested. All test images were masked so the automated system had not been exposed to them before. This system was trained using image subregions taken from about 400 sample images. Sensitivities of 92.2% and specificities of 90.4% were achieved varying between populations and population clusters. Of all images the automated system decided to be normal, 98.2% were true normal when compared to the manual grading results. These results demonstrate scalability and strong potential of such an integrated computational intelligence system as an effective tool to assist a grading service.
Tools to support evidence-informed public health decision making
2014-01-01
Background Public health professionals are increasingly expected to engage in evidence-informed decision making to inform practice and policy decisions. Evidence-informed decision making involves the use of research evidence along with expertise, existing public health resources, knowledge about community health issues, the local context and community, and the political climate. The National Collaborating Centre for Methods and Tools has identified a seven step process for evidence-informed decision making. Tools have been developed to support public health professionals as they work through each of these steps. This paper provides an overview of tools used in three Canadian public health departments involved in a study to develop capacity for evidence-informed decision making. Methods As part of a knowledge translation and exchange intervention, a Knowledge Broker worked with public health professionals to identify and apply tools for use with each of the steps of evidence-informed decision making. The Knowledge Broker maintained a reflective journal and interviews were conducted with a purposive sample of decision makers and public health professionals. This paper presents qualitative analysis of the perceived usefulness and usability of the tools. Results Tools were used in the health departments to assist in: question identification and clarification; searching for the best available research evidence; assessing the research evidence for quality through critical appraisal; deciphering the ‘actionable message(s)’ from the research evidence; tailoring messages to the local context to ensure their relevance and suitability; deciding whether and planning how to implement research evidence in the local context; and evaluating the effectiveness of implementation efforts. Decision makers provided descriptions of how the tools were used within the health departments and made suggestions for improvement. Overall, the tools were perceived as valuable for advancing and sustaining evidence-informed decision making. Conclusion Tools are available to support the process of evidence-informed decision making among public health professionals. The usability and usefulness of these tools for advancing and sustaining evidence-informed decision making are discussed, including recommendations for the tools’ application in other public health settings beyond this study. Knowledge and awareness of these tools may assist other health professionals in their efforts to implement evidence-informed practice. PMID:25034534
FY16 Status Report on NEAMS Neutronics Activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, C. H.; Shemon, E. R.; Smith, M. A.
2016-09-30
The goal of the NEAMS neutronics effort is to develop a neutronics toolkit for use on sodium-cooled fast reactors (SFRs) which can be extended to other reactor types. The neutronics toolkit includes the high-fidelity deterministic neutron transport code PROTEUS and many supporting tools such as a cross section generation code MC 2-3, a cross section library generation code, alternative cross section generation tools, mesh generation and conversion utilities, and an automated regression test tool. The FY16 effort for NEAMS neutronics focused on supporting the release of the SHARP toolkit and existing and new users, continuing to develop PROTEUS functions necessarymore » for performance improvement as well as the SHARP release, verifying PROTEUS against available existing benchmark problems, and developing new benchmark problems as needed. The FY16 research effort was focused on further updates of PROTEUS-SN and PROTEUS-MOCEX and cross section generation capabilities as needed.« less
Generating Systems Biology Markup Language Models from the Synthetic Biology Open Language.
Roehner, Nicholas; Zhang, Zhen; Nguyen, Tramy; Myers, Chris J
2015-08-21
In the context of synthetic biology, model generation is the automated process of constructing biochemical models based on genetic designs. This paper discusses the use cases for model generation in genetic design automation (GDA) software tools and introduces the foundational concepts of standards and model annotation that make this process useful. Finally, this paper presents an implementation of model generation in the GDA software tool iBioSim and provides an example of generating a Systems Biology Markup Language (SBML) model from a design of a 4-input AND sensor written in the Synthetic Biology Open Language (SBOL).
Reed, Shelby D; Li, Yanhong; Kamble, Shital; Polsky, Daniel; Graham, Felicia L; Bowers, Margaret T; Samsa, Gregory P; Paul, Sara; Schulman, Kevin A; Whellan, David J; Riegel, Barbara J
2012-01-01
Patient-centered health care interventions, such as heart failure disease management programs, are under increasing pressure to demonstrate good value. Variability in costing methods and assumptions in economic evaluations of such interventions limit the comparability of cost estimates across studies. Valid cost estimation is critical to conducting economic evaluations and for program budgeting and reimbursement negotiations. Using sound economic principles, we developed the Tools for Economic Analysis of Patient Management Interventions in Heart Failure (TEAM-HF) Costing Tool, a spreadsheet program that can be used by researchers and health care managers to systematically generate cost estimates for economic evaluations and to inform budgetary decisions. The tool guides users on data collection and cost assignment for associated personnel, facilities, equipment, supplies, patient incentives, miscellaneous items, and start-up activities. The tool generates estimates of total program costs, cost per patient, and cost per week and presents results using both standardized and customized unit costs for side-by-side comparisons. Results from pilot testing indicated that the tool was well-formatted, easy to use, and followed a logical order. Cost estimates of a 12-week exercise training program in patients with heart failure were generated with the costing tool and were found to be consistent with estimates published in a recent study. The TEAM-HF Costing Tool could prove to be a valuable resource for researchers and health care managers to generate comprehensive cost estimates of patient-centered interventions in heart failure or other conditions for conducting high-quality economic evaluations and making well-informed health care management decisions.
20 CFR 10.206 - May an employee who uses leave after an injury later decide to use COP instead?
Code of Federal Regulations, 2014 CFR
2014-04-01
... later decide to use COP instead? 10.206 Section 10.206 Employees' Benefits OFFICE OF WORKERS... THE FEDERAL EMPLOYEES' COMPENSATION ACT, AS AMENDED Continuation of Pay Eligibility for Cop § 10.206 May an employee who uses leave after an injury later decide to use COP instead? On Form CA-1, an...
20 CFR 10.206 - May an employee who uses leave after an injury later decide to use COP instead?
Code of Federal Regulations, 2013 CFR
2013-04-01
... later decide to use COP instead? 10.206 Section 10.206 Employees' Benefits OFFICE OF WORKERS... THE FEDERAL EMPLOYEES' COMPENSATION ACT, AS AMENDED Continuation of Pay Eligibility for Cop § 10.206 May an employee who uses leave after an injury later decide to use COP instead? On Form CA-1, an...
25 CFR 162.566 - How will BIA decide whether to approve a WSR lease?
Code of Federal Regulations, 2013 CFR
2013-04-01
... 25 Indians 1 2013-04-01 2013-04-01 false How will BIA decide whether to approve a WSR lease? 162.566 Section 162.566 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Wsr Lease Approval § 162.566 How will BIA decide whether to...
25 CFR 162.566 - How will BIA decide whether to approve a WSR lease?
Code of Federal Regulations, 2014 CFR
2014-04-01
... 25 Indians 1 2014-04-01 2014-04-01 false How will BIA decide whether to approve a WSR lease? 162.566 Section 162.566 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Wsr Lease Approval § 162.566 How will BIA decide whether to...
25 CFR 162.531 - How will BIA decide whether to approve a WEEL?
Code of Federal Regulations, 2013 CFR
2013-04-01
... 25 Indians 1 2013-04-01 2013-04-01 false How will BIA decide whether to approve a WEEL? 162.531 Section 162.531 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Weel Approval § 162.531 How will BIA decide whether to approve a...
25 CFR 162.531 - How will BIA decide whether to approve a WEEL?
Code of Federal Regulations, 2014 CFR
2014-04-01
... 25 Indians 1 2014-04-01 2014-04-01 false How will BIA decide whether to approve a WEEL? 162.531 Section 162.531 Indians BUREAU OF INDIAN AFFAIRS, DEPARTMENT OF THE INTERIOR LAND AND WATER LEASES AND PERMITS Wind and Solar Resource Leases Weel Approval § 162.531 How will BIA decide whether to approve a...
Code of Federal Regulations, 2012 CFR
2012-01-01
... decides to lease acquired agricultural real estate? 617.7615 Section 617.7615 Banks and Banking FARM... the System institution do when it decides to lease acquired agricultural real estate? (a) Notify the... real estate at a rate equivalent to the appraised rental value of the property. (1) Within 15 days...
Code of Federal Regulations, 2013 CFR
2013-01-01
... decides to lease acquired agricultural real estate? 617.7615 Section 617.7615 Banks and Banking FARM... the System institution do when it decides to lease acquired agricultural real estate? (a) Notify the... real estate at a rate equivalent to the appraised rental value of the property. (1) Within 15 days...
Code of Federal Regulations, 2012 CFR
2012-01-01
... decides to sell acquired agricultural real estate? 617.7610 Section 617.7610 Banks and Banking FARM CREDIT... institution do when it decides to sell acquired agricultural real estate? (a) Notify the previous owner, (1) Within 15 days of the System institution's decision to sell acquired agricultural real estate, it must...
Code of Federal Regulations, 2010 CFR
2010-01-01
... decides to lease acquired agricultural real estate? 617.7615 Section 617.7615 Banks and Banking FARM... the System institution do when it decides to lease acquired agricultural real estate? (a) Notify the... real estate at a rate equivalent to the appraised rental value of the property. (1) Within 15 days...
Code of Federal Regulations, 2011 CFR
2011-01-01
... decides to sell acquired agricultural real estate at a public auction? 617.7620 Section 617.7620 Banks and... What should the System institution do when it decides to sell acquired agricultural real estate at a public auction? System institutions electing to sell or lease acquired agricultural real estate or a...
Code of Federal Regulations, 2013 CFR
2013-01-01
... decides to sell acquired agricultural real estate? 617.7610 Section 617.7610 Banks and Banking FARM CREDIT... institution do when it decides to sell acquired agricultural real estate? (a) Notify the previous owner, (1) Within 15 days of the System institution's decision to sell acquired agricultural real estate, it must...
Code of Federal Regulations, 2014 CFR
2014-01-01
... decides to sell acquired agricultural real estate? 617.7610 Section 617.7610 Banks and Banking FARM CREDIT... institution do when it decides to sell acquired agricultural real estate? (a) Notify the previous owner, (1) Within 15 days of the System institution's decision to sell acquired agricultural real estate, it must...
Code of Federal Regulations, 2014 CFR
2014-01-01
... decides to lease acquired agricultural real estate? 617.7615 Section 617.7615 Banks and Banking FARM... the System institution do when it decides to lease acquired agricultural real estate? (a) Notify the... real estate at a rate equivalent to the appraised rental value of the property. (1) Within 15 days...
Code of Federal Regulations, 2011 CFR
2011-01-01
... decides to sell acquired agricultural real estate? 617.7610 Section 617.7610 Banks and Banking FARM CREDIT... institution do when it decides to sell acquired agricultural real estate? (a) Notify the previous owner, (1) Within 15 days of the System institution's decision to sell acquired agricultural real estate, it must...
Code of Federal Regulations, 2010 CFR
2010-01-01
... decides to sell acquired agricultural real estate at a public auction? 617.7620 Section 617.7620 Banks and... What should the System institution do when it decides to sell acquired agricultural real estate at a public auction? System institutions electing to sell or lease acquired agricultural real estate or a...
Code of Federal Regulations, 2011 CFR
2011-01-01
... decides to lease acquired agricultural real estate? 617.7615 Section 617.7615 Banks and Banking FARM... the System institution do when it decides to lease acquired agricultural real estate? (a) Notify the... real estate at a rate equivalent to the appraised rental value of the property. (1) Within 15 days...
Code of Federal Regulations, 2014 CFR
2014-04-01
... INTERIOR FINANCIAL ACTIVITIES TRUST FUNDS FOR TRIBES AND INDIVIDUAL INDIANS IIM Accounts: Hearing Process... hearing process we decide to encumber your IIM account because of an administrative error which resulted... process we decide to encumber your IIM account because of an administrative error which resulted in funds...
Code of Federal Regulations, 2013 CFR
2013-04-01
... INTERIOR FINANCIAL ACTIVITIES TRUST FUNDS FOR TRIBES AND INDIVIDUAL INDIANS IIM Accounts: Hearing Process... hearing process we decide to encumber your IIM account because of an administrative error which resulted... process we decide to encumber your IIM account because of an administrative error which resulted in funds...
Code of Federal Regulations, 2011 CFR
2011-04-01
... INTERIOR FINANCIAL ACTIVITIES TRUST FUNDS FOR TRIBES AND INDIVIDUAL INDIANS IIM Accounts: Hearing Process... hearing process we decide to encumber your IIM account because of an administrative error which resulted... process we decide to encumber your IIM account because of an administrative error which resulted in funds...
Code of Federal Regulations, 2012 CFR
2012-04-01
... INTERIOR FINANCIAL ACTIVITIES TRUST FUNDS FOR TRIBES AND INDIVIDUAL INDIANS IIM Accounts: Hearing Process... hearing process we decide to encumber your IIM account because of an administrative error which resulted... process we decide to encumber your IIM account because of an administrative error which resulted in funds...
Code of Federal Regulations, 2010 CFR
2010-04-01
... hearing process we decide to encumber your IIM account because of an administrative error which resulted... process we decide to encumber your IIM account because of an administrative error which resulted in funds... INTERIOR FINANCIAL ACTIVITIES TRUST FUNDS FOR TRIBES AND INDIVIDUAL INDIANS IIM Accounts: Hearing Process...
32 CFR 1653.4 - File to be returned after appeal to the President is decided.
Code of Federal Regulations, 2011 CFR
2011-07-01
... SELECTIVE SERVICE SYSTEM APPEAL TO THE PRESIDENT § 1653.4 File to be returned after appeal to the President is decided. When the appeal to the President has been decided, the file shall be returned as... 32 National Defense 6 2011-07-01 2011-07-01 false File to be returned after appeal to the...
42 CFR 83.16 - How will the Secretary decide the outcome(s) of a petition?
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 1 2010-10-01 2010-10-01 false How will the Secretary decide the outcome(s) of a... AS MEMBERS OF THE SPECIAL EXPOSURE COHORT UNDER THE ENERGY EMPLOYEES OCCUPATIONAL ILLNESS... Secretary decide the outcome(s) of a petition? (a) The Director of NIOSH will propose a decision to add or...
ERIC Educational Resources Information Center
Yoon, Su-Youn; Lee, Chong Min; Houghton, Patrick; Lopez, Melissa; Sakano, Jennifer; Loukina, Anastasia; Krovetz, Bob; Lu, Chi; Madani, Nitin
2017-01-01
In this study, we developed assistive tools and resources to support TOEIC® Listening test item generation. There has recently been an increased need for a large pool of items for these tests. This need has, in turn, inspired efforts to increase the efficiency of item generation while maintaining the quality of the created items. We aimed to…
Model-Driven Test Generation of Distributed Systems
NASA Technical Reports Server (NTRS)
Easwaran, Arvind; Hall, Brendan; Schweiker, Kevin
2012-01-01
This report describes a novel test generation technique for distributed systems. Utilizing formal models and formal verification tools, spe cifically the Symbolic Analysis Laboratory (SAL) tool-suite from SRI, we present techniques to generate concurrent test vectors for distrib uted systems. These are initially explored within an informal test validation context and later extended to achieve full MC/DC coverage of the TTEthernet protocol operating within a system-centric context.
Knowledge-based approach for generating target system specifications from a domain model
NASA Technical Reports Server (NTRS)
Gomaa, Hassan; Kerschberg, Larry; Sugumaran, Vijayan
1992-01-01
Several institutions in industry and academia are pursuing research efforts in domain modeling to address unresolved issues in software reuse. To demonstrate the concepts of domain modeling and software reuse, a prototype software engineering environment is being developed at George Mason University to support the creation of domain models and the generation of target system specifications. This prototype environment, which is application domain independent, consists of an integrated set of commercial off-the-shelf software tools and custom-developed software tools. This paper describes the knowledge-based tool that was developed as part of the environment to generate target system specifications from a domain model.
OCSEGen: Open Components and Systems Environment Generator
NASA Technical Reports Server (NTRS)
Tkachuk, Oksana
2014-01-01
To analyze a large system, one often needs to break it into smaller components.To analyze a component or unit under analysis, one needs to model its context of execution, called environment, which represents the components with which the unit interacts. Environment generation is a challenging problem, because the environment needs to be general enough to uncover unit errors, yet precise enough to make the analysis tractable. In this paper, we present a tool for automated environment generation for open components and systems. The tool, called OCSEGen, is implemented on top of the Soot framework. We present the tool's current support and discuss its possible future extensions.
Economic figures in herd health programmes as motivation factors for farmers.
Anneberg, Inger; Østergaard, Søren; Ettema, Jehan Frans; Kudahl, Anne Braad
2016-11-01
Veterinarians often express frustrations when farmers do not implement their advice, and farmers sometimes shake their heads when they receive veterinary advice which is practically unfeasible. This is the background for the development of a focused 3 page economic report created in cooperation between veterinarians, farmers, advisers and researchers. Based on herd specific key-figures for management, the report presents the short- and long-term economic effects of changes in 15 management areas. Simulations are performed by the dairy herd simulation model "SimHerd". The aim is to assist the veterinarian in identifying the economically most favorable and feasible management improvements and thereby provide more relevant and prioritised advice to the farmer. In the developing process, a prototype of the advisory tool was tested by 15 veterinarians on 55 farms. After the test period, a selection of farmers were asked to take part in a qualitative evaluation questioning them whether they had implemented the action plans suggested on basis of the advisory tool and making them explain what made them agree or disagree on the results from this new advisory tool. The aim of this process was to evaluate the farmers' receptiveness to advice based on these economic analyses. We found that the analysed advisory tool (the report) can be seen as a valuable help and support for some farmers when deciding whether to implement the action plans. However, certain reservations were recognised. The trustworthiness of the tool depends on whether the veterinarians are able to suggest to the farmer which specific management changes are needed to obtain the estimated effects and what the related expenses might be (costs). Without transparency of expenses, time-limits, work hours and so on, farmers may not be convinced by the tool. Copyright © 2016 Elsevier B.V. All rights reserved.
Adaptive scallop height tool path generation for robot-based incremental sheet metal forming
NASA Astrophysics Data System (ADS)
Seim, Patrick; Möllensiep, Dennis; Störkle, Denis Daniel; Thyssen, Lars; Kuhlenkötter, Bernd
2016-10-01
Incremental sheet metal forming is an emerging process for the production of individualized products or prototypes in low batch sizes and with short times to market. In these processes, the desired shape is produced by the incremental inward motion of the workpiece-independent forming tool in depth direction and its movement along the contour in lateral direction. Based on this shape production, the tool path generation is a key factor on e.g. the resulting geometric accuracy, the resulting surface quality, and the working time. This paper presents an innovative tool path generation based on a commercial milling CAM package considering the surface quality and working time. This approach offers the ability to define a specific scallop height as an indicator of the surface quality for specific faces of a component. Moreover, it decreases the required working time for the production of the entire component compared to the use of a commercial software package without this adaptive approach. Different forming experiments have been performed to verify the newly developed tool path generation. Mainly, this approach serves to solve the existing conflict of combining the working time and the surface quality within the process of incremental sheet metal forming.
GUDM: Automatic Generation of Unified Datasets for Learning and Reasoning in Healthcare.
Ali, Rahman; Siddiqi, Muhammad Hameed; Idris, Muhammad; Ali, Taqdir; Hussain, Shujaat; Huh, Eui-Nam; Kang, Byeong Ho; Lee, Sungyoung
2015-07-02
A wide array of biomedical data are generated and made available to healthcare experts. However, due to the diverse nature of data, it is difficult to predict outcomes from it. It is therefore necessary to combine these diverse data sources into a single unified dataset. This paper proposes a global unified data model (GUDM) to provide a global unified data structure for all data sources and generate a unified dataset by a "data modeler" tool. The proposed tool implements user-centric priority based approach which can easily resolve the problems of unified data modeling and overlapping attributes across multiple datasets. The tool is illustrated using sample diabetes mellitus data. The diverse data sources to generate the unified dataset for diabetes mellitus include clinical trial information, a social media interaction dataset and physical activity data collected using different sensors. To realize the significance of the unified dataset, we adopted a well-known rough set theory based rules creation process to create rules from the unified dataset. The evaluation of the tool on six different sets of locally created diverse datasets shows that the tool, on average, reduces 94.1% time efforts of the experts and knowledge engineer while creating unified datasets.
GUDM: Automatic Generation of Unified Datasets for Learning and Reasoning in Healthcare
Ali, Rahman; Siddiqi, Muhammad Hameed; Idris, Muhammad; Ali, Taqdir; Hussain, Shujaat; Huh, Eui-Nam; Kang, Byeong Ho; Lee, Sungyoung
2015-01-01
A wide array of biomedical data are generated and made available to healthcare experts. However, due to the diverse nature of data, it is difficult to predict outcomes from it. It is therefore necessary to combine these diverse data sources into a single unified dataset. This paper proposes a global unified data model (GUDM) to provide a global unified data structure for all data sources and generate a unified dataset by a “data modeler” tool. The proposed tool implements user-centric priority based approach which can easily resolve the problems of unified data modeling and overlapping attributes across multiple datasets. The tool is illustrated using sample diabetes mellitus data. The diverse data sources to generate the unified dataset for diabetes mellitus include clinical trial information, a social media interaction dataset and physical activity data collected using different sensors. To realize the significance of the unified dataset, we adopted a well-known rough set theory based rules creation process to create rules from the unified dataset. The evaluation of the tool on six different sets of locally created diverse datasets shows that the tool, on average, reduces 94.1% time efforts of the experts and knowledge engineer while creating unified datasets. PMID:26147731
NASA Astrophysics Data System (ADS)
Nikolić, Dalibor; Milošević, Žarko; Saveljić, Igor; Filipović, Nenad
2015-12-01
Vibration of the skull causes a hearing sensation. We call it Bone Conduction (BC) sound. There are several investigations about transmission properties of bone conducted sound. The aim of this study was to develop a software tool for easy generation of the finite element (FE) model of the human head with different materials based on human head anatomy and to calculate sound conduction through the head. Developed software tool generates a model in a few steps. The first step is to do segmentation of CT medical images (DICOM) and to generate a surface mesh files (STL). Each STL file presents a different layer of human head with different material properties (brain, CSF, different layers of the skull bone, skin, etc.). The next steps are to make tetrahedral mesh from obtained STL files, to define FE model boundary conditions and to solve FE equations. This tool uses PAK solver, which is the open source software implemented in SIFEM FP7 project, for calculations of the head vibration. Purpose of this tool is to show impact of the bone conduction sound of the head on the hearing system and to estimate matching of obtained results with experimental measurements.
The role of optimization in the next generation of computer-based design tools
NASA Technical Reports Server (NTRS)
Rogan, J. Edward
1989-01-01
There is a close relationship between design optimization and the emerging new generation of computer-based tools for engineering design. With some notable exceptions, the development of these new tools has not taken full advantage of recent advances in numerical design optimization theory and practice. Recent work in the field of design process architecture has included an assessment of the impact of next-generation computer-based design tools on the design process. These results are summarized, and insights into the role of optimization in a design process based on these next-generation tools are presented. An example problem has been worked out to illustrate the application of this technique. The example problem - layout of an aircraft main landing gear - is one that is simple enough to be solved by many other techniques. Although the mathematical relationships describing the objective function and constraints for the landing gear layout problem can be written explicitly and are quite straightforward, an approximation technique has been used in the solution of this problem that can just as easily be applied to integrate supportability or producibility assessments using theory of measurement techniques into the design decision-making process.
Role of Tool Shoulder End Features on Friction Stir Weld Characteristics of 6082 Aluminum Alloy
NASA Astrophysics Data System (ADS)
Mugada, Krishna Kishore; Adepu, Kumar
2018-03-01
Understanding the temperature generation around the tool shoulder contact is one of the important aspects of the friction stir welding process. In the present study, the effects of various tool shoulder end feature on the temperature and mechanical properties of the 6082 aluminum alloy were investigated. The experimental results show that the axial force during the welding is considerably reduced by using tools with shoulder end features. The detailed observation revealed that around the tool shoulder contact, the amount of heat generation is higher between trialing edge (TE) to retreating side-leading edge corner (RS-LE) counter clockwise direction and lower between RS-LE to TE clockwise direction. Out of the four shoulder end featured tools, the welds produced with ridges shoulder tool resulted in superior properties with significantly lower axial force (approximately 32%) compared to plane shoulder tool.
NUMERICAL STUDY OF ELECTROMAGNETIC WAVES GENERATED BY A PROTOTYPE DIELECTRIC LOGGING TOOL
To understand the electromagnetic waves generated by a prototype dielectric logging tool, a
numerical study was conducted using both the finite-difference, time-domain method and a frequency- wavenumber method. When the propagation velocity in the borehole was greater than th...
Software Tools for Weed Seed Germination Modeling
USDA-ARS?s Scientific Manuscript database
The next generation of weed seed germination models will need to account for variable soil microclimate conditions. In order to predict this microclimate environment we have developed a suite of individual tools (models) that can be used in conjunction with the next generation of weed seed germinati...
Experiences with a generator tool for building clinical application modules.
Kuhn, K A; Lenz, R; Elstner, T; Siegele, H; Moll, R
2003-01-01
To elaborate main system characteristics and relevant deployment experiences for the health information system (HIS) Orbis/OpenMed, which is in widespread use in Germany, Austria, and Switzerland. In a deployment phase of 3 years in a 1.200 bed university hospital, where the system underwent significant improvements, the system's functionality and its software design have been analyzed in detail. We focus on an integrated CASE tool for generating embedded clinical applications and for incremental system evolution. We present a participatory and iterative software engineering process developed for efficient utilization of such a tool. The system's functionality is comparable to other commercial products' functionality; its components are embedded in a vendor-specific application framework, and standard interfaces are being used for connecting subsystems. The integrated generator tool is a remarkable feature; it became a key factor of our project. Tool generated applications are workflow enabled and embedded into the overall data base schema. Rapid prototyping and iterative refinement are supported, so application modules can be adapted to the users' work practice. We consider tools supporting an iterative and participatory software engineering process highly relevant for health information system architects. The potential of a system to continuously evolve and to be effectively adapted to changing needs may be more important than sophisticated but hard-coded HIS functionality. More work will focus on HIS software design and on software engineering. Methods and tools are needed for quick and robust adaptation of systems to health care processes and changing requirements.
Novel aspects in diagnostic approach to respiratory patients: is it the time for a new semiotics?
Soldati, Gino; Smargiassi, Andrea; Mariani, Alberto A; Inchingolo, Riccardo
2017-01-01
Medical approach to patients is a fundamental step to get the correct diagnosis. The aim of this paper is to analyze some aspects of the reasoning process inherent in medical diagnosis in our era. Pathologic signs (anamnestic data, symptoms, semiotics, laboratory and strumental findings) represent informative phenomena to be integrated for inferring a diagnosis. Thus, diagnosis begins with "signs" and finishes in a probability of disease. The abductive reasoning process is the generation of a hypothesis to explain one or more observations (signs) in order to decide between alternative explanations searching the best one. This process is iterative during the diagnostic activity while collecting further observations and it could be creative generating new knowledge about what has not been experienced before. In the clinical setting the abductive process is not only theoretical, conversely the physical exploitation of the patient (palpation, percussion, auscultation) is always crucial. Through this manipulative abduction, new and still unexpressed information is discovered and evaluated and physicians are able "to think through doing" to get the correct diagnosis. Abductive inferential path originates with an emotional reaction (discovery of the signs), step by step explanations are formed and it ends with another emotional reaction (diagnosis). Few bedside instruments are allowed to physicians to amplify their ability to search for signs. Stethoscope is an example. Similarities between ultrasound exploration and percussion can be found. Bedside ultrasonography can be considered an external amplifier of signs, a particular kind of percussion and represents a valid example of abductive manipulation. In this searching for signs doctors act like detectives and sometimes the discovering of a strategic, unsuspected sign during abductive manipulation could represent the key point for the correct diagnosis. This condition is called serendipity. Ultrasound is a powerful tool for detecting soft, hidden, unexpected and strategic signs.
MagAl: A new tool to analyse galaxies photometric data
NASA Astrophysics Data System (ADS)
Schoenell, W.; Benítez, N.; Cid Fernandes, R.
2014-10-01
On galaxy spectra, one can find mainly two features: emission lines, which tell us about the ionised gas content, and the continuum plus absorption lines, which tell us about the stellar content. They thus allow us to derive gas-phase abundances, the main radiation sources, chemical enrichment and star formation histories. Braad-band photometry, on the other hand, is much more limited and hinders our ability to recover a galaxy's physical properties to such a degree of detail. However, with the recent development of redshift surveys using the technology of ultra-narrow filters (≍ 100 Å), such as ALHAMBRA, J-PAS and DES, it will be invaluable to be able to retrieve information on physical properties of galaxies from photometric data. Motivated by this data avalanche (which goes up to the petabyte scale), we decided to build our own SED-fitting code: Magnitudes Analyser (MagAl), which has three modules. 1) A template library generation module: generates empirical and theoretical template libraries. 2) Bayesian fitting module: calculates probability distribution functions (PDFs) for given observed and library template data. This is similar to the method to measure photometric redshifts by Benitez (2000). 3) A result-analyser module: streamlines data analysis from the large output PDFs files. A fourth module to manage 3D data is being developed and a few preliminary tests are also shown. To investigate the reliability of results obtained by MagAl, we have created a mock galaxy sample for the ALHAMBRA survey filter system (http://alhambrasurvey.com) and tried to recover their physical properties. We show that for our sample of simulated galaxies we can measure stellar ages, metallicities and extinctions with a precision of less than 0.3 dex. Also, we apply the code to the ALHAMBRA survey catalog and show that we can measure stellar masses with an accuracy of 0.2 dex when comparing to previous results like COSMOS masses measured by Bundy et al. (2006).
Miniaturised Space Payloads for Outdoor Environmental Applications
NASA Astrophysics Data System (ADS)
de Souza, P. A.
2012-12-01
The need for portable, robust and acurate sensors has increased in recent years resulting from industrial and environmental needs. This paper describes a number of applications of engineering copies of those Moessbauer spectrometers (MIMOS II) used by Mars Exploration Rovers, and the use of portable XRF spectrometers in the analysis of heavy metals in sediments. MIMOS II has been applied in the characterisation of Fe-bearing phases in airborne particles in industrialised urban centres, The results have allowed an identification of sources or air pollution in near-real-time. The results help to combine production parameters with pollution impact in the urban area. MIMOS II became a powerful tool because its constructive requirements to flight has produced a robust, power efficient, miniaturised, and light. On the limitation side, the technique takes sometime to produce a good result and the instrument requires a radioactive source to operate. MIMOS II Team has reported a new generation of this instrument incorporating a XRF spectrometer using the radioactive source to generate fluorescence emissions from sample. The author, and its research group, adapted a portable XRF spectrometer to an autonomous underwater vehicle (AUV) and conducted heavy metals survey in sediments across the Derwent Estuary in Tasmania, Australia. The AUV lands on suitable locations underwater, makes the chemical analysis and decide based on the result to move to a closer location, should high concentration of chemicals of interest be found, or to another distant location otherwise. Beyond environmental applications, these instruments were applied in archaeology and in industrial process control.oessbauer spectra recorded on airborne particles (Total Suspended Particles) collected at Ilha do Boi, VItoria, ES, Brazil. SIRO's Autonomous Underwater Vehicle carring a miniaturised XRF spectrometer for underwater chemistry. Students involved in this Project: Mr Jeremy Breen and Mr Andrew Davie. Collaborators: Dr. Greg Timms (CSIRO) and Dr. Robert Ollington (UTAS). This AUV us 1.2m long.
Factors associated with health-related decision-making in older adults from Southern Brazil.
Morsch, Patricia; Mirandola, Andrea Ribeiro; Caberlon, Iride Cristofoli; Bós, Ângelo José Gonçalves
2017-05-01
To analyze older adults' health-related decision-making profile. Secondary analysis of a population-based study with 6945 older-adults (aged ≥60 years) in Southern Brazil. Multiple logistic regressions were calculated to describe the odds of deciding alone or asking for advice, compared with the chance of letting someone else decide about health-related issues. Associated variables were age, sex, marital status, education level, number of chronic morbidities, having children and quality of life. The odds of asking for advice instead of letting others decide were significantly higher in the younger group and those with better levels of quality of life, independent of other variables. The chance of asking for advice was lower for unmarried (62%), widowed (76%) and those with children (50%). The chance of men deciding for themselves about their health instead of letting others decide was 47% higher compared with women (P = 0.0002), but 45% lower in the older group (P < 0.0001). Participants who where unmarried and childless, and individuals with better levels of quality of life were more likely to decide alone instead of letting others decide (P < 0.05). Decision-making is fundamental for older adults' good quality of life. Aging makes older adults more vulnerable to dependence; however, it does not necessarily mean that they lose or decrease their ability to make decisions regarding their own health and desires. Geriatr Gerontol Int 2017; 17: 798-803. © 2016 Japan Geriatrics Society.
Pharmacy career deciding: making choice a "good fit".
Willis, Sarah Caroline; Shann, Phillip; Hassell, Karen
2009-01-01
The purpose of this article is to explore factors influencing career deciding amongst pharmacy students and graduates in the U.K. Group interviews were used to devise a topic guide for five subsequent focus groups with pharmacy students and graduates. Focus groups were tape-recorded, recordings transcribed, and transcripts analysed. Key themes and interlinking factors relating to pharmacy career deciding were identified in the transcripts, following a constructivist approach. Participants' described making a "good fit" between themselves, their experiences, social networks etc. and pharmacy. Central to a coherent career deciding narrative were: having a job on graduation; and the instrumental advantage of studying a vocational course. Focusing on career deciding of UK pharmacy students and graduates may limit the study's generalisability to other countries. However, our findings are relevant to those interested in understanding students' motivations for healthcare careers, since our results suggest that making a "good fit" describes a general process of matching between a healthcare career and personal experience. As we have found that pharmacy career deciding was not, usually, a planned activity, career advisors and those involved in higher education recruitment should take into account the roles played by personal preferences and values in choosing a degree course. A qualitative study like this can illustrate how career deciding occurs and provide insight into the process from a student's perspective. This can help inform guidance processes, selection to healthcare professions courses within the higher education sector, and stimulate debate amongst those involved with recruitment of healthcare workers about desirable motivators for healthcare careers.
Tarzia, Laura; Murray, Elizabeth; Humphreys, Cathy; Glass, Nancy; Taft, Angela; Valpied, Jodie; Hegarty, Kelsey
2016-01-01
Domestic violence (DV) perpetrated by men against women is a pervasive global problem with significant physical and emotional consequences. Although some face-to-face interventions in health care settings have shown promise, there are barriers to disclosure to health care practitioners and women may not be ready to access or accept help, reducing uptake. Similar to the mental health field, interventions from clinical practice can be adapted to be delivered by technology. This article outlines the theoretical and conceptual development of I-DECIDE, an online healthy relationship tool and safety decision aid for women experiencing DV. The article explores the use of the Psychosocial Readiness Model (PRM) as a theoretical framework for the intervention and evaluation. This is a theoretical article drawing on current theory and literature around health care and online interventions for DV. The article argues that the Internet as a method of intervention delivery for DV might overcome many of the barriers present in health care settings. Using the PRM as a framework for an online DV intervention may help women on a pathway to safety and well-being for themselves and their children. This hypothesis will be tested in a randomized, controlled trial in 2015/2016. This article highlights the importance of using a theoretical model in intervention development and evaluation. Copyright © 2016 Jacobs Institute of Women's Health. Published by Elsevier Inc. All rights reserved.
Using a blog as an integrated eLearning tool and platform.
Goh, Poh Sun
2016-06-01
Technology enhanced learning or eLearning allows educators to expand access to educational content, promotes engagement with students and makes it easier for students to access educational material at a time, place and pace which suits them. The challenge for educators beginning their eLearning journey is to decide where to start, which includes the choice of an eLearning tool and platform. This article will share one educator's decision making process, and experience using blogs as a flexible and versatile integrated eLearning tool and platform. Apart from being a cost effective/free tool and platform, blogs offer the possibility of creating a hyperlinked indexed content repository, for both created and curated educational material; as well as a distribution and engagement tool and platform. Incorporating pedagogically sound activities and educational practices into a blog promote a structured templated teaching process, which can be reproduced. Moving from undergraduate to postgraduate training, educational blogs supported by a comprehensive online case-based repository offer the possibility of training beyond competency towards proficiency and expert level performance through a process of deliberate practice. By documenting educational content and the student engagement and learning process, as well as feedback and personal reflection of educational sessions, blogs can also form the basis for a teaching portfolio, and provide evidence and data of scholarly teaching and educational scholarship. Looking into the future, having a collection of readily accessible indexed hyperlinked teaching material offers the potential to do on the spot teaching with illustrative material called up onto smart surfaces, and displayed on holographic interfaces.
Koenig, Thomas W; Parrish, Samuel K; Terregino, Carol A; Williams, Joy P; Dunleavy, Dana M; Volsch, Joseph M
2013-05-01
Assessing applicants' personal competencies in the admission process has proven difficult because there is not an agreed-on set of personal competencies for entering medical students. In addition, there are questions about the measurement properties and costs of currently available assessment tools. The Association of American Medical College's Innovation Lab Working Group (ILWG) and Admissions Initiative therefore engaged in a multistep, multiyear process to identify personal competencies important to entering students' success in medical school as well as ways to measure them early in the admission process. To identify core personal competencies, they conducted literature reviews, surveyed U.S and Canadian medical school admission officers, and solicited input from the admission community. To identify tools with the potential to provide data in time for pre-interview screening, they reviewed the higher education and employment literature and evaluated tools' psychometric properties, group differences, risk of coaching/faking, likely applicant and admission officer reactions, costs, and scalability. This process resulted in a list of nine core personal competencies rated by stakeholders as very or extremely important for entering medical students: ethical responsibility to self and others; reliability and dependability; service orientation; social skills; capacity for improvement; resilience and adaptability; cultural competence; oral communication; and teamwork. The ILWG's research suggests that some tools hold promise for assessing personal competencies, but the authors caution that none are perfect for all situations. They recommend that multiple tools be used to evaluate information about applicants' personal competencies in deciding whom to interview.
Share2Quit: Web-Based Peer-Driven Referrals for Smoking Cessation
2013-01-01
Background Smoking is the number one preventable cause of death in the United States. Effective Web-assisted tobacco interventions are often underutilized and require new and innovative engagement approaches. Web-based peer-driven chain referrals successfully used outside health care have the potential for increasing the reach of Internet interventions. Objective The objective of our study was to describe the protocol for the development and testing of proactive Web-based chain-referral tools for increasing the access to Decide2Quit.org, a Web-assisted tobacco intervention system. Methods We will build and refine proactive chain-referral tools, including email and Facebook referrals. In addition, we will implement respondent-driven sampling (RDS), a controlled chain-referral sampling technique designed to remove inherent biases in chain referrals and obtain a representative sample. We will begin our chain referrals with an initial recruitment of former and current smokers as seeds (initial participants) who will be trained to refer current smokers from their social network using the developed tools. In turn, these newly referred smokers will also be provided the tools to refer other smokers from their social networks. We will model predictors of referral success using sample weights from the RDS to estimate the success of the system in the targeted population. Results This protocol describes the evaluation of proactive Web-based chain-referral tools, which can be used in tobacco interventions to increase the access to hard-to-reach populations, for promoting smoking cessation. Conclusions Share2Quit represents an innovative advancement by capitalizing on naturally occurring technology trends to recruit smokers to Web-assisted tobacco interventions. PMID:24067329
The Generation Rate of Respirable Dust from Cutting Fiber Cement Siding Using Different Tools
Qi, Chaolong; Echt, Alan; Gressel, Michael G
2017-01-01
This article describes the evaluation of the generation rate of respirable dust (GAPS, defined as the mass of respirable dust generated per unit linear length cut) from cutting fiber cement siding using different tools in a laboratory testing system. We used an aerodynamic particle sizer spectrometer (APS) to continuously monitor the real-time size distributions of the dust throughout cutting tests when using a variety of tools, and calculated the generation rate of respirable dust for each testing condition using the size distribution data. The test result verifies that power shears provided an almost dust-free operation with a GAPS of 0.006 gram meter−1 (g m−1) at the testing condition. For the same power saws, the cuts using saw blades with more teeth generated more respirable dusts. Using the same blade for all four miter saws tested in this study, a positive linear correlation was found between the saws’ blade rotating speed and its dust generation rate. In addition, a circular saw running at the highest blade rotating speed of 9068 RPM generated the greatest amount of dust. All the miter saws generated less dust in the ‘chopping mode’ than in the ‘chopping and sliding’ mode. For the tested saws, GAPS consistently decreased with the increases of the saw cutting feed rate and the number of board in the stack. All the test results point out that fewer cutting interactions between the saw blade’s teeth and the siding board for a unit linear length of cut tend to result in a lower generation rate of respirable dust. These results may help guide optimal operation in practice and future tool development aimed at minimizing dust generation while producing a satisfactory cut. PMID:28395343
The Generation Rate of Respirable Dust from Cutting Fiber Cement Siding Using Different Tools.
Qi, Chaolong; Echt, Alan; Gressel, Michael G
2017-03-01
This article describes the evaluation of the generation rate of respirable dust (GAPS, defined as the mass of respirable dust generated per unit linear length cut) from cutting fiber cement siding using different tools in a laboratory testing system. We used an aerodynamic particle sizer spectrometer (APS) to continuously monitor the real-time size distributions of the dust throughout cutting tests when using a variety of tools, and calculated the generation rate of respirable dust for each testing condition using the size distribution data. The test result verifies that power shears provided an almost dust-free operation with a GAPS of 0.006 g m-1 at the testing condition. For the same power saws, the cuts using saw blades with more teeth generated more respirable dusts. Using the same blade for all four miter saws tested in this study, a positive linear correlation was found between the saws' blade rotating speed and its dust generation rate. In addition, a circular saw running at the highest blade rotating speed of 9068 rpm generated the greatest amount of dust. All the miter saws generated less dust in the 'chopping mode' than in the 'chopping and sliding' mode. For the tested saws, GAPS consistently decreased with the increases of the saw cutting feed rate and the number of board in the stack. All the test results point out that fewer cutting interactions between the saw blade's teeth and the siding board for a unit linear length of cut tend to result in a lower generation rate of respirable dust. These results may help guide optimal operation in practice and future tool development aimed at minimizing dust generation while producing a satisfactory cut. Published by Oxford University Press on behalf of The British Occupational Hygiene Society 2017.
Analysis of Cryogenic Cycle with Process Modeling Tool: Aspen HYSYS
NASA Astrophysics Data System (ADS)
Joshi, D. M.; Patel, H. K.
2015-10-01
Cryogenic engineering deals with the development and improvement of low temperature techniques, processes and equipment. A process simulator such as Aspen HYSYS, for the design, analysis, and optimization of process plants, has features that accommodate the special requirements and therefore can be used to simulate most cryogenic liquefaction and refrigeration processes. Liquefaction is the process of cooling or refrigerating a gas to a temperature below its critical temperature so that liquid can be formed at some suitable pressure which is below the critical pressure. Cryogenic processes require special attention in terms of the integration of various components like heat exchangers, Joule-Thompson Valve, Turbo expander and Compressor. Here, Aspen HYSYS, a process modeling tool, is used to understand the behavior of the complete plant. This paper presents the analysis of an air liquefaction plant based on the Linde cryogenic cycle, performed using the Aspen HYSYS process modeling tool. It covers the technique used to find the optimum values for getting the maximum liquefaction of the plant considering different constraints of other parameters. The analysis result so obtained gives clear idea in deciding various parameter values before implementation of the actual plant in the field. It also gives an idea about the productivity and profitability of the given configuration plant which leads to the design of an efficient productive plant.
MASTOS: Mammography Simulation Tool for design Optimization Studies.
Spyrou, G; Panayiotakis, G; Tzanakos, G
2000-01-01
Mammography is a high quality imaging technique for the detection of breast lesions, which requires dedicated equipment and optimum operation. The design parameters of a mammography unit have to be decided and evaluated before the construction of such a high cost of apparatus. The optimum operational parameters also must be defined well before the real breast examination. MASTOS is a software package, based on Monte Carlo methods, that is designed to be used as a simulation tool in mammography. The input consists of the parameters that have to be specified when using a mammography unit, and also the parameters specifying the shape and composition of the breast phantom. In addition, the input may specify parameters needed in the design of a new mammographic apparatus. The main output of the simulation is a mammographic image and calculations of various factors that describe the image quality. The Monte Carlo simulation code is PC-based and is driven by an outer shell of a graphical user interface. The entire software package is a simulation tool for mammography and can be applied in basic research and/or in training in the fields of medical physics and biomedical engineering as well as in the performance evaluation of new designs of mammography units and in the determination of optimum standards for the operational parameters of a mammography unit.
UTIS as one example of standardization of subsea intervention systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haugen, F.G.
1995-12-31
The number of diverless subsea interventions has increased dramatically during the last few years. A number of types of tools and equipment have been designed and used. A typical procedure has been to develop new intervention tools under each new contract based on experience from the previous project. This is not at all optimal with regard to project cost and risk, and is no longer acceptable as the oil industry now calls for cost savings within all areas of field development. One answer to the problem will be to develop universal intervention systems with the capability to perform a rangemore » of related tasks, with only minor, planned modifications of the system. This philosophy will dramatically reduce planning, engineering, construction and interface work related to the intervention operation as the main work will be only to locate a standardized landing facility on the subsea structure. The operating procedures can be taken ``off the shelf``. To adapt to this philosophy within the tie-in area, KOS decided to standardize on a Universal Tie-In System (UTIS), which will be included in a Tool Pool for rental world-wide. This paper describes UTIS as a typical example of standardization of subsea intervention systems. 16 figs., 1 tab.« less
Turon, Clàudia; Comas, Joaquim; Torrens, Antonina; Molle, Pascal; Poch, Manel
2008-01-01
With the aim of improving effluent quality of waste stabilization ponds, different designs of vertical flow constructed wetlands and intermittent sand filters were tested on an experimental full-scale plant within the framework of a European project. The information extracted from this study was completed and updated with heuristic and bibliographic knowledge. The data and knowledge acquired were difficult to integrate into mathematical models because they involve qualitative information and expert reasoning. Therefore, it was decided to develop an environmental decision support system (EDSS-Filter-Design) as a tool to integrate mathematical models and knowledge-based techniques. This paper describes the development of this support tool, emphasizing the collection of data and knowledge and representation of this information by means of mathematical equations and a rule-based system. The developed support tool provides the main design characteristics of filters: (i) required surface, (ii) media type, and (iii) media depth. These design recommendations are based on wastewater characteristics, applied load, and required treatment level data provided by the user. The results of the EDSS-Filter-Design provide appropriate and useful information and guidelines on how to design filters, according to the expert criteria. The encapsulation of the information into a decision support system reduces the design period and provides a feasible, reasoned, and positively evaluated proposal.
A digital flight control system verification laboratory
NASA Technical Reports Server (NTRS)
De Feo, P.; Saib, S.
1982-01-01
A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.
Flowgen: Flowchart-based documentation for C + + codes
NASA Astrophysics Data System (ADS)
Kosower, David A.; Lopez-Villarejo, J. J.
2015-11-01
We present the Flowgen tool, which generates flowcharts from annotated C + + source code. The tool generates a set of interconnected high-level UML activity diagrams, one for each function or method in the C + + sources. It provides a simple and visual overview of complex implementations of numerical algorithms. Flowgen is complementary to the widely-used Doxygen documentation tool. The ultimate aim is to render complex C + + computer codes accessible, and to enhance collaboration between programmers and algorithm or science specialists. We describe the tool and a proof-of-concept application to the VINCIA plug-in for simulating collisions at CERN's Large Hadron Collider.
Table-driven software architecture for a stitching system
NASA Technical Reports Server (NTRS)
Thrash, Patrick J. (Inventor); Miller, Jeffrey L. (Inventor); Pallas, Ken (Inventor); Trank, Robert C. (Inventor); Fox, Rhoda (Inventor); Korte, Mike (Inventor); Codos, Richard (Inventor); Korolev, Alexandre (Inventor); Collan, William (Inventor)
2001-01-01
Native code for a CNC stitching machine is generated by generating a geometry model of a preform; generating tool paths from the geometry model, the tool paths including stitching instructions for making stitches; and generating additional instructions indicating thickness values. The thickness values are obtained from a lookup table. When the stitching machine runs the native code, it accesses a lookup table to determine a thread tension value corresponding to the thickness value. The stitching machine accesses another lookup table to determine a thread path geometry value corresponding to the thickness value.
Decision generation tools and Bayesian inference
NASA Astrophysics Data System (ADS)
Jannson, Tomasz; Wang, Wenjian; Forrester, Thomas; Kostrzewski, Andrew; Veeris, Christian; Nielsen, Thomas
2014-05-01
Digital Decision Generation (DDG) tools are important software sub-systems of Command and Control (C2) systems and technologies. In this paper, we present a special type of DDGs based on Bayesian Inference, related to adverse (hostile) networks, including such important applications as terrorism-related networks and organized crime ones.
20 CFR 10.206 - May an employee who uses leave after an injury later decide to use COP instead?
Code of Federal Regulations, 2012 CFR
2012-04-01
... injury later decide to use COP instead? 10.206 Section 10.206 Employees' Benefits OFFICE OF WORKERS... THE FEDERAL EMPLOYEES' COMPENSATION ACT, AS AMENDED Continuation of Pay Eligibility for Cop § 10.206 May an employee who uses leave after an injury later decide to use COP instead? On Form CA-1, an...
20 CFR 10.206 - May an employee who uses leave after an injury later decide to use COP instead?
Code of Federal Regulations, 2011 CFR
2011-04-01
... injury later decide to use COP instead? 10.206 Section 10.206 Employees' Benefits OFFICE OF WORKERS... THE FEDERAL EMPLOYEES' COMPENSATION ACT, AS AMENDED Continuation of Pay Eligibility for Cop § 10.206 May an employee who uses leave after an injury later decide to use COP instead? On Form CA-1, an...
20 CFR 10.206 - May an employee who uses leave after an injury later decide to use COP instead?
Code of Federal Regulations, 2010 CFR
2010-04-01
... injury later decide to use COP instead? 10.206 Section 10.206 Employees' Benefits OFFICE OF WORKERS... THE FEDERAL EMPLOYEES' COMPENSATION ACT, AS AMENDED Continuation of Pay Eligibility for Cop § 10.206 May an employee who uses leave after an injury later decide to use COP instead? On Form CA-1, an...
Code of Federal Regulations, 2010 CFR
2010-07-01
... deciding whether to use the fixed-fee or cost-reimbursable contracting method? 302-12.109 Section 302-12... Services Company § 302-12.109 What must we consider in deciding whether to use the fixed-fee or cost...-fee or cost-reimbursable contracting method: (a) Risk of alternative methods. Under a fixed fee...
Code of Federal Regulations, 2011 CFR
2011-04-01
..., DEPARTMENT OF THE INTERIOR LAW AND ORDER INDIAN COUNTRY LAW ENFORCEMENT Support Functions § 12.62 Who decides what uniform an Indian country law enforcement officer can wear and who pays for it? Each local law... 25 Indians 1 2011-04-01 2011-04-01 false Who decides what uniform an Indian country law...
Code of Federal Regulations, 2014 CFR
2014-04-01
..., DEPARTMENT OF THE INTERIOR LAW AND ORDER INDIAN COUNTRY LAW ENFORCEMENT Support Functions § 12.62 Who decides what uniform an Indian country law enforcement officer can wear and who pays for it? Each local law... 25 Indians 1 2014-04-01 2014-04-01 false Who decides what uniform an Indian country law...
Code of Federal Regulations, 2013 CFR
2013-04-01
..., DEPARTMENT OF THE INTERIOR LAW AND ORDER INDIAN COUNTRY LAW ENFORCEMENT Support Functions § 12.62 Who decides what uniform an Indian country law enforcement officer can wear and who pays for it? Each local law... 25 Indians 1 2013-04-01 2013-04-01 false Who decides what uniform an Indian country law...
Code of Federal Regulations, 2012 CFR
2012-04-01
... THE INTERIOR LAW AND ORDER INDIAN COUNTRY LAW ENFORCEMENT Support Functions § 12.62 Who decides what uniform an Indian country law enforcement officer can wear and who pays for it? Each local law enforcement... 25 Indians 1 2012-04-01 2011-04-01 true Who decides what uniform an Indian country law enforcement...
Research development of thermal aberration in 193nm lithography exposure system
NASA Astrophysics Data System (ADS)
Wang, Yueqiang; Liu, Yong
2014-08-01
Lithographic exposure is the key process in the manufacture of the integrated circuit, and the performance of exposure system decides the level of microelectronic manufacture technology. Nowadays, the 193nm ArF immersion exposure tool is widely used by the IC manufacturer. With the uniformity of critical dimension (CDU) and overlay become tighter and the requirement for throughput become higher, the thermal aberration caused by lens material and structure absorbing the laser energy cannot be neglected. In this paper, we introduce the efforts and methods that researcher on thermal aberration and its control. Further, these methods were compared to show their own pros and cons. Finally we investigated the challenges of thermal aberration control for state of the art technologies.
Maranger, Julie; Malcolm, Janine; Liddy, Clare; Izzi, Sheryl; Brez, Sharon; LaBrecque, Kerri; Taljaard, Monica; Reid, Robert; Keely, Erin; Ooi, Teik Chye
2013-01-01
The epidemic of diabetes has increased pressure on the whole spectrum of the healthcare system including specialist centres. The authors' own specialist centre at The Ottawa Hospital has 20,000 annual visits for diabetes, 80% of which are follow-up visits. Since it is a tertiary facility, managers, administrators and clinicians would like to increase their ability to see newly referred patients and decrease the number of follow-up visits. In order to discharge appropriate diabetes patients, the authors decided it was essential to strengthen the transition process to decrease both the pressure on the centre and the risk for discontinuity of diabetes care after discharge.
The importance of documenting code, and how you might make yourself do it
NASA Astrophysics Data System (ADS)
Tollerud, Erik Jon; Astropy Project
2016-01-01
Your science code is awesome. It reduces data, performs some statistical analysis, or models a physical process better than anyone has done before. You wisely decide that it is worth sharing with your student/advisor, research collaboration, or the whole world. But when you send it out, no one seems willing to use it. Why? Most of the time, it's your documentation. You wrote the code for yourself, so you know what every function, procedure, or class is supposed to do. Unfortunately, your users (sometimes including you 6 months later) do not. In this talk, I will describe some of the tools, both technical and psychological, to make that documentation happen (particularly for the Python ecosystem).
Informed consent in otolaryngologic surgery: case scenario from a nigerian specialist hospital.
Afolabi, O A; Fadare, J O; Ajiboye, O T
2014-01-01
Informed consent is a foundational concept necessary for ethical conduct of clinical research and practice. It is a technical tool that shifts the autonomy to decide whether a medical procedure should be performed-from the doctor to the patient. However there is an ongoing discussion in bioethical circles on the level of comprehension of the informed consent process by the patients and research participants. We present this case vignette and the discussion afterwards to explore the question of to what extent a patient comprehends the information given to him/her before a surgical procedure is carried out. In other words, the question being asked here is how informed is informed consent in the context of oto-laryngological practice.
Coagulopathy in liver disease: Lack of an assessment tool.
Blasi, Annabel
2015-09-21
There is a discrepancy between the information from clotting tests which have routinely been used in clinical practice and evidence regarding thrombotic and bleeding events in patients with liver disease. This discrepancy leads us to rely on other variables which have been shown to be involved in haemostasis in these patients and/or to extrapolate the behaviour of these patients to other settings in order to decide the best clinical approach. The aims of the present review are as follows: (1) to present the information provided by clotting tests in cirrhotic patients; (2) to present the factors that may influence clotting in these patients; (3) to review the clinical evidence; and (4) to put forward a clinical approach based on the first 3 points.
Coagulopathy in liver disease: Lack of an assessment tool
Blasi, Annabel
2015-01-01
There is a discrepancy between the information from clotting tests which have routinely been used in clinical practice and evidence regarding thrombotic and bleeding events in patients with liver disease. This discrepancy leads us to rely on other variables which have been shown to be involved in haemostasis in these patients and/or to extrapolate the behaviour of these patients to other settings in order to decide the best clinical approach. The aims of the present review are as follows: (1) to present the information provided by clotting tests in cirrhotic patients; (2) to present the factors that may influence clotting in these patients; (3) to review the clinical evidence; and (4) to put forward a clinical approach based on the first 3 points. PMID:26401071
Schmittdiel, Julie A.; Desai, Jay; Schroeder, Emily B.; Paolino, Andrea R.; Nichols, Gregory A.; Lawrence, Jean M.; O’Connor, Patrick J.; Ohnsorg, Kris A.; Newton, Katherine M.; Steiner, John F.
2016-01-01
ABSTRACT/Implementation Lessons Engaging stakeholders in the research process has the potential to improve quality of care and the patient care experience.Online patient community surveys can elicit important topic areas for comparative effectiveness research.Stakeholder meetings with substantial patient representation, as well as representation from health care delivery systems and research funding agencies, are a valuable tool for selecting and refining pilot research and quality improvement projects.Giving patient stakeholders a deciding vote in selecting pilot research topics helps ensure their ‘voice’ is heard.Researchers and health care leaders should continue to develop best-practices and strategies for increasing patient involvement in comparative effectiveness and delivery science research. PMID:26179728
PLQP & Company: Decidable Logics for Quantum Algorithms
NASA Astrophysics Data System (ADS)
Baltag, Alexandru; Bergfeld, Jort; Kishida, Kohei; Sack, Joshua; Smets, Sonja; Zhong, Shengyang
2014-10-01
We introduce a probabilistic modal (dynamic-epistemic) quantum logic PLQP for reasoning about quantum algorithms. We illustrate its expressivity by using it to encode the correctness of the well-known quantum search algorithm, as well as of a quantum protocol known to solve one of the paradigmatic tasks from classical distributed computing (the leader election problem). We also provide a general method (extending an idea employed in the decidability proof in Dunn et al. (J. Symb. Log. 70:353-359, 2005)) for proving the decidability of a range of quantum logics, interpreted on finite-dimensional Hilbert spaces. We give general conditions for the applicability of this method, and in particular we apply it to prove the decidability of PLQP.
Divorce transition differences of midlife women.
Sakraida, Teresa J
2005-01-01
Divorce transition experienced by and its influence upon midlife women's health is not fully understood. Interviews were conducted with 24 divorced women who self-classified into decider status groups: initiator (who first decided to end marriage), non-initiator (recipient of end of marriage decision), and mutual decider (shared decision to end marriage). Interpretive content analysis involving pattern coding was conducted. The divorce transition by initiators (n=8) included self-focused growth, optimism, and social support losses and opportunities, while the divorce transition by non-initiators (n=8) included being left, ruminating, vulnerability, and spiritual comfort. No profile emerged for the mutual-decider group (n=8). This study supports that differences in divorce transition exist for initiators and non-initiators.
Socioeconomic Attainment in the Ellis Island Era*
White, Michael J.; Mullen, Erica Jade
2017-01-01
Contemporary discussions of immigrant assimilation in the United States often take the experience of the late 19th and early 20th centuries as a benchmark, yet significant gaps remain in our understanding of the generality and rate of immigrant progress during that era. Using four decades of IPUMS census microdata, we utilize both OLS microdata regression and double cohort methodology to examine socioeconomic assimilation across arrival cohort and country of origin during the Ellis Island era. Our results show, contrary to some writing, that while the first generation (the foreign born) exhibit decidedly inferior labor market outcomes, socioeconomic attainment (measured by Socio-Economic Index [SEI] points) increased quickly with duration in the U.S. Persons of the second generation and those of mixed parentage show much less penalty than immigrants. At the same time, we uncover differences in outcome by European region that do not disappear over the decades we examine. PMID:28979054
Executing Medical Guidelines on the Web: Towards Next Generation Healthcare
NASA Astrophysics Data System (ADS)
Argüello, M.; Des, J.; Fernandez-Prieto, M. J.; Perez, R.; Paniagua, H.
There is still a lack of full integration between current Electronic Health Records (EHRs) and medical guidelines that encapsulate evidence-based medicine. Thus, general practitioners (GPs) and specialised physicians still have to read document-based medical guidelines and decide among various options for managing common non-life-threatening conditions where the selection of the most appropriate therapeutic option for each individual patient can be a difficult task. This paper presents a simulation framework and computational test-bed, called V.A.F. Framework, for supporting simulations of clinical situations that boosted the integration between Health Level Seven (HL7) and Semantic Web technologies (OWL, SWRL, and OWL-S) to achieve content layer interoperability between online clinical cases and medical guidelines, and therefore, it proves that higher integration between EHRs and evidence-based medicine can be accomplished which could lead to a next generation of healthcare systems that provide more support to physicians and increase patients' safety.
A simple generative model of collective online behavior.
Gleeson, James P; Cellai, Davide; Onnela, Jukka-Pekka; Porter, Mason A; Reed-Tsochas, Felix
2014-07-22
Human activities increasingly take place in online environments, providing novel opportunities for relating individual behaviors to population-level outcomes. In this paper, we introduce a simple generative model for the collective behavior of millions of social networking site users who are deciding between different software applications. Our model incorporates two distinct mechanisms: one is associated with recent decisions of users, and the other reflects the cumulative popularity of each application. Importantly, although various combinations of the two mechanisms yield long-time behavior that is consistent with data, the only models that reproduce the observed temporal dynamics are those that strongly emphasize the recent popularity of applications over their cumulative popularity. This demonstrates--even when using purely observational data without experimental design--that temporal data-driven modeling can effectively distinguish between competing microscopic mechanisms, allowing us to uncover previously unidentified aspects of collective online behavior.
A simple generative model of collective online behavior
Gleeson, James P.; Cellai, Davide; Onnela, Jukka-Pekka; Porter, Mason A.; Reed-Tsochas, Felix
2014-01-01
Human activities increasingly take place in online environments, providing novel opportunities for relating individual behaviors to population-level outcomes. In this paper, we introduce a simple generative model for the collective behavior of millions of social networking site users who are deciding between different software applications. Our model incorporates two distinct mechanisms: one is associated with recent decisions of users, and the other reflects the cumulative popularity of each application. Importantly, although various combinations of the two mechanisms yield long-time behavior that is consistent with data, the only models that reproduce the observed temporal dynamics are those that strongly emphasize the recent popularity of applications over their cumulative popularity. This demonstrates—even when using purely observational data without experimental design—that temporal data-driven modeling can effectively distinguish between competing microscopic mechanisms, allowing us to uncover previously unidentified aspects of collective online behavior. PMID:25002470
NASA Technical Reports Server (NTRS)
Stapfer, G.; Truscello, V. C.
1975-01-01
For the Multi-Hundred Watt (MHW) Radioisotope Thermoelectric Generator (RTG), the silicon germanium unicouples are coated with silicon nitride to minimize degradation mechanisms which are directly attributable to material sublimation effects. A program is under way to determine the effective vapor suppression of this coating as a function of temperature and gas environment. The results of weight loss experiments, using Si3N4 coated hot shoes (SiMo), operating over a temperature range from 900 C to 1200 C, are analyzed and discussed. These experiments were conducted both in high vacuum and at different pressures of carbon monoxide (CO) to determine its effect on the coating. Although the results show a favorable vapor suppression at all operating temperatures, the pressure of the CO and the thickness of the coating have a decided effect on the useful lifetime of the coating.
Automatic Certification of Kalman Filters for Reliable Code Generation
NASA Technical Reports Server (NTRS)
Denney, Ewen; Fischer, Bernd; Schumann, Johann; Richardson, Julian
2005-01-01
AUTOFILTER is a tool for automatically deriving Kalman filter code from high-level declarative specifications of state estimation problems. It can generate code with a range of algorithmic characteristics and for several target platforms. The tool has been designed with reliability of the generated code in mind and is able to automatically certify that the code it generates is free from various error classes. Since documentation is an important part of software assurance, AUTOFILTER can also automatically generate various human-readable documents, containing both design and safety related information. We discuss how these features address software assurance standards such as DO-178B.
Model Based Analysis and Test Generation for Flight Software
NASA Technical Reports Server (NTRS)
Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep
2009-01-01
We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.
First non-OEM steam-generator replacement in US a success
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hendsbee, P.M.; Lees, M.D.; Smith, J.C.
1994-04-01
In selecting replacements for major powerplant components, a fresh approach can be advantageous--even when complex nuclear components are involved. This was the experience at Unit 2 of Millstone nuclear station, which features an 870-MW pressurized-water reactor (PWR) with two nuclear recirculating steam generators. The unit began operation in 1975. In the early 1980s, pitting problems surfaced in the steam generator tubing; by the mid eighties, tube corrosion had reached an unacceptable level. Virtually all of the 17,000 tubes in the two units were deteriorating, with 2500 plugged and 5000 sleeved. Several new problems also were identified, including secondary-side circumferential crackingmore » of the Alloy 600 tubing near the tubesheet face, and deterioration of the carbon steel egg-crate tube supports. Despite improvements to primary and secondary steam-generator water chemistry, including almost complete copper removal from the condensate and feedwater loops, Northeast Utilities (NU) was unable to completely control degradation of the tube bundles. The utility decided in 1987 that full replacement was the most viable alternative. NU made a bold move, selecting a supplier other than the original equipment manufacturer (OEM).« less