Pre- and Post-Processing Tools to Create and Characterize Particle-Based Composite Model Structures
2017-11-01
ARL-TR-8213 ● NOV 2017 US Army Research Laboratory Pre- and Post -Processing Tools to Create and Characterize Particle-Based...ARL-TR-8213 ● NOV 2017 US Army Research Laboratory Pre- and Post -Processing Tools to Create and Characterize Particle-Based Composite...AND SUBTITLE Pre- and Post -Processing Tools to Create and Characterize Particle-Based Composite Model Structures 5a. CONTRACT NUMBER 5b. GRANT
Pre- and Post-Processing Tools to Streamline the CFD Process
NASA Technical Reports Server (NTRS)
Dorney, Suzanne Miller
2002-01-01
This viewgraph presentation provides information on software development tools to facilitate the use of CFD (Computational Fluid Dynamics) codes. The specific CFD codes FDNS and CORSAIR are profiled, and uses for software development tools with these codes during pre-processing, interim-processing, and post-processing are explained.
STAMPS: Software Tool for Automated MRI Post-processing on a supercomputer.
Bigler, Don C; Aksu, Yaman; Miller, David J; Yang, Qing X
2009-08-01
This paper describes a Software Tool for Automated MRI Post-processing (STAMP) of multiple types of brain MRIs on a workstation and for parallel processing on a supercomputer (STAMPS). This software tool enables the automation of nonlinear registration for a large image set and for multiple MR image types. The tool uses standard brain MRI post-processing tools (such as SPM, FSL, and HAMMER) for multiple MR image types in a pipeline fashion. It also contains novel MRI post-processing features. The STAMP image outputs can be used to perform brain analysis using Statistical Parametric Mapping (SPM) or single-/multi-image modality brain analysis using Support Vector Machines (SVMs). Since STAMPS is PBS-based, the supercomputer may be a multi-node computer cluster or one of the latest multi-core computers.
NASA Technical Reports Server (NTRS)
Ayap, Shanti; Fisher, Forest; Gladden, Roy; Khanampompan, Teerapat
2008-01-01
This software tool saves time and reduces risk by automating two labor-intensive and error-prone post-processing steps required for every DKF [DSN (Deep Space Network) Keyword File] that MRO (Mars Reconnaissance Orbiter) produces, and is being extended to post-process the corresponding TSOE (Text Sequence Of Events) as well. The need for this post-processing step stems from limitations in the seq-gen modeling resulting in incorrect DKF generation that is then cleaned up in post-processing.
Parallel workflow tools to facilitate human brain MRI post-processing
Cui, Zaixu; Zhao, Chenxi; Gong, Gaolang
2015-01-01
Multi-modal magnetic resonance imaging (MRI) techniques are widely applied in human brain studies. To obtain specific brain measures of interest from MRI datasets, a number of complex image post-processing steps are typically required. Parallel workflow tools have recently been developed, concatenating individual processing steps and enabling fully automated processing of raw MRI data to obtain the final results. These workflow tools are also designed to make optimal use of available computational resources and to support the parallel processing of different subjects or of independent processing steps for a single subject. Automated, parallel MRI post-processing tools can greatly facilitate relevant brain investigations and are being increasingly applied. In this review, we briefly summarize these parallel workflow tools and discuss relevant issues. PMID:26029043
A cascading failure analysis tool for post processing TRANSCARE simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
This is a MATLAB-based tool to post process simulation results in the EPRI software TRANSCARE, for massive cascading failure analysis following severe disturbances. There are a few key modules available in this tool, including: 1. automatically creating a contingency list to run TRANSCARE simulations, including substation outages above a certain kV threshold, N-k (1, 2 or 3) generator outages and branche outages; 2. read in and analyze a CKO file of PCG definition, an initiating event list, and a CDN file; 3. post process all the simulation results saved in a CDN file and perform critical event corridor analysis; 4.more » provide a summary of TRANSCARE simulations; 5. Identify the most frequently occurring event corridors in the system; and 6. Rank the contingencies using a user defined security index to quantify consequences in terms of total load loss, total number of cascades, etc.« less
A Format for Phylogenetic Placements
Matsen, Frederick A.; Hoffman, Noah G.; Gallagher, Aaron; Stamatakis, Alexandros
2012-01-01
We have developed a unified format for phylogenetic placements, that is, mappings of environmental sequence data (e.g., short reads) into a phylogenetic tree. We are motivated to do so by the growing number of tools for computing and post-processing phylogenetic placements, and the lack of an established standard for storing them. The format is lightweight, versatile, extensible, and is based on the JSON format, which can be parsed by most modern programming languages. Our format is already implemented in several tools for computing and post-processing parsimony- and likelihood-based phylogenetic placements and has worked well in practice. We believe that establishing a standard format for analyzing read placements at this early stage will lead to a more efficient development of powerful and portable post-analysis tools for the growing applications of phylogenetic placement. PMID:22383988
A format for phylogenetic placements.
Matsen, Frederick A; Hoffman, Noah G; Gallagher, Aaron; Stamatakis, Alexandros
2012-01-01
We have developed a unified format for phylogenetic placements, that is, mappings of environmental sequence data (e.g., short reads) into a phylogenetic tree. We are motivated to do so by the growing number of tools for computing and post-processing phylogenetic placements, and the lack of an established standard for storing them. The format is lightweight, versatile, extensible, and is based on the JSON format, which can be parsed by most modern programming languages. Our format is already implemented in several tools for computing and post-processing parsimony- and likelihood-based phylogenetic placements and has worked well in practice. We believe that establishing a standard format for analyzing read placements at this early stage will lead to a more efficient development of powerful and portable post-analysis tools for the growing applications of phylogenetic placement.
M. E. Miller; M. Billmire; W. J. Elliot; K. A. Endsley; P. R. Robichaud
2015-01-01
Preparation is key to utilizing Earth Observations and process-based models to support post-wildfire mitigation. Post-fire flooding and erosion can pose a serious threat to life, property and municipal water supplies. Increased runoff and sediment delivery due to the loss of surface cover and fire-induced changes in soil properties are of great concern. Remediation...
NASA Technical Reports Server (NTRS)
Chan, William M.; Rogers, Stuart E.; Nash, Steven M.; Buning, Pieter G.; Meakin, Robert
2005-01-01
Chimera Grid Tools (CGT) is a software package for performing computational fluid dynamics (CFD) analysis utilizing the Chimera-overset-grid method. For modeling flows with viscosity about geometrically complex bodies in relative motion, the Chimera-overset-grid method is among the most computationally cost-effective methods for obtaining accurate aerodynamic results. CGT contains a large collection of tools for generating overset grids, preparing inputs for computer programs that solve equations of flow on the grids, and post-processing of flow-solution data. The tools in CGT include grid editing tools, surface-grid-generation tools, volume-grid-generation tools, utility scripts, configuration scripts, and tools for post-processing (including generation of animated images of flows and calculating forces and moments exerted on affected bodies). One of the tools, denoted OVERGRID, is a graphical user interface (GUI) that serves to visualize the grids and flow solutions and provides central access to many other tools. The GUI facilitates the generation of grids for a new flow-field configuration. Scripts that follow the grid generation process can then be constructed to mostly automate grid generation for similar configurations. CGT is designed for use in conjunction with a computer-aided-design program that provides the geometry description of the bodies, and a flow-solver program.
Tools for 3D scientific visualization in computational aerodynamics at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Bancroft, Gordon; Plessel, Todd; Merritt, Fergus; Watson, Val
1989-01-01
Hardware, software, and techniques used by the Fluid Dynamics Division (NASA) for performing visualization of computational aerodynamics, which can be applied to the visualization of flow fields from computer simulations of fluid dynamics about the Space Shuttle, are discussed. Three visualization techniques applied, post-processing, tracking, and steering, are described, as well as the post-processing software packages used, PLOT3D, SURF (Surface Modeller), GAS (Graphical Animation System), and FAST (Flow Analysis software Toolkit). Using post-processing methods a flow simulation was executed on a supercomputer and, after the simulation was complete, the results were processed for viewing. It is shown that the high-resolution, high-performance three-dimensional workstation combined with specially developed display and animation software provides a good tool for analyzing flow field solutions obtained from supercomputers.
Introducing GHOST: The Geospace/Heliosphere Observation & Simulation Tool-kit
NASA Astrophysics Data System (ADS)
Murphy, J. J.; Elkington, S. R.; Schmitt, P.; Wiltberger, M. J.; Baker, D. N.
2013-12-01
Simulation models of the heliospheric and geospace environments can provide key insights into the geoeffective potential of solar disturbances such as Coronal Mass Ejections and High Speed Solar Wind Streams. Advanced post processing of the results of these simulations greatly enhances the utility of these models for scientists and other researchers. Currently, no supported centralized tool exists for performing these processing tasks. With GHOST, we introduce a toolkit for the ParaView visualization environment that provides a centralized suite of tools suited for Space Physics post processing. Building on the work from the Center For Integrated Space Weather Modeling (CISM) Knowledge Transfer group, GHOST is an open-source tool suite for ParaView. The tool-kit plugin currently provides tools for reading LFM and Enlil data sets, and provides automated tools for data comparison with NASA's CDAweb database. As work progresses, many additional tools will be added and through open-source collaboration, we hope to add readers for additional model types, as well as any additional tools deemed necessary by the scientific public. The ultimate end goal of this work is to provide a complete Sun-to-Earth model analysis toolset.
Bahrami-Samani, Emad; Vo, Dat T.; de Araujo, Patricia Rosa; Vogel, Christine; Smith, Andrew D.; Penalva, Luiz O. F.; Uren, Philip J.
2014-01-01
Co- and post-transcriptional regulation of gene expression is complex and multi-faceted, spanning the complete RNA lifecycle from genesis to decay. High-throughput profiling of the constituent events and processes is achieved through a range of technologies that continue to expand and evolve. Fully leveraging the resulting data is non-trivial, and requires the use of computational methods and tools carefully crafted for specific data sources and often intended to probe particular biological processes. Drawing upon databases of information pre-compiled by other researchers can further elevate analyses. Within this review, we describe the major co- and post-transcriptional events in the RNA lifecycle that are amenable to high-throughput profiling. We place specific emphasis on the analysis of the resulting data, in particular the computational tools and resources available, as well as looking towards future challenges that remain to be addressed. PMID:25515586
NASA Technical Reports Server (NTRS)
Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.
1982-01-01
Documentation of the preliminary software developed as a framework for a generalized integrated robotic system simulation is presented. The program structure is composed of three major functions controlled by a program executive. The three major functions are: system definition, analysis tools, and post processing. The system definition function handles user input of system parameters and definition of the manipulator configuration. The analysis tools function handles the computational requirements of the program. The post processing function allows for more detailed study of the results of analysis tool function executions. Also documented is the manipulator joint model software to be used as the basis of the manipulator simulation which will be part of the analysis tools capability.
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Day, John H. (Technical Monitor)
2000-01-01
Post-processing of data, related to a GPS receiver test in a GPS simulator and test facility, is an important step towards qualifying a receiver for space flight. Although the GPS simulator provides all the parameters needed to analyze a simulation, as well as excellent analysis tools on the simulator workstation, post-processing is not a GPS simulator or receiver function alone, and it must be planned as a separate pre-flight test program requirement. A GPS simulator is a critical resource, and it is desirable to move off the pertinent test data from the simulator as soon as a test is completed. The receiver and simulator databases are used to extract the test data files for postprocessing. These files are then usually moved from the simulator and receiver systems to a personal computer (PC) platform, where post-processing is done typically using PC-based commercial software languages and tools. Because of commercial software systems generality their functions are notoriously slow and more than often are the bottleneck even for short duration simulator-based tests. There is a need to do post-processing faster and within an hour after test completion, including all required operations on the simulator and receiver to prepare and move off the post-processing files. This is especially significant in order to use the previous test feedback for the next simulation setup or to run near back-to-back simulation scenarios. Solving the post-processing timing problem is critical for a pre-flight test program success. Towards this goal an approach was developed that allows to speed-up post-processing by an order of a magnitude. It is based on improving the post-processing bottleneck function algorithm using a priory information that is specific to a GPS simulation application and using only the necessary volume of truth data. The presented postprocessing scheme was used in support of a few successful space flight missions carrying GPS receivers.
Enhnacing the science of the WFIRST coronagraph instrument with post-processing.
NASA Astrophysics Data System (ADS)
Pueyo, Laurent; WFIRST CGI data analysis and post-processing WG
2018-01-01
We summarize the results of a three years effort investigating how to apply to the WFIRST coronagraph instrument (CGI) modern image analysis methods, now routinely used with ground-based coronagraphs. In this post we quantify the gain associated post-processing for WFIRST-CGI observing scenarios simulated between 2013 and 2017. We also show based one simulations that spectrum of planet can be confidently retrieved using these processing tools with and Integral Field Spectrograph. We then discuss our work using CGI experimental data and quantify coronagraph post-processing testbed gains. We finally introduce stability metrics that are simple to define and measure, and place useful lower bound and upper bounds on the achievable RDI post-processing contrast gain. We show that our bounds hold in the case of the testbed data.
Identifying the Machine Translation Error Types with the Greatest Impact on Post-editing Effort
Daems, Joke; Vandepitte, Sonia; Hartsuiker, Robert J.; Macken, Lieve
2017-01-01
Translation Environment Tools make translators’ work easier by providing them with term lists, translation memories and machine translation output. Ideally, such tools automatically predict whether it is more effortful to post-edit than to translate from scratch, and determine whether or not to provide translators with machine translation output. Current machine translation quality estimation systems heavily rely on automatic metrics, even though they do not accurately capture actual post-editing effort. In addition, these systems do not take translator experience into account, even though novices’ translation processes are different from those of professional translators. In this paper, we report on the impact of machine translation errors on various types of post-editing effort indicators, for professional translators as well as student translators. We compare the impact of MT quality on a product effort indicator (HTER) with that on various process effort indicators. The translation and post-editing process of student translators and professional translators was logged with a combination of keystroke logging and eye-tracking, and the MT output was analyzed with a fine-grained translation quality assessment approach. We find that most post-editing effort indicators (product as well as process) are influenced by machine translation quality, but that different error types affect different post-editing effort indicators, confirming that a more fine-grained MT quality analysis is needed to correctly estimate actual post-editing effort. Coherence, meaning shifts, and structural issues are shown to be good indicators of post-editing effort. The additional impact of experience on these interactions between MT quality and post-editing effort is smaller than expected. PMID:28824482
Identifying the Machine Translation Error Types with the Greatest Impact on Post-editing Effort.
Daems, Joke; Vandepitte, Sonia; Hartsuiker, Robert J; Macken, Lieve
2017-01-01
Translation Environment Tools make translators' work easier by providing them with term lists, translation memories and machine translation output. Ideally, such tools automatically predict whether it is more effortful to post-edit than to translate from scratch, and determine whether or not to provide translators with machine translation output. Current machine translation quality estimation systems heavily rely on automatic metrics, even though they do not accurately capture actual post-editing effort. In addition, these systems do not take translator experience into account, even though novices' translation processes are different from those of professional translators. In this paper, we report on the impact of machine translation errors on various types of post-editing effort indicators, for professional translators as well as student translators. We compare the impact of MT quality on a product effort indicator (HTER) with that on various process effort indicators. The translation and post-editing process of student translators and professional translators was logged with a combination of keystroke logging and eye-tracking, and the MT output was analyzed with a fine-grained translation quality assessment approach. We find that most post-editing effort indicators (product as well as process) are influenced by machine translation quality, but that different error types affect different post-editing effort indicators, confirming that a more fine-grained MT quality analysis is needed to correctly estimate actual post-editing effort. Coherence, meaning shifts, and structural issues are shown to be good indicators of post-editing effort. The additional impact of experience on these interactions between MT quality and post-editing effort is smaller than expected.
Updates to the CMAQ Post Processing and Evaluation Tools for 2016
In the spring of 2016, the evaluation tools distributed with the CMAQ model code were updated and new tools were added to the existing set of tools. Observation data files, compatible with the AMET software, were also made available on the CMAS website for the first time with the...
CFD Process Pre- and Post-processing Automation in Support of Space Propulsion
NASA Technical Reports Server (NTRS)
Dorney, Suzanne M.
2003-01-01
The use of Computational Fluid Dynamics or CFD has become standard practice in the design and analysis of the major components used for space propulsion. In an attempt to standardize and improve the CFD process a series of automated tools have been developed. Through the use of these automated tools the application of CFD to the design cycle has been improved and streamlined. This paper presents a series of applications in which deficiencies were identified in the CFD process and corrected through the development of automated tools.
The Western Aeronautical Test Range. Chapter 10 Tools
NASA Technical Reports Server (NTRS)
Knudtson, Kevin; Park, Alice; Downing, Robert; Sheldon, Jack; Harvey, Robert; Norcross, April
2011-01-01
The Western Aeronautical Test Range (WATR) staff at the NASA Dryden Flight Research Center is developing a translation software called Chapter 10 Tools in response to challenges posed by post-flight processing data files originating from various on-board digital recorders that follow the Range Commanders Council Inter-Range Instrumentation Group (IRIG) 106 Chapter 10 Digital Recording Standard but use differing interpretations of the Standard. The software will read the date files regardless of the vendor implementation of the source recorder, displaying data, identifying and correcting errors, and producing a data file that can be successfully processed post-flight
Making practice transparent through e-portfolio.
Stewart, Sarah M
2013-12-01
Midwives are required to maintain a professional portfolio as part of their statutory requirements. Some midwives are using open social networking tools and processes to develop an e-portfolio. However, confidentiality of patient and client data and professional reputation have to be taken into consideration when using online public spaces for reflection. There is little evidence about how midwives use social networking tools for ongoing learning. It is uncertain how reflecting in an e-portfolio with an audience impacts on learning outcomes. This paper investigates ways in which reflective midwifery practice be carried out using e-portfolio in open, social networking platforms using collaborative processes. Using an auto-ethnographic approach I explored my e-portfolio and selected posts that had attracted six or more comments. I used thematic analysis to identify themes within the textual conversations in the posts and responses posted by readers. The analysis identified that my collaborative e-portfolio had four themes: to provide commentary and discuss issues; to reflect and process learning; to seek advice, brainstorm and process ideas for practice, projects and research, and provide evidence of professional development. E-portfolio using open social networking tools and processes is a viable option for midwives because it facilitates collaborative reflection and shared learning. However, my experience shows that concerns about what people think, and client confidentiality does impact on the nature of open reflection and learning outcomes. I conclude this paper with a framework for managing midwifery statutory obligations using online public spaces and social networking tools. Copyright © 2013 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Merry, Josh; Takeshita, Jennifer; Tweedy, Bryan; Burford, Dwight
2006-01-01
In this presentation, the results of a recent study on the effect of pin tool design for friction stir welding thin sheets (0.040") of aluminum alloys 2024 and 7075 are provided. The objective of this study was to investigate and document the effect of tool shoulder and pin diameter, as well as the presence of pin flutes, on the resultant microstructure and mechanical properties at both room temperature and cryogenic temperature. Specifically, the comparison between three tools will include: FSW process load analysis (tool forces required to fabricate the welds), Static Mechanical Properties (ultimate tensile strength, yield strength, and elongation), and Process window documenting the range of parameters that can be used with the three pin tools investigated. All samples were naturally aged for a period greater than 10 days. Prior research has shown 7075 may require post weld heat treatment. Therefore, an additional pair of room temperature and cryogenic temperature samples was post-weld aged to the 7075-T7 condition prior to mechanical testing.
Pre-genomic, genomic and post-genomic study of microbial communities involved in bioenergy.
Rittmann, Bruce E; Krajmalnik-Brown, Rosa; Halden, Rolf U
2008-08-01
Microorganisms can produce renewable energy in large quantities and without damaging the environment or disrupting food supply. The microbial communities must be robust and self-stabilizing, and their essential syntrophies must be managed. Pre-genomic, genomic and post-genomic tools can provide crucial information about the structure and function of these microbial communities. Applying these tools will help accelerate the rate at which microbial bioenergy processes move from intriguing science to real-world practice.
Tools to Develop or Convert MOVES Inputs
The following tools are designed to help users develop inputs to MOVES and post-process the output. With the release of MOVES2014, EPA strongly encourages state and local agencies to develop local inputs based on MOVES fleet and activity categories.
FEMME- post-Fire Emergency ManageMEnt tool.
NASA Astrophysics Data System (ADS)
Vieira, Diana; Serpa, Dalila; Rocha, João; Nunes, João; Keizer, Jacob
2017-04-01
Wildfires can have important impacts on hydrological and soil erosion processes in forest catchments, due to the destruction of vegetation cover and changes to soil properties. The involved processes however, are non-linear and not fully understood. This has severely limited the understanding on the impacts of wildfires, and, as a consequence, current runoff-erosion models are poorly adapted to recently burned forest conditions. Furthermore, while post-fire forestry operations and, to a lesser extent, post-fire soil conservation measures are commonly applied, their hydrological and erosion impacts continue poorly known, hampering decision-making by land owners and managers. Past post-wildfire research in Portugal has involved simple adaptations of plot-scale runoff-erosion models to post-fire conditions. This follow-up study focusses on model adaptation to selected post-fire soil conservation measures. To this end, full stock is taken of various datasets collected by several (past and ongoing research projects. The selected model is the Morgan-Morgan-Finney model (MMF, Morgan,2001), which already proved its suitability for post-fire conditions in Portugal (Vieira et al, 2010, 2014) as well as NW-Spain ( Fernández et al., 2010). The present results concerned runoff and erosion different burn severities and various post-fire mitigation treatments (mulch, hydromulch, needle cast, barriers), focussing on the plot and field scale. The results for both the first and the second year following the wildfire revealed good model efficiency, not only for burned and untreated conditions but also for burned and treated conditions. These results thus reinforced earlier findings that MMF is a suitable model for the envisaged post-fire soil erosion assessment tool, coined "FEMME". The data used for post-fire soil erosion calibration with the MMF already allows the delineation of the post-fire management FEMME tool. Nevertheless, further model assessment will address additional post-fire forestry operations (e.g. plowing) as well as upscaling to the catchment scale with the MMF model and compare it with the SWAT model.
CBCT Post-Processing Tools to Manage the Progression of Invasive Cervical Resorption: A Case Report.
Vasconcelos, Karla de Faria; de-Azevedo-Vaz, Sergio Lins; Freitas, Deborah Queiroz; Haiter-Neto, Francisco
2016-01-01
This case report aimed to highlight the usefulness of cone beam computed tomography (CBCT) and its post-processing tools for the diagnosis, follow-up and treatment planning of invasive cervical resorption (ICR). A 16-year-old female patient was referred for periapical radiographic examination, which revealed an irregular but well demarcated radiolucency in the mandibular right central incisor. In addition, CBCT scanning was performed to distinguish between ICR and internal root resorption. After the diagnosis of ICR, the patient was advised to return shortly but did so only six years later. At that time, another CBCT scan was performed and CBCT registration and subtraction were done to document lesion progress. These imaging tools were able to show lesion progress and extent clearly and were fundamental for differential diagnosis and treatment decision.
Glemser, Philip A; Pfleiderer, Michael; Heger, Anna; Tremper, Jan; Krauskopf, Astrid; Schlemmer, Heinz-Peter; Yen, Kathrin; Simons, David
2017-03-01
The aim of this multi-reader feasibility study was to evaluate new post-processing CT imaging tools in rib fracture assessment of forensic cases by analyzing detection time and diagnostic accuracy. Thirty autopsy cases (20 with and 10 without rib fractures in autopsy) were randomly selected and included in this study. All cases received a native whole body CT scan prior to the autopsy procedure, which included dissection and careful evaluation of each rib. In addition to standard transverse sections (modality A), CT images were subjected to a reconstruction algorithm to compute axial labelling of the ribs (modality B) as well as "unfolding" visualizations of the rib cage (modality C, "eagle tool"). Three radiologists with different clinical and forensic experience who were blinded to autopsy results evaluated all cases in a random manner of modality and case. Rib fracture assessment of each reader was evaluated compared to autopsy and a CT consensus read as radiologic reference. A detailed evaluation of relevant test parameters revealed a better accordance to the CT consensus read as to the autopsy. Modality C was the significantly quickest rib fracture detection modality despite slightly reduced statistic test parameters compared to modalities A and B. Modern CT post-processing software is able to shorten reading time and to increase sensitivity and specificity compared to standard autopsy alone. The eagle tool as an easy to use tool is suited for an initial rib fracture screening prior to autopsy and can therefore be beneficial for forensic pathologists.
Contingency Analysis Post-Processing With Advanced Computing and Visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Glaesemann, Kurt; Fitzhenry, Erin
Contingency analysis is a critical function widely used in energy management systems to assess the impact of power system component failures. Its outputs are important for power system operation for improved situational awareness, power system planning studies, and power market operations. With the increased complexity of power system modeling and simulation caused by increased energy production and demand, the penetration of renewable energy and fast deployment of smart grid devices, and the trend of operating grids closer to their capacity for better efficiency, more and more contingencies must be executed and analyzed quickly in order to ensure grid reliability andmore » accuracy for the power market. Currently, many researchers have proposed different techniques to accelerate the computational speed of contingency analysis, but not much work has been published on how to post-process the large amount of contingency outputs quickly. This paper proposes a parallel post-processing function that can analyze contingency analysis outputs faster and display them in a web-based visualization tool to help power engineers improve their work efficiency by fast information digestion. Case studies using an ESCA-60 bus system and a WECC planning system are presented to demonstrate the functionality of the parallel post-processing technique and the web-based visualization tool.« less
User-centered design in clinical handover: exploring post-implementation outcomes for clinicians.
Wong, Ming Chao; Cummings, Elizabeth; Turner, Paul
2013-01-01
This paper examines the outcomes for clinicians from their involvement in the development of an electronic clinical hand-over tool developed using principles of user-centered design. Conventional e-health post-implementation evaluations tend to emphasize technology-related (mostly positive) outcomes. More recently, unintended (mostly negative) consequences arising from the implementation of e-health technologies have also been reported. There remains limited focus on the post-implementation outcomes for users, particularly those directly involved in e-health design processes. This paper presents detailed analysis and insights into the outcomes experienced post-implementation by a cohort of junior clinicians involved in developing an electronic clinical handover tool in Tasmania, Australia. The qualitative methods used included observations, semi-structured interviews and analysis of clinical handover notes. Significantly, a number of unanticipated flow-on effects were identified that mitigated some of the challenges arising during the design and implementation of the tool. The paper concludes by highlighting the importance of identifying post-implementation user outcomes beyond conventional system adoption and use and also points to the need for more comprehensive evaluative frameworks to encapsulate these broader socio-technical user outcomes.
ERIC Educational Resources Information Center
Harmon, Marcel; Larroque, Andre; Maniktala, Nate
2012-01-01
The New Mexico Public School Facilities Authority (NMPSFA) is the agency responsible for administering state-funded capital projects for schools statewide. Post occupancy evaluation (POE) is the tool selected by NMPSFA for measuring project outcomes. The basic POE process for V. Sue Cleveland High School (VSCHS) consisted of a series of field…
Simulation Tools for Forest Health Analysis: An Application in the Red River Watershed, Idaho
Andrew J. McMahan; Eric L. Smith
2006-01-01
Software tools for landscape analyses--including FVS model extensions, and a number of FVS-related pre- and post-processing âtoolsâ--are presented, using an analysis in the Red River Watershed, Nez Perce National Forest as an example. We present (1) a discussion of pre-simulation data analysis; (2) the Physiographic Information Extraction System (PIES), a tool that can...
Action learning: a tool for the development of strategic skills for Nurse Consultants?
Young, Sarah; Nixon, Eileen; Hinge, Denise; McFadyen, Jan; Wright, Vanessa; Lambert, Pauline; Pilkington, Carolyn; Newsome, Christine
2010-01-01
This paper will discuss the process of action learning and the outcomes of using action learning as a tool to achieve a more strategic function from Nurse Consultant posts. It is documented that one of the most challenging aspect of Nurse Consultant roles, in terms of leadership, is the strategic contribution they make at a senior corporate Trust level, often across organizations and local health economies. A facilitated action learning set was established in Brighton, England, to support the strategic leadership development of eight nurse consultant posts across two NHS Trusts. Benefits to patient care, with regard to patient pathways and cross-organizational working, have been evident outcomes associated with the nurse consultant posts involved in the action learning set. Commitment by organizational nurse leaders is essential to address the challenges facing nurse consultants to implement change at strategic levels. The use of facilitated action learning had been a successful tool in developing the strategic skills of Nurse Consultant posts within this setting. Action learning sets may be successfully applied to a range of senior nursing posts with a strategic remit and may assist post holders in achieving better outcomes pertinent to their roles.
Stevenson, Katherine; Busch, Angela; Scott, Darlene J.; Henry, Carol; Wall, Patricia A.
2009-01-01
Objectives To develop and evaluate a classroom-based curriculum designed to promote interprofessional competencies by having undergraduate students from various health professions work together on system-based problems using quality improvement (QI) methods and tools to improve patient-centered care. Design Students from 4 health care programs (nursing, nutrition, pharmacy, and physical therapy) participated in an interprofessional QI activity. In groups of 6 or 7, students completed pre-intervention and post-intervention reflection tools on attitudes relating to interprofessio nal teams, and a tool designed to evaluate group process. Assessment One hundred thirty-four students (76.6%) completed both self-reflection instruments, and 132 (74.2%) completed the post-course group evaluation instrument. Although already high prior to the activity, students' mean post-intervention reflection scores increased for 12 of 16 items. Post-intervention group evaluation scores reflected a high level of satisfaction with the experience. Conclusion Use of a quality-based case study and QI methodology were an effective approach to enhancing interprofessional experiences among students. PMID:19657497
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Day, John H. (Technical Monitor)
2000-01-01
Post-Processing of data related to a Global Positioning System (GPS) simulation is an important activity in qualification of a GPS receiver for space flight. Because a GPS simulator is a critical resource it is desirable to move off the pertinent simulation data from the simulator as soon as a test is completed. The simulator data files are usually moved to a Personal Computer (PC), where the post-processing of the receiver logged measurements and solutions data and simulated data is performed. Typically post-processing is accomplished using PC-based commercial software languages and tools. Because of commercial software systems generality their general-purpose functions are notoriously slow and more than often are the bottleneck problem even for short duration experiments. For example, it may take 8 hours to post-process data from a 6-hour simulation. There is a need to do post-processing faster, especially in order to use the previous test results as feedback for a next simulation setup. This paper demonstrates that a fast software linear interpolation algorithm is applicable to a large class of engineering problems, like GPS simulation data post-processing, where computational time is a critical resource and is one of the most important considerations. An approach is developed that allows to speed-up post-processing by an order of magnitude. It is based on improving the post-processing bottleneck interpolation algorithm using apriori information that is specific to the GPS simulation application. The presented post-processing scheme was used in support of a few successful space flight missions carrying GPS receivers. A future approach to solving the post-processing performance problem using Field Programmable Gate Array (FPGA) technology is described.
1989-11-01
other design tools. RESULTS OF TEST/DEMONSTRATION: Training for the Design 4D Program was conducted at USACERL. Although nearly half of the test...subjects had difficulty with the prompts, their understanding of the program improved after experimenting with the commands. After training , most felt...Equipment Testing Process 3 TEST DISTRICT TRAINING ........................................... 10 Training Process Post Training Survey Post Training
fMRI paradigm designing and post-processing tools
James, Jija S; Rajesh, PG; Chandran, Anuvitha VS; Kesavadas, Chandrasekharan
2014-01-01
In this article, we first review some aspects of functional magnetic resonance imaging (fMRI) paradigm designing for major cognitive functions by using stimulus delivery systems like Cogent, E-Prime, Presentation, etc., along with their technical aspects. We also review the stimulus presentation possibilities (block, event-related) for visual or auditory paradigms and their advantage in both clinical and research setting. The second part mainly focus on various fMRI data post-processing tools such as Statistical Parametric Mapping (SPM) and Brain Voyager, and discuss the particulars of various preprocessing steps involved (realignment, co-registration, normalization, smoothing) in these software and also the statistical analysis principles of General Linear Modeling for final interpretation of a functional activation result. PMID:24851001
Assesment on the performance of electrode arrays using image processing technique
NASA Astrophysics Data System (ADS)
Usman, N.; Khiruddin, A.; Nawawi, Mohd
2017-08-01
Interpreting inverted resistivity section is time consuming, tedious and requires other sources of information to be relevant geologically. Image processing technique was used in order to perform post inversion processing which make geophysical data interpretation easier. The inverted data sets were imported into the PCI Geomatica 9.0.1 for further processing. The data sets were clipped and merged together in order to match the coordinates of the three layers and permit pixel to pixel analysis. Dipole-dipole array is more sensitive to resistivity variation with depth in comparison with Werner-Schlumberger and pole-dipole. Image processing serves as good post-inversion tool in geophysical data processing.
Tools for 3D scientific visualization in computational aerodynamics
NASA Technical Reports Server (NTRS)
Bancroft, Gordon; Plessel, Todd; Merritt, Fergus; Watson, Val
1989-01-01
The purpose is to describe the tools and techniques in use at the NASA Ames Research Center for performing visualization of computational aerodynamics, for example visualization of flow fields from computer simulations of fluid dynamics about vehicles such as the Space Shuttle. The hardware used for visualization is a high-performance graphics workstation connected to a super computer with a high speed channel. At present, the workstation is a Silicon Graphics IRIS 3130, the supercomputer is a CRAY2, and the high speed channel is a hyperchannel. The three techniques used for visualization are post-processing, tracking, and steering. Post-processing analysis is done after the simulation. Tracking analysis is done during a simulation but is not interactive, whereas steering analysis involves modifying the simulation interactively during the simulation. Using post-processing methods, a flow simulation is executed on a supercomputer and, after the simulation is complete, the results of the simulation are processed for viewing. The software in use and under development at NASA Ames Research Center for performing these types of tasks in computational aerodynamics is described. Workstation performance issues, benchmarking, and high-performance networks for this purpose are also discussed as well as descriptions of other hardware for digital video and film recording.
Tool post modification allows easy turret lathe cutting-tool alignment
NASA Technical Reports Server (NTRS)
Fouts, L.
1966-01-01
Modified tool holder and tool post permit alignment of turret lathe cutting tools on the center of the spindle. The tool is aligned with the spindle by the holder which is kept in position by a hydraulic lock in feature of the tool post. The tool post is used on horizontal and vertical turret lathes and other engine lathes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tenney, J.L.
SARS is a data acquisition system designed to gather and process radar data from aircraft flights. A database of flight trajectories has been developed for Albuquerque, NM, and Amarillo, TX. The data is used for safety analysis and risk assessment reports. To support this database effort, Sandia developed a collection of hardware and software tools to collect and post process the aircraft radar data. This document describes the data reduction tools which comprise the SARS, and maintenance procedures for the hardware and software system.
ERIC Educational Resources Information Center
Diouf, Boucar; Rioux, Pierre
1999-01-01
Presents the rigor mortis process in brook charr (Salvelinus fontinalis) as a tool for better understanding skeletal muscle metabolism. Describes an activity that demonstrates how rigor mortis is related to the post-mortem decrease of muscular glycogen and ATP, how glycogen degradation produces lactic acid that lowers muscle pH, and how…
A Web simulation of medical image reconstruction and processing as an educational tool.
Papamichail, Dimitrios; Pantelis, Evaggelos; Papagiannis, Panagiotis; Karaiskos, Pantelis; Georgiou, Evangelos
2015-02-01
Web educational resources integrating interactive simulation tools provide students with an in-depth understanding of the medical imaging process. The aim of this work was the development of a purely Web-based, open access, interactive application, as an ancillary learning tool in graduate and postgraduate medical imaging education, including a systematic evaluation of learning effectiveness. The pedagogic content of the educational Web portal was designed to cover the basic concepts of medical imaging reconstruction and processing, through the use of active learning and motivation, including learning simulations that closely resemble actual tomographic imaging systems. The user can implement image reconstruction and processing algorithms under a single user interface and manipulate various factors to understand the impact on image appearance. A questionnaire for pre- and post-training self-assessment was developed and integrated in the online application. The developed Web-based educational application introduces the trainee in the basic concepts of imaging through textual and graphical information and proceeds with a learning-by-doing approach. Trainees are encouraged to participate in a pre- and post-training questionnaire to assess their knowledge gain. An initial feedback from a group of graduate medical students showed that the developed course was considered as effective and well structured. An e-learning application on medical imaging integrating interactive simulation tools was developed and assessed in our institution.
AFM surface imaging of AISI D2 tool steel machined by the EDM process
NASA Astrophysics Data System (ADS)
Guu, Y. H.
2005-04-01
The surface morphology, surface roughness and micro-crack of AISI D2 tool steel machined by the electrical discharge machining (EDM) process were analyzed by means of the atomic force microscopy (AFM) technique. Experimental results indicate that the surface texture after EDM is determined by the discharge energy during processing. An excellent machined finish can be obtained by setting the machine parameters at a low pulse energy. The surface roughness and the depth of the micro-cracks were proportional to the power input. Furthermore, the AFM application yielded information about the depth of the micro-cracks is particularly important in the post treatment of AISI D2 tool steel machined by EDM.
Levy, Andrew E; Shah, Nishant R; Matheny, Michael E; Reeves, Ruth M; Gobbel, Glenn T; Bradley, Steven M
2018-04-25
Reporting standards promote clarity and consistency of stress myocardial perfusion imaging (MPI) reports, but do not require an assessment of post-test risk. Natural Language Processing (NLP) tools could potentially help estimate this risk, yet it is unknown whether reports contain adequate descriptive data to use NLP. Among VA patients who underwent stress MPI and coronary angiography between January 1, 2009 and December 31, 2011, 99 stress test reports were randomly selected for analysis. Two reviewers independently categorized each report for the presence of critical data elements essential to describing post-test ischemic risk. Few stress MPI reports provided a formal assessment of post-test risk within the impression section (3%) or the entire document (4%). In most cases, risk was determinable by combining critical data elements (74% impression, 98% whole). If ischemic risk was not determinable (25% impression, 2% whole), inadequate description of systolic function (9% impression, 1% whole) and inadequate description of ischemia (5% impression, 1% whole) were most commonly implicated. Post-test ischemic risk was determinable but rarely reported in this sample of stress MPI reports. This supports the potential use of NLP to help clarify risk. Further study of NLP in this context is needed.
The anatomy of E-Learning tools: Does software usability influence learning outcomes?
Van Nuland, Sonya E; Rogers, Kem A
2016-07-08
Reductions in laboratory hours have increased the popularity of commercial anatomy e-learning tools. It is critical to understand how the functionality of such tools can influence the mental effort required during the learning process, also known as cognitive load. Using dual-task methodology, two anatomical e-learning tools were examined to determine the effect of their design on cognitive load during two joint learning exercises. A.D.A.M. Interactive Anatomy is a simplistic, two-dimensional tool that presents like a textbook, whereas Netter's 3D Interactive Anatomy has a more complex three-dimensional usability that allows structures to be rotated. It was hypothesized that longer reaction times on an observation task would be associated with the more complex anatomical software (Netter's 3D Interactive Anatomy), indicating a higher cognitive load imposed by the anatomy software, which would result in lower post-test scores. Undergraduate anatomy students from Western University, Canada (n = 70) were assessed using a baseline knowledge test, Stroop observation task response times (a measure of cognitive load), mental rotation test scores, and an anatomy post-test. Results showed that reaction times and post-test outcomes were similar for both tools, whereas mental rotation test scores were positively correlated with post-test values when students used Netter's 3D Interactive Anatomy (P = 0.007), but not when they used A.D.A.M. Interactive Anatomy. This suggests that a simple e-learning tool, such as A.D.A.M. Interactive Anatomy, is as effective as more complicated tools, such as Netter's 3D Interactive Anatomy, and does not academically disadvantage those with poor spatial ability. Anat Sci Educ 9: 378-390. © 2015 American Association of Anatomists. © 2015 American Association of Anatomists.
Mikalsen, Marius; Walderhaug, Ståle
2009-01-01
The objective of the study presented here was to perform an empirical investigation on factors affecting healthcare workers acceptance and utilisation of e-learning in post-school healthcare education. E-learning benefits are realised when key features of e-learning are not only applied, but deemed useful, compatible with the learning process and supportive in order to reach the overall goals of the learning process. We conducted a survey of 14 state-enrolled nurses and skilled-workers within the field of healthcare in Norway. The results show that perceived compatibility and subjective norm explain system usage of the e-learning tool amongst the students. We found that the fact that the students considered the e-learning to be compatible with the course in question had a positive effect on e-learning tool usage. We also found support for factors such as facilitating conditions and ease of use leads to the e-learning tool being considered useful.
Schoville, Benjamin J; Brown, Kyle S; Harris, Jacob A; Wilkins, Jayne
2016-01-01
The Middle Stone Age (MSA) is associated with early evidence for symbolic material culture and complex technological innovations. However, one of the most visible aspects of MSA technologies are unretouched triangular stone points that appear in the archaeological record as early as 500,000 years ago in Africa and persist throughout the MSA. How these tools were being used and discarded across a changing Pleistocene landscape can provide insight into how MSA populations prioritized technological and foraging decisions. Creating inferential links between experimental and archaeological tool use helps to establish prehistoric tool function, but is complicated by the overlaying of post-depositional damage onto behaviorally worn tools. Taphonomic damage patterning can provide insight into site formation history, but may preclude behavioral interpretations of tool function. Here, multiple experimental processes that form edge damage on unretouched lithic points from taphonomic and behavioral processes are presented. These provide experimental distributions of wear on tool edges from known processes that are then quantitatively compared to the archaeological patterning of stone point edge damage from three MSA lithic assemblages-Kathu Pan 1, Pinnacle Point Cave 13B, and Die Kelders Cave 1. By using a model-fitting approach, the results presented here provide evidence for variable MSA behavioral strategies of stone point utilization on the landscape consistent with armature tips at KP1, and cutting tools at PP13B and DK1, as well as damage contributions from post-depositional sources across assemblages. This study provides a method with which landscape-scale questions of early modern human tool-use and site-use can be addressed.
Schoville, Benjamin J.; Brown, Kyle S.; Harris, Jacob A.; Wilkins, Jayne
2016-01-01
The Middle Stone Age (MSA) is associated with early evidence for symbolic material culture and complex technological innovations. However, one of the most visible aspects of MSA technologies are unretouched triangular stone points that appear in the archaeological record as early as 500,000 years ago in Africa and persist throughout the MSA. How these tools were being used and discarded across a changing Pleistocene landscape can provide insight into how MSA populations prioritized technological and foraging decisions. Creating inferential links between experimental and archaeological tool use helps to establish prehistoric tool function, but is complicated by the overlaying of post-depositional damage onto behaviorally worn tools. Taphonomic damage patterning can provide insight into site formation history, but may preclude behavioral interpretations of tool function. Here, multiple experimental processes that form edge damage on unretouched lithic points from taphonomic and behavioral processes are presented. These provide experimental distributions of wear on tool edges from known processes that are then quantitatively compared to the archaeological patterning of stone point edge damage from three MSA lithic assemblages—Kathu Pan 1, Pinnacle Point Cave 13B, and Die Kelders Cave 1. By using a model-fitting approach, the results presented here provide evidence for variable MSA behavioral strategies of stone point utilization on the landscape consistent with armature tips at KP1, and cutting tools at PP13B and DK1, as well as damage contributions from post-depositional sources across assemblages. This study provides a method with which landscape-scale questions of early modern human tool-use and site-use can be addressed. PMID:27736886
NASA Astrophysics Data System (ADS)
Zhang, Yunfei; Huang, Wen; Zheng, Yongcheng; Ji, Fang; Xu, Min; Duan, Zhixin; Luo, Qing; Liu, Qian; Xiao, Hong
2016-03-01
Zinc sulfide is a kind of typical infrared optical material, commonly produced using single point diamond turning (SPDT). SPDT can efficiently produce zinc sulfide aspheric surfaces with micro-roughness and acceptable figure error. However the tool marks left by the diamond turning process cause high micro-roughness that degrades the optical performance when used in the visible region of the spectrum. Magnetorheological finishing (MRF) is a deterministic, sub-aperture polishing technology that is very helpful in improving both surface micro-roughness and surface figure.This paper mainly investigates the MRF technology of large aperture off-axis aspheric optical surfaces for zinc sulfide. The topological structure and coordinate transformation of a MRF machine tool PKC1200Q2 are analyzed and its kinematics is calculated, then the post-processing algorithm model of MRF for an optical lens is established. By taking the post-processing of off-axis aspheric surfacefor example, a post-processing algorithm that can be used for a raster tool path is deduced and the errors produced by the approximate treatment are analyzed. A polishing algorithm of trajectory planning and dwell time based on matrix equation and optimization theory is presented in this paper. Adopting this algorithm an experiment is performed to machining a large-aperture off-axis aspheric surface on the MRF machine developed by ourselves. After several times' polishing, the figure accuracy PV is proved from 3.3λ to 2.0λ and RMS from 0.451λ to 0.327λ. This algorithm is used to polish the other shapes including spheres, aspheres and prisms.
Spencer, Jean L; Bhatia, Vivek N; Whelan, Stephen A; Costello, Catherine E; McComb, Mark E
2013-12-01
The identification of protein post-translational modifications (PTMs) is an increasingly important component of proteomics and biomarker discovery, but very few tools exist for performing fast and easy characterization of global PTM changes and differential comparison of PTMs across groups of data obtained from liquid chromatography-tandem mass spectrometry experiments. STRAP PTM (Software Tool for Rapid Annotation of Proteins: Post-Translational Modification edition) is a program that was developed to facilitate the characterization of PTMs using spectral counting and a novel scoring algorithm to accelerate the identification of differential PTMs from complex data sets. The software facilitates multi-sample comparison by collating, scoring, and ranking PTMs and by summarizing data visually. The freely available software (beta release) installs on a PC and processes data in protXML format obtained from files parsed through the Trans-Proteomic Pipeline. The easy-to-use interface allows examination of results at protein, peptide, and PTM levels, and the overall design offers tremendous flexibility that provides proteomics insight beyond simple assignment and counting.
NASA Astrophysics Data System (ADS)
Solecki, W. D.; Friedman, E. S.; Breitzer, R.
2016-12-01
Increasingly frequent extreme weather events are becoming an immediate priority for urban coastal practitioners and stakeholders, adding complexity to decisions concerning risk management for short-term action and long-term needs of city climate stakeholders. The conflict between the prioritization of short versus long-term events by decision-makers creates disconnect between climate science and its applications. The Consortium for Climate Risk in the Urban Northeast (CCRUN), a NOAA RISA team, is developing a set of mechanisms to help bridge this gap. The mechanisms are designed to promote the application of climate science on extreme weather events and their aftermath. It is in the post event policy window where significant opportunities for science-policy linkages exist. In particular, CCRUN is interested in producing actionable and useful information for city managers to use in decision-making processes surrounding extreme weather events and climate change. These processes include a sector specific needs assessment survey instrument and two tools for urban coastal practitioners and stakeholders. The tools focus on post event learning and connections between resilience and transformative adaptation. Elements of the two tools are presented. Post extreme event learning supports urban coastal practitioners and decision-makers concerned about maximizing opportunities for knowledge transfer and assimilation, and policy initiation and development following an extreme weather event. For the urban U.S. Northeast, post event learning helps coastal stakeholders build the capacity to adapt to extreme weather events, and inform and develop their planning capacity through analysis of past actions and steps taken in response to Hurricane Sandy. Connecting resilience with transformative adaptation is intended to promote resilience in urban Northeast coastal settings to the long-term negative consequences of extreme weather events. This is done through a knowledge co-production engagement process that links innovative and flexible adaptation pathways that can address requirements for short-term action and long-term needs.
A service protocol for post-processing of medical images on the mobile device
NASA Astrophysics Data System (ADS)
He, Longjun; Ming, Xing; Xu, Lang; Liu, Qian
2014-03-01
With computing capability and display size growing, the mobile device has been used as a tool to help clinicians view patient information and medical images anywhere and anytime. It is uneasy and time-consuming for transferring medical images with large data size from picture archiving and communication system to mobile client, since the wireless network is unstable and limited by bandwidth. Besides, limited by computing capability, memory and power endurance, it is hard to provide a satisfactory quality of experience for radiologists to handle some complex post-processing of medical images on the mobile device, such as real-time direct interactive three-dimensional visualization. In this work, remote rendering technology is employed to implement the post-processing of medical images instead of local rendering, and a service protocol is developed to standardize the communication between the render server and mobile client. In order to make mobile devices with different platforms be able to access post-processing of medical images, the Extensible Markup Language is taken to describe this protocol, which contains four main parts: user authentication, medical image query/ retrieval, 2D post-processing (e.g. window leveling, pixel values obtained) and 3D post-processing (e.g. maximum intensity projection, multi-planar reconstruction, curved planar reformation and direct volume rendering). And then an instance is implemented to verify the protocol. This instance can support the mobile device access post-processing of medical image services on the render server via a client application or on the web page.
Fontan Surgical Planning: Previous Accomplishments, Current Challenges, and Future Directions.
Trusty, Phillip M; Slesnick, Timothy C; Wei, Zhenglun Alan; Rossignac, Jarek; Kanter, Kirk R; Fogel, Mark A; Yoganathan, Ajit P
2018-04-01
The ultimate goal of Fontan surgical planning is to provide additional insights into the clinical decision-making process. In its current state, surgical planning offers an accurate hemodynamic assessment of the pre-operative condition, provides anatomical constraints for potential surgical options, and produces decent post-operative predictions if boundary conditions are similar enough between the pre-operative and post-operative states. Moving forward, validation with post-operative data is a necessary step in order to assess the accuracy of surgical planning and determine which methodological improvements are needed. Future efforts to automate the surgical planning process will reduce the individual expertise needed and encourage use in the clinic by clinicians. As post-operative physiologic predictions improve, Fontan surgical planning will become an more effective tool to accurately model patient-specific hemodynamics.
NASA Astrophysics Data System (ADS)
Suckow, A. O.
2013-12-01
Measurements need post-processing to obtain results that are comparable between laboratories. Raw data may need to be corrected for blank, memory, drift (change of reference values with time), linearity (dependence of reference on signal height) and normalized to international reference materials. Post-processing parameters need to be stored for traceability of results. State of the art stable isotope correction schemes are available based on MS Excel (Geldern and Barth, 2012; Gröning, 2011) or MS Access (Coplen, 1998). These are specialized to stable isotope measurements only, often only to the post-processing of a special run. Embedding of algorithms into a multipurpose database system was missing. This is necessary to combine results of different tracers (3H, 3He, 2H, 18O, CFCs, SF6...) or geochronological tools (Sediment dating e.g. with 210Pb, 137Cs), to relate to attribute data (submitter, batch, project, geographical origin, depth in core, well information etc.) and for further interpretation tools (e.g. lumped parameter modelling). Database sub-systems to the LabData laboratory management system (Suckow and Dumke, 2001) are presented for stable isotopes and for gas chromatographic CFC and SF6 measurements. The sub-system for stable isotopes allows the following post-processing: 1. automated import from measurement software (Isodat, Picarro, LGR), 2. correction for sample-to sample memory, linearity, drift, and renormalization of the raw data. The sub-system for gas chromatography covers: 1. storage of all raw data 2. storage of peak integration parameters 3. correction for blank, efficiency and linearity The user interface allows interactive and graphical control of the post-processing and all corrections by export to and plot in MS Excel and is a valuable tool for quality control. The sub-databases are integrated into LabData, a multi-user client server architecture using MS SQL server as back-end and an MS Access front-end and installed in four laboratories to date. Attribute data storage (unique ID for each subsample, origin, project context etc.) and laboratory management features are included. Export routines to Excel (depth profiles, time series, all possible tracer-versus tracer plots...) and modelling capabilities are add-ons. The source code is public domain and available under the GNU general public licence agreement (GNU-GPL). References Coplen, T.B., 1998. A manual for a laboratory information management system (LIMS) for light stable isotopes. Version 7.0. USGS open file report 98-284. Geldern, R.v., Barth, J.A.C., 2012. Optimization of instrument setup and post-run corrections for oxygen and hydrogen stable isotope measurements of water by isotope ratio infrared spectroscopy (IRIS). Limnology and Oceanography: Methods 10, 1024-1036. Gröning, M., 2011. Improved water δ2H and δ18O calibration and calculation of measurement uncertainty using a simple software tool. Rapid Communications in Mass Spectrometry 25, 2711-2720. Suckow, A., Dumke, I., 2001. A database system for geochemical, isotope hydrological and geochronological laboratories. Radiocarbon 43, 325-337.
Farrokh-Eslamlou, Hamidreza; Aghlmand, Siamak; Eslami, Mohammad; Homer, Caroline S E
2014-04-01
We investigated whether use of the World Health Organization's (WHO's) Decision-Making Tool (DMT) for Family Planning Clients and Providers would improve the process and outcome quality indicators of family planning (FP) services in Iran. The DMT was adapted for the Iranian setting. The study evaluated 24 FP quality key indicators grouped into two main areas, namely process and outcome. The tool was implemented in 52 urban and rural public health facilities in four selected and representative provinces of Iran. A pre-post methodology was undertaken to examine whether use of the tool improved the quality of FP services and client satisfaction with the services. Quantitative data were collected through observations of counselling and exit interviews with clients using structured questionnaires. Different numbers of FP clients were recruited during the baseline and the post-intervention rounds (n=448 vs 547, respectively). The DMT improved many client-provider interaction indicators, including verbal and non-verbal communication (p<0.05). The tool also impacted positively on the client's choice of contraceptive method, providers' technical competence, and quality of information provided to clients (p<0.05). Use of the tool improved the clients' satisfaction with FP services (from 72% to 99%; p<0.05). The adapted WHO's DMT has the potential to improve the quality of FP services.
2011-01-01
expensive post-weld machining; and (g) low 102 environmental impact . However, some disadvantages of the 103 FSW process have also been identified such as (a...material. Its 443 density and thermal properties are next set to that of AISI- H13 , 444 a hot-worked tool steel, frequently used as the FSW-tool 445
2011-12-30
which reduces the need for expensive post-weld machining; and (g) low environmental impact . However, some disadvantages of the FSW process have also...next set to that of AISI- H13 , a hot-worked tool steel, frequently used as the FSW-tool material (Ref 16). The work-piece material is assumed to be
M. E. Miller; William Elliot; M. Billmire; Pete Robichaud; K. A. Endsley
2016-01-01
Post-wildfire flooding and erosion can threaten lives, property and natural resources. Increased peak flows and sediment delivery due to the loss of surface vegetation cover and fire-induced changes in soil properties are of great concern to public safety. Burn severity maps derived from remote sensing data reflect fire-induced changes in vegetative cover and soil...
NASA Technical Reports Server (NTRS)
Wey, Thomas; Liu, Nan-Suey
2003-01-01
The overall objective of the current effort at NASA GRC is to evaluate, develop, and apply methodologies suitable for modeling intra-engine trace chemical changes over post combustor flow path relevant to the pollutant emissions from aircraft engines. At the present time, the focus is the high pressure turbine environment. At first, the trace chemistry model of CNEWT were implemented into GLENN-HT as well as NCC. Then, CNEWT, CGLENN-HT, and NCC were applied to the trace species evolution in a cascade of Cambridge University's No. 2 rotor and in a turbine vane passage. In general, the results from these different codes provide similar features. However, the details of some of the quantities of interest can be sensitive to the differences of these codes. This report summaries the implementation effort and presents the comparison of the No. 2 rotor results obtained from these different codes. The comparison of the turbine vane passage results is reported elsewhere. In addition to the implementation of trace chemistry model into existing CFD codes, several pre/post-processing tools that can handle the manipulations of the geometry, the unstructured and structured grids as well as the CFD solutions also have been enhanced and seamlessly tied with NCC, CGLENN-HT, and CNEWT. Thus, a complete CFD package consisting of pre/post-processing tools and flow solvers suitable for post-combustor intra-engine trace chemistry study is assembled.
New Navigation Post-Processing Tools for Oceanographic Submersibles
NASA Astrophysics Data System (ADS)
Kinsey, J. C.; Whitcomb, L. L.; Yoerger, D. R.; Howland, J. C.; Ferrini, V. L.; Hegrenas, O.
2006-12-01
We report the development of Navproc, a new set of software tools for post-processing oceanographic submersible navigation data that exploits previously reported improvements in navigation sensing and estimation (e.g. Eos Trans. AGU, 84(46), Fall Meet. Suppl., Abstract OS32A- 0225, 2003). The development of these tools is motivated by the need to have post-processing software that allows users to compensate for errors in vehicle navigation, recompute the vehicle position, and then save the results for use with quantitative science data (e.g. bathymetric sonar data) obtained during the mission. Navproc does not provide real-time navigation or display of data nor is it capable of high-resolution, three dimensional (3D) data display. Navproc supports the ASCII data formats employed by the vehicles of the National Deep Submergence Facility (NDSF) operated by the Woods Hole Oceanographic Institution (WHOI). Post-processing of navigation data with Navproc is comprised of three tasks. First, data is converted from the logged ASCII file to a binary Matlab file. When loaded into Matlab, each sensor has a data structure containing the time stamped data sampled at the native update rate of the sensor. An additional structure contains the real-time vehicle navigation data. Second, the data can be displayed using a Graphical User Interface (GUI), allowing users to visually inspect the quality of the data and graphically extract portions of the data. Third, users can compensate for errors in the real-time vehicle navigation. Corrections include: (i) manual filtering and median filtering of long baseline (LBL) ranges; (ii) estimation of the Doppler/gyro alignment using previously reported methodologies; and (iii) sound velocity, tide, and LBL transponder corrections. Using these corrections, the Doppler and LBL positions can be recomputed to provide improved estimates of the vehicle position compared to those computed in real-time. The data can be saved in either binary or ASCII formats, allowing it to be merged with quantitative scientific data, such as bathymetric data. Navproc is written in the Matlab programming language, and is supported under the Windows, Macintosh, and Unix operating systems. To date, Navproc has been employed for post processing data from the DSV Alvin Human Occupied Vehicle (HOV), the Jason II/Medea Remotely Operated Vehicle (ROV), and the ABE, Seabed, and Sentry Autonomous Underwater Vehicles (AUVs).
Probabilistic soil erosion modeling using the Erosion Risk Management Tool (ERMIT) after wildfires
P. R. Robichaud; W. J. Elliot; J. W. Wagenbrenner
2011-01-01
The decision of whether or not to apply post-fire hillslope erosion mitigation treatments, and if so, where these treatments are most needed, is a multi-step process. Land managers must assess the risk of damaging runoff and sediment delivery events occurring on the unrecovered burned hillslope. We developed the Erosion Risk Management Tool (ERMiT) to address this need...
NASA Astrophysics Data System (ADS)
Cannata, Massimiliano; Neumann, Jakob; Cardoso, Mirko; Rossetto, Rudy; Foglia, Laura; Borsi, Iacopo
2017-04-01
In situ time-series are an important aspect of environmental modelling, especially with the advancement of numerical simulation techniques and increased model complexity. In order to make use of the increasing data available through the requirements of the EU Water Framework Directive, the FREEWAT GIS environment incorporates the newly developed Observation Analysis Tool for time-series analysis. The tool is used to import time-series data into QGIS from local CSV files, online sensors using the istSOS service, or MODFLOW model result files and enables visualisation, pre-processing of data for model development, and post-processing of model results. OAT can be used as a pre-processor for calibration observations, integrating the creation of observations for calibration directly from sensor time-series. The tool consists in an expandable Python library of processing methods and an interface integrated in the QGIS FREEWAT plug-in which includes a large number of modelling capabilities, data management tools and calibration capacity.
Hatler, Carol W; Grove, Charlene; Strickland, Stephanie; Barron, Starr; White, Bruce D
2012-01-01
Many critically ill patients in intensive care units (ICUs) are unable to communicate their wishes about goals of care, particularly about the use of life-sustaining treatments. Surrogates and clinicians struggle with medical decisions because of a lack of clarity regarding patients' preferences, leading to prolonged hospitalizations and increased costs. This project focused on the development and implementation of a tool to facilitate a better communication process by (1) assuring the early identification of a surrogate if indicated on admission and (2) clarifying the decision-making standards that the surrogate was to use when participating in decision making. Before introducing the tool into the admissions routine, the staff were educated about its use and value to the decision-making process. PROJECT AND METHODS: The study was to determine if early use of a simple method of identifying a patient's surrogate and treatment preferences might impact length of stay (LOS) and total hospital charges. A pre- and post-intervention study design was used. Nurses completed the surrogacy information tool for all patients upon admission to the neuroscience ICU. Subjects (total N = 203) were critically ill patients who had been on a mechanical ventilator for 96 hours or longer, or in the ICU for seven days or longer.The project included staff education on biomedical ethics, critical communication skills, early identification of families and staff in crisis, and use of a simple tool to document patients' surrogates and previously expressed care wishes. Data on hospital LOS and hospital charges were collected through a retrospective review of medical records for similar four-month time frames pre- and post-implementation of the assessment tool. Significant differences were found between pre- and post-groups in terms of hospital LOS (F = 6.39, p = .01) and total hospital charges (F = 7.03, p = .009). Project findings indicate that the use of a simple admission assessment tool, supported by staff education about its completion, use, and available resources, can decrease LOS and lower total hospital charges. The reasons for the difference between the pre- and post-intervention groups remain unclear. Further research is needed to evaluate if the quality of communications between patients, their legally authorized representatives, and clinicians--as suggested in the literature--may have played a role in decreasing LOS and total hospital charges.
Additive Manufacturing of Tooling for Refrigeration Cabinet Foaming Processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Post, Brian K; Nuttall, David; Cukier, Michael
The primary objective of this project was to leverage the Big Area Additive Manufacturing (BAAM) process and materials into a long term, quick change tooling concept to drastically reduce product lead and development timelines and costs. Current refrigeration foam molds are complicated to manufacture involving casting several aluminum parts in an approximate shape, machining components of the molds and post fitting and shimming of the parts in an articulated fixture. The total process timeline can take over 6 months. The foaming process is slower than required for production, therefore multiple fixtures, 10 to 27, are required per refrigerator model. Moldsmore » are particular to a specific product configuration making mixed model assembly challenging for sequencing, mold changes or auto changeover features. The initial goal was to create a tool leveraging the ORNL materials and additive process to build a tool in 4 to 6 weeks or less. A secondary goal was to create common fixture cores and provide lightweight fixture sections that could be revised in a very short time to increase equipment flexibility reduce lead times, lower the barriers to first production trials, and reduce tooling costs.« less
Online tools for nucleosynthesis studies
NASA Astrophysics Data System (ADS)
Göbel, K.; Glorius, J.; Koloczek, A.; Pignatari, M.; Plag, R.; Reifarth, R.; Ritter, C.; Schmidt, S.; Sonnabend, K.; Thomas, B.; Travaglio, C.
2018-01-01
The nucleosynthesis of the elements between iron and uranium involves many different astrophysical scenarios covering wide ranges of temperatures and densities. Thousands of nuclei and ten thousands of reaction rates have to be included in the corresponding simulations. We investigate the impact of single rates on the predicted abundance distributions with post-processing nucleosynthesis simulations. We present online tools, which allow the investigation of sensitivities and integrated mass fluxes in different astrophysical scenarios.
Hovick, Shelly R; Bevers, Therese B; Vidrine, Jennifer Irvin; Kim, Stephanie; Dailey, Phokeng M; Jones, Lovell A; Peterson, Susan K
2017-03-01
Online cancer risk assessment tools, which provide personalized cancer information and recommendations based on personal data input by users, are a promising cancer education approach; however, few tools have been evaluated. A randomized controlled study was conducted to compare user impressions of one tool, Cancer Risk Check (CRC), to non-personalized educational information delivered online as series of self-advancing slides (the control). CRC users (N = 1452) rated the tool to be as interesting as the control (p > .05), but users were more likely to report that the information was difficult to understand and not applicable to them (p < .05). Information seeking and sharing also were lower among CRC users; thus, although impressions of CRC were favorable, it was not shown to be superior to existing approaches. We hypothesized CRC was less effective because it contained few visual and graphical elements; therefore, CRC was compared to a text-based control (online PDF file) post hoc. CRC users rated the information to be more interesting, less difficult to understand, and better able to hold their attention (p < .05). Post hoc results suggest the visual presentation of risk is critical to tool success.
Combining Simulation Tools for End-to-End Trajectory Optimization
NASA Technical Reports Server (NTRS)
Whitley, Ryan; Gutkowski, Jeffrey; Craig, Scott; Dawn, Tim; Williams, Jacobs; Stein, William B.; Litton, Daniel; Lugo, Rafael; Qu, Min
2015-01-01
Trajectory simulations with advanced optimization algorithms are invaluable tools in the process of designing spacecraft. Due to the need for complex models, simulations are often highly tailored to the needs of the particular program or mission. NASA's Orion and SLS programs are no exception. While independent analyses are valuable to assess individual spacecraft capabilities, a complete end-to-end trajectory from launch to splashdown maximizes potential performance and ensures a continuous solution. In order to obtain end-to-end capability, Orion's in-space tool (Copernicus) was made to interface directly with the SLS's ascent tool (POST2) and a new tool to optimize the full problem by operating both simulations simultaneously was born.
The Use of AMET & Automated Scripts for Model Evaluation
Brief overview of EPA’s new CMAQ website to be launched publically in June, 2017. Details on the upcoming release of the Atmospheric Model Evaluation Tool (AMET) and the creation of automated scripts for post-processing and evaluating air quality model data.
ESTEST: An Open Science Platform for Electronic Structure Research
ERIC Educational Resources Information Center
Yuan, Gary
2012-01-01
Open science platforms in support of data generation, analysis, and dissemination are becoming indispensible tools for conducting research. These platforms use informatics and information technologies to address significant problems in open science data interoperability, verification & validation, comparison, analysis, post-processing,…
A Rule-Based Modeling for the Description of Flexible and Self-healing Business Processes
NASA Astrophysics Data System (ADS)
Boukhebouze, Mohamed; Amghar, Youssef; Benharkat, Aïcha-Nabila; Maamar, Zakaria
In this paper we discuss the importance of ensuring that business processes are label robust and agile at the same time robust and agile. To this end, we consider reviewing the way business processes are managed. For instance we consider offering a flexible way to model processes so that changes in regulations are handled through some self-healing mechanisms. These changes may raise exceptions at run-time if not properly reflected on these processes. To this end we propose a new rule based model that adopts the ECA rules and is built upon formal tools. The business logic of a process can be summarized with a set of rules that implement an organization’s policies. Each business rule is formalized using our ECAPE formalism (Event-Condition-Action-Post condition- post Event). This formalism allows translating a process into a graph of rules that is analyzed in terms of reliably and flexibility.
NASA Astrophysics Data System (ADS)
Wi, S.; Ray, P. A.; Brown, C.
2015-12-01
A software package developed to facilitate building distributed hydrologic models in a modular modeling system is presented. The software package provides a user-friendly graphical user interface that eases its practical use in water resources-related research and practice. The modular modeling system organizes the options available to users when assembling models according to the stages of hydrological cycle, such as potential evapotranspiration, soil moisture accounting, and snow/glacier melting processes. The software is intended to be a comprehensive tool that simplifies the task of developing, calibrating, validating, and using hydrologic models through the inclusion of intelligent automation to minimize user effort, and reduce opportunities for error. Processes so far automated include the definition of system boundaries (i.e., watershed delineation), climate and geographical input generation, and parameter calibration. Built-in post-processing toolkits greatly improve the functionality of the software as a decision support tool for water resources system management and planning. Example post-processing toolkits enable streamflow simulation at ungauged sites with predefined model parameters, and perform climate change risk assessment by means of the decision scaling approach. The software is validated through application to watersheds representing a variety of hydrologic regimes.
Balancing geo-privacy and spatial patterns in epidemiological studies.
Chen, Chien-Chou; Chuang, Jen-Hsiang; Wang, Da-Wei; Wang, Chien-Min; Lin, Bo-Cheng; Chan, Ta-Chien
2017-11-08
To balance the protection of geo-privacy and the accuracy of spatial patterns, we developed a geo-spatial tool (GeoMasker) intended to mask the residential locations of patients or cases in a geographic information system (GIS). To elucidate the effects of geo-masking parameters, we applied 2010 dengue epidemic data from Taiwan testing the tool's performance in an empirical situation. The similarity of pre- and post-spatial patterns was measured by D statistics under a 95% confidence interval. In the empirical study, different magnitudes of anonymisation (estimated Kanonymity ≥10 and 100) were achieved and different degrees of agreement on the pre- and post-patterns were evaluated. The application is beneficial for public health workers and researchers when processing data with individuals' spatial information.
NASA Astrophysics Data System (ADS)
Johnson, Daniel; Huerta, E. A.; Haas, Roland
2018-01-01
Numerical simulations of Einstein’s field equations provide unique insights into the physics of compact objects moving at relativistic speeds, and which are driven by strong gravitational interactions. Numerical relativity has played a key role to firmly establish gravitational wave astrophysics as a new field of research, and it is now paving the way to establish whether gravitational wave radiation emitted from compact binary mergers is accompanied by electromagnetic and astro-particle counterparts. As numerical relativity continues to blend in with routine gravitational wave data analyses to validate the discovery of gravitational wave events, it is essential to develop open source tools to streamline these studies. Motivated by our own experience as users and developers of the open source, community software, the Einstein Toolkit, we present an open source, Python package that is ideally suited to monitor and post-process the data products of numerical relativity simulations, and compute the gravitational wave strain at future null infinity in high performance environments. We showcase the application of this new package to post-process a large numerical relativity catalog and extract higher-order waveform modes from numerical relativity simulations of eccentric binary black hole mergers and neutron star mergers. This new software fills a critical void in the arsenal of tools provided by the Einstein Toolkit consortium to the numerical relativity community.
Novel Overhang Support Designs for Powder-Based Electron Beam Additive Manufacturing (EBAM)
NASA Technical Reports Server (NTRS)
Nabors, Sammy A.
2014-01-01
NASA Marshall Space Flight Center, in collaboration with the University of Alabama, has developed a contact-free support structure used to fabricate overhang-type geometries via EBAM. The support structure is used for 3-D metal-printed components for the aerospace, automotive, biomedical and other industries. Current techniques use support structures to address deformation challenges inherent in 3-D metal printing. However, these structures (overhangs) are bonded to the component and need to be removed in post-processing using a mechanical tool. This new technology improves the overhang support structure design for components by eliminating associated geometric defects and post-processing requirements.
DOE Office of Scientific and Technical Information (OSTI.GOV)
A. Alfonsi; C. Rabiti; D. Mandelli
The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data miningmore » module« less
Cottet, P; d'Hollander, A; Cahana, A; Van Gessel, E; Tassaux, D
2013-10-01
In the healthcare domain, different analytic tools focused on accidents appeared to be poorly adapted to sub-accidental issues. Improving local management and intra-institutional communication with simpler methods, allowing rapid and uncomplicated meta-reporting, could be an attractive alternative. A process-centered structure derived from the industrial domain - DEPOSE(E) - was selected and modified for its use in the healthcare domain. The seven exclusive meta-categories defined - Patient, Equipment, Process, Actor, Supplies, work Room and Organization- constitute 7CARECAT™. A collection of 536 "improvement" reports from a tertiary hospital Post anesthesia care unit (PACU) was used and four meta-categorization rules edited prior to the analysis. Both the relevance of the metacategories and of the rules were tested to build a meta-reporting methodology. The distribution of these categories was analyzed with a χ 2 test. Five hundred and ninety independent facts were collected out of the 536 reports. The frequencies of the categories are: Organization 44%, Actor 37%, Patient 11%, Process 3%, work Room 3%, Equipment 1% and Supplies 1%, with a p-value <0.005 (χ 2). During the analysis, three more rules were edited. The reproducibility, tested randomly on 200 reports, showed a <2% error rate. This meta-reporting methodology, developed with the 7CARECAT™ structure and using a reduced number of operational rules, has successfully produced a stable and consistent classification of sub-accidental events voluntarily reported. This model represents a relevant tool to exchange meta-informations important for local and transversal communication in healthcare institutions. It could be used as a promising tool to improve quality and risk management. Copyright © 2013. Published by Elsevier SAS.
Assessing Subjectivity in Sensor Data Post Processing via a Controlled Experiment
NASA Astrophysics Data System (ADS)
Jones, A. S.; Horsburgh, J. S.; Eiriksson, D.
2017-12-01
Environmental data collected by in situ sensors must be reviewed to verify validity, and conducting quality control often requires making edits in post processing to generate approved datasets. This process involves decisions by technicians, data managers, or data users on how to handle problematic data. Options include: removing data from a series, retaining data with annotations, and altering data based on algorithms related to adjacent data points or the patterns of data at other locations or of other variables. Ideally, given the same dataset and the same quality control guidelines, multiple data quality control technicians would make the same decisions in data post processing. However, despite the development and implementation of guidelines aimed to ensure consistent quality control procedures, we have faced ambiguity when performing post processing, and we have noticed inconsistencies in the practices of individuals performing quality control post processing. Technicians with the same level of training and using the same input datasets may produce different results, affecting the overall quality and comparability of finished data products. Different results may also be produced by technicians that do not have the same level of training. In order to assess the effect of subjective decision making by the individual technician on the end data product, we designed an experiment where multiple users performed quality control post processing on the same datasets using a consistent set of guidelines, field notes, and tools. We also assessed the effect of technician experience and training by conducting the same procedures with a group of novices unfamiliar with the data and the quality control process and compared their results to those generated by a group of more experienced technicians. In this presentation, we report our observations of the degree of subjectivity in sensor data post processing, assessing and quantifying the impacts of individual technician as well as technician experience on quality controlled data products.
Micromechanics Analysis Code Post-Processing (MACPOST) User Guide. 1.0
NASA Technical Reports Server (NTRS)
Goldberg, Robert K.; Comiskey, Michele D.; Bednarcyk, Brett A.
1999-01-01
As advanced composite materials have gained wider usage. the need for analytical models and computer codes to predict the thermomechanical deformation response of these materials has increased significantly. Recently, a micromechanics technique called the generalized method of cells (GMC) has been developed, which has the capability to fulfill this -oal. Tc provide a framework for GMC, the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) has been developed. As MAC/GMC has been updated, significant improvements have been made to the post-processing capabilities of the code. Through the MACPOST program, which operates directly within the MSC/PATRAN graphical pre- and post-processing package, a direct link between the analysis capabilities of MAC/GMC and the post-processing capabilities of MSC/PATRAN has been established. MACPOST has simplified the production, printing. and exportation of results for unit cells analyzed by MAC/GMC. MACPOST allows different micro-level quantities to be plotted quickly and easily in contour plots. In addition, meaningful data for X-Y plots can be examined. MACPOST thus serves as an important analysis and visualization tool for the macro- and micro-level data generated by MAC/GMC. This report serves as the user's manual for the MACPOST program.
Morey, Diane J
2012-01-01
The purpose of this study was to evaluate the effectiveness of Web-based animated pedagogical agents on critical thinking among nursing students. A pedagogical agent or virtual character provides a possible innovative tool for critical thinking through active engagement of students by asking questions and providing feedback about a series of nursing case studies. This mixed methods experimental study used a pretest, posttest design with a control group. ANCOVA demonstrated no significant difference between the groups on the Critical Thinking Process Test. Pre- and post-think-alouds were analyzed using a rating tool and rubric for the presence of eight cognitive processes, level of critical thinking, and for accuracy of nursing diagnosis, conclusions, and evaluation. Chi-square analyses for each group revealed a significant difference for improvement of the critical thinking level and correct conclusions from pre-think-aloud to post-think-aloud, but only the pedagogical agent group had a significant result for appropriate evaluations.
Brown, James A L
2016-05-06
A pedagogic intervention, in the form of an inquiry-based peer-assisted learning project (as a practical student-led bioinformatics module), was assessed for its ability to increase students' engagement, practical bioinformatic skills and process-specific knowledge. Elements assessed were process-specific knowledge following module completion, qualitative student-based module evaluation and the novelty, scientific validity and quality of written student reports. Bioinformatics is often the starting point for laboratory-based research projects, therefore high importance was placed on allowing students to individually develop and apply processes and methods of scientific research. Students led a bioinformatic inquiry-based project (within a framework of inquiry), discovering, justifying and exploring individually discovered research targets. Detailed assessable reports were produced, displaying data generated and the resources used. Mimicking research settings, undergraduates were divided into small collaborative groups, with distinctive central themes. The module was evaluated by assessing the quality and originality of the students' targets through reports, reflecting students' use and understanding of concepts and tools required to generate their data. Furthermore, evaluation of the bioinformatic module was assessed semi-quantitatively using pre- and post-module quizzes (a non-assessable activity, not contributing to their grade), which incorporated process- and content-specific questions (indicative of their use of the online tools). Qualitative assessment of the teaching intervention was performed using post-module surveys, exploring student satisfaction and other module specific elements. Overall, a positive experience was found, as was a post module increase in correct process-specific answers. In conclusion, an inquiry-based peer-assisted learning module increased students' engagement, practical bioinformatic skills and process-specific knowledge. © 2016 by The International Union of Biochemistry and Molecular Biology, 44:304-313 2016. © 2016 The International Union of Biochemistry and Molecular Biology.
Physics-based process model approach for detecting discontinuity during friction stir welding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shrivastava, Amber; Pfefferkorn, Frank E.; Duffie, Neil A.
2015-02-12
The goal of this work is to develop a method for detecting the creation of discontinuities during friction stir welding. This in situ weld monitoring method could significantly reduce the need for post-process inspection. A process force model and a discontinuity force model were created based on the state-of-the-art understanding of flow around an friction stir welding (FSW) tool. These models are used to predict the FSW forces and size of discontinuities formed in the weld. Friction stir welds with discontinuities and welds without discontinuities were created, and the differences in force dynamics were observed. In this paper, discontinuities weremore » generated by reducing the tool rotation frequency and increasing the tool traverse speed in order to create "cold" welds. Experimental force data for welds with discontinuities and welds without discontinuities compared favorably with the predicted forces. The model currently overpredicts the discontinuity size.« less
4D flow mri post-processing strategies for neuropathologies
NASA Astrophysics Data System (ADS)
Schrauben, Eric Mathew
4D flow MRI allows for the measurement of a dynamic 3D velocity vector field. Blood flow velocities in large vascular territories can be qualitatively visualized with the added benefit of quantitative probing. Within cranial pathologies theorized to have vascular-based contributions or effects, 4D flow MRI provides a unique platform for comprehensive assessment of hemodynamic parameters. Targeted blood flow derived measurements, such as flow rate, pulsatility, retrograde flow, or wall shear stress may provide insight into the onset or characterization of more complex neuropathologies. Therefore, the thorough assessment of each parameter within the context of a given disease has important medical implications. Not surprisingly, the last decade has seen rapid growth in the use of 4D flow MRI. Data acquisition sequences are available to researchers on all major scanner platforms. However, the use has been limited mostly to small research trials. One major reason that has hindered the more widespread use and application in larger clinical trials is the complexity of the post-processing tasks and the lack of adequate tools for these tasks. Post-processing of 4D flow MRI must be semi-automated, fast, user-independent, robust, and reliably consistent for use in a clinical setting, within large patient studies, or across a multicenter trial. Development of proper post-processing methods coupled with systematic investigation in normal and patient populations pushes 4D flow MRI closer to clinical realization while elucidating potential underlying neuropathological origins. Within this framework, the work in this thesis assesses venous flow reproducibility and internal consistency in a healthy population. A preliminary analysis of venous flow parameters in healthy controls and multiple sclerosis patients is performed in a large study employing 4D flow MRI. These studies are performed in the context of the chronic cerebrospinal venous insufficiency hypothesis. Additionally, a double-gated flow acquisition and reconstruction scheme demonstrates respiratory-induced changes in internal jugular vein flow. Finally, a semi-automated intracranial vessel segmentation and flow parameter measurement software tool for fast and consistent 4D flow post-processing analysis is developed, validated, and exhibited an in-vivo.
Amblàs-Novellas, Jordi; Casas, Sílvia; Catalán, Rosa María; Oriol-Ruscalleda, Margarita; Lucchetti, Gianni Enrico; Quer-Vall, Francesc Xavier
2016-01-01
Shared decision-making between patients and healthcare professionals is crucial to guarantee adequate coherence between patient values and preferences, caring aims and treatment intensity, which is key for the provision of patient-centred healthcare. The assessment of such interventions are essential for caring continuity purposes. To do this, reliable and easy-to-use assessment systems are required. This study describes the results of the implementation of a hospital treatment intensity assessment tool. The pre-implementation and post-implementation results were compared between two cohorts of patients assessed for one month. Some record of care was registered in 6.1% of patients in the pre-implementation group (n=673) compared to 31.6% of patients in the post-implementation group (n=832) (P<.01), with differences between services. Hospital mortality in both cohorts is 1.9%; in the pre-implementation group, 93.75% of deceased patients had treatment intensity assessment. In hospital settings, the availability of a specific tool seems to encourage very significantly shared decision-making processes between patients and healthcare professionals -multiplying by more than 5 times the treatment intensity assessment. Moreover, such tools help in the caring continuity processes between different teams and the personalisation of caring interventions to be monitored. More research is needed to continue improving shared decision-making for hospital patients. Copyright © 2015 SEGG. Published by Elsevier Espana. All rights reserved.
NASA Astrophysics Data System (ADS)
Wang, S.; Ancell, B. C.; Huang, G. H.; Baetz, B. W.
2018-03-01
Data assimilation using the ensemble Kalman filter (EnKF) has been increasingly recognized as a promising tool for probabilistic hydrologic predictions. However, little effort has been made to conduct the pre- and post-processing of assimilation experiments, posing a significant challenge in achieving the best performance of hydrologic predictions. This paper presents a unified data assimilation framework for improving the robustness of hydrologic ensemble predictions. Statistical pre-processing of assimilation experiments is conducted through the factorial design and analysis to identify the best EnKF settings with maximized performance. After the data assimilation operation, statistical post-processing analysis is also performed through the factorial polynomial chaos expansion to efficiently address uncertainties in hydrologic predictions, as well as to explicitly reveal potential interactions among model parameters and their contributions to the predictive accuracy. In addition, the Gaussian anamorphosis is used to establish a seamless bridge between data assimilation and uncertainty quantification of hydrologic predictions. Both synthetic and real data assimilation experiments are carried out to demonstrate feasibility and applicability of the proposed methodology in the Guadalupe River basin, Texas. Results suggest that statistical pre- and post-processing of data assimilation experiments provide meaningful insights into the dynamic behavior of hydrologic systems and enhance robustness of hydrologic ensemble predictions.
Modeling Post-Accident Vehicle Egress
2013-01-01
interest for military situations may involve rolled-over vehicles for which detailed movement data are not available. In the current design process...test trials. These evaluations are expensive and time-consuming, and are often performed late in the design process when it is too difficult to...alter the design if weaknesses are discovered. Yet, due to the limitations of current software tools, digital human models (DHMs) are not yet widely
Fang, Bin; Hoffman, Melissa A.; Mirza, Abu-Sayeef; Mishall, Katie M.; Li, Jiannong; Peterman, Scott M.; Smalley, Keiran S. M.; Shain, Kenneth H.; Weinberger, Paul M.; Wu, Jie; Rix, Uwe; Haura, Eric B.; Koomen, John M.
2015-01-01
Cancer biologists and other healthcare researchers face an increasing challenge in addressing the molecular complexity of disease. Biomarker measurement tools and techniques now contribute to both basic science and translational research. In particular, liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM) for multiplexed measurements of protein biomarkers has emerged as a versatile tool for systems biology. Assays can be developed for specific peptides that report on protein expression, mutation, or post-translational modification; discovery proteomics data rapidly translated into multiplexed quantitative approaches. Complementary advances in affinity purification enrich classes of enzymes or peptides representing post-translationally modified or chemically labeled substrates. Here, we illustrate the process for the relative quantification of hundreds of peptides in a single LC-MRM experiment. Desthiobiotinylated peptides produced by activity-based protein profiling (ABPP) using ATP probes and tyrosine-phosphorylated peptides are used as examples. These targeted quantification panels can be applied to further understand the biology of human disease. PMID:25782629
NASA Astrophysics Data System (ADS)
Dewi, Cut; Nopera Rauzi, Era
2018-05-01
This paper discusses the role of architectural heritage as a tool for resilience in a community after a surpassing disaster. It argues that architectural heritage is not merely a passive victim needing to be rescued; rather it is also an active agent in providing resilience for survivors. It is evidence in the ways it acts as a signifier of collective memories and place identities, and a place to seek refuge in emergency time and to decide central decision during the reconstruction process. This paper explores several theories related to architectural heritage in post-disaster context and juxtaposes them in a case study of Banda Aceh after the 2004 Tsunami Disaster. The paper is based on a six-month anthropological fieldwork in 2012 in Banda Aceh after the Tsunami Disaster. During the fieldwork, 166 respondents were interviewed to gain extensive insight into the ways architecture might play a role in post-disaster reconstruction.
NASA Technical Reports Server (NTRS)
Hairr, John W.; Dorris, William J.; Ingram, J. Edward; Shah, Bharat M.
1993-01-01
Interactive Stiffened Panel Analysis (ISPAN) modules, written in FORTRAN, were developed to provide an easy to use tool for creating finite element models of composite material stiffened panels. The modules allow the user to interactively construct, solve and post-process finite element models of four general types of structural panel configurations using only the panel dimensions and properties as input data. Linear, buckling and post-buckling solution capability is provided. This interactive input allows rapid model generation and solution by non finite element users. The results of a parametric study of a blade stiffened panel are presented to demonstrate the usefulness of the ISPAN modules. Also, a non-linear analysis of a test panel was conducted and the results compared to measured data and previous correlation analysis.
Dwyer, Robyn; Fraser, Suzanne
2017-06-01
It is widely accepted that alcohol and other drug consumption is profoundly gendered. Just where this gendering is occurring, however, remains the subject of debate. We contend that one important and overlooked site where the gendering of substance consumption and addiction is taking place is through AOD research itself: in particular, through the addiction screening and diagnostic tools designed to measure and track substance consumption and problems within populations. These tools establish key criteria and set numerical threshold scores for the identification of problems. In many of these tools, separate threshold scores for women and men are established or recommended. Drawing on Karen Barad's concept of post-humanist performativity, in this article we examine the ways in which gender itself is being materialised by these apparatuses of measurement. We focus primarily on the Drug Use Disorders Identification Test (DUDIT) tool as an exemplar of gendering processes that operate across addiction tools more broadly. We consider gendering processes operating through tools questions themselves and we also examine the quantification and legitimation processes used in establishing gender difference and the implications these have for women. We find tools rely on and reproduce narrow and marginalising assumptions about women as essentially fragile and vulnerable and simultaneously reinforce normative expectations that women sacrifice pleasure. The seemingly objective and neutral quantification processes operating in tools naturalise gender as they enact it. Copyright © 2017 Elsevier B.V. All rights reserved.
Shortcomings of low-cost imaging systems for viewing computed radiographs.
Ricke, J; Hänninen, E L; Zielinski, C; Amthauer, H; Stroszczynski, C; Liebig, T; Wolf, M; Hosten, N
2000-01-01
To assess potential advantages of a new PC-based viewing tool featuring image post-processing for viewing computed radiographs on low-cost hardware (PC) with a common display card and color monitor, and to evaluate the effect of using color versus monochrome monitors. Computed radiographs of a statistical phantom were viewed on a PC, with and without post-processing (spatial frequency and contrast processing), employing a monochrome or a color monitor. Findings were compared with the viewing on a radiological Workstation and evaluated with ROC analysis. Image post-processing improved the perception of low-contrast details significantly irrespective of the monitor used. No significant difference in perception was observed between monochrome and color monitors. The review at the radiological Workstation was superior to the review done using the PC with image processing. Lower quality hardware (graphic card and monitor) used in low cost PCs negatively affects perception of low-contrast details in computed radiographs. In this situation, it is highly recommended to use spatial frequency and contrast processing. No significant quality gain has been observed for the high-end monochrome monitor compared to the color display. However, the color monitor was affected stronger by high ambient illumination.
McCannon, Jessica B; O'Donnell, Walter J; Thompson, B Taylor; El-Jawahri, Areej; Chang, Yuchiao; Ananian, Lillian; Bajwa, Ednan K; Currier, Paul F; Parikh, Mihir; Temel, Jennifer S; Cooper, Zara; Wiener, Renda Soylemez; Volandes, Angelo E
2012-12-01
Effective communication between intensive care unit (ICU) providers and families is crucial given the complexity of decisions made regarding goals of therapy. Using video images to supplement medical discussions is an innovative process to standardize and improve communication. In this six-month, quasi-experimental, pre-post intervention study we investigated the impact of a cardiopulmonary resuscitation (CPR) video decision support tool upon knowledge about CPR among surrogate decision makers for critically ill adults. We interviewed surrogate decision makers for patients aged 50 and over, using a structured questionnaire that included a four-question CPR knowledge assessment similar to those used in previous studies. Surrogates in the post-intervention arm viewed a three-minute video decision support tool about CPR before completing the knowledge assessment and completed questions about perceived value of the video. We recruited 23 surrogates during the first three months (pre-intervention arm) and 27 surrogates during the latter three months of the study (post-intervention arm). Surrogates viewing the video had more knowledge about CPR (p=0.008); average scores were 2.0 (SD 1.1) and 2.9 (SD 1.2) (out of a total of 4) in pre-intervention and post-intervention arms. Surrogates who viewed the video were comfortable with its content (81% very) and 81% would recommend the video. CPR preferences for patients at the time of ICU discharge/death were distributed as follows: pre-intervention: full code 78%, DNR 22%; post-intervention: full code 59%, DNR 41% (p=0.23).
Center's ballast water management system website The Marine Safety Center recently updated two tools posted to its ballast water management system website to assist industry when completing the ballast water management system type approval process, or when accessing letters of intent. 5/23/2018: Release of Mission
SABRE--A Novel Software Tool for Bibliographic Post-Processing.
ERIC Educational Resources Information Center
Burge, Cecil D.
1989-01-01
Describes the software architecture and application of SABRE (Semi-Automated Bibliographic Environment), which is one of the first products to provide a semi-automatic environment for relevancy ranking of citations obtained from searches of bibliographic databases. Features designed to meet the review, categorization, culling, and reporting needs…
Initial Navigation Alignment of Optical Instruments on GOES-R
NASA Technical Reports Server (NTRS)
Isaacson, Peter J.; DeLuccia, Frank J.; Reth, Alan D.; Igli, David A.; Carter, Delano R.
2016-01-01
Post-launch alignment errors for the Advanced Baseline Imager (ABI) and Geospatial Lightning Mapper (GLM) on GOES-R may be too large for the image navigation and registration (INR) processing algorithms to function without an initial adjustment to calibration parameters. We present an approach that leverages a combination of user-selected image-to-image tie points and image correlation algorithms to estimate this initial launch-induced offset and calculate adjustments to the Line of Sight Motion Compensation (LMC) parameters. We also present an approach to generate synthetic test images, to which shifts and rotations of known magnitude are applied. Results of applying the initial alignment tools to a subset of these synthetic test images are presented. The results for both ABI and GLM are within the specifications established for these tools, and indicate that application of these tools during the post-launch test (PLT) phase of GOES-R operations will enable the automated INR algorithms for both instruments to function as intended.
Analysis of post-mining excavations as places for municipal waste
NASA Astrophysics Data System (ADS)
Górniak-Zimroz, Justyna
2018-01-01
Waste management planning is an interdisciplinary task covering a wide range of issues including costs, legal requirements, spatial planning, environmental protection, geography, demographics, and techniques used in collecting, transporting, processing and disposing of waste. Designing and analyzing this issue is difficult and requires the use of advanced analysis methods and tools available in GIS geographic information systems containing readily available graphical and descriptive databases, data analysis tools providing expert decision support while selecting the best-designed alternative, and simulation models that allow the user to simulate many variants of waste management together with graphical visualization of the results of performed analyzes. As part of the research study, there have been works undertaken concerning the use of multi-criteria data analysis in waste management in areas located in southwestern Poland. These works have proposed the inclusion in waste management of post-mining excavations as places for the final or temporary collection of waste assessed in terms of their suitability with the tools available in GIS systems.
Sidoli, Simone; Cheng, Lei; Jensen, Ole N
2012-06-27
Histone proteins contribute to the maintenance and regulation of the dynamic chromatin structure, to gene activation, DNA repair and many other processes in the cell nucleus. Site-specific reversible and irreversible post-translational modifications of histone proteins mediate biological functions, including recruitment of transcription factors to specific DNA regions, assembly of epigenetic reader/writer/eraser complexes onto DNA, and modulation of DNA-protein interactions. Histones thereby regulate chromatin structure and function, propagate inheritance and provide memory functions in the cell. Dysfunctional chromatin structures and misregulation may lead to pathogenic states, including diabetes and cancer, and the mapping and quantification of multivalent post-translational modifications has therefore attracted significant interest. Mass spectrometry has quickly been accepted as a versatile tool to achieve insights into chromatin biology and epigenetics. High sensitivity and high mass accuracy and the ability to sequence post-translationally modified peptides and perform large-scale analyses make this technique very well suited for histone protein characterization. In this review we discuss a range of analytical methods and various mass spectrometry-based approaches for histone analysis, from sample preparation to data interpretation. Mass spectrometry-based proteomics is already an integrated and indispensable tool in modern chromatin biology, providing insights into the mechanisms and dynamics of nuclear and epigenetic processes. This article is part of a Special Section entitled: Understanding genome regulation and genetic diversity by mass spectrometry. Copyright © 2011 Elsevier B.V. All rights reserved.
Post-Flight Data Analysis Tool
NASA Technical Reports Server (NTRS)
George, Marina
2018-01-01
A software tool that facilitates the retrieval and analysis of post-flight data. This allows our team and other teams to effectively and efficiently analyze and evaluate post-flight data in order to certify commercial providers.
Asomaning, Nana; Loftus, Carla
2014-07-01
To better meet the needs of older adults in the emergency department, Senior Friendly care processes, such as high-risk screening are recommended. The identification of Seniors at Risk (ISAR) tool is a 6-item validated screening tool for identifying elderly patients at risk of the adverse outcomes post-ED visit. This paper describes the implementation of the tool in the Mount Sinai Hospital emergency department using a Plan-Do-Study-Act model; and demonstrates whether the tool predicts adverse outcomes. An observational study tracked tool implementation. A retrospective chart audit was completed to collect data about elderly ED patients during 2 time periods in 2010 and 2011. Data analysis compared the characteristics of patients with positive and negative screening tool results. The identification of Seniors at Risk tool was completed for 51.6% of eligible patients, with 61.2% of patients having a positive result. Patients with positive screening results were more likely to be over age 79 (P = .003); be admitted to hospital (P < .001); have a longer mean ED length of stay (P < .001). For patients admitted to hospital, those with positive screening results had a longer mean inpatient stay (P = .012). Implementing the Idenfitication of Seniors at Risk tool was challenged by problematic compliance with tool completion. Strategies to address this included tool adaptation; and providing staff with knowledge of ED and inpatient geriatric resources and feedback on completion rates. Positive screening results predicted adverse outcomes in elderly Mount Sinai Hospital ED patients. © 2014. Published by Elsevier Inc. All rights reserved.
Catanuto, Giuseppe; Pappalardo, Francesco; Rocco, Nicola; Leotta, Marco; Ursino, Venera; Chiodini, Paolo; Buggi, Federico; Folli, Secondo; Catalano, Francesca; Nava, Maurizio B
2016-10-01
The increased complexity of the decisional process in breast cancer surgery is well documented. With this study we aimed to create a software tool able to assist patients and surgeons in taking proper decisions. We hypothesized that the endpoints of breast cancer surgery could be addressed combining a set of decisional drivers. We created a decision support system software tool (DSS) and an interactive decision tree. A formal analysis estimated the information gain derived from each feature in the process. We tested the DSS on 52 patients and we analyzed the concordance of decisions obtained by different users and between the DSS suggestions and the actual surgery. We also tested the ability of the system to prevent post breast conservation deformities. The information gain revealed that patients preferences are the root of our decision tree. An observed concordance respectively of 0.98 and 0.88 was reported when the DSS was used twice by an expert operator or by a newly trained operator vs. an expert one. The observed concordance between the DSS suggestion and the actual decision was 0.69. A significantly higher incidence of post breast conservation defects was reported among patients who did not follow the DSS decision (Type III of Fitoussi, N = 4; 33.3%, p = 0.004). The DSS decisions can be reproduced by operators with different experience. The concordance between suggestions and actual decision is quite low, however the DSS is able to prevent post- breast conservation deformities. Copyright © 2016 Elsevier Ltd. All rights reserved.
Comparison of different phantoms used in digital diagnostic imaging
NASA Astrophysics Data System (ADS)
Bor, Dogan; Unal, Elif; Uslu, Anil
2015-09-01
The organs of extremity, chest, skull and lumbar were physically simulated using uniform PMMA slabs with different thicknesses alone and using these slabs together with aluminum plates and air gaps (ANSI Phantoms). The variation of entrance surface air kerma and scatter fraction with X-ray beam qualities was investigated for these phantoms and the results were compared with those measured from anthropomorphic phantoms. A flat panel digital radiographic system was used for all the experiments. Considerable variations of entrance surface air kermas were found for the same organs of different designs, and highest doses were measured for the PMMA slabs. A low contrast test tool and a contrast detail test object (CDRAD) were used together with each organ simulation of PMMA slabs and ANSI phantoms in order to test the clinical image qualities. Digital images of these phantom combinations and anthropomorphic phantoms were acquired in raw and clinically processed formats. Variation of image quality with kVp and post processing was evaluated using the numerical metrics of these test tools and measured contrast values from the anthropomorphic phantoms. Our results indicated that design of some phantoms may not be efficient enough to reveal the expected performance of the post processing algorithms.
ERIC Educational Resources Information Center
Emstad, Anne Berit
2011-01-01
This article refers to a study on how the school principal engaged in the process after a school self-evaluation. The study examined how two primary schools followed up the evaluation. Although they both used the same evaluation tool, the schools' understanding and application of results differed greatly. This paper describes and discusses the…
ALMA from the Users' Perspective
NASA Astrophysics Data System (ADS)
Johnson, Kelsey
2010-05-01
After decades of dreaming and preparation, the call for early science with ALMA is just around the corner. The goal of this talk is to illustrate the process of preparing and carrying out a research program with ALMA. This presentation will step through the user interface for proposal preparation, proposal review, project tracking, data acquisition, and post-processing. Examples of the software tools, including the simulator and spectral line catalog, will be included.
NASA Astrophysics Data System (ADS)
Cucchiaro, S.; Maset, E.; Fusiello, A.; Cazorzi, F.
2018-05-01
In recent years, the combination of Structure-from-Motion (SfM) algorithms and UAV-based aerial images has revolutionised 3D topographic surveys for natural environment monitoring, offering low-cost, fast and high quality data acquisition and processing. A continuous monitoring of the morphological changes through multi-temporal (4D) SfM surveys allows, e.g., to analyse the torrent dynamic also in complex topography environment like debris-flow catchments, provided that appropriate tools and procedures are employed in the data processing steps. In this work we test two different software packages (3DF Zephyr Aerial and Agisoft Photoscan) on a dataset composed of both UAV and terrestrial images acquired on a debris-flow reach (Moscardo torrent - North-eastern Italian Alps). Unlike other papers in the literature, we evaluate the results not only on the raw point clouds generated by the Structure-from- Motion and Multi-View Stereo algorithms, but also on the Digital Terrain Models (DTMs) created after post-processing. Outcomes show differences between the DTMs that can be considered irrelevant for the geomorphological phenomena under analysis. This study confirms that SfM photogrammetry can be a valuable tool for monitoring sediment dynamics, but accurate point cloud post-processing is required to reliably localize geomorphological changes.
The impact of CmapTools utilization towards students' conceptual change on optics topic
NASA Astrophysics Data System (ADS)
Rofiuddin, Muhammad Rifqi; Feranie, Selly
2017-05-01
Science teachers need to help students identify their prior ideas and modify them based on scientific knowledge. This process is called as conceptual change. One of essential tools to analyze students' conceptual change is by using concept map. Concept Maps are graphical representations of knowledge that are comprised of concepts and the relationships between them. Constructing concept map is implemented by adapting the role of technology to support learning process, as it is suitable with Educational Ministry Regulation No.68 year 2013. Institute for Human and Machine Cognition (IHMC) has developed CmapTools, a client-server software for easily construct and visualize concept maps. This research aims to investigate secondary students' conceptual change after experiencing five-stage conceptual teaching model by utilizing CmapTools in learning Optics. Weak experimental method through one group pretest-posttest design is implemented in this study to collect preliminary and post concept map as qualitative data. Sample was taken purposively of 8th grade students (n= 22) at one of private schools Bandung, West Java. Conceptual change based on comparison of preliminary and post concept map construction is assessed based on rubric of concept map scoring and structure. Results shows significance conceptual change differences at 50.92 % that is elaborated into concept map element such as prepositions and hierarchical level in high category, cross links in medium category and specific examples in low category. All of the results are supported with the students' positive response towards CmapTools utilization that indicates improvement of motivation, interest, and behavior aspect towards Physics lesson.
Li, Ginny X H; Vogel, Christine; Choi, Hyungwon
2018-06-07
While tandem mass spectrometry can detect post-translational modifications (PTM) at the proteome scale, reported PTM sites are often incomplete and include false positives. Computational approaches can complement these datasets by additional predictions, but most available tools use prediction models pre-trained for single PTM type by the developers and it remains a difficult task to perform large-scale batch prediction for multiple PTMs with flexible user control, including the choice of training data. We developed an R package called PTMscape which predicts PTM sites across the proteome based on a unified and comprehensive set of descriptors of the physico-chemical microenvironment of modified sites, with additional downstream analysis modules to test enrichment of individual or pairs of PTMs in protein domains. PTMscape is flexible in the ability to process any major modifications, such as phosphorylation and ubiquitination, while achieving the sensitivity and specificity comparable to single-PTM methods and outperforming other multi-PTM tools. Applying this framework, we expanded proteome-wide coverage of five major PTMs affecting different residues by prediction, especially for lysine and arginine modifications. Using a combination of experimentally acquired sites (PSP) and newly predicted sites, we discovered that the crosstalk among multiple PTMs occur more frequently than by random chance in key protein domains such as histone, protein kinase, and RNA recognition motifs, spanning various biological processes such as RNA processing, DNA damage response, signal transduction, and regulation of cell cycle. These results provide a proteome-scale analysis of crosstalk among major PTMs and can be easily extended to other types of PTM.
A web platform for integrated surface water - groundwater modeling and data management
NASA Astrophysics Data System (ADS)
Fatkhutdinov, Aybulat; Stefan, Catalin; Junghanns, Ralf
2016-04-01
Model-based decision support systems are considered to be reliable and time-efficient tools for resources management in various hydrology related fields. However, searching and acquisition of the required data, preparation of the data sets for simulations as well as post-processing, visualization and publishing of the simulations results often requires significantly more work and time than performing the modeling itself. The purpose of the developed software is to combine data storage facilities, data processing instruments and modeling tools in a single platform which potentially can reduce time required for performing simulations, hence decision making. The system is developed within the INOWAS (Innovative Web Based Decision Support System for Water Sustainability under a Changing Climate) project. The platform integrates spatially distributed catchment scale rainfall - runoff, infiltration and groundwater flow models with data storage, processing and visualization tools. The concept is implemented in a form of a web-GIS application and is build based on free and open source components, including the PostgreSQL database management system, Python programming language for modeling purposes, Mapserver for visualization and publishing the data, Openlayers for building the user interface and others. Configuration of the system allows performing data input, storage, pre- and post-processing and visualization in a single not disturbed workflow. In addition, realization of the decision support system in the form of a web service provides an opportunity to easily retrieve and share data sets as well as results of simulations over the internet, which gives significant advantages for collaborative work on the projects and is able to significantly increase usability of the decision support system.
NASA Astrophysics Data System (ADS)
Czettl, C.; Pohler, M.
2016-03-01
Increasing demands on material properties of iron based work piece materials, e.g. for the turbine industry, complicate the machining process and reduce the lifetime of the cutting tools. Therefore, improved tool solutions, adapted to the requirements of the desired application have to be developed. Especially, the interplay of macro- and micro geometry, substrate material, coating and post treatment processes is crucial for the durability of modern high performance tool solutions. Improved and novel analytical methods allow a detailed understanding of material properties responsible for the wear behaviour of the tools. Those support the knowledge based development of tailored cutting materials for selected applications. One important factor for such a solution is the proper choice of coating material, which can be synthesized by physical or chemical vapor deposition techniques. Within this work an overview of state-of-the-art coated carbide grades is presented and application examples are shown to demonstrate their high efficiency. Machining processes for a material range from cast iron, low carbon steels to high alloyed steels are covered.
Sandia Advanced MEMS Design Tools v. 3.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yarberry, Victor R.; Allen, James J.; Lantz, Jeffrey W.
This is a major revision to the Sandia Advanced MEMS Design Tools. It replaces all previous versions. New features in this version: Revised to support AutoCAD 2014 and 2015 This CD contains an integrated set of electronic files that: a) Describe the SUMMiT V fabrication process b) Provide enabling educational information (including pictures, videos, technical information) c) Facilitate the process of designing MEMS with the SUMMiT process (prototype file, Design Rule Checker, Standard Parts Library) d) Facilitate the process of having MEMS fabricated at Sandia National Laboratories e) Facilitate the process of having post-fabrication services performed. While there exists somemore » files on the CD that are used in conjunction with software package AutoCAD, these files are not intended for use independent of the CD. Note that the customer must purchase his/her own copy of AutoCAD to use with these files.« less
Post-OPC verification using a full-chip pattern-based simulation verification method
NASA Astrophysics Data System (ADS)
Hung, Chi-Yuan; Wang, Ching-Heng; Ma, Cliff; Zhang, Gary
2005-11-01
In this paper, we evaluated and investigated techniques for performing fast full-chip post-OPC verification using a commercial product platform. A number of databases from several technology nodes, i.e. 0.13um, 0.11um and 90nm are used in the investigation. Although it has proven that for most cases, our OPC technology is robust in general, due to the variety of tape-outs with complicated design styles and technologies, it is difficult to develop a "complete or bullet-proof" OPC algorithm that would cover every possible layout patterns. In the evaluation, among dozens of databases, some OPC databases were found errors by Model-based post-OPC checking, which could cost significantly in manufacturing - reticle, wafer process, and more importantly the production delay. From such a full-chip OPC database verification, we have learned that optimizing OPC models and recipes on a limited set of test chip designs may not provide sufficient coverage across the range of designs to be produced in the process. And, fatal errors (such as pinch or bridge) or poor CD distribution and process-sensitive patterns may still occur. As a result, more than one reticle tape-out cycle is not uncommon to prove models and recipes that approach the center of process for a range of designs. So, we will describe a full-chip pattern-based simulation verification flow serves both OPC model and recipe development as well as post OPC verification after production release of the OPC. Lastly, we will discuss the differentiation of the new pattern-based and conventional edge-based verification tools and summarize the advantages of our new tool and methodology: 1). Accuracy: Superior inspection algorithms, down to 1nm accuracy with the new "pattern based" approach 2). High speed performance: Pattern-centric algorithms to give best full-chip inspection efficiency 3). Powerful analysis capability: Flexible error distribution, grouping, interactive viewing and hierarchical pattern extraction to narrow down to unique patterns/cells.
Near Real Time Analytics of Human Sensor Networks in the Realm of Big Data
NASA Astrophysics Data System (ADS)
Aulov, O.; Halem, M.
2012-12-01
With the prolific development of social media, emergency responders have an increasing interest in harvesting social media from outlets such as Flickr, Twitter, and Facebook, in order to assess the scale and specifics of extreme events including wild fires, earthquakes, terrorist attacks, oil spills, etc. A number of experimental platforms have successfully been implemented to demonstrate the utilization of social media data in extreme events, including Twitter Earthquake Detector, which relied on tweets for earthquake monitoring; AirTwitter, which used tweets for air quality reporting; and our previous work, using Flickr data as boundary value forcings to improve the forecast of oil beaching in the aftermath of the Deepwater Horizon oil spill. The majority of these platforms addressed a narrow, specific type of emergency and harvested data from a particular outlet. We demonstrate an interactive framework for monitoring, mining and analyzing a plethora of heterogeneous social media sources for a diverse range of extreme events. Our framework consists of three major parts: a real time social media aggregator, a data processing and analysis engine, and a web-based visualization and reporting tool. The aggregator gathers tweets, Facebook comments from fan pages, Google+ posts, forum discussions, blog posts (such as LiveJournal and Blogger.com), images from photo-sharing platforms (such as Flickr, Picasa), videos from video-sharing platforms (youtube, Vimeo), and so forth. The data processing and analysis engine pre-processes the aggregated information and annotates it with geolocation and sentiment information. In many cases, the metadata of the social media posts does not contain geolocation information—-however, a human reader can easily guess from the body of the text what location is discussed. We are automating this task by use of Named Entity Recognition (NER) algorithms and a gazetteer service. The visualization and reporting tool provides a web-based, user-friendly interface that provides time-series analysis and plotting tools, geo-spacial visualization tools with interactive maps, and cause-effect inference tools. We demonstrate how we address big data challenges of monitoring, aggregating and analyzing vast amounts of social media data at a near realtime. As a result, our framework not only allows emergency responders to augment their situational awareness with social media information, but can also allow them to extract geophysical data and incorporate it into their analysis models.
Simulation in the Service of Design - Asking the Right Questions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Donn, Michael; Selkowitz, Stephen; Bordass, Bill
2009-03-01
This paper proposes an approach to the creation of design tools that address the real information needs of designers in the early stages of design of nonresidential buildings. Traditional simplified design tools are typically too limited to be of much use, even in conceptual design. The proposal is to provide access to the power of detailed simulation tools, at a stage in design when little is known about the final building, but at a stage also when the freedom to explore options is greatest. The proposed approach to tool design has been derived from consultation with design analysis teams asmore » part of the COMFEN tool development. The paper explores how tools like COMFEN have been shaped by this consultation and how requests from these teams for real-world relevance might shape such tools in the future, drawing into the simulation process the lessons from Post Occupancy Evaluation (POE) of buildings.« less
Advances in Software Tools for Pre-processing and Post-processing of Overset Grid Computations
NASA Technical Reports Server (NTRS)
Chan, William M.
2004-01-01
Recent developments in three pieces of software for performing pre-processing and post-processing work on numerical computations using overset grids are presented. The first is the OVERGRID graphical interface which provides a unified environment for the visualization, manipulation, generation and diagnostics of geometry and grids. Modules are also available for automatic boundary conditions detection, flow solver input preparation, multiple component dynamics input preparation and dynamics animation, simple solution viewing for moving components, and debris trajectory analysis input preparation. The second is a grid generation script library that enables rapid creation of grid generation scripts. A sample of recent applications will be described. The third is the OVERPLOT graphical interface for displaying and analyzing history files generated by the flow solver. Data displayed include residuals, component forces and moments, number of supersonic and reverse flow points, and various dynamics parameters.
A novel processing platform for post tape out flows
NASA Astrophysics Data System (ADS)
Vu, Hien T.; Kim, Soohong; Word, James; Cai, Lynn Y.
2018-03-01
As the computational requirements for post tape out (PTO) flows increase at the 7nm and below technology nodes, there is a need to increase the scalability of the computational tools in order to reduce the turn-around time (TAT) of the flows. Utilization of design hierarchy has been one proven method to provide sufficient partitioning to enable PTO processing. However, as the data is processed through the PTO flow, its effective hierarchy is reduced. The reduction is necessary to achieve the desired accuracy. Also, the sequential nature of the PTO flow is inherently non-scalable. To address these limitations, we are proposing a quasi-hierarchical solution that combines multiple levels of parallelism to increase the scalability of the entire PTO flow. In this paper, we describe the system and present experimental results demonstrating the runtime reduction through scalable processing with thousands of computational cores.
Erberich, Stephan G; Bhandekar, Manasee; Chervenak, Ann; Kesselman, Carl; Nelson, Marvin D
2007-01-01
Functional MRI is successfully being used in clinical and research applications including preoperative planning, language mapping, and outcome monitoring. However, clinical use of fMRI is less widespread due to its complexity of imaging, image workflow, post-processing, and lack of algorithmic standards hindering result comparability. As a consequence, wide-spread adoption of fMRI as clinical tool is low contributing to the uncertainty of community physicians how to integrate fMRI into practice. In addition, training of physicians with fMRI is in its infancy and requires clinical and technical understanding. Therefore, many institutions which perform fMRI have a team of basic researchers and physicians to perform fMRI as a routine imaging tool. In order to provide fMRI as an advanced diagnostic tool to the benefit of a larger patient population, image acquisition and image post-processing must be streamlined, standardized, and available at any institution which does not have these resources available. Here we describe a software architecture, the functional imaging laboratory (funcLAB/G), which addresses (i) standardized image processing using Statistical Parametric Mapping and (ii) its extension to secure sharing and availability for the community using standards-based Grid technology (Globus Toolkit). funcLAB/G carries the potential to overcome the limitations of fMRI in clinical use and thus makes standardized fMRI available to the broader healthcare enterprise utilizing the Internet and HealthGrid Web Services technology.
Curran, E; Harper, P; Loveday, H; Gilmour, H; Jones, S; Benneyan, J; Hood, J; Pratt, R
2008-10-01
Statistical process control (SPC) charts have previously been advocated for infection control quality improvement. To determine their effectiveness, a multicentre randomised controlled trial was undertaken to explore whether monthly SPC feedback from infection control nurses (ICNs) to healthcare workers of ward-acquired meticillin-resistant Staphylococcus aureus (WA-MRSA) colonisation or infection rates would produce any reductions in incidence. Seventy-five wards in 24 hospitals in the UK were randomised into three arms: (1) wards receiving SPC chart feedback; (2) wards receiving SPC chart feedback in conjunction with structured diagnostic tools; and (3) control wards receiving neither type of feedback. Twenty-five months of pre-intervention WA-MRSA data were compared with 24 months of post-intervention data. Statistically significant and sustained decreases in WA-MRSA rates were identified in all three arms (P<0.001; P=0.015; P<0.001). The mean percentage reduction was 32.3% for wards receiving SPC feedback, 19.6% for wards receiving SPC and diagnostic feedback, and 23.1% for control wards, but with no significant difference between the control and intervention arms (P=0.23). There were significantly more post-intervention 'out-of-control' episodes (P=0.021) in the control arm (averages of 0.60, 0.28, and 0.28 for Control, SPC and SPC+Tools wards, respectively). Participants identified SPC charts as an effective communication tool and valuable for disseminating WA-MRSA data.
Analytical framework and tool kit for SEA follow-up
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran
2009-04-15
Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at themore » regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate.« less
Image Navigation and Registration Performance Assessment Evaluation Tools for GOES-R ABI and GLM
NASA Technical Reports Server (NTRS)
Houchin, Scott; Porter, Brian; Graybill, Justin; Slingerland, Philip
2017-01-01
The GOES-R Flight Project has developed an Image Navigation and Registration (INR) Performance Assessment Tool Set (IPATS) for measuring Advanced Baseline Imager (ABI) and Geostationary Lightning Mapper (GLM) INR performance metrics in the post-launch period for performance evaluation and long term monitoring. IPATS utilizes a modular algorithmic design to allow user selection of data processing sequences optimized for generation of each INR metric. This novel modular approach minimizes duplication of common processing elements, thereby maximizing code efficiency and speed. Fast processing is essential given the large number of sub-image registrations required to generate INR metrics for the many images produced over a 24 hour evaluation period. This paper describes the software design and implementation of IPATS and provides preliminary test results.
Addressing multi-label imbalance problem of surgical tool detection using CNN.
Sahu, Manish; Mukhopadhyay, Anirban; Szengel, Angelika; Zachow, Stefan
2017-06-01
A fully automated surgical tool detection framework is proposed for endoscopic video streams. State-of-the-art surgical tool detection methods rely on supervised one-vs-all or multi-class classification techniques, completely ignoring the co-occurrence relationship of the tools and the associated class imbalance. In this paper, we formulate tool detection as a multi-label classification task where tool co-occurrences are treated as separate classes. In addition, imbalance on tool co-occurrences is analyzed and stratification techniques are employed to address the imbalance during convolutional neural network (CNN) training. Moreover, temporal smoothing is introduced as an online post-processing step to enhance runtime prediction. Quantitative analysis is performed on the M2CAI16 tool detection dataset to highlight the importance of stratification, temporal smoothing and the overall framework for tool detection. The analysis on tool imbalance, backed by the empirical results, indicates the need and superiority of the proposed framework over state-of-the-art techniques.
ballast water « Coast Guard Maritime Commons
updates to Marine Safety Center's ballast water management system website The Marine Safety Center recently updated two tools posted to its ballast water management system website to assist industry when completing the ballast water management system type approval process, or when accessing letters of intent. 12
Commercial Vessel Compliance « Coast Guard Maritime Commons
updates to Marine Safety Center's ballast water management system website The Marine Safety Center recently updated two tools posted to its ballast water management system website to assist industry when completing the ballast water management system type approval process, or when accessing letters of intent. 5
Perception or Fact: Measuring the Effectiveness of the Terrorism Early Warning (TEW) Group
2005-09-01
alternatives ” (Campbell 2005). The logic model process is a tool that has been used by evaluators for many years to identify performance measures and...pertinent information is obtained, this cell is responsible for the development (pre-event) and use (trans- and post-event) of playbooks and...
Experiences with the Twitter Health Surveillance (THS) System
Rodríguez-Martínez, Manuel
2018-01-01
Social media has become an important platform to gauge public opinion on topics related to our daily lives. In practice, processing these posts requires big data analytics tools since the volume of data and the speed of production overwhelm single-server solutions. Building an application to capture and analyze posts from social media can be a challenge simply because it requires combining a set of complex software tools that often times are tricky to configure, tune, and maintain. In many instances, the application ends up being an assorted collection of Java/Scala programs or Python scripts that developers cobble together to generate the data products they need. In this paper, we present the Twitter Health Surveillance (THS) application framework. THS is designed as a platform to allow end-users to monitor a stream of tweets, and process the stream with a combination of built-in functionality and their own user-defined functions. We discuss the architecture of THS, and describe its implementation atop the Apache Hadoop Ecosystem. We also present several lessons learned while developing our current prototype. PMID:29607412
Experiences with the Twitter Health Surveillance (THS) System.
Rodríguez-Martínez, Manuel
2017-06-01
Social media has become an important platform to gauge public opinion on topics related to our daily lives. In practice, processing these posts requires big data analytics tools since the volume of data and the speed of production overwhelm single-server solutions. Building an application to capture and analyze posts from social media can be a challenge simply because it requires combining a set of complex software tools that often times are tricky to configure, tune, and maintain. In many instances, the application ends up being an assorted collection of Java/Scala programs or Python scripts that developers cobble together to generate the data products they need. In this paper, we present the Twitter Health Surveillance (THS) application framework. THS is designed as a platform to allow end-users to monitor a stream of tweets, and process the stream with a combination of built-in functionality and their own user-defined functions. We discuss the architecture of THS, and describe its implementation atop the Apache Hadoop Ecosystem. We also present several lessons learned while developing our current prototype.
Optical technologies for TSV inspection
NASA Astrophysics Data System (ADS)
Aiyer, Arun A.; Maltsev, Nikolai; Ryu, Jae
2014-04-01
In this paper, Frontier Semiconductor will introduce a new technology that is referred to as Virtual Interface Technology (VIT™). VIT™ is a Fourier domain technique that utilizes temporal phase shear of the measurement beam. The unique configuration of the sensor enables measurement of wafer and bonded stack thicknesses ranging from a few microns to millimeters with measurement repeatability ~ nm and resolution of approximately 0.1% of nominal thickness or depth. We will present data on high aspect ratio via measurements (depth, top critical dimension, bottom critical dimension, via bottom profile and side wall angle), bonded wafer stack thickness, and Cu bump measurements. A complimentary tool developed at FSM is a high resolution μRaman spectrometer to measure stress-change in Si lattice induced by Through Silicon Via (TSV) processes. These measurements are important to determine Keep-Out-Zone in the areas where devices are built so that the engineered gate strain is not altered by TSV processing induced strain. Applications include via post-etch; via post fill, and bottom Cu nail stress measurements. The capabilities of and measurement results from both tools are discussed below.
NASA Astrophysics Data System (ADS)
Tillmann, W.; Schaak, C.; Biermann, D.; Aßmuth, R.; Goeke, S.
2017-03-01
Cemented carbide (hard metal) cutting tools are the first choice to machine hard materials or to conduct high performance cutting processes. Main advantages of cemented carbide cutting tools are their high wear resistance (hardness) and good high temperature strength. In contrast, cemented carbide cutting tools are characterized by a low toughness and generate higher production costs, especially due to limited resources. Usually, cemented carbide cutting tools are produced by means of powder metallurgical processes. Compared to conventional manufacturing routes, these processes are more expensive and only a limited number of geometries can be realized. Furthermore, post-processing and preparing the cutting edges in order to achieve high performance tools is often required. In the present paper, an alternative method to substitute solid cemented carbide cutting tools is presented. Cutting tools made of conventional high speed steels (HSS) were coated with thick WC-Co (88/12) layers by means of thermal spraying (HVOF). The challenge is to obtain a dense, homogenous, and near-net-shape coating on the flanks and the cutting edge. For this purpose, different coating strategies were realized using an industrial robot. The coating properties were subsequently investigated. After this initial step, the surfaces of the cutting tools were ground and selected cutting edges were prepared by means of wet abrasive jet machining to achieve a smooth and round micro shape. Machining tests were conducted with these coated, ground and prepared cutting tools. The occurring wear phenomena were analyzed and compared to conventional HSS cutting tools. Overall, the results of the experiments proved that the coating withstands mechanical stresses during machining. In the conducted experiments, the coated cutting tools showed less wear than conventional HSS cutting tools. With respect to the initial wear resistance, additional benefits can be obtained by preparing the cutting edge by means of wet abrasive jet machining.
Multiple approaches to fine-grained indexing of the biomedical literature.
Neveol, Aurelie; Shooshan, Sonya E; Humphrey, Susanne M; Rindflesh, Thomas C; Aronson, Alan R
2007-01-01
The number of articles in the MEDLINE database is expected to increase tremendously in the coming years. To ensure that all these documents are indexed with continuing high quality, it is necessary to develop tools and methods that help the indexers in their daily task. We present three methods addressing a novel aspect of automatic indexing of the biomedical literature, namely producing MeSH main heading/subheading pair recommendations. The methods, (dictionary-based, post- processing rules and Natural Language Processing rules) are described and evaluated on a genetics-related corpus. The best overall performance is obtained for the subheading genetics (70% precision and 17% recall with post-processing rules, 48% precision and 37% recall with the dictionary-based method). Future work will address extending this work to all MeSH subheadings and a more thorough study of method combination.
Micro-optical fabrication by ultraprecision diamond machining and precision molding
NASA Astrophysics Data System (ADS)
Li, Hui; Li, Likai; Naples, Neil J.; Roblee, Jeffrey W.; Yi, Allen Y.
2017-06-01
Ultraprecision diamond machining and high volume molding for affordable high precision high performance optical elements are becoming a viable process in optical industry for low cost high quality microoptical component manufacturing. In this process, first high precision microoptical molds are fabricated using ultraprecision single point diamond machining followed by high volume production methods such as compression or injection molding. In the last two decades, there have been steady improvements in ultraprecision machine design and performance, particularly with the introduction of both slow tool and fast tool servo. Today optical molds, including freeform surfaces and microlens arrays, are routinely diamond machined to final finish without post machining polishing. For consumers, compression molding or injection molding provide efficient and high quality optics at extremely low cost. In this paper, first ultraprecision machine design and machining processes such as slow tool and fast too servo are described then both compression molding and injection molding of polymer optics are discussed. To implement precision optical manufacturing by molding, numerical modeling can be included in the future as a critical part of the manufacturing process to ensure high product quality.
Ngune, Irene; Jiwa, Moyez; McManus, Alexandra; Hughes, Jeff; Parsons, Richard; Hodder, Rupert; Entriken, Fiona
2014-01-01
Treatment for colorectal cancer (CRC) may result in physical, social, and psychological needs that affect patients' quality of life post-treatment. A comprehensive assessment should be conducted to identify these needs in CRC patients post treatment, however, there is a lack of tools and processes available in general practice. This study aimed to develop a patient-completed needs screening tool that identifies potentially unmet physical, psychological, and social needs in CRC and facilitates consultation with a general practitioner (GP) to address these needs. The development of the self-assessment tool for patients (SATp) included a review of the literature; face and content validity with reference to an expert panel; psychometric testing including readability, internal consistency, and test-retest reliability; and usability in clinical practice. The SATp contains 25 questions. The tool had internal consistency (Cronbach's alpha 0.70-0.97), readability (reading ease 82.5%), and test-retest reliability (kappa 0.689-1.000). A total of 66 patients piloted the SATp. Participants were on average 69.2 (SD 9.9) years old and had a median follow-up period of 26.7 months. The SATp identified a total of 547 needs (median 7 needs/per patient; IQR [3-12.25]). Needs were categorised into social (175[32%]), psychological (175[32%]), and physical (197[36%]) domains. SATp is a reliable self-assessment tool useful for identifying CRC patient needs. Further testing of this tool for validity and usability is underway.
Lessons Learned from Optical Payload for Lasercomm Science (OPALS) Mission Operations
NASA Technical Reports Server (NTRS)
Sindiy, Oleg V.; Abrahamson, Matthew J.; Biswas, Abhijit; Wright, Malcolm W.; Padams, Jordan H.; Konyha, Alexander L.
2015-01-01
This paper provides an overview of Optical Payload for Lasercomm Science (OPALS) activities and lessons learned during mission operations. Activities described cover the periods of commissioning, prime, and extended mission operations, during which primary and secondary mission objectives were achieved for demonstrating space-to-ground optical communications. Lessons learned cover Mission Operations System topics in areas of: architecture verification and validation, staffing, mission support area, workstations, workstation tools, interfaces with support services, supporting ground stations, team training, procedures, flight software upgrades, post-processing tools, and public outreach.
Solid State Joining of Magnesium to Steel
NASA Astrophysics Data System (ADS)
Jana, Saumyadeep; Hovanski, Yuri; Pilli, Siva P.; Field, David P.; Yu, Hao; Pan, Tsung-Yu; Santella, M. L.
Friction stir welding and ultrasonic welding techniques were applied to join automotive magnesium alloys to steel sheet. The effect of tooling and process parameters on the post-weld microstructure, texture and mechanical properties was investigated. Static and dynamic loading were utilized to investigate the joint strength of both cast and wrought magnesium alloys including their susceptibility and degradation under corrosive media. The conditions required to produce joint strengths in excess of 75% of the base metal strength were determined, and the effects of surface coatings, tooling and weld parameters on weld properties are presented.
Data System Architectures: Recent Experiences from Data Intensive Projects
NASA Astrophysics Data System (ADS)
Palanisamy, G.; Frame, M. T.; Boden, T.; Devarakonda, R.; Zolly, L.; Hutchison, V.; Latysh, N.; Krassovski, M.; Killeffer, T.; Hook, L.
2014-12-01
U.S. Federal agencies are frequently trying to address new data intensive projects that require next generation of data system architectures. This presentation will focus on two new such architectures: USGS's Science Data Catalog (SDC) and DOE's Next Generation Ecological Experiments - Arctic Data System. The U.S. Geological Survey (USGS) developed a Science Data Catalog (data.usgs.gov) to include records describing datasets, data collections, and observational or remotely-sensed data. The system was built using service oriented architecture and allows USGS scientists and data providers to create and register their data using either a standards-based metadata creation form or simply to register their already-created metadata records with the USGS SDC Dashboard. This dashboard then compiles the harvested metadata records and sends them to the post processing and indexing service using the JSON format. The post processing service, with the help of various ontologies and other geo-spatial validation services, auto-enhances these harvested metadata records and creates a Lucene index using the Solr enterprise search platform. Ultimately, metadata is made available via the SDC search interface. DOE's Next Generation Ecological Experiments (NGEE) Arctic project deployed a data system that allows scientists to prepare, publish, archive, and distribute data from field collections, lab experiments, sensors, and simulated modal outputs. This architecture includes a metadata registration form, data uploading and sharing tool, a Digital Object Identifier (DOI) tool, a Drupal based content management tool (http://ngee-arctic.ornl.gov), and a data search and access tool based on ORNL's Mercury software (http://mercury.ornl.gov). The team also developed Web-metric tools and a data ingest service to visualize geo-spatial and temporal observations.
Time Domain Tool Validation Using ARES I-X Flight Data
NASA Technical Reports Server (NTRS)
Hough, Steven; Compton, James; Hannan, Mike; Brandon, Jay
2011-01-01
The ARES I-X vehicle was launched from NASA's Kennedy Space Center (KSC) on October 28, 2009 at approximately 11:30 EDT. ARES I-X was the first test flight for NASA s ARES I launch vehicle, and it was the first non-Shuttle launch vehicle designed and flown by NASA since Saturn. The ARES I-X had a 4-segment solid rocket booster (SRB) first stage and a dummy upper stage (US) to emulate the properties of the ARES I US. During ARES I-X pre-flight modeling and analysis, six (6) independent time domain simulation tools were developed and cross validated. Each tool represents an independent implementation of a common set of models and parameters in a different simulation framework and architecture. Post flight data and reconstructed models provide the means to validate a subset of the simulations against actual flight data and to assess the accuracy of pre-flight dispersion analysis. Post flight data consists of telemetered Operational Flight Instrumentation (OFI) data primarily focused on flight computer outputs and sensor measurements as well as Best Estimated Trajectory (BET) data that estimates vehicle state information from all available measurement sources. While pre-flight models were found to provide a reasonable prediction of the vehicle flight, reconstructed models were generated to better represent and simulate the ARES I-X flight. Post flight reconstructed models include: SRB propulsion model, thrust vector bias models, mass properties, base aerodynamics, and Meteorological Estimated Trajectory (wind and atmospheric data). The result of the effort is a set of independently developed, high fidelity, time-domain simulation tools that have been cross validated and validated against flight data. This paper presents the process and results of high fidelity aerospace modeling, simulation, analysis and tool validation in the time domain.
Post-processing of 3D-printed parts using femtosecond and picosecond laser radiation
NASA Astrophysics Data System (ADS)
Mingareev, Ilya; Gehlich, Nils; Bonhoff, Tobias; Meiners, Wilhelm; Kelbassa, Ingomar; Biermann, Tim; Richardson, Martin C.
2014-03-01
Additive manufacturing, also known as 3D-printing, is a near-net shape manufacturing approach, delivering part geometry that can be considerably affected by various process conditions, heat-induced distortions, solidified melt droplets, partially fused powders, and surface modifications induced by the manufacturing tool motion and processing strategy. High-repetition rate femtosecond and picosecond laser radiation was utilized to improve surface quality of metal parts manufactured by laser additive techniques. Different laser scanning approaches were utilized to increase the ablation efficiency and to reduce the surface roughness while preserving the initial part geometry. We studied post-processing of 3D-shaped parts made of Nickel- and Titanium-base alloys by utilizing Selective Laser Melting (SLM) and Laser Metal Deposition (LMD) as additive manufacturing techniques. Process parameters such as the pulse energy, the number of layers and their spatial separation were varied. Surface processing in several layers was necessary to remove the excessive material, such as individual powder particles, and to reduce the average surface roughness from asdeposited 22-45 μm to a few microns. Due to the ultrafast laser-processing regime and the small heat-affected zone induced in materials, this novel integrated manufacturing approach can be used to post-process parts made of thermally and mechanically sensitive materials, and to attain complex designed shapes with micrometer precision.
Meat science: From proteomics to integrated omics towards system biology.
D'Alessandro, Angelo; Zolla, Lello
2013-01-14
Since the main ultimate goal of farm animal raising is the production of proteins for human consumption, research tools to investigate proteins play a major role in farm animal and meat science. Indeed, proteomics has been applied to the field of farm animal science to monitor in vivo performances of livestock animals (growth performances, fertility, milk quality etc.), but also to further our understanding of the molecular processes at the basis of meat quality, which are largely dependent on the post mortem biochemistry of the muscle, often in a species-specific way. Post mortem alterations to the muscle proteome reflect the biological complexity of the process of "muscle to meat conversion," a process that, despite decades of advancements, is all but fully understood. This is mainly due to the enormous amounts of variables affecting meat tenderness per se, including biological factors, such as animal species, breed specific-characteristic, muscle under investigation. However, it is rapidly emerging that the tender meat phenotype is not only tied to genetics (livestock breeding selection), but also to extrinsic factors, such as the rearing environment, feeding conditions, physical activity, administration of hormonal growth promotants, pre-slaughter handling and stress, post mortem handling. From this intricate scenario, biochemical approaches and systems-wide integrated investigations (metabolomics, transcriptomics, interactomics, phosphoproteomics, mathematical modeling), which have emerged as complementary tools to proteomics, have helped establishing a few milestones in our understanding of the events leading from muscle to meat conversion. The growing integration of omics disciplines in the field of systems biology will soon contribute to take further steps forward. Copyright © 2012 Elsevier B.V. All rights reserved.
Demner-Fushman, D; Elhadad, N
2016-11-10
This paper reviews work over the past two years in Natural Language Processing (NLP) applied to clinical and consumer-generated texts. We included any application or methodological publication that leverages text to facilitate healthcare and address the health-related needs of consumers and populations. Many important developments in clinical text processing, both foundational and task-oriented, were addressed in community- wide evaluations and discussed in corresponding special issues that are referenced in this review. These focused issues and in-depth reviews of several other active research areas, such as pharmacovigilance and summarization, allowed us to discuss in greater depth disease modeling and predictive analytics using clinical texts, and text analysis in social media for healthcare quality assessment, trends towards online interventions based on rapid analysis of health-related posts, and consumer health question answering, among other issues. Our analysis shows that although clinical NLP continues to advance towards practical applications and more NLP methods are used in large-scale live health information applications, more needs to be done to make NLP use in clinical applications a routine widespread reality. Progress in clinical NLP is mirrored by developments in social media text analysis: the research is moving from capturing trends to addressing individual health-related posts, thus showing potential to become a tool for precision medicine and a valuable addition to the standard healthcare quality evaluation tools.
Sanchez-Izquierdo-Riera, Jose Angel; Molano-Alvarez, Esteban; Saez-de la Fuente, Ignacio; Maynar-Moliner, Javier; Marín-Mateos, Helena; Chacón-Alves, Silvia
2016-01-01
The failure mode and effect analysis (FMEA) may improve the safety of the continuous renal replacement therapies (CRRT) in the intensive care unit. We use this tool in three phases: 1) Retrospective observational study. 2) A process FMEA, with implementation of the improvement measures identified. 3) Cohort study after FMEA. We included 54 patients in the pre-FMEA group and 72 patients in the post-FMEA group. Comparing the risks frequencies per patient in both groups, we got less cases of under 24 hours of filter survival time in the post-FMEA group (31 patients 57.4% vs. 21 patients 29.6%; p < 0.05); less patients suffered circuit coagulation with inability to return the blood to the patient (25 patients [46.3%] vs. 16 patients [22.2%]; p < 0.05); 54 patients (100%) versus 5 (6.94%) did not get phosphorus levels monitoring (p < 0.05); in 14 patients (25.9%) versus 0 (0%), the CRRT prescription did not appear on medical orders. As a measure of improvement, we adopt a dynamic dosage management. After the process FMEA, there were several improvements in the management of intensive care unit patients receiving CRRT, and we consider it a useful tool for improving the safety of critically ill patients.
NASA Astrophysics Data System (ADS)
Bush, D. F.; Sieber, R.; Seiler, G.; Chandler, M. A.; Chmura, G. L.
2017-12-01
Efforts to address climate change require public understanding of Earth and climate science. To meet this need, educators require instructional approaches and scientific technologies that overcome cultural barriers to impart conceptual understanding of the work of climate scientists. We compared student inquiry learning with now ubiquitous climate education toy models, data and tools against that which took place using a computational global climate model (GCM) from the National Aeronautics and Space Administration (NASA). Our study at McGill University and John Abbott College in Montreal, QC sheds light on how best to teach the research processes important to Earth and climate scientists studying atmospheric and Earth system processes but ill-understood by those outside the scientific community. We followed a pre/post, control/treatment experimental design that enabled detailed analysis and statistically significant results. Our research found more students succeed at understanding climate change when exposed to actual climate research processes and instruments. Inquiry-based education with a GCM resulted in significantly higher scores pre to post on diagnostic exams (quantitatively) and more complete conceptual understandings (qualitatively). We recognize the difficulty in planning and teaching inquiry with complex technology and we also found evidence that lectures support learning geared toward assessment exams.
Tree injury and mortality in fires: developing process-based models
Bret W. Butler; Matthew B. Dickinson
2010-01-01
Wildland fire managers are often required to predict tree injury and mortality when planning a prescribed burn or when considering wildfire management options; and, currently, statistical models based on post-fire observations are the only tools available for this purpose. Implicit in the derivation of statistical models is the assumption that they are strictly...
ERIC Educational Resources Information Center
Schmitt, Robert M.; Groves, David L.
1976-01-01
Results of pre- and post-testing the tree identification skills of a group of 4-H members showed that adolescents aged 9-11 respond better to lecture demonstration teaching methods, while adolescents aged 12-14 respond better to an inquiry process that used nature trails as the primary instructional tool. (MLH)
USDA-ARS?s Scientific Manuscript database
Fire is an inherent component of sagebrush steppe rangelands in western North America and can dramatically affect runoff and erosion processes. Post-fire flooding and erosion events pose substantial threats to proximal resources, property, and human life. Yet, prescribed fire can serve as a tool to ...
Using Bones to Shape Stones: MIS 9 Bone Retouchers at Both Edges of the Mediterranean Sea
Blasco, Ruth; Rosell, Jordi; Cuartero, Felipe; Fernández Peris, Josep; Gopher, Avi; Barkai, Ran
2013-01-01
A significant challenge in Prehistory is to understand the mechanisms involved in the behavioural evolution of human groups. The degree of technological and cultural development of prehistoric groups is assessed mainly through stone tools. However, other elements can provide valuable information as well. This paper presents two bone retouchers dated to the Middle Pleistocene MIS 9 used for the shaping of lithic artefacts. Originating from Bolomor Cave (Spain) and Qesem Cave (Israel), these two bone retouchers are among the earliest of the Old World. Although the emergence of such tools might be found in the latest phases of the Acheulean, their widespread use seems to coincide with independently emergent post-Acheulean cultural complexes at both ends of the Mediterranean Sea: the post-Acheulean/pre-Mousterian of Western Europe and the Acheulo Yabrudian Cultural Complex of the Levant. Both entities seem to reflect convergent processes that may be viewed in a wider cultural context as reflecting new technology-related behavioural patterns as well as new perceptions in stone tool manufacturing. PMID:24146928
2011-01-01
Background A framework for high quality in post graduate training has been defined by the World Federation of Medical Education (WFME). The objective of this paper is to perform a systematic review of reviews to find current evidence regarding aspects of quality of post graduate training and to organise the results following the 9 areas of the WFME framework. Methods The systematic literature review was conducted in 2009 in Medline Ovid, EMBASE, ERIC and RDRB databases from 1995 onward. The reviews were selected by two independent researchers and a quality appraisal was based on the SIGN tool. Results 31 reviews met inclusion criteria. The majority of the reviews provided information about the training process (WFME area 2), the assessment of trainees (WFME area 3) and the trainees (WFME area 4). One review covered the area 8 'governance and administration'. No review was found in relation to the mission and outcomes, the evaluation of the training process and the continuous renewal (respectively areas 1, 7 and 9 of the WFME framework). Conclusions The majority of the reviews provided information about the training process, the assessment of trainees and the trainees. Indicators used for quality assessment purposes of post graduate training should be based on this evidence but further research is needed for some areas in particular to assess the quality of the training process. PMID:21977898
Audio-visual aid in teaching "fatty liver".
Dash, Sambit; Kamath, Ullas; Rao, Guruprasad; Prakash, Jay; Mishra, Snigdha
2016-05-06
Use of audio visual tools to aid in medical education is ever on a rise. Our study intends to find the efficacy of a video prepared on "fatty liver," a topic that is often a challenge for pre-clinical teachers, in enhancing cognitive processing and ultimately learning. We prepared a video presentation of 11:36 min, incorporating various concepts of the topic, while keeping in view Mayer's and Ellaway guidelines for multimedia presentation. A pre-post test study on subject knowledge was conducted for 100 students with the video shown as intervention. A retrospective pre study was conducted as a survey which inquired about students understanding of the key concepts of the topic and a feedback on our video was taken. Students performed significantly better in the post test (mean score 8.52 vs. 5.45 in pre-test), positively responded in the retrospective pre-test and gave a positive feedback for our video presentation. Well-designed multimedia tools can aid in cognitive processing and enhance working memory capacity as shown in our study. In times when "smart" device penetration is high, information and communication tools in medical education, which can act as essential aid and not as replacement for traditional curriculums, can be beneficial to the students. © 2015 by The International Union of Biochemistry and Molecular Biology, 44:241-245, 2016. © 2015 The International Union of Biochemistry and Molecular Biology.
A comparison of ensemble post-processing approaches that preserve correlation structures
NASA Astrophysics Data System (ADS)
Schefzik, Roman; Van Schaeybroeck, Bert; Vannitsem, Stéphane
2016-04-01
Despite the fact that ensemble forecasts address the major sources of uncertainty, they exhibit biases and dispersion errors and therefore are known to improve by calibration or statistical post-processing. For instance the ensemble model output statistics (EMOS) method, also known as non-homogeneous regression approach (Gneiting et al., 2005) is known to strongly improve forecast skill. EMOS is based on fitting and adjusting a parametric probability density function (PDF). However, EMOS and other common post-processing approaches apply to a single weather quantity at a single location for a single look-ahead time. They are therefore unable of taking into account spatial, inter-variable and temporal dependence structures. Recently many research efforts have been invested in designing post-processing methods that resolve this drawback but also in verification methods that enable the detection of dependence structures. New verification methods are applied on two classes of post-processing methods, both generating physically coherent ensembles. A first class uses the ensemble copula coupling (ECC) that starts from EMOS but adjusts the rank structure (Schefzik et al., 2013). The second class is a member-by-member post-processing (MBM) approach that maps each raw ensemble member to a corrected one (Van Schaeybroeck and Vannitsem, 2015). We compare variants of the EMOS-ECC and MBM classes and highlight a specific theoretical connection between them. All post-processing variants are applied in the context of the ensemble system of the European Centre of Weather Forecasts (ECMWF) and compared using multivariate verification tools including the energy score, the variogram score (Scheuerer and Hamill, 2015) and the band depth rank histogram (Thorarinsdottir et al., 2015). Gneiting, Raftery, Westveld, and Goldman, 2005: Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation. Mon. Wea. Rev., {133}, 1098-1118. Scheuerer and Hamill, 2015. Variogram-based proper scoring rules for probabilistic forecasts of multivariate quantities. Mon. Wea. Rev. {143},1321-1334. Schefzik, Thorarinsdottir, Gneiting. Uncertainty quantification in complex simulation models using ensemble copula coupling. Statistical Science {28},616-640, 2013. Thorarinsdottir, M. Scheuerer, and C. Heinz, 2015. Assessing the calibration of high-dimensional ensemble forecasts using rank histograms, arXiv:1310.0236. Van Schaeybroeck and Vannitsem, 2015: Ensemble post-processing using member-by-member approaches: theoretical aspects. Q.J.R. Meteorol. Soc., 141: 807-818.
Technology to improve quality and accountability.
Kay, Jonathan
2006-01-01
A body of evidence has been accumulated to demonstrate that current practice is not sufficiently safe for several stages of central laboratory testing. In particular, while analytical and perianalytical steps that take place within the laboratory are subjected to quality control procedures, this is not the case for several pre- and post-analytical steps. The ubiquitous application of auto-identification technology seems to represent a valuable tool for reducing error rates. A series of projects in Oxford has attempted to improve processes which support several areas of laboratory medicine, including point-of-care testing, blood transfusion, delivery and interpretation of reports, and support of decision-making by clinicians. The key tools are auto-identification, Internet communication technology, process re-engineering, and knowledge management.
NONMEMory: a run management tool for NONMEM.
Wilkins, Justin J
2005-06-01
NONMEM is an extremely powerful tool for nonlinear mixed-effect modelling and simulation of pharmacokinetic and pharmacodynamic data. However, it is a console-based application whose output does not lend itself to rapid interpretation or efficient management. NONMEMory has been created to be a comprehensive project manager for NONMEM, providing detailed summary, comparison and overview of the runs comprising a given project, including the display of output data, simple post-run processing, fast diagnostic plots and run output management, complementary to other available modelling aids. Analysis time ought not to be spent on trivial tasks, and NONMEMory's role is to eliminate these as far as possible by increasing the efficiency of the modelling process. NONMEMory is freely available from http://www.uct.ac.za/depts/pha/nonmemory.php.
[The training of medical and scientific manpower in the system of postgraduate medical education].
Kabanova, S A; Lozhkevich, I Iu
2010-01-01
The research was held within Petrovsky National surgery center and revealed certain regularities and trends testifying the necessity of further strategic and tactic development of training of graduated specialists through the innovative optimization of effectiveness of post-graduate training of medical personnel. The inclusion of social psychological monitoring of educational process is obligatory. The implementation of sociological monitoring in any institution providing post-graduate training has to be a powerful tool for enhancing quality and efficiency of training of medical professionals. This approach presupposes modernization of training programs accounting the innovations and research data.
NASA Astrophysics Data System (ADS)
An, L.; Zhang, J.; Gong, L.
2018-04-01
Playing an important role in gathering information of social infrastructure damage, Synthetic Aperture Radar (SAR) remote sensing is a useful tool for monitoring earthquake disasters. With the wide application of this technique, a standard method, comparing post-seismic to pre-seismic data, become common. However, multi-temporal SAR processes, are not always achievable. To develop a post-seismic data only method for building damage detection, is of great importance. In this paper, the authors are now initiating experimental investigation to establish an object-based feature analysing classification method for building damage recognition.
Improving global CD uniformity by optimizing post-exposure bake and develop sequences
NASA Astrophysics Data System (ADS)
Osborne, Stephen P.; Mueller, Mark; Lem, Homer; Reyland, David; Baik, KiHo
2003-12-01
Improvements in the final uniformity of masks can be shrouded by error contributions from many sources. The final Global CD Uniformity (GCDU) of a mask is degraded by individual contributions of the writing tool, the Post Applied Bake (PAB), the Post Exposure Bake (PEB), the Develop sequence and the Etch step. Final global uniformity will improve by isolating and minimizing the variability of the PEB and Develop. We achieved this de-coupling of the PEB and Develop process from the whole process stream by using "dark loss" which is the loss of unexposed resist during the develop process. We confirmed a correspondence between Angstroms of dark loss and nanometer sized deviations in the chrome CD. A plate with a distinctive dark loss pattern was related to a nearly identical pattern in the chrome CD. This pattern was verified to have originated during the PEB process and displayed a [Δ(Final CD)/Δ(Dark Loss)] ratio of 6 for TOK REAP200 resist. Previous papers have reported a sensitive linkage between Angstroms of dark loss and nanometers in the final uniformity of the written plate. These initial studies reported using this method to improve the PAB of resists for greater uniformity of sensitivity and contrast. Similarly, this paper demonstrates an outstanding optimization of PEB and Develop processes.
[Three-dimensional computer aided design for individualized post-and-core restoration].
Gu, Xiao-yu; Wang, Ya-ping; Wang, Yong; Lü, Pei-jun
2009-10-01
To develop a method of three-dimensional computer aided design (CAD) of post-and-core restoration. Two plaster casts with extracted natural teeth were used in this study. The extracted teeth were prepared and scanned using tomography method to obtain three-dimensional digitalized models. According to the basic rules of post-and-core design, posts, cores and cavity surfaces of the teeth were designed using the tools for processing point clouds, curves and surfaces on the forward engineering software of Tanglong prosthodontic system. Then three-dimensional figures of the final restorations were corrected according to the configurations of anterior teeth, premolars and molars respectively. Computer aided design of 14 post-and-core restorations were finished, and good fitness between the restoration and the three-dimensional digital models were obtained. Appropriate retention forms and enough spaces for the full crown restorations can be obtained through this method. The CAD of three-dimensional figures of the post-and-core restorations can fulfill clinical requirements. Therefore they can be used in computer-aided manufacture (CAM) of post-and-core restorations.
NASA Astrophysics Data System (ADS)
Rincón, A.; Jorba, O.; Baldasano, J. M.
2010-09-01
The increased contribution of solar energy in power generation sources requires an accurate estimation of surface solar irradiance conditioned by geographical, temporal and meteorological conditions. The knowledge of the variability of these factors is essential to estimate the expected energy production and therefore help stabilizing the electricity grid and increase the reliability of available solar energy. The use of numerical meteorological models in combination with statistical post-processing tools may have the potential to satisfy the requirements for short-term forecasting of solar irradiance for up to several days ahead and its application in solar devices. In this contribution, we present an assessment of a short-term irradiance prediction system based on the WRF-ARW mesoscale meteorological model (Skamarock et al., 2005) and several post-processing tools in order to improve the overall skills of the system in an annual simulation of the year 2004 in Spain. The WRF-ARW model is applied with 4 km x 4 km horizontal resolution and 38 vertical layers over the Iberian Peninsula. The hourly model irradiance is evaluated against more than 90 surface stations. The stations are used to assess the temporal and spatial fluctuations and trends of the system evaluating three different post-processes: Model Output Statistics technique (MOS; Glahn and Lowry, 1972), Recursive statistical method (REC; Boi, 2004) and Kalman Filter Predictor (KFP, Bozic, 1994; Roeger et al., 2003). A first evaluation of the system without post-processing tools shows an overestimation of the surface irradiance, due to the lack of atmospheric absorbers attenuation different than clouds not included in the meteorological model. This produces an annual BIAS of 16 W m-2 h-1, annual RMSE of 106 W m-2 h-1 and annual NMAE of 42%. The largest errors are observed in spring and summer, reaching RMSE of 350 W m-2 h-1. Results using Kalman Filter Predictor show a reduction of 8% of RMSE, 83% of BIAS, and NMAE decreases down to 32%. The REC method shows a reduction of 6% of RMSE, 79% of BIAS, and NMAE decreases down to 28%. When comparing stations at different altitudes, the overestimation is enhanced at coastal stations (less than 200m) up to 900 W m-2 h-1. The results allow us to analyze strengths and drawbacks of the irradiance prediction system and its application in the estimation of energy production from photovoltaic system cells. References Boi, P.: A statistical method for forecasting extreme daily temperatures using ECMWF 2-m temperatures and ground station measurements, Meteorol. Appl., 11, 245-251, 2004. Bozic, S.: Digital and Kalman filtering, John Wiley, Hoboken, New Jersey, 2nd edn., 1994. Glahn, H. and Lowry, D.: The use of Model Output Statistics (MOS) in Objective Weather Forecasting, Applied Meteorology, 11, 1203-1211, 1972. Roeger, C., Stull, R., McClung, D., Hacker, J., Deng, X., and Modzelewski, H.: Verification of Mesoscale Numerical Weather Forecasts in Mountainous Terrain for Application to Avalanche Prediction, Weather and forecasting, 18, 1140-1160, 2003. Skamarock, W., Klemp, J., Dudhia, J., Gill, D., Barker, D. M., Wang, W., and Powers, J. G.: A Description of the Advanced Research WRF Version 2, Tech. Rep. NCAR/TN-468+STR, NCAR Technical note, 2005.
Tools to aid post-wildfire assessment and erosion-mitigation treatment decisions
Peter R. Robichaud; Louise E. Ashmun
2013-01-01
A considerable investment in post-fire research over the past decade has improved our understanding of wildfire effects on soil, hydrology, erosion and erosion-mitigation treatment effectiveness. Using this new knowledge, we have developed several tools to assist land managers with post-wildfire assessment and treatment decisions, such as prediction models, research...
3D reconstruction techniques made easy: know-how and pictures.
Luccichenti, Giacomo; Cademartiri, Filippo; Pezzella, Francesca Romana; Runza, Giuseppe; Belgrano, Manuel; Midiri, Massimo; Sabatini, Umberto; Bastianello, Stefano; Krestin, Gabriel P
2005-10-01
Three-dimensional reconstructions represent a visual-based tool for illustrating the basis of three-dimensional post-processing such as interpolation, ray-casting, segmentation, percentage classification, gradient calculation, shading and illumination. The knowledge of the optimal scanning and reconstruction parameters facilitates the use of three-dimensional reconstruction techniques in clinical practise. The aim of this article is to explain the principles of multidimensional image processing in a pictorial way and the advantages and limitations of the different possibilities of 3D visualisation.
E-Labs - Learning with Authentic Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bardeen, Marjorie G.; Wayne, Mitchell
the success teachers have had providing an opportunity for students to: • Organize and conduct authentic research. • Experience the environment of scientific collaborations. • Possibly make real contributions to a burgeoning scientific field. We've created projects that are problem-based, student driven and technology dependent. Students reach beyond classroom walls to explore data with other students and experts and share results, publishing original work to a worldwide audience. Students can discover and extend the research of other students, modeling the processes of modern, large-scale research projects. From start to finish e-Labs are student-led, teacher-guided projects. Students need only a Webmore » browser to access computing techniques employed by professional researchers. A Project Map with milestones allows students to set the research plan rather than follow a step-by-step process common in other online projects. Most importantly, e-Labs build the learning experience around the students' own questions and let them use the very tools that scientists use. Students contribute to and access shared data, most derived from professional research databases. They use common analysis tools, store their work and use metadata to discover, replicate and confirm the research of others. This is where real scientific collaboration begins. Using online tools, students correspond with other research groups, post comments and questions, prepare summary reports, and in general participate in the part of scientific research that is often left out of classroom experiments. Teaching tools such as student and teacher logbooks, pre- and post-tests and an assessment rubric aligned with learner outcomes help teachers guide student work. Constraints on interface designs and administrative tools such as registration databases give teachers the "one-stop-shopping" they seek for multiple e-Labs. Teaching and administrative tools also allow us to track usage and assess the impact on student learning.« less
Assessing the effect of adding interactive modeling to the geoscience curriculum
NASA Astrophysics Data System (ADS)
Castillo, A.; Marshall, J.; Cardenas, M.
2013-12-01
Technology and computer models enhance the learning experience when appropriately utilized. Moreover, learning is significantly improved when effective visualization is combined with models of processes allowing for inquiry-based problem solving. Still, hands-on experiences in real scenarios result in better contextualization of related problems compared to virtual laboratories. Therefore, the role of scientific visualization, technology, and computer modeling is to enhance, not displace, the learning experience by supplementing real-world problem solving and experiences, although in some circumstances, they can adequately serve to take the place of reality. The key to improving scientific education is to embrace an inquiry-based approach that favorably uses technology. This study will attempt to evaluate the effect of adding interactive modeling to the geological sciences curriculum. An assessment tool, designed to assess student understanding of physical hydrology, was used to evaluate a curriculum intervention based on student learning with a data- and modeling-driven approach using COMSOL Multiphysics software. This intervention was implemented in an upper division and graduate physical hydrology course in fall 2012. Students enrolled in the course in fall 2011 served as the control group. Interactive modeling was added to the curriculum in fall 2012 to replace the analogous mathematical modeling done by hand in fall 2011. Pre- and post-test results were used to assess and report its effectiveness. Student interviews were also used to probe student reactions to both the experimental and control curricula. The pre- and post-tests asked students to describe the significant processes in the hydrological cycle and describe the laws governing these processes. Their ability to apply their knowledge in a real-world problem was also assessed. Since the pre- and post-test data failed to meet the assumption of normality, a non-parametric Kruskal-Wallis test was run to determine if there were differences in pre- and post-test scores among the 2011 and 2012 groups. Results reveal significant differences in pretest and posttest scores among the 2011 and 2012 groups. Interview data revealed that students experience both affordances and barriers to using geoscience learning tools. Important affordances included COMSOL's modeling capabilities, the visualizations it offers, as well as the opportunity to use the software in the course. Barriers included lack of COMSOL experience, difficulty with COMSOL instructions, and lack of instruction with the software. Results from this study revealed that a well-designed pre- and post-assessment can be used to infer whether a given instructional intervention has caused a change in understanding in a given group of students, but the results are not necessarily generalizable. However, the student interviews, which were used to probe student reactions to both the experimental and control curricula, revealed that students experience both affordances and barriers to geoscience learning tools. This result has limitations including the number of participants, all from one institution, but the assessment tool was useful to assess the effect of adding interactive modeling to the geoscience curriculum. Supported by NSF CAREER grant (EAR-0955750).
Review and Reward within the Computerised Peer-Assessment of Essays
ERIC Educational Resources Information Center
Davies, Phil
2009-01-01
This article details the implementation and use of a "Review Stage" within the CAP (computerised assessment by peers) tool as part of the assessment process for a post-graduate module in e-learning. It reports upon the effect of providing the students with a "second chance" in marking and commenting their peers' essays having been able to view the…
Ogourtsova, Tatiana; Archambault, Philippe S; Lamontagne, Anouk
2017-11-07
Hemineglect, defined as a failure to attend to the contralesional side of space, is a prevalent and disabling post-stroke deficit. Conventional hemineglect assessments lack sensitivity as they contain mainly non-functional tasks performed in near-extrapersonal space, using static, two-dimensional methods. This is of concern given that hemineglect is a strong predictor for functional deterioration, limited post-stroke recovery, and difficulty in community reintegration. With the emerging field of virtual reality, several virtual tools have been proposed and have reported better sensitivity in neglect-related deficits detection than conventional methods. However, these and future virtual reality-based tools are yet to be implemented in clinical practice. The present study aimed to explore the barriers/facilitators perceived by clinicians in the use of virtual reality for hemineglect assessment; and to identify features of an optimal virtual assessment. A qualitative descriptive process, in the form of focus groups, self-administered questionnaire and individual interviews was used. Two focus groups (n = 11 clinicians) were conducted and experts in the field (n = 3) were individually interviewed. Several barriers and facilitators, including personal, institutional, client suitability, and equipment factors, were identified. Clinicians and experts in the field reported numerous features for the virtual tool optimization. Factors identified through this study lay the foundation for the development of a knowledge translation initiative towards an implementation of a virtual assessment for hemineglect. Addressing the identified barriers/facilitators during implementation and incorporating the optimal features in the design of the virtual assessment could assist and promote its eventual adoption in clinical settings. Implications for rehabilitation A multimodal and active knowledge translation intervention built on the presently identified modifiable factors is suggested to be implemented to support the clinical integration of a virtual reality-based assessment for post-stroke hemineglect. To amplify application and usefulness of a virtual-reality based tool in the assessment of post-stroke hemineglect, optimal features identified in the present study should be incorporated in the design of such technology.
Nissan, Noam; Furman-Haran, Edna; Shapiro-Feinberg, Myra; Grobgeld, Dov; Degani, Hadassa
2017-09-01
Lactation and the return to the pre-conception state during post-weaning are regulated by hormonal induced processes that modify the microstructure of the mammary gland, leading to changes in the features of the ductal / glandular tissue, the stroma and the fat tissue. These changes create a challenge in the radiological workup of breast disorder during lactation and early post-weaning. Here we present non-invasive MRI protocols designed to record in vivo high spatial resolution, T 2 -weighted images and diffusion tensor images of the entire mammary gland. Advanced imaging processing tools enabled tracking the changes in the anatomical and microstructural features of the mammary gland from the time of lactation to post-weaning. Specifically, by using diffusion tensor imaging (DTI) it was possible to quantitatively distinguish between the ductal / glandular tissue distention during lactation and the post-weaning involution. The application of the T 2 -weighted imaging and DTI is completely safe, non-invasive and uses intrinsic contrast based on differences in transverse relaxation rates and water diffusion rates in various directions, respectively. This study provides a basis for further in-vivo monitoring of changes during the mammary developmental stages, as well as identifying changes due to malignant transformation in patients with pregnancy associated breast cancer (PABC).
Semi-automated camera trap image processing for the detection of ungulate fence crossing events.
Janzen, Michael; Visser, Kaitlyn; Visscher, Darcy; MacLeod, Ian; Vujnovic, Dragomir; Vujnovic, Ksenija
2017-09-27
Remote cameras are an increasingly important tool for ecological research. While remote camera traps collect field data with minimal human attention, the images they collect require post-processing and characterization before it can be ecologically and statistically analyzed, requiring the input of substantial time and money from researchers. The need for post-processing is due, in part, to a high incidence of non-target images. We developed a stand-alone semi-automated computer program to aid in image processing, categorization, and data reduction by employing background subtraction and histogram rules. Unlike previous work that uses video as input, our program uses still camera trap images. The program was developed for an ungulate fence crossing project and tested against an image dataset which had been previously processed by a human operator. Our program placed images into categories representing the confidence of a particular sequence of images containing a fence crossing event. This resulted in a reduction of 54.8% of images that required further human operator characterization while retaining 72.6% of the known fence crossing events. This program can provide researchers using remote camera data the ability to reduce the time and cost required for image post-processing and characterization. Further, we discuss how this procedure might be generalized to situations not specifically related to animal use of linear features.
NASA Astrophysics Data System (ADS)
Biermann, D.; Kahleyss, F.; Krebs, E.; Upmeier, T.
2011-07-01
Micro-sized applications are gaining more and more relevance for NiTi-based shape memory alloys (SMA). Different types of micro-machining offer unique possibilities for the manufacturing of NiTi components. The advantage of machining is the low thermal influence on the workpiece. This is important, because the phase transformation temperatures of NiTi SMAs can be changed and the components may need extensive post manufacturing. The article offers a simulation-based approach to optimize five-axis micro-milling processes with respect to the special material properties of NiTi SMA. Especially, the influence of the various tool inclination angles is considered for introducing an intelligent tool inclination optimization algorithm. Furthermore, aspects of micro deep-hole drilling of SMAs are discussed. Tools with diameters as small as 0.5 mm are used. The possible length-to-diameter ratio reaches up to 50. This process offers new possibilities in the manufacturing of microstents. The study concentrates on the influence of the cutting speed, the feed and the tool design on the tool wear and the quality of the drilled holes.
Alignment Tool For Inertia Welding
NASA Technical Reports Server (NTRS)
Snyder, Gary L.
1991-01-01
Compact, easy-to-use tool aligns drive bar of inertia welder over hole in stub. Ensures drive bar concentric to hole within 0.002 in. (0.051 mm.). Holds two batteries and light bulb. Electrical circuit completed, providing current to bulb when pin in contact with post. When pin centered in post hole, it does not touch post, and lamp turns off. Built for use in making repair welds on liquid-oxygen-injector posts in Space Shuttle main engine. Version having suitably modified dimensions used to facilitate alignment in other forests of post.
Hadanny, Amir; Efrati, Shai
2016-08-01
Persistent post-concussion syndrome caused by mild traumatic brain injury has become a major cause of morbidity and poor quality of life. Unlike the acute care of concussion, there is no consensus for treatment of chronic symptoms. Moreover, most of the pharmacologic and non-pharmacologic treatments have failed to demonstrate significant efficacy on both the clinical symptoms as well as the pathophysiologic cascade responsible for the permanent brain injury. This article reviews the pathophysiology of PCS, the diagnostic tools and criteria, the current available treatments including pharmacotherapy and different cognitive rehabilitation programs, and promising new treatment directions. A most promising new direction is the use of hyperbaric oxygen therapy, which targets the basic pathological processes responsible for post-concussion symptoms; it is discussed here in depth.
NASA Astrophysics Data System (ADS)
Kern, Bastian; Jöckel, Patrick
2016-10-01
Numerical climate and weather models have advanced to finer scales, accompanied by large amounts of output data. The model systems hit the input and output (I/O) bottleneck of modern high-performance computing (HPC) systems. We aim to apply diagnostic methods online during the model simulation instead of applying them as a post-processing step to written output data, to reduce the amount of I/O. To include diagnostic tools into the model system, we implemented a standardised, easy-to-use interface based on the Modular Earth Submodel System (MESSy) into the ICOsahedral Non-hydrostatic (ICON) modelling framework. The integration of the diagnostic interface into the model system is briefly described. Furthermore, we present a prototype implementation of an advanced online diagnostic tool for the aggregation of model data onto a user-defined regular coarse grid. This diagnostic tool will be used to reduce the amount of model output in future simulations. Performance tests of the interface and of two different diagnostic tools show, that the interface itself introduces no overhead in form of additional runtime to the model system. The diagnostic tools, however, have significant impact on the model system's runtime. This overhead strongly depends on the characteristics and implementation of the diagnostic tool. A diagnostic tool with high inter-process communication introduces large overhead, whereas the additional runtime of a diagnostic tool without inter-process communication is low. We briefly describe our efforts to reduce the additional runtime from the diagnostic tools, and present a brief analysis of memory consumption. Future work will focus on optimisation of the memory footprint and the I/O operations of the diagnostic interface.
NASA Astrophysics Data System (ADS)
Sinha, Neeraj; Zambon, Andrea; Ott, James; Demagistris, Michael
2015-06-01
Driven by the continuing rapid advances in high-performance computing, multi-dimensional high-fidelity modeling is an increasingly reliable predictive tool capable of providing valuable physical insight into complex post-detonation reacting flow fields. Utilizing a series of test cases featuring blast waves interacting with combustible dispersed clouds in a small-scale test setup under well-controlled conditions, the predictive capabilities of a state-of-the-art code are demonstrated and validated. Leveraging physics-based, first principle models and solving large system of equations on highly-resolved grids, the combined effects of finite-rate/multi-phase chemical processes (including thermal ignition), turbulent mixing and shock interactions are captured across the spectrum of relevant time-scales and length scales. Since many scales of motion are generated in a post-detonation environment, even if the initial ambient conditions are quiescent, turbulent mixing plays a major role in the fireball afterburning as well as in dispersion, mixing, ignition and burn-out of combustible clouds in its vicinity. Validating these capabilities at the small scale is critical to establish a reliable predictive tool applicable to more complex and large-scale geometries of practical interest.
Elhadad, N.
2016-01-01
Summary Objectives This paper reviews work over the past two years in Natural Language Processing (NLP) applied to clinical and consumer-generated texts. Methods We included any application or methodological publication that leverages text to facilitate healthcare and address the health-related needs of consumers and populations. Results Many important developments in clinical text processing, both foundational and task-oriented, were addressed in community-wide evaluations and discussed in corresponding special issues that are referenced in this review. These focused issues and in-depth reviews of several other active research areas, such as pharmacovigilance and summarization, allowed us to discuss in greater depth disease modeling and predictive analytics using clinical texts, and text analysis in social media for healthcare quality assessment, trends towards online interventions based on rapid analysis of health-related posts, and consumer health question answering, among other issues. Conclusions Our analysis shows that although clinical NLP continues to advance towards practical applications and more NLP methods are used in large-scale live health information applications, more needs to be done to make NLP use in clinical applications a routine widespread reality. Progress in clinical NLP is mirrored by developments in social media text analysis: the research is moving from capturing trends to addressing individual health-related posts, thus showing potential to become a tool for precision medicine and a valuable addition to the standard healthcare quality evaluation tools. PMID:27830255
Capturing district nursing through a knowledge-based electronic caseload analysis tool (eCAT).
Kane, Kay
2014-03-01
The Electronic Caseload Analysis Tool (eCAT) is a knowledge-based software tool to assist the caseload analysis process. The tool provides a wide range of graphical reports, along with an integrated clinical advisor, to assist district nurses, team leaders, operational and strategic managers with caseload analysis by describing, comparing and benchmarking district nursing practice in the context of population need, staff resources, and service structure. District nurses and clinical lead nurses in Northern Ireland developed the tool, along with academic colleagues from the University of Ulster, working in partnership with a leading software company. The aim was to use the eCAT tool to identify the nursing need of local populations, along with the variances in district nursing practice, and match the workforce accordingly. This article reviews the literature, describes the eCAT solution and discusses the impact of eCAT on nursing practice, staff allocation, service delivery and workforce planning, using fictitious exemplars and a post-implementation evaluation from the trusts.
Dynamic Modeling of Process Technologies for Closed-Loop Water Recovery Systems
NASA Technical Reports Server (NTRS)
Allada, Rama Kumar; Lange, Kevin; Anderson, Molly
2011-01-01
Detailed chemical process simulations are a useful tool in designing and optimizing complex systems and architectures for human life support. Dynamic and steady-state models of these systems help contrast the interactions of various operating parameters and hardware designs, which become extremely useful in trade-study analyses. NASA s Exploration Life Support technology development project recently made use of such models to compliment a series of tests on different waste water distillation systems. This paper presents dynamic simulations of chemical process for primary processor technologies including: the Cascade Distillation System (CDS), the Vapor Compression Distillation (VCD) system, the Wiped-Film Rotating Disk (WFRD), and post-distillation water polishing processes such as the Volatiles Removal Assembly (VRA) that were developed using the Aspen Custom Modeler and Aspen Plus process simulation tools. The results expand upon previous work for water recovery technology models and emphasize dynamic process modeling and results. The paper discusses system design, modeling details, and model results for each technology and presents some comparisons between the model results and available test data. Following these initial comparisons, some general conclusions and forward work are discussed.
NASA Astrophysics Data System (ADS)
gochis, David; hooper, Rick; parodi, Antonio; Jha, Shantenu; Yu, Wei; Zaslavsky, Ilya; Ganapati, Dinesh
2014-05-01
The community WRF-Hydro system is currently being used in a variety of flood prediction and regional hydroclimate impacts assessment applications around the world. Despite its increasingly wide use certain cyberinfrastructure bottlenecks exist in the setup, execution and post-processing of WRF-Hydro model runs. These bottlenecks result in wasted time, labor, data transfer bandwidth and computational resource use. Appropriate development and use of cyberinfrastructure to setup and manage WRF-Hydro modeling applications will streamline the entire workflow of hydrologic model predictions. This talk will present recent advances in the development and use of new open-source cyberinfrastructure tools for the WRF-Hydro architecture. These tools include new web-accessible pre-processing applications, supercomputer job management applications and automated verification and visualization applications. The tools will be described successively and then demonstrated in a set of flash flood use cases for recent destructive flood events in the U.S. and in Europe. Throughout, an emphasis on the implementation and use of community data standards for data exchange is made.
FUJIFILM X10 white orbs and DeOrbIt
NASA Astrophysics Data System (ADS)
Dietz, Henry Gordon
2013-01-01
The FUJIFILM X10 is a high-end enthusiast compact digital camera using an unusual sensor design. Unfortunately, upon its Fall 2011 release, the camera quickly became infamous for the uniquely disturbing "white orbs" that often appeared in areas where the sensor was saturated. FUJIFILM's first attempt at a fix was firmware released on February 25, 2012 if it had little effect. In April 2012, a sensor replacement essentially solved the problem. This paper explores the "white orb" phenomenon in detail. After FUJIFILM's attempt at a firmware fix failed, the author decided to create a post-processing tool that automatically could repair existing images. DeOrbIt was released as a free tool on March 7, 2012. To better understand the problem and how to fix it, the WWW form version of the tool logs images, processing parameters, and evaluations by users. The current paper describes the technical problem, the novel computational photography methods used by DeOrbit to repair affected images, and the public perceptions revealed by this experiment.
Time-driven activity-based costing: A dynamic value assessment model in pediatric appendicitis.
Yu, Yangyang R; Abbas, Paulette I; Smith, Carolyn M; Carberry, Kathleen E; Ren, Hui; Patel, Binita; Nuchtern, Jed G; Lopez, Monica E
2017-06-01
Healthcare reform policies are emphasizing value-based healthcare delivery. We hypothesize that time-driven activity-based costing (TDABC) can be used to appraise healthcare interventions in pediatric appendicitis. Triage-based standing delegation orders, surgical advanced practice providers, and a same-day discharge protocol were implemented to target deficiencies identified in our initial TDABC model. Post-intervention process maps for a hospital episode were created using electronic time stamp data for simple appendicitis cases during February to March 2016. Total personnel and consumable costs were determined using TDABC methodology. The post-intervention TDABC model featured 6 phases of care, 33 processes, and 19 personnel types. Our interventions reduced duration and costs in the emergency department (-41min, -$23) and pre-operative floor (-57min, -$18). While post-anesthesia care unit duration and costs increased (+224min, +$41), the same-day discharge protocol eliminated post-operative floor costs (-$306). Our model incorporating all three interventions reduced total direct costs by 11% ($2753.39 to $2447.68) and duration of hospitalization by 51% (1984min to 966min). Time-driven activity-based costing can dynamically model changes in our healthcare delivery as a result of process improvement interventions. It is an effective tool to continuously assess the impact of these interventions on the value of appendicitis care. II, Type of study: Economic Analysis. Copyright © 2017 Elsevier Inc. All rights reserved.
Interventional MR: vascular applications.
Smits, H F; Bos, C; van der Weide, R; Bakker, C J
1999-01-01
Three strategies for visualisation of MR-dedicated guidewires and catheters have been proposed, namely active tracking, the technique of locally induced field inhomogeneity and passive susceptibility-based tracking. In this article the pros and cons of these techniques are discussed, including the development of MR-dedicated guidewires and catheters, scan techniques, post-processing tools, and display facilities for MR tracking. Finally, some of the results obtained with MR tracking are discussed.
2013-04-01
Forces can be computed at specific angular positions, and geometrical parameters can be evaluated. Much higher resolution models are required, along...composition engines (C#, C++, Python, Java ) Desert operates on the CyPhy model, converting from a design space alternative structure to a set of design...consists of scripts to execute dymola, post-processing of results to create metrics, and general management of the job sequence. An earlier version created
Validation of a Pre- and Post-Evaluation Process: A Tool for Adult Training in Food Handling
ERIC Educational Resources Information Center
Mastrantonio, Guido; Dulout, Mariana; González, María Lourdes; Zeinsteger, Pedro
2014-01-01
Education in food safety is a well-recognized health intervention, which allows the prevention of a wide range of diseases. Among the strategies of control and prevention of foodborne diseases, it is indicated that food safety education has the double advantage of having low costs and high potential effectiveness, as long as it is carried out with…
NASA Technical Reports Server (NTRS)
Englander, Jacob
2016-01-01
This set of tutorial slides is an introduction to the Evolutionary Mission Trajectory Generator (EMTG), NASA Goddard Space Flight Center's autonomous tool for preliminary design of interplanetary missions. This slide set covers the basics of creating and post-processing simple interplanetary missions in EMTG using both high-thrust chemical and low-thrust electric propulsion along with a variety of operational constraints.
Comparative study of resist stabilization techniques for metal etch processing
NASA Astrophysics Data System (ADS)
Becker, Gerry; Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Livesay, William R.
1999-06-01
This study investigates resist stabilization techniques as they are applied to a metal etch application. The techniques that are compared are conventional deep-UV/thermal stabilization, or UV bake, and electron beam stabilization. The electron beam tool use din this study, an ElectronCure system from AlliedSignal Inc., ELectron Vision Group, utilizes a flood electron source and a non-thermal process. These stabilization techniques are compared with respect to a metal etch process. In this study, two types of resist are considered for stabilization and etch: a g/i-line resist, Shipley SPR-3012, and an advanced i-line, Shipley SPR 955- Cm. For each of these resist the effects of stabilization on resist features are evaluated by post-stabilization SEM analysis. Etch selectivity in all cases is evaluated by using a timed metal etch, and measuring resists remaining relative to total metal thickness etched. Etch selectivity is presented as a function of stabilization condition. Analyses of the effects of the type of stabilization on this method of selectivity measurement are also presented. SEM analysis was also performed on the features after a compete etch process, and is detailed as a function of stabilization condition. Post-etch cleaning is also an important factor impacted by pre-etch resist stabilization. Results of post- etch cleaning are presented for both stabilization methods. SEM inspection is also detailed for the metal features after resist removal processing.
ERIC Educational Resources Information Center
Martin, Christie S.; Polly, Drew; Wang, Chuang; Lambert, Richard G.; Pugalee, David K.
2016-01-01
This study examined the influence of professional development on elementary school teachers' perceptions of and use of an internet-based formative assessment tool focused on students' number sense skills. Data sources include teacher-participants' pre and post survey, open ended response on post survey, use of the assessment tool and their written…
POST II Trajectory Animation Tool Using MATLAB, V1.0
NASA Technical Reports Server (NTRS)
Raiszadeh, Behzad
2005-01-01
A trajectory animation tool has been developed for accurately depicting position and the attitude of the bodies in flight. The movies generated from This MATLAB based tool serve as an engineering analysis aid to gain further understanding into the dynamic behavior of bodies in flight. This tool has been designed to interface with the output generated from POST II simulations, and is able to animate a single as well as multiple vehicles in flight.
Using Web 2.0 tools to connect shore-based users to live science from the wide blue ocean
NASA Astrophysics Data System (ADS)
Cooper, S. K.; Peart, L.; Collins, J.
2009-12-01
The fast-expanding use of social networking tools, combined with improved connectivity available through satellite-provided internet on board the scientific ocean drilling vessel JOIDES Resolution (the JR), has allowed for a whole new kind of interaction. Unlike in the not-so-distant past, when non-participants were forced to wait for months to read about the results of ongoing research, web tools allow almost instantaneous participation in ship-based ocean science. Utilizing a brand new portal, joidesresolution.org, scientists and educators at sea can post daily blogs about their work and respond to questions and comments on those blogs, update the JR’s Facebook and Twitter pages, and post videos and photos to YouTube and Flickr regularly. Live video conferencing tools also allow for direct interaction with scientists and a view into the work being done on board in real time. These tools have allowed students, teachers and families, groups and individuals on shore to follow along with the expeditions of the ship and its exciting scientific explorations -- and become a part of them. Building this community provides a whole range of rich interactions and brings seafloor research and the real process of science to those who would never before have had access to it. This presentation will include an overview of the web portal and its associated social networking sites, as well as a discussion of the challenges and lessons learned over nearly a year of utilizing these new tools. The web portal joidesresolution.org home page.
In-Line Monitoring of Fab Processing Using X-Ray Diffraction
NASA Astrophysics Data System (ADS)
Gittleman, Bruce; Kozaczek, Kris
2005-09-01
As the materials shift that started with Cu continues to advance in the semiconductor industry, new issues related to materials microstructure have arisen. While x-ray diffraction (XRD) has long been used in development applications, in this paper we show that results generated in real time by a unique, high throughput, fully automated XRD metrology tool can be used to develop metrics for qualification and monitoring of critical processes in current and future manufacturing. It will be shown that these metrics provide a unique set of data that correlate to manufacturing issues. For example, ionized-sputtering is the current deposition method of choice for both the Cu seed and TaNx/Ta barrier layers. The alpha phase of Ta is widely used in production for the upper layer of the barrier stack, but complete elimination of the beta phase requires a TaNx layer with sufficient N content, but not so much as to start poisoning the target and generating particle issues. This is a well documented issue, but traditional monitoring by sheet resistance methods cannot guarantee the absence of the beta phase, whereas XRD can determine the presence of even small amounts of beta. Nickel silicide for gate metallization is another example where monitoring of phase is critical. As well being able to qualify an anneal process that gives only the desired NiSi phase everywhere across the wafer, XRD can be used to determine if full silicidation of the Ni has occurred and characterize the crystallographic microstructure of the Ni to determine any effect of that microstructure on the anneal process. The post-anneal nickel silicide phase and uniformity of the silicide microstructure can all be monitored in production. Other examples of the application of XRD to process qualification and production monitoring are derived from the dependence of certain processes, some types of defect generation, and device performance on crystallographic texture. The data presented will show that CMP dishing problems could be traced to texture of the barrier layer and mitigated by adjusting the barrier process. The density of pits developed during CMP of electrochemically deposited (ECD) Cu depends on the fraction of (111) oriented grains. It must be emphasized that the crystallographic texture is not only a key parameter for qualification of high yielding and reliable processes, but also serves as a critical parameter for monitoring tool health. The texture of Cu and W are sensitive not only to deviations in performance of the tool depositing or annealing a particular film, but also highly sensitive to the texture of the barrier underlayers and thus any performance deviations in those tools. The XRD metrology tool has been designed with production monitoring in mind and has been fully integrated into both 200 mm and 300 mm fabs. Rapid analysis is achieved by using a high intensity fixed x-ray source, coupled with a large area 2D detector. The output metrics from one point are generated while the tool is measuring a subsequent point, giving true on-the-fly analysis; no post-processing of data is necessary. Spatial resolution on the wafer surface ranging from 35 μm to 1 mm is available, making the tool suitable for monitoring of product wafers. Typical analysis times range from 10 seconds to 2 minutes per point, depending on the film thickness and spot size. Current metrics used for process qualification and production monitoring are phase, FWHM of the primary phase peaks (for mean grain size tracking), and crystallographic texture.
Using Facebook to Facilitate Course-Related Discussion Between Students and Faculty Members
Kirwin, Jennifer L.
2012-01-01
Objectives. To use Facebook to facilitate online discussion of the content of a Comprehensive Disease Management course and to evaluate student use and perceptions of this exercise. Design. A Facebook page was created and coordinators encouraged students to “like” the page and to post and view study tips, links, or questions. At the end of the course, students’ use and perceptions were evaluated using an anonymous survey tool. Assessment. At the end of week 1, there were 81 followers, 5 wall posts, and 474 visits to the course Facebook page. At peak use, the page had 117 followers, 18 wall posts, and 1,326 visits. One hundred nineteen students (97% of the class) completed the survey tool. Twenty-six percent of students contributed posts compared to 11% who posted on the course discussion board on Blackboard. Students were more likely to post and be exposed to posts on Facebook than on Blackboard. Students found Facebook helpful and 57% said they would miss Facebook if use was not continued in subsequent courses. Conclusions. Students in a Comprehensive Disease Management course found the addition of a Facebook page a valuable study tool and thought most posts added to their learning. PMID:22438604
Using Facebook to facilitate course-related discussion between students and faculty members.
DiVall, Margarita V; Kirwin, Jennifer L
2012-03-12
To use Facebook to facilitate online discussion of the content of a Comprehensive Disease Management course and to evaluate student use and perceptions of this exercise. A Facebook page was created and coordinators encouraged students to "like" the page and to post and view study tips, links, or questions. At the end of the course, students' use and perceptions were evaluated using an anonymous survey tool. At the end of week 1, there were 81 followers, 5 wall posts, and 474 visits to the course Facebook page. At peak use, the page had 117 followers, 18 wall posts, and 1,326 visits. One hundred nineteen students (97% of the class) completed the survey tool. Twenty-six percent of students contributed posts compared to 11% who posted on the course discussion board on Blackboard. Students were more likely to post and be exposed to posts on Facebook than on Blackboard. Students found Facebook helpful and 57% said they would miss Facebook if use was not continued in subsequent courses. Students in a Comprehensive Disease Management course found the addition of a Facebook page a valuable study tool and thought most posts added to their learning.
FY17 Status Report on NEAMS Neutronics Activities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, C. H.; Jung, Y. S.; Smith, M. A.
2017-09-30
Under the U.S. DOE NEAMS program, the high-fidelity neutronics code system has been developed to support the multiphysics modeling and simulation capability named SHARP. The neutronics code system includes the high-fidelity neutronics code PROTEUS, the cross section library and preprocessing tools, the multigroup cross section generation code MC2-3, the in-house meshing generation tool, the perturbation and sensitivity analysis code PERSENT, and post-processing tools. The main objectives of the NEAMS neutronics activities in FY17 are to continue development of an advanced nodal solver in PROTEUS for use in nuclear reactor design and analysis projects, implement a simplified sub-channel based thermal-hydraulic (T/H)more » capability into PROTEUS to efficiently compute the thermal feedback, improve the performance of PROTEUS-MOCEX using numerical acceleration and code optimization, improve the cross section generation tools including MC2-3, and continue to perform verification and validation tests for PROTEUS.« less
Utilization of a postoperative adenotonsillectomy teaching video: A pilot study.
Khan, Sarah; Tumin, Dmitry; King, Adele; Rice, Julie; Jatana, Kris R; Tobias, Joseph D; Raman, Vidya T
2017-11-01
Pediatric tonsillectomies are increasingly being performed as an outpatient procedure thereby increasing the parental role in post-operative pain management. However, it is unclear if parents receive adequate teaching regarding pain management. We introduced a video teaching tool and compared its efficacy alone and in combination with the standard verbal instruction. A prospective study which randomized parents or caregivers of children undergoing tonsillectomy ± adenoidectomy into three groups: 1) standard verbal post-operative instructions; 2) watching the video teaching tool along with standard verbal instructions or 3) video teaching tool only. Parents completed pre and post-instruction assessments of their knowledge of post-operative pain management with responses scored from 0 to 8. Telephone assessments were conducted within 48 post-operative hours with a subjective rating of the helpfulness of the video teaching tool. The study cohort included 99 patients and their families. The median pre-instruction score was 5 of 8 points (Interquartile range [IQR]: 4, 6) and this remained at 5 following instruction. (IQR:4, 6; p = 0.702 difference from baseline). Baseline scores did not vary across the groups (p = 0.156) and there was no increase in the knowledge score from pre to post-test across the three groups. Groups B and C rated the helpfulness of the video teaching tool with a median score of 4 of 5. (IQR: 4, 5). A baseline deficit exists in parental understanding of post-operative pain management that did not statistically improve regardless of the form post-operative instruction used (verbal vs. video-based instruction). However, the high helpfulness scores in both video groups support the use of video instruction as an alternative to or to complement to verbal instruction. However, further identification of knowledge deficits is required for optimization of post-operative educational materials. Copyright © 2017 Elsevier B.V. All rights reserved.
Validity and reliability of food security measures.
Cafiero, Carlo; Melgar-Quiñonez, Hugo R; Ballard, Terri J; Kepple, Anne W
2014-12-01
This paper reviews some of the existing food security indicators, discussing the validity of the underlying concept and the expected reliability of measures under reasonably feasible conditions. The main objective of the paper is to raise awareness on existing trade-offs between different qualities of possible food security measurement tools that must be taken into account when such tools are proposed for practical application, especially for use within an international monitoring framework. The hope is to provide a timely, useful contribution to the process leading to the definition of a food security goal and the associated monitoring framework within the post-2015 Development Agenda. © 2014 New York Academy of Sciences.
Etienne, Audrey; Génard, Michel; Bugaud, Christophe
2015-01-01
Citrate is one of the most important organic acids in many fruits and its concentration plays a critical role in organoleptic properties. The regulation of citrate accumulation throughout fruit development, and the origins of the phenotypic variability of the citrate concentration within fruit species remain to be clarified. In the present study, we developed a process-based model of citrate accumulation based on a simplified representation of the TCA cycle to predict citrate concentration in fruit pulp during the pre- and post-harvest stages. Banana fruit was taken as a reference because it has the particularity of having post-harvest ripening, during which citrate concentration undergoes substantial changes. The model was calibrated and validated on the two stages, using data sets from three contrasting cultivars in terms of citrate accumulation, and incorporated different fruit load, potassium supply, and harvest dates. The model predicted the pre and post-harvest dynamics of citrate concentration with fairly good accuracy for the three cultivars. The model suggested major differences in TCA cycle functioning among cultivars during post-harvest ripening of banana, and pointed to a potential role for NAD-malic enzyme and mitochondrial malate carriers in the genotypic variability of citrate concentration. The sensitivity of citrate accumulation to growth parameters and temperature differed among cultivars during post-harvest ripening. Finally, the model can be used as a conceptual basis to study citrate accumulation in fleshy fruits and may be a powerful tool to improve our understanding of fruit acidity.
Automated Extraction of Flow Features
NASA Technical Reports Server (NTRS)
Dorney, Suzanne (Technical Monitor); Haimes, Robert
2005-01-01
Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, re-circulation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; isc-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.
Automated Extraction of Flow Features
NASA Technical Reports Server (NTRS)
Dorney, Suzanne (Technical Monitor); Haimes, Robert
2004-01-01
Computational Fluid Dynamics (CFD) simulations are routinely performed as part of the design process of most fluid handling devices. In order to efficiently and effectively use the results of a CFD simulation, visualization tools are often used. These tools are used in all stages of the CFD simulation including pre-processing, interim-processing, and post-processing, to interpret the results. Each of these stages requires visualization tools that allow one to examine the geometry of the device, as well as the partial or final results of the simulation. An engineer will typically generate a series of contour and vector plots to better understand the physics of how the fluid is interacting with the physical device. Of particular interest are detecting features such as shocks, recirculation zones, and vortices (which will highlight areas of stress and loss). As the demand for CFD analyses continues to increase the need for automated feature extraction capabilities has become vital. In the past, feature extraction and identification were interesting concepts, but not required in understanding the physics of a steady flow field. This is because the results of the more traditional tools like; iso-surface, cuts and streamlines, were more interactive and easily abstracted so they could be represented to the investigator. These tools worked and properly conveyed the collected information at the expense of a great deal of interaction. For unsteady flow-fields, the investigator does not have the luxury of spending time scanning only one "snapshot" of the simulation. Automated assistance is required in pointing out areas of potential interest contained within the flow. This must not require a heavy compute burden (the visualization should not significantly slow down the solution procedure for (co-processing environments). Methods must be developed to abstract the feature of interest and display it in a manner that physically makes sense.
An SPM12 extension for multiple sclerosis lesion segmentation
NASA Astrophysics Data System (ADS)
Roura, Eloy; Oliver, Arnau; Cabezas, Mariano; Valverde, Sergi; Pareto, Deborah; Vilanova, Joan C.; Ramió-Torrentà, Lluís.; Rovira, Àlex; Lladó, Xavier
2016-03-01
Purpose: Magnetic resonance imaging is nowadays the hallmark to diagnose multiple sclerosis (MS), characterized by white matter lesions. Several approaches have been recently presented to tackle the lesion segmentation problem, but none of them have been accepted as a standard tool in the daily clinical practice. In this work we present yet another tool able to automatically segment white matter lesions outperforming the current-state-of-the-art approaches. Methods: This work is an extension of Roura et al. [1], where external and platform dependent pre-processing libraries (brain extraction, noise reduction and intensity normalization) were required to achieve an optimal performance. Here we have updated and included all these required pre-processing steps into a single framework (SPM software). Therefore, there is no need of external tools to achieve the desired segmentation results. Besides, we have changed the working space from T1w to FLAIR, reducing interpolation errors produced in the registration process from FLAIR to T1w space. Finally a post-processing constraint based on shape and location has been added to reduce false positive detections. Results: The evaluation of the tool has been done on 24 MS patients. Qualitative and quantitative results are shown with both approaches in terms of lesion detection and segmentation. Conclusion: We have simplified both installation and implementation of the approach, providing a multiplatform tool1 integrated into the SPM software, which relies only on using T1w and FLAIR images. We have reduced with this new version the computation time of the previous approach while maintaining the performance.
Henderson, Fiona; Hart, Philippa J; Pradillo, Jesus M; Kassiou, Michael; Christie, Lidan; Williams, Kaye J; Boutin, Herve; McMahon, Adam
2018-05-15
Stroke is a leading cause of disability worldwide. Understanding the recovery process post-stroke is essential; however, longer-term recovery studies are lacking. In vivo positron emission tomography (PET) can image biological recovery processes, but is limited by spatial resolution and its targeted nature. Untargeted mass spectrometry imaging offers high spatial resolution, providing an ideal ex vivo tool for brain recovery imaging. Magnetic resonance imaging (MRI) was used to image a rat brain 48 h after ischaemic stroke to locate the infarcted regions of the brain. PET was carried out 3 months post-stroke using the tracers [ 18 F]DPA-714 for TSPO and [ 18 F]IAM6067 for sigma-1 receptors to image neuroinflammation and neurodegeneration, respectively. The rat brain was flash-frozen immediately after PET scanning, and sectioned for matrix-assisted laser desorption/ionisation mass spectrometry (MALDI-MS) imaging. Three months post-stroke, PET imaging shows minimal detection of neurodegeneration and neuroinflammation, indicating that the brain has stabilised. However, MALDI-MS images reveal distinct differences in lipid distributions (e.g. phosphatidylcholine and sphingomyelin) between the scar and the healthy brain, suggesting that recovery processes are still in play. It is currently not known if the altered lipids in the scar will change on a longer time scale, or if they are stabilised products of the brain post-stroke. The data demonstrates the ability to combine MALD-MS with in vivo PET to image different aspects of stroke recovery. Copyright © 2018 John Wiley & Sons, Ltd.
Pauwels, Evelyn; Van Hoof, Elke; Charlier, Caroline; Lechner, Lilian; De Bourdeaudhuij, Ilse
2012-10-03
On-line provision of information during the transition phase after treatment carries great promise in meeting shortcomings in post-treatment care for breast cancer survivors and their partners. The objectives of this study are to describe the development and process evaluation of a tailored informative website and to assess which characteristics of survivors and partners, participating in the feasibility study, are related to visiting the website. The development process included quantitative and qualitative assessments of survivors' and partners' care needs and preferences. Participants' use and evaluation of the website were explored by conducting baseline and post-measurements. During the intervening 10-12 weeks 57 survivors and 28 partners were granted access to the website. Fifty-seven percent (n=21) of survivors who took part in the post-measurement indicated that they had visited the website. Compared to non-visitors (n=16), they were more likely to have a partner and a higher income, reported higher levels of self-esteem and had completed treatment for a longer period of time. Partners who consulted the on-line information (42%, n=8) were younger and reported lower levels of social support compared to partners who did not visit the website (n=11). Visitors generally evaluated the content and lay-out positively, yet some believed the information was incomplete and impersonal. The website reached only about half of survivors and partners, yet was mostly well-received. Besides other ways of providing information and support, a website containing clear-cut and tailored information could be a useful tool in post-treatment care provision.
NASA Astrophysics Data System (ADS)
Abayan, Kenneth Munoz
Stoichiometry is a fundamental topic in chemistry that measures a quantifiable relationship between atoms, molecules, etc. Stoichiometry is usually taught using expository teaching methods. Students are passively given information, in the hopes they will retain the transmission of information to be able to solve stoichiometry problems masterfully. Cognitive science research has shown that this kind of instructional teaching method is not very effecting in meaningful learning practice. Instead, students must take ownership of their learning. The students need to actively construct their own knowledge by receiving, interpreting, integrating and reorganizing that information into their own mental schemas. In the absence of active learning practices, tools must be created in such a way to be able to scaffold difficult problems by encoding opportunities necessary to make the construction of knowledge memorable, thereby creating a usable knowledge base. Using an online e-learning tool and its potential to create a dynamic and interactive learning environment may facilitate the learning of stoichiometry. The study entailed requests from volunteer students, IRB consent form, a baseline questionnaire, random assignment of treatment, pre- and post-test assessment, and post assessment survey. These activities were given online. A stoichiometry-based assessment was given in a proctored examination at the University of Texas at Arlington (UTA) campus. The volunteer students who took part in these studies were at least 18 of age and were enrolled in General Chemistry 1441, at the University of Texas at Arlington. Each participant gave their informed consent to use their data in the following study. Students were randomly assigned to one of 4 treatments groups based on teaching methodology, (Dimensional Analysis, Operational Method, Ratios and Proportions) and a control group who just received instruction through lecture only. In this study, an e-learning tool was created to demonstrate several methodologies, on how to solve stoichiometry, which are all supported by chemical education research. Comparisons of student performance based on pre- and post-test assessment, and a stoichiometry-based examination was done to determine if the information provided within the e-learning tool yielded greater learning outcomes compared to the students in the absence of scaffold learning material. The e-learning tool was created to help scaffold the problem solving process necessary to help students (N=394) solve stoichiometry problems. Therein the study investigated possible predictors for success on a stoichiometry based examination, students' conceptual understanding of solving stoichiometry problems, and their explanation of reasoning. It was found that the way the student answered a given stoichiometry question (i.e. whether the student used dimensional analysis, operational method or any other process) was not statistically relevant (p=0.05). More importantly, if the students were able to describe their thought process clearly, these students scored significantly higher on stoichiometry test (mean 84, p<0.05). This finding has major implications in teaching the topic, as lecturers tend to stress and focus on the method rather than the process on how to solve stoichiometry problems.
Social Media Listening for Routine Post-Marketing Safety Surveillance.
Powell, Gregory E; Seifert, Harry A; Reblin, Tjark; Burstein, Phil J; Blowers, James; Menius, J Alan; Painter, Jeffery L; Thomas, Michele; Pierce, Carrie E; Rodriguez, Harold W; Brownstein, John S; Freifeld, Clark C; Bell, Heidi G; Dasgupta, Nabarun
2016-05-01
Post-marketing safety surveillance primarily relies on data from spontaneous adverse event reports, medical literature, and observational databases. Limitations of these data sources include potential under-reporting, lack of geographic diversity, and time lag between event occurrence and discovery. There is growing interest in exploring the use of social media ('social listening') to supplement established approaches for pharmacovigilance. Although social listening is commonly used for commercial purposes, there are only anecdotal reports of its use in pharmacovigilance. Health information posted online by patients is often publicly available, representing an untapped source of post-marketing safety data that could supplement data from existing sources. The objective of this paper is to describe one methodology that could help unlock the potential of social media for safety surveillance. A third-party vendor acquired 24 months of publicly available Facebook and Twitter data, then processed the data by standardizing drug names and vernacular symptoms, removing duplicates and noise, masking personally identifiable information, and adding supplemental data to facilitate the review process. The resulting dataset was analyzed for safety and benefit information. In Twitter, a total of 6,441,679 Medical Dictionary for Regulatory Activities (MedDRA(®)) Preferred Terms (PTs) representing 702 individual PTs were discussed in the same post as a drug compared with 15,650,108 total PTs representing 946 individual PTs in Facebook. Further analysis revealed that 26 % of posts also contained benefit information. Social media listening is an important tool to augment post-marketing safety surveillance. Much work remains to determine best practices for using this rapidly evolving data source.
Avila, M L; Brandão, L R; Williams, S; Ward, L C; Montoya, M I; Stinson, J; Kiss, A; Lara-Corrales, I; Feldman, B M
2016-08-01
Our goal was to conduct the item generation and piloting phases of a new discriminative and evaluative tool for pediatric post-thrombotic syndrome. We followed a formative model for the development of the tool, focusing on the signs/symptoms (items) that define post-thrombotic syndrome. For item generation, pediatric thrombosis experts and subjects diagnosed with extremity post-thrombotic syndrome during childhood nominated items. In the piloting phase, items were cross-sectionally measured in children with limb deep vein thrombosis to examine item performance. Twenty-three experts and 16 subjects listed 34 items, which were then measured in 140 subjects with previous diagnosis of limb deep vein thrombosis (70 upper extremity and 70 lower extremity). The items with strongest correlation with post-thrombotic syndrome severity and largest area under the curve were pain (in older children), paresthesia, and swollen limb for the upper extremity group, and pain (in older children), tired limb, heaviness, tightness and paresthesia for the lower extremity group. The diagnostic properties of the items and their correlations with post-thrombotic syndrome severity varied according to the assessed venous territory. The information gathered in this study will help experts decide which item should be considered for inclusion in the new tool. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Prasetyo, T.; Amar, S.; Arendra, A.; Zam Zami, M. K.
2018-01-01
This study develops an on-line detection system to predict the wear of DCMT070204 tool tip during the cutting process of the workpiece. The machine used in this research is CNC ProTurn 9000 to cut ST42 steel cylinder. The audio signal has been captured using the microphone placed in the tool post and recorded in Matlab. The signal is recorded at the sampling rate of 44.1 kHz, and the sampling size of 1024. The recorded signal is 110 data derived from the audio signal while cutting using a normal chisel and a worn chisel. And then perform signal feature extraction in the frequency domain using Fast Fourier Transform. Feature selection is done based on correlation analysis. And tool wear classification was performed using artificial neural networks with 33 input features selected. This artificial neural network is trained with back propagation method. Classification performance testing yields an accuracy of 74%.
ERIC Educational Resources Information Center
Kirova, Anna; Massing, Christine; Prochner, Larry; Cleghorn, Ailie
2016-01-01
This study examines the use of PowerPoint as a teaching tool in a workplace- embedded program aimed at bridging immigrant/refugee early childhood educators into post-secondary studies, and how, in the process, it shapes students' "habits of mind" (Turkle, 2004). The premise of the study is that it is not only the bodies of knowledge…
A homology-based pipeline for global prediction of post-translational modification sites
NASA Astrophysics Data System (ADS)
Chen, Xiang; Shi, Shao-Ping; Xu, Hao-Dong; Suo, Sheng-Bao; Qiu, Jian-Ding
2016-05-01
The pathways of protein post-translational modifications (PTMs) have been shown to play particularly important roles for almost any biological process. Identification of PTM substrates along with information on the exact sites is fundamental for fully understanding or controlling biological processes. Alternative computational strategies would help to annotate PTMs in a high-throughput manner. Traditional algorithms are suited for identifying the common organisms and tissues that have a complete PTM atlas or extensive experimental data. While annotation of rare PTMs in most organisms is a clear challenge. In this work, to this end we have developed a novel homology-based pipeline named PTMProber that allows identification of potential modification sites for most of the proteomes lacking PTMs data. Cross-promotion E-value (CPE) as stringent benchmark has been used in our pipeline to evaluate homology to known modification sites. Independent-validation tests show that PTMProber achieves over 58.8% recall with high precision by CPE benchmark. Comparisons with other machine-learning tools show that PTMProber pipeline performs better on general predictions. In addition, we developed a web-based tool to integrate this pipeline at http://bioinfo.ncu.edu.cn/PTMProber/index.aspx. In addition to pre-constructed prediction models of PTM, the website provides an extensional functionality to allow users to customize models.
High-throughput automatic defect review for 300mm blank wafers with atomic force microscope
NASA Astrophysics Data System (ADS)
Zandiatashbar, Ardavan; Kim, Byong; Yoo, Young-kook; Lee, Keibock; Jo, Ahjin; Lee, Ju Suk; Cho, Sang-Joon; Park, Sang-il
2015-03-01
While feature size in lithography process continuously becomes smaller, defect sizes on blank wafers become more comparable to device sizes. Defects with nm-scale characteristic size could be misclassified by automated optical inspection (AOI) and require post-processing for proper classification. Atomic force microscope (AFM) is known to provide high lateral and the highest vertical resolution by mechanical probing among all techniques. However, its low throughput and tip life in addition to the laborious efforts for finding the defects have been the major limitations of this technique. In this paper we introduce automatic defect review (ADR) AFM as a post-inspection metrology tool for defect study and classification for 300 mm blank wafers and to overcome the limitations stated above. The ADR AFM provides high throughput, high resolution, and non-destructive means for obtaining 3D information for nm-scale defect review and classification.
Real-time catheter localization and visualization using three-dimensional echocardiography
NASA Astrophysics Data System (ADS)
Kozlowski, Pawel; Bandaru, Raja Sekhar; D'hooge, Jan; Samset, Eigil
2017-03-01
Real-time three-dimensional transesophageal echocardiography (RT3D-TEE) is increasingly used during minimally invasive cardiac surgeries (MICS). In many cath labs, RT3D-TEE is already one of the requisite tools for image guidance during MICS. However, the visualization of the catheter is not always satisfactory making 3D- TEE challenging to use as the only modality for guidance. We propose a novel technique for better visualization of the catheter along with the cardiac anatomy using TEE alone - exploiting both beamforming and post processing methods. We extended our earlier method called Delay and Standard Deviation (DASD) beamforming to 3D in order to enhance specular reflections. The beam-formed image was further post-processed by the Frangi filter to segment the catheter. Multi-variate visualization techniques enabled us to render both the standard tissue and the DASD beam-formed image on a clinical ultrasound scanner simultaneously. A frame rate of 15 FPS was achieved.
Measuring the success of electronic medical record implementation using electronic and survey data.
Keshavjee, K.; Troyan, S.; Holbrook, A. M.; VanderMolen, D.
2001-01-01
Computerization of physician practices is increasing. Stakeholders are demanding demonstrated value for their Electronic Medical Record (EMR) implementations. We developed survey tools to measure medical office processes, including administrative and physician tasks pre- and post-EMR implementation. We included variables that were expected to improve with EMR implementation and those that were not expected to improve, as controls. We measured the same processes pre-EMR, at six months and 18 months post-EMR. Time required for most administrative tasks decreased within six months of EMR implementation. Staff time spent on charting increased with time, in keeping with our anecdotal observations that nurses were given more responsibility for charting in many offices. Physician time to chart increased initially by 50%, but went down to original levels by 18 months. However, this may be due to the drop-out of those physicians who had a difficult time charting electronically. PMID:11825201
Hors, Cora; Goldberg, Anna Carla; Almeida, Ederson Haroldo Pereira de; Babio Júnior, Fernando Galan; Rizzo, Luiz Vicente
2012-01-01
Introduce a program for the management of scientific research in a General Hospital employing the business management tools Lean Six Sigma and PMBOK for project management in this area. The Lean Six Sigma methodology was used to improve the management of the institution's scientific research through a specific tool (DMAIC) for identification, implementation and posterior analysis based on PMBOK practices of the solutions found. We present our solutions for the management of institutional research projects at the Sociedade Beneficente Israelita Brasileira Albert Einstein. The solutions were classified into four headings: people, processes, systems and organizational culture. A preliminary analysis of these solutions showed them to be completely or partially compliant to the processes described in the PMBOK Guide. In this post facto study, we verified that the solutions drawn from a project using Lean Six Sigma methodology and based on PMBOK enabled the improvement of our processes dealing with the management of scientific research carried out in the institution and constitutes a model to contribute to the search of innovative science management solutions by other institutions dealing with scientific research in Brazil.
Comparative evaluation of e-beam sensitive chemically amplified resists for mask making
NASA Astrophysics Data System (ADS)
Irmscher, Mathias; Beyer, Dirk; Butschke, Joerg; Constantine, Chris; Hoffmann, Thomas; Koepernik, Corinna; Krauss, Christian; Leibold, Bernd; Letzkus, Florian; Mueller, Dietmar; Springer, Reinhard; Voehringer, Peter
2002-07-01
Positive tone chemically amplified resists CAP209, EP012M (TOK), KRS-XE (JSR) and FEP171 (Fuji) were evaluated for mask making. The investigations were performed on an advanced tool set comprising of a Steag coater ASR5000, Steag developer ASP5000, 50kV e-beam writer Leica SB350, UNAXIS MASK ETCHER III , STS ICP silicon etcher and a CD-SEM KLA8100. We investigated and compared resolution, sensitivity, resist slope, dark field loss, CD-uniformity, line edge roughness, and etch resistance of the evaluated resists. Furthermore, the influence of post coating delay, post exposure delay and other process parameters on the resist performance was determined.
Characterizing challenged Minnesota ballots
NASA Astrophysics Data System (ADS)
Nagy, George; Lopresti, Daniel; Barney Smith, Elisa H.; Wu, Ziyan
2011-01-01
Photocopies of the ballots challenged in the 2008 Minnesota elections, which constitute a public record, were scanned on a high-speed scanner and made available on a public radio website. The PDF files were downloaded, converted to TIF images, and posted on the PERFECT website. Based on a review of relevant image-processing aspects of paper-based election machinery and on additional statistics and observations on the posted sample data, robust tools were developed for determining the underlying grid of the targets on these ballots regardless of skew, clipping, and other degradations caused by high-speed copying and digitization. The accuracy and robustness of a method based on both index-marks and oval targets are demonstrated on 13,435 challenged ballot page images.
NASA Astrophysics Data System (ADS)
Balbin, Jessie R.; Pinugu, Jasmine Nadja J.; Basco, Abigail Joy S.; Cabanada, Myla B.; Gonzales, Patrisha Melrose V.; Marasigan, Juan Carlos C.
2017-06-01
The research aims to build a tool in assessing patients for post-traumatic stress disorder or PTSD. The parameters used are heart rate, skin conductivity, and facial gestures. Facial gestures are recorded using OpenFace, an open-source face recognition program that uses facial action units in to track facial movements. Heart rate and skin conductivity is measured through sensors operated using Raspberry Pi. Results are stored in a database for easy and quick access. Databases to be used are uploaded to a cloud platform so that doctors have direct access to the data. This research aims to analyze these parameters and give accurate assessment of the patient.
Wysham, Nicholas G; Mularski, Richard A; Schmidt, David M; Nord, Shirley C; Louis, Deborah L; Shuster, Elizabeth; Curtis, J Randall; Mosen, David M
2014-06-01
Communication in the intensive care unit (ICU) is an important component of quality ICU care. In this report, we evaluate the long-term effects of a quality improvement (QI) initiative, based on the VALUE communication strategy, designed to improve communication with family members of critically ill patients. We implemented a multifaceted intervention to improve communication in the ICU and measured processes of care. Quality improvement components included posted VALUE placards, templated progress note inclusive of communication documentation, and a daily rounding checklist prompt. We evaluated care for all patients cared for by the intensivists during three separate 3 week periods, pre, post, and 3 years following the initial intervention. Care delivery was assessed in 38 patients and their families in the pre-intervention sample, 27 in the post-intervention period, and 41 in follow-up. Process measures of communication showed improvement across the evaluation periods, for example, daily updates increased from pre 62% to post 76% to current 84% of opportunities. Our evaluation of this quality improvement project suggests persistence and continued improvements in the delivery of measured aspects of ICU family communication. Maintenance with point-of-care-tools may account for some of the persistence and continued improvements. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kadow, Christopher; Illing, Sebastian; Kunst, Oliver; Schartner, Thomas; Kirchner, Ingo; Rust, Henning W.; Cubasch, Ulrich; Ulbrich, Uwe
2016-04-01
The Freie Univ Evaluation System Framework (Freva - freva.met.fu-berlin.de) is a software infrastructure for standardized data and tool solutions in Earth system science. Freva runs on high performance computers to handle customizable evaluation systems of research projects, institutes or universities. It combines different software technologies into one common hybrid infrastructure, including all features present in the shell and web environment. The database interface satisfies the international standards provided by the Earth System Grid Federation (ESGF). Freva indexes different data projects into one common search environment by storing the meta data information of the self-describing model, reanalysis and observational data sets in a database. This implemented meta data system with its advanced but easy-to-handle search tool supports users, developers and their plugins to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitation of the provision and usage of tools and climate data automatically increases the number of scientists working with the data sets and identifying discrepancies. The integrated web-shell (shellinabox) adds a degree of freedom in the choice of the working environment and can be used as a gate to the research projects HPC. Plugins are able to integrate their e.g. post-processed results into the database of the user. This allows e.g. post-processing plugins to feed statistical analysis plugins, which fosters an active exchange between plugin developers of a research project. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a database. Configurations and results of the tools can be shared among scientists via shell or web system. Therefore, plugged-in tools benefit from transparency and reproducibility. Furthermore, if configurations match while starting an evaluation plugin, the system suggests to use results already produced by other users - saving CPU/h, I/O, disk space and time. The efficient interaction between different technologies improves the Earth system modeling science framed by Freva.
Applying knowledge translation tools to inform policy: the case of mental health in Lebanon.
Yehia, Farah; El Jardali, Fadi
2015-06-06
Many reform efforts in health systems fall short because the use of research evidence to inform policy remains scarce. In Lebanon, one in four adults suffers from a mental illness, yet access to mental healthcare services in primary healthcare (PHC) settings is limited. Using an "integrated" knowledge framework to link research to action, this study examines the process of influencing the mental health agenda in Lebanon through the application of Knowledge Translation (KT) tools and the use of a KT Platform (KTP) as an intermediary between researchers and policymakers. This study employed the following KT tools: 1) development of a policy brief to address the lack of access to mental health services in PHC centres, 2) semi-structured interviews with 10 policymakers and key informants, 3) convening of a national policy dialogue, 4) evaluation of the policy brief and dialogue, and 5) a post-dialogue survey. Findings from the key informant interviews and a comprehensive synthesis of evidence were used to develop a policy brief which defined the problem and presented three elements of a policy approach to address it. This policy brief was circulated to 24 participants prior to the dialogue to inform the discussion. The policy dialogue validated the evidence synthesized in the brief, whereby integrating mental health into PHC services was the element most supported by evidence as well as participants. The post-dialogue survey showed that, in the following 6 months, several implementation steps were taken by stakeholders, including establishing national taskforce, training PHC staff, and updating the national essential drug list to include psychiatric medications. Relationships among policymakers, researchers, and stakeholders were strengthened as they conducted their own workshops and meetings after the dialogue to further discuss implementation, and their awareness about and demand for KT tools increased. This case study showed that the use of KT tools in Lebanon to help generate evidence-informed programs is promising. This experience provided insights into the most helpful features of the tools. The role of the KTP in engaging stakeholders, particularly policymakers, prior to the dialogue and linking them with researchers was vital in securing their support for the KT process and uptake of the research evidence.
Atema, Jasper J; Ram, Kim; Schultz, Marcus J; Boermeester, Marja A
Timely identification of patients in need of an intervention for abdominal sepsis after initial surgical management of secondary peritonitis is vital but complex. The aim of this study was to validate a decision tool for this purpose and to evaluate its potential to guide post-operative management. A prospective cohort study was conducted on consecutive adult patients undergoing surgery for secondary peritonitis in a single hospital. Assessments using the decision tool, based on one intra-operative and five post-operative variables, were performed on the second and third post-operative days and when the patients' clinical status deteriorated. Scores were compared with the clinical reference standard of persistent sepsis based on the clinical course or findings at imaging or surgery. Additionally, the potential of the decision tool to guide management in terms of diagnostic imaging in three previously defined score categories (low, intermediate, and high) was evaluated. A total of 161 assessments were performed in 69 patients. The majority of cases of secondary peritonitis (68%) were caused by perforation of the gastrointestinal tract. Post-operative persistent sepsis occurred in 28 patients. The discriminative capacity of the decision tool score was fair (area under the curve of the receiver operating characteristic = 0.79). The incidence rate differed significantly between the three score categories (p < 0.001). The negative predictive value of a decision tool score categorized as low probability was 89% (95% confidence interval [CI] 82-94) and 65% (95% CI 47-79) for an intermediate score. Diagnostic imaging was performed more frequently when there was an intermediate score than when the score was categorized as low (46% vs. 24%; p < 0.001). In patients operated on for secondary peritonitis, the decision tool score predicts with fair accuracy whether persistent sepsis is present.
Quality assessment of systematic reviews on alveolar socket preservation.
Moraschini, V; Barboza, E Dos S P
2016-09-01
The aim of this overview was to evaluate and compare the quality of systematic reviews, with or without meta-analysis, that have evaluated studies on techniques or biomaterials used for the preservation of alveolar sockets post tooth extraction in humans. An electronic search was conducted without date restrictions using the Medline/PubMed, Cochrane Library, and Web of Science databases up to April 2015. Eligibility criteria included systematic reviews, with or without meta-analysis, focused on the preservation of post-extraction alveolar sockets in humans. Two independent authors assessed the quality of the included reviews using AMSTAR and the checklist proposed by Glenny et al. in 2003. After the selection process, 12 systematic reviews were included. None of these reviews obtained the maximum score using the quality assessment tools implemented, and the results of the analyses were highly variable. A significant statistical correlation was observed between the scores of the two checklists. A wide structural and methodological variability was observed between the systematic reviews published on the preservation of alveolar sockets post tooth extraction. None of the reviews evaluated obtained the maximum score using the two quality assessment tools implemented. Copyright © 2016 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greiner, Miles
Radial hydride formation in high-burnup used fuel cladding has the potential to radically reduce its ductility and suitability for long-term storage and eventual transport. To avoid this formation, the maximum post-reactor temperature must remain sufficiently low to limit the cladding hoop stress, and so that hydrogen from the existing circumferential hydrides will not dissolve and become available to re-precipitate into radial hydrides under the slow cooling conditions during drying, transfer and early dry-cask storage. The objective of this research is to develop and experimentallybenchmark computational fluid dynamics simulations of heat transfer in post-pool-storage drying operations, when high-burnup fuel cladding ismore » likely to experience its highest temperature. These benchmarked tools can play a key role in evaluating dry cask storage systems for extended storage of high-burnup fuels and post-storage transportation, including fuel retrievability. The benchmarked tools will be used to aid the design of efficient drying processes, as well as estimate variations of surface temperatures as a means of inferring helium integrity inside the canister or cask. This work will be conducted effectively because the principal investigator has experience developing these types of simulations, and has constructed a test facility that can be used to benchmark them.« less
Inselect: Automating the Digitization of Natural History Collections
Hudson, Lawrence N.; Blagoderov, Vladimir; Heaton, Alice; Holtzhausen, Pieter; Livermore, Laurence; Price, Benjamin W.; van der Walt, Stéfan; Smith, Vincent S.
2015-01-01
The world’s natural history collections constitute an enormous evidence base for scientific research on the natural world. To facilitate these studies and improve access to collections, many organisations are embarking on major programmes of digitization. This requires automated approaches to mass-digitization that support rapid imaging of specimens and associated data capture, in order to process the tens of millions of specimens common to most natural history collections. In this paper we present Inselect—a modular, easy-to-use, cross-platform suite of open-source software tools that supports the semi-automated processing of specimen images generated by natural history digitization programmes. The software is made up of a Windows, Mac OS X, and Linux desktop application, together with command-line tools that are designed for unattended operation on batches of images. Blending image visualisation algorithms that automatically recognise specimens together with workflows to support post-processing tasks such as barcode reading, label transcription and metadata capture, Inselect fills a critical gap to increase the rate of specimen digitization. PMID:26599208
Inselect: Automating the Digitization of Natural History Collections.
Hudson, Lawrence N; Blagoderov, Vladimir; Heaton, Alice; Holtzhausen, Pieter; Livermore, Laurence; Price, Benjamin W; van der Walt, Stéfan; Smith, Vincent S
2015-01-01
The world's natural history collections constitute an enormous evidence base for scientific research on the natural world. To facilitate these studies and improve access to collections, many organisations are embarking on major programmes of digitization. This requires automated approaches to mass-digitization that support rapid imaging of specimens and associated data capture, in order to process the tens of millions of specimens common to most natural history collections. In this paper we present Inselect-a modular, easy-to-use, cross-platform suite of open-source software tools that supports the semi-automated processing of specimen images generated by natural history digitization programmes. The software is made up of a Windows, Mac OS X, and Linux desktop application, together with command-line tools that are designed for unattended operation on batches of images. Blending image visualisation algorithms that automatically recognise specimens together with workflows to support post-processing tasks such as barcode reading, label transcription and metadata capture, Inselect fills a critical gap to increase the rate of specimen digitization.
NASA Astrophysics Data System (ADS)
Gnyawali, Surya C.; Blum, Kevin; Pal, Durba; Ghatak, Subhadip; Khanna, Savita; Roy, Sashwati; Sen, Chandan K.
2017-01-01
Cutaneous microvasculopathy complicates wound healing. Functional assessment of gated individual dermal microvessels is therefore of outstanding interest. Functional performance of laser speckle contrast imaging (LSCI) systems is compromised by motion artefacts. To address such weakness, post-processing of stacked images is reported. We report the first post-processing of binary raw data from a high-resolution LSCI camera. Sharp images of low-flowing microvessels were enabled by introducing inverse variance in conjunction with speckle contrast in Matlab-based program code. Extended moving window averaging enhanced signal-to-noise ratio. Functional quantitative study of blood flow kinetics was performed on single gated microvessels using a free hand tool. Based on detection of flow in low-flow microvessels, a new sharp contrast image was derived. Thus, this work presents the first distinct image with quantitative microperfusion data from gated human foot microvasculature. This versatile platform is applicable to study a wide range of tissue systems including fine vascular network in murine brain without craniotomy as well as that in the murine dorsal skin. Importantly, the algorithm reported herein is hardware agnostic and is capable of post-processing binary raw data from any camera source to improve the sensitivity of functional flow data above and beyond standard limits of the optical system.
NASA Astrophysics Data System (ADS)
Amblard, Gilles; Purdy, Sara; Cooper, Ryan; Hockaday, Marjory
2016-03-01
The overall quality and processing capability of lithographic materials are critical for ensuring high device yield and performance at sub-20nm technology nodes in a high volume manufacturing environment. Insufficient process margin and high line width roughness (LWR) cause poor manufacturing control, while high defectivity causes product failures. In this paper, we focus on the most critical layer of a sub-20nm technology node LSI device, and present an improved method for characterizing both lithographic and post-patterning defectivity performance of state-of-the-art immersion photoresists. Multiple formulations from different suppliers were used and compared. Photoresists were tested under various process conditions, and multiple lithographic metrics were investigated (depth of focus, exposure dose latitude, line width roughness, etc.). Results were analyzed and combined using an innovative approach based on advanced software, providing clearer results than previously available. This increased detail enables more accurate performance comparisons among the different photoresists. Post-patterning defectivity was also quantified, with defects reviewed and classified using state-of-the-art inspection tools. Correlations were established between the lithographic and post-patterning defectivity performances for each material, and overall ranking was established among the photoresists, enabling the selection of the best performer for implementation in a high volume manufacturing environment.
Gnyawali, Surya C.; Blum, Kevin; Pal, Durba; Ghatak, Subhadip; Khanna, Savita; Roy, Sashwati; Sen, Chandan K.
2017-01-01
Cutaneous microvasculopathy complicates wound healing. Functional assessment of gated individual dermal microvessels is therefore of outstanding interest. Functional performance of laser speckle contrast imaging (LSCI) systems is compromised by motion artefacts. To address such weakness, post-processing of stacked images is reported. We report the first post-processing of binary raw data from a high-resolution LSCI camera. Sharp images of low-flowing microvessels were enabled by introducing inverse variance in conjunction with speckle contrast in Matlab-based program code. Extended moving window averaging enhanced signal-to-noise ratio. Functional quantitative study of blood flow kinetics was performed on single gated microvessels using a free hand tool. Based on detection of flow in low-flow microvessels, a new sharp contrast image was derived. Thus, this work presents the first distinct image with quantitative microperfusion data from gated human foot microvasculature. This versatile platform is applicable to study a wide range of tissue systems including fine vascular network in murine brain without craniotomy as well as that in the murine dorsal skin. Importantly, the algorithm reported herein is hardware agnostic and is capable of post-processing binary raw data from any camera source to improve the sensitivity of functional flow data above and beyond standard limits of the optical system. PMID:28106129
Literature and art therapy in post-stroke psychological disorders.
Eum, Yeongcheol; Yim, Jongeun
2015-01-01
Stroke is one of the leading causes of morbidity and long-term disability worldwide, and post-stroke depression (PSD) is a common and serious psychiatric complication of stroke. PSD makes patients have more severe deficits in activities of daily living, a worse functional outcome, more severe cognitive deficits and increased mortality as compared to stroke patients without depression. Therefore, to reduce or prevent mental problems of stroke patients, psychological treatment should be recommended. Literature and art therapy are highly effective psychological treatment for stroke patients. Literature therapy divided into poetry and story therapy is an assistive tool that treats neurosis as well as emotional or behavioral disorders. Poetry can add impression to the lethargic life of a patient with PSD, thereby acting as a natural treatment. Story therapy can change the gloomy psychological state of patients into a bright and healthy story, and therefore can help stroke patients to overcome their emotional disabilities. Art therapy is one form of psychological therapy that can treat depression and anxiety in stroke patients. Stroke patients can express their internal conflicts, emotions, and psychological status through art works or processes and it would be a healing process of mental problems. Music therapy can relieve the suppressed emotions of patients and add vitality to the body, while giving them the energy to share their feelings with others. In conclusion, literature and art therapy can identify the emotional status of patients and serve as a useful auxiliary tool to help stroke patients in their rehabilitation process.
Free software helps map and display data
NASA Astrophysics Data System (ADS)
Wessel, Paul; Smith, Walter H. F.
When creating camera-ready figures, most scientists are familiar with the sequence of raw data → processing → final illustration and with the spending of large sums of money to finalize papers for submission to scientific journals, prepare proposals, and create overheads and slides for various presentations. This process can be tedious and is often done manually, since available commercial or in-house software usually can do only part of the job.To expedite this process, we introduce the Generic Mapping Tools (GMT), which is a free, public domain software package that can be used to manipulate columns of tabular data, time series, and gridded data sets and to display these data in a variety of forms ranging from simple x-y plots to maps and color, perspective, and shaded-relief illustrations. GMT uses the PostScript page description language, which can create arbitrarily complex images in gray tones or 24-bit true color by superimposing multiple plot files. Line drawings, bitmapped images, and text can be easily combined in one illustration. PostScript plot files are device-independent, meaning the same file can be printed at 300 dots per inch (dpi) on an ordinary laserwriter or at 2470 dpi on a phototypesetter when ultimate quality is needed. GMT software is written as a set of UNIX tools and is totally self contained and fully documented. The system is offered free of charge to federal agencies and nonprofit educational organizations worldwide and is distributed over the computer network Internet.
Chart-stimulated Recall as a Learning Tool for Improving Radiology Residents' Reports.
Nadeem, Naila; Zafar, Abdul Mueed; Haider, Sonia; Zuberi, Rukhsana W; Ahmad, Muhammad Nadeem; Ojili, Vijayanadh
2017-08-01
Workplace-based assessments gauge the highest tier of clinical competence. Chart-stimulated recall (CSR) is a workplace-based assessment method that complements chart audit with an interview based on the residents' notes. It allows evaluation of the residents' knowledge and heuristics while providing opportunities for feedback and self-reflection. We evaluated the utility of CSR for improving the radiology residents' reporting skills. Residents in each year of training were randomly assigned to an intervention group (n = 12) or a control group (n = 13). Five pre-intervention and five post-intervention reports of each resident were independently evaluated by three blinded reviewers using a modified Bristol Radiology Report Assessment Tool. The study intervention comprised a CSR interview tailored to each individual resident's learning needs based on the pre-intervention assessment. The CSR process focused on the clinical relevance of the radiology reports. Student's t test (P < .05) was used to compare pre- and post-intervention scores of each group. A total of 125 pre-intervention and 125 post-intervention reports were evaluated (total 750 assessments). The Cronbach's alpha for the study tool was 0.865. A significant improvement was seen in the cumulative 19-item score (66% versus 73%, P < .001) and the global rating score (59% versus 72%, P < .001) of the intervention group after the CSR. The reports of the control group did not demonstrate any significant improvement. CSR is a feasible workplace-based assessment method for improving reporting skills of the radiology residents. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Barlow, J. E.; Goodrich, D. C.; Guertin, D. P.; Burns, I. S.
2016-12-01
Wildfires in the Western United States can alter landscapes by removing vegetation and changing soil properties. These altered landscapes produce more runoff than pre-fire landscapes which can lead to post-fire flooding that can damage infrastructure and impair natural resources. Resources, structures, historical artifacts and others that could be impacted by increased runoff are considered values at risk. .The Automated Geospatial Watershed Assessment tool (AGWA) allows users to quickly set up and execute the Kinematic Runoff and Erosion model (KINEROS2 or K2) in the ESRI ArcMap environment. The AGWA-K2 workflow leverages the visualization capabilities of GIS to facilitate evaluation of rapid watershed assessments for post-fire planning efforts. High relative change in peak discharge, as simulated by K2, provides a visual and numeric indicator to investigate those channels in the watershed that should be evaluated for more detailed analysis, especially if values at risk are within or near that channel. Modeling inundation extent along a channel would provide more specific guidance about risk along a channel. HEC-2 and HEC-RAS can be used for hydraulic modeling efforts at the reach and river system scale. These models have been used to address flood boundaries and, accordingly, flood risk. However, data collection and organization for hydraulic models can be time consuming and therefore a combined hydrologic-hydraulic modeling approach is not often employed for rapid assessments. A simplified approach could streamline this process and provide managers with a simple workflow and tool to perform a quick risk assessment for a single reach. By focusing on a single reach highlighted by large relative change in peak discharge, data collection efforts can be minimized and the hydraulic computations can be performed to supplement risk analysis. The incorporation of hydraulic analysis through a suite of Python tools (as outlined by HEC-2) with AGWA-K2 will allow more rapid applications of combined hydrologic-hydraulic modeling. This combined modeling approach is built in the ESRI ArcGIS application to enable rapid model preparation, execution and result visualization for risk assessment in post-fire environments.
Development of a prenatal psychosocial screening tool for post-partum depression and anxiety.
McDonald, Sheila; Wall, Jennifer; Forbes, Kaitlin; Kingston, Dawn; Kehler, Heather; Vekved, Monica; Tough, Suzanne
2012-07-01
Post-partum depression (PPD) is the most common complication of pregnancy in developed countries, affecting 10-15% of new mothers. There has been a shift in thinking less in terms of PPD per se to a broader consideration of poor mental health, including anxiety after giving birth. Some risk factors for poor mental health in the post-partum period can be identified prenatally; however prenatal screening tools developed to date have had poor sensitivity and specificity. The objective of this study was to develop a screening tool that identifies women at risk of distress, operationalized by elevated symptoms of depression and anxiety in the post-partum period using information collected in the prenatal period. Using data from the All Our Babies Study, a prospective cohort study of pregnant women living in Calgary, Alberta (N = 1578), we developed an integer score-based prediction rule for the prevalence of PPD, as defined as scoring 10 or higher on the Edinburgh Postnatal Depression Scale (EPDS) at 4-months postpartum. The best fit model included known risk factors for PPD: depression and stress in late pregnancy, history of abuse, and poor relationship quality with partner. Comparison of the screening tool with the EPDS in late pregnancy showed that our tool had significantly better performance for sensitivity. Further validation of our tool was seen in its utility for identifying elevated symptoms of postpartum anxiety. This research heeds the call for further development and validation work using psychosocial factors identified prenatally for identifying poor mental health in the post-partum period. © 2012 Blackwell Publishing Ltd.
Dynamic Modeling of Process Technologies for Closed-Loop Water Recovery Systems
NASA Technical Reports Server (NTRS)
Allada, Rama Kumar; Lange, Kevin E.; Anderson, Molly S.
2012-01-01
Detailed chemical process simulations are a useful tool in designing and optimizing complex systems and architectures for human life support. Dynamic and steady-state models of these systems help contrast the interactions of various operating parameters and hardware designs, which become extremely useful in trade-study analyses. NASA s Exploration Life Support technology development project recently made use of such models to compliment a series of tests on different waste water distillation systems. This paper presents dynamic simulations of chemical process for primary processor technologies including: the Cascade Distillation System (CDS), the Vapor Compression Distillation (VCD) system, the Wiped-Film Rotating Disk (WFRD), and post-distillation water polishing processes such as the Volatiles Removal Assembly (VRA). These dynamic models were developed using the Aspen Custom Modeler (Registered TradeMark) and Aspen Plus(Registered TradeMark) process simulation tools. The results expand upon previous work for water recovery technology models and emphasize dynamic process modeling and results. The paper discusses system design, modeling details, and model results for each technology and presents some comparisons between the model results and available test data. Following these initial comparisons, some general conclusions and forward work are discussed.
CFD Extraction Tool for TecPlot From DPLR Solutions
NASA Technical Reports Server (NTRS)
Norman, David
2013-01-01
This invention is a TecPlot macro of a computer program in the TecPlot programming language that processes data from DPLR solutions in TecPlot format. DPLR (Data-Parallel Line Relaxation) is a NASA computational fluid dynamics (CFD) code, and TecPlot is a commercial CFD post-processing tool. The Tec- Plot data is in SI units (same as DPLR output). The invention converts the SI units into British units. The macro modifies the TecPlot data with unit conversions, and adds some extra calculations. After unit conversions, the macro cuts a slice, and adds vectors on the current plot for output format. The macro can also process surface solutions. Existing solutions use manual conversion and superposition. The conversion is complicated because it must be applied to a range of inter-related scalars and vectors to describe a 2D or 3D flow field. It processes the CFD solution to create superposition/comparison of scalars and vectors. The existing manual solution is cumbersome, open to errors, slow, and cannot be inserted into an automated process. This invention is quick and easy to use, and can be inserted into an automated data-processing algorithm.
Ground Contact Model for Mars Science Laboratory Mission Simulations
NASA Technical Reports Server (NTRS)
Raiszadeh, Behzad; Way, David
2012-01-01
The Program to Optimize Simulated Trajectories II (POST 2) has been successful in simulating the flight of launch vehicles and entry bodies on earth and other planets. POST 2 has been the primary simulation tool for the Entry Descent, and Landing (EDL) phase of numerous Mars lander missions such as Mars Pathfinder in 1997, the twin Mars Exploration Rovers (MER-A and MER-B) in 2004, Mars Phoenix lander in 2007, and it is now the main trajectory simulation tool for Mars Science Laboratory (MSL) in 2012. In all previous missions, the POST 2 simulation ended before ground impact, and a tool other than POST 2 simulated landing dynamics. It would be ideal for one tool to simulate the entire EDL sequence, thus avoiding errors that could be introduced by handing off position, velocity, or other fight parameters from one simulation to the other. The desire to have one continuous end-to-end simulation was the motivation for developing the ground interaction model in POST 2. Rover landing, including the detection of the postlanding state, is a very critical part of the MSL mission, as the EDL landing sequence continues for a few seconds after landing. The method explained in this paper illustrates how a simple ground force interaction model has been added to POST 2, which allows simulation of the entire EDL from atmospheric entry through touchdown.
Recent progress in the synthesis of metal–organic frameworks
Sun, Yujia; Zhou, Hong -Cai
2015-09-25
Metal–organic frameworks (MOFs) have attracted considerable attention for various applications due to their tunable structure, porosity and functionality. In general, MOFs have been synthesized from isolated metal ions and organic linkers under hydrothermal or solvothermal conditions via one-spot reactions. The emerging precursor approach and kinetically tuned dimensional augmentation strategy add more diversity to this field. In addition, to speed up the crystallization process and create uniform crystals with reduced size, many alternative synthesis routes have been explored. Recent advances in microwave-assisted synthesis and electrochemical synthesis are presented in this review. In recent years, post-synthetic approaches have been shown to bemore » powerful tools to synthesize MOFs with modified functionality, which cannot be attained via de novo synthesis. In this study, some current accomplishments of post-synthetic modification (PSM) based on covalent transformations and coordinative interactions as well as post-synthetic exchange (PSE) in robust MOFs are provided.« less
Exhaled breath condensate – from an analytical point of view
Dodig, Slavica; Čepelak, Ivana
2013-01-01
Over the past three decades, the goal of many researchers is analysis of exhaled breath condensate (EBC) as noninvasively obtained sample. A total quality in laboratory diagnostic processes in EBC analysis was investigated: pre-analytical (formation, collection, storage of EBC), analytical (sensitivity of applied methods, standardization) and post-analytical (interpretation of results) phases. EBC analysis is still used as a research tool. Limitations referred to pre-analytical, analytical, and post-analytical phases of EBC analysis are numerous, e.g. low concentrations of EBC constituents, single-analyte methods lack in sensitivity, and multi-analyte has not been fully explored, and reference values are not established. When all, pre-analytical, analytical and post-analytical requirements are met, EBC biomarkers as well as biomarker patterns can be selected and EBC analysis can hopefully be used in clinical practice, in both, the diagnosis and in the longitudinal follow-up of patients, resulting in better outcome of disease. PMID:24266297
Modeling the fusion of cylindrical bioink particles in post bioprinting structure formation
NASA Astrophysics Data System (ADS)
McCune, Matt; Shafiee, Ashkan; Forgacs, Gabor; Kosztin, Ioan
2015-03-01
Cellular Particle Dynamics (CPD) is an effective computational method to describe the shape evolution and biomechanical relaxation processes in multicellular systems. Thus, CPD is a useful tool to predict the outcome of post-printing structure formation in bioprinting. The predictive power of CPD has been demonstrated for multicellular systems composed of spherical bioink units. Experiments and computer simulations were related through an independently developed theoretical formalism based on continuum mechanics. Here we generalize the CPD formalism to (i) include cylindrical bioink particles often used in specific bioprinting applications, (ii) describe the more realistic experimental situation in which both the length and the volume of the cylindrical bioink units decrease during post-printing structure formation, and (iii) directly connect CPD simulations to the corresponding experiments without the need of the intermediate continuum theory inherently based on simplifying assumptions. Work supported by NSF [PHY-0957914]. Computer time provided by the University of Missouri Bioinformatics Consortium.
Hoon, Lim Siew; Hong-Gu, He; Mackey, Sandra
Paediatric pain management remains a challenge in clinical settings. Parents can contribute to the effective and accurate pain assessment and management of their child. No systematic reviews regarding the parental involvement in their child's post-operative pain management have been published. To determine the best available evidence regarding parental involvement in managing their children's post-operative pain in the hospital setting. The review considered studies that included parents of all ethnic groups with children aged between 6 to 12 years old who were hospitalised and undergone surgery of any kind with post-operative surgical or incision site pain where care was provided in acute hospital settings. The phenomena of interest were the experiences of parents in managing their children's post-operative pain. A three-step search strategy was utilised in each component of this review. Major databases searched included: MEDLINE, CINAHL, Scopus, ScienceDirect, the Cochrane library, PubMed as well as Google Scholar. The search included published studies and papers in English from 1990 to 2009. Each included study was assessed by two independent reviewers using the appropriate appraisal checklists developed by the Joanna Briggs Institute (JBI). Quantitative and qualitative data were extracted from the included papers using standardised data extraction tools from the JBI, Meta-analysis Statistics Assessment and Review Instrument data extraction tool for descriptive/case series and the JBI-Qualitative Assessment and Review Instrument data extraction tool for interpretive and critical research. The five quantitative studies included in this review were not suitable for meta-analysis due to clinical and methodological heterogeneity and therefore the findings are presented in a narrative form. The two qualitative studies were from the same study, therefore meta-synthesis was not possible. Hence the results of the studies were presented in a narrative format. Seven papers were included in this review. The evidence identified topics including: pharmacological and non-pharmacological interventions carried out by parents; the experience of concern, fear, helplessness, anxiety, depression, frustration and lack of support felt by parents during their child's hospitalisation; communication issues and knowledge deficits; need for information by parents to promote effective participation in managing their child's post-operative pain. This review revealed pharmacological and non-pharmacological interventions carried out by parents to alleviate their children's post-operative pain. Obstacles and promoting factors influencing parents' experiences as well as their needs in the process of caring were identified. Parents' roles in their child's surgical pain management should be clarified and their efforts acknowledged, which will encourage parents' active participation in their child's caring process. Nurses should provide guidance, education and support to parents. More studies are needed to examine parents' experiences in caring for their child, investigate the effectiveness of education and guidance provided to parents by the nurses and explore the influence of parents' cultural values and nurses' perceptions of parental participation in their child's care.
2017-01-01
Background Data concerning patients originates from a variety of sources on social media. Objective The aim of this study was to show how methodologies borrowed from different areas including computer science, econometrics, statistics, data mining, and sociology may be used to analyze Facebook data to investigate the patients’ perspectives on a given medical prescription. Methods To shed light on patients’ behavior and concerns, we focused on Crohn’s disease, a chronic inflammatory bowel disease, and the specific therapy with the biological drug Infliximab. To gain information from the basin of big data, we analyzed Facebook posts in the time frame from October 2011 to August 2015. We selected posts from patients affected by Crohn’s disease who were experiencing or had previously been treated with the monoclonal antibody drug Infliximab. The selected posts underwent further characterization and sentiment analysis. Finally, an ethnographic review was carried out by experts from different scientific research fields (eg, computer science vs gastroenterology) and by a software system running a sentiment analysis tool. The patient feeling toward the Infliximab treatment was classified as positive, neutral, or negative, and the results from computer science, gastroenterologist, and software tool were compared using the square weighted Cohen’s kappa coefficient method. Results The first automatic selection process returned 56,000 Facebook posts, 261 of which exhibited a patient opinion concerning Infliximab. The ethnographic analysis of these 261 selected posts gave similar results, with an interrater agreement between the computer science and gastroenterology experts amounting to 87.3% (228/261), a substantial agreement according to the square weighted Cohen’s kappa coefficient method (w2K=0.6470). A positive, neutral, and negative feeling was attributed to 36%, 27%, and 37% of posts by the computer science expert and 38%, 30%, and 32% by the gastroenterologist, respectively. Only a slight agreement was found between the experts’ opinion and the software tool. Conclusions We show how data posted on Facebook by Crohn’s disease patients are a useful dataset to understand the patient’s perspective on the specific treatment with Infliximab. The genuine, nonmedically influenced patients’ opinion obtained from Facebook pages can be easily reviewed by experts from different research backgrounds, with a substantial agreement on the classification of patients’ sentiment. The described method allows a fast collection of big amounts of data, which can be easily analyzed to gain insight into the patients’ perspective on a specific medical therapy. PMID:28793981
Remote sensing education and Internet/World Wide Web technology
Griffith, J.A.; Egbert, S.L.
2001-01-01
Remote sensing education is increasingly in demand across academic and professional disciplines. Meanwhile, Internet technology and the World Wide Web (WWW) are being more frequently employed as teaching tools in remote sensing and other disciplines. The current wealth of information on the Internet and World Wide Web must be distilled, nonetheless, to be useful in remote sensing education. An extensive literature base is developing on the WWW as a tool in education and in teaching remote sensing. This literature reveals benefits and limitations of the WWW, and can guide its implementation. Among the most beneficial aspects of the Web are increased access to remote sensing expertise regardless of geographic location, increased access to current material, and access to extensive archives of satellite imagery and aerial photography. As with other teaching innovations, using the WWW/Internet may well mean more work, not less, for teachers, at least at the stage of early adoption. Also, information posted on Web sites is not always accurate. Development stages of this technology range from on-line posting of syllabi and lecture notes to on-line laboratory exercises and animated landscape flyovers and on-line image processing. The advantages of WWW/Internet technology may likely outweigh the costs of implementing it as a teaching tool.
A Unique Opportunity to Test Whether Cell Fusion is a Mechanism of Breast Cancer Metastasis
2013-07-01
populations. Last cycle we optimized electroporation conditions for T47D and human mesenchymal stem cell populations and this cycle we have improved our...specific receptor-ligand interactions necessary for cell fusion, to produce a target for drug therapy. Post-fusion events might also be investigated...new tools for the study of the complex processes of cell fusion. The inducible bipartite nature of these strategies assures the accurate
Comparison of Regional Vulnerability Factors for Department of Defense (DOD) Installations
2006-09-01
research efforts are all de - signed to provide tools, data, expertise, and processes that help the DoD sustain and evolve mission operations, both...target audience for the indicators and the regional resource assessment are de - cisionmakers and planners who need broadly based information to inform...to mitigate severe on-post issues while the longer-term efforts are being negotiated. It should not be assumed that pursuing this option will de
Multicutter machining of compound parametric surfaces
NASA Astrophysics Data System (ADS)
Hatna, Abdelmadjid; Grieve, R. J.; Broomhead, P.
2000-10-01
Parametric free forms are used in industries as disparate as footwear, toys, sporting goods, ceramics, digital content creation, and conceptual design. Optimizing tool path patterns and minimizing the total machining time is a primordial issue in numerically controlled (NC) machining of free form surfaces. We demonstrate in the present work that multi-cutter machining can achieve as much as 60% reduction in total machining time for compound sculptured surfaces. The given approach is based upon the pre-processing as opposed to the usual post-processing of surfaces for the detection and removal of interference followed by precise tracking of unmachined areas.
NURBS-Based Geometry for Integrated Structural Analysis
NASA Technical Reports Server (NTRS)
Oliver, James H.
1997-01-01
This grant was initiated in April 1993 and completed in September 1996. The primary goal of the project was to exploit the emerging defacto CAD standard of Non- Uniform Rational B-spline (NURBS) based curve and surface geometry to integrate and streamline the process of turbomachinery structural analysis. We focused our efforts on critical geometric modeling challenges typically posed by the requirements of structural analysts. We developed a suite of software tools that facilitate pre- and post-processing of NURBS-based turbomachinery blade models for finite element structural analyses. We also developed tools to facilitate the modeling of blades in their manufactured (or cold) state based on nominal operating shape and conditions. All of the software developed in the course of this research is written in the C++ language using the Iris Inventor 3D graphical interface tool-kit from Silicon Graphics. In addition to enhanced modularity, improved maintainability, and efficient prototype development, this design facilitates the re-use of code developed for other NASA projects and provides a uniform and professional 'look and feel' for all applications developed by the Iowa State Team.
Sequence Segmentation with changeptGUI.
Tasker, Edward; Keith, Jonathan M
2017-01-01
Many biological sequences have a segmental structure that can provide valuable clues to their content, structure, and function. The program changept is a tool for investigating the segmental structure of a sequence, and can also be applied to multiple sequences in parallel to identify a common segmental structure, thus providing a method for integrating multiple data types to identify functional elements in genomes. In the previous edition of this book, a command line interface for changept is described. Here we present a graphical user interface for this package, called changeptGUI. This interface also includes tools for pre- and post-processing of data and results to facilitate investigation of the number and characteristics of segment classes.
Tips and tricks for preparing lampbrush chromosome spreads from Xenopus tropicalis oocytes.
Penrad-Mobayed, May; Kanhoush, Rasha; Perrin, Caroline
2010-05-01
Due to their large size and fine organization, lampbrush chromosomes (LBCs) of amphibian oocytes have been for decades one of the favorite tools of biologists for the analysis of transcriptional and post-transcriptional processes at the cytological level. The emergence of the diploid Xenopus tropicalis amphibian as a model organism for vertebrate developmental genetics and the accumulation of sequence data made available by its recent genomic sequencing, strongly revive the interest of LBCs as a powerful tool to study genes expressed during oogenesis. We describe here a detailed protocol for preparing LBCs from X. tropicalis oocyte and give practical advice to encourage a large number of researchers to become familiar with these chromosomes.
Emergent FDA biodefense issues for microarray technology: process analytical technology.
Weinberg, Sandy
2004-11-01
A successful biodefense strategy relies upon any combination of four approaches. A nation can protect its troops and citizenry first by advanced mass vaccination, second, by responsive ring vaccination, and third, by post-exposure therapeutic treatment (including vaccine therapies). Finally, protection can be achieved by rapid detection followed by exposure limitation (suites and air filters) or immediate treatment (e.g., antibiotics, rapid vaccines and iodine pills). All of these strategies rely upon or are enhanced by microarray technologies. Microarrays can be used to screen, engineer and test vaccines. They are also used to construct early detection tools. While effective biodefense utilizes a variety of tactical tools, microarray technology is a valuable arrow in that quiver.
2012-01-01
Background On-line provision of information during the transition phase after treatment carries great promise in meeting shortcomings in post-treatment care for breast cancer survivors and their partners. The objectives of this study are to describe the development and process evaluation of a tailored informative website and to assess which characteristics of survivors and partners, participating in the feasibility study, are related to visiting the website. Methods The development process included quantitative and qualitative assessments of survivors’ and partners’ care needs and preferences. Participants’ use and evaluation of the website were explored by conducting baseline and post-measurements. During the intervening 10–12 weeks 57 survivors and 28 partners were granted access to the website. Results Fifty-seven percent (n=21) of survivors who took part in the post-measurement indicated that they had visited the website. Compared to non-visitors (n=16), they were more likely to have a partner and a higher income, reported higher levels of self-esteem and had completed treatment for a longer period of time. Partners who consulted the on-line information (42%, n=8) were younger and reported lower levels of social support compared to partners who did not visit the website (n=11). Visitors generally evaluated the content and lay-out positively, yet some believed the information was incomplete and impersonal. Conclusions The website reached only about half of survivors and partners, yet was mostly well-received. Besides other ways of providing information and support, a website containing clear-cut and tailored information could be a useful tool in post-treatment care provision. PMID:23034161
Mavaddat, Nahal; Ross, Sheila; Dobbin, Alastair; Williams, Kate; Graffy, Jonathan; Mant, Jonathan
2017-01-01
Post-stroke psychological problems predict poor recovery, while positive affect enables patients to focus on rehabilitation and may improve functional outcomes. Positive Mental Training (PosMT), a guided self-help audio shows promise as a tool in promoting positivity, optimism and resilience. To assess acceptability of training in positivity with PosMT for prevention and management of post-stroke psychological problems and to help with coping with rehabilitation. A modified PosMT tool consisted of 12 audio tracks each lasting 18 minutes, one listened to every day for a week. Survivors and carers were asked to listen for 4 weeks, but could volunteer to listen for more. Interviews took place about experiences of the tool after 4 and 12 weeks. 10 stroke survivors and 5 carers from Stroke Support Groups in the UK. Three stroke survivors did not engage with the tool. The remainder reported positive physical and psychological benefits including improved relaxation, better sleep and reduced anxiety after four weeks. Survivors who completed the programme gained a positive outlook on the future, increased motivation, confidence and ability to cope with rehabilitation. No adverse effects were reported. The PosMT shows potential as a tool for coping with rehabilitation and overcoming post-stroke psychological problems including anxiety and depression.
Lin, Steve; Morrison, Laurie J; Brooks, Steven C
2011-04-01
The widely accepted Utstein style has standardized data collection and analysis in resuscitation and post resuscitation research. However, collection of many of these variables poses significant practical challenges. In addition, several important variables in post resuscitation research are missing. Our aim was to develop a comprehensive data dictionary and web-based data collection tool as part of the Strategies for Post Arrest Resuscitation Care (SPARC) Network project, which implemented a knowledge translation program for post cardiac arrest therapeutic hypothermia in 37 Ontario hospitals. A list of data variables was generated based on the current Utstein style, previous studies and expert opinion within our group of investigators. We developed a data dictionary by creating clear definitions and establishing abstraction instructions for each variable. The data dictionary was integrated into a web-based collection form allowing for interactive data entry. Two blinded investigators piloted the data collection tool, by performing a retrospective chart review. A total of 454 variables were included of which 400 were Utstein, 2 were adapted from existing studies and 52 were added to address missing elements. Kappa statistics for two outcome variables, survival to discharge and induction of therapeutic hypothermia were 0.86 and 0.64, respectively. This is the first attempt in the literature to develop a data dictionary as part of a standardized, pragmatic data collection tool for post cardiac arrest research patients. In addition, our dataset defined important variables that were previously missing. This data collection tool can serve as a reference for future trials in post cardiac arrest care. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Adedokun, Omolola A.
2018-01-01
This article provides an illustrative description of the pre-post difference index (PPDI), a simple, nontechnical yet robust tool for examining the instructional sensitivity of assessment items. Extension educators often design pretest-posttest instruments to assess the impact of their curricula on participants' knowledge and understanding of the…
Beyond repair - family and community reintegration after obstetric fistula surgery: study protocol.
Byamugisha, Josaphat; El Ayadi, Alison; Obore, Susan; Mwanje, Haruna; Kakaire, Othman; Barageine, Justus; Lester, Felicia; Butrick, Elizabeth; Korn, Abner; Nalubwama, Hadija; Knight, Sharon; Miller, Suellen
2015-12-18
Obstetric fistula is a debilitating birth injury that affects an estimated 2-3 million women globally, most in sub-Saharan Africa and Asia. The urinary and/or fecal incontinence associated with fistula affects women physically, psychologically and socioeconomically. Surgical management of fistula is available with clinical success rates ranging from 65-95 %. Previous research on fistula repair outcomes has focused primarily on clinical outcomes without considering the broader goal of successful reintegration into family and community. The objectives for this study are to understand the process of family and community reintegration post fistula surgery and develop a measurement tool to assess long-term success of post-surgical family and community reintegration. This study is an exploratory sequential mixed-methods design including a preliminary qualitative component comprising in-depth interviews and focus group discussions to explore reintegration to family and community after fistula surgery. These results will be used to develop a reintegration tool, and the tool will be validated within a small longitudinal cohort (n = 60) that will follow women for 12 months after obstetric fistula surgery. Medical record abstraction will be conducted for patients managed within the fistula unit. Ethical approval for the study has been granted. This study will provide information regarding the success of family and community reintegration among women returning home after obstetric fistula surgery. The clinical and research community can utilize the standardized measurement tool in future studies of this patient population.
Mizukami, Naoki; Clark, Martyn P.; Sampson, Kevin; Nijssen, Bart; Mao, Yixin; McMillan, Hilary; Viger, Roland; Markstrom, Steven; Hay, Lauren E.; Woods, Ross; Arnold, Jeffrey R.; Brekke, Levi D.
2016-01-01
This paper describes the first version of a stand-alone runoff routing tool, mizuRoute. The mizuRoute tool post-processes runoff outputs from any distributed hydrologic model or land surface model to produce spatially distributed streamflow at various spatial scales from headwater basins to continental-wide river systems. The tool can utilize both traditional grid-based river network and vector-based river network data. Both types of river network include river segment lines and the associated drainage basin polygons, but the vector-based river network can represent finer-scale river lines than the grid-based network. Streamflow estimates at any desired location in the river network can be easily extracted from the output of mizuRoute. The routing process is simulated as two separate steps. First, hillslope routing is performed with a gamma-distribution-based unit-hydrograph to transport runoff from a hillslope to a catchment outlet. The second step is river channel routing, which is performed with one of two routing scheme options: (1) a kinematic wave tracking (KWT) routing procedure; and (2) an impulse response function – unit-hydrograph (IRF-UH) routing procedure. The mizuRoute tool also includes scripts (python, NetCDF operators) to pre-process spatial river network data. This paper demonstrates mizuRoute's capabilities to produce spatially distributed streamflow simulations based on river networks from the United States Geological Survey (USGS) Geospatial Fabric (GF) data set in which over 54 000 river segments and their contributing areas are mapped across the contiguous United States (CONUS). A brief analysis of model parameter sensitivity is also provided. The mizuRoute tool can assist model-based water resources assessments including studies of the impacts of climate change on streamflow.
Fabrication of a micromold using negative PMER
NASA Astrophysics Data System (ADS)
Kwon, Young A.; Chae, Kyoung-Soo; Jeoung, Dae S.; Kim, Jong Y.; Moon, Sung
2001-10-01
We fabricated a micro mold using UV-lithography process with a novel mold material, negative PMER. Negative PMER(TOK, PMER N-CA3000) is a chemically amplified negative tone photoresist on a novolak resin base. It can be processed using standard equipment such as standard spin coater, baking with ovens or hotplates, and immersion development tools. Good quality resist patterns of up to 36μm thickness were achieved by means of this equipment in a short time. The conditions of this process were pre-exposure bake of 110 degree(s)C/12min, exposure dose of 675mJ/cm2 post-exposure bake of 100 degree(s)C/9min, and development for 10min.
Morawski, Markus; Kirilina, Evgeniya; Scherf, Nico; Jäger, Carsten; Reimann, Katja; Trampel, Robert; Gavriilidis, Filippos; Geyer, Stefan; Biedermann, Bernd; Arendt, Thomas; Weiskopf, Nikolaus
2017-11-28
Recent breakthroughs in magnetic resonance imaging (MRI) enabled quantitative relaxometry and diffusion-weighted imaging with sub-millimeter resolution. Combined with biophysical models of MR contrast the emerging methods promise in vivo mapping of cyto- and myelo-architectonics, i.e., in vivo histology using MRI (hMRI) in humans. The hMRI methods require histological reference data for model building and validation. This is currently provided by MRI on post mortem human brain tissue in combination with classical histology on sections. However, this well established approach is limited to qualitative 2D information, while a systematic validation of hMRI requires quantitative 3D information on macroscopic voxels. We present a promising histological method based on optical 3D imaging combined with a tissue clearing method, Clear Lipid-exchanged Acrylamide-hybridized Rigid Imaging compatible Tissue hYdrogel (CLARITY), adapted for hMRI validation. Adapting CLARITY to the needs of hMRI is challenging due to poor antibody penetration into large sample volumes and high opacity of aged post mortem human brain tissue. In a pilot experiment we achieved transparency of up to 8 mm-thick and immunohistochemical staining of up to 5 mm-thick post mortem brain tissue by a combination of active and passive clearing, prolonged clearing and staining times. We combined 3D optical imaging of the cleared samples with tailored image processing methods. We demonstrated the feasibility for quantification of neuron density, fiber orientation distribution and cell type classification within a volume with size similar to a typical MRI voxel. The presented combination of MRI, 3D optical microscopy and image processing is a promising tool for validation of MRI-based microstructure estimates. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Cooper, P L; Raja, R; Golder, J; Stewart, A J; Shaikh, R F; Apostolides, M; Savva, J; Sequeira, J L; Silvers, M A
2016-12-01
A standardised nutrition risk screening (NRS) programme with ongoing education is recommended for the successful implementation of NRS. This project aimed to develop and implement a standardised NRS and education process across the adult bed-based services of a large metropolitan health service and to achieve a 75% NRS compliance at 12 months post-implementation. A working party of Monash Health (MH) dietitians and a nutrition technician revised an existing NRS medical record form consisting of the Malnutrition Universal Screening Tool and nutrition management guidelines. Nursing staff across six MH hospital sites were educated in the use of this revised form and there was a formalised implementation process. Support from Executive Management, nurse educators and the Nutrition Risk Committee ensured the incorporation of NRS into nursing practice. Compliance audits were conducted pre- and post-implementation. At 12 months post-implementation, organisation-wide NRS compliance reached 34.3%. For those wards that had pre-implementation NRS performed by nursing staff, compliance increased from 7.1% to 37.9% at 12 months (P < 0.001). The improved NRS form is now incorporated into standard nursing practice and NRS is embedded in the organisation's 'Point of Care Audit', which is reported 6-monthly to the Nutrition Risk Committee and site Quality and Safety Committees. NRS compliance improved at MH with strong governance support and formalised implementation; however, the overall compliance achieved appears to have been affected by the complexity and diversity of multiple healthcare sites. Ongoing education, regular auditing and establishment of NRS routines and ward practices is recommended to further improve compliance. © 2016 The British Dietetic Association Ltd.
Begasse de Dhaem, Olivia; Barr, William B; Balcer, Laura J; Galetta, Steven L; Minen, Mia T
2017-12-01
Given that post-traumatic headache is one of the most prevalent and long-lasting post-concussion sequelae, causes significant morbidity, and might be associated with slower neurocognitive recovery, we sought to evaluate the use of concussion screening scores in a concussion clinic population to assess for post-traumatic headache. This is a retrospective cross-sectional study of 254 concussion patients from the New York University (NYU) Concussion Registry. Data on the headache characteristics, concussion mechanism, concussion screening scores were collected and analyzed. 72% of the patients had post-traumatic headache. About half (56.3%) were women. The mean age was 35 (SD 16.2). 90 (35%) patients suffered from sport-related concussions (SRC). Daily post-traumatic headache patients had higher Sport Concussion Assessment Tool (SCAT)-3 symptom severity scores than the non-daily post-traumatic headache and the headache-free patients (50.2 [SD 28.2] vs. 33.1 [SD 27.5] vs. 21.6 SD23], p < 0.001). Patients with SRC had lower headache intensity (4.47 [SD 2.5] vs. 6.24 [SD 2.28], p < 0.001) and SCAT symptom severity scores (33.9 [SD 27.4] vs. 51.4 [SD 27.7], p < 0.001) than the other patients, but there were no differences in post-traumatic headache prevalence, frequency, and Standardized Assessment of Concussion (SAC) scores. The presence and frequency of post-traumatic headache are associated with the SCAT-3 symptom severity score, which is the most important predictor for post-concussion recovery. The SCAT-3 symptom severity score might be a useful tool to help characterize patients' post-traumatic headache.
NASA Technical Reports Server (NTRS)
Mainger, Steve
2004-01-01
As NASA speculates on and explores the future of aviation, the technological and physical aspects of our environment increasing become hurdles that must be overcome for success. Research into methods for overcoming some of these selected hurdles have been purposed by several NASA research partners as concepts. The task of establishing a common evaluation environment was placed on NASA's Virtual Airspace Simulation Technologies (VAST) project (sub-project of VAMS), and they responded with the development of the Airspace Concept Evaluation System (ACES). As one examines the ACES environment from a communication, navigation or surveillance (CNS) perspective, the simulation parameters are built with assumed perfection in the transactions associated with CNS. To truly evaluate these concepts in a realistic sense, the contributions/effects of CNS must be part of the ACES. NASA Glenn Research Center (GRC) has supported the Virtual Airspace Modeling and Simulation (VAMS) project through the continued development of CNS models and analysis capabilities which supports the ACES environment. NASA GRC initiated the development a communications traffic loading analysis tool, called the Future Aeronautical Sub-network Traffic Emulator for Communications, Navigation and Surveillance (FASTE-CNS), as part of this support. This tool allows for forecasting of communications load with the understanding that, there is no single, common source for loading models used to evaluate the existing and planned communications channels; and that, consensus and accuracy in the traffic load models is a very important input to the decisions being made on the acceptability of communication techniques used to fulfill the aeronautical requirements. Leveraging off the existing capabilities of the FASTE-CNS tool, GRC has called for FASTE-CNS to have the functionality to pre- and post-process the simulation runs of ACES to report on instances when traffic density, frequency congestion or aircraft spacing/distance violations have occurred. The integration of these functions require that the CNS models used to characterize these avionic system be of higher fidelity and better consistency then is present in FASTE-CNS system. This presentation will explore the capabilities of FASTE-CNS with renewed emphasis on the enhancements being added to perform these processing functions; the fidelity and reliability of CNS models necessary to make the enhancements work; and the benchmarking of FASTE-CNS results to improve confidence for the results of the new processing capabilities.
Automatic cloud coverage assessment of Formosat-2 image
NASA Astrophysics Data System (ADS)
Hsu, Kuo-Hsien
2011-11-01
Formosat-2 satellite equips with the high-spatial-resolution (2m ground sampling distance) remote sensing instrument. It has been being operated on the daily-revisiting mission orbit by National Space organization (NSPO) of Taiwan since May 21 2004. NSPO has also serving as one of the ground receiving stations for daily processing the received Formosat- 2 images. The current cloud coverage assessment of Formosat-2 image for NSPO Image Processing System generally consists of two major steps. Firstly, an un-supervised K-means method is used for automatically estimating the cloud statistic of Formosat-2 image. Secondly, manual estimation of cloud coverage from Formosat-2 image is processed by manual examination. Apparently, a more accurate Automatic Cloud Coverage Assessment (ACCA) method certainly increases the efficiency of processing step 2 with a good prediction of cloud statistic. In this paper, mainly based on the research results from Chang et al, Irish, and Gotoh, we propose a modified Formosat-2 ACCA method which considered pre-processing and post-processing analysis. For pre-processing analysis, cloud statistic is determined by using un-supervised K-means classification, Sobel's method, Otsu's method, non-cloudy pixels reexamination, and cross-band filter method. Box-Counting fractal method is considered as a post-processing tool to double check the results of pre-processing analysis for increasing the efficiency of manual examination.
Förster resonance energy transfer as a tool to study photoreceptor biology
Hovan, Stephanie C.; Howell, Scott; Park, Paul S.-H.
2010-01-01
Vision is initiated in photoreceptor cells of the retina by a set of biochemical events called phototransduction. These events occur via coordinated dynamic processes that include changes in secondary messenger concentrations, conformational changes and post-translational modifications of signaling proteins, and protein-protein interactions between signaling partners. A complete description of the orchestration of these dynamic processes is still unavailable. Described in this work is the first step in the development of tools combining fluorescent protein technology, Förster resonance energy transfer (FRET), and transgenic animals that have the potential to reveal important molecular insights about the dynamic processes occurring in photoreceptor cells. We characterize the fluorescent proteins SCFP3A and SYFP2 for use as a donor-acceptor pair in FRET assays, which will facilitate the visualization of dynamic processes in living cells. We also demonstrate the targeted expression of these fluorescent proteins to the rod photoreceptor cells of Xenopus laevis, and describe a general method for detecting FRET in these cells. The general approaches described here can address numerous types of questions related to phototransduction and photoreceptor biology by providing a platform to visualize dynamic processes in molecular detail within a native context. PMID:21198205
Allen, Jacqui; Annells, Merilyn
2009-04-01
To explore through literature review the appropriateness of three common tools for use by community nurses to screen war veteran and war widow(er) clients for depression, anxiety and post-traumatic stress disorder. War veterans and, to a lesser extent, war widow(er)s, are prone to mental health challenges, especially depression, anxiety and post-traumatic stress disorder. Community nurses do not accurately identify such people with depression and related disorders although they are well positioned to do so. The use of valid and reliable self-report tools is one method of improving nurses' identification of people with actual or potential mental health difficulties for referral to a general practitioner or mental health practitioner for diagnostic assessment and treatment. The Geriatric Depression Scale, Depression Anxiety Stress Scales and Post-traumatic Stress Disorder Checklist are frequently recommended for mental health screening but the appropriateness of using the tools for screening war veteran and war widow(er) community nursing clients who are often aged and have functional impairment, is unknown. Systematic review. Current literature informs that the Geriatric Depression Scale accurately predicts a diagnosis of depression in community nursing cohorts. The three Depression Anxiety Stress Scales subscales of depression, anxiety and stress are valid; however, no studies were identified that compared the performance of the Depression Anxiety Stress Scales in predicting diagnoses of depression or anxiety. The Post-traumatic Stress Disorder Checklist predicts post-traumatic stress disorder in community cohorts although no studies meeting the selection criteria included male participants. This review provides recommendations for the use of the Geriatric Depression Scale, Depression Anxiety Stress Scales and The Post-traumatic Stress Disorder Checklist based on examination of the published evidence for the application of these screening tools in samples approximated to community nursing cohorts. Findings and recommendations would guide community nurses, managers and health planners in the selection of mental health screening tools to promote holistic community nursing care.
Look@NanoSIMS--a tool for the analysis of nanoSIMS data in environmental microbiology.
Polerecky, Lubos; Adam, Birgit; Milucka, Jana; Musat, Niculina; Vagner, Tomas; Kuypers, Marcel M M
2012-04-01
We describe an open-source freeware programme for high throughput analysis of nanoSIMS (nanometre-scale secondary ion mass spectrometry) data. The programme implements basic data processing and analytical functions, including display and drift-corrected accumulation of scanned planes, interactive and semi-automated definition of regions of interest (ROIs), and export of the ROIs' elemental and isotopic composition in graphical and text-based formats. Additionally, the programme offers new functions that were custom-designed to address the needs of environmental microbiologists. Specifically, it allows manual and automated classification of ROIs based on the information that is derived either from the nanoSIMS dataset itself (e.g. from labelling achieved by halogen in situ hybridization) or is provided externally (e.g. as a fluorescence in situ hybridization image). Moreover, by implementing post-processing routines coupled to built-in statistical tools, the programme allows rapid synthesis and comparative analysis of results from many different datasets. After validation of the programme, we illustrate how these new processing and analytical functions increase flexibility, efficiency and depth of the nanoSIMS data analysis. Through its custom-made and open-source design, the programme provides an efficient, reliable and easily expandable tool that can help a growing community of environmental microbiologists and researchers from other disciplines process and analyse their nanoSIMS data. © 2012 Society for Applied Microbiology and Blackwell Publishing Ltd.
Detecting false positive sequence homology: a machine learning approach.
Fujimoto, M Stanley; Suvorov, Anton; Jensen, Nicholas O; Clement, Mark J; Bybee, Seth M
2016-02-24
Accurate detection of homologous relationships of biological sequences (DNA or amino acid) amongst organisms is an important and often difficult task that is essential to various evolutionary studies, ranging from building phylogenies to predicting functional gene annotations. There are many existing heuristic tools, most commonly based on bidirectional BLAST searches that are used to identify homologous genes and combine them into two fundamentally distinct classes: orthologs and paralogs. Due to only using heuristic filtering based on significance score cutoffs and having no cluster post-processing tools available, these methods can often produce multiple clusters constituting unrelated (non-homologous) sequences. Therefore sequencing data extracted from incomplete genome/transcriptome assemblies originated from low coverage sequencing or produced by de novo processes without a reference genome are susceptible to high false positive rates of homology detection. In this paper we develop biologically informative features that can be extracted from multiple sequence alignments of putative homologous genes (orthologs and paralogs) and further utilized in context of guided experimentation to verify false positive outcomes. We demonstrate that our machine learning method trained on both known homology clusters obtained from OrthoDB and randomly generated sequence alignments (non-homologs), successfully determines apparent false positives inferred by heuristic algorithms especially among proteomes recovered from low-coverage RNA-seq data. Almost ~42 % and ~25 % of predicted putative homologies by InParanoid and HaMStR respectively were classified as false positives on experimental data set. Our process increases the quality of output from other clustering algorithms by providing a novel post-processing method that is both fast and efficient at removing low quality clusters of putative homologous genes recovered by heuristic-based approaches.
SensePath: Understanding the Sensemaking Process Through Analytic Provenance.
Nguyen, Phong H; Xu, Kai; Wheat, Ashley; Wong, B L William; Attfield, Simon; Fields, Bob
2016-01-01
Sensemaking is described as the process of comprehension, finding meaning and gaining insight from information, producing new knowledge and informing further action. Understanding the sensemaking process allows building effective visual analytics tools to make sense of large and complex datasets. Currently, it is often a manual and time-consuming undertaking to comprehend this: researchers collect observation data, transcribe screen capture videos and think-aloud recordings, identify recurring patterns, and eventually abstract the sensemaking process into a general model. In this paper, we propose a general approach to facilitate such a qualitative analysis process, and introduce a prototype, SensePath, to demonstrate the application of this approach with a focus on browser-based online sensemaking. The approach is based on a study of a number of qualitative research sessions including observations of users performing sensemaking tasks and post hoc analyses to uncover their sensemaking processes. Based on the study results and a follow-up participatory design session with HCI researchers, we decided to focus on the transcription and coding stages of thematic analysis. SensePath automatically captures user's sensemaking actions, i.e., analytic provenance, and provides multi-linked views to support their further analysis. A number of other requirements elicited from the design session are also implemented in SensePath, such as easy integration with existing qualitative analysis workflow and non-intrusive for participants. The tool was used by an experienced HCI researcher to analyze two sensemaking sessions. The researcher found the tool intuitive and considerably reduced analysis time, allowing better understanding of the sensemaking process.
Conditions Affecting the Usefulness of Pre- and Post-Tests for Assessment Purposes
ERIC Educational Resources Information Center
Boyas, Elise; Bryan, Lois D.; Lee, Tanya
2012-01-01
Interest in measuring and evaluating student learning in higher education is growing. There are many tools available to assess student learning. However, the use of such tools may be more or less appropriate under various conditions. This study provides some evidence related to the appropriate use of pre/post-tests. The question of whether graded…
Randy B. Foltz; Peter R. Robichaud; Hakjun Rhee
2008-01-01
We synthesized post-fire road treatment information to assist BAER specialists in making road rehabilitation decisions. We developed a questionnaire; conducted 30 interviews of BAER team engineers and hydrologists; acquired and analyzed gray literature and other relevant publications; and reviewed road rehabilitation procedures and analysis tools. Post-fire road...
Validation of a probabilistic post-fire erosion model
Pete Robichaud; William J. Elliot; Sarah A. Lewis; Mary Ellen Miller
2016-01-01
Post-fire increases of runoff and erosion often occur and land managers need tools to be able to project the increased risk. The Erosion Risk Management Tool (ERMiT) uses the Water Erosion Prediction Project (WEPP) model as the underlying processor. ERMiT predicts the probability of a given amount of hillslope sediment delivery from a single rainfall or...
Reyes, E Michael; Sharma, Anjali; Thomas, Kate K; Kuehn, Chuck; Morales, José Rafael
2014-09-17
Little information exists on the technical assistance needs of local indigenous organizations charged with managing HIV care and treatment programs funded by the US President's Emergency Plan for AIDS Relief (PEPFAR). This paper describes the methods used to adapt the Primary Care Assessment Tool (PCAT) framework, which has successfully strengthened HIV primary care services in the US, into one that could strengthen the capacity of local partners to deliver priority health programs in resource-constrained settings by identifying their specific technical assistance needs. Qualitative methods and inductive reasoning approaches were used to conceptualize and adapt the new Clinical Assessment for Systems Strengthening (ClASS) framework. Stakeholder interviews, comparisons of existing assessment tools, and a pilot test helped determine the overall ClASS framework for use in low-resource settings. The framework was further refined one year post-ClASS implementation. Stakeholder interviews, assessment of existing tools, a pilot process and the one-year post- implementation assessment informed the adaptation of the ClASS framework for assessing and strengthening technical and managerial capacities of health programs at three levels: international partner, local indigenous partner, and local partner treatment facility. The PCAT focus on organizational strengths and systems strengthening was retained and implemented in the ClASS framework and approach. A modular format was chosen to allow the use of administrative, fiscal and clinical modules in any combination and to insert new modules as needed by programs. The pilot led to refined pre-visit planning, informed review team composition, increased visit duration, and restructured modules. A web-based toolkit was developed to capture three years of experiential learning; this kit can also be used for independent implementation of the ClASS framework. A systematic adaptation process has produced a qualitative framework that can inform implementation strategies in support of country led HIV care and treatment programs. The framework, as a well-received iterative process focused on technical assistance, may have broader utility in other global programs.
Leeman, Jennifer; Myers, Allison; Grant, Jennifer C; Wangen, Mary; Queen, Tara L
2017-09-01
The US tobacco industry spends $8.2 billion annually on marketing at the point of sale (POS), a practice known to increase tobacco use. Evidence-based policy interventions (EBPIs) are available to reduce exposure to POS marketing, and nationwide, states are funding community-based tobacco control partnerships to promote local enactment of these EBPIs. Little is known, however, about what implementation strategies best support community partnerships' success enacting EBPI. Guided by Kingdon's theory of policy change, Counter Tools provides tools, training, and other implementation strategies to support community partnerships' performance of five core policy change processes: document local problem, formulate policy solutions, engage partners, raise awareness of problems and solutions, and persuade decision makers to enact new policy. We assessed Counter Tools' impact at 1 year on (1) partnership coordinators' self-efficacy, (2) partnerships' performance of core policy change processes, (3) community progress toward EBPI enactment, and (4) salient contextual factors. Counter Tools provided implementation strategies to 30 partnerships. Data on self-efficacy were collected using a pre-post survey. Structured interviews assessed performance of core policy change processes. Data also were collected on progress toward EBPI enactment and contextual factors. Analysis included descriptive and bivariate statistics and content analysis. Following 1-year exposure to implementation strategies, coordinators' self-efficacy increased significantly. Partnerships completed the greatest proportion of activities within the "engage partners" and "document local problem" core processes. Communities made only limited progress toward policy enactment. Findings can inform delivery of implementation strategies and tests of their effects on community-level efforts to enact EBPIs.
Pre-liver transplant psychosocial evaluation predicts post-transplantation outcomes.
Benson, Ariel A; Rowe, Mina; Eid, Ahmad; Bluth, Keren; Merhav, Hadar; Khalaileh, Abed; Safadi, Rifaat
2018-08-01
Psychosocial factors greatly impact the course of patients throughout the liver transplantation process. A retrospective chart review was performed of patients who underwent liver transplantation at Hadassah-Hebrew University Medical Center between 2002 and 2012. A composite psychosocial score was computed based on the patient's pre-transplant evaluation. Patients were divided into two groups based on compliance, support and insight: Optimal psychosocial score and Non-optimal psychosocial score. Post-liver transplantation survival and complication rates were evaluated. Out of 100 patients who underwent liver transplantation at the Hadassah-Hebrew University Medical Center between 2002 and 2012, 93% had a complete pre-liver transplant psychosocial evaluation in the medical record performed by professional psychologists and social workers. Post-liver transplantation survival was significantly higher in the Optimal group (85%) as compared to the Non-optimal group (56%, p = .002). Post-liver transplantation rate of renal failure was significantly lower in the Optimal group. No significant differences were observed between the groups in other post-transplant complications. A patient's psychosocial status may impact outcomes following transplantation as inferior psychosocial grades were associated with lower overall survival and increased rates of complications. Pre-liver transplant psychosocial evaluations are an important tool to help predict survival following transplantation.
A semi-automatic annotation tool for cooking video
NASA Astrophysics Data System (ADS)
Bianco, Simone; Ciocca, Gianluigi; Napoletano, Paolo; Schettini, Raimondo; Margherita, Roberto; Marini, Gianluca; Gianforme, Giorgio; Pantaleo, Giuseppe
2013-03-01
In order to create a cooking assistant application to guide the users in the preparation of the dishes relevant to their profile diets and food preferences, it is necessary to accurately annotate the video recipes, identifying and tracking the foods of the cook. These videos present particular annotation challenges such as frequent occlusions, food appearance changes, etc. Manually annotate the videos is a time-consuming, tedious and error-prone task. Fully automatic tools that integrate computer vision algorithms to extract and identify the elements of interest are not error free, and false positive and false negative detections need to be corrected in a post-processing stage. We present an interactive, semi-automatic tool for the annotation of cooking videos that integrates computer vision techniques under the supervision of the user. The annotation accuracy is increased with respect to completely automatic tools and the human effort is reduced with respect to completely manual ones. The performance and usability of the proposed tool are evaluated on the basis of the time and effort required to annotate the same video sequences.
Dimagno, Matthew J; Wamsteker, Erik-Jan; Rizk, Rafat S; Spaete, Joshua P; Gupta, Suraj; Sahay, Tanya; Costanzo, Jeffrey; Inadomi, John M; Napolitano, Lena M; Hyzy, Robert C; Desmond, Jeff S
2014-03-01
There are many published clinical guidelines for acute pancreatitis (AP). Implementation of these recommendations is variable. We hypothesized that a clinical decision support (CDS) tool would change clinician behavior and shorten hospital length of stay (LOS). Observational study, entitled, The AP Early Response (TAPER) Project. Tertiary center emergency department (ED) and hospital. Two consecutive samplings of patients having ICD-9 code (577.0) for AP were generated from the emergency department (ED) or hospital admissions. Diagnosis of AP was based on conventional Atlanta criteria. The Pre-TAPER-CDS-Tool group (5/30/06-6/22/07) had 110 patients presenting to the ED with AP per 976 ICD-9 (577.0) codes and the Post-TAPER-CDS-Tool group (5/30/06-6/22/07) had 113 per 907 ICD-9 codes (7/14/10-5/5/11). The TAPER-CDS-Tool, developed 12/2008-7/14/2010, is a combined early, automated paging-alert system, which text pages ED clinicians about a patient with AP and an intuitive web-based point-of-care instrument, consisting of seven early management recommendations. The pre- vs. post-TAPER-CDS-Tool groups had similar baseline characteristics. The post-TAPER-CDS-Tool group met two management goals more frequently than the pre-TAPER-CDS-Tool group: risk stratification (P<0.0001) and intravenous fluids >6L/1st 0-24 h (P=0.0003). Mean (s.d.) hospital LOS was significantly shorter in the post-TAPER-CDS-Tool group (4.6 (3.1) vs. 6.7 (7.0) days, P=0.0126). Multivariate analysis identified four independent variables for hospital LOS: the TAPER-CDS-Tool associated with shorter LOS (P=0.0049) and three variables associated with longer LOS: Japanese severity score (P=0.0361), persistent organ failure (P=0.0088), and local pancreatic complications (<0.0001). The TAPER-CDS-Tool is associated with changed clinician behavior and shortened hospital LOS, which has significant financial implications.
APT - NASA ENHANCED VERSION OF AUTOMATICALLY PROGRAMMED TOOL SOFTWARE - STAND-ALONE VERSION
NASA Technical Reports Server (NTRS)
Premo, D. A.
1994-01-01
The APT code is one of the most widely used software tools for complex numerically controlled (N/C) machining. APT is an acronym for Automatically Programmed Tools and is used to denote both a language and the computer software that processes that language. Development of the APT language and software system was begun over twenty years ago as a U. S. government sponsored industry and university research effort. APT is a "problem oriented" language that was developed for the explicit purpose of aiding the N/C machine tools. Machine-tool instructions and geometry definitions are written in the APT language to constitute a "part program." The APT part program is processed by the APT software to produce a cutter location (CL) file. This CL file may then be processed by user supplied post processors to convert the CL data into a form suitable for a particular N/C machine tool. This June, 1989 offering of the APT system represents an adaptation, with enhancements, of the public domain version of APT IV/SSX8 to the DEC VAX-11/780 for use by the Engineering Services Division of the NASA Goddard Space Flight Center. Enhancements include the super pocket feature which allows concave and convex polygon shapes of up to 40 points including shapes that overlap, that leave islands of material within the pocket, and that have one or more arcs as part of the pocket boundary. Recent modifications to APT include a rework of the POCKET subroutine and correction of an error that prevented the use within a macro of a macro variable cutter move statement combined with macro variable double check surfaces. Former modifications included the expansion of array and buffer sizes to accommodate larger part programs, and the insertion of a few user friendly error messages. The APT system software on the DEC VAX-11/780 is organized into two separate programs: the load complex and the APT processor. The load complex handles the table initiation phase and is usually only run when changes to the APT processor capabilities are made. This phase initializes character recognition and syntax tables for the APT processor by creating FORTRAN block data programs. The APT processor consists of four components: the translator, the execution complex, the subroutine library, and the CL editor. The translator examines each APT statement in the part program for recognizable structure and generates a new statement, or series of statements, in an intermediate language. The execution complex processes all of the definition, motion, and related statements to generate cutter location coordinates. The subroutine library contains routines defining the algorithms required to process the sequenced list of intermediate language commands generated by the translator. The CL editor re-processes the cutter location coordinates according to user supplied commands to generate a final CL file. A sample post processor is also included which translates a CL file into a form for use with a Wales Strippit Fabramatic Model 30/30 sheet metal punch. The user should be able to readily develop post processors for other N/C machine tools. The APT language is a statement oriented, sequence dependent language. With the exception of such programming techniques as looping and macros, statements in an APT program are executed in a strict first-to-last sequence. In order to provide programming capability for the broadest possible range of parts and of machine tools, APT input (and output) is generalized, as represented by 3-dimensional geometry and tools, and arbitrarily uniform, as represented by the moving tool concept and output data in absolute coordinates. A command procedure allows the user to select the desired part program, ask for a graphics file of cutter motions in IGES format, and submit the procedure as a batch job, if desired. The APT system software is written in FORTRAN 77 for batch and interactive execution and has been implemented on a DEC VAX series computer under VMS 4.4. The enhancements for this version of APT were last updated in June, 1989. The NASA adaptation, with enhancements, of the public domain version of the APT IV/SSX8 software to the DEC VAX-11/780 is available by license for a period of ten (10) years to approved licensees. The licensed program product delivered includes the APT IV/SSX8 system source code, object code, executable images, and command procedures and one set of supporting documentation. Additional copies of the supporting documentation may be purchased at any time at the price indicated below.
Rong, Y; Padron, A V; Hagerty, K J; Nelson, N; Chi, S; Keyhani, N O; Katz, J; Datta, S P A; Gomes, C; McLamore, E S
2018-04-30
Impedimetric biosensors for measuring small molecules based on weak/transient interactions between bioreceptors and target analytes are a challenge for detection electronics, particularly in field studies or in the analysis of complex matrices. Protein-ligand binding sensors have enormous potential for biosensing, but achieving accuracy in complex solutions is a major challenge. There is a need for simple post hoc analytical tools that are not computationally expensive, yet provide near real time feedback on data derived from impedance spectra. Here, we show the use of a simple, open source support vector machine learning algorithm for analyzing impedimetric data in lieu of using equivalent circuit analysis. We demonstrate two different protein-based biosensors to show that the tool can be used for various applications. We conclude with a mobile phone-based demonstration focused on the measurement of acetone, an important biomarker related to the onset of diabetic ketoacidosis. In all conditions tested, the open source classifier was capable of performing as well as, or better, than the equivalent circuit analysis for characterizing weak/transient interactions between a model ligand (acetone) and a small chemosensory protein derived from the tsetse fly. In addition, the tool has a low computational requirement, facilitating use for mobile acquisition systems such as mobile phones. The protocol is deployed through Jupyter notebook (an open source computing environment available for mobile phone, tablet or computer use) and the code was written in Python. For each of the applications, we provide step-by-step instructions in English, Spanish, Mandarin and Portuguese to facilitate widespread use. All codes were based on scikit-learn, an open source software machine learning library in the Python language, and were processed in Jupyter notebook, an open-source web application for Python. The tool can easily be integrated with the mobile biosensor equipment for rapid detection, facilitating use by a broad range of impedimetric biosensor users. This post hoc analysis tool can serve as a launchpad for the convergence of nanobiosensors in planetary health monitoring applications based on mobile phone hardware.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tran, Anh Phuong; Dafflon, Baptiste; Hubbard, Susan
TOUGH2 and iTOUGH2 are powerful models that simulate the heat and fluid flows in porous and fracture media, and perform parameter estimation, sensitivity analysis and uncertainty propagation analysis. However, setting up the input files is not only tedious, but error prone, and processing output files is time consuming. Here, we present an open source Matlab-based tool (iMatTOUGH) that supports the generation of all necessary inputs for both TOUGH2 and iTOUGH2 and visualize their outputs. The tool links the inputs of TOUGH2 and iTOUGH2, making sure the two input files are consistent. It supports the generation of rectangular computational mesh, i.e.,more » it automatically generates the elements and connections as well as their properties as required by TOUGH2. The tool also allows the specification of initial and time-dependent boundary conditions for better subsurface heat and water flow simulations. The effectiveness of the tool is illustrated by an example that uses TOUGH2 and iTOUGH2 to estimate soil hydrological and thermal properties from soil temperature data and simulate the heat and water flows at the Rifle site in Colorado.« less
A software tool for determination of breast cancer treatment methods using data mining approach.
Cakır, Abdülkadir; Demirel, Burçin
2011-12-01
In this work, breast cancer treatment methods are determined using data mining. For this purpose, software is developed to help to oncology doctor for the suggestion of application of the treatment methods about breast cancer patients. 462 breast cancer patient data, obtained from Ankara Oncology Hospital, are used to determine treatment methods for new patients. This dataset is processed with Weka data mining tool. Classification algorithms are applied one by one for this dataset and results are compared to find proper treatment method. Developed software program called as "Treatment Assistant" uses different algorithms (IB1, Multilayer Perception and Decision Table) to find out which one is giving better result for each attribute to predict and by using Java Net beans interface. Treatment methods are determined for the post surgical operation of breast cancer patients using this developed software tool. At modeling step of data mining process, different Weka algorithms are used for output attributes. For hormonotherapy output IB1, for tamoxifen and radiotherapy outputs Multilayer Perceptron and for the chemotherapy output decision table algorithm shows best accuracy performance compare to each other. In conclusion, this work shows that data mining approach can be a useful tool for medical applications particularly at the treatment decision step. Data mining helps to the doctor to decide in a short time.
Tran, Anh Phuong; Dafflon, Baptiste; Hubbard, Susan
2016-04-01
TOUGH2 and iTOUGH2 are powerful models that simulate the heat and fluid flows in porous and fracture media, and perform parameter estimation, sensitivity analysis and uncertainty propagation analysis. However, setting up the input files is not only tedious, but error prone, and processing output files is time consuming. Here, we present an open source Matlab-based tool (iMatTOUGH) that supports the generation of all necessary inputs for both TOUGH2 and iTOUGH2 and visualize their outputs. The tool links the inputs of TOUGH2 and iTOUGH2, making sure the two input files are consistent. It supports the generation of rectangular computational mesh, i.e.,more » it automatically generates the elements and connections as well as their properties as required by TOUGH2. The tool also allows the specification of initial and time-dependent boundary conditions for better subsurface heat and water flow simulations. The effectiveness of the tool is illustrated by an example that uses TOUGH2 and iTOUGH2 to estimate soil hydrological and thermal properties from soil temperature data and simulate the heat and water flows at the Rifle site in Colorado.« less
Using Option Grids: steps toward shared decision-making for neonatal circumcision.
Fay, Mary; Grande, Stuart W; Donnelly, Kyla; Elwyn, Glyn
2016-02-01
To assess the impact, acceptability and feasibility of a short encounter tool designed to enhance the process of shared decision-making and parental engagement. We analyzed video-recordings of clinical encounters, half undertaken before and half after a brief intervention that trained four clinicians how to use Option Grids, using an observer-based measure of shared decision-making. We also analyzed semi-structured interviews conducted with the clinicians four weeks after their exposure to the intervention. Observer OPTION(5) scores were higher at post-intervention, with a mean of 33.9 (SD=23.5) compared to a mean of 16.1 (SD=7.1) for pre-intervention, a significant difference of 17.8 (95% CI: 2.4, 33.2). Prior to using the intervention, clinicians used a consent document to frame circumcision as a default practice. Encounters with the Option Grid conferred agency to both parents and clinicians, and facilitated shared decision-making. Clinician reported recognizing the tool's positive effect on their communication process. Tools such as Option Grids have the potential to make it easier for clinicians to achieve shared decision-making. Encounter tools have the potential to change practice. More research is needed to test their feasibility in routine practice. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Lessons learned: clinicians' post-occupancy perspective of facility design involvement.
Reno, Kathy; Okland, Kathy; Finis, Nanne; Lamantia, Gina; Call, Roger; Cardon, Kerrie; Gerber, Deborah; Zeigler, Janet
2014-01-01
The research was conducted to determine clinician knowledge needs for competent involvement with the facility design process as well as to gather lessons learned on building stronger design teams. As clinical stakeholders are invited to the healthcare facility design table, the question arises as to the ability of professionally diverse team members to translate each other's comments and ideas accurately. In the past, hospitals were designed by a handful of hospital leaders and architects. More recently, multiple players have become involved throughout the design and construction of new healthcare facilities. Clinical consultants from two international healthcare companies observed that many clinicians were unprepared to effectively translate their needs to the architectural community or to competently utilize architectural tools and documents. A qualitative, post-occupancy cross-case study was conducted to understand how clinicians could increase their competencies for successful involvement in facility design. Focus group interviews were held with teams from healthcare facilities occupying their new facility for more than 6 months and less than 2 years. Curriculum topics were validated and additional areas recommended based on the interviews. Open-ended questioins on lessons learned provided several new dimensions to the research. Although validating the curriculum was the initial intent, the feedback from the focus groups on lessons learned provided rich concepts for practice implications and further research on post-occupancy. Decision-making, design process, interdisciplinary, planning, post-occupancy.
NASA Astrophysics Data System (ADS)
Tritscher, Torsten; Koched, Amine; Han, Hee-Siew; Filimundi, Eric; Johnson, Tim; Elzey, Sherrie; Avenido, Aaron; Kykal, Carsten; Bischof, Oliver F.
2015-05-01
Electrical mobility classification (EC) followed by Condensation Particle Counter (CPC) detection is the technique combined in Scanning Mobility Particle Sizers(SMPS) to retrieve nanoparticle size distributions in the range from 2.5 nm to 1 μm. The detectable size range of SMPS systems can be extended by the addition of an Optical Particle Sizer(OPS) that covers larger sizes from 300 nm to 10 μm. This optical sizing method reports an optical equivalent diameter, which is often different from the electrical mobility diameter measured by the standard SMPS technique. Multi-Instrument Manager (MIMTM) software developed by TSI incorporates algorithms that facilitate merging SMPS data sets with data based on optical equivalent diameter to compile single, wide-range size distributions. Here we present MIM 2.0, the next-generation of the data merging tool that offers many advanced features for data merging and post-processing. MIM 2.0 allows direct data acquisition with OPS and NanoScan SMPS instruments to retrieve real-time particle size distributions from 10 nm to 10 μm, which we show in a case study at a fireplace. The merged data can be adjusted using one of the merging options, which automatically determines an overall aerosol effective refractive index. As a result an indirect and average characterization of aerosol optical and shape properties is possible. The merging tool allows several pre-settings, data averaging and adjustments, as well as the export of data sets and fitted graphs. MIM 2.0 also features several post-processing options for SMPS data and differences can be visualized in a multi-peak sample over a narrow size range.
Harris, Joseph A.; McMahon, Alex R.; Woldorff, Marty G.
2015-01-01
Any information represented in the brain holds the potential to influence behavior. It is therefore of broad interest to determine the extent and quality of neural processing of stimulus input that occurs with and without awareness. The attentional blink is a useful tool for dissociating neural and behavioral measures of perceptual visual processing across conditions of awareness. The extent of higher-order visual information beyond basic sensory signaling that is processed during the attentional blink remains controversial. To determine what neural processing at the level of visual-object identification occurs in the absence of awareness, electrophysiological responses to images of faces and houses were recorded both within and outside of the attentional blink period during a rapid serial visual presentation (RSVP) stream. Electrophysiological results were sorted according to behavioral performance (correctly identified targets versus missed targets) within these blink and non-blink periods. An early index of face-specific processing (the N170, 140–220 ms post-stimulus) was observed regardless of whether the subject demonstrated awareness of the stimulus, whereas a later face-specific effect with the same topographic distribution (500–700 ms post-stimulus) was only seen for accurate behavioral discrimination of the stimulus content. The present findings suggest a multi-stage process of object-category processing, with only the later phase being associated with explicit visual awareness. PMID:23859644
Quantifiable and objective approach to organizational performance enhancement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholand, Andrew Joseph; Tausczik, Yla R.
This report describes a new methodology, social language network analysis (SLNA), that combines tools from social language processing and network analysis to identify socially situated relationships between individuals which, though subtle, are highly influential. Specifically, SLNA aims to identify and characterize the nature of working relationships by processing artifacts generated with computer-mediated communication systems, such as instant message texts or emails. Because social language processing is able to identify psychological, social, and emotional processes that individuals are not able to fully mask, social language network analysis can clarify and highlight complex interdependencies between group members, even when these relationships aremore » latent or unrecognized. This report outlines the philosophical antecedents of SLNA, the mechanics of preprocessing, processing, and post-processing stages, and some example results obtained by applying this approach to a 15-month corporate discussion archive.« less
Best conditions for biodegradation of diesel oil by chemometric tools.
Kaczorek, Ewa; Bielicka-Daszkiewicz, Katarzyna; Héberger, Károly; Kemény, Sándor; Olszanowski, Andrzej; Voelkel, Adam
2014-01-01
Diesel oil biodegradation by different bacteria-yeast-rhamnolipids consortia was tested. Chromatographic analysis of post-biodegradation residue was completed with chemometric tools (ANOVA, and a novel ranking procedure based on the sum of ranking differences). These tools were used in the selection of the most effective systems. The best results of aliphatic fractions of diesel oil biodegradation were observed for a yeast consortia with Aeromonas hydrophila KR4. For these systems the positive effect of rhamnolipids on hydrocarbon biodegradation was observed. However, rhamnolipids addition did not always have a positive influence on the biodegradation process (e.g. in case of yeast consortia with Stenotrophomonas maltophila KR7). Moreover, particular differences in the degradation pattern were observed for lower and higher alkanes than in the case with C22. Normally, the best conditions for "lower" alkanes are Aeromonas hydrophila KR4 + emulsifier independently from yeasts and e.g. Pseudomonas stutzeri KR7 for C24 alkane.
Pore, Meenal; Sengeh, David M.; Mugambi, Purity; Purswani, Nuri V.; Sesay, Tom; Arnold, Anna Lena; Tran, Anh-Minh A.; Myers, Ralph
2017-01-01
During the 2014 West African Ebola Virus outbreak it became apparent that the initial response to the outbreak was hampered by limitations in the collection, aggregation, analysis and use of data for intervention planning. As part of the post-Ebola recovery phase, IBM Research Africa partnered with the Port Loko District Health Management Team (DHMT) in Sierra Leone and GOAL Global, to design, implement and deploy a web-based decision support tool for district-level disease surveillance. This paper discusses the design process and the functionality of the first version of the system. The paper presents evaluation results prior to a pilot deployment and identifies features for future iterations. A qualitative assessment of the tool prior to pilot deployment indicates that it improves the timeliness and ease of using data for making decisions at the DHMT level. PMID:29854209
Navigating freely-available software tools for metabolomics analysis.
Spicer, Rachel; Salek, Reza M; Moreno, Pablo; Cañueto, Daniel; Steinbeck, Christoph
2017-01-01
The field of metabolomics has expanded greatly over the past two decades, both as an experimental science with applications in many areas, as well as in regards to data standards and bioinformatics software tools. The diversity of experimental designs and instrumental technologies used for metabolomics has led to the need for distinct data analysis methods and the development of many software tools. To compile a comprehensive list of the most widely used freely available software and tools that are used primarily in metabolomics. The most widely used tools were selected for inclusion in the review by either ≥ 50 citations on Web of Science (as of 08/09/16) or the use of the tool being reported in the recent Metabolomics Society survey. Tools were then categorised by the type of instrumental data (i.e. LC-MS, GC-MS or NMR) and the functionality (i.e. pre- and post-processing, statistical analysis, workflow and other functions) they are designed for. A comprehensive list of the most used tools was compiled. Each tool is discussed within the context of its application domain and in relation to comparable tools of the same domain. An extended list including additional tools is available at https://github.com/RASpicer/MetabolomicsTools which is classified and searchable via a simple controlled vocabulary. This review presents the most widely used tools for metabolomics analysis, categorised based on their main functionality. As future work, we suggest a direct comparison of tools' abilities to perform specific data analysis tasks e.g. peak picking.
Geib, Scott M; Hall, Brian; Derego, Theodore; Bremer, Forest T; Cannoles, Kyle; Sim, Sheina B
2018-04-01
One of the most overlooked, yet critical, components of a whole genome sequencing (WGS) project is the submission and curation of the data to a genomic repository, most commonly the National Center for Biotechnology Information (NCBI). While large genome centers or genome groups have developed software tools for post-annotation assembly filtering, annotation, and conversion into the NCBI's annotation table format, these tools typically require back-end setup and connection to an Structured Query Language (SQL) database and/or some knowledge of programming (Perl, Python) to implement. With WGS becoming commonplace, genome sequencing projects are moving away from the genome centers and into the ecology or biology lab, where fewer resources are present to support the process of genome assembly curation. To fill this gap, we developed software to assess, filter, and transfer annotation and convert a draft genome assembly and annotation set into the NCBI annotation table (.tbl) format, facilitating submission to the NCBI Genome Assembly database. This software has no dependencies, is compatible across platforms, and utilizes a simple command to perform a variety of simple and complex post-analysis, pre-NCBI submission WGS project tasks. The Genome Annotation Generator is a consistent and user-friendly bioinformatics tool that can be used to generate a .tbl file that is consistent with the NCBI submission pipeline. The Genome Annotation Generator achieves the goal of providing a publicly available tool that will facilitate the submission of annotated genome assemblies to the NCBI. It is useful for any individual researcher or research group that wishes to submit a genome assembly of their study system to the NCBI.
Hall, Brian; Derego, Theodore; Bremer, Forest T; Cannoles, Kyle
2018-01-01
Abstract Background One of the most overlooked, yet critical, components of a whole genome sequencing (WGS) project is the submission and curation of the data to a genomic repository, most commonly the National Center for Biotechnology Information (NCBI). While large genome centers or genome groups have developed software tools for post-annotation assembly filtering, annotation, and conversion into the NCBI’s annotation table format, these tools typically require back-end setup and connection to an Structured Query Language (SQL) database and/or some knowledge of programming (Perl, Python) to implement. With WGS becoming commonplace, genome sequencing projects are moving away from the genome centers and into the ecology or biology lab, where fewer resources are present to support the process of genome assembly curation. To fill this gap, we developed software to assess, filter, and transfer annotation and convert a draft genome assembly and annotation set into the NCBI annotation table (.tbl) format, facilitating submission to the NCBI Genome Assembly database. This software has no dependencies, is compatible across platforms, and utilizes a simple command to perform a variety of simple and complex post-analysis, pre-NCBI submission WGS project tasks. Findings The Genome Annotation Generator is a consistent and user-friendly bioinformatics tool that can be used to generate a .tbl file that is consistent with the NCBI submission pipeline Conclusions The Genome Annotation Generator achieves the goal of providing a publicly available tool that will facilitate the submission of annotated genome assemblies to the NCBI. It is useful for any individual researcher or research group that wishes to submit a genome assembly of their study system to the NCBI. PMID:29635297
Assimilating the Future for Better Forecasts and Earlier Warnings
NASA Astrophysics Data System (ADS)
Du, H.; Wheatcroft, E.; Smith, L. A.
2016-12-01
Multi-model ensembles have become popular tools to account for some of the uncertainty due to model inadequacy in weather and climate simulation-based predictions. The current multi-model forecasts focus on combining single model ensemble forecasts by means of statistical post-processing. Assuming each model is developed independently or with different primary target variables, each is likely to contain different dynamical strengths and weaknesses. Using statistical post-processing, such information is only carried by the simulations under a single model ensemble: no advantage is taken to influence simulations under the other models. A novel methodology, named Multi-model Cross Pollination in Time, is proposed for multi-model ensemble scheme with the aim of integrating the dynamical information regarding the future from each individual model operationally. The proposed approach generates model states in time via applying data assimilation scheme(s) to yield truly "multi-model trajectories". It is demonstrated to outperform traditional statistical post-processing in the 40-dimensional Lorenz96 flow. Data assimilation approaches are originally designed to improve state estimation from the past to the current time. The aim of this talk is to introduce a framework that uses data assimilation to improve model forecasts at future time (not to argue for any one particular data assimilation scheme). Illustration of applying data assimilation "in the future" to provide early warning of future high-impact events is also presented.
External Tank - The Structure Backbone
NASA Technical Reports Server (NTRS)
Welzyn, Kenneth; Pilet, Jeffrey C.; Diecidue-Conners, Dawn; Worden, Michelle; Guillot, Michelle
2011-01-01
The External Tank forms the structural backbone of the Space Shuttle in the launch configuration. Because the tank flies to orbital velocity with the Space Shuttle Orbiter, minimization of weight is mandatory, to maximize payload performance. Choice of lightweight materials both for structure and thermal conditioning was necessary. The tank is large, and unique manufacturing facilities, tooling, handling, and transportation operations were required. Weld processes and tooling evolved with the design as it matured through several block changes, to reduce weight. Non Destructive Evaluation methods were used to assure integrity of welds and thermal protection system materials. The aluminum-lithium alloy was used near the end of the program and weld processes and weld repair techniques had to be refined. Development and implementation of friction stir welding was a substantial technology development incorporated during the Program. Automated thermal protection system application processes were developed for the majority of the tank surface. Material obsolescence was an issue throughout the 40 year program. The final configuration and tank weight enabled international space station assembly in a high inclination orbit allowing international cooperation with the Russian Federal Space Agency. Numerous process controls were implemented to assure product quality, and innovative proof testing was accomplished prior to delivery. Process controls were implemented to assure cleanliness in the production environment, to control contaminants, and to preclude corrosion. Each tank was accepted via rigorous inspections, including non-destructive evaluation techniques, proof testing, and all systems testing. In the post STS-107 era, the project focused on ascent debris risk reduction. This was accomplished via stringent process controls, post flight assessment using substantially improved imagery, and selective redesigns. These efforts were supported with a number of test programs to simulate combined environments. Processing improvements included development and use of low spray guns for foam application, additional human factors considerations for production, use of high fidelity mockups during hardware processing with video review, improved tank access, extensive use of non destructive evaluation, and producibility enhancements. Design improvements included redesigned bipod fittings, a bellows heater, a feedline camera active during ascent flight, removal of the protuberance airload ramps, redesigned ice frost ramps, and titanium brackets replaced aluminum brackets on the liquid oxygen feedline. Post flight assessment improved due to significant addition of imagery assets, greatly improving situational awareness. The debris risk was reduced by two orders of magnitude. During this time a major natural disaster was overcome when Katrina damaged the manufacturing facility. Numerous lessons from these efforts are documented within the paper.
Synthetic biology: tools to design microbes for the production of chemicals and fuels.
Seo, Sang Woo; Yang, Jina; Min, Byung Eun; Jang, Sungho; Lim, Jae Hyung; Lim, Hyun Gyu; Kim, Seong Cheol; Kim, Se Yeon; Jeong, Jun Hong; Jung, Gyoo Yeol
2013-11-01
The engineering of biological systems to achieve specific purposes requires design tools that function in a predictable and quantitative manner. Recent advances in the field of synthetic biology, particularly in the programmable control of gene expression at multiple levels of regulation, have increased our ability to efficiently design and optimize biological systems to perform designed tasks. Furthermore, implementation of these designs in biological systems highlights the potential of using these tools to build microbial cell factories for the production of chemicals and fuels. In this paper, we review current developments in the design of tools for controlling gene expression at transcriptional, post-transcriptional and post-translational levels, and consider potential applications of these tools. Copyright © 2013 Elsevier Inc. All rights reserved.
Ewing, Gail; Austin, Lynn; Grande, Gunn
2016-04-01
The importance of supporting family carers is well recognised in healthcare policy. The Carer Support Needs Assessment Tool is an evidence-based, comprehensive measure of carer support needs to facilitate carer support in palliative home care. To examine practitioner perspectives of the role of the Carer Support Needs Assessment Tool intervention in palliative home care to identify its impact and mechanisms of action. Qualitative - practitioner accounts of implementation (interviews, focus groups, reflective audio diaries) plus researcher field notes. A total of 29 staff members from two hospice home-care services - contrasting geographical locations, different service sizes and staff composition. A thematic analysis was conducted. Existing approaches to identification of carer needs were informal and unstructured. Practitioners expressed some concerns, pre-implementation, about negative impacts of the Carer Support Needs Assessment Tool on carers and expectations raised about support available. In contrast, post-implementation, the Carer Support Needs Assessment Tool provided positive impacts when used as part of a carer-led assessment and support process: it made support needs visible, legitimised support for carers and opened up different conversations with carers. The mechanisms of action that enabled the Carer Support Needs Assessment Tool to make a difference were creating space for the separate needs of carers, providing an opportunity for carers to express support needs and responding to carers' self-defined priorities. The Carer Support Needs Assessment Tool delivered benefits through a change in practice to an identifiable, separate assessment process for carers, facilitated by practitioners but carer-led. Used routinely with all carers, the Carer Support Needs Assessment Tool has the potential to normalise carer assessment and support, facilitate delivery of carer-identified support and enable effective targeting of resources. © The Author(s) 2015.
Mravec, Jozef; Kračun, Stjepan K; Rydahl, Maja G; Westereng, Bjørge; Miart, Fabien; Clausen, Mads H; Fangel, Jonatan U; Daugaard, Mathilde; Van Cutsem, Pierre; De Fine Licht, Henrik H; Höfte, Herman; Malinovsky, Frederikke G; Domozych, David S; Willats, William G T
2014-12-01
Polysaccharides are major components of extracellular matrices and are often extensively modified post-synthetically to suit local requirements and developmental programmes. However, our current understanding of the spatiotemporal dynamics and functional significance of these modifications is limited by a lack of suitable molecular tools. Here, we report the development of a novel non-immunological approach for producing highly selective reciprocal oligosaccharide-based probes for chitosan (the product of chitin deacetylation) and for demethylesterified homogalacturonan. Specific reciprocal binding is mediated by the unique stereochemical arrangement of oppositely charged amino and carboxy groups. Conjugation of oligosaccharides to fluorophores or gold nanoparticles enables direct and rapid imaging of homogalacturonan and chitosan with unprecedented precision in diverse plant, fungal and animal systems. We demonstrated their potential for providing new biological insights by using them to study homogalacturonan processing during Arabidopsis thaliana root cap development and by analyzing sites of chitosan deposition in fungal cell walls and arthropod exoskeletons. © 2014. Published by The Company of Biologists Ltd.
Controlling Energy Performance on the Big Stage - The New York Times Company
DOE Office of Scientific and Technical Information (OSTI.GOV)
Settlemyre, Kevin; Regnier, Cindy
2015-08-01
The Times partnered with the U.S. Department of Energy (DOE) as part of DOE’s Commercial Building Partnerships (CBP) Program to develop a post-occupancy evaluation (POE) of three EEMs that were implemented during the construction of The Times building between 2004-2006. With aggressive goals to reduce energy use and carbon emissions at a national level, one strategy of the US Department of Energy is looking to exemplary buildings that have already invested in new approaches to achieving the energy performance goals that are now needed at scale. The Times building incorporated a number of innovative technologies, systems and processes that makemore » their project a model for widespread replication in new and existing buildings. The measured results from the post occupancy evaluation study, the tools and processes developed, and continuous improvements in the performance and cost of the systems studied suggest that these savings are scalable and replicable in a wide range of commercial buildings nationwide.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andrews, Madison Theresa; Bates, Cameron Russell; Mckigney, Edward Allen
Here, this work presents the organic scintillation simulation capabilities of DRiFT, a post-processing Detector Response Function Toolkit for MCNPR output. DRiFT is used to create realistic scintillation detector response functions to incident neutron and gamma mixed- field radiation. As a post-processing tool, DRiFT leverages the extensively validated radiation transport capabilities of MCNPR ®6, which also provides the ability to simulate complex sources and geometries. DRiFT is designed to be flexible, it allows the user to specify scintillator material, PMT type, applied PMT voltage, and quenching data used in simulations. The toolkit's capabilities, which include the generation of pulse shape discriminationmore » plots and full-energy detector spectra, are demonstrated in a comparison of measured and simulated neutron contributions from 252Cf and PuBe, and photon spectra from 22Na and 228Th sources. DRiFT reproduced energy resolution effects observed in EJ-301 measurements through the inclusion of scintillation yield variances, photon transport noise, and PMT photocathode and multiplication noise.« less
Organic Scintillator Detector Response Simulations with DRiFT
Andrews, Madison Theresa; Bates, Cameron Russell; Mckigney, Edward Allen; ...
2016-06-11
Here, this work presents the organic scintillation simulation capabilities of DRiFT, a post-processing Detector Response Function Toolkit for MCNPR output. DRiFT is used to create realistic scintillation detector response functions to incident neutron and gamma mixed- field radiation. As a post-processing tool, DRiFT leverages the extensively validated radiation transport capabilities of MCNPR ®6, which also provides the ability to simulate complex sources and geometries. DRiFT is designed to be flexible, it allows the user to specify scintillator material, PMT type, applied PMT voltage, and quenching data used in simulations. The toolkit's capabilities, which include the generation of pulse shape discriminationmore » plots and full-energy detector spectra, are demonstrated in a comparison of measured and simulated neutron contributions from 252Cf and PuBe, and photon spectra from 22Na and 228Th sources. DRiFT reproduced energy resolution effects observed in EJ-301 measurements through the inclusion of scintillation yield variances, photon transport noise, and PMT photocathode and multiplication noise.« less
Organic scintillator detector response simulations with DRiFT
NASA Astrophysics Data System (ADS)
Andrews, M. T.; Bates, C. R.; McKigney, E. A.; Solomon, C. J.; Sood, A.
2016-09-01
This work presents the organic scintillation simulation capabilities of DRiFT, a post-processing Detector Response Function Toolkit for MCNP® output. DRiFT is used to create realistic scintillation detector response functions to incident neutron and gamma mixed-field radiation. As a post-processing tool, DRiFT leverages the extensively validated radiation transport capabilities of MCNP® 6 , which also provides the ability to simulate complex sources and geometries. DRiFT is designed to be flexible, it allows the user to specify scintillator material, PMT type, applied PMT voltage, and quenching data used in simulations. The toolkit's capabilities, which include the generation of pulse shape discrimination plots and full-energy detector spectra, are demonstrated in a comparison of measured and simulated neutron contributions from 252Cf and PuBe, and photon spectra from 22Na and 228Th sources. DRiFT reproduced energy resolution effects observed in EJ-301 measurements through the inclusion of scintillation yield variances, photon transport noise, and PMT photocathode and multiplication noise.
Your Personal Analysis Toolkit - An Open Source Solution
NASA Astrophysics Data System (ADS)
Mitchell, T.
2009-12-01
Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!
ERIC Educational Resources Information Center
George, David Alan; Tan, Poh-Ling; Clewett, Jeffrey Frank
2016-01-01
Using a participatory learning approach, we report on the delivery and evaluation of a climate change and risk assessment tool to help manage water risks within the agricultural sector. Post-graduate water-professional students from a range of countries, from both developed and emerging economies were involved in using this tool. Our approach…
ERIC Educational Resources Information Center
Lunsford, M. Leigh; Rowell, Ginger Holmes; Goodson-Espy, Tracy
2006-01-01
We applied a classroom research model to investigate student understanding of sampling distributions of sample means and the Central Limit Theorem in post-calculus introductory probability and statistics courses. Using a quantitative assessment tool developed by previous researchers and a qualitative assessment tool developed by the authors, we…
Advances in deep-UV processing using cluster tools
NASA Astrophysics Data System (ADS)
Escher, Gary C.; Tepolt, Gary; Mohondro, Robert D.
1993-09-01
Deep-UV laser lithography has shown the capability of supporting the manufacture of multiple generations of integrated circuits (ICs) due to its wide process latitude and depth of focus (DOF) for 0.2 micrometers to 0.5 micrometers feature sizes. This capability has been attained through improvements in deep-UV wide field lens technology, excimer lasers, steppers and chemically amplified, positive deep-UV resists. Chemically amplified deep-UV resists are required for 248 nm lithography due to the poor absorption and sensitivity of conventional novolac resists. The acid catalyzation processes of the new resists requires control of the thermal history and environmental conditions of the lithographic process. Work is currently underway at several resist vendors to reduce the need for these controls, but practical manufacturing solutions exist today. One of these solutions is the integration of steppers and resist tracks into a `cluster tool' or `Lithocell' to insure a consistent thermal profile for the resist process and reduce the time the resist is exposed to atmospheric contamination. The work here reports processing and system integration results with a Machine Technology, Inc (MTI) post-exposure bake (PEB) track interfaced with an advanced GCA XLS 7800 deep-UV stepper [31 mm diameter, variable NA (0.35 - 0.53) and variable sigma (0.3 - 0.74)].
On aerodynamic wake analysis and its relation to total aerodynamic drag in a wind tunnel environment
NASA Astrophysics Data System (ADS)
Guterres, Rui M.
The present work was developed with the goal of advancing the state of the art in the application of three-dimensional wake data analysis to the quantification of aerodynamic drag on a body in a low speed wind tunnel environment. Analysis of the existing tools, their strengths and limitations is presented. Improvements to the existing analysis approaches were made. Software tools were developed to integrate the analysis into a practical tool. A comprehensive derivation of the equations needed for drag computations based on three dimensional separated wake data is developed. A set of complete steps ranging from the basic mathematical concept to the applicable engineering equations is presented. An extensive experimental study was conducted. Three representative body types were studied in varying ground effect conditions. A detailed qualitative wake analysis using wake imaging and two and three dimensional flow visualization was performed. Several significant features of the flow were identified and their relation to the total aerodynamic drag established. A comprehensive wake study of this type is shown to be in itself a powerful tool for the analysis of the wake aerodynamics and its relation to body drag. Quantitative wake analysis techniques were developed. Significant post processing and data conditioning tools and precision analysis were developed. The quality of the data is shown to be in direct correlation with the accuracy of the computed aerodynamic drag. Steps are taken to identify the sources of uncertainty. These are quantified when possible and the accuracy of the computed results is seen to significantly improve. When post processing alone does not resolve issues related to precision and accuracy, solutions are proposed. The improved quantitative wake analysis is applied to the wake data obtained. Guidelines are established that will lead to more successful implementation of these tools in future research programs. Close attention is paid to implementation of issues that are of crucial importance for the accuracy of the results and that are not detailed in the literature. The impact of ground effect on the flows in hand is qualitatively and quantitatively studied. Its impact on the accuracy of the computations as well as the wall drag incompatibility with the theoretical model followed are discussed. The newly developed quantitative analysis provides significantly increased accuracy. The aerodynamic drag coefficient is computed within one percent of balance measured value for the best cases.
Assessing Student Understanding of Physical Hydrology
NASA Astrophysics Data System (ADS)
Castillo, A. J.; Marshall, J.; Cardenas, M. B.
2012-12-01
Our objective is to characterize and assess upper division and graduate student thinking by developing and testing an assessment tool for a physical hydrology class. The class' learning goals are: (1) Quantitative process-based understanding of hydrologic processes, (2) Experience with different methods in hydrology, (3) Learning, problem solving, communication skills. These goals were translated into two measurable tasks asked of students in a questionnaire: (1) Describe the significant processes in the hydrological cycle and (2) Describe laws governing these processes. A third question below assessed the students' ability to apply their knowledge: You have been hired as a consultant by __ to (1) assess how urbanization and the current drought have affected a local spring and (2) predict what the effects will be in the future if the drought continues. What information would you need to gather? What measurements would you make? What analyses would you perform? Student and expert responses to the questions were then used to develop a rubric to score responses. Using the rubric, 3 researchers independently blind-coded the full set of pre and post artifacts, resulting in 89% inter-rater agreement on the pre-tests and 83% agreement on the post-tests. We present student scores to illustrate the use of the rubric and to characterize student thinking prior to and following a traditional course. Most students interpreted Q1 in terms of physical processes affecting the water cycle, the primary organizing framework for hydrology, as intended. On the pre-test, one student scored 0, indicating no response, on this question. Twenty students scored 1, indicating rudimentary understanding, 2 students scored a 2, indicating a basic understanding, and no student scored a 3. Student scores on this question improved on the post-test. On the 22 post-tests that were blind scored, 11 students demonstrated some recognition of concepts, 9 students showed a basic understanding, and 2 students had a full understanding of the processes linked to hydrology. Half the students had provided evidence of the desired understanding; however, half still demonstrated only a rudimentary understanding. Results on Q2 were similar. On the pre-test, 2 students scored 0, 21 students scored 1, indicating rudimentary understanding, 2 students scored a 2, and no student scored a 3. On the post-test, again approximately half the students achieved the desired understanding: 9 students showed some recognition of concepts, 12 students demonstrated a basic understanding; only one student exhibited full understanding. On Q3, no student scored 0, 9 scored 1, 15 scored 2 and 1 student scored 3. On the post-test, one student scored 1, 16 students scored 2, and 5 students scored 3. Students were significantly better at responding to Q3 (the application) as opposed to Q1 and Q2, which were more abstract. Research has shown that students are often better able to solve contextualized problems when they are unable to deal with more abstract tasks. This result has limitations including the small number of participants, all from one institution, and the fact that the rubric was still under development. Nevertheless, the high inter-rater agreement by a group of experts is significant; the rubric we developed is a potentially useful tool for assessment of learning and understanding physical hydrology. Supported by NSF CAREER grant (EAR-0955750).
Roccetti, Marco; Marfia, Gustavo; Salomoni, Paola; Prandi, Catia; Zagari, Rocco Maurizio; Gningaye Kengni, Faustine Linda; Bazzoli, Franco; Montagnani, Marco
2017-08-09
Data concerning patients originates from a variety of sources on social media. The aim of this study was to show how methodologies borrowed from different areas including computer science, econometrics, statistics, data mining, and sociology may be used to analyze Facebook data to investigate the patients' perspectives on a given medical prescription. To shed light on patients' behavior and concerns, we focused on Crohn's disease, a chronic inflammatory bowel disease, and the specific therapy with the biological drug Infliximab. To gain information from the basin of big data, we analyzed Facebook posts in the time frame from October 2011 to August 2015. We selected posts from patients affected by Crohn's disease who were experiencing or had previously been treated with the monoclonal antibody drug Infliximab. The selected posts underwent further characterization and sentiment analysis. Finally, an ethnographic review was carried out by experts from different scientific research fields (eg, computer science vs gastroenterology) and by a software system running a sentiment analysis tool. The patient feeling toward the Infliximab treatment was classified as positive, neutral, or negative, and the results from computer science, gastroenterologist, and software tool were compared using the square weighted Cohen's kappa coefficient method. The first automatic selection process returned 56,000 Facebook posts, 261 of which exhibited a patient opinion concerning Infliximab. The ethnographic analysis of these 261 selected posts gave similar results, with an interrater agreement between the computer science and gastroenterology experts amounting to 87.3% (228/261), a substantial agreement according to the square weighted Cohen's kappa coefficient method (w2K=0.6470). A positive, neutral, and negative feeling was attributed to 36%, 27%, and 37% of posts by the computer science expert and 38%, 30%, and 32% by the gastroenterologist, respectively. Only a slight agreement was found between the experts' opinion and the software tool. We show how data posted on Facebook by Crohn's disease patients are a useful dataset to understand the patient's perspective on the specific treatment with Infliximab. The genuine, nonmedically influenced patients' opinion obtained from Facebook pages can be easily reviewed by experts from different research backgrounds, with a substantial agreement on the classification of patients' sentiment. The described method allows a fast collection of big amounts of data, which can be easily analyzed to gain insight into the patients' perspective on a specific medical therapy. ©Marco Roccetti, Gustavo Marfia, Paola Salomoni, Catia Prandi, Rocco Maurizio Zagari, Faustine Linda Gningaye Kengni, Franco Bazzoli, Marco Montagnani. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 09.08.2017.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zarkesh, Ryan A.; Foster, Michael E.; Ichimura, Andrew S.
The ability to tune the steric envelope through redox events post-synthetically or in tandem with other chemical processes is a powerful tool that could assist in enabling new catalytic methodologies and understanding potential pitfalls in ligand design. The α-diimine ligand, dmp-BIAN, exhibits the peculiar and previously unreported feature of varying steric profiles depending on oxidation state when paired with a main group element. A study of the factors that give rise to this behaviour as well as its impact on the incorporation of other ligands is performed.
A Human-Centered Command and Control (C2) Assessment of an Experimental Campaign Planning Tool
2014-04-01
and control (team without the CPT) groups . The two groups were designed to have an equal number of members; however, one member of the experimental...the researchers to analyze the planning process and outcomes. 3.3 Design and Procedure An experimental versus control group design was implemented...the post -PFnet (figure 16b). Within the PFnets, a concept can be focused on in order to identify how the individual or group is defining or
Teh, Ruth C-A; Visvanathan, Renuka; Ranasinghe, Damith; Wilson, Anne
2018-06-01
To evaluate clinicians' perspectives, before and after clinical implementation (i.e. trial) of a handheld health information technology (HIT) tool, incorporating an iPad device and automatically generated visual cues for bedside display, for falls risk assessment and prevention in hospital. This pilot study utilized mixed-methods research with focus group discussions and Likert-scale surveys to elicit clinicians' attitudes. The study was conducted across three phases within two medical wards of the Queen Elizabeth Hospital. Phase 1 (pretrial) involved focus group discussion (five staff) and surveys (48 staff) to elicit preliminary perspectives on tool use, benefits and barriers to use and recommendations for improvement. Phase 2 (tool trial) involved HIT tool implementation on two hospital wards over consecutive 12-week periods. Phase 3 (post-trial) involved focus group discussion (five staff) and surveys (29 staff) following tool implementation, with similar themes as in Phase 1. Qualitative data were evaluated using content analysis, and quantitative data using descriptive statistics and logistic regression analysis, with subgroup analyses on user status (P ≤ 0.05). Four findings emerged on clinicians' experience, positive perceptions, negative perceptions and recommendations for improvement of the tool. Pretrial, clinicians were familiar with using visual cues in hospital falls prevention. They identified potential benefits of the HIT tool in obtaining timely, useful falls risk assessment to improve patient care. During the trial, the wards differed in methods of tool implementation, resulting in lower uptake by clinicians on the subacute ward. Post-trial, clinicians remained supportive for incorporating the tool into clinical practice; however, there were issues with usability and lack of time for tool use. Staff who had not used the tool had less appreciation for it improving their understanding of patients' falls risk factors (odds ratio 0.12), or effectively preventing hospital falls (odds ratio 0.12). Clinicians' recommendations resulted in subsequent technological refinement of the tool, and provision of an additional iPad device for more efficient use. This study adds to the limited pool of knowledge about clinicians' attitudes toward health technology use in falls avoidance. Clinicians were willing to use the HIT tool, and their concerns about its usability were addressed in ongoing tool improvement. Including end-users in the development and refinement processes, as well as having high staff uptake of new technologies, is important in improving their acceptance and usage, and in maximizing beneficial feedback to further inform tool development.
NASA Astrophysics Data System (ADS)
Nunes, João Pedro; Keizer, Jan Jacob
2017-04-01
Models can be invaluable tools to assess and manage the impacts of forest fires on hydrological and erosion processes. Immediately after fires, models can be used to identify priority areas for post-fire interventions or assess the risks of flooding and downstream contamination. In the long term, models can be used to evaluate the long-term implications of a fire regime for soil protection, surface water quality and potential management risks, or determine how changes to fire regimes, caused e.g. by climate change, can impact soil and water quality. However, several challenges make post-fire modelling particularly difficult: • Fires change vegetation cover and properties, such as by changing soil water repellency or by adding an ash layer over the soil; these processes, however are not described in currently used models, so that existing models need to be modified and tested. • Vegetation and soils recover with time since fire, changing important model parameters, so that the recovery processes themselves also need to be simulated, including the role of post-fire interventions. • During the window of vegetation and soil disturbance, particular weather conditions, such as the occurrence of severe droughts or extreme rainfall events, can have a large impact on the amount of runoff and erosion produced in burnt areas, so that models that smooth out these peak responses and rather simulate "long-term" average processes are less useful. • While existing models can simulate reasonable well slope-scale runoff generation and associated sediment losses and their catchment-scale routing, few models can accommodate the role of the ash layer or its transport by overland flow, in spite of its importance for soil fertility losses and downstream contamination. This presentation will provide an overview of the importance of post-fire hydrological and erosion modelling as well as of the challenges it faces and of recent efforts made to overcome these challenges. It will illustrate these challenges with two examples: probabilistic approaches to simulate the impact of different vegetation regrowth and post-fire climate combinations on runoff and erosion; and model developments for post-fire soil water repellency with different levels of complexity. It will also present an inventory of the current state-of-the-art and propose future research directions, both on post-fire models themselves and on their integration with other models in large-scale water resource assessment management.
Building Energy Model Development for Retrofit Homes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chasar, David; McIlvaine, Janet; Blanchard, Jeremy
2012-09-30
Based on previous research conducted by Pacific Northwest National Laboratory and Florida Solar Energy Center providing technical assistance to implement 22 deep energy retrofits across the nation, 6 homes were selected in Florida and Texas for detailed post-retrofit energy modeling to assess realized energy savings (Chandra et al, 2012). However, assessing realized savings can be difficult for some homes where pre-retrofit occupancy and energy performance are unknown. Initially, savings had been estimated using a HERS Index comparison for these homes. However, this does not account for confounding factors such as occupancy and weather. This research addresses a method to moremore » reliably assess energy savings achieved in deep energy retrofits for which pre-retrofit utility bills or occupancy information in not available. A metered home, Riverdale, was selected as a test case for development of a modeling procedure to account occupancy and weather factors, potentially creating more accurate estimates of energy savings. This “true up” procedure was developed using Energy Gauge USA software and post-retrofit homeowner information and utility bills. The 12 step process adjusts the post-retrofit modeling results to correlate with post-retrofit utility bills and known occupancy information. The “trued” post retrofit model is then used to estimate pre-retrofit energy consumption by changing the building efficiency characteristics to reflect the pre-retrofit condition, but keeping all weather and occupancy-related factors the same. This creates a pre-retrofit model that is more comparable to the post-retrofit energy use profile and can improve energy savings estimates. For this test case, a home for which pre- and post- retrofit utility bills were available was selected for comparison and assessment of the accuracy of the “true up” procedure. Based on the current method, this procedure is quite time intensive. However, streamlined processing spreadsheets or incorporation into existing software tools would improve the efficiency of the process. Retrofit activity appears to be gaining market share, and this would be a potentially valuable capability with relevance to marketing, program management, and retrofit success metrics.« less
Picosecond and femtosecond lasers for industrial material processing
NASA Astrophysics Data System (ADS)
Mayerhofer, R.; Serbin, J.; Deeg, F. W.
2016-03-01
Cold laser materials processing using ultra short pulsed lasers has become one of the most promising new technologies for high-precision cutting, ablation, drilling and marking of almost all types of material, without causing unwanted thermal damage to the part. These characteristics have opened up new application areas and materials for laser processing, allowing previously impossible features to be created and also reducing the amount of post-processing required to an absolute minimum, saving time and cost. However, short pulse widths are only one part of thee story for industrial manufacturing processes which focus on total costs and maximum productivity and production yield. Like every other production tool, ultra-short pulse lasers have too provide high quality results with maximum reliability. Robustness and global on-site support are vital factors, as well ass easy system integration.
Phillips, Nicole M; Kent, Bridie; Colgan, Stephen; Mohebbi, Mohammadreza
2015-01-01
Introduction While the risk of adverse events following surgery has been identified, the impact of nursing care on early detection of these events is not well established. A systematic review of the evidence and an expert consensus study in post-anaesthetic care identified essential criteria for nursing assessment of patient readiness for discharge from the post-anaesthetic care unit (PACU). These criteria were included in a new nursing assessment tool, the Post-Anaesthetic Care Tool (PACT), and incorporated into the post-anaesthetic documentation at a large health service. The aim of this study is to test the clinical reliability of the PACT and evaluate whether the use of PACT will (1) enhance the recognition and response to patients at risk of deterioration in PACU; (2) improve documentation for handover from PACU nurse to ward nurse; (3) result in improved patient outcomes and (4) reduce healthcare costs. Methods and analysis A prospective, non-randomised, pre-implementation and post-implementation design comparing: (1) patients (n=750) who have surgery prior to the implementation of the PACT and (2) patients (n=750) who have surgery after PACT. The study will examine the use of the tool through the observation of patient care and nursing handover. Patient outcomes and cost-effectiveness will be determined from health service data and medical record audit. Descriptive statistics will be used to describe the sample and compare the two patient groups (pre-intervention and post-intervention). Differences in patient outcomes between the two groups will be compared using the Cochran-Mantel-Haenszel test and regression analyses and reported as ORs with the corresponding 95% CIs. Conclusions This study will test the clinical reliability and cost-effectiveness of the PACT. It is hypothesised that the PACT will enable nurses to recognise and respond to patients at risk of deterioration, improve handover to ward nurses, improve patient outcomes, and reduce healthcare costs. PMID:26033942
Ravasini, Francesco; Fornari, Matteo; Bonanini, Mauro
2016-12-01
The use of photogrammetry may be a new method to quantify the amount of artificial dental material removed from the surface of each teeth during the grind procedure (SG). SG is necessary in each denture to reach a correct occlusion. It consists in a refine action on the prosthesis teeth's surface using milling machine tools, aimed to remove the interferences (pre-contacts) between upper and lower teeth during chewing. This measure is achieved after a comparison between pre and post-grinding 3D models. This new application could be of interest for both dentists and dental technicians because it could be used to evaluate, with a accurate numerical description, the action applied on teeth surfaces during the grinding process. Furthermore, results of the analysis could have some value for the dental industry, since the use of photogrammetry can improve the process, reducing costs during the design of artificial teeth and eventually this method could be used as a teaching tool both for dental and "dental technician" high school students. The purpose of this work is to measure the thickness of the artificial enamel removed during grinding phases. Usually, the dental technician adjusts the dental plate on the mount of the patient following the traditional method, without a quantitative evaluation of the material removed. The photogrammetric method (PM) proposed here allows to measure the amount of material removed during the grinding process. This measure is achieved after a comparison between pre and post-grinding 3D models. Under control of three teachers (experts of dentures performed according to the Gerber method) ten complete dentures arrangements (upper and inferior arches) performed by dental students at the Prosthodontic Department of the University of Parma, Italy were analyzed with PM before and after SG. The average thickness variation between the pre and post-grinding dentures is within the range of 0.1÷0.4 mm. For the upper arches, the mean value of the SG process is 223 µm while for the inferior arches is 240 µm. Results show that the most important grind process in all models appear in correspondence of cusps, with values up to 1660 µm. On the other hand, in correspondence of the fossae the results show a moderate grind action: the value is around 200-300 µm. Conversely to guidelines thought to students: cusps undergo a greater grinding process than fossae, consequently cusps should be revisioned at least on their technical and morphological aspects. The average thickness variation between the pre and post-grinding dentures is within the range of 0.1÷0.4 mm, this mean an equal value loss of vertical dimension. Furthermore, the knowledge of the gauge material removed during the SG could be useful for dental industries, giving important information, that could be considered for project and design of artificial teeth. The FM implemented in this article has given satisfactory preliminary results, showing good accuracy, low costs and high versatility. It is necessary to highlight that this is an experimental method and that the present analysis is a pilot study that needs further evaluation. Nevertheless results obtained could be of some value for medical companies, in order to improve the artificial teeth's design and project. Moreover, such a method may serve as educational tool for dental students.
Fast assessment of planar chromatographic layers quality using pulse thermovision method.
Suszyński, Zbigniew; Świta, Robert; Loś, Joanna; Zarzycka, Magdalena B; Kaleniecka, Aleksandra; Zarzycki, Paweł K
2014-12-19
The main goal of this paper is to demonstrate capability of pulse thermovision (thermal-wave) methodology for sensitive detection of photothermal non-uniformities within light scattering and semi-transparent planar stationary phases. Successful visualization of stationary phases defects required signal processing protocols based on wavelet filtration, correlation analysis and k-means 3D segmentation. Such post-processing data handling approach allows extremely sensitive detection of thickness and structural changes within commercially available planar chromatographic layers. Particularly, a number of TLC and HPTLC stationary phases including silica, cellulose, aluminum oxide, polyamide and octadecylsilane coated with adsorbent layer ranging from 100 to 250μm were investigated. Presented detection protocol can be used as an efficient tool for fast screening the overall heterogeneity of any layered materials. Moreover, described procedure is very fast (few seconds including acquisition and data processing) and may be applied for fabrication processes online controlling. In spite of planar chromatographic plates this protocol can be used for assessment of different planar separation tools like paper based analytical devices or micro total analysis systems, consisted of organic and non-organic layers. Copyright © 2014 Elsevier B.V. All rights reserved.
Three Dimensional Transient Turbulent Simulations of Scramjet Fuel Injection and Combustion
NASA Astrophysics Data System (ADS)
Bahbaz, Marwane
2011-11-01
Scramjet is a propulsion system that is more effective for hypersonic flights (M >5). The main objective of the simulation is to understand both the mixing and combustion process of air flow using hydrogen fuel in high speed environment s. The understanding of this phenomenon is used to determine the number of fuel injectors required to increase combustion efficiency and energy transfer. Due to the complexity of this simulation, multiple software tools are used to achieve this objective. First, Solid works is used to draw a scramjet combustor with accurate measurements. Second software tool used is Gambit; It is used to make several types of meshes for the scramjet combustor. Finally, Open Foam and CFD++ are software used to process and post process the scramjet combustor. At this stage, the simulation is divided into two categories. The cold flow category is a series of simulations that include subsonic and supersonic turbulent air flow across the combustor channel with fuel interaction from one or more injectors'. The second category is the combustion simulations which involve fluid flow and fuel mixing with ignition. The simulation and modeling of scramjet combustor will assist to investigate and understand the combustion process and energy transfer in hypersonic environment.
CMOS-micromachined, two-dimenisional transistor arrays for neural recording and stimulation.
Lin, J S; Chang, S R; Chang, C H; Lu, S C; Chen, H
2007-01-01
In-plane microelectrode arrays have proven to be useful tools for studying the connectivities and the functions of neural tissues. However, seldom microelectrode arrays are monolithically-integrated with signal-processing circuits, without which the maximum number of electrodes is limited by the compromise with routing complexity and interferences. This paper proposes a CMOS-compatible, two-dimensional array of oxide-semiconductor field-effect transistors(OSFETs), capable of both recording and stimulating neuronal activities. The fabrication of the OSFETs not only requires simply die-level, post-CMOS micromachining process, but also retains metal layers for monolithic integration with signal-processing circuits. A CMOS microsystem containing the OSFET arrays and gain-programmable recording circuits has been fabricated and tested. The preliminary testing results are presented and discussed.
Functional Electrospun Nanofibrous Scaffolds for Biomedical Applications
Liang, Dehai; Hsiao, Benjamin S.; Chu, Benjamin
2009-01-01
Functional nanofibrous scaffolds produced by electrospinning have great potential in many biomedical applications, such as tissue engineering, wound dressing, enzyme immobilization and drug (gene) delivery. For a specific successful application, the chemical, physical and biological properties of electrospun scaffolds should be adjusted to match the environment by using a combination of multi-component compositions and fabrication techniques where electrospinning has often become a pivotal tool. The property of the nanofibrous scaffold can be further improved with innovative development in electrospinning processes, such as two-component electrospinning and in-situ mixing electrospinning. Post modifications of electrospun membranes also provide effective means to render the electrospun scaffolds with controlled anisotropy and porosity. In this review, we review the materials, techniques and post modification methods to functionalize electrospun nanofibrous scaffolds suitable for biomedical applications. PMID:17884240
Lamy, Francois R.; Daniulaityte, Raminta; Nahhas, Ramzi W.; Barratt, Monica J.; Smith, Alan G.; Sheth, Amit; Martins, Silvia S.; Boyer, Edward W.; Carlson, Robert G.
2017-01-01
Background Synthetic Cannabinoid Receptor Agonists (SCRA), also known as “K2” or “Spice,” have drawn considerable attention due to their potential of abuse and harmful consequences. More research is needed to understand user experiences of SCRA-related effects. We use semiautomated information processing techniques through eDrugTrends platform to examine SCRA-related effects and their variations through a longitudinal content analysis of web-forum data. Method English language posts from three drug-focused web-forums were extracted and analyzed between January 1st 2008 and September 30th 2015. Search terms are based on the Drug Abuse Ontology (DAO) created for this study (189 SCRA-related and 501 effect-related terms). EDrugTrends NLP-based text processing tools were used to extract posts mentioning SCRA and their effects. Generalized linear regression was used to fit restricted cubic spline functions of time to test whether the proportion of drug-related posts that mention SCRA (and no other drug) and the proportion of these “SCRA-only” posts that mention SCRA effects have changed over time, with an adjustment for multiple testing. Results 19,052 SCRA-related posts (Bluelight (n=2,782), Forum A (n=3,882), and Forum B (n=12,388)) posted by 2,543 international users were extracted. The most frequently mentioned effects were “getting high” (44.0%), “hallucinations” (10.8%), and “anxiety” (10.2%). The frequency of SCRA-only posts declined steadily over the study period. The proportions of SCRA-only posts mentioning positive effects (e.g., “High” and “Euphoria”) steadily decreased, while the proportions of SCRA-only posts mentioning negative effects (e.g., “Anxiety,” “Nausea,” “Overdose”) increased over the same period. Conclusion This study's findings indicate that the proportion of negative effects mentioned in web forum posts and linked to SCRA has increased over time, suggesting that recent generations of SCRA generate more harms. This is also one of the first studies to conduct automated content analysis of web forum data related to illicit drug use. PMID:28578250
Integration Process for Payloads in the Fluids and Combustion Facility
NASA Technical Reports Server (NTRS)
Free, James M.; Nall, Marsha M.
2001-01-01
The Fluids and Combustion Facility (FCF) is an ISS research facility located in the United States Laboratory (US Lab), Destiny. The FCF is a multi-discipline facility that performs microgravity research primarily in fluids physics science and combustion science. This facility remains on-orbit and provides accommodations to multi-user and Principal investigator (PI) unique hardware. The FCF is designed to accommodate 15 PI's per year. In order to allow for this number of payloads per year, the FCF has developed an end-to-end analytical and physical integration process. The process includes provision of integration tools, products and interface management throughout the life of the payload. The payload is provided with a single point of contact from the facility and works with that interface from PI selection through post flight processing. The process utilizes electronic tools for creation of interface documents/agreements, storage of payload data and rollup for facility submittals to ISS. Additionally, the process provides integration to and testing with flight-like simulators prior to payload delivery to KSC. These simulators allow the payload to test in the flight configuration and perform final facility interface and science verifications. The process also provides for support to the payload from the FCF through the Payload Safety Review Panel (PSRP). Finally, the process includes support in the development of operational products and the operation of the payload on-orbit.
Informed-Proteomics: open-source software package for top-down proteomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Jungkap; Piehowski, Paul D.; Wilkins, Christopher
Top-down proteomics involves the analysis of intact proteins. This approach is very attractive as it allows for analyzing proteins in their endogenous form without proteolysis, preserving valuable information about post-translation modifications, isoforms, proteolytic processing or their combinations collectively called proteoforms. Moreover, the quality of the top-down LC-MS/MS datasets is rapidly increasing due to advances in the liquid chromatography and mass spectrometry instrumentation and sample processing protocols. However, the top-down mass spectra are substantially more complex compare to the more conventional bottom-up data. To take full advantage of the increasing quality of the top-down LC-MS/MS datasets there is an urgent needmore » to develop algorithms and software tools for confident proteoform identification and quantification. In this study we present a new open source software suite for top-down proteomics analysis consisting of an LC-MS feature finding algorithm, a database search algorithm, and an interactive results viewer. The presented tool along with several other popular tools were evaluated using human-in-mouse xenograft luminal and basal breast tumor samples that are known to have significant differences in protein abundance based on bottom-up analysis.« less
Leo, Antonino; De Luca, Rosario; Russo, Margherita; Naro, Antonino; Bramanti, Placido; Calabrò, Rocco S
2016-01-01
Cognitive impairment after stroke is quite common and can cause important disability with a relevant impact on quality of life. Cognitive rehabilitation (CR) and related assistive technology may improve functional outcomes. A 30-year-old woman came to our research institute for an intensive CR cycle following a right parieto-temporal stroke. Because the patient was in the chronic phase, we decided to use 3 different rehabilitative protocols: (a) traditional cognitive training (TCT), (b) computerized cognitive training (CCT), and (c) CCT combined with transcranial direct stimulation (CCT plus) with a 2-week interval separating each session. Cognitive and language deficits were investigated using an ad-hoc psychometric battery at baseline (T0), post-TCT (T1), post-CCT (T2), and post-CCT plus (T3). Our patient showed the best neuropsychological improvement, with regard to attention processes and language domain, after T3. Our data showed that CCT plus should be considered a promising tool in the treatment of poststroke neuropsychological deficits.
A Diagnostics Tool to detect ensemble forecast system anomaly and guide operational decisions
NASA Astrophysics Data System (ADS)
Park, G. H.; Srivastava, A.; Shrestha, E.; Thiemann, M.; Day, G. N.; Draijer, S.
2017-12-01
The hydrologic community is moving toward using ensemble forecasts to take uncertainty into account during the decision-making process. The New York City Department of Environmental Protection (DEP) implements several types of ensemble forecasts in their decision-making process: ensemble products for a statistical model (Hirsch and enhanced Hirsch); the National Weather Service (NWS) Advanced Hydrologic Prediction Service (AHPS) forecasts based on the classical Ensemble Streamflow Prediction (ESP) technique; and the new NWS Hydrologic Ensemble Forecasting Service (HEFS) forecasts. To remove structural error and apply the forecasts to additional forecast points, the DEP post processes both the AHPS and the HEFS forecasts. These ensemble forecasts provide mass quantities of complex data, and drawing conclusions from these forecasts is time-consuming and difficult. The complexity of these forecasts also makes it difficult to identify system failures resulting from poor data, missing forecasts, and server breakdowns. To address these issues, we developed a diagnostic tool that summarizes ensemble forecasts and provides additional information such as historical forecast statistics, forecast skill, and model forcing statistics. This additional information highlights the key information that enables operators to evaluate the forecast in real-time, dynamically interact with the data, and review additional statistics, if needed, to make better decisions. We used Bokeh, a Python interactive visualization library, and a multi-database management system to create this interactive tool. This tool compiles and stores data into HTML pages that allows operators to readily analyze the data with built-in user interaction features. This paper will present a brief description of the ensemble forecasts, forecast verification results, and the intended applications for the diagnostic tool.
MAGIC: A Tool for Combining, Interpolating, and Processing Magnetograms
NASA Technical Reports Server (NTRS)
Allred, Joel
2012-01-01
Transients in the solar coronal magnetic field are ultimately the source of space weather. Models which seek to track the evolution of the coronal field require magnetogram images to be used as boundary conditions. These magnetograms are obtained by numerous instruments with different cadences and resolutions. A tool is required which allows modelers to fmd all available data and use them to craft accurate and physically consistent boundary conditions for their models. We have developed a software tool, MAGIC (MAGnetogram Interpolation and Composition), to perform exactly this function. MAGIC can manage the acquisition of magneto gram data, cast it into a source-independent format, and then perform the necessary spatial and temporal interpolation to provide magnetic field values as requested onto model-defined grids. MAGIC has the ability to patch magneto grams from different sources together providing a more complete picture of the Sun's field than is possible from single magneto grams. In doing this, care must be taken so as not to introduce nonphysical current densities along the seam between magnetograms. We have designed a method which minimizes these spurious current densities. MAGIC also includes a number of post-processing tools which can provide additional information to models. For example, MAGIC includes an interface to the DA VE4VM tool which derives surface flow velocities from the time evolution of surface magnetic field. MAGIC has been developed as an application of the KAMELEON data formatting toolkit which has been developed by the CCMC.
Measuring Quality in Ethics Consultation.
Bliss, Sally E; Oppenlander, Jane; Dahlke, Jacob M; Meyer, Gordon J; Williford, Eva M; Macauley, Robert C
2016-01-01
For all of the emphasis on quality improvement-as well as the acknowledged overlap between assessment of the quality of healthcare services and clinical ethics-the quality of clinical ethics consultation has received scant attention, especially in terms of empirical measurement. Recognizing this need, the second edition of Core Competencies for Health Care Ethics Consultation1 identified four domains of ethics quality: (1) ethicality, (2) stakeholders' satisfaction, (3) resolution of the presenting conflict/dilemma, and (4) education that translates into knowledge. This study is the first, to our knowledge, to directly measure all of these domains. Here we describe the quality improvement process undertaken at a tertiary care academic medical center, as well as the tools developed to measure the quality of ethics consultation, which include post-consultation satisfaction surveys and weekly case conferences. The information gained through these tools helps to improve not only the process of ethics consultation, but also the measurement and assurance of quality. Copyright 2016 The Journal of Clinical Ethics. All rights reserved.
Meehan, Thomas P; Qazi, Daniel J; Van Hoof, Thomas J; Ho, Shih-Yieh; Eckenrode, Sheila; Spenard, Ann; Pandolfi, Michelle; Johnson, Florence; Quetti, Deborah
2015-08-01
To describe and evaluate the impact of quality improvement (QI) support provided to skilled nursing facilities (SNFs) by a Quality Improvement Organization (QIO). Retrospective, mixed-method, process evaluation of a QI project intended to decrease preventable hospital readmissions from SNFs. Five SNFs in Connecticut. SNF Administrators, Directors of Nursing, Assistant Directors of Nursing, Admissions Coordinators, Registered Nurses, Certified Nursing Assistants, Receptionists, QIO Quality Improvement Consultant. QIO staff provided training and technical assistance to SNF administrative and clinical staff to establish or enhance QI infrastructure and implement an established set of QI tools [Interventions to Reduce Acute Care Transfers (INTERACT) tools]. Baseline SNF demographic, staffing, and hospital readmission data; baseline and follow-up SNF QI structure (QI Committee), processes (general and use of INTERACT tools), and outcome (30-day all-cause hospital readmission rates); details of QIO-provided training and technical assistance; QIO-perceived barriers to quality improvement; SNF leadership-perceived barriers, accomplishments, and suggestions for improvement of QIO support. Success occurred in establishing QI Committees and targeting preventable hospital readmissions, as well as implementing INTERACT tools in all SNFs; however, hospital readmission rates decreased in only 2 facilities. QIO staff and SNF leaders noted the ongoing challenge of engaging already busy SNF staff and leadership in QI activities. SNF leaders reported that they appreciated the training and technical assistance that their institutions received, although most noted that additional support was needed to bring about improvement in readmission rates. This process evaluation documented mixed clinical results but successfully identified opportunities to improve recruitment of and provision of technical support to participating SNFs. Recommendations are offered for others who wish to conduct similar projects. Copyright © 2015 AMDA – The Society for Post-Acute and Long-Term Care Medicine. All rights reserved.
NASA Astrophysics Data System (ADS)
Goodrich, D. C.; Clifford, T. J.; Guertin, D. P.; Sheppard, B. S.; Barlow, J. E.; Korgaonkar, Y.; Burns, I. S.; Unkrich, C. C.
2016-12-01
Wildfires disasters are common throughout the western US. While many feel fire suppression is the largest cost of wildfires, case studies note rehabilitation costs often equal or greatly exceed suppression costs. Using geospatial data sets, and post-fire burn severity products, coupled with the Automated Geospatial Watershed Assessment tool (AGWA - www.tucson.ars.ag.gov/agwa), the Dept. of Interior, Burned Area Emergency Response (BAER) teams can rapidly analyze and identify at-risk areas to target rehabilitation efforts. AGWA employs nationally available geospatial elevation, soils, and land cover data to parameterize the KINEROS2 hydrology and erosion model. A pre-fire watershed simulation can be done prior to BAER deployment using design storms. As soon as the satellite-derived Burned Area Reflectance Classification (BARC) map is obtained, a post-fire watershed simulation using the same storm is conducted. The pre- and post-fire simulations can be spatially differenced in the GIS for rapid identification of high at-risk areas of erosion or flooding. This difference map is used by BAER teams to prioritize field observations and in-turn produce a final burn severity map that is used in AGWA/KINEROS2 simulations to provide report ready results. The 2013 Elk Wildfire Complex that burned over 52,600 ha east of Boise, Idaho provides a tangible example of how BAER experts combined AGWA and geospatial data that resulted in substantial rehabilitation cost savings. The BAER team initially, they identified approximately 6,500 burned ha for rehabilitation. The team then used the AGWA pre- and post-fire watershed simulation results, accessibility constraints, and land slope conditions in an interactive process to locate burned areas that posed the greatest threat to downstream values-at-risk. The group combined the treatable area, field observations, and the spatial results from AGWA to target seed and mulch treatments that most effectively reduced the threats. Using this process, the BAER Team reduced the treatable acres from the original 16,000 ha to between 800 and 1,600 ha depending on the selected alternative. The final awarded contract amounted to about 1,480/ha, therefore, a total savings of 7.2 - $8.4 million was realized for mulch treatment alone.
NASA Astrophysics Data System (ADS)
Avdelidis, N. P.; Kappatos, V.; Georgoulas, G.; Karvelis, P.; Deli, C. K.; Theodorakeas, P.; Giakas, G.; Tsiokanos, A.; Koui, M.; Jamurtas, A. Z.
2017-04-01
Exercise induced muscle damage (EIMD), is usually experienced in i) humans who have been physically inactive for prolonged periods of time and then begin with sudden training trials and ii) athletes who train over their normal limits. EIMD is not so easy to be detected and quantified, by means of commonly measurement tools and methods. Thermography has been used successfully as a research detection tool in medicine for the last 6 decades but very limited work has been reported on EIMD area. The main purpose of this research is to assess and characterize EIMD, using thermography and image processing techniques. The first step towards that goal is to develop a reliable segmentation technique to isolate the region of interest (ROI). A semi-automatic image processing software was designed and regions of the left and right leg based on superpixels were segmented. The image is segmented into a number of regions and the user is able to intervene providing the regions which belong to each of the two legs. In order to validate the image processing software, an extensive experimental investigation was carried out, acquiring thermographic images of the rectus femoris muscle before, immediately post and 24, 48 and 72 hours after an acute bout of eccentric exercise (5 sets of 15 maximum repetitions), on males and females (20-30 year-old). Results indicate that the semi-automated approach provides an excellent bench-mark that can be used as a clinical reliable tool.
Epoxy matrix with triaromatic mesogenic unit in dielectric spectroscopy observation
NASA Astrophysics Data System (ADS)
Włodarska, Magdalena; Mossety-Leszczak, Beata; Bąk, Grzegorz W.; Kisiel, Maciej; Dłużniewski, Maciej; Okrasa, Lidia
2018-04-01
This paper describes the dielectric response of a selected liquid crystal epoxy monomer (plain and in curing systems) in a wide range of frequency and temperature. The dielectric spectroscopy, thanks to its sensitivity, is a very good tool for studying phase transitions, reaction progress, or material properties. This sensitivity is important in the case of liquid crystal epoxy resins, where properties of the final network depend on the choice of monomers, curing agents, curing conditions and post-curing treatment, or applying an external electric or magnetic field during the reaction. In most of the obtained cured products, the collected dielectric data show two relaxation processes. The α-process is related to a structural reorientation; it can usually be linked with the glass transition and the mechanical properties of the material. The β-process can be identified as a molecular motion process, probably associated with the carboxyl groups in the mesogen. A transient Maxwell-Wagner relaxation observed in one of the compositions after the initial curing is removed by post-curing treatment at elevated temperatures. Post-curing is therefore necessary for obtaining uniformly cured products in those cases. In the investigated systems, the choice of a curing agent can change the glass transition temperature by at least 70 °C. The obtained results are in a good agreement with an earlier study employing other techniques. Finally, we assess the influence of the direction of mesogen alignment on the dielectric properties of one selected system, where a global order was induced by applying an external magnetic field in the course of curing.
2013-04-01
Neuropsychology (AACN). Chicago , Illinois. One of the challenges in assessing the essential neural features of mild TBI in veterans is that... Chicago , Illinois. The tool, preliminarily called the Minnesota Blast Exposure Screening Tool (MN-BEST; see Figure 12), complements current screening...the AACN. Chicago , Illinois. Examination of the number of post-concussive symptoms endorsed by the entire National Guard sample indicates that
Risk management and post-marketing surveillance of CNS drugs.
Henningfield, Jack E; Schuster, Charles R
2009-12-01
Drugs affecting the central nervous system span a broad range of chemical entities, dosage forms, indications, and risks. Unintended consequences include potential abuse and overdose in non-patient drug abusers, deliberate tampering of drug dosage forms, and criminal behavior associated with diversion. Regulators must consider diverse factors to find the appropriate conditions of approval to minimize unintended consequences while enabling a level of access desired by health care providers and patients. This commentary appears as part of a special issue of Drug and Alcohol Dependence that focuses on risk management and post-marketing surveillance and addresses key issues that pose real-world challenges to pharmaceutical sponsors and regulators in particular. For example, in the U.S., Controlled Substances Act drug scheduling can be considered a risk management strategy but its legal authorities and administrative processes are independent from those of risk management (including Risk Evaluation and Mitigation Strategies or REMS); better harmonization of these approaches is vital from drug development and regulatory perspectives. Risk management would ideally be implemented on a strong science foundation demonstrating that the tools employed to mitigate risks and ensure safe use are effective. In reality, research and evaluation of tools in this area is in its infancy and will necessarily be an evolutionary process; furthermore, there is little precedent for linking interventions and program evolution to unintended consequences such as regional outbreaks of abuse and diversion. How such issues are resolved has the potential to stimulate or stifle innovations in drug development and advance or imperil health care.
NASA Technical Reports Server (NTRS)
Clementel, N.; Madura, T. I.; Kruip, C. J. H.; Icke, V.; Gull, T. R.
2014-01-01
Eta Carinae is an ideal astrophysical laboratory for studying massive binary interactions and evolution, and stellar wind-wind collisions. Recent three-dimensional (3D) simulations set the stage for understanding the highly complex 3D flows in Eta Car. Observations of different broad high- and low-ionization forbidden emission lines provide an excellent tool to constrain the orientation of the system, the primary's mass-loss rate, and the ionizing flux of the hot secondary. In this work we present the first steps towards generating synthetic observations to compare with available and future HST/STIS data. We present initial results from full 3D radiative transfer simulations of the interacting winds in Eta Car. We use the SimpleX algorithm to post-process the output from 3D SPH simulations and obtain the ionization fractions of hydrogen and helium assuming three different mass-loss rates for the primary star. The resultant ionization maps of both species constrain the regions where the observed forbidden emission lines can form. Including collisional ionization is necessary to achieve a better description of the ionization states, especially in the areas shielded from the secondary's radiation. We find that reducing the primary's mass-loss rate increases the volume of ionized gas, creating larger areas where the forbidden emission lines can form. We conclude that post processing 3D SPH data with SimpleX is a viable tool to create ionization maps for Eta Car.
NASA Technical Reports Server (NTRS)
Clementel, N.; Madura, T. I.; Kruip, C.J.H.; Icke, V.; Gull, T. R.
2014-01-01
Eta Carinae is an ideal astrophysical laboratory for studying massive binary interactions and evolution, and stellar wind-wind collisions. Recent three-dimensional (3D) simulations set the stage for understanding the highly complex 3D flows in eta Car. Observations of different broad high- and low-ionization forbidden emission lines provide an excellent tool to constrain the orientation of the system, the primary's mass-loss rate, and the ionizing flux of the hot secondary. In this work we present the first steps towards generating synthetic observations to compare with available and future HST/STIS data. We present initial results from full 3D radiative transfer simulations of the interacting winds in eta Car.We use the SimpleX algorithm to post-process the output from 3D SPH simulations and obtain the ionization fractions of hydrogen and helium assuming three different mass-loss rates for the primary star. The resultant ionization maps of both species constrain the regions where the observed forbidden emission lines can form. Including collisional ionization is necessary to achieve a better description of the ionization states, especially in the areas shielded from the secondary's radiation. We find that reducing the primary's mass-loss rate increases the volume of ionized gas, creating larger areas where the forbidden emission lines can form.We conclude that post processing 3D SPH data with SimpleX is a viable tool to create ionization maps for eta Car.
StagLab: Post-Processing and Visualisation in Geodynamics
NASA Astrophysics Data System (ADS)
Crameri, Fabio
2017-04-01
Despite being simplifications of nature, today's Geodynamic numerical models can, often do, and sometimes have to become very complex. Additionally, a steadily-increasing amount of raw model data results from more elaborate numerical codes and the still continuously-increasing computational power available for their execution. The current need for efficient post-processing and sensible visualisation is thus apparent. StagLab (www.fabiocrameri.ch/software) provides such much-needed strongly-automated post-processing in combination with state-of-the-art visualisation. Written in MatLab, StagLab is simple, flexible, efficient and reliable. It produces figures and movies that are both fully-reproducible and publication-ready. StagLab's post-processing capabilities include numerous diagnostics for plate tectonics and mantle dynamics. Featured are accurate plate-boundary identification, slab-polarity recognition, plate-bending derivation, mantle-plume detection, and surface-topography component splitting. These and many other diagnostics are derived conveniently from only a few parameter fields thanks to powerful image processing tools and other capable algorithms. Additionally, StagLab aims to prevent scientific visualisation pitfalls that are, unfortunately, still too common in the Geodynamics community. Misinterpretation of raw data and exclusion of colourblind people introduced with the continuous use of the rainbow (a.k.a. jet) colour scheme is just one, but a dramatic example (e.g., Rogowitz and Treinish, 1998; Light and Bartlein, 2004; Borland and Ii, 2007). StagLab is currently optimised for binary StagYY output (e.g., Tackley 2008), but is adjustable for the potential use with other Geodynamic codes. Additionally, StagLab's post-processing routines are open-source. REFERENCES Borland, D., and R. M. T. Ii (2007), Rainbow color map (still) considered harmful, IEEE Computer Graphics and Applications, 27(2), 14-17. Light, A., and P. J. Bartlein (2004), The end of the rainbow? Color schemes for improved data graphics, Eos Trans. AGU, 85(40), 385-391. Rogowitz, B. E., and L. A. Treinish (1998), Data visualization: the end of the rainbow, IEEE Spectrum, 35(12), 52-59, doi:10.1109/6.736450. Tackley, P.J (2008) Modelling compressible mantle convection with large viscosity contrasts in a three-dimensional spherical shell using the yin-yang grid. Physics of the Earth and Planetary Interiors 171(1-4), 7-18.
NASA Astrophysics Data System (ADS)
Kadow, Christopher; Illing, Sebastian; Schartner, Thomas; Ulbrich, Uwe; Cubasch, Ulrich
2017-04-01
Operationalization processes are important for Weather and Climate Services. Complex data and work flows need to be combined fast to fulfill the needs of service centers. Standards in data and software formats help in automatic solutions. In this study we show a software solution in between hindcasts, forecasts, and validation to be operationalized. Freva (see below) structures data and evaluation procedures and can easily be monitored. Especially in the development process of operationalized services, Freva supports scientists and project partners. The showcase of the decadal climate prediction project MiKlip (fona-miklip.de) shows such a complex development process. Different predictions, scientists input, tasks, and time evolving adjustments need to be combined to host precise climate informations in a web environment without losing track of its evolution. The Freie Univ Evaluation System Framework (Freva - freva.met.fu-berlin.de) is a software infrastructure for standardized data and tool solutions in Earth system science. Freva runs on high performance computers to handle customizable evaluation systems of research projects, institutes or universities. It combines different software technologies into one common hybrid infrastructure, including all features present in the shell and web environment. The database interface satisfies the international standards provided by the Earth System Grid Federation (ESGF). Freva indexes different data projects into one common search environment by storing the meta data information of the self-describing model, reanalysis and observational data sets in a database. This implemented meta data system with its advanced but easy-to-handle search tool supports users, developers and their plugins to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. Facilitation of the provision and usage of tools and climate data automatically increases the number of scientists working with the data sets and identifying discrepancies. The integrated webshell (shellinabox) adds a degree of freedom in the choice of the working environment and can be used as a gateto the research projects HPC. Plugins are able to integrate their e.g. post-processed results into the database ofthe user. This allows e.g. post-processing plugins to feed statistical analysis plugins, which fosters an active exchange between plugin developers of a research project. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a database. Configurations and results of the tools can be shared among scientists via shell or web system. Therefore, plugged-in tools benefit from transparency and reproducibility. Furthermore, if configurations match while starting an evaluation plugin, the system suggests to use results already produced by other users - saving CPU/h, I/O, disk space and time. The efficient interaction between different technologies improves the Earth system modeling science framed by Freva.
Nutritional pyramid for post-gastric bypass patients.
Moizé, Violeta L; Pi-Sunyer, Xavier; Mochari, Heidi; Vidal, Josep
2010-08-01
Life-long nutrition education and diet evaluation are key to the long-term success of surgical treatment of obesity. Diet guidelines provided for bariatric surgery patients generally focus on a progression through dietary stages, from the immediate post-surgical period to 6 months after surgery. However, long-term dietary guidelines for those surgically treated for obesity are not readily available. Therefore, there is a need for dietary recommendations for meal planning and nutritional supplementation for bariatric surgery patients beyond the short-term, post-operative period. The purpose of this paper is to construct an educational tool to provide long-term nutritional and behavioral advice for the post-bariatric patient. The manuscript summarizes the current knowledge on dietary strategies and behaviors associated with beneficial nutritional outcomes in the long term of post-bariatric surgery patients. Dietary and nutritional recommendations are presented in the form of a "bariatric food pyramid" designed to be easily disseminated to patients. The development of educational tools that are easy to understand and follow is essential for effective patient management during the surgery follow-up period. The pyramid can be used as a tool to help both therapists and patients to understand nutrition recommendations and thus promote a healthy long-term post-op dietary pattern based on high-quality protein, balanced with nutrient-dense complex carbohydrates and healthy sources of essential fatty acids.
VESL: The Virtual Earth Sheet Laboratory for Ice Sheet Modeling and Visualization
NASA Astrophysics Data System (ADS)
Cheng, D. L. C.; Larour, E. Y.; Quinn, J. D.; Halkides, D. J.
2016-12-01
We introduce the Virtual Earth System Laboratory (VESL), a scientific modeling and visualization tool delivered through an integrated web portal for dissemination of data, simulation of physical processes, and promotion of climate literacy. The current prototype leverages NASA's Ice Sheet System Model (ISSM), a state-of-the-art polar ice sheet dynamics model developed at the Jet Propulsion Lab and UC Irvine. We utilize the Emscripten source-to-source compiler to convert the C/C++ ISSM engine core to JavaScript, and bundled pre/post-processing JS scripts to be compatible with the existing ISSM Python/Matlab API. Researchers using VESL will be able to effectively present their work for public dissemination with little-to-no additional post-processing. This will allow for faster publication in peer-reviewed journals and adaption of results for educational applications. Through future application of this concept to multiple aspects of the Earth System, VESL has the potential to broaden data applications in the geosciences and beyond. At this stage, we seek feedback from the greater scientific and public outreach communities regarding the ease of use and feature set of VESL, as we plan its expansion, and aim to achieve more rapid communication and presentation of scientific results.
Mugisa, Dana J; Katimbo, Abia; Sempiira, John E; Kisaalita, William S
2016-05-01
Sub-Saharan African women on small-acreage farms carry a disproportionately higher labor burden, which is one of the main reasons they are unable to produce for both home and the market and realize higher incomes. Labor-saving interventions such as hand-tools are needed to save time and/or increase productivity in, for example, land preparation for crop and animal agriculture, post-harvest processing, and meeting daily energy and water needs. Development of such tools requires comprehensive and content-specific anthropometric data or body dimensions and existing databases based on Western women may be less relevant. We conducted measurements on 89 women to provide preliminary results toward answering two questions. First, how well existing databases are applicable in the design of hand-tools for sub-Saharan African women. Second, how universal body dimension predictive models are among ethnic groups. Our results show that, body dimensions between Bantu and Nilotic ethnolinguistic groups are different and both are different from American women. These results strongly support the need for establishing anthropometric databases for sub-Saharan African women, toward hand-tool design. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Modular workcells: modern methods for laboratory automation.
Felder, R A
1998-12-01
Laboratory automation is beginning to become an indispensable survival tool for laboratories facing difficult market competition. However, estimates suggest that only 8% of laboratories will be able to afford total laboratory automation systems. Therefore, automation vendors have developed alternative hardware configurations called 'modular automation', to fit the smaller laboratory. Modular automation consists of consolidated analyzers, integrated analyzers, modular workcells, and pre- and post-analytical automation. These terms will be defined in this paper. Using a modular automation model, the automated core laboratory will become a site where laboratory data is evaluated by trained professionals to provide diagnostic information to practising physicians. Modem software information management and process control tools will complement modular hardware. Proper standardization that will allow vendor-independent modular configurations will assure success of this revolutionary new technology.
FESetup: Automating Setup for Alchemical Free Energy Simulations.
Loeffler, Hannes H; Michel, Julien; Woods, Christopher
2015-12-28
FESetup is a new pipeline tool which can be used flexibly within larger workflows. The tool aims to support fast and easy setup of alchemical free energy simulations for molecular simulation packages such as AMBER, GROMACS, Sire, or NAMD. Post-processing methods like MM-PBSA and LIE can be set up as well. Ligands are automatically parametrized with AM1-BCC, and atom mappings for a single topology description are computed with a maximum common substructure search (MCSS) algorithm. An abstract molecular dynamics (MD) engine can be used for equilibration prior to free energy setup or standalone. Currently, all modern AMBER force fields are supported. Ease of use, robustness of the code, and automation where it is feasible are the main development goals. The project follows an open development model, and we welcome contributions.
Implementing an International Consultation on Earth System Research Priorities Using Web 2.0 Tools
NASA Astrophysics Data System (ADS)
Goldfarb, L.; Yang, A.
2009-12-01
Leah Goldfarb, Paul Cutler, Andrew Yang*, Mustapha Mokrane, Jacinta Legg and Deliang Chen The scientific community has been engaged in developing an international strategy on Earth system research. The initial consultation in this “visioning” process focused on gathering suggestions for Earth system research priorities that are interdisciplinary and address the most pressing societal issues. It was implemented this through a website that utilized Web 2.0 capabilities. The website (http://www.icsu-visioning.org/) collected input from 15 July to 1 September 2009. This consultation was the first in which the international scientific community was asked to help shape the future of a research theme. The site attracted over 7000 visitors from 133 countries, more than 1000 of whom registered and took advantage of the site’s functionality to contribute research questions (~300 questions), comment on posts, and/or vote on questions. To facilitate analysis of results, the site captured a small set of voluntary information about each contributor and their contribution. A group of ~50 international experts were invited to analyze the inputs at a “Visioning Earth System Research” meeting held in September 2009. The outcome of this meeting—a prioritized list of research questions to be investigated over the next decade—was then posted on the visioning website for additional comment from the community through an online survey tool. In general, many lessons were learned in the development and implementation of this website, both in terms of the opportunities offered by Web 2.0 capabilities and the application of these capabilities. It is hoped that this process may serve as a model for other scientific communities. The International Council for Science (ICSU) in cooperation with the International Social Science Council (ISSC) is responsible for organizing this Earth system visioning process.
Link Analysis in the Mission Planning Lab
NASA Technical Reports Server (NTRS)
McCarthy, Jessica A.; Cervantes, Benjamin W.; Daugherty, Sarah C.; Arroyo, Felipe; Mago, Divyang
2011-01-01
The legacy communications link analysis software currently used at Wallops Flight Facility involves processes that are different for command destruct, radar, and telemetry. There is a clear advantage to developing an easy-to-use tool that combines all the processes in one application. Link Analysis in the Mission Planning Lab (MPL) uses custom software and algorithms integrated with Analytical Graphics Inc. Satellite Toolkit (AGI STK). The MPL link analysis tool uses pre/post-mission data to conduct a dynamic link analysis between ground assets and the launch vehicle. Just as the legacy methods do, the MPL link analysis tool calculates signal strength and signal- to-noise according to the accepted processes for command destruct, radar, and telemetry assets. Graphs and other custom data are generated rapidly in formats for reports and presentations. STK is used for analysis as well as to depict plume angles and antenna gain patterns in 3D. The MPL has developed two interfaces with the STK software (see figure). The first interface is an HTML utility, which was developed in Visual Basic to enhance analysis for plume modeling and to offer a more user friendly, flexible tool. A graphical user interface (GUI) written in MATLAB (see figure upper right-hand corner) is also used to quickly depict link budget information for multiple ground assets. This new method yields a dramatic decrease in the time it takes to provide launch managers with the required link budgets to make critical pre-mission decisions. The software code used for these two custom utilities is a product of NASA's MPL.
How well does the Post-fire Erosion Risk Management Tool (ERMiT) really work?
NASA Astrophysics Data System (ADS)
Robichaud, Peter; Elliot, William; Lewis, Sarah; Miller, Mary Ellen
2016-04-01
The decision of where, when, and how to apply the most effective postfire erosion mitigation treatments requires land managers to assess the risk of damaging runoff and erosion events occurring after a fire. The Erosion Risk Management Tool (ERMiT) was developed to assist post fire assessment teams identify high erosion risk areas and effectiveness of various mitigation treatments to reduce that risk. ERMiT is a web-based application that uses the Water Erosion Prediction Project (WEPP) technology to estimate erosion, in probabilistic terms, on burned and recovering forest, range, and chaparral lands with and without the application of mitigation treatments. User inputs are processed by ERMiT to combine rain event variability with spatial and temporal variabilities of hillslope burn severity and soil properties which are then used as WEPP inputs. Since 2007, the model has been used in making hundreds of land management decisions in the US and elsewhere. We use eight published field study sites in the Western US to compare ERMiT predictions to observed hillslope erosion rates. Most sites experience only a few rainfall events that produced runoff and sediment except for a California site with a Mediterranean climate. When hillslope erosion occurred, significant correlations occurred between the observed hillslope erosion and those predicted by ERMiT. Significant correlation occurred for most mitigation treatments as well as the five recovery years. These model validation results suggest reasonable estimates of probabilistic post-fire hillslope sediment delivery when compared to observation.
Peled, Eli; Melamed, Eyal; Portal, Tali Banker; Axelman, Elena; Norman, Doron; Brenner, Benjamin; Nadir, Yona
2016-03-01
Trans-metatarsal operation to diabetic foot necrosis is a common procedure although only half of the patients do not need a second amputation due to surgery wound ischemia. No current tools are available for early prediction of surgery success and the clinical decision for a second operation may take weeks. Heparanase protein is involved in inflammation, angiogenesis and coagulation activation. The aim of the study was to evaluate heparanase level and procoagulant activity as an early predictor for success or failure of diabetic foot trans-metatarsal surgery. The study group included 40 patients with diabetic foot necrosis requiring trans-metatarsal surgical intervention. Eighteen patients designated as necrotic group, developed post-surgery necrosis at the surgery wound within the first month, requiring a second more proximal amputation. Skin biopsies from the proximal surgery edge were stained for heparanase, tissue factor (TF), TF pathway inhibitor (TFPI) and by hematoxylin and eosin. Plasma samples were drawn pre-surgery and at 1h, 1week and 1month post-surgery. Samples were tested for heparanase levels by ELISA and TF+heparanase activity, TF activity and heparanase procoagulant activity. Skin biopsy staining did not predict subsequent necrosis. In the non-necrotic group a significant rise in TF+heparanase activity, heparanase activity and heparanase levels were observed 1h and 1week post-surgery. The most significant increase was in heparanase procoagulant activity at the time point of 1h post-surgery (P<0.0001). Pre-surgery TF activity was significantly lower in the non-necrotic group compared to the necrotic group (P<0.05). Measuring heparanase procoagulant activity pre-surgery and 1h post-surgery could potentially serve as an early tool to predict the procedure success. The present results broaden our understanding regarding early involvement of heparanase in the wound healing process. Copyright © 2016 Elsevier Ltd. All rights reserved.
Assessing the technical efficiency of health posts in rural Guatemala: a data envelopment analysis.
Hernández, Alison R; San Sebastián, Miguel
2014-01-01
Strengthening health service delivery to the rural poor is an important means of redressing inequities. Meso-level managers can help enhance efficiency in the utilization of existing resources through the application of practical tools to analyze routinely collected data reflecting inputs and outputs. This study aimed to assess the efficiency and change in productivity of health posts over two years in a rural department of Guatemala. Data envelopment analysis was used to measure health posts' technical efficiency and productivity change for 2008 and 2009. Input/output data were collected from the regional health office of Alta Verapaz for 34 health posts from the 19 districts comprising the health region. Technical efficiency varied widely across health posts, with mean scores of 0.78 (SD=0.24) and 0.75 (SD=0.21) in 2008 and 2009, respectively. Overall, productivity increased by 4%, though 47% of health posts experienced a decline in productivity. Results were combined on a bivariate plot to identify health posts at the high and low extremes of efficiency, which should be followed up to determine how and why their production processes are operating differently. Assessing efficiency using the data that are available at the meso-level can serve as a first step in strengthening performance. Further work is required to support managers in the routine application of efficiency analysis and putting the results to use in guiding efforts to improve service delivery and increase utilization.
Ovesný, Martin; Křížek, Pavel; Borkovec, Josef; Švindrych, Zdeněk; Hagen, Guy M.
2014-01-01
Summary: ThunderSTORM is an open-source, interactive and modular plug-in for ImageJ designed for automated processing, analysis and visualization of data acquired by single-molecule localization microscopy methods such as photo-activated localization microscopy and stochastic optical reconstruction microscopy. ThunderSTORM offers an extensive collection of processing and post-processing methods so that users can easily adapt the process of analysis to their data. ThunderSTORM also offers a set of tools for creation of simulated data and quantitative performance evaluation of localization algorithms using Monte Carlo simulations. Availability and implementation: ThunderSTORM and the online documentation are both freely accessible at https://code.google.com/p/thunder-storm/ Contact: guy.hagen@lf1.cuni.cz Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24771516
Xie, Hongbo; Vucetic, Slobodan; Iakoucheva, Lilia M; Oldfield, Christopher J; Dunker, A Keith; Obradovic, Zoran; Uversky, Vladimir N
2007-05-01
Currently, the understanding of the relationships between function, amino acid sequence, and protein structure continues to represent one of the major challenges of the modern protein science. As many as 50% of eukaryotic proteins are likely to contain functionally important long disordered regions. Many proteins are wholly disordered but still possess numerous biologically important functions. However, the number of experimentally confirmed disordered proteins with known biological functions is substantially smaller than their actual number in nature. Therefore, there is a crucial need for novel bionformatics approaches that allow projection of the current knowledge from a few experimentally verified examples to much larger groups of known and potential proteins. The elaboration of a bioinformatics tool for the analysis of functional diversity of intrinsically disordered proteins and application of this data mining tool to >200 000 proteins from the Swiss-Prot database, each annotated with at least one of the 875 functional keywords, was described in the first paper of this series (Xie, H.; Vucetic, S.; Iakoucheva, L. M.; Oldfield, C. J.; Dunker, A. K.; Obradovic, Z.; Uversky, V.N. Functional anthology of intrinsic disorder. 1. Biological processes and functions of proteins with long disordered regions. J. Proteome Res. 2007, 5, 1882-1898). Using this tool, we have found that out of the 710 Swiss-Prot functional keywords associated with at least 20 proteins, 262 were strongly positively correlated with long intrinsically disordered regions, and 302 were strongly negatively correlated. Illustrative examples of functional disorder or order were found for the vast majority of keywords showing strongest positive or negative correlation with intrinsic disorder, respectively. Some 80 Swiss-Prot keywords associated with disorder- and order-driven biological processes and protein functions were described in the first paper (see above). The second paper of the series was devoted to the presentation of 87 Swiss-Prot keywords attributed to the cellular components, domains, technical terms, developmental processes, and coding sequence diversities possessing strong positive and negative correlation with long disordered regions (Vucetic, S.; Xie, H.; Iakoucheva, L. M.; Oldfield, C. J.; Dunker, A. K.; Obradovic, Z.; Uversky, V. N. Functional anthology of intrinsic disorder. 2. Cellular components, domains, technical terms, developmental processes, and coding sequence diversities correlated with long disordered regions. J. Proteome Res. 2007, 5, 1899-1916). Protein structure and functionality can be modulated by various post-translational modifications or/and as a result of binding of specific ligands. Numerous human diseases are associated with protein misfolding/misassembly/misfunctioning. This work concludes the series of papers dedicated to the functional anthology of intrinsic disorder and describes approximately 80 Swiss-Prot functional keywords that are related to ligands, post-translational modifications, and diseases possessing strong positive or negative correlation with the predicted long disordered regions in proteins.
Performance criteria and quality indicators for the post-analytical phase.
Sciacovelli, Laura; Aita, Ada; Padoan, Andrea; Pelloso, Michela; Antonelli, Giorgia; Piva, Elisa; Chiozza, Maria Laura; Plebani, Mario
2016-07-01
Quality indicators (QIs) used as performance measurements are an effective tool in accurately estimating quality, identifying problems that may need to be addressed, and monitoring the processes over time. In Laboratory Medicine, QIs should cover all steps of the testing process, as error studies have confirmed that most errors occur in the pre- and post-analytical phase of testing. Aim of the present study is to provide preliminary results on QIs and related performance criteria in the post-analytical phase. This work was conducted according to a previously described study design based on the voluntary participation of clinical laboratories in the project on QIs of the Working Group "Laboratory Errors and Patient Safety" (WG-LEPS) of the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC). Overall, data collected highlighted an improvement or stability in performances over time for all reported indicators thus demonstrating that the use of QIs is effective in the quality improvement strategy. Moreover, QIs data are an important source for defining the state-of-the-art concerning the error rate in the total testing process. The definition of performance specifications based on the state-of-the-art, as suggested by consensus documents, is a valuable benchmark point in evaluating the performance of each laboratory. Laboratory tests play a relevant role in the monitoring and evaluation of the efficacy of patient outcome thus assisting clinicians in decision-making. Laboratory performance evaluation is therefore crucial to providing patients with safe, effective and efficient care.
Predictable turn-around time for post tape-out flow
NASA Astrophysics Data System (ADS)
Endo, Toshikazu; Park, Minyoung; Ghosh, Pradiptya
2012-03-01
A typical post-out flow data path at the IC Fabrication has following major components of software based processing - Boolean operations before the application of resolution enhancement techniques (RET) and optical proximity correctin (OPC), the RET and OPC step [etch retargeting, sub-resolution assist feature insertion (SRAF) and OPC], post-OPCRET Boolean operations and sometimes in the same flow simulation based verification. There are two objectives that an IC Fabrication tapeout flow manager wants to achieve with the flow - predictable completion time and fastest turn-around time (TAT). At times they may be competing. There have been studies in the literature modeling the turnaround time from historical data for runs with the same recipe and later using that to derive the resource allocation for subsequent runs. [3]. This approach is more feasible in predominantly simulation dominated tools but for edge operation dominated flow it may not be possible especially if some processing acceleration methods like pattern matching or hierarchical processing is involved. In this paper, we suggest an alternative method of providing target turnaround time and managing the priority of jobs while not doing any upfront resource modeling and resource planning. The methodology then systematically either meets the turnaround time need and potentially lets the user know if it will not as soon as possible. This builds on top of the Calibre Cluster Management (CalCM) resource management work previously published [1][2]. The paper describes the initial demonstration of the concept.
Integrating Lean Exploration Loops Into Healthcare Facility Design.
Johnson, Kendra; M Mazur, Lukasz; Chadwick, Janet; Pooya, Pegah; Amos, Alison; McCreery, John
2017-04-01
To explore how Lean can add value during the schematic phase of design through providing additional resources and support to project leadership and the architectural design team. This case study-based research took place at one large academic hospital during design efforts of surgical tower to house 19 operating rooms (ORs) and support spaces including pre- and post-op, central processing and distribution, and materials management. Surgical services project leadership asked for Lean practitioners' support during the design process. Lean Exploration Loops (LELs) were conducted to generate evidence to support stakeholders, as they made important decisions about the new building design. The analyses conducted during LELs during the schematic phase were primarily conducted using express workouts (EWOs) and were focused on the flow of patients, staff, and family throughout the pavilion. LELs resulted in recommendations for key design features (e.g., number of pre- and post-op bays per OR floor, location of doors, scrub sinks, stretcher alcoves, equipment storage, and sterile core areas). Two-sided pre- and post-op bays with an inner clinical workspace and an outer patient transport corridor were recommended. Communicating elevator and a centrally located stairwell for staff to alleviate stress on the main bank of elevators at peak usage times were also suggested. We found Lean tools and methods to be of most value during schematic phase when focused on detailed process and layout analysis, while acknowledging the usefulness of focused EWOs to generate the evidence needed for the decision-making.
pySeismicDQA: open source post experiment data quality assessment and processing
NASA Astrophysics Data System (ADS)
Polkowski, Marcin
2017-04-01
Seismic Data Quality Assessment is python based, open source set of tools dedicated for data processing after passive seismic experiments. Primary goal of this toolset is unification of data types and formats from different dataloggers necessary for further processing. This process requires additional data checks for errors, equipment malfunction, data format errors, abnormal noise levels, etc. In all such cases user needs to decide (manually or by automatic threshold) if data is removed from output dataset. Additionally, output dataset can be visualized in form of website with data availability charts and waveform visualization with earthquake catalog (external). Data processing can be extended with simple STA/LTA event detection. pySeismicDQA is designed and tested for two passive seismic experiments in central Europe: PASSEQ 2006-2008 and "13 BB Star" (2013-2016). National Science Centre Poland provided financial support for this work via NCN grant DEC-2011/02/A/ST10/00284.
77 FR 72830 - Request for Comments on Request for Continued Examination (RCE) Practice
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-06
... the submission of written comments using a Web-based collaboration tool called IdeaScale[supreg]; and... collaboration tool called IdeaScale[supreg]. The tool allows users to post comments on a topic, and view and...
Post-disaster housing reconstruction: Perspectives of the NGO and local authorities on delay issues
NASA Astrophysics Data System (ADS)
Khalid, Khairin Norhashidah; Nifa, Faizatul Akmar Abdul; Ismail, Risyawati Mohamed; Lin, Chong Khai
2016-08-01
Post disaster reconstruction is complex, dynamic and chaotic in nature and as such represents many challenges because it is unlike normal construction. However, the time scale of reconstruction is shorter than the normal construction, but it often deals with uncertainties and the scale of the construction activities required is relatively high. After a disaster impacts a country, many governments, institutions and aid organizations cooperate and involved with the reconstruction process. This is seen as a tool for applying policies and programs designed to remedy the weakness in developmental policies, infrastructure and institutional arrangements. This paper reports a part of an on-going research on post-disaster housing reconstruction in Malaysia. An extensive literature review and pilot interviews were undertaken to establish the factors that contribute to the delay in post-disaster reconstruction project. Accordingly, this paper takes the perspective of recovery from non-government organization (NGO) and local authorities which act as providers of social services, builders of infrastructure, regulators of economic activity and managers of the natural environment. As a result, it is important on how those decisions are made, who is involved in the decision-making, and what are the consequences of this decision.
NASA Technical Reports Server (NTRS)
Waters, Eric D.
2013-01-01
Recent high level interest in the capability of small launch vehicles has placed significant demand on determining the trade space these vehicles occupy. This has led to the development of a zero level analysis tool that can quickly determine the minimum expected vehicle gross liftoff weight (GLOW) in terms of vehicle stage specific impulse (Isp) and propellant mass fraction (pmf) for any given payload value. Utilizing an extensive background in Earth to orbit trajectory experience a total necessary delta v the vehicle must achieve can be estimated including relevant loss terms. This foresight into expected losses allows for more specific assumptions relating to the initial estimates of thrust to weight values for each stage. This tool was further validated against a trajectory model, in this case the Program to Optimize Simulated Trajectories (POST), to determine if the initial sizing delta v was adequate to meet payload expectations. Presented here is a description of how the tool is setup and the approach the analyst must take when using the tool. Also, expected outputs which are dependent on the type of small launch vehicle being sized will be displayed. The method of validation will be discussed as well as where the sizing tool fits into the vehicle design process.
Hart, Tae L; Blacker, Susan; Panjwani, Aliza; Torbit, Lindsey; Evans, Michael
2015-03-01
To create informational tools for breast cancer patients with low levels of health literacy. Tools were developed through a three-stage process. (1) Focus groups were conducted with breast cancer survivors and interviews were held with health educators to determine content, source of information, format and medium of the tools. (2) Based on this feedback, a suite of tools was developed. (3) Focus groups were reconvened and health educators re-interviewed to obtain feedback and determine satisfaction. We developed a suite of five informational tools using low health literacy principles, which focused on learning about breast cancer resources and learning about the members of one's healthcare team, understanding the "journey" or trajectory of care beginning at diagnosis, hearing from other breast cancer patients about their own journey, and becoming informed about what to expect pre-and post-surgery for breast cancer. The final products were rated highly by breast cancer survivors. The developed materials, designed for patients who read below an 8th grade level, reflect the informational needs reported by breast cancer patients. Healthcare providers must consider utilizing design principles and theories of adult learning appropriate for those with low health literacy. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Availability of Ada and C++ Compilers, Tools, Education and Training
1991-07-01
executable mini-specs, to support import of existing code. Automated database population/change propagation. 9. Documentation generation: via FrameMaker . 10...formats. 12. Links to other tools: i. Atherton’s Software Backplane. ii. 4GLS iii. Interleaf and FrameMaker publishing. 13. Output formats: PostScript...by end . 11. Output formats: ASCII, PostScript, Interleaf, HPGL, Troff, nroff, FrameMaker , WordPerfect. 12. User interface: Menu and mouse
Utilization of curve offsets in additive manufacturing
NASA Astrophysics Data System (ADS)
Haseltalab, Vahid; Yaman, Ulas; Dolen, Melik
2018-05-01
Curve offsets are utilized in different fields of engineering and science. Additive manufacturing, which lately becomes an explicit requirement in manufacturing industry, utilizes curve offsets widely. One of the necessities of offsetting is for scaling which is required if there is shrinkage after the fabrication or if the surface quality of the resulting part is unacceptable. Therefore, some post-processing is indispensable. But the major application of curve offsets in additive manufacturing processes is for generating head trajectories. In a point-wise AM process, a correct tool-path in each layer can reduce lots of costs and increase the surface quality of the fabricated parts. In this study, different curve offset generation algorithms are analyzed to show their capabilities and disadvantages through some test cases and improvements on their drawbacks are suggested.
Processing Optimization of Deformed Plain Woven Thermoplastic Composites
NASA Astrophysics Data System (ADS)
Smith, John R.; Vaidya, Uday K.
2013-12-01
This research addresses the processing optimization of post-manufactured, plain weave architecture composite panels consisted of four glass layers and thermoplastic polyurethane (TPU) when formed with only localized heating. Often times, during the production of deep drawn composite parts, a fabric preform experiences various defects, including non-isothermal heating and thickness variations. Minimizing these defects is of utmost importance for mass produceability in a practical manufacturing process. The broad objective of this research was to implement a design of experiments approach to minimize through-thickness composite panel variation during manufacturing by varying the heating time, the temperature of heated components and the clamping pressure. It was concluded that the heated tooling with least area contact was most influential, followed by the length of heating time and the amount of clamping pressure.
NASA Technical Reports Server (NTRS)
Aldcroft, T.; Karovska, M.; Cresitello-Dittmar, M.; Cameron, R.
2000-01-01
The aspect system of the Chandra Observatory plays a key role in realizing the full potential of Chandra's x-ray optics and detectors. To achieve the highest spatial and spectral resolution (for grating observations), an accurate post-facto time history of the spacecraft attitude and internal alignment is needed. The CXC has developed a suite of tools which process sensor data from the aspect camera assembly and gyroscopes, and produce the spacecraft aspect solution. In this poster, the design of the aspect pipeline software is briefly described, followed by details of aspect system performance during the first eight months of flight. The two key metrics of aspect performance are: image reconstruction accuracy, which measures the x-ray image blurring introduced by aspect; and celestial location, which is the accuracy of detected source positions in absolute sky coordinates.
Weidner, Christopher; Fischer, Cornelius; Sauer, Sascha
2014-12-01
We introduce PHOXTRACK (PHOsphosite-X-TRacing Analysis of Causal Kinases), a user-friendly freely available software tool for analyzing large datasets of post-translational modifications of proteins, such as phosphorylation, which are commonly gained by mass spectrometry detection. In contrast to other currently applied data analysis approaches, PHOXTRACK uses full sets of quantitative proteomics data and applies non-parametric statistics to calculate whether defined kinase-specific sets of phosphosite sequences indicate statistically significant concordant differences between various biological conditions. PHOXTRACK is an efficient tool for extracting post-translational information of comprehensive proteomics datasets to decipher key regulatory proteins and to infer biologically relevant molecular pathways. PHOXTRACK will be maintained over the next years and is freely available as an online tool for non-commercial use at http://phoxtrack.molgen.mpg.de. Users will also find a tutorial at this Web site and can additionally give feedback at https://groups.google.com/d/forum/phoxtrack-discuss. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Tools and Techniques for Basin-Scale Climate Change Assessment
NASA Astrophysics Data System (ADS)
Zagona, E.; Rajagopalan, B.; Oakley, W.; Wilson, N.; Weinstein, P.; Verdin, A.; Jerla, C.; Prairie, J. R.
2012-12-01
The Department of Interior's WaterSMART Program seeks to secure and stretch water supplies to benefit future generations and identify adaptive measures to address climate change. Under WaterSMART, Basin Studies are comprehensive water studies to explore options for meeting projected imbalances in water supply and demand in specific basins. Such studies could be most beneficial with application of recent scientific advances in climate projections, stochastic simulation, operational modeling and robust decision-making, as well as computational techniques to organize and analyze many alternatives. A new integrated set of tools and techniques to facilitate these studies includes the following components: Future supply scenarios are produced by the Hydrology Simulator, which uses non-parametric K-nearest neighbor resampling techniques to generate ensembles of hydrologic traces based on historical data, optionally conditioned on long paleo reconstructed data using various Markov Chain techniuqes. Resampling can also be conditioned on climate change projections from e.g., downscaled GCM projections to capture increased variability; spatial and temporal disaggregation is also provided. The simulations produced are ensembles of hydrologic inputs to the RiverWare operations/infrastucture decision modeling software. Alternative demand scenarios can be produced with the Demand Input Tool (DIT), an Excel-based tool that allows modifying future demands by groups such as states; sectors, e.g., agriculture, municipal, energy; and hydrologic basins. The demands can be scaled at future dates or changes ramped over specified time periods. Resulting data is imported directly into the decision model. Different model files can represent infrastructure alternatives and different Policy Sets represent alternative operating policies, including options for noticing when conditions point to unacceptable vulnerabilities, which trigger dynamically executing changes in operations or other options. The over-arching Study Manager provides a graphical tool to create combinations of future supply scenarios, demand scenarios, infrastructure and operating policy alternatives; each scenario is executed as an ensemble of RiverWare runs, driven by the hydrologic supply. The Study Manager sets up and manages multiple executions on multi-core hardware. The sizeable are typically direct model outputs, or post-processed indicators of performance based on model outputs. Post processing statistical analysis of the outputs are possible using the Graphical Policy Analysis Tool or other statistical packages. Several Basin Studies undertaken have used RiverWare to evaluate future scenarios. The Colorado River Basin Study, the most complex and extensive to date, has taken advantage of these tools and techniques to generate supply scenarios, produce alternative demand scenarios and to set up and execute the many combinations of supplies, demands, policies, and infrastructure alternatives. The tools and techniques will be described with example applications.
Instructor Debrief Training in SPOT
NASA Technical Reports Server (NTRS)
Martin, Lynne; Orasanu, Judith; Villeda, Eric; Conners, Mary M. (Technical Monitor)
2002-01-01
One way to enhance the effectiveness of Special Purpose Operational Training' (SPOT) debriefing sessions may be for instructors to make explicit connections between the Crew Resource Management (CRM) concepts a carrier advocates and the behaviors displayed by the crew in question. A tool listing key behaviors from the scenario was devised, accompanied by an instructors' training session in which links were made between the behaviors and the underlying CRM processes they reflect. The aim of the tool is to assist instructors to focus the debriefing on the key SPOT/ CRM issues, in this case on planning. A second tool suggested ways to facilitate the discussion. Fourteen instructors at a major U.S. carrier took part in the training session and used the toolkit in their subsequent debriefs. Pre- and post-training debriefing samples from each instructor were compared to assess whether there were any changes in instructors' approaches to discussions in terms of the topics they covered and how they raised the points.
CANARY Risk Management of Adenocarcinoma: The Future of Imaging?
Foley, Finbar; Rajagopalan, Srinivasan; Raghunath, Sushravya M; Boland, Jennifer M; Karwoski, Ronald A.; Maldonado, Fabien; Bartholmai, Brian J; Peikert, Tobias
2016-01-01
Increased clinical utilization of chest high resolution computed tomography results in increased identification of lung adenocarcinomas and persistent sub-solid opacities. However, these lesions range from very indolent to extremely aggressive tumors. Clinically relevant diagnostic tools to non-invasively risk stratify and guide individualized management of these lesions are lacking. Research efforts investigating semi-quantitative measures to decrease inter- and intra-rater variability are emerging, and in some cases steps have been taken to automate this process. However, many such methods currently are still sub-optimal, require validation and are not yet clinically applicable. The Computer-Aided Nodule Assessment and Risk Yield (CANARY) software application represents a validated tool for the automated, quantitative, non-invasive tool for risk stratification of adenocarcinoma lung nodules. CANARY correlates well with consensus histology and post-surgical patient outcomes and therefore may help to guide individualized patient management e.g. in identification of nodules amenable to radiological surveillance, or in need of adjunctive therapy. PMID:27568149
Gillespie, Mary; Shackell, Eileen
2017-11-01
In nursing education, physiological concepts are typically presented within a body 'systems' framework yet learners are often challenged to apply this knowledge in the holistic and functional manner needed for effective clinical decision-making and safe patient care. A nursing faculty addressed this learning challenge by developing an advanced organizer as a conceptual and integrative learning tool to support learners in diverse learning environments and practice settings. A mixed methods research study was conducted that explored the effectiveness of the Oxygen Supply and Demand Framework as a learning tool in undergraduate nursing education. A pretest/post-test assessment and reflective journal were used to gather data. Findings indicated the Oxygen Supply and Demand Framework guided the development of pattern recognition and thinking processes and supported knowledge development, knowledge application and clinical decision-making. The Oxygen Supply and Demand Framework supports undergraduate students learning to provide safe and effective nursing care. Copyright © 2017 Elsevier Ltd. All rights reserved.
Best conditions for biodegradation of diesel oil by chemometric tools
Kaczorek, Ewa; Bielicka-Daszkiewicz, Katarzyna; Héberger, Károly; Kemény, Sándor; Olszanowski, Andrzej; Voelkel, Adam
2014-01-01
Diesel oil biodegradation by different bacteria-yeast-rhamnolipids consortia was tested. Chromatographic analysis of post-biodegradation residue was completed with chemometric tools (ANOVA, and a novel ranking procedure based on the sum of ranking differences). These tools were used in the selection of the most effective systems. The best results of aliphatic fractions of diesel oil biodegradation were observed for a yeast consortia with Aeromonas hydrophila KR4. For these systems the positive effect of rhamnolipids on hydrocarbon biodegradation was observed. However, rhamnolipids addition did not always have a positive influence on the biodegradation process (e.g. in case of yeast consortia with Stenotrophomonas maltophila KR7). Moreover, particular differences in the degradation pattern were observed for lower and higher alkanes than in the case with C22. Normally, the best conditions for “lower” alkanes are Aeromonas hydrophila KR4 + emulsifier independently from yeasts and e.g. Pseudomonas stutzeri KR7 for C24 alkane. PMID:24948922
Automatically Detecting Failures in Natural Language Processing Tools for Online Community Text.
Park, Albert; Hartzler, Andrea L; Huh, Jina; McDonald, David W; Pratt, Wanda
2015-08-31
The prevalence and value of patient-generated health text are increasing, but processing such text remains problematic. Although existing biomedical natural language processing (NLP) tools are appealing, most were developed to process clinician- or researcher-generated text, such as clinical notes or journal articles. In addition to being constructed for different types of text, other challenges of using existing NLP include constantly changing technologies, source vocabularies, and characteristics of text. These continuously evolving challenges warrant the need for applying low-cost systematic assessment. However, the primarily accepted evaluation method in NLP, manual annotation, requires tremendous effort and time. The primary objective of this study is to explore an alternative approach-using low-cost, automated methods to detect failures (eg, incorrect boundaries, missed terms, mismapped concepts) when processing patient-generated text with existing biomedical NLP tools. We first characterize common failures that NLP tools can make in processing online community text. We then demonstrate the feasibility of our automated approach in detecting these common failures using one of the most popular biomedical NLP tools, MetaMap. Using 9657 posts from an online cancer community, we explored our automated failure detection approach in two steps: (1) to characterize the failure types, we first manually reviewed MetaMap's commonly occurring failures, grouped the inaccurate mappings into failure types, and then identified causes of the failures through iterative rounds of manual review using open coding, and (2) to automatically detect these failure types, we then explored combinations of existing NLP techniques and dictionary-based matching for each failure cause. Finally, we manually evaluated the automatically detected failures. From our manual review, we characterized three types of failure: (1) boundary failures, (2) missed term failures, and (3) word ambiguity failures. Within these three failure types, we discovered 12 causes of inaccurate mappings of concepts. We used automated methods to detect almost half of 383,572 MetaMap's mappings as problematic. Word sense ambiguity failure was the most widely occurring, comprising 82.22% of failures. Boundary failure was the second most frequent, amounting to 15.90% of failures, while missed term failures were the least common, making up 1.88% of failures. The automated failure detection achieved precision, recall, accuracy, and F1 score of 83.00%, 92.57%, 88.17%, and 87.52%, respectively. We illustrate the challenges of processing patient-generated online health community text and characterize failures of NLP tools on this patient-generated health text, demonstrating the feasibility of our low-cost approach to automatically detect those failures. Our approach shows the potential for scalable and effective solutions to automatically assess the constantly evolving NLP tools and source vocabularies to process patient-generated text.
Street, Maryann; Phillips, Nicole M; Kent, Bridie; Colgan, Stephen; Mohebbi, Mohammadreza
2015-06-01
While the risk of adverse events following surgery has been identified, the impact of nursing care on early detection of these events is not well established. A systematic review of the evidence and an expert consensus study in post-anaesthetic care identified essential criteria for nursing assessment of patient readiness for discharge from the post-anaesthetic care unit (PACU). These criteria were included in a new nursing assessment tool, the Post-Anaesthetic Care Tool (PACT), and incorporated into the post-anaesthetic documentation at a large health service. The aim of this study is to test the clinical reliability of the PACT and evaluate whether the use of PACT will (1) enhance the recognition and response to patients at risk of deterioration in PACU; (2) improve documentation for handover from PACU nurse to ward nurse; (3) result in improved patient outcomes and (4) reduce healthcare costs. A prospective, non-randomised, pre-implementation and post-implementation design comparing: (1) patients (n=750) who have surgery prior to the implementation of the PACT and (2) patients (n=750) who have surgery after PACT. The study will examine the use of the tool through the observation of patient care and nursing handover. Patient outcomes and cost-effectiveness will be determined from health service data and medical record audit. Descriptive statistics will be used to describe the sample and compare the two patient groups (pre-intervention and post-intervention). Differences in patient outcomes between the two groups will be compared using the Cochran-Mantel-Haenszel test and regression analyses and reported as ORs with the corresponding 95% CIs. This study will test the clinical reliability and cost-effectiveness of the PACT. It is hypothesised that the PACT will enable nurses to recognise and respond to patients at risk of deterioration, improve handover to ward nurses, improve patient outcomes, and reduce healthcare costs. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Vieira, D C S; Serpa, D; Nunes, J P C; Prats, S A; Neves, R; Keizer, J J
2018-08-01
Wildfires have become a recurrent threat for many Mediterranean forest ecosystems. The characteristics of the Mediterranean climate, with its warm and dry summers and mild and wet winters, make this a region prone to wildfire occurrence as well as to post-fire soil erosion. This threat is expected to be aggravated in the future due to climate change and land management practices and planning. The wide recognition of wildfires as a driver for runoff and erosion in burnt forest areas has created a strong demand for model-based tools for predicting the post-fire hydrological and erosion response and, in particular, for predicting the effectiveness of post-fire management operations to mitigate these responses. In this study, the effectiveness of two post-fire treatments (hydromulch and natural pine needle mulch) in reducing post-fire runoff and soil erosion was evaluated against control conditions (i.e. untreated conditions), at different spatial scales. The main objective of this study was to use field data to evaluate the ability of different erosion models: (i) empirical (RUSLE), (ii) semi-empirical (MMF), and (iii) physically-based (PESERA), to predict the hydrological and erosive response as well as the effectiveness of different mulching techniques in fire-affected areas. The results of this study showed that all three models were reasonably able to reproduce the hydrological and erosive processes occurring in burned forest areas. In addition, it was demonstrated that the models can be calibrated at a small spatial scale (0.5 m 2 ) but provide accurate results at greater spatial scales (10 m 2 ). From this work, the RUSLE model seems to be ideal for fast and simple applications (i.e. prioritization of areas-at-risk) mainly due to its simplicity and reduced data requirements. On the other hand, the more complex MMF and PESERA models would be valuable as a base of a possible tool for assessing the risk of water contamination in fire-affected water bodies and for testing different land management scenarios. Copyright © 2018 Elsevier Inc. All rights reserved.
The Collins Center Update. Volume 1, Issue 3, December 1999
1999-12-01
CDN), devel oped and executed the FORO DE ESTRATEGIA NACIONAL 2005 Hon du ras en el Siglo XXI (FEN 2005) {Na tional Strategy Forum 2005 Hon du ras...tools and processes used to make strate gic leaders. Im pressed with this program, Gover nor Pat ton requested a return visit with his en tire...wide command post and field train ing exer cise which tests and vali dates nuclear command and control, and exe cu tion proce dures. It is based on a
76 FR 37136 - Post-Entry Amendment (PEA) Processing Test: Modification, Clarification, and Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-24
.... Customs and Border Protection's (CBP's) Post-Entry Amendment (PEA) Processing test, which allows the...: The Post-Entry Amendment (PEA) Processing test modification set forth in this document is effective...: Background I. Post-Entry Amendment Processing Test Program The Post-Entry Amendment (PEA) Processing test...
Implementation of Cyberinfrastructure and Data Management Workflow for a Large-Scale Sensor Network
NASA Astrophysics Data System (ADS)
Jones, A. S.; Horsburgh, J. S.
2014-12-01
Monitoring with in situ environmental sensors and other forms of field-based observation presents many challenges for data management, particularly for large-scale networks consisting of multiple sites, sensors, and personnel. The availability and utility of these data in addressing scientific questions relies on effective cyberinfrastructure that facilitates transformation of raw sensor data into functional data products. It also depends on the ability of researchers to share and access the data in useable formats. In addition to addressing the challenges presented by the quantity of data, monitoring networks need practices to ensure high data quality, including procedures and tools for post processing. Data quality is further enhanced if practitioners are able to track equipment, deployments, calibrations, and other events related to site maintenance and associate these details with observational data. In this presentation we will describe the overall workflow that we have developed for research groups and sites conducting long term monitoring using in situ sensors. Features of the workflow include: software tools to automate the transfer of data from field sites to databases, a Python-based program for data quality control post-processing, a web-based application for online discovery and visualization of data, and a data model and web interface for managing physical infrastructure. By automating the data management workflow, the time from collection to analysis is reduced and sharing and publication is facilitated. The incorporation of metadata standards and descriptions and the use of open-source tools enhances the sustainability and reusability of the data. We will describe the workflow and tools that we have developed in the context of the iUTAH (innovative Urban Transitions and Aridregion Hydrosustainability) monitoring network. The iUTAH network consists of aquatic and climate sensors deployed in three watersheds to monitor Gradients Along Mountain to Urban Transitions (GAMUT). The variety of environmental sensors and the multi-watershed, multi-institutional nature of the network necessitate a well-planned and efficient workflow for acquiring, managing, and sharing sensor data, which should be useful for similar large-scale and long-term networks.
NASA Astrophysics Data System (ADS)
Kadow, C.; Illing, S.; Schartner, T.; Grieger, J.; Kirchner, I.; Rust, H.; Cubasch, U.; Ulbrich, U.
2017-12-01
The Freie Univ Evaluation System Framework (Freva - freva.met.fu-berlin.de) is a software infrastructure for standardized data and tool solutions in Earth system science (e.g. www-miklip.dkrz.de, cmip-eval.dkrz.de). Freva runs on high performance computers to handle customizable evaluation systems of research projects, institutes or universities. It combines different software technologies into one common hybrid infrastructure, including all features present in the shell and web environment. The database interface satisfies the international standards provided by the Earth System Grid Federation (ESGF). Freva indexes different data projects into one common search environment by storing the meta data information of the self-describing model, reanalysis and observational data sets in a database. This implemented meta data system with its advanced but easy-to-handle search tool supports users, developers and their plugins to retrieve the required information. A generic application programming interface (API) allows scientific developers to connect their analysis tools with the evaluation system independently of the programming language used. Users of the evaluation techniques benefit from the common interface of the evaluation system without any need to understand the different scripting languages. The integrated web-shell (shellinabox) adds a degree of freedom in the choice of the working environment and can be used as a gate to the research projects HPC. Plugins are able to integrate their e.g. post-processed results into the database of the user. This allows e.g. post-processing plugins to feed statistical analysis plugins, which fosters an active exchange between plugin developers of a research project. Additionally, the history and configuration sub-system stores every analysis performed with the evaluation system in a database. Configurations and results of the tools can be shared among scientists via shell or web system. Furthermore, if configurations match while starting an evaluation plugin, the system suggests to use results already produced by other users - saving CPU/h, I/O, disk space and time. The efficient interaction between different technologies improves the Earth system modeling science framed by Freva.
Palese, Alvisa; Marini, Eva; Guarnier, Annamaria; Barelli, Paolo; Zambiasi, Paola; Allegrini, Elisabetta; Bazoli, Letizia; Casson, Paola; Marin, Meri; Padovan, Marisa; Picogna, Michele; Taddia, Patrizia; Chiari, Paolo; Salmaso, Daniele; Marognolli, Oliva; Canzan, Federica; Ambrosi, Elisa; Saiani, Luisa; Grassetti, Luca
2016-10-01
There is growing interest in validating tools aimed at supporting the clinical decision-making process and research. However, an increased bureaucratization of clinical practice and redundancies in the measures collected have been reported by clinicians. Redundancies in clinical assessments affect negatively both patients and nurses. To validate a meta-tool measuring the risks/problems currently estimated by multiple tools used in daily practice. A secondary analysis of a database was performed, using a cross-validation and a longitudinal study designs. In total, 1464 patients admitted to 12 medical units in 2012 were assessed at admission with the Brass, Barthel, Conley and Braden tools. Pertinent outcomes such as the occurrence of post-discharge need for resources and functional decline at discharge, as well as falls and pressure sores, were measured. Explorative factor analysis of each tool, inter-tool correlations and a conceptual evaluation of the redundant/similar items across tools were performed. Therefore, the validation of the meta-tool was performed through explorative factor analysis, confirmatory factor analysis and the structural equation model to establish the ability of the meta-tool to predict the outcomes estimated by the original tools. High correlations between the tools have emerged (from r 0.428 to 0.867) with a common variance from 18.3% to 75.1%. Through a conceptual evaluation and explorative factor analysis, the items were reduced from 42 to 20, and the three factors that emerged were confirmed by confirmatory factor analysis. According to the structural equation model results, two out of three emerged factors predicted the outcomes. From the initial 42 items, the meta-tool is composed of 20 items capable of predicting the outcomes as with the original tools. © 2016 John Wiley & Sons, Ltd.
A new tool to evaluate postgraduate training posts: the Job Evaluation Survey Tool (JEST).
Wall, David; Goodyear, Helen; Singh, Baldev; Whitehouse, Andrew; Hughes, Elizabeth; Howes, Jonathan
2014-10-02
Three reports in 2013 about healthcare and patient safety in the UK, namely Berwick, Francis and Keogh have highlighted the need for junior doctors' views about their training experience to be heard. In the UK, the General Medical Council (GMC) quality assures medical training programmes and requires postgraduate deaneries to undertake quality management and monitoring of all training posts in their area. The aim of this study was to develop a simple trainee questionnaire for evaluation of postgraduate training posts based on the GMC, UK standards and to look at the reliability and validity including comparison with a well-established and internationally validated tool, the Postgraduate Hospital Educational Environment Measure (PHEEM). The Job Evaluation Survey Tool (JEST), a fifteen item job evaluation questionnaire was drawn up in 2006, piloted with Foundation doctors (2007), field tested with specialist paediatric registrars (2008) and used over a three year period (2008-11) by Foundation Doctors. Statistical analyses including descriptives, reliability, correlation and factor analysis were undertaken and JEST compared with PHEEM. The JEST had a reliability of 0.91 in the pilot study of 76 Foundation doctors, 0.88 in field testing of 173 Paediatric specialist registrars and 0.91 in three years of general use in foundation training with 3367 doctors completing JEST. Correlation of JEST with PHEEM was 0.80 (p < 0.001). Factor analysis showed two factors, a teaching factor and a social and lifestyle one. The JEST has proved to be a simple, valid and reliable evaluation tool in the monitoring and evaluation of postgraduate hospital training posts.
Active Wiki Knowledge Repository
2012-10-01
data using SPARQL queries or RESTful web-services; ‘gardening’ tools for examining the semantically tagged content in the wiki; high-level language tool...Tagging & RDF triple-store Fusion and inferences for collaboration Tools for Consuming Data SPARQL queries or RESTful WS Inference & Gardening tools...other stores using AW SPARQL queries and rendering templates; and 4) Interactively share maps and other content using annotation tools to post notes
NASA Planning for Orion Multi-Purpose Crew Vehicle Ground Operations
NASA Technical Reports Server (NTRS)
Letchworth, Gary; Schlierf, Roland
2011-01-01
The NASA Orion Ground Processing Team was originally formed by the Kennedy Space Center (KSC) Constellation (Cx) Project Office's Orion Division to define, refine and mature pre-launch and post-landing ground operations for the Orion human spacecraft. The multidisciplined KSC Orion team consisted of KSC civil servant, SAIC, Productivity Apex, Inc. and Boeing-CAPPS engineers, project managers and safety engineers, as well as engineers from Constellation's Orion Project and Lockheed Martin Orion Prime contractor. The team evaluated the Orion design configurations as the spacecraft concept matured between Systems Design Review (SDR), Systems Requirement Review (SRR) and Preliminary Design Review (PDR). The team functionally decomposed prelaunch and post-landing steps at three levels' of detail, or tiers, beginning with functional flow block diagrams (FFBDs). The third tier FFBDs were used to build logic networks and nominal timelines. Orion ground support equipment (GSE) was identified and mapped to each step. This information was subsequently used in developing lower level operations steps in a Ground Operations Planning Document PDR product. Subject matter experts for each spacecraft and GSE subsystem were used to define 5th - 95th percentile processing times for each FFBD step, using the Delphi Method. Discrete event simulations used this information and the logic network to provide processing timeline confidence intervals for launch rate assessments. The team also used the capabilities of the KSC Visualization Lab, the FFBDs and knowledge of the spacecraft, GSE and facilities to build visualizations of Orion pre-launch and postlanding processing at KSC. Visualizations were a powerful tool for communicating planned operations within the KSC community (i.e., Ground Systems design team), and externally to the Orion Project, Lockheed Martin spacecraft designers and other Constellation Program stakeholders during the SRR to PDR timeframe. Other operations planning tools included Kaizen/Lean events, mockups and human factors analysis. The majority of products developed by this team are applicable as KSC prepares 21st Century Ground Systems for the Orion Multi-Purpose Crew Vehicle and Space Launch System.
STUDENTS ** GRADUATING ERC STUDENTS ** ERC AFFILIATED POST DOCTORAL ASSIGNEES When you join a ERC Research to My SRC page/My Tools. 2. All students and Postdocs must post a current resume and verify their
Kaizer, Franceen; Kim, Angela; Van, My Tram; Korner-Bitensky, Nicol
2010-03-01
Patients with stroke should be screened for safety prior to starting a self-medication regime. An extensive literature review revealed no standardized self-medication tool tailored to the multi-faceted needs of the stroke population. The aim of this study was to create and validate a condition-specific tool to be used in screening for self-medication safety in individuals with stroke. Items were generated using expert consultation and review of the existing tools. The draft tool was pilot-tested on expert stroke clinicians to receive feedback on content, clarity, optimal cueing and domain omissions. The final version was piloted on patients with stroke using a structured interviewer-administered interview. The tool was progressively refined and validated according to feedback from the 11 expert reviewers. The subsequent version was piloted on patients with stroke. The final version includes 16 questions designed to elicit information on 5 domains: cognition, communication, motor, visual-perception and, judgement/executive function/self-efficacy. The Screening for Safe Self-medication post-Stroke Scale (S-5) has been created and validated for use by health professionals to screen self-medication safety readiness of patients after stroke. Its use should also help to guide clinicians' recommendations and interventions aimed at enhancing self-medication post-stroke.
NASA Astrophysics Data System (ADS)
Buongiorno, Maria Fabrizia; Musacchio, Massimo; Silvestri, Malvina; Spinetti, Claudia; Corradini, Stefano; Lombardo, Valerio; Merucci, Luca; Sansosti, Eugenio; Pugnagli, Sergio; Teggi, Sergio; Pace, Gaetano; Fermi, Marco; Zoffoli, Simona
2007-10-01
The Project called Sistema Rischio Vulcanico (SRV) is funded by the Italian Space Agency (ASI) in the frame of the National Space Plan 2003-2005 under the Earth Observations section for natural risks management. The SRV Project is coordinated by the Istituto Nazionale di Geofisica e Vulcanologia (INGV) which is responsible at national level for the volcanic monitoring. The objective of the project is to develop a pre-operative system based on EO data and ground measurements integration to support the volcanic risk monitoring of the Italian Civil Protection Department which requirements and need are well integrated in the GMES Emergency Core Services program. The project philosophy is to implement, by incremental versions, specific modules which allow to process, store and visualize through Web GIS tools EO derived parameters considering three activity phases: 1) knowledge and prevention; 2) crisis; 3) post crisis. In order to combine effectively the EO data and the ground networks measurements the system will implement a multi-parametric analysis tool, which represents and unique tool to analyze contemporaneously a large data set of data in "near real time". The SRV project will be tested his operational capabilities on three Italian Volcanoes: Etna,Vesuvio and Campi Flegrei.
Global and Cross-National Influences on Education in Post-Genocide Rwanda
ERIC Educational Resources Information Center
Schweisfurth, Michele
2006-01-01
In post-genocide Rwanda, education is being seen as a tool for development, reconstruction and reconciliation. This article explores three different ways in which international influence on the education agenda is being experienced, with particular focus on Rwanda as a post-conflict society. The three quite different dimensions and sources of…
Spatio-Temporal Process Variability in Watershed Scale Wetland Restoration Planning
NASA Astrophysics Data System (ADS)
Evenson, G. R.
2012-12-01
Watershed scale restoration decision making processes are increasingly informed by quantitative methodologies providing site-specific restoration recommendations - sometimes referred to as "systematic planning." The more advanced of these methodologies are characterized by a coupling of search algorithms and ecological models to discover restoration plans that optimize environmental outcomes. Yet while these methods have exhibited clear utility as decision support toolsets, they may be critiqued for flawed evaluations of spatio-temporally variable processes fundamental to watershed scale restoration. Hydrologic and non-hydrologic mediated process connectivity along with post-restoration habitat dynamics, for example, are commonly ignored yet known to appreciably affect restoration outcomes. This talk will present a methodology to evaluate such spatio-temporally complex processes in the production of watershed scale wetland restoration plans. Using the Tuscarawas Watershed in Eastern Ohio as a case study, a genetic algorithm will be coupled with the Soil and Water Assessment Tool (SWAT) to reveal optimal wetland restoration plans as measured by their capacity to maximize nutrient reductions. Then, a so-called "graphical" representation of the optimization problem will be implemented in-parallel to promote hydrologic and non-hydrologic mediated connectivity amongst existing wetlands and sites selected for restoration. Further, various search algorithm mechanisms will be discussed as a means of accounting for temporal complexities such as post-restoration habitat dynamics. Finally, generalized patterns of restoration plan optimality will be discussed as an alternative and possibly superior decision support toolset given the complexity and stochastic nature of spatio-temporal process variability.
Post-Flight Assessment of Low Density Supersonic Decelerator Flight Dynamics Test 2 Simulation
NASA Technical Reports Server (NTRS)
Dutta, Soumyo; Bowes, Angela L.; White, Joseph P.; Striepe, Scott A.; Queen, Eric M.; O'Farrel, Clara; Ivanov, Mark C.
2016-01-01
NASA's Low Density Supersonic Decelerator (LDSD) project conducted its second Supersonic Flight Dynamics Test (SFDT-2) on June 8, 2015. The Program to Optimize Simulated Trajectories II (POST2) was one of the flight dynamics tools used to simulate and predict the flight performance and was a major tool used in the post-flight assessment of the flight trajectory. This paper compares the simulation predictions with the reconstructed trajectory. Additionally, off-nominal conditions seen during flight are modeled in the simulation to reconcile the predictions with flight data. These analyses are beneficial to characterize the results of the flight test and to improve the simulation and targeting of the subsequent LDSD flights.
Modular GIS Framework for National Scale Hydrologic and Hydraulic Modeling Support
NASA Astrophysics Data System (ADS)
Djokic, D.; Noman, N.; Kopp, S.
2015-12-01
Geographic information systems (GIS) have been extensively used for pre- and post-processing of hydrologic and hydraulic models at multiple scales. An extensible GIS-based framework was developed for characterization of drainage systems (stream networks, catchments, floodplain characteristics) and model integration. The framework is implemented as a set of free, open source, Python tools and builds on core ArcGIS functionality and uses geoprocessing capabilities to ensure extensibility. Utilization of COTS GIS core capabilities allows immediate use of model results in a variety of existing online applications and integration with other data sources and applications.The poster presents the use of this framework to downscale global hydrologic models to local hydraulic scale and post process the hydraulic modeling results and generate floodplains at any local resolution. Flow forecasts from ECMWF or WRF-Hydro are downscaled and combined with other ancillary data for input into the RAPID flood routing model. RAPID model results (stream flow along each reach) are ingested into a GIS-based scale dependent stream network database for efficient flow utilization and visualization over space and time. Once the flows are known at localized reaches, the tools can be used to derive the floodplain depth and extent for each time step in the forecast at any available local resolution. If existing rating curves are available they can be used to relate the flow to the depth of flooding, or synthetic rating curves can be derived using the tools in the toolkit and some ancillary data/assumptions. The results can be published as time-enabled spatial services to be consumed by web applications that use floodplain information as an input. Some of the existing online presentation templates can be easily combined with available online demographic and infrastructure data to present the impact of the potential floods on the local community through simple, end user products. This framework has been successfully used in both the data rich environments as well as in locales with minimum available spatial and hydrographic data.
Nepal, Chirag; Coolen, Marion; Hadzhiev, Yavor; Cussigh, Delphine; Mydel, Piotr; Steen, Vidar M.; Carninci, Piero; Andersen, Jesper B.; Bally-Cuif, Laure; Müller, Ferenc; Lenhard, Boris
2016-01-01
MicroRNAs (miRNAs) play a major role in the post-transcriptional regulation of target genes, especially in development and differentiation. Our understanding about the transcriptional regulation of miRNA genes is limited by inadequate annotation of primary miRNA (pri-miRNA) transcripts. Here, we used CAGE-seq and RNA-seq to provide genome-wide identification of the pri-miRNA core promoter repertoire and its dynamic usage during zebrafish embryogenesis. We assigned pri-miRNA promoters to 152 precursor-miRNAs (pre-miRNAs), the majority of which were supported by promoter associated post-translational histone modifications (H3K4me3, H2A.Z) and RNA polymerase II (RNAPII) occupancy. We validated seven miR-9 pri-miRNAs by in situ hybridization and showed similar expression patterns as mature miR-9. In addition, processing of an alternative intronic promoter of miR-9–5 was validated by 5′ RACE PCR. Developmental profiling revealed a subset of pri-miRNAs that are maternally inherited. Moreover, we show that promoter-associated H3K4me3, H2A.Z and RNAPII marks are not only present at pri-miRNA promoters but are also specifically enriched at pre-miRNAs, suggesting chromatin level regulation of pre-miRNAs. Furthermore, we demonstrated that CAGE-seq also detects 3′-end processing of pre-miRNAs on Drosha cleavage site that correlates with miRNA-offset RNAs (moRNAs) production and provides a new tool for detecting Drosha processing events and predicting pre-miRNA processing by a genome-wide assay. PMID:26673698
Examining the Relationships Among Self-Compassion, Social Anxiety, and Post-Event Processing.
Blackie, Rebecca A; Kocovski, Nancy L
2017-01-01
Post-event processing refers to negative and repetitive thinking following anxiety provoking social situations. Those who engage in post-event processing may lack self-compassion in relation to social situations. As such, the primary aim of this research was to evaluate whether those high in self-compassion are less likely to engage in post-event processing and the specific self-compassion domains that may be most protective. In study 1 ( N = 156 undergraduate students) and study 2 ( N = 150 individuals seeking help for social anxiety and shyness), participants completed a battery of questionnaires, recalled a social situation, and then rated state post-event processing. Self-compassion negatively correlated with post-event processing, with some differences depending on situation type. Even after controlling for self-esteem, self-compassion remained significantly correlated with state post-event processing. Given these findings, self-compassion may serve as a buffer against post-event processing. Future studies should experimentally examine whether increasing self-compassion leads to reduced post-event processing.
Modelling Peri-Perceptual Brain Processes in a Deep Learning Spiking Neural Network Architecture.
Gholami Doborjeh, Zohreh; Kasabov, Nikola; Gholami Doborjeh, Maryam; Sumich, Alexander
2018-06-11
Familiarity of marketing stimuli may affect consumer behaviour at a peri-perceptual processing level. The current study introduces a method for deep learning of electroencephalogram (EEG) data using a spiking neural network (SNN) approach that reveals the complexity of peri-perceptual processes of familiarity. The method is applied to data from 20 participants viewing familiar and unfamiliar logos. The results support the potential of SNN models as novel tools in the exploration of peri-perceptual mechanisms that respond differentially to familiar and unfamiliar stimuli. Specifically, the activation pattern of the time-locked response identified by the proposed SNN model at approximately 200 milliseconds post-stimulus suggests greater connectivity and more widespread dynamic spatio-temporal patterns for familiar than unfamiliar logos. The proposed SNN approach can be applied to study other peri-perceptual or perceptual brain processes in cognitive and computational neuroscience.
Extraction of UMLS® Concepts Using Apache cTAKES™ for German Language.
Becker, Matthias; Böckmann, Britta
2016-01-01
Automatic information extraction of medical concepts and classification with semantic standards from medical reports is useful for standardization and for clinical research. This paper presents an approach for an UMLS concept extraction with a customized natural language processing pipeline for German clinical notes using Apache cTAKES. The objectives are, to test the natural language processing tool for German language if it is suitable to identify UMLS concepts and map these with SNOMED-CT. The German UMLS database and German OpenNLP models extended the natural language processing pipeline, so the pipeline can normalize to domain ontologies such as SNOMED-CT using the German concepts. For testing, the ShARe/CLEF eHealth 2013 training dataset translated into German was used. The implemented algorithms are tested with a set of 199 German reports, obtaining a result of average 0.36 F1 measure without German stemming, pre- and post-processing of the reports.
Process and Post-Process: A Discursive History.
ERIC Educational Resources Information Center
Matsuda, Paul Kei
2003-01-01
Examines the history of process and post-process in composition studies, focusing on ways in which terms, such as "current-traditional rhetoric,""process," and "post-process" have contributed to the discursive construction of reality. Argues that use of the term post-process in the context of second language writing needs to be guided by a…
Capturing, using, and managing quality assurance knowledge for shuttle post-MECO flight design
NASA Technical Reports Server (NTRS)
Peters, H. L.; Fussell, L. R.; Goodwin, M. A.; Schultz, Roger D.
1991-01-01
Ascent initialization values used by the Shuttle's onboard computer for nominal and abort mission scenarios are verified by a six degrees of freedom computer simulation. The procedure that the Ascent Post Main Engine Cutoff (Post-MECO) group uses to perform quality assurance (QA) of the simulation is time consuming. Also, the QA data, checklists and associated rationale, though known by the group members, is not sufficiently documented, hindering transfer of knowledge and problem resolution. A new QA procedure which retains the current high level of integrity while reducing the time required to perform QA is needed to support the increasing Shuttle flight rate. Documenting the knowledge is also needed to increase its availability for training and problem resolution. To meet these needs, a knowledge capture process, embedded into the group activities, was initiated to verify the existing QA checks, define new ones, and document all rationale. The resulting checks were automated in a conventional software program to achieve the desired standardization, integrity, and time reduction. A prototype electronic knowledge base was developed with Macintosh's HyperCard to serve as a knowledge capture tool and data storage.
AN Fitting Reconditioning Tool
NASA Technical Reports Server (NTRS)
Lopez, Jason
2011-01-01
A tool was developed to repair or replace AN fittings on the shuttle external tank (ET). (The AN thread is a type of fitting used to connect flexible hoses and rigid metal tubing that carry fluid. It is a U.S. military-derived specification agreed upon by the Army and Navy, hence AN.) The tool is used on a drill and is guided by a pilot shaft that follows the inside bore. The cutting edge of the tool is a standard-size replaceable insert. In the typical Post Launch Maintenance/Repair process for the AN fittings, the six fittings are removed from the ET's GUCP (ground umbilical carrier plate) for reconditioning. The fittings are inspected for damage to the sealing surface per standard operations maintenance instructions. When damage is found on the sealing surface, the condition is documented. A new AN reconditioning tool is set up to cut and remove the surface damage. It is then inspected to verify the fitting still meets drawing requirements. The tool features a cone-shaped interior at 36.5 , and may be adjusted at a precise angle with go-no-go gauges to insure that the cutting edge could be adjusted as it wore down. One tool, one setting block, and one go-no-go gauge were fabricated. At the time of this reporting, the tool has reconditioned/returned to spec 36 AN fittings with 100-percent success of no leakage. This tool provides a quick solution to repair a leaky AN fitting. The tool could easily be modified with different-sized pilot shafts to different-sized fittings.
Assessing the technical efficiency of health posts in rural Guatemala: a data envelopment analysis
Hernández, Alison R.; Sebastián, Miguel San
2014-01-01
Introduction Strengthening health service delivery to the rural poor is an important means of redressing inequities. Meso-level managers can help enhance efficiency in the utilization of existing resources through the application of practical tools to analyze routinely collected data reflecting inputs and outputs. This study aimed to assess the efficiency and change in productivity of health posts over two years in a rural department of Guatemala. Methods Data envelopment analysis was used to measure health posts’ technical efficiency and productivity change for 2008 and 2009. Input/output data were collected from the regional health office of Alta Verapaz for 34 health posts from the 19 districts comprising the health region. Results Technical efficiency varied widely across health posts, with mean scores of 0.78 (SD=0.24) and 0.75 (SD=0.21) in 2008 and 2009, respectively. Overall, productivity increased by 4%, though 47% of health posts experienced a decline in productivity. Results were combined on a bivariate plot to identify health posts at the high and low extremes of efficiency, which should be followed up to determine how and why their production processes are operating differently. Conclusions Assessing efficiency using the data that are available at the meso-level can serve as a first step in strengthening performance. Further work is required to support managers in the routine application of efficiency analysis and putting the results to use in guiding efforts to improve service delivery and increase utilization. PMID:24461356
Microstructural evolution during the homogenization heat treatment of 6XXX and 7XXX aluminum alloys
NASA Astrophysics Data System (ADS)
Priya, Pikee
Homogenization heat treatment of as-cast billets is an important step in the processing of aluminum extrusions. Microstructural evolution during homogenization involves elimination of the eutectic morphology by spheroidisation of the interdendritic phases, minimization of the microsegregation across the grains through diffusion, dissolution of the low-melting phases, which enhances the surface finish of the extrusions, and precipitation of nano-sized dispersoids (for Cr-, Zr-, Mn-, Sc-containing alloys), which inhibit grain boundary motion to prevent recrystallization. Post-homogenization cooling reprecipitates some of the phases, changing the flow stress required for subsequent extrusion. These precipitates, however, are deleterious for the mechanical properties of the alloy and also hamper the age-hardenability and are hence dissolved during solution heat treatment. Microstructural development during homogenization and subsequent cooling occurs both at the length scale of the Secondary Dendrite Arm Spacing (SDAS) in micrometers and dispersoids in nanometers. Numerical tools to simulate microstructural development at both the length scales have been developed and validated against experiments. These tools provide easy and convenient means to study the process. A Cellular Automaton-Finite Volume-based model for evolution of interdendritic phases is coupled with a Particle Size Distribution-based model for precipitation of dispersoids across the grain. This comprehensive model has been used to study the effect of temperature, composition, as-cast microstructure, and cooling rates during post-homogenization quenching on microstructural evolution. The numerical study has been complimented with experiments involving Scanning Electron Microscopy, Energy Dispersive Spectroscopy, X-Ray Diffraction and Differential Scanning Calorimetry and a good agreement has with numerical results has been found. The current work aims to study the microstructural evolution during homogenization heat treatment at both length scales which include the (i) dissolution and transformation of the as-cast secondary phases; (ii) precipitation of dispersoids; and (iii) reprecipitation of some of the secondary phases during post-homogenization cooling. The kinetics of the phase transformations are mostly diffusion controlled except for the eta to S phase transformation in 7XXX alloys which is interface reaction rate controlled which has been implemented using a novel approach. Recommendations for homogenization temperature, time, cooling rates and compositions are made for Al-Si-Mg-Fe-Mn and Al-Zn-Cu-Mg-Zr alloys. The numerical model developed has been applied for a through process solidification-homogenization modeling of a Direct-Chill cast AA7050 cylindrical billet to study the radial variation of microstructure after solidification, homogenization and post-homogenization cooling.
Gul, Naheed; Quadri, Mujtaba
2011-09-01
To evaluate the clinical diagnostic reasoning process as a tool to decrease the number of unnecessary endoscopies for diagnosing peptic ulcer disease. tudy Cross-sectional KAP study. Shifa College of Medicine, Islamabad, from April to August 2010. Two hundred doctors were assessed with three common clinical scenarios of low, intermediate and high pre-test probability for peptic ulcer disease using a questionnaire. The differences between the reference estimates and the respondents' estimates of pre-test and post test probability were used for assessing the ability of estimating the pretest probability and the post test probability of the disease. Doctors were also enquired about the cost-effectiveness and safety of endoscopy. Consecutive sampling technique was used and the data was analyzed using SPSS version 16. In the low pre-test probability settings, overestimation of the disease probability suggested the doctors' inability to rule out the disease. The post test probabilities were similarly overestimated. In intermediate pre-test probability settings, both over and under estimation of probabilities were noticed. In high pre-test probability setting, there was no significant difference in the reference and the responders' intuitive estimates of post test probability. Doctors were more likely to consider ordering the test as the disease probability increased. Most respondents were of the opinion that endoscopy is not a cost-effective procedure and may be associated with a potential harm. Improvement is needed in doctors' diagnostic ability by more emphasis on clinical decision-making and application of bayesian probabilistic thinking to real clinical situations.
Dynamic Lipid-dependent Modulation of Protein Topology by Post-translational Phosphorylation.
Vitrac, Heidi; MacLean, David M; Karlstaedt, Anja; Taegtmeyer, Heinrich; Jayaraman, Vasanthi; Bogdanov, Mikhail; Dowhan, William
2017-02-03
Membrane protein topology and folding are governed by structural principles and topogenic signals that are recognized and decoded by the protein insertion and translocation machineries at the time of initial membrane insertion and folding. We previously demonstrated that the lipid environment is also a determinant of initial protein topology, which is dynamically responsive to post-assembly changes in membrane lipid composition. However, the effect on protein topology of post-assembly phosphorylation of amino acids localized within initially cytoplasmically oriented extramembrane domains has never been investigated. Here, we show in a controlled in vitro system that phosphorylation of a membrane protein can trigger a change in topological arrangement. The rate of change occurred on a scale of seconds, comparable with the rates observed upon changes in the protein lipid environment. The rate and extent of topological rearrangement were dependent on the charges of extramembrane domains and the lipid bilayer surface. Using model membranes mimicking the lipid compositions of eukaryotic organelles, we determined that anionic lipids, cholesterol, sphingomyelin, and membrane fluidity play critical roles in these processes. Our results demonstrate how post-translational modifications may influence membrane protein topology in a lipid-dependent manner, both along the organelle trafficking pathway and at their final destination. The results provide further evidence that membrane protein topology is dynamic, integrating for the first time the effect of changes in lipid composition and regulators of cellular processes. The discovery of a new topology regulatory mechanism opens additional avenues for understanding unexplored structure-function relationships and the development of optimized topology prediction tools. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.
Carlton, Connor D; Mitchell, Samantha; Lewis, Patrick
2018-01-01
Over the past decade, Structure from Motion (SfM) has increasingly been used as a means of digital preservation and for documenting archaeological excavations, architecture, and cultural material. However, few studies have tapped the potential of using SfM to document and analyze taphonomic processes affecting burials for forensic sciences purposes. This project utilizes SfM models to elucidate specific post-depositional events that affected a series of three human cadavers deposited at the South East Texas Applied Forensic Science Facility (STAFS). The aim of this research was to test the ability for untrained researchers to employ spatial software and photogrammetry for data collection purposes. For a series of three months a single lens reflex (SLR) camera was used to capture a series of overlapping images at periodic stages in the decomposition process of each cadaver. These images are processed through photogrammetric software that creates a 3D model that can be measured, manipulated, and viewed. This project used photogrammetric and geospatial software to map changes in decomposition and movement of the body from original deposition points. Project results indicate SfM and GIS as a useful tool for documenting decomposition and taphonomic processes. Results indicate photogrammetry is an efficient, relatively simple, and affordable tool for the documentation of decomposition. Copyright © 2017 Elsevier B.V. All rights reserved.
Surface topography analysis and performance on post-CMP images (Conference Presentation)
NASA Astrophysics Data System (ADS)
Lee, Jusang; Bello, Abner F.; Kakita, Shinichiro; Pieniazek, Nicholas; Johnson, Timothy A.
2017-03-01
Surface topography on post-CMP processing can be measured with white light interference microscopy to determine the planarity. Results are used to avoid under or over polishing and to decrease dishing. The numerical output of the surface topography is the RMS (root-mean-square) of the height. Beyond RMS, the topography image is visually examined and not further quantified. Subjective comparisons of the height maps are used to determine optimum CMP process conditions. While visual comparison of height maps can determine excursions, it's only through manual inspection of the images. In this work we describe methods of quantifying post-CMP surface topography characteristics that are used in other technical fields such as geography and facial-recognition. The topography image is divided into small surface patches of 7x7 pixels. Each surface patch is fitted to an analytic surface equation, in this case a third order polynomial, from which the gradient, directional derivatives, and other characteristics are calculated. Based on the characteristics, the surface patch is labeled as peak, ridge, flat, saddle, ravine, pit or hillside. The number of each label and thus the associated histogram is then used as a quantified characteristic of the surface topography, and could be used as a parameter for SPC (statistical process control) charting. In addition, the gradient for each surface patch is calculated, so the average, maximum, and other characteristics of the gradient distribution can be used for SPC. Repeatability measurements indicate high confidence where individual labels can be lower than 2% relative standard deviation. When the histogram is considered, an associated chi-squared value can be defined from which to compare other measurements. The chi-squared value of the histogram is a very sensitive and quantifiable parameter to determine the within wafer and wafer-to-wafer topography non-uniformity. As for the gradient histogram distribution, the chi-squared could again be calculated and used as yet another quantifiable parameter for SPC. In this work we measured the post Cu CMP of a die designed for 14nm technology. A region of interest (ROI) known to be indicative of the CMP processing is chosen for the topography analysis. The ROI, of size 1800 x 2500 pixels where each pixel represents 2um, was repeatably measured. We show the sensitivity based on measurements and the comparison between center and edge die measurements. The topography measurements and surface patch analysis were applied to hundreds of images representing the periodic process qualification runs required to control and verify CMP performance and tool matching. The analysis is shown to be sensitive to process conditions that vary in polishing time, type of slurry, CMP tool manufacturer, and CMP pad lifetime. Keywords: Keywords: CMP, Topography, Image Processing, Metrology, Interference microscopy, surface processing [1] De Lega, Xavier Colonna, and Peter De Groot. "Optical topography measurement of patterned wafers." Characterization and Metrology for ULSI Technology 2005 788 (2005): 432-436. [2] de Groot, Peter. "Coherence scanning interferometry." Optical Measurement of Surface Topography. Springer Berlin Heidelberg, 2011. 187-208. [3] Watson, Layne T., Thomas J. Laffey, and Robert M. Haralick. "Topographic classification of digital image intensity surfaces using generalized splines and the discrete cosine transformation." Computer Vision, Graphics, and Image Processing 29.2 (1985): 143-167. [4] Wang, Jun, et al. "3D facial expression recognition based on primitive surface feature distribution." Computer Vision and Pattern Recognition, 2006 IEEE Computer Society Conference on. Vol. 2. IEEE, 2006.
NASA Astrophysics Data System (ADS)
Reed, Judd E.; Rumberger, John A.; Buithieu, Jean; Behrenbeck, Thomas; Breen, Jerome F.; Sheedy, Patrick F., II
1995-05-01
Electron beam computed tomography is unparalleled in its ability to consistently produce high quality dynamic images of the human heart. Its use in quantification of left ventricular dynamics is well established in both clinical and research applications. However, the image analysis tools supplied with the scanners offer a limited number of analysis options. They are based on embedded computer systems which have not been significantly upgraded since the scanner was introduced over a decade ago in spite of the explosive improvements in available computer power which have occured during this period. To address these shortcomings, a workstation-based ventricular analysis system has been developed at our institution. This system, which has been in use for over five years, is based on current workstation technology and therefore has benefited from the periodic upgrades in processor performance available to these systems. The dynamic image segmentation component of this system is an interactively supervised, semi-automatic surface identification and tracking system. It characterizes the endocardial and epicardial surfaces of the left ventricle as two concentric 4D hyper-space polyhedrons. Each of these polyhedrons have nearly ten thousand vertices which are deposited into a relational database. The right ventricle is also processed in a similar manner. This database is queried by other custom components which extract ventricular function parameters such as regional ejection fraction and wall stress. The interactive tool which supervises dynamic image segmentation has been enhanced with a temporal domain display. The operator interactively chooses the spatial location of the endpoints of a line segment while the corresponding space/time image is displayed. These images, with content resembling M-Mode echocardiography, benefit form electron beam computed tomography's high spatial and contrast resolution. The segmented surfaces are displayed along with the imagery. These displays give the operator valuable feedback pertaining to the contiguity of the extracted surfaces. As with M-Mode echocardiography, the velocity of moving structures can be easily visualized and measured. However, many views inaccessible to standard transthoracic echocardiography are easily generated. These features have augmented the interpretability of cine electron beam computed tomography and have prompted the recent cloning of this system into an 'omni-directional M-Mode display' system for use in digital post-processing of echocardiographic parasternal short axis tomograms. This enhances the functional assessment in orthogonal views of the left ventricle, accounting for shape changes particularly in the asymmetric post-infarction ventricle. Conclusions: A new tool has been developed for analysis and visualization of cine electron beam computed tomography. It has been found to be very useful in verifying the consistency of myocardial surface definition with a semi-automated segmentation tool. By drawing on M-Mode echocardiography experience, electron beam tomography's interpretability has been enhanced. Use of this feature, in conjunction with the existing image processing tools, will enhance the presentations of data on regional systolic and diastolic functions to clinicians in a format that is familiar to most cardiologists. Additionally, this tool reinforces the advantages of electron beam tomography as a single imaging modality for the assessment of left and right ventricular size, shape, and regional functions.
Downey, Rachel I; Hutchison, Michael G; Comper, Paul
2018-06-14
To examine the clinical utility of the Sport Concussion Assessment Tool-3 (SCAT3) in university athletes with concussion in the absence and presence of baseline data over time. Athletes with concussion (n = 23) and uninjured controls (n = 22) were prospectively evaluated at three time-points (baseline, 3-5 days, 3 weeks post-injury) with the SCAT3 components: (1) Post-Concussion Symptom Scale (PCSS); (2) Standardized Assessment of Concussion (SAC); and (3) modified Balance Error Scoring System (m-BESS). Sensitivity and specificity were calculated using reliable change indices and normative data from 458 athletes who completed baseline testing. The PCSS total symptom score yielded highest sensitivity (47.4-72.2%) and specificity (78.6-91.7%) 3-5 days post-injury, with the SAC and m-BESS demonstrating little discriminative ability when used more than 3 days post-concussion. The utility of the SCAT3 was comparable when baseline or normative data was used for predicting concussion. The SCAT is a clinically useful tool for assessing concussion in the absence or presence of baseline data within the first 3-5 days post-injury. Clinical utility of the SCAT3 was driven by symptoms, which remains consistent in the SCAT5. Future research should explore whether additional cognitive elements in the SCAT5 improve utility beyond this timeframe.
Validation of the Virtual MET as an assessment tool for executive functions.
Rand, Debbie; Basha-Abu Rukan, Soraya; Weiss, Patrice L Tamar; Katz, Noomi
2009-08-01
The purpose of this study was to establish ecological validity and initial construct validity of a Virtual Multiple Errands Test (VMET) as an assessment tool for executive functions. It was implemented within the Virtual Mall (VMall), a novel functional video-capture virtual shopping environment. The main objectives were (1) to examine the relationships between the performance of three groups of participants in the Multiple Errands Test (MET) carried out in a real shopping mall and their performance in the VMET, (2) to assess the relationships between the MET and VMET of the post-stroke participant's level of executive functioning and independence in instrumental activities of daily living, and (3) to compare the performance of post-stroke participants to those of healthy young and older controls in both the MET and VMET. The study population included three groups; post-stroke participants (n = 9), healthy young participants (n = 20), and healthy older participants (n = 20). The VMET was able to differentiate between two age groups of healthy participants and between healthy and post-stroke participants thus demonstrating that it is sensitive to brain injury and ageing and supports construct validity between known groups. In addition, significant correlations were found between the MET and the VMET for both the post-stroke participants and older healthy participants. This provides initial support for the ecological validity of the VMET as an assessment tool of executive functions. However, further psychometric data on temporal stability are needed, namely test-retest reliability and responsiveness, before it is ready for clinical application. Further research using the VMET as an assessment tool within the VMall with larger groups and in additional populations is also recommended.
Differences That Make A Difference: A Study in Collaborative Learning
NASA Astrophysics Data System (ADS)
Touchman, Stephanie
Collaborative learning is a common teaching strategy in classrooms across age groups and content areas. It is important to measure and understand the cognitive process involved during collaboration to improve teaching methods involving interactive activities. This research attempted to answer the question: why do students learn more in collaborative settings? Using three measurement tools, 142 participants from seven different biology courses at a community college and at a university were tested before and after collaborating about the biological process of natural selection. Three factors were analyzed to measure their effect on learning at the individual level and the group level. The three factors were: difference in prior knowledge, sex and religious beliefs. Gender and religious beliefs both had a significant effect on post-test scores.
Modelling information dissemination under privacy concerns in social media
NASA Astrophysics Data System (ADS)
Zhu, Hui; Huang, Cheng; Lu, Rongxing; Li, Hui
2016-05-01
Social media has recently become an important platform for users to share news, express views, and post messages. However, due to user privacy preservation in social media, many privacy setting tools are employed, which inevitably change the patterns and dynamics of information dissemination. In this study, a general stochastic model using dynamic evolution equations was introduced to illustrate how privacy concerns impact the process of information dissemination. Extensive simulations and analyzes involving the privacy settings of general users, privileged users, and pure observers were conducted on real-world networks, and the results demonstrated that user privacy settings affect information differently. Finally, we also studied the process of information diffusion analytically and numerically with different privacy settings using two classic networks.
Thought Spot: Co-Creating Mental Health Solutions with Post-Secondary Students.
Wiljer, David; Johnson, Andrew; McDiarmid, Erica; Abi-Jaoude, Alexxa; Ferguson, Genevieve; Hollenberg, Elisa; van Heerwaarden, Nicole; Tripp, Tim; Law, Marcus
2017-01-01
It is difficult for the nearly 20% of Canadian 15- to 24-year olds reporting symptoms to seek the help they need within the current mental health system. Web-based and mobile health interventions are promising tools for reaching this group; having the capacity to reduce access-to-service barriers and engage youth in promoting their mental well-being. A three-phased, iterative, co-creation developmental approach was used to develop Thought Spot, a platform to better enable post-secondary students to seek mental health support. Co-creation activities included student development teams, hosting a hackathon, conducting focus groups and evidence-based workshops and student advisory groups. Evaluation results highlighted the need for greater role clarity and strategies for sustainable engagement in the co-creation process. Lessons learned are informing the project optimization phase and will be utilized to inform the design and implementation of an RCT, assessing impact on help seeking behaviour.
Extending the knowledge in histochemistry and cell biology.
Heupel, Wolfgang-Moritz; Drenckhahn, Detlev
2010-01-01
Central to modern Histochemistry and Cell Biology stands the need for visualization of cellular and molecular processes. In the past several years, a variety of techniques has been achieved bridging traditional light microscopy, fluorescence microscopy and electron microscopy with powerful software-based post-processing and computer modeling. Researchers now have various tools available to investigate problems of interest from bird's- up to worm's-eye of view, focusing on tissues, cells, proteins or finally single molecules. Applications of new approaches in combination with well-established traditional techniques of mRNA, DNA or protein analysis have led to enlightening and prudent studies which have paved the way toward a better understanding of not only physiological but also pathological processes in the field of cell biology. This review is intended to summarize articles standing for the progress made in "histo-biochemical" techniques and their manifold applications.
Downing, Katherine L; Campbell, Karen J; van der Pligt, Paige; Hesketh, Kylie D
2017-12-01
Social networking sites such as Facebook afford new opportunities for behavior-change interventions. Although often used as a recruitment tool, few studies have reported the use of Facebook as an intervention component to facilitate communication between researchers and participants. The aim of this study was to examine facilitator and participant use of a Facebook component of a community-based intervention for parents. First-time parent groups participating in the intervention arm of the extended Infant Feeding, Activity and Nutrition Trial (InFANT Extend) Program were invited to join their own private Facebook group. Facilitators mediated the Facebook groups, using them to share resources with parents, arrange group sessions, and respond to parent queries. Parents completed process evaluation questionnaires reporting on the usefulness of the Facebook groups. A total of 150 parents (from 27 first-time parent groups) joined their private Facebook group. There were a mean of 36.9 (standard deviation 11.1) posts/group, with the majority being facilitator posts. Facilitator administration posts (e.g., arranging upcoming group sessions) had the highest average comments (4.0), followed by participant health/behavior questions (3.5). The majority of participants reported that they enjoyed being a part of their Facebook group; however, the frequency of logging on to their groups' page declined over the 36 months of the trial, as did their perceived usefulness of the group. Facebook appears to be a useful administrative tool in this context. Parents enjoyed being part of their Facebook group, but their reported use of and engagement with Facebook declined over time.
Automation bias: decision making and performance in high-tech cockpits.
Mosier, K L; Skitka, L J; Heers, S; Burdick, M
1997-01-01
Automated aids and decision support tools are rapidly becoming indispensable tools in high-technology cockpits and are assuming increasing control of"cognitive" flight tasks, such as calculating fuel-efficient routes, navigating, or detecting and diagnosing system malfunctions and abnormalities. This study was designed to investigate automation bias, a recently documented factor in the use of automated aids and decision support systems. The term refers to omission and commission errors resulting from the use of automated cues as a heuristic replacement for vigilant information seeking and processing. Glass-cockpit pilots flew flight scenarios involving automation events or opportunities for automation-related omission and commission errors. Although experimentally manipulated accountability demands did not significantly impact performance, post hoc analyses revealed that those pilots who reported an internalized perception of "accountability" for their performance and strategies of interaction with the automation were significantly more likely to double-check automated functioning against other cues and less likely to commit errors than those who did not share this perception. Pilots were also lilkely to erroneously "remember" the presence of expected cues when describing their decision-making processes.
Kusonmano, Kanthida; Vongsangnak, Wanwipa; Chumnanpuen, Pramote
2016-01-01
Metabolome profiling of biological systems has the powerful ability to provide the biological understanding of their metabolic functional states responding to the environmental factors or other perturbations. Tons of accumulative metabolomics data have thus been established since pre-metabolomics era. This is directly influenced by the high-throughput analytical techniques, especially mass spectrometry (MS)- and nuclear magnetic resonance (NMR)-based techniques. Continuously, the significant numbers of informatics techniques for data processing, statistical analysis, and data mining have been developed. The following tools and databases are advanced for the metabolomics society which provide the useful metabolomics information, e.g., the chemical structures, mass spectrum patterns for peak identification, metabolite profiles, biological functions, dynamic metabolite changes, and biochemical transformations of thousands of small molecules. In this chapter, we aim to introduce overall metabolomics studies from pre- to post-metabolomics era and their impact on society. Directing on post-metabolomics era, we provide a conceptual framework of informatics techniques for metabolomics and show useful examples of techniques, tools, and databases for metabolomics data analysis starting from preprocessing toward functional interpretation. Throughout the framework of informatics techniques for metabolomics provided, it can be further used as a scaffold for translational biomedical research which can thus lead to reveal new metabolite biomarkers, potential metabolic targets, or key metabolic pathways for future disease therapy.
A stochastic post-processing method for solar irradiance forecasts derived from NWPs models
NASA Astrophysics Data System (ADS)
Lara-Fanego, V.; Pozo-Vazquez, D.; Ruiz-Arias, J. A.; Santos-Alamillos, F. J.; Tovar-Pescador, J.
2010-09-01
Solar irradiance forecast is an important area of research for the future of the solar-based renewable energy systems. Numerical Weather Prediction models (NWPs) have proved to be a valuable tool for solar irradiance forecasting with lead time up to a few days. Nevertheless, these models show low skill in forecasting the solar irradiance under cloudy conditions. Additionally, climatic (averaged over seasons) aerosol loading are usually considered in these models, leading to considerable errors for the Direct Normal Irradiance (DNI) forecasts during high aerosols load conditions. In this work we propose a post-processing method for the Global Irradiance (GHI) and DNI forecasts derived from NWPs. Particularly, the methods is based on the use of Autoregressive Moving Average with External Explanatory Variables (ARMAX) stochastic models. These models are applied to the residuals of the NWPs forecasts and uses as external variables the measured cloud fraction and aerosol loading of the day previous to the forecast. The method is evaluated for a set one-moth length three-days-ahead forecast of the GHI and DNI, obtained based on the WRF mesoscale atmospheric model, for several locations in Andalusia (Southern Spain). The Cloud fraction is derived from MSG satellite estimates and the aerosol loading from the MODIS platform estimates. Both sources of information are readily available at the time of the forecast. Results showed a considerable improvement of the forecasting skill of the WRF model using the proposed post-processing method. Particularly, relative improvement (in terms of the RMSE) for the DNI during summer is about 20%. A similar value is obtained for the GHI during the winter.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-02
..., HUD. ACTION: Notice extension of deadline. SUMMARY: On August 26, 2010, HUD posted on http://www...--Round 1. Today's Federal Register notice announces that HUD has posted on http://www.Grants.gov a... to the Mapping Tool used to determine neighborhood eligibility. HUD will post on http://www.Grants...
Using remote sensing to monitor post-fire watershed recovery as a tool for management
Jess Clark; Marc Stamer; Kevin Cooper; Carolyn Napper; Terri Hogue; Alicia Kinoshita
2013-01-01
Post-fire watershed recovery is influenced by numerous variables but one of the most important factors is the rate of re-establishment of vegetative cover. Burned Area Emergency Response (BAER) teams, along with other agencies (Natural Resource Conservation Service, state, counties, cities, etc.), prescribe temporary post-fire mitigation treatments based on expected...
Sepulveda, Ana R; Wise, Caroline; Zabala, Maria; Todd, Gill; Treasure, Janet
2013-12-01
The aims of this study were to develop an eating disorder scenarios tool to assess the motivational interviewing (MI) skills of caregivers and evaluate the coding reliability of the instrument, and to test the sensitivity to change through a pre/post/follow-up design. The resulting Motivational Interview Scenarios Tool for Eating Disorders (MIST-ED) was administered to caregivers (n = 66) who were asked to provide oral and written responses before and after a skills-based intervention, and at a 3-month follow-up. Raters achieved excellent inter-rater reliability (intra-class correlations of 91.8% on MI adherent and 86.1% for MI non-adherent statements for written scenarios and 89.2%, and 85.3% for oral scenarios). Following the intervention, MI adherent statements increased (baseline = 9.4%, post = 61.5% and follow-up 47.2%) and non-MI adherent statements decreased (baseline = 90.6%, post = 38.5% and follow-up = 52.8%). This instrument can be used as a simple method to measure the acquisition of MI skills to improve coping and both response methods are adequate. The tool shows good sensitivity to improved skills. © 2013.
NASA Astrophysics Data System (ADS)
Telasang, Gururaj; Dutta Majumdar, Jyotsna; Wasekar, Nitin; Padmanabham, G.; Manna, Indranil
2015-05-01
This study reports a detailed investigation of the microstructure and mechanical properties (wear resistance and tensile strength) of hardened and tempered AISI H13 tool steel substrate following laser cladding with AISI H13 tool steel powder in as-clad and after post-cladding conventional bulk isothermal tempering [at 823 K (550 °C) for 2 hours] heat treatment. Laser cladding was carried out on AISI H13 tool steel substrate using a 6 kW continuous wave diode laser coupled with fiber delivering an energy density of 133 J/mm2 and equipped with a co-axial powder feeding nozzle capable of feeding powder at the rate of 13.3 × 10-3 g/mm2. Laser clad zone comprises martensite, retained austenite, and carbides, and measures an average hardness of 600 to 650 VHN. Subsequent isothermal tempering converted the microstructure into one with tempered martensite and uniform dispersion of carbides with a hardness of 550 to 650 VHN. Interestingly, laser cladding introduced residual compressive stress of 670 ± 15 MPa, which reduces to 580 ± 20 MPa following isothermal tempering. Micro-tensile testing with specimens machined from the clad zone across or transverse to cladding direction showed high strength but failure in brittle mode. On the other hand, similar testing with samples sectioned from the clad zone parallel or longitudinal to the direction of laser cladding prior to and after post-cladding tempering recorded lower strength but ductile failure with 4.7 and 8 pct elongation, respectively. Wear resistance of the laser surface clad and post-cladding tempered samples (evaluated by fretting wear testing) registered superior performance as compared to that of conventional hardened and tempered AISI H13 tool steel.
NASA Astrophysics Data System (ADS)
McEver, Jimmie; Davis, Paul K.; Bigelow, James H.
2000-06-01
We have developed and used families of multiresolution and multiple-perspective models (MRM and MRMPM), both in our substantive analytic work for the Department of Defense and to learn more about how such models can be designed and implemented. This paper is a brief case history of our experience with a particular family of models addressing the use of precision fires in interdicting and halting an invading army. Our models were implemented as closed-form analytic solutions, in spreadsheets, and in the more sophisticated AnalyticaTM environment. We also drew on an entity-level simulation for data. The paper reviews the importance of certain key attributes of development environments (visual modeling, interactive languages, friendly use of array mathematics, facilities for experimental design and configuration control, statistical analysis tools, graphical visualization tools, interactive post-processing, and relational database tools). These can go a long way towards facilitating MRMPM work, but many of these attributes are not yet widely available (or available at all) in commercial model-development tools--especially for use with personal computers. We conclude with some lessons learned from our experience.
The Persistence of Mode 1 Technology in the Korean Late Paleolithic
Lee, Hyeong Woo
2013-01-01
Ssangjungri (SJ), an open-air site with several Paleolithic horizons, was recently discovered in South Korea. Most of the identified artifacts are simple core and flake tools that indicate an expedient knapping strategy. Bifacially worked core tools, which might be considered non-classic bifaces, also have been found. The prolific horizons at the site were dated by accelerator mass spectrometry (AMS) to about 30 kya. Another newly discovered Paleolithic open-air site, Jeungsan (JS), shows a homogeneous lithic pattern during this period. The dominated artifact types and usage of raw materials are similar in character to those from SJ, although JS yielded a larger number of simple core and flake tools with non-classic bifaces. Chronometric analysis by AMS and optically stimulated luminescence (OSL) indicate that the prime stratigraphic levels at JS also date to approximately 30 kya, and the numerous conjoining pieces indicate that the layers were not seriously affected by post-depositional processes. Thus, it can be confirmed that simple core and flake tools were produced at temporally and culturally independent sites until after 30 kya, supporting the hypothesis of a wide and persistent use of simple technology into the Late Pleistocene. PMID:23724113
Novel texture-based descriptors for tool wear condition monitoring
NASA Astrophysics Data System (ADS)
Antić, Aco; Popović, Branislav; Krstanović, Lidija; Obradović, Ratko; Milošević, Mijodrag
2018-01-01
All state-of-the-art tool condition monitoring systems (TCM) in the tool wear recognition task, especially those that use vibration sensors, heavily depend on the choice of descriptors containing information about the tool wear state which are extracted from the particular sensor signals. All other post-processing techniques do not manage to increase the recognition precision if those descriptors are not discriminative enough. In this work, we propose a tool wear monitoring strategy which relies on the novel texture based descriptors. We consider the module of the Short Term Discrete Fourier Transform (STDFT) spectra obtained from the particular vibration sensors signal utterance as the 2D textured image. This is done by identifying the time scale of STDFT as the first dimension, and the frequency scale as the second dimension of the particular textured image. The obtained textured image is then divided into particular 2D texture patches, covering a part of the frequency range of interest. After applying the appropriate filter bank, 2D textons are extracted for each predefined frequency band. By averaging in time, we extract from the textons for each band of interest the information regarding the Probability Density Function (PDF) in the form of lower order moments, thus obtaining robust tool wear state descriptors. We validate the proposed features by the experiments conducted on the real TCM system, obtaining the high recognition accuracy.
González-Ferrer, Arturo; Valcárcel, M Ángel; Cuesta, Martín; Cháfer, Joan; Runkle, Isabelle
2017-07-01
Hyponatremia is the most common type of electrolyte imbalance, occurring when serum sodium is below threshold levels, typically 135mmol/L. Electrolyte balance has been identified as one of the most challenging subjects for medical students, but also as one of the most relevant areas to learn about according to physicians and researchers. We present a computer-interpretable guideline (CIG) model that will be used for medical training to learn how to improve the diagnosis of hyponatremia applying an expert consensus document (ECDs). We used the PROForma set of tools to develop the model, using an iterative process involving two knowledge engineers (a computer science Ph.D. and a preventive medicine specialist) and two expert endocrinologists. We also carried out an initial validation of the model and a qualitative post-analysis from the results of a retrospective study (N=65 patients), comparing the consensus diagnosis of two experts with the output of the tool. The model includes over two-hundred "for", "against" and "neutral" arguments that are selectively triggered depending on the input value of more than forty patient-state variables. We share the methodology followed for the development process and the initial validation results, that achieved a high ratio of 61/65 agreements with the consensus diagnosis, having a kappa value of K=0.86 for overall agreement and K=0.80 for first-ranked agreement. Hospital care professionals involved in the project showed high expectations of using this tool for training, but the process to follow for a successful diagnosis and application is not trivial, as reported in this manuscript. Secondary benefits of using these tools are associated to improving research knowledge and existing clinical practice guidelines (CPGs) or ECDs. Beyond point-of-care clinical decision support, knowledge-based decision support systems are very attractive as a training tool, to help selected professionals to better understand difficult diseases that are underdiagnosed and/or incorrectly managed. Copyright © 2017 Elsevier B.V. All rights reserved.
Mobility scores as a predictor of length of stay in general surgery: a prospective cohort study.
Carroll, Georgia M; Hampton, Jacob; Carroll, Rosemary; Smith, Stephen R
2018-05-22
Post-operative length of stay (LOS) is an increasingly important clinical indicator in general surgery. Despite this, no tool has been validated to predict LOS or readiness for discharge in general surgical patients. The de Morton Mobility Index (DEMMI) is a functional mobility assessment tool that has been validated in rehabilitation patient populations. In this prospective cohort study, we aimed to identify if trends in DEMMI scores were associated with discharge within 1 week and overall LOS in general surgical patients. A total of 161 patients who underwent elective gastrointestinal resections were included. DEMMI scores were performed preoperatively, on days 1, 2, 3 and 30 post-operative. Statistical analysis was performed to identify any association between DEMMI scores and discharge within 1 week and LOS. Functional recovery (measured by achieving 80% of baseline DEMMI score by post-operative day 1) was significantly associated with discharge within 1 week. Presence of a stoma was associated with longer LOS. The area under the receiver operating characteristic curve using functional recovery on post-operative day 1 as a predictor of discharge within 1 week is 0.772. The DEMMI score is a fast, easy and useful tool to, on post-operative day 1, predict discharge within 1 week. The utility of this is to act as an anticipatory trigger for more proactive and efficient discharge planning in the early post-operative period, and there is potential to use the DEMMI as a comparator in clinical trials to assess functional recovery. © 2018 Royal Australasian College of Surgeons.
NASA Astrophysics Data System (ADS)
Schatz, A.; Pantel, D.; Hanemann, T.
2017-09-01
Integration of lead zirconate titanate (Pb[Zrx,Ti1-x]O3 - PZT) thin films on complementary metal-oxide semiconductor substrates (CMOS) is difficult due to the usually high crystallization temperature of the piezoelectric perovskite PZT phase, which harms the CMOS circuits. In this work, a wafer-scale pulsed laser deposition tool was used to grow 1 μm thick PZT thin films on 150 mm diameter silicon wafers. Three different routes towards a post-CMOS compatible deposition process were investigated, maintaining a post-CMOS compatible thermal budget limit of 445 °C for 1 h (or 420 °C for 6 h). By crystallizing the perovskite LaNiO3 seed layer at 445 °C, the PZT deposition temperature can be lowered to below 400 °C, yielding a transverse piezoelectric coefficient e31,f of -9.3 C/m2. With the same procedure, applying a slightly higher PZT deposition temperature of 420 °C, an e31,f of -10.3 C/m2 can be reached. The low leakage current density of below 3 × 10-6 A/cm2 at 200 kV/cm allows for application of the post-CMOS compatible PZT thin films in low power micro-electro-mechanical-systems actuators.
Chung, Heaseung S.; Wang, Sheng-Bing; Venkatraman, Vidya; Murray, Christopher I.; Van Eyk, Jennifer E.
2014-01-01
In the cardiovascular system, changes in the oxidative balance can affect many aspects of cellular physiology through redox-signaling. Depending on the magnitude, fluctuations in the cell's production of reactive oxygen and nitrogen species can regulate normal metabolic processes, activate protective mechanisms, or be cytotoxic. Reactive oxygen and nitrogen species can have many effects including the post-translational modification of proteins at critical cysteine (Cys) thiols. A subset can act as redox-switches, which elicit functional effects in response to changes in oxidative state. While the general concepts of redox-signaling have been established, the identity and function of many regulatory switches remains unclear. Characterizing the effects of individual modifications is the key to understanding how the cell interprets oxidative signals under physiological and pathological conditions. Here, we review the various Cys oxidative post-translational modifications (Ox-PTMs) and their ability to function as redox-switches that regulate the cell's response to oxidative stimuli. In addition, we discuss how these modifications have the potential to influence other post-translational modifications' signaling pathways though cross-talk. Finally, we review the growing number of tools being developed to identify and quantify the various Cys Ox-PTMs and how this will advance our understanding of redox-regulation. PMID:23329793
A Decade of Monitoring HIV Epidemics in Nigeria: Positioning for Post-2015 Agenda.
Akinwande, Oluyemisi; Bashorun, Adebobola; Azeez, Aderemi; Agbo, Francis; Dakum, Patrick; Abimiku, Alashle; Bilali, Camara; Idoko, John; Ogungbemi, Kayode
2017-07-01
Nigeria accounts for 9% of the global HIV burden and is a signatory to Millennium Development Goals as well as the post-2015 Sustainable Development Goals. This paper reviews maturation of her HIV M&E system and preparedness for monitoring of the post-2015 agenda. Using the UNAIDS criteria for assessing a functional M&E system, a mixed-methods approach of desk review and expert consultations, was employed. Following adoption of a multi-sectoral M&E system, Nigeria experienced improved HIV coordination at the National and State levels, capacity building for epidemic appraisals, spectrum estimation and routine data quality assessments. National data and systems audit processes were instituted which informed harmonization of tools and indicators. The M&E achievements of the HIV response enhanced performance of the National Health Management Information System (NHMIS) using DHIS2 platform following its re-introduction by the Federal Ministry of Health, and also enabled decentralization of data management to the periphery. A decade of implementing National HIV M&E framework in Nigeria and the recent adoption of the DHIS2 provides a strong base for monitoring the Post 2015 agenda. There is however a need to strengthen inter-sectoral data linkages and reduce the rising burden of data collection at the global level.
Storm Water BMP Tool Implementation Testing
DOT National Transportation Integrated Search
2017-12-01
Under project 2015-ORIL 7, a screening tool was developed to assist Local communities with selecting post-construction storm water best management practices (BMPs) to comply with the Ohio Environmental Protection Agencys (Ohio EPA) statewide Const...
Huang, Ming-Xiong; Anderson, Bill; Huang, Charles W.; Kunde, Gerd J.; Vreeland, Erika C.; Huang, Jeffrey W.; Matlashov, Andrei N.; Karaulanov, Todor; Nettles, Christopher P.; Gomez, Andrew; Minser, Kayla; Weldon, Caroline; Paciotti, Giulio; Harsh, Michael; Lee, Roland R.; Flynn, Edward R.
2017-01-01
Superparamagnetic Relaxometry (SPMR) is a highly sensitive technique for the in vivo detection of tumor cells and may improve early stage detection of cancers. SPMR employs superparamagnetic iron oxide nanoparticles (SPION). After a brief magnetizing pulse is used to align the SPION, SPMR measures the time decay of SPION using Super-conducting Quantum Interference Device (SQUID) sensors. Substantial research has been carried out in developing the SQUID hardware and in improving the properties of the SPION. However, little research has been done in the pre-processing of sensor signals and post-processing source modeling in SPMR. In the present study, we illustrate new pre-processing tools that were developed to: 1) remove trials contaminated with artifacts, 2) evaluate and ensure that a single decay process associated with bounded SPION exists in the data, 3) automatically detect and correct flux jumps, and 4) accurately fit the sensor signals with different decay models. Furthermore, we developed an automated approach based on multi-start dipole imaging technique to obtain the locations and magnitudes of multiple magnetic sources, without initial guesses from the users. A regularization process was implemented to solve the ambiguity issue related to the SPMR source variables. A procedure based on reduced chi-square cost-function was introduced to objectively obtain the adequate number of dipoles that describe the data. The new pre-processing tools and multi-start source imaging approach have been successfully evaluated using phantom data. In conclusion, these tools and multi-start source modeling approach substantially enhance the accuracy and sensitivity in detecting and localizing sources from the SPMR signals. Furthermore, multi-start approach with regularization provided robust and accurate solutions for a poor SNR condition similar to the SPMR detection sensitivity in the order of 1000 cells. We believe such algorithms will help establishing the industrial standards for SPMR when applying the technique in pre-clinical and clinical settings. PMID:28072579
"Easy-on, Easy-off" Blanket Fastener
NASA Technical Reports Server (NTRS)
Kolecki, Ronald E.; Clatterbuck, Carroll H.
1992-01-01
Fasteners hold flexible blanket on set of posts on supporting structure. Disk of silicone rubber cast on disk of Mylar, fastened to blanket and press-fit over post to nest securely in groove. No tools needed for installation or removal.
Wood, Jennifer P; Connelly, Denise M; Maly, Monica R
2010-11-01
To examine the process of community reintegration over the first year following stroke, from the patient's perspective. Qualitative, longitudinal, grounded theory study involving ten participants. During the first year post discharge from inpatient rehabilitation, 46 one-on-one semi-structured interviews were conducted with ten participants. Interviews were completed with participants before discharge from inpatient stroke rehabilitation and in their homes at two weeks, three months, six months and one year post discharge. Analysis was guided by grounded theory methods described by Corbin and Strauss. Four women and six men (mean age 59.6 ± 18.0, all with left hemiparesis and without aphasia) who had sustained their first hemispheric stroke and were returning to the community following inpatient rehabilitation. The process of community reintegration after stroke involved transitioning through a series of goals: gaining physical function, establishing independence, adjusting expectations and getting back to real living. The ultimate challenge for stroke survivors during this process of community reintegration was to create a balance between their expectations of themselves and their physical capacity to engage in meaningful roles. Over the first year after stroke, participants reported that the process of community reintegration was marked by ongoing changes in their goals. Formal and informal caregivers need to work with stroke survivors living in the community to facilitate realistic and achievable goal setting. Tools which identify meaningful activities should also be incorporated to provide stroke survivors with the opportunity to contribute and engage with others in the community.
Teachers' Perspectives on Digital Tools for Pedagogic Planning and Design
ERIC Educational Resources Information Center
Masterman, Elizabeth; Manton, Marion
2011-01-01
The authors introduce the concept of design support tools and situate them in the pedagogic context of professional development for technology-enhanced learning (TEL) and the research field of learning design. Through focusing on the development and evaluation of one such tool, Phoebe, they discuss their value to lecturers in post-compulsory…
Interagency Transition Team Development and Facilitation. Essential Tools.
ERIC Educational Resources Information Center
Stodden, Robert A.; Brown, Steven E.; Galloway, L. M.; Mrazek, Susan; Noy, Liora
2005-01-01
The purpose of this Essential Tool is to assist state-level transition coordinators and others responsible for forming, conducting, and evaluating the performance of interagency transition teams that are focused upon the school and post-school needs of youth with disabilities. This Essential Tool is designed to guide the coordination efforts of…
ERIC Educational Resources Information Center
Hudson, Chloe C.; Lambe, Laura; Pepler, Debra J.; Craig, Wendy M.
2016-01-01
The current study explored online preventive coping (privacy settings) and reactive coping (reporting tools) among youth and how the use of these online safety tools related to the frequency of cybervictimization. Surveys were administered to youth in elementary, secondary, and post-secondary school. Results indicated that the prevalence of…
Optical scheme for simulating post-quantum nonlocality distillation.
Chu, Wen-Jing; Yang, Ming; Pan, Guo-Zhu; Yang, Qing; Cao, Zhuo-Liang
2016-11-28
An optical scheme for simulating nonlocality distillation is proposed in post-quantum regime. The nonlocal boxes are simulated by measurements on appropriately pre- and post-selected polarization entangled photon pairs, i.e. post-quantum nonlocality is simulated by exploiting fair-sampling loophole in a Bell test. Mod 2 addition on the outputs of two nonlocal boxes combined with pre- and post-selection operations constitutes the key operation of simulating nonlocality distillation. This scheme provides a possible tool for the experimental study on the nonlocality in post-quantum regime and the exact physical principle precisely distinguishing physically realizable correlations from nonphysical ones.
An analysis of post-event processing in social anxiety disorder.
Brozovich, Faith; Heimberg, Richard G
2008-07-01
Research has demonstrated that self-focused thoughts and negative affect have a reciprocal relationship [Mor, N., Winquist, J. (2002). Self-focused attention and negative affect: A meta-analysis. Psychological Bulletin, 128, 638-662]. In the anxiety disorder literature, post-event processing has emerged as a specific construction of repetitive self-focused thoughts that pertain to social anxiety disorder. Post-event processing can be defined as an individual's repeated consideration and potential reconstruction of his performance following a social situation. Post-event processing can also occur when an individual anticipates a social or performance event and begins to brood about other, past social experiences. The present review examined the post-event processing literature in an attempt to organize and highlight the significant results. The methodologies employed to study post-event processing have included self-report measures, daily diaries, social or performance situations created in the laboratory, and experimental manipulations of post-event processing or anticipation of an upcoming event. Directions for future research on post-event processing are discussed.
Boundary Layer Transition Results From STS-114
NASA Technical Reports Server (NTRS)
Berry, Scott A.; Horvath, Thomas J.; Cassady, Amy M.; Kirk, Benjamin S.; Wang, K. C.; Hyatt, Andrew J.
2006-01-01
The tool for predicting the onset of boundary layer transition from damage to and/or repair of the thermal protection system developed in support of Shuttle Return to Flight is compared to the STS-114 flight results. The Boundary Layer Transition (BLT) Tool is part of a suite of tools that analyze the aerothermodynamic environment of the local thermal protection system to allow informed disposition of damage for making recommendations to fly as is or to repair. Using mission specific trajectory information and details of each damage site or repair, the expected time of transition onset is predicted to help determine the proper aerothermodynamic environment to use in the subsequent thermal and stress analysis of the local structure. The boundary layer transition criteria utilized for the tool was developed from ground-based measurements to account for the effect of both protuberances and cavities and has been calibrated against flight data. Computed local boundary layer edge conditions provided the means to correlate the experimental results and then to extrapolate to flight. During STS-114, the BLT Tool was utilized and was part of the decision making process to perform an extravehicular activity to remove the large gap fillers. The role of the BLT Tool during this mission, along with the supporting information that was acquired for the on-orbit analysis, is reviewed. Once the large gap fillers were removed, all remaining damage sites were cleared for reentry as is. Post-flight analysis of the transition onset time revealed excellent agreement with BLT Tool predictions.
Multidimensional proteomics for cell biology.
Larance, Mark; Lamond, Angus I
2015-05-01
The proteome is a dynamic system in which each protein has interconnected properties - dimensions - that together contribute to the phenotype of a cell. Measuring these properties has proved challenging owing to their diversity and dynamic nature. Advances in mass spectrometry-based proteomics now enable the measurement of multiple properties for thousands of proteins, including their abundance, isoform expression, turnover rate, subcellular localization, post-translational modifications and interactions. Complementing these experimental developments are new data analysis, integration and visualization tools as well as data-sharing resources. Together, these advances in the multidimensional analysis of the proteome are transforming our understanding of various cellular and physiological processes.
Data compression for sequencing data
2013-01-01
Post-Sanger sequencing methods produce tons of data, and there is a general agreement that the challenge to store and process them must be addressed with data compression. In this review we first answer the question “why compression” in a quantitative manner. Then we also answer the questions “what” and “how”, by sketching the fundamental compression ideas, describing the main sequencing data types and formats, and comparing the specialized compression algorithms and tools. Finally, we go back to the question “why compression” and give other, perhaps surprising answers, demonstrating the pervasiveness of data compression techniques in computational biology. PMID:24252160
Giustiniano, Mariateresa; Basso, Andrea; Mercalli, Valentina; Massarotti, Alberto; Novellino, Ettore; Tron, Gian Cesare; Zhu, Jieping
2017-03-06
The term functionalized isocyanides refers to all those isocyanides in which a neighbouring functional group can finely tune the reactivity of the isocyano group or can be exploited in post-functionalization processes. In this manuscript, we have reviewed all the isocyanides in which the pendant functional group causes either deviation from or reinforces the normal reactivity of the isocyano group and categorized them to highlight their common features and differences. An analysis of their synthetic potential and the possible unexplored directions for future research studies is also addressed.
4D imaging for target definition in stereotactic radiotherapy for lung cancer.
Slotman, Ben J; Lagerwaard, Frank J; Senan, Suresh
2006-01-01
Stereotactic radiotherapy of Stage I lung tumors has been reported to result in high local control rates that are far superior to those obtained with conventional radiotherapy techniques, and which approach those achieved with primary surgery. Breathing-induced motion of tumor and target tissues is an important issue in this technique and careful attention should be paid to the contouring and the generation of individualized margins. We describe our experience with the use of 4DCT scanning for this group of patients, the use of post-processing tools and the potential benefits of respiratory gating.
Big data sharing and analysis to advance research in post-traumatic epilepsy.
Duncan, Dominique; Vespa, Paul; Pitkanen, Asla; Braimah, Adebayo; Lapinlampi, Nina; Toga, Arthur W
2018-06-01
We describe the infrastructure and functionality for a centralized preclinical and clinical data repository and analytic platform to support importing heterogeneous multi-modal data, automatically and manually linking data across modalities and sites, and searching content. We have developed and applied innovative image and electrophysiology processing methods to identify candidate biomarkers from MRI, EEG, and multi-modal data. Based on heterogeneous biomarkers, we present novel analytic tools designed to study epileptogenesis in animal model and human with the goal of tracking the probability of developing epilepsy over time. Copyright © 2017. Published by Elsevier Inc.
GIS embedded hydrological modeling: the SID&GRID project
NASA Astrophysics Data System (ADS)
Borsi, I.; Rossetto, R.; Schifani, C.
2012-04-01
The SID&GRID research project, started April 2010 and funded by Regione Toscana (Italy) under the POR FSE 2007-2013, aims to develop a Decision Support System (DSS) for water resource management and planning based on open source and public domain solutions. In order to quantitatively assess water availability in space and time and to support the planning decision processes, the SID&GRID solution consists of hydrological models (coupling 3D existing and newly developed surface- and ground-water and unsaturated zone modeling codes) embedded in a GIS interface, applications and library, where all the input and output data are managed by means of DataBase Management System (DBMS). A graphical user interface (GUI) to manage, analyze and run the SID&GRID hydrological models based on open source gvSIG GIS framework (Asociación gvSIG, 2011) and a Spatial Data Infrastructure to share and interoperate with distributed geographical data is being developed. Such a GUI is thought as a "master control panel" able to guide the user from pre-processing spatial and temporal data, running the hydrological models, and analyzing the outputs. To achieve the above-mentioned goals, the following codes have been selected and are being integrated: 1. Postgresql/PostGIS (PostGIS, 2011) for the Geo Data base Management System; 2. gvSIG with Sextante (Olaya, 2011) geo-algorithm library capabilities and Grass tools (GRASS Development Team, 2011) for the desktop GIS; 3. Geoserver and Geonetwork to share and discover spatial data on the web according to Open Geospatial Consortium; 4. new tools based on the Sextante GeoAlgorithm framework; 5. MODFLOW-2005 (Harbaugh, 2005) groundwater modeling code; 6. MODFLOW-LGR (Mehl and Hill 2005) for local grid refinement; 7. VSF (Thoms et al., 2006) for the variable saturated flow component; 8. new developed routines for overland flow; 9. new algorithms in Jython integrated in gvSIG to compute the net rainfall rate reaching the soil surface, as input for the unsaturated/saturated flow model. At this stage of the research (which will end April 2013), two primary components of the master control panel are being developed: i. a SID&GRID toolbar integrated into gvSIG map context; ii. a new Sextante set of geo-algorithm to pre- and post-process the spatial data to run the hydrological models. The groundwater part of the code has been fully integrated and tested and 3D visualization tools are being developed. The LGR capability has been extended to the 3D solution of the Richards' equation in order to solve in detail the unsaturated zone where required. To be updated about the project, please follow us at the website: http://ut11.isti.cnr.it/SIDGRID/
Additive Manufacturing of Fuel Injectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadek Tadros, Dr. Alber Alphonse; Ritter, Dr. George W.; Drews, Charles Donald
Additive manufacturing (AM), also known as 3D-printing, has been shifting from a novelty prototyping paradigm to a legitimate manufacturing tool capable of creating components for highly complex engineered products. An emerging AM technology for producing metal parts is the laser powder bed fusion (L-PBF) process; however, industry manufacturing specifications and component design practices for L-PBF have not yet been established. Solar Turbines Incorporated (Solar), an industrial gas turbine manufacturer, has been evaluating AM technology for development and production applications with the desire to enable accelerated product development cycle times, overall turbine efficiency improvements, and supply chain flexibility relative to conventionalmore » manufacturing processes (casting, brazing, welding). Accordingly, Solar teamed with EWI on a joint two-and-a-half-year project with the goal of developing a production L-PBF AM process capable of consistently producing high-nickel alloy material suitable for high temperature gas turbine engine fuel injector components. The project plan tasks were designed to understand the interaction of the process variables and their combined impact on the resultant AM material quality. The composition of the high-nickel alloy powders selected for this program met the conventional cast Hastelloy X compositional limits and were commercially available in different particle size distributions (PSD) from two suppliers. Solar produced all the test articles and both EWI and Solar shared responsibility for analyzing them. The effects of powder metal input stock, laser parameters, heat treatments, and post-finishing methods were evaluated. This process knowledge was then used to generate tensile, fatigue, and creep material properties data curves suitable for component design activities. The key process controls for ensuring consistent material properties were documented in AM powder and process specifications. The basic components of the project were: • Powder metal input stock: Powder characterization, dimensional accuracy, metallurgical characterization, and mechanical properties evaluation. • Process parameters: Laser parameter effects, post-printing heat-treatment development, mechanical properties evaluation, and post-finishing technique. • Material design curves: Room and elevated temperature tensiles, low cycle fatigue, and creep rupture properties curves generated. • AM specifications: Key metal powder characteristics, laser parameters, and heat-treatment controls identified.« less
Fokkens, Andrea S; Groothoff, Johan W; van der Klink, Jac J L; Popping, Roel; Stewart, Roy E; van de Ven, Lex; Brouwer, Sandra; Tuinstra, Jolanda
2015-09-01
An assessment tool was developed to assess disability in veterans who suffer from post-traumatic stress disorder (PTSD) due to a military mission. The objective of this study was to determine the reliability, intra-rater and inter-rater variation of the Mental Disability Military (MDM) assessment tool. Twenty-four assessment interviews of veterans with an insurance physician were videotaped. Each videotaped interview was assessed by a group of five independent raters on limitations of the veterans using the MDM assessment tool. After 2 months the raters repeated this procedure. Next the intra-rater and inter-rater variation was assessed with an adjusted version of AG09 computing weighted percentage agreement. The results of this study showed that both the intra-rater variation and inter-rater variation on the ten subcategories of the MDM assessment tool were small, with an agreement of 84-100% within raters and 93-100% between raters. The MDM assessment tool proves to be a reliable instrument to measure PTSD limitations in functioning in Dutch military veterans who apply for disability compensation. Further research is needed to assess the validity of this instrument.
The Development and Application of the RAND Program Classification Tool. The RAND Toolkit, Volume 1
2014-01-01
one may be selected.) Pretest /baseline only Posttest only Pre-post Pre-post with comparison group ...following outcome data (used to identify the results of a program’s efforts)? (More than one may be selected.) Pretest /baseline only Posttest only...results of a program’s efforts)? o Pretest /baseline only o Posttest only o Pre-post o Pre-post with comparison group o Randomized controlled trial
Diagnosing alternative conceptions of Fermi energy among undergraduate students
NASA Astrophysics Data System (ADS)
Sharma, Sapna; Ahluwalia, Pardeep Kumar
2012-07-01
Physics education researchers have scientifically established the fact that the understanding of new concepts and interpretation of incoming information are strongly influenced by the preexisting knowledge and beliefs of students, called epistemological beliefs. This can lead to a gap between what students actually learn and what the teacher expects them to learn. In a classroom, as a teacher, it is desirable that one tries to bridge this gap at least on the key concepts of a particular field which is being taught. One such key concept which crops up in statistical physics/solid-state physics courses, and around which the behaviour of materials is described, is Fermi energy (εF). In this paper, we present the results which emerged about misconceptions on Fermi energy in the process of administering a diagnostic tool called the Statistical Physics Concept Survey developed by the authors. It deals with eight themes of basic importance in learning undergraduate solid-state physics and statistical physics. The question items of the tool were put through well-established sequential processes: definition of themes, Delphi study, interview with students, drafting questions, administration, validity and reliability of the tool. The tool was administered to a group of undergraduate students and postgraduate students, in a pre-test and post-test design. In this paper, we have taken one of the themes i.e. Fermi energy of the diagnostic tool for our analysis and discussion. Students’ responses and reasoning comments given during interview were analysed. This analysis helped us to identify prevailing misconceptions/learning gaps among students on this topic. How spreadsheets can be effectively used to remove the identified misconceptions and help appreciate the finer nuances while visualizing the behaviour of the system around Fermi energy, normally sidestepped both by the teachers and learners, is also presented in this paper.
Developing Web-based Tools for Collaborative Science and Public Outreach
NASA Astrophysics Data System (ADS)
Friedman, A.; Pizarro, O.; Williams, S. B.
2016-02-01
With the advances in high bandwidth communications and the proliferation of social media tools, education & outreach activities have become commonplace on ocean-bound research cruises. In parallel, advances in underwater robotics & other data collecting platforms, have made it possible to collect copious amounts of oceanographic data. This data then typically undergoes laborious, manual processing to transform it into quantitative information, which normally occurs post cruise resulting in significant lags between collecting data and using it for scientific discovery. This presentation discusses how appropriately designed software systems, can be used to fulfill multiple objectives and attempt to leverage public engagement in order to compliment science goals. We will present two software platforms: the first is a web browser based tool that was developed for real-time tracking of multiple underwater robots and ships. It was designed to allow anyone on board to view or control it on any device with a web browser. It opens up the possibility of remote teleoperation & engagement and was easily adapted to enable live streaming over the internet for public outreach. While the tracking system provided context and engaged people in real-time, it also directed interested participants to Squidle, another online system. Developed for scientists, Squidle supports data management, exploration & analysis and enables direct access to survey data reducing the lag in data processing. It provides a user-friendly streamlined interface that integrates advanced data management & online annotation tools. This system was adapted to provide a simplified user interface, tutorial instructions and a gamified ranking system to encourage "citizen science" participation. These examples show that through a flexible design approach, it is possible to leverage the development effort of creating science tools to facilitate outreach goals, opening up the possibility for acquiring large volumes of crowd-sourced data without compromising science objectives.
Petty, Julia
2013-01-01
Learning technology is increasingly being implemented for programmes of blended learning within nurse education. With a growing emphasis on self-directed study particularly in post-basic education, there is a need for learners to be guided in their learning away from practice and limited classroom time. Technology-enabled (TE) tools which engage learners actively can play a part in this. The effectiveness and value of interactive TE learning strategies within healthcare is the focus of this paper. To identify literature that explores the effectiveness of interactive, TE tools on knowledge acquisition and learner satisfaction within healthcare with a view to evaluating their use for post-basic nurse education. A Literature Review was performed focusing on papers exploring the comparative value and perceived benefit of TE tools compared to traditional modes of learning within healthcare. The Databases identified as most suitable due to their relevance to healthcare were accessed through EBSCOhost. Primary, Boolean and advanced searches on key terms were undertaken. Inclusion and exclusion criteria were applied which resulted in a final selection of 11 studies for critique. Analysis of the literature found that knowledge acquisition in most cases was enhanced and measured learner satisfaction was generally positive for interactive, self-regulated TE tools. However, TE education may not suit all learners and this is critiqued in the light of the identified limitations. Interactive self regulation and/or testing can be a valuable learning strategy that can be incorporated into self-directed programmes of study for post-registration learners. Whilst acknowledging the learning styles not suited to such tools, the concurrent use of self-directed TE tools with those learning strategies necessitating a more social presence can work together to support enhancement of knowledge required to deliver rationale for nursing practice. Copyright © 2012 Elsevier Ltd. All rights reserved.
Effectiveness of a web-based automated cell distribution system.
Niland, Joyce C; Stiller, Tracey; Cravens, James; Sowinski, Janice; Kaddis, John; Qian, Dajun
2010-01-01
In recent years, industries have turned to the field of operations research to help improve the efficiency of production and distribution processes. Largely absent is the application of this methodology to biological materials, such as the complex and costly procedure of human pancreas procurement and islet isolation. Pancreatic islets are used for basic science research and in a promising form of cell replacement therapy for a subset of patients afflicted with severe type 1 diabetes mellitus. Having an accurate and reliable system for cell distribution is therefore crucial. The Islet Cell Resource Center Consortium was formed in 2001 as the first and largest cooperative group of islet production and distribution facilities in the world. We previously reported on the development of a Matching Algorithm for Islet Distribution (MAID), an automated web-based tool used to optimize the distribution of human pancreatic islets by matching investigator requests to islet characteristics. This article presents an assessment of that algorithm and compares it to the manual distribution process used prior to MAID. A comparison was done using an investigator's ratio of the number of islets received divided by the number requested pre- and post-MAID. Although the supply of islets increased between the pre- versus post-MAID period, the median received-to-requested ratio remained around 60% due to an increase in demand post-MAID. A significantly smaller variation in the received-to-requested ratio was achieved in the post- versus pre-MAID period. In particular, the undesirable outcome of providing users with more islets than requested, ranging up to four times their request, was greatly reduced through the algorithm. In conclusion, this analysis demonstrates, for the first time, the effectiveness of using an automated web-based cell distribution system to facilitate efficient and consistent delivery of human pancreatic islets by enhancing the islet matching process.
i-ADHoRe 2.0: an improved tool to detect degenerated genomic homology using genomic profiles.
Simillion, Cedric; Janssens, Koen; Sterck, Lieven; Van de Peer, Yves
2008-01-01
i-ADHoRe is a software tool that combines gene content and gene order information of homologous genomic segments into profiles to detect highly degenerated homology relations within and between genomes. The new version offers, besides a significant increase in performance, several optimizations to the algorithm, most importantly to the profile alignment routine. As a result, the annotations of multiple genomes, or parts thereof, can be fed simultaneously into the program, after which it will report all regions of homology, both within and between genomes. The i-ADHoRe 2.0 package contains the C++ source code for the main program as well as various Perl scripts and a fully documented Perl API to facilitate post-processing. The software runs on any Linux- or -UNIX based platform. The package is freely available for academic users and can be downloaded from http://bioinformatics.psb.ugent.be/
WebbPSF: Updated PSF Models Based on JWST Ground Testing Results
NASA Astrophysics Data System (ADS)
Osborne, Shannon; Perrin, Marshall D.; Melendez Hernandez, Marcio
2018-06-01
WebbPSF is a widely-used package that allows astronomers to create simulated point spread functions (PSFs) for the James Webb Space Telescope (JWST). WebbPSF provides the user with the flexibility to produce PSFs for direct imaging and coronographic modes, for a range of filters and masks, and across all the JWST instruments. These PSFs can then be analyzed with built-in evaluation tools or can be output to be used with users’ own tools. In the most recent round of updates, the accuracy of the PSFs have been improved with updated analyses of the instrument test data from NASA Goddard and with the new data from the testing of the combined Optical Telescope Element and Integrated Science Instrument Module (OTIS) at NASA Johnson. A post-processing function applying detector effects and pupil distortions to input PSFs has also been added to the WebbPSF package.
Quantifying Ubiquitin Signaling
Ordureau, Alban; Münch, Christian; Harper, J. Wade
2015-01-01
Ubiquitin (UB)-driven signaling systems permeate biology, and are often integrated with other types of post-translational modifications (PTMs), most notably phosphorylation. Flux through such pathways is typically dictated by the fractional stoichiometry of distinct regulatory modifications and protein assemblies as well as the spatial organization of pathway components. Yet, we rarely understand the dynamics and stoichiometry of rate-limiting intermediates along a reaction trajectory. Here, we review how quantitative proteomic tools and enrichment strategies are being used to quantify UB-dependent signaling systems, and to integrate UB signaling with regulatory phosphorylation events. A key regulatory feature of ubiquitylation is that the identity of UB chain linkage types can control downstream processes. We also describe how proteomic and enzymological tools can be used to identify and quantify UB chain synthesis and linkage preferences. The emergence of sophisticated quantitative proteomic approaches will set a new standard for elucidating biochemical mechanisms of UB-driven signaling systems. PMID:26000850
Khan, Arif Ul Maula; Torelli, Angelo; Wolf, Ivo; Gretz, Norbert
2018-05-08
In biological assays, automated cell/colony segmentation and counting is imperative owing to huge image sets. Problems occurring due to drifting image acquisition conditions, background noise and high variation in colony features in experiments demand a user-friendly, adaptive and robust image processing/analysis method. We present AutoCellSeg (based on MATLAB) that implements a supervised automatic and robust image segmentation method. AutoCellSeg utilizes multi-thresholding aided by a feedback-based watershed algorithm taking segmentation plausibility criteria into account. It is usable in different operation modes and intuitively enables the user to select object features interactively for supervised image segmentation method. It allows the user to correct results with a graphical interface. This publicly available tool outperforms tools like OpenCFU and CellProfiler in terms of accuracy and provides many additional useful features for end-users.
2017-01-01
ABSTRACT The field of microbiology has experienced significant growth due to transformative advances in technology and the influx of scientists driven by a curiosity to understand how microbes sustain myriad biochemical processes that maintain Earth. With this explosion in scientific output, a significant bottleneck has been the ability to rapidly disseminate new knowledge to peers and the public. Preprints have emerged as a tool that a growing number of microbiologists are using to overcome this bottleneck. Posting preprints can help to transparently recruit a more diverse pool of reviewers prior to submitting to a journal for formal peer review. Although the use of preprints is still limited in the biological sciences, early indications are that preprints are a robust tool that can complement and enhance peer-reviewed publications. As publishing moves to embrace advances in Internet technology, there are many opportunities for preprints and peer-reviewed journals to coexist in the same ecosystem. PMID:28536284
Schloss, Patrick D
2017-05-23
The field of microbiology has experienced significant growth due to transformative advances in technology and the influx of scientists driven by a curiosity to understand how microbes sustain myriad biochemical processes that maintain Earth. With this explosion in scientific output, a significant bottleneck has been the ability to rapidly disseminate new knowledge to peers and the public. Preprints have emerged as a tool that a growing number of microbiologists are using to overcome this bottleneck. Posting preprints can help to transparently recruit a more diverse pool of reviewers prior to submitting to a journal for formal peer review. Although the use of preprints is still limited in the biological sciences, early indications are that preprints are a robust tool that can complement and enhance peer-reviewed publications. As publishing moves to embrace advances in Internet technology, there are many opportunities for preprints and peer-reviewed journals to coexist in the same ecosystem. Copyright © 2017 Schloss.
Introducing PLIA: Planetary Laboratory for Image Analysis
NASA Astrophysics Data System (ADS)
Peralta, J.; Hueso, R.; Barrado, N.; Sánchez-Lavega, A.
2005-08-01
We present a graphical software tool developed under IDL software to navigate, process and analyze planetary images. The software has a complete Graphical User Interface and is cross-platform. It can also run under the IDL Virtual Machine without the need to own an IDL license. The set of tools included allow image navigation (orientation, centring and automatic limb determination), dynamical and photometric atmospheric measurements (winds and cloud albedos), cylindrical and polar projections, as well as image treatment under several procedures. Being written in IDL, it is modular and easy to modify and grow for adding new capabilities. We show several examples of the software capabilities with Galileo-Venus observations: Image navigation, photometrical corrections, wind profiles obtained by cloud tracking, cylindrical projections and cloud photometric measurements. Acknowledgements: This work has been funded by Spanish MCYT PNAYA2003-03216, fondos FEDER and Grupos UPV 15946/2004. R. Hueso acknowledges a post-doc fellowship from Gobierno Vasco.
O-GlcNAc transferase inhibitors: current tools and future challenges.
Trapannone, Riccardo; Rafie, Karim; van Aalten, Daan M F
2016-02-01
The O-linked N-acetylglucosamine (O-GlcNAc) post-translational modification (O-GlcNAcylation) is the dynamic and reversible attachment of N-acetylglucosamine to serine and threonine residues of nucleocytoplasmic target proteins. It is abundant in metazoa, involving hundreds of proteins linked to a plethora of biological functions with implications in human diseases. The process is catalysed by two enzymes: O-GlcNAc transferase (OGT) and O-GlcNAcase (OGA) that add and remove sugar moieties respectively. OGT knockout is embryonic lethal in a range of animal models, hampering the study of the biological role of O-GlcNAc and the dissection of catalytic compared with non-catalytic roles of OGT. Therefore, selective and potent chemical tools are necessary to inhibit OGT activity in the context of biological systems. The present review focuses on the available OGT inhibitors and summarizes advantages, limitations and future challenges. © 2016 Authors; published by Portland Press Limited.
GWFASTA: server for FASTA search in eukaryotic and microbial genomes.
Issac, Biju; Raghava, G P S
2002-09-01
Similarity searches are a powerful method for solving important biological problems such as database scanning, evolutionary studies, gene prediction, and protein structure prediction. FASTA is a widely used sequence comparison tool for rapid database scanning. Here we describe the GWFASTA server that was developed to assist the FASTA user in similarity searches against partially and/or completely sequenced genomes. GWFASTA consists of more than 60 microbial genomes, eight eukaryote genomes, and proteomes of annotatedgenomes. Infact, it provides the maximum number of databases for similarity searching from a single platform. GWFASTA allows the submission of more than one sequence as a single query for a FASTA search. It also provides integrated post-processing of FASTA output, including compositional analysis of proteins, multiple sequences alignment, and phylogenetic analysis. Furthermore, it summarizes the search results organism-wise for prokaryotes and chromosome-wise for eukaryotes. Thus, the integration of different tools for sequence analyses makes GWFASTA a powerful toolfor biologists.
NASA Astrophysics Data System (ADS)
Storesund, R.; Chin, A.; Florsheim, J. L.; O'Hirok, L.; Williams, K.; Austin, K. E.
2014-12-01
Mountains areas are increasingly susceptible to wildfires because of warming climates. Although knowledge of the hydro-geomorphological impacts of wildfire has advanced in recent years, much is still unknown regarding how environmental fluxes move through burned watersheds. Because of the loss of vegetation and hydrophobic soils, flash floods often accompany elevated runoff events from burned watersheds, making direct process measurements challenging. Direct measurements are also only partly successful at capturing the spatial variations of post-fire effects. Coupled with short temporal windows for observing such responses, opportunities are often missed for collecting data needed for developing predictive models. Terrestrial LiDAR scanning (TLS) of burned areas allows detailed documentation of the post-fire topography to cm-level accuracy, providing pictures of geomorphic responses not previously possible. This paper reports a comparative study of hillslope-channel interactions, using repeat TLS, in two contrasting environments. Burned by the 2012 Waldo Canyon Fire and 2013 Springs Fire, in Colorado and California respectively, the study sites share many similarities including steep erosive slopes, small drainage areas, and step-pool channel morphologies. TLS provided a tool to test the central hypothesis that, dry ravel, distinct in the California Mediterranean environment, would prompt a greater sedimentological response from the Springs Fire compared to the Waldo Canyon Fire. At selected sites in each area, TLS documented baseline conditions immediately following the fire. Repeat scanning after major storms allowed detection of changes in the landscape. Results show a tendency for sedimentation in river channels in the study sites interacting with dry ravel on hillslopes, whereas erosion dominated the response from the Waldo Canyon Fire with an absence of dry ravel. These data provide clues to developing generalizations for post-fire effects at regional scales, which could assist with managing hazards from wildfires. TLS provides a promising tool to expand the range of studies concerning environmental responses through burned landscapes.
NASA Astrophysics Data System (ADS)
Dulom, Duyum
Buildings account for about 40 percent of total U.S. energy consumption. It is therefore important to shift our focus on important measures that can be taken to make buildings more energy efficient. With the rise in number of buildings day by day and the dwindling resources, retrofitting buildings is the key to an energy efficiency future. Post occupancy evaluation (POE) is an important tool and is ideal for the retrofitting process. POE would help to identify the problem areas in the building and enable researchers and designers to come up with solutions addressing the inefficient energy usage as well as the overall wellbeing of the users of the building. The post occupancy energy evaluation of Ronald Tutor Hall (RTH) located at the University of Southern California is one small step in that direction. RTH was chosen to study because; (a) relatively easy access to the building data (b) it was built in compliance with Title 24 2001 and (c) it was old enough to have post occupancy data. The energy modeling tool eQuest was used to simulate the RTH building using the background information of the building such as internal thermal comfort profile, occupancy profile, building envelope profile, internal heat gain profile, etc. The simulation results from eQuest were then compared with the actual building recorded data to verify that our simulated model was behaving similar to the actual building. Once we were able to make the simulated model behave like the actual building, changes were made to the model such as installation of occupancy sensor in the classroom & laboratories, changing the thermostat set points and introducing solar shade on northwest and southwest facade. The combined savings of the proposed interventions resulted in a 6% savings in the overall usage of energy.
Modeling post-fire hydro-geomorphic recovery in the Waldo Canyon Fire
NASA Astrophysics Data System (ADS)
Kinoshita, Alicia; Nourbakhshbeidokhti, Samira; Chin, Anne
2016-04-01
Wildfire can have significant impacts on watershed hydrology and geomorphology by changing soil properties and removing vegetation, often increasing runoff and soil erosion and deposition, debris flows, and flooding. Watershed systems may take several years or longer to recover. During this time, post-fire channel changes have the potential to alter hydraulics that influence characteristics such as time of concentration and increase time to peak flow, flow capacity, and velocity. Using the case of the 2012 Waldo Canyon Fire in Colorado (USA), this research will leverage field-based surveys and terrestrial Light Detection and Ranging (LiDAR) data to parameterize KINEROS2 (KINematic runoff and EROSion), an event oriented, physically-based watershed runoff and erosion model. We will use the Automated Geospatial Watershed Assessment (AGWA) tool, which is a GIS-based hydrologic modeling tool that uses commonly available GIS data layers to parameterize, execute, and spatially visualize runoff and sediment yield for watersheds impacted by the Waldo Canyon Fire. Specifically, two models are developed, an unburned (Bear Creek) and burned (Williams) watershed. The models will simulate burn severity and treatment conditions. Field data will be used to validate the burned watersheds for pre- and post-fire changes in infiltration, runoff, peak flow, sediment yield, and sediment discharge. Spatial modeling will provide insight into post-fire patterns for varying treatment, burn severity, and climate scenarios. Results will also provide post-fire managers with improved hydro-geomorphic modeling and prediction tools for water resources management and mitigation efforts.
Online reporting of adverse drug reactions: a study from a French regional pharmacovigilance center.
Abadie, Delphine; Chebane, Leyla; Bert, Max; Durrieu, Geneviève; Montastruc, Jean-Louis
2014-01-01
In France, online reporting via a website is a new method for notifying adverse drug reactions (ADRs). The French Midi-Pyrénées Regional Pharmacovigilance Center (RPVC) set up in July, 2010 a Web-based ADR reporting tool in order to improve ADR reporting rate. To assess feasibility, use and performances of this new ADR reporting system. To evaluate the main characteristics of these online reports. In a retrospective study, we evaluated characteristics (numbers, ADR reporting and file processing times, type of reporters, suspected drugs, "seriousness" and nature of ADRs) of online notifications reported to the RPVC between July 7(th), 2010 (first online notification) and December 31(th), 2011. We performed comparisons to a random sample of "conventional" notifications, i.e. spontaneously reported to the RPVC via traditional tools (post, fax, e-mail or telephone) during the same period. The total number of online reports was 312 over the 18-month period. There was a 45% increase in numbers of reports from ambulatory healthcare professionals after the implementation of the new reporting tool. Online reports were transmitted to the French Medicine Agency on average almost one month (26 days) earlier than "conventional" ones. This difference was mainly due to a faster ADR notification process via the online form (on average, the reporting period was decreased by 19 days with the new tool). In comparison to "conventional" notifications, online reports came more often from ambulatory healthcare professionals, and involved more frequently neuropsychiatric drugs and neuropsychiatric ADRs. None difference was observed for "seriousness" of ADRs. It is feasible to deploy an online ADR reporting system used by health professionals in current practice. We underline the efficiency of this new online reporting tool for increasing ADRs reporting. Moreover, this is the first published study demonstrating that an online reporting tool can help to save time on the ADR reporting period and file processing, which is essential to generate early safety signals. © 2014 Société Française de Pharmacologie et de Thérapeutique.
Trident: A Universal Tool for Generating Synthetic Absorption Spectra from Astrophysical Simulations
NASA Astrophysics Data System (ADS)
Hummels, Cameron B.; Smith, Britton D.; Silvia, Devin W.
2017-09-01
Hydrodynamical simulations are increasingly able to accurately model physical systems on stellar, galactic, and cosmological scales; however, the utility of these simulations is often limited by our ability to directly compare them with the data sets produced by observers: spectra, photometry, etc. To address this problem, we have created trident, a Python-based open-source tool for post-processing hydrodynamical simulations to produce synthetic absorption spectra and related data. trident can (I) create absorption-line spectra for any trajectory through a simulated data set mimicking both background quasar and down-the-barrel configurations; (II) reproduce the spectral characteristics of common instruments like the Cosmic Origins Spectrograph; (III) operate across the ultraviolet, optical, and infrared using customizable absorption-line lists; (IV) trace simulated physical structures directly to spectral features; (v) approximate the presence of ion species absent from the simulation outputs; (VI) generate column density maps for any ion; and (vii) provide support for all major astrophysical hydrodynamical codes. trident was originally developed to aid in the interpretation of observations of the circumgalactic medium and intergalactic medium, but it remains a general tool applicable in other contexts.
NASA Technical Reports Server (NTRS)
Kellas, Sotiris; Knight, Norman F., Jr.
2002-01-01
A lightweight energy-absorbing keel-beam concept was developed and retrofitted in a general aviation type aircraft to improve crashworthiness performance. The energy-absorbing beam consisted of a foam-filled cellular structure with glass fiber and hybrid glass/kevlar cell walls. Design, analysis, fabrication and testing of the keel beams prior to installation and subsequent full-scale crash testing of the aircraft are described. Factors such as material and fabrication constraints, damage tolerance, crush stress/strain response, seat-rail loading, and post crush integrity, which influenced the course of the design process are also presented. A theory similar to the one often used for ductile metal box structures was employed with appropriate modifications to estimate the sustained crush loads for the beams. This, analytical tool, coupled with dynamic finite element simulation using MSC.Dytran were the prime design and analysis tools. The validity of the theory as a reliable design tool was examined against test data from static crush tests of beam sections while the overall performance of the energy-absorbing subfloor was assessed through dynamic testing of 24 in long subfloor assemblies.
Optimization of Gate, Runner and Sprue in Two-Plate Family Plastic Injection Mould
NASA Astrophysics Data System (ADS)
Amran, M. A.; Hadzley, M.; Amri, S.; Izamshah, R.; Hassan, A.; Samsi, S.; Shahir, K.
2010-03-01
This paper describes the optimization size of gate, runner and sprue in two-plate family plastic injection mould. An Electronic Cash Register (ECR) plastic product was used in this study, which there are three components in electronic cast register plastic product consist of top casing, bottom casing and paper holder. The objectives of this paper are to find out the optimum size of gate, runner and sprue, to locate the optimum layout of cavities and to recognize the defect problems due to the wrong size of gate, runner and sprue. Three types of software were used in this study, which Unigraphics software as CAD tool was used to design 3D modeling, Rhinoceros software as post processing tool was used to design gate, runner and sprue and Moldex software as simulation tool was used to analyze the plastic flow. As result, some modifications were made on size of feeding system and location of cavity to eliminate the short- shot, over filling and welding line problems in two-plate family plastic injection mould.
Chemical and Biological Tools for the Preparation of Modified Histone Proteins
Howard, Cecil J.; Yu, Ruixuan R.; Gardner, Miranda L.; Shimko, John C.; Ottesen, Jennifer J.
2016-01-01
Eukaryotic chromatin is a complex and dynamic system in which the DNA double helix is organized and protected by interactions with histone proteins. This system is regulated through, a large network of dynamic post-translational modifications (PTMs) exists to ensure proper gene transcription, DNA repair, and other processes involving DNA. Homogenous protein samples with precisely characterized modification sites are necessary to better understand the functions of modified histone proteins. Here, we discuss sets of chemical and biological tools that have been developed for the preparation of modified histones, with a focus on the appropriate choice of tool for a given target. We start with genetic approaches for the creation of modified histones, including the incorporation of genetic mimics of histone modifications, chemical installation of modification analogs, and the use of the expanded genetic code to incorporate modified amino acids. Additionally, we will cover the chemical ligation techniques that have been invaluable in the generation of complex modified histones that are indistinguishable from the natural counterparts. Finally, we will end with a prospectus on future directions of synthetic chromatin in living systems. PMID:25863817
PPSP: prediction of PK-specific phosphorylation site with Bayesian decision theory.
Xue, Yu; Li, Ao; Wang, Lirong; Feng, Huanqing; Yao, Xuebiao
2006-03-20
As a reversible and dynamic post-translational modification (PTM) of proteins, phosphorylation plays essential regulatory roles in a broad spectrum of the biological processes. Although many studies have been contributed on the molecular mechanism of phosphorylation dynamics, the intrinsic feature of substrates specificity is still elusive and remains to be delineated. In this work, we present a novel, versatile and comprehensive program, PPSP (Prediction of PK-specific Phosphorylation site), deployed with approach of Bayesian decision theory (BDT). PPSP could predict the potential phosphorylation sites accurately for approximately 70 PK (Protein Kinase) groups. Compared with four existing tools Scansite, NetPhosK, KinasePhos and GPS, PPSP is more accurate and powerful than these tools. Moreover, PPSP also provides the prediction for many novel PKs, say, TRK, mTOR, SyK and MET/RON, etc. The accuracy of these novel PKs are also satisfying. Taken together, we propose that PPSP could be a potentially powerful tool for the experimentalists who are focusing on phosphorylation substrates with their PK-specific sites identification. Moreover, the BDT strategy could also be a ubiquitous approach for PTMs, such as sumoylation and ubiquitination, etc.
Picturing academic learning: salutogenic and health promoting perspectives on drawings.
Garista, Patrizia; Pocetta, Giancarlo; Lindström, Bengt
2018-05-25
More than 20 years ago an article about the use of drawings in higher education appeared in a medical journal. After that, other papers explored the possible contribution of drawings in adult education, while only very few in the field of health promotion and education. This article aims to introduce the use of drawing in this field using the salutogenic lens to think, plan and reflect on academic learning. Reflections on what salutogenesis is and what we can consider a clear application of salutogenic principles to the learning process answer a hypothetical question for the reader concerning the relationship between drawings and health promotion theories. They appear as communication tools capable of exploring meaning-making processes, capturing data that is flexible to dynamic systems, power relations, as well as emotional and latent aspects of human experience. This article proposes a connection between salutogenesis and drawings through: a theoretical framework on salutogenic learning and drawings; a teacher practice and its tools focusing the critical point on visual data analysis in a learning environment; a learner case example for knowledge and capacity building through the drawing process; and a health promotion competency-based analysis. Our case example illustrates how drawings were introduced in a post-graduate course in Health Promotion and Education and argues their strengths and weaknesses.
Optimization of a hardware implementation for pulse coupled neural networks for image applications
NASA Astrophysics Data System (ADS)
Gimeno Sarciada, Jesús; Lamela Rivera, Horacio; Warde, Cardinal
2010-04-01
Pulse Coupled Neural Networks are a very useful tool for image processing and visual applications, since it has the advantages of being invariant to image changes as rotation, scale, or certain distortion. Among other characteristics, the PCNN changes a given image input into a temporal representation which can be easily later analyzed for pattern recognition. The structure of a PCNN though, makes it necessary to determine all of its parameters very carefully in order to function optimally, so that the responses to the kind of inputs it will be subjected are clearly discriminated allowing for an easy and fast post-processing yielding useful results. This tweaking of the system is a taxing process. In this paper we analyze and compare two methods for modeling PCNNs. A purely mathematical model is programmed and a similar circuital model is also designed. Both are then used to determine the optimal values of the several parameters of a PCNN: gain, threshold, time constants for feed-in and threshold and linking leading to an optimal design for image recognition. The results are compared for usefulness, accuracy and speed, as well as the performance and time requirements for fast and easy design, thus providing a tool for future ease of management of a PCNN for different tasks.
Kitsos, Gemma; Harris, Dawn; Pollack, Michael; Hubbard, Isobel J
2011-01-01
In Australia, stroke is the leading cause of adult disability. For most stroke survivors, the recovery process is challenging, and in the first few weeks their recovery is supported with stroke rehabilitation services. Stroke clinicians are expected to apply an evidence-based approach to stroke rehabilitation and, in turn, use standardised and validated assessments to monitor stroke recovery. In 2008, the National Stroke Foundation conducted the first national audit of Australia's post acute stroke rehabilitation services and findings identified a vast array of assessments being used by clinicians. This study undertook a sub-analysis of the audit's assessment tools data with the aim of making clinically relevant recommendations concerning the validity of the most frequently selected assessments. Data reduction ranked the most frequently selected assessments across a series of sub-categories. A serial systematic review of relevant literature using Medline and the Cumulative Index to Nursing and Allied Health Literature identified post-stroke validity ranking. The study found that standardised and non-standardised assessments are currently in use in stroke rehabilitation. It recommends further research in the sub-categories of strength, visual acuity, dysphagia, continence and nutrition and found strengths in the sub-categories of balance and mobility, upper limb function and mood. This is the first study to map national usage of post-stroke assessments and review that usage against the evidence. It generates new knowledge concerning what assessments we currently use post stroke, what we should be using and makes some practical post stroke clinical recommendations.
Martinez, Bibiana; Dailey, Francis; Almario, Christopher V; Keller, Michelle S; Desai, Mansee; Dupuy, Taylor; Mosadeghi, Sasan; Whitman, Cynthia; Lasch, Karen; Ursos, Lyann; Spiegel, Brennan M R
2017-07-01
Few studies have examined inflammatory bowel disease (IBD) patients' knowledge and understanding of biologic therapies outside traditional surveys. Here, we used social media data to examine IBD patients' understanding of the risks and benefits associated with biologic therapies and how this affects decision-making. We collected posts from Twitter and e-forum discussions from >3000 social media sites posted between June 27, 2012 and June 27, 2015. Guided by natural language processing, we identified posts with specific IBD keywords that discussed the risks and/or benefits of biologics. We then manually coded the resulting posts and performed qualitative analysis using ATLAS.ti software. A hierarchical coding structure was developed based on the keyword list and relevant themes were identified through manual coding. We examined 1598 IBD-related posts, of which 452 (28.3%) centered on the risks and/or benefits of biologics. There were 5 main themes: negative experiences and concerns with biologics (n = 247; 54.6%), decision-making surrounding biologic use (n = 169; 37.4%), positive experiences with biologics (n = 168; 37.2%), information seeking from peers (n = 125; 27.7%), and cost (n = 38; 8.4%). Posts describing negative experiences primarily commented on side effects from biologics, concerns about potential side effects and increased cancer risk, and pregnancy safety concerns. Posts on decision-making focused on nonbiologic treatment options, hesitation to initiate biologics, and concerns about changing or discontinuing regimens. Social media reveals a wide range of themes governing patients' experience and choice with IBD biologics. The complexity of navigating their risk-benefit profiles suggests merit in creating online tailored decision tools to support IBD patients' decision-making with biologic therapies.
Chaudhuri, Rima; Sadrieh, Arash; Hoffman, Nolan J; Parker, Benjamin L; Humphrey, Sean J; Stöckli, Jacqueline; Hill, Adam P; James, David E; Yang, Jean Yee Hwa
2015-08-19
Most biological processes are influenced by protein post-translational modifications (PTMs). Identifying novel PTM sites in different organisms, including humans and model organisms, has expedited our understanding of key signal transduction mechanisms. However, with increasing availability of deep, quantitative datasets in diverse species, there is a growing need for tools to facilitate cross-species comparison of PTM data. This is particularly important because functionally important modification sites are more likely to be evolutionarily conserved; yet cross-species comparison of PTMs is difficult since they often lie in structurally disordered protein domains. Current tools that address this can only map known PTMs between species based on known orthologous phosphosites, and do not enable the cross-species mapping of newly identified modification sites. Here, we addressed this by developing a web-based software tool, PhosphOrtholog ( www.phosphortholog.com ) that accurately maps protein modification sites between different species. This facilitates the comparison of datasets derived from multiple species, and should be a valuable tool for the proteomics community. Here we describe PhosphOrtholog, a web-based application for mapping known and novel orthologous PTM sites from experimental data obtained from different species. PhosphOrtholog is the only generic and automated tool that enables cross-species comparison of large-scale PTM datasets without relying on existing PTM databases. This is achieved through pairwise sequence alignment of orthologous protein residues. To demonstrate its utility we apply it to two sets of human and rat muscle phosphoproteomes generated following insulin and exercise stimulation, respectively, and one publicly available mouse phosphoproteome following cellular stress revealing high mapping and coverage efficiency. Although coverage statistics are dataset dependent, PhosphOrtholog increased the number of cross-species mapped sites in all our example data sets by more than double when compared to those recovered using existing resources such as PhosphoSitePlus. PhosphOrtholog is the first tool that enables mapping of thousands of novel and known protein phosphorylation sites across species, accessible through an easy-to-use web interface. Identification of conserved PTMs across species from large-scale experimental data increases our knowledgebase of functional PTM sites. Moreover, PhosphOrtholog is generic being applicable to other PTM datasets such as acetylation, ubiquitination and methylation.
Duffa, Céline; Bailly du Bois, Pascal; Caillaud, Matthieu; Charmasson, Sabine; Couvez, Céline; Didier, Damien; Dumas, Franck; Fievet, Bruno; Morillon, Mehdi; Renaud, Philippe; Thébault, Hervé
2016-01-01
The Fukushima nuclear accident resulted in the largest ever accidental release of artificial radionuclides in coastal waters. This accident has shown the importance of marine assessment capabilities for emergency response and the need to develop tools for adequately predicting the evolution and potential impact of radioactive releases to the marine environment. The French Institute for Radiological Protection and Nuclear Safety (IRSN) equips its emergency response centre with operational tools to assist experts and decision makers in the event of accidental atmospheric releases and contamination of the terrestrial environment. The on-going project aims to develop tools for the management of marine contamination events in French coastal areas. This should allow us to evaluate and anticipate post-accident conditions, including potential contamination sites, contamination levels and potential consequences. In order to achieve this goal, two complementary tools are developed: site-specific marine data sheets and a dedicated simulation tool (STERNE, Simulation du Transport et du transfert d'Eléments Radioactifs dans l'environNEment marin). Marine data sheets are used to summarize the marine environment characteristics of the various sites considered, and to identify vulnerable areas requiring implementation of population protection measures, such as aquaculture areas, beaches or industrial water intakes, as well as areas of major ecological interest. Local climatological data (dominant sea currents as a function of meteorological or tidal conditions) serving as the basis for an initial environmental sampling strategy is provided whenever possible, along with a list of possible local contacts for operational management purposes. The STERNE simulation tool is designed to predict radionuclide dispersion and contamination in seawater and marine species by incorporating spatio-temporal data. 3D hydrodynamic forecasts are used as input data. Direct discharge points or atmospheric deposition source terms can be taken into account. STERNE calculates Eulerian radionuclide dispersion using advection and diffusion equations established offline from hydrodynamic calculations. A radioecological model based on dynamic transfer equations is implemented to evaluate activity concentrations in aquatic organisms. Essential radioecological parameters (concentration factors and single or multicomponent biological half-lives) have been compiled for main radionuclides and generic marine species (fish, molluscs, crustaceans and algae). Dispersion and transfer calculations are performed simultaneously on a 3D grid. Results can be plotted on maps, with possible tracking of spatio-temporal evolution. Post-processing and visualization can then be performed. Copyright © 2015 Elsevier Ltd. All rights reserved.
Simulation of the Press Hardening Process and Prediction of the Final Mechanical Material Properties
NASA Astrophysics Data System (ADS)
Hochholdinger, Bernd; Hora, Pavel; Grass, Hannes; Lipp, Arnulf
2011-08-01
Press hardening is a well-established production process in the automotive industry today. The actual trend of this process technology points towards the manufacturing of parts with tailored properties. Since the knowledge of the mechanical properties of a structural part after forming and quenching is essential for the evaluation of for example the crash performance, an accurate as possible virtual assessment of the production process is more than ever necessary. In order to achieve this, the definition of reliable input parameters and boundary conditions for the thermo-mechanically coupled simulation of the process steps is required. One of the most important input parameters, especially regarding the final properties of the quenched material, is the contact heat transfer coefficient (IHTC). The CHTC depends on the effective pressure or the gap distance between part and tool. The CHTC at different contact pressures and gap distances is determined through inverse parameter identification. Furthermore a simulation strategy for the subsequent steps of the press hardening process as well as adequate modeling approaches for part and tools are discussed. For the prediction of the yield curves of the material after press hardening a phenomenological model is presented. This model requires the knowledge of the microstructure within the part. By post processing the nodal temperature history with a CCT diagram the quantitative distribution of the phase fractions martensite, bainite, ferrite and pearlite after press hardening is determined. The model itself is based on a Hockett-Sherby approach with the Hockett-Sherby parameters being defined in function of the phase fractions and a characteristic cooling rate.
Quality indicators in laboratory medicine: a fundamental tool for quality and patient safety.
Plebani, Mario; Sciacovelli, Laura; Marinova, Mariela; Marcuccitti, Jessica; Chiozza, Maria Laura
2013-09-01
The identification of reliable quality indicators (QIs) is a crucial step in enabling users to quantify the quality of laboratory services. The current lack of attention to extra-laboratory factors is in stark contrast with the body of evidence pointing to the multitude of errors that continue to occur in the pre- and post-analytical phases. Different QIs and terminologies are currently used and, therefore, there is the need to harmonize proposed QIs. A model of quality indicators (MQI) has been consensually developed by a group of clinical laboratories according to a project launched by a working group of the International Federation of Clinical Chemistry and Laboratory Medicine (IFCC). The model includes 57 QIs related to key processes (35 pre-, 7 intra- and 15 post-analytical phases) and 3 to support processes. The developed MQI and the data collected provide evidence of the feasibility of the project to harmonize currently available QIs, but further efforts should be done to involve more clinical laboratories and to collect a more consistent amount of data. Copyright © 2012 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Effects of caffeine supplementation in post-thaw human semen over different incubation periods.
Pariz, J R; Hallak, J
2016-11-01
This study aimed to evaluate the effects of caffeine supplementation in post-cryopreservation human semen over different incubation periods. After collection by masturbation, 17 semen samples were analysed according to World Health Organization criteria, processed and cryopreserved with TEST-yolk buffer (1 : 1) in liquid nitrogen. After a thawing protocol, samples were incubated with 2 mm of caffeine for 0, 5, 15, 30 or 60 min, followed by analysis of motility and mitochondrial activity using 3,3'-diaminobenzidine (DAB). Mean variance analysis was performed, and P < 0.05 was the adopted significance threshold. Samples incubated for 15 min showed increased progressive motility compared to other periods of incubation, as well as a reduced percentage of immotile spermatozoa (P < 0.05). In samples incubated for 5 min, increased mitochondrial activity above 50% was observed (DABI and DABII). Although cryosurvival rates were low after the cryopreservation process, incubation with caffeine was associated with an increase in sperm motility, particularly 15-min incubation, suggesting that incubation with caffeine can be an important tool in patients with worsening seminal quality undergoing infertility treatment. © 2016 Blackwell Verlag GmbH.
Biswas, Ambarish; Brown, Chris M
2014-06-08
Gene expression in vertebrate cells may be controlled post-transcriptionally through regulatory elements in mRNAs. These are usually located in the untranslated regions (UTRs) of mRNA sequences, particularly the 3'UTRs. Scan for Motifs (SFM) simplifies the process of identifying a wide range of regulatory elements on alignments of vertebrate 3'UTRs. SFM includes identification of both RNA Binding Protein (RBP) sites and targets of miRNAs. In addition to searching pre-computed alignments, the tool provides users the flexibility to search their own sequences or alignments. The regulatory elements may be filtered by expected value cutoffs and are cross-referenced back to their respective sources and literature. The output is an interactive graphical representation, highlighting potential regulatory elements and overlaps between them. The output also provides simple statistics and links to related resources for complementary analyses. The overall process is intuitive and fast. As SFM is a free web-application, the user does not need to install any software or databases. Visualisation of the binding sites of different classes of effectors that bind to 3'UTRs will facilitate the study of regulatory elements in 3' UTRs.
NASA Astrophysics Data System (ADS)
Paulsson, Adisa; Xing, Kezhao; Fosshaug, Hans; Lundvall, Axel; Bjoernberg, Charles; Karlsson, Johan
2005-05-01
A continuing improvement in resist process is a necessity for high-end photomask fabrication. In advanced chemically amplified resist systems the lithographic performance is strongly influenced by diffusion of acid and acid quencher (i.e. bases). Beside the resist properties, e.g. size and volatility of the photoacid, the process conditions play important roles for the diffusion control. Understanding and managing these properties influences lithographic characteristics on the photomask such as CD uniformity, CD and pitch linearity, resolution, substrate contamination, clear-dark bias and iso-dense bias. In this paper we have investigated effects on the lithographic characteristics with respect to post exposure bake conditions, when using the chemically amplified resist FEP-171. We used commercially available mask blanks from the Hoya Mask Blank Division with NTAR7 chrome and an optimized resist thickness for the 248 nm laser tool at 3200Å. The photomasks were exposed on the optical DUV (248nm) Sigma7300 pattern generator. Additionally, we investigated the image stability between exposure and post exposure bake. Unlike in wafer fabrication, photomask writing requires several hours, making the resist susceptible to image blur and acid latent image degradation.
Sgobba, Miriam; Caporuscio, Fabiana; Anighoro, Andrew; Portioli, Corinne; Rastelli, Giulio
2012-12-01
In the last decades, molecular docking has emerged as an increasingly useful tool in the modern drug discovery process, but it still needs to overcome many hurdles and limitations such as how to account for protein flexibility and poor scoring function performance. For this reason, it has been recognized that in many cases docking results need to be post-processed to achieve a significant agreement with experimental activities. In this study, we have evaluated the performance of MM-PBSA and MM-GBSA scoring functions, implemented in our post-docking procedure BEAR, in rescoring docking solutions. For the first time, the performance of this post-docking procedure has been evaluated on six different biological targets (namely estrogen receptor, thymidine kinase, factor Xa, adenosine deaminase, aldose reductase, and enoyl ACP reductase) by using i) both a single and a multiple protein conformation approach, and ii) two different software, namely AutoDock and LibDock. The assessment has been based on two of the most important criteria for the evaluation of docking methods, i.e., the ability of known ligands to enrich the top positions of a ranked database with respect to molecular decoys, and the consistency of the docking poses with crystallographic binding modes. We found that, in many cases, MM-PBSA and MM-GBSA are able to yield higher enrichment factors compared to those obtained with the docking scoring functions alone. However, for only a minority of the cases, the enrichment factors obtained by using multiple protein conformations were higher than those obtained by using only one protein conformation. Copyright © 2012 Elsevier Masson SAS. All rights reserved.
HTAPP: High-Throughput Autonomous Proteomic Pipeline
Yu, Kebing; Salomon, Arthur R.
2011-01-01
Recent advances in the speed and sensitivity of mass spectrometers and in analytical methods, the exponential acceleration of computer processing speeds, and the availability of genomic databases from an array of species and protein information databases have led to a deluge of proteomic data. The development of a lab-based automated proteomic software platform for the automated collection, processing, storage, and visualization of expansive proteomic datasets is critically important. The high-throughput autonomous proteomic pipeline (HTAPP) described here is designed from the ground up to provide critically important flexibility for diverse proteomic workflows and to streamline the total analysis of a complex proteomic sample. This tool is comprised of software that controls the acquisition of mass spectral data along with automation of post-acquisition tasks such as peptide quantification, clustered MS/MS spectral database searching, statistical validation, and data exploration within a user-configurable lab-based relational database. The software design of HTAPP focuses on accommodating diverse workflows and providing missing software functionality to a wide range of proteomic researchers to accelerate the extraction of biological meaning from immense proteomic data sets. Although individual software modules in our integrated technology platform may have some similarities to existing tools, the true novelty of the approach described here is in the synergistic and flexible combination of these tools to provide an integrated and efficient analysis of proteomic samples. PMID:20336676
Recombinant fungal lectin as a new tool to investigate O-GlcNAcylation processes.
Machon, Oriane; Baldini, Steffi F; Ribeiro, João P; Steenackers, Agata; Varrot, Annabelle; Lefebvre, Tony; Imberty, Anne
2017-01-01
Glycosylation is a group of post-translational modifications that displays a large variety of structures and are implicated in a plethora of biological processes. Therefore, studying glycosylation requires different technical approaches and reliable tools, lectins being part of them. Here, we describe the use of the recombinant mushroom lectin PVL to discriminate O-GlcNAcylation, a modification consisting in the attachment of a single N-acetylglucosamine residue to proteins confined within the cytosolic, nuclear and mitochondrial compartments. Recombinant PVL (Psathyrella velutina lectin) (rPVL) displays significantly stronger affinity for GlcNAc over Neu5Ac residues as verified by thermal shift assays and surface plasmon resonance experiments, being therefore an excellent alternative to WGA (wheat germ agglutinin). Labeling of rPVL with biotin or HRP (horseradish peroxidase) allows its useful and efficient utilization by western blot. The staining of whole cell lysates with labeled-rPVL was dramatically decreased in response to O-GlcNAc transferase knockdown and seen to increase after pharmacological blockade of O-GlcNAcase. Also, HRP-rPVL seemed to be more sensitive than the anti-O-GlcNAc antibody RL2. Thus, rPVL is a potent new tool to selectively detect O-GlcNAcylated proteins. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Assessment of Rho GTPase signaling during neurite outgrowth.
Feltrin, Daniel; Pertz, Olivier
2012-01-01
Rho GTPases are key regulators of the cytoskeleton during the process of neurite outgrowth. Based on overexpression of dominant-positive and negative Rho GTPase constructs, the classic view is that Rac1 and Cdc42 are important for neurite elongation whereas RhoA regulates neurite retraction in response to collapsing agents. However, recent work has suggested a much finer control of spatiotemporal Rho GTPase signaling in this process. Understanding this complexity level necessitates a panel of more sensitive tools than previously used. Here, we discuss a novel assay that enables the biochemical fractionation of the neurite from the soma of differentiating N1E-115 neuronal-like cells. This allows for spatiotemporal characterization of a large number of protein components, interactions, and post-translational modifications using classic biochemical and also proteomics approaches. We also provide protocols for siRNA-mediated knockdown of genes and sensitive assays that allow quantitative analysis of the neurite outgrowth process.
Calibration of ultra-high frequency (UHF) partial discharge sensors using FDTD method
NASA Astrophysics Data System (ADS)
Ishak, Asnor Mazuan; Ishak, Mohd Taufiq
2018-02-01
Ultra-high frequency (UHF) partial discharge sensors are widely used for conditioning monitoring and defect location in insulation system of high voltage equipment. Designing sensors for specific applications often requires an iterative process of manufacturing, testing and mechanical modifications. This paper demonstrates the use of finite-difference time-domain (FDTD) technique as a tool to predict the frequency response of UHF PD sensors. Using this approach, the design process can be simplified and parametric studies can be conducted in order to assess the influence of component dimensions and material properties on the sensor response. The modelling approach is validated using gigahertz transverse electromagnetic (GTEM) calibration system. The use of a transient excitation source is particularly suitable for modeling using FDTD, which is able to simulate the step response output voltage of the sensor from which the frequency response is obtained using the same post-processing applied to the physical measurement.
An Optimal Partial Differential Equations-based Stopping Criterion for Medical Image Denoising.
Khanian, Maryam; Feizi, Awat; Davari, Ali
2014-01-01
Improving the quality of medical images at pre- and post-surgery operations are necessary for beginning and speeding up the recovery process. Partial differential equations-based models have become a powerful and well-known tool in different areas of image processing such as denoising, multiscale image analysis, edge detection and other fields of image processing and computer vision. In this paper, an algorithm for medical image denoising using anisotropic diffusion filter with a convenient stopping criterion is presented. In this regard, the current paper introduces two strategies: utilizing the efficient explicit method due to its advantages with presenting impressive software technique to effectively solve the anisotropic diffusion filter which is mathematically unstable, proposing an automatic stopping criterion, that takes into consideration just input image, as opposed to other stopping criteria, besides the quality of denoised image, easiness and time. Various medical images are examined to confirm the claim.
The Effects of Science Models on Students' Understanding of Scientific Processes
NASA Astrophysics Data System (ADS)
Berglin, Riki Susan
This action research study investigated how the use of science models affected fifth-grade students' ability to transfer their science curriculum to a deeper understanding of scientific processes. This study implemented a variety of science models into a chemistry unit throughout a 6-week study. The research question addressed was: In what ways do using models to learn and teach science help students transfer classroom knowledge to a deeper understanding of the scientific processes? Qualitative and quantitative data were collected through pre- and post-science interest inventories, observations field notes, student work samples, focus group interviews, and chemistry unit tests. These data collection tools assessed students' attitudes, engagement, and content knowledge throughout their chemistry unit. The results of the data indicate that the model-based instruction program helped with students' engagement in the lessons and understanding of chemistry content. The results also showed that students displayed positive attitudes toward using science models.
Nowak, Sascha; Winter, Martin
2017-03-06
Quantitative electrolyte extraction from lithium ion batteries (LIB) is of great interest for recycling processes. Following the generally valid EU legal guidelines for the recycling of batteries, 50 wt % of a LIB cell has to be recovered, which cannot be achieved without the electrolyte; hence, the electrolyte represents a target component for the recycling of LIBs. Additionally, fluoride or fluorinated compounds, as inevitably present in LIB electrolytes, can hamper or even damage recycling processes in industry and have to be removed from the solid LIB parts, as well. Finally, extraction is a necessary tool for LIB electrolyte aging analysis as well as for post-mortem investigations in general, because a qualitative overview can already be achieved after a few minutes of extraction for well-aged, apparently "dry" LIB cells, where the electrolyte is deeply penetrated or even gellified in the solid battery materials.
Avoiding the ensemble decorrelation problem using member-by-member post-processing
NASA Astrophysics Data System (ADS)
Van Schaeybroeck, Bert; Vannitsem, Stéphane
2014-05-01
Forecast calibration or post-processing has become a standard tool in atmospheric and climatological science due to the presence of systematic initial condition and model errors. For ensemble forecasts the most competitive methods derive from the assumption of a fixed ensemble distribution. However, when independently applying such 'statistical' methods at different locations, lead times or for multiple variables the correlation structure for individual ensemble members is destroyed. Instead of reastablishing the correlation structure as in Schefzik et al. (2013) we instead propose a calibration method that avoids such problem by correcting each ensemble member individually. Moreover, we analyse the fundamental mechanisms by which the probabilistic ensemble skill can be enhanced. In terms of continuous ranked probability score, our member-by-member approach amounts to skill gain that extends for lead times far beyond the error doubling time and which is as good as the one of the most competitive statistical approach, non-homogeneous Gaussian regression (Gneiting et al. 2005). Besides the conservation of correlation structure, additional benefits arise including the fact that higher-order ensemble moments like kurtosis and skewness are inherited from the uncorrected forecasts. Our detailed analysis is performed in the context of the Kuramoto-Sivashinsky equation and different simple models but the results extent succesfully to the ensemble forecast of the European Centre for Medium-Range Weather Forecasts (Van Schaeybroeck and Vannitsem, 2013, 2014) . References [1] Gneiting, T., Raftery, A. E., Westveld, A., Goldman, T., 2005: Calibrated probabilistic forecasting using ensemble model output statistics and minimum CRPS estimation. Mon. Weather Rev. 133, 1098-1118. [2] Schefzik, R., T.L. Thorarinsdottir, and T. Gneiting, 2013: Uncertainty Quantification in Complex Simulation Models Using Ensemble Copula Coupling. To appear in Statistical Science 28. [3] Van Schaeybroeck, B., and S. Vannitsem, 2013: Reliable probabilities through statistical post-processing of ensemble forecasts. Proceedings of the European Conference on Complex Systems 2012, Springer proceedings on complexity, XVI, p. 347-352. [4] Van Schaeybroeck, B., and S. Vannitsem, 2014: Ensemble post-processing using member-by-member approaches: theoretical aspects, under review.
NASA Astrophysics Data System (ADS)
McCormack, K. A.; Hesse, M.
2016-12-01
Remote sensing and geodetic measurements are providing a new wealth of spatially distributed, time-series data that have the ability to improve our understanding of co-seismic rupture and post-seismic processes in subduction zones. Following a large earthquake, large-scale deformation is influenced by a myriad of post-seismic processes occurring on different spatial and temporal scales. These include continued slip on the fault plane (after-slip), a poroelastic response due to the movement of over-pressurized groundwater and viscoelastic relaxation of the underlying mantle. Often, the only means of observing these phenomena are through surface deformation measurements - either GPS or InSAR. Such tools measure the combined result of all these processes, which makes studying the effects of any single process difficult. For the 2012 Mw 7.6 Costa Rica Earthquake, we formulate a Bayesian inverse problem to infer the slip distribution on the plate interface using an elastic finite element model and GPS surface deformation measurements. From this study we identify a horseshoe-shaped rupture area surrounding a locked patch that is likely to release stress in the future. The results of our inversion are then used as an initial condition in a coupled poroelastic forward model to investigate the role of poroelastic effects on post-seismic deformation and stress transfer. We model the co-seismic pore pressure change as well as the pressure evolution and resulting deformation in the months after the earthquake. The surface permeability field is constrained by pump-test data from 526 groundwater wells throughout the study area. The results of the forward model indicate that earthquake-induced pore pressure changes dissipate quickly in most areas near the surface, resulting in relaxation of the surface in the seven to twenty days following the earthquake. Near the subducting slab interface, pore pressure changes can be an order of magnitude larger and may persist for many months after the earthquake. Dissipation of earthquake-induced pore pressure in deeper, low permeability areas manifests as surface deformation over a much longer timescale - on the order of months - which may influence the interpretation of longer timescale post-seismic deformation as purely viscoelastic relaxation.
Sharon Hood; Barbara Bentz; Ken Gibson; Kevin Ryan; Gregg DeNitto
2007-01-01
Douglas-fir has life history traits that greatly enhance resistance to injury from fire, thereby increasing post-fire survival rates. Tools for predicting the probability of tree mortality following fire are important components of both pre-fire planning and post-fire management efforts. Using data from mixed-severity wildfire in Montana and Wyoming, Hood and Bentz (...
Remote Sensing Image Quality Assessment Experiment with Post-Processing
NASA Astrophysics Data System (ADS)
Jiang, W.; Chen, S.; Wang, X.; Huang, Q.; Shi, H.; Man, Y.
2018-04-01
This paper briefly describes the post-processing influence assessment experiment, the experiment includes three steps: the physical simulation, image processing, and image quality assessment. The physical simulation models sampled imaging system in laboratory, the imaging system parameters are tested, the digital image serving as image processing input are produced by this imaging system with the same imaging system parameters. The gathered optical sampled images with the tested imaging parameters are processed by 3 digital image processes, including calibration pre-processing, lossy compression with different compression ratio and image post-processing with different core. Image quality assessment method used is just noticeable difference (JND) subject assessment based on ISO20462, through subject assessment of the gathered and processing images, the influence of different imaging parameters and post-processing to image quality can be found. The six JND subject assessment experimental data can be validated each other. Main conclusions include: image post-processing can improve image quality; image post-processing can improve image quality even with lossy compression, image quality with higher compression ratio improves less than lower ratio; with our image post-processing method, image quality is better, when camera MTF being within a small range.
Macdowall, Wendy; Parker, Rachael; Nanchahal, Kiran; Ford, Chris; Lowbury, Ruth; Robinson, Angela; Sherrard, Jackie; Martins, Helen; Fasey, Nicky; Wellings, Kaye
2010-12-01
To develop and pilot a communication aid aimed at increasing the frequency with which sexual health issues are raised proactively with young people in primary care. Group interviews among primary health care professionals to guide development of the tool, simulated consultations to pre-test it, and a pilot study to assess effectiveness. We developed an electronic consultation aid: Talking of Sex and piloted it in eight general practices across the UK. 188 patients and 58 practitioners completed questionnaires pre-intervention, and 92 patients and 45 practitioners post-intervention. There was a modest increase in the proportion of consultations in which sexual health was raised, from 28.1% pre-intervention to 32.6% post-intervention. In consultations with nurses the rise was more marked. More patients reported discussing preventive practices such as condom use post-intervention. Patients unanimously welcomed the opportunity to discuss sexual health matters with their practitioner. The tool has capacity to increase the frequency with which sexual health is raised in primary care, particularly by nurses, to influence the topics discussed, and to improve patient satisfaction. The tool has potential in increasing the proportion of young people whose sexual health needs are addressed in general practice. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
van der Kroft, G; Janssen-Heijnen, M L G; van Berlo, C L H; Konsten, J L M
2015-08-01
Nutritional Risk Screening-2002 (NRS-2002) and the Malnutrition Universal Screening Tool (MUST) are screening tools for nutritional risk that have also been used to predict post-operative complications and morbidity, though not all studies confirm the reliability of nutritional screening. Our study aims to evaluate the independent predictive value of nutritional risk screening in addition to currently documented medical, surgical and anesthesiological risk factors for post-operative complications, as well as length of hospital stay. This study is a prospective observational cohort study of 129 patients undergoing elective gastro-intestinal-surgery. Patients were screened for nutritional risk upon admission using both MUST and NRS-2002 screening tools. Univariate and multivariate analyses were performed to investigate the independent predictive value of nutritional risk for post-operative complications and length of hospital stay. MUST ≥2 (OR 2.87; 95% CI 1.05-7.87) and peri-operative transfusion (OR 2.78; 95% CI 1.05-7.40) were significant independent predictors for the occurrence of post-operative complications. Peri-operative transfusion (HR 2.40; 95% CI 1.45-4.00), age ≥70 (HR 1.50; 95% CI 1.05-2.16) and open surgery versus laparoscopic surgery (HR 1.39; 95% CI 0.94-2.05) were independent predictors for increased length of hospital stay, whereas American Society of Anesthesiology Score (ASA) and MUST were not. Nutritional risk screening (MUST ≥2) is an independent predictor for post-operative complications, but not for increased length of hospital stay. Copyright © 2015 European Society for Clinical Nutrition and Metabolism. Published by Elsevier Ltd. All rights reserved.
Hogg, Sandra; Roe, Yvette; Mills, Richard
2017-01-01
The Institute for Urban Indigenous Health believes that continuous quality improvement (CQI) contributes to the delivery of high-quality care, thereby improving health outcomes for Aboriginal and Torres Strait Islander people. The opening of a new health service in 2015 provided an opportunity to implement best practice CQI strategies and apply them to a regional influenza vaccination campaign. The aim of this project was to implement an evidence-based CQI process within one Aboriginal Community Controlled Health Service in South East Queensland and use staff engagement as a measure of success. A CQI tool was selected from the Joanna Briggs Institute Practical Application of Clinical Evidence System (PACES) to be implemented in the study site. The study site was a newly established Aboriginal and Torres Strait Islander Community Controlled Health Service located in the northern suburbs of Brisbane. This project used the evidence-based information collected in PACES to develop a set of questions related to known variables resulting in proven CQI uptake. A pre implementation clinical audit, education and self-directed learning, using the Plan Do Study Act framework, included a total of seven staff and was conducted in April 2015. A post implementation audit was conducted in July 2015. There were a total of 11 pre- and post-survey respondents which included representation from most of the clinical team and medical administration. The results of the pre implementation audit identified a number of possible areas to improve engagement with the CQI process including staff training and support, understanding CQI and its impacts on individual work areas, understanding clinical data extraction, clinical indicator benchmarking, strong internal leadership and having an external data extractor. There were improvements to all audit criteria in the post-survey, for example, knowledge regarding the importance of CQI activity, attendance at education and training sessions on CQI, active involvement with CQI activity and a multidisciplinary team approach to problem solving within the CQI process. The study found that the implementation of regular, formally organized CQI strategies does have an immediate impact on clinical practice, in this case, by increasing staff awareness regarding the uptake of influenza vaccination against regional targets. The Plan Do Study Act cycle is an efficient tool to record and monitor the change and to guide discussions. For the CQI process to be effective, continued education and training on data interpretation is pivotal to improve staff confidence to engage in regular data discussions, and this should be incorporated into all future CQI sessions.
New Directions in the NOAO Observing Proposal System
NASA Astrophysics Data System (ADS)
Gasson, David; Bell, Dave
For the past eight years NOAO has been refining its on-line observing proposal system. Virtually all related processes are now handled electronically. Members of the astronomical community can submit proposals through email, web form, or via the Gemini Phase I Tool. NOAO staff can use the system to do administrative tasks, scheduling, and compilation of various statistics. In addition, all information relevant to the TAC process is made available on-line, including the proposals themselves (in HTML, PDF and PostScript) and technical comments. Grades and TAC comments are entered and edited through web forms, and can be sorted and filtered according to specified criteria. Current developments include a move away from proprietary solutions, toward open standards such as SQL (in the form of the MySQL relational database system), Perl, PHP and XML.
Enabling the Public to Experience Science from Beginning to End (Invited)
NASA Astrophysics Data System (ADS)
Trouille, L.; Chen, Y.; Lintott, C.; Lynn, S.; Simmons, B.; Smith, A.; Tremonti, C.; Whyte, L.; Willett, K.; Zevin, M.; Science Team; Moderator Team, G.
2013-12-01
In this talk we present the results of an experiment in collaborative research and article writing within the citizen science context. During July-September 2013, astronomers and the Zooniverse team ran Galaxy Zoo Quench (quench.galaxyzoo.org), investigating the mechanism(s) that recently and abruptly shut off star formation in a sample of post-quenched galaxies. Through this project, the public had the opportunity to experience the entire process of science, including galaxy classification, reading background literature, data analysis, discussion, debate, drawing conclusions, and writing an article to submit to a professional journal. The context was galaxy evolution, however, the lessons learned are applicable across the disciplines. The discussion will focus on how to leverage online tools to authentically engage the public in the entire process of science.
Pain, Suffering, and Trauma in Labor and Prevention of Subsequent Posttraumatic Stress Disorder
Simkin, Penny
2011-01-01
In this column, Kimmelin Hull, community manager of Science & Sensibility, Lamaze International’s research blog, reprints and discusses a recent blog post series by acclaimed writer, lecturer, doula, and normal birth advocate Penny Simkin. Examined here is the fruitful dialog that ensued—including testimonies from blog readers about their own experiences with traumatic birth and subsequent posttraumatic stress disorder. Hull further highlights the impact traumatic birth has not only on the birthing woman but also on the labor team—including doulas and childbirth educators—and the implied need for debriefing processes for birth workers. Succinct tools for assessing a laboring woman’s experience of pain versus suffering are offered by Simkin, along with Hull’s added suggestions for application during the labor and birth process. PMID:22654466
Sciacovelli, Laura; O'Kane, Maurice; Skaik, Younis Abdelwahab; Caciagli, Patrizio; Pellegrini, Cristina; Da Rin, Giorgio; Ivanov, Agnes; Ghys, Timothy; Plebani, Mario
2011-05-01
The adoption of Quality Indicators (QIs) has prompted the development of tools to measure and evaluate the quality and effectiveness of laboratory testing, first in the hospital setting and subsequently in ambulatory and other care settings. While Laboratory Medicine has an important role in the delivery of high-quality care, no consensus exists as yet on the use of QIs focussing on all steps of the laboratory total testing process (TTP), and further research in this area is required. In order to reduce errors in laboratory testing, the IFCC Working Group on "Laboratory Errors and Patient Safety" (WG-LEPS) developed a series of Quality Indicators, specifically designed for clinical laboratories. In the first phase of the project, specific QIs for key processes of the TTP were identified, including all the pre-, intra- and post-analytic steps. The overall aim of the project is to create a common reporting system for clinical laboratories based on standardized data collection, and to define state-of-the-art and Quality Specifications (QSs) for each QI independent of: a) the size of organization and type of activities; b) the complexity of processes undertaken; and c) different degree of knowledge and ability of the staff. The aim of the present paper is to report the results collected from participating laboratories from February 2008 to December 2009 and to identify preliminary QSs. The results demonstrate that a Model of Quality Indicators managed as an External Quality Assurance Program can serve as a tool to monitor and control the pre-, intra- and post-analytical activities. It might also allow clinical laboratories to identify risks that lead to errors resulting in patient harm: identification and design of practices that eliminate medical errors; the sharing of information and education of clinical and laboratory teams on practices that reduce or prevent errors; the monitoring and evaluation of improvement activities.
Hall, William; Smith, Neale; Mitton, Craig; Urquhart, Bonnie; Bryan, Stirling
2018-01-01
Background: In order to meet the challenges presented by increasing demand and scarcity of resources, healthcare organizations are faced with difficult decisions related to resource allocation. Tools to facilitate evaluation and improvement of these processes could enable greater transparency and more optimal distribution of resources. Methods: The Resource Allocation Performance Assessment Tool (RAPAT) was implemented in a healthcare organization in British Columbia, Canada. Recommendations for improvement were delivered, and a follow up evaluation exercise was conducted to assess the trajectory of the organization’s priority setting and resource allocation (PSRA) process 2 years post the original evaluation. Results: Implementation of RAPAT in the pilot organization identified strengths and weaknesses of the organization’s PSRA process at the time of the original evaluation. Strengths included the use of criteria and evidence, an ability to reallocate resources, and the involvement of frontline staff in the process. Weaknesses included training, communication, and lack of program budgeting. Although the follow up revealed a regression from a more formal PSRA process, a legacy of explicit resource allocation was reported to be providing ongoing benefit for the organization. Conclusion: While past studies have taken a cross-sectional approach, this paper introduces the first longitudinal evaluation of PSRA in a healthcare organization. By including the strengths, weaknesses, and evolution of one organization’s journey, the authors’ intend that this paper will assist other healthcare leaders in meeting the challenges of allocating scarce resources. PMID:29626400
An unsupervised classification scheme for improving predictions of prokaryotic TIS.
Tech, Maike; Meinicke, Peter
2006-03-09
Although it is not difficult for state-of-the-art gene finders to identify coding regions in prokaryotic genomes, exact prediction of the corresponding translation initiation sites (TIS) is still a challenging problem. Recently a number of post-processing tools have been proposed for improving the annotation of prokaryotic TIS. However, inherent difficulties of these approaches arise from the considerable variation of TIS characteristics across different species. Therefore prior assumptions about the properties of prokaryotic gene starts may cause suboptimal predictions for newly sequenced genomes with TIS signals differing from those of well-investigated genomes. We introduce a clustering algorithm for completely unsupervised scoring of potential TIS, based on positionally smoothed probability matrices. The algorithm requires an initial gene prediction and the genomic sequence of the organism to perform the reannotation. As compared with other methods for improving predictions of gene starts in bacterial genomes, our approach is not based on any specific assumptions about prokaryotic TIS. Despite the generality of the underlying algorithm, the prediction rate of our method is competitive on experimentally verified test data from E. coli and B. subtilis. Regarding genomes with high G+C content, in contrast to some previously proposed methods, our algorithm also provides good performance on P. aeruginosa, B. pseudomallei and R. solanacearum. On reliable test data we showed that our method provides good results in post-processing the predictions of the widely-used program GLIMMER. The underlying clustering algorithm is robust with respect to variations in the initial TIS annotation and does not require specific assumptions about prokaryotic gene starts. These features are particularly useful on genomes with high G+C content. The algorithm has been implemented in the tool "TICO" (TIs COrrector) which is publicly available from our web site.
Hortolà, Policarp
2010-01-01
When dealing with microscopic still images of some kinds of samples, the out-of-focus problem represents a particularly serious limiting factor for the subsequent generation of fully sharp 3D animations. In order to produce fully-focused 3D animations of strongly uneven surface microareas, a vertical stack of six digital secondary-electron SEM micrographs of a human bloodstain microarea was acquired. Afterwards, single combined images were generated using a macrophotography and light microscope image post-processing software. Subsequently, 3D animations of texture and topography were obtained in different formats using a combination of software tools. Finally, a 3D-like animation of a texture-topography composite was obtained in different formats using another combination of software tools. By one hand, results indicate that the use of image post-processing software not concerned primarily with electron micrographs allows to obtain, in an easy way, fully-focused images of strongly uneven surface microareas of bloodstains from small series of partially out-of-focus digital SEM micrographs. On the other hand, results also indicate that such small series of electron micrographs can be utilized for generating 3D and 3D-like animations that can subsequently be converted into different formats, by using certain user-friendly software facilities not originally designed for use in SEM, that are easily available from Internet. Although the focus of this study was on bloodstains, the methods used in it well probably are also of relevance for studying the surface microstructures of other organic or inorganic materials whose sharp displaying is difficult of obtaining from a single SEM micrograph.
Sauer, Brian C; Jones, Barbara E; Globe, Gary; Leng, Jianwei; Lu, Chao-Chin; He, Tao; Teng, Chia-Chen; Sullivan, Patrick; Zeng, Qing
2016-01-01
Pulmonary function tests (PFTs) are objective estimates of lung function, but are not reliably stored within the Veteran Health Affairs data systems as structured data. The aim of this study was to validate the natural language processing (NLP) tool we developed-which extracts spirometric values and responses to bronchodilator administration-against expert review, and to estimate the number of additional spirometric tests identified beyond the structured data. All patients at seven Veteran Affairs Medical Centers with a diagnostic code for asthma Jan 1, 2006-Dec 31, 2012 were included. Evidence of spirometry with a bronchodilator challenge (BDC) was extracted from structured data as well as clinical documents. NLP's performance was compared against a human reference standard using a random sample of 1,001 documents. In the validation set NLP demonstrated a precision of 98.9 percent (95 percent confidence intervals (CI): 93.9 percent, 99.7 percent), recall of 97.8 percent (95 percent CI: 92.2 percent, 99.7 percent), and an F-measure of 98.3 percent for the forced vital capacity pre- and post pairs and precision of 100 percent (95 percent CI: 96.6 percent, 100 percent), recall of 100 percent (95 percent CI: 96.6 percent, 100 percent), and an F-measure of 100 percent for the forced expiratory volume in one second pre- and post pairs for bronchodilator administration. Application of the NLP increased the proportion identified with complete bronchodilator challenge by 25 percent. This technology can improve identification of PFTs for epidemiologic research. Caution must be taken in assuming that a single domain of clinical data can completely capture the scope of a disease, treatment, or clinical test.
Schilling, Birgit; Rardin, Matthew J; MacLean, Brendan X; Zawadzka, Anna M; Frewen, Barbara E; Cusack, Michael P; Sorensen, Dylan J; Bereman, Michael S; Jing, Enxuan; Wu, Christine C; Verdin, Eric; Kahn, C Ronald; Maccoss, Michael J; Gibson, Bradford W
2012-05-01
Despite advances in metabolic and postmetabolic labeling methods for quantitative proteomics, there remains a need for improved label-free approaches. This need is particularly pressing for workflows that incorporate affinity enrichment at the peptide level, where isobaric chemical labels such as isobaric tags for relative and absolute quantitation and tandem mass tags may prove problematic or where stable isotope labeling with amino acids in cell culture labeling cannot be readily applied. Skyline is a freely available, open source software tool for quantitative data processing and proteomic analysis. We expanded the capabilities of Skyline to process ion intensity chromatograms of peptide analytes from full scan mass spectral data (MS1) acquired during HPLC MS/MS proteomic experiments. Moreover, unlike existing programs, Skyline MS1 filtering can be used with mass spectrometers from four major vendors, which allows results to be compared directly across laboratories. The new quantitative and graphical tools now available in Skyline specifically support interrogation of multiple acquisitions for MS1 filtering, including visual inspection of peak picking and both automated and manual integration, key features often lacking in existing software. In addition, Skyline MS1 filtering displays retention time indicators from underlying MS/MS data contained within the spectral library to ensure proper peak selection. The modular structure of Skyline also provides well defined, customizable data reports and thus allows users to directly connect to existing statistical programs for post hoc data analysis. To demonstrate the utility of the MS1 filtering approach, we have carried out experiments on several MS platforms and have specifically examined the performance of this method to quantify two important post-translational modifications: acetylation and phosphorylation, in peptide-centric affinity workflows of increasing complexity using mouse and human models.
Schilling, Birgit; Rardin, Matthew J.; MacLean, Brendan X.; Zawadzka, Anna M.; Frewen, Barbara E.; Cusack, Michael P.; Sorensen, Dylan J.; Bereman, Michael S.; Jing, Enxuan; Wu, Christine C.; Verdin, Eric; Kahn, C. Ronald; MacCoss, Michael J.; Gibson, Bradford W.
2012-01-01
Despite advances in metabolic and postmetabolic labeling methods for quantitative proteomics, there remains a need for improved label-free approaches. This need is particularly pressing for workflows that incorporate affinity enrichment at the peptide level, where isobaric chemical labels such as isobaric tags for relative and absolute quantitation and tandem mass tags may prove problematic or where stable isotope labeling with amino acids in cell culture labeling cannot be readily applied. Skyline is a freely available, open source software tool for quantitative data processing and proteomic analysis. We expanded the capabilities of Skyline to process ion intensity chromatograms of peptide analytes from full scan mass spectral data (MS1) acquired during HPLC MS/MS proteomic experiments. Moreover, unlike existing programs, Skyline MS1 filtering can be used with mass spectrometers from four major vendors, which allows results to be compared directly across laboratories. The new quantitative and graphical tools now available in Skyline specifically support interrogation of multiple acquisitions for MS1 filtering, including visual inspection of peak picking and both automated and manual integration, key features often lacking in existing software. In addition, Skyline MS1 filtering displays retention time indicators from underlying MS/MS data contained within the spectral library to ensure proper peak selection. The modular structure of Skyline also provides well defined, customizable data reports and thus allows users to directly connect to existing statistical programs for post hoc data analysis. To demonstrate the utility of the MS1 filtering approach, we have carried out experiments on several MS platforms and have specifically examined the performance of this method to quantify two important post-translational modifications: acetylation and phosphorylation, in peptide-centric affinity workflows of increasing complexity using mouse and human models. PMID:22454539
Sauer, Brian C.; Jones, Barbara E.; Globe, Gary; Leng, Jianwei; Lu, Chao-Chin; He, Tao; Teng, Chia-Chen; Sullivan, Patrick; Zeng, Qing
2016-01-01
Introduction/Objective: Pulmonary function tests (PFTs) are objective estimates of lung function, but are not reliably stored within the Veteran Health Affairs data systems as structured data. The aim of this study was to validate the natural language processing (NLP) tool we developed—which extracts spirometric values and responses to bronchodilator administration—against expert review, and to estimate the number of additional spirometric tests identified beyond the structured data. Methods: All patients at seven Veteran Affairs Medical Centers with a diagnostic code for asthma Jan 1, 2006–Dec 31, 2012 were included. Evidence of spirometry with a bronchodilator challenge (BDC) was extracted from structured data as well as clinical documents. NLP’s performance was compared against a human reference standard using a random sample of 1,001 documents. Results: In the validation set NLP demonstrated a precision of 98.9 percent (95 percent confidence intervals (CI): 93.9 percent, 99.7 percent), recall of 97.8 percent (95 percent CI: 92.2 percent, 99.7 percent), and an F-measure of 98.3 percent for the forced vital capacity pre- and post pairs and precision of 100 percent (95 percent CI: 96.6 percent, 100 percent), recall of 100 percent (95 percent CI: 96.6 percent, 100 percent), and an F-measure of 100 percent for the forced expiratory volume in one second pre- and post pairs for bronchodilator administration. Application of the NLP increased the proportion identified with complete bronchodilator challenge by 25 percent. Discussion/Conclusion: This technology can improve identification of PFTs for epidemiologic research. Caution must be taken in assuming that a single domain of clinical data can completely capture the scope of a disease, treatment, or clinical test. PMID:27376095
Brown, Jason L; Bennett, Joseph R; French, Connor M
2017-01-01
SDMtoolbox 2.0 is a software package for spatial studies of ecology, evolution, and genetics. The release of SDMtoolbox 2.0 allows researchers to use the most current ArcGIS software and MaxEnt software, and reduces the amount of time that would be spent developing common solutions. The central aim of this software is to automate complicated and repetitive spatial analyses in an intuitive graphical user interface. One core tenant facilitates careful parameterization of species distribution models (SDMs) to maximize each model's discriminatory ability and minimize overfitting. This includes carefully processing of occurrence data, environmental data, and model parameterization. This program directly interfaces with MaxEnt, one of the most powerful and widely used species distribution modeling software programs, although SDMtoolbox 2.0 is not limited to species distribution modeling or restricted to modeling in MaxEnt. Many of the SDM pre- and post-processing tools have 'universal' analogs for use with any modeling software. The current version contains a total of 79 scripts that harness the power of ArcGIS for macroecology, landscape genetics, and evolutionary studies. For example, these tools allow for biodiversity quantification (such as species richness or corrected weighted endemism), generation of least-cost paths and corridors among shared haplotypes, assessment of the significance of spatial randomizations, and enforcement of dispersal limitations of SDMs projected into future climates-to only name a few functions contained in SDMtoolbox 2.0. Lastly, dozens of generalized tools exists for batch processing and conversion of GIS data types or formats, which are broadly useful to any ArcMap user.
NASA Astrophysics Data System (ADS)
Holifield Collins, C.; Skirvin, S. M.; Kautz, M. A.; Metz, L. J.
2017-12-01
The Landsat Surface Reflectance (SR) Product is valuable for applied research of the Earth's surface and has been used in the development of a number of operational products. Landsat SR data available as of April 2017 have been processed through a new system and are publically available as part of a new product set known as Collection 1. The impact of these changes has not yet been described, but knowing their nature and magnitude is vital for continued confidence in operational products produced from these datasets. The Rangeland Brush Estimation Toolbox (RaBET) developed for USDA Natural Resources Conservation Service (NRCS) land managers is based on relationships developed using the pre-April 2017 Landsat SR Product to derive estimates of woody cover in western rangelands. The maps produced from this tool will be used to aid in planning the implementation of brush removal treatments. Due to the vast expenditure of resources (millions of dollars per year) required to execute these treatments, it is imperative that the effects of the SR data processing changes are understood to allow for modifications in the tool if necessary. The objectives of this study are to: 1) determine where SR data processing changes have the greatest effect on the Landsat-based vegetation indices used within RaBET for Major Land Resource Areas (MLRAs) in Arizona and Texas, and 2) compare model outputs arising from Landsat SR data obtained pre- and post- April 2017 to assess the magnitude of the changes to the RaBET end product.
Development of High-Performance Cast Crankshafts. Final Technical Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bauer, Mark E
The objective of this project was to develop technologies that would enable the production of cast crankshafts that can replace high performance forged steel crankshafts. To achieve this, the Ultimate Tensile Strength (UTS) of the new material needs to be 850 MPa with a desired minimum Yield Strength (YS; 0.2% offset) of 615 MPa and at least 10% elongation. Perhaps more challenging, the cast material needs to be able to achieve sufficient local fatigue properties to satisfy the durability requirements in today’s high performance gasoline and diesel engine applications. The project team focused on the development of cast steel alloysmore » for application in crankshafts to take advantage of the higher stiffness over other potential material choices. The material and process developed should be able to produce high-performance crankshafts at no more than 110% of the cost of current production cast units, perhaps the most difficult objective to achieve. To minimize costs, the primary alloy design strategy was to design compositions that can achieve the required properties with minimal alloying and post-casting heat treatments. An Integrated Computational Materials Engineering (ICME) based approach was utilized, rather than relying only on traditional trial-and-error methods, which has been proven to accelerate alloy development time. Prototype melt chemistries designed using ICME were cast as test specimens and characterized iteratively to develop an alloy design within a stage-gate process. Standard characterization and material testing was done to validate the alloy performance against design targets and provide feedback to material design and manufacturing process models. Finally, the project called for Caterpillar and General Motors (GM) to develop optimized crankshaft designs using the final material and manufacturing processing path developed. A multi-disciplinary effort was to integrate finite element analyses by engine designers and geometry-specific casting simulations with existing materials models to optimize crankshaft cost and performance. Prototype crankshafts of the final design were to be produced and validated using laboratory bench testing and on-engine durability testing. ICME process simulation tools were used to investigate a broad range of processing concepts. These concepts included casting orientation, various mold and core materials, and various filling and feeding strategies. Each crankshaft was first simulated without gating and risers, which is termed natural solidification. The natural solidification results were used as a baseline for strategy development of each concept. Casting process simulations and ICME tools were proven to be reasonable predictors of real world results. Potential alloys were developed that could meet the project material property goals with appropriate normalization and temper treatments. For the alloys considered, post-normalization temper treatments proved to be necessary to achieve the desired yield strengths and elongations and appropriate heat treatments were designed using ICME tools. The experimental data of all the alloys were analyzed in combination with ICME tools to establish chemistry-process-structure relations. Several GM small gas engine (SGE) crankshafts were successfully cast in sand molds using two different sprue, runner, gate, riser, chill designs. These crankshafts were cast in two different steel alloys developed during the project, but casting finishing (e.g. riser removal) remains a cost challenge. A long list of future work was left unfinished when this project was unexpectedly terminated.« less
Comparison of breast percent density estimation from raw versus processed digital mammograms
NASA Astrophysics Data System (ADS)
Li, Diane; Gavenonis, Sara; Conant, Emily; Kontos, Despina
2011-03-01
We compared breast percent density (PD%) measures obtained from raw and post-processed digital mammographic (DM) images. Bilateral raw and post-processed medio-lateral oblique (MLO) images from 81 screening studies were retrospectively analyzed. Image acquisition was performed with a GE Healthcare DS full-field DM system. Image post-processing was performed using the PremiumViewTM algorithm (GE Healthcare). Area-based breast PD% was estimated by a radiologist using a semi-automated image thresholding technique (Cumulus, Univ. Toronto). Comparison of breast PD% between raw and post-processed DM images was performed using the Pearson correlation (r), linear regression, and Student's t-test. Intra-reader variability was assessed with a repeat read on the same data-set. Our results show that breast PD% measurements from raw and post-processed DM images have a high correlation (r=0.98, R2=0.95, p<0.001). Paired t-test comparison of breast PD% between the raw and the post-processed images showed a statistically significant difference equal to 1.2% (p = 0.006). Our results suggest that the relatively small magnitude of the absolute difference in PD% between raw and post-processed DM images is unlikely to be clinically significant in breast cancer risk stratification. Therefore, it may be feasible to use post-processed DM images for breast PD% estimation in clinical settings. Since most breast imaging clinics routinely use and store only the post-processed DM images, breast PD% estimation from post-processed data may accelerate the integration of breast density in breast cancer risk assessment models used in clinical practice.
2012-01-01
Background MicroRNAs (miRNAs) are noncoding RNAs that direct post-transcriptional regulation of protein coding genes. Recent studies have shown miRNAs are important for controlling many biological processes, including nervous system development, and are highly conserved across species. Given their importance, computational tools are necessary for analysis, interpretation and integration of high-throughput (HTP) miRNA data in an increasing number of model species. The Bioinformatics Resource Manager (BRM) v2.3 is a software environment for data management, mining, integration and functional annotation of HTP biological data. In this study, we report recent updates to BRM for miRNA data analysis and cross-species comparisons across datasets. Results BRM v2.3 has the capability to query predicted miRNA targets from multiple databases, retrieve potential regulatory miRNAs for known genes, integrate experimentally derived miRNA and mRNA datasets, perform ortholog mapping across species, and retrieve annotation and cross-reference identifiers for an expanded number of species. Here we use BRM to show that developmental exposure of zebrafish to 30 uM nicotine from 6–48 hours post fertilization (hpf) results in behavioral hyperactivity in larval zebrafish and alteration of putative miRNA gene targets in whole embryos at developmental stages that encompass early neurogenesis. We show typical workflows for using BRM to integrate experimental zebrafish miRNA and mRNA microarray datasets with example retrievals for zebrafish, including pathway annotation and mapping to human ortholog. Functional analysis of differentially regulated (p<0.05) gene targets in BRM indicates that nicotine exposure disrupts genes involved in neurogenesis, possibly through misregulation of nicotine-sensitive miRNAs. Conclusions BRM provides the ability to mine complex data for identification of candidate miRNAs or pathways that drive phenotypic outcome and, therefore, is a useful hypothesis generation tool for systems biology. The miRNA workflow in BRM allows for efficient processing of multiple miRNA and mRNA datasets in a single software environment with the added capability to interact with public data sources and visual analytic tools for HTP data analysis at a systems level. BRM is developed using Java™ and other open-source technologies for free distribution (http://www.sysbio.org/dataresources/brm.stm). PMID:23174015
Induction of neuroplasticity and recovery in post-stroke aphasia by non-invasive brain stimulation
Shah, Priyanka P.; Szaflarski, Jerzy P.; Allendorfer, Jane; Hamilton, Roy H.
2013-01-01
Stroke victims tend to prioritize speaking, writing, and walking as the three most important rehabilitation goals. Of note is that two of these goals involve communication. This underscores the significance of developing successful approaches to aphasia treatment for the several hundred thousand new aphasia patients each year and over 1 million stroke survivors with chronic aphasia in the U.S. alone. After several years of growth as a research tool, non-invasive brain stimulation (NBS) is gradually entering the arena of clinical aphasiology. In this review, we first examine the current state of knowledge of post-stroke language recovery including the contributions from the dominant and non-dominant hemispheres. Next, we briefly discuss the methods and the physiologic basis of the use of inhibitory and excitatory repetitive transcranial magnetic stimulation (rTMS) and transcranial direct current stimulation (tDCS) as research tools in patients who experience post-stroke aphasia. Finally, we provide a critical review of the most influential evidence behind the potential use of these two brain stimulation methods as clinical rehabilitative tools. PMID:24399952
RAPID POST-FIRE HYDROLOGIC WATERSHED ASSESSMENT USING THE AGWA GIS-BASED HYDROLOGIC MODELING TOOL
Rapid post-fire watershed assessment to identify potential trouble spots for erosion and flooding can potentially aid land managers and Burned Area Emergency Rehabilitation (BAER) teams in deploying mitigation and rehabilitation resources.
These decisions are inherently co...
Ritchie, Linda; Wright-St Clair, Valerie A; Keogh, Justin; Gray, Marion
2014-01-01
To explore the scope, reliability, and validity of community integration measures for older adults after traumatic brain injury (TBI). A search of peer-reviewed articles in English from 1990 to April 2011 was conducted using the EBSCO Health and Scopus databases. Search terms included were community integration, traumatic brain injury or TBI, 65 plus or older adults, and assessment. Forty-three eligible articles were identified, with 11 selected for full review using a standardized critical review method. Common community integration measures were identified and ranked for relevance and psychometric properties. Of the 43 eligible articles, studies reporting community integration outcomes post-TBI were identified and critically reviewed. Older adults' community integration needs post-TBI from high quality studies were summarized. There is a relative lack of evidence pertaining to older adults post-TBI, but indicators are that older adults have poorer outcomes than their younger counterparts. The Community Integration Questionnaire (CIQ) is the most widely used community integration measurement tool used in research for people with TBI. Because of some limitations, many studies have used the CIQ in conjunction with other measures to better quantify and/or monitor changes in community integration. Enhancing integration of older adults after TBI into their community of choice, with particular emphasis on social integration and quality of life, should be a primary rehabilitation goal. However, more research is needed to inform best practice guidelines to meet the needs of this growing TBI population. It is recommended that subjective tools, such as quality of life measures, are used in conjunction with well-established community integration measures, such as the CIQ, during the assessment process. Copyright © 2014 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
Anshu; Sharma, M; Burdick, W P; Singh, T
2010-04-01
Group dynamics of online medical faculty development programs have not been analyzed and reported in literature. Knowledge of the types of content of posted messages will help to understand group dynamics and promote participation in an asynchronous learning environment. This paper assesses group dynamics and social interactivity in an online learning environment for medical teachers in the South Asian context. Participants of a medical education fellowship program conducted by the Foundation for Advancement of International Medical Education and Research (FAIMER) Regional Institute at Christian Medical College, Ludhiana (CMCL) in India interact on a listserv called the Mentoring-Learning Web (ML-Web). Monthly topics for online discussion are chosen by fellows through a standard tool called "multi-voting". Fellows volunteer to moderate sessions and direct the pace of the discussion. We analyzed the content and process of the discussion of one particular month. The emails were categorized as those that reflected cognitive presence (dealing with construction and exploration of knowledge), teacher presence (dealing with instructional material and learning resources), and social presence, or were administrative in nature. Social emails were further classified as: affective, cohesive and interactive. Social emails constituted one-third of the total emails. Another one-quarter of the emails dealt with sharing of resources and teacher presence, while cognitive emails comprised 36.2% of the total. More than half of the social emails were affective, while a little less than one-third were cohesive. Social posts are an inevitable part of online learning. These posts promote bonding between learners and contribute to better interaction and collaboration in online learning. Moderators should be aware of their presence and use them as tools to promote interactivity.
RNA Binding Proteins in Eye Development and Disease: Implication of Conserved RNA Granule Components
Dash, Soma; Siddam, Archana D.; Barnum, Carrie E.; Janga, Sarath Chandra
2016-01-01
The molecular biology of metazoan eye development is an area of intense investigation. These efforts have led to the surprising recognition that although insect and vertebrate eyes have dramatically different structures, the orthologs or family members of several conserved transcription and signaling regulators such as Pax6, Six3, Prox1 and Bmp4 are commonly required for their development. In contrast, our understanding of post-transcriptional regulation in eye development and disease, particularly regarding the function of RNA binding proteins (RBPs), is limited. We examine the present knowledge of RBPs in eye development in the insect model Drosophila, as well as several vertebrate models such as fish, frog, chicken and mouse. Interestingly, of the 42 RBPs that have been investigated with for their expression or function in vertebrate eye development, 24 (~60%) are recognized in eukaryotic cells as components of RNA granules such as Processing bodies (P-bodies), Stress granules, or other specialized ribonucleoprotein (RNP) complexes. We discuss the distinct developmental and cellular events that may necessitate potential RBP/RNA granule-associated RNA regulon models to facilitate post-transcriptional control of gene expression in eye morphogenesis. In support of these hypotheses, three RBPs and RNP/RNA granule components Tdrd7, Caprin2 and Stau2 are linked to ocular developmental defects such as congenital cataract, Peters anomaly and microphthalmia in human patients or animal models. We conclude by discussing the utility of interdisciplinary approaches such as the bioinformatics tool iSyTE (integrated Systems Tool for Eye gene discovery) to prioritize RBPs for deriving post-transcriptional regulatory networks in eye development and disease. PMID:27133484
Explorers of the Universe: Interactive Collaborations via the Internet
NASA Astrophysics Data System (ADS)
Burks, G.
1999-05-01
This proposal details how self-directed case-based research with earth/space investigations, and instruction together with collaborative interactions with teachers, students, scientists, and university educators using metacognitive tools (e.g., concept maps, interactive vee diagrams, and thematic organizers), and innovative technology promotes meaningful learning in ways that differ from conventional and atypical educational settings. Our Explorers of the Universe Scientific/Literacy project (http://explorers.tsuniv.edu) promotes earth/space science inquires in non-conventional learning environments with middle, secondary, and postsecondary students. Outlined are programs and educational processes and outcomes that meet both local and national contexts for achieving meaningful learner-centered science and mathematics goals. All information is entered electronically by students and collected for analyses in a database at our TSU web server. Scientists and university educators review and respond to these postings of students by writing in their electronic notebooks, commenting on their concept maps and interactive vee diagrams, and guiding them to pertinent papers and journal articles. Teachers are active learners with their students. They facilitate the learning process by guiding students in their inquires, evoking discussions, and involving their students with other affiliated schools whose students may be engaged in similar research topics. Teachers manage their student electronic accounts by assigning passwords, determining the degree of portfolio sharing among students, and responding to student inquires. Students post their thoughts, progress, inquires, and data on their individualized electronic notebook. Likewise, they plan, carry out, and finalize their case-based research using electronic transmissions via e-mail and the Internet of their concept maps and interactive vee diagrams. Their peer-edited papers are posted on the WWW for others to read and react. The final process involves students developing CDs of their case research report, which serves as a longitudinal case for others to pursue.
Point-cloud-to-point-cloud technique on tool calibration for dental implant surgical path tracking
NASA Astrophysics Data System (ADS)
Lorsakul, Auranuch; Suthakorn, Jackrit; Sinthanayothin, Chanjira
2008-03-01
Dental implant is one of the most popular methods of tooth root replacement used in prosthetic dentistry. Computerize navigation system on a pre-surgical plan is offered to minimize potential risk of damage to critical anatomic structures of patients. Dental tool tip calibrating is basically an important procedure of intraoperative surgery to determine the relation between the hand-piece tool tip and hand-piece's markers. With the transferring coordinates from preoperative CT data to reality, this parameter is a part of components in typical registration problem. It is a part of navigation system which will be developed for further integration. A high accuracy is required, and this relation is arranged by point-cloud-to-point-cloud rigid transformations and singular value decomposition (SVD) for minimizing rigid registration errors. In earlier studies, commercial surgical navigation systems from, such as, BrainLAB and Materialize, have flexibility problem on tool tip calibration. Their systems either require a special tool tip calibration device or are unable to change the different tool. The proposed procedure is to use the pointing device or hand-piece to touch on the pivot and the transformation matrix. This matrix is calculated every time when it moves to the new position while the tool tip stays at the same point. The experiment acquired on the information of tracking device, image acquisition and image processing algorithms. The key success is that point-to-point-cloud requires only 3 post images of tool to be able to converge to the minimum errors 0.77%, and the obtained result is correct in using the tool holder to track the path simulation line displayed in graphic animation.
Biomolecular engineering of intracellular switches in eukaryotes
Pastuszka, M.K.; Mackay, J.A.
2010-01-01
Tools to selectively and reversibly control gene expression are useful to study and model cellular functions. When optimized, these cellular switches can turn a protein's function “on” and “off” based on cues designated by the researcher. These cues include small molecules, drugs, hormones, and even temperature variations. Here we review three distinct areas in gene expression that are commonly targeted when designing cellular switches. Transcriptional switches target gene expression at the level of mRNA polymerization, with examples including the tetracycline gene induction system as well as nuclear receptors. Translational switches target the process of turning the mRNA signal into protein, with examples including riboswitches and RNA interference. Post-translational switches control how proteins interact with one another to attenuate or relay signals. Examples of post-translational modification include dimerization and intein splicing. In general, the delay times between switch and effect decreases from transcription to translation to post-translation; furthermore, the fastest switches may offer the most elegant opportunities to influence and study cell behavior. We discuss the pros and cons of these strategies, which directly influence their usefulness to study and implement drug targeting at the tissue and cellular level. PMID:21209849
Context and hand posture modulate the neural dynamics of tool-object perception.
Natraj, Nikhilesh; Poole, Victoria; Mizelle, J C; Flumini, Andrea; Borghi, Anna M; Wheaton, Lewis A
2013-02-01
Prior research has linked visual perception of tools with plausible motor strategies. Thus, observing a tool activates the putative action-stream, including the left posterior parietal cortex. Observing a hand functionally grasping a tool involves the inferior frontal cortex. However, tool-use movements are performed in a contextual and grasp specific manner, rather than relative isolation. Our prior behavioral data has demonstrated that the context of tool-use (by pairing the tool with different objects) and varying hand grasp postures of the tool can interact to modulate subjects' reaction times while evaluating tool-object content. Specifically, perceptual judgment was delayed in the evaluation of functional tool-object pairings (Correct context) when the tool was non-functionally (Manipulative) grasped. Here, we hypothesized that this behavioral interference seen with the Manipulative posture would be due to increased and extended left parietofrontal activity possibly underlying motor simulations when resolving action conflict due to this particular grasp at time scales relevant to the behavioral data. Further, we hypothesized that this neural effect will be restricted to the Correct tool-object context wherein action affordances are at a maximum. 64-channel electroencephalography (EEG) was recorded from 16 right-handed subjects while viewing images depicting three classes of tool-object contexts: functionally Correct (e.g. coffee pot-coffee mug), functionally Incorrect (e.g. coffee pot-marker) and Spatial (coffee pot-milk). The Spatial context pairs a tool and object that would not functionally match, but may commonly appear in the same scene. These three contexts were modified by hand interaction: No Hand, Static Hand near the tool, Functional Hand posture and Manipulative Hand posture. The Manipulative posture is convenient for relocating a tool but does not afford a functional engagement of the tool on the target object. Subjects were instructed to visually assess whether the pictures displayed correct tool-object associations. EEG data was analyzed in time-voltage and time-frequency domains. Overall, Static Hand, Functional and Manipulative postures cause early activation (100-400ms post image onset) of parietofrontal areas, to varying intensity in each context, when compared to the No Hand control condition. However, when context is Correct, only the Manipulative Posture significantly induces extended neural responses, predominantly over right parietal and right frontal areas [400-600ms post image onset]. Significant power increase was observed in the theta band [4-8Hz] over the right frontal area, [0-500ms]. In addition, when context is Spatial, Manipulative posture alone significantly induces extended neural responses, over bilateral parietofrontal and left motor areas [400-600ms]. Significant power decrease occurred primarily in beta bands [12-16, 20-25Hz] over the aforementioned brain areas [400-600ms]. Here, we demonstrate that the neural processing of tool-object perception is sensitive to several factors. While both Functional and Manipulative postures in Correct context engage predominantly an early left parietofrontal circuit, the Manipulative posture alone extends the neural response and transitions to a late right parietofrontal network. This suggests engagement of a right neural system to evaluate action affordances when hand posture does not support action (Manipulative). Additionally, when tool-use context is ambiguous (Spatial context), there is increased bilateral parietofrontal activation and, extended neural response for the Manipulative posture. These results point to the existence of other networks evaluating tool-object associations when motoric affordances are not readily apparent and underlie corresponding delayed perceptual judgment in our prior behavioral data wherein Manipulative postures had exclusively interfered in judging tool-object content. Copyright © 2012 Elsevier Ltd. All rights reserved.
Lycke, Michelle; Lefebvre, Tessa; Pottel, Lies; Pottel, Hans; Ketelaars, Lore; Stellamans, Karin; Eygen, Koen Van; Vergauwe, Philippe; Werbrouck, Patrick; Goethals, Laurence; Schofield, Patricia; Boterberg, Tom; Debruyne, Philip R
2017-01-01
Research has indicated that cancer-related cognitive impairments (CRCI) may be influenced by psychosocial factors such as distress, worry and fatigue. Therefore, we aimed to validate the distress thermometer (DT) as a screening tool to detect CRCI six months post-treatment-initiation in a group of general cancer patients. Patients (≥18 years, n = 125) with a histologically confirmed diagnosis of a solid cancer or hematological malignancy, scheduled for a curative treatment, were evaluated at baseline (T0) and six months post-treatment-initiation (T1) for CRCI by a neuropsychological assessment, including patient-reported outcome measures (PROMs). Assessed cognitive domains included premorbid intelligence, attention, processing speed, flexibility, verbal and visual episodic memory and verbal fluency. PROMs entailed distress (DT, cut-off ≥4, range 0-10), anxiety and depression, fatigue (FACIT-fatigue scale) and subjective cognitive complaints. At T0, 60.4% of patients showed a DT score of ≥4, whereas 50% met this criterion at T1. According to the definition of the International Cognition and Cancer Task Force, 25.5% and 28.3% of patients presented with a CRCI at T0 and T1, respectively. When evaluating the DT as a screening tool for CRCI at T1, data showed an inverse relationship between the DT and CRCI. ROC-curve analysis revealed an AUC <0.5. ROC-curve analyses evaluating the DT and FACIT-fatigue scale as screening tools for subjective cognitive complaints showed an AUC ± SE of, respectively, 0.642 ± 0.067 and 0.794 ± 0.057. The DT at T0 cannot be used to screen for objective CRCI at T1, but both the DT and FACIT-fatigue scale at T0 showed potential as screening tools for subjective cognitive complaints at T1.
Designs for Risk Evaluation and Management
DOE Office of Scientific and Technical Information (OSTI.GOV)
The Designs for Risk Evaluation and Management (DREAM) tool was developed as part of the effort to quantify the risk of geologic storage of carbon dioxide (CO 2) under the U.S. Department of Energy's National Risk Assessment Partnership (NRAP). DREAM is an optimization tool created to identify optimal monitoring schemes that minimize the time to first detection of CO 2 leakage from a subsurface storage formation. DREAM acts as a post-processer on user-provided output from subsurface leakage simulations. While DREAM was developed for CO 2 leakage scenarios, it is applicable to any subsurface leakage simulation of the same output format.more » The DREAM tool is comprised of three main components: (1) a Java wizard used to configure and execute the simulations, (2) a visualization tool to view the domain space and optimization results, and (3) a plotting tool used to analyze the results. A secondary Java application is provided to aid users in converting common American Standard Code for Information Interchange (ASCII) output data to the standard DREAM hierarchical data format (HDF5). DREAM employs a simulated annealing approach that searches the solution space by iteratively mutating potential monitoring schemes built of various configurations of monitoring locations and leak detection parameters. This approach has proven to be orders of magnitude faster than an exhaustive search of the entire solution space. The user's manual illustrates the program graphical user interface (GUI), describes the tool inputs, and includes an example application.« less
Light-weight Parallel Python Tools for Earth System Modeling Workflows
NASA Astrophysics Data System (ADS)
Mickelson, S. A.; Paul, K.; Xu, H.; Dennis, J.; Brown, D. I.
2015-12-01
With the growth in computing power over the last 30 years, earth system modeling codes have become increasingly data-intensive. As an example, it is expected that the data required for the next Intergovernmental Panel on Climate Change (IPCC) Assessment Report (AR6) will increase by more than 10x to an expected 25PB per climate model. Faced with this daunting challenge, developers of the Community Earth System Model (CESM) have chosen to change the format of their data for long-term storage from time-slice to time-series, in order to reduce the required download bandwidth needed for later analysis and post-processing by climate scientists. Hence, efficient tools are required to (1) perform the transformation of the data from time-slice to time-series format and to (2) compute climatology statistics, needed for many diagnostic computations, on the resulting time-series data. To address the first of these two challenges, we have developed a parallel Python tool for converting time-slice model output to time-series format. To address the second of these challenges, we have developed a parallel Python tool to perform fast time-averaging of time-series data. These tools are designed to be light-weight, be easy to install, have very few dependencies, and can be easily inserted into the Earth system modeling workflow with negligible disruption. In this work, we present the motivation, approach, and testing results of these two light-weight parallel Python tools, as well as our plans for future research and development.