Leading Change Step-by-Step: Tactics, Tools, and Tales
ERIC Educational Resources Information Center
Spiro, Jody
2010-01-01
"Leading Change Step-by-Step" offers a comprehensive and tactical guide for change leaders. Spiro's approach has been field-tested for more than a decade and proven effective in a wide variety of public sector organizations including K-12 schools, universities, international agencies and non-profits. The book is filled with proven tactics for…
ERIC Educational Resources Information Center
Heys, Chris
2008-01-01
Excel, Microsoft's spreadsheet program, offers several tools which have proven useful in solving some optimization problems that arise in operations research. We will look at two such tools, the Excel modules called Solver and Goal Seek--this after deriving an equation, called the "cash accumulation equation", to be used in conjunction with them.
Titian: Data Provenance Support in Spark
Interlandi, Matteo; Shah, Kshitij; Tetali, Sai Deep; Gulzar, Muhammad Ali; Yoo, Seunghyun; Kim, Miryung; Millstein, Todd; Condie, Tyson
2015-01-01
Debugging data processing logic in Data-Intensive Scalable Computing (DISC) systems is a difficult and time consuming effort. Today’s DISC systems offer very little tooling for debugging programs, and as a result programmers spend countless hours collecting evidence (e.g., from log files) and performing trial and error debugging. To aid this effort, we built Titian, a library that enables data provenance—tracking data through transformations—in Apache Spark. Data scientists using the Titian Spark extension will be able to quickly identify the input data at the root cause of a potential bug or outlier result. Titian is built directly into the Spark platform and offers data provenance support at interactive speeds—orders-of-magnitude faster than alternative solutions—while minimally impacting Spark job performance; observed overheads for capturing data lineage rarely exceed 30% above the baseline job execution time. PMID:26726305
Applying Content Management to Automated Provenance Capture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schuchardt, Karen L.; Gibson, Tara D.; Stephan, Eric G.
2008-04-10
Workflows and data pipelines are becoming increasingly valuable in both computational and experimen-tal sciences. These automated systems are capable of generating significantly more data within the same amount of time than their manual counterparts. Automatically capturing and recording data prove-nance and annotation as part of these workflows is critical for data management, verification, and dis-semination. Our goal in addressing the provenance challenge was to develop and end-to-end system that demonstrates real-time capture, persistent content management, and ad-hoc searches of both provenance and metadata using open source software and standard protocols. We describe our prototype, which extends the Kepler workflow toolsmore » for the execution environment, the Scientific Annotation Middleware (SAM) content management software for data services, and an existing HTTP-based query protocol. Our implementation offers several unique capabilities, and through the use of standards, is able to pro-vide access to the provenance record to a variety of commonly available client tools.« less
A Multi-Collaborative Ambient Assisted Living Service Description Tool
Falcó, Jorge L.; Vaquerizo, Esteban; Artigas, José Ignacio
2014-01-01
Collaboration among different stakeholders is a key factor in the design of Ambient Assisted Living (AAL) environments and services. Throughout several AAL projects we have found repeated difficulties in this collaboration and have learned lessons by the experience of solving real situations. This paper highlights identified critical items for collaboration among technicians, users, company and institutional stakeholders and proposes as a communication tool for a project steering committee a service description tool which includes information from the different fields in comprehensible format for the others. It was first generated in the MonAMI project to promote understanding among different workgroups, proven useful there, and further tested later in some other smaller AAL projects. The concept of scalable service description has proven useful for understanding of different disciplines and for participatory decision making throughout the projects to adapt to singularities and partial successes or faults of each action. This paper introduces such tool, relates with existing methodologies in cooperation in AAL and describes it with a example to offer to AAL community. Further work on this tool will significantly improve results in user-centered design of sustainable services in AAL. PMID:24897409
ERIC Educational Resources Information Center
Lehr, Camilla A.; Johnson, David R.; Bremer, Christine D.; Cosio, Anna; Thompson, Megan
2004-01-01
This manual provides a synthesis of research-based dropout prevention and intervention and offers examples of interventions that show evidence of effectiveness. This has proven to be a difficult task because the intervention research on dropout and school completion that can be used to inform practice is incomplete (Dynarski & Gleason, 2002;…
Screening, diagnosis, and treatment of post-traumatic stress disorder.
Wisco, Blair E; Marx, Brian P; Keane, Terence M
2012-08-01
Post-traumatic stress disorder (PTSD) is a prevalent problem among military personnel and veterans. Identification of effective screening tools, diagnostic technologies, and treatments for PTSD is essential to ensure that all individuals in need of treatment are offered interventions with proven efficacy. Well-validated methods for screening and diagnosing PTSD are now available, and effective pharmacological and psychological treatments can be offered. Despite these advances, many military personnel and veterans do not receive evidence-based care. We review the literature on screening, diagnosis, and treatment of PTSD in military populations, and discuss the challenges to implementing the best evidence-based practices in clinical settings.
Active Provenance in Data-intensive Research
NASA Astrophysics Data System (ADS)
Spinuso, Alessandro; Mihajlovski, Andrej; Filgueira, Rosa; Atkinson, Malcolm
2017-04-01
Scientific communities are building platforms where the usage of data-intensive workflows is crucial to conduct their research campaigns. However managing and effectively support the understanding of the 'live' processes, fostering computational steering, sharing and re-use of data and methods, present several bottlenecks. These are often caused by the poor level of documentation on the methods and the data and how users interact with it. This work wants to explore how in such systems, flexibility in the management of the provenance and its adaptation to the different users and application contexts can lead to new opportunities for its exploitation, improving productivity. In particular, this work illustrates a conceptual and technical framework enabling tunable and actionable provenance in data-intensive workflow systems in support of reproducible science. It introduces the concept of Agile data-intensive systems to define the characteristic of our target platform. It shows a novel approach to the integration of provenance mechanisms, offering flexibility in the scale and in the precision of the provenance data collected, ensuring its relevance to the domain of the data-intensive task, fostering its rapid exploitation. The contributions address aspects of the scale of the provenance records, their usability and active role in the research life-cycle. We will discuss the use of dynamically generated provenance types as the approach for the integration of provenance mechanisms into a data-intensive workflow system. Enabling provenance can be transparent to the workflow user and developer, as well as fully controllable and customisable, depending from their expertise and the application's reproducibility, monitoring and validation requirements. The API that allows the realisation and adoption of a provenance type is presented, especially for what concerns the support of provenance profiling, contextualisation and precision. An actionable approach to provenance management will be also discussed, enabling provenance-driven operations at runtime, regardless of the enactment technologies and connectivity impediments. We proposes a framework based on concepts such as provenance clusters and provenance sensors, envisaging new potential for exploiting large quantities of provenance traces at runtime. Finally the work will also introduce how the underlying provenance model can be explored with big-data visualization techniques, aiming at producing comprehensive and interactive views on top of large and heterogeneous provenance data. We will demonstrate the adoption of alternative visualisation methods, from detailed and localised interactive graphs to radial-views, serving different purposes and expertise. Combining provenance types, selective rules, extensible metadata with reactive clustering opens a new and more versatile role of the lineage information in the research life-cycle, thanks to its improved usability. The flexible profiling of the proposed framework offers aid to the human analysis of the process, with the support of advanced and intuitive interactive graphical tools. The Active provenance methods are discussed in the context of a real implementation for a data-intensive library (dispel4py) and its adoption within use cases for computational seismology, climate studies and generic correlation analysis.
A Formal Approach to Requirements-Based Programming
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
No significant general-purpose method is currently available to mechanically transform system requirements into a provably equivalent model. The widespread use of such a method represents a necessary step toward high-dependability system engineering for numerous application domains. Current tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" unfilled by such tools and methods is that the formal models cannot be proven to be equivalent to the requirements. We offer a method for mechanically transforming requirements into a provably equivalent formal model that can be used as the basis for code generation and other transformations. This method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. Finally, we describe further application areas we are investigating for use of the approach.
SEIS-PROV: Practical Provenance for Seismological Data
NASA Astrophysics Data System (ADS)
Krischer, L.; Smith, J. A.; Tromp, J.
2015-12-01
It is widely recognized that reproducibility is crucial to advance science, but at the same time it is very hard to actually achieve. This results in it being recognized but also mostly ignored by a large fraction of the community. A key ingredient towards full reproducibility is to capture and describe the history of data, an issue known as provenance. We present SEIS-PROV, a practical format and data model to store provenance information for seismological data. In a seismological context, provenance can be seen as information about the processes that generated and modified a particular piece of data. For synthetic waveforms the provenance information describes which solver and settings therein were used to generate it. When looking at processed seismograms, the provenance conveys information about the different time series analysis steps that led to it. Additional uses include the description of derived data types, such as cross-correlations and adjoint sources, enabling their proper storage and exchange. SEIS-PROV is based on W3C PROV (http://www.w3.org/TR/prov-overview/), a standard for generic provenance information. It then applies an additional set of constraints to make it suitable for seismology. We present a definition of the SEIS-PROV format, a way to check if any given file is a valid SEIS-PROV document, and two sample implementations: One in SPECFEM3D GLOBE (https://geodynamics.org/cig/software/specfem3d_globe/) to store the provenance information of synthetic seismograms and another one as part of the ObsPy (http://obspy.org) framework enabling automatic tracking of provenance information during a series of analysis and transformation stages. This, along with tools to visualize and interpret provenance graphs, offers a description of data history that can be readily tracked, stored, and exchanged.
From Provenance Standards and Tools to Queries and Actionable Provenance
NASA Astrophysics Data System (ADS)
Ludaescher, B.
2017-12-01
The W3C PROV standard provides a minimal core for sharing retrospective provenance information for scientific workflows and scripts. PROV extensions such as DataONE's ProvONE model are necessary for linking runtime observables in retrospective provenance records with conceptual-level prospective provenance information, i.e., workflow (or dataflow) graphs. Runtime provenance recorders, such as DataONE's RunManager for R, or noWorkflow for Python capture retrospective provenance automatically. YesWorkflow (YW) is a toolkit that allows researchers to declare high-level prospective provenance models of scripts via simple inline comments (YW-annotations), revealing the computational modules and dataflow dependencies in the script. By combining and linking both forms of provenance, important queries and use cases can be supported that neither provenance model can afford on its own. We present existing and emerging provenance tools developed for the DataONE and SKOPE (Synthesizing Knowledge of Past Environments) projects. We show how the different tools can be used individually and in combination to model, capture, share, query, and visualize provenance information. We also present challenges and opportunities for making provenance information more immediately actionable for the researchers who create it in the first place. We argue that such a shift towards "provenance-for-self" is necessary to accelerate the creation, sharing, and use of provenance in support of transparent, reproducible computational and data science.
NASA Astrophysics Data System (ADS)
Zeilik, M.; Garvin-Doxas, K.
2003-12-01
FLAG, the Field-tested Learning Assessment Guide (http://www.flaguide.org/) is a NSF funded website that offers broadly-applicable, self-contained modular classroom assessment techniques (CATs) and discipline-specific tools for STEM instructors creating new approaches to evaluate student learning, attitudes and performance. In particular, the FLAG contains proven techniques for alterative assessments---those needed for reformed, innovative STEM courses. Each tool has been developed, tested and refined in real classrooms at colleges and universities. The FLAG also contains an assessment primer, a section to help you select the most appropriate assessment technique(s) for your course goals, and other resources. In addition to references on instrument development and field-tested instruments on attitudes towards science, the FLAG also includes discipline-specific tools in Physics, Astronomy, Biology, and Mathematics. Building of the Geoscience collection is currently under way with the development of an instrument for detecting misconceptions of incoming freshmen on Space Science, which is being developed with the help of the Committee on Space Science and Astronomy of the American Association of Physics Teachers. Additional field-tested resources from the Geosciences are solicited from the community. Contributions should be sent to Michael Zeilik, zeilik@la.unm.edu. This work has been supported in part by NSF grant DUE 99-81155.
Aptamers as tools for target prioritization and lead identification.
Burgstaller, Petra; Girod, Anne; Blind, Michael
2002-12-15
The increasing number of potential drug target candidates has driven the development of novel technologies designed to identify functionally important targets and enhance the subsequent lead discovery process. Highly specific synthetic nucleic acid ligands--also known as aptamers--offer a new exciting route in the drug discovery process by linking target validation directly with HTS. Recently, aptamers have proven to be valuable tools for modulating the function of endogenous cellular proteins in their natural environment. A set of technologies has been developed to use these sophisticated ligands for the validation of potential drug targets in disease models. Moreover, aptamers that are specific antagonists of protein function can act as substitute interaction partners in HTS assays to facilitate the identification of small-molecule lead compounds.
ACSM Fitness Book: A Proven Step-By-Step Program from the Experts. Third Edition.
ERIC Educational Resources Information Center
American Coll. of Sports Medicine, Indianapolis, IN.
This offers advice on the health benefits of regular physical activity. It includes a scientifically proven fitness test to determine one's starting point and monitor ongoing progress, offering step-by-step instructions, sample programs, and insights on nutrition, weight control, motivation, and overcoming setbacks. Seven chapters examine: (1)…
Applying the Karma Provenance tool to NASA's AMSR-E Data Production Stream
NASA Astrophysics Data System (ADS)
Ramachandran, R.; Conover, H.; Regner, K.; Movva, S.; Goodman, H. M.; Pale, B.; Purohit, P.; Sun, Y.
2010-12-01
Current procedures for capturing and disseminating provenance, or data product lineage, are limited in both what is captured and how it is disseminated to the science community. For example, the Advanced Microwave Scanning Radiometer for the Earth Observing System (AMSR-E) Science Investigator-led Processing System (SIPS) generates Level 2 and Level 3 data products for a variety of geophysical parameters. Data provenance and quality information for these data sets is either very general (e.g., user guides, a list of anomalous data receipt and processing conditions over the life of the missions) or difficult to access or interpret (e.g., quality flags embedded in the data, production history files not easily available to users). Karma is a provenance collection and representation tool designed and developed for data driven workflows such as the productions streams used to produce EOS standard products. Karma records uniform and usable provenance metadata independent of the processing system while minimizing both the modification burden on the processing system and the overall performance overhead. Karma collects both the process and data provenance. The process provenance contains information about the workflow execution and the associated algorithm invocations. The data provenance captures metadata about the derivation history of the data product, including algorithms used and input data sources transformed to generate it. As part of an ongoing NASA funded project, Karma is being integrated into the AMSR-E SIPS data production streams. Metadata gathered by the tool will be presented to the data consumers as provenance graphs, which are useful in validating the workflows and determining the quality of the data product. This presentation will discuss design and implementation issues faced while incorporating a provenance tool into a structured data production flow. Prototype results will also be presented in this talk.
Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools
NASA Astrophysics Data System (ADS)
Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.
2015-12-01
Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software used to execute analyses and models, and derived data and products that arise from these computations. This provenance is vital to interpretation and understanding of science, and provides an audit trail that researchers can use to understand and replicate computational workflows in the geosciences.
Rea, Shane L.; Graham, Brett H.; Nakamaru-Ogiso, Eiko; Kar, Adwitiya; Falk, Marni J.
2013-01-01
The extensive conservation of mitochondrial structure, composition, and function across evolution offers a unique opportunity to expand our understanding of human mitochondrial biology and disease. By investigating the biology of much simpler model organisms, it is often possible to answer questions that are unreachable at the clinical level. Here, we review the relative utility of four different model organisms, namely the bacteria Escherichia coli, the yeast Saccharomyces cerevisiae, the nematode Caenorhabditis elegans and the fruit fly Drosophila melanogaster, in studying the role of mitochondrial proteins relevant to human disease. E. coli are single cell, prokaryotic bacteria that have proven to be a useful model system in which to investigate mitochondrial respiratory chain protein structure and function. S. cerevisiae is a single-celled eukaryote that can grow equally well by mitochondrial-dependent respiration or by ethanol fermentation, a property that has proven to be a veritable boon for investigating mitochondrial functionality. C. elegans is a multi-cellular, microscopic worm that is organized into five major tissues and has proven to be a robust model animal for in vitro and in vivo studies of primary respiratory chain dysfunction and its potential therapies in humans. Studied for over a century, D. melanogaster is a classic metazoan model system offering an abundance of genetic tools and reagents that facilitates investigations of mitochondrial biology using both forward and reverse genetics. The respective strengths and limitations of each species relative to mitochondrial studies are explored. In addition, an overview is provided of major discoveries made in mitochondrial biology in each of these four model systems. PMID:20818735
Oratoria Online: The Use of Technology Enhaced Learning to Improve Students' Oral Skills
NASA Astrophysics Data System (ADS)
Dornaleteche, Jon
New ITCs have proven to be useful tools for implementing innovating didactic and pedagogical formula oriented to enhance students' en teachers' creativity. The up-and-coming massive e-learning and blended learning projects are clear examples of such a phenomenon. The teaching of oral communication offers a perfect scenario to experiment with these formulas. Since the traditional face to face approach for teaching 'Speech techniques' does not keep up with the new digital environment that surround students, it is necessary to move towards an 'Online oratory' model focused on using TEL to improve oral skills.
Modern methods of cost saving of the production activity in construction
NASA Astrophysics Data System (ADS)
Silka, Dmitriy
2017-10-01
Every time economy faces recession, cost saving questions acquire increased urgency. This article shows how companies of the construction industry have switched to the new kind of economic relations over recent years. It is specified that the dominant type of economic relations does not allow to quickly reorient on the necessary tools in accordance with new requirements of economic activity. Successful experience in the new environment becomes demanded. Cost saving methods, which were proven in other industries, are offered for achievement of efficiency and competitiveness of the companies. Analysis is performed on the example of the retail sphere, which, according to the authoritative analytical reviews, is extremely innovative on both local and world economic levels. At that, methods, based on the modern unprecedentedly high opportunities of communications and informational exchange took special place among offered methods.
[Psychocardiology: clinically relevant recommendations regarding selected cardiovascular diseases].
Albus, C; Ladwig, K-H; Herrmann-Lingen, C
2014-03-01
Psychosocial risk factors (work stress, low socioeconomic status, impaired social support, anger, anxiety and depression), certain personality traits (e.g. hostility) and post-traumatic stress disorders may negatively influence the incidence and course of multiple cardiovascular disease conditions. Systematic screening for these factors may help to adequately assess the psychosocial risk pattern of a given patient and may also contribute to the treatment of these patients. Recommendations for treatment are based on current guidelines. The physician-patient interaction should basically follow the principle of a patient centered communication and should gender and age specific aspects into consideration. Integrated biopsychosocial care is an effective, low threshold option to treat psycho-social risk factors and should be offered on a regular basis. Patients with high blood pressure may profit from relaxation programs and biofeedback procedures (however with moderate success). An individually adjusted multimodal treatment strategy should be offered to patients with coronary heart disease, heart failure and after heart surgery. It may incorporate educational tools, exercise therapy, motivational modules, relaxation and stress management programs. In case of affective comorbidity, psychotherapy may be indicated. Anti-depressant pharmacotherapy with selective serotonin reuptake inhibitors (SSRIs) in the first line should only be offered to patients with at least moderate severe depressive episodes. Psychotherapy and SSRIs, particularly sertraline, have been proven to be safe and effective with regard to improvements of the patient's quality of life. A prognostic benefit has not been clearly proven so far. Patients with an implanted cardioverter/defibrillator (ICD) should receive psychosocial support on a regular basis. Concomitant psychotherapy and/or psychopharmacotherapy (SSRIs) should be offered in case of a severe mental comorbidity. Generally, tricyclic antidepressants should be avoided in cardiac patients because of adverse side effects. © Georg Thieme Verlag KG Stuttgart · New York.
Effects of Cold Plasma on Food Quality: A Review.
Pankaj, Shashi K; Wan, Zifan; Keener, Kevin M
2018-01-01
Cold plasma (CP) technology has proven very effective as an alternative tool for food decontamination and shelf-life extension. The impact of CP on food quality is very crucial for its acceptance as an alternative food processing technology. Due to the non-thermal nature, CP treatments have shown no or minimal impacts on the physical, chemical, nutritional and sensory attributes of various products. This review also discusses the negative impacts and limitations posed by CP technology for food products. The limited studies on interactions of CP species with food components at the molecular level offers future research opportunities. It also highlights the need for optimization studies to mitigate the negative impacts on visual, chemical, nutritional and functional properties of food products. The design versatility, non-thermal, economical and environmentally friendly nature of CP offers unique advantages over traditional processing technologies. However, CP processing is still in its nascent form and needs further research to reach its potential.
The multi-copy simultaneous search methodology: a fundamental tool for structure-based drug design.
Schubert, Christian R; Stultz, Collin M
2009-08-01
Fragment-based ligand design approaches, such as the multi-copy simultaneous search (MCSS) methodology, have proven to be useful tools in the search for novel therapeutic compounds that bind pre-specified targets of known structure. MCSS offers a variety of advantages over more traditional high-throughput screening methods, and has been applied successfully to challenging targets. The methodology is quite general and can be used to construct functionality maps for proteins, DNA, and RNA. In this review, we describe the main aspects of the MCSS method and outline the general use of the methodology as a fundamental tool to guide the design of de novo lead compounds. We focus our discussion on the evaluation of MCSS results and the incorporation of protein flexibility into the methodology. In addition, we demonstrate on several specific examples how the information arising from the MCSS functionality maps has been successfully used to predict ligand binding to protein targets and RNA.
High productivity machining of holes in Inconel 718 with SiAlON tools
NASA Astrophysics Data System (ADS)
Agirreurreta, Aitor Arruti; Pelegay, Jose Angel; Arrazola, Pedro Jose; Ørskov, Klaus Bonde
2016-10-01
Inconel 718 is often employed in aerospace engines and power generation turbines. Numerous researches have proven the enhanced productivity when turning with ceramic tools compared to carbide ones, however there is considerably less information with regard to milling. Moreover, no knowledge has been published about machining holes with this type of tools. Additional research on different machining techniques, like for instance circular ramping, is critical to expand the productivity improvements that ceramics can offer. In this a 3D model of the machining and a number of experiments with SiAlON round inserts have been carried out in order to evaluate the effect of the cutting speed and pitch on the tool wear and chip generation. The results of this analysis show that three different types of chips are generated and also that there are three potential wear zones. Top slice wear is identified as the most critical wear type followed by the notch wear as a secondary wear mechanism. Flank wear and adhesion are also found in most of the tests.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2004-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a probably equivalent model has yet to appeal: Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including sensor networks and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a probably equivalent implementation are valuable but not su8cient. The "gap" unfilled by such tools and methods is that their. formal models cannot be proven to be equivalent to the system requirements as originated by the customel: For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a probably equivalent formal model that can be used as the basis for code generation and other transformations.
NASA Astrophysics Data System (ADS)
West, P.; Michaelis, J.; Lebot, T.; McGuinness, D. L.; Fox, P. A.
2014-12-01
Providing proper citation and attribution for published data, derived data products, and the software tools used to generate them, has always been an important aspect of scientific research. However, It is often the case that this type of detailed citation and attribution is lacking. This is in part because it often requires manual markup since dynamic generation of this type of provenance information is not typically done by the tools used to access, manipulate, transform, and visualize data. In addition, the tools themselves lack the information needed to be properly cited themselves. The OPeNDAP Hyrax Software Framework is a tool that provides access to and the ability to constrain, manipulate, and transform, different types of data from different data formats, into a common format, the DAP (Data Access Protocol), in order to derive new data products. A user, or another software client, specifies an HTTP URL in order to access a particular piece of data, and appropriately transform it to suit a specific purpose of use. The resulting data products, however, do not contain any information about what data was used to create it, or the software process used to generate it, let alone information that would allow the proper citing and attribution to down stream researchers and tool developers. We will present our approach to provenance capture in Hyrax including a mechanism that can be used to report back to the hosting site any derived products, such as publications and reports, using the W3C PROV recommendation pingback service. We will demonstrate our utilization of Semantic Web and Web standards, the development of an information model that extends the PROV model for provenance capture, and the development of the pingback service. We will present our findings, as well as our practices for providing provenance information, visualization of the provenance information, and the development of pingback services, to better enable scientists and tool developers to be recognized and properly cited for their contributions.
Full Life Cycle of Data Analysis with Climate Model Diagnostic Analyzer (CMDA)
NASA Astrophysics Data System (ADS)
Lee, S.; Zhai, C.; Pan, L.; Tang, B.; Zhang, J.; Bao, Q.; Malarout, N.
2017-12-01
We have developed a system that supports the full life cycle of a data analysis process, from data discovery, to data customization, to analysis, to reanalysis, to publication, and to reproduction. The system called Climate Model Diagnostic Analyzer (CMDA) is designed to demonstrate that the full life cycle of data analysis can be supported within one integrated system for climate model diagnostic evaluation with global observational and reanalysis datasets. CMDA has four subsystems that are highly integrated to support the analysis life cycle. Data System manages datasets used by CMDA analysis tools, Analysis System manages CMDA analysis tools which are all web services, Provenance System manages the meta data of CMDA datasets and the provenance of CMDA analysis history, and Recommendation System extracts knowledge from CMDA usage history and recommends datasets/analysis tools to users. These four subsystems are not only highly integrated but also easily expandable. New datasets can be easily added to Data System and scanned to be visible to the other subsystems. New analysis tools can be easily registered to be available in the Analysis System and Provenance System. With CMDA, a user can start a data analysis process by discovering datasets of relevance to their research topic using the Recommendation System. Next, the user can customize the discovered datasets for their scientific use (e.g. anomaly calculation, regridding, etc) with tools in the Analysis System. Next, the user can do their analysis with the tools (e.g. conditional sampling, time averaging, spatial averaging) in the Analysis System. Next, the user can reanalyze the datasets based on the previously stored analysis provenance in the Provenance System. Further, they can publish their analysis process and result to the Provenance System to share with other users. Finally, any user can reproduce the published analysis process and results. By supporting the full life cycle of climate data analysis, CMDA improves the research productivity and collaboration level of its user.
NASA Astrophysics Data System (ADS)
Spinuso, Alessandro; Krause, Amy; Ramos Garcia, Clàudia; Casarotti, Emanuele; Magnoni, Federica; Klampanos, Iraklis A.; Frobert, Laurent; Krischer, Lion; Trani, Luca; David, Mario; Leong, Siew Hoon; Muraleedharan, Visakh
2014-05-01
The EU-funded project VERCE (Virtual Earthquake and seismology Research Community in Europe) aims to deploy technologies which satisfy the HPC and data-intensive requirements of modern seismology. As a result of VERCE's official collaboration with the EU project SCI-BUS, access to computational resources, like local clusters and international infrastructures (EGI and PRACE), is made homogeneous and integrated within a dedicated science gateway based on the gUSE framework. In this presentation we give a detailed overview on the progress achieved with the developments of the VERCE Science Gateway, according to a use-case driven implementation strategy. More specifically, we show how the computational technologies and data services have been integrated within a tool for Seismic Forward Modelling, whose objective is to offer the possibility to perform simulations of seismic waves as a service to the seismological community. We will introduce the interactive components of the OGC map based web interface and how it supports the user with setting up the simulation. We will go through the selection of input data, which are either fetched from federated seismological web services, adopting community standards, or provided by the users themselves by accessing their own document data store. The HPC scientific codes can be selected from a number of waveform simulators, currently available to the seismological community as batch tools or with limited configuration capabilities in their interactive online versions. The results will be staged out from the HPC via a secure GridFTP transfer to a VERCE data layer managed by iRODS. The provenance information of the simulation will be automatically cataloged by the data layer via NoSQL techonologies. We will try to demonstrate how data access, validation and visualisation can be supported by a general purpose provenance framework which, besides common provenance concepts imported from the OPM and the W3C-PROV initiatives, also offers an extensible metadata archive including community and user defined metadata and annotations. Finally, we will show how the VERCE Gateway platform will allow the customisation of pre and post processing phases of the simulation workflows, thanks to the availability of a registry of processing elements (PEs,) which are easily developed and maintained by the seismologists.
ERIC Educational Resources Information Center
Allington, Richard L.
1992-01-01
Offers summaries of three proven programs (Reading Recovery, Success for All, and Accelerated Schools) for accelerating the reading and writing progress of low-achieving, low-income children. Provides addresses for more information. (SR)
Spatially explicit multi-criteria decision analysis for managing vector-borne diseases
2011-01-01
The complex epidemiology of vector-borne diseases creates significant challenges in the design and delivery of prevention and control strategies, especially in light of rapid social and environmental changes. Spatial models for predicting disease risk based on environmental factors such as climate and landscape have been developed for a number of important vector-borne diseases. The resulting risk maps have proven value for highlighting areas for targeting public health programs. However, these methods generally only offer technical information on the spatial distribution of disease risk itself, which may be incomplete for making decisions in a complex situation. In prioritizing surveillance and intervention strategies, decision-makers often also need to consider spatially explicit information on other important dimensions, such as the regional specificity of public acceptance, population vulnerability, resource availability, intervention effectiveness, and land use. There is a need for a unified strategy for supporting public health decision making that integrates available data for assessing spatially explicit disease risk, with other criteria, to implement effective prevention and control strategies. Multi-criteria decision analysis (MCDA) is a decision support tool that allows for the consideration of diverse quantitative and qualitative criteria using both data-driven and qualitative indicators for evaluating alternative strategies with transparency and stakeholder participation. Here we propose a MCDA-based approach to the development of geospatial models and spatially explicit decision support tools for the management of vector-borne diseases. We describe the conceptual framework that MCDA offers as well as technical considerations, approaches to implementation and expected outcomes. We conclude that MCDA is a powerful tool that offers tremendous potential for use in public health decision-making in general and vector-borne disease management in particular. PMID:22206355
Valdez, Joshua; Rueschman, Michael; Kim, Matthew; Redline, Susan; Sahoo, Satya S
2016-10-01
Extraction of structured information from biomedical literature is a complex and challenging problem due to the complexity of biomedical domain and lack of appropriate natural language processing (NLP) techniques. High quality domain ontologies model both data and metadata information at a fine level of granularity, which can be effectively used to accurately extract structured information from biomedical text. Extraction of provenance metadata, which describes the history or source of information, from published articles is an important task to support scientific reproducibility. Reproducibility of results reported by previous research studies is a foundational component of scientific advancement. This is highlighted by the recent initiative by the US National Institutes of Health called "Principles of Rigor and Reproducibility". In this paper, we describe an effective approach to extract provenance metadata from published biomedical research literature using an ontology-enabled NLP platform as part of the Provenance for Clinical and Healthcare Research (ProvCaRe). The ProvCaRe-NLP tool extends the clinical Text Analysis and Knowledge Extraction System (cTAKES) platform using both provenance and biomedical domain ontologies. We demonstrate the effectiveness of ProvCaRe-NLP tool using a corpus of 20 peer-reviewed publications. The results of our evaluation demonstrate that the ProvCaRe-NLP tool has significantly higher recall in extracting provenance metadata as compared to existing NLP pipelines such as MetaMap.
Tyndall, Timothy; Tyndall, Ayami
2018-01-01
Healthcare directories are vital for interoperability among healthcare providers, researchers and patients. Past efforts at directory services have not provided the tools to allow integration of the diverse data sources. Many are overly strict, incompatible with legacy databases, and do not provide Data Provenance. A more architecture-independent system is needed to enable secure, GDPR-compatible (8) service discovery across organizational boundaries. We review our development of a portable Data Provenance Toolkit supporting provenance within Health Information Exchange (HIE) systems. The Toolkit has been integrated with client software and successfully leveraged in clinical data integration. The Toolkit validates provenance stored in a Blockchain or Directory record and creates provenance signatures, providing standardized provenance that moves with the data. This healthcare directory suite implements discovery of healthcare data by HIE and EHR systems via FHIR. Shortcomings of past directory efforts include the ability to map complex datasets and enabling interoperability via exchange endpoint discovery. By delivering data without dictating how it is stored we improve exchange and facilitate discovery on a multi-national level through open source, fully interoperable tools. With the development of Data Provenance resources we enhance exchange and improve security and usability throughout the health data continuum.
Chen, Ming-Jun; Cheng, Jian; Yuan, Xiao-Dong; Liao, Wei; Wang, Hai-Jun; Wang, Jing-He; Xiao, Yong; Li, Ming-Quan
2015-01-01
Repairing initial slight damage site into stable structures by engineering techniques is the leading strategy to mitigate the damage growth on large-size components used in laser-driven fusion facilities. For KH2PO4 crystals, serving as frequency converter and optoelectronic switch-Pockels cell, micro-milling has been proven the most promising method to fabricate these stable structures. However, tool marks inside repairing pit would be unavoidably introduced due to the wearing of milling cutter in actual repairing process. Here we quantitatively investigate the effect of tool marks on repairing quality of damaged crystal components by simulating its induced light intensification and testing the laser-induced damage threshold. We found that due to the formation of focusing hot spots and interference ripples, the light intensity is strongly enhanced with the presence of tool marks, especially for those on rear surfaces. Besides, the negative effect of tool marks is mark density dependent and multiple tool marks would aggravate the light intensification. Laser damage tests verified the role of tool marks as weak points, reducing the repairing quality. This work offers new criterion to comprehensively evaluate the quality of repaired optical surfaces to alleviate the bottleneck issue of low laser damage threshold for optical components in laser-driven fusion facilities. PMID:26399624
Formal Requirements-Based Programming for Complex Systems
NASA Technical Reports Server (NTRS)
Rash, James L.; Hinchey, Michael G.; Rouff, Christopher A.; Gracanin, Denis
2005-01-01
Computer science as a field has not yet produced a general method to mechanically transform complex computer system requirements into a provably equivalent implementation. Such a method would be one major step towards dealing with complexity in computing, yet it remains the elusive holy grail of system development. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that such tools and methods leave unfilled is that the formal models cannot be proven to be equivalent to the system requirements as originated by the customer For the classes of complex systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations. While other techniques are available, this method is unique in offering full mathematical tractability while using notations and techniques that are well known and well trusted. We illustrate the application of the method to an example procedure from the Hubble Robotic Servicing Mission currently under study and preliminary formulation at NASA Goddard Space Flight Center.
Diode Lasers used in Plastic Welding and Selective Laser Soldering - Applications and Products
NASA Astrophysics Data System (ADS)
Reinl, S.
Aside from conventional welding methods, laser welding of plastics has established itself as a proven bonding method. The component-conserving and clean process offers numerous advantages and enables welding of sensitive assemblies in automotive, electronic, medical, human care, food packaging and consumer electronics markets. Diode lasers are established since years within plastic welding applications. Also, soft soldering using laser radiation is becoming more and more significant in the field of direct diode laser applications. Fast power controllability combined with a contactless temperature measurement to minimize thermal damage make the diode laser an ideal tool for this application. These advantages come in to full effect when soldering of increasingly small parts in temperature sensitive environments is necessary.
An intravital microscopy model to study early pancreatic inflammation in type 1 diabetes in NOD mice
Lehmann, Christian; Fisher, Nicholas B.; Tugwell, Barna; Zhou, Juan
2016-01-01
ABSTRACT Intravital microscopy (IVM) of the pancreas has been proven to be an invaluable tool in pancreatitis, transplantation and ischemia/reperfusion research. Also in type 1 diabetes (T1D) pancreatic IVM offers unique advantages for the elucidation of the disease process. Female non-obese diabetic (NOD) mice develop T1D spontaneously by 40 weeks of age. Our goal was to establish an IVM-based method to study early pancreatic inflammation in NOD mice, which can be used to screen novel medications to prevent or delay T1D in future studies. This included evaluation of leukocyte-endothelial interactions as well as disturbances of capillary perfusion in the pancreatic microcirculation. PMID:28243521
Automated Generation of Technical Documentation and Provenance for Reproducible Research
NASA Astrophysics Data System (ADS)
Jolly, B.; Medyckyj-Scott, D.; Spiekermann, R.; Ausseil, A. G.
2017-12-01
Data provenance and detailed technical documentation are essential components of high-quality reproducible research, however are often only partially addressed during a research project. Recording and maintaining this information during the course of a project can be a difficult task to get right as it is a time consuming and often boring process for the researchers involved. As a result, provenance records and technical documentation provided alongside research results can be incomplete or may not be completely consistent with the actual processes followed. While providing access to the data and code used by the original researchers goes some way toward enabling reproducibility, this does not count as, or replace, data provenance. Additionally, this can be a poor substitute for good technical documentation and is often more difficult for a third-party to understand - particularly if they do not understand the programming language(s) used. We present and discuss a tool built from the ground up for the production of well-documented and reproducible spatial datasets that are created by applying a series of classification rules to a number of input layers. The internal model of the classification rules required by the tool to process the input data is exploited to also produce technical documentation and provenance records with minimal additional user input. Available provenance records that accompany input datasets are incorporated into those that describe the current process. As a result, each time a new iteration of the analysis is performed the documentation and provenance records are re-generated to provide an accurate description of the exact process followed. The generic nature of this tool, and the lessons learned during its creation, have wider application to other fields where the production of derivative datasets must be done in an open, defensible, and reproducible way.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The gap that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the classes of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
Requirements to Design to Code: Towards a Fully Formal Approach to Automatic Code Generation
NASA Technical Reports Server (NTRS)
Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.
2005-01-01
A general-purpose method to mechanically transform system requirements into a provably equivalent model has yet to appear. Such a method represents a necessary step toward high-dependability system engineering for numerous possible application domains, including distributed software systems, sensor networks, robot operation, complex scripts for spacecraft integration and testing, and autonomous systems. Currently available tools and methods that start with a formal model of a: system and mechanically produce a provably equivalent implementation are valuable but not sufficient. The "gap" that current tools and methods leave unfilled is that their formal models cannot be proven to be equivalent to the system requirements as originated by the customer. For the ciasses of systems whose behavior can be described as a finite (but significant) set of scenarios, we offer a method for mechanically transforming requirements (expressed in restricted natural language, or in other appropriate graphical notations) into a provably equivalent formal model that can be used as the basis for code generation and other transformations.
Femtosecond lasers as novel tool in dental surgery
NASA Astrophysics Data System (ADS)
Serbin, J.; Bauer, T.; Fallnich, C.; Kasenbacher, A.; Arnold, W. H.
2002-09-01
There is a proven potential of femtosecond lasers for medical applications like cornea shaping [1], ear surgery or dental surgery [2]. Minimal invasive treatment of carious tissue has become an increasingly important aspect in modern dentistry. State of the art methods like grinding using turbine-driven drills or ablation by Er:YAG lasers [3] generate mechanical and thermal stress, thus generating micro cracks of several tens of microns in the enamel [4]. These cracks are starting points for new carious attacks and have to be avoided for long term success of the dental treatment. By using femtosecond lasers (1 fs=10 -15 s) for ablating dental tissue, these drawbacks can be overcome. We have demonstrated that femtosecond laser ablation offers a tool for crack-free generation of cavities in dental tissue. Furthermore, spectral analysis of the laser induced plasma has been used to indicate carious oral tissue. Our latest results on femtosecond laser dentistry will be presented, demonstrating the great potential of this kind of laser technology in medicine.
ASDF: An Adaptable Seismic Data Format with Full Provenance
NASA Astrophysics Data System (ADS)
Smith, J. A.; Krischer, L.; Tromp, J.; Lefebvre, M. P.
2015-12-01
In order for seismologists to maximize their knowledge of how the Earth works, they must extract the maximum amount of useful information from all recorded seismic data available for their research. This requires assimilating large sets of waveform data, keeping track of vast amounts of metadata, using validated standards for quality control, and automating the workflow in a careful and efficient manner. In addition, there is a growing gap between CPU/GPU speeds and disk access speeds that leads to an I/O bottleneck in seismic workflows. This is made even worse by existing seismic data formats that were not designed for performance and are limited to a few fixed headers for storing metadata.The Adaptable Seismic Data Format (ASDF) is a new data format for seismology that solves the problems with existing seismic data formats and integrates full provenance into the definition. ASDF is a self-describing format that features parallel I/O using the parallel HDF5 library. This makes it a great choice for use on HPC clusters. The format integrates the standards QuakeML for seismic sources and StationXML for receivers. ASDF is suitable for storing earthquake data sets, where all waveforms for a single earthquake are stored in a one file, ambient noise cross-correlations, and adjoint sources. The format comes with a user-friendly Python reader and writer that gives seismologists access to a full set of Python tools for seismology. There is also a faster C/Fortran library for integrating ASDF into performance-focused numerical wave solvers, such as SPECFEM3D_GLOBE. Finally, a GUI tool designed for visually exploring the format exists that provides a flexible interface for both research and educational applications. ASDF is a new seismic data format that offers seismologists high-performance parallel processing, organized and validated contents, and full provenance tracking for automated seismological workflows.
NASA Astrophysics Data System (ADS)
Cleven, Nathan; Lin, Shoufa; Davis, Donald; Xiao, Wenjiao; Guilmette, Carl
2017-04-01
This work expands upon detrital zircon geochronology with a sampling and analysis strategy dating granitoid conglomerate clasts that exhibit differing degrees of internal ductile deformation. As deformation textures within clastic material reflect the variation and history of tectonization in the source region of a deposit, we outline a dating methodology that can provide details of the provenance's tectonomagmatic history from deformation-relative age distributions. The method involves bulk samples of solely granitoid clasts, as they are representative of the magmatic framework within the provenance. The clasts are classified and sorted into three subsets: undeformed, slightly deformed, and deformed. LA-ICPMS U-Pb geochronology is performed on zircon separates of each subset. Our case study, involving the Permian Hongliuhe Formation in the southern Central Asian Orogenic Belt, analyzes each of the three clast subsets, as well as sandstone detrital samples, at three stratigraphic levels to yield a profile of the unroofed provenance. The age spectra of the clast samples exhibit different, wider distributions than sandstone samples, considered an effect of proximity to the respective provenance. Comparisons of clast data to sandstone data, as well as comparisons between stratigraphic levels, yield indications of key tectonic processes, in addition to the typical characteristics provided by detrital geochronology. The clast data indicates a minimal lag time, implying rapid exhumation rates, whereas sandstone data alone would indicate a 90 m.y. lag time. Early Paleozoic arc building episodes appear as Ordovician peaks in sandstone data, and Silurian-Devonian peaks in clast data, indicating a younging of magmatism towards the proximal provenance. A magmatic hiatus starts in the Devonian, correlating with the latest age of deformed clasts, interpreted as timing of collisional tectonics. Provenance interpretation using the correlations seen between the clast and sandstone data proves to be more detailed and more robust than that determined from sandstone samples alone. The variably tectonized clast detrital geochronology method offers a regional reconnaissance tool that can address the practical limits of studying regional granitoid distributions.
Scientific Workflows + Provenance = Better (Meta-)Data Management
NASA Astrophysics Data System (ADS)
Ludaescher, B.; Cuevas-Vicenttín, V.; Missier, P.; Dey, S.; Kianmajd, P.; Wei, Y.; Koop, D.; Chirigati, F.; Altintas, I.; Belhajjame, K.; Bowers, S.
2013-12-01
The origin and processing history of an artifact is known as its provenance. Data provenance is an important form of metadata that explains how a particular data product came about, e.g., how and when it was derived in a computational process, which parameter settings and input data were used, etc. Provenance information provides transparency and helps to explain and interpret data products. Other common uses and applications of provenance include quality control, data curation, result debugging, and more generally, 'reproducible science'. Scientific workflow systems (e.g. Kepler, Taverna, VisTrails, and others) provide controlled environments for developing computational pipelines with built-in provenance support. Workflow results can then be explained in terms of workflow steps, parameter settings, input data, etc. using provenance that is automatically captured by the system. Scientific workflows themselves provide a user-friendly abstraction of the computational process and are thus a form of ('prospective') provenance in their own right. The full potential of provenance information is realized when combining workflow-level information (prospective provenance) with trace-level information (retrospective provenance). To this end, the DataONE Provenance Working Group (ProvWG) has developed an extension of the W3C PROV standard, called D-PROV. Whereas PROV provides a 'least common denominator' for exchanging and integrating provenance information, D-PROV adds new 'observables' that described workflow-level information (e.g., the functional steps in a pipeline), as well as workflow-specific trace-level information ( timestamps for each workflow step executed, the inputs and outputs used, etc.) Using examples, we will demonstrate how the combination of prospective and retrospective provenance provides added value in managing scientific data. The DataONE ProvWG is also developing tools based on D-PROV that allow scientists to get more mileage from provenance metadata. DataONE is a federation of member nodes that store data and metadata for discovery and access. By enriching metadata with provenance information, search and reuse of data is enhanced, and the 'social life' of data (being the product of many workflow runs, different people, etc.) is revealed. We are currently prototyping a provenance repository (PBase) to demonstrate what can be achieved with advanced provenance queries. The ProvExplorer and ProPub tools support advanced ad-hoc querying and visualization of provenance as well as customized provenance publications (e.g., to address privacy issues, or to focus provenance to relevant details). In a parallel line of work, we are exploring ways to add provenance support to widely-used scripting platforms (e.g. R and Python) and then expose that information via D-PROV.
NASA Astrophysics Data System (ADS)
Kemp, C.; Car, N. J.
2016-12-01
Geoscience Australia (GA) is a government agency that provides advice on the geology and geography of Australia. It is the custodian of many digital and physical datasets of national significance. For several years GA has been implementing an enterprise approach to provenance management. The goal for transparency and reproducibility for all of GA's information products; an objective supported at the highest levels and explicitly listed in its Science Principles. Currently GA is finalising a set of enterprise tools to assist with provenance management and rolling out provenance reporting to different science areas. GA has adopted or developed: provenance storage systems; provenance collection code libraries (for use within automated systems); reporting interfaces (for manual use) and provenance representation capability within legacy catalogues. Using these tools within GA's science areas involves modelling the scenario first and then assessing whether the area has its data managed in such a way that allows links to data within provenance to be resolvable in perpetuity. We don't just want to represent provenance (demonstrating transparency), we want to access data via provenance (allowing for reproducibility). A subtask of GA's current work is to link physical samples to information products (datasets, reports, papers) by uniquely and persistently identifying samples using International GeoSample Numbers and then modelling automated & manual laboratory workflows and associated tasks, such as data delivery to corporate databases using the W3C's PROV Data Model. We use PROV DM throughout our modelling and systems. We are also moving to deliver all sample and digital dataset metadata across the agency in the Web Ontology Language (OWL) and exposing it via Linked Data methods in order to allow Semantic Web querying of multiple systems allowing provenance to be leveraged using as a single method and query point. Through the Science First Transformation Program GA is undergoing a significant rethinking of its data architecture, curation and access to support the Digital Science capability for which Provenance management is an output.
2009-01-01
The means we use to record the process of carrying out research remains tied to the concept of a paginated paper notebook despite the advances over the past decade in web based communication and publication tools. The development of these tools offers an opportunity to re-imagine what the laboratory record would look like if it were re-built in a web-native form. In this paper I describe a distributed approach to the laboratory record based which uses the most appropriate tool available to house and publish each specific object created during the research process, whether they be a physical sample, a digital data object, or the record of how one was created from another. I propose that the web-native laboratory record would act as a feed of relationships between these items. This approach can be seen as complementary to, rather than competitive with, integrative approaches that aim to aggregate relevant objects together to describe knowledge. The potential for the recent announcement of the Google Wave protocol to have a significant impact on realizing this vision is discussed along with the issues of security and provenance that are raised by such an approach. PMID:20098590
[What do virtual reality tools bring to child and adolescent psychiatry?
Bioulac, S; de Sevin, E; Sagaspe, P; Claret, A; Philip, P; Micoulaud-Franchi, J A; Bouvard, M P
2018-06-01
Virtual reality is a relatively new technology that enables individuals to immerse themselves in a virtual world. It offers several advantages including a more realistic, lifelike environment that may allow subjects to "forget" they are being assessed, allow a better participation and an increased generalization of learning. Moreover, the virtual reality system can provide multimodal stimuli, such as visual and auditory stimuli, and can also be used to evaluate the patient's multimodal integration and to aid rehabilitation of cognitive abilities. The use of virtual reality to treat various psychiatric disorders in adults (phobic anxiety disorders, post-traumatic stress disorder, eating disorders, addictions…) and its efficacy is supported by numerous studies. Similar research for children and adolescents is lagging behind. This may be particularly beneficial to children who often show great interest and considerable success on computer, console or videogame tasks. This article will expose the main studies that have used virtual reality with children and adolescents suffering from psychiatric disorders. The use of virtual reality to treat anxiety disorders in adults is gaining popularity and its efficacy is supported by various studies. Most of the studies attest to the significant efficacy of the virtual reality exposure therapy (or in virtuo exposure). In children, studies have covered arachnophobia social anxiety and school refusal phobia. Despite the limited number of studies, results are very encouraging for treatment in anxiety disorders. Several studies have reported the clinical use of virtual reality technology for children and adolescents with autistic spectrum disorders (ASD). Extensive research has proven the efficiency of technologies as support tools for therapy. Researches are found to be focused on communication and on learning and social imitation skills. Virtual reality is also well accepted by subjects with ASD. The virtual environment offers the opportunity to administer controlled tasks such as the typical neuropsychological tools, but in an environment much more like a standard classroom. The virtual reality classroom offers several advantages compared to classical tools such as more realistic and lifelike environment but also records various measures in standardized conditions. Most of the studies using a virtual classroom have found that children with Attention Deficit/Hyperactivity Disorder make significantly fewer correct hits and more commission errors compared with controls. The virtual classroom has proven to be a good clinical tool for evaluation of attention in ADHD. For eating disorders, cognitive behavioural therapy (CBT) program enhanced by a body image specific component using virtual reality techniques was shown to be more efficient than cognitive behavioural therapy alone. The body image-specific component using virtual reality techniques boots efficiency and accelerates the CBT change process for eating disorders. Virtual reality is a relatively new technology and its application in child and adolescent psychiatry is recent. However, this technique is still in its infancy and much work is needed including controlled trials before it can be introduced in routine clinical use. Virtual reality interventions should also investigate how newly acquired skills are transferred to the real world. At present virtual reality can be considered a useful tool in evaluation and treatment for child and adolescent disorders. Copyright © 2017 L'Encéphale, Paris. Published by Elsevier Masson SAS. All rights reserved.
X-ray imaging detectors for synchrotron and XFEL sources
Hatsui, Takaki; Graafsma, Heinz
2015-01-01
Current trends for X-ray imaging detectors based on hybrid and monolithic detector technologies are reviewed. Hybrid detectors with photon-counting pixels have proven to be very powerful tools at synchrotrons. Recent developments continue to improve their performance, especially for higher spatial resolution at higher count rates with higher frame rates. Recent developments for X-ray free-electron laser (XFEL) experiments provide high-frame-rate integrating detectors with both high sensitivity and high peak signal. Similar performance improvements are sought in monolithic detectors. The monolithic approach also offers a lower noise floor, which is required for the detection of soft X-ray photons. The link between technology development and detector performance is described briefly in the context of potential future capabilities for X-ray imaging detectors. PMID:25995846
Chemical mapping of cytosines enzymatically flipped out of the DNA helix
Liutkevičiūtė, Zita; Tamulaitis, Gintautas; Klimašauskas, Saulius
2008-01-01
Haloacetaldehydes can be employed for probing unpaired DNA structures involving cytosine and adenine residues. Using an enzyme that was structurally proven to flip its target cytosine out of the DNA helix, the HhaI DNA methyltransferase (M.HhaI), we demonstrate the suitability of the chloroacetaldehyde modification for mapping extrahelical (flipped-out) cytosine bases in protein–DNA complexes. The generality of this method was verified with two other DNA cytosine-5 methyltransferases, M.AluI and M.SssI, as well as with two restriction endonucleases, R.Ecl18kI and R.PspGI, which represent a novel class of base-flipping enzymes. Our results thus offer a simple and convenient laboratory tool for detection and mapping of flipped-out cytosines in protein–DNA complexes. PMID:18450817
Oller, Stephen D
2005-01-01
The pragmatic mapping process and its variants have proven effective in second language learning and teaching. The goal of this paper is to show that the same process applies in teaching and intervention with disordered populations. A secondary goal, ultimately more important, is to give clinicians, teachers, and other educators a tool-kit, or a framework, from which they can evaluate and implement interventions. What is offered is an introduction to a general theory of signs and some examples of how it can be applied in treating communication disorders. (1) Readers will be able to relate the three theoretical consistency requirements to language teaching and intervention. (2) Readers will be introduced to a general theory of signs that provides a basis for evaluating and implementing interventions.
2016-10-14
ABSTRACT DTAGs have proven to be a valuable tool for the study of marine mammal acoustics and fine scale motion. The success of the DTAG has resulted in...Underwater Acoustics , Digital Communications 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF a. REPORT b. ABSTRACT c. THIS PAGE ABSTRACT u u u...continuously monitor and improve the tag design.\\ OBJECTIVES Dtags have proven to be a valuable tool for the study of marine mammal acoustics and fine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ragan, Eric D; Goodall, John R
2014-01-01
Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less
ERIC Educational Resources Information Center
Hope, Alice
2018-01-01
Literature that addresses young children's learning in galleries and museums typically concentrates on what is already offered and discusses what has proven to be effective, or not, in accommodating their needs. This article offers insight into how objects can be explored with early years children at school, to create greater understanding of…
Xu, Shen; Rogers, Toby; Fairweather, Elliot; Glenn, Anthony; Curran, James; Curcin, Vasa
2018-01-01
Data provenance is a technique that describes the history of digital objects. In health data settings, it can be used to deliver auditability and transparency, and to achieve trust in a software system. However, implementing data provenance in analytics software at an enterprise level presents a different set of challenges from the research environments where data provenance was originally devised. In this paper, the challenges of reporting provenance information to the user is presented. Provenance captured from analytics software can be large and complex and visualizing a series of tasks over a long period can be overwhelming even for a domain expert, requiring visual aggregation mechanisms that fit with complex human cognitive activities involved in the process. This research studied how provenance-based reporting can be integrated into a health data analytics software, using the example of Atmolytics visual reporting tool. PMID:29888084
Tracking Provenance of Earth Science Data
NASA Technical Reports Server (NTRS)
Tilmes, Curt; Yesha, Yelena; Halem, Milton
2010-01-01
Tremendous volumes of data have been captured, archived and analyzed. Sensors, algorithms and processing systems for transforming and analyzing the data are evolving over time. Web Portals and Services can create transient data sets on-demand. Data are transferred from organization to organization with additional transformations at every stage. Provenance in this context refers to the source of data and a record of the process that led to its current state. It encompasses the documentation of a variety of artifacts related to particular data. Provenance is important for understanding and using scientific datasets, and critical for independent confirmation of scientific results. Managing provenance throughout scientific data processing has gained interest lately and there are a variety of approaches. Large scale scientific datasets consisting of thousands to millions of individual data files and processes offer particular challenges. This paper uses the analogy of art history provenance to explore some of the concerns of applying provenance tracking to earth science data. It also illustrates some of the provenance issues with examples drawn from the Ozone Monitoring Instrument (OMI) Data Processing System (OMIDAPS) run at NASA's Goddard Space Flight Center by the first author.
Key Provenance of Earth Science Observational Data Products
NASA Astrophysics Data System (ADS)
Conover, H.; Plale, B.; Aktas, M.; Ramachandran, R.; Purohit, P.; Jensen, S.; Graves, S. J.
2011-12-01
As the sheer volume of data increases, particularly evidenced in the earth and environmental sciences, local arrangements for sharing data need to be replaced with reliable records about the what, who, how, and where of a data set or collection. This is frequently called the provenance of a data set. While observational data processing systems in the earth sciences have a long history of capturing metadata about the processing pipeline, current processes are limited in both what is captured and how it is disseminated to the science community. Provenance capture plays a role in scientific data preservation and stewardship precisely because it can automatically capture and represent a coherent picture of the what, how and who of a particular scientific collection. It reflects the transformations that a data collection underwent prior to its current form and the sequence of tasks that were executed and data products applied to generate a new product. In the NASA-funded Instant Karma project, we examine provenance capture in earth science applications, specifically the Advanced Microwave Scanning Radiometer - Earth Observing System (AMSR-E) Science Investigator-led Processing system (SIPS). The project is integrating the Karma provenance collection and representation tool into the AMSR-E SIPS production environment, with an initial focus on Sea Ice. This presentation will describe capture and representation of provenance that is guided by the Open Provenance Model (OPM). Several things have become clear during the course of the project to date. One is that core OPM entities and relationships are not adequate for expressing the kinds of provenance that is of interest in the science domain. OPM supports name-value pair annotations that can be used to augment what is known about the provenance entities and relationships, but in Karma, annotations cannot be added during capture, but only after the fact. This limits the capture system's ability to record something it learned about an entity after the event of its creation in the provenance record. We will discuss extensions to the Open Provenance Model (OPM) and modifications to the Karma tool suite to address this issue, more efficient representations of earth science kinds of provenance, and definition of metadata structures for capturing related knowledge about the data products and science algorithms used to generate them. Use scenarios for provenance information is an active topic of investigation. It has additionally become clear through the project that not all provenance is created equal. In processing pipelines, some provenance is repetitive and uninteresting. Because of the volume of provenance, this obscures what are the interesting pieces of provenance. Methodologies to reveal science-relevant provenance will be presented, along with a preview of the AMSR-E Provenance Browser.
Navigating moral distress using the moral distress map.
Dudzinski, Denise Marie
2016-05-01
The plethora of literature on moral distress has substantiated and refined the concept, provided data about clinicians' (especially nurses') experiences, and offered advice for coping. Fewer scholars have explored what makes moral distress moral If we acknowledge that patient care can be distressing in the best of ethical circumstances, then differentiating distress and moral distress may refine the array of actions that are likely to ameliorate it. This article builds upon scholarship exploring the normative and conceptual dimensions of moral distress and introduces a new tool to map moral distress from emotional source to corrective actions. The Moral Distress Map has proven useful in clinical teaching and ethics-related debriefings. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Hannerz, Mats; Westin, Johan
2005-09-01
Reforestation with provenances from locations remote from the planting site (transferred provenances) or the progeny of trees of local provenances selected for superior form and vigor (plus trees) offer alternative means to increase yield over that obtained by the use of seed from unselected trees of the local provenance. Under Swedish conditions, Norway spruce (Picea abies (L.) Karst.) of certain transferred provenances generally has an advantage in productivity relative to the local provenance comparable to that of progeny of plus trees. The aim of this study was to explore the extent to which productivity gains achieved by provenance transfer or the use of plus tree progeny are associated with reductions in autumn frost hardiness, relative to that of trees of the local provenance. In a field trial with 19-year-old trees in central Sweden, bud hardiness was tested on four occasions during the autumn of 2002. Trees of the local provenance were compared with trees of a south Swedish provenance originating 3 degrees of latitude to the south, a Belarusian provenance and the progeny of plus trees of local origin. The Belarusian provenance was the least hardy and the local provenance the most hardy, with plus tree progeny and the south Swedish provenance being intermediate in hardiness. Both the Belarusian provenance and the plus tree progeny were significantly taller than trees of the other populations. Within provenances, tree height was negatively correlated with autumn frost hardiness. Among the plus tree progeny, however, no such correlation between tree height and autumn frost hardiness was found. It is concluded that although the gain in productivity achieved by provenance transfer from Belarus was comparable to that achieved by using the progeny of plus trees of the local provenance, the use of trees of the Belarus provenance involved an increased risk of autumn frost damage because of later hardening.
The Discovery Dome: A Tool for Increasing Student Engagement
NASA Astrophysics Data System (ADS)
Brevik, Corinne
2015-04-01
The Discovery Dome is a portable full-dome theater that plays professionally-created science films. Developed by the Houston Museum of Natural Science and Rice University, this inflatable planetarium offers a state-of-the-art visual learning experience that can address many different fields of science for any grade level. It surrounds students with roaring dinosaurs, fascinating planets, and explosive storms - all immersive, engaging, and realistic. Dickinson State University has chosen to utilize its Discovery Dome to address Earth Science education at two levels. University courses across the science disciplines can use the Discovery Dome as part of their curriculum. The digital shows immerse the students in various topics ranging from astronomy to geology to weather and climate. The dome has proven to be a valuable tool for introducing new material to students as well as for reinforcing concepts previously covered in lectures or laboratory settings. The Discovery Dome also serves as an amazing science public-outreach tool. University students are trained to run the dome, and they travel with it to schools and libraries around the region. During the 2013-14 school year, our Discovery Dome visited over 30 locations. Many of the schools visited are in rural settings which offer students few opportunities to experience state-of-the-art science technology. The school kids are extremely excited when the Discovery Dome visits their community, and they will talk about the experience for many weeks. Traveling with the dome is also very valuable for the university students who get involved in the program. They become very familiar with the science content, and they gain experience working with teachers as well as the general public. They get to share their love of science, and they get to help inspire a new generation of scientists.
Provenance for Runtime Workflow Steering and Validation in Computational Seismology
NASA Astrophysics Data System (ADS)
Spinuso, A.; Krischer, L.; Krause, A.; Filgueira, R.; Magnoni, F.; Muraleedharan, V.; David, M.
2014-12-01
Provenance systems may be offered by modern workflow engines to collect metadata about the data transformations at runtime. If combined with effective visualisation and monitoring interfaces, these provenance recordings can speed up the validation process of an experiment, suggesting interactive or automated interventions with immediate effects on the lifecycle of a workflow run. For instance, in the field of computational seismology, if we consider research applications performing long lasting cross correlation analysis and high resolution simulations, the immediate notification of logical errors and the rapid access to intermediate results, can produce reactions which foster a more efficient progress of the research. These applications are often executed in secured and sophisticated HPC and HTC infrastructures, highlighting the need for a comprehensive framework that facilitates the extraction of fine grained provenance and the development of provenance aware components, leveraging the scalability characteristics of the adopted workflow engines, whose enactment can be mapped to different technologies (MPI, Storm clusters, etc). This work looks at the adoption of W3C-PROV concepts and data model within a user driven processing and validation framework for seismic data, supporting also computational and data management steering. Validation needs to balance automation with user intervention, considering the scientist as part of the archiving process. Therefore, the provenance data is enriched with community-specific metadata vocabularies and control messages, making an experiment reproducible and its description consistent with the community understandings. Moreover, it can contain user defined terms and annotations. The current implementation of the system is supported by the EU-Funded VERCE (http://verce.eu). It provides, as well as the provenance generation mechanisms, a prototypal browser-based user interface and a web API built on top of a NoSQL storage technology, experimenting ways to ensure a rapid and flexible access to the lineage traces. It supports the users with the visualisation of graphical products and offers combined operations to access and download the data which may be selectively stored at runtime, into dedicated data archives.
Towards structured sharing of raw and derived neuroimaging data across existing resources
Keator, D.B.; Helmer, K.; Steffener, J.; Turner, J.A.; Van Erp, T.G.M.; Gadde, S.; Ashish, N.; Burns, G.A.; Nichols, B.N.
2013-01-01
Data sharing efforts increasingly contribute to the acceleration of scientific discovery. Neuroimaging data is accumulating in distributed domain-specific databases and there is currently no integrated access mechanism nor an accepted format for the critically important meta-data that is necessary for making use of the combined, available neuroimaging data. In this manuscript, we present work from the Derived Data Working Group, an open-access group sponsored by the Biomedical Informatics Research Network (BIRN) and the International Neuroimaging Coordinating Facility (INCF) focused on practical tools for distributed access to neuroimaging data. The working group develops models and tools facilitating the structured interchange of neuroimaging meta-data and is making progress towards a unified set of tools for such data and meta-data exchange. We report on the key components required for integrated access to raw and derived neuroimaging data as well as associated meta-data and provenance across neuroimaging resources. The components include (1) a structured terminology that provides semantic context to data, (2) a formal data model for neuroimaging with robust tracking of data provenance, (3) a web service-based application programming interface (API) that provides a consistent mechanism to access and query the data model, and (4) a provenance library that can be used for the extraction of provenance data by image analysts and imaging software developers. We believe that the framework and set of tools outlined in this manuscript have great potential for solving many of the issues the neuroimaging community faces when sharing raw and derived neuroimaging data across the various existing database systems for the purpose of accelerating scientific discovery. PMID:23727024
Giusti, Chad; Ghrist, Robert; Bassett, Danielle S
2016-08-01
The language of graph theory, or network science, has proven to be an exceptional tool for addressing myriad problems in neuroscience. Yet, the use of networks is predicated on a critical simplifying assumption: that the quintessential unit of interest in a brain is a dyad - two nodes (neurons or brain regions) connected by an edge. While rarely mentioned, this fundamental assumption inherently limits the types of neural structure and function that graphs can be used to model. Here, we describe a generalization of graphs that overcomes these limitations, thereby offering a broad range of new possibilities in terms of modeling and measuring neural phenomena. Specifically, we explore the use of simplicial complexes: a structure developed in the field of mathematics known as algebraic topology, of increasing applicability to real data due to a rapidly growing computational toolset. We review the underlying mathematical formalism as well as the budding literature applying simplicial complexes to neural data, from electrophysiological recordings in animal models to hemodynamic fluctuations in humans. Based on the exceptional flexibility of the tools and recent ground-breaking insights into neural function, we posit that this framework has the potential to eclipse graph theory in unraveling the fundamental mysteries of cognition.
Lightweight Provenance Service for High-Performance Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dai, Dong; Chen, Yong; Carns, Philip
Provenance describes detailed information about the history of a piece of data, containing the relationships among elements such as users, processes, jobs, and workflows that contribute to the existence of data. Provenance is key to supporting many data management functionalities that are increasingly important in operations such as identifying data sources, parameters, or assumptions behind a given result; auditing data usage; or understanding details about how inputs are transformed into outputs. Despite its importance, however, provenance support is largely underdeveloped in highly parallel architectures and systems. One major challenge is the demanding requirements of providing provenance service in situ. Themore » need to remain lightweight and to be always on often conflicts with the need to be transparent and offer an accurate catalog of details regarding the applications and systems. To tackle this challenge, we introduce a lightweight provenance service, called LPS, for high-performance computing (HPC) systems. LPS leverages a kernel instrument mechanism to achieve transparency and introduces representative execution and flexible granularity to capture comprehensive provenance with controllable overhead. Extensive evaluations and use cases have confirmed its efficiency and usability. We believe that LPS can be integrated into current and future HPC systems to support a variety of data management needs.« less
A Model for Developing Meta-Cognitive Tools in Teacher Apprenticeships
ERIC Educational Resources Information Center
Bray, Paige; Schatz, Steven
2013-01-01
This research investigates a model for developing meta-cognitive tools to be used by pre-service teachers during apprenticeship (student teaching) experience to operationalise the epistemological model of Cook and Brown (2009). Meta-cognitive tools have proven to be effective for increasing performance and retention of undergraduate students.…
BioQ: tracing experimental origins in public genomic databases using a novel data provenance model.
Saccone, Scott F; Quan, Jiaxi; Jones, Peter L
2012-04-15
Public genomic databases, which are often used to guide genetic studies of human disease, are now being applied to genomic medicine through in silico integrative genomics. These databases, however, often lack tools for systematically determining the experimental origins of the data. We introduce a new data provenance model that we have implemented in a public web application, BioQ, for assessing the reliability of the data by systematically tracing its experimental origins to the original subjects and biologics. BioQ allows investigators to both visualize data provenance as well as explore individual elements of experimental process flow using precise tools for detailed data exploration and documentation. It includes a number of human genetic variation databases such as the HapMap and 1000 Genomes projects. BioQ is freely available to the public at http://bioq.saclab.net.
Association Between Availability of a Price Transparency Tool and Outpatient Spending.
Desai, Sunita; Hatfield, Laura A; Hicks, Andrew L; Chernew, Michael E; Mehrotra, Ateev
2016-05-03
There is increasing interest in using price transparency tools to decrease health care spending. To measure the association between offering a health care price transparency tool and outpatient spending. Two large employers represented in multiple market areas across the United States offered an online health care price transparency tool to their employees. One introduced it on April 1, 2011, and the other on January 1, 2012. The tool provided users information about what they would pay out of pocket for services from different physicians, hospitals, or other clinical sites. Using a matched difference-in-differences design, outpatient spending among employees offered the tool (n=148,655) was compared with that among employees from other companies not offered the tool (n=295,983) in the year before and after it was introduced. Availability of a price transparency tool. Annual outpatient spending, outpatient out-of-pocket spending, use rates of the tool. Mean outpatient spending among employees offered the tool was $2021 in the year before the tool was introduced and $2233 in the year after. In comparison, among controls, mean outpatient spending changed from $1985 to $2138. After adjusting for demographic and health characteristics, being offered the tool was associated with a mean $59 (95% CI, $25-$93) increase in outpatient spending. Mean outpatient out-of-pocket spending among those offered the tool was $507 in the year before introduction of the tool and $555 in the year after. Among the comparison group, mean outpatient out-of-pocket spending changed from $490 to $520. Being offered the price transparency tool was associated with a mean $18 (95% CI, $12-$25) increase in out-of-pocket spending after adjusting for relevant factors. In the first 12 months, 10% of employees who were offered the tool used it at least once. Among employees at 2 large companies, offering a price transparency tool was not associated with lower health care spending. The tool was used by only a small percentage of eligible employees.
Traditional transcutaneous approaches in head and neck surgery
Goessler, Ulrich R.
2012-01-01
The treatment of laryngeal and hypopharyngeal malignancies remains a challenging task for the head and neck surgeon as the chosen treatment modality often has to bridge the gap between oncologically sound radicality and preservation of function. Due to the increase in transoral laser surgery in early tumor stages and chemoradiation in advanced stages, the usage of traditional transcutaneous approaches has decreased over the recent past. In addition, the need for a function-sparing surgical approach as well as highest possible quality of life has become evident. In view of these facts, rationale and importance of traditional transcutaneous approaches to the treatment of laryngeal and hypopharyngeal malignancies are discussed in a contemporary background. The transcutaneous open partial laryngectomies remain a valuable tool in the surgeon's armamentarium for the treatment of early and advanced laryngeal carcinomas, especially in cases of impossible laryngeal overview using the rigid laryngoscope. Open partial laryngetomies offer superior overview and oncologic safety at the anterior commissure, especially in recurrencies. In select advanced cases and salvage settings, the supracricoid laryngectomy offers a valuable tool for function-preserving but oncologically safe surgical therapy at the cost of high postoperative morbidity and a very demanding rehabilitation of swallowing. In hypopharyngeal malignancies, the increasing use of transoral laser surgery has led to a decline in transcutaneous resections via partial pharyngectomy with partial laryngectomy in early tumor stages. In advanced stages of tumors of the piriform sinus and the postcricoid area with involvement of the larynx, total laryngectomy with partial pharyngectomy is an oncologically safe approach. The radical surgical approach using circumferent laryngopharyngectomy with/without esophagectomy is indicated in salvage cases with advanced recurrences or as a primary surgical approach in patients where chemoradiation does not offer sufficient oncologic control or preservation of function. In cases with impending reconstruction, fasciocutaneous free flaps (anterolateral thigh flap, radial forearm flap) seem to offer superior results to enteric flaps in cases where the cervical esophagus is not involved leading to better voice rehabilitation with fewer complications and postoperative morbidity. In salvage situations, the Gastroomental Free Flap has proven to be a valuable tool. In conclusion, the choice of a surgical treatment modality is influenced by the patient's anatomy, tumor size and location as well as the surgeon's personal expertise. PMID:23320058
CT, MRI and PET imaging in peritoneal malignancy
Sahdev, Anju; Reznek, Rodney H.
2011-01-01
Abstract Imaging plays a vital role in the evaluation of patients with suspected or proven peritoneal malignancy. Nevertheless, despite significant advances in imaging technology and protocols, assessment of peritoneal pathology remains challenging. The combination of complex peritoneal anatomy, an extensive surface area that may host tumour deposits and the considerable overlap of imaging appearances of various peritoneal diseases often makes interpretation difficult. Contrast-enhanced multidetector computed tomography (MDCT) remains the most versatile tool in the imaging of peritoneal malignancy. However, conventional and emerging magnetic resonance imaging (MRI) and positron emission tomography (PET)/CT techniques offer significant advantages over MDCT in detection and surveillance. This article reviews established and new techniques in CT, MRI and PET imaging in both primary and secondary peritoneal malignancies and provides an overview of peritoneal anatomy, function and modes of disease dissemination with illustration of common sites and imaging features of peritoneal malignancy. PMID:21865109
Development of a micromachined epiretinal vision prosthesis
NASA Astrophysics Data System (ADS)
Stieglitz, Thomas
2009-12-01
Microsystems engineering offers the tools to develop highly sophisticated miniaturized implants to interface with the nervous system. One challenging application field is the development of neural prostheses to restore vision in persons that have become blind by photoreceptor degeneration due to retinitis pigmentosa. The fundamental work that has been done in one approach is presented here. An epiretinal vision prosthesis has been developed that allows hybrid integration of electronics on one part of a thin and flexible substrate. Polyimide as a substrate material is proven to be non-cytotoxic. Non-hermetic encapsulation with parylene C was stable for at least 3 months in vivo. Chronic animal experiments proved spatially selective cortical activation after epiretinal stimulation with a 25-channel implant. Research results have been transferred successfully to companies that currently work on the medical device approval of these retinal vision prostheses in Europe and in the USA.
Inflatable Structures Technology Handbook. Chapter 21; Inflatable Habitats
NASA Technical Reports Server (NTRS)
Kennedy, Kriss J.; Raboin, Jasen; Spexarth, Gary; Valle, Gerard
2000-01-01
The technologies required to design, fabricate, and utilize an inflatable module for space applications has been demonstrated and proven by the TransHab team during the development phase of the program. Through testing and hands-on development several issues about inflatable space structures have been addressed , such as: ease of manufacturing, structural integrity, micrometeorite protection, folding , and vacuum deployment. The TransHab inflatable technology development program has proven that not only are inflatable structures a viable option, but they also offer significant advantages over conventional metallic structures.
Provenance Usage in the OceanLink Project
NASA Astrophysics Data System (ADS)
Narock, T.; Arko, R. A.; Carbotte, S. M.; Chandler, C. L.; Cheatham, M.; Fils, D.; Finin, T.; Hitzler, P.; Janowicz, K.; Jones, M.; Krisnadhi, A.; Lehnert, K. A.; Mickle, A.; Raymond, L. M.; Schildhauer, M.; Shepherd, A.; Wiebe, P. H.
2014-12-01
A wide spectrum of maturing methods and tools, collectively characterized as the Semantic Web, is helping to vastly improve thedissemination of scientific research. The OceanLink project, an NSF EarthCube Building Block, is utilizing semantic technologies tointegrate geoscience data repositories, library holdings, conference abstracts, and funded research awards. Provenance is a vital componentin meeting both the scientific and engineering requirements of OceanLink. Provenance plays a key role in justification and understanding when presenting users with results aggregated from multiple sources. In the engineering sense, provenance enables the identification of new data and the ability to determine which data sources to query. Additionally, OceanLink will leverage human and machine computation for crowdsourcing, text mining, and co-reference resolution. The results of these computations, and their associated provenance, will be folded back into the constituent systems to continually enhance precision and utility. We will touch on the various roles provenance is playing in OceanLink as well as present our use of the PROV Ontology and associated Ontology Design Patterns.
S-ProvFlow: provenance model and tools for scalable and adaptive analysis pipelines in geoscience.
NASA Astrophysics Data System (ADS)
Spinuso, A.; Mihajlovski, A.; Atkinson, M.; Filgueira, R.; Klampanos, I.; Sanchez, S.
2017-12-01
The reproducibility of scientific findings is essential to improve the quality and application of modern data-driven research. Delivering such reproducibility is challenging in the context of systems handling large data-streams with sophisticated computational methods. Similarly, the SKA (Square Kilometer Array) will collect an unprecedented volume of radio-wave signals that will have to be reduced and transformed into derived products, with impact on space-weather research. This highlights the importance of having cross-disciplines mechanisms at the producer's side that rely on usable lineage data to support validation and traceability of the new artifacts. To be informative, provenance has to describe each methods' abstractions and their implementation as mappings onto distributed platforms and their concurrent execution, capturing relevant internal dependencies at runtime. Producers and intelligent toolsets should be able to exploit the produced provenance, steering real-time monitoring activities and inferring adaptations of methods at runtime.We present a model of provenance (S-PROV) that extends W3C PROV and ProvONE, broadening coverage of provenance to aspects related to distribution, scale-up and steering of stateful streaming operators in analytic pipelines. This is supported by a technical framework for tuneable and actionable lineage, ensuring its relevance to the users' interests, fostering its rapid exploitation to facilitate research practices. By applying concepts such as provenance typing and profiling, users define rules to capture common provenance patterns and activate selective controls based on domain-metadata. The traces are recorded in a document-store with index optimisation and a web API serves advanced interactive tools (S-ProvFlow, https://github.com/KNMI/s-provenance). These allow different classes of consumers to rapidly explore the provenance data. The system, which contributes to the SKA-Link initiative, within technology and knowledge transfer events, will be discussed in the context of an existing data-intensive service for seismology (VERCE), and the newly funded project DARE - Delivering Agile Research Excellence. A generic solution for extreme data and methods in geosciences that domain experts can understand, change and use effectively.
In-vivo immunofluorescence confocal microscopy of herpes simplex virus type 1 keratitis
NASA Astrophysics Data System (ADS)
Kaufman, Stephen C.; Laird, Jeffery A.; Beuerman, Roger W.
1996-05-01
The white-light confocal microscope offers an in vivo, cellular-level resolution view of the cornea. This instrument has proven to be a valuable research and diagnostic tool for the study of infectious keratitis. In this study, we investigate the direct visualization of herpes simplex virus type 1 (HSV-1)-infected corneal epithelium, with in vivo confocal microscopy, using HSV-1 immunofluorescent antibodies. New Zealand white rabbits were infected with McKrae strain of HSV-1 in one eye; the other eye of each rabbit was used as an uninfected control. Four days later, the rabbits were anesthetized and a cellulose sponge was applied to each cornea, and a drop of direct HSV fluorescein-tagged antibody was placed on each sponge every 3 to 5 minutes for 1 hour. Fluorescence confocal microscopy was then performed. The HSV-infected corneas showed broad regions of hyperfluorescent epithelial cells. The uninfected corneas revealed no background fluorescence. Thus, using the confocal microscope with a fluorescent cube, we were able to visualize HSV-infected corneal epithelial cells tagged with a direct fluorescent antibody. This process may prove to be a useful clinical tool for the in vivo diagnosis of HSV keratitis.
Longo, Edoardo; Hussain, Rohanah; Siligardi, Giuliano
2015-03-01
Synchrotron radiation circular dichroism (SRCD) is a powerful tool for photo-stability assessment of proteins. Recently our research has been interested in applying SRCD to develop screening methodologies for accelerated photo-stability assessment of monoclonal antibody formulations. Despite it was proven to be reliable and applicable within a wide range of salts and excipients containing solutions, the presence of far-UV (<260nm) strong absorbing species (e.g., sodium chloride, histidine, arginine) in common formulations completely prevent the analysis. Herein, we propose a new method based on CD coupled with magnetic CD (MCD) to address the problem and offer an additional versatile tool for monitoring the photo-stability. This is done by assessing the stability of the samples by looking at the near-UV band, as well as giving insights in the denaturation mechanism. We applied this method to four mAbs formulations and correlated the results with dynamic light scattering data. Finally, we applied MCD in ligand interaction to key proteins such as lysozyme, comparing the human with the hen enzyme in the binding of N,N',N''-triacetylchitotriose. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
A CROSS-SPECIES APPROACH TO USING GENOMICS TOOLS IN AQUATIC TOXICOLOGY
Microarray technology has proven to be a useful tool for analyzing the transcriptome of various organisms representing conditions such as disease states, developmental stages, and responses to chemical exposure. Most commercially available arrays are limited to organisms that ha...
Towards a Unified Architecture for Data-Intensive Seismology in VERCE
NASA Astrophysics Data System (ADS)
Klampanos, I.; Spinuso, A.; Trani, L.; Krause, A.; Garcia, C. R.; Atkinson, M.
2013-12-01
Modern seismology involves managing, storing and processing large datasets, typically geographically distributed across organisations. Performing computational experiments using these data generates more data, which in turn have to be managed, further analysed and frequently be made available within or outside the scientific community. As part of the EU-funded project VERCE (http://verce.eu), we research and develop a number of use-cases, interfacing technologies to satisfy the data-intensive requirements of modern seismology. Our solution seeks to support: (1) familiar programming environments to develop and execute experiments, in particular via Python/ObsPy, (2) a unified view of heterogeneous computing resources, public or private, through the adoption of workflows, (3) monitoring the experiments and validating the data products at varying granularities, via a comprehensive provenance system, (4) reproducibility of experiments and consistency in collaboration, via a shared registry of processing units and contextual metadata (computing resources, data, etc.) Here, we provide a brief account of these components and their roles in the proposed architecture. Our design integrates heterogeneous distributed systems, while allowing researchers to retain current practices and control data handling and execution via higher-level abstractions. At the core of our solution lies the workflow language Dispel. While Dispel can be used to express workflows at fine detail, it may also be used as part of meta- or job-submission workflows. User interaction can be provided through a visual editor or through custom applications on top of parameterisable workflows, which is the approach VERCE follows. According to our design, the scientist may use versions of Dispel/workflow processing elements offered by the VERCE library or override them introducing custom scientific code, using ObsPy. This approach has the advantage that, while the scientist uses a familiar tool, the resulting workflow can be executed on a number of underlying stream-processing engines, such as STORM or OGSA-DAI, transparently. While making efficient use of arbitrarily distributed resources and large data-sets is of priority, such processing requires adequate provenance tracking and monitoring. Hiding computation and orchestration details via a workflow system, allows us to embed provenance harvesting where appropriate without impeding the user's regular working patterns. Our provenance model is based on the W3C PROV standard and can provide information of varying granularity regarding execution, systems and data consumption/production. A video demonstrating a prototype provenance exploration tool can be found at http://bit.ly/15t0Fz0. Keeping experimental methodology and results open and accessible, as well as encouraging reproducibility and collaboration, is of central importance to modern science. As our users are expected to be based at different geographical locations, to have access to different computing resources and to employ customised scientific codes, the use of a shared registry of workflow components, implementations, data and computing resources is critical.
Kirkman, Matthew A; Muirhead, William; Nandi, Dipankar; Sevdalis, Nick
2014-01-01
Neurosurgical simulation training is becoming increasingly popular. Attitudes toward simulation among residents can contribute to the effectiveness of simulation training, but such attitudes remain poorly explored in neurosurgery with no psychometrically proven measure in the literature. The aim of the present study was to evaluate prospectively a newly developed tool for this purpose: the Neurosurgical Evaluation of Attitudes towards simulation Training (NEAT). The NEAT tool was prospectively developed in 2 stages and psychometrically evaluated (validity and reliability) in 2 administrations with the same participants. The tool comprises a questionnaire with 9 Likert scale items and 2 free-text sections assessing attitudes toward simulation in neurosurgery. The evaluation was completed with 31 neurosurgery residents in London, United Kingdom, who were generally favorable toward neurosurgical simulation. The internal consistency of the questionnaire was high, as demonstrated by the overall Cronbach α values (α=0.899 and α=0.955). All but 2 questionnaire items had "substantial" or "almost perfect" test-retest reliability following repeated survey administrations (median Pearson r correlation=0.688; range, 0.248-0.841). NEAT items were well correlated with each other on both occasions, showing good validity of content within the NEAT tool. There was no significant relationship between either gender or length of neurosurgical experience and item ratings. NEAT is the first psychometrically evaluated tool for evaluating attitudes toward simulation in neurosurgery. Further implementation of NEAT is required in wider neurosurgical populations to establish whether specific population groups differ. Use of NEAT in studies of neurosurgical simulation could offer an additional outcome measure to performance metrics, permitting evaluation of the impact of neurosurgical simulation on attitudes toward simulation both between participants and within the same participants over time. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Mihajlovski, A.; Spinuso, A.; Plieger, M.; Som de Cerff, W.
2016-12-01
Modern Climate analysis platforms provide generic and standardized ways of accessing data and processing services. These are typically supported by a wide range of OGC formats and interfaces. However, the problem of instrumentally tracing the lineage of the transformations occurring on a dataset and its provenance remains an open challenge. It requires standard-driven and interoperable solutions to facilitate understanding, sharing of self-describing data products, fostering collaboration among peers. The CLIPC portal provided us real use case, where the need of an instrumented provenance management is fundamental. CLIPC provides a single point of access for scientific information on climate change. The data about the physical environment which is used to inform climate change policy and adaptation measures comes from several categories: satellite measurements, terrestrial observing systems, model projections and simulations and from re-analyses. This is made possible through the Copernicus Earth Observation Programme for Europe. With a backbone combining WPS and OPeNDAP services, CLIPC has two themes: 1. Harmonized access to climate datasets derived from models, observations and re-analyses 2. A climate impact tool kit to evaluate, rank and aggregate indicators The climate impact tool kit is realised with the orchestration of a number of WPS that ingest, normalize and combine NetCDF files. The WPS allowing this specific computation are hosted by the climate4impact portal, which is a more generic climate data-access and processing service. In this context, guaranteeing validation and reproducibility of results, is a clearly stated requirement to improve the quality of the results obtained by the combined analysis Two core contributions made, are the enabling of a provenance wrapper around WPS services and the enabling of provenance tracing within the NetCDF format, which adopts and extends the W3C's PROV model. To disseminate indicator data and create transformed data products, a standardized provenance, metadata and processing infrastructure is researched for CLIPC. These efforts will lead towards the provision of tools for further web service processing development and optimisation, opening up possibilities to scale and administer abstract users and data driven workflows.
Ways to Prevent Percussion Overuse Injuries
ERIC Educational Resources Information Center
Fidyk, Steve
2009-01-01
It is a proven fact that the repetitive nature of percussion playing can cause carpal tunnel syndrome, bursitis, and tendinitis. This paper offers ways to prevent percussion overuse injuries, particularly by developing a healthy warmup routine.
BioQ: tracing experimental origins in public genomic databases using a novel data provenance model
Saccone, Scott F.; Quan, Jiaxi; Jones, Peter L.
2012-01-01
Motivation: Public genomic databases, which are often used to guide genetic studies of human disease, are now being applied to genomic medicine through in silico integrative genomics. These databases, however, often lack tools for systematically determining the experimental origins of the data. Results: We introduce a new data provenance model that we have implemented in a public web application, BioQ, for assessing the reliability of the data by systematically tracing its experimental origins to the original subjects and biologics. BioQ allows investigators to both visualize data provenance as well as explore individual elements of experimental process flow using precise tools for detailed data exploration and documentation. It includes a number of human genetic variation databases such as the HapMap and 1000 Genomes projects. Availability and implementation: BioQ is freely available to the public at http://bioq.saclab.net Contact: ssaccone@wustl.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:22426342
Genome editing in livestock: Are we ready for a revolution in animal breeding industry?
Ruan, Jinxue; Xu, Jie; Chen-Tsai, Ruby Yanru; Li, Kui
2017-12-01
Genome editing is a powerful technology that can efficiently alter the genome of organisms to achieve targeted modification of endogenous genes and targeted integration of exogenous genes. Current genome-editing tools mainly include ZFN, TALEN and CRISPR/Cas9, which have been successfully applied to all species tested including zebrafish, humans, mice, rats, monkeys, pigs, cattle, sheep, goats and others. The application of genome editing has quickly swept through the entire biomedical field, including livestock breeding. Traditional livestock breeding is associated with rate limiting issues such as long breeding cycle and limitations of genetic resources. Genome editing tools offer solutions to these problems at affordable costs. Generation of gene-edited livestock with improved traits has proven feasible and valuable. For example, the CD163 gene-edited pig is resistant to porcine reproductive and respiratory syndrome (PRRS, also referred to as "blue ear disease"), and a SP110 gene knock-in cow less susceptible to tuberculosis. Given the high efficiency and low cost of genome editing tools, particularly CRISPR/Cas9, it is foreseeable that a significant number of genome edited livestock animals will be produced in the near future; hence it is imperative to comprehensively evaluate the pros and cons they will bring to the livestock breeding industry. Only with these considerations in mind, we will be able to fully take the advantage of the genome editing era in livestock breeding.
Using Art to Assess Environmental Education Outcomes
ERIC Educational Resources Information Center
Flowers, Ami A.; Carroll, John P.; Green, Gary T.; Larson, Lincoln R.
2015-01-01
Construction of developmentally appropriate tools for assessing the environmental attitudes and awareness of young learners has proven to be challenging. Art-based assessments that encourage creativity and accommodate different modes of expression may be a particularly useful complement to conventional tools (e.g. surveys), but their efficacy and…
Méndez, Verónica; Wood, Jamie R; Butler, Simon J
2018-05-01
Functional diversity metrics are increasingly used to augment or replace taxonomic diversity metrics to deliver more mechanistic insights into community structure and function. Metrics used to describe landscape structure and characteristics share many of the same limitations as taxonomy-based metrics, particularly their reliance on anthropogenically defined typologies with little consideration of structure, management, or function. However, the development of alternative metrics to describe landscape characteristics has been limited. Here, we extend the functional diversity framework to characterize landscapes based on the diversity of resources available across habitats present. We then examine the influence of resource diversity and provenance on the functional diversities of native and exotic avian communities in New Zealand. Invasive species are increasingly prevalent and considered a global threat to ecosystem function, but the characteristics of and interactions between sympatric native and exotic communities remain unresolved. Understanding their comparative responses to environmental change and the mechanisms underpinning them is of growing importance in predicting community dynamics and changing ecosystem function. We use (i) matrices of resource use (species) and resource availability (habitats) and (ii) occurrence data for 62 native and 25 exotic species and 19 native and 13 exotic habitats in 2015 10 × 10 km quadrats to examine the relationship between native and exotic avian and landscape functional diversity. The numbers of species in, and functional diversities of, native and exotic communities were positively related. Each community displayed evidence of environmental filtering, but it was significantly stronger for exotic species. Less environmental filtering occurred in landscapes providing a more diverse combination of resources, with resource provenance also an influential factor. Landscape functional diversity explained a greater proportion of variance in native and exotic community characteristics than the number of habitat types present. Resource diversity and provenance should be explicitly accounted for when characterizing landscape structure and change as they offer additional mechanistic understanding of the links between environmental filtering and community structure. Manipulating resource diversity through the design and implementation of management actions could prove a powerful tool for the delivery of conservation objectives, be they to protect native species, control exotic species, or maintain ecosystem service provision.
Software offers transparent, straightforward assessment of pavement additives : research spotlight.
DOT National Transportation Integrated Search
2015-04-01
Adding new materials to pavement layers is a proven technique to : improve performance. Many types of additivesfrom engineered : polymers and acids to recycled pavement, crumb rubber, shingles : and glasshave been used to help construct better ...
Scale models: A proven cost-effective tool for outage planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, R.; Segroves, R.
1995-03-01
As generation costs for operating nuclear stations have risen, more nuclear utilities have initiated efforts to improve cost effectiveness. Nuclear plant owners are also being challenged with lower radiation exposure limits and new revised radiation protection related regulations (10 CFR 20), which places further stress on their budgets. As source term reduction activities continue to lower radiation fields, reducing the amount of time spent in radiation fields becomes one of the most cost-effective ways of reducing radiation exposure. An effective approach for minimizing time spent in radiation areas is to use a physical scale model for worker orientation planning andmore » monitoring maintenance, modifications, and outage activities. To meet the challenge of continued reduction in the annual cumulative radiation exposures, new cost-effective tools are required. One field-tested and proven tool is the physical scale model.« less
75 FR 27552 - Guidance for Federal Land Management in the Chesapeake Bay Watershed
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-17
... effective tools and practices available to reduce water pollution from a variety of nonpoint sources... describe ``proven cost-effective tools and practices that reduce water pollution'' that are appropriate to...: Katie Flahive, USEPA, Office of Water, Office of Wetlands, Oceans and Watersheds, 1200 Pennsylvania Ave...
Provenance in Data Interoperability for Multi-Sensor Intercomparison
NASA Technical Reports Server (NTRS)
Lynnes, Chris; Leptoukh, Greg; Berrick, Steve; Shen, Suhung; Prados, Ana; Fox, Peter; Yang, Wenli; Min, Min; Holloway, Dan; Enloe, Yonsook
2008-01-01
As our inventory of Earth science data sets grows, the ability to compare, merge and fuse multiple datasets grows in importance. This requires a deeper data interoperability than we have now. Efforts such as Open Geospatial Consortium and OPeNDAP (Open-source Project for a Network Data Access Protocol) have broken down format barriers to interoperability; the next challenge is the semantic aspects of the data. Consider the issues when satellite data are merged, cross-calibrated, validated, inter-compared and fused. We must match up data sets that are related, yet different in significant ways: the phenomenon being measured, measurement technique, location in space-time or quality of the measurements. If subtle distinctions between similar measurements are not clear to the user, results can be meaningless or lead to an incorrect interpretation of the data. Most of these distinctions trace to how the data came to be: sensors, processing and quality assessment. For example, monthly averages of satellite-based aerosol measurements often show significant discrepancies, which might be due to differences in spatio- temporal aggregation, sampling issues, sensor biases, algorithm differences or calibration issues. Provenance information must be captured in a semantic framework that allows data inter-use tools to incorporate it and aid in the intervention of comparison or merged products. Semantic web technology allows us to encode our knowledge of measurement characteristics, phenomena measured, space-time representation, and data quality attributes in a well-structured, machine-readable ontology and rulesets. An analysis tool can use this knowledge to show users the provenance-related distrintions between two variables, advising on options for further data processing and analysis. An additional problem for workflows distributed across heterogeneous systems is retrieval and transport of provenance. Provenance may be either embedded within the data payload, or transmitted from server to client in an out-of-band mechanism. The out of band mechanism is more flexible in the richness of provenance information that can be accomodated, but it relies on a persistent framework and can be difficult for legacy clients to use. We are prototyping the embedded model, incorporating provenance within metadata objects in the data payload. Thus, it always remains with the data. The downside is a limit to the size of provenance metadata that we can include, an issue that will eventually need resolution to encompass the richness of provenance information required for daata intercomparison and merging.
Space exploration initiative (SEI) logistics support lessons from the DoD
NASA Astrophysics Data System (ADS)
Cox, John R.; McCoy, Walbert G.; Jenkins, Terence
Proven and innovative logistics management approaches and techniques used for developing and supporting DoD and Strategic Defense Initiative Office (SDIO) systems are described on the basis of input from DoD to the SEI Synthesis Group; SDIO-developed logistics initiatives, innovative tools, and methodologies; and logistics planning support provided to the NASA/Johnson Planet Surface System Office. The approach is tailored for lunar/Martian surface operations, and provides guidelines for the development and management of a crucial element of the SEI logistics support program. A case study is presented which shows how incorporation of DoD's proven and innovative logistics management approach, tools, and techniques can substantially benefit early logistics planning for SEI, while also implementing many of DoD's recommendations for SEI.
Helping Veterans with Disabilities Transition to Employment
ERIC Educational Resources Information Center
Ruh, Debra; Spicer, Paul; Vaughan, Kathleen
2009-01-01
Veterans with disabilities constitute a vast, capable, deserving, and under-utilized workforce, and many successful hiring campaigns have targeted the employment of veterans. Colleges offering comprehensive, individualized transitional services have proven successful in supporting veterans with disabilities reentering the civilian workforce. With…
A postprofessional distance-education program in neurodiagnostics and sleep science.
Overton, Auburne
2014-01-01
Sleep medicine is a quickly growing field of allied health and preventive medicine. The University of North Carolina has proven innovative and timely in offering a neurodiagnostics and sleep science bachelor's degree program for the sleep medicine profession.
Rider, Lisa G.; Dankó, Katalin; Miller, Frederick W.
2016-01-01
Purpose of review Clinical registries and biorepositories have proven extremely useful in many studies of diseases, especially rare diseases. Given their rarity and diversity, the idiopathic inflammatory myopathies, or myositis syndromes, have benefited from individual researchers’ collections of cohorts of patients. Major efforts are being made to establish large registries and biorepositories that will allow many additional studies to be performed that were not possible before. Here we describe the registries developed by investigators and patient support groups that are currently available for collaborative research purposes. Recent findings We have identified 46 myositis research registries, including many with biorepositories, which have been developed for a wide variety of purposes and have resulted in great advances in understanding the range of phenotypes, clinical presentations, risk factors, pathogenic mechanisms, outcome assessment, therapeutic responses, and prognoses. These are now available for collaborative use to undertake additional studies. Two myositis patient registries have been developed for research, and myositis patient support groups maintain demographic registries with large numbers of patients available to be contacted for potential research participation. Summary Investigator-initiated myositis research registries and biorepositories have proven extremely useful in understanding many aspects of these rare and diverse autoimmune diseases. These registries and biorepositories, in addition to those developed by myositis patient support groups, deserve continued support to maintain the momentum in this field as they offer major opportunities to improve understanding of the pathogenesis and treatment of these diseases in cost-effective ways. PMID:25225838
Rider, Lisa G; Dankó, Katalin; Miller, Frederick W
2014-11-01
Clinical registries and biorepositories have proven extremely useful in many studies of diseases, especially rare diseases. Given their rarity and diversity, the idiopathic inflammatory myopathies, or myositis syndromes, have benefited from individual researchers' collections of cohorts of patients. Major efforts are being made to establish large registries and biorepositories that will allow many additional studies to be performed that were not possible before. Here, we describe the registries developed by investigators and patient support groups that are currently available for collaborative research purposes. We have identified 46 myositis research registries, including many with biorepositories, which have been developed for a wide variety of purposes and have resulted in great advances in understanding the range of phenotypes, clinical presentations, risk factors, pathogenic mechanisms, outcome assessment, therapeutic responses, and prognoses. These are now available for collaborative use to undertake additional studies. Two myositis patient registries have been developed for research, and myositis patient support groups maintain demographic registries with large numbers of patients available to be contacted for potential research participation. Investigator-initiated myositis research registries and biorepositories have proven extremely useful in understanding many aspects of these rare and diverse autoimmune diseases. These registries and biorepositories, in addition to those developed by myositis patient support groups, deserve continued support to maintain the momentum in this field as they offer major opportunities to improve understanding of the pathogenesis and treatment of these diseases in cost-effective ways.
Enhancements to the Image Analysis Tool for Core Punch Experiments and Simulations (vs. 2014)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hogden, John Edward; Unal, Cetin
A previous paper (Hogden & Unal, 2012, Image Analysis Tool for Core Punch Experiments and Simulations) described an image processing computer program developed at Los Alamos National Laboratory. This program has proven useful so developement has been continued. In this paper we describe enhacements to the program as of 2014.
RTI Success: Proven Tools and Strategies for Schools and Classrooms
ERIC Educational Resources Information Center
Whitten, Elizabeth; Esteves, Kelli J.; Woodrow, Alice
2009-01-01
What is Response to Intervention (RTI) and how can it benefit your school? Find out in "RTI Success", an all-in-one resource that provides information on this innovative model as well as step-by-step administrator guidelines and practical teacher tools for implementation. Despite ongoing federal initiatives meant to increase the profile…
ERIC Educational Resources Information Center
Chaudhary, Anil Kumar; Warner, Laura A.
2015-01-01
Most educational programs are designed to produce lower level outcomes, and Extension educators are challenged to produce behavior change in target audiences. Social norms are a very powerful proven tool for encouraging sustainable behavior change among Extension's target audiences. Minor modifications to program content to demonstrate the…
The parser generator as a general purpose tool
NASA Technical Reports Server (NTRS)
Noonan, R. E.; Collins, W. R.
1985-01-01
The parser generator has proven to be an extremely useful, general purpose tool. It can be used effectively by programmers having only a knowledge of grammars and no training at all in the theory of formal parsing. Some of the application areas for which a table-driven parser can be used include interactive, query languages, menu systems, translators, and programming support tools. Each of these is illustrated by an example grammar.
NASA Astrophysics Data System (ADS)
Hakulinen, T.; Klein, J.
2016-03-01
Two-photon (2P) microscopy based on tunable Ti:sapphire lasers has become a widespread tool for 3D imaging with sub-cellular resolution in living tissues. In recent years multi-photon microscopy with simpler fixed-wavelength femtosecond oscillators using Yb-doped tungstenates as gain material has raised increasing interest in life-sciences, because these lasers offer one order of magnitude more average power than Ti:sapphire lasers in the wavelength range around 1040 nm: Two-photon (2P) excitation of mainly red or yellow fluorescent dyes and proteins (e.g. YFP, mFruit series) simultaneously has been proven with a single IR laser wavelength. A new approach is to extend the usability of existing tunable Titanium sapphire lasers by adding a fixed IR wavelength with an Yb femtosecond oscillator. By that means a multitude of applications for multimodal imaging and optogenetics can be supported. Furthermore fs Yb-lasers are available with a repetition rate of typically 10 MHz and an average power of typically 5 W resulting in pulse energy of typically 500 nJ, which is comparably high for fs-oscillators. This makes them an ideal tool for two-photon spinning disk laser scanning microscopy and holographic patterning for simultaneous photoactivation of large cell populations. With this work we demonstrate that economical, small-footprint Yb fixed-wavelength lasers can present an interesting add-on to tunable lasers that are commonly used in multiphoton microscopy. The Yb fs-lasers hereby offer higher power for imaging of red fluorescent dyes and proteins, are ideally enhancing existing Ti:sapphire lasers with more power in the IR, and are supporting pulse energy and power hungry applications such as spinning disk microscopy and holographic patterning.
Data Provenance as a Tool for Debugging Hydrological Models based on Python
NASA Astrophysics Data System (ADS)
Wombacher, A.; Huq, M.; Wada, Y.; Van Beek, R.
2012-12-01
There is an increase in data volume used in hydrological modeling. The increasing data volume requires additional efforts in debugging models since a single output value is influenced by a multitude of input values. Thus, it is difficult to keep an overview among the data dependencies. Further, knowing these dependencies, it is a tedious job to infer all the relevant data values. The aforementioned data dependencies are also known as data provenance, i.e. the determination of how a particular value has been created and processed. The proposed tool infers the data provenance automatically from a python script and visualizes the dependencies as a graph without executing the script. To debug the model the user specifies the value of interest in space and time. The tool infers all related data values and displays them in the graph. The tool has been evaluated by hydrologists developing a model for estimating the global water demand [1]. The model uses multiple different data sources. The script we analysed has 120 lines of codes and used more than 3000 individual files, each of them representing a raster map of 360*720 cells. After importing the data of the files into a SQLite database, the data consumes around 40 GB of memory. Using the proposed tool a modeler is able to select individual values and infer which values have been used to calculate the value. Especially in cases of outliers or missing values it is a beneficial tool to provide the modeler with efficient information to investigate the unexpected behavior of the model. The proposed tool can be applied to many python scripts and has been tested with other scripts in different contexts. In case a python code contains an unknown function or class the tool requests additional information about the used function or class to enable the inference. This information has to be entered only once and can be shared with colleagues or in the community. Reference [1] Y. Wada, L. P. H. van Beek, D. Viviroli, H. H. Drr, R. Weingartner, and M. F. P. Bierkens, "Global monthly water stress: II. water demand and severity of water," Water Resources Research, vol. 47, 2011.
Personalising nutritional guidance for more effective behaviour change.
Celis-Morales, Carlos; Lara, Jose; Mathers, John C
2015-05-01
Improving diet and other lifestyle behaviours has considerable potential for reducing the global burden of non-communicable diseases, promoting better health across the life-course and increasing wellbeing. However, realising this potential will require the development, testing and implementation of much more effective behaviour change interventions than are used conventionally. Evidence-based, personalised (or stratified) interventions which incorporate effective behaviour change techniques (BCT) and which are delivered digitally are likely to be an important route to scalable and sustainable interventions. Progress in developing such interventions will depend on the outcomes of research on: (i) the best bases for personalisation of dietary advice; (ii) identification of BCT which are proven to enhance intervention efficacy; (iii) suitable platforms (digital-based tools) for collection of relevant participant characteristics (e.g. socioeconomic information, current diet and lifestyle and dietary preferences) linked with intelligent systems which use those characteristics to offer tailored feedback and advice in a cost-effective and acceptable manner. Future research should focus on such interventions aiming to reduce health inequalities and to improve overall public health.
AnthropMMD: An R package with a graphical user interface for the mean measure of divergence.
Santos, Frédéric
2018-01-01
The mean measure of divergence is a dissimilarity measure between groups of individuals described by dichotomous variables. It is well suited to datasets with many missing values, and it is generally used to compute distance matrices and represent phenograms. Although often used in biological anthropology and archaeozoology, this method suffers from a lack of implementation in common statistical software. A package for the R statistical software, AnthropMMD, is presented here. Offering a dynamic graphical user interface, it is the first one dedicated to Smith's mean measure of divergence. The package also provides facilities for graphical representations and the crucial step of trait selection, so that the entire analysis can be performed through the graphical user interface. Its use is demonstrated using an artificial dataset, and the impact of trait selection is discussed. Finally, AnthropMMD is compared to three other free tools available for calculating the mean measure of divergence, and is proven to be consistent with them. © 2017 Wiley Periodicals, Inc.
Wu, Qifang; Xie, Lijuan; Xu, Huirong
2018-06-30
Nuts and dried fruits contain rich nutrients and are thus highly vulnerable to contamination with toxigenic fungi and aflatoxins because of poor weather, processing and storage conditions. Imaging and spectroscopic techniques have proven to be potential alternative tools to wet chemistry methods for efficient and non-destructive determination of contamination with fungi and toxins. Thus, this review provides an overview of the current developments and applications in frequently used food safety testing techniques, including near infrared spectroscopy (NIRS), mid-infrared spectroscopy (MIRS), conventional imaging techniques (colour imaging (CI) and hyperspectral imaging (HSI)), and fluorescence spectroscopy and imaging (FS/FI). Interesting classification and determination results can be found in both static and on/in-line real-time detection for contaminated nuts and dried fruits. Although these techniques offer many benefits over conventional methods, challenges remain in terms of heterogeneous distribution of toxins, background constituent interference, model robustness, detection limits, sorting efficiency, as well as instrument development. Copyright © 2018 Elsevier Ltd. All rights reserved.
Matusow, Harlan; Dickman, Samuel L; Rich, Josiah D; Fong, Chunki; Dumont, Dora M; Hardin, Carolyn; Marlowe, Douglas; Rosenblum, Andrew
2013-01-01
Drug treatment courts are an increasingly important tool in reducing the census of those incarcerated for non-violent drug offenses; medication assisted treatment (MAT) is proven to be an effective treatment for opioid addiction. However, little is known about the availability of and barriers to MAT provision for opioid-addicted people under drug court jurisdiction. Using an online survey, we assessed availability, barriers, and need for MAT (especially agonist medication) for opioid addiction in drug courts. Ninety-eight percent reported opioid-addicted participants, and 47% offered agonist medication (56% for all MAT including naltrexone). Barriers included cost and court policy. Responses revealed significant uncertainty, especially among non-MAT providing courts. Political, judicial and administrative opposition appear to affect MAT's inconsistent use and availability in drug court settings. These data suggest that a substantial, targeted educational initiative is needed to increase awareness of the treatment and criminal justice benefits of MAT in the drug courts. Copyright © 2013 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Omran, Adel; Dietrich, Schröder; Abouelmagd, Abdou; Michael, Märker
2016-09-01
Damages caused by flash floods hazards are an increasing phenomenon, especially in arid and semi-arid areas. Thus, the need to evaluate these areas based on their flash flood risk using maps and hydrological models is also becoming more important. For ungauged watersheds a tentative analysis can be carried out based on the geomorphometric characteristics of the terrain. To process regions with larger watersheds, where perhaps hundreds of watersheds have to be delineated, processed and classified, the overall process need to be automated. GIS packages such as ESRI's ArcGIS offer a number of sophisticated tools that help regarding such analysis. Yet there are still gaps and pitfalls that need to be considered if the tools are combined into a geoprocessing model to automate the complete assessment workflow. These gaps include issues such as i) assigning stream order according to Strahler theory, ii) calculating the threshold value for the stream network extraction, and iii) determining the pour points for each of the nodes of the Strahler ordered stream network. In this study a complete automated workflow based on ArcGIS Model Builder using standard tools will be introduced and discussed. Some additional tools have been implemented to complete the overall workflow. These tools have been programmed using Python and Java in the context of ArcObjects. The workflow has been applied to digital data from the southwestern Sinai Peninsula, Egypt. An optimum threshold value has been selected to optimize drainage configuration by statistically comparing all of the extracted stream configuration results from DEM with the available reference data from topographic maps. The code has succeeded in estimating the correct ranking of specific stream orders in an automatic manner without additional manual steps. As a result, the code has proven to save time and efforts; hence it's considered a very useful tool for processing large catchment basins.
ProphTools: general prioritization tools for heterogeneous biological networks.
Navarro, Carmen; Martínez, Victor; Blanco, Armando; Cano, Carlos
2017-12-01
Networks have been proven effective representations for the analysis of biological data. As such, there exist multiple methods to extract knowledge from biological networks. However, these approaches usually limit their scope to a single biological entity type of interest or they lack the flexibility to analyze user-defined data. We developed ProphTools, a flexible open-source command-line tool that performs prioritization on a heterogeneous network. ProphTools prioritization combines a Flow Propagation algorithm similar to a Random Walk with Restarts and a weighted propagation method. A flexible model for the representation of a heterogeneous network allows the user to define a prioritization problem involving an arbitrary number of entity types and their interconnections. Furthermore, ProphTools provides functionality to perform cross-validation tests, allowing users to select the best network configuration for a given problem. ProphTools core prioritization methodology has already been proven effective in gene-disease prioritization and drug repositioning. Here we make ProphTools available to the scientific community as flexible, open-source software and perform a new proof-of-concept case study on long noncoding RNAs (lncRNAs) to disease prioritization. ProphTools is robust prioritization software that provides the flexibility not present in other state-of-the-art network analysis approaches, enabling researchers to perform prioritization tasks on any user-defined heterogeneous network. Furthermore, the application to lncRNA-disease prioritization shows that ProphTools can reach the performance levels of ad hoc prioritization tools without losing its generality. © The Authors 2017. Published by Oxford University Press.
Jones, Brendon R; Brouwers, Luke B; Van Tonder, Warren D; Dippenaar, Matthys A
2017-05-01
The vadose zone typically comprises soil underlain by fractured rock. Often, surface water and groundwater parameters are readily available, but variably saturated flow through soil and rock are oversimplified or estimated as input for hydrological models. In this paper, a series of geotechnical centrifuge experiments are conducted to contribute to the knowledge gaps in: (i) variably saturated flow and dispersion in soil and (ii) variably saturated flow in discrete vertical and horizontal fractures. Findings from the research show that the hydraulic gradient, and not the hydraulic conductivity, is scaled for seepage flow in the geotechnical centrifuge. Furthermore, geotechnical centrifuge modelling has been proven as a viable experimental tool for the modelling of hydrodynamic dispersion as well as the replication of similar flow mechanisms for unsaturated fracture flow, as previously observed in literature. Despite the imminent challenges of modelling variable saturation in the vadose zone, the geotechnical centrifuge offers a powerful experimental tool to physically model and observe variably saturated flow. This can be used to give valuable insight into mechanisms associated with solid-fluid interaction problems under these conditions. Findings from future research can be used to validate current numerical modelling techniques and address the subsequent influence on aquifer recharge and vulnerability, contaminant transport, waste disposal, dam construction, slope stability and seepage into subsurface excavations.
Remote sensing techniques in monitoring areas affected by forest fire
NASA Astrophysics Data System (ADS)
Karagianni, Aikaterini Ch.; Lazaridou, Maria A.
2017-09-01
Forest fire is a part of nature playing a key role in shaping ecosystems. However, fire's environmental impacts can be significant, affecting wildlife habitat and timber, human settlements, man-made technical constructions and various networks (road, power networks) and polluting the air with emissions harmful to human health. Furthermore, fire's effect on the landscape may be long-lasting. Monitoring the development of a fire occurs as an important aspect at the management of natural hazards in general. Among the used methods for monitoring, satellite data and remote sensing techniques can be proven of particular importance. Satellite remote sensing offers a useful tool for forest fire detection, monitoring, management and damage assessment. Especially for fire scars detection and monitoring, satellite data derived from Landsat 8 can be a useful research tool. This paper includes critical considerations of the above and concerns in particular an example of the Greek area (Thasos Island). This specific area was hit by fires several times in the past and recently as well (September 2016). Landsat 8 satellite data are being used (pre and post fire imagery) and digital image processing techniques are applied (enhancement techniques, calculation of various indices) for fire scars detection. Visual interpretation of the example area affected by the fires is also being done, contributing to the overall study.
Fluoromodule-based reporter/probes designed for in vivo fluorescence imaging
Zhang, Ming; Chakraborty, Subhasish K.; Sampath, Padma; Rojas, Juan J.; Hou, Weizhou; Saurabh, Saumya; Thorne, Steve H.; Bruchez, Marcel P.; Waggoner, Alan S.
2015-01-01
Optical imaging of whole, living animals has proven to be a powerful tool in multiple areas of preclinical research and has allowed noninvasive monitoring of immune responses, tumor and pathogen growth, and treatment responses in longitudinal studies. However, fluorescence-based studies in animals are challenging because tissue absorbs and autofluoresces strongly in the visible light spectrum. These optical properties drive development and use of fluorescent labels that absorb and emit at longer wavelengths. Here, we present a far-red absorbing fluoromodule–based reporter/probe system and show that this system can be used for imaging in living mice. The probe we developed is a fluorogenic dye called SC1 that is dark in solution but highly fluorescent when bound to its cognate reporter, Mars1. The reporter/probe complex, or fluoromodule, produced peak emission near 730 nm. Mars1 was able to bind a variety of structurally similar probes that differ in color and membrane permeability. We demonstrated that a tool kit of multiple probes can be used to label extracellular and intracellular reporter–tagged receptor pools with 2 colors. Imaging studies may benefit from this far-red excited reporter/probe system, which features tight coupling between probe fluorescence and reporter binding and offers the option of using an expandable family of fluorogenic probes with a single reporter gene. PMID:26348895
Neuroimaging Study Designs, Computational Analyses and Data Provenance Using the LONI Pipeline
Dinov, Ivo; Lozev, Kamen; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Pierce, Jonathan; Zamanyan, Alen; Chakrapani, Shruthi; Van Horn, John; Parker, D. Stott; Magsipoc, Rico; Leung, Kelvin; Gutman, Boris; Woods, Roger; Toga, Arthur
2010-01-01
Modern computational neuroscience employs diverse software tools and multidisciplinary expertise to analyze heterogeneous brain data. The classical problems of gathering meaningful data, fitting specific models, and discovering appropriate analysis and visualization tools give way to a new class of computational challenges—management of large and incongruous data, integration and interoperability of computational resources, and data provenance. We designed, implemented and validated a new paradigm for addressing these challenges in the neuroimaging field. Our solution is based on the LONI Pipeline environment [3], [4], a graphical workflow environment for constructing and executing complex data processing protocols. We developed study-design, database and visual language programming functionalities within the LONI Pipeline that enable the construction of complete, elaborate and robust graphical workflows for analyzing neuroimaging and other data. These workflows facilitate open sharing and communication of data and metadata, concrete processing protocols, result validation, and study replication among different investigators and research groups. The LONI Pipeline features include distributed grid-enabled infrastructure, virtualized execution environment, efficient integration, data provenance, validation and distribution of new computational tools, automated data format conversion, and an intuitive graphical user interface. We demonstrate the new LONI Pipeline features using large scale neuroimaging studies based on data from the International Consortium for Brain Mapping [5] and the Alzheimer's Disease Neuroimaging Initiative [6]. User guides, forums, instructions and downloads of the LONI Pipeline environment are available at http://pipeline.loni.ucla.edu. PMID:20927408
Evolution of computational models in BioModels Database and the Physiome Model Repository.
Scharm, Martin; Gebhardt, Tom; Touré, Vasundra; Bagnacani, Andrea; Salehzadeh-Yazdi, Ali; Wolkenhauer, Olaf; Waltemath, Dagmar
2018-04-12
A useful model is one that is being (re)used. The development of a successful model does not finish with its publication. During reuse, models are being modified, i.e. expanded, corrected, and refined. Even small changes in the encoding of a model can, however, significantly affect its interpretation. Our motivation for the present study is to identify changes in models and make them transparent and traceable. We analysed 13734 models from BioModels Database and the Physiome Model Repository. For each model, we studied the frequencies and types of updates between its first and latest release. To demonstrate the impact of changes, we explored the history of a Repressilator model in BioModels Database. We observed continuous updates in the majority of models. Surprisingly, even the early models are still being modified. We furthermore detected that many updates target annotations, which improves the information one can gain from models. To support the analysis of changes in model repositories we developed MoSt, an online tool for visualisations of changes in models. The scripts used to generate the data and figures for this study are available from GitHub https://github.com/binfalse/BiVeS-StatsGenerator and as a Docker image at https://hub.docker.com/r/binfalse/bives-statsgenerator/ . The website https://most.bio.informatik.uni-rostock.de/ provides interactive access to model versions and their evolutionary statistics. The reuse of models is still impeded by a lack of trust and documentation. A detailed and transparent documentation of all aspects of the model, including its provenance, will improve this situation. Knowledge about a model's provenance can avoid the repetition of mistakes that others already faced. More insights are gained into how the system evolves from initial findings to a profound understanding. We argue that it is the responsibility of the maintainers of model repositories to offer transparent model provenance to their users.
Wilks, Beth; Morgan, Ruth M; Rose, Neil L
2017-09-01
The use of geoforensic analysis in criminal investigations is continuing to develop, with the diversification of analytical techniques, many of which are semi-automated, facilitating prompt analysis of large sample sets at a relatively low cost. Whilst micro-scale geoforensic analysis has been shown to assist criminal investigations including homicide (Concheri et al., 2011 [1]), wildlife crime (Morgan et al., 2006 [2]), illicit drug distribution (Stanley, 1992 [3]), and burglary (Mildenhall, 2006 [4]), its application to the pressing international security threat posed by Improvised Explosive Devices (IEDs) is yet to be considered. This experimental study simulated an IED supply chain from the sourcing of raw materials through to device emplacement. Mineralogy, quartz grain surface texture analysis (QGSTA) and particle size analysis (PSA) were used to assess whether environmental materials were transferred and subsequently persisted on the different components of three pressure plate IEDs. The research also addressed whether these samples were comprised of material from single or multiple geographical provenances that represented supply chain activity nodes. The simulation demonstrated that material derived from multiple activity nodes, was transferred and persisted on device components. The results from the mineralogy and QGSTA illustrated the value these techniques offer for the analysis of mixed provenance samples. The results from the PSA, which produces a bulk signature of the sample, failed to distinguish multiple provenances. The study also considered how the environmental material recovered could be used to generate information regarding the geographical locations the device had been in contact with, in an intelligence style investigation, and demonstrated that geoforensic analysis has the potential to be of value to international counter-IED efforts. It is a tool that may be used to prevent the distribution of large quantities of devices, by aiding the identification of the geographical location of key activity nodes. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Allenspach, K; Vaden, S L; Harris, T S; Gröne, A; Doherr, M G; Griot-Wenk, M E; Bischoff, S C; Gaschen, F
2006-01-01
To evaluate the colonoscopic allergen provocation (COLAP) test as a new tool for the diagnosis of IgE-mediated food allergy. Oral food challenges as well as COLAP testing were performed in a colony of nine research dogs with proven immediate-type food allergic reactions. In addition, COLAP was performed in five healthy dogs. When compared with the oral challenge test, COLAP accurately determined 18 of 23 (73 per cent) positive oral challenge reactions (73 per cent) in dogs with food allergies and was negative in the healthy dogs. The accuracy of this new test may be higher than that for gastric sensitivity testing. Therefore, COLAP holds promise as a new test to confirm the diagnosis of suspect IgE-mediated food allergy in dogs.
Accurate Micro-Tool Manufacturing by Iterative Pulsed-Laser Ablation
NASA Astrophysics Data System (ADS)
Warhanek, Maximilian; Mayr, Josef; Dörig, Christian; Wegener, Konrad
2017-12-01
Iterative processing solutions, including multiple cycles of material removal and measurement, are capable of achieving higher geometric accuracy by compensating for most deviations manifesting directly on the workpiece. Remaining error sources are the measurement uncertainty and the repeatability of the material-removal process including clamping errors. Due to the lack of processing forces, process fluids and wear, pulsed-laser ablation has proven high repeatability and can be realized directly on a measuring machine. This work takes advantage of this possibility by implementing an iterative, laser-based correction process for profile deviations registered directly on an optical measurement machine. This way efficient iterative processing is enabled, which is precise, applicable for all tool materials including diamond and eliminates clamping errors. The concept is proven by a prototypical implementation on an industrial tool measurement machine and a nanosecond fibre laser. A number of measurements are performed on both the machine and the processed workpieces. Results show production deviations within 2 μm diameter tolerance.
Tool geometry and damage mechanisms influencing CNC turning efficiency of Ti6Al4V
NASA Astrophysics Data System (ADS)
Suresh, Sangeeth; Hamid, Darulihsan Abdul; Yazid, M. Z. A.; Nasuha, Nurdiyanah; Ain, Siti Nurul
2017-12-01
Ti6Al4V or Grade 5 titanium alloy is widely used in the aerospace, medical, automotive and fabrication industries, due to its distinctive combination of mechanical and physical properties. Ti6Al4V has always been perverse during its machining, strangely due to the same mix of properties mentioned earlier. Ti6Al4V machining has resulted in shorter cutting tool life which has led to objectionable surface integrity and rapid failure of the parts machined. However, the proven functional relevance of this material has prompted extensive research in the optimization of machine parameters and cutting tool characteristics. Cutting tool geometry plays a vital role in ensuring dimensional and geometric accuracy in machined parts. In this study, an experimental investigation is actualized to optimize the nose radius and relief angles of the cutting tools and their interaction to different levels of machining parameters. Low elastic modulus and thermal conductivity of Ti6Al4V contribute to the rapid tool damage. The impact of these properties over the tool tips damage is studied. An experimental design approach is utilized in the CNC turning process of Ti6Al4V to statistically analyze and propose optimum levels of input parameters to lengthen the tool life and enhance surface characteristics of the machined parts. A greater tool nose radius with a straight flank, combined with low feed rates have resulted in a desirable surface integrity. The presence of relief angle has proven to aggravate tool damage and also dimensional instability in the CNC turning of Ti6Al4V.
Effective Therapy for College Students. Alternatives to Traditional Counseling.
ERIC Educational Resources Information Center
Hanfmann, Eugenia
Clinical professionals are offered practical guidelines for organizing psychological counseling services that will be acceptable and available to large numbers of students without being exorbitantly expensive. Detailed accounts are given on therapeutic and administrative procedures that have proven highly effective at Brandeis University's…
Need for Social Approval and Drug Use
ERIC Educational Resources Information Center
Scherer, Shawn E.; And Others
1972-01-01
The present results offer a partial explantation of drug use which is consistent with certain therapeutic techniques which have proven successful in the treatment of drug addicts. Approval motivation appears to play a significant role in both the initiation and treatment of illicit drug use. (Author)
Capturing, Harmonizing and Delivering Data and Quality Provenance
NASA Technical Reports Server (NTRS)
Leptoukh, Gregory; Lynnes, Christopher
2011-01-01
Satellite remote sensing data have proven to be vital for various scientific and applications needs. However, the usability of these data depends not only on the data values but also on the ability of data users to assess and understand the quality of these data for various applications and for comparison or inter-usage of data from different sensors and models. In this paper, we describe some aspects of capturing, harmonizing and delivering this information to users in the framework of distributed web-based data tools.
NASA Technical Reports Server (NTRS)
Kahan, A. M. (Principal Investigator)
1975-01-01
The author has identified the following significant results. The LANDSAT data collection system has proven itself to be a valuable tool for control of cloud seeding operations and for verification of weather forecasts. These platforms have proven to be reliable weather resistant units suitable for the collection of hydrometeorological data from remote severe weather environments. The detailed design of the wind speed and direction system and the wire-wrapping of the logic boards were completed.
A Paradigm Shift From Brick and Mortar: Full-Time Nursing Faculty Off Campus.
Beck, Marlene; Bradley, Holly B; Cook, Linda L; Leasca, Joslin B; Lampley, Tammy; Gatti-Petito, JoAnne
The organizational structure for the Master of Science in Nursing's online program at Sacred Heart University offers a remarkably different innovative faculty model. Full-time, doctorally prepared faculty reside in several different states and teach online but are fully integrated and immersed in all aspects of the college of nursing. This untraditional model, which has proven to be successful over time using best practices for online education, is replicable and offers an innovative option for online learning.
Climate Data Provenance Tracking for Just-In-Time Computation
NASA Astrophysics Data System (ADS)
Fries, S.; Nadeau, D.; Doutriaux, C.; Williams, D. N.
2016-12-01
The "Climate Data Management System" (CDMS) was created in 1996 as part of the Climate Data Analysis Tools suite of software. It provides a simple interface into a wide variety of climate data formats, and creates NetCDF CF-Compliant files. It leverages the NumPy framework for high performance computation, and is an all-in-one IO and computation package. CDMS has been extended to track manipulations of data, and trace that data all the way to the original raw data. This extension tracks provenance about data, and enables just-in-time (JIT) computation. The provenance for each variable is packaged as part of the variable's metadata, and can be used to validate data processing and computations (by repeating the analysis on the original data). It also allows for an alternate solution for sharing analyzed data; if the bandwidth for a transfer is prohibitively expensive, the provenance serialization can be passed in a much more compact format and the analysis rerun on the input data. Data provenance tracking in CDMS enables far-reaching and impactful functionalities, permitting implementation of many analytical paradigms.
Educational and Scientific Applications of Climate Model Diagnostic Analyzer
NASA Astrophysics Data System (ADS)
Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Zhang, J.; Bao, Q.
2016-12-01
Climate Model Diagnostic Analyzer (CMDA) is a web-based information system designed for the climate modeling and model analysis community to analyze climate data from models and observations. CMDA provides tools to diagnostically analyze climate data for model validation and improvement, and to systematically manage analysis provenance for sharing results with other investigators. CMDA utilizes cloud computing resources, multi-threading computing, machine-learning algorithms, web service technologies, and provenance-supporting technologies to address technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. As CMDA infrastructure and technology have matured, we have developed the educational and scientific applications of CMDA. Educationally, CMDA supported the summer school of the JPL Center for Climate Sciences for three years since 2014. In the summer school, the students work on group research projects where CMDA provide datasets and analysis tools. Each student is assigned to a virtual machine with CMDA installed in Amazon Web Services. A provenance management system for CMDA is developed to keep track of students' usages of CMDA, and to recommend datasets and analysis tools for their research topic. The provenance system also allows students to revisit their analysis results and share them with their group. Scientifically, we have developed several science use cases of CMDA covering various topics, datasets, and analysis types. Each use case developed is described and listed in terms of a scientific goal, datasets used, the analysis tools used, scientific results discovered from the use case, an analysis result such as output plots and data files, and a link to the exact analysis service call with all the input arguments filled. For example, one science use case is the evaluation of NCAR CAM5 model with MODIS total cloud fraction. The analysis service used is Difference Plot Service of Two Variables, and the datasets used are NCAR CAM total cloud fraction and MODIS total cloud fraction. The scientific highlight of the use case is that the CAM5 model overall does a fairly decent job at simulating total cloud cover, though simulates too few clouds especially near and offshore of the eastern ocean basins where low clouds are dominant.
Earning a Master's of Science in Nursing through Distance Education.
ERIC Educational Resources Information Center
Tagg, Peggy Ingram; Arreola, Raoul A.
1996-01-01
The master's degree in nursing offered via distance education by the University of Tennessee requires educators to design instruction carefully. The most successful students are risk takers, assertive, and responsible for their own learning. Compressed interactive video has proven the most effective medium. (JOW)
Proven Formula for Employing Youth: 70001
ERIC Educational Resources Information Center
Kunerth, Jeff
1977-01-01
"70001 Ltd." is a national nonprofit organization which offers unemployed dropouts a way to complete high school without returning to the public school classroom and also to get employment in retail sales occupations. Its centers, frequently located in shopping centers, provide a business rather than a school environment. (MF)
Mission-Centered Network Models: Defending Mission-Critical Tasks From Deception
2015-09-29
celebrities ). In military applications, networked operations offer an effective way to reduce the footprint of a force, but become a center of gravity...from,-used-by-trust-algorithms-to-assess-quality-and- trustworthiness - • Technical&challenge:-Developing-standard-representa3ons-for-provenance-that
The Tao of treating weeds: Reaching for restoration in the northern Rocky Mountains
Lisa-Natalie Anjozian
2008-01-01
Noxious weeds are a serious problem that is spreading across the West. Herbicides such as Picloram have proven to be powerful tools in reducing weed invaders, although use of this tool has often produced unintended consequences. Broadleaf herbicides kill forbs, such as the noxious knapweed, but also harm native forbs such as arrowleaf balsamroot. Removing weedy forbs...
Old Tools for New Problems: Modifying Master Gardener Training to Improve Food Access in Rural Areas
ERIC Educational Resources Information Center
Randle, Anne
2015-01-01
Extension faces ever-changing problems, which can be addressed by modifying successful tools rather than inventing new ones. The Master Gardener program has proven its effectiveness, but the cost and time commitment can make it inaccessible to rural, low-income communities, where training in home gardening may address issues of food access and…
Measurement of community empowerment in three community programs in Rapla (Estonia).
Kasmel, Anu; Andersen, Pernille Tanggaard
2011-03-01
Community empowerment approaches have been proven to be powerful tools for solving local health problems. However, the methods for measuring empowerment in the community remain unclear and open to dispute. This study aims to describe how a context-specific community empowerment measurement tool was developed and changes made to three health promotion programs in Rapla, Estonia. An empowerment expansion model was compiled and applied to three existing programs: Safe Community, Drug/HIV Prevention and Elderly Quality of Life. The consensus workshop method was used to create the measurement tool and collect data on the Organizational Domains of Community Empowerment (ODCE). The study demonstrated considerable increases in the ODCE among the community workgroup, which was initiated by community members and the municipality's decision-makers. The increase was within the workgroup, which had strong political and financial support on a national level but was not the community's priority. The program was initiated and implemented by the local community members, and continuous development still occurred, though at a reduced pace. The use of the empowerment expansion model has proven to be an applicable, relevant, simple and inexpensive tool for the evaluation of community empowerment.
Measurement of Community Empowerment in Three Community Programs in Rapla (Estonia)
Kasmel, Anu; Andersen, Pernille Tanggaard
2011-01-01
Community empowerment approaches have been proven to be powerful tools for solving local health problems. However, the methods for measuring empowerment in the community remain unclear and open to dispute. This study aims to describe how a context-specific community empowerment measurement tool was developed and changes made to three health promotion programs in Rapla, Estonia. An empowerment expansion model was compiled and applied to three existing programs: Safe Community, Drug/HIV Prevention and Elderly Quality of Life. The consensus workshop method was used to create the measurement tool and collect data on the Organizational Domains of Community Empowerment (ODCE). The study demonstrated considerable increases in the ODCE among the community workgroup, which was initiated by community members and the municipality’s decision-makers. The increase was within the workgroup, which had strong political and financial support on a national level but was not the community’s priority. The program was initiated and implemented by the local community members, and continuous development still occurred, though at a reduced pace. The use of the empowerment expansion model has proven to be an applicable, relevant, simple and inexpensive tool for the evaluation of community empowerment. PMID:21556179
Thakur, Shalabh; Guttman, David S
2016-06-30
Comparative analysis of whole genome sequence data from closely related prokaryotic species or strains is becoming an increasingly important and accessible approach for addressing both fundamental and applied biological questions. While there are number of excellent tools developed for performing this task, most scale poorly when faced with hundreds of genome sequences, and many require extensive manual curation. We have developed a de-novo genome analysis pipeline (DeNoGAP) for the automated, iterative and high-throughput analysis of data from comparative genomics projects involving hundreds of whole genome sequences. The pipeline is designed to perform reference-assisted and de novo gene prediction, homolog protein family assignment, ortholog prediction, functional annotation, and pan-genome analysis using a range of proven tools and databases. While most existing methods scale quadratically with the number of genomes since they rely on pairwise comparisons among predicted protein sequences, DeNoGAP scales linearly since the homology assignment is based on iteratively refined hidden Markov models. This iterative clustering strategy enables DeNoGAP to handle a very large number of genomes using minimal computational resources. Moreover, the modular structure of the pipeline permits easy updates as new analysis programs become available. DeNoGAP integrates bioinformatics tools and databases for comparative analysis of a large number of genomes. The pipeline offers tools and algorithms for annotation and analysis of completed and draft genome sequences. The pipeline is developed using Perl, BioPerl and SQLite on Ubuntu Linux version 12.04 LTS. Currently, the software package accompanies script for automated installation of necessary external programs on Ubuntu Linux; however, the pipeline should be also compatible with other Linux and Unix systems after necessary external programs are installed. DeNoGAP is freely available at https://sourceforge.net/projects/denogap/ .
Nash, David J; Coulson, Sheila; Staurset, Sigrid; Ullyott, J Stewart; Babutsi, Mosarwa; Hopkinson, Laurence; Smith, Martin P
2013-04-01
Lithic artifacts from the African Middle Stone Age (MSA) offer an avenue to explore a range of human behaviors, including mobility, raw material acquisition, trade and exchange. However, to date, in southern Africa it has not been possible to provenance the locations from which commonly used stone materials were acquired prior to transport to archaeological sites. Here we present results of the first investigation to geochemically fingerprint silcrete, a material widely used for tool manufacture across the subcontinent. The study focuses on the provenancing of silcrete artifacts from the MSA of White Paintings Shelter (WPS), Tsodilo Hills, in the Kalahari Desert of northwest Botswana. Our results suggest that: (i) despite having access to local quartz and quartzite at Tsodilo Hills, MSA peoples chose to transport silcrete over 220 km to WPS from sites south of the Okavango Delta; (ii) these sites were preferred to silcrete sources much closer to Tsodilo Hills; (iii) the same source areas were repeatedly used for silcrete supply throughout the 3 m MSA sequence; (iv) during periods of colder, wetter climate, silcrete may have been sourced from unknown, more distant, sites. Our results offer a new provenancing approach for exploring prehistoric behavior at other sites where silcrete is present in the archaeological record. Copyright © 2013 Elsevier Ltd. All rights reserved.
BASINS enables users to efficiently access nationwide environmental databases and local user-specified datasets, apply assessment and planning tools, and run a variety of proven nonpoint loading and water quality models within a single GIS format.
Improving Sugarcane for Biofuel: Engineering for an even better feedstock
USDA-ARS?s Scientific Manuscript database
Sugarcane is a proven biofuel feedstock and accounts for about half the biofuel production worldwide. It has a more favorable energy input/output ratio than that of corn, the other major biofuel feedstock. The rich resource of genetic diversity and the plasticity of autopolyploid genomes offer a wea...
Digital Technology and Student Cognitive Development
ERIC Educational Resources Information Center
Cavanaugh, J. Michael; Giapponi, Catherine C.; Golden, Timothy D.
2016-01-01
Digital technology has proven a beguiling, some even venture addictive, presence in the lives of our 21st century (millennial) students. And while screen technology may offer select cognitive benefits, there is mounting evidence in the cognitive neuroscience literature that digital technology is restructuring the way our students read and think,…
76 FR 76025 - World AIDS Day, 2011
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-06
... to turn the corner on the HIV/AIDS pandemic by investing in research that promises new and proven.... And research is ongoing to devise new prevention methods that may one day offer innovative ways to... science available to prevent new HIV infections, and we are testing new approaches to integrating housing...
Team-Building Success: It's in the Cards
ERIC Educational Resources Information Center
Scarfino, Deborah; Roever, Carol
2009-01-01
Successful team outcomes frequently--if not always--rely upon proven techniques for managing diverse styles and strengths. In this article, the authors describe the Diversity Card Game and the benefits it offers for students and instructors. Building teams using Diversity gives students the knowledge to manage clashes that might otherwise create…
A Thematic Instruction Approach to Teaching Technology and Engineering
ERIC Educational Resources Information Center
Moyer, Courtney D.
2016-01-01
Thematic instruction offers flexible opportunities to engage students with real-world experiences in the technology and engineering community. Whether used in a broad unifying theme or specific project-based theme, research has proven that thematic instruction has the capacity to link cross-curricular subjects, facilitate active learning, and…
Power Plays: Proven Methods of Professional Learning Pack a Force
ERIC Educational Resources Information Center
Easton, Lois Brown
2005-01-01
Powerful professional learning is more than a one-shot workshop. It involves educators working collegially on a matter they care about with content arising directly from their classroom experiences. Educators know which strategies offer more powerful learning. Choosing the appropriate strategy requires answering just three questions.
Shark tales: a molecular species-level phylogeny of sharks (Selachimorpha, Chondrichthyes).
Vélez-Zuazo, Ximena; Agnarsson, Ingi
2011-02-01
Sharks are a diverse and ecologically important group, including some of the ocean's largest predatory animals. Sharks are also commercially important, with many species suffering overexploitation and facing extinction. However, despite a long evolutionary history, commercial, and conservation importance, phylogenetic relationships within the sharks are poorly understood. To date, most studies have either focused on smaller clades within sharks, or sampled taxa sparsely across the group. A more detailed species-level phylogeny will offer further insights into shark taxonomy, provide a tool for comparative analyses, as well as facilitating phylogenetic estimates of conservation priorities. We used four mitochondrial and one nuclear gene to investigate the phylogenetic relationships of 229 species (all eight Orders and 31 families) of sharks, more than quadrupling the number of taxon sampled in any prior study. The resulting Bayesian phylogenetic hypothesis agrees with prior studies on the major relationships of the sharks phylogeny; however, on those relationships that have proven more controversial, it differs in several aspects from the most recent molecular studies. The phylogeny supports the division of sharks into two major groups, the Galeomorphii and Squalimorphii, rejecting the hypnosqualean hypothesis that places batoids within sharks. Within the squalimorphs the orders Hexanchiformes, Squatiniformes, Squaliformes, and Pristiophoriformes are broadly monophyletic, with minor exceptions apparently due to missing data. Similarly, within Galeomorphs, the orders Heterodontiformes, Lamniformes, Carcharhiniformes, and Orectolobiformes are broadly monophyletic, with a couple of species 'misplaced'. In contrast, many of the currently recognized shark families are not monophyletic according to our results. Our phylogeny offers some of the first clarification of the relationships among families of the order Squaliformes, a group that has thus far received relatively little phylogenetic attention. Our results suggest that the genus Echinorhinus is not a squaliform, but rather related to the saw sharks, a hypothesis that might be supported by both groups sharing 'spiny' snouts. In sum, our results offer the most detailed species-level phylogeny of sharks to date and a tool for comparative analyses. Copyright © 2010 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Seidel, Hannes; Schunk, Christian; Matiu, Michael; Menzel, Annette
2016-04-01
Climate warming and more frequent and severe drought events will alter the adaptedness and fitness of tree species. Especially, Scots pine forests have been affected above average by die-off events during the last decades. Assisted migration of adapted provenances might help alleviating impacts by recent climate change and successfully regenerating forests. However, the identification of suitable provenances based on established ecophysiological methods is time consuming, sometimes invasive, and data on provenance-specific mortality are lacking. We studied the performance, stress and survival of potted Scots pine seedlings from 12 European provenances grown in a greenhouse experiment with multiple drought and warming treatments. In this paper, we will present results of drought stress impacts monitored with four different thermal indices derived from infrared thermography imaging as well as an ample mortality study. Percent soil water deficit (PSWD) was shown to be the main driver of drought stress response in all thermal indices. In spite of wet and dry reference surfaces, however, fluctuating environmental conditions, mainly in terms of air temperature and humidity, altered the measured stress response. In linear mixed-effects models, besides PSWD and meteorological covariates, the factors provenance and provenance - PSWD interactions were included. The explanatory power of the models (R2) ranged between 0.51 to 0.83 and thus, provenance-specific responses to strong and moderate drought and subsequent recovery were revealed. However, obvious differences in the response magnitude of provenances to drought were difficult to explicitly link to general features such Mediterranean - continental type or climate at the provenances' origin. We conclude that seedlings' drought resistance may be linked to summer precipitation and their experienced stress levels are a.o. dependent on their above ground dimensions under given water supply. In respect to mortality, previous drought stress experience lowered the current risk and obvious provenance effects were largely related to different growth traits (dimensions). Our experimental results suggest besides evidence for abiotic stress hardening provenance-specific variation in drought resilience. Thus, there is room for provenance-based assisted migration as tool for climate change adaptation in forestry.
Desai, Sunita; Hatfield, Laura A; Hicks, Andrew L; Sinaiko, Anna D; Chernew, Michael E; Cowling, David; Gautam, Santosh; Wu, Sze-Jung; Mehrotra, Ateev
2017-08-01
Insurers, employers, and states increasingly encourage price transparency so that patients can compare health care prices across providers. However, the evidence on whether price transparency tools encourage patients to receive lower-cost care and reduce overall spending remains limited and mixed. We examined the experience of a large insured population that was offered a price transparency tool, focusing on a set of "shoppable" services (lab tests, office visits, and advanced imaging services). Overall, offering the tool was not associated with lower shoppable services spending. Only 12 percent of employees who were offered the tool used it in the first fifteen months after it was introduced, and use of the tool was not associated with lower prices for lab tests or office visits. The average price paid for imaging services preceded by a price search was 14 percent lower than that paid for imaging services not preceded by a price search. However, only 1 percent of those who received advanced imaging conducted a price search. Simply offering a price transparency tool is not sufficient to meaningfully decrease health care prices or spending. Project HOPE—The People-to-People Health Foundation, Inc.
LA-ICP-MS as Tool for Provenance Analyses in Arctic Marine Sediments
NASA Astrophysics Data System (ADS)
Wildau, Antje; Garbe-Schönberg, Dieter
2015-04-01
The hydraulic transport of sediments is a major geological process in terrestrial and marine systems and is responsible for the loss, redistribution and accumulation of minerals. Provenance analyses are a powerful tool for assessing the origin and dispersion of material in ancient and modern fluvial and marine sediments. Provenance-specific heavy minerals (e.g., zircon, rutile, tourmaline) can therefore be used to provide valuable information on the formation of ore deposits (placer deposits), and the reconstruction of paleogeography, hydrology, climate conditions and developments. The application of provenances analyses for the latter reason is of specific interest, since there is need for research on the progressing climate change, and heavy minerals represent good proxies for the evaluation of recent and past changes in the climate. The study of these fine particles provides information about potential regional or long distance transport paths, glacial / ice drift and current flows, freezing and melting events as well as depositional centers for the released sediments. Classic methods applied for provenance analyses are mapping of the presence / absence of diagnostic minerals, their grain size distribution, modal mineralogy and the analysis of variations in ratio of two or more heavy minerals. Electron microprobe has been established to discover changes in mineral chemistry of individual mineral phases, which can indicate fluctuations or differences in the provenance. All these methods bear the potential of high errors that lower the validity of the provenance analyses. These are for example the misclassification of mineral species due to undistinguishable optical properties or the limitations in the detection / variations of trace elements using the election microprobe. For this case study, marine sediments from the Arctic Ocean have been selected to test if LA-ICP-MS can be established as a key technique for precise and reliable provenance analyses. The Laptev Sea is known to be a "sea ice formation factory" and represents a perfect source area with numerous sediment loaded rivers draining into the Arctic Ocean. Mineral grains become trapped in the sea ice, which is transported to the Fram Strait, the outflow area of the Transpolar Drift System. Thus, minerals in the Fram Strait and in the Laptev Sea should have the same provenance. In both areas zircon, garnet, ilmenite, magnetite, tourmaline, pyroxene and amphibole were identified (amongst others). The vast majority of potential source areas and the widespread occurrence of these accessory and rock forming minerals result in the absolute need for a highly sensitive and precise method such as LA-ICP-MS. We report new data on the eligibility of selected heavy minerals for provenance analyses in the Arctic Ocean. Based on the individual trace element composition, REE-pattern and isotopic ratios, reflecting the conditions during formation, we report individual fingerprints for single mineral species. This enables us to allocate specific minerals from Fram Strait and from Laptev Sea to one provenance. Furthermore we evaluate the eligibility of different heavy minerals as a geochemical proxy in Arctic sediments for provenance analyses using LA-ICP-MS.
Turn over folders: a proven tool in succession management planning.
Engells, Thomas E
2011-01-01
The dual challenges of succession management and succession management planning are considerable. A tool, the Turn over Folder, was introduced and described in detail as a useful first step in succession management planning. The adoption of that tool will not in itself produce a succession management plan, but it will orientate the organization and its members to the reality of succession management in all important leadership and critical positions. Succession management is an important consideration in all progressive organizations and well worth the effort.
Providing Global Change Information for Decision-Making: Capturing and Presenting Provenance
NASA Technical Reports Server (NTRS)
Ma, Xiaogang; Fox, Peter; Tilmes, Curt; Jacobs, Katherine; Waple, Anne
2014-01-01
Global change information demands access to data sources and well-documented provenance to provide evidence needed to build confidence in scientific conclusions and, in specific applications, to ensure the information's suitability for use in decision-making. A new generation of Web technology, the Semantic Web, provides tools for that purpose. The topic of global change covers changes in the global environment (including alterations in climate, land productivity, oceans or other water resources, atmospheric composition and or chemistry, and ecological systems) that may alter the capacity of the Earth to sustain life and support human systems. Data and findings associated with global change research are of great public, government, and academic concern and are used in policy and decision-making, which makes the provenance of global change information especially important. In addition, since different types of decisions benefit from different types of information, understanding how to capture and present the provenance of global change information is becoming more of an imperative in adaptive planning.
NASA Astrophysics Data System (ADS)
Chakraborty, Debojyoti; Schueler, Silvio
2017-04-01
Adaptive management aiming at reducing vulnerability and enhancing the resilience of forested ecosystems is a key to preserving the potential of forests to provide multiple ecosystem services under climate change. Planting alternative or non native tree species adapted to future conditions and also utilizing the genetic variation within tree species has also been suggested as an important adaptive management strategy under climate change. Therefore, knowledge on suitable provenances/populations is a key issue. Provenance trial experiments, where several populations of a species are planted in a particular climate or throughout an appropriate climatic gradient offers a great opportunity to understand adaptive genetic variation within a tree species. These trials were primarily established, for identifying populations with desired growth and fitness characteristics. Due to the increasing interest in climate change, such trials were revisited to understand the relation between growth performance and climate and to recommend suitable populations for future conditions. Here we present the lessons learned from provenance trials of Norway spruce and Douglas -fir in central Europe. With data from provenance trials planted across a wide range of environmental conditions in central Europe we developed multivariate models, Universal Response Functions (URFs). The URFs predict growth performance as a function of climate of planting locations (i.e. environmental factors) and provenance/ population origin (i.e. genetic factors). The flexibility of the URFs as a decision making tool is remarkable. The model can be used as to identify suitable planting material for a give site, and vice versa and also as a species distribution model (SDM) with integrated genetic variation. Under current and climate change scenarios, the URFs were applied to predict populations with higher growth performance in central Europe and also as species distribution models for Douglas-fir (Pseudotsuga menziesii [Mirbel] Franco) and Norway spruce (Picea abies (L.) Karst). For both Douglas-fir and Norway spruce wide variation in growth performance were detected. Populations of Douglas-fir identified by the URFs to be optimum for central Europe current climate and climate change scenarios originate from western Cascades and coastal areas of British Columbia, Washington and Oregon. The current seed stands of Douglas-fir in North America, providing planting materials for Central Europe under the legal framework of the Organization for Economic Cooperation and Development (OECD) were found to be suitable for under future conditions. In case of Norway spruce provenances originating from warm and drier regions of south east Europe were found to be suitable for central Europe under future conditions. Even though calibrated with data from Central Europe, when applied as SDMs, the URFs predicted the observed occurrence of Douglas-fir in its native range in North America with reasonable accuracy compared to contemporary SDMs developed in North America. For both Douglas-fir and Norway spruce significant variation in habitat suitability was found depending on the planted population or seed source indicating the role of intraspecific variation in buffering effects of climate change.
NASA Technical Reports Server (NTRS)
Olsen, Lola M.
2006-01-01
The capabilities of the International Directory Network's (IDN) version MD9.5, along with a new version of the metadata authoring tool, "docBUILDER", will be presented during the Technology and Services Subgroup session of the Working Group on Information Systems and Services (WGISS). Feedback provided through the international community has proven instrumental in positively influencing the direction of the IDN s development. The international community was instrumental in encouraging support for using the IS0 international character set that is now available through the directory. Supporting metadata descriptions in additional languages encourages extended use of the IDN. Temporal and spatial attributes often prove pivotal in the search for data. Prior to the new software release, the IDN s geospatial and temporal searches suffered from browser incompatibilities and often resulted in unreliable performance for users attempting to initiate a spatial search using a map based on aging Java applet technology. The IDN now offers an integrated Google map and date search that replaces that technology. In addition, one of the most defining characteristics in the search for data relates to the temporal and spatial resolution of the data. The ability to refine the search for data sets meeting defined resolution requirements is now possible. Data set authors are encouraged to indicate the precise resolution values for their data sets and subsequently bin these into one of the pre-selected resolution ranges. New metadata authoring tools have been well received. In response to requests for a standalone metadata authoring tool, a new shareable software package called "docBUILDER solo" will soon be released to the public. This tool permits researchers to document their data during experiments and observational periods in the field. interoperability has been enhanced through the use of the Open Archives Initiative s (OAI) Protocol for Metadata Harvesting (PMH). Harvesting of XML content through OAI-MPH has been successfully tested with several organizations. The protocol appears to be a prime candidate for sharing metadata throughout the international community. Data services for visualizing and analyzing data have become valuable assets in facilitating the use of data. Data providers are offering many of their data-related services through the directory. The IDN plans to develop a service-based architecture to further promote the use of web services. During the IDN Task Team session, ideas for further enhancements will be discussed.
Miller, Brian W.; Morisette, Jeffrey T.
2014-01-01
Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.
NASA Astrophysics Data System (ADS)
Shackley, S.
2010-12-01
For many decades now, geologists and archaeologists have been analyzing archaeological obsidian using a spate of techniques. No single technology, however, can solve all of the chemical, petrological, or archaeological problems that arise from this disordered substance. The future is indistinct for obsidian studies with the rising use and misuse of portable XRF (PXRF) and ICP-MS, the apparent decline of the use of neutron activation (NAA), continual misuse of megascopic source assignment, and the maturation of laboratory x-ray fluorescence spectrometry (XRF). Magnetic property analysis of obsidian is yet another tool for the understanding of source provenance and may very well become a tool that fills a gap in our analytical repertoire. This discussion is designed to provide historical context for this resurrected technique and serve as a reminder that we don’t always know what we know in geoarchaeological science.
Reduce Fraud Risk in Your District with Stronger Internal Controls
ERIC Educational Resources Information Center
Okrzesik, Daryl J.; Nuehring, Bert G.
2011-01-01
Internal accounts offer schools a faster, more convenient way to handle the income and expenses that result from student fees, school clubs and organizations, field trips, fund-raising, and similar activities. But this convenience also incurs the added risk of fraud. Fortunately, there are proven ways to strengthen internal controls and reduce…
ERIC Educational Resources Information Center
Marchel, Carol A.; Green, Susan K
2014-01-01
Increased use of field-based teacher preparation offers important opportunities to develop skills with diverse learners. However, limited focus on theoretical content restricts understanding and generalization of well-proven theoretical approaches, resulting in fragmented field applications unlikely to result in broad application. Inspired by Kurt…
Creating Healthful Environments in Schools through a Total Ban on Tobacco Use.
ERIC Educational Resources Information Center
Barta, Kathleen; Totten, Samuel
1995-01-01
Health and medical researchers have conclusively proven that smoking and secondhand smoke constitute a major health danger. Students get mixed messages when school staff smoke or use other tobacco products on campus. This article advocates a total ban on tobacco use in school settings, offers policymaking guidelines, summarizes state legislation,…
Teaching and Learning with Computers! A Method for American Indian Bilingual Classrooms.
ERIC Educational Resources Information Center
Bennett, Ruth
Computer instruction can offer particular benefits to the Indian child. Computer use emphasizes the visual facets of learning, teaches language based skills needed for higher education and careers, and provides types of instruction proven effective with Indian children, such as private self-testing and cooperative learning. The Hupa, Yurok, Karuk,…
ERIC Educational Resources Information Center
St. Clair, Guy
As funds for supporting library and information services dwindle, librarians are beginning to recognize the value of evaluating and justifying their library in terms that the decision makers--those who control the budgets--understand. This book offers proven techniques for implementing a program that both promotes information services and dispels…
Challenge 98: Sustaining the Work of a Regional Technology Integration Initiative
ERIC Educational Resources Information Center
Billig, Shelley H.; Sherry, Lorraine; Havelock, Bruce
2005-01-01
In this article, we offer a research-based theoretical framework for sustainability, describing the proven qualities of a project and the innovations that support its sustained existence over time. We then describe how a US Department of Education Technology Innovation Challenge grantee, working to promote technology integration in a…
Understanding Leadership: An Experimental-Experiential Model
ERIC Educational Resources Information Center
Hole, George T.
2014-01-01
Books about leadership are dangerous to readers who fantasize about being leaders or apply leadership ideas as if they were proven formulas. As an antidote, I offer an experimental framework in which any leadership-management model can be tested to gain experiential understanding of the model. As a result one can gain reality-based insights about…
Seamless Provenance Representation and Use in Collaborative Science Scenarios
NASA Astrophysics Data System (ADS)
Missier, P.; Ludaescher, B.; Bowers, S.; Altintas, I.; Anand, M. K.; Dey, S.; Sarkar, A.; Shrestha, B.; Goble, C.
2010-12-01
The notion of sharing scientific data has only recently begun to gain ground in science, where data is still considered a private asset. There is growing evidence, however, that the benefits of scientific collaboration through early data sharing during the course of a science project may outgrow the risk of losing exclusive ownership of the data. As exemplar success stories are making the headlines[1], principles of effective information sharing have become the subject of e-science research. In particular, any piece of published data should be self-describing, to the extent necessary for consumers to determine its suitability for reuse in their own projects. This is accomplished by associating a body of formally specified and machine-processable metadata to the data. When data is produced and reused by independent groups, however, metadata interoperability issues emerge. This is the case for provenance, a form of metadata that describes the history of a data product, Y. Provenance is typically expressed as a graph-structured set of dependencies that account for the sequence of computational or interactive steps that led to Y, often starting from some primary, observational data. Traversing dependency graphs is one of the mechanisms used to answer questions on data reliability. In the context of the NSF DataONE project[2], we have been studying issues of provenance interoperability in scientific collaboration scenarios. Consider a first scientist, Alice, who publishes a data product X along with its provenance, and a second scientist who further transforms X into a new product Y, also along with its provenance. A third scientist, who is interested in Y, expects to be able to trace Y's history up to the inputs used by Alice. This is only possible, however, if provenance accumulates into a single, uniform graph that can be seamlessly traversed. This becomes problematic when provenance is captured using different tools and computational models (i.e. workflow systems), as well as when data is published and reused using mechanisms that are not provenance-aware. In this presentation we discuss requirements for ensuring provenance-aware data publishing and reuse, and describe the design and implementation of a prototype toolkit that involves two specific, and broadly used, workflow models, Kepler [3] and Taverna [4]. The implementation is expected to be adopted as part of DataONE's investigators' toolkit, in support of its mission of large-scale data preservation. Refs. [1]Sharing of Data Leads to Progress on Alzheimer’s, G. Kolata, NYT, 8/12/2010 [2]http://www.dataone.org [3]Ludaescher B., Altintas I. et al. Scientific Workflow Management and the Kepler System. Special Issue: Workflow in Grid Systems. Concurrency and Computation: Practice & Experience 18(10): 1039-1065, 2006 [4]D. Hull, K. Wolstencroft, R. Stevens, C. Goble, M. R. Pocock, P. Li, T. Oinn. Taverna: a tool for building and running workflows of services. Nucl. Acids Res. 34: W729-W732, 2006
SeqHound: biological sequence and structure database as a platform for bioinformatics research
2002-01-01
Background SeqHound has been developed as an integrated biological sequence, taxonomy, annotation and 3-D structure database system. It provides a high-performance server platform for bioinformatics research in a locally-hosted environment. Results SeqHound is based on the National Center for Biotechnology Information data model and programming tools. It offers daily updated contents of all Entrez sequence databases in addition to 3-D structural data and information about sequence redundancies, sequence neighbours, taxonomy, complete genomes, functional annotation including Gene Ontology terms and literature links to PubMed. SeqHound is accessible via a web server through a Perl, C or C++ remote API or an optimized local API. It provides functionality necessary to retrieve specialized subsets of sequences, structures and structural domains. Sequences may be retrieved in FASTA, GenBank, ASN.1 and XML formats. Structures are available in ASN.1, XML and PDB formats. Emphasis has been placed on complete genomes, taxonomy, domain and functional annotation as well as 3-D structural functionality in the API, while fielded text indexing functionality remains under development. SeqHound also offers a streamlined WWW interface for simple web-user queries. Conclusions The system has proven useful in several published bioinformatics projects such as the BIND database and offers a cost-effective infrastructure for research. SeqHound will continue to develop and be provided as a service of the Blueprint Initiative at the Samuel Lunenfeld Research Institute. The source code and examples are available under the terms of the GNU public license at the Sourceforge site http://sourceforge.net/projects/slritools/ in the SLRI Toolkit. PMID:12401134
Conversion of the Aeronautics Interactive Workstation
NASA Technical Reports Server (NTRS)
Riveras, Nykkita L.
2004-01-01
This summer I am working in the Educational Programs Office. My task is to convert the Aeronautics Interactive Workstation from a Macintosh (Mac) platform to a Personal Computer (PC) platform. The Aeronautics Interactive Workstation is a workstation in the Aerospace Educational Laboratory (AEL), which is one of the three components of the Science, Engineering, Mathematics, and Aerospace Academy (SEMAA). The AEL is a state-of-the-art, electronically enhanced, computerized classroom that puts cutting-edge technology at the fingertips of participating students. It provides a unique learning experience regarding aerospace technology that features activities equipped with aerospace hardware and software that model real-world challenges. The Aeronautics Interactive Workstation, in particular, offers a variety of activities pertaining to the history of aeronautics. When the Aeronautics Interactive Workstation was first implemented into the AEL it was designed with Macromedia Director 4 for a Mac. Today it is being converted to Macromedia DirectorMX2004 for a PC. Macromedia Director is the proven multimedia tool for building rich content and applications for CDs, DVDs, kiosks, and the Internet. It handles the widest variety of media and offers powerful features for building rich content that delivers red results, integrating interactive audio, video, bitmaps, vectors, text, fonts, and more. Macromedia Director currently offers two programmingkripting languages: Lingo, which is Director's own programmingkripting language and JavaScript. In the workstation, Lingo is used in the programming/scripting since it was the only language in use when the workstation was created. Since the workstation was created with an older version of Macromedia Director it hosted significantly different programming/scripting protocols. In order to successfully accomplish my task, the final product required correction of Xtra and programming/scripting errors. I also had to convert the Mac platform file extensions into compatible file extensions for a PC.
High power disk lasers: advances and applications
NASA Astrophysics Data System (ADS)
Havrilla, David; Holzer, Marco
2011-02-01
Though the genesis of the disk laser concept dates to the early 90's, the disk laser continues to demonstrate the flexibility and the certain future of a breakthrough technology. On-going increases in power per disk, and improvements in beam quality and efficiency continue to validate the genius of the disk laser concept. As of today, the disk principle has not reached any fundamental limits regarding output power per disk or beam quality, and offers numerous advantages over other high power resonator concepts, especially over monolithic architectures. With well over 1000 high power disk lasers installations, the disk laser has proven to be a robust and reliable industrial tool. With advancements in running cost, investment cost and footprint, manufacturers continue to implement disk laser technology with more vigor than ever. This paper will explain important details of the TruDisk laser series and process relevant features of the system, like pump diode arrangement, resonator design and integrated beam guidance. In addition, advances in applications in the thick sheet area and very cost efficient high productivity applications like remote welding, remote cutting and cutting of thin sheets will be discussed.
Colour thresholding and objective quantification in bioimaging
NASA Technical Reports Server (NTRS)
Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.
1992-01-01
Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.
Cognitive simulators for medical education and training.
Kahol, Kanav; Vankipuram, Mithra; Smith, Marshall L
2009-08-01
Simulators for honing procedural skills (such as surgical skills and central venous catheter placement) have proven to be valuable tools for medical educators and students. While such simulations represent an effective paradigm in surgical education, there is an opportunity to add a layer of cognitive exercises to these basic simulations that can facilitate robust skill learning in residents. This paper describes a controlled methodology, inspired by neuropsychological assessment tasks and embodied cognition, to develop cognitive simulators for laparoscopic surgery. These simulators provide psychomotor skill training and offer the additional challenge of accomplishing cognitive tasks in realistic environments. A generic framework for design, development and evaluation of such simulators is described. The presented framework is generalizable and can be applied to different task domains. It is independent of the types of sensors, simulation environment and feedback mechanisms that the simulators use. A proof of concept of the framework is provided through developing a simulator that includes cognitive variations to a basic psychomotor task. The results of two pilot studies are presented that show the validity of the methodology in providing an effective evaluation and learning environments for surgeons.
Gradient optimization of finite projected entangled pair states
NASA Astrophysics Data System (ADS)
Liu, Wen-Yuan; Dong, Shao-Jun; Han, Yong-Jian; Guo, Guang-Can; He, Lixin
2017-05-01
Projected entangled pair states (PEPS) methods have been proven to be powerful tools to solve strongly correlated quantum many-body problems in two dimensions. However, due to the high computational scaling with the virtual bond dimension D , in a practical application, PEPS are often limited to rather small bond dimensions, which may not be large enough for some highly entangled systems, for instance, frustrated systems. Optimization of the ground state using the imaginary time evolution method with a simple update scheme may go to a larger bond dimension. However, the accuracy of the rough approximation to the environment of the local tensors is questionable. Here, we demonstrate that by combining the imaginary time evolution method with a simple update, Monte Carlo sampling techniques and gradient optimization will offer an efficient method to calculate the PEPS ground state. By taking advantage of massive parallel computing, we can study quantum systems with larger bond dimensions up to D =10 without resorting to any symmetry. Benchmark tests of the method on the J1-J2 model give impressive accuracy compared with exact results.
Recchia, Gabriel L; Louwerse, Max M
2016-11-01
Computational techniques comparing co-occurrences of city names in texts allow the relative longitudes and latitudes of cities to be estimated algorithmically. However, these techniques have not been applied to estimate the provenance of artifacts with unknown origins. Here, we estimate the geographic origin of artifacts from the Indus Valley Civilization, applying methods commonly used in cognitive science to the Indus script. We show that these methods can accurately predict the relative locations of archeological sites on the basis of artifacts of known provenance, and we further apply these techniques to determine the most probable excavation sites of four sealings of unknown provenance. These findings suggest that inscription statistics reflect historical interactions among locations in the Indus Valley region, and they illustrate how computational methods can help localize inscribed archeological artifacts of unknown origin. The success of this method offers opportunities for the cognitive sciences in general and for computational anthropology specifically. Copyright © 2015 Cognitive Science Society, Inc.
NERIES: Seismic Data Gateways and User Composed Datasets Metadata Management
NASA Astrophysics Data System (ADS)
Spinuso, Alessandro; Trani, Luca; Kamb, Linus; Frobert, Laurent
2010-05-01
One of the NERIES EC project main objectives is to establish and improve the networking of seismic waveform data exchange and access among four main data centers in Europe: INGV, GFZ, ORFEUS and IPGP. Besides the implementation of the data backbone, several investigations and developments have been conducted in order to offer to the users the data available from this network, either programmatically or interactively. One of the challenges is to understand how to enable users` activities such as discovering, aggregating, describing and sharing datasets to obtain a decrease in the replication of similar data queries towards the network, exempting the data centers to guess and create useful pre-packed products. We`ve started to transfer this task more and more towards the users community, where the users` composed data products could be extensively re-used. The main link to the data is represented by a centralized webservice (SeismoLink) acting like a single access point to the whole data network. Users can download either waveform data or seismic station inventories directly from their own software routines by connecting to this webservice, which routes the request to the data centers. The provenance of the data is maintained and transferred to the users in the form of URIs, that identify the dataset and implicitly refer to the data provider. SeismoLink, combined with other webservices (eg EMSC-QuakeML earthquakes catalog service), is used from a community gateway such as the NERIES web portal (http://www.seismicportal.eu). Here the user interacts with a map based portlet which allows the dynamic composition of a data product, binding seismic event`s parameters with a set of seismic stations. The requested data is collected by the back-end processes of the portal, preserved and offered to the user in a personal data cart, where metadata can be generated interactively on-demand. The metadata, expressed in RDF, can also be remotely ingested. They offer rating, provenance and user annotation properties. Once generated they are included into a proprietary taxonomy, used by the overall architecture of the web portal. The metadata are made available through a SPARQL endpoint, thus allowing the datasets to be aggregated and shared among users in a meaningful way, enabling at the same time the development of third party visualization tools beyond the portal infrastructure. The SEE-GRID-SCI and the JISC-funded RapidSeis projects investigate the usage of this framework to enable the waveform data processing over the Grid.
4 Other Ways to Quit Smoking Besides Using Medication
There are other ways to quit smoking besides cold turkey and medication. Medications are a good tool, but they’re not a magic bullet. Boost your chances of quitting and staying quit by using these other proven methods.
LIVING SHORES GALLERY MX964015
An interactive computer kiosk will allow the Texas State Aquarium to deliver a considerable amount of information in an efficient and highly effective manner. Touch screen interactives have proven to be excellent teaching tools in the Aquarium's Jellies: Floating Phantoms galler...
ERIC Educational Resources Information Center
Krause, Jaclyn A.
2010-01-01
As Web 2.0 tools and technologies increase in popularity in consumer markets, enterprises are seeking ways to take advantage of the rich social knowledge exchanges that these tools offer. The problem this study addresses is that it remains unknown whether employees perceive that these tools offer value to the organization and therefore will be…
Control system design and analysis using the INteractive Controls Analysis (INCA) program
NASA Technical Reports Server (NTRS)
Bauer, Frank H.; Downing, John P.
1987-01-01
The INteractive Controls Analysis (INCA) program was developed at the Goddard Space Flight Center to provide a user friendly efficient environment for the design and analysis of linear control systems. Since its inception, INCA has found extensive use in the design, development, and analysis of control systems for spacecraft, instruments, robotics, and pointing systems. Moreover, the results of the analytic tools imbedded in INCA have been flight proven with at least three currently orbiting spacecraft. This paper describes the INCA program and illustrates, using a flight proven example, how the package can perform complex design analyses with relative ease.
EPA perspective - exposure and effects prediction and monitoring
Risk-based decisions for environmental chemicals often rely on estimates of human exposure and biological response. Biomarkers have proven a useful empirical tool for evaluating exposure and hazard predictions. In the United States, the Centers for Disease Control and Preventio...
Nuclear moisture-density evaluation : part II : final report.
DOT National Transportation Integrated Search
1966-06-01
The determination of in-place density by the use of nuclear moisture-density devices has proven to be an exceptionally useful tool to the modern Highway Engineer. In order to adequately adapt this new testing equipment to efficient field use, evaluat...
ERIC Educational Resources Information Center
Frees, Edward W.; Kim, Jee-Seon
2006-01-01
Multilevel models are proven tools in social research for modeling complex, hierarchical systems. In multilevel modeling, statistical inference is based largely on quantification of random variables. This paper distinguishes among three types of random variables in multilevel modeling--model disturbances, random coefficients, and future response…
Black cherry provenances for planting in northwestern Pennsylvania
Russell S. Walters; Russell S. Walters
1985-01-01
After 14 years, survival of 8 of 25 planted black cherry sources is greater than 70 percent, and there are no significant differences in height. These sources offer the greater potential for planting in northwestern Pennsylvania; they include four Pennsylvania sources plus one each from Tennessee, West Virginia, Ohio, and Virginia. Planted trees did not grow better...
ERIC Educational Resources Information Center
Lewis, Mary G., Comp.
This catalog contains descriptions of the science education programs in the National Diffusion Network (NDN). These programs are available to school systems or other educational institutions for implementation in their classrooms. Some programs may be able to offer consultant services and limited assistance with the training and materials…
The Literature Review: Six Steps to Success. Second Edition
ERIC Educational Resources Information Center
Machi, Lawrence A.; McEvoy, Brenda T.
2012-01-01
This new edition of the best-selling book offers graduate students in education and the social sciences a road map to developing and writing an effective literature review for a research project, thesis, or dissertation. Organized around a proven six-step model and incorporating technology into all of the steps, the book provides examples,…
Literacy Matters: Strategies Every Teacher Can Use. A SkyLight Guide.
ERIC Educational Resources Information Center
Fogarty, Robin
This book explores 15 practical literacy strategies that can be used across grade levels (K-College) and content areas. The easy-to-use book offers an overview of the research and best practices associated with each literacy strategy--and it defines each strategy. It explores proven instructional methods such as metacognition, literacy circles,…
ERIC Educational Resources Information Center
Barros, Ricardo
Focusing on the Chama Valley School District's attempt to plan and implement a community council as a foundation for community education efforts in the rural Hispanic community of Chama, this publication offers "hands-on" suggestions in methods of implementing a community education program. Following a description of the school district…
Make Your Load Lighter with STARS
ERIC Educational Resources Information Center
Braxton, Barbara
2005-01-01
The Student Teaching and Research Services (STARS) program that is in place in a particular elementary school has proven very successful, not only in improving the services that are offered, but also in helping its participants to build their self-esteem. Those who seek a safe haven in the library during breaks make such a significant and visible…
ERIC Educational Resources Information Center
Hieneman, Meme; Childs, Karen; Sergay, Jane
2006-01-01
Now the theory and research behind the positive behavior support (PBS) process--an approach already proven effective in schools and community programs--has been transformed into a practical, easy-to-use guide that's perfect for sharing with parents. Developed by educators and families, this user-friendly handbook offers parents easy-to-follow…
The Dark Night: A Model of Spiritual Formation for Emerging Young Adults in College
ERIC Educational Resources Information Center
Rendon-Reyes, Juan
2017-01-01
An increasing number of baptized emerging adults consider themselves spiritual (spiritual seekers) and religiously un-affiliated. In 2003 the Pontifical Council for Culture, aware of the reality of the human spiritual quest, stated that helping people in their spiritual search by offering proven techniques and experiences of real prayer could open…
ERIC Educational Resources Information Center
Hayes, Dianne
2012-01-01
With more than a quarter of a century of filmmaking under his belt, Spike Lee has begun working with students to provide opportunities for them to reach their dreams. The tough-minded director on set has proven to have a soft spot when it comes to youth and the next generation of filmmakers. Lee is a welcomed guest lecturer offering candid…
John B. Loomis; George Peterson; Patricia A. Champ; Thomas C. Brown; Beatrice Lucero
1998-01-01
Estimating empirical measures of an individual's willingness to accept that are consistent with conventional economic theory, has proven difficult. The method of paired comparison offers a promising approach to estimate willingness to accept. This method involves having individuals make binary choices between receiving a particular good or a sum of money....
How the Men's Shed Idea Travels to Scandinavia
ERIC Educational Resources Information Center
Ahl, Helene; Hedegaard, Joel; Golding, Barry
2017-01-01
Australia has around 1,000 Men's Sheds--informal community-based workshops offering men beyond paid work somewhere to go, something to do and someone to talk to. They have proven to be of great benefit for older men's learning, health and wellbeing, social integration, and for developing a positive male identity focusing on community…
NASA Astrophysics Data System (ADS)
Hark, R. R.; Harmon, R. S.; Remus, J. J.; East, L. J.; Wise, M. A.; Tansi, B. M.; Shughrue, K. M.; Dunsin, K. S.; Liu, C.
2012-04-01
Laser-induced breakdown spectroscopy (LIBS) offers a means of rapidly distinguishing different places of origin for a mineral because the LIBS plasma emission spectrum provides the complete chemical composition (i.e. geochemical fingerprint) of a mineral in real-time. An application of this approach with potentially significant commercial and political importance is the spectral fingerprinting of the 'conflict minerals' columbite-tantalite ("coltan"). Following a successful pilot study of three columbite-tantalite suites from the United States and Canada, a more geographically diverse set of samples from 37 locations worldwide were analyzed using a commercial laboratory LIBS system and a subset of samples also analyzed using a prototype broadband field-portable system. The spectral range from 250-490 nm was chosen for the laboratory analysis to encompass many of the intense emission lines for the major elements (Ta, Nb, Fe, Mn) and the significant trace elements (e.g., W, Ti, Zr, Sn, U, Sb, Ca, Zn, Pb, Y, Mg, and Sc) known to commonly substitute in the columbite-tantalite solid solution series crystal structure and in the columbite group minerals. The field-portable instrument offered an increased spectral range (198-1005 nm), over which all elements have spectral emission lines, and higher resolution than the laboratory instrument. In both cases, the LIBS spectra were analyzed using advanced multivariate statistical signal processing techniques. Partial Least Squares Discriminant Analysis (PLSDA) resulted in a correct place-level geographic classification at success rates between 90 and 100%. The possible role of rare-earth elements (REE's) as a factor contributing to the high levels of sample discrimination was explored. Given the fact that it can be deployed as a man-portable analytical technology, these results lend additional evidence that LIBS has the potential to be utilized in the field as a real-time tool to discriminate between columbite-tantalite ores of different provenance.
Bakhoum, Niokhor; Ndoye, Fatou; Kane, Aboubacry; Assigbetse, Komi; Fall, Dioumacor; Sylla, Samba Ndao; Noba, Kandioura; Diouf, Diégane
2012-07-01
Rhizobial inoculation has a positive impact on plants growth; however, there is little information about its effect on soil microbial communities and their activity in the rhizosphere. It was therefore necessary to test the effect of inoculation of Acacia senegal (L.) Willd. seedlings with selected rhizobia on plant growth, structure and diversity of soil bacterial communities and soil functioning in relation to plant provenance and soil origin. In order to carry out this experiment, three A. senegal seeds provenance from Kenya, Niger, and Senegal were inoculated with selected rhizobial strains. They have been further grown during 4 months in greenhouse conditions in two non-disinfected soils, Dahra and Goudiry coming respectively from arid and semi-arid areas. The principal component analysis (ACP) showed an inoculation effect on plant growth, rhizospheric bacterial diversity and soil functioning. However, the performances of the rhizobial strains varied in relation to the seed provenance and the soil origin. The selected rhizobial strains, the A. senegal provenance and the soil origin have modified the structure and the diversity of soil bacterial communities as measured by principal component analysis/denaturing gradient gel electrophoresis analyses. It is interesting to note that bacterial communities of Dahra soil were highly structured according to A. senegal provenance, whereas they were structured in relation to rhizobial inoculation in Goudiry soil. Besides, the impact of inoculation on soil microbial activities measured by fluorescein diacetate analyses varied in relation to plant provenance and soil origin. Nevertheless, total microbial activity was about two times higher in Goudiry, arid soil than in Dahra, semi-arid soil. Our results suggest that the rhizobial inoculation is a suitable tool for improving plants growth and soil fertility. Yet, the impact is dependent on inoculants, plant provenance and soil origin. It will, therefore, be crucial to identify the appropriate rhizobial strains and plant provenance or species in relation to the soil type.
CLICK: The USGS Center for LIDAR Information Coordination & Knowledge
Menig, Jordan C.; Stoker, Jason M.
2007-01-01
While this technology has proven its use as a mapping tool - effective for generating bare earth DEMs at high resolutions (1-3 m) and with high vertical accuracies (15-18 cm) - obstacles remain for its application as a remote sensing tool: * The high cost of collecting LIDAR * The steep learning curve on research and application of using the entire point cloud * The challenges of discovering whether data exist for regions of interest
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-27
... Proposed Rule Change to Offer Risk Management Tools Designed to Allow Member Organizations to Monitor and... of the Proposed Rule Change The Exchange proposes to offer risk management tools designed to allow... risk management tools designed to allow member organizations to monitor and address exposure to risk...
An examination of the predictive validity of the risk matrix 2000 in England and wales.
Barnett, Georgia D; Wakeling, Helen C; Howard, Philip D
2010-12-01
This study examined the predictive validity of an actuarial risk-assessment tool with convicted sexual offenders in England and Wales. A modified version of the RM2000/s scale and the RM2000 v and c scales (Thornton et al., 2003) were examined for accuracy in predicting proven sexual violent, nonsexual violent, and combined sexual and/or nonsexual violent reoffending in a sample of sexual offenders who had either started a community sentence or been released from prison into the community by March 2007. Rates of proven reoffending were examined at 2 years for the majority of the sample (n = 4,946), and 4 years ( n = 578) for those for whom these data were available. The predictive validity of the RM2000 scales was also explored for different subgroups of sexual offenders to assess the robustness of the tool. Both the modified RM2000/s and the complete v and c scales effectively classified offenders into distinct risk categories that differed significantly in rates of proven sexual and/or nonsexual violent reoffending. Survival analyses on the RM2000/s and v scales (N = 9,284) indicated that the higher risk groups offended more quickly and at a higher rate than lower risk groups. The relative predictive validity of the RM2000/s, v, and c, as calculated using Receiver Operating Characteristics (ROC) analyses, were moderate (.68) for RM2000/s and large for both the RM2000/c (.73) and RM2000/v (.80), at the 2-year follow-up. RM2000/s was moderately accurate in predicting relative risk of proven sexual reoffending for a variety of subgroups of sexual offenders.
Provenance-Powered Automatic Workflow Generation and Composition
NASA Astrophysics Data System (ADS)
Zhang, J.; Lee, S.; Pan, L.; Lee, T. J.
2015-12-01
In recent years, scientists have learned how to codify tools into reusable software modules that can be chained into multi-step executable workflows. Existing scientific workflow tools, created by computer scientists, require domain scientists to meticulously design their multi-step experiments before analyzing data. However, this is oftentimes contradictory to a domain scientist's daily routine of conducting research and exploration. We hope to resolve this dispute. Imagine this: An Earth scientist starts her day applying NASA Jet Propulsion Laboratory (JPL) published climate data processing algorithms over ARGO deep ocean temperature and AMSRE sea surface temperature datasets. Throughout the day, she tunes the algorithm parameters to study various aspects of the data. Suddenly, she notices some interesting results. She then turns to a computer scientist and asks, "can you reproduce my results?" By tracking and reverse engineering her activities, the computer scientist creates a workflow. The Earth scientist can now rerun the workflow to validate her findings, modify the workflow to discover further variations, or publish the workflow to share the knowledge. In this way, we aim to revolutionize computer-supported Earth science. We have developed a prototyping system to realize the aforementioned vision, in the context of service-oriented science. We have studied how Earth scientists conduct service-oriented data analytics research in their daily work, developed a provenance model to record their activities, and developed a technology to automatically generate workflow starting from user behavior and adaptability and reuse of these workflows for replicating/improving scientific studies. A data-centric repository infrastructure is established to catch richer provenance to further facilitate collaboration in the science community. We have also established a Petri nets-based verification instrument for provenance-based automatic workflow generation and recommendation.
SSOAP - A USEPA Toolbox for Sanitary Sewer Overflow Analysis and Control Planning - Presentation
The United States Environmental Protection Agency (USEPA) has identified a need to use proven methodologies to develop computer tools that help communities properly characterize rainfall-derived infiltration and inflow (RDII) into sanitary sewer systems and develop sanitary sewer...
Hydrologic Landscape Classification to Estimate Bristol Bay Watershed Hydrology
The use of hydrologic landscapes has proven to be a useful tool for broad scale assessment and classification of landscapes across the United States. These classification systems help organize larger geographical areas into areas of similar hydrologic characteristics based on cl...
DOT National Transportation Integrated Search
1999-01-01
The electronic cone penetrometer is a popular in situ investigation tool for site characterization. This research report describes the application of this proven concept of the cone penetration test (CPT) to highway design and construction control by...
Simple approach to sediment provenance tracing using element analysis and fundamental principles
NASA Astrophysics Data System (ADS)
Matys Grygar, Tomas; Elznicova, Jitka; Popelka, Jan
2016-04-01
Common sediment fingerprinting techniques use either (1) extensive analytical datasets, sometimes nearly complete with respect to accessible characterization techniques; they are processed by multidimensional statistics based on certain statistical assumptions on distribution functions of analytical results and conservativeness/additivity of some components, or (2) analytically demanding characteristics such as isotope ratios assumed to be unequivocal "labels" on the parent material unaltered by any catchment process. The inherent problem of the approach ad (1) is that interpretation of statistical components ("sources") is done ex post and remains purely formal. The problem of the approach ad (2) is that catchment processes (weathering, transport, deposition) can modify most geochemical parameters of soils and sediments, in other words, that the idea that some geochemistry parameters are "conservative" may be idealistic. Grain-size effects and sediment provenance have a joint influence on chemical composition of fluvial sediments that is indeed not easy to distinguish. Attempts to separate those two main components using only statistics seem risky and equivocal, because grain-size dependence of element composition is nearly individual for each element and reflects sediment maturity and catchment-specific formation transport processes. We suppose that the use of less extensive datasets of analytical results and their interpretation respecting fundamental principles should be more robust than only statistic tools applied to overwhelming datasets. We examined sediment composition, both published by other researchers and gathered by us, and we found some general principles, which are in our opinion relevant for fingerprinting: (1) Concentrations of all elements are grain-size sensitive, i.e. there are no "conservative" elements in conventional sense of provenance- or transport-pathways tracing, (2) fractionation by catchment processes and fluvial transport changes slightly but systematically element ratios in solids, (3) the geochemistry and fates of the finest particles, neoformed by weathering and reactive during transport and storage in fluvial system, are different than those of the parent material and its less mature coarse weathering products, and (4) most inter-element ratios and some grain-size effects are non-linear that endanger assumption on additivity of properties in components mixing. We are aware we offer only a conceptual model and not a novel algorithm for quantification of sediment sources, which could be tested in practical studies. On the other hand, we consider element fractionation by exogenic processes fascinating as they are poorly described but relevant not only for provenance tracing but also for general environmental geochemistry.
Persistent identifiers for web service requests relying on a provenance ontology design pattern
NASA Astrophysics Data System (ADS)
Car, Nicholas; Wang, Jingbo; Wyborn, Lesley; Si, Wei
2016-04-01
Delivering provenance information for datasets produced from static inputs is relatively straightforward: we represent the processing actions and data flow using provenance ontologies and link to stored copies of the inputs stored in repositories. If appropriate detail is given, the provenance information can then describe what actions have occurred (transparency) and enable reproducibility. When web service-generated data is used by a process to create a dataset instead of a static inputs, we need to use sophisticated provenance representations of the web service request as we can no longer just link to data stored in a repository. A graph-based provenance representation, such as the W3C's PROV standard, can be used to model the web service request as a single conceptual dataset and also as a small workflow with a number of components within the same provenance report. This dual representation does more than just allow simplified or detailed views of a dataset's production to be used where appropriate. It also allow persistent identifiers to be assigned to instances of a web service requests, thus enabling one form of dynamic data citation, and for those identifiers to resolve to whatever level of detail implementers think appropriate in order for that web service request to be reproduced. In this presentation we detail our reasoning in representing web service requests as small workflows. In outline, this stems from the idea that web service requests are perdurant things and in order to most easily persist knowledge of them for provenance, we should represent them as a nexus of relationships between endurant things, such as datasets and knowledge of particular system types, as these endurant things are far easier to persist. We also describe the ontology design pattern that we use to represent workflows in general and how we apply it to different types of web service requests. We give examples of specific web service requests instances that were made by systems at Australia's National Computing Infrastructure and show how one can 'click' through provenance interfaces to see the dual representations of the requests using provenance management tooling we have built.
SU-E-T-103: Development and Implementation of Web Based Quality Control Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Studinski, R; Taylor, R; Angers, C
Purpose: Historically many radiation medicine programs have maintained their Quality Control (QC) test results in paper records or Microsoft Excel worksheets. Both these approaches represent significant logistical challenges, and are not predisposed to data review and approval. It has been our group's aim to develop and implement web based software designed not just to record and store QC data in a centralized database, but to provide scheduling and data review tools to help manage a radiation therapy clinics Equipment Quality control program. Methods: The software was written in the Python programming language using the Django web framework. In order tomore » promote collaboration and validation from other centres the code was made open source and is freely available to the public via an online source code repository. The code was written to provide a common user interface for data entry, formalize the review and approval process, and offer automated data trending and process control analysis of test results. Results: As of February 2014, our installation of QAtrack+ has 180 tests defined in its database and has collected ∼22 000 test results, all of which have been reviewed and approved by a physicist via QATrack+'s review tools. These results include records for quality control of Elekta accelerators, CT simulators, our brachytherapy programme, TomoTherapy and Cyberknife units. Currently at least 5 other centres are known to be running QAtrack+ clinically, forming the start of an international user community. Conclusion: QAtrack+ has proven to be an effective tool for collecting radiation therapy QC data, allowing for rapid review and trending of data for a wide variety of treatment units. As free and open source software, all source code, documentation and a bug tracker are available to the public at https://bitbucket.org/tohccmedphys/qatrackplus/.« less
LiPD and CSciBox: A Case Study in Why Data Standards are Important for Paleoscience
NASA Astrophysics Data System (ADS)
Weiss, I.; Bradley, E.; McKay, N.; Emile-Geay, J.; de Vesine, L. R.; Anderson, K. A.; White, J. W. C.; Marchitto, T. M., Jr.
2016-12-01
CSciBox [1] is an integrated software system that helps geoscientists build and evaluate age models. Its user chooses from a number of built-in analysis tools, composing them into an analysis workflow and applying it to paleoclimate proxy datasets. CSciBox employs modern database technology to store both the data and the analysis results in an easily accessible and searchable form, and offers the user access to the computational toolbox, the data, and the results via a graphical user interface and a sophisticated plotter. Standards are a staple of modern life, and underlie any form of automation. Without data standards, it is difficult, if not impossible, to construct effective computer tools for paleoscience analysis. The LiPD (Linked Paleo Data) framework [2] enables the storage of both data and metadata in systematic, meaningful, machine-readable ways. LiPD has been a primary enabler of CSciBox's goals of usability, interoperability, and reproducibility. Building LiPD capabilities into CSciBox's importer, for instance, eliminated the need to ask the user about file formats, variable names, relationships between columns in the input file, etc. Building LiPD capabilities into the exporter facilitated the storage of complete details about the input data-provenance, preprocessing steps, etc.-as well as full descriptions of any analyses that were performed using the CSciBox tool, along with citations to appropriate references. This comprehensive collection of data and metadata, which is all linked together in a semantically meaningful, machine-readable way, not only completely documents the analyses and makes them reproducible. It also enables interoperability with any other software system that employs the LiPD standard. [1] www.cs.colorado.edu/ lizb/cscience.html[2] McKay & Emile-Geay, Climate of the Past 12:1093 (2016)
NASA Astrophysics Data System (ADS)
Hobley, Daniel E. J.; Adams, Jordan M.; Nudurupati, Sai Siddhartha; Hutton, Eric W. H.; Gasparini, Nicole M.; Istanbulluoglu, Erkan; Tucker, Gregory E.
2017-01-01
The ability to model surface processes and to couple them to both subsurface and atmospheric regimes has proven invaluable to research in the Earth and planetary sciences. However, creating a new model typically demands a very large investment of time, and modifying an existing model to address a new problem typically means the new work is constrained to its detriment by model adaptations for a different problem. Landlab is an open-source software framework explicitly designed to accelerate the development of new process models by providing (1) a set of tools and existing grid structures - including both regular and irregular grids - to make it faster and easier to develop new process components, or numerical implementations of physical processes; (2) a suite of stable, modular, and interoperable process components that can be combined to create an integrated model; and (3) a set of tools for data input, output, manipulation, and visualization. A set of example models built with these components is also provided. Landlab's structure makes it ideal not only for fully developed modelling applications but also for model prototyping and classroom use. Because of its modular nature, it can also act as a platform for model intercomparison and epistemic uncertainty and sensitivity analyses. Landlab exposes a standardized model interoperability interface, and is able to couple to third-party models and software. Landlab also offers tools to allow the creation of cellular automata, and allows native coupling of such models to more traditional continuous differential equation-based modules. We illustrate the principles of component coupling in Landlab using a model of landform evolution, a cellular ecohydrologic model, and a flood-wave routing model.
Aziz, Michael
2015-01-01
Recent technological advances have made airway management safer. Because difficult intubation remains challenging to predict, having tools readily available that can be used to manage a difficult airway in any setting is critical. Fortunately, video technology has resulted in improvements for intubation performance while using laryngoscopy by various means. These technologies have been applied to rigid optical stylets, flexible intubation scopes, and, most notably, rigid laryngoscopes. These tools have proven effective for the anticipated difficult airway as well as the unanticipated difficult airway.
U.S. Initiatives to Promote Global Internet Freedom: Issues, Policy, and Technology
2010-03-17
video sharing sites, and other tools of today’s communications technology has proven to be an unprecedented and often disruptive force in some closed...using newer tools, such as blogs, social networks, video sharing sites, and other aspects of today’s communications technology to express their...Iranians sent VOA over 300 videos a day, along with thousands of still pictures, e-mails, and telephone calls to the agency.2 A variety of control
Time Lapse of World’s Largest 3-D Printed Object
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2016-08-29
Researchers at the MDF have 3D-printed a large-scale trim tool for a Boeing 777X, the world’s largest twin-engine jet airliner. The additively manufactured tool was printed on the Big Area Additive Manufacturing, or BAAM machine over a 30-hour period. The team used a thermoplastic pellet comprised of 80% ABS plastic and 20% carbon fiber from local material supplier. The tool has proven to decrease time, labor, cost and errors associated with traditional manufacturing techniques and increased energy savings in preliminary testing and will undergo further, long term testing.
Infrared Time Lapse of World’s Largest 3D-Printed Object
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Researchers at Oak Ridge National Laboratory have 3D-printed a large-scale trim tool for a Boeing 777X, the world’s largest twin-engine jet airliner. The additively manufactured tool was printed on the Big Area Additive Manufacturing, or BAAM machine over a 30-hour period. The team used a thermoplastic pellet comprised of 80% ABS plastic and 20% carbon fiber from local material supplier. The tool has proven to decrease time, labor, cost and errors associated with traditional manufacturing techniques and increased energy savings in preliminary testing and will undergo further, long term testing.
ERIC Educational Resources Information Center
Sivertsen, Mary Lewis, Comp.
These programs are available to school systems or other educational institutions for implementation in the classroom. Some programs may be able to offer consultant services and limited assistance with the training and materials associated with installing one of these programs in schools. Information about the National Diffusion Network (NDN) is…
ERIC Educational Resources Information Center
Lewis, Mary G., Comp.
This catalog contains descriptions of the science education programs and materials in the National Diffusion Network (NDN). These programs and materials are available to school systems or other educational institutions for implementation in their classrooms. Some programs may be able to offer consultant services and limited assistance with the…
ERIC Educational Resources Information Center
Dirks-Naylor, Amie J.; Griffiths, Carrie L.; Gibson, Jacob L.; Luu, Jacqueline A.
2016-01-01
Exercise training has proven to be beneficial in the prevention of disease. In addition, exercise can improve the pathogenesis and symptoms associated with a variety of chronic disease states and can attenuate drug-induced adverse effects. Exercise is a drug-free polypill. Because the benefits of exercise are clear and profound, Exercise is…
[Centers for bariatric medicine : why and how ?
Mégevand, Jean-Marie; Vignaux, Laurence; Maghdessian, Raffi; Pralong, François
2018-03-21
Obesity is a chronic, complex and relapsing disease. Because of this complexity, work up and follow up of affected patients implicate different specialists, working in synergy to diagnose and treat obesity and its complications. This follow up is specialized, and should be available in integrated centers of bariatric medicine offering all modalities of treatments with proven efficacy, whether medical, surgical or psychotherapeutical.
No Shelf Required 2: Use and Management of Electronic Books
ERIC Educational Resources Information Center
Polanka, Sue, Ed.
2012-01-01
With their explosive sales and widespread availability over the past few years, e-books have definitively proven that they're here to stay. In this sequel to her best-selling book of the same title, expert Polanka dives even deeper into the world of digital distribution. Contributors from across the e-book world offer their perspectives on what's…
ERIC Educational Resources Information Center
van den Hoogen, Suzanne; Parrott, Denise
2012-01-01
Partnerships and collaborations among libraries are proven to enhance collective resources. The collaboration of multi-type libraries offers a unique opportunity to explore the potential of different libraries working together to provide the best possible service to their community members. This article provides a detailed report of a multi-type…
Development of acoustic emission evaluation method for repaired prestressed concrete bridge girders.
DOT National Transportation Integrated Search
2011-06-01
Acoustic emission (AE) monitoring has proven to be a useful nondestructive testing tool in ordinary reinforced concrete beams. Over the past decade, however, the technique has also been used to test other concrete structures. It has been seen that ac...
RTI in Middle School Classrooms: Proven Tools and Strategies
ERIC Educational Resources Information Center
Esteves, Kelli J.; Whitten, Elizabeth
2014-01-01
"RTI in Middle School Classrooms" provides practical, research-based instructional techniques and interventions--geared especially to middle school teachers and administrators--that target and address specific needs of individual students. Response to intervention allows educators to assess and meet the needs of struggling students…
Using Technology To Bring Abstract Concepts into Focus: A Programming Case Study.
ERIC Educational Resources Information Center
Crews, Thad; Butterfield, Jeff
2002-01-01
Discusses the three-step implementation of an instructional technology tool and associated pedagogy to support teaching and learning computer programming concepts. The Flowchart Interpreter (FLINT) was proven through experiments to support novice programmers better than the traditional textbook approach. (EV)
Microarray technology has proven to be a useful tool for analyzing the transcriptome of various organisms representing conditions such as disease states, developmental stages, and responses to chemical exposure. Although most commercially available arrays are limited to organism...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowen, Benjamin; Ruebel, Oliver; Fischer, Curt Fischer R.
BASTet is an advanced software library written in Python. BASTet serves as the analysis and storage library for the OpenMSI project. BASTet is an integrate framework for: i) storage of spectral imaging data, ii) storage of derived analysis data, iii) provenance of analyses, iv) integration and execution of analyses via complex workflows. BASTet implements the API for the HDF5 storage format used by OpenMSI. Analyses that are developed using BASTet benefit from direct integration with storage format, automatic tracking of provenance, and direct integration with command-line and workflow execution tools. BASTet also defines interfaces to enable developers to directly integratemore » their analysis with OpenMSI's web-based viewing infrastruture without having to know OpenMSI. BASTet also provides numerous helper classes and tools to assist with the conversion of data files, ease parallel implementation of analysis algorithms, ease interaction with web-based functions, description methods for data reduction. BASTet also includes detailed developer documentation, user tutorials, iPython notebooks, and other supporting documents.« less
Proven and novel strategies for efficient editing of the human genome.
Mussolino, Claudio; Mlambo, Tafadzwa; Cathomen, Toni
2015-10-01
Targeted gene editing with designer nucleases has become increasingly popular. The most commonly used designer nuclease platforms are engineered meganucleases, zinc-finger nucleases, transcription activator-like effector nucleases and the clustered regularly interspaced short palindromic repeat/Cas9 system. These powerful tools have greatly facilitated the generation of plant and animal models for basic research, and harbor an enormous potential for applications in biotechnology and gene therapy. This review recapitulates proven concepts of targeted genome engineering in primary human cells and elaborates on novel concepts that became possible with the dawn of RNA-guided nucleases and RNA-guided transcription factors. Copyright © 2015 Elsevier Ltd. All rights reserved.
Morey, G.B.; Setterholm, D.R.
1997-01-01
The relative abundance of rare earth elements in sediments has been suggested as a tool for determining their source rocks. This correlation requires that weathering, erosion, and sedimentation do not alter the REE abundances, or do so in a predictable manner. We find that the rare earth elements are mobilized and fractionated by weathering, and that sediments derived from the weathered materials can display modifications of the original pattern of rare earth elements of some due to grain-size sorting of the weathered material. However, the REE distribution pattern of the provenance terrane can be recognized in the sediments.
Space Flight Operations Center local area network
NASA Technical Reports Server (NTRS)
Goodman, Ross V.
1988-01-01
The existing Mission Control and Computer Center at JPL will be replaced by the Space Flight Operations Center (SFOC). One part of the SFOC is the LAN-based distribution system. The purpose of the LAN is to distribute the processed data among the various elements of the SFOC. The SFOC LAN will provide a robust subsystem that will support the Magellan launch configuration and future project adaptation. Its capabilities include (1) a proven cable medium as the backbone for the entire network; (2) hardware components that are reliable, varied, and follow OSI standards; (3) accurate and detailed documentation for fault isolation and future expansion; and (4) proven monitoring and maintenance tools.
Comparison of BrainTool to other UML modeling and model transformation tools
NASA Astrophysics Data System (ADS)
Nikiforova, Oksana; Gusarovs, Konstantins
2017-07-01
In the last 30 years there were numerous model generated software systems offered targeting problems with the development productivity and the resulting software quality. CASE tools developed due today's date are being advertised as having "complete code-generation capabilities". Nowadays the Object Management Group (OMG) is calling similar arguments in regards to the Unified Modeling Language (UML) models at different levels of abstraction. It is being said that software development automation using CASE tools enables significant level of automation. Actual today's CASE tools are usually offering a combination of several features starting with a model editor and a model repository for a traditional ones and ending with code generator (that could be using a scripting or domain-specific (DSL) language), transformation tool to produce the new artifacts from the manually created and transformation definition editor to define new transformations for the most advanced ones. Present paper contains the results of CASE tool (mainly UML editors) comparison against the level of the automation they are offering.
Neic, Aurel; Campos, Fernando O; Prassl, Anton J; Niederer, Steven A; Bishop, Martin J; Vigmond, Edward J; Plank, Gernot
2017-10-01
Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.
NASA Astrophysics Data System (ADS)
Neic, Aurel; Campos, Fernando O.; Prassl, Anton J.; Niederer, Steven A.; Bishop, Martin J.; Vigmond, Edward J.; Plank, Gernot
2017-10-01
Anatomically accurate and biophysically detailed bidomain models of the human heart have proven a powerful tool for gaining quantitative insight into the links between electrical sources in the myocardium and the concomitant current flow in the surrounding medium as they represent their relationship mechanistically based on first principles. Such models are increasingly considered as a clinical research tool with the perspective of being used, ultimately, as a complementary diagnostic modality. An important prerequisite in many clinical modeling applications is the ability of models to faithfully replicate potential maps and electrograms recorded from a given patient. However, while the personalization of electrophysiology models based on the gold standard bidomain formulation is in principle feasible, the associated computational expenses are significant, rendering their use incompatible with clinical time frames. In this study we report on the development of a novel computationally efficient reaction-eikonal (R-E) model for modeling extracellular potential maps and electrograms. Using a biventricular human electrophysiology model, which incorporates a topologically realistic His-Purkinje system (HPS), we demonstrate by comparing against a high-resolution reaction-diffusion (R-D) bidomain model that the R-E model predicts extracellular potential fields, electrograms as well as ECGs at the body surface with high fidelity and offers vast computational savings greater than three orders of magnitude. Due to their efficiency R-E models are ideally suitable for forward simulations in clinical modeling studies which attempt to personalize electrophysiological model features.
Building Alliances Series: Workforce Development
ERIC Educational Resources Information Center
Brady, Cecilia
2009-01-01
Public-private partnerships done right are a powerful tool for development, providing enduring solutions to some of the greatest challenges. To help familiarize readers with the art of alliance building, the Global Development Alliance (GDA) office has created a series of practical guides that highlight proven practices in partnerships,…
Measuring STEM Students' Mathematical Identities
ERIC Educational Resources Information Center
Kaspersen, Eivind; Pepin, Birgit; Sikko, Svein Arne
2017-01-01
Studies on identity in general and mathematical identity in particular have gained much interest over the last decades. However, although measurements have been proven to be potent tools in many scientific fields, a lack of consensus on ontological, epistemological, and methodological issues has complicated measurements of mathematical identities.…
Open Science and the Monitoring of Aquatic Ecosystems
Open science represents both a philosophy and a set of tools that can be leveraged for more effective scientific analysis. At the core of the open science movement is the concept that research should be reproducible and transparent, in addition to having long-term provenance thro...
76 FR 159 - Discretionary Grant Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-03
... detection of iron deficiency, another pediatric health issue. Proficiency testing (PT) is a proven method... monthly PT and other lab quality improvement tools to nearly 600 laboratories across the U.S. and beyond... Competition: The participation of large numbers of these labs in voluntary proficiency was by design, and...
Meijón, Mónica; Feito, Isabel; Oravec, Michal; Delatorre, Carolina; Weckwerth, Wolfram; Majada, Juan; Valledor, Luis
2016-02-01
Natural variation of the metabolome of Pinus pinaster was studied to improve understanding of its role in the adaptation process and phenotypic diversity. The metabolomes of needles and the apical and basal section of buds were analysed in ten provenances of P. pinaster, selected from France, Spain and Morocco, grown in a common garden for 5 years. The employment of complementary mass spectrometry techniques (GC-MS and LC-Orbitrap-MS) together with bioinformatics tools allowed the reliable quantification of 2403 molecular masses. The analysis of the metabolome showed that differences were maintained across provenances and that the metabolites characteristic of each organ are mainly related to amino acid metabolism, while provenances were distinguishable essentially through secondary metabolism when organs were analysed independently. Integrative analyses of metabolome, environmental and growth data provided a comprehensive picture of adaptation plasticity in conifers. These analyses defined two major groups of plants, distinguished by secondary metabolism: that is, either Atlantic or Mediterranean provenance. Needles were the most sensitive organ, where strong correlations were found between flavonoids and the water regime of the geographic origin of the provenance. The data obtained point to genome specialization aimed at maximizing the drought stress resistance of trees depending on their origin. © 2016 John Wiley & Sons Ltd.
Hsieh, Chang-tseh; Lin, Binshan
2011-01-01
The utilisation of IS/IT could offer a substantial competitive advantage to healthcare service providers through the realisation of improved clinical, financial, and administrative outcomes. In this study, 42 journal articles were reviewed and summarised with respect to identified benefits and challenges of the development and implementation of electronic medical records, tele-health, and electronic appointment reminders. Results of this study help pave the knowledge foundation for management of the behavioural healthcare to learn how to apply state-of-the-art information technology to offer higher quality, clinically proven effective services at lower costs than those of their competitors.
Providing traceability for neuroimaging analyses.
McClatchey, Richard; Branson, Andrew; Anjum, Ashiq; Bloodsworth, Peter; Habib, Irfan; Munir, Kamran; Shamdasani, Jetendr; Soomro, Kamran
2013-09-01
With the increasingly digital nature of biomedical data and as the complexity of analyses in medical research increases, the need for accurate information capture, traceability and accessibility has become crucial to medical researchers in the pursuance of their research goals. Grid- or Cloud-based technologies, often based on so-called Service Oriented Architectures (SOA), are increasingly being seen as viable solutions for managing distributed data and algorithms in the bio-medical domain. For neuroscientific analyses, especially those centred on complex image analysis, traceability of processes and datasets is essential but up to now this has not been captured in a manner that facilitates collaborative study. Few examples exist, of deployed medical systems based on Grids that provide the traceability of research data needed to facilitate complex analyses and none have been evaluated in practice. Over the past decade, we have been working with mammographers, paediatricians and neuroscientists in three generations of projects to provide the data management and provenance services now required for 21st century medical research. This paper outlines the finding of a requirements study and a resulting system architecture for the production of services to support neuroscientific studies of biomarkers for Alzheimer's disease. The paper proposes a software infrastructure and services that provide the foundation for such support. It introduces the use of the CRISTAL software to provide provenance management as one of a number of services delivered on a SOA, deployed to manage neuroimaging projects that have been studying biomarkers for Alzheimer's disease. In the neuGRID and N4U projects a Provenance Service has been delivered that captures and reconstructs the workflow information needed to facilitate researchers in conducting neuroimaging analyses. The software enables neuroscientists to track the evolution of workflows and datasets. It also tracks the outcomes of various analyses and provides provenance traceability throughout the lifecycle of their studies. As the Provenance Service has been designed to be generic it can be applied across the medical domain as a reusable tool for supporting medical researchers thus providing communities of researchers for the first time with the necessary tools to conduct widely distributed collaborative programmes of medical analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Benchmarking of software tools for optical proximity correction
NASA Astrophysics Data System (ADS)
Jungmann, Angelika; Thiele, Joerg; Friedrich, Christoph M.; Pforr, Rainer; Maurer, Wilhelm
1998-06-01
The point when optical proximity correction (OPC) will become a routine procedure for every design is not far away. For such a daily use the requirements for an OPC tool go far beyond the principal functionality of OPC that was proven by a number of approaches and is documented well in literature. In this paper we first discuss the requirements for a productive OPC tool. Against these requirements a benchmarking was performed with three different OPC tools available on market (OPRX from TVT, OPTISSIMO from aiss and PROTEUS from TMA). Each of these tools uses a different approach to perform the correction (rules, simulation or model). To assess the accuracy of the correction, a test chip was fabricated, which contains corrections done by each software tool. The advantages and weakness of the several solutions are discussed.
NASA Astrophysics Data System (ADS)
Boden, T. A.; Krassovski, M.; Yang, B.
2013-06-01
The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL), USA has provided scientific data management support for the US Department of Energy and international climate change science since 1982. Among the many data archived and available from CDIAC are collections from long-term measurement projects. One current example is the AmeriFlux measurement network. AmeriFlux provides continuous measurements from forests, grasslands, wetlands, and croplands in North, Central, and South America and offers important insight about carbon cycling in terrestrial ecosystems. To successfully manage AmeriFlux data and support climate change research, CDIAC has designed flexible data systems using proven technologies and standards blended with new, evolving technologies and standards. The AmeriFlux data system, comprised primarily of a relational database, a PHP-based data interface and a FTP server, offers a broad suite of AmeriFlux data. The data interface allows users to query the AmeriFlux collection in a variety of ways and then subset, visualize and download the data. From the perspective of data stewardship, on the other hand, this system is designed for CDIAC to easily control database content, automate data movement, track data provenance, manage metadata content, and handle frequent additions and corrections. CDIAC and researchers in the flux community developed data submission guidelines to enhance the AmeriFlux data collection, enable automated data processing, and promote standardization across regional networks. Both continuous flux and meteorological data and irregular biological data collected at AmeriFlux sites are carefully scrutinized by CDIAC using established quality-control algorithms before the data are ingested into the AmeriFlux data system. Other tasks at CDIAC include reformatting and standardizing the diverse and heterogeneous datasets received from individual sites into a uniform and consistent network database, generating high-level derived products to meet the current demands from a broad user group, and developing new products in anticipation of future needs. In this paper, we share our approaches to meet the challenges of standardizing, archiving and delivering quality, well-documented AmeriFlux data worldwide to benefit others with similar challenges of handling diverse climate change data, to further heighten awareness and use of an outstanding ecological data resource, and to highlight expanded software engineering applications being used for climate change measurement data.
NASA Astrophysics Data System (ADS)
Boden, T. A.; Krassovski, M.; Yang, B.
2013-02-01
The Carbon Dioxide Information Analysis Center (CDIAC) at Oak Ridge National Laboratory (ORNL), USA has provided scientific data management support for the US Department of Energy and international climate change science since 1982. Among the many data archived and available from CDIAC are collections from long-term measurement projects. One current example is the AmeriFlux measurement network. AmeriFlux provides continuous measurements from forests, grasslands, wetlands, and croplands in North, Central, and South America and offers important insight about carbon cycling in terrestrial ecosystems. To successfully manage AmeriFlux data and support climate change research, CDIAC has designed flexible data systems using proven technologies and standards blended with new, evolving technologies and standards. The AmeriFlux data system, comprised primarily of a relational database, a PHP based data-interface and a FTP server, offers a broad suite of AmeriFlux data. The data interface allows users to query the AmeriFlux collection in a variety of ways and then subset, visualize and download the data. From the perspective of data stewardship, on the other hand, this system is designed for CDIAC to easily control database content, automate data movement, track data provenance, manage metadata content, and handle frequent additions and corrections. CDIAC and researchers in the flux community developed data submission guidelines to enhance the AmeriFlux data collection, enable automated data processing, and promote standardization across regional networks. Both continuous flux and meteorological data and irregular biological data collected at AmeriFlux sites are carefully scrutinized by CDIAC using established quality-control algorithms before the data are ingested into the AmeriFlux data system. Other tasks at CDIAC include reformatting and standardizing the diverse and heterogeneous datasets received from individual sites into a uniform and consistent network database, generating high-level derived products to meet the current demands from a broad user group, and developing new products in anticipation of future needs. In this paper, we share our approaches to meet the challenges of standardizing, archiving and delivering quality, well-documented AmeriFlux data worldwide to benefit others with similar challenges of handling diverse climate change data, to further heighten awareness and use of an outstanding ecological data resource, and to highlight expanded software engineering applications being used for climate change measurement data.
ERIC Educational Resources Information Center
West, Martin R.; Peterson, Paul E.; Barrows, Samuel
2017-01-01
Over the past 25 years, charter schools have offered an increasing number of families an alternative to their local district schools. The charter option has proven particularly popular in large cities, but charter-school growth is often constrained by state laws that limit the number of students the sector can serve. The charter sector is the most…
ERIC Educational Resources Information Center
Dyck, Bruno
2013-01-01
Widespread agreement suggests that it is appropriate and desirable to develop and teach business theory and practice consistent with Catholic social teaching (CST) in Catholic business schools. Such a curriculum would cover the same mainstream material taught in other business schools, but then offer a CST approach to business that can be…
ERIC Educational Resources Information Center
Wang, Weina; Dermody, Kelly; Burgess, Colleen; Wang, Fangmin
2014-01-01
Technology lending has proven to be one of the most popular services that the Ryerson University Library and Archives (RULA) has offered in the past few years. Given the number of commuting digital natives comprising our student body, the library wanted to know how these students were using our current laptop loan program and how this service…
Strategic Planning and Management in Defense Systems Acquisition
2014-04-30
Strategic planning, according to Dr. John Bryson (2010), offers many benefits to public-sector organizations: Promotes strategic thinking, acting, and...planning in particular—and has proven its value (Barzelay & Campbell, 2003; Berman & West, 1998; Berry & Wechsler, 1995; Boyne & Gould-Williams, 2003...national survey. Public Administration Review, 55(2), 159–68. Boyne , G. A., & Gould-Williams, J. (2003). Planning and performance in public organizations
Strategic Planning and Management in Defense Systems Acquisition
2013-10-01
to Dr. John Bryson (2010), offers many benefits to public-sector organizations: • Promotes strategic thinking, acting, and learning; • Improves...and with strategic planning in particular—and has proven its value (Barzelay & Campbell, 2003; Berman & West, 1998; Berry & Wechsler, 1995; Boyne ...national survey. Public Administration Review, 55(2), 159–68. Boyne , G. A., & Gould-Williams, J. (2003). Planning and performance in public organizations
ERIC Educational Resources Information Center
National Center for Homeless Education at SERVE, 2010
2010-01-01
The National Center for Homeless Education and the Legal Center for Foster Care and Education present this brief to help educators and child welfare advocates work together to support the academic success of children and youth in out-of-home care. The brief offers practical, proven strategies for implementing two federal laws collaboratively: The…
Alexander, Robert W; Harrell, David B
2013-01-01
Objectives Provide background for use of acquiring autologous adipose tissue as a tissue graft and source of adult progenitor cells for use in cosmetic plastic surgery. Discuss the background and mechanisms of action of closed syringe vacuum lipoaspiration, with emphasis on accessing adipose-derived mesenchymal/stromal cells and the stromal vascular fraction (SVF) for use in aesthetic, structural reconstruction and regenerative applications. Explain a proven protocol for acquiring high-quality autologous fat grafts (AFG) with use of disposable, microcannula systems. Design Explain the components and advantage of use of the patented super luer-lock and microcannulas system for use with the closed-syringe system. A sequential explanation of equipment selection for minimally traumatic lipoaspiration in small volumes is presented, including use of blunt injection cannulas to reduce risk of embolism. Results Thousands of AFG have proven safe and efficacious for lipoaspiration techniques for large and small structural fat grafting procedures. The importance and advantages of gentle harvesting of the adipose tissue complex has become very clear in the past 5 years. The closed-syringe system offers a minimally invasive, gentle system with which to mobilize subdermal fat tissues in a suspension form. Resulting total nuclear counting of undifferentiated cells of the adipose-derived -SVF suggests that the yield achieved is better than use of always-on, constant mechanical pump applied vacuum systems. Conclusion Use of a closed-syringe lipoaspiration system featuring disposable microcannulas offers a safe and effective means of harvesting small volumes of nonmanipulated adipose tissues and its accompanying progenitor cells within the SVF. Closed syringes and microcannulas are available as safe, sterile, disposable, compact systems for acquiring high-quality AFG. Presented is a detailed, step-by-step, proven protocol for performing quality autologous structural adipose transplantation. PMID:23630430
The School Buddy System: The Practice of Collaboration.
ERIC Educational Resources Information Center
Bush, Gail
This book explains how to create a collaborative learning environment involving librarians, teachers, administrators, and all team players in K-12 education. Building on existing educational standards, the book features such proven tools as a ready-to-use framework for establishing a collaborative relationship, 40 discussion prompts to help…
DOT National Transportation Integrated Search
2014-03-01
Recent research in highway safety has focused on the more advanced and statistically proven techniques of highway : safety analysis. This project focuses on the two most recent safety analysis tools, the Highway Safety Manual (HSM) : and SafetyAnalys...
Learning to Write and Loving It! Preschool-Kindergarten
ERIC Educational Resources Information Center
Trehearne, Miriam P.
2011-01-01
"Learning to Write and Loving It!" equips teachers of young children with practical strategies, assessment tools, and motivating writing activities that are based on current research and proven practice and are easily applicable to all kinds of learning environments. Included are many authentic writing samples and photos to illustrate effective,…
USDA-ARS?s Scientific Manuscript database
Pheromone-based mating disruption has proven to be a powerful pest management tool in many cropping systems, helping to reduce reliance on insecticide applications. However, a sustainable mating disruption program has not yet been developed for cranberries. In the cranberry system, two of the major ...
Jessica Wright
2014-01-01
Combining data from provenance test studies with our current understanding of predicted climate change can be a powerful tool for informing reforestation efforts. However, the limitations of both sources of data need to be understood to develop an approach to ecological restoration that reduces risk and promotes the highest chance of successful reforestation.
EVALUATION OF MEMBRANE TYPE FOR USE IN DIFFUSION SAMPLERS TO MONITOR GROUND WATER QUALITY
The Discrete Multi-Level Sampler (DMLS®) system has proven to be a useful tool for obtaining discrete interval contaminant concentrations at hazardous waste sites. The DMLS® utilizes dialysis cells, which consist of a polypropylene vial, covered on both ends by a permeable membr...
Provides overview of Assessing IAQ by downloading EPA's School Assessment Mobile App. “One-stop shop” for accessing guidance from EPA’s IAQ Tools for Schools Action Kit with proven strategies for specifically addressing important IAQ issues.
Recruitment. Getting Customers for Employment and Training Programs.
ERIC Educational Resources Information Center
Newton, Greg
This workbook presents the essential principles of successful marketing and applies the proven strategies used by the private sector to attract customers for their products to the recruitment of clients for employment and training programs. It also provides the tools and how-to's to develop recruitment strategies. Informative materials, lists of…
Early Violence Prevention: Tools for Teachers of Young Children.
ERIC Educational Resources Information Center
Slaby, Ronald G.; And Others
Based on the latest knowledge about early violence prevention and effective teaching strategies, this book describes practical ways for early childhood educators to handle children's aggression and shows how to help children become assertive, nonviolent problem solvers. The book's repertoire of proven approaches includes teaching children how to…
Factors Enabling the Use of Technology in Subject Teaching
ERIC Educational Resources Information Center
Cubukcuoglu, Begum
2013-01-01
The importance of information and communication technologies in the teaching and learning process has been proven by many research studies to be an effective way of supporting teaching and learning. Although many teachers do not use new technologies as instructional tools, some are integrating information and communication technologies…
76 FR 5501 - Request for Comments: Review and Improvement of EDA's Regulations
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-01
... development and growth of RICs as proven economic development tools through which American regions can create... drivers of regional economic growth, such as local universities, government research centers, and/or other... higher education (including community colleges), and other public and private agencies and institutions...
Sky online: linking amateur and professional astronomers on the world wide web
NASA Astrophysics Data System (ADS)
Fienberg, Richard Tresch
SKY Online is the World Wide Web site of Sky Publishing Corporation, publisher of Sky & Telescope magazine. Conceived mainly as an electronic extension of the company's marketing and promotion efforts, SKY Online has also proven to be a useful tool for communication between amateur and professional astronomers.
Progress and Plans in Support of the Polar Community
NASA Technical Reports Server (NTRS)
Olsen, Lola M.; Meaux, Melanie F.
2006-01-01
Feedback provided by the Antarctic community has proven instrumental in positively influencing the direction of the GCMD's development. For example, in response to requests for a stand alone metadata authoring tool, a new shareable software package called docBUILDER solo will be released to the public in March 2006. This tool permits researchers to document their data during experiments and observational periods in the field. The international polar community has also played a key role in encouraging support for the foreign language character set in the metadata display and tools (10% of the records in the AMD hold foreign characters). In the upcoming release, the full ISO character set, which also includes mathematical symbols, will be supported. Additional upgrades include the ability for users to search for data sets based on pre-selected temporal and spatial resolution ranges. Data providers are strongly encouraged to populate the resolution fields for their data sets, although these fields are not currently required. In prior versions, browser incompatibilities often resulted in unreliable performance for users attempting to initiate a spatial search using a map based on Java applet technology. The GCMD will offer an integrated Google map and date search, replacing the applet technology and enhancing the geospatial and temporal searches. It is estimated that 30% of the records in the AMD have direct access to data. A growing number of these records can be accessed through data service links. Related data services are therefore becoming valuable assets in facilitating the use and visualization of data. Users will gain the ability to refine services using the same options as those available for data set searches. Data providers are encouraged to describe available data-related services through the directory. Future plans include offering web services through a SOAP interface and extending semantic queries for the polar regions through the use of ontologies. The Open Archives Initiative's (OAI) Protocol for Metadata Harvesting (PMH) has been successfully tested with several organizations and appears to be a prime candidate for sharing metadata within the community. The GCMD anticipates contributing to the design of the data management system for the International Polar Year and to the ongoing efforts in the years to come. Further enhancements will be discussed at the meeting.
High performance multi-spectral interrogation for surface plasmon resonance imaging sensors.
Sereda, A; Moreau, J; Canva, M; Maillart, E
2014-04-15
Surface plasmon resonance (SPR) sensing has proven to be a valuable tool in the field of surface interactions characterization, especially for biomedical applications where label-free techniques are of particular interest. In order to approach the theoretical resolution limit, most SPR-based systems have turned to either angular or spectral interrogation modes, which both offer very accurate real-time measurements, but at the expense of the 2-dimensional imaging capability, therefore decreasing the data throughput. In this article, we show numerically and experimentally how to combine the multi-spectral interrogation technique with 2D-imaging, while finding an optimum in terms of resolution, accuracy, acquisition speed and reduction in data dispersion with respect to the classical reflectivity interrogation mode. This multi-spectral interrogation methodology is based on a robust five parameter fitting of the spectral reflectivity curve which enables monitoring of the reflectivity spectral shift with a resolution of the order of ten picometers, and using only five wavelength measurements per point. In fine, such multi-spectral based plasmonic imaging system allows biomolecular interaction monitoring in a linear regime independently of variations of buffer optical index, which is illustrated on a DNA-DNA model case. © 2013 Elsevier B.V. All rights reserved.
Investigational Notch and Hedgehog Inhibitors – Therapies for Cardiovascular disease
Redmond, EM; Guha, S; Walls, D; Cahill, PA
2011-01-01
Importance to the field During the past decade a variety of Notch and Hedgehog pathway inhibitors have been developed for the treatment of several cancers. An emerging paradigm suggests that these same gene regulatory networks are often recapitulated in the context of cardiovascular disease and may now offer an attractive target for therapeutic intervention. Areas Covered This article briefly reviews the profile of Notch and Hedgehog inhibitors that have reached the pre-clinic and clinic for cancer treatment and discusses the clinical issues surrounding targeted use of these inhibitors in the treatment of vascular disorders. Expert Opinion Pre-clinical and clinical data using pan-Notch inhibitors (γ-secretase inhibitors) and selective antibodies to preferentially target notch receptors and ligands has proven successful but concerns remain over normal organ homeostasis and significant pathology in multiple organs. In contrast, the Hedgehog based drug pipeline is rich with more than a dozen Smoothened (SMO) inhibitors at various stages of development. Overall, refined strategies will be necessary to harness these pathways safely as a powerful tool to disrupt angiogenesis and vascular proliferative phenomena without causing prohibitive side effects already seen with cancer models and patients. PMID:22007748
High-power disk lasers: advances and applications
NASA Astrophysics Data System (ADS)
Havrilla, David; Ryba, Tracey; Holzer, Marco
2012-03-01
Though the genesis of the disk laser concept dates to the early 90's, the disk laser continues to demonstrate the flexibility and the certain future of a breakthrough technology. On-going increases in power per disk, and improvements in beam quality and efficiency continue to validate the genius of the disk laser concept. As of today, the disk principle has not reached any fundamental limits regarding output power per disk or beam quality, and offers numerous advantages over other high power resonator concepts, especially over monolithic architectures. With about 2,000 high power disk lasers installations, and a demand upwards of 1,000 lasers per year, the disk laser has proven to be a robust and reliable industrial tool. With advancements in running cost, investment cost and footprint, manufacturers continue to implement disk laser technology with more vigor than ever. This paper will explain recent advances in disk laser technology and process relevant features of the laser, like pump diode arrangement, resonator design and integrated beam guidance. In addition, advances in applications in the thick sheet area and very cost efficient high productivity applications like remote welding, remote cutting and cutting of thin sheets will be discussed.
Stem Cells as a Tool for Breast Imaging
Padín-Iruegas, Maria Elena; López López, Rafael
2012-01-01
Stem cells are a scientific field of interest due to their therapeutic potential. There are different groups, depending on the differentiation state. We can find lonely stem cells, but generally they distribute in niches. Stem cells don't survive forever. They are affected for senescence. Cancer stem cells are best defined functionally, as a subpopulation of tumor cells that can enrich for tumorigenic property and can regenerate heterogeneity of the original tumor. Circulating tumor cells are cells that have detached from a primary tumor and circulate in the bloodstream. They may constitute seeds for subsequent growth of additional tumors (metastasis) in different tissues. Advances in molecular imaging have allowed a deeper understanding of the in vivo behavior of stem cells and have proven to be indispensable in preclinical and clinical studies. One of the first imaging modalities for monitoring pluripotent stem cells in vivo, magnetic resonance imaging (MRI) offers high spatial and temporal resolution to obtain detailed morphological and functional information. Advantages of radioscintigraphic techniques include their picomolar sensitivity, good tissue penetration, and translation to clinical applications. Radionuclide imaging is the sole direct labeling technique used thus far in human studies, involving both autologous bone marrow derived and peripheral stem cells. PMID:22848220
Batur, Fulya; Dedeurwaerdere, Tom
2014-12-01
Focused on the impact of stringent intellectual property mechanisms over the uses of plant agricultural biodiversity in crop improvement, the article delves into a systematic analysis of the relationship between institutional paradigms and their technological contexts of application, identified as mass selection, controlled hybridisation, molecular breeding tools and transgenics. While the strong property paradigm has proven effective in the context of major leaps forward in genetic engineering, it faces a systematic breakdown when extended to mass selection, where innovation often displays a collective nature. However, it also creates partial blockages in those innovation schemes rested between on-farm observation and genetic modification, i.e. conventional plant breeding and upstream molecular biology research tools. Neither overly strong intellectual property rights, nor the absence of well delineated protection have proven an optimal fit for these two intermediary socio-technological systems of cumulative incremental innovation. To address these challenges, the authors look at appropriate institutional alternatives which can create effective incentives for in situ agrobiodiversity conservation and the equitable distribution of technologies in plant improvement, using the flexibilities of the TRIPS Agreement, the liability rules set forth in patents or plant variety rights themselves (in the form of farmers', breeders' and research exceptions), and other ad hoc reward regimes.
Vakrman, Tomas; Kristoufek, Ladislav
2015-01-01
Online activity of Internet users has proven very useful in modeling various phenomena across a wide range of scientific disciplines. In our study, we focus on two stylized facts or puzzles surrounding the initial public offerings (IPOs) - the underpricing and the long-term underperformance. Using the Internet searches on Google, we proxy the investor attention before and during the day of the offering to show that the high attention IPOs have different characteristics than the low attention ones. After controlling for various effects, we show that investor attention still remains a strong component of the high initial returns (the underpricing), primarily for the high sentiment periods. Moreover, we demonstrate that the investor attention partially explains the overoptimistic market reaction and thus also a part of the long-term underperformance.
Big Data is a powerful tool for environmental improvements in the construction business
NASA Astrophysics Data System (ADS)
Konikov, Aleksandr; Konikov, Gregory
2017-10-01
The work investigates the possibility of applying the Big Data method as a tool to implement environmental improvements in the construction business. The method is recognized as effective in analyzing big volumes of heterogeneous data. It is noted that all preconditions exist for this method to be successfully used for resolution of environmental issues in the construction business. It is proven that the principal Big Data techniques (cluster analysis, crowd sourcing, data mixing and integration) can be applied in the sphere in question. It is concluded that Big Data is a truly powerful tool to implement environmental improvements in the construction business.
Advanced composites: Fabrication processes for selected resin matrix materials
NASA Technical Reports Server (NTRS)
Welhart, E. K.
1976-01-01
This design note is based on present state of the art for epoxy and polyimide matrix composite fabrication technology. Boron/epoxy and polyimide and graphite/epoxy and polyimide structural parts can be successfully fabricated. Fabrication cycles for polyimide matrix composites have been shortened to near epoxy cycle times. Nondestructive testing has proven useful in detecting defects and anomalies in composite structure elements. Fabrication methods and tooling materials are discussed along with the advantages and disadvantages of different tooling materials. Types of honeycomb core, material costs and fabrication methods are shown in table form for comparison. Fabrication limits based on tooling size, pressure capabilities and various machining operations are also discussed.
Efficient volumetric estimation from plenoptic data
NASA Astrophysics Data System (ADS)
Anglin, Paul; Reeves, Stanley J.; Thurow, Brian S.
2013-03-01
The commercial release of the Lytro camera, and greater availability of plenoptic imaging systems in general, have given the image processing community cost-effective tools for light-field imaging. While this data is most commonly used to generate planar images at arbitrary focal depths, reconstruction of volumetric fields is also possible. Similarly, deconvolution is a technique that is conventionally used in planar image reconstruction, or deblurring, algorithms. However, when leveraged with the ability of a light-field camera to quickly reproduce multiple focal planes within an imaged volume, deconvolution offers a computationally efficient method of volumetric reconstruction. Related research has shown than light-field imaging systems in conjunction with tomographic reconstruction techniques are also capable of estimating the imaged volume and have been successfully applied to particle image velocimetry (PIV). However, while tomographic volumetric estimation through algorithms such as multiplicative algebraic reconstruction techniques (MART) have proven to be highly accurate, they are computationally intensive. In this paper, the reconstruction problem is shown to be solvable by deconvolution. Deconvolution offers significant improvement in computational efficiency through the use of fast Fourier transforms (FFTs) when compared to other tomographic methods. This work describes a deconvolution algorithm designed to reconstruct a 3-D particle field from simulated plenoptic data. A 3-D extension of existing 2-D FFT-based refocusing techniques is presented to further improve efficiency when computing object focal stacks and system point spread functions (PSF). Reconstruction artifacts are identified; their underlying source and methods of mitigation are explored where possible, and reconstructions of simulated particle fields are provided.
Symes, Craig; Skhosana, Felix; Butler, Mike; Gardner, Brett; Woodborne, Stephan
2017-12-01
Diet-tissue isotopic relationships established under controlled conditions are informative for determining the dietary sources and geographic provenance of organisms. We analysed δ 13 C, δ 15 N, and non-exchangeable δ 2 H values of captive African grey parrot Psittacus erithacus feathers grown on a fixed mixed-diet and borehole water. Diet-feather Δ 13 C and Δ 15 N discrimination values were +3.8 ± 0.3 ‰ and +6.3 ± 0.7 ‰ respectively; significantly greater than expected. Non-exchangeable δ 2 H feather values (-62.4 ± 6.4 ‰) were more negative than water (-26.1 ± 2.5 ‰) offered during feather growth. There was no positive relationship between the δ 13 C and δ 15 N values of the samples along each feather with the associated samples of food offered, or the feather non-exchangeable hydrogen isotope values with δ 2 H values of water, emphasising the complex processes involved in carbohydrate, protein, and income water routing to feather growth. Understanding the isotopic relationship between diet and feathers may provide greater clarity in the use of stable isotopes in feathers as a tool in determining origins of captive and wild-caught African grey parrots, a species that is widespread in aviculture and faces significant threats to wild populations. We suggest that these isotopic results, determined even in controlled laboratory conditions, be used with caution.
Lou, Yun-xiao; Fu, Xian-shu; Yu, Xiao-ping; Zhang, Ya-fen
2017-01-01
This paper focused on an effective method to discriminate the geographical origin of Wuyi-Rock tea by the stable isotope ratio (SIR) and metallic element profiling (MEP) combined with support vector machine (SVM) analysis. Wuyi-Rock tea (n = 99) collected from nine producing areas and non-Wuyi-Rock tea (n = 33) from eleven nonproducing areas were analysed for SIR and MEP by established methods. The SVM model based on coupled data produced the best prediction accuracy (0.9773). This prediction shows that instrumental methods combined with a classification model can provide an effective and stable tool for provenance discrimination. Moreover, every feature variable in stable isotope and metallic element data was ranked by its contribution to the model. The results show that δ2H, δ18O, Cs, Cu, Ca, and Rb contents are significant indications for provenance discrimination and not all of the metallic elements improve the prediction accuracy of the SVM model. PMID:28473941
Collections and user tools for utilization of persistent identifiers in cyberinfrastructures
NASA Astrophysics Data System (ADS)
Weigel, T.
2014-12-01
The main use of persistent identifiers (PIDs) for data objects has so far been for formal publication and citation purposes with a focus on long-term availability and trust. This core use case has now evolved and broadened to include basic data management tasks as identifiers are increasingly seen as a possible anchor element in the deluge of data for purposes of large-scale automation of tasks. The European Data Infrastructure (EUDAT) for instance uses PIDs in their back-end services and distinctly so for entities where the identifier may be more persistent than a resource with limited lifetime. Despite breaking with the traditional metaphor, this offers new opportunities for data management and end-user tools, but also requires a clear demonstrated benefit of value-added services because en masse identifier assignment does not come at zero costs. There are several obstacles to overcome when establishing identifiers at large scale. The administration of large numbers of identifiers can be cumbersome if they are treated in an isolated manner. Here, identifier collections can enable automated mass operations on groups of associated objects. Several use cases rely on base information that is rapidly available from the identifier systems without the need to retrieve objects, yet they will not work efficiently if the information is not consistently typed. Tools that span cyberinfrastructures and address scientific end-users unaware of the varying back-ends must overcome such obstacles. The Working Group on PID Information Types of the Research Data Alliance (RDA) has developed an interface specification and prototype to access and manipulate typed base information. Concrete prototypes for identifier collections exist as well. We will present some first data and provenance tracking tools that make extensive use of these recent developments and address different user needs that span from administrative tasks to individual end-user services with particular focus on data available from the Earth System Grid Federation (ESGF). We will compare the tools along their respective use cases with existing approaches and discuss benefits and limitations.
APMS: An Integrated Suite of Tools for Measuring Performance and Safety
NASA Technical Reports Server (NTRS)
Statler, Irving C.; Lynch, Robert E.; Connors, Mary M. (Technical Monitor)
1997-01-01
This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data. The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data-analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions. APMS will offer to the air transport community an open, voluntary standard for flight-data-analysis software, a standard that will help to ensure suitable functionality, and data interchangeability, among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs of air crews in mind. APMS tools must serve the needs of the government and air carriers, as well as air crews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but through statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the air crew.
NASA Technical Reports Server (NTRS)
Statler, Irving C.; Connor, Mary M. (Technical Monitor)
1998-01-01
This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data, The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions. APMS offers to the air transport community an open, voluntary standard for flight-data-analysis software; a standard that will help to ensure suitable functionality and data interchangeability among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs-of aircrews in mind. APMS tools must serve the needs of the government and air carriers, as well as aircrews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but also through statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the aircrew.
APMS: An Integrated Suite of Tools for Measuring Performance and Safety
NASA Technical Reports Server (NTRS)
Statler, Irving C. (Technical Monitor)
1997-01-01
This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data. The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data-analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions . APMS will offer to the air transport community an open, voluntary standard for flight-data-analysis software, a standard that will help to ensure suitable functionality, and data interchangeability, among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs of air crews in mind. APMS tools must serve the needs of the government and air carriers, as well as air crews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but through statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the air crew.
APMS: An Integrated Set of Tools for Measuring Safety
NASA Technical Reports Server (NTRS)
Statler, Irving C.; Reynard, William D. (Technical Monitor)
1996-01-01
This is a report of work in progress. In it, I summarize the status of the research and development of the Aviation Performance Measuring System (APMS) for managing, processing, and analyzing digital flight-recorded data. The objectives of the NASA-FAA APMS research project are to establish a sound scientific and technological basis for flight-data analysis, to define an open and flexible architecture for flight-data-analysis systems, and to articulate guidelines for a standardized database structure on which to continue to build future flight-data-analysis extensions. APMS will offer to the air transport community an open, voluntary standard for flight-data-analysis software, a standard that will help to ensure suitable functionality, and data interchangeability, among competing software programs. APMS will develop and document the methodologies, algorithms, and procedures for data management and analyses to enable users to easily interpret the implications regarding safety and efficiency of operations. APMS does not entail the implementation of a nationwide flight-data-collection system. It is intended to provide technical tools to ease the large-scale implementation of flight-data analyses at both the air-carrier and the national-airspace levels in support of their Flight Operations and Quality Assurance (FOQA) Programs and Advanced Qualifications Programs (AQP). APMS cannot meet its objectives unless it develops tools that go substantially beyond the capabilities of the current commercially available software and supporting analytic methods that are mainly designed to count special events. These existing capabilities, while of proven value, were created primarily with the needs of air crews in mind. APMS tools must serve the needs of the government and air carriers, as well as air crews, to fully support the FOQA and AQP programs. They must be able to derive knowledge not only through the analysis of single flights (special-event detection), but through statistical evaluation of the performance of large groups of flights. This paper describes the integrated suite of tools that will assist analysts in evaluating the operational performance and safety of the national air transport system, the air carrier, and the air crew.
NASA Astrophysics Data System (ADS)
Harmon, T.; Hofmann, A. F.; Utz, R.; Deelman, E.; Hanson, P. C.; Szekely, P.; Villamizar, S. R.; Knoblock, C.; Guo, Q.; Crichton, D. J.; McCann, M. P.; Gil, Y.
2011-12-01
Environmental cyber-observatory (ECO) planning and implementation has been ongoing for more than a decade now, and several major efforts have recently come online or will soon. Some investigators in the relevant research communities will use ECO data, traditionally by developing their own client-side services to acquire data and then manually create custom tools to integrate and analyze it. However, a significant portion of the aquatic ecosystem science community will need more custom services to manage locally collected data. The latter group represents enormous intellectual capacity when one envisions thousands of ecosystems scientists supplementing ECO baseline data by sharing their own locally intensive observational efforts. This poster summarizes the outcomes of the June 2011 Workshop for Aquatic Ecosystem Sustainability (WAES) which focused on the needs of aquatic ecosystem research on inland waters and oceans. Here we advocate new approaches to support scientists to model, integrate, and analyze data based on: 1) a new breed of software tools in which semantic provenance is automatically created and used by the system, 2) the use of open standards based on RDF and Linked Data Principles to facilitate sharing of data and provenance annotations, 3) the use of workflows to represent explicitly all data preparation, integration, and processing steps in a way that is automatically repeatable. Aquatic ecosystems workflow exemplars are provided and discussed in terms of their potential broaden data sharing, analysis and synthesis thereby increasing the impact of aquatic ecosystem research.
NASA Astrophysics Data System (ADS)
Sivarami Reddy, N.; Ramamurthy, D. V., Dr.; Prahlada Rao, K., Dr.
2017-08-01
This article addresses simultaneous scheduling of machines, AGVs and tools where machines are allowed to share the tools considering transfer times of jobs and tools between machines, to generate best optimal sequences that minimize makespan in a multi-machine Flexible Manufacturing System (FMS). Performance of FMS is expected to improve by effective utilization of its resources, by proper integration and synchronization of their scheduling. Symbiotic Organisms Search (SOS) algorithm is a potent tool which is a better alternative for solving optimization problems like scheduling and proven itself. The proposed SOS algorithm is tested on 22 job sets with makespan as objective for scheduling of machines and tools where machines are allowed to share tools without considering transfer times of jobs and tools and the results are compared with the results of existing methods. The results show that the SOS has outperformed. The same SOS algorithm is used for simultaneous scheduling of machines, AGVs and tools where machines are allowed to share tools considering transfer times of jobs and tools to determine the best optimal sequences that minimize makespan.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Frost-protected shallow foundations (FPSFs) offer a proven technology designed to substantially lower construction costs in colder climates, enhancing housing affordability for families in many parts of the United States. This document provides step-by-step procedures to assist building professionals in designing and laying a slab- on-grade FPSF. FPSFs save money over conventional designs by requiring less excavation to construct a frost-proof foundation. It is specially insulated along its perimeter to raise the temperature of the surrounding ground and decrease frost penetration, thus allowing for the construction of a substantially shallower foundation. The FPSF is considered standard practice for homes in Scandinavia,more » where 40 years of field testing has proven it to be economical to construct, durable, and energy efficient. HUD strongly encourages wide spread adoption of FPSF technology in the United States and its incorporation into major model building codes.« less
[Quantified self movement--the new mantra of life insurance companies].
Becher, St
2016-06-01
Wearables are small personal minicomputers that register biometric data. In such a way, the insurance industry hopes to create new sales opportunities and products, and simplify underwriting. Lower premiums will promote the use of wearables. The related possibilities and unanswered questions are discussed in this article. Utilisation of big data offers the insurance industry a range of new opportunities. The benefit must be proven in the future, however.
Rapid Building Assessment Project
2014-05-01
ongoing management of commercial energy efficiency. No other company offers all of these proven services on a seamless, integrated Software -as-a- Service ...FirstFuel has added a suite of additional Software -as-a- Service analytics capabilities to support the entire energy efficiency lifecycle, including...the client side. In this document, we refer to the service side software as “BUILDER” and the client software as “BuilderRED,” following the Army
NASA Astrophysics Data System (ADS)
Lim, D. S.; Brady, A. L.; Cardman, Z.; Cowie, B. R.; Forrest, A.; Marinova, M.; Shepard, R.; Laval, B.; Slater, G. F.; Gernhardt, M.; Andersen, D. T.; Hawes, I.; Sumner, D. Y.; Trembanis, A. C.; McKay, C. P.
2009-12-01
Microbialites can be metre-scale or larger discrete structures that cover kilometre-scale regions, for example in Pavilion Lake, British Columbia, Canada, while the organisms associated with their growth and development are much smaller (less than millimeter scale). As such, a multi-scaled approach to understanding their provenance, maintenance and morphological characteristics is required. Research members of the Pavilion Lake Research Project (PLRP) (www.pavilionlake.com) have been working to understand microbialite morphogenesis in Pavilion Lake, B.C., Canada and the potential for biosignature preservation in these carbonate rocks using a combination of field and lab based techniques. PLRP research participants have been: (1) exploring the physical and chemical limnological properties of the lake, especially as these characteristics pertain to microbialite formation, (2) using geochemical and molecular tools to test the hypothesized biological origin of the microbialites and the associated meso-scale processes, and (3) using geochemical and microscopic tools to characterize potential biosignature preservation in the microbialites on the micro scale. To address these goals, PLRP identified the need to (a) map Pavilion Lake to gain a contextual understanding of microbialite distribution and possible correlation between their lake-wide distribution and the ambient growth conditions, and (b) sample the microbialites, including those from deepest regions of the lake (60m). Initial assessments showed that PLRP science diving operations did not prove adequate for mapping and sample recovery in the large and deep (0.8 km x 5.7 km; 65m max depth) lake. As such, the DeepWorker Science and Exploration (DSE) program was established by the PLRP. At the heart of this program are two DeepWorker (DW) submersibles, single-person vehicles that offer Scientist-Pilots (SP) an opportunity to study the lake in a 1 atm pressurized environment. In addition, the use of Autonomous Underwater Vehicles (AUVs) for landscape level geophysical mapping (side-scan and multibeam) provides and additional large-scale context of the microbialite associations. The multi-scaled approach undertaken by the PLRP team members has created an opportunity to weave together a comprehensive understanding of the modern microbialites in Pavilion Lake, and their relevance to interpreting ancient carbonate fabrics. An overview of the team’s findings to date and on-going research will be presented.
VirusDetect: An automated pipeline for efficient virus discovery using deep sequencing of small RNAs
USDA-ARS?s Scientific Manuscript database
Accurate detection of viruses in plants and animals is critical for agriculture production and human health. Deep sequencing and assembly of virus-derived siRNAs has proven to be a highly efficient approach for virus discovery. However, to date no computational tools specifically designed for both k...
Leading the Charge: Governors, Higher Education and Accountability
ERIC Educational Resources Information Center
American Council of Trustees and Alumni, 2014
2014-01-01
With this new tool, ACTA [American Council of Trustees and Alumni] is working to expand its outreach to governors nationwide on behalf of higher education reform, focusing on key issues of quality, cost, and accountability. ACTA has worked with governors and education leaders from across the country, and that experience has proven that innovative…
Adapting Stanford's Chronic Disease Self-Management Program to Hawaii's Multicultural Population
ERIC Educational Resources Information Center
Tomioka, Michiyo; Braun, Kathryn L.; Compton, Merlita; Tanoue, Leslie
2012-01-01
Purpose of the Study: Stanford's Chronic Disease Self-Management Program (CDSMP) has been proven to increase patients' ability to manage distress. We describe how we replicated CDSMP in Asian and Pacific Islander (API) communities. Design and Methods: We used the "track changes" tool to deconstruct CDSMP into its various components…
USDA-ARS?s Scientific Manuscript database
To reduce susceptibility to stressors and diseases, immune-modulators such as ß-glucans have been proven effective tools to enhance the innate immune responses of fish. Consequently, commercial sources of this polysaccharide are becoming increasingly more available. AlgamuneTM is a commercial addi...
USDA-ARS?s Scientific Manuscript database
Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS) has proven to be a powerful tool for taxonomic resolution of microorganisms. In this proof-of-concept study, we assessed the effectiveness of this technique to track the current gene sequence-based phylogenet...
Total Lightning as an Indicator of Mesocyclone Behavior
NASA Technical Reports Server (NTRS)
Stough, Sarah M.; Carey, Lawrence D.; Schultz, Christopher J.
2014-01-01
Apparent relationship between total lightning (in-cloud and cloud to ground) and severe weather suggests its operational utility. Goal of fusion of total lightning with proven tools (i.e., radar lightning algorithms. Preliminary work here investigates circulation from Weather Suveilance Radar- 1988 Doppler (WSR-88D) coupled with total lightning data from Lightning Mapping Arrays.
Tracy S. Hawkins; Emile S. Gardiner; Greg S. Comer
2009-01-01
Handheld chlorophyll meters have proven to be useful tools for rapid, nondestructive assessment of chlorophyll and nutrient status in various agricultural and arborescent plant species. We proposed that a SPAD-502 chlorophyll meter would provide valuable information when monitoring life cycle changes and intraspecific variation in...
Role of E-Learning in Capacity Building: An Alumni View
ERIC Educational Resources Information Center
Zaheer, Muhammad; Jabeen, Sadia; Qadri, Mubasher Majeed
2015-01-01
The concept of knowledge sharing has now expanded because of sophisticated communication tools. A common consensus has been generated for spreading knowledge beyond boundaries and making collective efforts for the development of individuals as well as nations. E-learning has proven its authenticity in this regard. In developing countries, access…
Integrating Traditional Learning and Games on Large Displays: An Experimental Study
ERIC Educational Resources Information Center
Ardito, Carmelo; Lanzilotti, Rosa; Costabile, Maria F.; Desolda, Giuseppe
2013-01-01
Current information and communication technology (ICT) has the potential to bring further changes to education. New learning techniques must be identified to take advantage of recent technological tools, such as smartphones, multimodal interfaces, multi-touch displays, etc. Game-based techniques that capitalize on ICT have proven to be very…
Captured by Motion: Dance, Action Understanding, and Social Cognition
ERIC Educational Resources Information Center
Sevdalis, Vassilis; Keller, Peter E.
2011-01-01
In this review article, we summarize the main findings from empirical studies that used dance-related forms of rhythmical full body movement as a research tool for investigating action understanding and social cognition. This work has proven to be informative about behavioral and brain mechanisms that mediate links between perceptual and motor…
An Earth Hazards Camp to Encourage Minority Participation in the Geosciences
ERIC Educational Resources Information Center
Sherman-Morris, Kathleen; Clary, Renee M.; McNeal, Karen S.; Diaz-Ramirez, Jairo; Brown, Michael E.
2017-01-01
Summer camps have proven to be effective tools to engage students in the geosciences. Findings from this study highlight perceptions and experiences of middle school students from predominantly African American school districts in Mississippi who attended a 3-d residence camp focused on increasing interest in the geosciences through an earth…
USDA-ARS?s Scientific Manuscript database
Spectroscopy has proven to be an efficient tool for measuring the properties of meat. In this article, the hyperspectral imaging (HSI) technique is investigated for the determination of moisture content in cooked chicken breast over the VIS/NIR (400–1000 nm) spectral ranges. Moisture measurements we...
Effectiveness of Hexazinone as a Forestry Herbicide
Jerry L. Michael
1985-01-01
Hexuinone has proven to be a useful herbicide in southern forestry. Its effectiveness in controlling many woody and herbaceour weeds at application rates tolerated by pines provides foresters with a selective vegetation management tool. Hexazinone is an environmentally safe chemical because lt is low in toxicity, is degraded readily, does not bioaccumulate; and does...
The Utility of Interaction Analysis for Generalizing Characteristics of Science Classrooms
ERIC Educational Resources Information Center
Crippen, Kent J.; Sangueza, Cheryl R.
2013-01-01
Validating and generalizing from holistic observation protocols of classroom practice have proven difficult. These tools miss crucial classroom characteristics, like the type of instruction, the organization of learners, and the level of cognitive engagement that occur differentially in the time span of a lesson. As a result, this study examined…
The presence and distribution of undesirable quantities of bioavailable nitrogenous compounds in the environment are issues of long-standing concern. Importantly for us today, deleterious effects associated with high levels of nitrogen in the ecosystem are becoming everyday news...
Equipping Novice Teachers with a Learning Map to Enhance Teaching Practice
ERIC Educational Resources Information Center
Xu, Zhe; Gu, Xiaoqing
2017-01-01
Using tools to support learning design has been proven feasible in improving the integration of technology into the curriculum. However, novice teachers are faced with two major issues, including their limited experience in learning design and limited ability in using new technologies. Learning map is explored and developed in e-Textbooks to…
Life Cycle Impact Analysis (LCIA) has proven to be a valuable tool for systematically comparing processes and products, and has been proposed for use in Chemical Alternatives Analysis (CAA). The exposure assessment portion of the human health impact scores of LCIA has historicall...
Converting Inhouse Subject Card Files to Electronic Keyword Files.
ERIC Educational Resources Information Center
Culmer, Carita M.
The library at Phoenix College developed the Controversial Issues Files (CIF), a "home made" card file containing references pertinent to specific ongoing assignments. Although the CIF had proven itself to be an excellent resource tool for beginning researchers, it was cumbersome to maintain in the card format, and was limited to very…
USDA-ARS?s Scientific Manuscript database
Agbiotechnology uses genetic engineering to improve the output and value of crops. Altering the expression of the plant Type I Proton-pumping Pyrophosphatase (H+-PPase) has already proven to be a useful tool to enhance crop productivity. Despite the effective use of this gene in translational resear...
Educational Technology along with the Uncritical Mass versus Ethics
ERIC Educational Resources Information Center
Sayadmansour, Alireza; Nassaji, Mehdi
2013-01-01
This paper considers the ethics of educational technology in terms of whether or not selected media and methods are beneficial to the teacher and student, or whether other motives and criteria determine the selection. Communications media have proven themselves to be powerful and efficient tools, used like "dynamite" for getting the most…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-24
... water pollution and requests public comment. The document was prepared pursuant to Executive Order (E.O... Chesapeake Bay watershed describing proven, cost-effective tools and practices that reduce water pollution... top right of the Web page, then follow the online instructions. Mail: Water Docket, Environmental...
USDA-ARS?s Scientific Manuscript database
Microbial contamination of waters in agricultural watershed is the critical public health issue. The watershed-scale model has been proven to be one of the candidate tools for predicting microbial water quality and evaluating management practices. The Agricultural Policy/Environmental eXtender (APEX...
Durkin, Gregory J
2010-01-01
A wide variety of evaluation formats are available for new graduate nurses, but most of them are single-point evaluation tools that do not provide a clear picture of progress for orientee or educator. This article describes the development of a Web-based evaluation tool that combines learning taxonomies with the Synergy model into a rating scale based on independent performance. The evaluation tool and process provides open 24/7 access to evaluation documentation for members of the orientation team, demystifying the process and clarifying expectations. The implementation of the tool has proven to be transformative in the perceptions of evaluation and performance expectations of new graduates. This tool has been successful at monitoring progress, altering education, and opening dialogue about performance for over 125 new graduate nurses since inception.
Malikova, Marina A; Tkacz, Jaroslaw N; Slanetz, Priscilla J; Guo, Chao-Yu; Aakil, Adam; Jara, Hernan
2017-08-01
Early breast cancer detection is important for intervention and prognosis. Advances in treatment and outcome require diagnostic tools with highly positive predictive value. To study the potential role of quantitative MRI (qMRI) using T1/T2 ratios to differentiate benign from malignant breast lesions. A cross-sectional study of 69 women with 69 known or suspicious breast lesions were scanned with mixed-turbo spin echo pulse sequence. Patients were grouped according to histopathological assessment of disease stage: untreated malignant tumor, treated malignancy and benign disease. Elevated T1/T2 means were observed for biopsy-proven malignant lesions and for malignant lesions treated prior to qMRI with chemotherapy and/or radiation, as compared with benign lesions. The qMRI-obtained T1/T2 ratios correlated with histopathology. Analysis revealed correlation between elevated T1/T2 ratio and disease stage. This could provide valuable complementary information on tissue properties as an additional diagnostic tool.
A Collaborative Web-Based Approach to Planning Research, Integration, and Testing Using a Wiki
NASA Technical Reports Server (NTRS)
Delaney, Michael M.; Koshimoto, Edwin T.; Noble, Deleena; Duggan, Christopher
2010-01-01
The National Aeronautics and Space Administration Integrated Vehicle Health Management program touches on many different research areas while striving to enable the automated detection, diagnosis, prognosis, and mitigation of adverse events at the aircraft and system level. At the system level, the research focus is on the evaluation of multidisciplinary integrated methods, tools, and technologies for achieving the program goal. The participating program members form a diverse group of government, industry, and academic researchers. The program team developed the Research and Test Integration Plan in order to track significant test and evaluation activities, which are important for understanding, demonstrating, and communicating the overall project state and project direction. The Plan is a living document, which allows the project team the flexibility to construct conceptual test scenarios and to track project resources. The Plan also incorporates several desirable feature requirements for Plan users and maintainers. A wiki has proven to be the most efficient and effective means of implementing the feature requirements for the Plan. The wiki has proven very valuable as a research project management tool, and there are plans to expand its scope.
Climate Model Diagnostic Analyzer
NASA Technical Reports Server (NTRS)
Lee, Seungwon; Pan, Lei; Zhai, Chengxing; Tang, Benyang; Kubar, Terry; Zhang, Zia; Wang, Wei
2015-01-01
The comprehensive and innovative evaluation of climate models with newly available global observations is critically needed for the improvement of climate model current-state representation and future-state predictability. A climate model diagnostic evaluation process requires physics-based multi-variable analyses that typically involve large-volume and heterogeneous datasets, making them both computation- and data-intensive. With an exploratory nature of climate data analyses and an explosive growth of datasets and service tools, scientists are struggling to keep track of their datasets, tools, and execution/study history, let alone sharing them with others. In response, we have developed a cloud-enabled, provenance-supported, web-service system called Climate Model Diagnostic Analyzer (CMDA). CMDA enables the physics-based, multivariable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. At the same time, CMDA provides a crowd-sourcing space where scientists can organize their work efficiently and share their work with others. CMDA is empowered by many current state-of-the-art software packages in web service, provenance, and semantic search.
Liang, Laurel; Abi Safi, Jhoni; Gagliardi, Anna R
2017-11-15
Guideline implementation tools (GI tools) can improve clinician behavior and patient outcomes. Analyses of guidelines published before 2010 found that many did not offer GI tools. Since 2010 standards, frameworks and instructions for GI tools have emerged. This study analyzed the number and types of GI tools offered by guidelines published in 2010 or later. Content analysis and a published GI tool framework were used to categorize GI tools by condition, country, and type of organization. English-language guidelines on arthritis, asthma, colorectal cancer, depression, diabetes, heart failure, and stroke management were identified in the National Guideline Clearinghouse. Screening and data extraction were in triplicate. Findings were reported with summary statistics. Eighty-five (67.5%) of 126 eligible guidelines published between 2010 and 2017 offered one or more of a total of 464 GI tools. The mean number of GI tools per guideline was 5.5 (median 4.0, range 1 to 28) and increased over time. The majority of GI tools were for clinicians (239, 51.5%), few were for patients (113, 24.4%), and fewer still were to support implementation (66, 14.3%) or evaluation (46, 9.9%). Most clinician GI tools were guideline summaries (116, 48.5%), and most patient GI tools were condition-specific information (92, 81.4%). Government agencies (patient 23.5%, clinician 28.9%, implementation 24.1%, evaluation 23.5%) and developers in the UK (patient 18.5%, clinician 25.2%, implementation 27.2%, evaluation 29.1%) were more likely to generate guidelines that offered all four types of GI tools. Professional societies were more likely to generate guidelines that included clinician GI tools. Many guidelines do not include any GI tools, or a variety of GI tools for different stakeholders that may be more likely to prompt guideline uptake (point-of-care forms or checklists for clinicians, decision-making or self-management tools for patients, implementation and evaluation tools for managers and policy-makers). While this may vary by country and type of organization, and suggests that developers could improve the range of GI tools they develop, further research is needed to identify determinants and potential solutions. Research is also needed to examine the cost-effectiveness of various types of GI tools so that developers know where to direct their efforts and scarce resources.
NASA Astrophysics Data System (ADS)
Sessa, Francesco; D'Angelo, Paola; Migliorati, Valentina
2018-01-01
In this work we have developed an analytical procedure to identify metal ion coordination geometries in liquid media based on the calculation of Combined Distribution Functions (CDFs) starting from Molecular Dynamics (MD) simulations. CDFs provide a fingerprint which can be easily and unambiguously assigned to a reference polyhedron. The CDF analysis has been tested on five systems and has proven to reliably identify the correct geometries of several ion coordination complexes. This tool is simple and general and can be efficiently applied to different MD simulations of liquid systems.
Brennan, Sean R.; Fernandez, Diego P.; Zimmerman, Christian E.; Cerling, Thure E.; Brown, Randy J.; Wooller, Matthew J.
2015-01-01
Heterogeneity in 87Sr/86Sr ratios of river-dissolved strontium (Sr) across geologically diverse environments provides a useful tool for investigating provenance, connectivity and movement patterns of various organisms and materials. Evaluation of site-specific 87Sr/86Sr temporal variability throughout study regions is a prerequisite for provenance research, but the dynamics driving temporal variability are generally system-dependent and not accurately predictable. We used the time-keeping properties of otoliths from non-migratory slimy sculpin (Cottus cognatus) to evaluate multi-scale 87Sr/86Sr temporal variability of river waters throughout the Nushagak River, a large (34,700 km2) remote watershed in Alaska, USA. Slimy sculpin otoliths incorporated site-specific temporal variation at sub-annual resolution and were able to record on the order of 0.0001 changes in the 87Sr/86Sr ratio. 87Sr/86Sr profiles of slimy sculpin collected in tributaries and main-stem channels of the upper watershed indicated that these regions were temporally stable, whereas the Lower Nushagak River exhibited some spatio-teporal variability. This study illustrates how the behavioral ecology of a non-migratory organism can be used to evaluate sub-annual 87Sr/86Sr temporal variability and has broad implications for provenance studies employing this tracer.
NASA Astrophysics Data System (ADS)
Jonell, T. N.; Li, Y.; Blusztajn, J.; Giosan, L.; Clift, P. D.
2017-12-01
Rare earth element (REE) radioisotope systems, such as neodymium (Nd), have been traditionally used as powerful tracers of source provenance, chemical weathering intensity, and sedimentary processes over geologic timescales. More recently, the effects of physical fractionation (hydraulic sorting) of sediments during transport have called into question the utility of Nd isotopes as a provenance tool. Is source terrane Nd provenance resolvable if sediment transport strongly induces noise? Can grain-size sorting effects be quantified? This study works to address such questions by utilizing grain size analysis, trace element geochemistry, and Nd isotope geochemistry of bulk and grain-size fractions (<63μm, 63-125 μm, 125-250 μm) from the Indus delta of Pakistan. Here we evaluate how grain size effects drive Nd isotope variability and further resolve the total uncertainties associated with Nd isotope compositions of bulk sediments. Results from the Indus delta indicate bulk sediment ɛNd compositions are most similar to the <63 µm fraction as a result of strong mineralogical control on bulk compositions by silt- to clay-sized monazite and/or allanite. Replicate analyses determine that the best reproducibility (± 0.15 ɛNd points) is observed in the 125-250 µm fraction. The bulk and finest fractions display the worst reproducibility (±0.3 ɛNd points). Standard deviations (2σ) indicate that bulk sediment uncertainties are no more than ±1.0 ɛNd points. This argues that excursions of ≥1.0 ɛNd points in any bulk Indus delta sediments must in part reflect an external shift in provenance irrespective of sample composition, grain size, and grain size distribution. Sample standard deviations (2s) estimate that any terrigenous bulk sediment composition should vary no greater than ±1.1 ɛNd points if provenance remains constant. Findings from this study indicate that although there are grain-size dependent Nd isotope effects, they are minimal in the Indus delta such that resolvable provenance-driven trends can be identified in bulk sediment ɛNd compositions over the last 20 k.y., and that overall provenance trends remain consistent with previous findings.
Aberration-free FTIR spectroscopic imaging of live cells in microfluidic devices.
Chan, K L Andrew; Kazarian, Sergei G
2013-07-21
The label-free, non-destructive chemical analysis offered by FTIR spectroscopic imaging is a very attractive and potentially powerful tool for studies of live biological cells. FTIR imaging of live cells is a challenging task, due to the fact that cells are cultured in an aqueous environment. While the synchrotron facility has proven to be a valuable tool for FTIR microspectroscopic studies of single live cells, we have demonstrated that high quality infrared spectra of single live cells using an ordinary Globar source can also be obtained by adding a pair of lenses to a common transmission liquid cell. The lenses, when placed on the transmission cell window, form pseudo hemispheres which removes the refraction of light and hence improve the imaging and spectral quality of the obtained data. This study demonstrates that infrared spectra of single live cells can be obtained without the focus shifting effect at different wavenumbers, caused by the chromatic aberration. Spectra of the single cells have confirmed that the measured spectral region remains in focus across the whole range, while spectra of the single cells measured without the lenses have shown some erroneous features as a result of the shift of focus. It has also been demonstrated that the addition of lenses can be applied to the imaging of cells in microfabricated devices. We have shown that it was not possible to obtain a focused image of an isolated cell in a droplet of DPBS in oil unless the lenses are applied. The use of the approach described herein allows for well focused images of single cells in DPBS droplets to be obtained.
Wireless sEMG-Based Body-Machine Interface for Assistive Technology Devices.
Fall, Cheikh Latyr; Gagnon-Turcotte, Gabriel; Dube, Jean-Francois; Gagne, Jean Simon; Delisle, Yanick; Campeau-Lecours, Alexandre; Gosselin, Clement; Gosselin, Benoit
2017-07-01
Assistive technology (AT) tools and appliances are being more and more widely used and developed worldwide to improve the autonomy of people living with disabilities and ease the interaction with their environment. This paper describes an intuitive and wireless surface electromyography (sEMG) based body-machine interface for AT tools. Spinal cord injuries at C5-C8 levels affect patients' arms, forearms, hands, and fingers control. Thus, using classical AT control interfaces (keypads, joysticks, etc.) is often difficult or impossible. The proposed system reads the AT users' residual functional capacities through their sEMG activity, and converts them into appropriate commands using a threshold-based control algorithm. It has proven to be suitable as a control alternative for assistive devices and has been tested with the JACO arm, an articulated assistive device of which the vocation is to help people living with upper-body disabilities in their daily life activities. The wireless prototype, the architecture of which is based on a 3-channel sEMG measurement system and a 915-MHz wireless transceiver built around a low-power microcontroller, uses low-cost off-the-shelf commercial components. The embedded controller is compared with JACO's regular joystick-based interface, using combinations of forearm, pectoral, masseter, and trapeze muscles. The measured index of performance values is 0.88, 0.51, and 0.41 bits/s, respectively, for correlation coefficients with the Fitt's model of 0.75, 0.85, and 0.67. These results demonstrate that the proposed controller offers an attractive alternative to conventional interfaces, such as joystick devices, for upper-body disabled people using ATs such as JACO.
Inspection and Verification of Domain Models with PlanWorks and Aver
NASA Technical Reports Server (NTRS)
Bedrax-Weiss, Tania; Frank, Jeremy; Iatauro, Michael; McGann, Conor
2006-01-01
When developing a domain model, it seems natural to bring the traditional informal tools of inspection and verification, debuggers and automated test suites, to bear upon the problems that will inevitably arise. Debuggers that allow inspection of registers and memory and stepwise execution have been a staple of software development of all sorts from the very beginning. Automated testing has repeatedly proven its considerable worth, to the extent that an entire design philosophy (Test Driven Development) has been developed around the writing of tests. Unfortunately, while not entirely without their uses, the limitations of these tools and the nature of the complexity of models and the underlying planning systems make the diagnosis of certain classes of problems and the verification of their solutions difficult or impossible. Debuggers provide a good local view of executing code, allowing a fine-grained look at algorithms and data. This view is, however, usually only at the level of the current scope in the implementation language, and the data-inspection capabilities of most debuggers usually consist of on-line print statements. More modem graphical debuggers offer a sort of tree view of data structures, but even this is too low-level and is often inappropriate for the kinds of structures created by planning systems. For instance, god or constraint networks are at best awkward when visualized as trees. Any any non-structural link between data structures, as through a lookup table, isn't captured at all. Further, while debuggers have powerful breakpointing facilities that are suitable for finding specific algorithmic errors, they have little use in the diagnosis of modeling errors.
Significance of Objective Structured Clinical Examinations to Plastic Surgery Residency Training.
Simmons, Brian J; Zoghbi, Yasmina; Askari, Morad; Birnbach, David J; Shekhter, Ilya; Thaller, Seth R
2017-09-01
Objective structured clinical examinations (OSCEs) have proven to be a powerful tool. They possess more than a 30-year track record in assessing the competency of medical students, residents, and fellows. Objective structured clinical examinations have been used successfully in a variety of medical specialties, including surgery. They have recently found their way into the subspecialty of plastic surgery. This article uses a systematic review of the available literature on OSCEs and their recent use in plastic surgery. It incorporates survey results assessing program directors' views on the use of OSCEs. Approximately 40% of programs surveyed use OSCEs to assess the Accreditation Council for Graduate Medical Education core competencies. We found that 40% use OSCEs to evaluate specific plastic surgery milestones. Objective structured clinical examinations are usually performed annually. They cost anywhere between $100 and more than $1000 per resident. Four milestones giving residents the most difficulties on OSCEs were congenital anomalies, noncancer breast surgery, breast reconstruction, and practice-based learning and improvement. It was determined that challenges with milestones were due to lack of adequate general knowledge and surgical ward patient care, as well as deficits in professionalism and system-based problems. Programs were able to remediate weakness found by OSCEs using a variety of methods. Objective structured clinical examinations offer a unique tool to objectively assess the proficiency of residents in key areas of the Accreditation Council for Graduate Medical Education core competencies. In addition, they can be used to assess the specific milestones that plastic surgery residents must meet. This allows programs to identify and improve identified areas of weakness.
Rosetta Structure Prediction as a Tool for Solving Difficult Molecular Replacement Problems.
DiMaio, Frank
2017-01-01
Molecular replacement (MR), a method for solving the crystallographic phase problem using phases derived from a model of the target structure, has proven extremely valuable, accounting for the vast majority of structures solved by X-ray crystallography. However, when the resolution of data is low, or the starting model is very dissimilar to the target protein, solving structures via molecular replacement may be very challenging. In recent years, protein structure prediction methodology has emerged as a powerful tool in model building and model refinement for difficult molecular replacement problems. This chapter describes some of the tools available in Rosetta for model building and model refinement specifically geared toward difficult molecular replacement cases.
NASA Technical Reports Server (NTRS)
Wissler, Steven S.; Maldague, Pierre; Rocca, Jennifer; Seybold, Calina
2006-01-01
The Deep Impact mission was ambitious and challenging. JPL's well proven, easily adaptable multi-mission sequence planning tools combined with integrated spacecraft subsystem models enabled a small operations team to develop, validate, and execute extremely complex sequence-based activities within very short development times. This paper focuses on the core planning tool used in the mission, APGEN. It shows how the multi-mission design and adaptability of APGEN made it possible to model spacecraft subsystems as well as ground assets throughout the lifecycle of the Deep Impact project, starting with models of initial, high-level mission objectives, and culminating in detailed predictions of spacecraft behavior during mission-critical activities.
Bolt, H. L.; Williams, C. E. J.; Brooks, R. V.; ...
2017-01-13
Hydrophobicity has proven to be an extremely useful parameter in small molecule drug discovery programmes given that it can be used as a predictive tool to enable rational design. For larger molecules, including peptoids, where folding is possible, the situation is more complicated and the average hydrophobicity (as determined by RP-HPLC retention time) may not always provide an effective predictive tool for rational design. Herein, we report the first ever application of partitioning experiments to determine the log D values for a series of peptoids. By comparing log D and average hydrophobicities we highlight the potential advantage of employing themore » former as a predictive tool in the rational design of biologically active peptoids.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolt, H. L.; Williams, C. E. J.; Brooks, R. V.
Hydrophobicity has proven to be an extremely useful parameter in small molecule drug discovery programmes given that it can be used as a predictive tool to enable rational design. For larger molecules, including peptoids, where folding is possible, the situation is more complicated and the average hydrophobicity (as determined by RP-HPLC retention time) may not always provide an effective predictive tool for rational design. Herein, we report the first ever application of partitioning experiments to determine the log D values for a series of peptoids. By comparing log D and average hydrophobicities we highlight the potential advantage of employing themore » former as a predictive tool in the rational design of biologically active peptoids.« less
Delsing, Corine E.; Groenwold, Rolf H. H.; Wegdam-Blans, Marjolijn C. A.; Bleeker-Rovers, Chantal P.; de Jager-Leclercq, Monique G. L.; Hoepelman, Andy I. M.; van Kasteren, Marjo E.; Buijs, Jacqueline; Renders, Nicole H. M.; Nabuurs-Franssen, Marrigje H.; Oosterheert, Jan Jelrik; Wever, Peter C.
2014-01-01
Coxiella burnetii causes Q fever, a zoonosis, which has acute and chronic manifestations. From 2007 to 2010, the Netherlands experienced a large Q fever outbreak, which has offered a unique opportunity to analyze chronic Q fever cases. In an observational cohort study, baseline characteristics and clinical characteristics, as well as mortality, of patients with proven, probable, or possible chronic Q fever in the Netherlands, were analyzed. In total, 284 chronic Q fever patients were identified, of which 151 (53.7%) had proven, 64 (22.5%) probable, and 69 (24.3%) possible chronic Q fever. Among proven and probable chronic Q fever patients, vascular infection focus (56.7%) was more prevalent than endocarditis (34.9%). An acute Q fever episode was recalled by 27.0% of the patients. The all-cause mortality rate was 19.1%, while the chronic Q fever-related mortality rate was 13.0%, with mortality rates of 9.3% among endocarditis patients and 18% among patients with a vascular focus of infection. Increasing age (P = 0.004 and 0.010), proven chronic Q fever (P = 0.020 and 0.002), vascular chronic Q fever (P = 0.024 and 0.005), acute presentation with chronic Q fever (P = 0.002 and P < 0.001), and surgical treatment of chronic Q fever (P = 0.025 and P < 0.001) were significantly associated with all-cause mortality and chronic Q fever-related mortality, respectively. PMID:24599987
Open access chemical probes for epigenetic targets
Brown, Peter J; Müller, Susanne
2015-01-01
Background High attrition rates in drug discovery call for new approaches to improve target validation. Academia is filling gaps, but often lacks the experience and resources of the pharmaceutical industry resulting in poorly characterized tool compounds. Discussion The SGC has established an open access chemical probe consortium, currently encompassing ten pharmaceutical companies. One of its mandates is to create well-characterized inhibitors (chemical probes) for epigenetic targets to enable new biology and target validation for drug development. Conclusion Epigenetic probe compounds have proven to be very valuable and have not only spurred a plethora of novel biological findings, but also provided starting points for clinical trials. These probes have proven to be critical complementation to traditional genetic targeting strategies and provided sometimes surprising results. PMID:26397018
1985-07-25
renovation is not a recent discovery . In May 1984, I also rejected /Mrs Peron’s/ offer to appoint me to the tactical command that she created, and I...been marked by emphasis placed on greater discoveries of reserves. For example, at present, the proven crude supplies will suffice to cover only 14... Cobre and Mid-Claren- don and provide other irrigation facilities where they are necessary throughout the country. ’■ 4 Special rate of electricity
Framework for ReSTful Web Services in OSGi
NASA Technical Reports Server (NTRS)
Shams, Khawaja S.; Norris, Jeffrey S.; Powell, Mark W.; Crockett, Thomas M.; Mittman, David S.; Fox, Jason M.; Joswig, Joseph C.; Wallick, Michael N.; Torres, Recaredo J.; Rabe, Kenneth
2009-01-01
Ensemble ReST is a software system that eases the development, deployment, and maintenance of server-side application programs to perform functions that would otherwise be performed by client software. Ensemble ReST takes advantage of the proven disciplines of ReST (Representational State Transfer. ReST leverages the standardized HTTP protocol to enable developers to offer services to a diverse variety of clients: from shell scripts to sophisticated Java application suites
The Military Railroads of the Civil War and Their Great Leaders.
these leadership styles as part of its curriculum to expose others to those traits which have proven to be successful. In the majority of instances...of leadership styles in the combat service and combat service support specialties. It is therefore a major contention that our curriculm does not...offer our students the broadest exposure to leadership styles . This study is intended to do that. Using the Military Railroads as the support function
A Proven Ground System Architecture for Promoting Collaboration and Common Solutions at NASA
NASA Technical Reports Server (NTRS)
Smith, Danford
2005-01-01
Requirement: Improve how NASA develops and maintains ground data systems for dozens of missions, with a couple new missions always in the development phase. Decided in 2001 on enhanced message-bus architecture. Users offered choices for major components. They plug and play because key interfaces are all the same. Can support COTS, heritage, and new software. Even the middleware can be switched. Project name: GMSEC. Goddard Mission Services Evolution Center.
ERIC Educational Resources Information Center
Adams, Dennis; Hamm, Mary
2011-01-01
"Shaping the Future with Math, Science, and Technology" examines how ingenuity, creativity, and teamwork skills are part of an intellectual toolbox associated with math, science, and technology. The book provides new ideas, proven processes, practical tools, and examples useful to educators who want to encourage students to solve problems and…
USDA-ARS?s Scientific Manuscript database
Microbial contamination of waters is the critical public health issue. The watershed-scale process-based modeling of bacteria fate and transport (F&T) has been proven to serve as the useful tool for predicting microbial water quality and evaluating management practices. The objective of this work is...
Examining Evolving Performance on the Force Concept Inventory Using Factor Analysis
ERIC Educational Resources Information Center
Semak, M. R.; Dietz, R. D.; Pearson, R. H.; Willis, C. W
2017-01-01
The application of factor analysis to the "Force Concept Inventory" (FCI) has proven to be problematic. Some studies have suggested that factor analysis of test results serves as a helpful tool in assessing the recognition of Newtonian concepts by students. Other work has produced at best ambiguous results. For the FCI administered as a…
ERIC Educational Resources Information Center
Mohamed, Fahim; Abdeslam, Jakimi; Lahcen, El Bermi
2017-01-01
Virtual Environments for Training (VET) are useful tools for visualization, discovery as well as for training. VETs are based on virtual reality technique to put learners in training situations that emulate genuine situations. VETs have proven to be advantageous in putting learners into varied training situations to acquire knowledge and…
Understanding Nature-Related Behaviors among Children through a Theory of Reasoned Action Approach
ERIC Educational Resources Information Center
Gotch, Chad; Hall, Troy
2004-01-01
The Theory of Reasoned Action has proven to be a valuable tool for predicting and understanding behavior and, as such, provides a potentially important basis for environmental education program design. This study used a Theory of Reasoned Action approach to examine a unique type of behavior (nature-related activities) and a unique population…
Effect of overstorey trees on understorey vegetation in California (USA) ponderosa pine plantations
Jianwei Zhang; David H. Young; William W. Oliver; Gary O. Fiddler
2016-01-01
Understorey vegetation plays a significant role in the structure and function of forest ecosystems. Controlling understorey vegetation has proven to be an effective tool in increasing tree growth and overstorey development. However, a long-term consequence of the practice on plant diversity is not fully understood. Here, we analyzed early development of overstorey and...
Generation of germline ablated male pigs by CRISPR/Cas9 editing of the NANOS2 gene
USDA-ARS?s Scientific Manuscript database
Genome editing tools have revolutionized the generation of genetically modified animals including livestock. In particular, the domestic pig is a proven model of human physiology and an agriculturally important species. In this study, we utilized the CRISPR/Cas9 system to edit the NANOS2 gene in p...
Using Bayesian Stable Isotope Mixing Models to Enhance Marine Ecosystem Models
The use of stable isotopes in food web studies has proven to be a valuable tool for ecologists. We investigated the use of Bayesian stable isotope mixing models as constraints for an ecosystem model of a temperate seagrass system on the Atlantic coast of France. δ13C and δ15N i...
Soil vapor extraction (SVE) and bioventing (BV) are proven strategies for remediation of unsaturated zone soils. Mathematical models are powerful tools that can be used to integrate and quantify the interaction of physical, chemical, and biological processes occurring in field sc...
ERIC Educational Resources Information Center
Chang, Y. C.; Peng, H. Y.; Chao, H. C.
2010-01-01
In recent years, games have been proven to be an effective tool in supplementing traditional teaching methods. Through game playing, students can strengthen their cognitive-recognition architecture and can gain satisfaction as well as a sense of achievement. This study presents a conceptual framework for examining various effective strategies by…
ERIC Educational Resources Information Center
2001
This guide contains all of the information, support and tools that community members need to implement "Talking About Mental Illness" in their community--an awareness program proven to be effective in bringing about positive change in young people's knowledge about mental illness, and in reducing stigma that surrounds mental illness. The…
From Newton to Gates--Digital Principia
ERIC Educational Resources Information Center
Beckwith, E. George; Cunniff, Daniel T.
2008-01-01
Computers are becoming the norm for teaching and learning. The Internet gives people ready access to text, visual and audio messages from around the world. For teachers, content is critical and the future dictates the need for major changes in the role of the teacher and learner. Today's digital tools and video games have proven to be well known…
ERIC Educational Resources Information Center
Emelyanova, Natalya; Voronina, Elena
2014-01-01
Learning management systems (LMS) have been proven to encourage a constructive approach to knowledge acquisition and support active learning. One of the keys to successful and efficient use of LMS is how the stakeholders adopt and perceive this learning tool. The present research is therefore motivated by the importance of understanding teachers'…
Silviculture of ponderosa pine in the Black Hills: The status of our knowledge
Charles E. Boldt; James L. Van Deusen
1974-01-01
This Paper, intended as a guide for professional foresters, describes major silvicultural conditions likely to be encountered in the Black Hills, reasonable treatment options, and probable results and implications of these treatments. It also describes silvical characteristics and behavior of Black Hills ponderosa pine, and a variety of proven silvicultural tools....
Methods Beyond Methods: A Model for Africana Graduate Methods Training.
Best, Latrica E; Byrd, W Carson
2014-06-01
A holistic graduate education can impart not just tools and knowledge, but critical positioning to fulfill many of the original missions of Africana Studies programs set forth in the 1960s and 1970s. As an interdisciplinary field with many approaches to examining the African Diaspora, the methodological training of graduate students can vary across graduate programs. Although taking qualitative methods courses are often required of graduate students in Africana Studies programs, and these programs offer such courses, rarely if ever are graduate students in these programs required to take quantitative methods courses, let alone have these courses offered in-house. These courses can offer Africana Studies graduate students new tools for their own research, but more importantly, improve their knowledge of quantitative research of diasporic communities. These tools and knowledge can assist with identifying flawed arguments about African-descended communities and their members. This article explores the importance of requiring and offering critical quantitative methods courses in graduate programs in Africana Studies, and discusses the methods requirements of one graduate program in the field as an example of more rigorous training that other programs could offer graduate students.
Waller, Sarah; Masterson, Abigail; Evans, Simon C
2017-02-01
The need for more dementia friendly design in hospitals and other care settings is now widely acknowledged. Working with 26 NHS Trusts in England as part of a Department of Health commissioned programme, The King's Fund developed a set of overarching design principles and an environmental assessment tool for hospital wards in 2012. Following requests from other sectors, additional tools were developed for hospitals, care homes, health centres and housing with care. The tools have proven to be effective in both disseminating the principles of dementia friendly design and in enabling the case to be made for improvements that have a positive effect on patient outcomes and staff morale. This paper reports on the development, use and review of the environmental assessment tools, including further work that is now being taken forward by The Association for Dementia Studies, University of Worcester.
Organizational Readiness Tools for Global Health Intervention: A Review
Dearing, James W.
2018-01-01
The ability of non-governmental organizations, government agencies, and corporations to deliver and support the availability and use of interventions for improved global public health depends on their readiness to do so. Yet readiness has proven to be a rather fluid concept in global public health, perhaps due to its multidimensional nature and because scholars and practitioners have applied the concept at different levels such as the individual, organization, and community. This review concerns 30 publically available tools created for the purpose of organizational readiness assessment in order to carry out global public health objectives. Results suggest that these tools assess organizational capacity in the absence of measuring organizational motivation, thus overlooking a key aspect of organizational readiness. Moreover, the tools reviewed are mostly untested by their developers to establish whether the tools do, in fact, measure capacity. These results suggest opportunities for implementation science researchers. PMID:29552552
On Improving Efficiency of Differential Evolution for Aerodynamic Shape Optimization Applications
NASA Technical Reports Server (NTRS)
Madavan, Nateri K.
2004-01-01
Differential Evolution (DE) is a simple and robust evolutionary strategy that has been provEn effective in determining the global optimum for several difficult optimization problems. Although DE offers several advantages over traditional optimization approaches, its use in applications such as aerodynamic shape optimization where the objective function evaluations are computationally expensive is limited by the large number of function evaluations often required. In this paper various approaches for improving the efficiency of DE are reviewed and discussed. Several approaches that have proven effective for other evolutionary algorithms are modified and implemented in a DE-based aerodynamic shape optimization method that uses a Navier-Stokes solver for the objective function evaluations. Parallelization techniques on distributed computers are used to reduce turnaround times. Results are presented for standard test optimization problems and for the inverse design of a turbine airfoil. The efficiency improvements achieved by the different approaches are evaluated and compared.
Remote Attitude Measurement Sensor (RAMS)
NASA Technical Reports Server (NTRS)
Davis, H. W.
1989-01-01
Remote attitude measurement sensor (RAMS) offers a low-cost, low-risk, proven design concept that is based on mature, demonstrated space sensor technology. The electronic design concepts and interpolation algorithms were tested and proven in space hardware like th Retroreflector Field Tracker and various star trackers. The RAMS concept is versatile and has broad applicability to both ground testing and spacecraft needs. It is ideal for use as a precision laboratory sensor for structural dynamics testing. It requires very little set-up or preparation time and the output data is immediately usable without integration or extensive analysis efforts. For on-orbit use, RAMS rivals any other type of dynamic structural sensor (accelerometer, lidar, photogrammetric techniques, etc.) for overall performance, reliability, suitability, and cost. Widespread acceptance and extensive usage of RAMS will occur only after some interested agency, such as OAST, adopts the RAMS concept and provides the funding support necessary for further development and implementation of RAMS for a specific program.
Ultrasonic flowmeters offer oil line leak-detection potential
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hettrich, U.
1995-04-01
Ultrasonic flowmeters (USFM) installed on Transalpine Pipeline Co.`s (TAL) crude-oil system have proven to be a cost-effective flow measurement technique and beneficial in batch identification and leak detection. Through close examination, TAL has determined that clamp-on USFMs offer cost-saving advantages in installation, maintenance and operation. USFMs do not disturb pig passage. The technique also provides sound velocity capabilities, which can be used for liquid identification and batch tracking. The instruments have a repeatability of better than 0.25% and achieve an accuracy of better than 1%, depending on the flow profiles predictability. Using USFMs with multiple beams probably will improve accuracymore » further and it should be possible to find leaks even smaller than 1% of flow.« less
2011-01-01
Background Over the past several centuries, chemistry has permeated virtually every facet of human lifestyle, enriching fields as diverse as medicine, agriculture, manufacturing, warfare, and electronics, among numerous others. Unfortunately, application-specific, incompatible chemical information formats and representation strategies have emerged as a result of such diverse adoption of chemistry. Although a number of efforts have been dedicated to unifying the computational representation of chemical information, disparities between the various chemical databases still persist and stand in the way of cross-domain, interdisciplinary investigations. Through a common syntax and formal semantics, Semantic Web technology offers the ability to accurately represent, integrate, reason about and query across diverse chemical information. Results Here we specify and implement the Chemical Entity Semantic Specification (CHESS) for the representation of polyatomic chemical entities, their substructures, bonds, atoms, and reactions using Semantic Web technologies. CHESS provides means to capture aspects of their corresponding chemical descriptors, connectivity, functional composition, and geometric structure while specifying mechanisms for data provenance. We demonstrate that using our readily extensible specification, it is possible to efficiently integrate multiple disparate chemical data sources, while retaining appropriate correspondence of chemical descriptors, with very little additional effort. We demonstrate the impact of some of our representational decisions on the performance of chemically-aware knowledgebase searching and rudimentary reaction candidate selection. Finally, we provide access to the tools necessary to carry out chemical entity encoding in CHESS, along with a sample knowledgebase. Conclusions By harnessing the power of Semantic Web technologies with CHESS, it is possible to provide a means of facile cross-domain chemical knowledge integration with full preservation of data correspondence and provenance. Our representation builds on existing cheminformatics technologies and, by the virtue of RDF specification, remains flexible and amenable to application- and domain-specific annotations without compromising chemical data integration. We conclude that the adoption of a consistent and semantically-enabled chemical specification is imperative for surviving the coming chemical data deluge and supporting systems science research. PMID:21595881
Chepelev, Leonid L; Dumontier, Michel
2011-05-19
Over the past several centuries, chemistry has permeated virtually every facet of human lifestyle, enriching fields as diverse as medicine, agriculture, manufacturing, warfare, and electronics, among numerous others. Unfortunately, application-specific, incompatible chemical information formats and representation strategies have emerged as a result of such diverse adoption of chemistry. Although a number of efforts have been dedicated to unifying the computational representation of chemical information, disparities between the various chemical databases still persist and stand in the way of cross-domain, interdisciplinary investigations. Through a common syntax and formal semantics, Semantic Web technology offers the ability to accurately represent, integrate, reason about and query across diverse chemical information. Here we specify and implement the Chemical Entity Semantic Specification (CHESS) for the representation of polyatomic chemical entities, their substructures, bonds, atoms, and reactions using Semantic Web technologies. CHESS provides means to capture aspects of their corresponding chemical descriptors, connectivity, functional composition, and geometric structure while specifying mechanisms for data provenance. We demonstrate that using our readily extensible specification, it is possible to efficiently integrate multiple disparate chemical data sources, while retaining appropriate correspondence of chemical descriptors, with very little additional effort. We demonstrate the impact of some of our representational decisions on the performance of chemically-aware knowledgebase searching and rudimentary reaction candidate selection. Finally, we provide access to the tools necessary to carry out chemical entity encoding in CHESS, along with a sample knowledgebase. By harnessing the power of Semantic Web technologies with CHESS, it is possible to provide a means of facile cross-domain chemical knowledge integration with full preservation of data correspondence and provenance. Our representation builds on existing cheminformatics technologies and, by the virtue of RDF specification, remains flexible and amenable to application- and domain-specific annotations without compromising chemical data integration. We conclude that the adoption of a consistent and semantically-enabled chemical specification is imperative for surviving the coming chemical data deluge and supporting systems science research.
Novel MixSIAR fingerprint model implementation in a Mediterranean mountain catchment
NASA Astrophysics Data System (ADS)
Lizaga, Ivan; Gaspar, Leticia; Blake, William; Palazón, Leticia; Quijano, Laura; Navas, Ana
2017-04-01
Increased sediment erosion levels can lead to degraded water and food quality, reduced aquatic biodiversity, decrease reservoir capacity and restrict recreational usage but determining soil redistribution and sediment budgets in watersheds is often challenging. One of the methods for making such determinations applies sediment fingerprinting methods by using sediment properties. The fingerprinting procedure tests a range of source material tracer properties to select a subset that can discriminate between the different potential sediment sources. The present study aims to test the feasibility of geochemical and radioisotopic fingerprint properties to apportion sediment sources within the Barués catchment. For this purpose, the new MixSIAR unmixing model was implemented as statistical tool. A total of 98 soil samples from different land cover sources (Mediterranean forest, pine forest scrubland, agricultural and subsoil) were collected in the Barués catchment (23 km2). This new approach divides the catchment into six different sub-catchments to evaluate how the sediment provenance varies along the river and the percentage of its sources and not only the contribution at the end. For this purpose, target sediments were collected at the end of each sub-catchment to introduce the variation along the entire catchment. Geochemistry and radioisotopic activity were analyzed for each sample and introduced as input parameters in the model. Percentage values from the five sources were different along the different subcatchments and the variations of all of them are summarized at the final target sample located at the end of the catchment. This work represents a good approximation to the fine sediment provenance in Mediterranean agricultural catchments and has the potential to be used for water resource control and future soil management. Identifying sediment contribution from different land uses offers considerable potential to prevent environmental degradation and the decrease in food production and quality.
NASA Astrophysics Data System (ADS)
Radhakrishnan, A.; Balaji, V.; Schweitzer, R.; Nikonov, S.; O'Brien, K.; Vahlenkamp, H.; Burger, E. F.
2016-12-01
There are distinct phases in the development cycle of an Earth system model. During the model development phase, scientists make changes to code and parameters and require rapid access to results for evaluation. During the production phase, scientists may make an ensemble of runs with different settings, and produce large quantities of output, that must be further analyzed and quality controlled for scientific papers and submission to international projects such as the Climate Model Intercomparison Project (CMIP). During this phase, provenance is a key concern:being able to track back from outputs to inputs. We will discuss one of the paths taken at GFDL in delivering tools across this lifecycle, offering on-demand analysis of data by integrating the use of GFDL's in-house FRE-Curator, Unidata's THREDDS and NOAA PMEL's Live Access Servers (LAS).Experience over this lifecycle suggests that a major difficulty in developing analysis capabilities is only partially the scientific content, but often devoted to answering the questions "where is the data?" and "how do I get to it?". "FRE-Curator" is the name of a database-centric paradigm used at NOAA GFDL to ingest information about the model runs into an RDBMS (Curator database). The components of FRE-Curator are integrated into Flexible Runtime Environment workflow and can be invoked during climate model simulation. The front end to FRE-Curator, known as the Model Development Database Interface (MDBI) provides an in-house web-based access to GFDL experiments: metadata, analysis output and more. In order to provide on-demand visualization, MDBI uses Live Access Servers which is a highly configurable web server designed to provide flexible access to geo-referenced scientific data, that makes use of OPeNDAP. Model output saved in GFDL's tape archive, the size of the database and experiments, continuous model development initiatives with more dynamic configurations add complexity and challenges in providing an on-demand visualization experience to our GFDL users.
2009-04-01
Geisinger's system of care can be seen as a microcosm of the national delivery of healthcare, with implications for decision makers in other health plans. In this interview, Dr Ronald A. Paulus focuses on Geisinger's unique approach to patient care. In its core, this approach represents a system of quality and value initiatives based on 3 major programs-Proven Health Navigation (medical home); the ProvenCare model; and transitions of care. The goal of such an approach is to optimize disease management by using a rational reimbursement paradigm for appropriate interventions, providing innovative incentives, and engaging patients in their own care as part of any intervention. Dr Paulus explains the reasons why, unlike Geisinger, other stakeholders, including payers, providers, patients, and employers, have no intrinsic reasons to be concerned with quality and value initiatives. In addition, he says, an electronic infrastructure that could be modified as management paradigms evolve is a necessary tool to ensure the healthcare delivery system's ability to adapt to new clinical realities quickly to ensure the continuation of delivering best value for all stakeholders.
Signatures of mountain building: Detrital zircon U/Pb ages from northeast Tibet
Lease, Richard O.; Burbank, Douglas W.; Gehrels, George E.; Wang, Zhicai; Yuan, Daoyang
2007-01-01
Although detrital zircon has proven to be a powerful tool for determining provenance, past work has focused primarily on delimiting regional source terranes. Here we explore the limits of spatial resolution and stratigraphic sensitivity of detrital zircon in ascertaining provenance, and we demonstrate its ability to detect source changes for terranes separated by only a few tens of kilometers. For such an analysis to succeed for a given mountain, discrete intrarange source terranes must have unique U/Pb zircon age signatures and sediments eroded from the range must have well-defined depositional ages. Here we use ∼1400 single-grain U/Pb zircon ages from northeastern Tibet to identify and analyze an area that satisfies these conditions. This analysis shows that the edges of intermontane basins are stratigraphically sensitive to discrete, punctuated changes in local source terranes. By tracking eroding rock units chronologically through the stratigraphic record, this sensitivity permits the detection of the differential rock uplift and progressive erosion that began ca. 8 Ma in the Laji Shan, a 10-25-km-wide range in northeastern Tibet with a unique U/Pb age signature.
Integrated Measurements and Characterization | Photovoltaic Research | NREL
Integrated Measurements and Characterization cluster tool offers powerful capabilities with integrated tools more details on these capabilities. Basic Cluster Tool Capabilities Sample Handling Ultra-high-vacuum connections, it can be interchanged between tools, such as the Copper Indium Gallium Diselenide cluster tool
Matabosch, Xavier; Ying, Lee; Serra, Montserrat; Wassif, Christopher A.; Porter, Forbes D.; Shackleton, Cedric; Watson, Gordon
2010-01-01
Smith-Lemli-Opitz syndrome (SLOS) is caused by deficiency in the terminal step of cholesterol biosynthesis: the conversion of 7-dehydrocholesterol (7DHC) to cholesterol (C), catalyzed by 7-dehydrocholesterol reductase (DHCR7). This disorder exhibits several phenotypic traits including dysmorphia and mental retardation with a broad range of severity. There are few proven treatment options. That most commonly used is a high cholesterol diet that seems to enhance the quality of life and improve behavioral characteristics of patients, although these positive effects are controversial. The goal of our study was to investigate the possibility of restoring DHCR7 activity by gene transfer. We constructed an adeno-associated virus (AAV) vector containing the DHCR7 gene. After we infused this vector into affected mice, the introduced DHCR7 gene could be identified in liver, mRNA was expressed and a functional enzyme was produced. Evidence of functionality came from the ability to partially normalize the serum ratio of 7DHC/C in treated animals, apparently by increasing cholesterol production with concomitant decrease in 7DHC precursor. By five weeks after treatment the mean ratio (for 7 animals) had fallen to 0.05 while the ratio for untreated littermate controls had risen to 0.14. This provides proof of principle that gene transfer can ameliorate the genetic defect causing SLOS and provides a new experimental tool for studying the pathogenesis of this disease. If effective in humans, it might also offer a possible alternative to exogenous cholesterol therapy. However, it would not offer a complete cure for the disorder as many of the negative implications of defective synthesis are already established during prenatal development. PMID:20800683
Technology Advances Enabling a New Class of Hybrid Underwater Vehicles
NASA Astrophysics Data System (ADS)
Bowen, A.
2016-02-01
Both tethered (ROV) and untethered (AUV) systems have proven to be highly valuable tools for a range of application undersea. Certain enabling technologies coupled with recent advances in robotic systems make it possible to consider supplementing many of the functions performed by these platforms with appropriately designed semi-autonomous vehicles that may be less expensive operate than traditional deep-water ROVs. Such vehicles can be deployed from smaller ships and may lead to sea-floor resident systems able to perform a range of interventions under direct human control when required. These systems are effectively a hybrid cross between ROV and AUV vehicles and poised to enable an important new class of undersea vehicle. It is now possible to radically redefine the meaning of the words "tethered vehicle" to include virtual tethering via acoustic and optical means or through the use of small diameter re-useable tethers, providing not power but only high bandwidth communications. The recent developments at Woods Hole Oceanographic Institution (WHOI), paves the way for a derivative vehicle type able to perform a range of interventions in deep water. Such battery-powered, hybrid-tethered vehicles will be able to perform tasks that might otherwise require a conventional ROV. These functions will be possible from less complex ships because of a greatly reduced dependence on large, heavy tethers and associated vehicle handling equipment. In certain applications, such vehicles can be resident within subsea facilities, able to provide operators with near instant access when required. Several key emerging technologies and capabilities make such a vehicle possible. Advances in both acoustic and optical "wireless" underwater communications and mico-tethers as pioneered by the HROV Nereus offer the potential to transform ROV type operations and thus offer planners and designers an important new dimension to subsea robotic intervention
Parikh, Priti P; Minning, Todd A; Nguyen, Vinh; Lalithsena, Sarasi; Asiaee, Amir H; Sahoo, Satya S; Doshi, Prashant; Tarleton, Rick; Sheth, Amit P
2012-01-01
Research on the biology of parasites requires a sophisticated and integrated computational platform to query and analyze large volumes of data, representing both unpublished (internal) and public (external) data sources. Effective analysis of an integrated data resource using knowledge discovery tools would significantly aid biologists in conducting their research, for example, through identifying various intervention targets in parasites and in deciding the future direction of ongoing as well as planned projects. A key challenge in achieving this objective is the heterogeneity between the internal lab data, usually stored as flat files, Excel spreadsheets or custom-built databases, and the external databases. Reconciling the different forms of heterogeneity and effectively integrating data from disparate sources is a nontrivial task for biologists and requires a dedicated informatics infrastructure. Thus, we developed an integrated environment using Semantic Web technologies that may provide biologists the tools for managing and analyzing their data, without the need for acquiring in-depth computer science knowledge. We developed a semantic problem-solving environment (SPSE) that uses ontologies to integrate internal lab data with external resources in a Parasite Knowledge Base (PKB), which has the ability to query across these resources in a unified manner. The SPSE includes Web Ontology Language (OWL)-based ontologies, experimental data with its provenance information represented using the Resource Description Format (RDF), and a visual querying tool, Cuebee, that features integrated use of Web services. We demonstrate the use and benefit of SPSE using example queries for identifying gene knockout targets of Trypanosoma cruzi for vaccine development. Answers to these queries involve looking up multiple sources of data, linking them together and presenting the results. The SPSE facilitates parasitologists in leveraging the growing, but disparate, parasite data resources by offering an integrative platform that utilizes Semantic Web techniques, while keeping their workload increase minimal.
QUADrATiC: scalable gene expression connectivity mapping for repurposing FDA-approved therapeutics.
O'Reilly, Paul G; Wen, Qing; Bankhead, Peter; Dunne, Philip D; McArt, Darragh G; McPherson, Suzanne; Hamilton, Peter W; Mills, Ken I; Zhang, Shu-Dong
2016-05-04
Gene expression connectivity mapping has proven to be a powerful and flexible tool for research. Its application has been shown in a broad range of research topics, most commonly as a means of identifying potential small molecule compounds, which may be further investigated as candidates for repurposing to treat diseases. The public release of voluminous data from the Library of Integrated Cellular Signatures (LINCS) programme further enhanced the utilities and potentials of gene expression connectivity mapping in biomedicine. We describe QUADrATiC ( http://go.qub.ac.uk/QUADrATiC ), a user-friendly tool for the exploration of gene expression connectivity on the subset of the LINCS data set corresponding to FDA-approved small molecule compounds. It enables the identification of compounds for repurposing therapeutic potentials. The software is designed to cope with the increased volume of data over existing tools, by taking advantage of multicore computing architectures to provide a scalable solution, which may be installed and operated on a range of computers, from laptops to servers. This scalability is provided by the use of the modern concurrent programming paradigm provided by the Akka framework. The QUADrATiC Graphical User Interface (GUI) has been developed using advanced Javascript frameworks, providing novel visualization capabilities for further analysis of connections. There is also a web services interface, allowing integration with other programs or scripts. QUADrATiC has been shown to provide an improvement over existing connectivity map software, in terms of scope (based on the LINCS data set), applicability (using FDA-approved compounds), usability and speed. It offers potential to biological researchers to analyze transcriptional data and generate potential therapeutics for focussed study in the lab. QUADrATiC represents a step change in the process of investigating gene expression connectivity and provides more biologically-relevant results than previous alternative solutions.
Let your fingers do the walking: The projects most invaluable tool
NASA Technical Reports Server (NTRS)
Zirk, Deborah A.
1993-01-01
The barrage of information pertaining to the software being developed for a project can be overwhelming. Current status information, as well as the statistics and history of software releases, should be 'at the fingertips' of project management and key technical personnel. This paper discusses the development, configuration, capabilities, and operation of a relational database, the System Engineering Database (SEDB) which was designed to assist management in monitoring of the tasks performed by the Network Control Center (NCC) Project. This database has proven to be an invaluable project tool and is utilized daily to support all project personnel.
Exploring biology with small organic molecules
Stockwell, Brent R.
2011-01-01
Small organic molecules have proven to be invaluable tools for investigating biological systems, but there is still much to learn from their use. To discover and to use more effectively new chemical tools to understand biology, strategies are needed that allow us to systematically explore ‘biological-activity space’. Such strategies involve analysing both protein binding of, and phenotypic responses to, small organic molecules. The mapping of biological-activity space using small molecules is akin to mapping the stars — uncharted territory is explored using a system of coordinates that describes where each new feature lies. PMID:15602550
The Principles of Engineering Immune Cells to Treat Cancer
Lim, Wendell A.; June, Carl H.
2017-01-01
Chimeric antigen receptor (CAR) T cells have proven that engineered immune cells can serve as a powerful new class of cancer therapeutics. Clinical experience has helped to define the major challenges that must be met to make engineered T cells a reliable, safe, and effective platform that can be deployed against a broad range of tumors. The emergence of synthetic biology approaches for cellular engineering is providing us with a broadly expanded set of tools for programming immune cells. We discuss how these tools could be used to design the next generation of smart T cell precision therapeutics. PMID:28187291
MSX-3D: a tool to validate 3D protein models using mass spectrometry.
Heymann, Michaël; Paramelle, David; Subra, Gilles; Forest, Eric; Martinez, Jean; Geourjon, Christophe; Deléage, Gilbert
2008-12-01
The technique of chemical cross-linking followed by mass spectrometry has proven to bring valuable information about the protein structure and interactions between proteic subunits. It is an effective and efficient way to experimentally investigate some aspects of a protein structure when NMR and X-ray crystallography data are lacking. We introduce MSX-3D, a tool specifically geared to validate protein models using mass spectrometry. In addition to classical peptides identifications, it allows an interactive 3D visualization of the distance constraints derived from a cross-linking experiment. Freely available at http://proteomics-pbil.ibcp.fr
On the Razor’s Edge: Establishing Indistinct Thresholds for Military Power in Cyberspace
2012-04-23
attribution of the cyber threat, offer mitigation 8 techniques, and perform network intrusion diagnosis .”21 However, to date these efforts have proven...that use radio signal to insert coding into networks remotely .”22 Additionally, USCYBERCOM intends to deploy Cyber Support Elements 9 (CSEs) to each...a state actor, DoD could conduct cruise missile strikes, deploy special operating forces, or use unmanned drones against the adversary‟s cyber
NASA Astrophysics Data System (ADS)
Adesta, Erry Yulian T.; Riza, Muhammad; Avicena
2018-03-01
Tool wear prediction plays a significant role in machining industry for proper planning and control machining parameters and optimization of cutting conditions. This paper aims to investigate the effect of tool path strategies that are contour-in and zigzag tool path strategies applied on tool wear during pocket milling process. The experiments were carried out on CNC vertical machining centre by involving PVD coated carbide inserts. Cutting speed, feed rate and depth of cut were set to vary. In an experiment with three factors at three levels, Response Surface Method (RSM) design of experiment with a standard called Central Composite Design (CCD) was employed. Results obtained indicate that tool wear increases significantly at higher range of feed per tooth compared to cutting speed and depth of cut. This result of this experimental work is then proven statistically by developing empirical model. The prediction model for the response variable of tool wear for contour-in strategy developed in this research shows a good agreement with experimental work.
Raman Spectroscopy: an essential tool for future IODP expeditions
NASA Astrophysics Data System (ADS)
Andò, Sergio; Garzanti, Eduardo; Kulhanek, Denise K.
2016-04-01
The scientific drilling of oceanic sedimentary sequences plays a fundamental part in provenance studies, paleoclimate recostructions, and source-to-sink investigations (e.g., France-Lanord et al., 2015; Pandey et al., 2015). When studying oceanic deposits, Raman spectroscopy can and does represent an essential flexible tool for the multidisciplinary approach necessary to integrate the insight provided by different disciplines. This new user-friendly technique opens up an innovative avenue to study in real time the composition of detrital mineral grains of any origin, complementing traditional methods of provenance analysis (e.g., sedimentary petrography, heavy minerals; Andò and Garzanti, 2014). Raman spectra can readily reveal the chemistry of foraminiferal tests, nannofossils and other biogenic debris for the study of ecosystem evolution and paleoclimate, or the Ca/Mg ratio in biogenic or terrigenous carbonates for geological or marine biological applications and oil exploration (Borromeo et al., 2015). For the study of pelagic or turbiditic muds, which represent the bulk of the deep-marine sedimentary record, Raman spectroscopy allows us to identify silt-sized grains down to the size of a few microns with the same precision level required in quantitative provenance analysis of sand-sized sediments (Andò et al., 2011). Silt and siltstone also represent a very conspicuous part of the stratigraphic record onshore and usually preserve original mineralogical assemblages better than more permeable interbedded sand and sandstone (Blatt, 1985). Raman spectra can be obtained on sample volumes of only a few cubic microns by a confocal micro-Raman coupled with a standard polarizing light microscope using a 50× objective. The size of this apparatus can be easily placed onboard an IODP vessel to provide crucial information and quickly solve identification problems for the benefit of a wide range of scientists during future expeditions. Cited references Andò, S., Vignola, P., Garzanti, E., 2011. Raman counting: a new method to determine provenance of silt. Rend. Fis. Acc. Lincei, 22: 327-347. Andò, S., Garzanti, E., 2014. Raman spectroscopy in heavy-mineral studies. Geological Society, London, Special Publications, 386 (1), 395-412. Blatt, H., (1985). Provenance studies and mudrocks. Journal of Sedimentary Research, 55 (1), 69-75. Borromeo, L., Zimmermann, U., Andò, S., Coletti, G., Bersani, D., Basso, D., Gentile, P., Garzanti, E., 2015. Raman Spectroscopy as a tool for magnesium estimation in Mg-calcite. Periodico di Mineralogia , ECMS, 35-36. France-Lanord, C., Spiess, V., Klaus, A., and the Expedition 354 Scientists, 2015. IODP, Exp. 354, Preliminary Report: Bengal Fan, Neogene and late Paleogene record of Himalayan orogeny and climate: a transect across the Middle Bengal Fan. Pandey, D.K., Clift, P.D., Kulhanek, D.K. and the Expedition 355 Scientists, 2015. IODP, Exp. 355, Preliminary Report: Arabian Sea Monsoon, Deep sea drilling in the Arabian Sea: constraining tectonic-monsoon interactions in South Asia.
Metabolic network flux analysis for engineering plant systems.
Shachar-Hill, Yair
2013-04-01
Metabolic network flux analysis (NFA) tools have proven themselves to be powerful aids to metabolic engineering of microbes by providing quantitative insights into the flows of material and energy through cellular systems. The development and application of NFA tools to plant systems has advanced in recent years and are yielding significant insights and testable predictions. Plants present substantial opportunities for the practical application of NFA but they also pose serious challenges related to the complexity of plant metabolic networks and to deficiencies in our knowledge of their structure and regulation. By considering the tools available and selected examples, this article attempts to assess where and how NFA is most likely to have a real impact on plant biotechnology. Copyright © 2013 Elsevier Ltd. All rights reserved.
The Integrated Waste Tracking System - A Flexible Waste Management Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Robert Stephen
2001-02-01
The US Department of Energy (DOE) Idaho National Engineering and Environmental Laboratory (INEEL) has fully embraced a flexible, computer-based tool to help increase waste management efficiency and integrate multiple operational functions from waste generation through waste disposition while reducing cost. The Integrated Waste Tracking System (IWTS)provides comprehensive information management for containerized waste during generation,storage, treatment, transport, and disposal. The IWTS provides all information necessary for facilities to properly manage and demonstrate regulatory compliance. As a platformindependent, client-server and Web-based inventory and compliance system, the IWTS has proven to be a successful tracking, characterization, compliance, and reporting tool that meets themore » needs of both operations and management while providing a high level of management flexibility.« less
Hypothermic machine perfusion in kidney transplantation.
De Deken, Julie; Kocabayoglu, Peri; Moers, Cyril
2016-06-01
This article summarizes novel developments in hypothermic machine perfusion (HMP) as an organ preservation modality for kidneys recovered from deceased donors. HMP has undergone a renaissance in recent years. This renewed interest has arisen parallel to a shift in paradigms; not only optimal preservation of an often marginal quality graft is required, but also improved graft function and tools to predict the latter are expected from HMP. The focus of attention in this field is currently drawn to the protection of endothelial integrity by means of additives to the perfusion solution, improvement of the HMP solution, choice of temperature, duration of perfusion, and machine settings. HMP may offer the opportunity to assess aspects of graft viability before transplantation, which can potentially aid preselection of grafts based on characteristics such as perfusate biomarkers, as well as measurement of machine perfusion dynamics parameters. HMP has proven to be beneficial as a kidney preservation method for all types of renal grafts, most notably those retrieved from extended criteria donors. Large numbers of variables during HMP, such as duration, machine settings and additives to the perfusion solution are currently being investigated to improve renal function and graft survival. In addition, the search for biomarkers has become a focus of attention to predict graft function posttransplant.
Evaluation of self-combustion risk in tire derived aggregate fills.
Arroyo, Marcos; San Martin, Ignacio; Olivella, Sebastian; Saaltink, Maarten W
2011-01-01
Lightweight tire derived aggregate (TDA) fills are a proven recycling outlet for waste tires, requiring relatively low cost waste processing and being competitively priced against other lightweight fill alternatives. However its value has been marred as several TDA fills have self-combusted during the early applications of this technique. An empirical review of these cases led to prescriptive guidelines from the ASTM aimed at avoiding this problem. This approach has been successful in avoiding further incidents of self-combustion. However, at present there remains no rational method available to quantify self-combustion risk in TDA fills. This means that it is not clear which aspects of the ASTM guidelines are essential and which are accessory. This hinders the practical use of TDA fills despite their inherent advantages as lightweight fill. Here a quantitative approach to self-combustion risk evaluation is developed and illustrated with a parametric analysis of an embankment case. This is later particularized to model a reported field self-combustion case. The approach is based on the available experimental observations and incorporates well-tested methodological (ISO corrosion evaluation) and theoretical tools (finite element analysis of coupled heat and mass flow). The results obtained offer clear insights into the critical aspects of the problem, allowing already some meaningful recommendations for guideline revision. Copyright © 2011 Elsevier Ltd. All rights reserved.
Bergholz, W
2008-11-01
In many high-tech industries, quality management (QM) has enabled improvements of quality by a factor of 100 or more, in combination with significant cost reductions. Compared to this, the application of QM methods in health care is in its initial stages. It is anticipated that stringent process management, embedded in an effective QM system will lead to significant improvements in health care in general and in the German public health service in particular. Process management is an ideal platform for controlling in the health care sector, and it will significantly improve the leverage of controlling to bring down costs. Best practice sharing in industry has led to quantum leap improvements. Process management will enable best practice sharing also in the public health service, in spite of the highly diverse portfolio of services that the public health service offers in different German regions. Finally, it is emphasised that "technical" QM, e.g., on the basis of the ISO 9001 standard is not sufficient to reach excellence. It is necessary to integrate soft factors, such as patient or employee satisfaction, and leadership quality into the system. The EFQM model for excellence can serve as proven tool to reach this goal.
Analyzing a 35-Year Hourly Data Record: Why So Difficult?
NASA Technical Reports Server (NTRS)
Lynnes, Chris
2014-01-01
At the Goddard Distributed Active Archive Center, we have recently added a 35-Year record of output data from the North American Land Assimilation System (NLDAS) to the Giovanni web-based analysis and visualization tool. Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure) offers a variety of data summarization and visualization to users that operate at the data center, obviating the need for users to download and read the data themselves for exploratory data analysis. However, the NLDAS data has proven surprisingly resistant to application of the summarization algorithms. Algorithms that were perfectly happy analyzing 15 years of daily satellite data encountered limitations both at the algorithm and system level for 35 years of hourly data. Failures arose, sometimes unexpectedly, from command line overflows, memory overflows, internal buffer overflows, and time-outs, among others. These serve as an early warning sign for the problems likely to be encountered by the general user community as they try to scale up to Big Data analytics. Indeed, it is likely that more users will seek to perform remote web-based analysis precisely to avoid the issues, or the need to reprogram around them. We will discuss approaches to mitigating the limitations and the implications for data systems serving the user communities that try to scale up their current techniques to analyze Big Data.
Detecting weak position fluctuations from encoder signal using singular spectrum analysis.
Xu, Xiaoqiang; Zhao, Ming; Lin, Jing
2017-11-01
Mechanical fault or defect will cause some weak fluctuations to the position signal. Detection of such fluctuations via encoders can help determine the health condition and performance of the machine, and offer a promising alternative to the vibration-based monitoring scheme. However, besides the interested fluctuations, encoder signal also contains a large trend and some measurement noise. In applications, the trend is normally several orders larger than the concerned fluctuations in magnitude, which makes it difficult to detect the weak fluctuations without signal distortion. In addition, the fluctuations can be complicated and amplitude modulated under non-stationary working condition. To overcome this issue, singular spectrum analysis (SSA) is proposed for detecting weak position fluctuations from encoder signal in this paper. It enables complicated encode signal to be reduced into several interpretable components including a trend, a set of periodic fluctuations and noise. A numerical simulation is given to demonstrate the performance of the method, it shows that SSA outperforms empirical mode decomposition (EMD) in terms of capability and accuracy. Moreover, linear encoder signals from a CNC machine tool are analyzed to determine the magnitudes and sources of fluctuations during feed motion. The proposed method is proven to be feasible and reliable for machinery condition monitoring. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Validity of the AusTOM scales: A comparison of the AusTOMs and EuroQol-5D
Unsworth, Carolyn A; Duckett, Stephen J; Duncombe, Dianne; Perry, Alison; Skeat, Jemma; Taylor, Nicholas
2004-01-01
Background Clinicians require brief outcome measures in their busy daily practice to document global client outcomes. Based on the UK Therapy Outcome Measure, the Australian Therapy Outcome Measures were designed to capture global therapy outcomes of occupational therapy, physiotherapy and speech pathology in the Australian clinical context. The aim of this study was to investigate the construct (convergent) validity of the Australian Therapy Outcome Measures (AusTOMs) by comparing it with the EuroQuol-5D (EQ-5D). Methods The research was a prospective, longitudinal cohort study, with data collected over a seven month time period. The study was conducted at a total of 13 metropolitan and rural health-care sites including acute, sub-acute and community facilities. Two-hundred and five clients were asked to score themselves on the EQ-5D, and the same clients were scored by approximately 115 therapists (physiotherapists, speech pathologists and occupational therapists) using the AusTOMs at admission and discharge. Clients were consecutive admissions who agreed to participate in the study. Clients of all diagnoses, aged 18 years and over (a criteria of the EQ-5D), and able to give informed consent were scored on the measures. Spearman rank order correlation coefficients were used to analyze the relationships between scores from the two tools. The clients were scored on the AusTOMs and EQ-5D. Results There were many health care areas where correlations were expected and found between scores on the AusTOMs and the EQ-5D. Conclusion In the quest to measure the effectiveness of therapy services, managers, health care founders and clinicians are urgently seeking to undertake the first step by identifying tools that can measure therapy outcome. AusTOMs is one tool that can measure global client outcomes following therapy. In this study, it was found that on the whole, the AusTOMs and the EQ-5D measure similar constructs. Hence, although the validity of a tool is never 'proven', this study offers preliminary support for the construct validity of AusTOMs. PMID:15541181
NASA Astrophysics Data System (ADS)
Narock, T.; Arko, R. A.; Carbotte, S. M.; Chandler, C. L.; Cheatham, M.; Finin, T.; Hitzler, P.; Krisnadhi, A.; Raymond, L. M.; Shepherd, A.; Wiebe, P. H.
2014-12-01
A wide spectrum of maturing methods and tools, collectively characterized as the Semantic Web, is helping to vastly improve the dissemination of scientific research. Creating semantic integration requires input from both domain and cyberinfrastructure scientists. OceanLink, an NSF EarthCube Building Block, is demonstrating semantic technologies through the integration of geoscience data repositories, library holdings, conference abstracts, and funded research awards. Meeting project objectives involves applying semantic technologies to support data representation, discovery, sharing and integration. Our semantic cyberinfrastructure components include ontology design patterns, Linked Data collections, semantic provenance, and associated services to enhance data and knowledge discovery, interoperation, and integration. We discuss how these components are integrated, the continued automated and semi-automated creation of semantic metadata, and techniques we have developed to integrate ontologies, link resources, and preserve provenance and attribution.
Monitor weather conditions for cloud seeding control. [Colorado River Basin
NASA Technical Reports Server (NTRS)
Kahan, A. M. (Principal Investigator)
1973-01-01
The author has identified the following significant results. The near real-time DCS platform data transfer to the time-share compare is a working reality. Six stations are now being automatically monitored and displayed with a system delay of 3 to 8 hours from time of data transmission to time of data accessibility on the computer. The DCS platform system has proven itself a valuable tool for near real-time monitoring of mountain precipitation. Data from Wolf Creek Pass were an important input in making the decision when to suspend seeding operations to avoid exceeding suspension criteria in that area. The DCS platforms, as deployed in this investigation, have proven themselves to be reliable weather resistant systems for winter mountain environments in the southern Colorado mountains.
[Rehabilitation in rheumatology].
Luttosch, F; Baerwald, C
2010-10-01
Rehabilitation in rheumatology focuses on prevention of functional disorders of the musculoskeletal system, maintenance of working ability and prevention of care dependency. Drug treatment alone rarely results in long-term remission, therefore rehabilitative measures must be integrated into rheumatic care. Rehabilitative therapy in rheumatology includes physiotherapy, patient education and occupational therapy. Positive effects of physical therapy methods have been proven by various studies. Patient education and occupational therapy are important tools for stabilizing the course of the disease. To maintain positive rehabilitative results patients have to be involved in the selection of treatment measures and should take an active part in the long-term treatment process. Despite proven efficacy of physical measures there is evidence for a lack of utilization of rehabilitative therapy due to increasing cost pressure in the health care system which will further increase over time.
Cavicchioli, M G S; Guerbali, C C L; Ochiai, C; Silva, R M; Camara, G; Petry, T B Z
2016-07-01
Diabetes has caused 5.1 million deaths, primarily from cardiovascular disease. Large clinical studies have proven the importance of intensive control of diabetes from diagnosis to prevent microvascular and macrovascular complications of the disease in the long term. Diabetes education conducted by an interdisciplinary team of doctors, nurses, nutritionists, psychologists, and others is a necessary tool to ensure effective behavioral change and help overcome the obstacles that may hinder self care. Several studies have been analyzed in this review, in which we find a variety of results. Diabetes education has proven to be essential to patient compliance with their T2DM treatment; the main objective is to prevent acute and chronic complications, especially cardiovascular ones, which are the main causes of mortality.
NASA Astrophysics Data System (ADS)
Jue, Brian J.; Bice, Michael D.
2013-07-01
As students explore the technological tools available to them for learning mathematics, some will eventually discover what happens when a function button is repeatedly pressed on a calculator. We explore several examples of this, presenting tabular and graphical results for the square root, natural logarithm and sine and cosine functions. Observed behaviour is proven and then discussed in the context of fixed points.
Soil vapor extraction (SVE) and bioventing (BV) are proven strategies for remediation of unsaturated zone soils. Mathematical models are powerful tools that can be used to integrate and quantify the interaction of physical, chemical, and biological processes occurring in field sc...
Podcasting: A Preliminary Classroom Study
ERIC Educational Resources Information Center
Aristizabal, Alexander
2009-01-01
Podcasting is a term introduced through the use of Apple Computer, Inc.'s iPod, a term which denotes how a portable audio player can be used to download audio files, mostly MP3s, and be heard at the user's convenience. Initially such an operation was intended for entertainment; however, it has proven itself to be an important tool in the field of…
ERIC Educational Resources Information Center
Gable, Robert A.; Park, Kristy Lee; Scott, Terrance M.
2014-01-01
The use of functional behavioral assessment (FBA) is an effective tool to address a wide range of severe behavior problems of students at risk for or with emotional disabilities (ED). However, the transformation of a procedure proven effective under highly-controlled clinical conditions to a practical and effective strategy for use in applied…
Professional Development of, by, and for the Practitioners of the Washington Teachers Inquiry Group
ERIC Educational Resources Information Center
Gasoi, Emily; Hare, Abby; Mallaney,Norah; Stevens-Morin, Hanna
2016-01-01
Descriptive Review, as developed by Patricia Carini and colleagues at the Prospect Center, is a process of deep observation and documentation that has proven to be an invaluable tool for teachers interested in gaining a more holistic view of their students and their work. In this article, a teacher educator and three classroom teachers share their…
Building Networks of Leaders through the Internet.
ERIC Educational Resources Information Center
Gabbard, Glenn
2001-01-01
This bulletin brings together the concepts of parent networking and the Internet. The document highlights key free or low cost features of the Internet which have proven to be useful tools in linking together networks of parents. It addresses the following six questions: (1) What if I don't have a computer? (2) How can I get Web access? (3) How do…
NASA Technical Reports Server (NTRS)
1991-01-01
The Computer Graphics Center of North Carolina State University uses LAS, a COSMIC program, to analyze and manipulate data from Landsat and SPOT providing information for government and commercial land resource application projects. LAS is used to interpret aircraft/satellite data and enables researchers to improve image-based classification accuracies. The system is easy to use and has proven to be a valuable remote sensing training tool.
ERIC Educational Resources Information Center
Al Sadi, Fatma H.; Basit, Tehmina N.
2017-01-01
The vignettes approach has emerged as a popular tool in quantitative and qualitative research. It has proven to be particularly effective in measuring sensitive topics. This paper focuses on the construction and validation process of questionnaire-based vignettes, which were used as an instrument to examine Omani secondary school girls' cultural…
Scientists and artists: ""Hey! You got art in my science! You got science on my art
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elfman, Mary E; Hayes, Birchard P; Michel, Kelly D
The pairing of science and art has proven to be a powerful combination since the Renaissance. The combination of these two seemingly disparate disciplines ensured that even complex scientific theories could be explored and effectively communicated to both the subject matter expert and the layman. In modern times, science and art have frequently been considered disjoint, with objectives, philosophies, and perspectives often in direct opposition to each other. However, given the technological advances in computer science and high fidelity 3-D graphics development tools, this marriage of art and science is once again logically complimentary. Art, in the form of computermore » graphics and animation created on supercomputers, has already proven to be a powerful tool for improving scientific research and providing insight into nuclear phenomena. This paper discusses the power of pairing artists with scientists and engineers in order to pursue the possibilities of a widely accessible lightweight, interactive approach. We will use a discussion of photo-realism versus stylization to illuminate the expected beneficial outcome of such collaborations and the societal advantages gained by a non-traditional pa11nering of these two fields.« less
Active flow control insight gained from a modified integral boundary layer equation
NASA Astrophysics Data System (ADS)
Seifert, Avraham
2016-11-01
Active Flow Control (AFC) can alter the development of boundary layers with applications (e.g., reducing drag by separation delay or separating the boundary layers and enhancing vortex shedding to increase drag). Historically, significant effects of steady AFC methods were observed. Unsteady actuation is significantly more efficient than steady. Full-scale AFC tests were conducted with varying levels of success. While clearly relevant to industry, AFC implementation relies on expert knowledge with proven intuition and or costly and lengthy computational efforts. This situation hinders the use of AFC while simple, quick and reliable design method is absent. An updated form of the unsteady integral boundary layer (UIBL) equations, that include AFC terms (unsteady wall transpiration and body forces) can be used to assist in AFC analysis and design. With these equations and given a family of suitable velocity profiles, the momentum thickness can be calculated and matched with an outer, potential flow solution in 2D and 3D manner to create an AFC design tool, parallel to proven tools for airfoil design. Limiting cases of the UIBL equation can be used to analyze candidate AFC concepts in terms of their capability to modify the boundary layers development and system performance.
Berry, Scott A; Laam, Leslie A; Wary, Andrea A; Mateer, Harry O; Cassagnol, Hans P; McKinley, Karen E; Nolan, Ruth A
2011-05-01
Geisinger Health System (GHS) has applied its ProvenCare model to demonstrate that a large integrated health care delivery system, enabled by an electronic health record (EHR), could reengineer a complicated clinical process, reduce unwarranted variation, and provide evidence-based care for patients with a specified clinical condition. In 2007 GHS began to apply the model to a more complicated, longer-term condition of "wellness"--perinatal care. ADAPTING PROVENCARE TO PERINATAL CARE: The ProvenCare Perinatal initiative was more complex than the five previous ProvenCare endeavors in terms of breadth, scope, and duration. Each of the 22 sites created a process flow map to depict the current, real-time process at each location. The local practice site providers-physicians and mid-level practitioners-reached consensus on 103 unique best practice measures (BPMs), which would be tracked for every patient. These maps were then used to create a single standardized pathway that included the BPMs but also preserved some unique care offerings that reflected the needs of the local context. A nine-phase methodology, expanded from the previous six-phase model, was implemented on schedule. Pre- to postimplementation improvement occurred for all seven BPMs or BPM bundles that were considered the most clinically relevant, with five statistically significant. In addition, the rate of primary cesarean sections decreased by 32%, and birth trauma remained unchanged as the number of vaginal births increased. Preliminary experience suggests that integrating evidence/guideline-based best practices into work flows in inpatient and outpatient settings can achieve improvements in daily patient care processes and outcomes.
Nanotechnology Cancer Therapy and Treatment
Nanotechnology offers the means to target therapies directly and selectively to cancerous cells and neoplasms. With these tools, clinicians can safely and effectively deliver chemotherapy, radiotherapy, and the next generation of immuno- and gene therapies to the tumor. Futhermore, surgical resection of tumors can be guided and enhanced by way of nanotechnology tools. Find out how nanotechnology will offer the next generation of our therapeutic arsenal to the patient.
ERIC Educational Resources Information Center
Cummins, Caroline
2006-01-01
In this article, the author discusses the benefits offered by integrated library systems (ILS) for making more informed decisions. Library software vendors, realizing ILS products can reveal business intelligence, have begun to offer tools like Director's Station to help library managers get more out of their data, and librarians are taking…
Johnson, Karin E; Kamineni, Aruna; Fuller, Sharon; Olmstead, Danielle; Wernli, Karen J
2014-01-01
The use of electronic health records (EHRs) for research is proceeding rapidly, driven by computational power, analytical techniques, and policy. However, EHR-based research is limited by the complexity of EHR data and a lack of understanding about data provenance, meaning the context under which the data were collected. This paper presents system flow mapping as a method to help researchers more fully understand the provenance of their EHR data as it relates to local workflow. We provide two specific examples of how this method can improve data identification, documentation, and processing. EHRs store clinical and administrative data, often in unstructured fields. Each clinical system has a unique and dynamic workflow, as well as an EHR customized for local use. The EHR customization may be influenced by a broader context such as documentation required for billing. We present a case study with two examples of using system flow mapping to characterize EHR data for a local colorectal cancer screening process. System flow mapping demonstrated that information entered into the EHR during clinical practice required interpretation and transformation before it could be accurately applied to research. We illustrate how system flow mapping shaped our knowledge of the quality and completeness of data in two examples: (1) determining colonoscopy indication as recorded in the EHR, and (2) discovering a specific EHR form that captured family history. Researchers who do not consider data provenance risk compiling data that are systematically incomplete or incorrect. For example, researchers who are not familiar with the clinical workflow under which data were entered might miss or misunderstand patient information or procedure and diagnostic codes. Data provenance is a fundamental characteristic of research data from EHRs. Given the diversity of EHR platforms and system workflows, researchers need tools for evaluating and reporting data availability, quality, and transformations. Our case study illustrates how system mapping can inform researchers about the provenance of their data as it pertains to local workflows.
Methods Beyond Methods: A Model for Africana Graduate Methods Training
Best, Latrica E.; Byrd, W. Carson
2018-01-01
A holistic graduate education can impart not just tools and knowledge, but critical positioning to fulfill many of the original missions of Africana Studies programs set forth in the 1960s and 1970s. As an interdisciplinary field with many approaches to examining the African Diaspora, the methodological training of graduate students can vary across graduate programs. Although taking qualitative methods courses are often required of graduate students in Africana Studies programs, and these programs offer such courses, rarely if ever are graduate students in these programs required to take quantitative methods courses, let alone have these courses offered in-house. These courses can offer Africana Studies graduate students new tools for their own research, but more importantly, improve their knowledge of quantitative research of diasporic communities. These tools and knowledge can assist with identifying flawed arguments about African-descended communities and their members. This article explores the importance of requiring and offering critical quantitative methods courses in graduate programs in Africana Studies, and discusses the methods requirements of one graduate program in the field as an example of more rigorous training that other programs could offer graduate students. PMID:29710883
Tool-use: An open window into body representation and its plasticity
Martel, Marie; Cardinali, Lucilla; Roy, Alice C.; Farnè, Alessandro
2016-01-01
ABSTRACT Over the last decades, scientists have questioned the origin of the exquisite human mastery of tools. Seminal studies in monkeys, healthy participants and brain-damaged patients have primarily focused on the plastic changes that tool-use induces on spatial representations. More recently, we focused on the modifications tool-use must exert on the sensorimotor system and highlighted plastic changes at the level of the body representation used by the brain to control our movements, i.e., the Body Schema. Evidence is emerging for tool-use to affect also more visually and conceptually based representations of the body, such as the Body Image. Here we offer a critical review of the way different tool-use paradigms have been, and should be, used to try disentangling the critical features that are responsible for tool incorporation into different body representations. We will conclude that tool-use may offer a very valuable means to investigate high-order body representations and their plasticity. PMID:27315277
Tool-use: An open window into body representation and its plasticity.
Martel, Marie; Cardinali, Lucilla; Roy, Alice C; Farnè, Alessandro
2016-01-01
Over the last decades, scientists have questioned the origin of the exquisite human mastery of tools. Seminal studies in monkeys, healthy participants and brain-damaged patients have primarily focused on the plastic changes that tool-use induces on spatial representations. More recently, we focused on the modifications tool-use must exert on the sensorimotor system and highlighted plastic changes at the level of the body representation used by the brain to control our movements, i.e., the Body Schema. Evidence is emerging for tool-use to affect also more visually and conceptually based representations of the body, such as the Body Image. Here we offer a critical review of the way different tool-use paradigms have been, and should be, used to try disentangling the critical features that are responsible for tool incorporation into different body representations. We will conclude that tool-use may offer a very valuable means to investigate high-order body representations and their plasticity.
Leonardi, Matilde; Chatterji, Somnath; Koskinen, Seppo; Ayuso-Mateos, Jose Luis; Haro, Josep Maria; Frisoni, Giovanni; Frattura, Lucilla; Martinuzzi, Andrea; Tobiasz-Adamczyk, Beata; Gmurek, Michal; Serrano, Ramon; Finocchiaro, Carla
2014-01-01
COURAGE in Europe was a 3-year project involving 12 partners from four European countries and the World Health Organization. It was inspired by the pressing need to integrate international studies on disability and ageing in light of an innovative perspective based on a validated data-collection protocol. COURAGE in Europe Project collected data on the determinants of health and disability in an ageing population, with specific tools for the evaluation of the role of the built environment and social networks on health, disability, quality of life and well-being. The main survey was conducted by partners in Finland, Poland and Spain where the survey has been administered to a sample of 10,800 persons, which was completed in March 2012. The newly developed and validated COURAGE Protocol for Ageing Studies has proven to be a valid tool for collecting comparable data in ageing population, and the COURAGE in Europe Project has created valid and reliable scientific evidence, demonstrating cross-country comparability, for disability and ageing research and policy development. It is therefore recommended that future studies exploring determinants of health and disability in ageing use the COURAGE-derived methodology. COURAGE in Europe Project collected data on the determinants of health and disability in an ageing population, with specific tools for the evaluation of the role of built environment and social networks on health, disability quality of life and well-being. The COURAGE Protocol for Ageing Studies has proven to be a valid tool for collecting comparable data in the ageing population. The COURAGE in Europe Consortium recommends that future studies exploring determinants of health and disability in ageing use COURAGE-derived methodology. Copyright © 2013 John Wiley & Sons, Ltd.
Addressing and Presenting Quality of Satellite Data via Web-Based Services
NASA Technical Reports Server (NTRS)
Leptoukh, Gregory; Lynnes, C.; Ahmad, S.; Fox, P.; Zednik, S.; West, P.
2011-01-01
With the recent attention to climate change and proliferation of remote-sensing data utilization, climate model and various environmental monitoring and protection applications have begun to increasingly rely on satellite measurements. Research application users seek good quality satellite data, with uncertainties and biases provided for each data point. However, different communities address remote sensing quality issues rather inconsistently and differently. We describe our attempt to systematically characterize, capture, and provision quality and uncertainty information as it applies to the NASA MODIS Aerosol Optical Depth data product. In particular, we note the semantic differences in quality/bias/uncertainty at the pixel, granule, product, and record levels. We outline various factors contributing to uncertainty or error budget; errors. Web-based science analysis and processing tools allow users to access, analyze, and generate visualizations of data while alleviating users from having directly managing complex data processing operations. These tools provide value by streamlining the data analysis process, but usually shield users from details of the data processing steps, algorithm assumptions, caveats, etc. Correct interpretation of the final analysis requires user understanding of how data has been generated and processed and what potential biases, anomalies, or errors may have been introduced. By providing services that leverage data lineage provenance and domain-expertise, expert systems can be built to aid the user in understanding data sources, processing, and the suitability for use of products generated by the tools. We describe our experiences developing a semantic, provenance-aware, expert-knowledge advisory system applied to NASA Giovanni web-based Earth science data analysis tool as part of the ESTO AIST-funded Multi-sensor Data Synergy Advisor project.
NASA Astrophysics Data System (ADS)
Eivind Augland, Lars; Jones, Morgan; Planke, Sverre; Svensen, Henrik; Tegner, Christian
2016-04-01
Zircons are a powerful tool in geochronology and isotope geochemistry, as their affinity for U and Hf in the crystal structure and the low initial Pb and Lu allow for precise and accurate dating by U-Pb ID-TIMS and precise and accurate determination of initial Hf isotopic composition by solution MC-ICP-MS analysis. The U-Pb analyses provide accurate chronostratigraphic controls on the sedimentary successions and absolute age frames for the biotic evolution across geological boundaries. Moreover, the analyses of Lu-Hf by solution MC-ICP-MS after Hf-purification column chemistry provide a powerful and robust fingerprinting tool to test the provenance of individual ash beds. Here we focus on ash beds from Permian-Triassic and Palaeocene successions in Svalbard and from the Palaeocene-Eocene Thermal Maximum (PETM) in Fur, Denmark. Used in combination with whole rock geochemistry from the ash layers and the available geochemical and isotopic data from potential source volcanoes, these data are used to evaluate the provenance of the Permian-Triassic and Palaeocene ashes preserved in Svalbard and PETM ashes in Denmark. If explosive eruptions from volcanic centres such as the Siberian Traps and the North Atlantic Igneous Province (NAIP) can be traced to distal basins as ash layers, they provide robust tests of hypotheses of global synchronicity of environmental changes and biotic crises. In addition, the potential correlation of ash layers with source volcanoes will aid in constraining the extent of explosive volcanism in the respective volcanic centres. The new integrated data sets will also contribute to establish new reference sections for the study of these boundary events when combined with stable isotope data and biostratigraphy.
The use of acuity and frailty measures for district nursing workforce plans.
David, Ami; Saunders, Mary
2018-02-02
This article discusses the use of Quest acuity and frailty measures for community nursing interventions to quantify and qualify the contributions of district nursing teams. It describes the use of a suite of acuity and frailty tools tested in 8 UK community service trusts over the past 5years. In addition, a competency assessment tool was used to gauge both capacity and capability of individual nurses. The consistency of the results obtained from the Quest audits offer significant evidence and potential for realigning community nursing services to offer improvements in efficiency and cost-effectiveness. The National Quality Board (NQB) improvement resource for the district nursing services ( NQB, 2017 ) recommends a robust method for classifying patient acuity/frailty/dependency. It is contended the Quest tools and their usage articulated here offer a suitable methodology.
A comprehensive review on cold work of AISI D2 tool steel
NASA Astrophysics Data System (ADS)
Abdul Rahim, Mohd Aidil Shah bin; Minhat, Mohamad bin; Hussein, Nur Izan Syahriah Binti; Salleh, Mohd Shukor bin
2017-11-01
As a common material in mould and die application, AISI D2 cold work tool steel has proven to be a promising chosen material in the industries. However, challenges remain in using AISI D2 through a modified version with a considerable progress having been made in recent years. This paper provides a critical review of the original as-cast AISI D2 cold work tool steel up to the modified version. The main purpose is to develop an understanding of current modified tool steel trend; the machinability of AISI D2 (drilling, milling, turning, grinding and EDM/WEDM; and the microstructure evolution and mechanical properties of these cold work tool steels due to the presence of alloy materials in the steel matrix. The doping of rare earth alloy element, new steel fabrication processes, significant process parameter in machinability and surface treatment shows that there have been few empirical investigations into these cold work tool steel alloys. This study has discovered that cold work tool steel will remain to be explored in order to survive in the steel industries.
2016-02-16
additional duty. Proven advisors can be assigned to training or policy billets where advisor advocacy is desired. Additionally , they serve as advisors...USMC SFA success has history from the Philippines through Banana Wars, WWII, Vietnam, and most recently Iraq and Afghanistan. The current de-emphasis... values . An individual who is too quick, (Sonny), or too slow, (Fredo), to make a decision will fail as an advisor. The Marine Corps must identify and
Hypoallergenic molecules for subcutaneous immunotherapy.
Jongejan, Laurian; van Ree, Ronald; Poulsen, Lars K
2016-01-01
Although a large part of the population suffers from allergies, a cure is not yet available. Allergen-specific immunotherapy (AIT) offers promise for these patients. AIT has proven successful in insect and venom allergies; however, for food allergy this is still unclear. In this editorial we focus on the recent advances in a proof of concept study in food allergy, FAST (Food allergy specific immunotherapy), which may increase interest within the biomolecular and pharmaceutical industry to embark on similar projects of immunology driven precision medicine within the allergy field.
[Immunotherapy : a revolution in the management of urothelial bladder cancer ?
Adam, Sophie Mc; Derré, Laurent; Jichlinski, Patrice; Lucca, Ilaria
2017-11-29
The treatment of urothelial bladder cancer has changed very little in recent years, with high rates of disease recurrence and progression, even in low aggressive urothelial bladder cancer. Immunotherapy has already proven its effectiveness as a treatment for several types of cancer and has been used in high-grade non-muscle-invasive bladder cancer for decades. Recent findings on immune checkpoints inhibitors have opened up a new chapter for treatment of bladder cancer, offering interesting therapeutic perspectives that could revolutionize the management.
Empathy deficit in antisocial personality disorder: a psychodynamic formulation.
Malancharuvil, Joseph M
2012-09-01
Empathic difficulty is a highly consequential characteristic of antisocial personality structure. The origin, maintenance, and possible resolution of this profound deficit are not very clear. While reconstructing empathic ability is of primary importance in the treatment of antisocial personality, not many proven procedures are in evidence. In this article, the author offers a psychodynamic formulation of the origin, character, and maintenance of the empathic deficiency in antisocial personality. The author discusses some of the treatment implications from this dynamic formulation.
Bipolar lead acid battery development
NASA Technical Reports Server (NTRS)
Eskra, Michael; Vidas, Robin; Miles, Ronald; Halpert, Gerald; Attia, Alan; Perrone, David
1991-01-01
A modular bipolar battery configuration is under development at Johnson Control, Inc. (JCI) and the Jet Propulsion Laboratory (JPL). The battery design, incorporating proven lead acid electrochemistry, yields a rechargeable, high-power source that is light weight and compact. This configuration offers advantages in power capability, weight, and volume over conventional monopolar batteries and other battery chemistries. The lead acid bipolar battery operates in a sealed, maintenance-free mode allowing for maximum application flexibility. It is ideal for high-voltage and high-power applications.
"PULS." – a Blog-based Online-Magazine for Students of Medicine of the Goethe University Frankfurt
Wurche, Bettina; Klauer, Gertrud; Nürnberger, Frank
2013-01-01
In the context of nationwide protests 2009 also students of the faculty of medicine/dentistry at Goethe-University in Frankfurt demanded more transparency and communication. To satisfy these demands, a web 2.0-tool offered an innovative solution: A blog-based online-magazine for students and other faculty-members. The online-magazine „PULS.“ is realized with the share-ware blog-software (wordpress version 3.1.3) and is conceived and written by an online-journalist. „PULS.“ is available from https://newsmagazin.puls.med.uni-frankfurt.de/wp/. The articles are generated from own investigations and from ideas of different groups of the faculty– deanship, students and lecturers. A user-analysis is conducted with the open-source software Piwik and considers the data security. Additionally, every year an anonymous online-user-survey (Survey Monkey) is conducted. “PULS.” is continuously online since 14.02.2010 and has published 806 articles (state: 27.11.2012) and has about 2400 readers monthly. The content focuses on the needs of Frankfurt medical students. The close cooperation with different groups of the faculty - deanship, students and lecturers - furthermore guarantees themes relevant to the academic faculty. “PULS.” flanks complex projects and decisions with background-information and communicates them understandable. The user-evaluation shows a growing number of readers and a high acceptance for the online-magazine, its themes and its style. The web 2.0-tool “Blog” and the web-specific language comply with media habits of the main target group, the students of the faculty medicine/dentistry. Thus, “PULS.” has proven as a suitable and strategic instrument. It pushes towards a higher transparency, more communication and a stronger identification of the students with their faculty. PMID:23467571
"PULS." - a blog-based online-magazine for students of medicine of the Goethe University Frankfurt.
Wurche, Bettina; Klauer, Gertrud; Nürnberger, Frank
2013-01-01
In the context of nationwide protests 2009 also students of the faculty of medicine/dentistry at Goethe-University in Frankfurt demanded more transparency and communication. To satisfy these demands, a web 2.0-tool offered an innovative solution: A blog-based online-magazine for students and other faculty-members. The online-magazine "PULS." is realized with the share-ware blog-software (wordpress version 3.1.3) and is conceived and written by an online-journalist. "PULS." is available from https://newsmagazin.puls.med.uni-frankfurt.de/wp/. The articles are generated from own investigations and from ideas of different groups of the faculty- deanship, students and lecturers. A user-analysis is conducted with the open-source software Piwik and considers the data security. Additionally, every year an anonymous online-user-survey (Survey Monkey) is conducted. "PULS." is continuously online since 14.02.2010 and has published 806 articles (state: 27.11.2012) and has about 2400 readers monthly. The content focuses on the needs of Frankfurt medical students. The close cooperation with different groups of the faculty - deanship, students and lecturers - furthermore guarantees themes relevant to the academic faculty. "PULS." flanks complex projects and decisions with background-information and communicates them understandable. The user-evaluation shows a growing number of readers and a high acceptance for the online-magazine, its themes and its style. The web 2.0-tool "Blog" and the web-specific language comply with media habits of the main target group, the students of the faculty medicine/dentistry. Thus, "PULS." has proven as a suitable and strategic instrument. It pushes towards a higher transparency, more communication and a stronger identification of the students with their faculty.
The relationship between advertising, price, and nursing home quality.
Kash, Bita A; Miller, Thomas R
2009-01-01
Theoretically, nursing homes should engage in advertising for the following two reasons: (a) to improve awareness of the services offered in a particular market and (b) to signal high-quality services. In this study, we build upon results from prior studies of nursing home advertising activity, market competition, and quality. The purpose of this study was to examine the association between advertising expenses, price, and quality. We focused on answering the question: Do nursing homes use advertising and price to signal superior quality? The Texas Nursing Facilities Medicaid Cost Report, the Texas Quality Reporting System, and the Area Resource File were merged for the year 2003. We used three alternative measures of quality to improve the robustness of this exploratory analysis. Quality measures were examined using Bonferroni correlation coefficient analysis. Associations between advertising expenses and quality were evaluated using three regression models predicting quality. We also examined the association of the price of a private bed per day with quality. Advertising expenses were not associated with better nursing home quality as measured by three quality scales. The average price customers pay for one private bed per day was associated with better quality only in one of the three quality regression models. The price of nursing home care might be a better indicator of quality and necessary to increase as quality of care is improved in the nursing homes sector. Because more advertising expenditures are not necessarily associated with better quality, consumers could be mislead by advertisements and choose poor quality nursing homes. Nursing home administrators should focus on customer relationship management tools instead of expensive advertising. Relationship management tools are proven marketing techniques for the health services sector, usually less expensive than advertising, and help with staff retention and quality outcomes.
NASA Astrophysics Data System (ADS)
Vermeire, B. C.; Witherden, F. D.; Vincent, P. E.
2017-04-01
First- and second-order accurate numerical methods, implemented for CPUs, underpin the majority of industrial CFD solvers. Whilst this technology has proven very successful at solving steady-state problems via a Reynolds Averaged Navier-Stokes approach, its utility for undertaking scale-resolving simulations of unsteady flows is less clear. High-order methods for unstructured grids and GPU accelerators have been proposed as an enabling technology for unsteady scale-resolving simulations of flow over complex geometries. In this study we systematically compare accuracy and cost of the high-order Flux Reconstruction solver PyFR running on GPUs and the industry-standard solver STAR-CCM+ running on CPUs when applied to a range of unsteady flow problems. Specifically, we perform comparisons of accuracy and cost for isentropic vortex advection (EV), decay of the Taylor-Green vortex (TGV), turbulent flow over a circular cylinder, and turbulent flow over an SD7003 aerofoil. We consider two configurations of STAR-CCM+: a second-order configuration, and a third-order configuration, where the latter was recommended by CD-adapco for more effective computation of unsteady flow problems. Results from both PyFR and STAR-CCM+ demonstrate that third-order schemes can be more accurate than second-order schemes for a given cost e.g. going from second- to third-order, the PyFR simulations of the EV and TGV achieve 75× and 3× error reduction respectively for the same or reduced cost, and STAR-CCM+ simulations of the cylinder recovered wake statistics significantly more accurately for only twice the cost. Moreover, advancing to higher-order schemes on GPUs with PyFR was found to offer even further accuracy vs. cost benefits relative to industry-standard tools.
Toward an Ethical Framework for Climate Services
NASA Astrophysics Data System (ADS)
Wilby, R.; Adams, P.; Eitland, E.; Hewitson, B.; Shumake, J.; Vaughan, C.; Zebiak, S. E.
2015-12-01
Climate services offer information and tools to help stakeholders anticipate and/or manage risks posed by climate change. However, climate services lack a cohesive ethical framework to govern their development and application. This paper describes a prototype, open-ended process to form a set of ethical principles to ensure that climate services are effectively deployed to manage climate risks, realize opportunities, and advance human security.We begin by acknowledging the multiplicity of competing interests and motivations across individuals and institutions. Growing awareness of potential climate impacts has raised interest and investments in climate services and led to the entrance of new providers. User demand for climate services is also rising, as are calls for new types of services. Meanwhile, there is growing pressure from funders to operationalize climate research.Our proposed ethical framework applies reference points founded on diverse experiences in western and developing countries, fundamental and applied climate research, different sectors, gender, and professional practice (academia, private sector, government). We assert that climate service providers should be accountable for both their practices and products by upholding values of integrity, transparency, humility, and collaboration.Principles of practice include: communicating all value judgements; eschewing climate change as a singular threat; engaging in the co-exploration of knowledge; establishing mechanisms for monitoring/evaluating procedures and products; declaring any conflicts of interest. Examples of principles of products include: clear and defensible provenance of information; descriptions of the extent and character of uncertainties using terms that are meaningful to intended users; tools and information that are tailored to the context of the user; and thorough documentation of methods and meta-data.We invite the community to test and refine these points.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vermeire, B.C., E-mail: brian.vermeire@concordia.ca; Witherden, F.D.; Vincent, P.E.
First- and second-order accurate numerical methods, implemented for CPUs, underpin the majority of industrial CFD solvers. Whilst this technology has proven very successful at solving steady-state problems via a Reynolds Averaged Navier–Stokes approach, its utility for undertaking scale-resolving simulations of unsteady flows is less clear. High-order methods for unstructured grids and GPU accelerators have been proposed as an enabling technology for unsteady scale-resolving simulations of flow over complex geometries. In this study we systematically compare accuracy and cost of the high-order Flux Reconstruction solver PyFR running on GPUs and the industry-standard solver STAR-CCM+ running on CPUs when applied to amore » range of unsteady flow problems. Specifically, we perform comparisons of accuracy and cost for isentropic vortex advection (EV), decay of the Taylor–Green vortex (TGV), turbulent flow over a circular cylinder, and turbulent flow over an SD7003 aerofoil. We consider two configurations of STAR-CCM+: a second-order configuration, and a third-order configuration, where the latter was recommended by CD-adapco for more effective computation of unsteady flow problems. Results from both PyFR and STAR-CCM+ demonstrate that third-order schemes can be more accurate than second-order schemes for a given cost e.g. going from second- to third-order, the PyFR simulations of the EV and TGV achieve 75× and 3× error reduction respectively for the same or reduced cost, and STAR-CCM+ simulations of the cylinder recovered wake statistics significantly more accurately for only twice the cost. Moreover, advancing to higher-order schemes on GPUs with PyFR was found to offer even further accuracy vs. cost benefits relative to industry-standard tools.« less
Hernandez-Valladares, Maria; Rihet, Pascal; Iraqi, Fuad A
2014-01-01
There is growing evidence for human genetic factors controlling the outcome of malaria infection, while molecular basis of this genetic control is still poorly understood. Case-control and family-based studies have been carried out to identify genes underlying host susceptibility to malarial infection. Parasitemia and mild malaria have been genetically linked to human chromosomes 5q31-q33 and 6p21.3, and several immune genes located within those regions have been associated with malaria-related phenotypes. Association and linkage studies of resistance to malaria are not easy to carry out in human populations, because of the difficulty in surveying a significant number of families. Murine models have proven to be an excellent genetic tool for studying host response to malaria; their use allowed mapping 14 resistance loci, eight of them controlling parasitic levels and six controlling cerebral malaria. Once quantitative trait loci or genes have been identified, the human ortholog may then be identified. Comparative mapping studies showed that a couple of human and mouse might share similar genetically controlled mechanisms of resistance. In this way, char8, which controls parasitemia, was mapped on chromosome 11; char8 corresponds to human chromosome 5q31-q33 and contains immune genes, such as Il3, Il4, Il5, Il12b, Il13, Irf1, and Csf2. Nevertheless, part of the genetic factors controlling malaria traits might differ in both hosts because of specific host-pathogen interactions. Finally, novel genetic tools including animal models were recently developed and will offer new opportunities for identifying genetic factors underlying host phenotypic response to malaria, which will help in better therapeutic strategies including vaccine and drug development.
A method for probing the mutational landscape of amyloid structure.
O'Donnell, Charles W; Waldispühl, Jérôme; Lis, Mieszko; Halfmann, Randal; Devadas, Srinivas; Lindquist, Susan; Berger, Bonnie
2011-07-01
Proteins of all kinds can self-assemble into highly ordered β-sheet aggregates known as amyloid fibrils, important both biologically and clinically. However, the specific molecular structure of a fibril can vary dramatically depending on sequence and environmental conditions, and mutations can drastically alter amyloid function and pathogenicity. Experimental structure determination has proven extremely difficult with only a handful of NMR-based models proposed, suggesting a need for computational methods. We present AmyloidMutants, a statistical mechanics approach for de novo prediction and analysis of wild-type and mutant amyloid structures. Based on the premise of protein mutational landscapes, AmyloidMutants energetically quantifies the effects of sequence mutation on fibril conformation and stability. Tested on non-mutant, full-length amyloid structures with known chemical shift data, AmyloidMutants offers roughly 2-fold improvement in prediction accuracy over existing tools. Moreover, AmyloidMutants is the only method to predict complete super-secondary structures, enabling accurate discrimination of topologically dissimilar amyloid conformations that correspond to the same sequence locations. Applied to mutant prediction, AmyloidMutants identifies a global conformational switch between Aβ and its highly-toxic 'Iowa' mutant in agreement with a recent experimental model based on partial chemical shift data. Predictions on mutant, yeast-toxic strains of HET-s suggest similar alternate folds. When applied to HET-s and a HET-s mutant with core asparagines replaced by glutamines (both highly amyloidogenic chemically similar residues abundant in many amyloids), AmyloidMutants surprisingly predicts a greatly reduced capacity of the glutamine mutant to form amyloid. We confirm this finding by conducting mutagenesis experiments. Our tool is publically available on the web at http://amyloid.csail.mit.edu/. lindquist_admin@wi.mit.edu; bab@csail.mit.edu.
Halpern, Diane F
2017-07-01
Contemporary psychology is experiencing tremendous growth in neuroscience, and there is every indication that it will continue to gain in popularity notwithstanding the scarcity of academic positions for newly minted Ph.Ds. Despite the general perception that brain correlates "explain" or "cause" the mind and behavior, these correlates have not yet proven useful in understanding psychological processes, although they offer the possibility of early identification of some disorders. Other recent developments in psychology include increased emphasis on applications and more global representation among researchers and participants. In thinking about the way we want psychology to evolve, psychologists need to pay more than lip service to the idea that complex questions in psychology require multiple levels of analysis with contributions from biological (brain, hormones, and genetics), individual differences and social and cultural perspectives. Early career psychologists who can attain a breadth of knowledge will be well-positioned for a team approach to psychological inquiry. Finally, I offer the belief that an emphasis on enhancing critical thinking skills at all levels of education offers the best hope for the future.
[E-Learning--an important contribution to general medical training and continuing education?].
Ruf, D; Berner, M M; Kriston, L; Härter, M
2008-09-01
There is increasing activity in the development of e-learning modules for general medical training and continuing education. One of the central advantages of e-learning is flexibility regarding time and place of its use. The quality of the available e-learning opportunities varies quite considerably. For users it is often not easy to assess the quality of e-learning modules or to find offers of high quality. This could be a reason for the fact that despite the huge number of e-learning modules still only few students and physicians are using them. This is although e-learning has proven to be as effective as and even more efficient than learning in the classroom or with paper-based materials. This article summarizes the different models of e-learning, how and where to find offers of high quality, advantages of using e-learning, and the effectiveness and efficiency of such offers. In addition problems of e-learning and possibilities to overcome these problems are shown.
Navigation Constellation Design Using a Multi-Objective Genetic Algorithm
2015-03-26
programs. This specific tool not only offers high fidelity simulations, but it also offers the visual aid provided by STK . The ability to...MATLAB and STK . STK is a program that allows users to model, analyze, and visualize space systems. Users can create objects such as satellites and...position dilution of precision (PDOP) and system cost. This thesis utilized Satellite Tool Kit ( STK ) to calculate PDOP values of navigation
Klamath Falls: High-Power Acoustic Well Stimulation Technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Black, Brian
Acoustic well stimulation (AWS) technology uses high-power sonic waves from specific frequency spectra in an attempt to stimulate production in a damaged or low-production wellbore. AWS technology is one of the most promising technologies in the oil and gas industry, but it has proven difficult for the industry to develop an effective downhole prototype. This collaboration between Klamath Falls Inc. and the Rocky Mountain Oilfield Testing Center (RMOTC) included a series of tests using high-power ultrasonic tools to stimulate oil and gas production. Phase I testing was designed and implemented to verify tool functionality, power requirements, and capacity of high-powermore » AWS tools. The purpose of Phase II testing was to validate the production response of wells with marginal production rates to AWS stimulation and to capture and identify any changes in the downhole environment after tool deployment. This final report presents methodology and results.« less
Faust, Kyle; Faust, David
2015-08-12
Problematic or addictive digital gaming (including all types of electronic devices) can and has had extremely adverse impacts on the lives of many individuals across the world. The understanding of this phenomenon, and the effectiveness of treatment design and monitoring, can be improved considerably by continuing refinement of assessment tools. The present article briefly overviews tools designed to measure problematic or addictive use of digital gaming, the vast majority of which are founded on the Diagnostic and Statistical Manual of Mental Disorders (DSM) criteria for other addictive disorders, such as pathological gambling. Although adapting DSM content and strategies for measuring problematic digital gaming has proven valuable, there are some potential issues with this approach. We discuss the strengths and limitations of current methods for measuring problematic or addictive gaming and provide various recommendations that might help in enhancing or supplementing existing tools, or in developing new and even more effective tools.
Faust, Kyle; Faust, David
2015-01-01
Problematic or addictive digital gaming (including all types of electronic devices) can and has had extremely adverse impacts on the lives of many individuals across the world. The understanding of this phenomenon, and the effectiveness of treatment design and monitoring, can be improved considerably by continuing refinement of assessment tools. The present article briefly overviews tools designed to measure problematic or addictive use of digital gaming, the vast majority of which are founded on the Diagnostic and Statistical Manual of Mental Disorders (DSM) criteria for other addictive disorders, such as pathological gambling. Although adapting DSM content and strategies for measuring problematic digital gaming has proven valuable, there are some potential issues with this approach. We discuss the strengths and limitations of current methods for measuring problematic or addictive gaming and provide various recommendations that might help in enhancing or supplementing existing tools, or in developing new and even more effective tools. PMID:26274977
The influence of machining condition and cutting tool wear on surface roughness of AISI 4340 steel
NASA Astrophysics Data System (ADS)
Natasha, A. R.; Ghani, J. A.; Che Haron, C. H.; Syarif, J.
2018-01-01
Sustainable machining by using cryogenic coolant as the cutting fluid has been proven to enhance some machining outputs. The main objective of the current work was to investigate the influence of machining conditions; dry and cryogenic, as well as the cutting tool wear on the machined surface roughness of AISI 4340 steel. The experimental tests were performed using chemical vapor deposition (CVD) coated carbide inserts. The value of machined surface roughness were measured at 3 cutting intervals; beginning, middle, and end of the cutting based on the readings of the tool flank wear. The results revealed that cryogenic turning had the greatest influence on surface roughness when machined at lower cutting speed and higher feed rate. Meanwhile, the cutting tool wear was also found to influence the surface roughness, either improving it or deteriorating it, based on the severity and the mechanism of the flank wear.
NASA Astrophysics Data System (ADS)
Biermann, D.; Kahleyss, F.; Krebs, E.; Upmeier, T.
2011-07-01
Micro-sized applications are gaining more and more relevance for NiTi-based shape memory alloys (SMA). Different types of micro-machining offer unique possibilities for the manufacturing of NiTi components. The advantage of machining is the low thermal influence on the workpiece. This is important, because the phase transformation temperatures of NiTi SMAs can be changed and the components may need extensive post manufacturing. The article offers a simulation-based approach to optimize five-axis micro-milling processes with respect to the special material properties of NiTi SMA. Especially, the influence of the various tool inclination angles is considered for introducing an intelligent tool inclination optimization algorithm. Furthermore, aspects of micro deep-hole drilling of SMAs are discussed. Tools with diameters as small as 0.5 mm are used. The possible length-to-diameter ratio reaches up to 50. This process offers new possibilities in the manufacturing of microstents. The study concentrates on the influence of the cutting speed, the feed and the tool design on the tool wear and the quality of the drilled holes.
Acoustic testing to enhance western forest values and meet customer wood quality needs
Peter Carter; David Briggs; Robert J. Ross; Xiping Wang
2005-01-01
Nondestructive testing (NDT) of wood products, such as lumber and veneer, for stiffness and strength evaluation has been proven and commercialized for many years. The NDT concept has been extended and commercialized in the Director HM-200⢠tool for testing logs in advance of processing so manufacturers can make more informed log purchases and better match logs to...
ERIC Educational Resources Information Center
Gilfoil, David M.; Aukers, Steven M.; Jobs, Charles G.
2015-01-01
Over the past decade, Web 2.0 has brought a wealth of opportunities for improving marketing effectiveness; social media platforms, in particular, have proven to be exceptional tools for realizing growth potential. The big question for businesses used to be how to measure and report financial return on investment (ROI) for social media ad spend to…
Your Vote, Your Voice. National Campus Voter Registration Project Organizing Handbook, 2004
ERIC Educational Resources Information Center
National Association of Independent Colleges and Universities, 2004
2004-01-01
Campus voter registration and education programs are powerful and proven tools for building voter participation. Young people who go to college are far more likely to vote than their peers who do not. U.S. Census Bureau data on the 2000 general election show that those with bachelor?s degrees were twice as likely to vote (75 percent) as were those…
ERIC Educational Resources Information Center
Sanchez-Garcia, Juan; Hamann, Edmund T.; Zuniga, Victor
2012-01-01
For 5 years, this research team has sought to learn from more than 700 students encountered in Mexican schools, who had previous experience attending schools in the United States. Although this study has used mixed methods, 1 tool--the written survey--has proven particularly valuable as a means to build profiles of such transnational students.…
Beccalli, Egle M; Broggini, Gianluigi; Gazzola, Silvia; Mazza, Alberto
2014-09-21
The double functionalization of carbon-carbon multiple bonds in one-pot processes has emerged in recent years as a fruitful tool for the rapid synthesis of complex molecular scaffolds. This review covers the advances in domino reactions promoted by the couple palladium(ii)/copper(ii), which was proven to be an excellent catalytic system for the functionalization of substrates.
Stand-Alone Measurements and Characterization | Photovoltaic Research |
Science and Technology Facility cluster tools offer powerful capabilities for measuring and characterizing Characterization tool suite are supplemented by the Integrated Measurements and Characterization cluster tool the Integrated M&C cluster tool using a mobile transport pod, which can keep samples under vacuum
NASA Technical Reports Server (NTRS)
Marsell, Brandon; Griffin, David; Schallhorn, Dr. Paul; Roth, Jacob
2012-01-01
Coupling computational fluid dynamics (CFD) with a controls analysis tool elegantly allows for high accuracy predictions of the interaction between sloshing liquid propellants and th e control system of a launch vehicle. Instead of relying on mechanical analogs which are not valid during aU stages of flight, this method allows for a direct link between the vehicle dynamic environments calculated by the solver in the controls analysis tool to the fluid flow equations solved by the CFD code. This paper describes such a coupling methodology, presents the results of a series of test cases, and compares said results against equivalent results from extensively validated tools. The coupling methodology, described herein, has proven to be highly accurate in a variety of different cases.
Integrated CFD and Controls Analysis Interface for High Accuracy Liquid Propellant Slosh Predictions
NASA Technical Reports Server (NTRS)
Marsell, Brandon; Griffin, David; Schallhorn, Paul; Roth, Jacob
2012-01-01
Coupling computational fluid dynamics (CFD) with a controls analysis tool elegantly allows for high accuracy predictions of the interaction between sloshing liquid propellants and the control system of a launch vehicle. Instead of relying on mechanical analogs which are n0t va lid during all stages of flight, this method allows for a direct link between the vehicle dynamic environments calculated by the solver in the controls analysis tool to the fluid now equations solved by the CFD code. This paper describes such a coupling methodology, presents the results of a series of test cases, and compares said results against equivalent results from extensively validated tools. The coupling methodology, described herein, has proven to be highly accurate in a variety of different cases.
NASA Technical Reports Server (NTRS)
Modesitt, Kenneth L.
1990-01-01
Since 1984, an effort has been underway at Rocketdyne, manufacturer of the Space Shuttle Main Engine (SSME), to automate much of the analysis procedure conducted after engine test firings. Previously published articles at national and international conferences have contained the context of and justification for this effort. Here, progress is reported in building the full system, including the extensions of integrating large databases with the system, known as Scotty. Inductive knowledge acquisition has proven itself to be a key factor in the success of Scotty. The combination of a powerful inductive expert system building tool (ExTran), a relational data base management system (Reliance), and software engineering principles and Computer-Assisted Software Engineering (CASE) tools makes for a practical, useful and state-of-the-art application of an expert system.
NASA Astrophysics Data System (ADS)
Krause, Lee S.; Burns, Carla L.
2000-06-01
This paper discusses the research currently in progress to develop the Conceptual Federation Object Model Design Tool. The objective of the Conceptual FOM (C-FOM) Design Tool effort is to provide domain and subject matter experts, such as scenario developers, with automated support for understanding and utilizing available HLA simulation and other simulation assets during HLA Federation development. The C-FOM Design Tool will import Simulation Object Models from HLA reuse repositories, such as the MSSR, to populate the domain space that will contain all the objects and their supported interactions. In addition, the C-FOM tool will support the conversion of non-HLA legacy models into HLA- compliant models by applying proven abstraction techniques against the legacy models. Domain experts will be able to build scenarios based on the domain objects and interactions in both a text and graphical form and export a minimal FOM. The ability for domain and subject matter experts to effectively access HLA and non-HLA assets is critical to the long-term acceptance of the HLA initiative.
PolNet: A Tool to Quantify Network-Level Cell Polarity and Blood Flow in Vascular Remodeling.
Bernabeu, Miguel O; Jones, Martin L; Nash, Rupert W; Pezzarossa, Anna; Coveney, Peter V; Gerhardt, Holger; Franco, Claudio A
2018-05-08
In this article, we present PolNet, an open-source software tool for the study of blood flow and cell-level biological activity during vessel morphogenesis. We provide an image acquisition, segmentation, and analysis protocol to quantify endothelial cell polarity in entire in vivo vascular networks. In combination, we use computational fluid dynamics to characterize the hemodynamics of the vascular networks under study. The tool enables, to our knowledge for the first time, a network-level analysis of polarity and flow for individual endothelial cells. To date, PolNet has proven invaluable for the study of endothelial cell polarization and migration during vascular patterning, as demonstrated by two recent publications. Additionally, the tool can be easily extended to correlate blood flow with other experimental observations at the cellular/molecular level. We release the source code of our tool under the Lesser General Public License. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Screening and assessment tools for pediatric malnutrition.
Huysentruyt, Koen; Vandenplas, Yvan; De Schepper, Jean
2016-06-18
The ideal measures for screening and assessing undernutrition in children remain a point of discussion in literature. This review aims to provide an overview of recent advances in the nutritional screening and assessment methods in children. This review focuses on two major topics that emerged in literature since 2015: the practical endorsement of the new definition for pediatric undernutrition, with a focus on anthropometric measurements and the search for a consensus on pediatric nutritional screening tools in different settings. Few analytical tools exist for the assessment of the nutritional status in children. The subjective global nutritional assessment has been validated by anthropometric as well as clinical outcome parameters. Nutritional screening can help in selecting patients that benefit the most from a full nutritional assessment. Two new screening tools have been developed for use in a general (mixed) hospital population, and one for a population of children with cancer. The value of screening tools in different disease-specific and outpatient pediatric populations remains to be proven.
Lindsey, Cary R.; Neupane, Ghanashym; Spycher, Nicolas; ...
2018-01-03
Although many Known Geothermal Resource Areas in Oregon and Idaho were identified during the 1970s and 1980s, few were subsequently developed commercially. Because of advances in power plant design and energy conversion efficiency since the 1980s, some previously identified KGRAs may now be economically viable prospects. Unfortunately, available characterization data vary widely in accuracy, precision, and granularity, making assessments problematic. In this paper, we suggest a procedure for comparing test areas against proven resources using Principal Component Analysis and cluster identification. The result is a low-cost tool for evaluating potential exploration targets using uncertain or incomplete data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lindsey, Cary R.; Neupane, Ghanashym; Spycher, Nicolas
Although many Known Geothermal Resource Areas in Oregon and Idaho were identified during the 1970s and 1980s, few were subsequently developed commercially. Because of advances in power plant design and energy conversion efficiency since the 1980s, some previously identified KGRAs may now be economically viable prospects. Unfortunately, available characterization data vary widely in accuracy, precision, and granularity, making assessments problematic. In this paper, we suggest a procedure for comparing test areas against proven resources using Principal Component Analysis and cluster identification. The result is a low-cost tool for evaluating potential exploration targets using uncertain or incomplete data.
Percussive Excavation of Lunar Soil
NASA Technical Reports Server (NTRS)
Whittaker, Matthew P.
2008-01-01
It has been suggested using a percussive motion could improve the efficiency of excavation by up to 90%. If this is proven to be true it would be very beneficial to excavation projects on the Moon and Mars. The purpose of this study is to design, build and test a percussive tool which could dig a trench and then compare this data against that of a non-percussive tool of the same shape and size. The results of this test thus far have been inconclusive due to malfunctions in the testbed and percussive bucket; however, experimental results from small scale experiments confirm this higher efficiency and support further testing.
Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)
NASA Technical Reports Server (NTRS)
Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.
2003-01-01
A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.
Lean management: innovative tools for engaging teams in continuous quality improvement.
Perreault, Lucille; Vaillancourt, Lise; Filion, Catherine; Hadj, Camélia
2014-01-01
Lean management has proven to be a sustainable method to ensure a high level of patient care through innovation and teamwork. It involves a set of six tools that allow for visual management shared among team members. The team focuses their efforts on the improvement of organizational indicators in a standardized and engaging way, resulting in the sustainability of improvements. This article outlines the program's rollout at Montfort Hospital (l'Hôpital Montfort). In only a few months, two pilot units accomplished close to 50 improvements each. In addition, the organizational employee satisfaction questionnaire showed very positive results. Copyright © 2014 Longwoods Publishing.
Problem solving with genetic algorithms and Splicer
NASA Technical Reports Server (NTRS)
Bayer, Steven E.; Wang, Lui
1991-01-01
Genetic algorithms are highly parallel, adaptive search procedures (i.e., problem-solving methods) loosely based on the processes of population genetics and Darwinian survival of the fittest. Genetic algorithms have proven useful in domains where other optimization techniques perform poorly. The main purpose of the paper is to discuss a NASA-sponsored software development project to develop a general-purpose tool for using genetic algorithms. The tool, called Splicer, can be used to solve a wide variety of optimization problems and is currently available from NASA and COSMIC. This discussion is preceded by an introduction to basic genetic algorithm concepts and a discussion of genetic algorithm applications.
NASA Astrophysics Data System (ADS)
Shrivastava, Prakash K.; Asthana, Rajesh; Roy, Sandip K.; Swain, Ashit K.; Dharwadkar, Amit
2012-07-01
The scientific study of quartz grains is a powerful tool in deciphering the depositional environment and mode of transportation of sediments, and ultimately the origin and classification of sediments. Surface microfeatures, angularity, chemical features, and grain-size analysis of quartz grains, collectively reveal the sedimentary and physicochemical processes that acted on the grains during different stages of their geological history. Here, we apply scanning electron microscopic (SEM) analysis to evaluating the sedimentary provenance, modes of transport, weathering characteristics, alteration, and sedimentary environment of selected detrital quartz grains from the peripheral part of two epi-shelf lakes (ESL-1 and ESL-2) of the Schirmacher Oasis of East Antarctica. Our study reveals that different styles of physical weathering, erosive signatures, and chemical precipitation variably affected these quartz grains before final deposition as lake sediments. Statistical analysis (central tendencies, sorting, skewness, and kurtosis) indicates that these quartz-bearing sediments are poorly sorted glaciofluvial sediments. Saltation and suspension seem to have been the two dominant modes of transportation, and chemical analysis of these sediments indicates a gneissic provenance.
Provenance information as a tool for addressing engineered nanoparticle reproducibility challenges
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baer, Donald R.; Munusamy, Prabhakaran; Thrall, Brian D.
Nanoparticles of various types are of increasing research and technological importance in biological and other applications. Difficulties in the production and delivery of nanoparticles with consistent and well defined properties appear in many forms and have a variety of causes. Among several issues are those associated with incomplete information about the history of particles involved in research studies including the synthesis method, sample history after synthesis including time and nature of storage and the detailed nature of any sample processing or modification. In addition, the tendency of particles to change with time or environmental condition suggests that the time betweenmore » analysis and application is important and some type of consistency or verification process can be important. The essential history of a set of particles can be identified as provenance information tells the origin or source of a batch of nano-objects along with information related to handling and any changes that may have taken place since it was originated. A record of sample provenance information for a set of particles can play a useful role in identifying some of the sources and decreasing the extent of particle variability and the observed lack of reproducibility observed by many researchers.« less
NASA Technical Reports Server (NTRS)
Smith, Matthew R.; Molthan, Andrew L.; Fuell, Kevin K.; Jedlovec, Gary J.
2012-01-01
SPoRT is a team of NASA/NOAA scientists focused on demonstrating the utility of NASA and future NOAA data and derived products on improving short-term weather forecasts. Work collaboratively with a suite of unique products and selected WFOs in an end-to-end transition activity. Stable funding from NASA and NOAA. Recognized by the science community as the "go to" place for transitioning experimental and research data to the operational weather community. Endorsed by NWS ESSD/SSD chiefs. Proven paradigm for transitioning satellite observations and modeling capabilities to operations (R2O). SPoRT s transition of NASA satellite instruments provides unique or higher resolution data products to complement the baseline suite of geostationary data available to forecasters. SPoRT s partnership with NWS WFOs provides them with unique imagery to support disaster response and local forecast challenges. SPoRT has years of proven experience in developing and transitioning research products to the operational weather community. SPoRT has begun work with CONUS and OCONUS WFOs to determine the best products for maximum benefit to forecasters. VIIRS has already proven to be another extremely powerful tool, enhancing forecasters ability to handle difficult forecasting situations.
Quantum network with trusted and untrusted relays
NASA Astrophysics Data System (ADS)
Ma, Xiongfeng; Annabestani, Razieh; Fung, Chi-Hang Fred; Lo, Hoi-Kwong; Lütkenhaus, Norbert; PitkäNen, David; Razavi, Mohsen
2012-02-01
Quantum key distribution offers two distant users to establish a random secure key by exploiting properties of quantum mechanics, whose security has proven in theory. In practice, many lab and field demonstrations have been performed in the last 20 years. Nowadays, quantum network with quantum key distribution systems are tested around the world, such as in China, Europe, Japan and US. In this talk, I will give a brief introduction of recent development for quantum network. For the untrusted relay part, I will introduce the measurement-device-independent quantum key distribution scheme and a quantum relay with linear optics. The security of such scheme is proven without assumptions on the detection devices, where most of quantum hacking strategies are launched. This scheme can be realized with current technology. For the trusted relay part, I will introduce so-called delayed privacy amplification, with which no error correction and privacy amplification is necessarily to be performed between users and the relay. In this way, classical communications and computational power requirement on the relay site will be reduced.
Poland, Michael P; Nugent, Chris D; Wang, Hui; Chen, Liming
2009-01-01
Smart Homes offer potential solutions for various forms of independent living for the elderly. The assistive and protective environment afforded by smart homes offer a safe, relatively inexpensive, dependable and viable alternative to vulnerable inhabitants. Nevertheless, the success of a smart home rests upon the quality of information its decision support system receives and this in turn places great importance on the issue of correct sensor deployment. In this article we present a software tool that has been developed to address the elusive issue of sensor distribution within smart homes. Details of the tool will be presented and it will be shown how it can be used to emulate any real world environment whereby virtual sensor distributions can be rapidly implemented and assessed without the requirement for physical deployment for evaluation. As such, this approach offers the potential of tailoring sensor distributions to the specific needs of a patient in a non-evasive manner. The heuristics based tool presented here has been developed as the first part of a three stage project.
Long life, low cost, rechargeable AgZn battery for non-military applications
NASA Astrophysics Data System (ADS)
Brown, Curtis C.
1996-03-01
Of the rechargeable (secondary) battery systems with mature technology, the silver oxide-zinc system (AgZn) safely offers the highest power and energy (watts and watt hours) per unit of volume and mass. As a result they have long been used for aerospace and defense applications where they have also proven their high reliability. In the past, the expense associated with the cost of silver and the resulting low production volume have limited their commercial application. However, the relative low cost of silver now make this system feasible in many applications where high energy and reliability are required. One area of commercial potential is power for a new generation of sophisticated, portable medical equipment. AgZn batteries have recently proven ``enabling technology'' for power critical, advanced medical devices. By extending the cycle calendar life to the system (offers both improved performance and lower operating cost), a combination is achieved which may enable a wide range of future electrical devices. Other areas where AgZn batteries have been used in nonmilitary applications to provide power to aid in the development of commercial equipment have been: (a) Electrically powered vehicles; (b) Remote sensing in nuclear facilities; (c) Special effects equipment for movies; (d) Remote sensing in petroleum pipe lines; (e) Portable computers; (f) Fly by wire systems for commercial aircraft; and (g) Robotics. However none of these applications have progressed to the level where the volume required will significantly lower cost.
ERIC Educational Resources Information Center
DuBois, Bryce; Allred, Shorna; Bunting-Howarth, Katherine; Sanderson, Eric W.; Giampieri, Mario
2017-01-01
The Welikeia project and the corresponding free online tool Visionmaker. NYC focus on the historical landscape ecologies of New York City. This article provides a brief introduction to online participatory tools, describes the Visionmaker tool in detail, and offers suggested ways to use the tool for Extension professionals based in and outside New…
ERIC Educational Resources Information Center
Hartsell, Taralynn S.; Yuen, Steve Chi-Yin
2003-01-01
Discusses advantages and limitations of online exams, describes available software tools for creating computer-based tests (CGI, JavaScript, commercial programs, course authoring tools), and offers suggestions for implementation. (JOW)
Huebner, D M; Binson, D; Pollack, L M; Woods, W J
2012-03-01
Implementing HIV voluntary counselling and testing (VCT) in bathhouses is a proven public health strategy for reaching high-risk men who have sex with men (MSM) and efficiently identifying new HIV cases. However, some bathhouse managers are concerned that VCT programmes could adversely affect business. This study examined whether offering VCT on the premises of a bathhouse changed patterns of patron visits. A collaborating bathhouse provided electronic anonymized patron data from their entire population of attendees. VCT was offered on premises with varying frequencies over the course of three years. Club entrances and exits were modelled as a function of intensity of VCT programming. Club entrances did not differ as a function of how many days per week testing was being offered in a given month. Additionally, club entrances did not decrease, nor did club exits increase, during specific half-hour time periods when testing was offered. Implementing bathhouse-based VCT did not have any demonstrable impact on patronage. Public health officials can leverage these results to help alleviate club managers' concerns about patron reactions to providing testing on site, and to support expanding sexual health programmes for MSM in these venues.
Choosing the Right Tool for the Job: RNAi, TALEN or CRISPR
Boettcher, Michael; McManus, Michael T.
2015-01-01
The most widely used approach for defining a genes’ function is to reduce or completely disrupt its normal expression. For over a decade, RNAi has ruled the lab, offering a magic bullet to disrupt gene expression in many organisms. However, new biotechnological tools - specifically CRISPR-based technologies - have become available and are squeezing out RNAi dominance in mammalian cell studies. These seemingly competing technologies leave research investigators with the question: ‘Which technology should I use in my experiment?’ This review offers a practical resource to compare and contrast these technologies, guiding the investigator when and where to use this fantastic array of powerful tools. PMID:26000843
NASA Technical Reports Server (NTRS)
Moghaddam, M.; Saatchi, S.
1996-01-01
To understand and predict the functioning of forest biomes, their interaction with the atmosphere, and their growth rates, the knowledge of moisture content of their canopy and the floor soil is essential. The synthetic aperture radar on airborne and spaceborne platforms has proven to be a flexible tool for measuring electromagnetic back- scattering properties of vegetation related to their moisture content.
ERIC Educational Resources Information Center
Thoron, Andrew C.; Myers, Brian E.
2011-01-01
The National Research Council has recognized the challenge of assessing laboratory investigation and called for the investigation of assessments that are proven through sound research-based studies. The Vee map provides a framework that allows the learners to conceptualize their previous knowledge as they develop success in meaningful learning…
Meteor Beliefs Project: Spears of GodSpears of God
NASA Astrophysics Data System (ADS)
Hendrix, Howard V.; McBeath, Alastair; Gheorghe, Andrei Dorian
2012-04-01
A selection of genuine or supposedly sky-fallen objects from real-world sources, a mixture of weapons, tools and "magical" objects of heavenly provenance, are drawn from their re-use in the near-future science-fiction novel Spears of God by author Howard V Hendrix, with additional discussion. The book includes other meteoric and meteoritic items too, some of which have been the subject of previous Meteor Beliefs Project examinations.
NASA Technical Reports Server (NTRS)
Uber, James G.
1988-01-01
Software itself is not hazardous, but since software and hardware share common interfaces there is an opportunity for software to create hazards. Further, these software systems are complex, and proven methods for the design, analysis, and measurement of software safety are not yet available. Some past software failures, future NASA software trends, software engineering methods, and tools and techniques for various software safety analyses are reviewed. Recommendations to NASA are made based on this review.
ERIC Educational Resources Information Center
Almendarez Barron, Maria
2012-01-01
The National Council for Accreditation of Teacher Education has called for strengthening teacher preparation by incorporating more fieldwork. Supervision with effective instructional feedback is an essential component of meaningful fieldwork, and immediate feedback has proven more efficacious than delayed feedback. Rock and her colleagues have…
Development of a Communications Front End Processor (FEP) for the VAX-11/780 Using an LSI-11/23.
1983-12-01
9 Approach . . . . . . . . . . . . . . . . . . 11 Software Development Life Cycle . . . . . . . 11 Requirements Analysis...proven to be useful (25] during the Software Development Life Cycle of a project. Development tools and documentation aids used throughout this effort...include "Structure Charts" ( ref Appendix B ), a "Data Dictionary" ( ref Appendix C ),and Program Design Language CPDL). 1.5.1 Software Development- Life
ERIC Educational Resources Information Center
Howe, Erica M.
2007-01-01
The history of science (HOS) has proven to be a useful pedagogical tool to help students learn about what has come to be regarded as an agreed upon set of core nature of science (NOS) tenets. The following article illustrates an example of how teachers can instrumentally use the history of research on heterozygote protection in sickle-cell anemia…
ERIC Educational Resources Information Center
Mowling, Claire M.; Menear, Kristi; Dennen, Ayla; Fittipaldi-Wert, Jeanine
2018-01-01
The use of technology has proven to be a successful tool for enhancing the learning of children with disabilities. One example is the use of video-recorded social story movies as interventions for children with an autism spectrum disorder (ASD). Through the use of electronic devices such as iPads, iPods and tablets, social stories are brought to…
Relevance of ERTS-1 to the state of Ohio
NASA Technical Reports Server (NTRS)
Sweet, D. C. (Principal Investigator); Wells, T. L.; Wukelic, G. E.
1973-01-01
The author has identified the following significant results. To date, only one significant result has been reported for the Ohio ERTS program. This result relates to the proven usefulness of ERTS-1 imagery for mapping and inventorying strip-mined areas in southeastern Ohio. ERTS provides a tool for rapidly and economically acquiring an up-to-date inventory of strip-mined lands for state planning purposes which was not previously possible.
AtomPy: an open atomic-data curation environment
NASA Astrophysics Data System (ADS)
Bautista, Manuel; Mendoza, Claudio; Boswell, Josiah S; Ajoku, Chukwuemeka
2014-06-01
We present a cloud-computing environment for atomic data curation, networking among atomic data providers and users, teaching-and-learning, and interfacing with spectral modeling software. The system is based on Google-Drive Sheets, Pandas (Python Data Analysis Library) DataFrames, and IPython Notebooks for open community-driven curation of atomic data for scientific and technological applications. The atomic model for each ionic species is contained in a multi-sheet Google-Drive workbook, where the atomic parameters from all known public sources are progressively stored. Metadata (provenance, community discussion, etc.) accompanying every entry in the database are stored through Notebooks. Education tools on the physics of atomic processes as well as their relevance to plasma and spectral modeling are based on IPython Notebooks that integrate written material, images, videos, and active computer-tool workflows. Data processing workflows and collaborative software developments are encouraged and managed through the GitHub social network. Relevant issues this platform intends to address are: (i) data quality by allowing open access to both data producers and users in order to attain completeness, accuracy, consistency, provenance and currentness; (ii) comparisons of different datasets to facilitate accuracy assessment; (iii) downloading to local data structures (i.e. Pandas DataFrames) for further manipulation and analysis by prospective users; and (iv) data preservation by avoiding the discard of outdated sets.
Development of a DNA Microarray-Based Assay for the Detection of Sugar Beet Root Rot Pathogens.
Liebe, Sebastian; Christ, Daniela S; Ehricht, Ralf; Varrelmann, Mark
2016-01-01
Sugar beet root rot diseases that occur during the cropping season or in storage are accompanied by high yield losses and a severe reduction of processing quality. The vast diversity of microorganism species involved in rot development requires molecular tools allowing simultaneous identification of many different targets. Therefore, a new microarray technology (ArrayTube) was applied in this study to improve diagnosis of sugar beet root rot diseases. Based on three marker genes (internal transcribed spacer, translation elongation factor 1 alpha, and 16S ribosomal DNA), 42 well-performing probes enabled the identification of prevalent field pathogens (e.g., Aphanomyces cochlioides), storage pathogens (e.g., Botrytis cinerea), and ubiquitous spoilage fungi (e.g., Penicillium expansum). All probes were proven for specificity with pure cultures from 73 microorganism species as well as for in planta detection of their target species using inoculated sugar beet tissue. Microarray-based identification of root rot pathogens in diseased field beets was successfully confirmed by classical detection methods. The high discriminatory potential was proven by Fusarium species differentiation based on a single nucleotide polymorphism. The results demonstrate that the ArrayTube constitute an innovative tool allowing a rapid and reliable detection of plant pathogens particularly when multiple microorganism species are present.
Development of a virtual lab for practical eLearning in eHealth.
Herzog, Juliane; Forjan, Mathias; Sauermann, Stefan; Mense, Alexander; Urbauer, Philipp
2015-01-01
In recent years an ongoing development in educational offers for professionals working in the field of eHealth has been observed. This education is increasingly offered in the form of eLearning courses. Furthermore, it can be seen that simulations are a valuable part to support the knowledge transfer. Based on the knowledge profiles defined for eHealth courses a virtual lab should be developed. For this purpose, a subset of skills and a use case is determined. After searching and evaluating appropriate simulating and testing tools six tools were chosen to implement the use case practically. Within an UML use case diagram the interaction between the tools and the user is represented. Initially tests have shown good results of the tools' feasibility. After an extensive testing phase the tools should be integrated in the eHealth eLearning courses.
Human eye haptics-based multimedia.
Velandia, David; Uribe-Quevedo, Alvaro; Perez-Gutierrez, Byron
2014-01-01
Immersive and interactive multimedia applications offer complementary study tools in anatomy as users can explore 3D models while obtaining information about the organ, tissue or part being explored. Haptics increases the sense of interaction with virtual objects improving user experience in a more realistic manner. Common eye studying tools are books, illustrations, assembly models, and more recently these are being complemented with mobile apps whose 3D capabilities, computing power and customers are increasing. The goal of this project is to develop a complementary eye anatomy and pathology study tool using deformable models within a multimedia application, offering the students the opportunity for exploring the eye from up close and within with relevant information. Validation of the tool provided feedback on the potential of the development, along with suggestions on improving haptic feedback and navigation.
NASA Astrophysics Data System (ADS)
Ma, X.; Zheng, J. G.; Goldstein, J.; Duggan, B.; Xu, J.; Du, C.; Akkiraju, A.; Aulenbach, S.; Tilmes, C.; Fox, P. A.
2013-12-01
The periodical National Climate Assessment (NCA) of the US Global Change Research Program (USGCRP) [1] produces reports about findings of global climate change and the impacts of climate change on the United States. Those findings are of great public and academic concerns and are used in policy and management decisions, which make the provenance information of findings in those reports especially important. The USGCRP is developing a Global Change Information System (GCIS), in which the NCA reports and associated provenance information are the primary records. We were modeling and developing Semantic Web applications for the GCIS. By applying a use case-driven iterative methodology [2], we developed an ontology [3] to represent the content structure of a report and the associated provenance information. We also mapped the classes and properties in our ontology into the W3C PROV-O ontology [4] to realize the formal presentation of provenance. We successfully implemented the ontology in several pilot systems for a recent National Climate Assessment report (i.e., the NCA3). They provide users the functionalities to browse and search provenance information with topics of interest. Provenance information of the NCA3 has been made structured and interoperable by applying the developed ontology. Besides the pilot systems we developed, other tools and services are also able to interact with the data in the context of the 'Web of data' and thus create added values. Our research shows that the use case-driven iterative method bridges the gap between Semantic Web researchers and earth and environmental scientists and is able to be deployed rapidly for developing Semantic Web applications. Our work also provides first-hand experience for re-using the W3C PROV-O ontology in the field of earth and environmental sciences, as the PROV-O ontology is recently ratified (on 04/30/2013) by the W3C as a recommendation and relevant applications are still rare. [1] http://www.globalchange.gov [2] Fox, P., McGuinness, D.L., 2008. TWC Semantic Web Methodology. Accessible at: http://tw.rpi.edu/web/doc/TWC_SemanticWebMethodology [3] https://scm.escience.rpi.edu/svn/public/projects/gcis/trunk/rdf/schema/GCISOntology.ttl [4] http://www.w3.org/TR/prov-o/
Don't leave data unattended at any time!
NASA Astrophysics Data System (ADS)
Fleischer, D.; Czerniak, A.; Schirnick, C.
2013-12-01
The architecture of Kiel Data Management Infrastructure (KDMI) is setup to serve from the data creation process all the way to the data publication procedure. Accordingly the KDMI is managing data at the right beginning of the data life cycle and does not leave data unattended at this very crucial time. Starting from the chosen working procedure to handwritten protocols or lab notes the provenance of the resulting research data is captured within the KDMI. The provenance definition system is the fundamental (see figure 1) capturing tool for working procedures. The provenance definition is used to enable data input by file import, web client or hand writing recognition. The captured data in the provenance system for data is taking care of unpublished in house research data created directly on site. This system serves as a master for research data systems with more degrees of freedom in regard to technology, design or performance (e.g. GraphDB, etc). Such research systems can be regarded as compilations of unpublished data and public domain data e.g. from World Data Centers or archives. These compilations can be used to run statistical data mining and pattern finding algorithms on these specially designed platforms. The architecture of the KDMI ensures that a technical solution for data correction from the slave systems to the master system is possible and improves the quality of the stored data in the provenance system for data. After the research phase is over and the interpretation is finished the provenance system is used by a workflow based publication system called PubFlow. Within PubFlow it is possible to create repeatable workflows to publish data into various external long-term archives or World Data Center. The KDMI is based on the utilization of persistent identifiers for samples and person identities to support this automatized publication process. The publication process is the final step of the KDMI and the management responsibility of the long-term part of the data life cycle is handed over to the chosen archive. Nevertheless the provenance information remains at the KDMI and the definition maybe serves for future datasets again. Unattended data may get lost or be destroyed Usage cycle of the Kiel Data Management Infrastructure (KDMI)
SensePath: Understanding the Sensemaking Process Through Analytic Provenance.
Nguyen, Phong H; Xu, Kai; Wheat, Ashley; Wong, B L William; Attfield, Simon; Fields, Bob
2016-01-01
Sensemaking is described as the process of comprehension, finding meaning and gaining insight from information, producing new knowledge and informing further action. Understanding the sensemaking process allows building effective visual analytics tools to make sense of large and complex datasets. Currently, it is often a manual and time-consuming undertaking to comprehend this: researchers collect observation data, transcribe screen capture videos and think-aloud recordings, identify recurring patterns, and eventually abstract the sensemaking process into a general model. In this paper, we propose a general approach to facilitate such a qualitative analysis process, and introduce a prototype, SensePath, to demonstrate the application of this approach with a focus on browser-based online sensemaking. The approach is based on a study of a number of qualitative research sessions including observations of users performing sensemaking tasks and post hoc analyses to uncover their sensemaking processes. Based on the study results and a follow-up participatory design session with HCI researchers, we decided to focus on the transcription and coding stages of thematic analysis. SensePath automatically captures user's sensemaking actions, i.e., analytic provenance, and provides multi-linked views to support their further analysis. A number of other requirements elicited from the design session are also implemented in SensePath, such as easy integration with existing qualitative analysis workflow and non-intrusive for participants. The tool was used by an experienced HCI researcher to analyze two sensemaking sessions. The researcher found the tool intuitive and considerably reduced analysis time, allowing better understanding of the sensemaking process.
ERIC Educational Resources Information Center
Tooley, Melinda
2005-01-01
The different tools that are helpful during the Synthesis stage, their role in boosting students abilities in Synthesis and the way in which it can be customized to meet the needs of each group of students are discussed. Big6 TurboTools offers several tools to help complete the task. In Synthesis stage, these same tools along with Turbo Report and…
An HTML Tool for Production of Interactive Stereoscopic Compositions.
Chistyakov, Alexey; Soto, Maria Teresa; Martí, Enric; Carrabina, Jordi
2016-12-01
The benefits of stereoscopic vision in medical applications were appreciated and have been thoroughly studied for more than a century. The usage of the stereoscopic displays has a proven positive impact on performance in various medical tasks. At the same time the market of 3D-enabled technologies is blooming. New high resolution stereo cameras, TVs, projectors, monitors, and head mounted displays become available. This equipment, completed with a corresponding application program interface (API), could be relatively easy implemented in a system. Such complexes could open new possibilities for medical applications exploiting the stereoscopic depth. This work proposes a tool for production of interactive stereoscopic graphical user interfaces, which could represent a software layer for web-based medical systems facilitating the stereoscopic effect. Further the tool's operation mode and the results of the conducted subjective and objective performance tests will be exposed.
Laing, Karen; Baumgartner, Katherine
2005-01-01
Many endoscopy units are looking for ways to improve their efficiency without increasing the number of staff, purchasing additional equipment, or making the patients feel as if they have been rushed through the care process. To accomplish this, a few hospitals have looked to other industries for help. Recently, "lean" methods and tools from the manufacturing industry, have been applied successfully in health care systems, and have proven to be an effective way to eliminate waste and redundancy in workplace processes. The "lean" method and tools in service organizations focuses on providing the most efficient and effective flow of service and products. This article will describe the journey of one endoscopy department within a community hospital to illustrate application of "lean" methods and tools and results.
Wang, Chao; Yang, Xinzhou; Mellick, George D; Feng, Yunjiang
2016-12-21
Parkinson's disease (PD) is an incurable neurodegenerative disorder with a high prevalence rate worldwide. The fact that there are currently no proven disease-modifying treatments for PD underscores the urgency for a more comprehensive understanding of the underlying disease mechanism. Chemical probes have been proven to be powerful tools for studying biological processes. Traditional Chinese medicine (TCM) contains a huge reservoir of bioactive small molecules as potential chemical probes that may hold the key to unlocking the mystery of PD biology. The TCM-sourced chemical approach to PD biology can be advanced through the use of an emerging cytological profiling (CP) technique that allows unbiased characterization of small molecules and their cellular responses. This comprehensive technique, applied to chemical probe identification from TCM and used for studying the molecular mechanisms underlying PD, may inform future therapeutic target selection and provide a new perspective to PD drug discovery.
Making Sense of 'Big Data' in Provenance Studies
NASA Astrophysics Data System (ADS)
Vermeesch, P.
2014-12-01
Huge online databases can be 'mined' to reveal previously hidden trends and relationships in society. One could argue that sedimentary geology has entered a similar era of 'Big Data', as modern provenance studies routinely apply multiple proxies to dozens of samples. Just like the Internet, sedimentary geology now requires specialised statistical tools to interpret such large datasets. These can be organised on three levels of progressively higher order:A single sample: The most effective way to reveal the provenance information contained in a representative sample of detrital zircon U-Pb ages are probability density estimators such as histograms and kernel density estimates. The widely popular 'probability density plots' implemented in IsoPlot and AgeDisplay compound analytical uncertainty with geological scatter and are therefore invalid.Several samples: Multi-panel diagrams comprising many detrital age distributions or compositional pie charts quickly become unwieldy and uninterpretable. For example, if there are N samples in a study, then the number of pairwise comparisons between samples increases quadratically as N(N-1)/2. This is simply too much information for the human eye to process. To solve this problem, it is necessary to (a) express the 'distance' between two samples as a simple scalar and (b) combine all N(N-1)/2 such values in a single two-dimensional 'map', grouping similar and pulling apart dissimilar samples. This can be easily achieved using simple statistics-based dissimilarity measures and a standard statistical method called Multidimensional Scaling (MDS).Several methods: Suppose that we use four provenance proxies: bulk petrography, chemistry, heavy minerals and detrital geochronology. This will result in four MDS maps, each of which likely show slightly different trends and patterns. To deal with such cases, it may be useful to use a related technique called 'three way multidimensional scaling'. This results in two graphical outputs: an MDS map, and a map with 'weights' showing to what extent the different provenance proxies influence the horizontal and vertical axis of the MDS map. Thus, detrital data can not only inform the user about the provenance of sediments, but also about the causal relationships between the mineralogy, geochronology and chemistry.
NASA Technical Reports Server (NTRS)
Gangal, M. D.; Isenberg, L.; Lewis, E. V.
1985-01-01
Proposed system offers safety and large return on investment. System, operating by year 2000, employs machines and processes based on proven principles. According to concept, line of parallel machines, connected in groups of four to service modules, attacks face of coal seam. High-pressure water jets and central auger on each machine break face. Jaws scoop up coal chunks, and auger grinds them and forces fragments into slurry-transport system. Slurry pumped through pipeline to point of use. Concept for highly automated coal-mining system increases productivity, makes mining safer, and protects health of mine workers.
Therapeutic options for lymphangioleiomyomatosis (LAM): where we are and where we are going
Steagall, Wendy K; Moss, Joel
2009-01-01
Lymphangioleiomyomatosis (LAM), a multisystem disease affecting predominantly premenopausal and middle-aged women, causes progressive respiratory failure due to cystic lung destruction and is associated with lymphatic and kidney tumors. In the past, the treatment of LAM comprised exclusively anti-estrogen and related hormonal therapies. These treatments, however, have not been proven effective. In this article, we discuss new findings regarding the molecular mechanisms involved in the regulation of LAM cell growth, which may offer opportunities to develop effective and targeted therapeutic agents. PMID:20948684
Berrue, Fabrice; Withers, Sydnor T; Haltli, Brad; Withers, Jo; Kerr, Russell G
2011-03-21
Marine invertebrates have proven to be a rich source of secondary metabolites. The growing recognition that marine microorganisms associated with invertebrate hosts are involved in the biosynthesis of secondary metabolites offers new alternatives for the discovery and development of marine natural products. However, the discovery of microorganisms producing secondary metabolites previously attributed to an invertebrate host poses a significant challenge. This study describes an efficient chemical screening method utilizing a 96-well plate-based bacterial cultivation strategy to identify and isolate microbial producers of marine invertebrate-associated metabolites.
Facts controllers and HVDC enhance power transmission (A manufacturer`s perspective)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Juette, G.; Renz, K.
1995-12-31
Various types of FACTS as well as HVDC have been available for some time. New ones have been developed recently. Their respective benefits are well proven and have been made known. System studies have to be done to make full use of FACTS and HVDC problem solving capabilities. Siemens is offering digital models for correct representation of several FACTS devices and HVDC in widely used time-based simulation study programs. The manufacturers are doing their homework. It is up to the utility industry to make use of it now!
Aurora kinase A interacts with H-Ras and potentiates Ras-MAPK signaling | Office of Cancer Genomics
In cancer, upregulated Ras promotes cellular transformation and proliferation in part through activation of oncogenic Ras-MAPK signaling. While directly inhibiting Ras has proven challenging, new insights into Ras regulation through protein-protein interactions may offer unique opportunities for therapeutic intervention. Here we report the identification and validation of Aurora kinase A (Aurora A) as a novel Ras binding protein. We demonstrate that the kinase domain of Aurora A mediates the interaction with the N-terminal domain of H-Ras.
Fu, Qi; Yang, Lei; Wang, Wenhui; Han, Ali; Huang, Jian; Du, Pingwu; Fan, Zhiyong; Zhang, Jingyu; Xiang, Bin
2015-08-26
The first realization of a tunable band-gap in monolayer WS2(1-x) Se2x is demonstrated. The tuning of the bandgap exhibits a strong dependence of S and Se content, as proven by PL spectroscopy. Because of its remarkable electronic structure, monolayer WS2(1-x) Se2x exhibits novel electrochemical catalytic activity and offers long-term electrocatalytic stability for the hydrogen evolution reaction. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Management reporting on the Web.
Narayanan, G.; McHolm, G.; Jones, D. T.
2000-01-01
Driven by easy-to-use World Wide Web technology and new information integration concepts that have proven their worth in business and industry, online management reporting is now becoming an important strategy for improving operational performance in health care organizations. In this article, we provide an overview of these new information management concepts and describe our experience in planning and executing an enterprise-wide Web-enabled management reporting initiative. We also offer an inventory of the key organizational capacities that we found essential for developing and sustaining Web-enabled reporting services for health care managers. PMID:11079955
Back, D A; Haberstroh, N; Hoff, E; Plener, J; Haas, N P; Perka, C; Schmidmaier, G
2012-01-01
Modern internet-based information technologies offer great possibilities to create and improve teaching methods for students. The eLearning tool NESTOR (Network for Students in Traumatology and Orthopedics) presented here was designed to complement the existing clinical teaching in orthopedics and traumatology at the Charité, University Medicine Berlin. Using a learning management system, videos, podcasts, X-ray diagnosis, virtual patients, tests and further tools for learning and study information were combined. After implementation the eLearning project was evaluated by students. The NESTOR project offers various possibilities for knowledge acquisition. Students using the program voluntarily showed a high acceptance whereby 82.4% were very satisfied with the contents offered and 95.3% supported the idea of a future use of NESTOR in teaching. The blended learning approach was positively evaluated by 93.5% of the students. The project received the eLearning seal of quality of the Charité University Medicine Berlin. Using complex eLearning tools, such as the NESTOR project represents a contemporary teaching approach in the teaching of traumatology and orthopedics and should be offered in a blended learning context as they are well accepted by students.
Increasingly mobile: How new technologies can enhance qualitative research
Moylan, Carrie Ann; Derr, Amelia Seraphia; Lindhorst, Taryn
2015-01-01
Advances in technology, such as the growth of smart phones, tablet computing, and improved access to the internet have resulted in many new tools and applications designed to increase efficiency and improve workflow. Some of these tools will assist scholars using qualitative methods with their research processes. We describe emerging technologies for use in data collection, analysis, and dissemination that each offer enhancements to existing research processes. Suggestions for keeping pace with the ever-evolving technological landscape are also offered. PMID:25798072
A rationale and model for addressing tobacco dependence in substance abuse treatment.
Richter, Kimber P; Arnsten, Julia H
2006-08-14
Most persons in drug treatment smoke cigarettes. Until drug treatment facilities systematically treat their patients' tobacco use, millions will flow through the drug treatment system, overcome their primary drug of abuse, but die prematurely from tobacco-related illnesses. This paper reviews the literature on the health benefits of quitting smoking for drug treatment patients, whether smoking causes relapse to other drug or alcohol abuse, the treatment of tobacco dependence, and good and bad times for quitting smoking among drug treatment patients. It also presents a conceptual model and recommendations for treating tobacco in substance abuse treatment, and provides references to internet and paper-copy tools and information for treating tobacco dependence. At present, research on tobacco treatment in drug treatment is in its infancy. Although few drug treatment programs currently offer formal services, many more will likely begin to treat nicotine dependence as external forces and patient demand for these services increases. In the absence of clear guidelines and attention to quality of care, drug treatment programs may adopt smoking cessation services based on cost, convenience, or selection criteria other than efficacy. Because research in this field is relatively new, substance abuse treatment professionals should adhere to the standards of care for the general population, but be prepared to update their practices with emerging interventions that have proven to be effective for patients in drug treatment.
NMRNet: A deep learning approach to automated peak picking of protein NMR spectra.
Klukowski, Piotr; Augoff, Michal; Zieba, Maciej; Drwal, Maciej; Gonczarek, Adam; Walczak, Michal J
2018-03-14
Automated selection of signals in protein NMR spectra, known as peak picking, has been studied for over 20 years, nevertheless existing peak picking methods are still largely deficient. Accurate and precise automated peak picking would accelerate the structure calculation, and analysis of dynamics and interactions of macromolecules. Recent advancement in handling big data, together with an outburst of machine learning techniques, offer an opportunity to tackle the peak picking problem substantially faster than manual picking and on par with human accuracy. In particular, deep learning has proven to systematically achieve human-level performance in various recognition tasks, and thus emerges as an ideal tool to address automated identification of NMR signals. We have applied a convolutional neural network for visual analysis of multidimensional NMR spectra. A comprehensive test on 31 manually-annotated spectra has demonstrated top-tier average precision (AP) of 0.9596, 0.9058 and 0.8271 for backbone, side-chain and NOESY spectra, respectively. Furthermore, a combination of extracted peak lists with automated assignment routine, FLYA, outperformed other methods, including the manual one, and led to correct resonance assignment at the levels of 90.40%, 89.90% and 90.20% for three benchmark proteins. The proposed model is a part of a Dumpling software (platform for protein NMR data analysis), and is available at https://dumpling.bio/. michaljerzywalczak@gmail.compiotr.klukowski@pwr.edu.pl. Supplementary data are available at Bioinformatics online.
Optical characterization and polarization calibration for rigid endoscopes
NASA Astrophysics Data System (ADS)
Garcia, Missael; Gruev, Viktor
2017-02-01
Polarization measurements give orthogonal information to spectral images making them a great tool in the characterization of environmental parameters in nature. Thus, polarization imagery has proven to be remarkably useful in a vast range of biomedical applications. One such application is the early diagnosis of flat cancerous lesions in murine colorectal tumor models, where polarization data complements NIR fluorescence analysis. Advances in nanotechnology have led to compact and precise bio-inspired imaging sensors capable of accurately co-registering multidimensional spectral and polarization information. As more applications emerge for these imagers, the optics used in these instruments get very complex and can potentially compromise the original polarization state of the incident light. Here we present a complete optical and polarization characterization of three rigid endoscopes of size 1.9mm x 10cm (Karl Storz, Germany), 5mm x 30cm, and 10mm x 33cm (Olympus, Germany), used in colonoscopy for the prevention of colitis-associated cancer. Characterization results show that the telescope optics act as retarders and effectively depolarize the linear component. These incorrect readings can cause false-positives or false-negatives leading to an improper diagnosis. In this paper, we offer a polarization calibration scheme for these endoscopes based on Mueller calculus. By modeling the optical properties from training data as real-valued Mueller matrices, we are able to successfully reconstruct the initial polarization state acquired by the imaging system.
Systematic review of recent dementia practice guidelines.
Ngo, Jennifer; Holroyd-Leduc, Jayna M
2015-01-01
dementia is a highly prevalent acquired cognitive disorder that interferes with activities of daily living, relationships and quality of life. Recognition and effective management strategies are necessary to provide comprehensive care for these patients and their families. High-quality clinical practice guidelines can improve the quality and consistency of care in all aspects of dementia diagnosis and management by clarifying interventions supported by sound evidence and by alerting clinicians to interventions without proven benefit. we aimed to offer a synthesis of existing practice recommendations for the diagnosis and management of dementia, based upon moderate-to-high quality dementia guidelines. we performed a systematic search in EMBASE and MEDLINE as well as the grey literature for guidelines produced between 2008 and 2013. thirty-nine retrieved practice guidelines were included for quality appraisal by the Appraisal of Guidelines Research and Evaluation II (AGREE-II) tool, performed by two independent reviewers. From the 12 moderate-to-high quality guidelines included, specific practice recommendations for the diagnosis and/or management of any aspect of dementia were extracted for comparison based upon the level of evidence and strength of recommendation. there was a general agreement between guidelines for many practice recommendations. However, direct comparisons between guidelines were challenging due to variations in grading schemes. © The Author 2014. Published by Oxford University Press on behalf of the British Geriatrics Society. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Erbium laser resurfacing for actinic cheilitis.
Cohen, Joel L
2013-11-01
Actinic cheilitis is a precancerous condition characterized by grayish-whitish area(s) of discoloration on the mucosal lip, often blunting the demarcation between mucosa and cutaneous lip. Actinic cheilitis is considered to be an early part of the spectrum of squamous cell carcinoma. Squamous cell carcinoma specifically of the lip has a high rate of recurrence and metastasis through the oral cavity leading to a poor overall survival. Risk factors for the development of actinic cheilitis include chronic solar irradiation, increasing age, male gender, light skin complexion, immunosuppression, and possibly tobacco and alcohol consumption. Treatment options include topical pharmacotherapy (eg, fluorouracil, imiquimod) or procedural interventions (eg, cryotherapy, electrosurgery, surgical vermillionectomy, laser resurfacing), each with their known advantages and disadvantages. There is little consensus as to which treatment options offer the most clinical utility given the paucity of comparative clinical data. In my practice, laser resurfacing has become an important tool for the treatment of actinic cheilitis owing to its ease of use and overall safety, tolerability, and cosmetic acceptability. Herein the use of erbium laser resurfacing is described for three actinic cheilitis presentations for which I find it particularly useful: clinically prominent actinic cheilitis, biopsy-proven actinic cheilitis, and treatment of the entire lip following complete tumor excision of squamous cell carcinoma. All patients were treated with a 2940-nm erbium laser (Sciton Profile Contour Tunable Resurfacing Laser [TRL], Sciton, Inc., Palo Alto, CA).
Integrated Genomic and Network-Based Analyses of Complex Diseases and Human Disease Network.
Al-Harazi, Olfat; Al Insaif, Sadiq; Al-Ajlan, Monirah A; Kaya, Namik; Dzimiri, Nduna; Colak, Dilek
2016-06-20
A disease phenotype generally reflects various pathobiological processes that interact in a complex network. The highly interconnected nature of the human protein interaction network (interactome) indicates that, at the molecular level, it is difficult to consider diseases as being independent of one another. Recently, genome-wide molecular measurements, data mining and bioinformatics approaches have provided the means to explore human diseases from a molecular basis. The exploration of diseases and a system of disease relationships based on the integration of genome-wide molecular data with the human interactome could offer a powerful perspective for understanding the molecular architecture of diseases. Recently, subnetwork markers have proven to be more robust and reliable than individual biomarker genes selected based on gene expression profiles alone, and achieve higher accuracy in disease classification. We have applied one of these methodologies to idiopathic dilated cardiomyopathy (IDCM) data that we have generated using a microarray and identified significant subnetworks associated with the disease. In this paper, we review the recent endeavours in this direction, and summarize the existing methodologies and computational tools for network-based analysis of complex diseases and molecular relationships among apparently different disorders and human disease network. We also discuss the future research trends and topics of this promising field. Copyright © 2015 Institute of Genetics and Developmental Biology, Chinese Academy of Sciences, and Genetics Society of China. Published by Elsevier Ltd. All rights reserved.
Empathic design: Research strategies.
Thomas, Joyce; McDonagh, Deana
2013-01-01
This paper explores the role of empathy within new product development from the perspective of human-centred design. The authors have developed a range of empathic design tools and strategies that help to identify authentic human needs.For products and services to be effective, they need to satisfy both functional and emotional needs of individuals. In addition, the individual user needs to feel that the product and/or service has been designed 'just for them', otherwise they may misuse, underuse or abandon the product/service. This becomes critical with a product such as a Zimmer frame (walker), when it fails to resonate with the patient due to any stigma the patient may perceive, and thus remains unused.When training young designers to consider the wider community (people unlike themselves) during the design process, it has proven extremely valuable to take them outside their comfort zones, by seeking to develop empathy with the end user for whom they are designing. Empathic modelling offers designers the opportunity to develop greater insight and understanding, in order to support more effective design outcomes. Sensitising designers to the different ways that individuals complete daily tasks has helped to diminish the gap between themselves and others (e.g. people with disabilities).The authors intend for this paper to resonate with health care providers. Human-centred design can help to refocus the designer, by placing the individual end user's needs at the heart of their decision-making.
Computer-aided sperm analysis: a useful tool to evaluate patient's response to varicocelectomy.
Ariagno, Julia I; Mendeluk, Gabriela R; Furlan, María J; Sardi, M; Chenlo, P; Curi, Susana M; Pugliese, Mercedes N; Repetto, Herberto E; Cohen, Mariano
2017-01-01
Preoperative and postoperative sperm parameter values from infertile men with varicocele were analyzed by computer-aided sperm analysis (CASA) to assess if sperm characteristics improved after varicocelectomy. Semen samples of men with proven fertility (n = 38) and men with varicocele-related infertility (n = 61) were also analyzed. Conventional semen analysis was performed according to WHO (2010) criteria and a CASA system was employed to assess kinetic parameters and sperm concentration. Seminal parameters values in the fertile group were very far above from those of the patients, either before or after surgery. No significant improvement in the percentage normal sperm morphology (P = 0.10), sperm concentration (P = 0.52), total sperm count (P = 0.76), subjective motility (%) (P = 0.97) nor kinematics (P = 0.30) was observed after varicocelectomy when all groups were compared. Neither was significant improvement found in percentage normal sperm morphology (P = 0.91), sperm concentration (P = 0.10), total sperm count (P = 0.89) or percentage motility (P = 0.77) after varicocelectomy in paired comparisons of preoperative and postoperative data. Analysis of paired samples revealed that the total sperm count (P = 0.01) and most sperm kinetic parameters: curvilinear velocity (P = 0.002), straight-line velocity (P = 0.0004), average path velocity (P = 0.0005), linearity (P = 0.02), and wobble (P = 0.006) improved after surgery. CASA offers the potential for accurate quantitative assessment of each patient's response to varicocelectomy.
Freezable Radiator Coupon Testing and Full Scale Radiator Design
NASA Technical Reports Server (NTRS)
Lillibridge, Sean T.; Guinn, John; Cognata, Thomas; Navarro, Moses
2009-01-01
Freezable radiators offer an attractive solution to the issue of thermal control system scalability. As thermal environments change, a freezable radiator will effectively scale the total heat rejection it is capable of as a function of the thermal environment and flow rate through the radiator. Scalable thermal control systems are a critical technology for spacecraft that will endure missions with widely varying thermal requirements. These changing requirements are a result of the space craft s surroundings and because of different thermal loads during different mission phases. However, freezing and thawing (recovering) a radiator is a process that has historically proven very difficult to predict through modeling, resulting in highly inaccurate predictions of recovery time. This paper summarizes tests on three test articles that were performed to further empirically quantify the behavior of a simple freezable radiator, and the culmination of those tests into a full scale design. Each test article explored the bounds of freezing and recovery behavior, as well as providing thermo-physical data of the working fluid, a 50-50 mixture of DowFrost HD and water. These results were then used as a tool for developing correlated thermal model in Thermal Desktop which could be used for modeling the behavior of a full scale thermal control system for a lunar mission. The final design of a thermal control system for a lunar mission is also documented in this paper.
New Techniques for Radar Altimetry of Sea Ice and the Polar Oceans
NASA Astrophysics Data System (ADS)
Armitage, T. W. K.; Kwok, R.; Egido, A.; Smith, W. H. F.; Cullen, R.
2017-12-01
Satellite radar altimetry has proven to be a valuable tool for remote sensing of the polar oceans, with techniques for estimating sea ice thickness and sea surface height in the ice-covered ocean advancing to the point of becoming routine, if not operational, products. Here, we explore new techniques in radar altimetry of the polar oceans and the sea ice cover. First, we present results from fully-focused SAR (FFSAR) altimetry; by accounting for the phase evolution of scatterers in the scene, the FFSAR technique applies an inter-burst coherent integration, potentially over the entire duration that a scatterer remains in the altimeter footprint, which can narrow the effective along track resolution to just 0.5m. We discuss the improvement of using interleaved operation over burst-more operation for applying FFSAR processing to data acquired by future missions, such as a potential CryoSat follow-on. Second, we present simulated sea ice retrievals from the Ka-band Radar Interferometer (KaRIn), the instrument that will be launched on the Surface Water and Ocean Topography (SWOT) mission in 2021, that is capable of producing swath images of surface elevation. These techniques offer the opportunity to advance our understanding of the physics of the ice-covered oceans, plus new insight into how we interpret more conventional radar altimetry data in these regions.
AppEEARS: A Simple Tool that Eases Complex Data Integration and Visualization Challenges for Users
NASA Astrophysics Data System (ADS)
Maiersperger, T.
2017-12-01
The Application for Extracting and Exploring Analysis-Ready Samples (AppEEARS) offers a simple and efficient way to perform discovery, processing, visualization, and acquisition across large quantities and varieties of Earth science data. AppEEARS brings significant value to a very broad array of user communities by 1) significantly reducing data volumes, at-archive, based on user-defined space-time-variable subsets, 2) promoting interoperability across a wide variety of datasets via format and coordinate reference system harmonization, 3) increasing the velocity of both data analysis and insight by providing analysis-ready data packages and by allowing interactive visual exploration of those packages, and 4) ensuring veracity by making data quality measures more apparent and usable and by providing standards-based metadata and processing provenance. Development and operation of AppEEARS is led by the National Aeronautics and Space Administration (NASA) Land Processes Distributed Active Archive Center (LP DAAC). The LP DAAC also partners with several other archives to extend the capability across a larger federation of geospatial data providers. Over one hundred datasets are currently available, covering a diversity of variables including land cover, population, elevation, vegetation indices, and land surface temperature. Many hundreds of users have already used this new web-based capability to make the complex tasks of data integration and visualization much simpler and more efficient.
Sentinel Lymph Node Biopsy in Breast Cancer: A Clinical Review and Update
Haji, Altaf; Battoo, Azhar; Qurieshi, Mariya; Mir, Wahid; Shah, Mudasir
2017-01-01
Sentinel lymph node biopsy has become a standard staging tool in the surgical management of breast cancer. The positive impact of sentinel lymph node biopsy on postoperative negative outcomes in breast cancer patients, without compromising the oncological outcomes, is its major advantage. It has evolved over the last few decades and has proven its utility beyond early breast cancer. Its applicability and efficacy in patients with clinically positive axilla who have had a complete clinical response after neoadjuvant chemotherapy is being aggressively evaluated at present. This article discusses how sentinel lymph node biopsy has evolved and is becoming a useful tool in new clinical scenarios of breast cancer management. PMID:28970846
Pickering, Janessa; Richmond, Peter C.; Kirkham, Lea-Ann S.
2014-01-01
Non-typeable Haemophilus influenzae (NTHi) and Haemophilus haemolyticus are closely related bacteria that reside in the upper respiratory tract. NTHi is associated with respiratory tract infections that frequently result in antibiotic prescription whilst H. haemolyticus is rarely associated with disease. NTHi and H. haemolyticus can be indistinguishable by traditional culture methods and molecular differentiation has proven difficult. This current review chronologically summarizes the molecular approaches that have been developed for differentiation of NTHi from H. haemolyticus, highlighting the advantages and disadvantages of each target and/or technique. We also provide suggestions for the development of new tools that would be suitable for clinical and research laboratories. PMID:25520712
Sentinel Lymph Node Biopsy in Breast Cancer: A Clinical Review and Update.
Zahoor, Sheikh; Haji, Altaf; Battoo, Azhar; Qurieshi, Mariya; Mir, Wahid; Shah, Mudasir
2017-09-01
Sentinel lymph node biopsy has become a standard staging tool in the surgical management of breast cancer. The positive impact of sentinel lymph node biopsy on postoperative negative outcomes in breast cancer patients, without compromising the oncological outcomes, is its major advantage. It has evolved over the last few decades and has proven its utility beyond early breast cancer. Its applicability and efficacy in patients with clinically positive axilla who have had a complete clinical response after neoadjuvant chemotherapy is being aggressively evaluated at present. This article discusses how sentinel lymph node biopsy has evolved and is becoming a useful tool in new clinical scenarios of breast cancer management.
Tools to Use in an Information Technology Class--and Best of All They Are FREE!
ERIC Educational Resources Information Center
Swanson, Dewey; Gusev, Dmitri A.
2016-01-01
Purdue Polytechnic has several locations in the state of Indiana offering students a chance to get a Purdue degree. The Computer and Information Technology (CIT) department offers the CIT degree at three sites in Indiana: Anderson, Columbus and Kokomo. CIT offers several potential majors including Cybersecurity, Network Engineering, Systems…
Wiazowski, Jaroslaw
2014-01-01
With a decline in use of Braille, very few attractive technological options can be offered to young learners. Various research data confirm that teachers of the visually impaired do not have sufficient skills to introduce their students to modern devices. The Mountbatten Brailler can be considered as a tool that combines Braille technology with mainstream tools commonly used by students and teachers. This combination of devices opens new possibilities for the teachers and their students to reverse the trend in the use of Braille. Thanks to features offered by the Brailler and iOS devices, sighted and blind users receive a tool for unimpaired written communication.
ERIC Educational Resources Information Center
Hardy, Lawrence
2003-01-01
Requirements of the No Child Left Behind Act present school districts with a massive lesson in data-driven decision-making. Technology companies offer data-management tools that organize student information from state tests. Offers districts advice in choosing a technology provider. (MLF)
Bolstering Teaching through Online Tools
ERIC Educational Resources Information Center
Singh, Anil; Mangalaraj, George; Taneja, Aakash
2010-01-01
This paper offers a compilation of technologies that provides either free or low-cost solutions to the challenges of teaching online courses. It presents various teaching methods the outlined tools and technologies can support, with emphasis on fit between these tools and the tasks they are meant to serve. In addition, it highlights various…
The School Personnel Management System. Manual 1--Tools. Manual 2--Models. Manual 3--Results.
ERIC Educational Resources Information Center
National School Boards Association, Washington, DC.
The School Personnel Management System offers a correlated set of job descriptions, evaluative instruments, policies, tools, forms, and publications intended to aid local school officials in enhancing their personnel management programs. The materials are contained in two looseleaf binders entitled "Manual 1--Tools," and "Manual…
CrossTalk. The Journal of Defense Software Engineering. Volume 13, Number 6, June 2000
2000-06-01
Techniques for Efficiently Generating and Testing Software This paper presents a proven process that uses advanced tools to design, develop and test... optimal software. by Keith R. Wegner Large Software Systems—Back to Basics Development methods that work on small problems seem to not scale well to...Ability Requirements for Teamwork: Implications for Human Resource Management, Journal of Management, Vol. 20, No. 2, 1994. 11. Ferguson, Pat, Watts S
6th Annual CMMI Technology Conference and User Group
2006-11-17
Operationally Oriented; Customer Focused Proven Approach – Level of Detail Beginner Decision Table (DT) is a tabular representation with tailoring options to...written to reflect the experience of the author Software Engineering led the process charge in the ’80s – Used Flowcharts – CASE tools – “data...Postpo ned PCR. Verification Steps • EPG configuration audits • EPG configuration status reports Flowcharts and Entry, Task, Verification and eXit
Advances in Time-Distance Helioseismology
NASA Technical Reports Server (NTRS)
Duvall, Thomas L., Jr.; Beck, John G.; Gizon, Laurent; Kosovichev, Alexander F.; Oegerle, William (Technical Monitor)
2002-01-01
Time-distance helioseismology is a way to measure travel times between surface locations for waves traversing the solar interior. Coupling the travel with an extensive modeling effort has proven to be a powerful tool for measuring flows and other wave speed inhomogeneities in the solar interior. Problems receiving current attention include studying the time variation of the meridional circulation and torsional oscillation and active region emergence and evolution, current results on these topics will be presented.
Software Tools for Shipbuilding Productivity
1984-12-01
shipbuilding, is that design, manufacturing and robotic technology applications to shipbuilding have been proven. all aspects of shipbuilding is now a task...technical information about the process of Computer Aided Design (CAD) and Computer Aided Manufacturing (CAM) effectively has been a problem of serious and...Design (CAD) 3.4.1 CAD System Components 3.4.2 CAD System Benefits 3.4.3 New and Future CAD Technologies Computer Aided Manufacturing (CAM) 3.5.1 CAM
Architectural evaluation of dynamic and partial reconfigurable systems designed with DREAMS tool
NASA Astrophysics Data System (ADS)
Otero, Andrés.; Gallego, Ángel; de la Torre, Eduardo; Riesgo, Teresa
2013-05-01
Benefits of dynamic and partial reconfigurable systems are increasingly being more accepted by the industry. For this reason, SRAM-based FPGA manufacturers have improved, or even included for the first time, the support they offer for the design of this kind of systems. However, commercial tools still offer a poor flexibility, which leads to a limited efficiency. This is witnessed by the overhead introduced by the communication primitives, as well as by the inability to relocate reconfigurable modules, among others. For this reason, authors have proposed an academic design tool called DREAMS, which targets the design of dynamically reconfigurable systems. In this paper, main features offered by DREAMS are described, comparing them with existing commercial and academic tools. Moreover, a graphic user interface (GUI) is originally described in this work, with the aim of simplifying the design process, as well as to hide the low level device dependent details to the system designer. The overall goal is to increase the designer productivity. Using the graphic interface, different reconfigurable architectures are provided as design examples. Among them, both conventional slot-based architectures and mesh type designs have been included.
User Driven Development of Software Tools for Open Data Discovery and Exploration
NASA Astrophysics Data System (ADS)
Schlobinski, Sascha; Keppel, Frank; Dihe, Pascal; Boot, Gerben; Falkenroth, Esa
2016-04-01
The use of open data in research faces challenges not restricted to inherent properties such as data quality, resolution of open data sets. Often Open data is catalogued insufficiently or fragmented. Software tools that support the effective discovery including the assessment of the data's appropriateness for research have shortcomings such as the lack of essential functionalities like support for data provenance. We believe that one of the reasons is the neglect of real end users requirements in the development process of aforementioned software tools. In the context of the FP7 Switch-On project we have pro-actively engaged the relevant user user community to collaboratively develop a means to publish, find and bind open data relevant for hydrologic research. Implementing key concepts of data discovery and exploration we have used state of the art web technologies to provide an interactive software tool that is easy to use yet powerful enough to satisfy the data discovery and access requirements of the hydrological research community.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aragon, Kathryn M.; Eaton, Shelley M.; McCornack, Marjorie Turner
When a requirements engineering effort fails to meet expectations, often times the requirements management tool is blamed. Working with numerous project teams at Sandia National Laboratories over the last fifteen years has shown us that the tool is rarely the culprit; usually it is the lack of a viable information architecture with well- designed processes to support requirements engineering. This document illustrates design concepts with rationale, as well as a proven information architecture to structure and manage information in support of requirements engineering activities for any size or type of project. This generalized information architecture is specific to IBM's Rationalmore » DOORS (Dynamic Object Oriented Requirements System) software application, which is the requirements management tool in Sandia's CEE (Common Engineering Environment). This generalized information architecture can be used as presented or as a foundation for designing a tailored information architecture for project-specific needs. It may also be tailored for another software tool. Version 1.0 4 November 201« less
Using a formal requirements management tool for system engineering: first results at ESO
NASA Astrophysics Data System (ADS)
Zamparelli, Michele
2006-06-01
The attention to proper requirement analysis and maintenance is growing in modern astronomical undertakings. The increasing degree of complexity that current and future generations of projects have reached requires substantial system engineering efforts and the usage of all available technology to keep project development under control. One such technology is a tool which helps managing relationships between deliverables at various development stages, and across functional subsystems and disciplines as different as software, mechanics, optics and electronics. The immediate benefits are traceability and the possibility to do impact analysis. An industrially proven tool for requirements management is presented together with the first results across some projects at ESO and a cost/benefit analysis of its usage. Experience gathered so far shows that the extensibility and configurability of the tool from one hand, and integration with common documentation formats and standards on the other, make it appear as a promising solution for even small scale system development.
Tools for Implementing an Evidence-Based Approach in Public Health Practice
Jacobs, Julie A.; Jones, Ellen; Gabella, Barbara A.; Spring, Bonnie
2012-01-01
Increasing disease rates, limited funding, and the ever-growing scientific basis for intervention demand the use of proven strategies to improve population health. Public health practitioners must be ready to implement an evidence-based approach in their work to meet health goals and sustain necessary resources. We researched easily accessible and time-efficient tools for implementing an evidence-based public health (EBPH) approach to improve population health. Several tools have been developed to meet EBPH needs, including free online resources in the following topic areas: training and planning tools, US health surveillance, policy tracking and surveillance, systematic reviews and evidence-based guidelines, economic evaluation, and gray literature. Key elements of EBPH are engaging the community in assessment and decision making; using data and information systems systematically; making decisions on the basis of the best available peer-reviewed evidence (both quantitative and qualitative); applying program-planning frameworks (often based in health-behavior theory); conducting sound evaluation; and disseminating what is learned. PMID:22721501
NASA Astrophysics Data System (ADS)
Young, J. C.; Boronska, K.; Martin, C. J.; Rickard, A. R.; Vázquez Moreno, M.; Pilling, M. J.; Haji, M. H.; Dew, P. M.; Lau, L. M.; Jimack, P. K.
2010-12-01
AtChem On-line1 is a simple to use zero-dimensional box modelling toolkit, developed for use by laboratory, field and chamber scientists. Any set of chemical reactions can be simulated, in particular the whole Master Chemical Mechanism (MCM2) or any subset of it. Parameters and initial data can be provided through a self-explanatory web form and the resulting model is compiled and run on a dedicated server. The core part of the toolkit, providing a robust solver for thousands of chemical reactions, is written in Fortran and uses SUNDIALS3 CVODE libraries. Chemical systems can be constrained at multiple, user-determined timescales; this enabled studies of radical chemistry at one minute timescales. AtChem On-line is free to use and requires no installation - a web browser, text editor and any compressing software is all the user needs. CPU and storage are provided by the server (input and output data are saved indefinitely). An off-line version is also being developed, which will provide batch processing, an advanced graphical user interface and post-processing tools, for example, Rate of Production Analysis (ROPA) and chainlength analysis. The source code is freely available for advanced users wishing to adapt and run the program locally. Data management, dissemination and archiving are essential in all areas of science. In order to do this in an efficient and transparent way, there is a critical need to capture high quality metadata/provenance for modelling activities. An Electronic Laboratory Notebook (ELN) has been developed in parallel with AtChem Online as part of the EC EUROCHAMP24 project. In order to use controlled chamber experiments to evaluate the MCM, we need to be able to archive, track and search information on all associated chamber model runs, so that they can be used in subsequent mechanism development. Therefore it would be extremely useful if experiment and model metadata/provenance could be easily and automatically stored electronically. Archiving metadata/provenance via an ELN makes it easier to write a paper or thesis and for mechanism developers/evaluators/peer review to search for appropriate experimental and modelling results and conclusions. The development of an ELN in the context mechanism evaluation/development using large experimental chamber datasets is presented.
Application of sepsis definitions to pediatric patients admitted with suspected infections in Uganda
Wiens, Matthew O.; Larson, Charles P.; Kumbakumba, Elias; Kissoon, Niranjan; Ansermino, J. Mark; Singer, Joel; Wong, Hubert; Ndamira, Andrew; Kabakyenga, Jerome; Moschovis, Peter; Kiwanuka, Julius
2017-01-01
Objectives Acute infectious diseases are the most common cause of under-5 mortality. However, the hospital burden of non-neonatal pediatric sepsis has not previously been described in the resource poor setting. The objective of this study was to determine the prevalence of sepsis among children 6 months to 5 years of age admitted with proven or suspected infection and to evaluate the presence of sepsis as a predictive tool for mortality during admission. Design In this Prospective cohort study we used the pediatric International Consensus Conference definition of sepsis to determine the prevalence of sepsis among children admitted to the pediatric ward with a proven or suspected infection. The diagnosis of sepsis, as well as each individual component of the sepsis definition, were evaluated for capturing in-hospital mortality. Setting The pediatric ward of two hospitals in Mbarara, Uganda Patients Admitted children between 6 months and 5 years with a confirmed or suspected infection. Interventions None Measurements and Main Results One thousand three hundred and seven (1307) subjects with a confirmed or suspected infection were enrolled and 65 children died (5.0%) during their admission. One thousand one hundred and twenty-one (85.9%) met the systemic inflammatory response syndrome criteria, and therefore were defined as having sepsis. The sepsis criteria captured 61 deaths, demonstrating a sensitivity and specificity of 95% (95% CI 90% – 100%) and 15% (95% CI 13% – 17%), respectively. The most discriminatory individual component of the SIRS criteria was the leukocyte count which alone had a sensitivity of 72% and a specificity of 56% for the identification of mortality in hospital. Conclusions This study is among the first to quantify the burden of non-neonatal pediatric sepsis in children with suspected infection, using the international consensus sepsis definition, in a typical resource constrained setting in Africa. This definition was found to be highly sensitive in identifying those who died, but had very low specificity as most children who were admitted with infections had sepsis. The SIRS-based sepsis definition offers little value in identification of children at high risk of in-hospital mortality in this setting. PMID:27043996
Wiens, Matthew O; Larson, Charles P; Kumbakumba, Elias; Kissoon, Niranjan; Ansermino, J Mark; Singer, Joel; Wong, Hubert; Ndamira, Andrew; Kabakyenga, Jerome; Moschovis, Peter; Kiwanuka, Julius
2016-05-01
Acute infectious diseases are the most common cause of under-5 mortality. However, the hospital burden of nonneonatal pediatric sepsis has not previously been described in the resource poor setting. The objective of this study was to determine the prevalence of sepsis among children 6 months to 5 years old admitted with proven or suspected infection and to evaluate the presence of sepsis as a predictive tool for mortality during admission. In this prospective cohort study, we used the pediatric International Consensus Conference definition of sepsis to determine the prevalence of sepsis among children admitted to the pediatric ward with a proven or suspected infection. The diagnosis of sepsis, as well as each individual component of the sepsis definition, was evaluated for capturing in-hospital mortality. The pediatric ward of two hospitals in Mbarara, Uganda. Admitted children between 6 months and 5 years with a confirmed or suspected infection. None. One thousand three hundred seven (1,307) subjects with a confirmed or suspected infection were enrolled, and 65 children died (5.0%) during their admission. One thousand one hundred twenty-one (85.9%) met the systemic inflammatory response syndrome criteria, and therefore, they were defined as having sepsis. The sepsis criteria captured 61 deaths, demonstrating a sensitivity and a specificity of 95% (95% CI, 90-100%) and 15% (95% CI, 13-17%), respectively. The most discriminatory individual component of the systemic inflammatory response syndrome criteria was the leukocyte count, which alone had a sensitivity of 72% and a specificity of 56% for the identification of mortality in hospital. This study is among the first to quantify the burden of nonneonatal pediatric sepsis in children with suspected infection, using the international consensus sepsis definition, in a typical resource-constrained setting in Africa. This definition was found to be highly sensitive in identifying those who died but had very low specificity as most children who were admitted with infections had sepsis. The systemic inflammatory response syndrome-based sepsis definition offers little value in identification of children at high risk of in-hospital mortality in this setting.
Role of laser therapy in bladder carcinoma
NASA Astrophysics Data System (ADS)
Sharpe, Brent A.; de Riese, Werner T.
2001-05-01
Transitional cell carcinoma (TCC) of the bladder is most common genitourinary tract cancer and its treatment comprises a large number of surgical procedures in urological oncology. Seventy-five percent (75%) of cases recur within two years and the recurrence rate is correlated with the grade of the initial tumor. While Transurethral Resection of the Bladder (TURB) is the current standard of care, the use of laser offers a proven alternative. Sufficient evidence is available that laser treatment of superficial bladder cancer is as effective as TURB. Laser treatment offers several advantages such as decreased incidence of bladder perforation, a near bloodless procedure, catheter-free procedure, and the possibility of outpatient therapy. It has been reported that laser treatment may reduce the recurrence rate of TCC as compared to electrocautery resection. Furthermore, some studies suggest seeding can be avoided with laser resection; however, both items remain highly controversial.
Effects of Website Interactivity on Online Retail Shopping Behavior
NASA Astrophysics Data System (ADS)
Islam, Hafizul
Motivations to engage in retail online shopping can include both utilitarian and hedonic shopping dimensions. To cater to these consumers, online retailers can create a cognitively and esthetically rich shopping environment, through sophisticated levels of interactive web utilities and features, offering not only utilitarian benefits and attributes but also providing hedonic benefits of enjoyment. Since the effect of interactive websites has proven to stimulate online consumer’s perceptions, this study presumes that websites with multimedia rich interactive utilities and features can influence online consumers’ shopping motivations and entice them to modify or even transform their original shopping predispositions by providing them with attractive and enhanced interactive features and controls, thus generating a positive attitude towards products and services offered by the retailer. This study seeks to explore the effects of Web interactivity on online consumer behavior through an attitudinal model of technology acceptance.
Ethics and forensic psychiatry: translating principles into practice.
Appelbaum, Paul S
2008-01-01
Twenty-five years ago, Alan Stone expressed his skepticism that forensic psychiatry could be practiced ethically. His remarks have proven a useful goad to the field, focusing attention on the importance of an ethics framework for forensic practice. But Stone remains dubious that any system of ethics--including the "Standard Position" on which he focuses his critique--could be of much value in practice. In contrast, I suggest that Stone's pessimism is not well founded. Immanent in forensic practice itself is a reasonable set of ethics principles, based on truth-telling and respect for persons. Psychiatrists can offer reliable and valid testimony, while resisting seduction into an advocacy role. Indeed, with new structured approaches to assessment, the potential utility of forensic testimony is probably greater than ever. Though problematic behavior still exists, forensic psychiatry offers the factual background and interpretive context to allow legal decision-makers to make better choices than they otherwise would.
Nanophotonic applications for silicon-on-insulator (SOI)
NASA Astrophysics Data System (ADS)
de la Houssaye, Paul R.; Russell, Stephen D.; Shimabukuro, Randy L.
2004-07-01
Silicon-on-insulator is a proven technology for very large scale integration of microelectronic devices. The technology also offers the potential for development of nanophotonic devices and the ability to interface such devices to the macroscopic world. This paper will report on fabrication techniques used to form nano-structured silicon wires on an insulating structure that is amenable to interfacing nanostructured sensors with high-performance microelectronic circuitry for practical implementation. Nanostructures formed on silicon-on-sapphire can also exploit the transparent substrate for novel device geometries. This research harnesses the unique properties of a high-quality single crystal film of silicon on sapphire and uses the film thickness as one of the confinement dimensions. Lateral arrays of silicon nanowires were fabricated in the thin (5 to 20 nm) silicon layer and studied. This technique offers simplified contact to individual wires and provides wire surfaces that are more readily accessible for controlled alteration and device designs.
Comparing 2 National Organization-Level Workplace Health Promotion and Improvement Tools, 2013–2015
Lang, Jason E.; Davis, Whitney D.; Jones-Jack, Nkenge H.; Mukhtar, Qaiser; Lu, Hua; Acharya, Sushama D.; Molloy, Meg E.
2016-01-01
Creating healthy workplaces is becoming more common. Half of employers that have more than 50 employees offer some type of workplace health promotion program. Few employers implement comprehensive evidence-based interventions that reach all employees and achieve desired health and cost outcomes. A few organization-level assessment and benchmarking tools have emerged to help employers evaluate the comprehensiveness and rigor of their health promotion offerings. Even fewer tools exist that combine assessment with technical assistance and guidance to implement evidence-based practices. Our descriptive analysis compares 2 such tools, the Centers for Disease Control and Prevention’s Worksite Health ScoreCard and Prevention Partners’ WorkHealthy America, and presents data from both to describe workplace health promotion practices across the United States. These tools are reaching employers of all types (N = 1,797), and many employers are using a comprehensive approach (85% of those using WorkHealthy America and 45% of those using the ScoreCard), increasing program effectiveness and impact. PMID:27685429
Raman counting of heavy minerals in turbidites: Indus Fan, IODP Expedition 355
NASA Astrophysics Data System (ADS)
Andò, Sergio
2017-04-01
Raman spectroscopy is an innovative tool with tremendous potential. Thorny long-standing problems that cannot be solved confidently with a polarizing microscope alone, such as the determination of opaque heavy minerals or of detrital grains as small as a few microns, can finally be addressed. Heavy-mineral species commonly found in sediments convey specific information on the genesis of their source rocks and are therefore crucial in provenance diagnoses and palaeotectonic reconstructions. A high-resolution mineralogical study of Indus Fan turbiditic sediments cored during IODP Expedition 355 (Arabian Sea Monsoon) in the Laxmi Basin was carried out to investigate and quantify the different compositional signatures of sand and silt fractions. Silt and sand in turbidite deposits recovered at IODP Sites U1456 and U1457 were chosen as the best natural archive for this source-to-sink study. An integrated mineralogical dataset was obtained by coupling traditional and innovative single-grain heavy-mineral analyses. Reliable quantitative results even in the medium to fine silt classes, which represent the dominant sediment sizes encountered in the recovered cores, were obtained by point-counting of single grains under the microscope assisted by Micro-Raman spectroscopy. Preliminary data from the studied turbidites document rich and diverse heavy-mineral assemblages in both sand and silty-sand fractions. Multiple varietal studies of amphibole, epidote and garnet varieties, representing the dominant heavy-mineral trial in orogenic detritus derived from collided ranges such as the Himalaya, were performed to highlight the wide unexplored potential of Raman spectroscopy when applied to provenance studies. Discriminating within the isomorphous series of garnets is possible, and diverse pyralspite and ugrandite garnets are distinguished by the position of characteristic peaks found at high frequencies and caused by Si-O stretching modes (873-880 cm-1 in ugrandites, 907-926 cm-1 in pyralspites; Bersani et al., 2009; Andò et al., 2009). Raman discrimination of amphibole varieties is also possible and the diagnostic position and shape of the more intense OH stretching bands (frequencies between 3600 and 3700 cm-1) are particularly helpful (Vezzoli et al., 2016). Raman discrimination of epidote-group minerals was tackled by using a new data set of the characteristic vibrational modes in the high-frequency region to facilitate distinction from other silicates and distinguish different varieties. A protocol to separate heavy minerals from the silt fraction, starting from a few grams of sediments only, was developed at the Laboratory for Provenance Studies of Milano-Bicocca. An appropriate data base of Raman spectra of detrital minerals is essential to apply this method routinely in future provenance studies of deep-sea turbidites. Such a new methodological approach plays a potentially key role to differentiate among the diverse Himalayan versus Indian Peninsular sources of detritus and opens up a new frontier for future studies of the largely unexplored deep-marine sedimentary record. Cited references S. Andò, D. Bersani, P. Vignola, E. Garzanti, 2009. Raman spectroscopy as an effective tool for high-resolution heavy-mineral analysis: examples from major Himalayan and Alpine fluvio-deltaic systems. Spectrochimica Acta Part A 73, 3, 450-455. D. Bersani, S. Andò, P. Vignola, G. Moltifiori, I.G. Marino, P.P. Lottici, V. Diella, 2009. Micro-Raman spectroscopy as a routine tool for garnet analysis. Spectrochimica Acta Part A 73, 3, 484-491. G. Vezzoli, E. Garzanti, M. Limonta, S. Andò, S. Yang, 2016. Erosion patterns in the Changjiang (Yangtze River) catchment revealed by bulk-sample versus single-mineral provenance budgets. Geomorphology 261, 177-192.
The nanomaterial toolkit for neuroengineering
NASA Astrophysics Data System (ADS)
Shah, Shreyas
2016-10-01
There is a growing interest in developing effective tools to better probe the central nervous system (CNS), to understand how it works and to treat neural diseases, injuries and cancer. The intrinsic complexity of the CNS has made this a challenging task for decades. Yet, with the extraordinary recent advances in nanotechnology and nanoscience, there is a general consensus on the immense value and potential of nanoscale tools for engineering neural systems. In this review, an overview of specialized nanomaterials which have proven to be the most effective tools in neuroscience is provided. After a brief background on the prominent challenges in the field, a variety of organic and inorganic-based nanomaterials are described, with particular emphasis on the distinctive properties that make them versatile and highly suitable in the context of the CNS. Building on this robust nano-inspired foundation, the rational design and application of nanomaterials can enable the generation of new methodologies to greatly advance the neuroscience frontier.
Alkylation of Staurosporine to Derive a Kinase Probe for Fluorescence Applications.
Disney, Alexander J M; Kellam, Barrie; Dekker, Lodewijk V
2016-05-06
The natural product staurosporine is a high-affinity inhibitor of nearly all mammalian protein kinases. The labelling of staurosporine has proven effective as a means of generating protein kinase research tools. Most tools have been generated by acylation of the 4'-methylamine of the sugar moiety of staurosporine. Herein we describe the alkylation of this group as a first step to generate a fluorescently labelled staurosporine. Following alkylation, a polyethylene glycol linker was installed, allowing subsequent attachment of fluorescein. We report that this fluorescein-staurosporine conjugate binds to cAMP-dependent protein kinase in the nanomolar range. Furthermore, its binding can be antagonised with unmodified staurosporine as well as ATP, indicating it targets the ATP binding site in a similar fashion to native staurosporine. This reagent has potential application as a screening tool for protein kinases of interest. © 2015 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.