Sample records for provide proven methods

  1. Analysis of Antarctic glacigenic sediment provenance through geochemical and petrologic applications

    NASA Astrophysics Data System (ADS)

    Licht, Kathy J.; Hemming, Sidney R.

    2017-05-01

    The number of provenance studies of glacigenic sediments in Antarctica has increased dramatically over the past decade, providing an enhanced understanding of ice sheet history and dynamics, along with the broader geologic history. Such data have been used to assess glacial erosion patterns at the catchment scale, flow path reconstructions over a wide range of scales, and ice sheet fluctuations indicated by iceberg rafted debris in circumantarctic glacial marine sediments. It is notable that even though most of the bedrock of the continent is ice covered and inaccessible, provenance data can provide such valuable information about Antarctic ice and can even be used to infer buried rock types along with their geo- and thermochronologic history. Glacigenic sediments provide a broader array of provenance analysis opportunities than any other sediment type because of their wide range of grain sizes, and in this paper we review methods and examples from all size fractions that have been applied to the Antarctic glacigenic sedimentary record. Interpretations of these records must take careful consideration of the choice of analytical methods, uneven patterns of erosion, and spatial variability in sediment transport and rock types, which all may lead to a preferential identification of different elements of sources in the provenance analyses. Because of this, we advocate a multi-proxy approach and highlight studies that demonstrate the value of selecting complementary provenance methods.

  2. A Scientific Data Provenance API for Distributed Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raju, Bibi; Elsethagen, Todd O.; Stephan, Eric G.

    Data provenance has been an active area of research as a means to standardize how the origin of data, process event history, and what or who was responsible for influencing results is explained. There are two approaches to capture provenance information. The first approach is to collect observed evidence produced by an executing application using log files, event listeners, and temporary files that are used by the application or application developer. The provenance translated from these observations is an interpretation of the provided evidence. The second approach is called disclosed because the application provides a firsthand account of the provenancemore » based on the anticipated questions on data flow, process flow, and responsible agents. Most observed provenance collection systems collect lot of provenance information during an application run or workflow execution. The common trend in capturing provenance is to collect all possible information, then attempt to find relevant information, which is not efficient. Existing disclosed provenance system APIs do not work well in distributed environment and have trouble finding where to fit the individual pieces of provenance information. This work focuses on determining more reliable solutions for provenance capture. As part of the Integrated End-to-end Performance Prediction and Diagnosis for Extreme Scientific Workflows (IPPD) project, an API was developed, called Producer API (PAPI), which can disclose application targeted provenance, designed to work in distributed environments by means of unique object identification methods. The provenance disclosure approach used adds additional metadata to the provenance information to uniquely identify the pieces and connect them together. PAPI uses a common provenance model to support this provenance integration across disclosure sources. The API also provides the flexibility to let the user decide what to do with the collected provenance. The collected provenance can be sent to a triple store using REST services or it can be logged to a file.« less

  3. Fourth order exponential time differencing method with local discontinuous Galerkin approximation for coupled nonlinear Schrodinger equations

    DOE PAGES

    Liang, Xiao; Khaliq, Abdul Q. M.; Xing, Yulong

    2015-01-23

    In this paper, we study a local discontinuous Galerkin method combined with fourth order exponential time differencing Runge-Kutta time discretization and a fourth order conservative method for solving the nonlinear Schrödinger equations. Based on different choices of numerical fluxes, we propose both energy-conserving and energy-dissipative local discontinuous Galerkin methods, and have proven the error estimates for the semi-discrete methods applied to linear Schrödinger equation. The numerical methods are proven to be highly efficient and stable for long-range soliton computations. Finally, extensive numerical examples are provided to illustrate the accuracy, efficiency and reliability of the proposed methods.

  4. Evaluation methodology for comparing memory and communication of analytic processes in visual analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric D; Goodall, John R

    2014-01-01

    Provenance tools can help capture and represent the history of analytic processes. In addition to supporting analytic performance, provenance tools can be used to support memory of the process and communication of the steps to others. Objective evaluation methods are needed to evaluate how well provenance tools support analyst s memory and communication of analytic processes. In this paper, we present several methods for the evaluation of process memory, and we discuss the advantages and limitations of each. We discuss methods for determining a baseline process for comparison, and we describe various methods that can be used to elicit processmore » recall, step ordering, and time estimations. Additionally, we discuss methods for conducting quantitative and qualitative analyses of process memory. By organizing possible memory evaluation methods and providing a meta-analysis of the potential benefits and drawbacks of different approaches, this paper can inform study design and encourage objective evaluation of process memory and communication.« less

  5. Geoscience Australia's enterprise application of provenance standards and systems for physical and digital objects

    NASA Astrophysics Data System (ADS)

    Kemp, C.; Car, N. J.

    2016-12-01

    Geoscience Australia (GA) is a government agency that provides advice on the geology and geography of Australia. It is the custodian of many digital and physical datasets of national significance. For several years GA has been implementing an enterprise approach to provenance management. The goal for transparency and reproducibility for all of GA's information products; an objective supported at the highest levels and explicitly listed in its Science Principles. Currently GA is finalising a set of enterprise tools to assist with provenance management and rolling out provenance reporting to different science areas. GA has adopted or developed: provenance storage systems; provenance collection code libraries (for use within automated systems); reporting interfaces (for manual use) and provenance representation capability within legacy catalogues. Using these tools within GA's science areas involves modelling the scenario first and then assessing whether the area has its data managed in such a way that allows links to data within provenance to be resolvable in perpetuity. We don't just want to represent provenance (demonstrating transparency), we want to access data via provenance (allowing for reproducibility). A subtask of GA's current work is to link physical samples to information products (datasets, reports, papers) by uniquely and persistently identifying samples using International GeoSample Numbers and then modelling automated & manual laboratory workflows and associated tasks, such as data delivery to corporate databases using the W3C's PROV Data Model. We use PROV DM throughout our modelling and systems. We are also moving to deliver all sample and digital dataset metadata across the agency in the Web Ontology Language (OWL) and exposing it via Linked Data methods in order to allow Semantic Web querying of multiple systems allowing provenance to be leveraged using as a single method and query point. Through the Science First Transformation Program GA is undergoing a significant rethinking of its data architecture, curation and access to support the Digital Science capability for which Provenance management is an output.

  6. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    NASA Astrophysics Data System (ADS)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software used to execute analyses and models, and derived data and products that arise from these computations. This provenance is vital to interpretation and understanding of science, and provides an audit trail that researchers can use to understand and replicate computational workflows in the geosciences.

  7. Stable Isotope Ratio and Elemental Profile Combined with Support Vector Machine for Provenance Discrimination of Oolong Tea (Wuyi-Rock Tea)

    PubMed Central

    Lou, Yun-xiao; Fu, Xian-shu; Yu, Xiao-ping; Zhang, Ya-fen

    2017-01-01

    This paper focused on an effective method to discriminate the geographical origin of Wuyi-Rock tea by the stable isotope ratio (SIR) and metallic element profiling (MEP) combined with support vector machine (SVM) analysis. Wuyi-Rock tea (n = 99) collected from nine producing areas and non-Wuyi-Rock tea (n = 33) from eleven nonproducing areas were analysed for SIR and MEP by established methods. The SVM model based on coupled data produced the best prediction accuracy (0.9773). This prediction shows that instrumental methods combined with a classification model can provide an effective and stable tool for provenance discrimination. Moreover, every feature variable in stable isotope and metallic element data was ranked by its contribution to the model. The results show that δ2H, δ18O, Cs, Cu, Ca, and Rb contents are significant indications for provenance discrimination and not all of the metallic elements improve the prediction accuracy of the SVM model. PMID:28473941

  8. Visualisation methods for large provenance collections in data-intensive collaborative platforms

    NASA Astrophysics Data System (ADS)

    Spinuso, Alessandro; Fligueira, Rosa; Atkinson, Malcolm; Gemuend, Andre

    2016-04-01

    This work investigates improving the methods of visually representing provenance information in the context of modern data-driven scientific research. It explores scenarios where data-intensive workflows systems are serving communities of researchers within collaborative environments, supporting the sharing of data and methods, and offering a variety of computation facilities, including HPC, HTC and Cloud. It focuses on the exploration of big-data visualization techniques aiming at producing comprehensive and interactive views on top of large and heterogeneous provenance data. The same approach is applicable to control-flow and data-flow workflows or to combinations of the two. This flexibility is achieved using the W3C-PROV recommendation as a reference model, especially its workflow oriented profiles such as D-PROV (Messier et al. 2013). Our implementation is based on the provenance records produced by the dispel4py data-intensive processing library (Filgueira et al. 2015). dispel4py is an open-source Python framework for describing abstract stream-based workflows for distributed data-intensive applications, developed during the VERCE project. dispel4py enables scientists to develop their scientific methods and applications on their laptop and then run them at scale on a wide range of e-Infrastructures (Cloud, Cluster, etc.) without making changes. Users can therefore focus on designing their workflows at an abstract level, describing actions, input and output streams, and how they are connected. The dispel4py system then maps these descriptions to the enactment platforms, such as MPI, Storm, multiprocessing. It provides a mechanism which allows users to determine the provenance information to be collected and to analyze it at runtime. For this work we consider alternative visualisation methods for provenance data, from infinite lists and localised interactive graphs, to radial-views. The latter technique has been positively explored in many fields, from text data visualisation to genomics and social networking analysis. Its adoption for provenance has been presented in literature (Borkin et al. 2013) in the context of parent-child relationships across processes, constructed from control-flow information. Computer graphics research has focused on the advantage of this radial distribution of interlinked information and on ways to improve the visual efficiency and tunability of such representations, like the Hierarchical Edge Bundles visualisation method, (Holten et al. 2006), which aims at reducing visual clutter of highly connected structures via the generation of bundles. Our approach explores the potential of the combination of these methods. It serves environments where the size of the provenance collection, coupled with the diversity of the infrastructures and the domain metadata, make the extrapolation of usage trends extremely challenging. Applications of such visualisation systems can engage groups of scientists, data providers and computational engineers, by serving visual snapshots that highlight relationships between an item and its connected processes. We will present examples of comprehensive views on the distribution of processing and data transfers during a workflow's execution in HPC, as well as cross workflows interactions and internal dynamics. The latter in the context of faceted searches on domain metadata values-range. These are obtained from the analysis of real provenance data generated by the processing of seismic traces performed through the VERCE platform.

  9. Proven Weight Loss Methods

    MedlinePlus

    Fact Sheet Proven Weight Loss Methods What can weight loss do for you? Losing weight can improve your health in a number of ways. It can lower ... at www.hormone.org/Spanish . Proven Weight Loss Methods Fact Sheet www.hormone.org

  10. LA-ICP-MS as Tool for Provenance Analyses in Arctic Marine Sediments

    NASA Astrophysics Data System (ADS)

    Wildau, Antje; Garbe-Schönberg, Dieter

    2015-04-01

    The hydraulic transport of sediments is a major geological process in terrestrial and marine systems and is responsible for the loss, redistribution and accumulation of minerals. Provenance analyses are a powerful tool for assessing the origin and dispersion of material in ancient and modern fluvial and marine sediments. Provenance-specific heavy minerals (e.g., zircon, rutile, tourmaline) can therefore be used to provide valuable information on the formation of ore deposits (placer deposits), and the reconstruction of paleogeography, hydrology, climate conditions and developments. The application of provenances analyses for the latter reason is of specific interest, since there is need for research on the progressing climate change, and heavy minerals represent good proxies for the evaluation of recent and past changes in the climate. The study of these fine particles provides information about potential regional or long distance transport paths, glacial / ice drift and current flows, freezing and melting events as well as depositional centers for the released sediments. Classic methods applied for provenance analyses are mapping of the presence / absence of diagnostic minerals, their grain size distribution, modal mineralogy and the analysis of variations in ratio of two or more heavy minerals. Electron microprobe has been established to discover changes in mineral chemistry of individual mineral phases, which can indicate fluctuations or differences in the provenance. All these methods bear the potential of high errors that lower the validity of the provenance analyses. These are for example the misclassification of mineral species due to undistinguishable optical properties or the limitations in the detection / variations of trace elements using the election microprobe. For this case study, marine sediments from the Arctic Ocean have been selected to test if LA-ICP-MS can be established as a key technique for precise and reliable provenance analyses. The Laptev Sea is known to be a "sea ice formation factory" and represents a perfect source area with numerous sediment loaded rivers draining into the Arctic Ocean. Mineral grains become trapped in the sea ice, which is transported to the Fram Strait, the outflow area of the Transpolar Drift System. Thus, minerals in the Fram Strait and in the Laptev Sea should have the same provenance. In both areas zircon, garnet, ilmenite, magnetite, tourmaline, pyroxene and amphibole were identified (amongst others). The vast majority of potential source areas and the widespread occurrence of these accessory and rock forming minerals result in the absolute need for a highly sensitive and precise method such as LA-ICP-MS. We report new data on the eligibility of selected heavy minerals for provenance analyses in the Arctic Ocean. Based on the individual trace element composition, REE-pattern and isotopic ratios, reflecting the conditions during formation, we report individual fingerprints for single mineral species. This enables us to allocate specific minerals from Fram Strait and from Laptev Sea to one provenance. Furthermore we evaluate the eligibility of different heavy minerals as a geochemical proxy in Arctic sediments for provenance analyses using LA-ICP-MS.

  11. 40 CFR 63.90 - Program overview.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... “proven technology” (generally accepted by the scientific community as equivalent or better) that is... enforceable test method involving “proven technology” (generally accepted by the scientific community as... interest; and (3) “Combining” a federally required method with another proven method for application to...

  12. 40 CFR 63.90 - Program overview.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... “proven technology” (generally accepted by the scientific community as equivalent or better) that is... enforceable test method involving “proven technology” (generally accepted by the scientific community as... interest; and (3) “Combining” a federally required method with another proven method for application to...

  13. Querying Provenance Information: Basic Notions and an Example from Paleoclimate Reconstruction

    NASA Astrophysics Data System (ADS)

    Stodden, V.; Ludaescher, B.; Bocinsky, K.; Kintigh, K.; Kohler, T.; McPhillips, T.; Rush, J.

    2016-12-01

    Computational models are used to reconstruct and explain past environments and to predict likely future environments. For example, Bocinsky and Kohler have performed a 2,000-year reconstruction of the rain-fed maize agricultural niche in the US Southwest. The resulting academic publications not only contain traditional method descriptions, figures, etc. but also links to code and data for basic transparency and reproducibility. Examples include ResearchCompendia.org and the new project "Merging Science and Cyberinfrastructure Pathways: The Whole Tale." Provenance information provides a further critical element to understand a published study and to possibly extend or challenge the findings of the original authors. We present different notions and uses of provenance information using a computational archaeology example, e.g., the common use of "provenance for others" (for transparency and reproducibility), but also the more elusive but equally important use of "provenance for self". To this end, we distinguish prospective provenance (a.k.a. workflow) from retrospective provenance (a.k.a. data lineage) and show how combinations of both forms of provenance can be used to answer different kinds of important questions about a workflow and its execution. Since many workflows are developed using scripting or special purpose languages such as Python and R, we employ an approach and toolkit called YesWorkflow that brings provenance modeling, capture, and querying into the realm of scripting. YesWorkflow employs the basic W3C PROV standard, as well as the ProvONE extension for sharing and exchanging retrospective and prospective provenance information, respectively. Finally, we argue that the utility of provenance information should be maximized by developing different kinds provenance questions and queries during the early phases of computational workflow design and implementation.

  14. Scientific Workflows + Provenance = Better (Meta-)Data Management

    NASA Astrophysics Data System (ADS)

    Ludaescher, B.; Cuevas-Vicenttín, V.; Missier, P.; Dey, S.; Kianmajd, P.; Wei, Y.; Koop, D.; Chirigati, F.; Altintas, I.; Belhajjame, K.; Bowers, S.

    2013-12-01

    The origin and processing history of an artifact is known as its provenance. Data provenance is an important form of metadata that explains how a particular data product came about, e.g., how and when it was derived in a computational process, which parameter settings and input data were used, etc. Provenance information provides transparency and helps to explain and interpret data products. Other common uses and applications of provenance include quality control, data curation, result debugging, and more generally, 'reproducible science'. Scientific workflow systems (e.g. Kepler, Taverna, VisTrails, and others) provide controlled environments for developing computational pipelines with built-in provenance support. Workflow results can then be explained in terms of workflow steps, parameter settings, input data, etc. using provenance that is automatically captured by the system. Scientific workflows themselves provide a user-friendly abstraction of the computational process and are thus a form of ('prospective') provenance in their own right. The full potential of provenance information is realized when combining workflow-level information (prospective provenance) with trace-level information (retrospective provenance). To this end, the DataONE Provenance Working Group (ProvWG) has developed an extension of the W3C PROV standard, called D-PROV. Whereas PROV provides a 'least common denominator' for exchanging and integrating provenance information, D-PROV adds new 'observables' that described workflow-level information (e.g., the functional steps in a pipeline), as well as workflow-specific trace-level information ( timestamps for each workflow step executed, the inputs and outputs used, etc.) Using examples, we will demonstrate how the combination of prospective and retrospective provenance provides added value in managing scientific data. The DataONE ProvWG is also developing tools based on D-PROV that allow scientists to get more mileage from provenance metadata. DataONE is a federation of member nodes that store data and metadata for discovery and access. By enriching metadata with provenance information, search and reuse of data is enhanced, and the 'social life' of data (being the product of many workflow runs, different people, etc.) is revealed. We are currently prototyping a provenance repository (PBase) to demonstrate what can be achieved with advanced provenance queries. The ProvExplorer and ProPub tools support advanced ad-hoc querying and visualization of provenance as well as customized provenance publications (e.g., to address privacy issues, or to focus provenance to relevant details). In a parallel line of work, we are exploring ways to add provenance support to widely-used scripting platforms (e.g. R and Python) and then expose that information via D-PROV.

  15. Tele-Intervention: The Wave of the Future Fits Families' Lives Today

    ERIC Educational Resources Information Center

    Behl, Diane D.; Houston, K. Todd; Guthrie, W. Spencer; Guthrie, Nancy K.

    2010-01-01

    This article provides information on providing early intervention services virtually using distance communication technologies. It describes "tele-intervention," a new method of providing services to children and their families, and how it is used in a family with a deaf child. Tele-intervention has proven to be a viable service delivery model for…

  16. Advance in diagnosis of female genital tract tumor with laser fluorescence

    NASA Astrophysics Data System (ADS)

    Ding, Ai-Hua; Tseng, Quen; Lian, Shao-Hui

    1998-11-01

    In order to improve the diagnostic accuracy of malignant tumors with laser fluorescence, in 1996, our group successfully created the computerized laser fluorescence spectrograph type II with more reliable images shown overshadowing the naked eye method before 74 cases of female genital tract diseases had been examined by the LFS II resulting in 10 positive cases which were also proven pathologically as malignant tumors, without nay false negative, 3 cases presented suspicious positive but all were proven pathologically as non-tumors lesions, the false positive rate was 4 percent. Our work showed that the method of LFS II can provide a more rapid and accurate diagnosis for the clinical malignant tumors.

  17. A Scientific Data Provenance Harvester for Distributed Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephan, Eric G.; Raju, Bibi; Elsethagen, Todd O.

    Data provenance provides a way for scientists to observe how experimental data originates, conveys process history, and explains influential factors such as experimental rationale and associated environmental factors from system metrics measured at runtime. The US Department of Energy Office of Science Integrated end-to-end Performance Prediction and Diagnosis for Extreme Scientific Workflows (IPPD) project has developed a provenance harvester that is capable of collecting observations from file based evidence typically produced by distributed applications. To achieve this, file based evidence is extracted and transformed into an intermediate data format inspired in part by W3C CSV on the Web recommendations, calledmore » the Harvester Provenance Application Interface (HAPI) syntax. This syntax provides a general means to pre-stage provenance into messages that are both human readable and capable of being written to a provenance store, Provenance Environment (ProvEn). HAPI is being applied to harvest provenance from climate ensemble runs for Accelerated Climate Modeling for Energy (ACME) project funded under the U.S. Department of Energy’s Office of Biological and Environmental Research (BER) Earth System Modeling (ESM) program. ACME informally provides provenance in a native form through configuration files, directory structures, and log files that contain success/failure indicators, code traces, and performance measurements. Because of its generic format, HAPI is also being applied to harvest tabular job management provenance from Belle II DIRAC scheduler relational database tables as well as other scientific applications that log provenance related information.« less

  18. Elucidating tectonic events and processes from variably tectonized conglomerate clast detrital geochronology: examples from the Hongliuhe Formation in the southern Central Asian Orogenic Belt, NW China

    NASA Astrophysics Data System (ADS)

    Cleven, Nathan; Lin, Shoufa; Davis, Donald; Xiao, Wenjiao; Guilmette, Carl

    2017-04-01

    This work expands upon detrital zircon geochronology with a sampling and analysis strategy dating granitoid conglomerate clasts that exhibit differing degrees of internal ductile deformation. As deformation textures within clastic material reflect the variation and history of tectonization in the source region of a deposit, we outline a dating methodology that can provide details of the provenance's tectonomagmatic history from deformation-relative age distributions. The method involves bulk samples of solely granitoid clasts, as they are representative of the magmatic framework within the provenance. The clasts are classified and sorted into three subsets: undeformed, slightly deformed, and deformed. LA-ICPMS U-Pb geochronology is performed on zircon separates of each subset. Our case study, involving the Permian Hongliuhe Formation in the southern Central Asian Orogenic Belt, analyzes each of the three clast subsets, as well as sandstone detrital samples, at three stratigraphic levels to yield a profile of the unroofed provenance. The age spectra of the clast samples exhibit different, wider distributions than sandstone samples, considered an effect of proximity to the respective provenance. Comparisons of clast data to sandstone data, as well as comparisons between stratigraphic levels, yield indications of key tectonic processes, in addition to the typical characteristics provided by detrital geochronology. The clast data indicates a minimal lag time, implying rapid exhumation rates, whereas sandstone data alone would indicate a 90 m.y. lag time. Early Paleozoic arc building episodes appear as Ordovician peaks in sandstone data, and Silurian-Devonian peaks in clast data, indicating a younging of magmatism towards the proximal provenance. A magmatic hiatus starts in the Devonian, correlating with the latest age of deformed clasts, interpreted as timing of collisional tectonics. Provenance interpretation using the correlations seen between the clast and sandstone data proves to be more detailed and more robust than that determined from sandstone samples alone. The variably tectonized clast detrital geochronology method offers a regional reconnaissance tool that can address the practical limits of studying regional granitoid distributions.

  19. FHIR Healthcare Directories: Adopting Shared Interfaces to Achieve Interoperable Medical Device Data Integration.

    PubMed

    Tyndall, Timothy; Tyndall, Ayami

    2018-01-01

    Healthcare directories are vital for interoperability among healthcare providers, researchers and patients. Past efforts at directory services have not provided the tools to allow integration of the diverse data sources. Many are overly strict, incompatible with legacy databases, and do not provide Data Provenance. A more architecture-independent system is needed to enable secure, GDPR-compatible (8) service discovery across organizational boundaries. We review our development of a portable Data Provenance Toolkit supporting provenance within Health Information Exchange (HIE) systems. The Toolkit has been integrated with client software and successfully leveraged in clinical data integration. The Toolkit validates provenance stored in a Blockchain or Directory record and creates provenance signatures, providing standardized provenance that moves with the data. This healthcare directory suite implements discovery of healthcare data by HIE and EHR systems via FHIR. Shortcomings of past directory efforts include the ability to map complex datasets and enabling interoperability via exchange endpoint discovery. By delivering data without dictating how it is stored we improve exchange and facilitate discovery on a multi-national level through open source, fully interoperable tools. With the development of Data Provenance resources we enhance exchange and improve security and usability throughout the health data continuum.

  20. Successful Instructional Strategies for the Community College Student.

    ERIC Educational Resources Information Center

    Franse, Stephen R.

    1992-01-01

    Describes methods which have proven effective in motivating largely nontraditional and bilingual community college students to succeed. Suggests that teachers require writing, use graphs/illustrations creatively, use primary sources as required readings, provide supplementary readings, use brainstorming and quality circle techniques, and prepare…

  1. The enhancement of friction ridge detail on brass ammunition casings using cold patination fluid.

    PubMed

    James, Richard Michael; Altamimi, Mohamad Jamal

    2015-12-01

    Brass ammunition is commonly found at firearms related crime scenes. For this reason, many studies have focused on evidence that can be obtained from brass ammunition such as DNA, gunshot residue and fingerprints. Latent fingerprints on ammunition can provide good forensic evidence, however; fingerprint development on ammunition casings has proven to be difficult. A method using cold patination fluid is described as a potential tool to enhance friction ridge detail on brass ammunition casings. Current latent fingerprint development methods for brass ammunition have either failed to provide the necessary quality of friction ridge detail or can be very time consuming and require expensive equipment. In this study, the enhancement of fingerprints on live ammunition has been achieved with a good level of detail whilst the development on spent casings has to an extent also been possible. Development with cold patination fluid has proven to be a quick, simple and cost-effective method for fingerprint development on brass ammunition that can be easily implemented for routine police work. Crown Copyright © 2015. Published by Elsevier Ireland Ltd. All rights reserved.

  2. Geotechnical Descriptions of Rock and Rock Masses.

    DTIC Science & Technology

    1985-04-01

    determined in the field on core speci ns by the standard Rock Testing Handbook Methods . afls GA DTIC TAB thannounod 13 Justifiatlo By Distributin...to provide rock strength descriptions from the field. The point-load test has proven to be a reliable method of determining rock strength properties...report should qualify the reported spacing values by stating the methods used to determine spacing. Preferably the report should make the determination

  3. Active Provenance in Data-intensive Research

    NASA Astrophysics Data System (ADS)

    Spinuso, Alessandro; Mihajlovski, Andrej; Filgueira, Rosa; Atkinson, Malcolm

    2017-04-01

    Scientific communities are building platforms where the usage of data-intensive workflows is crucial to conduct their research campaigns. However managing and effectively support the understanding of the 'live' processes, fostering computational steering, sharing and re-use of data and methods, present several bottlenecks. These are often caused by the poor level of documentation on the methods and the data and how users interact with it. This work wants to explore how in such systems, flexibility in the management of the provenance and its adaptation to the different users and application contexts can lead to new opportunities for its exploitation, improving productivity. In particular, this work illustrates a conceptual and technical framework enabling tunable and actionable provenance in data-intensive workflow systems in support of reproducible science. It introduces the concept of Agile data-intensive systems to define the characteristic of our target platform. It shows a novel approach to the integration of provenance mechanisms, offering flexibility in the scale and in the precision of the provenance data collected, ensuring its relevance to the domain of the data-intensive task, fostering its rapid exploitation. The contributions address aspects of the scale of the provenance records, their usability and active role in the research life-cycle. We will discuss the use of dynamically generated provenance types as the approach for the integration of provenance mechanisms into a data-intensive workflow system. Enabling provenance can be transparent to the workflow user and developer, as well as fully controllable and customisable, depending from their expertise and the application's reproducibility, monitoring and validation requirements. The API that allows the realisation and adoption of a provenance type is presented, especially for what concerns the support of provenance profiling, contextualisation and precision. An actionable approach to provenance management will be also discussed, enabling provenance-driven operations at runtime, regardless of the enactment technologies and connectivity impediments. We proposes a framework based on concepts such as provenance clusters and provenance sensors, envisaging new potential for exploiting large quantities of provenance traces at runtime. Finally the work will also introduce how the underlying provenance model can be explored with big-data visualization techniques, aiming at producing comprehensive and interactive views on top of large and heterogeneous provenance data. We will demonstrate the adoption of alternative visualisation methods, from detailed and localised interactive graphs to radial-views, serving different purposes and expertise. Combining provenance types, selective rules, extensible metadata with reactive clustering opens a new and more versatile role of the lineage information in the research life-cycle, thanks to its improved usability. The flexible profiling of the proposed framework offers aid to the human analysis of the process, with the support of advanced and intuitive interactive graphical tools. The Active provenance methods are discussed in the context of a real implementation for a data-intensive library (dispel4py) and its adoption within use cases for computational seismology, climate studies and generic correlation analysis.

  4. Understanding the public's health problems: applications of symbolic interaction to public health.

    PubMed

    Maycock, Bruce

    2015-01-01

    Public health has typically investigated health issues using methods from the positivistic paradigm. Yet these approaches, although they are able to quantify the problem, may not be able to explain the social reasons of why the problem exists or the impact on those affected. This article will provide a brief overview of a sociological theory that provides methods and a theoretical framework that has proven useful in understanding public health problems and developing interventions. © 2014 APJPH.

  5. A GIHS-based spectral preservation fusion method for remote sensing images using edge restored spectral modulation

    NASA Astrophysics Data System (ADS)

    Zhou, Xiran; Liu, Jun; Liu, Shuguang; Cao, Lei; Zhou, Qiming; Huang, Huawen

    2014-02-01

    High spatial resolution and spectral fidelity are basic standards for evaluating an image fusion algorithm. Numerous fusion methods for remote sensing images have been developed. Some of these methods are based on the intensity-hue-saturation (IHS) transform and the generalized IHS (GIHS), which may cause serious spectral distortion. Spectral distortion in the GIHS is proven to result from changes in saturation during fusion. Therefore, reducing such changes can achieve high spectral fidelity. A GIHS-based spectral preservation fusion method that can theoretically reduce spectral distortion is proposed in this study. The proposed algorithm consists of two steps. The first step is spectral modulation (SM), which uses the Gaussian function to extract spatial details and conduct SM of multispectral (MS) images. This method yields a desirable visual effect without requiring histogram matching between the panchromatic image and the intensity of the MS image. The second step uses the Gaussian convolution function to restore lost edge details during SM. The proposed method is proven effective and shown to provide better results compared with other GIHS-based methods.

  6. Phenolic profile, antioxidant capacity of five Ziziphus spina-christi (L.) Willd provenances and their allelopathic effects on Trigonella foenum-graecum L. and Lens culinaris L. seeds.

    PubMed

    Elaloui, M; Ghazghazi, H; Ennajah, A; Manaa, S; Guezmir, W; Karray, N B; Laamouri, A

    2017-05-01

    The aim of this work was to evaluate some secondary metabolites, antioxidant activity of methanolic leaf extracts of five Ziziphus spina-christi provenances (INRGREF, Tozeur, Degueche, Nafta and Kebelli) and their allelopathic effects on Trigonella foenum-graecum and Lens culinaris. Leaves were collected during 2013 and 2014. Total phenols, flavonoids, tannins and antioxidant activity were evaluated using the Folin ciocalteux, Aluminum trichloride, vanillin and scavenging activity on 22-diphenyl-1-picrylhydrazyl (DPPH) radical methods, respectively. Total phenols, tannins and flavonoids were present, at levels of 57.41 mg GAE/g DW, 31.98 mg RE/g DW and 14.68 μg CE/g DW, respectively. The high antioxidant activity (0.086 μg/mL) was noted in kebelli provenance (2013). The highest germination, plumule and radicle lengths of tested species were observed in INRGREF provenance. Z. spina-christi leaf extracts may be suggested in foods and pharmaceutical industries. Leaf extracts could also provide a natural herbicide with a positive impact on the environment.

  7. The Symbiotic Relationship between Scientific Workflow and Provenance (Invited)

    NASA Astrophysics Data System (ADS)

    Stephan, E.

    2010-12-01

    The purpose of this presentation is to describe the symbiotic nature of scientific workflows and provenance. We will also discuss the current trends and real world challenges facing these two distinct research areas. Although motivated differently, the needs of the international science communities are the glue that binds this relationship together. Understanding and articulating the science drivers to these communities is paramount as these technologies evolve and mature. Originally conceived for managing business processes, workflows are now becoming invaluable assets in both computational and experimental sciences. These reconfigurable, automated systems provide essential technology to perform complex analyses by coupling together geographically distributed disparate data sources and applications. As a result, workflows are capable of higher throughput in a shorter amount of time than performing the steps manually. Today many different workflow products exist; these could include Kepler and Taverna or similar products like MeDICI, developed at PNNL, that are standardized on the Business Process Execution Language (BPEL). Provenance, originating from the French term Provenir “to come from”, is used to describe the curation process of artwork as art is passed from owner to owner. The concept of provenance was adopted by digital libraries as a means to track the lineage of documents while standards such as the DublinCore began to emerge. In recent years the systems science community has increasingly expressed the need to expand the concept of provenance to formally articulate the history of scientific data. Communities such as the International Provenance and Annotation Workshop (IPAW) have formalized a provenance data model. The Open Provenance Model, and the W3C is hosting a provenance incubator group featuring the Proof Markup Language. Although both workflows and provenance have risen from different communities and operate independently, their mutual success is tied together, forming a symbiotic relationship where research and development advances in one effort can provide tremendous benefits to the other. For example, automating provenance extraction within scientific applications is still a relatively new concept; the workflow engine provides the framework to capture application specific operations, inputs, and resulting data. It provides a description of the process history and data flow by wrapping workflow components around the applications and data sources. On the other hand, a lack of cooperation between workflows and provenance can inhibit usefulness of both to science. Blindly tracking the execution history without having a true understanding of what kinds of questions end users may have makes the provenance indecipherable to the target users. Over the past nine years PNNL has been actively involved in provenance research in support of computational chemistry, molecular dynamics, biology, hydrology, and climate. PNNL has also been actively involved in efforts by the international community to develop open standards for provenance and the development of architectures to support provenance capture, storage, and querying. This presentation will provide real world use cases of how provenance and workflow can be leveraged and implemented to meet different needs and the challenges that lie ahead.

  8. Using Qualitative Methods to Assess Student Trajectories and College Impact

    ERIC Educational Resources Information Center

    Harper, Shaun R.

    2007-01-01

    Researchers have called attention to the racism and stereotypes experienced by black undergraduates on predominantly white campuses; provided evidence of how race-specific organizations and programs help neutralize the oppressive ethos of these institutions; and proven empirically that historically black colleges and universities foster more…

  9. Optically Based Rapid Screening Method for Proven Optimal Treatment Strategies Before Treatment Begins

    DTIC Science & Technology

    to rapidly test /screen breast cancer therapeutics as a strategy to streamline drug development and provide individualized treatment. The results...system can therefore be used to streamline pre-clinical drug development, by reducing the number of animals , cost, and time required to screen new drugs

  10. Crossing Organizations for Professional Training.

    ERIC Educational Resources Information Center

    Gans, Cheryl

    2002-01-01

    Cross training from a variety of organizations can provide camp professionals with new ways of doing things and with proven methods used by others in related fields. The training needs of camp professionals can be met by organizations in the areas of outdoor education, environmental education, experiential learning, and recreation. Certification…

  11. Development and Evaluation of the School Cafeteria Nutrition Assessment Measures

    ERIC Educational Resources Information Center

    Krukowski, Rebecca A.; Philyaw Perez, Amanda G.; Bursac, Zoran; Goodell, Melanie; Raczynski, James M.; Smith West, Delia; Phillips, Martha M.

    2011-01-01

    Background: Foods provided in schools represent a substantial portion of US children's dietary intake; however, the school food environment has proven difficult to describe due to the lack of comprehensive, standardized, and validated measures. Methods: As part of the Arkansas Act 1220 evaluation project, we developed the School Cafeteria…

  12. Knowledge Provenance in Semantic Wikis

    NASA Astrophysics Data System (ADS)

    Ding, L.; Bao, J.; McGuinness, D. L.

    2008-12-01

    Collaborative online environments with a technical Wiki infrastructure are becoming more widespread. One of the strengths of a Wiki environment is that it is relatively easy for numerous users to contribute original content and modify existing content (potentially originally generated by others). As more users begin to depend on informational content that is evolving by Wiki communities, it becomes more important to track the provenance of the information. Semantic Wikis expand upon traditional Wiki environments by adding some computationally understandable encodings of some of the terms and relationships in Wikis. We have developed a semantic Wiki environment that expands a semantic Wiki with provenance markup. Provenance of original contributions as well as modifications is encoded using the provenance markup component of the Proof Markup Language. The Wiki environment provides the provenance markup automatically, thus users are not required to make specific encodings of author, contribution date, and modification trail. Further, our Wiki environment includes a search component that understands the provenance primitives and thus can be used to provide a provenance-aware search facility. We will describe the knowledge provenance infrastructure of our Semantic Wiki and show how it is being used as the foundation of our group web site as well as a number of project web sites.

  13. The ethical leadership challenge for effective resolution of patient and family complaints and grievances: proven methods and models.

    PubMed

    Piper, Llewellyn E; Tallman, Erin

    2015-01-01

    Health care leaders and managers face the ethical leadership challenge in ensuring effective resolution of patient and family complaints and grievances. In today's society of increasing discontent about safety, quality, cost, and satisfaction, patient complaints and grievances are becoming more prevalent. Under the mandates of the Patient Protection and Affordable Care Act for transparency of quality and patient satisfaction scores and to be compliant with the standards from the Centers for Medicare & Medicaid Services and The Joint Commission, it is imperative that leadership ensure an ethical culture for effective resolution of patient and family complaints and grievances. This article addresses this ethical leadership challenge by providing a systematic approach with proven methods and models for effective resolution of complaints and grievances and thereby improving satisfaction, quality, safety, and cost.

  14. How the provenance of electronic health record data matters for research: a case example using system mapping.

    PubMed

    Johnson, Karin E; Kamineni, Aruna; Fuller, Sharon; Olmstead, Danielle; Wernli, Karen J

    2014-01-01

    The use of electronic health records (EHRs) for research is proceeding rapidly, driven by computational power, analytical techniques, and policy. However, EHR-based research is limited by the complexity of EHR data and a lack of understanding about data provenance, meaning the context under which the data were collected. This paper presents system flow mapping as a method to help researchers more fully understand the provenance of their EHR data as it relates to local workflow. We provide two specific examples of how this method can improve data identification, documentation, and processing. EHRs store clinical and administrative data, often in unstructured fields. Each clinical system has a unique and dynamic workflow, as well as an EHR customized for local use. The EHR customization may be influenced by a broader context such as documentation required for billing. We present a case study with two examples of using system flow mapping to characterize EHR data for a local colorectal cancer screening process. System flow mapping demonstrated that information entered into the EHR during clinical practice required interpretation and transformation before it could be accurately applied to research. We illustrate how system flow mapping shaped our knowledge of the quality and completeness of data in two examples: (1) determining colonoscopy indication as recorded in the EHR, and (2) discovering a specific EHR form that captured family history. Researchers who do not consider data provenance risk compiling data that are systematically incomplete or incorrect. For example, researchers who are not familiar with the clinical workflow under which data were entered might miss or misunderstand patient information or procedure and diagnostic codes. Data provenance is a fundamental characteristic of research data from EHRs. Given the diversity of EHR platforms and system workflows, researchers need tools for evaluating and reporting data availability, quality, and transformations. Our case study illustrates how system mapping can inform researchers about the provenance of their data as it pertains to local workflows.

  15. Undoing Feudalism: A New Look at Communal Conflict Mediation

    DTIC Science & Technology

    1994-03-24

    social constructionists assert that each individual human plays a significant role in creating and influencing reality as it is perceived by his...concepts which, over time, have proven their worth by providing meaning to new, ever-emerging social realities. Levi’s second method for belief system...competing social groups is a gradual one, though varying in speed and method according to circumstance. This realization suggests the existence of a

  16. Preliminary Climate Uncertainty Quantification Study on Model-Observation Test Beds at Earth Systems Grid Federation Repository

    NASA Astrophysics Data System (ADS)

    Lin, G.; Stephan, E.; Elsethagen, T.; Meng, D.; Riihimaki, L. D.; McFarlane, S. A.

    2012-12-01

    Uncertainty quantification (UQ) is the science of quantitative characterization and reduction of uncertainties in applications. It determines how likely certain outcomes are if some aspects of the system are not exactly known. UQ studies such as the atmosphere datasets greatly increased in size and complexity because they now comprise of additional complex iterative steps, involve numerous simulation runs and can consist of additional analytical products such as charts, reports, and visualizations to explain levels of uncertainty. These new requirements greatly expand the need for metadata support beyond the NetCDF convention and vocabulary and as a result an additional formal data provenance ontology is required to provide a historical explanation of the origin of the dataset that include references between the explanations and components within the dataset. This work shares a climate observation data UQ science use case and illustrates how to reduce climate observation data uncertainty and use a linked science application called Provenance Environment (ProvEn) to enable and facilitate scientific teams to publish, share, link, and discover knowledge about the UQ research results. UQ results include terascale datasets that are published to an Earth Systems Grid Federation (ESGF) repository. Uncertainty exists in observation data sets, which is due to sensor data process (such as time averaging), sensor failure in extreme weather conditions, and sensor manufacture error etc. To reduce the uncertainty in the observation data sets, a method based on Principal Component Analysis (PCA) was proposed to recover the missing values in observation data. Several large principal components (PCs) of data with missing values are computed based on available values using an iterative method. The computed PCs can approximate the true PCs with high accuracy given a condition of missing values is met; the iterative method greatly improve the computational efficiency in computing PCs. Moreover, noise removal is done at the same time during the process of computing missing values by using only several large PCs. The uncertainty quantification is done through statistical analysis of the distribution of different PCs. To record above UQ process, and provide an explanation on the uncertainty before and after the UQ process on the observation data sets, additional data provenance ontology, such as ProvEn, is necessary. In this study, we demonstrate how to reduce observation data uncertainty on climate model-observation test beds and using ProvEn to record the UQ process on ESGF. ProvEn demonstrates how a scientific team conducting UQ studies can discover dataset links using its domain knowledgebase, allowing them to better understand and convey the UQ study research objectives, the experimental protocol used, the resulting dataset lineage, related analytical findings, ancillary literature citations, along with the social network of scientists associated with the study. Climate scientists will not only benefit from understanding a particular dataset within a knowledge context, but also benefit from the cross reference of knowledge among the numerous UQ studies being stored in ESGF.

  17. DNA barcode and identification of the varieties and provenances of Taiwan's domestic and imported made teas using ribosomal internal transcribed spacer 2 sequences.

    PubMed

    Lee, Shih-Chieh; Wang, Chia-Hsiang; Yen, Cheng-En; Chang, Chieh

    2017-04-01

    The major aim of made tea identification is to identify the variety and provenance of the tea plant. The present experiment used 113 tea plants [Camellia sinensis (L.) O. Kuntze] housed at the Tea Research and Extension Substation, from which 113 internal transcribed spacer 2 (ITS2) fragments, 104 trnL intron, and 98 trnL-trnF intergenic sequence region DNA sequences were successfully sequenced. The similarity of the ITS2 nucleotide sequences between tea plants housed at the Tea Research and Extension Substation was 0.379-0.994. In this polymerase chain reaction-amplified noncoding region, no varieties possessed identical sequences. Compared with the trnL intron and trnL-trnF intergenic sequence fragments of chloroplast cpDNA, the proportion of ITS2 nucleotide sequence variation was large and is more suitable for establishing a DNA barcode database to identify tea plant varieties. After establishing the database, 30 imported teas and 35 domestic made teas were used in this model system to explore the feasibility of using ITS2 sequences to identify the varieties and provenances of made teas. A phylogenetic tree was constructed using ITS2 sequences with the unweighted pair group method with arithmetic mean, which indicated that the same variety of tea plant is likely to be successfully categorized into one cluster, but contamination from other tea plants was also detected. This result provides molecular evidence that the similarity between important tea varieties in Taiwan remains high. We suggest a direct, wide collection of made tea and original samples of tea plants to establish an ITS2 sequence molecular barcode identification database to identify the varieties and provenances of tea plants. The DNA barcode comparison method can satisfy the need for a rapid, low-cost, frontline differentiation of the large amount of made teas from Taiwan and abroad, and can provide molecular evidence of their varieties and provenances. Copyright © 2016. Published by Elsevier B.V.

  18. Coupling online effects-based monitoring with physicochemical, optical, and spectroscopy methods to assess quality at a surface water intake

    EPA Science Inventory

    Effects-based monitoring of water quality is a proven approach to monitoring the status of a water source. Only biological material can integrate factors which dictate toxicity. Online Toxicity Monitors (OTMs) provide a means to digitize sentinel organism responses to dynamic wa...

  19. Parallel Worlds of Education and Medicine: Art, Science, and Evidence

    ERIC Educational Resources Information Center

    Johnson, Eileen

    2008-01-01

    The No Child Left Behind Act is comprised of four pillars, one of which is "proven education methods." This paper attempts to provide a historical context for the development of evidence-based education by examining its foundation in medical practice. Next, the rationale for evidence of educational effectiveness based on a scientific…

  20. Teaching and Learning with Computers! A Method for American Indian Bilingual Classrooms.

    ERIC Educational Resources Information Center

    Bennett, Ruth

    Computer instruction can offer particular benefits to the Indian child. Computer use emphasizes the visual facets of learning, teaches language based skills needed for higher education and careers, and provides types of instruction proven effective with Indian children, such as private self-testing and cooperative learning. The Hupa, Yurok, Karuk,…

  1. A convergent diffusion and social marketing approach for disseminating proven approaches to physical activity promotion.

    PubMed

    Dearing, James W; Maibach, Edward W; Buller, David B

    2006-10-01

    Approaches from diffusion of innovations and social marketing are used here to propose efficient means to promote and enhance the dissemination of evidence-based physical activity programs. While both approaches have traditionally been conceptualized as top-down, center-to-periphery, centralized efforts at social change, their operational methods have usually differed. The operational methods of diffusion theory have a strong relational emphasis, while the operational methods of social marketing have a strong transactional emphasis. Here, we argue for a convergence of diffusion of innovation and social marketing principles to stimulate the efficient dissemination of proven-effective programs. In general terms, we are encouraging a focus on societal sectors as a logical and efficient means for enhancing the impact of dissemination efforts. This requires an understanding of complex organizations and the functional roles played by different individuals in such organizations. In specific terms, ten principles are provided for working effectively within societal sectors and enhancing user involvement in the processes of adoption and implementation.

  2. On the efficacy of stochastic collocation, stochastic Galerkin, and stochastic reduced order models for solving stochastic problems

    DOE PAGES

    Richard V. Field, Jr.; Emery, John M.; Grigoriu, Mircea Dan

    2015-05-19

    The stochastic collocation (SC) and stochastic Galerkin (SG) methods are two well-established and successful approaches for solving general stochastic problems. A recently developed method based on stochastic reduced order models (SROMs) can also be used. Herein we provide a comparison of the three methods for some numerical examples; our evaluation only holds for the examples considered in the paper. The purpose of the comparisons is not to criticize the SC or SG methods, which have proven very useful for a broad range of applications, nor is it to provide overall ratings of these methods as compared to the SROM method.more » Furthermore, our objectives are to present the SROM method as an alternative approach to solving stochastic problems and provide information on the computational effort required by the implementation of each method, while simultaneously assessing their performance for a collection of specific problems.« less

  3. Membrane proteins structures: A review on computational modeling tools.

    PubMed

    Almeida, Jose G; Preto, Antonio J; Koukos, Panagiotis I; Bonvin, Alexandre M J J; Moreira, Irina S

    2017-10-01

    Membrane proteins (MPs) play diverse and important functions in living organisms. They constitute 20% to 30% of the known bacterial, archaean and eukaryotic organisms' genomes. In humans, their importance is emphasized as they represent 50% of all known drug targets. Nevertheless, experimental determination of their three-dimensional (3D) structure has proven to be both time consuming and rather expensive, which has led to the development of computational algorithms to complement the available experimental methods and provide valuable insights. This review highlights the importance of membrane proteins and how computational methods are capable of overcoming challenges associated with their experimental characterization. It covers various MP structural aspects, such as lipid interactions, allostery, and structure prediction, based on methods such as Molecular Dynamics (MD) and Machine-Learning (ML). Recent developments in algorithms, tools and hybrid approaches, together with the increase in both computational resources and the amount of available data have resulted in increasingly powerful and trustworthy approaches to model MPs. Even though MPs are elementary and important in nature, the determination of their 3D structure has proven to be a challenging endeavor. Computational methods provide a reliable alternative to experimental methods. In this review, we focus on computational techniques to determine the 3D structure of MP and characterize their binding interfaces. We also summarize the most relevant databases and software programs available for the study of MPs. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Preliminary Structural Design Using Topology Optimization with a Comparison of Results from Gradient and Genetic Algorithm Methods

    NASA Technical Reports Server (NTRS)

    Burt, Adam O.; Tinker, Michael L.

    2014-01-01

    In this paper, genetic algorithm based and gradient-based topology optimization is presented in application to a real hardware design problem. Preliminary design of a planetary lander mockup structure is accomplished using these methods that prove to provide major weight savings by addressing the structural efficiency during the design cycle. This paper presents two alternative formulations of the topology optimization problem. The first is the widely-used gradient-based implementation using commercially available algorithms. The second is formulated using genetic algorithms and internally developed capabilities. These two approaches are applied to a practical design problem for hardware that has been built, tested and proven to be functional. Both formulations converged on similar solutions and therefore were proven to be equally valid implementations of the process. This paper discusses both of these formulations at a high level.

  5. The Correlation between Radon Emission Concentration and Subsurface Geological Condition

    NASA Astrophysics Data System (ADS)

    Kuntoro, Yudi; Setiawan, Herru L.; Wijayanti, Teni; Haerudin, Nandi

    2018-03-01

    Exploration activities with standard methods have already encountered many obstacles in the field. Geological survey is often difficult to find outcrop because they are covered by vegetation, alluvial layer or as a result of urban development and housing. Seismic method requires a large expense and licensing in the use of dynamite is complicated. Method of gravity requires the operator to go back (looping) to the starting point. Given some of these constraints, therefore it needs a solution in the form of new method that can work more efficiently with less cost. Several studies in various countries have shown a correlation between the presence of hydrocarbons and Radon gas concentration in the earth surface. By utilizing the properties of Radon that can migrate to the surface, the value of Radon concentration in the surface is suggested to provide information about the subsurface structure condition. Radon is the only radioactive substance that gas-phased at atmospheric temperature. It is very abundant in the earth mantle. The vast differences of temperatures and pressures between the mantle and the earth crust cause the convection flow toward earth surface. Radon in gas phase will be carried by convection flow to the surface. The quantity of convection currents depend on the porosity and permeability of rocks where Radon travels within, so that Radon concentration in the earth surface delineates the porosity and permeability of subsurface rock layers. Some measurements were carried out at several locations with various subsurface geological conditions, including proven oil fields, proven geothermal field, and frontier area as a comparison. These measurements show that the average and the background concentration threshold in the proven oil field (11,200 Bq/m3) and proven geothermal field (7,820 Bq/m3) is much higher than the quantity in frontier area (329 and 1,620 Bq/m3). Radon concentration in the earth surface is correlated with the presence of geological faults. Peak concentrations of Radon takes place along the fault.

  6. Empowering Provenance in Data Integration

    NASA Astrophysics Data System (ADS)

    Kondylakis, Haridimos; Doerr, Martin; Plexousakis, Dimitris

    The provenance of data has recently been recognized as central to the trust one places in data. This paper presents a novel framework in order to empower provenance in a mediator based data integration system. We use a simple mapping language for mapping schema constructs, between an ontology and relational sources, capable to carry provenance information. This language extends the traditional data exchange setting by translating our mapping specifications into source-to-target tuple generating dependencies (s-t tgds). Then we define formally the provenance information we want to retrieve i.e. annotation, source and tuple provenance. We provide three algorithms to retrieve provenance information using information stored on the mappings and the sources. We show the feasibility of our solution and the advantages of our framework.

  7. S-ProvFlow: provenance model and tools for scalable and adaptive analysis pipelines in geoscience.

    NASA Astrophysics Data System (ADS)

    Spinuso, A.; Mihajlovski, A.; Atkinson, M.; Filgueira, R.; Klampanos, I.; Sanchez, S.

    2017-12-01

    The reproducibility of scientific findings is essential to improve the quality and application of modern data-driven research. Delivering such reproducibility is challenging in the context of systems handling large data-streams with sophisticated computational methods. Similarly, the SKA (Square Kilometer Array) will collect an unprecedented volume of radio-wave signals that will have to be reduced and transformed into derived products, with impact on space-weather research. This highlights the importance of having cross-disciplines mechanisms at the producer's side that rely on usable lineage data to support validation and traceability of the new artifacts. To be informative, provenance has to describe each methods' abstractions and their implementation as mappings onto distributed platforms and their concurrent execution, capturing relevant internal dependencies at runtime. Producers and intelligent toolsets should be able to exploit the produced provenance, steering real-time monitoring activities and inferring adaptations of methods at runtime.We present a model of provenance (S-PROV) that extends W3C PROV and ProvONE, broadening coverage of provenance to aspects related to distribution, scale-up and steering of stateful streaming operators in analytic pipelines. This is supported by a technical framework for tuneable and actionable lineage, ensuring its relevance to the users' interests, fostering its rapid exploitation to facilitate research practices. By applying concepts such as provenance typing and profiling, users define rules to capture common provenance patterns and activate selective controls based on domain-metadata. The traces are recorded in a document-store with index optimisation and a web API serves advanced interactive tools (S-ProvFlow, https://github.com/KNMI/s-provenance). These allow different classes of consumers to rapidly explore the provenance data. The system, which contributes to the SKA-Link initiative, within technology and knowledge transfer events, will be discussed in the context of an existing data-intensive service for seismology (VERCE), and the newly funded project DARE - Delivering Agile Research Excellence. A generic solution for extreme data and methods in geosciences that domain experts can understand, change and use effectively.

  8. Bridging the Gap between Scientific Data Producers and Consumers: A Provenance Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stephan, Eric G.; Pinheiro da Silva, Paulo; Kleese van Dam, Kerstin

    2013-06-03

    Despite the methodical and painstaking efforts made by scientists to record their scientific findings and protocols, a knowledge gap problem continues to persist today between producers of scientific results and consumers because technology is performing the exchange of data as opposed to scientists making direct contact. Provenance is a means to formalize how this knowledge is transferred. However, for it to be meaningful to scientists, the provenance research community needs continued contributions from the scientific community to extend and leverage provenance-based vocabularies and technology from the provenance community. Going forward the provenance community must also be vigilant to meet scalabilitymore » needs of data intensive science« less

  9. Qualitative Versus Quantitative Mammographic Breast Density Assessment: Applications for the US and Abroad

    PubMed Central

    Destounis, Stamatia; Arieno, Andrea; Morgan, Renee; Roberts, Christina; Chan, Ariane

    2017-01-01

    Mammographic breast density (MBD) has been proven to be an important risk factor for breast cancer and an important determinant of mammographic screening performance. The measurement of density has changed dramatically since its inception. Initial qualitative measurement methods have been found to have limited consistency between readers, and in regards to breast cancer risk. Following the introduction of full-field digital mammography, more sophisticated measurement methodology is now possible. Automated computer-based density measurements can provide consistent, reproducible, and objective results. In this review paper, we describe various methods currently available to assess MBD, and provide a discussion on the clinical utility of such methods for breast cancer screening. PMID:28561776

  10. Ontology development for provenance tracing in National Climate Assessment of the US Global Change Research Program

    NASA Astrophysics Data System (ADS)

    Ma, X.; Zheng, J. G.; Goldstein, J.; Duggan, B.; Xu, J.; Du, C.; Akkiraju, A.; Aulenbach, S.; Tilmes, C.; Fox, P. A.

    2013-12-01

    The periodical National Climate Assessment (NCA) of the US Global Change Research Program (USGCRP) [1] produces reports about findings of global climate change and the impacts of climate change on the United States. Those findings are of great public and academic concerns and are used in policy and management decisions, which make the provenance information of findings in those reports especially important. The USGCRP is developing a Global Change Information System (GCIS), in which the NCA reports and associated provenance information are the primary records. We were modeling and developing Semantic Web applications for the GCIS. By applying a use case-driven iterative methodology [2], we developed an ontology [3] to represent the content structure of a report and the associated provenance information. We also mapped the classes and properties in our ontology into the W3C PROV-O ontology [4] to realize the formal presentation of provenance. We successfully implemented the ontology in several pilot systems for a recent National Climate Assessment report (i.e., the NCA3). They provide users the functionalities to browse and search provenance information with topics of interest. Provenance information of the NCA3 has been made structured and interoperable by applying the developed ontology. Besides the pilot systems we developed, other tools and services are also able to interact with the data in the context of the 'Web of data' and thus create added values. Our research shows that the use case-driven iterative method bridges the gap between Semantic Web researchers and earth and environmental scientists and is able to be deployed rapidly for developing Semantic Web applications. Our work also provides first-hand experience for re-using the W3C PROV-O ontology in the field of earth and environmental sciences, as the PROV-O ontology is recently ratified (on 04/30/2013) by the W3C as a recommendation and relevant applications are still rare. [1] http://www.globalchange.gov [2] Fox, P., McGuinness, D.L., 2008. TWC Semantic Web Methodology. Accessible at: http://tw.rpi.edu/web/doc/TWC_SemanticWebMethodology [3] https://scm.escience.rpi.edu/svn/public/projects/gcis/trunk/rdf/schema/GCISOntology.ttl [4] http://www.w3.org/TR/prov-o/

  11. From Provenance Standards and Tools to Queries and Actionable Provenance

    NASA Astrophysics Data System (ADS)

    Ludaescher, B.

    2017-12-01

    The W3C PROV standard provides a minimal core for sharing retrospective provenance information for scientific workflows and scripts. PROV extensions such as DataONE's ProvONE model are necessary for linking runtime observables in retrospective provenance records with conceptual-level prospective provenance information, i.e., workflow (or dataflow) graphs. Runtime provenance recorders, such as DataONE's RunManager for R, or noWorkflow for Python capture retrospective provenance automatically. YesWorkflow (YW) is a toolkit that allows researchers to declare high-level prospective provenance models of scripts via simple inline comments (YW-annotations), revealing the computational modules and dataflow dependencies in the script. By combining and linking both forms of provenance, important queries and use cases can be supported that neither provenance model can afford on its own. We present existing and emerging provenance tools developed for the DataONE and SKOPE (Synthesizing Knowledge of Past Environments) projects. We show how the different tools can be used individually and in combination to model, capture, share, query, and visualize provenance information. We also present challenges and opportunities for making provenance information more immediately actionable for the researchers who create it in the first place. We argue that such a shift towards "provenance-for-self" is necessary to accelerate the creation, sharing, and use of provenance in support of transparent, reproducible computational and data science.

  12. Adaptation of eastern whitepine provenances to planting sites

    Treesearch

    Maurice E., Jr. Demeritt; Peter W. Garrett

    1996-01-01

    Eastern white pine provenances from the extreme limits of the natural range of this species are changing from above- and below-average stability to average stability for height growth with increasing age. The regression method is useful for evaluating the stability of provenance to planting sites. The same general conclusions are reached for the performance at...

  13. Developing Patient-Centered Communication Skills Training for Oncologists: Describing the Content and Efficacy of Training

    ERIC Educational Resources Information Center

    Brown, Richard F.; Bylund, Carma L.; Gueguen, Jennifer A.; Diamond, Catherine; Eddington, Julia; Kissane, David

    2010-01-01

    Communication Skills Training (CST) is a proven aid to help oncologists achieve high quality patient-centered communication. No research studies have provided clear guidelines for developing the content of CST. The aim of this work is to describe a method of developing such content and evaluation of effectiveness of CST training workshops (based…

  14. School Accounting, Budgeting and Finance Challenges. Programs to Help Your School District Improve Its Accounting Procedures, Budgeting Methods and Financial Reporting.

    ERIC Educational Resources Information Center

    Stolberg, Charles G., Ed.

    To help improve school district financial management, the Association of School Business Officials at its 1980 annual meeting held a special session consisting of 20 "mini-workshops" about successful, field-proven practices in school budgeting, accounting, auditing, and other financial tasks. This document provides summaries of the…

  15. Assessing non-uniqueness: An algebraic approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasco, Don W.

    Geophysical inverse problems are endowed with a rich mathematical structure. When discretized, most differential and integral equations of interest are algebraic (polynomial) in form. Techniques from algebraic geometry and computational algebra provide a means to address questions of existence and uniqueness for both linear and non-linear inverse problem. In a sense, the methods extend ideas which have proven fruitful in treating linear inverse problems.

  16. Advantages and Challenges in using Multi-Sensor Data for Studying Aerosols from Space

    NASA Astrophysics Data System (ADS)

    Leptoukh, Gregory

    We are living now in the golden era of numerous sensors measuring aerosols from space, e.g., MODIS, MISR, MERIS, OMI, POLDER, etc. Data from multiple sensors provide a more complete coverage of physical phenomena than data from a single sensor. These sensors are rather different from each other, are sensitive to various parts of the atmosphere, use different aerosol models and treat surface differently when retrieving aerosols. However, they complement each other thus providing more information about spatial, vertical and temporal distribution of aerosols. In addition to differences in instrumentation, retrieval algorithms and calibration, there are quite substantial differences in processing algorithms from Level 0 up to Level 3 and 4. Some of these differences in processing steps, at times not well documented and not widely known by users, can lead to quite significant differences in final products. Without documenting all the steps leading to the final product, data users will not trust the data and/or may use data incorrectly. Data by themselves without quality assessment and provenance are not sufficient to make accurate scientific conclusions. In this paper we provide examples of striking differences between aerosol optical depth data from MODIS, MISR, and MERIS that can be attributed to differences in a certain threshold, aggregation methods, and the dataday definition. We talk about challenges in developing processing provenance. Also, we address issues of harmonization of data, quality and provenance that is needed to guide the multi-sensor data usage and avoid apples-to-oranges comparison and fusion.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, E.; Engebrecht-Metzger, C.; Horowitz, S.

    As BA has grown to include a large and diverse cross-section of the home building and retrofit industries, it has become more important to develop accurate, consistent analysis techniques to measure progress towards the program's goals. The House Simulation Protocol (HSP) document provides guidance to program partners and managers so they can compare energy savings for new construction and retrofit projects. The HSP provides the program with analysis methods that are proven to be effective and reliable in investigating the energy use of advanced energy systems and of entire houses.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilson, E.; Engebrecht, C. Metzger; Horowitz, S.

    As Building America has grown to include a large and diverse cross-section of the home building and retrofit industries, it has become more important to develop accurate, consistent analysis techniques to measure progress towards the program's goals. The House Simulation Protocol (HSP) document provides guidance to program partners and managers so they can compare energy savings for new construction and retrofit projects. The HSP provides the program with analysis methods that are proven to be effective and reliable in investigating the energy use of advanced energy systems and of entire houses.

  19. Characterizing Provenance in Visualization and Data Analysis: An Organizational Framework of Provenance Types and Purposes.

    PubMed

    Ragan, Eric D; Endert, Alex; Sanyal, Jibonananda; Chen, Jian

    2016-01-01

    While the primary goal of visual analytics research is to improve the quality of insights and findings, a substantial amount of research in provenance has focused on the history of changes and advances throughout the analysis process. The term, provenance, has been used in a variety of ways to describe different types of records and histories related to visualization. The existing body of provenance research has grown to a point where the consolidation of design knowledge requires cross-referencing a variety of projects and studies spanning multiple domain areas. We present an organizational framework of the different types of provenance information and purposes for why they are desired in the field of visual analytics. Our organization is intended to serve as a framework to help researchers specify types of provenance and coordinate design knowledge across projects. We also discuss the relationships between these factors and the methods used to capture provenance information. In addition, our organization can be used to guide the selection of evaluation methodology and the comparison of study outcomes in provenance research.

  20. Characterizing Provenance in Visualization and Data Analysis: An Organizational Framework of Provenance Types and Purposes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ragan, Eric; Alex, Endert; Sanyal, Jibonananda

    While the primary goal of visual analytics research is to improve the quality of insights and findings, a substantial amount of research in provenance has focused on the history of changes and advances throughout the analysis process. The term, provenance, has been used in a variety of ways to describe different types of records and histories related to visualization. The existing body of provenance research has grown to a point where the consolidation of design knowledge requires cross-referencing a variety of projects and studies spanning multiple domain areas. We present an organizational framework of the different types of provenance informationmore » and purposes for why they are desired in the field of visual analytics. Our organization is intended to serve as a framework to help researchers specify types of provenance and coordinate design knowledge across projects. We also discuss the relationships between these factors and the methods used to capture provenance information. In addition, our organization can be used to guide the selection of evaluation methodology and the comparison of study outcomes in provenance research« less

  1. Characterizing Provenance in Visualization and Data Analysis: An Organizational Framework of Provenance Types and Purposes

    DOE PAGES

    Ragan, Eric; Alex, Endert; Sanyal, Jibonananda; ...

    2016-01-01

    While the primary goal of visual analytics research is to improve the quality of insights and findings, a substantial amount of research in provenance has focused on the history of changes and advances throughout the analysis process. The term, provenance, has been used in a variety of ways to describe different types of records and histories related to visualization. The existing body of provenance research has grown to a point where the consolidation of design knowledge requires cross-referencing a variety of projects and studies spanning multiple domain areas. We present an organizational framework of the different types of provenance informationmore » and purposes for why they are desired in the field of visual analytics. Our organization is intended to serve as a framework to help researchers specify types of provenance and coordinate design knowledge across projects. We also discuss the relationships between these factors and the methods used to capture provenance information. In addition, our organization can be used to guide the selection of evaluation methodology and the comparison of study outcomes in provenance research« less

  2. The CELSS Antarctic Analog Project: An Advanced Life Support Testbed at the Amundsen-Scott South Pole Station, Antarctica

    NASA Technical Reports Server (NTRS)

    Straight, Christian L.; Bubenheim, David L.; Bates, Maynard E.; Flynn, Michael T.

    1994-01-01

    CELSS Antarctic Analog Project (CAAP) represents a logical solution to the multiple objectives of both the NASA and the National Science Foundation (NSF). CAAP will result in direct transfer of proven technologies and systems, proven under the most rigorous of conditions, to the NSF and to society at large. This project goes beyond, as it must, the generally accepted scope of CELSS and life support systems including the issues of power generation, human dynamics, community systems, and training. CAAP provides a vivid and starkly realistic testbed of Controlled Ecological Life Support System (CELSS) and life support systems and methods. CAAP will also be critical in the development and validation of performance parameters for future advanced life support systems.

  3. Data provenance assurance in the cloud using blockchain

    NASA Astrophysics Data System (ADS)

    Shetty, Sachin; Red, Val; Kamhoua, Charles; Kwiat, Kevin; Njilla, Laurent

    2017-05-01

    Ever increasing adoption of cloud technology scales up the activities like creation, exchange, and alteration of cloud data objects, which create challenges to track malicious activities and security violations. Addressing this issue requires implementation of data provenance framework so that each data object in the federated cloud environment can be tracked and recorded but cannot be modified. The blockchain technology gives a promising decentralized platform to build tamper-proof systems. Its incorruptible distributed ledger/blockchain complements the need of maintaining cloud data provenance. In this paper, we present a cloud based data provenance framework using block chain which traces data record operations and generates provenance data. We anchor provenance data records into block chain transactions, which provide validation on provenance data and preserve user privacy at the same time. Once the provenance data is uploaded to the global block chain network, it is extremely challenging to tamper the provenance data. Besides, the provenance data uses hashed user identifiers prior to uploading so the blockchain nodes cannot link the operations to a particular user. The framework ensures that the privacy is preserved. We implemented the architecture on ownCloud, uploaded records to blockchain network, stored records in a provenance database and developed a prototype in form of a web service.

  4. Tracing catchment fine sediment sources using the new SIFT (SedIment Fingerprinting Tool) open source software.

    PubMed

    Pulley, S; Collins, A L

    2018-09-01

    The mitigation of diffuse sediment pollution requires reliable provenance information so that measures can be targeted. Sediment source fingerprinting represents one approach for supporting these needs, but recent methodological developments have resulted in an increasing complexity of data processing methods rendering the approach less accessible to non-specialists. A comprehensive new software programme (SIFT; SedIment Fingerprinting Tool) has therefore been developed which guides the user through critical data analysis decisions and automates all calculations. Multiple source group configurations and composite fingerprints are identified and tested using multiple methods of uncertainty analysis. This aims to explore the sediment provenance information provided by the tracers more comprehensively than a single model, and allows for model configurations with high uncertainties to be rejected. This paper provides an overview of its application to an agricultural catchment in the UK to determine if the approach used can provide a reduction in uncertainty and increase in precision. Five source group classifications were used; three formed using a k-means cluster analysis containing 2, 3 and 4 clusters, and two a-priori groups based upon catchment geology. Three different composite fingerprints were used for each classification and bi-plots, range tests, tracer variability ratios and virtual mixtures tested the reliability of each model configuration. Some model configurations performed poorly when apportioning the composition of virtual mixtures, and different model configurations could produce different sediment provenance results despite using composite fingerprints able to discriminate robustly between the source groups. Despite this uncertainty, dominant sediment sources were identified, and those in close proximity to each sediment sampling location were found to be of greatest importance. This new software, by integrating recent methodological developments in tracer data processing, guides users through key steps. Critically, by applying multiple model configurations and uncertainty assessment, it delivers more robust solutions for informing catchment management of the sediment problem than many previously used approaches. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  5. Providing Global Change Information for Decision-Making: Capturing and Presenting Provenance

    NASA Technical Reports Server (NTRS)

    Ma, Xiaogang; Fox, Peter; Tilmes, Curt; Jacobs, Katherine; Waple, Anne

    2014-01-01

    Global change information demands access to data sources and well-documented provenance to provide evidence needed to build confidence in scientific conclusions and, in specific applications, to ensure the information's suitability for use in decision-making. A new generation of Web technology, the Semantic Web, provides tools for that purpose. The topic of global change covers changes in the global environment (including alterations in climate, land productivity, oceans or other water resources, atmospheric composition and or chemistry, and ecological systems) that may alter the capacity of the Earth to sustain life and support human systems. Data and findings associated with global change research are of great public, government, and academic concern and are used in policy and decision-making, which makes the provenance of global change information especially important. In addition, since different types of decisions benefit from different types of information, understanding how to capture and present the provenance of global change information is becoming more of an imperative in adaptive planning.

  6. Provenance Storage, Querying, and Visualization in PBase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kianmajd, Parisa; Ludascher, Bertram; Missier, Paolo

    2015-01-01

    We present PBase, a repository for scientific workflows and their corresponding provenance information that facilitates the sharing of experiments among the scientific community. PBase is interoperable since it uses ProvONE, a standard provenance model for scientific workflows. Workflows and traces are stored in RDF, and with the support of SPARQL and the tree cover encoding, the repository provides a scalable infrastructure for querying the provenance data. Furthermore, through its user interface, it is possible to: visualize workflows and execution traces; visualize reachability relations within these traces; issue SPARQL queries; and visualize query results.

  7. Proven Alternatives for Aboveground Treatment of Arsenic in Groundwater

    EPA Pesticide Factsheets

    This issue paper, developed for EPA's Engineering Forum, identifies and summarizes experiences with proven aboveground treatment alternatives for arsenic in groundwater, and provides information on their relative effectiveness and cost.

  8. Quantitative and Qualitative Analysis of Flavonoids and Phenolic Acids in Snow Chrysanthemum (Coreopsis tinctoria Nutt.) by HPLC-DAD and UPLC-ESI-QTOF-MS.

    PubMed

    Yang, Yinjun; Sun, Xinguang; Liu, Jinjun; Kang, Liping; Chen, Sibao; Ma, Baiping; Guo, Baolin

    2016-09-30

    A simple, accurate and reliable high performance liquid chromatography coupled with photodiode array detection (HPLC-DAD) method was developed and then successfully applied for simultaneous quantitative analysis of eight compounds, including chlorogenic acid ( 1 ), ( R / S )-flavanomarein ( 2 ), butin-7- O -β-d-glucopyranoside ( 3 ), isookanin ( 4 ), taxifolin ( 5 ), 5,7,3',5'-tetrahydroxyflavanone-7- O -β-d-glucopyranoside ( 6 ), marein ( 7 ) and okanin ( 8 ), in 23 batches of snow chrysanthemum of different seed provenance and from various habitats. The results showed total contents of the eight compounds in the samples with seed provenance from Keliyang (Xinjiang, China), are higher than in samples from the other five provenances by 52.47%, 15.53%, 19.78%, 21.17% and 5.06%, respectively, which demonstrated that provenance has a great influence on the constituents in snow chrysanthemum. Meanwhile, an ultra performance liquid chromatography coupled with electrospray ionization and quadrupole time-of-flight-mass spectrometry (UPLC-ESI-QTOF-MS) was also employed to rapidly separate and identify flavonoids and phenolic acids in snow chrysanthemum from Keliyang. As a result, a total of 30 constituents, including 26 flavonoids and four phenolic acids, were identified or tentatively identified based on the exact mass information, the fragmentation characteristics, and retention times of eight reference standards. This work may provide an efficient approach to comprehensively evaluate the quality of snow chrysanthemum.

  9. Automated Generation of Technical Documentation and Provenance for Reproducible Research

    NASA Astrophysics Data System (ADS)

    Jolly, B.; Medyckyj-Scott, D.; Spiekermann, R.; Ausseil, A. G.

    2017-12-01

    Data provenance and detailed technical documentation are essential components of high-quality reproducible research, however are often only partially addressed during a research project. Recording and maintaining this information during the course of a project can be a difficult task to get right as it is a time consuming and often boring process for the researchers involved. As a result, provenance records and technical documentation provided alongside research results can be incomplete or may not be completely consistent with the actual processes followed. While providing access to the data and code used by the original researchers goes some way toward enabling reproducibility, this does not count as, or replace, data provenance. Additionally, this can be a poor substitute for good technical documentation and is often more difficult for a third-party to understand - particularly if they do not understand the programming language(s) used. We present and discuss a tool built from the ground up for the production of well-documented and reproducible spatial datasets that are created by applying a series of classification rules to a number of input layers. The internal model of the classification rules required by the tool to process the input data is exploited to also produce technical documentation and provenance records with minimal additional user input. Available provenance records that accompany input datasets are incorporated into those that describe the current process. As a result, each time a new iteration of the analysis is performed the documentation and provenance records are re-generated to provide an accurate description of the exact process followed. The generic nature of this tool, and the lessons learned during its creation, have wider application to other fields where the production of derivative datasets must be done in an open, defensible, and reproducible way.

  10. Data Provenance Hybridization Supporting Extreme-Scale Scientific WorkflowApplications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Elsethagen, Todd O.; Stephan, Eric G.; Raju, Bibi

    As high performance computing (HPC) infrastructures continue to grow in capability and complexity, so do the applications that they serve. HPC and distributed-area computing (DAC) (e.g. grid and cloud) users are looking increasingly toward workflow solutions to orchestrate their complex application coupling, pre- and post-processing needs To gain insight and a more quantitative understanding of a workflow’s performance our method includes not only the capture of traditional provenance information, but also the capture and integration of system environment metrics helping to give context and explanation for a workflow’s execution. In this paper, we describe IPPD’s provenance management solution (ProvEn) andmore » its hybrid data store combining both of these data provenance perspectives.« less

  11. QEMSCAN+LA-ICP-MS: a 'big data' generator for sedimentary provenance analysis

    NASA Astrophysics Data System (ADS)

    Vermeesch, Pieter; Rittner, Martin; Garzanti, Eduardo

    2017-04-01

    Sedimentary provenance may be traced by 'fingerprinting' sediments with chemical, mineralogical or isotopic means. Normally, each of these provenance proxies is characterised on a separate aliquot of the same sample. For example, the chemical composition of the bulk sample may be analysed by X-ray fluorescence (XRF) on one aliquot, framework petrography on another, heavy mineral analysis on a density separate of a third split, and zircon U-Pb dating on a further density separate of the heavy mineral fraction. The labour intensity of this procedure holds back the widespread application of multi-method provenance studies. We here present a new method to solve this problem and avoid mineral separation by coupling a QEMSCAN electron microscope to an LA-ICP-MS instrument and thereby generate all four aforementioned provenance datasets as part of the same workflow. Given a polished hand specimen, a petrographic thin section, or a grain mount, the QEMSCAN+LA-ICP-MS method produces chemical and mineralogical maps from which the X-Y coordinates of the datable mineral are extracted. These coordinates are subsequently passed on to the laser ablation system for isotopic and, hence, geochronological analysis. In the process of finding all the zircons in a sediment grain mount, the QEMSCAN yields the compositional and mineralogical compositions as byproducts. We have applied the new QEMSCAN+LA-ICP-MS instrument suite to over 100 samples from three large sediment routing systems: (1) the Tigris-Euphrates river catchments and Rub' Al Khali desert in Arabia; (2) the Nile catchment in northeast Africa and (3) desert and beach sands between the Orange and Congo rivers in southwest Africa. These studies reveal (1) that Rub' Al Khali sand is predominantly derived from the Arabian Shield and not from Mesopotamia; (2) that the Blue Nile is the principal source of Nile sand; and (3) that Orange River sand is carried northward by longshore drift nearly 1,800km from South Africa to southern Angola. In addition to these geological findings, the first applications of QEMSCAN+LA-ICP-MS highlight some key advantages of the new workflow over traditional provenance analysis: (a) the new method not only increases sample throughput but also improves data quality by reducing significant biases associated with mineral separation and grain selection; (b) the three case studies highlight the importance of zircon 'fertility' for interpreting detrital zircon U-Pb datasets, and the ability of QEMSCAN to quantify this crucial parameter semi-automatically; (c) QEMSCAN+LA-ICP-MS provides an opportunity to add textural information to detrital geochronology and, for example, quantify possible grain-size dependence of U-Pb age distributions. But besides these advantages, the three case studies also reveal a number of limitations: (a) mineral identification by QEMSCAN is not as reliable as commonly achieved by human observers; (b) heavy mineral compositions obtained by QEMSCAN cannot easily be compared with conventional point counting data; and (c) apparent grain sizes can be greatly affected by polishing artefacts. In conclusion, QEMSCAN+LA-ICP-MS is a transformational new technique for provenance analysis but should be used with caution, in combination with conventional petrographic and heavy mineral techniques.

  12. A unified framework for managing provenance information in translational research

    PubMed Central

    2011-01-01

    Background A critical aspect of the NIH Translational Research roadmap, which seeks to accelerate the delivery of "bench-side" discoveries to patient's "bedside," is the management of the provenance metadata that keeps track of the origin and history of data resources as they traverse the path from the bench to the bedside and back. A comprehensive provenance framework is essential for researchers to verify the quality of data, reproduce scientific results published in peer-reviewed literature, validate scientific process, and associate trust value with data and results. Traditional approaches to provenance management have focused on only partial sections of the translational research life cycle and they do not incorporate "domain semantics", which is essential to support domain-specific querying and analysis by scientists. Results We identify a common set of challenges in managing provenance information across the pre-publication and post-publication phases of data in the translational research lifecycle. We define the semantic provenance framework (SPF), underpinned by the Provenir upper-level provenance ontology, to address these challenges in the four stages of provenance metadata: (a) Provenance collection - during data generation (b) Provenance representation - to support interoperability, reasoning, and incorporate domain semantics (c) Provenance storage and propagation - to allow efficient storage and seamless propagation of provenance as the data is transferred across applications (d) Provenance query - to support queries with increasing complexity over large data size and also support knowledge discovery applications We apply the SPF to two exemplar translational research projects, namely the Semantic Problem Solving Environment for Trypanosoma cruzi (T.cruzi SPSE) and the Biomedical Knowledge Repository (BKR) project, to demonstrate its effectiveness. Conclusions The SPF provides a unified framework to effectively manage provenance of translational research data during pre and post-publication phases. This framework is underpinned by an upper-level provenance ontology called Provenir that is extended to create domain-specific provenance ontologies to facilitate provenance interoperability, seamless propagation of provenance, automated querying, and analysis. PMID:22126369

  13. K/Ar Dating of Fine Grained Sediments Near Prydz Bay, Antarctica: East Antarctic Ice Sheet Behavior During the Middle-Miocene Climate Transition

    NASA Astrophysics Data System (ADS)

    Duchesne, A. E.; Pierce, E. L.; Williams, T.; Hemming, S. R.; Johnson, D. L.; May, T.; Gombiner, J.; Torfstein, A.

    2012-12-01

    ¶ The Middle Miocene Climate Transition (MMCT) (~14 Ma) represents a time of major East Antarctic Ice-Sheet (EAIS) expansion, with research suggesting major global sea level fall on the order of ~60 meters (John et al., 2011, EPSL). Ocean Drilling Program (ODP) core data from Site 1165B near Prydz Bay shows an influx of cobbles deposited ~13.8-13.5 Ma, representing a sudden burst of ice-rafted detritus (IRD) during the MMCT. Based on 40Ar/39Ar dating of hornblendes and/or biotite grains, 5 of 6 dated pebbles from a companion study show Wilkes Land origins, indicating transport from over 1500 kilometers away. However, samples throughout this time interval have an anomalously low abundance of sand, thus we seek to understand the sedimentary processes that led to the deposition of these isolated dropstones in a fine matrix through provenance studies of the core's terrigenous fine fraction. Geochemical provenance studies of the terrigenous fraction of marine sediments can aid in identifying past dynamic EAIS behavior; the few outcrops available on the continent provide specific rock characterizations and age constraints from which cored marine sediments can then be matched to using established radiogenic isotope techniques. Here we apply the K/Ar dating method as a provenance tool for identifying the source area(s) of fine-grained terrigenous sediments (<63 μm) deposited during the MMCT. ¶ After source area characterization, we find that the fine-grained sediments from the mid-Miocene show a mixture of both local Prydz Bay sourcing (~400 Ma signature) and Wilkes Land provenance (~900 Ma signature). While locally-derived Prydz Bay sediments are likely to have been delivered via meltwater from ice and deposited as hemipelagic sediments (with some possible bottom current modification, as this is a drift site), sediments sourced from Wilkes Land required transport via large icebergs. Future work will involve further provenance determination on both the fine-grained sediments and the abundant dropstones deposited at ODP Site 1165B during the MMCT. We anticipate that the use of the K/Ar radiometric dating technique as a proxy for the study of glacially transported fine-grained terrigenous materials will enable future Antarctic provenance research and further aid in providing insight into past EAIS behavior.; ODP Core 34X from Site 1165B

  14. Understanding and Managing Causality of Change in Socio-Technical Systems 3

    DTIC Science & Technology

    2012-01-06

    influence, and (4) management and control. The questions are listed below. Dynamics and Context  What can be learned from patterns of causal...has proven to be an insufficient method to determine existence of behavioral and performance patterns . Cognitive work analysis, on the other hand...to provide a point of comparison, including Victorian bushfires, Queensland and Victorian floods, and the mine collapse in Chile. Privacy, Threats

  15. Improved numerical solutions for chaotic-cancer-model

    NASA Astrophysics Data System (ADS)

    Yasir, Muhammad; Ahmad, Salman; Ahmed, Faizan; Aqeel, Muhammad; Akbar, Muhammad Zubair

    2017-01-01

    In biological sciences, dynamical system of cancer model is well known due to its sensitivity and chaoticity. Present work provides detailed computational study of cancer model by counterbalancing its sensitive dependency on initial conditions and parameter values. Cancer chaotic model is discretized into a system of nonlinear equations that are solved using the well-known Successive-Over-Relaxation (SOR) method with a proven convergence. This technique enables to solve large systems and provides more accurate approximation which is illustrated through tables, time history maps and phase portraits with detailed analysis.

  16. Monitoring surface water quality using social media in the context of citizen science

    NASA Astrophysics Data System (ADS)

    Zheng, Hang; Hong, Yang; Long, Di; Jing, Hua

    2017-02-01

    Surface water quality monitoring (SWQM) provides essential information for water environmental protection. However, SWQM is costly and limited in terms of equipment and sites. The global popularity of social media and intelligent mobile devices with GPS and photography functions allows citizens to monitor surface water quality. This study aims to propose a method for SWQM using social media platforms. Specifically, a WeChat-based application platform is built to collect water quality reports from volunteers, which have been proven valuable for water quality monitoring. The methods for data screening and volunteer recruitment are discussed based on the collected reports. The proposed methods provide a framework for collecting water quality data from citizens and offer a primary foundation for big data analysis in future research.

  17. PAV ontology: provenance, authoring and versioning

    PubMed Central

    2013-01-01

    Background Provenance is a critical ingredient for establishing trust of published scientific content. This is true whether we are considering a data set, a computational workflow, a peer-reviewed publication or a simple scientific claim with supportive evidence. Existing vocabularies such as Dublin Core Terms (DC Terms) and the W3C Provenance Ontology (PROV-O) are domain-independent and general-purpose and they allow and encourage for extensions to cover more specific needs. In particular, to track authoring and versioning information of web resources, PROV-O provides a basic methodology but not any specific classes and properties for identifying or distinguishing between the various roles assumed by agents manipulating digital artifacts, such as author, contributor and curator. Results We present the Provenance, Authoring and Versioning ontology (PAV, namespace http://purl.org/pav/): a lightweight ontology for capturing “just enough” descriptions essential for tracking the provenance, authoring and versioning of web resources. We argue that such descriptions are essential for digital scientific content. PAV distinguishes between contributors, authors and curators of content and creators of representations in addition to the provenance of originating resources that have been accessed, transformed and consumed. We explore five projects (and communities) that have adopted PAV illustrating their usage through concrete examples. Moreover, we present mappings that show how PAV extends the W3C PROV-O ontology to support broader interoperability. Method The initial design of the PAV ontology was driven by requirements from the AlzSWAN project with further requirements incorporated later from other projects detailed in this paper. The authors strived to keep PAV lightweight and compact by including only those terms that have demonstrated to be pragmatically useful in existing applications, and by recommending terms from existing ontologies when plausible. Discussion We analyze and compare PAV with related approaches, namely Provenance Vocabulary (PRV), DC Terms and BIBFRAME. We identify similarities and analyze differences between those vocabularies and PAV, outlining strengths and weaknesses of our proposed model. We specify SKOS mappings that align PAV with DC Terms. We conclude the paper with general remarks on the applicability of PAV. PMID:24267948

  18. Antimicrobial-Coated Granules for Disinfecting Water

    NASA Technical Reports Server (NTRS)

    Akse, James R.; Holtsnider, John T.; Kliestik, Helen

    2011-01-01

    Methods of preparing antimicrobialcoated granules for disinfecting flowing potable water have been developed. Like the methods reported in the immediately preceding article, these methods involve chemical preparation of substrate surfaces (in this case, the surfaces of granules) to enable attachment of antimicrobial molecules to the surfaces via covalent bonds. A variety of granular materials have been coated with a variety of antimicrobial agents that include antibiotics, bacteriocins, enzymes, bactericides, and fungicides. When employed in packed beds in flowing water, these antimicrobial-coated granules have been proven effective against gram-positive bacteria, gram-negative bacteria, fungi, and viruses. Composite beds, consisting of multiple layers containing different granular antimicrobial media, have proven particularly effective against a broad spectrum of microorganisms. These media have also proven effective in enhancing or potentiating the biocidal effects of in-line iodinated resins and of very low levels of dissolved elemental iodine.

  19. Sulfur isotope analysis of cinnabar from Roman wall paintings by elemental analysis/isotope ratio mass spectrometry--tracking the origin of archaeological red pigments and their authenticity.

    PubMed

    Spangenberg, Jorge E; Lavric, Jost V; Meisser, Nicolas; Serneels, Vincent

    2010-10-15

    The most valuable pigment of the Roman wall paintings was the red color obtained from powdered cinnabar (Minium Cinnabaris pigment), the red mercury sulfide (HgS), which was brought from mercury (Hg) deposits in the Roman Empire. To address the question of whether sulfur isotope signatures can serve as a rapid method to establish the provenance of the red pigment in Roman frescoes, we have measured the sulfur isotope composition (δ(34)S value in ‰ VCDT) in samples of wall painting from the Roman city Aventicum (Avenches, Vaud, Switzerland) and compared them with values from cinnabar from European mercury deposits (Almadén in Spain, Idria in Slovenia, Monte Amiata in Italy, Moschellandsberg in Germany, and Genepy in France). Our study shows that the δ(34)S values of cinnabar from the studied Roman wall paintings fall within or near to the composition of Almadén cinnabar; thus, the provenance of the raw material may be deduced. This approach may provide information on provenance and authenticity in archaeological, restoration and forensic studies of Roman and Greek frescoes. Copyright © 2010 John Wiley & Sons, Ltd.

  20. Securing Provenance of Distributed Processes in an Untrusted Environment

    NASA Astrophysics Data System (ADS)

    Syalim, Amril; Nishide, Takashi; Sakurai, Kouichi

    Recently, there is much concern about the provenance of distributed processes, that is about the documentation of the origin and the processes to produce an object in a distributed system. The provenance has many applications in the forms of medical records, documentation of processes in the computer systems, recording the origin of data in the cloud, and also documentation of human-executed processes. The provenance of distributed processes can be modeled by a directed acyclic graph (DAG) where each node represents an entity, and an edge represents the origin and causal relationship between entities. Without sufficient security mechanisms, the provenance graph suffers from integrity and confidentiality problems, for example changes or deletions of the correct nodes, additions of fake nodes and edges, and unauthorized accesses to the sensitive nodes and edges. In this paper, we propose an integrity mechanism for provenance graph using the digital signature involving three parties: the process executors who are responsible in the nodes' creation, a provenance owner that records the nodes to the provenance store, and a trusted party that we call the Trusted Counter Server (TCS) that records the number of nodes stored by the provenance owner. We show that the mechanism can detect the integrity problem in the provenance graph, namely unauthorized and malicious “authorized” updates even if all the parties, except the TCS, collude to update the provenance. In this scheme, the TCS only needs a very minimal storage (linear with the number of the provenance owners). To protect the confidentiality and for an efficient access control administration, we propose a method to encrypt the provenance graph that allows access by paths and compartments in the provenance graph. We argue that encryption is important as a mechanism to protect the provenance data stored in an untrusted environment. We analyze the security of the integrity mechanism, and perform experiments to measure the performance of both mechanisms.

  1. Component-specific modeling

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.

    1985-01-01

    A series of interdisciplinary modeling and analysis techniques that were specialized to address three specific hot section components are presented. These techniques will incorporate data as well as theoretical methods from many diverse areas including cycle and performance analysis, heat transfer analysis, linear and nonlinear stress analysis, and mission analysis. Building on the proven techniques already available in these fields, the new methods developed will be integrated into computer codes to provide an accurate, and unified approach to analyzing combustor burner liners, hollow air cooled turbine blades, and air cooled turbine vanes. For these components, the methods developed will predict temperature, deformation, stress and strain histories throughout a complete flight mission.

  2. Trace analysis of energetic materials via direct analyte-probed nanoextraction coupled to direct analysis in real time mass spectrometry.

    PubMed

    Clemons, Kristina; Dake, Jeffrey; Sisco, Edward; Verbeck, Guido F

    2013-09-10

    Direct analysis in real time mass spectrometry (DART-MS) has proven to be a useful forensic tool for the trace analysis of energetic materials. While other techniques for detecting trace amounts of explosives involve extraction, derivatization, solvent exchange, or sample clean-up, DART-MS requires none of these. Typical DART-MS analyses directly from a solid sample or from a swab have been quite successful; however, these methods may not always be an optimal sampling technique in a forensic setting. For example, if the sample were only located in an area which included a latent fingerprint of interest, direct DART-MS analysis or the use of a swab would almost certainly destroy the print. To avoid ruining such potentially invaluable evidence, another method has been developed which will leave the fingerprint virtually untouched. Direct analyte-probed nanoextraction coupled to nanospray ionization-mass spectrometry (DAPNe-NSI-MS) has demonstrated excellent sensitivity and repeatability in forensic analyses of trace amounts of illicit drugs from various types of surfaces. This technique employs a nanomanipulator in conjunction with bright-field microscopy to extract single particles from a surface of interest and has provided a limit of detection of 300 attograms for caffeine. Combining DAPNe with DART-MS provides another level of flexibility in forensic analysis, and has proven to be a sufficient detection method for trinitrotoluene (TNT), RDX, and 1-methylaminoanthraquinone (MAAQ). Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  3. The Ontario Cancer Research Ethics Board: a central REB that works

    PubMed Central

    Chaddah, M.R.

    2008-01-01

    The Ontario Cancer Research Ethics Board (ocreb) has made its mark within Ontario as a successful, centralized, oncology-specific research ethics board. As such, ocreb has proven invaluable to principal investigators, sponsors, and study participants given its ability to reduce duplication during the submission process, to provide the highest quality of review, to shorten study start-up time, and to implement more efficient methods of reporting serious adverse events. PMID:18317585

  4. Single-grid spectral collocation for the Navier-Stokes equations

    NASA Technical Reports Server (NTRS)

    Bernardi, Christine; Canuto, Claudio; Maday, Yvon; Metivet, Brigitte

    1988-01-01

    The aim of the paper is to study a collocation spectral method to approximate the Navier-Stokes equations: only one grid is used, which is built from the nodes of a Gauss-Lobatto quadrature formula, either of Legendre or of Chebyshev type. The convergence is proven for the Stokes problem provided with inhomogeneous Dirichlet conditions, then thoroughly analyzed for the Navier-Stokes equations. The practical implementation algorithm is presented, together with numerical results.

  5. Methamphetamine Vaccines: Improvement through Hapten Design.

    PubMed

    Collins, Karen C; Schlosburg, Joel E; Bremer, Paul T; Janda, Kim D

    2016-04-28

    Methamphetamine (MA) addiction is a serious public health problem, and current methods to abate addiction and relapse are currently ineffective for mitigating this growing global epidemic. Development of a vaccine targeting MA would provide a complementary strategy to existing behavioral therapies, but this has proven challenging. Herein, we describe optimization of both hapten design and formulation, identifying a vaccine that elicited a robust anti-MA immune response in mice, decreasing methamphetamine-induced locomotor activity.

  6. 40 CFR 63.90 - Program overview.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... calibration gases or test cells; (4) Use of an analytical technology that differs from that specified by a... “proven technology” (generally accepted by the scientific community as equivalent or better) that is... enforceable test method involving “proven technology” (generally accepted by the scientific community as...

  7. 40 CFR 63.90 - Program overview.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... calibration gases or test cells; (4) Use of an analytical technology that differs from that specified by a... “proven technology” (generally accepted by the scientific community as equivalent or better) that is... enforceable test method involving “proven technology” (generally accepted by the scientific community as...

  8. 40 CFR 63.90 - Program overview.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... calibration gases or test cells; (4) Use of an analytical technology that differs from that specified by a... “proven technology” (generally accepted by the scientific community as equivalent or better) that is... enforceable test method involving “proven technology” (generally accepted by the scientific community as...

  9. Provenance Usage in the OceanLink Project

    NASA Astrophysics Data System (ADS)

    Narock, T.; Arko, R. A.; Carbotte, S. M.; Chandler, C. L.; Cheatham, M.; Fils, D.; Finin, T.; Hitzler, P.; Janowicz, K.; Jones, M.; Krisnadhi, A.; Lehnert, K. A.; Mickle, A.; Raymond, L. M.; Schildhauer, M.; Shepherd, A.; Wiebe, P. H.

    2014-12-01

    A wide spectrum of maturing methods and tools, collectively characterized as the Semantic Web, is helping to vastly improve thedissemination of scientific research. The OceanLink project, an NSF EarthCube Building Block, is utilizing semantic technologies tointegrate geoscience data repositories, library holdings, conference abstracts, and funded research awards. Provenance is a vital componentin meeting both the scientific and engineering requirements of OceanLink. Provenance plays a key role in justification and understanding when presenting users with results aggregated from multiple sources. In the engineering sense, provenance enables the identification of new data and the ability to determine which data sources to query. Additionally, OceanLink will leverage human and machine computation for crowdsourcing, text mining, and co-reference resolution. The results of these computations, and their associated provenance, will be folded back into the constituent systems to continually enhance precision and utility. We will touch on the various roles provenance is playing in OceanLink as well as present our use of the PROV Ontology and associated Ontology Design Patterns.

  10. Archaeology Through Computational Linguistics: Inscription Statistics Predict Excavation Sites of Indus Valley Artifacts.

    PubMed

    Recchia, Gabriel L; Louwerse, Max M

    2016-11-01

    Computational techniques comparing co-occurrences of city names in texts allow the relative longitudes and latitudes of cities to be estimated algorithmically. However, these techniques have not been applied to estimate the provenance of artifacts with unknown origins. Here, we estimate the geographic origin of artifacts from the Indus Valley Civilization, applying methods commonly used in cognitive science to the Indus script. We show that these methods can accurately predict the relative locations of archeological sites on the basis of artifacts of known provenance, and we further apply these techniques to determine the most probable excavation sites of four sealings of unknown provenance. These findings suggest that inscription statistics reflect historical interactions among locations in the Indus Valley region, and they illustrate how computational methods can help localize inscribed archeological artifacts of unknown origin. The success of this method offers opportunities for the cognitive sciences in general and for computational anthropology specifically. Copyright © 2015 Cognitive Science Society, Inc.

  11. A comprehensive screen for volatile organic compounds in biological fluids.

    PubMed

    Sharp, M E

    2001-10-01

    A headspace gas chromatographic (GC) screen for common volatile organic compounds in biological fluids is reported. Common GC phases, DB-1 and DB-WAX, with split injection provide separation and identification of more than 40 compounds in a single 20-min run. In addition, this method easily accommodates quantitation. The screen detects commonly encountered volatile compounds at levels below 4 mg%. A control mixture, providing qualitative and semiquantitative information, is described. For comparison, elution of the volatiles on a specialty phase, DB-624, is reported. This method is an expansion and modification of a screen that had been used for more than 20 years. During its first year of use, the expanded screen has proven to be advantageous in routine forensic casework.

  12. The visible touch: in planta visualization of protein-protein interactions by fluorophore-based methods

    PubMed Central

    Bhat, Riyaz A; Lahaye, Thomas; Panstruga, Ralph

    2006-01-01

    Non-invasive fluorophore-based protein interaction assays like fluorescence resonance energy transfer (FRET) and bimolecular fluorescence complementation (BiFC, also referred to as "split YFP") have been proven invaluable tools to study protein-protein interactions in living cells. Both methods are now frequently used in the plant sciences and are likely to develop into standard techniques for the identification, verification and in-depth analysis of polypeptide interactions. In this review, we address the individual strengths and weaknesses of both approaches and provide an outlook about new directions and possible future developments for both techniques. PMID:16800872

  13. Combination film/splash fill for overcoming film fouling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phelps, P.M.; Minett, T.O.

    1995-02-01

    In summary, this large cooling tower user has found the Phelps film/splash Stack-Pack fill design to attain a substantial improvement in capability of their existing crossflow cooling towers, without increasing fan power or tower size. The lack of fouling in the film fill component of this fill design is due to the use of film fill with large (1 inch) spacing between sheets, coupled with effective water treatment as provided by Nalco. This combination of factors provides a proven method for significantly increasing crossflow or counterflow cooling tower capability while minimizing chances of serious fill fouling.

  14. How to Get Information on Several Proven Programs for Accelerating the Progress of Low-Achieving Children (Literacy for All Children).

    ERIC Educational Resources Information Center

    Allington, Richard L.

    1992-01-01

    Offers summaries of three proven programs (Reading Recovery, Success for All, and Accelerated Schools) for accelerating the reading and writing progress of low-achieving, low-income children. Provides addresses for more information. (SR)

  15. How do you assign persistent identifiers to extracts from large, complex, dynamic data sets that underpin scholarly publications?

    NASA Astrophysics Data System (ADS)

    Wyborn, Lesley; Car, Nicholas; Evans, Benjamin; Klump, Jens

    2016-04-01

    Persistent identifiers in the form of a Digital Object Identifier (DOI) are becoming more mainstream, assigned at both the collection and dataset level. For static datasets, this is a relatively straight-forward matter. However, many new data collections are dynamic, with new data being appended, models and derivative products being revised with new data, or the data itself revised as processing methods are improved. Further, because data collections are becoming accessible as services, researchers can log in and dynamically create user-defined subsets for specific research projects: they also can easily mix and match data from multiple collections, each of which can have a complex history. Inevitably extracts from such dynamic data sets underpin scholarly publications, and this presents new challenges. The National Computational Infrastructure (NCI) has been experiencing and making progress towards addressing these issues. The NCI is large node of the Research Data Services initiative (RDS) of the Australian Government's research infrastructure, which currently makes available over 10 PBytes of priority research collections, ranging from geosciences, geophysics, environment, and climate, through to astronomy, bioinformatics, and social sciences. Data are replicated to, or are produced at, NCI and then processed there to higher-level data products or directly analysed. Individual datasets range from multi-petabyte computational models and large volume raster arrays, down to gigabyte size, ultra-high resolution datasets. To facilitate access, maximise reuse and enable integration across the disciplines, datasets have been organized on a platform called the National Environmental Research Data Interoperability Platform (NERDIP). Combined, the NERDIP data collections form a rich and diverse asset for researchers: their co-location and standardization optimises the value of existing data, and forms a new resource to underpin data-intensive Science. New publication procedures require that a persistent identifier (DOI) be provided for the dataset that underpins the publication. Being able to produce these for data extracts from the NCI data node using only DOIs is proving difficult: preserving a copy of each data extract is not possible due to data scale. A proposal is for researchers to use workflows that capture the provenance of each data extraction, including metadata (e.g., version of the dataset used, the query and time of extraction). In parallel, NCI is now working with the NERDIP dataset providers to ensure that the provenance of data publication is also captured in provenance systems including references to previous versions and a history of data appended or modified. This proposed solution would require an enhancement to new scholarly publication procedures whereby the reference to underlying dataset to a scholarly publication would be the persistent identifier of the provenance workflow that created the data extract. In turn, the provenance workflow would itself link to a series of persistent identifiers that, at a minimum, provide complete dataset production transparency and, if required, would facilitate reconstruction of the dataset. Such a solution will require strict adherence to design patterns for provenance representation to ensure that the provenance representation of the workflow does indeed contain information required to deliver dataset generation transparency and a pathway to reconstruction.

  16. Detection of a Serum Siderophore by LC-MS/MS as a Potential Biomarker of Invasive Aspergillosis

    PubMed Central

    Carroll, Cassandra S.; Amankwa, Lawrence N.; Pinto, Linda J.; Fuller, Jeffrey D.; Moore, Margo M.

    2016-01-01

    Invasive aspergillosis (IA) is a life-threatening systemic mycosis caused primarily by Aspergillus fumigatus. Early diagnosis of IA is based, in part, on an immunoassay for circulating fungal cell wall carbohydrate, galactomannan (GM). However, a wide range of sensitivity and specificity rates have been reported for the GM test across various patient populations. To obtain iron in vivo, A. fumigatus secretes the siderophore, N,N',N"-triacetylfusarinine C (TAFC) and we hypothesize that TAFC may represent a possible biomarker for early detection of IA. We developed an ultra performance liquid chromatography tandem mass spectrometry (UPLC-MS/MS) method for TAFC analysis from serum, and measured TAFC in serum samples collected from patients at risk for IA. The method showed lower and upper limits of quantitation (LOQ) of 5 ng/ml and 750 ng/ml, respectively, and complete TAFC recovery from spiked serum. As proof of concept, we evaluated 76 serum samples from 58 patients with suspected IA that were investigated for the presence of GM. Fourteen serum samples obtained from 11 patients diagnosed with probable or proven IA were also analyzed for the presence of TAFC. Control sera (n = 16) were analyzed to establish a TAFC cut-off value (≥6 ng/ml). Of the 36 GM-positive samples (≥0.5 GM index) from suspected IA patients, TAFC was considered positive in 25 (69%). TAFC was also found in 28 additional GM-negative samples. TAFC was detected in 4 of the 14 samples (28%) from patients with proven/probable aspergillosis. Log-transformed TAFC and GM values from patients with proven/probable IA, healthy individuals and SLE patients showed a significant correlation with a Pearson r value of 0.77. In summary, we have developed a method for the detection of TAFC in serum that revealed this fungal product in the sera of patients at risk for invasive aspergillosis. A prospective study is warranted to determine whether this method provides improved early detection of IA. PMID:26974544

  17. Identification of provenance rocks based on EPMA analyses of heavy minerals

    NASA Astrophysics Data System (ADS)

    Shimizu, M.; Sano, N.; Ueki, T.; Yonaga, Y.; Yasue, K. I.; Masakazu, N.

    2017-12-01

    Information on mountain building is significant in the field of geological disposal of high-level radioactive waste, because this affects long-term stability in groundwater flow system. Provenance analysis is one of effective approaches for understanding building process of mountains. Chemical compositions of heavy minerals, as well as their chronological data, can be an index for identification of provenance rocks. The accurate identification requires the measurement of as many grains as possible. In order to achieve an efficient provenance analysis, we developed a method for quick identification of heavy minerals using an Electron Probe Micro Analyzer (EPMA). In this method, heavy mineral grains extracted from a sample were aligned on a glass slide and mounted in a resin. Concentration of 28 elements was measured for 300-500 grains per sample using EPMA. To measure as many grains as possible, we prioritized swiftness of measurement over precision, configuring measurement time of about 3.5 minutes for each grain. Identification of heavy minerals was based on their chemical composition. We developed a Microsoft® Excel® spread sheet input criteria of mineral identification using a typical range of chemical compositions for each mineral. The grains of <80 wt.% or >110 wt.% total were rejected. The criteria of mineral identification were revised through the comparison between mineral identification by optical microscopy and chemical compositions of grains classified as "unknown minerals". Provenance rocks can be identified based on abundance ratio of identified minerals. If no significant difference of the abundance ratio was found among source rocks, chemical composition of specific minerals was used as another index. This method was applied to the sediments of some regions in Japan where provenance rocks had lithological variations but similar formation ages. Consequently, the provenance rocks were identified based on chemical compositions of heavy minerals resistant to weathering, such as zircon and ilmenite.This study was carried out under a contract with Ministry of Economy, Trade and Industry of Japan as part of its R&D supporting program for developing geological disposal technology.

  18. Lightweight Provenance Service for High-Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Dong; Chen, Yong; Carns, Philip

    Provenance describes detailed information about the history of a piece of data, containing the relationships among elements such as users, processes, jobs, and workflows that contribute to the existence of data. Provenance is key to supporting many data management functionalities that are increasingly important in operations such as identifying data sources, parameters, or assumptions behind a given result; auditing data usage; or understanding details about how inputs are transformed into outputs. Despite its importance, however, provenance support is largely underdeveloped in highly parallel architectures and systems. One major challenge is the demanding requirements of providing provenance service in situ. Themore » need to remain lightweight and to be always on often conflicts with the need to be transparent and offer an accurate catalog of details regarding the applications and systems. To tackle this challenge, we introduce a lightweight provenance service, called LPS, for high-performance computing (HPC) systems. LPS leverages a kernel instrument mechanism to achieve transparency and introduces representative execution and flexible granularity to capture comprehensive provenance with controllable overhead. Extensive evaluations and use cases have confirmed its efficiency and usability. We believe that LPS can be integrated into current and future HPC systems to support a variety of data management needs.« less

  19. Preparing All Teachers to Use Proven, Effective Instructional Methods across the Curriculum.

    ERIC Educational Resources Information Center

    Southern Regional Education Board (SREB), 2012

    2012-01-01

    Research has shown that certain ways of teaching can make a difference in whether students learn standards-based content. Many strategies have proven to be effective in teaching literacy, mathematics, science and social studies. These strategies have facilitated blending academic and career/technical subjects to make learning more meaningful for…

  20. Optical hair removal.

    PubMed

    Ort, R J; Anderson, R R

    1999-06-01

    Traditional methods of hair removal have proven unsatisfactory for many individuals with excessive or unwanted hair. In the last few years, several lasers and xenon flashlamps have been developed that promise to fulfill the need for a practical, safe, and long-lasting method of hair removal. Aggressive marketing of these has contributed to their popularity among patients and physicians. However, significant controversy and confusion surrounds this field. This article provides a detailed explanation of the scientific underpinnings for optical hair removal and explores the advantages and disadvantages of the various devices currently available (Nd:YAG, ruby, alexandrite, diode lasers, and xenon flashlamp). Treatment and safety guidelines are provided to assist the practitioner in the use of these devices. Although the field of optical hair removal is still in its infancy, initial reports of long-term efficacy are encouraging.

  1. A Web-based interface to calculate phonotactic probability for words and nonwords in Modern Standard Arabic.

    PubMed

    Aljasser, Faisal; Vitevitch, Michael S

    2018-02-01

    A number of databases (Storkel Behavior Research Methods, 45, 1159-1167, 2013) and online calculators (Vitevitch & Luce Behavior Research Methods, Instruments, and Computers, 36, 481-487, 2004) have been developed to provide statistical information about various aspects of language, and these have proven to be invaluable assets to researchers, clinicians, and instructors in the language sciences. The number of such resources for English is quite large and continues to grow, whereas the number of such resources for other languages is much smaller. This article describes the development of a Web-based interface to calculate phonotactic probability in Modern Standard Arabic (MSA). A full description of how the calculator can be used is provided. It can be freely accessed at http://phonotactic.drupal.ku.edu/ .

  2. Image reconstruction for PET/CT scanners: past achievements and future challenges

    PubMed Central

    Tong, Shan; Alessio, Adam M; Kinahan, Paul E

    2011-01-01

    PET is a medical imaging modality with proven clinical value for disease diagnosis and treatment monitoring. The integration of PET and CT on modern scanners provides a synergy of the two imaging modalities. Through different mathematical algorithms, PET data can be reconstructed into the spatial distribution of the injected radiotracer. With dynamic imaging, kinetic parameters of specific biological processes can also be determined. Numerous efforts have been devoted to the development of PET image reconstruction methods over the last four decades, encompassing analytic and iterative reconstruction methods. This article provides an overview of the commonly used methods. Current challenges in PET image reconstruction include more accurate quantitation, TOF imaging, system modeling, motion correction and dynamic reconstruction. Advances in these aspects could enhance the use of PET/CT imaging in patient care and in clinical research studies of pathophysiology and therapeutic interventions. PMID:21339831

  3. Double exposure using 193nm negative tone photoresist

    NASA Astrophysics Data System (ADS)

    Kim, Ryoung-han; Wallow, Tom; Kye, Jongwook; Levinson, Harry J.; White, Dave

    2007-03-01

    Double exposure is one of the promising methods for extending lithographic patterning into the low k I regime. In this paper, we demonstrate double patterning of k 1-effective=0.25 with improved process window using a negative resist. Negative resist (TOK N- series) in combination with a bright field mask is proven to provide a large process window in generating 1:3 = trench:line resist features. By incorporating two etch transfer steps into the hard mask material, frequency doubled patterns could be obtained.

  4. Optically Based Rapid Screening Method for Proven Optimal Treatment Strategies Before Treatment Begins

    DTIC Science & Technology

    2016-08-01

    are currently re- evaluating our IHC analysis to provide more refined response data (i.e. proliferation, percent tumor, percent fibrosis, etc.) to test...vitro results (in Fig. 4). We attempted to include only tumor cells in our image analysis, by evaluation of the cellmorphologywith respect to...tomography for evaluation of the activity of lapatinib, a dual inhibitor of the ErbB1 andErbB2 tyrosine kinases, in patients with advanced tumors. Jpn J

  5. Design manual: Oxygen Thermal Test Article (OTTA)

    NASA Technical Reports Server (NTRS)

    Chronic, W. L.; Baese, C. L.; Conder, R. L.

    1974-01-01

    The characteristics of a cryogenic tank for storing liquid hydrogen, nitrogen, oxygen, methane, or helium for an extended period of time with minimum losses are discussed. A description of the tank and control module, assembly drawings and details of major subassemblies, specific requirements controlling development of the system, thermal concept considerations, thermal analysis methods, and a record of test results are provided. The oxygen thermal test article thermal protection system has proven that the insulation system for cryogenic vessels is effective.

  6. Linked data and provenance in biological data webs.

    PubMed

    Zhao, Jun; Miles, Alistair; Klyne, Graham; Shotton, David

    2009-03-01

    The Web is now being used as a platform for publishing and linking life science data. The Web's linking architecture can be exploited to join heterogeneous data from multiple sources. However, as data are frequently being updated in a decentralized environment, provenance information becomes critical to providing reliable and trustworthy services to scientists. This article presents design patterns for representing and querying provenance information relating to mapping links between heterogeneous data from sources in the domain of functional genomics. We illustrate the use of named resource description framework (RDF) graphs at different levels of granularity to make provenance assertions about linked data, and demonstrate that these assertions are sufficient to support requirements including data currency, integrity, evidential support and historical queries.

  7. Tracing dust provenance in paleoclimate records using mineralogical and isotopic fingerprints: additional clues from present-day studies

    NASA Astrophysics Data System (ADS)

    Bory, A. J.; Skonieczny, C.; Bout-Roumazeilles, V.; Grousset, F. E.; Biscaye, P. E.

    2011-12-01

    Dust records retrieved from ice and sediment cores represent some of our most valuable evidence for modifications of atmospheric circulation on various times scales over the last few Pleistocene glacial and interglacial climate cycles. These data also contribute to the documentation of changes in continental paleo-environments (e.g., changes in aridity), changes in iron inputs to the ocean, as well as changes in the hydrological cycle. Interpreting ice and sediment-core dust records, and using them for modelling purposes, requires firstly a good understanding of the dust provenance and its possible temporal variability. Specific intrinsic tracers such as clay mineralogy, major and trace elements, and radiogenic isotopes (strontium, neodymium, lead) have been used for this purpose, with variable effectiveness. One difficulty lies in the fact that these measurements require significant amount of mineral particles and can thus only be obtained at low temporal resolution, either because of the low dust concentration in ice cores or because of the low mass accumulation rates and bioturbation in marine sediments. As a result, dust samples extracted from ice and sediment cores for provenance investigation average long periods of time and may reflect mixtures from various source areas, complicating the interpretation of the data. Still, provenance tracers (clay mineralogy and Sr-Nd isotopes in particular) made possible for instance the discrimination of which continents provided most of the dust deposited in remote locations such as Greenland and Antarctica during the dusty glacial stages. The locations of the contributing source areas, however, were not precisely identified. During the low-dust, interglacial periods, provenance has proven more difficult to establish unambiguously, even at broad (i.e., continental) geographic scales. In other aeolian deposits, such as Asian loess or marine sediments off West Africa, the provenance of the dust is still poorly constrained, despite the fact that these archives are located close to the highest dust-emission areas in the world. Characterization of dust provenance (using mineralogical and isotopic fingerprints) at present, which can be achieved at much higher resolution and benefit from remote sensing data and well-constrained GMC outputs, may provide valuable clues for our understanding of dust provenance in paleoclimate records. We review some investigations carried out in Greenland and Antarctica over the last decade, and present new results from the West African margin. We discuss the extent to which these present-day time series may help us calibrating our paleo-dust provenance proxies, and improving our understanding of dust provenance in paleoclimate records.

  8. Providing traceability for neuroimaging analyses.

    PubMed

    McClatchey, Richard; Branson, Andrew; Anjum, Ashiq; Bloodsworth, Peter; Habib, Irfan; Munir, Kamran; Shamdasani, Jetendr; Soomro, Kamran

    2013-09-01

    With the increasingly digital nature of biomedical data and as the complexity of analyses in medical research increases, the need for accurate information capture, traceability and accessibility has become crucial to medical researchers in the pursuance of their research goals. Grid- or Cloud-based technologies, often based on so-called Service Oriented Architectures (SOA), are increasingly being seen as viable solutions for managing distributed data and algorithms in the bio-medical domain. For neuroscientific analyses, especially those centred on complex image analysis, traceability of processes and datasets is essential but up to now this has not been captured in a manner that facilitates collaborative study. Few examples exist, of deployed medical systems based on Grids that provide the traceability of research data needed to facilitate complex analyses and none have been evaluated in practice. Over the past decade, we have been working with mammographers, paediatricians and neuroscientists in three generations of projects to provide the data management and provenance services now required for 21st century medical research. This paper outlines the finding of a requirements study and a resulting system architecture for the production of services to support neuroscientific studies of biomarkers for Alzheimer's disease. The paper proposes a software infrastructure and services that provide the foundation for such support. It introduces the use of the CRISTAL software to provide provenance management as one of a number of services delivered on a SOA, deployed to manage neuroimaging projects that have been studying biomarkers for Alzheimer's disease. In the neuGRID and N4U projects a Provenance Service has been delivered that captures and reconstructs the workflow information needed to facilitate researchers in conducting neuroimaging analyses. The software enables neuroscientists to track the evolution of workflows and datasets. It also tracks the outcomes of various analyses and provides provenance traceability throughout the lifecycle of their studies. As the Provenance Service has been designed to be generic it can be applied across the medical domain as a reusable tool for supporting medical researchers thus providing communities of researchers for the first time with the necessary tools to conduct widely distributed collaborative programmes of medical analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  9. Quantifying the provenance of aeolian sediments using multiple composite fingerprints

    NASA Astrophysics Data System (ADS)

    Liu, Benli; Niu, Qinghe; Qu, Jianjun; Zu, Ruiping

    2016-09-01

    We introduce a new fingerprinting method that uses multiple composite fingerprints for studies of aeolian sediment provenance. We used this method to quantify the provenance of sediments on both sides of the Qinghai-Tibetan Railway (QTR) in the Cuona Lake section of the Tibetan Plateau (TP), in an environment characterized by aeolian and fluvial interactions. The method involves repeatedly solving a linear mixing model based on mass conservation; the model is not limited to spatial scale or transport types and uses all the tracer groups that passed the range check, Kruskal-Wallis H-test, and a strict analytical solution screening. The proportional estimates that result from using different composite fingerprints are highly variable; however, the average of these fingerprints has a greater accuracy and certainty than any single fingerprint. The results show that sand from the lake beach, hilly surface, and gullies contribute, respectively, 48%, 31% and 21% to the western railway sediments and 43%, 33% and 24% to the eastern railway sediments. The difference between contributions from various sources on either side of the railway, which may increase in the future, was clearly related to variations in local transport characteristics, a conclusion that is supported by grain size analysis. The construction of the QTR changed the local cycling of materials, and the difference in provenance between the sediments that are separated by the railway reflects the changed sedimentary conditions on either side of the railway. The effectiveness of this method suggests that it will be useful in other studies of aeolian sediments.

  10. Fluvial sediment fingerprinting: literature review and annotated bibliography

    USGS Publications Warehouse

    Williamson, Joyce E.; Haj, Adel E.; Stamm, John F.; Valder, Joshua F.; Prautzch, Vicki L.

    2014-01-01

    The U.S. Geological Survey has evaluated and adopted various field methods for collecting real-time sediment and nutrient data. These methods have proven to be valuable representations of sediment and nutrient concentrations and loads but are not able to accurately identify specific source areas. Recently, more advanced data collection and analysis techniques have been evaluated that show promise in identifying specific source areas. Application of field methods could include studies of sources of fluvial sediment, otherwise referred to as sediment “fingerprinting.” The identification of sediment is important, in part, because knowing the primary sediment source areas in watersheds ensures that best management practices are incorporated in areas that maximize reductions in sediment loadings. This report provides a literature review and annotated bibliography of existing methodologies applied in the field of fluvial sediment fingerprinting. This literature review provides a bibliography of publications where sediment fingerprinting methods have been used; however, this report is not assumed to provide an exhaustive listing. Selected publications were categorized by methodology with some additional summary information. The information contained in the summary may help researchers select methods better suited to their particular study or study area, and identify methods in need of more testing and application.

  11. Diverging drought resistance of Scots pine provenances revealed by infrared thermography and mortality

    NASA Astrophysics Data System (ADS)

    Seidel, Hannes; Schunk, Christian; Matiu, Michael; Menzel, Annette

    2016-04-01

    Climate warming and more frequent and severe drought events will alter the adaptedness and fitness of tree species. Especially, Scots pine forests have been affected above average by die-off events during the last decades. Assisted migration of adapted provenances might help alleviating impacts by recent climate change and successfully regenerating forests. However, the identification of suitable provenances based on established ecophysiological methods is time consuming, sometimes invasive, and data on provenance-specific mortality are lacking. We studied the performance, stress and survival of potted Scots pine seedlings from 12 European provenances grown in a greenhouse experiment with multiple drought and warming treatments. In this paper, we will present results of drought stress impacts monitored with four different thermal indices derived from infrared thermography imaging as well as an ample mortality study. Percent soil water deficit (PSWD) was shown to be the main driver of drought stress response in all thermal indices. In spite of wet and dry reference surfaces, however, fluctuating environmental conditions, mainly in terms of air temperature and humidity, altered the measured stress response. In linear mixed-effects models, besides PSWD and meteorological covariates, the factors provenance and provenance - PSWD interactions were included. The explanatory power of the models (R2) ranged between 0.51 to 0.83 and thus, provenance-specific responses to strong and moderate drought and subsequent recovery were revealed. However, obvious differences in the response magnitude of provenances to drought were difficult to explicitly link to general features such Mediterranean - continental type or climate at the provenances' origin. We conclude that seedlings' drought resistance may be linked to summer precipitation and their experienced stress levels are a.o. dependent on their above ground dimensions under given water supply. In respect to mortality, previous drought stress experience lowered the current risk and obvious provenance effects were largely related to different growth traits (dimensions). Our experimental results suggest besides evidence for abiotic stress hardening provenance-specific variation in drought resilience. Thus, there is room for provenance-based assisted migration as tool for climate change adaptation in forestry.

  12. Applied evolutionary theories for engineering of secondary metabolic pathways.

    PubMed

    Bachmann, Brian O

    2016-12-01

    An expanded definition of 'secondary metabolism' is emerging. Once the exclusive provenance of naturally occurring organisms, evolved over geological time scales, secondary metabolism increasingly encompasses molecules generated via human engineered biocatalysts and biosynthetic pathways. Many of the tools and strategies for enzyme and pathway engineering can find origins in evolutionary theories. This perspective presents an overview of selected proposed evolutionary strategies in the context of engineering secondary metabolism. In addition to the wealth of biocatalysts provided via secondary metabolic pathways, improving the understanding of biosynthetic pathway evolution will provide rich resources for methods to adapt to applied laboratory evolution. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Evaluation of an Impedance Threshold Device as a VIIP Countermeasure

    NASA Technical Reports Server (NTRS)

    Ebert, Douglas; Macias, Brandon; Sargsyan, Ashot; Garcia, Kathleen; Stenger, Michael; Hargens, Alan; Johnston, Smith; Kemp, David; Danielson, Richard

    2017-01-01

    Visual Impairment/Intracranial Pressure (VIIP) is a top human spaceflight risk for which NASA does not currently have a proven mitigation strategy. Thigh cuffs and lower body negative pressure (LBNP) devices have been or are currently being evaluated as a means to reduce VIIP signs and symptoms, but these methods alone may not provide sufficient relief of cephalic venous congestion and VIIP symptoms. Additionally, current LBNP devices are too large and cumbersome for their systematic use as a countermeasure. Therefore, a novel approach is needed that is easy to implement and provides specific relief of symptoms. This investigation will evaluate an impedance threshold device (ITD) as a VIIP countermeasure.

  14. Applications of FT-IR spectrophotometry in cancer diagnostics.

    PubMed

    Bunaciu, Andrei A; Hoang, Vu Dang; Aboul-Enein, Hassan Y

    2015-01-01

    This review provides a brief background to the application of infrared spectroscopy, including Fourier transform-infrared spectroscopy, in biological fluids. It is not meant to be complete or exhaustive but to provide the reader with sufficient background for selected applications in cancer diagnostics. Fourier transform-infrared spectroscopy (FT-IR) is a fast and nondestructive analytical method. The infrared spectrum of a mixture serves as the basis to quantitate its constituents, and a number of common clinical chemistry tests have proven to be feasible using this approach. This review focuses on biomedical FT-IR applications, published in the period 2009-2013, used for early detection of cancer through qualitative and quantitative analysis.

  15. Defining the "proven technology" technical criterion in the reactor technology assessment for Malaysia's nuclear power program

    NASA Astrophysics Data System (ADS)

    Anuar, Nuraslinda; Kahar, Wan Shakirah Wan Abdul; Manan, Jamal Abdul Nasir Abd

    2015-04-01

    Developing countries that are considering the deployment of nuclear power plants (NPPs) in the near future need to perform reactor technology assessment (RTA) in order to select the most suitable reactor design. The International Atomic Energy Agency (IAEA) reported in the Common User Considerations (CUC) document that "proven technology" is one of the most important technical criteria for newcomer countries in performing the RTA. The qualitative description of five desired features for "proven technology" is relatively broad and only provides a general guideline to its characterization. This paper proposes a methodology to define the "proven technology" term according to a specific country's requirements using a three-stage evaluation process. The first evaluation stage screens the available technologies in the market against a predefined minimum Technology Readiness Level (TRL) derived as a condition based on national needs and policy objectives. The result is a list of technology options, which are then assessed in the second evaluation stage against quantitative definitions of CUC desired features for proven technology. The potential technology candidates produced from this evaluation is further narrowed down to obtain a list of proven technology candidates by assessing them against selected risk criteria and the established maximum allowable total score using a scoring matrix. The outcome of this methodology is the proven technology candidates selected using an accurate definition of "proven technology" that fulfills the policy objectives, national needs and risk, and country-specific CUC desired features of the country that performs this assessment. A simplified assessment for Malaysia is carried out to demonstrate and suggest the use of the proposed methodology. In this exercise, ABWR, AP1000, APR1400 and EPR designs assumed the top-ranks of proven technology candidates according to Malaysia's definition of "proven technology".

  16. Enabling Linked Science in Global Climate Uncertainty Quantification (UQ) Research

    NASA Astrophysics Data System (ADS)

    Elsethagen, T.; Stephan, E.; Lin, G.; Williams, D.; Banks, E.

    2012-12-01

    This paper shares a real-world global climate UQ science use case and illustrates how a linked science application called Provenance Environment (ProvEn), currently being developed, enables and facilitates scientific teams to publish, share, link, and discover new links over their UQ research results. UQ results include terascale datasets that are published to an Earth Systems Grid Federation (ESGF) repository. ProvEn demonstrates how a scientific team conducting UQ studies can discover dataset links using its domain knowledgebase, allowing them to better understand the UQ study research objectives, the experimental protocol used, the resulting dataset lineage, related analytical findings, ancillary literature citations, along with the social network of scientists associated with the study. This research claims that scientists using this linked science approach will not only allow them to greatly benefit from understanding a particular dataset within a knowledge context, a benefit can also be seen by the cross reference of knowledge among the numerous UQ studies being stored in ESGF. ProvEn collects native forms of data provenance resources as the UQ study is carried out. The native data provenance resources can be collected from a variety of sources such as scripts, a workflow engine log, simulation log files, scientific team members etc. Schema alignment is used to translate the native forms of provenance into a set of W3C PROV-O semantic statements used as a common interchange format which will also contain URI references back to resources in the UQ study dataset for querying and cross referencing. ProvEn leverages Fedora Commons' digital object model in a Resource Oriented Architecture (ROA) (i.e. a RESTful framework) to logically organize and partition native and translated provenance resources by UQ study. The ROA also provides scientists the means to both search native and translated forms of provenance.

  17. Citation and Recognition of contributions using Semantic Provenance Knowledge Captured in the OPeNDAP Software Framework

    NASA Astrophysics Data System (ADS)

    West, P.; Michaelis, J.; Lebot, T.; McGuinness, D. L.; Fox, P. A.

    2014-12-01

    Providing proper citation and attribution for published data, derived data products, and the software tools used to generate them, has always been an important aspect of scientific research. However, It is often the case that this type of detailed citation and attribution is lacking. This is in part because it often requires manual markup since dynamic generation of this type of provenance information is not typically done by the tools used to access, manipulate, transform, and visualize data. In addition, the tools themselves lack the information needed to be properly cited themselves. The OPeNDAP Hyrax Software Framework is a tool that provides access to and the ability to constrain, manipulate, and transform, different types of data from different data formats, into a common format, the DAP (Data Access Protocol), in order to derive new data products. A user, or another software client, specifies an HTTP URL in order to access a particular piece of data, and appropriately transform it to suit a specific purpose of use. The resulting data products, however, do not contain any information about what data was used to create it, or the software process used to generate it, let alone information that would allow the proper citing and attribution to down stream researchers and tool developers. We will present our approach to provenance capture in Hyrax including a mechanism that can be used to report back to the hosting site any derived products, such as publications and reports, using the W3C PROV recommendation pingback service. We will demonstrate our utilization of Semantic Web and Web standards, the development of an information model that extends the PROV model for provenance capture, and the development of the pingback service. We will present our findings, as well as our practices for providing provenance information, visualization of the provenance information, and the development of pingback services, to better enable scientists and tool developers to be recognized and properly cited for their contributions.

  18. Restful Implementation of Catalogue Service for Geospatial Data Provenance

    NASA Astrophysics Data System (ADS)

    Jiang, L. C.; Yue, P.; Lu, X. C.

    2013-10-01

    Provenance, also known as lineage, is important in understanding the derivation history of data products. Geospatial data provenance helps data consumers to evaluate the quality and reliability of geospatial data. In a service-oriented environment, where data are often consumed or produced by distributed services, provenance could be managed by following the same service-oriented paradigm. The Open Geospatial Consortium (OGC) Catalogue Service for the Web (CSW) is used for the registration and query of geospatial data provenance by extending ebXML Registry Information Model (ebRIM). Recent advance of the REpresentational State Transfer (REST) paradigm has shown great promise for the easy integration of distributed resources. RESTful Web Service aims to provide a standard way for Web clients to communicate with servers based on REST principles. The existing approach for provenance catalogue service could be improved by adopting the RESTful design. This paper presents the design and implementation of a catalogue service for geospatial data provenance following RESTful architecture style. A middleware named REST Converter is added on the top of the legacy catalogue service to support a RESTful style interface. The REST Converter is composed of a resource request dispatcher and six resource handlers. A prototype service is developed to demonstrate the applicability of the approach.

  19. A method for the direct measurement of electronic site populations in a molecular aggregate using two-dimensional electronic-vibrational spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, Nicholas H. C.; Dong, Hui; Oliver, Thomas A. A.

    2015-09-28

    Two dimensional electronic spectroscopy has proven to be a valuable experimental technique to reveal electronic excitation dynamics in photosynthetic pigment-protein complexes, nanoscale semiconductors, organic photovoltaic materials, and many other types of systems. It does not, however, provide direct information concerning the spatial structure and dynamics of excitons. 2D infrared spectroscopy has become a widely used tool for studying structural dynamics but is incapable of directly providing information concerning electronic excited states. 2D electronic-vibrational (2DEV) spectroscopy provides a link between these domains, directly connecting the electronic excitation with the vibrational structure of the system under study. In this work, we derivemore » response functions for the 2DEV spectrum of a molecular dimer and propose a method by which 2DEV spectra could be used to directly measure the electronic site populations as a function of time following the initial electronic excitation. We present results from the response function simulations which show that our proposed approach is substantially valid. This method provides, to our knowledge, the first direct experimental method for measuring the electronic excited state dynamics in the spatial domain, on the molecular scale.« less

  20. Use of genomic recursions and algorithm for proven and young animals for single-step genomic BLUP analyses--a simulation study.

    PubMed

    Fragomeni, B O; Lourenco, D A L; Tsuruta, S; Masuda, Y; Aguilar, I; Misztal, I

    2015-10-01

    The purpose of this study was to examine accuracy of genomic selection via single-step genomic BLUP (ssGBLUP) when the direct inverse of the genomic relationship matrix (G) is replaced by an approximation of G(-1) based on recursions for young genotyped animals conditioned on a subset of proven animals, termed algorithm for proven and young animals (APY). With the efficient implementation, this algorithm has a cubic cost with proven animals and linear with young animals. Ten duplicate data sets mimicking a dairy cattle population were simulated. In a first scenario, genomic information for 20k genotyped bulls, divided in 7k proven and 13k young bulls, was generated for each replicate. In a second scenario, 5k genotyped cows with phenotypes were included in the analysis as young animals. Accuracies (average for the 10 replicates) in regular EBV were 0.72 and 0.34 for proven and young animals, respectively. When genomic information was included, they increased to 0.75 and 0.50. No differences between genomic EBV (GEBV) obtained with the regular G(-1) and the approximated G(-1) via the recursive method were observed. In the second scenario, accuracies in GEBV (0.76, 0.51 and 0.59 for proven bulls, young males and young females, respectively) were also higher than those in EBV (0.72, 0.35 and 0.49). Again, no differences between GEBV with regular G(-1) and with recursions were observed. With the recursive algorithm, the number of iterations to achieve convergence was reduced from 227 to 206 in the first scenario and from 232 to 209 in the second scenario. Cows can be treated as young animals in APY without reducing the accuracy. The proposed algorithm can be implemented to reduce computing costs and to overcome current limitations on the number of genotyped animals in the ssGBLUP method. © 2015 Blackwell Verlag GmbH.

  1. Development of a statistically proven injection molding method for reaction bonded silicon nitride, sintering reaction bonded silicon nitride, and sintered silicon nitride

    NASA Astrophysics Data System (ADS)

    Steiner, Matthias

    A statistically proven, series injection molding technique for ceramic components was developed for the construction of engines and gas turbines. The flow behavior of silicon injection-molding materials was characterized and improved. Hot-isostatic-pressing reaction bonded silicon nitride (HIPRBSN) was developed. A nondestructive component evaluation method was developed. An injection molding line for HIPRBSN engine components precombustion chamber, flame spreader, and valve guide was developed. This line allows the production of small series for engine tests.

  2. On the Daubechies-based wavelet differentiation matrix

    NASA Technical Reports Server (NTRS)

    Jameson, Leland

    1993-01-01

    The differentiation matrix for a Daubechies-based wavelet basis is constructed and superconvergence is proven. That is, it will be proven that under the assumption of periodic boundary conditions that the differentiation matrix is accurate of order 2M, even though the approximation subspace can represent exactly only polynomials up to degree M-1, where M is the number of vanishing moments of the associated wavelet. It is illustrated that Daubechies-based wavelet methods are equivalent to finite difference methods with grid refinement in regions of the domain where small-scale structure is present.

  3. A parametric method for determining the number of signals in narrow-band direction finding

    NASA Astrophysics Data System (ADS)

    Wu, Qiang; Fuhrmann, Daniel R.

    1991-08-01

    A novel and more accurate method to determine the number of signals in the multisource direction finding problem is developed. The information-theoretic criteria of Yin and Krishnaiah (1988) are applied to a set of quantities which are evaluated from the log-likelihood function. Based on proven asymptotic properties of the maximum likelihood estimation, these quantities have the properties required by the criteria. Since the information-theoretic criteria use these quantities instead of the eigenvalues of the estimated correlation matrix, this approach possesses the advantage of not requiring a subjective threshold, and also provides higher performance than when eigenvalues are used. Simulation results are presented and compared to those obtained from the nonparametric method given by Wax and Kailath (1985).

  4. Acoustic levitation and manipulation for space applications

    NASA Technical Reports Server (NTRS)

    Wang, T. G.

    1979-01-01

    A wide spectrum of experiments to be performed in space in a microgravity environment require levitation and manipulation of liquid or molten samples. A novel acoustic method has been developed at JPL for controlling liquid samples without physical contacts. This method utilizes the static pressure generated by three orthogonal acoustic standing waves excited within an enclosure. Furthermore, this method will allow the sample to be rotated and/or oscillated by modifying the phase angles and/or the amplitude of the acoustic field. This technique has been proven both in our laboratory and in a microgravity environment provided by KC-135 flights. Samples placed within our chamber driven at (1,0,0), (0,1,0), and (0,0,1), modes were indeed levitated, rotated, and oscillated.

  5. Moiré deflectometry-based position detection for optical tweezers.

    PubMed

    Khorshad, Ali Akbar; Reihani, S Nader S; Tavassoly, Mohammad Taghi

    2017-09-01

    Optical tweezers have proven to be indispensable tools for pico-Newton range force spectroscopy. A quadrant photodiode (QPD) positioned at the back focal plane of an optical tweezers' condenser is commonly used for locating the trapped object. In this Letter, for the first time, to the best of our knowledge, we introduce a moiré pattern-based detection method for optical tweezers. We show, both theoretically and experimentally, that this detection method could provide considerably better position sensitivity compared to the commonly used detection systems. For instance, position sensitivity for a trapped 2.17 μm polystyrene bead is shown to be 71% better than the commonly used QPD-based detection method. Our theoretical and experimental results are in good agreement.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowen, Benjamin; Ruebel, Oliver; Fischer, Curt Fischer R.

    BASTet is an advanced software library written in Python. BASTet serves as the analysis and storage library for the OpenMSI project. BASTet is an integrate framework for: i) storage of spectral imaging data, ii) storage of derived analysis data, iii) provenance of analyses, iv) integration and execution of analyses via complex workflows. BASTet implements the API for the HDF5 storage format used by OpenMSI. Analyses that are developed using BASTet benefit from direct integration with storage format, automatic tracking of provenance, and direct integration with command-line and workflow execution tools. BASTet also defines interfaces to enable developers to directly integratemore » their analysis with OpenMSI's web-based viewing infrastruture without having to know OpenMSI. BASTet also provides numerous helper classes and tools to assist with the conversion of data files, ease parallel implementation of analysis algorithms, ease interaction with web-based functions, description methods for data reduction. BASTet also includes detailed developer documentation, user tutorials, iPython notebooks, and other supporting documents.« less

  7. Applying Content Management to Automated Provenance Capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schuchardt, Karen L.; Gibson, Tara D.; Stephan, Eric G.

    2008-04-10

    Workflows and data pipelines are becoming increasingly valuable in both computational and experimen-tal sciences. These automated systems are capable of generating significantly more data within the same amount of time than their manual counterparts. Automatically capturing and recording data prove-nance and annotation as part of these workflows is critical for data management, verification, and dis-semination. Our goal in addressing the provenance challenge was to develop and end-to-end system that demonstrates real-time capture, persistent content management, and ad-hoc searches of both provenance and metadata using open source software and standard protocols. We describe our prototype, which extends the Kepler workflow toolsmore » for the execution environment, the Scientific Annotation Middleware (SAM) content management software for data services, and an existing HTTP-based query protocol. Our implementation offers several unique capabilities, and through the use of standards, is able to pro-vide access to the provenance record to a variety of commonly available client tools.« less

  8. Provenance testing at Michigan Technological University

    Treesearch

    Robert L. Sajdak

    1970-01-01

    The location of M.T.U. in the Keweenaw Peninsula of Upper Michigan provides some unique advantages and disadvantages in provenance testing and tree improvement research. Extremes in summer and winter temperatures are uncommon because of the moderating effect of Lake Superior. Near the Lake we have about 140 frost-free days while inland the frost-free season is only 80...

  9. Multi-camera digital image correlation method with distributed fields of view

    NASA Astrophysics Data System (ADS)

    Malowany, Krzysztof; Malesa, Marcin; Kowaluk, Tomasz; Kujawinska, Malgorzata

    2017-11-01

    A multi-camera digital image correlation (DIC) method and system for measurements of large engineering objects with distributed, non-overlapping areas of interest are described. The data obtained with individual 3D DIC systems are stitched by an algorithm which utilizes the positions of fiducial markers determined simultaneously by Stereo-DIC units and laser tracker. The proposed calibration method enables reliable determination of transformations between local (3D DIC) and global coordinate systems. The applicability of the method was proven during in-situ measurements of a hall made of arch-shaped (18 m span) self-supporting metal-plates. The proposed method is highly recommended for 3D measurements of shape and displacements of large and complex engineering objects made from multiple directions and it provides the suitable accuracy of data for further advanced structural integrity analysis of such objects.

  10. Use of an OSSE to Evaluate Background Error Covariances Estimated by the 'NMC Method'

    NASA Technical Reports Server (NTRS)

    Errico, Ronald M.; Prive, Nikki C.; Gu, Wei

    2014-01-01

    The NMC method has proven utility for prescribing approximate background-error covariances required by variational data assimilation systems. Here, untunedNMCmethod estimates are compared with explicitly determined error covariances produced within an OSSE context by exploiting availability of the true simulated states. Such a comparison provides insights into what kind of rescaling is required to render the NMC method estimates usable. It is shown that rescaling of variances and directional correlation lengths depends greatly on both pressure and latitude. In particular, some scaling coefficients appropriate in the Tropics are the reciprocal of those in the Extratropics. Also, the degree of dynamic balance is grossly overestimated by the NMC method. These results agree with previous examinations of the NMC method which used ensembles as an alternative for estimating background-error statistics.

  11. Geographic variation of stable isotopes in African elephant ivory

    NASA Astrophysics Data System (ADS)

    Ziegler, S.; Merker, S.; Jacob, D.

    2012-04-01

    In 1989, the international community listed the African elephant in Appendix I of the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES) thus prohibiting commercial ivory trade. Recent surveillance data show that the illegal trade in ivory has been growing worldwide. Long-term preservation of many of the African elephant populations can be supported with a control mechanism that helps with the implementation of remedial conservation action. Therefore, setting up a reference database that predicts the origin of ivory specimens can assist in determining smuggling routes and the provenance of illegal ivory. Our research builds on earlier work to seek an appropriate method for determining the area of origin for individual tusks. Several researchers have shown that the provenance of elephant ivory can be traced by its isotopic composition, but this is the first attempt to produce an integrated isotopic reference database of elephant ivory provenance. We applied a combination of various routine geochemical analyses to measure the stable isotope ratios of hydrogen, carbon, nitrogen, oxygen, and sulphur. Up to now, we analysed 606 ivory samples of known geographical origin from African range states, museums and private collections, comprising 22 African elephant range states. The isotopic measurements were superimposed with data layers from vegetation, geology and climate. A regression function for the isotope composition of the water isotopes in precipitation and collagen in ivory was developed to overcome the problem of imprecise origin of some of the sampled material. Multivariate statistics, such as nearest neighborhood and discriminate analysis were applied to eventually allow a statistical determination of the provenance for ivory of unknown origin. Our results suggest that the combination of isotopic parameters have the potential to provide predictable and complementary markers for estimating the origin of seized elephant ivory.

  12. Evaluation and Quality Control for the Copernicus Seasonal Forecast Systems

    NASA Astrophysics Data System (ADS)

    Manubens, N.; Hunter, A.; Bedia, J.; Bretonnière, P. A.; Bhend, J.; Doblas-Reyes, F. J.

    2017-12-01

    The EU funded Copernicus Climate Change Service (C3S) will provide authoritative information about past, current and future climate for a wide range of users, from climate scientists to stakeholders from a wide range of sectors including insurance, energy or transport. It has been recognized that providing information about the products' quality and provenance is paramount to establish trust in the service and allow users to make best use of the available information. This presentation outlines the work being conducted within the Quality Assurance for Multi-model Seasonal Forecast Products project (QA4Seas). The aim of QA4Seas is to develop a strategy for the evaluation and quality control (EQC) of the multi-model seasonal forecasts provided by C3S. First, we present the set of guidelines the data providers must comply with, ensuring the data is fully traceable and harmonized across data sets. Second, we discuss the ongoing work on defining a provenance and metadata model that is able to encode such information, and that can be extended to describe the steps followed to obtain the final verification products such as maps and time series of forecast quality measures. The metadata model is based on the Resource Description Framework W3C standard, being thus extensible and reusable. It benefits from widely adopted vocabularies to describe data provenance and workflows, as well as from expert consensus and community-support for the development of the verification and downscaling specific ontologies. Third, we describe the open source software being developed to generate fully reproducible and certifiable seasonal forecast products, which also attaches provenance and metadata information to the verification measures and enables the user to visually inspect the quality of the C3S products. QA4Seas is seeking collaboration with similar initiatives, as well as extending the discussion to interested parties outside the C3S community to share experiences and establish global common guidelines or best practices regarding data provenance.

  13. Morphological and physiological divergences within Quercus ilex support the existence of different ecotypes depending on climatic dryness

    PubMed Central

    Peguero-Pina, José Javier; Sancho-Knapik, Domingo; Barrón, Eduardo; Camarero, Julio Jesús; Vilagrosa, Alberto; Gil-Pelegrín, Eustaquio

    2014-01-01

    Background and Aims Several studies show apparently contradictory findings about the functional convergence within the Mediterranean woody flora. In this context, this study evaluates the variability of functional traits within holm oak (Quercus ilex) to elucidate whether provenances corresponding to different morphotypes represent different ecotypes locally adapted to the prevaling stress levels. Methods Several morphological and physiological traits were measured at leaf and shoot levels in 9-year-old seedlings of seven Q. ilex provenances including all recognized morphotypes. Plants were grown in a common garden for 9 years under the same environmental conditions to avoid possible biases due to site-specific characteristics. Key Results Leaf morphometry clearly separates holm oak provenances into ‘ilex’ (more elongated leaves with low vein density) and ‘rotundifolia’ (short and rounded leaves with high vein density) morphotypes. Moreover, these morphotypes represent two consistent and very contrasting functional types in response to dry climates, mainly in terms of leaf area, major vein density, leaf specific conductivity, resistance to drought-induced cavitation and turgor loss point. Conclusions The ‘ilex’ and ‘rotundifolia’ morphotypes correspond to different ecotypes as inferred from their contrasting functional traits. To the best of our knowledge, this is the first time that the combined use of morphological and physiological traits has provided support for the concept of these two holm oak morphotypes being regarded as two different species. PMID:24941998

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anuar, Nuraslinda, E-mail: nuraslinda@uniten.edu.my; Kahar, Wan Shakirah Wan Abdul, E-mail: shakirah@tnb.com.my; Manan, Jamal Abdul Nasir Abd

    Developing countries that are considering the deployment of nuclear power plants (NPPs) in the near future need to perform reactor technology assessment (RTA) in order to select the most suitable reactor design. The International Atomic Energy Agency (IAEA) reported in the Common User Considerations (CUC) document that “proven technology” is one of the most important technical criteria for newcomer countries in performing the RTA. The qualitative description of five desired features for “proven technology” is relatively broad and only provides a general guideline to its characterization. This paper proposes a methodology to define the “proven technology” term according to amore » specific country’s requirements using a three-stage evaluation process. The first evaluation stage screens the available technologies in the market against a predefined minimum Technology Readiness Level (TRL) derived as a condition based on national needs and policy objectives. The result is a list of technology options, which are then assessed in the second evaluation stage against quantitative definitions of CUC desired features for proven technology. The potential technology candidates produced from this evaluation is further narrowed down to obtain a list of proven technology candidates by assessing them against selected risk criteria and the established maximum allowable total score using a scoring matrix. The outcome of this methodology is the proven technology candidates selected using an accurate definition of “proven technology” that fulfills the policy objectives, national needs and risk, and country-specific CUC desired features of the country that performs this assessment. A simplified assessment for Malaysia is carried out to demonstrate and suggest the use of the proposed methodology. In this exercise, ABWR, AP1000, APR1400 and EPR designs assumed the top-ranks of proven technology candidates according to Malaysia’s definition of “proven technology”.« less

  15. Raman Spectroscopy: an essential tool for future IODP expeditions

    NASA Astrophysics Data System (ADS)

    Andò, Sergio; Garzanti, Eduardo; Kulhanek, Denise K.

    2016-04-01

    The scientific drilling of oceanic sedimentary sequences plays a fundamental part in provenance studies, paleoclimate recostructions, and source-to-sink investigations (e.g., France-Lanord et al., 2015; Pandey et al., 2015). When studying oceanic deposits, Raman spectroscopy can and does represent an essential flexible tool for the multidisciplinary approach necessary to integrate the insight provided by different disciplines. This new user-friendly technique opens up an innovative avenue to study in real time the composition of detrital mineral grains of any origin, complementing traditional methods of provenance analysis (e.g., sedimentary petrography, heavy minerals; Andò and Garzanti, 2014). Raman spectra can readily reveal the chemistry of foraminiferal tests, nannofossils and other biogenic debris for the study of ecosystem evolution and paleoclimate, or the Ca/Mg ratio in biogenic or terrigenous carbonates for geological or marine biological applications and oil exploration (Borromeo et al., 2015). For the study of pelagic or turbiditic muds, which represent the bulk of the deep-marine sedimentary record, Raman spectroscopy allows us to identify silt-sized grains down to the size of a few microns with the same precision level required in quantitative provenance analysis of sand-sized sediments (Andò et al., 2011). Silt and siltstone also represent a very conspicuous part of the stratigraphic record onshore and usually preserve original mineralogical assemblages better than more permeable interbedded sand and sandstone (Blatt, 1985). Raman spectra can be obtained on sample volumes of only a few cubic microns by a confocal micro-Raman coupled with a standard polarizing light microscope using a 50× objective. The size of this apparatus can be easily placed onboard an IODP vessel to provide crucial information and quickly solve identification problems for the benefit of a wide range of scientists during future expeditions. Cited references Andò, S., Vignola, P., Garzanti, E., 2011. Raman counting: a new method to determine provenance of silt. Rend. Fis. Acc. Lincei, 22: 327-347. Andò, S., Garzanti, E., 2014. Raman spectroscopy in heavy-mineral studies. Geological Society, London, Special Publications, 386 (1), 395-412. Blatt, H., (1985). Provenance studies and mudrocks. Journal of Sedimentary Research, 55 (1), 69-75. Borromeo, L., Zimmermann, U., Andò, S., Coletti, G., Bersani, D., Basso, D., Gentile, P., Garzanti, E., 2015. Raman Spectroscopy as a tool for magnesium estimation in Mg-calcite. Periodico di Mineralogia , ECMS, 35-36. France-Lanord, C., Spiess, V., Klaus, A., and the Expedition 354 Scientists, 2015. IODP, Exp. 354, Preliminary Report: Bengal Fan, Neogene and late Paleogene record of Himalayan orogeny and climate: a transect across the Middle Bengal Fan. Pandey, D.K., Clift, P.D., Kulhanek, D.K. and the Expedition 355 Scientists, 2015. IODP, Exp. 355, Preliminary Report: Arabian Sea Monsoon, Deep sea drilling in the Arabian Sea: constraining tectonic-monsoon interactions in South Asia.

  16. Quantitative Metrics for Provenance in the Global Change Information System

    NASA Astrophysics Data System (ADS)

    Sherman, R. A.; Tipton, K.; Elamparuthy, A.

    2017-12-01

    The Global Change Information System (GCIS) is an open-source web-based resource to provide traceable provenance for government climate information, particularly the National Climate Assessment and other climate science reports from the U.S. Global Change Research Program. Since 2014, GCIS has been adding and updating information and linking records to make the system as complete as possible for the key reports. Our total count of records has grown to well over 20,000, but until recently there hasn't been an easy way to measure how well all those records were serving the mission of providing provenance. The GCIS team has recently established quantitative measures of whether each record has sufficient metadata and linkages to be useful for users of our featured climate reports. We will describe our metrics and show how they can be used to guide future development of GCIS and aid users of government climate data.

  17. Chemical Characterization of Bed Material Coatingsby LA-ICP-MS and SEM-EDS

    NASA Astrophysics Data System (ADS)

    Piispanen, M. H.; Mustonen, A. J.; Tiainen, M. S.; Laitinen, R. S.

    Bed material coatings and the consequent agglomeration of bed material are main ash-related problems in FB-boilers. The bed agglomeration is a particular problem when combusting biofuels and waste materials. Whereas SEM-EDS together with automated image processing has proven to be a convenient method to study compositional distribution in coating layers and agglomerates, it is a relatively expensive technique and is not necessarily widely available. In this contribution, we explore the suitability of LA-ICP-MS to provide analogous information of the bed.

  18. Psychosexual responses to infertility.

    PubMed

    Keye, W R

    1984-09-01

    Clearly, infertility is one of several gynecologic conditions that may have a profound effect on the psychological and sexual status of its victim. It is obvious from indepth discussions with many infertile couples that they want their physician to help them recognize and deal with these problems. Unfortunately, we are often ill-prepared to meet such a request. Hopefully, however, the next several years will see an increase in the systematic and scientific study of such problems and their solutions, thus providing us with a rational and proven method of dealing with these problems.

  19. Highly Enantioselective Rhodium-Catalyzed Addition of Arylboroxines to Simple Aryl Ketones: Efficient Synthesis of Escitalopram.

    PubMed

    Huang, Linwei; Zhu, Jinbin; Jiao, Guangjun; Wang, Zheng; Yu, Xingxin; Deng, Wei-Ping; Tang, Wenjun

    2016-03-24

    Highly enantioselective additions of arylboroxines to simple aryl ketones have been achieved for the first time with a Rh/(R,R,R,R)-WingPhos catalyst, thus providing a range of chiral diaryl alkyl carbinols with excellent ee values and yields. (R,R,R,R)-WingPhos has been proven to be crucial for the high reactivity and enantioselectivity. The method has enabled a new, concise, and enantioselective synthesis of the antidepressant drug escitalopram. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Transposon facilitated DNA sequencing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berg, D.E.; Berg, C.M.; Huang, H.V.

    1990-01-01

    The purpose of this research is to investigate and develop methods that exploit the power of bacterial transposable elements for large scale DNA sequencing: Our premise is that the use of transposons to put primer binding sites randomly in target DNAs should provide access to all portions of large DNA fragments, without the inefficiencies of methods involving random subcloning and attendant repetitive sequencing, or of sequential synthesis of many oligonucleotide primers that are used to match systematically along a DNA molecule. Two unrelated bacterial transposons, Tn5 and {gamma}{delta}, are being used because they have both proven useful for molecular analyses,more » and because they differ sufficiently in mechanism and specificity of transposition to merit parallel development.« less

  1. EMT-defibrillation: a recipe for saving lives.

    PubMed

    Paris, P M

    1988-05-01

    Sudden cardiac death is the number-one cause of death in this country. It has long been known that most of these deaths occur outside of the hospital, therefore necessitating an approach to the problem involving prehospital care. The development of advanced life support emergency medical systems has had a dramatic impact on improving survival in selected communities. Most of the country continues to see little result because of our inability to provide timely defibrillation. Automatic external defibrillators now provide a safe, reliable, proven method to increase the number of "saves" in rural, urban, and suburban communities. This new tool, if widely used, will allow us to save scores of "hearts too good to die."

  2. Evaluation of an Impedance Threshold Device as a VIIP Countermeasure

    NASA Technical Reports Server (NTRS)

    Ebert, D.; Macias, B.; Sargsyan, A.; Garcia, K.; Stenger, M.; Kemp, D.; Hargens, A.; Johnston, S.

    2017-01-01

    Visual Impairment/Intracranial Pressure (VIIP) is a top human spaceflight risk for which NASA does not currently have a proven mitigation strategy. Thigh cuffs (Braslets) and lower body negative pressure (LBNP; Chibis) devices have been or are currently being evaluated as a means to reduce VIIP signs and symptoms, but these methods alone may not provide sufficient relief of cephalic venous congestion and VIIP symptoms. Additionally, current LBNP devices are too large and cumbersome for their systematic use as a countermeasure. Therefore, a novel approach is needed that is easy to implement and provides specific relief of symptoms. This investigation will evaluate an impedance threshold device (ITD) as a VIIP countermeasure.

  3. Comparison of methods for localizing the source position of deauthentication attacks on WAP 802.11n using Chanalyzer and Wi-Spy 2.4x

    NASA Astrophysics Data System (ADS)

    Bahaweres, R. B.; Mokoginta, S.; Alaydrus, M.

    2017-01-01

    This paper descnbes a comparison of three methods used to locate the position of the source of deauthentication attacks on Wi-Fi using Chanalyzer, and Wi-Spy 2.4x adapter. The three methods are wardriving, absorption and trilateration. The position of constant deauthentication attacks is more easily analyzed compared to that of random attacks. Signal propagation may provide a comparison between signal strength and distance which makes the position of attackers more easily located. The results are shown on the chart patterns generated from the Received Signal Strength Indicator (RSS). And it is proven that these three methods can be used to localize the position of attackers, and can be recommended for use in the environment of organizations using Wi-Fi.

  4. Probabilistic learning of nonlinear dynamical systems using sequential Monte Carlo

    NASA Astrophysics Data System (ADS)

    Schön, Thomas B.; Svensson, Andreas; Murray, Lawrence; Lindsten, Fredrik

    2018-05-01

    Probabilistic modeling provides the capability to represent and manipulate uncertainty in data, models, predictions and decisions. We are concerned with the problem of learning probabilistic models of dynamical systems from measured data. Specifically, we consider learning of probabilistic nonlinear state-space models. There is no closed-form solution available for this problem, implying that we are forced to use approximations. In this tutorial we will provide a self-contained introduction to one of the state-of-the-art methods-the particle Metropolis-Hastings algorithm-which has proven to offer a practical approximation. This is a Monte Carlo based method, where the particle filter is used to guide a Markov chain Monte Carlo method through the parameter space. One of the key merits of the particle Metropolis-Hastings algorithm is that it is guaranteed to converge to the "true solution" under mild assumptions, despite being based on a particle filter with only a finite number of particles. We will also provide a motivating numerical example illustrating the method using a modeling language tailored for sequential Monte Carlo methods. The intention of modeling languages of this kind is to open up the power of sophisticated Monte Carlo methods-including particle Metropolis-Hastings-to a large group of users without requiring them to know all the underlying mathematical details.

  5. Evaluation of Contrast Extravasation as a Diagnostic Criterion in the Evaluation of Arthroscopically Proven HAGL/pHAGL Lesions

    PubMed Central

    Maldjian, Catherine; Khanna, Vineet; Bradley, James; Adam, Richard

    2014-01-01

    Purpose. The validity of preoperative MRI in diagnosing HAGL lesions is debated. Various investigations have produced mixed results with regard to the utility of MRI. The purpose of this investigation is to apply a novel method of diagnosing HAGL/pHAGL lesions by looking at contrast extravasation and to evaluate the reliability of such extravasation of contrast into an extra-articular space as a sign of HAGL/pHAGL lesion. Methods. We utilized specific criteria to define contrast extravasation. We evaluated these criteria in 12 patients with arthroscopically proven HAGL/pHAGL lesion. We also evaluated these criteria in a control group. Results. Contrast extravasation occurred in over 83% of arthroscopically positive cases. Contrast extravasation as a diagnostic criterion in the evaluation of HAGL/pHAGL lesions demonstrated a high interobserver degree of agreement. Conclusions. In conclusion, extra-articular contrast extravasation may serve as a valid and reliable sign of HAGL and pHAGL lesions, provided stringent criteria are maintained to assure that the contrast lies in an extra-articular location. In cases where extravasation is not present, the “J” sign, though nonspecific, may be the only evidence of subtle HAGL and pHAGL lesions. Level of Evidence. Level IV, Retrospective Case-Control series. PMID:25530880

  6. Climate Data Provenance Tracking for Just-In-Time Computation

    NASA Astrophysics Data System (ADS)

    Fries, S.; Nadeau, D.; Doutriaux, C.; Williams, D. N.

    2016-12-01

    The "Climate Data Management System" (CDMS) was created in 1996 as part of the Climate Data Analysis Tools suite of software. It provides a simple interface into a wide variety of climate data formats, and creates NetCDF CF-Compliant files. It leverages the NumPy framework for high performance computation, and is an all-in-one IO and computation package. CDMS has been extended to track manipulations of data, and trace that data all the way to the original raw data. This extension tracks provenance about data, and enables just-in-time (JIT) computation. The provenance for each variable is packaged as part of the variable's metadata, and can be used to validate data processing and computations (by repeating the analysis on the original data). It also allows for an alternate solution for sharing analyzed data; if the bandwidth for a transfer is prohibitively expensive, the provenance serialization can be passed in a much more compact format and the analysis rerun on the input data. Data provenance tracking in CDMS enables far-reaching and impactful functionalities, permitting implementation of many analytical paradigms.

  7. Stability theory applications to laminar-flow control

    NASA Technical Reports Server (NTRS)

    Malik, Mujeeb R.

    1987-01-01

    In order to design Laminar Flow Control (LFC) configurations, reliable methods are needed for boundary-layer transition predictions. Among the available methods, there are correlations based upon R sub e, shape factors, Goertler number and crossflow Reynolds number. The most advanced transition prediction method is based upon linear stability theory in the form of the e sup N method which has proven to be successful in predicting transition in two- and three-dimensional boundary layers. When transition occurs in a low disturbance environment, the e sup N method provides a viable design tool for transition prediction and LFC in both 2-D and 3-D subsonic/supersonic flows. This is true for transition dominated by either TS, crossflow, or Goertler instability. If Goertler/TS or crossflow/TS interaction is present, the e sup N will fail to predict transition. However, there is no evidence of such interaction at low amplitudes of Goertler and crossflow vortices.

  8. A New Enzyme-linked Sorbent Assay (ELSA) to Quantify Syncytiotrophoblast Extracellular Vesicles in Biological Fluids.

    PubMed

    Göhner, Claudia; Weber, Maja; Tannetta, Dionne S; Groten, Tanja; Plösch, Torsten; Faas, Marijke M; Scherjon, Sicco A; Schleußner, Ekkehard; Markert, Udo R; Fitzgerald, Justine S

    2015-06-01

    The pregnancy-associated disease preeclampsia is related to the release of syncytiotrophoblast extracellular vesicles (STBEV) by the placenta. To improve functional research on STBEV, reliable and specific methods are needed to quantify them. However, only a few quantification methods are available and accepted, though imperfect. For this purpose, we aimed to provide an enzyme-linked sorbent assay (ELSA) to quantify STBEV in fluid samples based on their microvesicle characteristics and placental origin. Ex vivo placenta perfusion provided standards and samples for the STBEV quantification. STBEV were captured by binding of extracellular phosphatidylserine to immobilized annexin V. The membranous human placental alkaline phosphatase on the STBEV surface catalyzed a colorimetric detection reaction. The described ELSA is a rapid and simple method to quantify STBEV in diverse liquid samples, such as blood or perfusion suspension. The reliability of the ELSA was proven by comparison with nanoparticle tracking analysis. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  9. Comparative genetic responses to climate in the varieties of Pinus ponderosa and Pseudotsuga menziesii: clines in growth potential

    Treesearch

    Gerald E. Rehfeldt; Laura P. Leites; J. Bradley St Clair; Barry C. Jaquish; Cuauhtemoc Saenz-Romero; Javier Lopez-Upton; Dennis G. Joyce

    2014-01-01

    Height growth data were assembled from 10 Pinus ponderosa and 17 Pseudotsuga menziesii provenance tests. Data from the disparate studies were scaled according to climate similarities of the provenances to provide single datasets for 781 P. ponderosa and 1193 P. menziesii populations. Mixed effects models were used for two sub-specific varieties of each species to...

  10. Associating uncertainty with datasets using Linked Data and allowing propagation via provenance chains

    NASA Astrophysics Data System (ADS)

    Car, Nicholas; Cox, Simon; Fitch, Peter

    2015-04-01

    With earth-science datasets increasingly being published to enable re-use in projects disassociated from the original data acquisition or generation, there is an urgent need for associated metadata to be connected, in order to guide their application. In particular, provenance traces should support the evaluation of data quality and reliability. However, while standards for describing provenance are emerging (e.g. PROV-O), these do not include the necessary statistical descriptors and confidence assessments. UncertML has a mature conceptual model that may be used to record uncertainty metadata. However, by itself UncertML does not support the representation of uncertainty of multi-part datasets, and provides no direct way of associating the uncertainty information - metadata in relation to a dataset - with dataset objects.We present a method to address both these issues by combining UncertML with PROV-O, and delivering resulting uncertainty-enriched provenance traces through the Linked Data API. UncertProv extends the PROV-O provenance ontology with an RDF formulation of the UncertML conceptual model elements, adds further elements to support uncertainty representation without a conceptual model and the integration of UncertML through links to documents. The Linked ID API provides a systematic way of navigating from dataset objects to their UncertProv metadata and back again. The Linked Data API's 'views' capability enables access to UncertML and non-UncertML uncertainty metadata representations for a dataset. With this approach, it is possible to access and navigate the uncertainty metadata associated with a published dataset using standard semantic web tools, such as SPARQL queries. Where the uncertainty data follows the UncertML model it can be automatically interpreted and may also support automatic uncertainty propagation . Repositories wishing to enable uncertainty propagation for all datasets must ensure that all elements that are associated with uncertainty (PROV-O Entity and Activity classes) have UncertML elements recorded. This methodology is intentionally flexible to allow uncertainty metadata in many forms, not limited to UncertML. While the more formal representation of uncertainty metadata is desirable (using UncertProv elements to implement the UncertML conceptual model ), this will not always be possible, and any uncertainty data stored will be better than none. Since the UncertProv ontology contains a superset of UncertML elements to facilitate the representation of non-UncertML uncertainty data, it could easily be extended to include other formal uncertainty conceptual models thus allowing non-UncertML propagation calculations.

  11. PyMVPA: A python toolbox for multivariate pattern analysis of fMRI data.

    PubMed

    Hanke, Michael; Halchenko, Yaroslav O; Sederberg, Per B; Hanson, Stephen José; Haxby, James V; Pollmann, Stefan

    2009-01-01

    Decoding patterns of neural activity onto cognitive states is one of the central goals of functional brain imaging. Standard univariate fMRI analysis methods, which correlate cognitive and perceptual function with the blood oxygenation-level dependent (BOLD) signal, have proven successful in identifying anatomical regions based on signal increases during cognitive and perceptual tasks. Recently, researchers have begun to explore new multivariate techniques that have proven to be more flexible, more reliable, and more sensitive than standard univariate analysis. Drawing on the field of statistical learning theory, these new classifier-based analysis techniques possess explanatory power that could provide new insights into the functional properties of the brain. However, unlike the wealth of software packages for univariate analyses, there are few packages that facilitate multivariate pattern classification analyses of fMRI data. Here we introduce a Python-based, cross-platform, and open-source software toolbox, called PyMVPA, for the application of classifier-based analysis techniques to fMRI datasets. PyMVPA makes use of Python's ability to access libraries written in a large variety of programming languages and computing environments to interface with the wealth of existing machine learning packages. We present the framework in this paper and provide illustrative examples on its usage, features, and programmability.

  12. PyMVPA: A Python toolbox for multivariate pattern analysis of fMRI data

    PubMed Central

    Hanke, Michael; Halchenko, Yaroslav O.; Sederberg, Per B.; Hanson, Stephen José; Haxby, James V.; Pollmann, Stefan

    2009-01-01

    Decoding patterns of neural activity onto cognitive states is one of the central goals of functional brain imaging. Standard univariate fMRI analysis methods, which correlate cognitive and perceptual function with the blood oxygenation-level dependent (BOLD) signal, have proven successful in identifying anatomical regions based on signal increases during cognitive and perceptual tasks. Recently, researchers have begun to explore new multivariate techniques that have proven to be more flexible, more reliable, and more sensitive than standard univariate analysis. Drawing on the field of statistical learning theory, these new classifier-based analysis techniques possess explanatory power that could provide new insights into the functional properties of the brain. However, unlike the wealth of software packages for univariate analyses, there are few packages that facilitate multivariate pattern classification analyses of fMRI data. Here we introduce a Python-based, cross-platform, and open-source software toolbox, called PyMVPA, for the application of classifier-based analysis techniques to fMRI datasets. PyMVPA makes use of Python's ability to access libraries written in a large variety of programming languages and computing environments to interface with the wealth of existing machine-learning packages. We present the framework in this paper and provide illustrative examples on its usage, features, and programmability. PMID:19184561

  13. Enabling data-driven provenance in NetCDF, via OGC WPS operations. Climate Analysis services use case.

    NASA Astrophysics Data System (ADS)

    Mihajlovski, A.; Spinuso, A.; Plieger, M.; Som de Cerff, W.

    2016-12-01

    Modern Climate analysis platforms provide generic and standardized ways of accessing data and processing services. These are typically supported by a wide range of OGC formats and interfaces. However, the problem of instrumentally tracing the lineage of the transformations occurring on a dataset and its provenance remains an open challenge. It requires standard-driven and interoperable solutions to facilitate understanding, sharing of self-describing data products, fostering collaboration among peers. The CLIPC portal provided us real use case, where the need of an instrumented provenance management is fundamental. CLIPC provides a single point of access for scientific information on climate change. The data about the physical environment which is used to inform climate change policy and adaptation measures comes from several categories: satellite measurements, terrestrial observing systems, model projections and simulations and from re-analyses. This is made possible through the Copernicus Earth Observation Programme for Europe. With a backbone combining WPS and OPeNDAP services, CLIPC has two themes: 1. Harmonized access to climate datasets derived from models, observations and re-analyses 2. A climate impact tool kit to evaluate, rank and aggregate indicators The climate impact tool kit is realised with the orchestration of a number of WPS that ingest, normalize and combine NetCDF files. The WPS allowing this specific computation are hosted by the climate4impact portal, which is a more generic climate data-access and processing service. In this context, guaranteeing validation and reproducibility of results, is a clearly stated requirement to improve the quality of the results obtained by the combined analysis Two core contributions made, are the enabling of a provenance wrapper around WPS services and the enabling of provenance tracing within the NetCDF format, which adopts and extends the W3C's PROV model. To disseminate indicator data and create transformed data products, a standardized provenance, metadata and processing infrastructure is researched for CLIPC. These efforts will lead towards the provision of tools for further web service processing development and optimisation, opening up possibilities to scale and administer abstract users and data driven workflows.

  14. Establishing a range-wide provenance test in valley oak (Quercus lobata Née) at two California sites

    Treesearch

    Annette Delfino-Mix; Jessica W. Wright; Paul F. Gugger; Christina Liang; Victoria L. Sork

    2015-01-01

    We present the methods used to establish a provenance test in valley oak, Quercus lobata. Nearly 11,000 acorns were planted and 88 percent of those germinated. The resulting seedlings were measured after 1 and 2 years of growth, and were outplanted in the field in the winter of 2014-2015. This test represents a long-term resource for both research...

  15. Targeted profiling of hydrophilic constituents of royal jelly by hydrophilic interaction liquid chromatography-tandem mass spectrometry.

    PubMed

    Pina, Athanasia; Begou, Olga; Kanelis, Dimitris; Gika, Helen; Kalogiannis, Stavros; Tananaki, Chrysoula; Theodoridis, Georgios; Zotou, Anastasia

    2018-01-05

    In the present work a Hydrophilic Interaction Liquid Chromatography-tandem Mass Spectrometry (HILIC-MS/MS) method was developed for the efficient separation and quantification of a large number of small polar bioactive molecules in Royal Jelly. The method was validated and provided satisfactory detection sensitivity for 88 components. Quantification was proven to be precise for 64 components exhibiting good linearity, recoveries R% >90% for the majority of analytes and intra- and inter-day precision from 0.14 to 20% RSD. Analysis of 125 fresh royal jelly samples of Greek origin provided useful information on royal jelly's hydrophilic bioactive components revealing lysine, ribose, proline, melezitose and glutamic acid to be in high abundance. In addition the occurrence of 18 hydrophilic nutrients which have not been reported previously as royal jelly constituents is shown. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Cancer Detection and Diagnosis Methods - Annual Plan

    Cancer.gov

    Early cancer detection is a proven life-saving strategy. Learn about the research opportunities NCI supports, including liquid biopsies and other less-invasive methods, for detecting early cancers and precancerous growths.

  17. ProphTools: general prioritization tools for heterogeneous biological networks.

    PubMed

    Navarro, Carmen; Martínez, Victor; Blanco, Armando; Cano, Carlos

    2017-12-01

    Networks have been proven effective representations for the analysis of biological data. As such, there exist multiple methods to extract knowledge from biological networks. However, these approaches usually limit their scope to a single biological entity type of interest or they lack the flexibility to analyze user-defined data. We developed ProphTools, a flexible open-source command-line tool that performs prioritization on a heterogeneous network. ProphTools prioritization combines a Flow Propagation algorithm similar to a Random Walk with Restarts and a weighted propagation method. A flexible model for the representation of a heterogeneous network allows the user to define a prioritization problem involving an arbitrary number of entity types and their interconnections. Furthermore, ProphTools provides functionality to perform cross-validation tests, allowing users to select the best network configuration for a given problem. ProphTools core prioritization methodology has already been proven effective in gene-disease prioritization and drug repositioning. Here we make ProphTools available to the scientific community as flexible, open-source software and perform a new proof-of-concept case study on long noncoding RNAs (lncRNAs) to disease prioritization. ProphTools is robust prioritization software that provides the flexibility not present in other state-of-the-art network analysis approaches, enabling researchers to perform prioritization tasks on any user-defined heterogeneous network. Furthermore, the application to lncRNA-disease prioritization shows that ProphTools can reach the performance levels of ad hoc prioritization tools without losing its generality. © The Authors 2017. Published by Oxford University Press.

  18. Survival and growth patterns of white spruce (Picea glauca [Moench] Voss) rangewide provenances and their implications for climate change adaptation

    PubMed Central

    Lu, Pengxin; Parker, William H; Cherry, Marilyn; Colombo, Steve; Parker, William C; Man, Rongzhou; Roubal, Ngaire

    2014-01-01

    Intraspecific assisted migration (ISAM) through seed transfer during artificial forest regeneration has been suggested as an adaptation strategy to enhance forest resilience and productivity under future climate. In this study, we assessed the risks and benefits of ISAM in white spruce based on long-term and multilocation, rangewide provenance test data. Our results indicate that the adaptive capacity and growth potential of white spruce varied considerably among 245 range-wide provenances sampled across North America; however, the results revealed that local populations could be outperformed by nonlocal ones. Provenances originating from south-central Ontario and southwestern Québec, Canada, close to the southern edge of the species' natural distribution, demonstrated superior growth in more northerly environments compared with local populations and performed much better than populations from western Canada and Alaska, United States. During the 19–28 years between planting and measurement, the southern provenances have not been more susceptible to freezing damage compared with local populations, indicating they have the potential to be used now for the reforestation of more northerly planting sites; based on changing temperature, these seed sources potentially could maintain or increase white spruce productivity at or above historical levels at northern sites. A universal response function (URF), which uses climatic variables to predict provenance performance across field trials, indicated a relatively weak relationship between provenance performance and the climate at provenance origin. Consequently, the URF from this study did not provide information useful to ISAM. The ecological and economic importance of conserving white spruce genetic resources in south-central Ontario and southwestern Québec for use in ISAM is discussed. PMID:25360273

  19. Survival and growth patterns of white spruce (Picea glauca [Moench] Voss) rangewide provenances and their implications for climate change adaptation.

    PubMed

    Lu, Pengxin; Parker, William H; Cherry, Marilyn; Colombo, Steve; Parker, William C; Man, Rongzhou; Roubal, Ngaire

    2014-06-01

    Intraspecific assisted migration (ISAM) through seed transfer during artificial forest regeneration has been suggested as an adaptation strategy to enhance forest resilience and productivity under future climate. In this study, we assessed the risks and benefits of ISAM in white spruce based on long-term and multilocation, rangewide provenance test data. Our results indicate that the adaptive capacity and growth potential of white spruce varied considerably among 245 range-wide provenances sampled across North America; however, the results revealed that local populations could be outperformed by nonlocal ones. Provenances originating from south-central Ontario and southwestern Québec, Canada, close to the southern edge of the species' natural distribution, demonstrated superior growth in more northerly environments compared with local populations and performed much better than populations from western Canada and Alaska, United States. During the 19-28 years between planting and measurement, the southern provenances have not been more susceptible to freezing damage compared with local populations, indicating they have the potential to be used now for the reforestation of more northerly planting sites; based on changing temperature, these seed sources potentially could maintain or increase white spruce productivity at or above historical levels at northern sites. A universal response function (URF), which uses climatic variables to predict provenance performance across field trials, indicated a relatively weak relationship between provenance performance and the climate at provenance origin. Consequently, the URF from this study did not provide information useful to ISAM. The ecological and economic importance of conserving white spruce genetic resources in south-central Ontario and southwestern Québec for use in ISAM is discussed.

  20. Community Landscapes: An Integrative Approach to Determine Overlapping Network Module Hierarchy, Identify Key Nodes and Predict Network Dynamics

    PubMed Central

    Kovács, István A.; Palotai, Robin; Szalay, Máté S.; Csermely, Peter

    2010-01-01

    Background Network communities help the functional organization and evolution of complex networks. However, the development of a method, which is both fast and accurate, provides modular overlaps and partitions of a heterogeneous network, has proven to be rather difficult. Methodology/Principal Findings Here we introduce the novel concept of ModuLand, an integrative method family determining overlapping network modules as hills of an influence function-based, centrality-type community landscape, and including several widely used modularization methods as special cases. As various adaptations of the method family, we developed several algorithms, which provide an efficient analysis of weighted and directed networks, and (1) determine pervasively overlapping modules with high resolution; (2) uncover a detailed hierarchical network structure allowing an efficient, zoom-in analysis of large networks; (3) allow the determination of key network nodes and (4) help to predict network dynamics. Conclusions/Significance The concept opens a wide range of possibilities to develop new approaches and applications including network routing, classification, comparison and prediction. PMID:20824084

  1. Comparative analysis of methods for extracting vessel network on breast MRI images

    NASA Astrophysics Data System (ADS)

    Gaizer, Bence T.; Vassiou, Katerina G.; Lavdas, Eleftherios; Arvanitis, Dimitrios L.; Fezoulidis, Ioannis V.; Glotsos, Dimitris T.

    2017-11-01

    Digital processing of MRI images aims to provide an automatized diagnostic evaluation of regular health screenings. Cancerous lesions are proven to cause an alteration in the vessel structure of the diseased organ. Currently there are several methods used for extraction of the vessel network in order to quantify its properties. In this work MRI images (Signa HDx 3.0T, GE Healthcare, courtesy of University Hospital of Larissa) of 30 female breasts were subjected to three different vessel extraction algorithms to determine the location of their vascular network. The first method is an experiment to build a graph over known points of the vessel network; the second algorithm aims to determine the direction and diameter of vessels at these points; the third approach is a seed growing algorithm, spreading selection to neighbors of the known vessel pixels. The possibilities shown by the different methods were analyzed, and quantitative measurements were performed. The data provided by these measurements showed no clear correlation with the presence or malignancy of tumors, based on the radiological diagnosis of skilled physicians.

  2. Translational Neuromodulation: Approximating Human Transcranial Magnetic Stimulation Protocols In Rats

    PubMed Central

    Vahabzadeh-Hagh, Andrew M.; Muller, Paul A.; Gersner, Roman; Zangen, Abraham; Rotenberg, Alexander

    2015-01-01

    Objective Transcranial magnetic stimulation (TMS) is a well-established clinical protocol with numerous potential therapeutic and diagnostic applications. Yet, much work remains in the elucidation of TMS mechanisms, optimization of protocols, and in development of novel therapeutic applications. As with many technologies, the key to these issues lies in the proper experimentation and translation of TMS methods to animal models, among which rat models have proven popular. A significant increase in the number of rat TMS publications has necessitated analysis of their relevance to human work. We therefore review the essential principles necessary for the approximation of human TMS protocols in rats as well as specific methods that addressed these issues in published studies. Materials and Methods We performed an English language literature search combined with our own experience and data. We address issues that we see as important in the translation of human TMS methods to rat models and provide a summary of key accomplishments in these areas. Results An extensive literature review illustrated the growth of rodent TMS studies in recent years. Current advances in the translation of single, paired-pulse, and repetitive stimulation paradigms to rodent models are presented. The importance of TMS in the generation of data for preclinical trials is also highlighted. Conclusions Rat TMS has several limitations when considering parallels between animal and human stimulation. However, it has proven to be a useful tool in the field of translational brain stimulation and will likely continue to aid in the design and implementation of stimulation protocols for therapeutic and diagnostic applications. PMID:22780329

  3. Application of ITS2 metabarcoding to determine the provenance of pollen collected by honey bees in an agroecosystem1

    PubMed Central

    Richardson, Rodney T.; Lin, Chia-Hua; Sponsler, Douglas B.; Quijia, Juan O.; Goodell, Karen; Johnson, Reed M.

    2015-01-01

    • Premise of the study: Melissopalynology, the identification of bee-collected pollen, provides insight into the flowers exploited by foraging bees. Information provided by melissopalynology could guide floral enrichment efforts aimed at supporting pollinators, but it has rarely been used because traditional methods of pollen identification are laborious and require expert knowledge. We approach melissopalynology in a novel way, employing a molecular method to study the pollen foraging of honey bees (Apis mellifera) in a landscape dominated by field crops, and compare these results to those obtained by microscopic melissopalynology. • Methods: Pollen was collected from honey bee colonies in Madison County, Ohio, USA, during a two-week period in midspring and identified using microscopic methods and ITS2 metabarcoding. • Results: Metabarcoding identified 19 plant families and exhibited sensitivity for identifying the taxa present in large and diverse pollen samples relative to microscopy, which identified eight families. The bulk of pollen collected by honey bees was from trees (Sapindaceae, Oleaceae, and Rosaceae), although dandelion (Taraxacum officinale) and mustard (Brassicaceae) pollen were also abundant. • Discussion: For quantitative analysis of pollen, using both metabarcoding and microscopic identification is superior to either individual method. For qualitative analysis, ITS2 metabarcoding is superior, providing heightened sensitivity and genus-level resolution. PMID:25606352

  4. DESIGN MANUAL: PHOSPHORUS REMOVAL

    EPA Science Inventory

    This manual summarizes process design information for the best developed methods for removing phosphorus from wastewater. his manual discusses several proven phosphorus removal methods, including phosphorus removal obtainable through biological activity as well as chemical precip...

  5. Software Formal Inspections Guidebook

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  6. Isotope ratio mass spectrometry in combination with chemometrics for characterization of geographical origin and agronomic practices of table grape.

    PubMed

    Longobardi, Francesco; Casiello, Grazia; Centonze, Valentina; Catucci, Lucia; Agostiano, Angela

    2017-08-01

    Although table grape is one of the most cultivated and consumed fruits worldwide, no study has been reported on its geographical origin or agronomic practice based on stable isotope ratios. This study aimed to evaluate the usefulness of isotopic ratios (i.e. 2 H/ 1 H, 13 C/ 12 C, 15 N/ 14 N and 18 O/ 16 O) as possible markers to discriminate the agronomic practice (conventional versus organic farming) and provenance of table grape. In order to quantitatively evaluate which of the isotopic variables were more discriminating, a t test was carried out, in light of which only δ 13 C and δ 18 O provided statistically significant differences (P ≤ 0.05) for the discrimination of geographical origin and farming method. Principal component analysis (PCA) showed no good separation of samples differing in geographical area and agronomic practice; thus, for classification purposes, supervised approaches were carried out. In particular, general discriminant analysis (GDA) was used, resulting in prediction abilities of 75.0 and 92.2% for the discrimination of farming method and origin respectively. The present findings suggest that stable isotopes (i.e. δ 18 O, δ 2 H and δ 13 C) combined with chemometrics can be successfully applied to discriminate the provenance of table grape. However, the use of bulk nitrogen isotopes was not effective for farming method discrimination. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  7. Bridging the provenance gap: opportunities and challenges tracking in and ex silico provenance in sUAS workflows

    NASA Astrophysics Data System (ADS)

    Thomer, A.

    2017-12-01

    Data provenance - the record of the varied processes that went into the creation of a dataset, as well as the relationships between resulting data objects - is necessary to support the reusability, reproducibility and reliability of earth science data. In sUAS-based research, capturing provenance can be particularly challenging because of the breadth and distributed nature of the many platforms used to collect, process and analyze data. In any given project, multiple drones, controllers, computers, software systems, sensors, cameras, imaging processing algorithms and data processing workflows are used over sometimes long periods of time. These platforms and processing result in dozens - if not hundreds - of data products in varying stages of readiness-for-analysis and sharing. Provenance tracking mechanisms are needed to make the relationships between these many data products explicit, and therefore more reusable and shareable. In this talk, I discuss opportunities and challenges in tracking provenance in sUAS-based research, and identify gaps in current workflow-capture technologies. I draw on prior work conducted as part of the IMLS-funded Site-Based Data Curation project in which we developed methods of documenting in and ex silico (that is, computational and non-computation) workflows, and demonstrate this approaches applicability to research with sUASes. I conclude with a discussion of ontologies and other semantic technologies that have potential application in sUAS research.

  8. Connecting Provenance with Semantic Descriptions in the NASA Earth Exchange (NEX)

    NASA Astrophysics Data System (ADS)

    Votava, P.; Michaelis, A.; Nemani, R. R.

    2012-12-01

    NASA Earth Exchange (NEX) is a data, modeling and knowledge collaboratory that houses NASA satellite data, climate data and ancillary data where a focused community may come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform. Some of the main goals of NEX are transparency and repeatability and to that extent we have been adding components that enable tracking of provenance of both scientific processes and datasets produced by these processes. As scientific processes become more complex, they are often developed collaboratively and it becomes increasingly important for the research team to be able to track the development of the process and the datasets that are produced along the way. Additionally, we want to be able to link the processes and the datasets developed on NEX to an existing information and knowledge, so that the users can query and compare the provenance of any dataset or process with regard to the component-specific attributes such as data quality, geographic location, related publications, user comments and annotations etc. We have developed several ontologies that describe datasets and workflow components available on NEX using the OWL ontology language as well as a simple ontology that provides linking mechanism to the collected provenance information. The provenance is captured in two ways - we utilize existing provenance infrastructure of VisTrails, which is used as a workflow engine on NEX, and we extend the captured provenance using the PROV data model expressed through the PROV-O ontology. We do this in order to link and query the provenance easier in the context of the existing NEX information and knowledge. The captured provenance graph is processed and stored using RDFlib with MySQL backend that can be queried using either RDFLib or SPARQL. As a concrete example, we show how this information is captured during anomaly detection process in large satellite datasets.

  9. Provenance analysis of the Voirons Flysch (Gurnigel nappe, Haute-Savoie, France): stratigraphic and palaeogeographic implications

    NASA Astrophysics Data System (ADS)

    Ragusa, Jérémy; Kindler, Pascal; Segvic, Branimir; Ospina-Ostios, Lina Maria

    2017-04-01

    The Chablais Prealps (Haute-Savoie, France) represent a well-preserved accretionary wedge of the Western Alpine Tethys. They comprise a stack of sedimentary nappes related to palaeogeographic realms ranging from the Ultrahelvetic to the Southern Penninic. The provenance analysis is based on the Gazzi-Dickinson method and on QEMSCAN® for heavy-minerals. The Quartzose petrofacies is the most important of the two sources, and supplied three of the four formations of the Voirons Flysch. It is similar to the sources that fed the other flyschs from the Gurnigel nappe. It is characterised by a mature, quartz-rich assemblage and a heavy-mineral population dominated by apatite and the zircon-tourmaline-rutile mineral group. These observations suggest a Clastic wedge provenance. The Feldspathic petrofacies is derived from a feldspar-rich source associated with metamorphic clasts and a heavy-mineral population dominated by garnet. This provenance characterises only one formation of the Voirons Flysch, and is related to the axial belt provenance. This provenance analysis shows that the Middle Eocene to Early Oligocene Voirons Flysch was fed by two sources, in contrast to the other flyschs of the Gurnigel nappe, and further suggests that this flysch was not deposited in the Piemont Ocean but in the Valais domain. Based on the results and comparative provenance analysis with the other flyschs of the Gurnigel nappe, we propose a generic feeding model which involves the Sesia-Dent Blanche nappe, the sedimentary nappes incorporated in the accretionary prism, and probably the Briançonnais basement.

  10. Application of photo-detection to art and archaeology at the C2RMF

    NASA Astrophysics Data System (ADS)

    Calligaro, T.; Dran, J.-C.; Klein, M.

    2003-05-01

    The Centre for research and restoration of the museums of France (C2RMF), located in the Louvre palace in Paris routinely uses photodetector-based techniques for the study of objects of cultural heritage. Among these methods, the ion beam analysis techniques (IBA) provided by the 2-MV electrostatic accelerator "AGLAE" installed in the C2RMF have the specific qualities required for the study of these valuable objects. Indeed, PIXE and PIGE are non-destructive, non-invasive, rapid and sensitive tools for the determination of the chemical composition. Their use enables to answer three major questions in the field of Art and Archaeology: (1) identification of the material, (2) determination of the provenance, and (3) study of surface modification (ageing, alteration). Applications of radiation detectors are exemplified through case studies performed at the Centre: the identification of the pigments used on an Egyptian papyrus, the provenance of gemstones set on ancient jewels and the indirect dating of archaeological flints. New trends in the use of photo-detectors in Art and Archaeology are presented.

  11. Effects of obesity surgery on non-insulin-dependent diabetes mellitus.

    PubMed

    Greenway, Scott E; Greenway, Frank L; Klein, Stanley

    2002-10-01

    Most individuals who have non-insulin-dependent diabetes mellitus are obese. The obese population has proved a frustrating entity regarding weight loss and diabetes control. Results of medical weight loss programs, medications, and behavior therapy have proved disappointing. Bariatric surgery is the most effective method of diabetes management and cure in the morbidly obese population. Surgical procedures to cause malabsorption provide a more dramatic effect on diabetes owing to the imparted bypass of the hormonally active foregut. Pertinent journal articles spanning the last 40 years, as well as textbooks. Bariatric surgical procedures have proven a much more successful method of weight loss and diabetes control in the obese population than conservative methods. These surgical procedures have proven safe with reported mortality rates of 0% to 1.5%. Bariatric operations may be divided based on the method of weight loss and effect on diabetes. The first category is restrictive and includes vertical banded gastroplasty and adjustable silicone gastric banding. These operations improve diabetes by decreasing food intake and body weight with a slowing of gastric emptying. The second category not only contains restrictive components but also elements of malabsorption. This category includes the Roux-en-Y gastric bypass and biliary-pancreatic diversion, which bypass the foregut. Although all of the surgical procedures for obesity offer improved weight loss and diabetes control compared with conservative methods, the Roux-en-Y gastric bypass and biliary-pancreatic diversion offer superior weight loss and resolution of diabetes. The more dramatic effect seen in the surgical procedures to cause malabsorption is likely secondary to the bypass of the foregut resulting in increased weight loss and elevation of the enteroglucagon level.

  12. Reporting on the Strategies Needed to Implement Proven Interventions: An Example From a "Real-World" Cross-Setting Implementation Study.

    PubMed

    Gold, Rachel; Bunce, Arwen E; Cohen, Deborah J; Hollombe, Celine; Nelson, Christine A; Proctor, Enola K; Pope, Jill A; DeVoe, Jennifer E

    2016-08-01

    The objective of this study was to empirically demonstrate the use of a new framework for describing the strategies used to implement quality improvement interventions and provide an example that others may follow. Implementation strategies are the specific approaches, methods, structures, and resources used to introduce and encourage uptake of a given intervention's components. Such strategies have not been regularly reported in descriptions of interventions' effectiveness, or in assessments of how proven interventions are implemented in new settings. This lack of reporting may hinder efforts to successfully translate effective interventions into "real-world" practice. A recently published framework was designed to standardize reporting on implementation strategies in the implementation science literature. We applied this framework to describe the strategies used to implement a single intervention in its original commercial care setting, and when implemented in community health centers from September 2010 through May 2015. Per this framework, the target (clinic staff) and outcome (prescribing rates) remained the same across settings; the actor, action, temporality, and dose were adapted to fit local context. The framework proved helpful in articulating which of the implementation strategies were kept constant and which were tailored to fit diverse settings, and simplified our reporting of their effects. Researchers should consider consistently reporting this information, which could be crucial to the success or failure of implementing proven interventions effectively across diverse care settings. clinicaltrials.gov Identifier: NCT02299791. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  13. Currently available methodologies for the processing of intravascular ultrasound and optical coherence tomography images.

    PubMed

    Athanasiou, Lambros; Sakellarios, Antonis I; Bourantas, Christos V; Tsirka, Georgia; Siogkas, Panagiotis; Exarchos, Themis P; Naka, Katerina K; Michalis, Lampros K; Fotiadis, Dimitrios I

    2014-07-01

    Optical coherence tomography and intravascular ultrasound are the most widely used methodologies in clinical practice as they provide high resolution cross-sectional images that allow comprehensive visualization of the lumen and plaque morphology. Several methods have been developed in recent years to process the output of these imaging modalities, which allow fast, reliable and reproducible detection of the luminal borders and characterization of plaque composition. These methods have proven useful in the study of the atherosclerotic process as they have facilitated analysis of a vast amount of data. This review presents currently available intravascular ultrasound and optical coherence tomography processing methodologies for segmenting and characterizing the plaque area, highlighting their advantages and disadvantages, and discusses the future trends in intravascular imaging.

  14. Innovative use of technologies and methods to redesign care: the problem of care transitions.

    PubMed

    Richman, Mark; Sklaroff, Laura Myerchin; Hoang, Khathy; Wasson, Elijah; Gross-Schulman, Sandra

    2014-01-01

    Organizations are redesigning models of care in today's rapidly changing health care environment. Using proven innovation techniques maximizes likelihood of effective change. Our safety-net hospital aims to reduce high emergency department visit, admission, and readmission rates, key components to health care cost control. Twenty-five clinical stakeholders participated in mixed-methods innovation exercises to understand stakeholders, frame problems, and explore solutions. We identified existing barriers and means to improve post-emergency department/post-inpatient discharge care coordination/communication among patient-centered medical home care team members, including patients. Physicians and staff preferred automated e-mail notifications, including patient identifiers, medical home/primary care provider information, and relevant clinical documentation, to improve communication efficiency/efficacy.

  15. Aerodynamic Shape Optimization Using Hybridized Differential Evolution

    NASA Technical Reports Server (NTRS)

    Madavan, Nateri K.

    2003-01-01

    An aerodynamic shape optimization method that uses an evolutionary algorithm known at Differential Evolution (DE) in conjunction with various hybridization strategies is described. DE is a simple and robust evolutionary strategy that has been proven effective in determining the global optimum for several difficult optimization problems. Various hybridization strategies for DE are explored, including the use of neural networks as well as traditional local search methods. A Navier-Stokes solver is used to evaluate the various intermediate designs and provide inputs to the hybrid DE optimizer. The method is implemented on distributed parallel computers so that new designs can be obtained within reasonable turnaround times. Results are presented for the inverse design of a turbine airfoil from a modern jet engine. (The final paper will include at least one other aerodynamic design application). The capability of the method to search large design spaces and obtain the optimal airfoils in an automatic fashion is demonstrated.

  16. Innovative Trajectory Designs to meet Exploration Challenges

    NASA Technical Reports Server (NTRS)

    Folta, David C.

    2006-01-01

    This document is a viewgraph presentation of the conference paper. Missions incorporated into NASA's Vision for Space Exploration include many different destinations and regions; are challenging to plan; and need new and innovative trajectory design methods to enable them. By combining proven methods with chaos dynamics, exploration goals that require maximum payload mass or minimum duration can be achieved. The implementation of these innovative methods, such as weak stability boundaries, has altered NASA's approach to meet exploration challenges and is described to show how exploration goals may be met in the next decade. With knowledge that various perturbations play a significant role, the mission designer must rely on both traditional design strategies as well as these innovative methods. Over the past decades, improvements have been made that would at first glance seem dramatic. This paper provides a brief narrative on how a fundamental shift has occurred and how chaos dynamics improve the design of exploration missions with complex constraints.

  17. Determination of Electron Optical Properties for Aperture Zoom Lenses Using an Artificial Neural Network Method.

    PubMed

    Isik, Nimet

    2016-04-01

    Multi-element electrostatic aperture lens systems are widely used to control electron or charged particle beams in many scientific instruments. By means of applied voltages, these lens systems can be operated for different purposes. In this context, numerous methods have been performed to calculate focal properties of these lenses. In this study, an artificial neural network (ANN) classification method is utilized to determine the focused/unfocused charged particle beam in the image point as a function of lens voltages for multi-element electrostatic aperture lenses. A data set for training and testing of ANN is taken from the SIMION 8.1 simulation program, which is a well known and proven accuracy program in charged particle optics. Mean squared error results of this study indicate that the ANN classification method provides notable performance characteristics for electrostatic aperture zoom lenses.

  18. Development of monitoring and diagnostic methods for robots used in remediation of waste sites. 1997 annual progress report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tecza, J.

    1998-06-01

    'Safe and efficient clean up of hazardous and radioactive waste sites throughout the DOE complex will require extensive use of robots. This research effort focuses on developing Monitoring and Diagnostic (M and D) methods for robots that will provide early detection, isolation, and tracking of impending faults before they result in serious failure. The utility and effectiveness of applying M and D methods to hydraulic robots has never been proven. The present research program is utilizing seeded faults in a laboratory test rig that is representative of an existing hydraulically-powered remediation robot. This report summarizes activity conducted in the firstmore » 9 months of the project. The research team has analyzed the Rosie Mobile Worksystem as a representative hydraulic robot, developed a test rig for implanted fault testing, developed a test plan and agenda, and established methods for acquiring and analyzing the test data.'« less

  19. Fault diagnosis of motor bearing with speed fluctuation via angular resampling of transient sound signals

    NASA Astrophysics Data System (ADS)

    Lu, Siliang; Wang, Xiaoxian; He, Qingbo; Liu, Fang; Liu, Yongbin

    2016-12-01

    Transient signal analysis (TSA) has been proven an effective tool for motor bearing fault diagnosis, but has yet to be applied in processing bearing fault signals with variable rotating speed. In this study, a new TSA-based angular resampling (TSAAR) method is proposed for fault diagnosis under speed fluctuation condition via sound signal analysis. By applying the TSAAR method, the frequency smearing phenomenon is eliminated and the fault characteristic frequency is exposed in the envelope spectrum for bearing fault recognition. The TSAAR method can accurately estimate the phase information of the fault-induced impulses using neither complicated time-frequency analysis techniques nor external speed sensors, and hence it provides a simple, flexible, and data-driven approach that realizes variable-speed motor bearing fault diagnosis. The effectiveness and efficiency of the proposed TSAAR method are verified through a series of simulated and experimental case studies.

  20. Introducing a novel gravitation-based high-velocity compaction analysis method for pharmaceutical powders.

    PubMed

    Tanner, Timo; Antikainen, Osmo; Ehlers, Henrik; Yliruusi, Jouko

    2017-06-30

    With modern tableting machines large amounts of tablets are produced with high output. Consequently, methods to examine powder compression in a high-velocity setting are in demand. In the present study, a novel gravitation-based method was developed to examine powder compression. A steel bar is dropped on a punch to compress microcrystalline cellulose and starch samples inside the die. The distance of the bar is being read by a high-accuracy laser displacement sensor which provides a reliable distance-time plot for the bar movement. In-die height and density of the compact can be seen directly from this data, which can be examined further to obtain information on velocity, acceleration and energy distribution during compression. The energy consumed in compact formation could also be seen. Despite the high vertical compression speed, the method was proven to be cost-efficient, accurate and reproducible. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Bond additivity corrections for quantum chemistry methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C. F. Melius; M. D. Allendorf

    1999-04-01

    In the 1980's, the authors developed a bond-additivity correction procedure for quantum chemical calculations called BAC-MP4, which has proven reliable in calculating the thermochemical properties of molecular species, including radicals as well as stable closed-shell species. New Bond Additivity Correction (BAC) methods have been developed for the G2 method, BAC-G2, as well as for a hybrid DFT/MP2 method, BAC-Hybrid. These BAC methods use a new form of BAC corrections, involving atomic, molecular, and bond-wise additive terms. These terms enable one to treat positive and negative ions as well as neutrals. The BAC-G2 method reduces errors in the G2 method duemore » to nearest-neighbor bonds. The parameters within the BAC-G2 method only depend on atom types. Thus the BAC-G2 method can be used to determine the parameters needed by BAC methods involving lower levels of theory, such as BAC-Hybrid and BAC-MP4. The BAC-Hybrid method should scale well for large molecules. The BAC-Hybrid method uses the differences between the DFT and MP2 as an indicator of the method's accuracy, while the BAC-G2 method uses its internal methods (G1 and G2MP2) to provide an indicator of its accuracy. Indications of the average error as well as worst cases are provided for each of the BAC methods.« less

  2. Targeted versus statistical approaches to selecting parameters for modelling sediment provenance

    NASA Astrophysics Data System (ADS)

    Laceby, J. Patrick

    2017-04-01

    One effective field-based approach to modelling sediment provenance is the source fingerprinting technique. Arguably, one of the most important steps for this approach is selecting the appropriate suite of parameters or fingerprints used to model source contributions. Accordingly, approaches to selecting parameters for sediment source fingerprinting will be reviewed. Thereafter, opportunities and limitations of these approaches and some future research directions will be presented. For properties to be effective tracers of sediment, they must discriminate between sources whilst behaving conservatively. Conservative behavior is characterized by constancy in sediment properties, where the properties of sediment sources remain constant, or at the very least, any variation in these properties should occur in a predictable and measurable way. Therefore, properties selected for sediment source fingerprinting should remain constant through sediment detachment, transportation and deposition processes, or vary in a predictable and measurable way. One approach to select conservative properties for sediment source fingerprinting is to identify targeted tracers, such as caesium-137, that provide specific source information (e.g. surface versus subsurface origins). A second approach is to use statistical tests to select an optimal suite of conservative properties capable of modelling sediment provenance. In general, statistical approaches use a combination of a discrimination (e.g. Kruskal Wallis H-test, Mann-Whitney U-test) and parameter selection statistics (e.g. Discriminant Function Analysis or Principle Component Analysis). The challenge is that modelling sediment provenance is often not straightforward and there is increasing debate in the literature surrounding the most appropriate approach to selecting elements for modelling. Moving forward, it would be beneficial if researchers test their results with multiple modelling approaches, artificial mixtures, and multiple lines of evidence to provide secondary support to their initial modelling results. Indeed, element selection can greatly impact modelling results and having multiple lines of evidence will help provide confidence when modelling sediment provenance.

  3. Remedy for Repeated Implant Retained Denture Fracture-A Challenging Case Report

    PubMed Central

    Reddy M, Ramu; Metta, Kiran Kumar; Charry N, Sudheer; B, Chittaranjan

    2014-01-01

    The most common site of fracture in a maxillary or a mandibular complete denture is along an anteroposterior line that coincides with the labial notch in in the denture which used to provide the frenum relief. Osseointegrated implants have been a boon to the patients who are completelly edentulous and are not satisfied with the conventional removable complete denture approach.Implant supported dentures have proven to provide superior retention and support for removable complete dentures. Nevertheless, fracture of the denture bases is a common complication of implant-supported mandibular overlay dentures,ecspecially when the artificial denture is opposing natural dentition. This article describes and illustrates a method of reinforcing implant-supported mandibular overdentures to overcome this problem. PMID:25584333

  4. Intraspecies variation in a widely distributed tree species regulates the responses of soil microbiome to different temperature regimes.

    PubMed

    Zhang, Cui-Jing; Delgado-Baquerizo, Manuel; Drake, John E; Reich, Peter B; Tjoelker, Mark G; Tissue, David T; Wang, Jun-Tao; He, Ji-Zheng; Singh, Brajesh K

    2018-04-01

    Plant characteristics in different provenances within a single species may vary in response to climate change, which might alter soil microbial communities and ecosystem functions. We conducted a glasshouse experiment and grew seedlings of three provenances (temperate, subtropical and tropical origins) of a tree species (i.e., Eucalyptus tereticornis) at different growth temperatures (18, 21.5, 25, 28.5, 32 and 35.5°C) for 54 days. At the end of the experiment, bacterial and fungal community composition, diversity and abundance were characterized. Measured soil functions included surrogates of microbial respiration, enzyme activities and nutrient cycling. Using Permutation multivariate analysis of variance (PerMANOVA) and network analysis, we found that the identity of tree provenances regulated both structure and function of soil microbiomes. In some cases, tree provenances substantially affected the response of microbial communities to the temperature treatments. For example, we found significant interactions of temperature and tree provenance on bacterial community and relative abundances of Chloroflexi and Zygomycota, and inorganic nitrogen. Microbial abundance was altered in response to increasing temperature, but was not affected by tree provenances. Our study provides novel evidence that even a small variation in biotic components (i.e., intraspecies tree variation) can significantly influence the response of soil microbial community composition and specific soil functions to global warming. © 2018 Society for Applied Microbiology and John Wiley & Sons Ltd.

  5. The NASA welding assessment program

    NASA Technical Reports Server (NTRS)

    Scott-Monck, J.; Bozek, J.

    1984-01-01

    The potential cost and performance advantages of welding was understood but ignored by solar panel manufacturers in the U.S. Although NASA, DOD and COMSAT have supported welding development efforts, soldering remains the only U.S. space qualified method for interconnecting solar cells. The reason is that no U.S. satellite prime contractor found it necessary, due to mission requirements, to abandon the space proven soldering process. It appears that the proposed NASA space station program will provide an array requirement, a 10 year operation in a low Earth orbital environment, that mandates welding. The status of welding technology in the U.S. is assessed.

  6. Disease management positively affects patient quality of life.

    PubMed

    Walker, David R; Landis, Darryl L; Stern, Patricia M; Vance, Richard P

    2003-04-01

    Health care costs are spiraling upward. The population of the United States is aging, and many baby boomers will develop multiple chronic health conditions. Disease management is one method for reducing costs associated with chronic health conditions. Although these programs have been proven effective in improving patient health, detailed information about their effect on patient quality of life has been scarce. This article provides preliminary evidence that disease management programs for coronary artery disease, chronic obstructive pulmonary disease, diabetes, and heart failure lead to improved quality of life, which correlates with a healthier, more satisfied, and less costly patient.

  7. Compact high reliability fiber coupled laser diodes for avionics and related applications

    NASA Astrophysics Data System (ADS)

    Daniel, David R.; Richards, Gordon S.; Janssen, Adrian P.; Turley, Stephen E. H.; Stockton, Thomas E.

    1993-04-01

    This paper describes a newly developed compact high reliability fiber coupled laser diode which is capable of providing enhanced performance under extreme environmental conditions including a very wide operating temperature range. Careful choice of package materials to minimize thermal and mechanical stress, used with proven manufacturing methods, has resulted in highly stable coupling of the optical fiber pigtail to a high performance MOCVD-grown Multi-Quantum Well laser chip. Electro-optical characteristics over temperature are described together with a demonstration of device stability over a range of environmental conditions. Real time device lifetime data is also presented.

  8. Generative Models in Deep Learning: Constraints for Galaxy Evolution

    NASA Astrophysics Data System (ADS)

    Turp, Maximilian Dennis; Schawinski, Kevin; Zhang, Ce; Weigel, Anna K.

    2018-01-01

    New techniques are essential to make advances in the field of galaxy evolution. Recent developments in the field of artificial intelligence and machine learning have proven that these tools can be applied to problems far more complex than simple image recognition. We use these purely data driven approaches to investigate the process of star formation quenching. We show that Variational Autoencoders provide a powerful method to forward model the process of galaxy quenching. Our results imply that simple changes in specific star formation rate and bulge to disk ratio cannot fully describe the properties of the quenched population.

  9. Challenges and opportunities of open data in ecology.

    PubMed

    Reichman, O J; Jones, Matthew B; Schildhauer, Mark P

    2011-02-11

    Ecology is a synthetic discipline benefiting from open access to data from the earth, life, and social sciences. Technological challenges exist, however, due to the dispersed and heterogeneous nature of these data. Standardization of methods and development of robust metadata can increase data access but are not sufficient. Reproducibility of analyses is also important, and executable workflows are addressing this issue by capturing data provenance. Sociological challenges, including inadequate rewards for sharing data, must also be resolved. The establishment of well-curated, federated data repositories will provide a means to preserve data while promoting attribution and acknowledgement of its use.

  10. Recent Development of Dual-Dictionary Learning Approach in Medical Image Analysis and Reconstruction.

    PubMed

    Wang, Bigong; Li, Liang

    2015-01-01

    As an implementation of compressive sensing (CS), dual-dictionary learning (DDL) method provides an ideal access to restore signals of two related dictionaries and sparse representation. It has been proven that this method performs well in medical image reconstruction with highly undersampled data, especially for multimodality imaging like CT-MRI hybrid reconstruction. Because of its outstanding strength, short signal acquisition time, and low radiation dose, DDL has allured a broad interest in both academic and industrial fields. Here in this review article, we summarize DDL's development history, conclude the latest advance, and also discuss its role in the future directions and potential applications in medical imaging. Meanwhile, this paper points out that DDL is still in the initial stage, and it is necessary to make further studies to improve this method, especially in dictionary training.

  11. Implementing "lean" principles to improve the efficiency of the endoscopy department of a community hospital: a case study.

    PubMed

    Laing, Karen; Baumgartner, Katherine

    2005-01-01

    Many endoscopy units are looking for ways to improve their efficiency without increasing the number of staff, purchasing additional equipment, or making the patients feel as if they have been rushed through the care process. To accomplish this, a few hospitals have looked to other industries for help. Recently, "lean" methods and tools from the manufacturing industry, have been applied successfully in health care systems, and have proven to be an effective way to eliminate waste and redundancy in workplace processes. The "lean" method and tools in service organizations focuses on providing the most efficient and effective flow of service and products. This article will describe the journey of one endoscopy department within a community hospital to illustrate application of "lean" methods and tools and results.

  12. Recent Development of Dual-Dictionary Learning Approach in Medical Image Analysis and Reconstruction

    PubMed Central

    Wang, Bigong; Li, Liang

    2015-01-01

    As an implementation of compressive sensing (CS), dual-dictionary learning (DDL) method provides an ideal access to restore signals of two related dictionaries and sparse representation. It has been proven that this method performs well in medical image reconstruction with highly undersampled data, especially for multimodality imaging like CT-MRI hybrid reconstruction. Because of its outstanding strength, short signal acquisition time, and low radiation dose, DDL has allured a broad interest in both academic and industrial fields. Here in this review article, we summarize DDL's development history, conclude the latest advance, and also discuss its role in the future directions and potential applications in medical imaging. Meanwhile, this paper points out that DDL is still in the initial stage, and it is necessary to make further studies to improve this method, especially in dictionary training. PMID:26089956

  13. Integrating prediction, provenance, and optimization into high energy workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schram, M.; Bansal, V.; Friese, R. D.

    We propose a novel approach for efficient execution of workflows on distributed resources. The key components of this framework include: performance modeling to quantitatively predict workflow component behavior; optimization-based scheduling such as choosing an optimal subset of resources to meet demand and assignment of tasks to resources; distributed I/O optimizations such as prefetching; and provenance methods for collecting performance data. In preliminary results, these techniques improve throughput on a small Belle II workflow by 20%.

  14. Structural Elements in a Persistent Identifier Infrastructure and Resulting Benefits for the Earth Science Community

    NASA Astrophysics Data System (ADS)

    Weigel, T.; Toussaiant, F.; Stockhause, M.; Höck, H.; Kindermann, S.; Lautenschlager, M.; Ludwig, T.

    2012-12-01

    We propose a wide adoption of structural elements (typed links, collections, trees) in the Handle System to improve identification and access of scientific data, metadata and software as well as traceability of data provenance. Typed links target the issue of data provenance as a means to assess the quality of scientific data. Data provenance is seen here as a directed acyclic graph with nodes representing data and vertices representing derivative operations (Moreau 2010). Landing pages can allow a human user to explore the provenance graph back to the primary unprocessed data, thereby also giving credit to the original data producer. As in Earth System Modeling no single infrastructure with complete data lifecycle coverage exists, we propose to split the problem domain in two parts. Project-specific infrastructures such as the German project C3-Grid or the Earth System Grid Federation (ESGF) for CMIP5 data are aware of data and data operations (Toussaint et al. 2012) and can thus detect and accumulate single nodes and vertices in the provenance graph, assigning Handles to data, metadata and software. With a common schema for typed links, the provenance graph is established as downstream infrastructures refer incoming Handles. Data in this context is for example hierarchically structured Earth System model output data, which receives DataCite DOIs only for the most coarse-granular elements. Using Handle tree structures, the lower levels of the hierarchy can also receive Handles, allowing authors to more precisely identify the data they used (Lawrence et al. 2011). We can e.g. define a DOI for just the 2m-temperature variable of CMIP5 data across many CMIP5 experiments or a DOI for model and observational data coming from different sources. The structural elements should be implemented through Handle values at the Handle infrastructure level for two reasons. Handle values are more durable than downstream websites or databases, and thus the provenance chain does not break if individual links become unavailable. Secondly, a single service cannot interpret links if downstream solutions differ in their implementation schemas. Emerging efforts driven by the European Persistent Identifier Consortium (EPIC) aim to establish a default mechanism for structural elements at the Handle level. We motivate to make applications, which take part in the data lifecycle, aware of data derivation provenance and let them provide additional elements to the provenance graph. Since they are also Handles, DataCite DOIs can act as a corner stone and provide an entry point to discover the provenance graph. References B. Lawrence, C. Jones, B. Matthews, S. Pepler, and S. Callaghan, "Citation and peer review of data: Moving towards formal data publication," Int. J. of Digital Curation, vol. 6, no. 2, 2011. L. Moreau, "The foundations for provenance on the web," Foundations and Trends® in Web Science, vol. 2, no. 2-3, pp. 99-241, 2010. F. Toussaint, T. Weigel, H. Thiemann, H. Höck, M. Stockhause: "Application Examples for Handle System Usage", submitted to AGU 2012 session IN009.

  15. Integrating artificial and human intelligence into tablet production process.

    PubMed

    Gams, Matjaž; Horvat, Matej; Ožek, Matej; Luštrek, Mitja; Gradišek, Anton

    2014-12-01

    We developed a new machine learning-based method in order to facilitate the manufacturing processes of pharmaceutical products, such as tablets, in accordance with the Process Analytical Technology (PAT) and Quality by Design (QbD) initiatives. Our approach combines the data, available from prior production runs, with machine learning algorithms that are assisted by a human operator with expert knowledge of the production process. The process parameters encompass those that relate to the attributes of the precursor raw materials and those that relate to the manufacturing process itself. During manufacturing, our method allows production operator to inspect the impacts of various settings of process parameters within their proven acceptable range with the purpose of choosing the most promising values in advance of the actual batch manufacture. The interaction between the human operator and the artificial intelligence system provides improved performance and quality. We successfully implemented the method on data provided by a pharmaceutical company for a particular product, a tablet, under development. We tested the accuracy of the method in comparison with some other machine learning approaches. The method is especially suitable for analyzing manufacturing processes characterized by a limited amount of data.

  16. Biomechanical differences in the stem straightening process among Pinus pinaster provenances. A new approach for early selection of stem straightness.

    PubMed

    Sierra-de-Grado, Rosario; Pando, Valentín; Martínez-Zurimendi, Pablo; Peñalvo, Alejandro; Báscones, Esther; Moulia, Bruno

    2008-06-01

    Stem straightness is an important selection trait in Pinus pinaster Ait. breeding programs. Despite the stability of stem straightness rankings in provenance trials, the efficiency of breeding programs based on a quantitative index of stem straightness remains low. An alternative approach is to analyze biomechanical processes that underlie stem form. The rationale for this selection method is that genetic differences in the biomechanical processes that maintain stem straightness in young plants will continue to control stem form throughout the life of the tree. We analyzed the components contributing most to genetic differences among provenances in stem straightening processes by kinetic analysis and with a biomechanical model defining the interactions between the variables involved (Fournier's model). This framework was tested on three P. pinaster provenances differing in adult stem straightness and growth. One-year-old plants were tilted at 45 degrees, and individual stem positions and sizes were recorded weekly for 5 months. We measured the radial extension of reaction wood and the anatomical features of wood cells in serial stem cross sections. The integral effect of reaction wood on stem leaning was computed with Fournier's model. Responses driven by both primary and secondary growth were involved in the stem straightening process, but secondary-growth-driven responses accounted for most differences among provenances. Plants from the straight-stemmed provenance showed a greater capacity for stem straightening than plants from the sinuous provenances mainly because of (1) more efficient reaction wood (higher maturation strains) and (2) more pronounced secondary-growth-driven autotropic decurving. These two process-based traits are thus good candidates for early selection of stem straightness, but additional tests on a greater number of genotypes over a longer period are required.

  17. Application of ITS2 metabarcoding to determine the provenance of pollen collected by honey bees in an agroecosystem.

    PubMed

    Richardson, Rodney T; Lin, Chia-Hua; Sponsler, Douglas B; Quijia, Juan O; Goodell, Karen; Johnson, Reed M

    2015-01-01

    Melissopalynology, the identification of bee-collected pollen, provides insight into the flowers exploited by foraging bees. Information provided by melissopalynology could guide floral enrichment efforts aimed at supporting pollinators, but it has rarely been used because traditional methods of pollen identification are laborious and require expert knowledge. We approach melissopalynology in a novel way, employing a molecular method to study the pollen foraging of honey bees (Apis mellifera) in a landscape dominated by field crops, and compare these results to those obtained by microscopic melissopalynology. • Pollen was collected from honey bee colonies in Madison County, Ohio, USA, during a two-week period in midspring and identified using microscopic methods and ITS2 metabarcoding. • Metabarcoding identified 19 plant families and exhibited sensitivity for identifying the taxa present in large and diverse pollen samples relative to microscopy, which identified eight families. The bulk of pollen collected by honey bees was from trees (Sapindaceae, Oleaceae, and Rosaceae), although dandelion (Taraxacum officinale) and mustard (Brassicaceae) pollen were also abundant. • For quantitative analysis of pollen, using both metabarcoding and microscopic identification is superior to either individual method. For qualitative analysis, ITS2 metabarcoding is superior, providing heightened sensitivity and genus-level resolution.

  18. Bayes factors for the linear ballistic accumulator model of decision-making.

    PubMed

    Evans, Nathan J; Brown, Scott D

    2018-04-01

    Evidence accumulation models of decision-making have led to advances in several different areas of psychology. These models provide a way to integrate response time and accuracy data, and to describe performance in terms of latent cognitive processes. Testing important psychological hypotheses using cognitive models requires a method to make inferences about different versions of the models which assume different parameters to cause observed effects. The task of model-based inference using noisy data is difficult, and has proven especially problematic with current model selection methods based on parameter estimation. We provide a method for computing Bayes factors through Monte-Carlo integration for the linear ballistic accumulator (LBA; Brown and Heathcote, 2008), a widely used evidence accumulation model. Bayes factors are used frequently for inference with simpler statistical models, and they do not require parameter estimation. In order to overcome the computational burden of estimating Bayes factors via brute force integration, we exploit general purpose graphical processing units; we provide free code for this. This approach allows estimation of Bayes factors via Monte-Carlo integration within a practical time frame. We demonstrate the method using both simulated and real data. We investigate the stability of the Monte-Carlo approximation, and the LBA's inferential properties, in simulation studies.

  19. Adding Semantics and OPM Ontology for the Provenance of Multi-sensor Merged Climate Data Records. Now What About Reproducibility?

    NASA Astrophysics Data System (ADS)

    Hua, H.; Wilson, B. D.; Manipon, G.; Pan, L.; Fetzer, E.

    2011-12-01

    Multi-decadal climate data records are critical to studying climate variability and change. These often also require merging data from multiple instruments such as those from NASA's A-Train that contain measurements covering a wide range of atmospheric conditions and phenomena. Multi-decadal climate data record of water vapor measurements from sensors on A-Train, operational weather, and other satellites are being assembled from existing data sources, or produced from well-established methods published in peer-reviewed literature. However, the immense volume and inhomogeneity of data often requires an "exploratory computing" approach to product generation where data is processed in a variety of different ways with varying algorithms, parameters, and code changes until an acceptable intermediate product is generated. This process is repeated until a desirable final merged product can be generated. Typically the production legacy is often lost due to the complexity of processing steps that were tried along the way. The data product information associated with source data, processing methods, parameters used, intermediate product outputs, and associated materials are often hidden in each of the trials and scattered throughout the processing system(s). We will discuss methods to help users better capture and explore the production legacy of the data, metadata, ancillary files, code, and computing environment changes used during the production of these merged and multi-sensor data products. By leveraging existing semantic and provenance tools, we can capture sufficient information to enable users to track, perform faceted searches, and visualize the provenance of the products and processing lineage. We will explore if sufficient provenance information can be captured to enable science reproducibility of these climate data records.

  20. Ozone Generators That Are Sold as Air Cleaners

    MedlinePlus

    ... U.S. EPA, 1996a). Top of Page What Other Methods Can Be Used to Control Indoor Air Pollution? ... Air Cleaning: Remove pollutants through proven air cleaning methods. Of the three, the first approach — source control — ...

  1. A comparative study of three tissue-cultured Dendrobium species and their wild correspondences by headspace gas chromatography-mass spectrometry combined with chemometric methods.

    PubMed

    Chen, Nai-Dong; You, Tao; Li, Jun; Bai, Li-Tao; Hao, Jing-Wen; Xu, Xiao-Yuan

    2016-10-01

    Plant tissue culture technique is widely used in the conservation and utilization of rare and endangered medicinal plants and it is crucial for tissue culture stocks to obtain the ability to produce similar bioactive components as their wild correspondences. In this paper, a headspace gas chromatography-mass spectrometry method combined with chemometric methods was applied to analyze and evaluate the volatile compounds in tissue-cultured and wild Dendrobium huoshanense Cheng and Tang, Dendrobium officinale Kimura et Migo and Dendrobium moniliforme (Linn.) Sw. In total, 63 volatile compounds were separated, with 53 being identified from the three Dendrobium spp. Different provenances of Dendrobiums had characteristic chemicals and showed remarkable quantity discrepancy of common compositions. The similarity evaluation disclosed that the accumulation of volatile compounds in Dendrobium samples might be affected by their provenance. Principal component analysis showed that the first three components explained 85.9% of data variance, demonstrating a good discrimination between samples. Gas chromatography-mass spectrometry techniques, combined with chemometrics, might be an effective strategy for identifying the species and their provenance, especially in the assessment of tissue-cultured Dendrobium quality for use in raw herbal medicines. Copyright © 2016. Published by Elsevier B.V.

  2. A bioavailable strontium isoscape for Western Europe: A machine learning approach

    PubMed Central

    von Holstein, Isabella C. C.; Laffoon, Jason E.; Willmes, Malte; Liu, Xiao-Ming; Davies, Gareth R.

    2018-01-01

    Strontium isotope ratios (87Sr/86Sr) are gaining considerable interest as a geolocation tool and are now widely applied in archaeology, ecology, and forensic research. However, their application for provenance requires the development of baseline models predicting surficial 87Sr/86Sr variations (“isoscapes”). A variety of empirically-based and process-based models have been proposed to build terrestrial 87Sr/86Sr isoscapes but, in their current forms, those models are not mature enough to be integrated with continuous-probability surface models used in geographic assignment. In this study, we aim to overcome those limitations and to predict 87Sr/86Sr variations across Western Europe by combining process-based models and a series of remote-sensing geospatial products into a regression framework. We find that random forest regression significantly outperforms other commonly used regression and interpolation methods, and efficiently predicts the multi-scale patterning of 87Sr/86Sr variations by accounting for geological, geomorphological and atmospheric controls. Random forest regression also provides an easily interpretable and flexible framework to integrate different types of environmental auxiliary variables required to model the multi-scale patterning of 87Sr/86Sr variability. The method is transferable to different scales and resolutions and can be applied to the large collection of geospatial data available at local and global levels. The isoscape generated in this study provides the most accurate 87Sr/86Sr predictions in bioavailable strontium for Western Europe (R2 = 0.58 and RMSE = 0.0023) to date, as well as a conservative estimate of spatial uncertainty by applying quantile regression forest. We anticipate that the method presented in this study combined with the growing numbers of bioavailable 87Sr/86Sr data and satellite geospatial products will extend the applicability of the 87Sr/86Sr geo-profiling tool in provenance applications. PMID:29847595

  3. Nitrogen partitioning in oak leaves depends on species, provenance, climate conditions and soil type.

    PubMed

    Hu, B; Simon, J; Kuster, T M; Arend, M; Siegwolf, R; Rennenberg, H

    2013-01-01

    Climate-tolerant tree species and/or provenances have to be selected to ensure the high productivity of managed forests in Central Europe under the prognosticated climate changes. For this purpose, we studied the responses of saplings from three oak species (i.e. Quercus robur, Q. petraea and Q. pubescens) and provenances of different climatic origin (i.e. low or high rainfall, low or high temperature habitats) with regard to leaf nitrogen (N) composition as a measure of N nutrition. Saplings were grown in model ecosystems on either calcareous or acidic soil and subjected to one of four treatments (control, drought, air warming or a combination of drought and air warming). Across species, oak N metabolism responded to the influence of drought and/or air warming with an increase in leaf amino acid N concentration at the expense of structural N. Moreover, provenances or species from drier habitats were more tolerant to the climate conditions applied, as indicated by an increase in amino acid N (comparing species) or soluble protein N (comparing provenances within a species). Furthermore, amino acid N concentrations of oak leaves were significantly higher on calcareous compared to acidic soil. From these results, it can be concluded that seeds from provenances or species originating from drier habitats and - if available - from calcareous soil types may provide a superior seed source for future forest establishment. © 2012 German Botanical Society and The Royal Botanical Society of the Netherlands.

  4. Interacting multiple model forward filtering and backward smoothing for maneuvering target tracking

    NASA Astrophysics Data System (ADS)

    Nandakumaran, N.; Sutharsan, S.; Tharmarasa, R.; Lang, Tom; McDonald, Mike; Kirubarajan, T.

    2009-08-01

    The Interacting Multiple Model (IMM) estimator has been proven to be effective in tracking agile targets. Smoothing or retrodiction, which uses measurements beyond the current estimation time, provides better estimates of target states. Various methods have been proposed for multiple model smoothing in the literature. In this paper, a new smoothing method, which involves forward filtering followed by backward smoothing while maintaining the fundamental spirit of the IMM, is proposed. The forward filtering is performed using the standard IMM recursion, while the backward smoothing is performed using a novel interacting smoothing recursion. This backward recursion mimics the IMM estimator in the backward direction, where each mode conditioned smoother uses standard Kalman smoothing recursion. Resulting algorithm provides improved but delayed estimates of target states. Simulation studies are performed to demonstrate the improved performance with a maneuvering target scenario. The comparison with existing methods confirms the improved smoothing accuracy. This improvement results from avoiding the augmented state vector used by other algorithms. In addition, the new technique to account for model switching in smoothing is a key in improving the performance.

  5. Development of dynamic Bayesian models for web application test management

    NASA Astrophysics Data System (ADS)

    Azarnova, T. V.; Polukhin, P. V.; Bondarenko, Yu V.; Kashirina, I. L.

    2018-03-01

    The mathematical apparatus of dynamic Bayesian networks is an effective and technically proven tool that can be used to model complex stochastic dynamic processes. According to the results of the research, mathematical models and methods of dynamic Bayesian networks provide a high coverage of stochastic tasks associated with error testing in multiuser software products operated in a dynamically changing environment. Formalized representation of the discrete test process as a dynamic Bayesian model allows us to organize the logical connection between individual test assets for multiple time slices. This approach gives an opportunity to present testing as a discrete process with set structural components responsible for the generation of test assets. Dynamic Bayesian network-based models allow us to combine in one management area individual units and testing components with different functionalities and a direct influence on each other in the process of comprehensive testing of various groups of computer bugs. The application of the proposed models provides an opportunity to use a consistent approach to formalize test principles and procedures, methods used to treat situational error signs, and methods used to produce analytical conclusions based on test results.

  6. Critically Reflective Leadership

    ERIC Educational Resources Information Center

    Cunningham, Christine L.

    2012-01-01

    Critical Reflective Practice (CRP) has a proven reputation as a method for teacher-researchers in K-12 classrooms, but there have been few published examples of this method being used to document school leaders' work-based practice. This paper outlines adaptations made by the author from an original CRP method to a Critically Reflective Leadership…

  7. [Introduction of active learning and student readership in teaching by the pharmaceutical faculty].

    PubMed

    Sekiguchi, Masaki; Yamato, Ippei; Kato, Tetsuta; Torigoe, Kojyun

    2005-07-01

    We have introduced improvements and new approaches into our teaching methods by exploiting 4 active learning methods for pharmacy students of first year. The 4 teaching methods for each lesson or take home assignment are follows: 1) problem-based learning (clinical case) including a student presentation of the clinical case, 2) schematic drawings of the human organs, one drawing done in 15-20 min during the week following a lecture and a second drawing done with reference to a professional textbook, 3) learning of professional themes in take home assignments, and 4) short test in order to confirm the understanding of technical terms by using paper or computer. These improvements and new methods provide active approaches for pharmacy students (as opposed to passive memorization of words and image study). In combination, they have proven to be useful as a learning method to acquire expert knowledge and to convert from passive learning approach to active learning approach of pharmacy students in the classroom.

  8. Environmental Response Laboratory Network (ERLN) Basic Ordering Agreement Fact Sheet

    EPA Pesticide Factsheets

    Having a standing agreement means that a laboratory has been vetted and has proven capable of providing certain services that meet ERLN standards, and it provides a mechanism for EPA to quickly task a lab to provide supplies/services during an indicent.

  9. Microbial degradation of an organophosphate pesticide, malathion.

    PubMed

    Singh, Baljinder; Kaur, Jagdeep; Singh, Kashmir

    2014-05-01

    Organophosphorus pesticide, malathion, is used in public health, residential, and agricultural settings worldwide to control the pest population. It is proven that exposure to malathion produce toxic effects in humans and other mammals. Due to high toxicity, studies are going on to design effective methods for removal of malathion and its associated compounds from the environment. Among various techniques available, degradation of malathion by microbes proves to be an effective and environment friendly method. Recently, research activities in this area have shown that a diverse range of microorganisms are capable of degrading malathion. Therefore, we aimed at providing an overview of research accomplishments on this subject and discussed the toxicity of malathion and its metabolites, various microorganisms involved in its biodegradation and effect of various environmental parameters on its degradation.

  10. Modelling of high-frequency structure-borne sound transmission on FEM grids using the Discrete Flow Mapping technique

    NASA Astrophysics Data System (ADS)

    Hartmann, Timo; Tanner, Gregor; Xie, Gang; Chappell, David; Bajars, Janis

    2016-09-01

    Dynamical Energy Analysis (DEA) combined with the Discrete Flow Mapping technique (DFM) has recently been introduced as a mesh-based high frequency method modelling structure borne sound for complex built-up structures. This has proven to enhance vibro-acoustic simulations considerably by making it possible to work directly on existing finite element meshes circumventing time-consuming and costly re-modelling strategies. In addition, DFM provides detailed spatial information about the vibrational energy distribution within a complex structure in the mid-to-high frequency range. We will present here progress in the development of the DEA method towards handling complex FEM-meshes including Rigid Body Elements. In addition, structure borne transmission paths due to spot welds are considered. We will present applications for a car floor structure.

  11. Enumerating Hematopoietic Stem and Progenitor Cells in Zebrafish Embryos.

    PubMed

    Esain, Virginie; Cortes, Mauricio; North, Trista E

    2016-01-01

    Over the past 20 years, zebrafish have proven to be a valuable model to dissect the signaling pathways involved in hematopoiesis, including Hematopoietic Stem and Progenitor Cell (HSPC) formation and homeostasis. Despite tremendous efforts to generate the tools necessary to characterize HSPCs in vitro and in vivo the zebrafish community still lacks standardized methods to quantify HSPCs across laboratories. Here, we describe three methods used routinely in our lab, and in others, to reliably enumerate HSPCs in zebrafish embryos: large-scale live imaging of transgenic reporter lines, Fluorescence-Activated Cell Sorting (FACS), and in vitro cell culture. While live imaging and FACS analysis allows enumeration of total or site-specific HSPCs, the cell culture assay provides the unique opportunity to test the functional potential of isolated HSPCs, similar to those employed in mammals.

  12. Stem cell secretome-rich nanoclay hydrogel: a dual action therapy for cardiovascular regeneration

    NASA Astrophysics Data System (ADS)

    Waters, Renae; Pacelli, Settimio; Maloney, Ryan; Medhi, Indrani; Ahmed, Rafeeq P. H.; Paul, Arghya

    2016-03-01

    A nanocomposite hydrogel with photocrosslinkable micro-porous networks and a nanoclay component was successfully prepared to control the release of growth factor-rich stem cell secretome. The proven pro-angiogenic and cardioprotective potential of this new bioactive system provides a valuable therapeutic platform for cardiac tissue repair and regeneration.A nanocomposite hydrogel with photocrosslinkable micro-porous networks and a nanoclay component was successfully prepared to control the release of growth factor-rich stem cell secretome. The proven pro-angiogenic and cardioprotective potential of this new bioactive system provides a valuable therapeutic platform for cardiac tissue repair and regeneration. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr07806g

  13. Increasing Diversity in the Geosciences at the City University of New York

    NASA Astrophysics Data System (ADS)

    Damas, C.; Johnson, L.; McHugh, C.; Marchese, P. J.

    2007-12-01

    The City University of New York (CUNY) is the nation's largest urban university, with 23 institutions serving a large number of underrepresented minority (URM) and women students at all levels of the pipeline - community college to graduate school. CUNY has a strong record of recruiting, enrolling, retaining and graduating URMs in science, technology, engineering and mathematics (STEM) fields. Current efforts are underway to increase the number of URMs in the geosciences. These efforts include: 1) involving students in research at all levels of the pipeline; 2) incorporating innovative and proven pedagogical methods into the classroom; and 3) mentoring of students by research scientists from CUNY and other participating institutions. At all levels of the pipeline, students are actively engaged in Space and Earth Science research. At the community college level, students are introduced to the scientific research process through familiar software such as MS Excel to analyze simple time series. At the senior colleges, students progress to multi-variate data analysis, and they also have the opportunity to go into the field to collect data. As graduate students, they are involved as mentors and supervise undergraduate student research. Program initiatives such as the CUNY pipeline provide stipends and academic enrichment activities (i.e., GRE training, applying to graduate school, etc.) throughout the summer and academic year. During the summer, students also have the opportunity to work with and be mentored by research scientists at a CUNY campus, at a NASA center or a national laboratory. Mentors advise students about graduate school and careers, serve as role models, and perhaps more importantly, provide encouragement to students who lack confidence in their ability to do scientific research. Students also are expected to present their research findings at meetings and conferences, both locally and nationally. In addition to their research experiences, students also benefit from classroom instructions that emphasize active learning, and the integration of research related activities. Proven educational materials and pedagogical methods developed at Medgar Evers College and Queensborough Community College have proven quite effective at engaging and assisting students who have conceptual difficulties in their science and mathematics courses. Overall, students demonstrate an increase in their conceptual understanding of the subject matter, as well an increase in their confidence to solve scientific problems and to become scientists.

  14. Interoperability Using Lightweight Metadata Standards: Service & Data Casting, OpenSearch, OPM Provenance, and Shared SciFlo Workflows

    NASA Astrophysics Data System (ADS)

    Wilson, B. D.; Manipon, G.; Hua, H.; Fetzer, E.

    2011-12-01

    Under several NASA grants, we are generating multi-sensor merged atmospheric datasets to enable the detection of instrument biases and studies of climate trends over decades of data. For example, under a NASA MEASURES grant we are producing a water vapor climatology from the A-Train instruments, stratified by the Cloudsat cloud classification for each geophysical scene. The generation and proper use of such multi-sensor climate data records (CDR's) requires a high level of openness, transparency, and traceability. To make the datasets self-documenting and provide access to full metadata and traceability, we have implemented a set of capabilities and services using known, interoperable protocols. These protocols include OpenSearch, OPeNDAP, Open Provenance Model, service & data casting technologies using Atom feeds, and REST-callable analysis workflows implemented as SciFlo (XML) documents. We advocate that our approach can serve as a blueprint for how to openly "document and serve" complex, multi-sensor CDR's with full traceability. The capabilities and services provided include: - Discovery of the collections by keyword search, exposed using OpenSearch protocol; - Space/time query across the CDR's granules and all of the input datasets via OpenSearch; - User-level configuration of the production workflows so that scientists can select additional physical variables from the A-Train to add to the next iteration of the merged datasets; - Efficient data merging using on-the-fly OPeNDAP variable slicing & spatial subsetting of data out of input netCDF and HDF files (without moving the entire files); - Self-documenting CDR's published in a highly usable netCDF4 format with groups used to organize the variables, CF-style attributes for each variable, numeric array compression, & links to OPM provenance; - Recording of processing provenance and data lineage into a query-able provenance trail in Open Provenance Model (OPM) format, auto-captured by the workflow engine; - Open Publishing of all of the workflows used to generate products as machine-callable REST web services, using the capabilities of the SciFlo workflow engine; - Advertising of the metadata (e.g. physical variables provided, space/time bounding box, etc.) for our prepared datasets as "datacasts" using the Atom feed format; - Publishing of all datasets via our "DataDrop" service, which exploits the WebDAV protocol to enable scientists to access remote data directories as local files on their laptops; - Rich "web browse" of the CDR's with full metadata and the provenance trail one click away; - Advertising of all services as Google-discoverable "service casts" using the Atom format. The presentation will describe our use of the interoperable protocols and demonstrate the capabilities and service GUI's.

  15. Making Sense of 'Big Data' in Provenance Studies

    NASA Astrophysics Data System (ADS)

    Vermeesch, P.

    2014-12-01

    Huge online databases can be 'mined' to reveal previously hidden trends and relationships in society. One could argue that sedimentary geology has entered a similar era of 'Big Data', as modern provenance studies routinely apply multiple proxies to dozens of samples. Just like the Internet, sedimentary geology now requires specialised statistical tools to interpret such large datasets. These can be organised on three levels of progressively higher order:A single sample: The most effective way to reveal the provenance information contained in a representative sample of detrital zircon U-Pb ages are probability density estimators such as histograms and kernel density estimates. The widely popular 'probability density plots' implemented in IsoPlot and AgeDisplay compound analytical uncertainty with geological scatter and are therefore invalid.Several samples: Multi-panel diagrams comprising many detrital age distributions or compositional pie charts quickly become unwieldy and uninterpretable. For example, if there are N samples in a study, then the number of pairwise comparisons between samples increases quadratically as N(N-1)/2. This is simply too much information for the human eye to process. To solve this problem, it is necessary to (a) express the 'distance' between two samples as a simple scalar and (b) combine all N(N-1)/2 such values in a single two-dimensional 'map', grouping similar and pulling apart dissimilar samples. This can be easily achieved using simple statistics-based dissimilarity measures and a standard statistical method called Multidimensional Scaling (MDS).Several methods: Suppose that we use four provenance proxies: bulk petrography, chemistry, heavy minerals and detrital geochronology. This will result in four MDS maps, each of which likely show slightly different trends and patterns. To deal with such cases, it may be useful to use a related technique called 'three way multidimensional scaling'. This results in two graphical outputs: an MDS map, and a map with 'weights' showing to what extent the different provenance proxies influence the horizontal and vertical axis of the MDS map. Thus, detrital data can not only inform the user about the provenance of sediments, but also about the causal relationships between the mineralogy, geochronology and chemistry.

  16. [Surgical manegement of breast cancer].

    PubMed

    Bussmann, J F; Trede, M

    1975-12-18

    A survey of common operative methods in carcinoma of the breast is given. The own procedure in localized and generalized stages of the disease is presented. Simple mastectomy plus axillary dissection has according to our experience proven to be the method of choice.

  17. Convergence and attractivity of memristor-based cellular neural networks with time delays.

    PubMed

    Qin, Sitian; Wang, Jun; Xue, Xiaoping

    2015-03-01

    This paper presents theoretical results on the convergence and attractivity of memristor-based cellular neural networks (MCNNs) with time delays. Based on a realistic memristor model, an MCNN is modeled using a differential inclusion. The essential boundedness of its global solutions is proven. The state of MCNNs is further proven to be convergent to a critical-point set located in saturated region of the activation function, when the initial state locates in a saturated region. It is shown that the state convergence time period is finite and can be quantitatively estimated using given parameters. Furthermore, the positive invariance and attractivity of state in non-saturated regions are also proven. The simulation results of several numerical examples are provided to substantiate the results. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Developing students' qualitative muscles in an introductory methods course.

    PubMed

    SmithBattle, Lee

    2014-08-30

    The exponential growth of qualitative research (QR) has coincided with methodological innovations, the proliferation of qualitative textbooks and journals, and the greater availability of qualitative methods courses. In spite of these advances, the pedagogy for teaching qualitative methods has received little attention. This paper provides a philosophical foundation for teaching QR with active learning strategies and shows how active learning is fully integrated into a one-semester course. The course initiates students into qualitative dispositions and skills as students develop study aims and procedures; enter the field to gather data; analyze the full set of student-generated data; and write results in a final report. Conducting a study in one semester is challenging but has proven feasible and disabuses students of the view that QR is simple, unscientific, or non-rigorous. Student reflections on course assignments are integrated into the paper. The strengths and limitations of this pedagogical approach are also described.

  19. The use of earthquake rate changes as a stress meter at Kilauea volcano.

    PubMed

    Dieterich, J; Cayol, V; Okubo, P

    2000-11-23

    Stress changes in the Earth's crust are generally estimated from model calculations that use near-surface deformation as an observational constraint. But the widespread correlation of changes of earthquake activity with stress has led to suggestions that stress changes might be calculated from earthquake occurrence rates obtained from seismicity catalogues. Although this possibility has considerable appeal, because seismicity data are routinely collected and have good spatial and temporal resolution, the method has not yet proven successful, owing to the non-linearity of earthquake rate changes with respect to both stress and time. Here, however, we present two methods for inverting earthquake rate data to infer stress changes, using a formulation for the stress- and time-dependence of earthquake rates. Application of these methods at Kilauea volcano, in Hawaii, yields good agreement with independent estimates, indicating that earthquake rates can provide a practical remote-sensing stress meter.

  20. Recommendations for a step‐wise comparative approach to the evaluation of new screening tests for colorectal cancer

    PubMed Central

    Senore, Carlo; Mandel, Jack S.; Allison, James E.; Atkin, Wendy S.; Benamouzig, Robert; Bossuyt, Patrick M. M.; Silva, Mahinda De; Guittet, Lydia; Halloran, Stephen P.; Haug, Ulrike; Hoff, Geir; Itzkowitz, Steven H.; Leja, Marcis; Levin, Bernard; Meijer, Gerrit A.; O'Morain, Colm A.; Parry, Susan; Rabeneck, Linda; Rozen, Paul; Saito, Hiroshi; Schoen, Robert E.; Seaman, Helen E.; Steele, Robert J. C.; Sung, Joseph J. Y.; Winawer, Sidney J.

    2016-01-01

    BACKGROUND New screening tests for colorectal cancer continue to emerge, but the evidence needed to justify their adoption in screening programs remains uncertain. METHODS A review of the literature and a consensus approach by experts was undertaken to provide practical guidance on how to compare new screening tests with proven screening tests. RESULTS Findings and recommendations from the review included the following: Adoption of a new screening test requires evidence of effectiveness relative to a proven comparator test. Clinical accuracy supported by programmatic population evaluation in the screening context on an intention‐to‐screen basis, including acceptability, is essential. Cancer‐specific mortality is not essential as an endpoint provided that the mortality benefit of the comparator has been demonstrated and that the biologic basis of detection is similar. Effectiveness of the guaiac‐based fecal occult blood test provides the minimum standard to be achieved by a new test. A 4‐phase evaluation is recommended. An initial retrospective evaluation in cancer cases and controls (Phase 1) is followed by a prospective evaluation of performance across the continuum of neoplastic lesions (Phase 2). Phase 3 follows the demonstration of adequate accuracy in these 2 prescreening phases and addresses programmatic outcomes at 1 screening round on an intention‐to‐screen basis. Phase 4 involves more comprehensive evaluation of ongoing screening over multiple rounds. Key information is provided from the following parameters: the test positivity rate in a screening population, the true‐positive and false‐positive rates, and the number needed to colonoscope to detect a target lesion. CONCLUSIONS New screening tests can be evaluated efficiently by this stepwise comparative approach. Cancer 2016;122:826–39. © 2016 The Authors. Cancer published by Wiley Periodicals, Inc. on behalf of American Cancer Society. PMID:26828588

  1. ProvenCare: Geisinger's Model for Care Transformation through Innovative Clinical Initiatives and Value Creation.

    PubMed

    2009-04-01

    Geisinger's system of care can be seen as a microcosm of the national delivery of healthcare, with implications for decision makers in other health plans. In this interview, Dr Ronald A. Paulus focuses on Geisinger's unique approach to patient care. In its core, this approach represents a system of quality and value initiatives based on 3 major programs-Proven Health Navigation (medical home); the ProvenCare model; and transitions of care. The goal of such an approach is to optimize disease management by using a rational reimbursement paradigm for appropriate interventions, providing innovative incentives, and engaging patients in their own care as part of any intervention. Dr Paulus explains the reasons why, unlike Geisinger, other stakeholders, including payers, providers, patients, and employers, have no intrinsic reasons to be concerned with quality and value initiatives. In addition, he says, an electronic infrastructure that could be modified as management paradigms evolve is a necessary tool to ensure the healthcare delivery system's ability to adapt to new clinical realities quickly to ensure the continuation of delivering best value for all stakeholders.

  2. Linking research to practice: the organisation and implementation of The Netherlands health and social care improvement programmes.

    PubMed

    Ovretveit, John; Klazinga, Niek

    2013-02-01

    Both public and private health and social care services are facing increased and changing demands to improve quality and reduce costs. To enable local services to respond to these demands, governments and other organisations have established large scale improvement programmes. These usually seek to enable many services to make changes to apply proven improvements and to make use of quality improvement methods. The purpose of this paper is to provide an empirical description of how one organisation coordinated ten national improvement programmes between 2004 and 2010. It provides details which may be useful to others seeking to plan and implement such programmes, and also contributes to the understanding of knowledge translation and of network governance. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  3. Bioforensics: Characterization of biological weapons agents by NanoSIMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weber, P K; Ghosal, S; Leighton, T J

    2007-02-26

    The anthrax attacks of Fall 2001 highlight the need to develop forensic methods based on multiple identifiers to determine the origin of biological weapons agents. Genetic typing methods (i.e., DNA and RNA-based) provide one attribution technology, but genetic information alone is not usually sufficient to determine the provenance of the material. Non-genetic identifiers, including elemental and isotopic signatures, provide complementary information that can be used to identify the means, geographic location and date of production. Under LDRD funding, we have successfully developed the techniques necessary to perform bioforensic characterization with the NanoSIMS at the individual spore level. We have developedmore » methods for elemental and isotopic characterization at the single spore scale. We have developed methods for analyzing spore sections to map elemental abundance within spores. We have developed rapid focused ion beam (FIB) sectioning techniques for spores to preserve elemental and structural integrity. And we have developed a high-resolution depth profiling method to characterize the elemental distribution in individual spores without sectioning. We used these newly developed methods to study the controls on elemental abundances in spores, characterize the elemental distribution of in spores, and to study elemental uptake by spores. Our work under this LDRD project attracted FBI and DHS funding for applied purposes.« less

  4. A positive and entropy-satisfying finite volume scheme for the Baer-Nunziato model

    NASA Astrophysics Data System (ADS)

    Coquel, Frédéric; Hérard, Jean-Marc; Saleh, Khaled

    2017-02-01

    We present a relaxation scheme for approximating the entropy dissipating weak solutions of the Baer-Nunziato two-phase flow model. This relaxation scheme is straightforwardly obtained as an extension of the relaxation scheme designed in [16] for the isentropic Baer-Nunziato model and consequently inherits its main properties. To our knowledge, this is the only existing scheme for which the approximated phase fractions, phase densities and phase internal energies are proven to remain positive without any restrictive condition other than a classical fully computable CFL condition. For ideal gas and stiffened gas equations of state, real values of the phasic speeds of sound are also proven to be maintained by the numerical scheme. It is also the only scheme for which a discrete entropy inequality is proven, under a CFL condition derived from the natural sub-characteristic condition associated with the relaxation approximation. This last property, which ensures the non-linear stability of the numerical method, is satisfied for any admissible equation of state. We provide a numerical study for the convergence of the approximate solutions towards some exact Riemann solutions. The numerical simulations show that the relaxation scheme compares well with two of the most popular existing schemes available for the Baer-Nunziato model, namely Schwendeman-Wahle-Kapila's Godunov-type scheme [39] and Tokareva-Toro's HLLC scheme [44]. The relaxation scheme also shows a higher precision and a lower computational cost (for comparable accuracy) than a standard numerical scheme used in the nuclear industry, namely Rusanov's scheme. Finally, we assess the good behavior of the scheme when approximating vanishing phase solutions.

  5. A positive and entropy-satisfying finite volume scheme for the Baer–Nunziato model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coquel, Frédéric, E-mail: frederic.coquel@cmap.polytechnique.fr; Hérard, Jean-Marc, E-mail: jean-marc.herard@edf.fr; Saleh, Khaled, E-mail: saleh@math.univ-lyon1.fr

    We present a relaxation scheme for approximating the entropy dissipating weak solutions of the Baer–Nunziato two-phase flow model. This relaxation scheme is straightforwardly obtained as an extension of the relaxation scheme designed in for the isentropic Baer–Nunziato model and consequently inherits its main properties. To our knowledge, this is the only existing scheme for which the approximated phase fractions, phase densities and phase internal energies are proven to remain positive without any restrictive condition other than a classical fully computable CFL condition. For ideal gas and stiffened gas equations of state, real values of the phasic speeds of sound aremore » also proven to be maintained by the numerical scheme. It is also the only scheme for which a discrete entropy inequality is proven, under a CFL condition derived from the natural sub-characteristic condition associated with the relaxation approximation. This last property, which ensures the non-linear stability of the numerical method, is satisfied for any admissible equation of state. We provide a numerical study for the convergence of the approximate solutions towards some exact Riemann solutions. The numerical simulations show that the relaxation scheme compares well with two of the most popular existing schemes available for the Baer–Nunziato model, namely Schwendeman–Wahle–Kapila's Godunov-type scheme and Tokareva–Toro's HLLC scheme . The relaxation scheme also shows a higher precision and a lower computational cost (for comparable accuracy) than a standard numerical scheme used in the nuclear industry, namely Rusanov's scheme. Finally, we assess the good behavior of the scheme when approximating vanishing phase solutions.« less

  6. Parent-Led Activity and Nutrition (PLAN) for Healthy Living: Design and Methods

    PubMed Central

    Dalton, William T.; Schetzina, Karen E.; Holt, Nicole; Fulton-Robinson, Hazel; Ho, Ai-Leng; Tudiver, Fred; McBee, Mathew T.; Wu, Tiejian

    2011-01-01

    Child obesity has become an important public heath concern, especially in rural areas. Primary care providers are well positioned to intervene with children and their parents, but encounter many barriers to addressing child overweight and obesity. This paper describes the design and methods of a cluster- randomized controlled trial to evaluate a parent-mediated approach utilizing physician’s brief motivational interviewing and parent group sessions to treat child (ages 5–11 years) overweight and obesity in the primary care setting in Southern Appalachia. Specific aims of this pilot project will be 1) to establish a primary care based and parent-mediated childhood overweight intervention program in the primary care setting, 2) to explore the efficacy of this intervention in promoting healthier weight status and health behaviors of children, 3) to examine the acceptability and feasibility of the approach among parents and primary care providers. If proven to be effective, this approach may be an exportable model to other primary care practices. PMID:21777701

  7. Intracochlear pressure measurements to study bone conduction transmission: State-of-the art and proof of concept of the experimental Procedure

    NASA Astrophysics Data System (ADS)

    Borgers, Charlotte; van Wieringen, Astrid; D'hondt, Christiane; Verhaert, Nicolas

    2018-05-01

    The cochlea is the main contributor in bone conduction perception. Measurements of differential pressure in the cochlea give a good estimation of the cochlear input provided by bone conduction stimulation. Recent studies have proven the feasibility of intracochlear pressure measurements in chinchillas and in human temporal bones to study bone conduction. However, similar measurements in fresh-frozen whole human cadaveric heads could give a more realistic representation of the five different transmission pathways of bone conduction to the cochlea compared to human temporal bones. The aim of our study is to develop and validate a framework for intracochlear pressure measurements to evaluate different aspects of bone conduction in whole human cadaveric heads. A proof of concept describing our experimental setup is provided together with the procedure. Additionally, we also present a method to fix the stapes footplate in order to simulate otosclerosis in human temporal bones. The effectiveness of this method is verified by some preliminary results.

  8. Controlling protected designation of origin of wine by Raman spectroscopy.

    PubMed

    Mandrile, Luisa; Zeppa, Giuseppe; Giovannozzi, Andrea Mario; Rossi, Andrea Mario

    2016-11-15

    In this paper, a Fourier Transform Raman spectroscopy method, to authenticate the provenience of wine, for food traceability applications was developed. In particular, due to the specific chemical fingerprint of the Raman spectrum, it was possible to discriminate different wines produced in the Piedmont area (North West Italy) in accordance with i) grape varieties, ii) production area and iii) ageing time. In order to create a consistent training set, more than 300 samples from tens of different producers were analyzed, and a chemometric treatment of raw spectra was applied. A discriminant analysis method was employed in the classification procedures, providing a classification capability (percentage of correct answers) of 90% for validation of grape analysis and geographical area provenance, and a classification capability of 84% for ageing time classification. The present methodology was applied successfully to raw materials without any preliminary treatment of the sample, providing a response in a very short time. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. SEPT9: A Specific Circulating Biomarker for Colorectal Cancer.

    PubMed

    Song, Lele; Li, Yuemin

    2015-01-01

    SEPT9 gene methylation has been implicated as a biomarker for colorectal cancer (CRC) for more than 10 years and has been used clinically for more than 6 years. Studies have proven it to be an accurate, reliable, fast, and convenient method for CRC. In this chapter, we will first provide the background on the role of septin9 protein and the theoretical basis of the SEPT9 gene methylation assay. We will then focus on the performance of SEPT9 gene methylation assay for CRC early detection and screening by analyzing the data obtained in clinical trials and comparing its performance with other methods or markers. Finally, we will discuss the future applications of the assay in monitoring cancer recurrence, evaluating surgery, chemotherapy, and predicting long-term survival. We hope this chapter can provide a full overview of the theoretical basis, development, validation, and clinical applications of the SEPT9 assay for both basic science researchers and clinical practitioners. © 2015 Elsevier Inc. All rights reserved.

  10. Concepts for a geostationary-like polar mission

    NASA Astrophysics Data System (ADS)

    Macdonald, Malcolm; Anderson, Pamela; Carrea, Laura; Dobke, Benjamin; Embury, Owen; Merchant, Chris; Bensi, Paolo

    2014-10-01

    An evidence-led scientific case for development of a space-based polar remote sensing platform at geostationary-like (GEO-like) altitudes is developed through methods including a data user survey. Whilst a GEO platform provides a nearstatic perspective, multiple platforms are required to provide circumferential coverage. Systems for achieving GEO-like polar observation likewise require multiple platforms however the perspective is non-stationery. A key choice is between designs that provide complete polar view from a single platform at any given instant, and designs where this is obtained by compositing partial views from multiple sensors. Users foresee an increased challenge in extracting geophysical information from composite images and consider the use of non-composited images advantageous. Users also find the placement of apogee over the pole to be preferable to the alternative scenarios. Thus, a clear majority of data users find the "Taranis" orbit concept to be better than a critical inclination orbit, due to the improved perspective offered. The geophysical products that would benefit from a GEO-like polar platform are mainly estimated from radiances in the visible/near infrared and thermal parts of the electromagnetic spectrum, which is consistent with currently proven technologies from GEO. Based on the survey results, needs analysis, and current technology proven from GEO, scientific and observation requirements are developed along with two instrument concepts with eight and four channels, based on Flexible Combined Imager heritage. It is found that an operational system could, mostly likely, be deployed from an Ariane 5 ES to a 16-hour orbit, while a proof-of-concept system could be deployed from a Soyuz launch to the same orbit.

  11. A Novel Space Partitioning Algorithm to Improve Current Practices in Facility Placement

    PubMed Central

    Jimenez, Tamara; Mikler, Armin R; Tiwari, Chetan

    2012-01-01

    In the presence of naturally occurring and man-made public health threats, the feasibility of regional bio-emergency contingency plans plays a crucial role in the mitigation of such emergencies. While the analysis of in-place response scenarios provides a measure of quality for a given plan, it involves human judgment to identify improvements in plans that are otherwise likely to fail. Since resource constraints and government mandates limit the availability of service provided in case of an emergency, computational techniques can determine optimal locations for providing emergency response assuming that the uniform distribution of demand across homogeneous resources will yield and optimal service outcome. This paper presents an algorithm that recursively partitions the geographic space into sub-regions while equally distributing the population across the partitions. For this method, we have proven the existence of an upper bound on the deviation from the optimal population size for sub-regions. PMID:23853502

  12. Modeling individual effects in the Cormack-Jolly-Seber Model: A state-space formulation

    USGS Publications Warehouse

    Royle, J. Andrew

    2008-01-01

    In population and evolutionary biology, there exists considerable interest in individual heterogeneity in parameters of demographic models for open populations. However, flexible and practical solutions to the development of such models have proven to be elusive. In this article, I provide a state-space formulation of open population capture-recapture models with individual effects. The state-space formulation provides a generic and flexible framework for modeling and inference in models with individual effects, and it yields a practical means of estimation in these complex problems via contemporary methods of Markov chain Monte Carlo. A straightforward implementation can be achieved in the software package WinBUGS. I provide an analysis of a simple model with constant parameter detection and survival probability parameters. A second example is based on data from a 7-year study of European dippers, in which a model with year and individual effects is fitted.

  13. Reducing Communication in Algebraic Multigrid Using Additive Variants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vassilevski, Panayot S.; Yang, Ulrike Meier

    Algebraic multigrid (AMG) has proven to be an effective scalable solver on many high performance computers. However, its increasing communication complexity on coarser levels has shown to seriously impact its performance on computers with high communication cost. Moreover, additive AMG variants provide not only increased parallelism as well as decreased numbers of messages per cycle but also generally exhibit slower convergence. Here we present various new additive variants with convergence rates that are significantly improved compared to the classical additive algebraic multigrid method and investigate their potential for decreased communication, and improved communication-computation overlap, features that are essential for goodmore » performance on future exascale architectures.« less

  14. Reducing Communication in Algebraic Multigrid Using Additive Variants

    DOE PAGES

    Vassilevski, Panayot S.; Yang, Ulrike Meier

    2014-02-12

    Algebraic multigrid (AMG) has proven to be an effective scalable solver on many high performance computers. However, its increasing communication complexity on coarser levels has shown to seriously impact its performance on computers with high communication cost. Moreover, additive AMG variants provide not only increased parallelism as well as decreased numbers of messages per cycle but also generally exhibit slower convergence. Here we present various new additive variants with convergence rates that are significantly improved compared to the classical additive algebraic multigrid method and investigate their potential for decreased communication, and improved communication-computation overlap, features that are essential for goodmore » performance on future exascale architectures.« less

  15. Verification of an analytic modeler for capillary pump loop thermal control systems

    NASA Technical Reports Server (NTRS)

    Schweickart, R. B.; Neiswanger, L.; Ku, J.

    1987-01-01

    A number of computer programs have been written to model two-phase heat transfer systems for space use. These programs support the design of thermal control systems and provide a method of predicting their performance in the wide range of thermal environments of space. Predicting the performance of one such system known as the capillary pump loop (CPL) is the intent of the CPL Modeler. By modeling two developed CPL systems and comparing the results with actual test data, the CPL Modeler has proven useful in simulating CPL operation. Results of the modeling effort are discussed, together with plans for refinements to the modeler.

  16. Bathymetric Terrain Model of the Puerto Rico Trench and the Northeastern Caribbean Region for Marine Geological Investigations

    USGS Publications Warehouse

    Andrews, Brian D.; ten Brink, Uri S.; Danforth, William W.; Chaytor, Jason D.; Granja-Bruna, J; Carbo-Gorosabel, A

    2014-01-01

    Multibeam bathymetry data collected in the Puerto Rico Trench and Northeast Caribbean region are compiled into a seamless bathymetric terrain model for broad-scale geological investigations of the trench system. These data, collected during eight separate surveys between 2002 and 2013, covering almost 180,000 square kilometers are published here in large format map sheet and digital spatial data. This report describes the common multibeam data collection, and processing methods used to produce the bathymetric terrain model and corresponding data source polygon. Details documenting the complete provenance of the data are also provided in the metadata in the Data Catalog section.

  17. Use of a wave reverberation technique to infer the density compression of shocked liquid deuterium to 75 GPa.

    PubMed

    Knudson, M D; Hanson, D L; Bailey, J E; Hall, C A; Asay, J R

    2003-01-24

    A novel approach was developed to probe density compression of liquid deuterium (L-D2) along the principal Hugoniot. Relative transit times of shock waves reverberating within the sample are shown to be sensitive to the compression due to the first shock. This technique has proven to be more sensitive than the conventional method of inferring density from the shock and mass velocity, at least in this high-pressure regime. Results in the range of 22-75 GPa indicate an approximately fourfold density compression, and provide data to differentiate between proposed theories for hydrogen and its isotopes.

  18. Swarm intelligence metaheuristics for enhanced data analysis and optimization.

    PubMed

    Hanrahan, Grady

    2011-09-21

    The swarm intelligence (SI) computing paradigm has proven itself as a comprehensive means of solving complicated analytical chemistry problems by emulating biologically-inspired processes. As global optimum search metaheuristics, associated algorithms have been widely used in training neural networks, function optimization, prediction and classification, and in a variety of process-based analytical applications. The goal of this review is to provide readers with critical insight into the utility of swarm intelligence tools as methods for solving complex chemical problems. Consideration will be given to algorithm development, ease of implementation and model performance, detailing subsequent influences on a number of application areas in the analytical, bioanalytical and detection sciences.

  19. Identification and root cause analysis of cell culture media precipitates in the viral deactivation treatment with high-temperature/short-time method.

    PubMed

    Cao, Xiaolin; Stimpfl, Gregory; Wen, Zai-Qing; Frank, Gregory; Hunter, Glenn

    2013-01-01

    High-temperature/short-time (HTST) treatment of cell culture media is one of the proven techniques used in the biopharmaceutical manufacturing industry for the prevention and mitigation of media viral contamination. With the HTST method, the formulated media is pasteurized (virus-deactivated) by heating and pumping the media continuously through the preset high-temperature holding tubes to achieve a specified period of time at a specific temperature. Recently, during the evaluation and implementation of HTST method in multiple Amgen, Inc. manufacturing facilities, media precipitates were observed in the tests of HTST treatments. The media precipitates may have adverse consequences such as clogging the HTST system, altering operating conditions and compromising the efficacy of viral deactivation, and ultimately affecting the media composition and cell growth. In this study, we report the identification of the composition of media precipitates from multiple media HTST runs using combined microspectroscopic methods including Raman, Fourier transform infrared spectroscopy, and scanning electron microscopy with energy-dispersive X-ray spectroscopy. The major composition in the precipitates was determined to be metal phosphates, including calcium phosphate, magnesium phosphate, and iron (III) phosphate. Based on the composition, stoichiometry, and root-cause study of media precipitations, methods were implemented for the mitigation and prevention of the occurrence of the media precipitation. Viral contamination in cell culture media is an important issue in the biopharmaceutical manufacturing industry and may have serious consequences on product quality, efficacy, and safety. High-temperature/short-time (HTST) treatment of cell culture media is one of the proven techniques used in the industry for the prevention and mitigation of media viral contamination. With the HTST method, the formulated media is pasteurized (virus-deactivated) by heating at preset conditions. This paper provides the identification and root-cause study of the media precipitates that adversely affected the HTST process and discusses the possible solutions to mitigate the precipitation problem.

  20. Optimized mtDNA Control Region Primer Extension Capture Analysis for Forensically Relevant Samples and Highly Compromised mtDNA of Different Age and Origin

    PubMed Central

    Eduardoff, Mayra; Xavier, Catarina; Strobl, Christina; Casas-Vargas, Andrea; Parson, Walther

    2017-01-01

    The analysis of mitochondrial DNA (mtDNA) has proven useful in forensic genetics and ancient DNA (aDNA) studies, where specimens are often highly compromised and DNA quality and quantity are low. In forensic genetics, the mtDNA control region (CR) is commonly sequenced using established Sanger-type Sequencing (STS) protocols involving fragment sizes down to approximately 150 base pairs (bp). Recent developments include Massively Parallel Sequencing (MPS) of (multiplex) PCR-generated libraries using the same amplicon sizes. Molecular genetic studies on archaeological remains that harbor more degraded aDNA have pioneered alternative approaches to target mtDNA, such as capture hybridization and primer extension capture (PEC) methods followed by MPS. These assays target smaller mtDNA fragment sizes (down to 50 bp or less), and have proven to be substantially more successful in obtaining useful mtDNA sequences from these samples compared to electrophoretic methods. Here, we present the modification and optimization of a PEC method, earlier developed for sequencing the Neanderthal mitochondrial genome, with forensic applications in mind. Our approach was designed for a more sensitive enrichment of the mtDNA CR in a single tube assay and short laboratory turnaround times, thus complying with forensic practices. We characterized the method using sheared, high quantity mtDNA (six samples), and tested challenging forensic samples (n = 2) as well as compromised solid tissue samples (n = 15) up to 8 kyrs of age. The PEC MPS method produced reliable and plausible mtDNA haplotypes that were useful in the forensic context. It yielded plausible data in samples that did not provide results with STS and other MPS techniques. We addressed the issue of contamination by including four generations of negative controls, and discuss the results in the forensic context. We finally offer perspectives for future research to enable the validation and accreditation of the PEC MPS method for final implementation in forensic genetic laboratories. PMID:28934125

  1. Making a Literature Methods Course "Realistic."

    ERIC Educational Resources Information Center

    Lewis, William J.

    Recognizing that it can be a challenge to make an undergraduate literature methods course realistic, a methods instructor at a Michigan university has developed three major and several minor activities that have proven effective in preparing pre-student teachers for the "real world" of teaching and, at the same time, have been challenging and…

  2. Australian Curriculum Linked Lessons

    ERIC Educational Resources Information Center

    Hurrell, Derek

    2014-01-01

    In providing a continued focus on tasks and activities that help to illustrate key ideas embedded in the new Australian Curriculum, this issue will focus on Number in the Number and Algebra strand. In this article Derek Hurrell provides a few tried and proven activities to develop place value understanding. These activities are provided for…

  3. Representing annotation compositionality and provenance for the Semantic Web

    PubMed Central

    2013-01-01

    Background Though the annotation of digital artifacts with metadata has a long history, the bulk of that work focuses on the association of single terms or concepts to single targets. As annotation efforts expand to capture more complex information, annotations will need to be able to refer to knowledge structures formally defined in terms of more atomic knowledge structures. Existing provenance efforts in the Semantic Web domain primarily focus on tracking provenance at the level of whole triples and do not provide enough detail to track how individual triple elements of annotations were derived from triple elements of other annotations. Results We present a task- and domain-independent ontological model for capturing annotations and their linkage to their denoted knowledge representations, which can be singular concepts or more complex sets of assertions. We have implemented this model as an extension of the Information Artifact Ontology in OWL and made it freely available, and we show how it can be integrated with several prominent annotation and provenance models. We present several application areas for the model, ranging from linguistic annotation of text to the annotation of disease-associations in genome sequences. Conclusions With this model, progressively more complex annotations can be composed from other annotations, and the provenance of compositional annotations can be represented at the annotation level or at the level of individual elements of the RDF triples composing the annotations. This in turn allows for progressively richer annotations to be constructed from previous annotation efforts, the precise provenance recording of which facilitates evidence-based inference and error tracking. PMID:24268021

  4. Using Detrital Zircon Geochronology to Constrain Paleogene Provenance and Its Relationship to Rifting in the Zhu 1 Depression, Pearl River Mouth Basin, South China Sea

    NASA Astrophysics Data System (ADS)

    Wang, Wei; Ye, Jiaren; Bidgoli, Tandis; Yang, Xianghua; Shi, Hesheng; Shu, Yu

    2017-11-01

    Paleogene syn-rift successions in the South China Sea are poorly understood and systematic provenance analysis, which could provide clues to their history, is lacking. Here we report 409 new concordant U-Pb ages from detrital zircons separated from the Paleogene Wenchang, Enping, and Zhuhai formations in the Zhu 1 depression, Pearl River Mouth Basin. The new data, combined with the published age data from the region, document changes in the provenance of syn-rift successions. Detrital zircons from the Eocene Wenchang Formation are unimodal, with Jurassic-Cretaceous (180-80 Ma) ages making up >80% of grains. The ages are consistent with the geochronology of intrabasinal highs, dominated by igneous rocks emplaced during the Yanshanian orogeny, and suggest local provenance. By contrast, detrital zircons from the upper Eocene to lower Oligocene Enping Formation form three well-recognized age-clusters, with peaks at 150, 254, and 438 Ma that match documented tectonomagmatism in South China Block (SCB). Combined with increasing numbers of Precambrian zircons, the data suggest increasing influence of regional provenance of the SCB. Similar age peaks are also recognized from the limited number of zircons analyzed from the upper Oligocene Zhuhai Formation and comparability with modern shelf and river sediment indicates the unit was mainly sourced from the SCB and likely transported by a paleo-Pearl River. We infer that the change in provenance, from local uplifts within the Zhu 1 to the SCB, is related to distinct phases of PRMB rift development; however, later changes are best explained by SCB drainage evolution.

  5. An approximate method for determining of investment risk

    NASA Astrophysics Data System (ADS)

    Slavkova, Maria; Tzenova, Zlatina

    2016-12-01

    In this work a method for determining of investment risk during all economic states is considered. It is connected to matrix games with two players. A definition for risk in a matrix game is introduced. Three properties are proven. It is considered an appropriate example.

  6. Computer-aided target tracking in motion analysis studies

    NASA Astrophysics Data System (ADS)

    Burdick, Dominic C.; Marcuse, M. L.; Mislan, J. D.

    1990-08-01

    Motion analysis studies require the precise tracking of reference objects in sequential scenes. In a typical situation, events of interest are captured at high frame rates using special cameras, and selected objects or targets are tracked on a frame by frame basis to provide necessary data for motion reconstruction. Tracking is usually done using manual methods which are slow and prone to error. A computer based image analysis system has been developed that performs tracking automatically. The objective of this work was to eliminate the bottleneck due to manual methods in high volume tracking applications such as the analysis of crash test films for the automotive industry. The system has proven to be successful in tracking standard fiducial targets and other objects in crash test scenes. Over 95 percent of target positions which could be located using manual methods can be tracked by the system, with a significant improvement in throughput over manual methods. Future work will focus on the tracking of clusters of targets and on tracking deformable objects such as airbags.

  7. Research on the Construction of Remote Sensing Automatic Interpretation Symbol Big Data

    NASA Astrophysics Data System (ADS)

    Gao, Y.; Liu, R.; Liu, J.; Cheng, T.

    2018-04-01

    Remote sensing automatic interpretation symbol (RSAIS) is an inexpensive and fast method in providing precise in-situ information for image interpretation and accuracy. This study designed a scientific and precise RSAIS data characterization method, as well as a distributed and cloud architecture massive data storage method. Additionally, it introduced an offline and online data update mode and a dynamic data evaluation mechanism, with the aim to create an efficient approach for RSAIS big data construction. Finally, a national RSAIS database with more than 3 million samples covering 86 land types was constructed during 2013-2015 based on the National Geographic Conditions Monitoring Project of China and then annually updated since the 2016 period. The RSAIS big data has proven to be a good method for large scale image interpretation and field validation. It is also notable that it has the potential to solve image automatic interpretation with the assistance of deep learning technology in the remote sensing big data era.

  8. Study of Commercially Available Lobelia chinensis Products Using Bar-HRM Technology.

    PubMed

    Sun, Wei; Yan, Song; Li, Jingjian; Xiong, Chao; Shi, Yuhua; Wu, Lan; Xiang, Li; Deng, Bo; Ma, Wei; Chen, Shilin

    2017-01-01

    There is an unmet need for herbal medicine identification using a fast, sensitive, and easy-to-use method that does not require complex infrastructure and well-trained technicians. For instance, the detection of adulterants in Lobelia chinensis herbal product has been challenging, since current detection technologies are not effective due to their own limits. High Resolution Melting (HRM) has emerged as a powerful new technology for clinical diagnosis, research in the food industry and in plant molecular biology, and this method has already highlighted the complexity of species identification. In this study, we developed a method of species specific detection of L. chinensis using HRM analysis combined with internal transcribed spacer 2. We then applied this method to commercial products purporting to contain L . chinensis . Our results demonstrated that HRM can differentiate L. chinensis from six common adulterants. HRM was proven to be a fast and accurate technique for testing the authenticity of L. chinensis in herbal products. Based on these results, a HRM approach for herbal authentication is provided.

  9. Stability and stabilization of the lattice Boltzmann method

    NASA Astrophysics Data System (ADS)

    Brownlee, R. A.; Gorban, A. N.; Levesley, J.

    2007-03-01

    We revisit the classical stability versus accuracy dilemma for the lattice Boltzmann methods (LBM). Our goal is a stable method of second-order accuracy for fluid dynamics based on the lattice Bhatnager-Gross-Krook method (LBGK). The LBGK scheme can be recognized as a discrete dynamical system generated by free flight and entropic involution. In this framework the stability and accuracy analysis are more natural. We find the necessary and sufficient conditions for second-order accurate fluid dynamics modeling. In particular, it is proven that in order to guarantee second-order accuracy the distribution should belong to a distinguished surface—the invariant film (up to second order in the time step). This surface is the trajectory of the (quasi)equilibrium distribution surface under free flight. The main instability mechanisms are identified. The simplest recipes for stabilization add no artificial dissipation (up to second order) and provide second-order accuracy of the method. Two other prescriptions add some artificial dissipation locally and prevent the system from loss of positivity and local blowup. Demonstration of the proposed stable LBGK schemes are provided by the numerical simulation of a one-dimensional (1D) shock tube and the unsteady 2D flow around a square cylinder up to Reynolds number Rẽ20000 .

  10. The Safe Drinking Water Act of 1974 and Its Role in Providing Access to Safe Drinking Water in the United States.

    PubMed

    Weinmeyer, Richard; Norling, Annalise; Kawarski, Margaret; Higgins, Estelle

    2017-10-01

    In 1974, President Gerald Ford signed into law the Safe Drinking Water Act, the first piece of legislation of its kind to provide a comprehensive regulatory framework for overseeing the nation's drinking water supply. The law has proven instrumental in setting standards for ensuring that the US population can access drinking water that is safe. However, the law delegates much of its monitoring requirements to states, creating, at times, a confusing and complicated system of standards that must be adhered to and enforced. Although it has proven valuable in the safety standards it specifies, the law's administration and enforcement poses tremendous challenges. © 2017 American Medical Association. All Rights Reserved.

  11. Open access chemical probes for epigenetic targets

    PubMed Central

    Brown, Peter J; Müller, Susanne

    2015-01-01

    Background High attrition rates in drug discovery call for new approaches to improve target validation. Academia is filling gaps, but often lacks the experience and resources of the pharmaceutical industry resulting in poorly characterized tool compounds. Discussion The SGC has established an open access chemical probe consortium, currently encompassing ten pharmaceutical companies. One of its mandates is to create well-characterized inhibitors (chemical probes) for epigenetic targets to enable new biology and target validation for drug development. Conclusion Epigenetic probe compounds have proven to be very valuable and have not only spurred a plethora of novel biological findings, but also provided starting points for clinical trials. These probes have proven to be critical complementation to traditional genetic targeting strategies and provided sometimes surprising results. PMID:26397018

  12. Space exploration initiative (SEI) logistics support lessons from the DoD

    NASA Astrophysics Data System (ADS)

    Cox, John R.; McCoy, Walbert G.; Jenkins, Terence

    Proven and innovative logistics management approaches and techniques used for developing and supporting DoD and Strategic Defense Initiative Office (SDIO) systems are described on the basis of input from DoD to the SEI Synthesis Group; SDIO-developed logistics initiatives, innovative tools, and methodologies; and logistics planning support provided to the NASA/Johnson Planet Surface System Office. The approach is tailored for lunar/Martian surface operations, and provides guidelines for the development and management of a crucial element of the SEI logistics support program. A case study is presented which shows how incorporation of DoD's proven and innovative logistics management approach, tools, and techniques can substantially benefit early logistics planning for SEI, while also implementing many of DoD's recommendations for SEI.

  13. Toward a systematic exploration of nano-bio interactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bai, Xue; Liu, Fang; Liu, Yin

    Many studies of nanomaterials make non-systematic alterations of nanoparticle physicochemical properties. Given the immense size of the property space for nanomaterials, such approaches are not very useful in elucidating fundamental relationships between inherent physicochemical properties of these materials and their interactions with, and effects on, biological systems. Data driven artificial intelligence methods such as machine learning algorithms have proven highly effective in generating models with good predictivity and some degree of interpretability. They can provide a viable method of reducing or eliminating animal testing. However, careful experimental design with the modelling of the results in mind is a proven andmore » efficient way of exploring large materials spaces. This approach, coupled with high speed automated experimental synthesis and characterization technologies now appearing, is the fastest route to developing models that regulatory bodies may find useful. We advocate greatly increased focus on systematic modification of physicochemical properties of nanoparticles combined with comprehensive biological evaluation and computational analysis. This is essential to obtain better mechanistic understanding of nano-bio interactions, and to derive quantitatively predictive and robust models for the properties of nanomaterials that have useful domains of applicability. - Highlights: • Nanomaterials studies make non-systematic alterations to nanoparticle properties. • Vast nanomaterials property spaces require systematic studies of nano-bio interactions. • Experimental design and modelling are efficient ways of exploring materials spaces. • We advocate systematic modification and computational analysis to probe nano-bio interactions.« less

  14. Acid-fast Smear and Histopathology Results Provide Guidance for the Appropriate Use of Broad-Range Polymerase Chain Reaction and Sequencing for Mycobacteria.

    PubMed

    Miller, Kennon; Harrington, Susan M; Procop, Gary W

    2015-08-01

    New molecular diagnostic tests are attractive because of the potential they hold for improving diagnostics in microbiology. The value of these tests, which is often assumed, should be investigated to determine the best use of these potentially powerful tools. To investigate the usefulness of broad-range polymerase chain reaction (PCR), followed by sequencing, in mycobacterial infections. We reviewed the test performance of acid-fast bacilli (AFB) PCR and traditional diagnostic methods (histopathology, AFB smear, and culture). We assessed the diagnostic effect and cost of the unrestricted ordering of broad-range PCR for the detection and identification of mycobacteria in clinical specimens. The AFB PCR was less sensitive than culture and histopathology and was less specific than culture, AFB smear, and histopathology. During 18 months, $93 063 was spent on 183 patient specimens for broad-range PCR and DNA sequencing for mycobacteria to confirm one culture-proven Mycobacterium tuberculosis infection that was also known to be positive by AFB smear and histopathology. In this cohort, there was a false-negative AFB PCR for M tuberculosis and a false-positive AFB PCR for Mycobacterium lentiflavum . Testing of AFB smear-negative specimens from patients without an inflammatory response supportive of a mycobacterial infection is costly and has not been proven to improve patient care. Traditional diagnostics (histopathology, AFB smear, and culture) should remain the primary methods for the detection of mycobacteria in clinical specimens.

  15. Exploring natural variation of Pinus pinaster Aiton using metabolomics: Is it possible to identify the region of origin of a pine from its metabolites?

    PubMed

    Meijón, Mónica; Feito, Isabel; Oravec, Michal; Delatorre, Carolina; Weckwerth, Wolfram; Majada, Juan; Valledor, Luis

    2016-02-01

    Natural variation of the metabolome of Pinus pinaster was studied to improve understanding of its role in the adaptation process and phenotypic diversity. The metabolomes of needles and the apical and basal section of buds were analysed in ten provenances of P. pinaster, selected from France, Spain and Morocco, grown in a common garden for 5 years. The employment of complementary mass spectrometry techniques (GC-MS and LC-Orbitrap-MS) together with bioinformatics tools allowed the reliable quantification of 2403 molecular masses. The analysis of the metabolome showed that differences were maintained across provenances and that the metabolites characteristic of each organ are mainly related to amino acid metabolism, while provenances were distinguishable essentially through secondary metabolism when organs were analysed independently. Integrative analyses of metabolome, environmental and growth data provided a comprehensive picture of adaptation plasticity in conifers. These analyses defined two major groups of plants, distinguished by secondary metabolism: that is, either Atlantic or Mediterranean provenance. Needles were the most sensitive organ, where strong correlations were found between flavonoids and the water regime of the geographic origin of the provenance. The data obtained point to genome specialization aimed at maximizing the drought stress resistance of trees depending on their origin. © 2016 John Wiley & Sons Ltd.

  16. Assessing uncertainty in sighting records: an example of the Barbary lion.

    PubMed

    Lee, Tamsin E; Black, Simon A; Fellous, Amina; Yamaguchi, Nobuyuki; Angelici, Francesco M; Al Hikmani, Hadi; Reed, J Michael; Elphick, Chris S; Roberts, David L

    2015-01-01

    As species become rare and approach extinction, purported sightings can be controversial, especially when scarce management resources are at stake. We consider the probability that each individual sighting of a series is valid. Obtaining these probabilities requires a strict framework to ensure that they are as accurately representative as possible. We used a process, which has proven to provide accurate estimates from a group of experts, to obtain probabilities for the validation of 32 sightings of the Barbary lion. We consider the scenario where experts are simply asked whether a sighting was valid, as well as asking them to score the sighting based on distinguishablity, observer competence, and verifiability. We find that asking experts to provide scores for these three aspects resulted in each sighting being considered more individually, meaning that this new questioning method provides very different estimated probabilities that a sighting is valid, which greatly affects the outcome from an extinction model. We consider linear opinion pooling and logarithm opinion pooling to combine the three scores, and also to combine opinions on each sighting. We find the two methods produce similar outcomes, allowing the user to focus on chosen features of each method, such as satisfying the marginalisation property or being externally Bayesian.

  17. Damage Detection Based on Static Strain Responses Using FBG in a Wind Turbine Blade.

    PubMed

    Tian, Shaohua; Yang, Zhibo; Chen, Xuefeng; Xie, Yong

    2015-08-14

    The damage detection of a wind turbine blade enables better operation of the turbines, and provides an early alert to the destroyed events of the blade in order to avoid catastrophic losses. A new non-baseline damage detection method based on the Fiber Bragg grating (FBG) in a wind turbine blade is developed in this paper. Firstly, the Chi-square distribution is proven to be an effective damage-sensitive feature which is adopted as the individual information source for the local decision. In order to obtain the global and optimal decision for the damage detection, the feature information fusion (FIF) method is proposed to fuse and optimize information in above individual information sources, and the damage is detected accurately through of the global decision. Then a 13.2 m wind turbine blade with the distributed strain sensor system is adopted to describe the feasibility of the proposed method, and the strain energy method (SEM) is used to describe the advantage of the proposed method. Finally results show that the proposed method can deliver encouraging results of the damage detection in the wind turbine blade.

  18. Turbomachinery Airfoil Design Optimization Using Differential Evolution

    NASA Technical Reports Server (NTRS)

    Madavan, Nateri K.; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    An aerodynamic design optimization procedure that is based on a evolutionary algorithm known at Differential Evolution is described. Differential Evolution is a simple, fast, and robust evolutionary strategy that has been proven effective in determining the global optimum for several difficult optimization problems, including highly nonlinear systems with discontinuities and multiple local optima. The method is combined with a Navier-Stokes solver that evaluates the various intermediate designs and provides inputs to the optimization procedure. An efficient constraint handling mechanism is also incorporated. Results are presented for the inverse design of a turbine airfoil from a modern jet engine. The capability of the method to search large design spaces and obtain the optimal airfoils in an automatic fashion is demonstrated. Substantial reductions in the overall computing time requirements are achieved by using the algorithm in conjunction with neural networks.

  19. Optical image encryption using multilevel Arnold transform and noninterferometric imaging

    NASA Astrophysics Data System (ADS)

    Chen, Wen; Chen, Xudong

    2011-11-01

    Information security has attracted much current attention due to the rapid development of modern technologies, such as computer and internet. We propose a novel method for optical image encryption using multilevel Arnold transform and rotatable-phase-mask noninterferometric imaging. An optical image encryption scheme is developed in the gyrator transform domain, and one phase-only mask (i.e., phase grating) is rotated and updated during image encryption. For the decryption, an iterative retrieval algorithm is proposed to extract high-quality plaintexts. Conventional encoding methods (such as digital holography) have been proven vulnerably to the attacks, and the proposed optical encoding scheme can effectively eliminate security deficiency and significantly enhance cryptosystem security. The proposed strategy based on the rotatable phase-only mask can provide a new alternative for data/image encryption in the noninterferometric imaging.

  20. How to Measure the Intervention Process? An Assessment of Qualitative and Quantitative Approaches to Data Collection in the Process Evaluation of Organizational Interventions

    PubMed Central

    Abildgaard, Johan S.; Saksvik, Per Ø.; Nielsen, Karina

    2016-01-01

    Organizational interventions aiming at improving employee health and wellbeing have proven to be challenging to evaluate. To analyze intervention processes two methodological approaches have widely been used: quantitative (often questionnaire data), or qualitative (often interviews). Both methods are established tools, but their distinct epistemological properties enable them to illuminate different aspects of organizational interventions. In this paper, we use the quantitative and qualitative process data from an organizational intervention conducted in a national postal service, where the Intervention Process Measure questionnaire (N = 285) as well as an extensive interview study (N = 50) were used. We analyze what type of knowledge about intervention processes these two methodologies provide and discuss strengths and weaknesses as well as potentials for mixed methods evaluation methodologies. PMID:27713707

  1. How to Measure the Intervention Process? An Assessment of Qualitative and Quantitative Approaches to Data Collection in the Process Evaluation of Organizational Interventions.

    PubMed

    Abildgaard, Johan S; Saksvik, Per Ø; Nielsen, Karina

    2016-01-01

    Organizational interventions aiming at improving employee health and wellbeing have proven to be challenging to evaluate. To analyze intervention processes two methodological approaches have widely been used: quantitative (often questionnaire data), or qualitative (often interviews). Both methods are established tools, but their distinct epistemological properties enable them to illuminate different aspects of organizational interventions. In this paper, we use the quantitative and qualitative process data from an organizational intervention conducted in a national postal service, where the Intervention Process Measure questionnaire ( N = 285) as well as an extensive interview study ( N = 50) were used. We analyze what type of knowledge about intervention processes these two methodologies provide and discuss strengths and weaknesses as well as potentials for mixed methods evaluation methodologies.

  2. Application of Lightweight Formal Methods to Software Security

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt

    2005-01-01

    Formal specification and verification of security has proven a challenging task. There is no single method that has proven feasible. Instead, an integrated approach which combines several formal techniques can increase the confidence in the verification of software security properties. Such an approach which species security properties in a library that can be reused by 2 instruments and their methodologies developed for the National Aeronautics and Space Administration (NASA) at the Jet Propulsion Laboratory (JPL) are described herein The Flexible Modeling Framework (FMF) is a model based verijkation instrument that uses Promela and the SPIN model checker. The Property Based Tester (PBT) uses TASPEC and a Text Execution Monitor (TEM). They are used to reduce vulnerabilities and unwanted exposures in software during the development and maintenance life cycles.

  3. Off-Policy Actor-Critic Structure for Optimal Control of Unknown Systems With Disturbances.

    PubMed

    Song, Ruizhuo; Lewis, Frank L; Wei, Qinglai; Zhang, Huaguang

    2016-05-01

    An optimal control method is developed for unknown continuous-time systems with unknown disturbances in this paper. The integral reinforcement learning (IRL) algorithm is presented to obtain the iterative control. Off-policy learning is used to allow the dynamics to be completely unknown. Neural networks are used to construct critic and action networks. It is shown that if there are unknown disturbances, off-policy IRL may not converge or may be biased. For reducing the influence of unknown disturbances, a disturbances compensation controller is added. It is proven that the weight errors are uniformly ultimately bounded based on Lyapunov techniques. Convergence of the Hamiltonian function is also proven. The simulation study demonstrates the effectiveness of the proposed optimal control method for unknown systems with disturbances.

  4. Provenance establishment of coffee using solution ICP-MS and ICP-AES.

    PubMed

    Valentin, Jenna L; Watling, R John

    2013-11-01

    Statistical interpretation of the concentrations of 59 elements, determined using solution based inductively coupled plasma mass spectrometry (ICP-MS) and inductively coupled plasma emission spectroscopy (ICP-AES), was used to establish the provenance of coffee samples from 15 countries across five continents. Data confirmed that the harvest year, degree of ripeness and whether the coffees were green or roasted had little effect on the elemental composition of the coffees. The application of linear discriminant analysis and principal component analysis of the elemental concentrations permitted up to 96.9% correct classification of the coffee samples according to their continent of origin. When samples from each continent were considered separately, up to 100% correct classification of coffee samples into their countries, and plantations of origin was achieved. This research demonstrates the potential of using elemental composition, in combination with statistical classification methods, for accurate provenance establishment of coffee. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Systematic reviews and knowledge translation.

    PubMed Central

    Tugwell, Peter; Robinson, Vivian; Grimshaw, Jeremy; Santesso, Nancy

    2006-01-01

    Proven effective interventions exist that would enable all countries to meet the Millennium Development Goals. However, uptake and use of these interventions in the poorest populations is at least 50% less than in the richest populations within each country. Also, we have recently shown that community effectiveness of interventions is lower for the poorest populations due to a "staircase" effect of lower coverage/access, worse diagnostic accuracy, less provider compliance and less consumer adherence. We propose an evidence-based framework for equity-oriented knowledge translation to enhance community effectiveness and health equity. This framework is represented as a cascade of steps to assess and prioritize barriers and thus choose effective knowledge translation interventions that are tailored for relevant audiences (public, patient, practitioner, policy-maker, press and private sector), as well as the evaluation, monitoring and sharing of these strategies. We have used two examples of effective interventions (insecticide-treated bednets to prevent malaria and childhood immunization) to illustrate how this framework can provide a systematic method for decision-makers to ensure the application of evidence-based knowledge in disadvantaged populations. Future work to empirically validate and evaluate the usefulness of this framework is needed. We invite researchers and implementers to use the cascade for equity-oriented knowledge translation as a guide when planning implementation strategies for proven effective interventions. We also encourage policy-makers and health-care managers to use this framework when deciding how effective interventions can be implemented in their own settings. PMID:16917652

  6. Systematic reviews and knowledge translation.

    PubMed

    Tugwell, Peter; Robinson, Vivian; Grimshaw, Jeremy; Santesso, Nancy

    2006-08-01

    Proven effective interventions exist that would enable all countries to meet the Millennium Development Goals. However, uptake and use of these interventions in the poorest populations is at least 50% less than in the richest populations within each country. Also, we have recently shown that community effectiveness of interventions is lower for the poorest populations due to a "staircase" effect of lower coverage/access, worse diagnostic accuracy, less provider compliance and less consumer adherence. We propose an evidence-based framework for equity-oriented knowledge translation to enhance community effectiveness and health equity. This framework is represented as a cascade of steps to assess and prioritize barriers and thus choose effective knowledge translation interventions that are tailored for relevant audiences (public, patient, practitioner, policy-maker, press and private sector), as well as the evaluation, monitoring and sharing of these strategies. We have used two examples of effective interventions (insecticide-treated bednets to prevent malaria and childhood immunization) to illustrate how this framework can provide a systematic method for decision-makers to ensure the application of evidence-based knowledge in disadvantaged populations. Future work to empirically validate and evaluate the usefulness of this framework is needed. We invite researchers and implementers to use the cascade for equity-oriented knowledge translation as a guide when planning implementation strategies for proven effective interventions. We also encourage policy-makers and health-care managers to use this framework when deciding how effective interventions can be implemented in their own settings.

  7. Determination of vibration-rotation lines intensities from absorption Fourier spectra

    NASA Technical Reports Server (NTRS)

    Mandin, J. Y.

    1979-01-01

    The method presented allows the line intensities to be calculated from either their equivalent widths, heights, or quantities deduced from spectra obtained by Fourier spectrometry. This method has proven its effectiveness in measuring intensities of 60 lines of the molecule H2O with a precision of 10%. However, this method cannot be applied to isolated lines.

  8. Evaluation of simple geochemical indicators of aeolian sand provenance: Late Quaternary dune fields of North America revisited

    USGS Publications Warehouse

    Muhs, Daniel

    2017-01-01

    Dune fields of Quaternary age occupy large areas of the world's arid and semiarid regions. Despite this, there has been surprisingly little work done on understanding dune sediment provenance, in part because many techniques are time-consuming, prone to operator error, experimental, highly specialized, expensive, or require sophisticated instrumentation. Provenance of dune sand using K/Rb and K/Ba values in K-feldspar in aeolian sands of the arid and semiarid regions of North America is tested here. Results indicate that K/Rb and K/Ba can distinguish different river sands that are sediment sources for dunes and dune fields themselves have distinctive K/Rb and K/Ba compositions. Over the Basin and Range and Great Plains regions of North America, the hypothesized sediment sources of dune fields are reviewed and assessed using K/Rb and K/Ba values in dune sands and in hypothesized source sediments. In some cases, the origins of dunes assessed in this manner are consistent with previous studies and in others, dune fields are found to have a more complex origin than previously thought. Use of K/Rb and K/Ba for provenance studies is a robust method that is inexpensive, rapid, and highly reproducible. It exploits one of the most common minerals found in dune sand, K-feldspar. The method avoids the problem of using simple concentrations of key elements that may be subject to interpretative bias due to changes in mineralogical maturity of Quaternary dune fields that occur over time.

  9. Parameterizing Coefficients of a POD-Based Dynamical System

    NASA Technical Reports Server (NTRS)

    Kalb, Virginia L.

    2010-01-01

    A method of parameterizing the coefficients of a dynamical system based of a proper orthogonal decomposition (POD) representing the flow dynamics of a viscous fluid has been introduced. (A brief description of POD is presented in the immediately preceding article.) The present parameterization method is intended to enable construction of the dynamical system to accurately represent the temporal evolution of the flow dynamics over a range of Reynolds numbers. The need for this or a similar method arises as follows: A procedure that includes direct numerical simulation followed by POD, followed by Galerkin projection to a dynamical system has been proven to enable representation of flow dynamics by a low-dimensional model at the Reynolds number of the simulation. However, a more difficult task is to obtain models that are valid over a range of Reynolds numbers. Extrapolation of low-dimensional models by use of straightforward Reynolds-number-based parameter continuation has proven to be inadequate for successful prediction of flows. A key part of the problem of constructing a dynamical system to accurately represent the temporal evolution of the flow dynamics over a range of Reynolds numbers is the problem of understanding and providing for the variation of the coefficients of the dynamical system with the Reynolds number. Prior methods do not enable capture of temporal dynamics over ranges of Reynolds numbers in low-dimensional models, and are not even satisfactory when large numbers of modes are used. The basic idea of the present method is to solve the problem through a suitable parameterization of the coefficients of the dynamical system. The parameterization computations involve utilization of the transfer of kinetic energy between modes as a function of Reynolds number. The thus-parameterized dynamical system accurately predicts the flow dynamics and is applicable to a range of flow problems in the dynamical regime around the Hopf bifurcation. Parameter-continuation software can be used on the parameterized dynamical system to derive a bifurcation diagram that accurately predicts the temporal flow behavior.

  10. Localizing chronic Q fever: a challenging query

    PubMed Central

    2013-01-01

    Background Chronic Q fever usually presents as endocarditis or endovascular infection. We investigated whether 18F-FDG PET/CT and echocardiography were able to detect the localization of infection. Also, the utility of the modified Duke criteria was assessed. Methods Fifty-two patients, who had an IgG titre of ≥ 1024 against C. burnetii phase I ≥ 3 months after primary infection or a positive PCR ≥ 1 month after primary infection, were retrospectively included. Data on serology, the results of all imaging studies, possible risk factors for developing proven chronic Q fever and clinical outcome were recorded. Results According to the Dutch consensus on Q fever diagnostics, 18 patients had proven chronic Q fever, 14 probable chronic Q fever, and 20 possible chronic Q fever. Of the patients with proven chronic Q fever, 22% were diagnosed with endocarditis, 17% with an infected vascular prosthesis, and 39% with a mycotic aneurysm. 56% of patients with proven chronic Q fever did not recall an episode of acute Q fever. Ten out of 13 18F-FDG PET/CT-scans in patients with proven chronic Q fever localized the infection. TTE and TEE were helpful in only 6% and 50% of patients, respectively. Conclusions If chronic Q fever is diagnosed, 18F-FDG PET/CT is a helpful imaging technique for localization of vascular infections due to chronic Q fever. Patients with proven chronic Q fever were diagnosed significantly more often with mycotic aneurysms than in previous case series. Definite endocarditis due to chronic Q fever was less frequently diagnosed in the current study. Chronic Q fever often occurs in patients without a known episode of acute Q fever, so clinical suspicion should remain high, especially in endemic regions. PMID:24004470

  11. Somatic Embryogenesis: Still a Relevant Technique in Citrus Improvement.

    PubMed

    Omar, Ahmad A; Dutt, Manjul; Gmitter, Frederick G; Grosser, Jude W

    2016-01-01

    The genus Citrus contains numerous fresh and processed fruit cultivars that are economically important worldwide. New cultivars are needed to battle industry threatening diseases and to create new marketing opportunities. Citrus improvement by conventional methods alone has many limitations that can be overcome by applications of emerging biotechnologies, generally requiring cell to plant regeneration. Many citrus genotypes are amenable to somatic embryogenesis, which became a key regeneration pathway in many experimental approaches to cultivar improvement. This chapter provides a brief history of plant somatic embryogenesis with focus on citrus, followed by a discussion of proven applications in biotechnology-facilitated citrus improvement techniques, such as somatic hybridization, somatic cybridization, genetic transformation, and the exploitation of somaclonal variation. Finally, two important new protocols that feature plant regeneration via somatic embryogenesis are provided: protoplast transformation and Agrobacterium-mediated transformation of embryogenic cell suspension cultures.

  12. Orientation decoding depends on maps, not columns

    PubMed Central

    Freeman, Jeremy; Brouwer, Gijs Joost; Heeger, David J.; Merriam, Elisha P.

    2011-01-01

    The representation of orientation in primary visual cortex (V1) has been examined at a fine spatial scale corresponding to the columnar architecture. We present functional magnetic resonance imaging (fMRI) measurements providing evidence for a topographic map of orientation preference in human V1 at a much coarser scale, in register with the angular-position component of the retinotopic map of V1. This coarse-scale orientation map provides a parsimonious explanation for why multivariate pattern analysis methods succeed in decoding stimulus orientation from fMRI measurements, challenging the widely-held assumption that decoding results reflect sampling of spatial irregularities in the fine-scale columnar architecture. Decoding stimulus attributes and cognitive states from fMRI measurements has proven useful for a number of applications, but our results demonstrate that the interpretation cannot assume decoding reflects or exploits columnar organization. PMID:21451017

  13. Provenance for Runtime Workflow Steering and Validation in Computational Seismology

    NASA Astrophysics Data System (ADS)

    Spinuso, A.; Krischer, L.; Krause, A.; Filgueira, R.; Magnoni, F.; Muraleedharan, V.; David, M.

    2014-12-01

    Provenance systems may be offered by modern workflow engines to collect metadata about the data transformations at runtime. If combined with effective visualisation and monitoring interfaces, these provenance recordings can speed up the validation process of an experiment, suggesting interactive or automated interventions with immediate effects on the lifecycle of a workflow run. For instance, in the field of computational seismology, if we consider research applications performing long lasting cross correlation analysis and high resolution simulations, the immediate notification of logical errors and the rapid access to intermediate results, can produce reactions which foster a more efficient progress of the research. These applications are often executed in secured and sophisticated HPC and HTC infrastructures, highlighting the need for a comprehensive framework that facilitates the extraction of fine grained provenance and the development of provenance aware components, leveraging the scalability characteristics of the adopted workflow engines, whose enactment can be mapped to different technologies (MPI, Storm clusters, etc). This work looks at the adoption of W3C-PROV concepts and data model within a user driven processing and validation framework for seismic data, supporting also computational and data management steering. Validation needs to balance automation with user intervention, considering the scientist as part of the archiving process. Therefore, the provenance data is enriched with community-specific metadata vocabularies and control messages, making an experiment reproducible and its description consistent with the community understandings. Moreover, it can contain user defined terms and annotations. The current implementation of the system is supported by the EU-Funded VERCE (http://verce.eu). It provides, as well as the provenance generation mechanisms, a prototypal browser-based user interface and a web API built on top of a NoSQL storage technology, experimenting ways to ensure a rapid and flexible access to the lineage traces. It supports the users with the visualisation of graphical products and offers combined operations to access and download the data which may be selectively stored at runtime, into dedicated data archives.

  14. Professional Nursing Series by Videoconferencing.

    ERIC Educational Resources Information Center

    Weber, Jeanne Rodier; Lawlor, Andrew C.

    1998-01-01

    A university consortium developed a continuing-education series for rural Pennsylvania nurses using videoconferencing. The program was well received by participants and proven a viable method of delivery to isolated areas. (SK)

  15. Transition of Suomi National Polar-Orbiting Partnership (S-NPP) Data Products for Operational Weather Forecasting Applications

    NASA Technical Reports Server (NTRS)

    Smith, Matthew R.; Molthan, Andrew L.; Fuell, Kevin K.; Jedlovec, Gary J.

    2012-01-01

    SPoRT is a team of NASA/NOAA scientists focused on demonstrating the utility of NASA and future NOAA data and derived products on improving short-term weather forecasts. Work collaboratively with a suite of unique products and selected WFOs in an end-to-end transition activity. Stable funding from NASA and NOAA. Recognized by the science community as the "go to" place for transitioning experimental and research data to the operational weather community. Endorsed by NWS ESSD/SSD chiefs. Proven paradigm for transitioning satellite observations and modeling capabilities to operations (R2O). SPoRT s transition of NASA satellite instruments provides unique or higher resolution data products to complement the baseline suite of geostationary data available to forecasters. SPoRT s partnership with NWS WFOs provides them with unique imagery to support disaster response and local forecast challenges. SPoRT has years of proven experience in developing and transitioning research products to the operational weather community. SPoRT has begun work with CONUS and OCONUS WFOs to determine the best products for maximum benefit to forecasters. VIIRS has already proven to be another extremely powerful tool, enhancing forecasters ability to handle difficult forecasting situations.

  16. A new method of imposing boundary conditions for hyperbolic equations

    NASA Technical Reports Server (NTRS)

    Funaro, D.; ative.

    1987-01-01

    A new method to impose boundary conditions for pseudospectral approximations to hyperbolic equations is suggested. This method involves the collocation of the equation at the boundary nodes as well as satisfying boundary conditions. Stability and convergence results are proven for the Chebyshev approximation of linear scalar hyperbolic equations. The eigenvalues of this method applied to parabolic equations are shown to be real and negative.

  17. The efficacy of wire and glue hair snares in identifying mesocarnivores

    Treesearch

    William J. Zielinski; Fredrick V. Schlexer; Kristine L. Pilgrim; Michael K. Schwartz

    2006-01-01

    Track plates and cameras are proven methods for detecting and identifying fishers (Martes pennant) and other mesocarnivores. But these methods are inadequate to achieve demographic and population-monitoring objectives that require identifying sex and individuals. Although noninvasive collection of biological material for genetic analysis (i.e.,...

  18. Upgrading in an Industrial Setting. Final Report.

    ERIC Educational Resources Information Center

    Russell, Wendell

    The project objectives were: (1) to assess existing industrial upgrading practices in an Atomic Energy Commission contractor organization, (2) to design new alternative upgrading methods, (3) to experiment with new upgrading methods, (4) to plan for utilization of proven upgrading programs, and (5) to document and disseminate activities. A twelve…

  19. Silvicultural systems for southern bottomland hardwood forests

    Treesearch

    James S. Meadows; John A. Stanturf

    1997-01-01

    Silvicultural systems integrate both regeneration and intermediate operations in an orderly process for managing forest stands. The clearcutting method of regeneration favors the development of species that are moderately intolerant to intolerant of shade. In fact, clearcutting is the most proven and widely used method of successfully regenerating bottomland oak...

  20. Nonlinear inversion of resistivity sounding data for 1-D earth models using the Neighbourhood Algorithm

    NASA Astrophysics Data System (ADS)

    Ojo, A. O.; Xie, Jun; Olorunfemi, M. O.

    2018-01-01

    To reduce ambiguity related to nonlinearities in the resistivity model-data relationships, an efficient direct-search scheme employing the Neighbourhood Algorithm (NA) was implemented to solve the 1-D resistivity problem. In addition to finding a range of best-fit models which are more likely to be global minimums, this method investigates the entire multi-dimensional model space and provides additional information about the posterior model covariance matrix, marginal probability density function and an ensemble of acceptable models. This provides new insights into how well the model parameters are constrained and make assessing trade-offs between them possible, thus avoiding some common interpretation pitfalls. The efficacy of the newly developed program is tested by inverting both synthetic (noisy and noise-free) data and field data from other authors employing different inversion methods so as to provide a good base for comparative performance. In all cases, the inverted model parameters were in good agreement with the true and recovered model parameters from other methods and remarkably correlate with the available borehole litho-log and known geology for the field dataset. The NA method has proven to be useful whilst a good starting model is not available and the reduced number of unknowns in the 1-D resistivity inverse problem makes it an attractive alternative to the linearized methods. Hence, it is concluded that the newly developed program offers an excellent complementary tool for the global inversion of the layered resistivity structure.

  1. Local lymph node assay (LLNA) for detection of sensitization capacity of chemicals.

    PubMed

    Gerberick, G Frank; Ryan, Cindy A; Dearman, Rebecca J; Kimber, Ian

    2007-01-01

    The local lymph node assay (LLNA) is a murine model developed to evaluate the skin sensitization potential of chemicals. The LLNA is an alternative approach to traditional guinea pig methods and in comparison provides important animal welfare benefits. The assay relies on measurement of events induced during the induction phase of skin sensitization, specifically lymphocyte proliferation in the draining lymph nodes which is a hallmark of a skin sensitization response. Since its introduction the LLNA has been the subject of extensive evaluation on a national and international scale, and has been successfully validated and incorporated worldwide into regulatory guidelines. Experience gained in recent years has demonstrated that adherence to published procedures and guidelines for the LLNA (e.g., with respect to dose and vehicle selection) is critical for the successful conduct and eventual interpretation of the data. In addition to providing a robust method for skin sensitization hazard identification, the LLNA has proven very useful in assessing the skin sensitizing potency of test chemicals, and this has provided invaluable information to risk assessors. The primary method to make comparisons of the relative potency of chemical sensitizers is to use linear interpolation to estimate the concentration of chemical required to induce a stimulation index of three relative to concurrent vehicle-treated controls (EC3). In certain situations where there are available less than optimal dose response data a log-linear extrapolation method can be used to estimate an EC3 value which can reduce significantly the need for repeat testing of chemicals. The LLNA, when conducted according to published guidelines, provides a robust method for skin sensitization testing that not only provides reliable hazard identification information but also data necessary for effective risk assessment and risk management.

  2. Implementation of a Flipped Classroom for Nuclear Medicine Physician CME.

    PubMed

    Komarraju, Aparna; Bartel, Twyla B; Dickinson, Lisa A; Grant, Frederick D; Yarbrough, Tracy L

    2018-06-21

    Increasingly, emerging technologies are expanding instructional possibilities, with new methods being adopted to improve knowledge acquisition and retention. Within medical education, many new techniques have been employed in the undergraduate setting, with less utilization thus far in the continuing medical education (CME) sphere. This paper discusses the use of a new method for CME-the "flipped classroom," widely used in undergraduate medical education. This method engages learners by providing content before the live ("in class") session that aids in preparation and fosters in-class engagement. A flipped classroom method was employed using an online image-rich case-based module and quiz prior to a live CME session at a national nuclear medicine meeting. The preparatory material provided a springboard for in-depth discussion at the live session-a case-based activity utilizing audience response technology. Study participants completed a survey regarding their initial experience with this new instructional method. In addition, focus group interviews were conducted with session attendees who had or had not completed the presession material; transcripts were qualitatively analyzed. Quantitative survey data (completed by two-thirds of the session attendees) suggested that the flipped method was highly valuable and met attendee educational objectives. Analysis of focus group data yielded six themes broadly related to two categories-benefits of the flipped method for CME and programmatic considerations for successfully implementing the flipped method in CME. Data from this study have proven encouraging and support further investigations around the incorporation of this innovative teaching method into CME for nuclear imaging specialists.

  3. Prediction of physical protein protein interactions

    NASA Astrophysics Data System (ADS)

    Szilágyi, András; Grimm, Vera; Arakaki, Adrián K.; Skolnick, Jeffrey

    2005-06-01

    Many essential cellular processes such as signal transduction, transport, cellular motion and most regulatory mechanisms are mediated by protein-protein interactions. In recent years, new experimental techniques have been developed to discover the protein-protein interaction networks of several organisms. However, the accuracy and coverage of these techniques have proven to be limited, and computational approaches remain essential both to assist in the design and validation of experimental studies and for the prediction of interaction partners and detailed structures of protein complexes. Here, we provide a critical overview of existing structure-independent and structure-based computational methods. Although these techniques have significantly advanced in the past few years, we find that most of them are still in their infancy. We also provide an overview of experimental techniques for the detection of protein-protein interactions. Although the developments are promising, false positive and false negative results are common, and reliable detection is possible only by taking a consensus of different experimental approaches. The shortcomings of experimental techniques affect both the further development and the fair evaluation of computational prediction methods. For an adequate comparative evaluation of prediction and high-throughput experimental methods, an appropriately large benchmark set of biophysically characterized protein complexes would be needed, but is sorely lacking.

  4. Towards structured sharing of raw and derived neuroimaging data across existing resources

    PubMed Central

    Keator, D.B.; Helmer, K.; Steffener, J.; Turner, J.A.; Van Erp, T.G.M.; Gadde, S.; Ashish, N.; Burns, G.A.; Nichols, B.N.

    2013-01-01

    Data sharing efforts increasingly contribute to the acceleration of scientific discovery. Neuroimaging data is accumulating in distributed domain-specific databases and there is currently no integrated access mechanism nor an accepted format for the critically important meta-data that is necessary for making use of the combined, available neuroimaging data. In this manuscript, we present work from the Derived Data Working Group, an open-access group sponsored by the Biomedical Informatics Research Network (BIRN) and the International Neuroimaging Coordinating Facility (INCF) focused on practical tools for distributed access to neuroimaging data. The working group develops models and tools facilitating the structured interchange of neuroimaging meta-data and is making progress towards a unified set of tools for such data and meta-data exchange. We report on the key components required for integrated access to raw and derived neuroimaging data as well as associated meta-data and provenance across neuroimaging resources. The components include (1) a structured terminology that provides semantic context to data, (2) a formal data model for neuroimaging with robust tracking of data provenance, (3) a web service-based application programming interface (API) that provides a consistent mechanism to access and query the data model, and (4) a provenance library that can be used for the extraction of provenance data by image analysts and imaging software developers. We believe that the framework and set of tools outlined in this manuscript have great potential for solving many of the issues the neuroimaging community faces when sharing raw and derived neuroimaging data across the various existing database systems for the purpose of accelerating scientific discovery. PMID:23727024

  5. ProvenCare perinatal: a model for delivering evidence/ guideline-based care for perinatal populations.

    PubMed

    Berry, Scott A; Laam, Leslie A; Wary, Andrea A; Mateer, Harry O; Cassagnol, Hans P; McKinley, Karen E; Nolan, Ruth A

    2011-05-01

    Geisinger Health System (GHS) has applied its ProvenCare model to demonstrate that a large integrated health care delivery system, enabled by an electronic health record (EHR), could reengineer a complicated clinical process, reduce unwarranted variation, and provide evidence-based care for patients with a specified clinical condition. In 2007 GHS began to apply the model to a more complicated, longer-term condition of "wellness"--perinatal care. ADAPTING PROVENCARE TO PERINATAL CARE: The ProvenCare Perinatal initiative was more complex than the five previous ProvenCare endeavors in terms of breadth, scope, and duration. Each of the 22 sites created a process flow map to depict the current, real-time process at each location. The local practice site providers-physicians and mid-level practitioners-reached consensus on 103 unique best practice measures (BPMs), which would be tracked for every patient. These maps were then used to create a single standardized pathway that included the BPMs but also preserved some unique care offerings that reflected the needs of the local context. A nine-phase methodology, expanded from the previous six-phase model, was implemented on schedule. Pre- to postimplementation improvement occurred for all seven BPMs or BPM bundles that were considered the most clinically relevant, with five statistically significant. In addition, the rate of primary cesarean sections decreased by 32%, and birth trauma remained unchanged as the number of vaginal births increased. Preliminary experience suggests that integrating evidence/guideline-based best practices into work flows in inpatient and outpatient settings can achieve improvements in daily patient care processes and outcomes.

  6. A quasi-likelihood approach to non-negative matrix factorization

    PubMed Central

    Devarajan, Karthik; Cheung, Vincent C.K.

    2017-01-01

    A unified approach to non-negative matrix factorization based on the theory of generalized linear models is proposed. This approach embeds a variety of statistical models, including the exponential family, within a single theoretical framework and provides a unified view of such factorizations from the perspective of quasi-likelihood. Using this framework, a family of algorithms for handling signal-dependent noise is developed and its convergence proven using the Expectation-Maximization algorithm. In addition, a measure to evaluate the goodness-of-fit of the resulting factorization is described. The proposed methods allow modeling of non-linear effects via appropriate link functions and are illustrated using an application in biomedical signal processing. PMID:27348511

  7. Optimizing Global Coronal Magnetic Field Models Using Image-Based Constraints

    NASA Technical Reports Server (NTRS)

    Jones-Mecholsky, Shaela I.; Davila, Joseph M.; Uritskiy, Vadim

    2016-01-01

    The coronal magnetic field directly or indirectly affects a majority of the phenomena studied in the heliosphere. It provides energy for coronal heating, controls the release of coronal mass ejections, and drives heliospheric and magnetospheric activity, yet the coronal magnetic field itself has proven difficult to measure. This difficulty has prompted a decades-long effort to develop accurate, timely, models of the field, an effort that continues today. We have developed a method for improving global coronal magnetic field models by incorporating the type of morphological constraints that could be derived from coronal images. Here we report promising initial tests of this approach on two theoretical problems, and discuss opportunities for application.

  8. A Planetarium Inside Your Office: Virtual Reality in the Dome Production Pipeline

    NASA Astrophysics Data System (ADS)

    Summers, Frank

    2018-01-01

    Producing astronomy visualization sequences for a planetarium without ready access to a dome is a distorted geometric challenge. Fortunately, one can now use virtual reality (VR) to simulate a dome environment without ever leaving one's office chair. The VR dome experience has proven to be a more than suitable pre-visualization method that requires only modest amounts of processing beyond the standard production pipeline. It also provides a crucial testbed for identifying, testing, and fixing the visual constraints and artifacts that arise in a spherical presentation environment. Topics adreesed here will include rendering, geometric projection, movie encoding, software playback, and hardware setup for a virtual dome using VR headsets.

  9. Angle Control on the Optima HE/XE Ion Implanter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bell, Edward; Satoh, Shu

    2008-11-03

    The Optima HE/XE is the latest generation of high energy ion implanter from Axcelis, combining proven RF linear accelerator technology with new single wafer processing. The architecture of the implanter is designed to provide a parallel beam at the wafer plane over the full range of implant energies and beam currents. One of the advantages of this system is the ability to control both the horizontal and vertical implant angles for each implant. Included in the design is the ability to perform in situ measurements of the horizontal and vertical angles of the beam in real time. The method ofmore » the horizontal and vertical angle measurements is described in this paper.« less

  10. Uplink Array Calibration via Far-Field Power Maximization

    NASA Technical Reports Server (NTRS)

    Vilnrotter, V.; Mukai, R.; Lee, D.

    2006-01-01

    Uplink antenna arrays have the potential to greatly increase the Deep Space Network s high-data-rate uplink capabilities as well as useful range, and to provide additional uplink signal power during critical spacecraft emergencies. While techniques for calibrating an array of receive antennas have been addressed previously, proven concepts for uplink array calibration have yet to be demonstrated. This article describes a method of utilizing the Moon as a natural far-field reflector for calibrating a phased array of uplink antennas. Using this calibration technique, the radio frequency carriers transmitted by each antenna of the array are optimally phased to ensure that the uplink power received by the spacecraft is maximized.

  11. Molecular tools for differentiation of non-typeable Haemophilus influenzae from Haemophilus haemolyticus

    PubMed Central

    Pickering, Janessa; Richmond, Peter C.; Kirkham, Lea-Ann S.

    2014-01-01

    Non-typeable Haemophilus influenzae (NTHi) and Haemophilus haemolyticus are closely related bacteria that reside in the upper respiratory tract. NTHi is associated with respiratory tract infections that frequently result in antibiotic prescription whilst H. haemolyticus is rarely associated with disease. NTHi and H. haemolyticus can be indistinguishable by traditional culture methods and molecular differentiation has proven difficult. This current review chronologically summarizes the molecular approaches that have been developed for differentiation of NTHi from H. haemolyticus, highlighting the advantages and disadvantages of each target and/or technique. We also provide suggestions for the development of new tools that would be suitable for clinical and research laboratories. PMID:25520712

  12. OPTIMIZING GLOBAL CORONAL MAGNETIC FIELD MODELS USING IMAGE-BASED CONSTRAINTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Shaela I.; Davila, Joseph M.; Uritsky, Vadim, E-mail: shaela.i.jonesmecholsky@nasa.gov

    The coronal magnetic field directly or indirectly affects a majority of the phenomena studied in the heliosphere. It provides energy for coronal heating, controls the release of coronal mass ejections, and drives heliospheric and magnetospheric activity, yet the coronal magnetic field itself has proven difficult to measure. This difficulty has prompted a decades-long effort to develop accurate, timely, models of the field—an effort that continues today. We have developed a method for improving global coronal magnetic field models by incorporating the type of morphological constraints that could be derived from coronal images. Here we report promising initial tests of thismore » approach on two theoretical problems, and discuss opportunities for application.« less

  13. An Overview of Practice Facilitation Programs in Canada: Current Perspectives and Future Directions

    PubMed Central

    Liddy, Clare; Laferriere, Dianne; Baskerville, Bruce; Dahrouge, Simone; Knox, Lyndee; Hogg, William

    2013-01-01

    Practice facilitation has proven to be effective in improving the quality of primary care. A practice facilitator is a health professional, usually external to the practice, who regularly visits the practice to provide support in change management that targets improvements in the delivery of care. Our environmental scan shows that several initiatives across Canada utilize practice facilitation as a quality improvement method; however, many are conducted in isolation as there is a lack of coordinated effort, knowledge translation and dissemination in this field across the country. We recommend that investments be made in capacity building, knowledge exchange and facilitator training, and that partnership building be considered a priority in this field. PMID:23968627

  14. Portable Instrument to Measure CDOM Light Absorption in Aquatic Systems: WPI Success Story

    NASA Technical Reports Server (NTRS)

    2001-01-01

    World Precision Instruments, Inc. (WPI), of Sarasota, FL, in collaboration with NASA's John C. Stennis Space Center, has developed an innovative instrument to accurately measure Colored Dissolved Organic Matter (CDOM) absorption in the field. This successful collaboration has culminated in an exciting new device, called the UltraPath, now commercially available through WPI. Traditional methods of measuring absorption of dissolved materials require special handling and storage prior to measurement. Use of laboratory spectrophotometers as the measuring devices have proven time consuming, cumbersome, and delicate to handle. The UltraPath provides a low-cost, highly sensitive, rugged, portable system that is capable of high sensitivity measurements in widely divergent waters.

  15. High Field Small Animal Magnetic Resonance Oncology Studies

    PubMed Central

    Bokacheva, Louisa; Ackerstaff, Ellen; LeKaye, H. Carl; Zakian, Kristen; Koutcher, Jason A.

    2014-01-01

    This review focuses on the applications of high magnetic field magnetic resonance imaging (MRI) and spectroscopy (MRS) to cancer studies in small animals. High field MRI can provide information about tumor physiology, the microenvironment, metabolism, vascularity and cellularity. Such studies are invaluable for understanding tumor growth and proliferation, response to treatment and drug development. The MR techniques reviewed here include 1H, 31P, Chemical Exchange Saturation Transfer (CEST) imaging, and hyperpolarized 13C MR spectroscopy as well as diffusion-weighted, Blood Oxygen Level Dependent (BOLD) contrast imaging, and dynamic contrast-enhanced MR imaging. These methods have been proven effective in animal studies and are highly relevant to human clinical studies. PMID:24374985

  16. Language Integrated Technology Project Final Evaluation Report.

    ERIC Educational Resources Information Center

    Stiegemeier, Lois

    The goal of the Language Integrated Technology Grant Project (LIT) consortium was to help provide critical components of successful reading programs through a combination of proven computer/print programs and teacher training. Through leadership provided by the Educational Service District 113 (Olympia, Washington), the LIT consortium of schools…

  17. Autumn frost hardiness in Norway spruce plus tree progeny and trees of the local and transferred provenances in central Sweden.

    PubMed

    Hannerz, Mats; Westin, Johan

    2005-09-01

    Reforestation with provenances from locations remote from the planting site (transferred provenances) or the progeny of trees of local provenances selected for superior form and vigor (plus trees) offer alternative means to increase yield over that obtained by the use of seed from unselected trees of the local provenance. Under Swedish conditions, Norway spruce (Picea abies (L.) Karst.) of certain transferred provenances generally has an advantage in productivity relative to the local provenance comparable to that of progeny of plus trees. The aim of this study was to explore the extent to which productivity gains achieved by provenance transfer or the use of plus tree progeny are associated with reductions in autumn frost hardiness, relative to that of trees of the local provenance. In a field trial with 19-year-old trees in central Sweden, bud hardiness was tested on four occasions during the autumn of 2002. Trees of the local provenance were compared with trees of a south Swedish provenance originating 3 degrees of latitude to the south, a Belarusian provenance and the progeny of plus trees of local origin. The Belarusian provenance was the least hardy and the local provenance the most hardy, with plus tree progeny and the south Swedish provenance being intermediate in hardiness. Both the Belarusian provenance and the plus tree progeny were significantly taller than trees of the other populations. Within provenances, tree height was negatively correlated with autumn frost hardiness. Among the plus tree progeny, however, no such correlation between tree height and autumn frost hardiness was found. It is concluded that although the gain in productivity achieved by provenance transfer from Belarus was comparable to that achieved by using the progeny of plus trees of the local provenance, the use of trees of the Belarus provenance involved an increased risk of autumn frost damage because of later hardening.

  18. Examining structural and clinical factors associated with implementation of standing orders for adult immunization.

    PubMed

    Yonas, Michael A; Nowalk, Mary Patricia; Zimmerman, Richard K; Ahmed, Faruque; Albert, Steven M

    2012-01-01

    A proven method to increase vaccination rates in primary care is a standing orders program (SOP) for nonphysician staff to assess and vaccinate eligible individuals without a specific written physician order. This study describes a mixed methods approach to examining physicians' beliefs and attitudes about and adoption of SOPs for adult immunizations, specifically, influenza and pneumococcal polysaccharide vaccine. Focus groups and in-depth interviews of physicians, nurses, practice managers, and the medical director of a managed care health plan were conducted. Results were used to enrich a concise survey based on the Awareness-to-Adherence model of physician behavior and previous research, which was mailed to 1,640 general internists and family physicians nationwide. Barriers to SOPs identified through qualitative methods were lack of interest in changing the status quo, a physician-dominated hierarchy, and fear of malpractice. Facilitators included having an electronic medical record and a practice culture that was open to change. The survey (response rate 67%) confirmed the facilitators and further identified patient, physician, and practice factors that served as barriers to establishing and maintaining SOPs. This mixed methods approach provided the opportunity to develop a tailored and practice-oriented survey for examining the contextual factors influencing clinical providers' decisions to implement SOPs for adult immunization. © 2011 National Association for Healthcare Quality.

  19. Development and validation of a novel, simple, and accurate spectrophotometric method for the determination of lead in human serum.

    PubMed

    Shayesteh, Tavakol Heidari; Khajavi, Farzad; Khosroshahi, Abolfazl Ghafuri; Mahjub, Reza

    2016-01-01

    The determination of blood lead levels is the most useful indicator of the determination of the amount of lead that is absorbed by the human body. Various methods, like atomic absorption spectroscopy (AAS), have already been used for the detection of lead in biological fluid, but most of these methods are based on complicated, expensive, and highly instructed instruments. In this study, a simple and accurate spectroscopic method for the determination of lead has been developed and applied for the investigation of lead concentration in biological samples. In this study, a silica gel column was used to extract lead and eliminate interfering agents in human serum samples. The column was washed with deionized water. The pH was adjusted to the value of 8.2 using phosphate buffer, and then tartrate and cyanide solutions were added as masking agents. The lead content was extracted into the organic phase containing dithizone as a complexion reagent and the dithizone-Pb(II) complex was formed and approved by visible spectrophotometry at 538 nm. The recovery was found to be 84.6 %. In order to validate the method, a calibration curve involving the use of various concentration levels was calculated and proven to be linear in the range of 0.01-1.5 μg/ml, with an R (2) regression coefficient of 0.9968 by statistical analysis of linear model validation. The largest error % values were found to be -5.80 and +11.6 % for intra-day and inter-day measurements, respectively. The largest RSD % values were calculated to be 6.54 and 12.32 % for intra-day and inter-day measurements, respectively. Further, the limit of detection (LOD) was calculated to be 0.002 μg/ml. The developed method was applied to determine the lead content in the human serum of voluntary miners, and it has been proven that there is no statistically significant difference between the data provided from this novel method and the data obtained from previously studied AAS.

  20. Toward the Geoscience Paper of the Future: Best practices for documenting and sharing research from data to software to provenance

    NASA Astrophysics Data System (ADS)

    Gil, Yolanda; David, Cédric H.; Demir, Ibrahim; Essawy, Bakinam T.; Fulweiler, Robinson W.; Goodall, Jonathan L.; Karlstrom, Leif; Lee, Huikyo; Mills, Heath J.; Oh, Ji-Hyun; Pierce, Suzanne A.; Pope, Allen; Tzeng, Mimi W.; Villamizar, Sandra R.; Yu, Xuan

    2016-10-01

    Geoscientists now live in a world rich with digital data and methods, and their computational research cannot be fully captured in traditional publications. The Geoscience Paper of the Future (GPF) presents an approach to fully document, share, and cite all their research products including data, software, and computational provenance. This article proposes best practices for GPF authors to make data, software, and methods openly accessible, citable, and well documented. The publication of digital objects empowers scientists to manage their research products as valuable scientific assets in an open and transparent way that enables broader access by other scientists, students, decision makers, and the public. Improving documentation and dissemination of research will accelerate the pace of scientific discovery by improving the ability of others to build upon published work.

  1. Ontology-Driven Provenance Management in eScience: An Application in Parasite Research

    NASA Astrophysics Data System (ADS)

    Sahoo, Satya S.; Weatherly, D. Brent; Mutharaju, Raghava; Anantharam, Pramod; Sheth, Amit; Tarleton, Rick L.

    Provenance, from the French word "provenir", describes the lineage or history of a data entity. Provenance is critical information in scientific applications to verify experiment process, validate data quality and associate trust values with scientific results. Current industrial scale eScience projects require an end-to-end provenance management infrastructure. This infrastructure needs to be underpinned by formal semantics to enable analysis of large scale provenance information by software applications. Further, effective analysis of provenance information requires well-defined query mechanisms to support complex queries over large datasets. This paper introduces an ontology-driven provenance management infrastructure for biology experiment data, as part of the Semantic Problem Solving Environment (SPSE) for Trypanosoma cruzi (T.cruzi). This provenance infrastructure, called T.cruzi Provenance Management System (PMS), is underpinned by (a) a domain-specific provenance ontology called Parasite Experiment ontology, (b) specialized query operators for provenance analysis, and (c) a provenance query engine. The query engine uses a novel optimization technique based on materialized views called materialized provenance views (MPV) to scale with increasing data size and query complexity. This comprehensive ontology-driven provenance infrastructure not only allows effective tracking and management of ongoing experiments in the Tarleton Research Group at the Center for Tropical and Emerging Global Diseases (CTEGD), but also enables researchers to retrieve the complete provenance information of scientific results for publication in literature.

  2. Anti-inflammatory drugs and prediction of new structures by comparative analysis.

    PubMed

    Bartzatt, Ronald

    2012-01-01

    Nonsteroidal anti-inflammatory drugs (NSAIDs) are a group of agents important for their analgesic, anti-inflammatory, and antipyretic properties. This study presents several approaches to predict and elucidate new molecular structures of NSAIDs based on 36 known and proven anti-inflammatory compounds. Based on 36 known NSAIDs the mean value of Log P is found to be 3.338 (standard deviation= 1.237), mean value of polar surface area is 63.176 Angstroms2 (standard deviation = 20.951 A2), and the mean value of molecular weight is 292.665 (standard deviation = 55.627). Nine molecular properties are determined for these 36 NSAID agents, including Log P, number of -OH and -NHn, violations of Rule of 5, number of rotatable bonds, and number of oxygens and nitrogens. Statistical analysis of these nine molecular properties provides numerical parameters to conform to in the design of novel NSAID drug candidates. Multiple regression analysis is accomplished using these properties of 36 agents followed with examples of predicted molecular weight based on minimum and maximum property values. Hierarchical cluster analysis indicated that licofelone, tolfenamic acid, meclofenamic acid, droxicam, and aspirin are substantially distinct from all remaining NSAIDs. Analysis of similarity (ANOSIM) produced R = 0.4947, which indicates low to moderate level of dissimilarity between these 36 NSAIDs. Non-hierarchical K-means cluster analysis separated the 36 NSAIDs into four groups having members of greatest similarity. Likewise, discriminant analysis divided the 36 agents into two groups indicating the greatest level of distinction (discrimination) based on nine properties. These two multivariate methods together provide investigators a means to compare and elucidate novel drug designs to 36 proven compounds and ascertain to which of those are most analogous in pharmacodynamics. In addition, artificial neural network modeling is demonstrated as an approach to predict numerous molecular properties of new drug designs that is based on neural training from 36 proven NSAIDs. Comprehensive and effective approaches are presented in this study for the design of new NSAID type agents which are so very important for inhibition of COX-2 and COX-1 isoenzymes.

  3. Generation of Multiple Metadata Formats from a Geospatial Data Repository

    NASA Astrophysics Data System (ADS)

    Hudspeth, W. B.; Benedict, K. K.; Scott, S.

    2012-12-01

    The Earth Data Analysis Center (EDAC) at the University of New Mexico is partnering with the CYBERShARE and Environmental Health Group from the Center for Environmental Resource Management (CERM), located at the University of Texas, El Paso (UTEP), the Biodiversity Institute at the University of Kansas (KU), and the New Mexico Geo- Epidemiology Research Network (GERN) to provide a technical infrastructure that enables investigation of a variety of climate-driven human/environmental systems. Two significant goals of this NASA-funded project are: a) to increase the use of NASA Earth observational data at EDAC by various modeling communities through enabling better discovery, access, and use of relevant information, and b) to expose these communities to the benefits of provenance for improving understanding and usability of heterogeneous data sources and derived model products. To realize these goals, EDAC has leveraged the core capabilities of its Geographic Storage, Transformation, and Retrieval Engine (Gstore) platform, developed with support of the NSF EPSCoR Program. The Gstore geospatial services platform provides general purpose web services based upon the REST service model, and is capable of data discovery, access, and publication functions, metadata delivery functions, data transformation, and auto-generated OGC services for those data products that can support those services. Central to the NASA ACCESS project is the delivery of geospatial metadata in a variety of formats, including ISO 19115-2/19139, FGDC CSDGM, and the Proof Markup Language (PML). This presentation details the extraction and persistence of relevant metadata in the Gstore data store, and their transformation into multiple metadata formats that are increasingly utilized by the geospatial community to document not only core library catalog elements (e.g. title, abstract, publication data, geographic extent, projection information, and database elements), but also the processing steps used to generate derived modeling products. In particular, we discuss the generation and service delivery of provenance, or trace of data sources and analytical methods used in a scientific analysis, for archived data. We discuss the workflows developed by EDAC to capture end-to-end provenance, the storage model for those data in a delivery format independent data structure, and delivery of PML, ISO, and FGDC documents to clients requesting those products.

  4. Deficit irrigation: Arriving at the crop water stress index via gas exchange measurements

    USDA-ARS?s Scientific Manuscript database

    Plant gas exchange provides a highly sensitive measure of the degree of drought stress. Canopy temperature (Tc) provides a much easier to acquire indication of crop water deficit that has been used in irrigation scheduling systems, but interpretation of this measurement has proven difficult. Our goa...

  5. Culture-independent discovery of natural products from soil metagenomes.

    PubMed

    Katz, Micah; Hover, Bradley M; Brady, Sean F

    2016-03-01

    Bacterial natural products have proven to be invaluable starting points in the development of many currently used therapeutic agents. Unfortunately, traditional culture-based methods for natural product discovery have been deemphasized by pharmaceutical companies due in large part to high rediscovery rates. Culture-independent, or "metagenomic," methods, which rely on the heterologous expression of DNA extracted directly from environmental samples (eDNA), have the potential to provide access to metabolites encoded by a large fraction of the earth's microbial biosynthetic diversity. As soil is both ubiquitous and rich in bacterial diversity, it is an appealing starting point for culture-independent natural product discovery efforts. This review provides an overview of the history of soil metagenome-driven natural product discovery studies and elaborates on the recent development of new tools for sequence-based, high-throughput profiling of environmental samples used in discovering novel natural product biosynthetic gene clusters. We conclude with several examples of these new tools being employed to facilitate the recovery of novel secondary metabolite encoding gene clusters from soil metagenomes and the subsequent heterologous expression of these clusters to produce bioactive small molecules.

  6. Deflectometry challenges interferometry: the competition gets tougher!

    NASA Astrophysics Data System (ADS)

    Faber, Christian; Olesch, Evelyn; Krobot, Roman; Häusler, Gerd

    2012-09-01

    Deflectometric methods that are capable of providing full-field topography data for specular freeform surfaces have been around for more than a decade. They have proven successful in various fields of application, such as the measurement of progressive power eyeglasses, painted car body panels, or windshields. However, up to now deflectometry has not been considered as a viable competitor to interferometry, especially for the qualification of optical components. The reason is that, despite the unparalleled local sensitivity provided by deflectometric methods, the global height accuracy attainable with this measurement technique used to be limited to several microns over a field of 100 mm. Moreover, spurious reflections at the rear surface of transparent objects could easily mess up the measured signal completely. Due to new calibration and evaluation procedures, this situation has changed lately. We will give a comparative assessment of the strengths and - now partly revised - weaknesses of both measurement principles from the current perspective. By presenting recent developments and measurement examples from different applications, we will show that deflectometry is now heading to become a serious competitor to interferometry.

  7. Control system design and analysis using the INteractive Controls Analysis (INCA) program

    NASA Technical Reports Server (NTRS)

    Bauer, Frank H.; Downing, John P.

    1987-01-01

    The INteractive Controls Analysis (INCA) program was developed at the Goddard Space Flight Center to provide a user friendly efficient environment for the design and analysis of linear control systems. Since its inception, INCA has found extensive use in the design, development, and analysis of control systems for spacecraft, instruments, robotics, and pointing systems. Moreover, the results of the analytic tools imbedded in INCA have been flight proven with at least three currently orbiting spacecraft. This paper describes the INCA program and illustrates, using a flight proven example, how the package can perform complex design analyses with relative ease.

  8. A Multilevel Comprehensive Assessment of International Accreditation for Business Programmes-Based on AMBA Accreditation of GDUFS

    ERIC Educational Resources Information Center

    Jiang, Yong

    2017-01-01

    Traditional mathematical methods built around exactitude have limitations when applied to the processing of educational information, due to their uncertainty and imperfection. Alternative mathematical methods, such as grey system theory, have been widely applied in processing incomplete information systems and have proven effective in a number of…

  9. Stickie removal using neutral enzymatic repulping pressure screening

    Treesearch

    Marguerite Sykes; John Klungness; Roland Gleisner; Said Abubakr

    1998-01-01

    Removal of stickie contaminants is currently a major focus of paper recycling research. Medium consistency alkaline repulping followed by pressure screening has proven to be effective for stickie removal. There is, however, an alternate method that is equally effective and more environmentally benign. This study compares the effectiveness of this alternative method,...

  10. Student Diversity Requires Different Approaches to College Teaching, Even in Math and Science.

    ERIC Educational Resources Information Center

    Nelson, Craig E.

    1996-01-01

    Asserts that traditional teaching methods are unintentionally biased towards the elite and against many non-traditional students. Outlines several easily accessible changes in teaching methods that have fostered dramatic changes in student performance with no change in standards. These approaches have proven effective even in the fields of…

  11. Motivating People To Be Physically Active. Physical Activity Intervention Series.

    ERIC Educational Resources Information Center

    Marcus, Bess H.; Forsyth, LeighAnn H.

    This book describes proven methods for helping people change from inactive to active living. The behavior change methods are useful for healthy adults as well as individuals with chronic physical and psychological conditions. The book describes intervention programs for individuals and groups and for workplace and community settings. Part 1,…

  12. Strange quark contribution to the nucleon

    NASA Astrophysics Data System (ADS)

    Darnell, Dean F.

    The strangeness contribution to the electric and magnetic properties of the nucleon has been under investigation experimentally for many years. Lattice Quantum Chromodynamics (LQCD) gives theoretical predictions of these measurements by implementing the continuum gauge theory on a discrete, mathematical Euclidean space-time lattice which provides a cutoff removing the ultra-violet divergences. In this dissertation we will discuss effective methods using LQCD that will lead to a better determination of the strangeness contribution to the nucleon properties. Strangeness calculations are demanding technically and computationally. Sophisticated techniques are required to carry them to completion. In this thesis, new theoretical and computational methods for this calculation such as twisted mass fermions, perturbative subtraction, and General Minimal Residual (GMRES) techniques which have proven useful in the determination of these form factors will be investigated. Numerical results of the scalar form factor using these techniques are presented. These results give validation to these methods in future calculations of the strange quark contribution to the electric and magnetic form factors.

  13. Gradient Dynamics and Entropy Production Maximization

    NASA Astrophysics Data System (ADS)

    Janečka, Adam; Pavelka, Michal

    2018-01-01

    We compare two methods for modeling dissipative processes, namely gradient dynamics and entropy production maximization. Both methods require similar physical inputs-how energy (or entropy) is stored and how it is dissipated. Gradient dynamics describes irreversible evolution by means of dissipation potential and entropy, it automatically satisfies Onsager reciprocal relations as well as their nonlinear generalization (Maxwell-Onsager relations), and it has statistical interpretation. Entropy production maximization is based on knowledge of free energy (or another thermodynamic potential) and entropy production. It also leads to the linear Onsager reciprocal relations and it has proven successful in thermodynamics of complex materials. Both methods are thermodynamically sound as they ensure approach to equilibrium, and we compare them and discuss their advantages and shortcomings. In particular, conditions under which the two approaches coincide and are capable of providing the same constitutive relations are identified. Besides, a commonly used but not often mentioned step in the entropy production maximization is pinpointed and the condition of incompressibility is incorporated into gradient dynamics.

  14. Speckle noise attenuation in optical coherence tomography by compounding images acquired at different positions of the sample

    NASA Astrophysics Data System (ADS)

    Popescu, Dan P.; Hewko, Mark D.; Sowa, Michael G.

    2007-01-01

    This study demonstrates a simple method for attenuating the speckle noise generated by coherent multiple-scattered photons in optical-coherence tomography images. The method could be included among the space-diversity techniques used for speckle reduction. It relies on displacing the sample along a weakly focused beam in the sample arm of the interferometer, acquiring a coherent image for each sample position and adding the individual images to form a compounded image. It is proven that the compounded image displays a reduction in the speckle noise generated by multiple scattered photons and an enhancement in the intensity signal caused by single-backscattered photons. To evaluate its potential biomedical applications, the method is used to investigate in vitro a caries lesion affecting the enamel layer of a wisdom tooth. Because of the uncorrelated nature of the speckle noise the compounded image provides a better mapping of the lesion compared to a single (coherent) image.

  15. Timeliness “at a glance”: assessing the turnaround time through the six sigma metrics.

    PubMed

    Ialongo, Cristiano; Bernardini, Sergio

    2016-01-01

    Almost thirty years of systematic analysis have proven the turnaround time to be a fundamental dimension for the clinical laboratory. Several indicators are to date available to assess and report quality with respect to timeliness, but they sometimes lack the communicative immediacy and accuracy. The six sigma is a paradigm developed within the industrial domain for assessing quality and addressing goal and issues. The sigma level computed through the Z-score method is a simple and straightforward tool which delivers quality by a universal dimensionless scale and allows to handle non-normal data. Herein we report our preliminary experience in using the sigma level to assess the change in urgent (STAT) test turnaround time due to the implementation of total automation. We found that the Z-score method is a valuable and easy to use method for assessing and communicating the quality level of laboratory timeliness, providing a good correspondence with the actual change in efficiency which was retrospectively observed.

  16. Forensic Discrimination of Latent Fingerprints Using Laser-Induced Breakdown Spectroscopy (LIBS) and Chemometric Approaches.

    PubMed

    Yang, Jun-Ho; Yoh, Jack J

    2018-01-01

    A novel technique is reported for separating overlapping latent fingerprints using chemometric approaches that combine laser-induced breakdown spectroscopy (LIBS) and multivariate analysis. The LIBS technique provides the capability of real time analysis and high frequency scanning as well as the data regarding the chemical composition of overlapping latent fingerprints. These spectra offer valuable information for the classification and reconstruction of overlapping latent fingerprints by implementing appropriate statistical multivariate analysis. The current study employs principal component analysis and partial least square methods for the classification of latent fingerprints from the LIBS spectra. This technique was successfully demonstrated through a classification study of four distinct latent fingerprints using classification methods such as soft independent modeling of class analogy (SIMCA) and partial least squares discriminant analysis (PLS-DA). The novel method yielded an accuracy of more than 85% and was proven to be sufficiently robust. Furthermore, through laser scanning analysis at a spatial interval of 125 µm, the overlapping fingerprints were reconstructed as separate two-dimensional forms.

  17. Control of Origin of Sesame Oil from Various Countries by Stable Isotope Analysis and DNA Based Markers—A Pilot Study

    PubMed Central

    Horacek, Micha; Hansel-Hohl, Karin; Burg, Kornel; Soja, Gerhard; Okello-Anyanga, Walter; Fluch, Silvia

    2015-01-01

    The indication of origin of sesame seeds and sesame oil is one of the important factors influencing its price, as it is produced in many regions worldwide and certain provenances are especially sought after. We joined stable carbon and hydrogen isotope analysis with DNA based molecular marker analysis to study their combined potential for the discrimination of different origins of sesame seeds. For the stable carbon and hydrogen isotope data a positive correlation between both isotope parameters was observed, indicating a dominant combined influence of climate and water availability. This enabled discrimination between sesame samples from tropical and subtropical/moderate climatic provenances. Carbon isotope values also showed differences between oil from black and white sesame seeds from identical locations, indicating higher water use efficiency of plants producing black seeds. DNA based markers gave independent evidence for geographic variation as well as provided information on the genetic relatedness of the investigated samples. Depending on the differences in ambient environmental conditions and in the genotypic fingerprint, a combination of both analytical methods is a very powerful tool to assess the declared geographic origin. To our knowledge this is the first paper on food authenticity combining the stable isotope analysis of bio-elements with DNA based markers and their combined statistical analysis. PMID:25831054

  18. Control of origin of sesame oil from various countries by stable isotope analysis and DNA based markers--a pilot study.

    PubMed

    Horacek, Micha; Hansel-Hohl, Karin; Burg, Kornel; Soja, Gerhard; Okello-Anyanga, Walter; Fluch, Silvia

    2015-01-01

    The indication of origin of sesame seeds and sesame oil is one of the important factors influencing its price, as it is produced in many regions worldwide and certain provenances are especially sought after. We joined stable carbon and hydrogen isotope analysis with DNA based molecular marker analysis to study their combined potential for the discrimination of different origins of sesame seeds. For the stable carbon and hydrogen isotope data a positive correlation between both isotope parameters was observed, indicating a dominant combined influence of climate and water availability. This enabled discrimination between sesame samples from tropical and subtropical/moderate climatic provenances. Carbon isotope values also showed differences between oil from black and white sesame seeds from identical locations, indicating higher water use efficiency of plants producing black seeds. DNA based markers gave independent evidence for geographic variation as well as provided information on the genetic relatedness of the investigated samples. Depending on the differences in ambient environmental conditions and in the genotypic fingerprint, a combination of both analytical methods is a very powerful tool to assess the declared geographic origin. To our knowledge this is the first paper on food authenticity combining the stable isotope analysis of bio-elements with DNA based markers and their combined statistical analysis.

  19. Informativeness of Diagnostic Marker Values and the Impact of Data Grouping.

    PubMed

    Ma, Hua; Bandos, Andriy I; Gur, David

    2018-01-01

    Assessing performance of diagnostic markers is a necessary step for their use in decision making regarding various conditions of interest in diagnostic medicine and other fields. Globally useful markers could, however, have ranges of values that are " diagnostically non-informative" . This paper demonstrates that the presence of marker values from diagnostically non-informative ranges could lead to a loss in statistical efficiency during nonparametric evaluation and shows that grouping non-informative values provides a natural resolution to this problem. These points are theoretically proven and an extensive simulation study is conducted to illustrate the possible benefits of using grouped marker values in a number of practically reasonable scenarios. The results contradict the common conjecture regarding the detrimental effect of grouped marker values during performance assessments. Specifically, contrary to the common assumption that grouped marker values lead to bias, grouping non-informative values does not introduce bias and could substantially reduce sampling variability. The proven concept that grouped marker values could be statistically beneficial without detrimental consequences implies that in practice, tied values do not always require resolution whereas the use of continuous diagnostic results without addressing diagnostically non-informative ranges could be statistically detrimental. Based on these findings, more efficient methods for evaluating diagnostic markers could be developed.

  20. A nuclear method to authenticate Buddha images

    NASA Astrophysics Data System (ADS)

    Khaweerat, S.; Ratanatongchai, W.; Channuie, J.; Wonglee, S.; Picha, R.; Promping, J.; Silva, K.; Liamsuwan, T.

    2015-05-01

    The value of Buddha images in Thailand varies dramatically depending on authentication and provenance. In general, people use their individual skills to make the justification which frequently leads to obscurity, deception and illegal activities. Here, we propose two non-destructive techniques of neutron radiography (NR) and neutron activation autoradiography (NAAR) to reveal respectively structural and elemental profiles of small Buddha images. For NR, a thermal neutron flux of 105 n cm-2s-1 was applied. NAAR needed a higher neutron flux of 1012 n cm-2 s-1 to activate the samples. Results from NR and NAAR revealed unique characteristic of the samples. Similarity of the profile played a key role in the classification of the samples. The results provided visual evidence to enhance the reliability of authenticity approval. The method can be further developed for routine practice which impact thousands of customers in Thailand.

  1. Novel Intrinsic Ignition Method Measuring Local-Global Integration Characterizes Wakefulness and Deep Sleep

    PubMed Central

    Tagliazucchi, Enzo; Sanjuán, Ana

    2017-01-01

    Abstract A precise definition of a brain state has proven elusive. Here, we introduce the novel local-global concept of intrinsic ignition characterizing the dynamical complexity of different brain states. Naturally occurring intrinsic ignition events reflect the capability of a given brain area to propagate neuronal activity to other regions, giving rise to different levels of integration. The ignitory capability of brain regions is computed by the elicited level of integration for each intrinsic ignition event in each brain region, averaged over all events. This intrinsic ignition method is shown to clearly distinguish human neuroimaging data of two fundamental brain states (wakefulness and deep sleep). Importantly, whole-brain computational modelling of this data shows that at the optimal working point is found where there is maximal variability of the intrinsic ignition across brain regions. Thus, combining whole brain models with intrinsic ignition can provide novel insights into underlying mechanisms of brain states. PMID:28966977

  2. [Applications of MALDI-TOF-MS in clinical microbiology laboratory].

    PubMed

    Carbonnelle, Etienne; Nassif, Xavier

    2011-10-01

    For twenty years, mass spectrometry (MS) has emerged as a particularly powerful tool for analysis and characterization of proteins in research. It is only recently that this technology, especially MALDI-TOF-MS (Matrix Assisted Laser Desorption Ionization Time-Of-Flight) has entered the field of routine microbiology. This method has proven to be reliable and safe for the identification of bacteria, yeasts, filamentous fungi and dermatophytes. MALDI-TOF-MS is a rapid, precise and cost-effective method for identification, compared to conventional phenotypic techniques or molecular biology. Its ability to analyse whole microorganisms with few sample preparation has greatly reduced the time to identification (1-2 min). Furthermore, this technology can be used to identify bacteria directly from clinical samples as blood culture bottles or urines. Future applications will be developed in order to provide direct information concerning virulence or resistance protein markers. © 2011 médecine/sciences – Inserm / SRMS.

  3. A user-defined data type for the storage of time series data allowing efficient similarity screening.

    PubMed

    Sorokin, Anatoly; Selkov, Gene; Goryanin, Igor

    2012-07-16

    The volume of the experimentally measured time series data is rapidly growing, while storage solutions offering better data types than simple arrays of numbers or opaque blobs for keeping series data are sorely lacking. A number of indexing methods have been proposed to provide efficient access to time series data, but none has so far been integrated into a tried-and-proven database system. To explore the possibility of such integration, we have developed a data type for time series storage in PostgreSQL, an object-relational database system, and equipped it with an access method based on SAX (Symbolic Aggregate approXimation). This new data type has been successfully tested in a database supporting a large-scale plant gene expression experiment, and it was additionally tested on a very large set of simulated time series data. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Fragment Screening and HIV Therapeutics

    PubMed Central

    Bauman, Joseph D.; Patel, Disha; Arnold, Eddy

    2013-01-01

    Fragment screening has proven to be a powerful alternative to traditional methods for drug discovery. Biophysical methods, such as X-ray crystallography, NMR spectroscopy, and surface plasmon resonance, are used to screen a diverse library of small molecule compounds. Although compounds identified via this approach have relatively weak affinity, they provide a good platform for lead development and are highly efficient binders with respect to their size. Fragment screening has been utilized for a wide-range of targets, including HIV-1 proteins. Here, we review the fragment screening studies targeting HIV-1 proteins using X-ray crystallography or surface plasmon resonance. These studies have successfully detected binding of novel fragments to either previously established or new sites on HIV-1 protease and reverse transcriptase. In addition, fragment screening against HIV-1 reverse transcriptase has been used as a tool to better understand the complex nature of ligand binding to a flexible target. PMID:21972022

  5. Boron Nitride Nanotube: Synthesis and Applications

    NASA Technical Reports Server (NTRS)

    Tiano, Amanda L.; Park, Cheol; Lee, Joseph W.; Luong, Hoa H.; Gibbons, Luke J.; Chu, Sang-Hyon; Applin, Samantha I.; Gnoffo, Peter; Lowther, Sharon; Kim, Hyun Jung; hide

    2014-01-01

    Scientists have predicted that carbon's immediate neighbors on the periodic chart, boron and nitrogen, may also form perfect nanotubes, since the advent of carbon nanotubes (CNTs) in 1991. First proposed then synthesized by researchers at UC Berkeley in the mid 1990's, the boron nitride nanotube (BNNT) has proven very difficult to make until now. Herein we provide an update on a catalyst-free method for synthesizing highly crystalline, small diameter BNNTs with a high aspect ratio using a high power laser under a high pressure and high temperature environment first discovered jointly by NASA/NIA JSA. Progress in purification methods, dispersion studies, BNNT mat and composite formation, and modeling and diagnostics will also be presented. The white BNNTs offer extraordinary properties including neutron radiation shielding, piezoelectricity, thermal oxidative stability (> 800 C in air), mechanical strength, and toughness. The characteristics of the novel BNNTs and BNNT polymer composites and their potential applications are discussed.

  6. Investigating Endogenous Peptides and Peptidases using Peptidomics

    PubMed Central

    Tinoco, Arthur D.; Saghatelian, Alan

    2012-01-01

    Rather than simply being protein degradation products, peptides have proven to be important bioactive molecules. Bioactive peptides act as hormones, neurotransmitters and antimicrobial agents in vivo. The dysregulation of bioactive peptide signaling is also known to be involved in disease, and targeting peptide hormone pathways has been successful strategy in the development of novel therapeutics. The importance of bioactive peptides in biology has spurred research to elucidate the function and regulation of these molecules. Classical methods for peptide analysis have relied on targeted immunoassays, but certain scientific questions necessitated a broader and more detailed view of the peptidome–all the peptides in a cell, tissue or organism. In this review we discuss how peptidomics has emerged to fill this need through the application of advanced liquid chromatography-tandem mass spectrometry (LC-MS/MS) methods that provide unique insights into peptide activity and regulation. PMID:21786763

  7. Integrating heterogeneous databases in clustered medic care environments using object-oriented technology

    NASA Astrophysics Data System (ADS)

    Thakore, Arun K.; Sauer, Frank

    1994-05-01

    The organization of modern medical care environments into disease-related clusters, such as a cancer center, a diabetes clinic, etc., has the side-effect of introducing multiple heterogeneous databases, often containing similar information, within the same organization. This heterogeneity fosters incompatibility and prevents the effective sharing of data amongst applications at different sites. Although integration of heterogeneous databases is now feasible, in the medical arena this is often an ad hoc process, not founded on proven database technology or formal methods. In this paper we illustrate the use of a high-level object- oriented semantic association method to model information found in different databases into an integrated conceptual global model that integrates the databases. We provide examples from the medical domain to illustrate an integration approach resulting in a consistent global view, without attacking the autonomy of the underlying databases.

  8. Boron nitride nanotube: synthesis and applications

    NASA Astrophysics Data System (ADS)

    Tiano, Amanda L.; Park, Cheol; Lee, Joseph W.; Luong, Hoa H.; Gibbons, Luke J.; Chu, Sang-Hyon; Applin, Samantha; Gnoffo, Peter; Lowther, Sharon; Kim, Hyun Jung; Danehy, Paul M.; Inman, Jennifer A.; Jones, Stephen B.; Kang, Jin Ho; Sauti, Godfrey; Thibeault, Sheila A.; Yamakov, Vesselin; Wise, Kristopher E.; Su, Ji; Fay, Catharine C.

    2014-04-01

    Scientists have predicted that carbon's immediate neighbors on the periodic chart, boron and nitrogen, may also form perfect nanotubes, since the advent of carbon nanotubes (CNTs) in 1991. First proposed then synthesized by researchers at UC Berkeley in the mid 1990's, the boron nitride nanotube (BNNT) has proven very difficult to make until now. Herein we provide an update on a catalyst-free method for synthesizing highly crystalline, small diameter BNNTs with a high aspect ratio using a high power laser under a high pressure and high temperature environment first discovered jointly by NASA/NIA/JSA. Progress in purification methods, dispersion studies, BNNT mat and composite formation, and modeling and diagnostics will also be presented. The white BNNTs offer extraordinary properties including neutron radiation shielding, piezoelectricity, thermal oxidative stability (> 800°C in air), mechanical strength, and toughness. The characteristics of the novel BNNTs and BNNT polymer composites and their potential applications are discussed.

  9. Novel Intrinsic Ignition Method Measuring Local-Global Integration Characterizes Wakefulness and Deep Sleep.

    PubMed

    Deco, Gustavo; Tagliazucchi, Enzo; Laufs, Helmut; Sanjuán, Ana; Kringelbach, Morten L

    2017-01-01

    A precise definition of a brain state has proven elusive. Here, we introduce the novel local-global concept of intrinsic ignition characterizing the dynamical complexity of different brain states. Naturally occurring intrinsic ignition events reflect the capability of a given brain area to propagate neuronal activity to other regions, giving rise to different levels of integration. The ignitory capability of brain regions is computed by the elicited level of integration for each intrinsic ignition event in each brain region, averaged over all events. This intrinsic ignition method is shown to clearly distinguish human neuroimaging data of two fundamental brain states (wakefulness and deep sleep). Importantly, whole-brain computational modelling of this data shows that at the optimal working point is found where there is maximal variability of the intrinsic ignition across brain regions. Thus, combining whole brain models with intrinsic ignition can provide novel insights into underlying mechanisms of brain states.

  10. Between land and sea: divergent data stewardship practices in deep-sea biosphere research

    NASA Astrophysics Data System (ADS)

    Cummings, R.; Darch, P.

    2013-12-01

    Data in deep-sea biosphere research often live a double life. While the original data generated on IODP expeditions are highly structured, professionally curated, and widely shared, the downstream data practices of deep-sea biosphere laboratories are far more localized and ad hoc. These divergent data practices make it difficult to track the provenance of datasets from the cruise ships to the laboratory or to integrate IODP data with laboratory data. An in-depth study of the divergent data practices in deep-sea biosphere research allows us to: - Better understand the social and technical forces that shape data stewardship throughout the data lifecycle; - Develop policy, infrastructure, and best practices to improve data stewardship in small labs; - Track provenance of datasets from IODP cruises to labs and publications; - Create linkages between laboratory findings, cruise data, and IODP samples. In this paper, we present findings from the first year of a case study of the Center for Dark Energy Biosphere Investigations (C-DEBI), an NSF Science and Technology Center that studies life beneath the seafloor. Our methods include observation in laboratories, interviews, document analysis, and participation in scientific meetings. Our research uncovers the data stewardship norms of geologists, biologists, chemists, and hydrologists conducting multi-disciplinary research. Our research team found that data stewardship on cruises is a clearly defined task performed by an IODP curator, while downstream it is a distributed task that develops in response to local need and to the extent necessary for the immediate research team. IODP data are expensive to collect and challenging to obtain, often costing $50,000/day and requiring researchers to work twelve hours a day onboard the ships. To maximize this research investment, a highly trained IODP data curator controls data stewardship on the cruise and applies best practices such as standardized formats, proper labeling, and centralized storage. In the laboratory, a scientist is his or her own curator. In contrast to the IODP research parties, laboratory research teams analyze diverse datasets, share them internally, implement ad hoc data management practices, optimize methods for their specific research questions, and release data on request through personal transactions. We discovered that while these workflows help small research teams retain flexibility and local control - crucial in exploratory deep-sea biosphere research - they also hinder data interoperability, discoverability, and consistency of methods from one research team to the next. Additional consequences of this contrast between IODP and lab practices are that it is difficult to track the provenance of data and to create linkages between laboratory findings, cruise data, and archived IODP samples. The ability to track provenance would add value to datasets and provide a clearer picture of the decisions made throughout the data lifecycle. Better linkages between the original data, laboratory data, and samples would allow secondary researchers to locate IODP data that may be useful to their research after laboratory findings are published. Our case study is funded by the Sloan Foundation and NSF.

  11. MEMBRANES FOR DRINKING WATER TREATMENT

    EPA Science Inventory

    Various treatment technologies have proven effective in controlling halogenated disinfection by-products such as precursor removal and the use of alternative disinfectants. One of the most promising methods for halogenated by-product control includes removal of precursors before ...

  12. The contribution of cluster and discriminant analysis to the classification of complex aquifer systems.

    PubMed

    Panagopoulos, G P; Angelopoulou, D; Tzirtzilakis, E E; Giannoulopoulos, P

    2016-10-01

    This paper presents an innovated method for the discrimination of groundwater samples in common groups representing the hydrogeological units from where they have been pumped. This method proved very efficient even in areas with complex hydrogeological regimes. The proposed method requires chemical analyses of water samples only for major ions, meaning that it is applicable to most of cases worldwide. Another benefit of the method is that it gives a further insight of the aquifer hydrogeochemistry as it provides the ions that are responsible for the discrimination of the group. The procedure begins with cluster analysis of the dataset in order to classify the samples in the corresponding hydrogeological unit. The feasibility of the method is proven from the fact that the samples of volcanic origin were separated into two different clusters, namely the lava units and the pyroclastic-ignimbritic aquifer. The second step is the discriminant analysis of the data which provides the functions that distinguish the groups from each other and the most significant variables that define the hydrochemical composition of the aquifer. The whole procedure was highly successful as the 94.7 % of the samples were classified to the correct aquifer system. Finally, the resulted functions can be safely used to categorize samples of either unknown or doubtful origin improving thus the quality and the size of existing hydrochemical databases.

  13. Tempered fractional calculus

    NASA Astrophysics Data System (ADS)

    Sabzikar, Farzad; Meerschaert, Mark M.; Chen, Jinghua

    2015-07-01

    Fractional derivatives and integrals are convolutions with a power law. Multiplying by an exponential factor leads to tempered fractional derivatives and integrals. Tempered fractional diffusion equations, where the usual second derivative in space is replaced by a tempered fractional derivative, govern the limits of random walk models with an exponentially tempered power law jump distribution. The limiting tempered stable probability densities exhibit semi-heavy tails, which are commonly observed in finance. Tempered power law waiting times lead to tempered fractional time derivatives, which have proven useful in geophysics. The tempered fractional derivative or integral of a Brownian motion, called a tempered fractional Brownian motion, can exhibit semi-long range dependence. The increments of this process, called tempered fractional Gaussian noise, provide a useful new stochastic model for wind speed data. A tempered fractional difference forms the basis for numerical methods to solve tempered fractional diffusion equations, and it also provides a useful new correlation model in time series.

  14. TEMPERED FRACTIONAL CALCULUS.

    PubMed

    Meerschaert, Mark M; Sabzikar, Farzad; Chen, Jinghua

    2015-07-15

    Fractional derivatives and integrals are convolutions with a power law. Multiplying by an exponential factor leads to tempered fractional derivatives and integrals. Tempered fractional diffusion equations, where the usual second derivative in space is replaced by a tempered fractional derivative, govern the limits of random walk models with an exponentially tempered power law jump distribution. The limiting tempered stable probability densities exhibit semi-heavy tails, which are commonly observed in finance. Tempered power law waiting times lead to tempered fractional time derivatives, which have proven useful in geophysics. The tempered fractional derivative or integral of a Brownian motion, called a tempered fractional Brownian motion, can exhibit semi-long range dependence. The increments of this process, called tempered fractional Gaussian noise, provide a useful new stochastic model for wind speed data. A tempered difference forms the basis for numerical methods to solve tempered fractional diffusion equations, and it also provides a useful new correlation model in time series.

  15. TEMPERED FRACTIONAL CALCULUS

    PubMed Central

    MEERSCHAERT, MARK M.; SABZIKAR, FARZAD; CHEN, JINGHUA

    2014-01-01

    Fractional derivatives and integrals are convolutions with a power law. Multiplying by an exponential factor leads to tempered fractional derivatives and integrals. Tempered fractional diffusion equations, where the usual second derivative in space is replaced by a tempered fractional derivative, govern the limits of random walk models with an exponentially tempered power law jump distribution. The limiting tempered stable probability densities exhibit semi-heavy tails, which are commonly observed in finance. Tempered power law waiting times lead to tempered fractional time derivatives, which have proven useful in geophysics. The tempered fractional derivative or integral of a Brownian motion, called a tempered fractional Brownian motion, can exhibit semi-long range dependence. The increments of this process, called tempered fractional Gaussian noise, provide a useful new stochastic model for wind speed data. A tempered difference forms the basis for numerical methods to solve tempered fractional diffusion equations, and it also provides a useful new correlation model in time series. PMID:26085690

  16. Tempered fractional calculus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sabzikar, Farzad, E-mail: sabzika2@stt.msu.edu; Meerschaert, Mark M., E-mail: mcubed@stt.msu.edu; Chen, Jinghua, E-mail: cjhdzdz@163.com

    2015-07-15

    Fractional derivatives and integrals are convolutions with a power law. Multiplying by an exponential factor leads to tempered fractional derivatives and integrals. Tempered fractional diffusion equations, where the usual second derivative in space is replaced by a tempered fractional derivative, govern the limits of random walk models with an exponentially tempered power law jump distribution. The limiting tempered stable probability densities exhibit semi-heavy tails, which are commonly observed in finance. Tempered power law waiting times lead to tempered fractional time derivatives, which have proven useful in geophysics. The tempered fractional derivative or integral of a Brownian motion, called a temperedmore » fractional Brownian motion, can exhibit semi-long range dependence. The increments of this process, called tempered fractional Gaussian noise, provide a useful new stochastic model for wind speed data. A tempered fractional difference forms the basis for numerical methods to solve tempered fractional diffusion equations, and it also provides a useful new correlation model in time series.« less

  17. Relatedness in spatially structured populations with empty sites: An approach based on spatial moment equations.

    PubMed

    Lion, Sébastien

    2009-09-07

    Taking into account the interplay between spatial ecological dynamics and selection is a major challenge in evolutionary ecology. Although inclusive fitness theory has proven to be a very useful tool to unravel the interactions between spatial genetic structuring and selection, applications of the theory usually rely on simplifying demographic assumptions. In this paper, I attempt to bridge the gap between spatial demographic models and kin selection models by providing a method to compute approximations for relatedness coefficients in a spatial model with empty sites. Using spatial moment equations, I provide an approximation of nearest-neighbour relatedness on random regular networks, and show that this approximation performs much better than the ordinary pair approximation. I discuss the connection between the relatedness coefficients I define and those used in population genetics, and sketch some potential extensions of the theory.

  18. Visual affective classification by combining visual and text features.

    PubMed

    Liu, Ningning; Wang, Kai; Jin, Xin; Gao, Boyang; Dellandréa, Emmanuel; Chen, Liming

    2017-01-01

    Affective analysis of images in social networks has drawn much attention, and the texts surrounding images are proven to provide valuable semantic meanings about image content, which can hardly be represented by low-level visual features. In this paper, we propose a novel approach for visual affective classification (VAC) task. This approach combines visual representations along with novel text features through a fusion scheme based on Dempster-Shafer (D-S) Evidence Theory. Specifically, we not only investigate different types of visual features and fusion methods for VAC, but also propose textual features to effectively capture emotional semantics from the short text associated to images based on word similarity. Experiments are conducted on three public available databases: the International Affective Picture System (IAPS), the Artistic Photos and the MirFlickr Affect set. The results demonstrate that the proposed approach combining visual and textual features provides promising results for VAC task.

  19. Visual affective classification by combining visual and text features

    PubMed Central

    Liu, Ningning; Wang, Kai; Jin, Xin; Gao, Boyang; Dellandréa, Emmanuel; Chen, Liming

    2017-01-01

    Affective analysis of images in social networks has drawn much attention, and the texts surrounding images are proven to provide valuable semantic meanings about image content, which can hardly be represented by low-level visual features. In this paper, we propose a novel approach for visual affective classification (VAC) task. This approach combines visual representations along with novel text features through a fusion scheme based on Dempster-Shafer (D-S) Evidence Theory. Specifically, we not only investigate different types of visual features and fusion methods for VAC, but also propose textual features to effectively capture emotional semantics from the short text associated to images based on word similarity. Experiments are conducted on three public available databases: the International Affective Picture System (IAPS), the Artistic Photos and the MirFlickr Affect set. The results demonstrate that the proposed approach combining visual and textual features provides promising results for VAC task. PMID:28850566

  20. Hybrid DFP-CG method for solving unconstrained optimization problems

    NASA Astrophysics Data System (ADS)

    Osman, Wan Farah Hanan Wan; Asrul Hery Ibrahim, Mohd; Mamat, Mustafa

    2017-09-01

    The conjugate gradient (CG) method and quasi-Newton method are both well known method for solving unconstrained optimization method. In this paper, we proposed a new method by combining the search direction between conjugate gradient method and quasi-Newton method based on BFGS-CG method developed by Ibrahim et al. The Davidon-Fletcher-Powell (DFP) update formula is used as an approximation of Hessian for this new hybrid algorithm. Numerical result showed that the new algorithm perform well than the ordinary DFP method and proven to posses both sufficient descent and global convergence properties.

  1. The long and winding road

    NASA Astrophysics Data System (ADS)

    Pomeau, Yves

    2016-03-01

    For a problem as complex as turbulence, combining universal concepts from statistical physics with ideas from fluid mechanics has proven indispensable. Three decades since this link was formed, it is still providing food for new thought.

  2. Intelligent transportation systems benefits, costs, and lessons learned : 2014 update report.

    DOT National Transportation Integrated Search

    2014-06-01

    Intelligent transportation systems (ITS) provide a proven set of strategies for advancing transportation safety, mobility, and environmental sustainability by integrating communication and information technology applications into the management and o...

  3. Kernel Partial Least Squares for Nonlinear Regression and Discrimination

    NASA Technical Reports Server (NTRS)

    Rosipal, Roman; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This paper summarizes recent results on applying the method of partial least squares (PLS) in a reproducing kernel Hilbert space (RKHS). A previously proposed kernel PLS regression model was proven to be competitive with other regularized regression methods in RKHS. The family of nonlinear kernel-based PLS models is extended by considering the kernel PLS method for discrimination. Theoretical and experimental results on a two-class discrimination problem indicate usefulness of the method.

  4. Dynamic Data Citation through Provenance - new approach for reproducible science in Geoscience Australia.

    NASA Astrophysics Data System (ADS)

    Bastrakova, I.; Car, N.

    2017-12-01

    Geoscience Australia (GA) is recognised and respected as the National Repository and steward of multiple nationally significance data collections that provides geoscience information, services and capability to the Australian Government, industry and stakeholders. Internally, this brings a challenge of managing large volume (11 PB) of diverse and highly complex data distributed through a significant number of catalogues, applications, portals, virtual laboratories, and direct downloads from multiple locations. Externally, GA is facing constant changer in the Government regulations (e.g. open data and archival laws), growing stakeholder demands for high quality and near real-time delivery of data and products, and rapid technological advances enabling dynamic data access. Traditional approach to citing static data and products cannot satisfy increasing demands for the results from scientific workflows, or items within the workflows to be open, discoverable, thrusted and reproducible. Thus, citation of data, products, codes and applications through the implementation of provenance records is being implemented. This approach involves capturing the provenance of many GA processes according to a standardised data model and storing it, as well as metadata for the elements it references, in a searchable set of systems. This provides GA with ability to cite workflows unambiguously as well as each item within each workflow, including inputs and outputs and many other registered components. Dynamic objects can therefore be referenced flexibly in relation to their generation process - a dataset's metadata indicates where to obtain its provenance from - meaning the relevant facts of its dynamism need not be crammed into a single citation object with a single set of attributes. This allows for simple citations, similar to traditional static document citations such as references in journals, to be used for complex dynamic data and other objects such as software code.

  5. Entity Linking Leveraging the GeoDeepDive Cyberinfrastructure and Managing Uncertainty with Provenance.

    NASA Astrophysics Data System (ADS)

    Maio, R.; Arko, R. A.; Lehnert, K.; Ji, P.

    2017-12-01

    Unlocking the full, rich, network of links between the scientific literature and the real world entities to which data correspond - such as field expeditions (cruises) on oceanographic research vessels and physical samples collected during those expeditions - remains a challenge for the geoscience community. Doing so would enable data reuse and integration on a broad scale; making it possible to inspect the network and discover, for example, all rock samples reported in the scientific literature found within 10 kilometers of an undersea volcano, and associated geochemical analyses. Such a capability could facilitate new scientific discoveries. The GeoDeepDive project provides negotiated access to 4.2+ million documents from scientific publishers, enabling text and document mining via a public API and cyberinfrastructure. We mined this corpus using entity linking techniques, which are inherently uncertain, and recorded provenance information about each link. This opens the entity linking methodology to scrutiny, and enables downstream applications to make informed assessments about the suitability of an entity link for consumption. A major challenge is how to model and disseminate the provenance information. We present results from entity linking between journal articles, research vessels and cruises, and physical samples from the Petrological Database (PetDB), and incorporate Linked Data resources such as cruises in the Rolling Deck to Repository (R2R) catalog where possible. Our work demonstrates the value and potential of the GeoDeepDive cyberinfrastructure in combination with Linked Data infrastructure provided by the EarthCube GeoLink project. We present a research workflow to capture provenance information that leverages the World Wide Web Consortium (W3C) recommendation PROV Ontology.

  6. Data Quality Assurance and Provenance Tracking in ICOADS Release 3.0

    NASA Astrophysics Data System (ADS)

    Cram, T.; Worley, S. J.; Ji, Z.; Schuster, D.

    2017-12-01

    The International Comprehensive Ocean-Atmosphere Data Set (ICOADS) Release 3.0 (R3.0) is the world's most extensive collection of global surface marine meteorological in situ observational data. Managed under an international partnership, it contains over 455 million unique multi-parameter records, dates back to 1662, and is updated monthly in near real-time. It is a foundational dataset for weather and climate research that has been used by thousands of users. By using rigorous data preparation methods, new IT infrastructure, and International Maritime Meteorological Archive (IMMA) format enhancements, ICOADS R3.0 is exemplary in data quality assurance, provenance tracking, and capturing user feedback. The features in this data lifecycle management will be presented and include, but are not limited to, written data translation specification for each data source being added to ICOADS, assignment of data source identification parameters, attachment of the original data in the IMMA format to support future re-evaluation if necessary, permanently assigned unique identification on every record making data development and community collaborations easily possible using a relational database infrastructure, and extensible capacity of the IMMA format to augment the data richness beyond the primary scope of marine surface data. Some recent augmentations are more completely specified ocean observations from profiling observing systems, feedback data submitted by the atmospheric and oceanographic reanalysis providers, higher quality edited cloud reports, and community provided data value adjustments with uncertainty estimates. Highlights covering these ICOADS value-added features will be explained and the open free access from NCAR will be briefly described.

  7. A fast ergodic algorithm for generating ensembles of equilateral random polygons

    NASA Astrophysics Data System (ADS)

    Varela, R.; Hinson, K.; Arsuaga, J.; Diao, Y.

    2009-03-01

    Knotted structures are commonly found in circular DNA and along the backbone of certain proteins. In order to properly estimate properties of these three-dimensional structures it is often necessary to generate large ensembles of simulated closed chains (i.e. polygons) of equal edge lengths (such polygons are called equilateral random polygons). However finding efficient algorithms that properly sample the space of equilateral random polygons is a difficult problem. Currently there are no proven algorithms that generate equilateral random polygons with its theoretical distribution. In this paper we propose a method that generates equilateral random polygons in a 'step-wise uniform' way. We prove that this method is ergodic in the sense that any given equilateral random polygon can be generated by this method and we show that the time needed to generate an equilateral random polygon of length n is linear in terms of n. These two properties make this algorithm a big improvement over the existing generating methods. Detailed numerical comparisons of our algorithm with other widely used algorithms are provided.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yashchuk, V.V.; Takacs, P.; Anderson, E.H.

    A modulation transfer function (MTF) calibration method based on binary pseudorandom (BPR) gratings and arrays has been proven to be an effective MTF calibration method for interferometric microscopes and a scatterometer. Here we report on a further expansion of the application range of the method. We describe the MTF calibration of a 6 in. phase shifting Fizeau interferometer. Beyond providing a direct measurement of the interferometer's MTF, tests with a BPR array surface have revealed an asymmetry in the instrument's data processing algorithm that fundamentally limits its bandwidth. Moreover, the tests have illustrated the effects of the instrument's detrending andmore » filtering procedures on power spectral density measurements. The details of the development of a BPR test sample suitable for calibration of scanning and transmission electron microscopes are also presented. Such a test sample is realized as a multilayer structure with the layer thicknesses of two materials corresponding to the BPR sequence. The investigations confirm the universal character of the method that makes it applicable to a large variety of metrology instrumentation with spatial wavelength bandwidths from a few nanometers to hundreds of millimeters.« less

  9. An efficient user-oriented method for calculating compressible flow in an about three-dimensional inlets. [panel method

    NASA Technical Reports Server (NTRS)

    Hess, J. L.; Mack, D. P.; Stockman, N. O.

    1979-01-01

    A panel method is used to calculate incompressible flow about arbitrary three-dimensional inlets with or without centerbodies for four fundamental flow conditions: unit onset flows parallel to each of the coordinate axes plus static operation. The computing time is scarcely longer than for a single solution. A linear superposition of these solutions quite rigorously gives incompressible flow about the inlet for any angle of attack, angle of yaw, and mass flow rate. Compressibility is accounted for by applying a well-proven correction to the incompressible flow. Since the computing times for the combination and the compressibility correction are small, flows at a large number of inlet operating conditions are obtained rather cheaply. Geometric input is aided by an automatic generating program. A number of graphical output features are provided to aid the user, including surface streamline tracing and automatic generation of curves of curves of constant pressure, Mach number, and flow inclination at selected inlet cross sections. The inlet method and use of the program are described. Illustrative results are presented.

  10. Dictionaries and distributions: Combining expert knowledge and large scale textual data content analysis : Distributed dictionary representation.

    PubMed

    Garten, Justin; Hoover, Joe; Johnson, Kate M; Boghrati, Reihane; Iskiwitch, Carol; Dehghani, Morteza

    2018-02-01

    Theory-driven text analysis has made extensive use of psychological concept dictionaries, leading to a wide range of important results. These dictionaries have generally been applied through word count methods which have proven to be both simple and effective. In this paper, we introduce Distributed Dictionary Representations (DDR), a method that applies psychological dictionaries using semantic similarity rather than word counts. This allows for the measurement of the similarity between dictionaries and spans of text ranging from complete documents to individual words. We show how DDR enables dictionary authors to place greater emphasis on construct validity without sacrificing linguistic coverage. We further demonstrate the benefits of DDR on two real-world tasks and finally conduct an extensive study of the interaction between dictionary size and task performance. These studies allow us to examine how DDR and word count methods complement one another as tools for applying concept dictionaries and where each is best applied. Finally, we provide references to tools and resources to make this method both available and accessible to a broad psychological audience.

  11. Validation of AN Hplc-Dad Method for the Classification of Green Teas

    NASA Astrophysics Data System (ADS)

    Yu, Jingbo; Ye, Nengsheng; Gu, Xuexin; Liu, Ni

    A reversed phase high performance liquid chromatography (RP-HPLC) separation coupled with diode array detection (DAD) and electrospray ionization mass spectrometer (ESI/MS) was developed and optimized for the classification of green teas. Five catechins [epigallocatechin (EGC), epigallocatechin gallate (EGCG), epicatechin (EC), gallocatechin gallate (GCG), epicatechin gallate (ECG)] had been identified and quantified by the HPLC-DAD-ESI/MS/MS method. The limit of detection (LOD) of five catechins was within the range of 1.25-15 ng. All the analytes exhibited good linearity up to 2500 ng. These compounds were considered as chemical descriptors to define groups of green teas. Chemometric methods including principal component analysis (PCA) and hierarchical cluster analysis (HCA) were applied for the purpose. Twelve green tea samples originating from different regions were subjected to reveal the natural groups. The results showed that the analyzed green teas were differentiated mainly by provenance; HCA afforded an excellent performance in terms of recognition and prediction abilities. This method was accurate and reproducible, providing a potential approach for authentication of green teas.

  12. PAV ontology: provenance, authoring and versioning.

    PubMed

    Ciccarese, Paolo; Soiland-Reyes, Stian; Belhajjame, Khalid; Gray, Alasdair Jg; Goble, Carole; Clark, Tim

    2013-11-22

    Provenance is a critical ingredient for establishing trust of published scientific content. This is true whether we are considering a data set, a computational workflow, a peer-reviewed publication or a simple scientific claim with supportive evidence. Existing vocabularies such as Dublin Core Terms (DC Terms) and the W3C Provenance Ontology (PROV-O) are domain-independent and general-purpose and they allow and encourage for extensions to cover more specific needs. In particular, to track authoring and versioning information of web resources, PROV-O provides a basic methodology but not any specific classes and properties for identifying or distinguishing between the various roles assumed by agents manipulating digital artifacts, such as author, contributor and curator. We present the Provenance, Authoring and Versioning ontology (PAV, namespace http://purl.org/pav/): a lightweight ontology for capturing "just enough" descriptions essential for tracking the provenance, authoring and versioning of web resources. We argue that such descriptions are essential for digital scientific content. PAV distinguishes between contributors, authors and curators of content and creators of representations in addition to the provenance of originating resources that have been accessed, transformed and consumed. We explore five projects (and communities) that have adopted PAV illustrating their usage through concrete examples. Moreover, we present mappings that show how PAV extends the W3C PROV-O ontology to support broader interoperability. The initial design of the PAV ontology was driven by requirements from the AlzSWAN project with further requirements incorporated later from other projects detailed in this paper. The authors strived to keep PAV lightweight and compact by including only those terms that have demonstrated to be pragmatically useful in existing applications, and by recommending terms from existing ontologies when plausible. We analyze and compare PAV with related approaches, namely Provenance Vocabulary (PRV), DC Terms and BIBFRAME. We identify similarities and analyze differences between those vocabularies and PAV, outlining strengths and weaknesses of our proposed model. We specify SKOS mappings that align PAV with DC Terms. We conclude the paper with general remarks on the applicability of PAV.

  13. Asbestos: The Case for Encapsulation.

    ERIC Educational Resources Information Center

    Russek, William F.

    1980-01-01

    Encapsulation has proven to be the safest, surest, and most permanent method of treating sprayed asbestos on ceilings and walls. Federal aid is available to help pay for inspection of school buildings for asbestos and for asbestos removal. (Author/MLF)

  14. Demonstrations To Save the World.

    ERIC Educational Resources Information Center

    Brown, Tom; Dias, Michael

    2003-01-01

    Presents two environmental modeling activities, biosphere bubbles and a crystal ball of population growth, along with a related online exercise and explains proven teaching methods that make demonstrations less teacher-centered and more engaging for groups of collaborating students. (KHR)

  15. Development of a speech autocuer

    NASA Astrophysics Data System (ADS)

    Bedles, R. L.; Kizakvich, P. N.; Lawson, D. T.; McCartney, M. L.

    1980-12-01

    A wearable, visually based prosthesis for the deaf based upon the proven method for removing lipreading ambiguity known as cued speech was fabricated and tested. Both software and hardware developments are described, including a microcomputer, display, and speech preprocessor.

  16. Development of a speech autocuer

    NASA Technical Reports Server (NTRS)

    Bedles, R. L.; Kizakvich, P. N.; Lawson, D. T.; Mccartney, M. L.

    1980-01-01

    A wearable, visually based prosthesis for the deaf based upon the proven method for removing lipreading ambiguity known as cued speech was fabricated and tested. Both software and hardware developments are described, including a microcomputer, display, and speech preprocessor.

  17. The historical biogeography of Mammalia

    PubMed Central

    Springer, Mark S.; Meredith, Robert W.; Janecka, Jan E.; Murphy, William J.

    2011-01-01

    Palaeobiogeographic reconstructions are underpinned by phylogenies, divergence times and ancestral area reconstructions, which together yield ancestral area chronograms that provide a basis for proposing and testing hypotheses of dispersal and vicariance. Methods for area coding include multi-state coding with a single character, binary coding with multiple characters and string coding. Ancestral reconstruction methods are divided into parsimony versus Bayesian/likelihood approaches. We compared nine methods for reconstructing ancestral areas for placental mammals. Ambiguous reconstructions were a problem for all methods. Important differences resulted from coding areas based on the geographical ranges of extant species versus the geographical provenance of the oldest fossil for each lineage. Africa and South America were reconstructed as the ancestral areas for Afrotheria and Xenarthra, respectively. Most methods reconstructed Eurasia as the ancestral area for Boreoeutheria, Euarchontoglires and Laurasiatheria. The coincidence of molecular dates for the separation of Afrotheria and Xenarthra at approximately 100 Ma with the plate tectonic sundering of Africa and South America hints at the importance of vicariance in the early history of Placentalia. Dispersal has also been important including the origins of Madagascar's endemic mammal fauna. Further studies will benefit from increased taxon sampling and the application of new ancestral area reconstruction methods. PMID:21807730

  18. Damage Detection Based on Static Strain Responses Using FBG in a Wind Turbine Blade

    PubMed Central

    Tian, Shaohua; Yang, Zhibo; Chen, Xuefeng; Xie, Yong

    2015-01-01

    The damage detection of a wind turbine blade enables better operation of the turbines, and provides an early alert to the destroyed events of the blade in order to avoid catastrophic losses. A new non-baseline damage detection method based on the Fiber Bragg grating (FBG) in a wind turbine blade is developed in this paper. Firstly, the Chi-square distribution is proven to be an effective damage-sensitive feature which is adopted as the individual information source for the local decision. In order to obtain the global and optimal decision for the damage detection, the feature information fusion (FIF) method is proposed to fuse and optimize information in above individual information sources, and the damage is detected accurately through of the global decision. Then a 13.2 m wind turbine blade with the distributed strain sensor system is adopted to describe the feasibility of the proposed method, and the strain energy method (SEM) is used to describe the advantage of the proposed method. Finally results show that the proposed method can deliver encouraging results of the damage detection in the wind turbine blade. PMID:26287200

  19. Development of Advanced Verification and Validation Procedures and Tools for the Certification of Learning Systems in Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola

    2005-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.

  20. The undercooling of liquids

    NASA Technical Reports Server (NTRS)

    Turnbull, D.

    1984-01-01

    The formation by melt quenching of such metastable structures as glassy or microcrystalline solids and highly supersaturated solutions is made possible by the extreme resistance of most melts to homophase crystal nucleation at deep undercooling. This nucleation resistance contrasts sharply with the very low kinetic resistance to the movement of crystal-melt interfaces, once formed, in metals and other fluid systems at even minute undercooling. The methods of nucleation study which have proven especially effective in bypassing nucleation by heterophase impurities thereby exposing the high resistance of melts to homophase nucleation may be summarized as follows: observation of the crystallization behavior of dispersed small droplets; drop tube experiments in which liquid drops solidify, under containerless conditions, during their fall in the tube; and observation of the crystallization of bulk specimens immersed in fluxes chosen to dissolve or otherwise deactivate (e.g., by wetting) heterophase nucleants. This method has proven to be remarkably effective in deactivating such nucleants in certain pure metals.

  1. Integrating addiction treatment into primary care using mobile health technology: protocol for an implementation research study

    PubMed Central

    2014-01-01

    Background Healthcare reform in the United States is encouraging Federally Qualified Health Centers and other primary-care practices to integrate treatment for addiction and other behavioral health conditions into their practices. The potential of mobile health technologies to manage addiction and comorbidities such as HIV in these settings is substantial but largely untested. This paper describes a protocol to evaluate the implementation of an E-Health integrated communication technology delivered via mobile phones, called Seva, into primary-care settings. Seva is an evidence-based system of addiction treatment and recovery support for patients and real-time caseload monitoring for clinicians. Methods/Design Our implementation strategy uses three models of organizational change: the Program Planning Model to promote acceptance and sustainability, the NIATx quality improvement model to create a welcoming environment for change, and Rogers’s diffusion of innovations research, which facilitates adaptations of innovations to maximize their adoption potential. We will implement Seva and conduct an intensive, mixed-methods assessment at three diverse Federally Qualified Healthcare Centers in the United States. Our non-concurrent multiple-baseline design includes three periods — pretest (ending in four months of implementation preparation), active Seva implementation, and maintenance — with implementation staggered at six-month intervals across sites. The first site will serve as a pilot clinic. We will track the timing of intervention elements and assess study outcomes within each dimension of the Reach, Effectiveness, Adoption, Implementation, and Maintenance framework, including effects on clinicians, patients, and practices. Our mixed-methods approach will include quantitative (e.g., interrupted time-series analysis of treatment attendance, with clinics as the unit of analysis) and qualitative (e.g., staff interviews regarding adaptations to implementation protocol) methods, and assessment of implementation costs. Discussion If implementation is successful, the field will have a proven technology that helps Federally Qualified Health Centers and affiliated organizations provide addiction treatment and recovery support, as well as a proven strategy for implementing the technology. Seva also has the potential to improve core elements of addiction treatment, such as referral and treatment processes. A mobile technology for addiction treatment and accompanying implementation model could provide a cost-effective means to improve the lives of patients with drug and alcohol problems. Trial registration ClinicalTrials.gov (NCT01963234). PMID:24884976

  2. Strontium isotopes in otoliths of a non-migratory fish (slimy sculpin): Implications for provenance studies

    USGS Publications Warehouse

    Brennan, Sean R.; Fernandez, Diego P.; Zimmerman, Christian E.; Cerling, Thure E.; Brown, Randy J.; Wooller, Matthew J.

    2015-01-01

    Heterogeneity in 87Sr/86Sr ratios of river-dissolved strontium (Sr) across geologically diverse environments provides a useful tool for investigating provenance, connectivity and movement patterns of various organisms and materials. Evaluation of site-specific 87Sr/86Sr temporal variability throughout study regions is a prerequisite for provenance research, but the dynamics driving temporal variability are generally system-dependent and not accurately predictable. We used the time-keeping properties of otoliths from non-migratory slimy sculpin (Cottus cognatus) to evaluate multi-scale 87Sr/86Sr temporal variability of river waters throughout the Nushagak River, a large (34,700 km2) remote watershed in Alaska, USA. Slimy sculpin otoliths incorporated site-specific temporal variation at sub-annual resolution and were able to record on the order of 0.0001 changes in the 87Sr/86Sr ratio. 87Sr/86Sr profiles of slimy sculpin collected in tributaries and main-stem channels of the upper watershed indicated that these regions were temporally stable, whereas the Lower Nushagak River exhibited some spatio-teporal variability. This study illustrates how the behavioral ecology of a non-migratory organism can be used to evaluate sub-annual 87Sr/86Sr temporal variability and has broad implications for provenance studies employing this tracer.

  3. TEACH: An Ethogram-Based Method to Observe and Record Teaching Behavior

    ERIC Educational Resources Information Center

    Kline, Michelle Ann

    2017-01-01

    Teaching has attracted growing research attention in studies of human and animal behavior as a crucial behavior that coevolved with human cultural capacities. However, the synthesis of data on teaching across species and across human populations has proven elusive because researchers use a variety of definitions and methods to approach the topic.…

  4. A Transfer Learning Approach for Applying Matrix Factorization to Small ITS Datasets

    ERIC Educational Resources Information Center

    Voß, Lydia; Schatten, Carlotta; Mazziotti, Claudia; Schmidt-Thieme, Lars

    2015-01-01

    Machine Learning methods for Performance Prediction in Intelligent Tutoring Systems (ITS) have proven their efficacy; specific methods, e.g. Matrix Factorization (MF), however suffer from the lack of available information about new tasks or new students. In this paper we show how this problem could be solved by applying Transfer Learning (TL),…

  5. Conservative discretization of the Landau collision integral

    DOE PAGES

    Hirvijoki, E.; Adams, M. F.

    2017-03-28

    Here we describe a density, momentum-, and energy-conserving discretization of the nonlinear Landau collision integral. The method is suitable for both the finite-element and discontinuous Galerkin methods and does not require structured meshes. The conservation laws for the discretization are proven algebraically and demonstrated numerically for an axially symmetric nonlinear relaxation problem using a finite-element implementation.

  6. Using Touchscreens as Position Detectors in Physics Experiments

    ERIC Educational Resources Information Center

    Dilek, Ufuk; Sengören, Serap Kaya

    2017-01-01

    The position of a ball was measured by using the touchscreen of a mobile phone during its rolling motion. The translational speed of the ball was determined using the recorded position and time data. The speed was also calculated by a conventional method. The speed values determined by the two methods were consistent, thus it was proven that a…

  7. Paired comparison estimates of willingness to accept versus contingent valuation estimates of willingness to pay

    Treesearch

    John B. Loomis; George Peterson; Patricia A. Champ; Thomas C. Brown; Beatrice Lucero

    1998-01-01

    Estimating empirical measures of an individual's willingness to accept that are consistent with conventional economic theory, has proven difficult. The method of paired comparison offers a promising approach to estimate willingness to accept. This method involves having individuals make binary choices between receiving a particular good or a sum of money....

  8. Insights into the Geographic Sequence of Deglaciation in the Weddell Sea Embayment by Provenance of Ice-Rafted Debris

    NASA Astrophysics Data System (ADS)

    Williams, T.; Hemming, S. R.; Licht, K.; Agrios, L.; Brachfeld, S. A.; van de Flierdt, T.; Hillenbrand, C. D.; Ehrmann, W. U.; Zhai, X.; Cai, Y.; Corley, A. D.; Kuhn, G.

    2017-12-01

    The geochemical and geochronological fingerprint of rock debris eroded and carried by ice streams may be used to identify the provenance of iceberg-rafted debris (IRD) in the marine sediment record. During ice retreat following glacial maxima, it has been shown that there is an increase in IRD accumulation in marine sediments underlying the western limb of the Weddell Gyre. Here we present IRD provenance records from sediment core PS1571-1 in the NW Weddell Sea, and interpret these records in terms of the geographic sequence of ice sheet retreat in the Weddell Sea embayment during the most recent deglaciation. We first characterize the source areas of eroded debris around the Weddell Sea Embayment, using published mapping of the embayment and new material from: 1. Till in modern moraines at the edges of ice streams, including the Foundation Ice Stream, the Academy Glacier, and the Recovery Glacier; and 2. Subglacial till and proximal glaciomarine sediment from existing cores located along the front of the Filchner and Ronne Ice Shelves, collected on past expeditions of the RV Polarstern. The analyses on these samples include 40Ar/39Ar hornblende and biotite thermochronology and U-Pb zircon geochronology on individual mineral grains, and K-Ar thermochronology, Nd isotopes, and clay mineralogy on the clay grain size fraction. Results so far indicate that samples along the front of the Filchner and Ronne Ice Shelves record the geochemical and geochronological fingerprint that would be expected from tracing ice flow lines back to the bedrock terranes. The Ronne (west), Hughes (central), and Filchner (east) sectors have distinguishable provenance source signatures, and further subdivision is possible. In core PS1571-1, downcore IRD provenance changes reflect iceberg output and ice sheet retreat from the different sectors of the embayment through the last deglaciation. The detrital provenance method of interpreting the geographic sequence of ice retreat can equally be applied to previous deglaciations of the Weddell Sea Embayment.

  9. Provenance of sandstones in the Golconda terrane, north central Nevada

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, E.A.

    1991-02-01

    The upper Paleozoic Golconda terrane of north-central Nevada is a composite of several structurally bounded subterranes made of clastic, volcanic, and carbonate rocks. The clastic rocks provide important clues for the interpretation of the provenance and paleogeographic settings of the different lithologic assemblages found in these subterranes. Two petrographically distinct sandstones are identified in the Golconda terrane in the Osgood Mountains and the Hot springs Range of north-central Nevada. The sandstone of the Mississippian Farrel Canyon Formation, part of the Dry Hills subterrane, is characterized by quartzose and sedimentary and lithic-rich clasts with a small feldspar component. in contrast, themore » sandstone of the Permian Poverty Peak (II) subterrane is a silty quartzarenite with no lithic component, and a very limited feldspar component. The sandstone of the Farrel Canyon Formation is similar to nonvolcanic sandstones reported from elsewhere in the Golconda terrane. Modal data reflect a provenance of a recycled orogen and permit the interpretation that it could have been derived from the antler orogen as has been proposed for other sandstones of the golconda terrane. The sandstone of the Poverty Peak (II) subterrane is more mature than any of the other sandstones in either the Golconda terrane, the Antler overlap sequence, or the Antler foreland basin sequence. Modal data put the Poverty Peak (II) sandstone in the continental block provenance category. The distinct extrabasinal provenances represented in these different sandstones support the idea that the Golconda basin was made up of complex paleogeographic settings, which included multiple sources of extrabasinal sediment.« less

  10. Local Technical Assistance Program Field Manual

    DOT National Transportation Integrated Search

    1997-02-01

    The FHWA Local Technical Assistance Program (LTAP) provides technology transfer products to local highway departments. One of the most valuable and well-received resources of the program, the LTAP (or T 2 ) centers, have proven themselves invaluable ...

  11. Facilitating Stewardship of scientific data through standards based workflows

    NASA Astrophysics Data System (ADS)

    Bastrakova, I.; Kemp, C.; Potter, A. K.

    2013-12-01

    There are main suites of standards that can be used to define the fundamental scientific methodology of data, methods and results. These are firstly Metadata standards to enable discovery of the data (ISO 19115), secondly the Sensor Web Enablement (SWE) suite of standards that include the O&M and SensorML standards and thirdly Ontology that provide vocabularies to define the scientific concepts and relationships between these concepts. All three types of standards have to be utilised by the practicing scientist to ensure that those who ultimately have to steward the data stewards to ensure that the data can be preserved curated and reused and repurposed. Additional benefits of this approach include transparency of scientific processes from the data acquisition to creation of scientific concepts and models, and provision of context to inform data use. Collecting and recording metadata is the first step in scientific data flow. The primary role of metadata is to provide details of geographic extent, availability and high-level description of data suitable for its initial discovery through common search engines. The SWE suite provides standardised patterns to describe observations and measurements taken for these data, capture detailed information about observation or analytical methods, used instruments and define quality determinations. This information standardises browsing capability over discrete data types. The standardised patterns of the SWE standards simplify aggregation of observation and measurement data enabling scientists to transfer disintegrated data to scientific concepts. The first two steps provide a necessary basis for the reasoning about concepts of ';pure' science, building relationship between concepts of different domains (linked-data), and identifying domain classification and vocabularies. Geoscience Australia is re-examining its marine data flows, including metadata requirements and business processes, to achieve a clearer link between scientific data acquisition and analysis requirements and effective interoperable data management and delivery. This includes participating in national and international dialogue on development of standards, embedding data management activities in business processes, and developing scientific staff as effective data stewards. Similar approach is applied to the geophysical data. By ensuring the geophysical datasets at GA strictly follow metadata and industry standards we are able to implement a provenance based workflow where the data is easily discoverable, geophysical processing can be applied to it and results can be stored. The provenance based workflow enables metadata records for the results to be produced automatically from the input dataset metadata.

  12. Note: A simple sample transfer alignment for ultra-high vacuum systems.

    PubMed

    Tamtögl, A; Carter, E A; Ward, D J; Avidor, N; Kole, P R; Jardine, A P; Allison, W

    2016-06-01

    The alignment of ultra-high-vacuum sample transfer systems can be problematic when there is no direct line of sight to assist the user. We present the design of a simple and cheap system which greatly simplifies the alignment of sample transfer devices. Our method is based on the adaptation of a commercial digital camera which provides live views from within the vacuum chamber. The images of the camera are further processed using an image recognition and processing code which determines any misalignments and reports them to the user. Installation has proven to be extremely useful in order to align the sample with respect to the transfer mechanism. Furthermore, the alignment software can be easily adapted for other systems.

  13. Peptide Probe for Crystalline Hydroxyapatite: In Situ Detection of Biomineralization

    NASA Astrophysics Data System (ADS)

    Cicerone, Marcus; Becker, Matthew; Simon, Carl; Chatterjee, Kaushik

    2009-03-01

    While cells template mineralization in vitro and in vivo, specific detection strategies that impart chemical and structural information on this process have proven elusive. Recently we have developed an in situ based peptide probe via phage display methods that is specific to crystalline hydroxyapatite (HA). We are using this in fluorescence based assays to characterize mineralization. One application being explored is the screening of tissue engineering scaffolds for their ability to support osteogenesis. Specifically, osteoblasts are being cultured in hydrogel scaffolds possessing property gradients to provide a test bed for the HA peptide probe. Hydrogel properties that support osteogenesis and HA deposition will be identified using the probe to demonstrate its utility in optimizing design of tissue scaffolds.

  14. South African Research Ethics Committee Review of Standards of Prevention in HIV Vaccine Trial Protocols.

    PubMed

    Essack, Zaynab; Wassenaar, Douglas R

    2018-04-01

    HIV prevention trials provide a prevention package to participants to help prevent HIV acquisition. As new prevention methods are proven effective, this raises ethical and scientific design complexities regarding the prevention package or standard of prevention. Given its high HIV incidence and prevalence, South Africa has become a hub for HIV prevention research. For this reason, it is critical to study the implementation of relevant ethical-legal frameworks for such research in South Africa. This qualitative study used in-depth interviews to explore the practices and perspectives of eight members of South African research ethics committees (RECs) who have reviewed protocols for HIV vaccine trials. Their practices and perspectives are compared with ethics guideline requirements for standards of prevention.

  15. Lessons from single-cell transcriptome analysis of oxygen-sensing cells.

    PubMed

    Zhou, Ting; Matsunami, Hiroaki

    2018-05-01

    The advent of single-cell RNA-sequencing (RNA-Seq) technology has enabled transcriptome profiling of individual cells. Comprehensive gene expression analysis at the single-cell level has proven to be effective in characterizing the most fundamental aspects of cellular function and identity. This unbiased approach is revolutionary for small and/or heterogeneous tissues like oxygen-sensing cells in identifying key molecules. Here, we review the major methods of current single-cell RNA-Seq technology. We discuss how this technology has advanced the understanding of oxygen-sensing glomus cells in the carotid body and helped uncover novel oxygen-sensing cells and mechanisms in the mice olfactory system. We conclude by providing our perspective on future single-cell RNA-Seq research directed at oxygen-sensing cells.

  16. Enhancement and character recognition of the erased colophon of a 15th-century Hebrew prayer book

    NASA Astrophysics Data System (ADS)

    Walvoord, Derek J.; Easton, Roger L., Jr.; Knox, Keith T.; Heimbueger, Matthew

    2005-01-01

    A handwritten codex often included an inscription that listed facts about its publication, such as the names of the scribe and patron, date of publication, the city where the book was copied, etc. These facts obviously provide essential information to a historian studying the provenance of the codex. Unfortunately, this page was sometimes erased after the sale of the book to a new owner, often by scraping off the original ink. The importance of recovering this information would be difficult to overstate. This paper reports on the methods of imaging, image enhancement, and character recognition that were applied to this page in a Hebrew prayer book copied in Florence in the 15th century.

  17. Enhancement and character recognition of the erased colophon of a 15th-century Hebrew prayer book

    NASA Astrophysics Data System (ADS)

    Walvoord, Derek J.; Easton, Roger L., Jr.; Knox, Keith T.; Heimbueger, Matthew

    2004-12-01

    A handwritten codex often included an inscription that listed facts about its publication, such as the names of the scribe and patron, date of publication, the city where the book was copied, etc. These facts obviously provide essential information to a historian studying the provenance of the codex. Unfortunately, this page was sometimes erased after the sale of the book to a new owner, often by scraping off the original ink. The importance of recovering this information would be difficult to overstate. This paper reports on the methods of imaging, image enhancement, and character recognition that were applied to this page in a Hebrew prayer book copied in Florence in the 15th century.

  18. Creating Effective Dialogue Around Climate Change

    NASA Astrophysics Data System (ADS)

    Kiehl, J. T.

    2015-12-01

    Communicating climate change to people from diverse sectors of society has proven to be difficult in the United States. It is widely recognized that difficulties arise from a number of sources, including: basic science understanding, the psychologically affect laden content surrounding climate change, and the diversity of value systems that exist in our society. I explore ways of working with the affect that arises around climate change and describe specific methods to work with the resistance often encountered when communicating this important issue. The techniques I describe are rooted in psychology and group process and provide means for creating more effective narratives to break through the barriers to communicating climate change science. Examples are given from personal experiences in presenting climate change to diverse groups.

  19. Newton-like methods for Navier-Stokes solution

    NASA Astrophysics Data System (ADS)

    Qin, N.; Xu, X.; Richards, B. E.

    1992-12-01

    The paper reports on Newton-like methods called SFDN-alpha-GMRES and SQN-alpha-GMRES methods that have been devised and proven as powerful schemes for large nonlinear problems typical of viscous compressible Navier-Stokes solutions. They can be applied using a partially converged solution from a conventional explicit or approximate implicit method. Developments have included the efficient parallelization of the schemes on a distributed memory parallel computer. The methods are illustrated using a RISC workstation and a transputer parallel system respectively to solve a hypersonic vortical flow.

  20. Reconstructing Spectral Scenes Using Statistical Estimation to Enhance Space Situational Awareness

    DTIC Science & Technology

    2006-12-01

    simultane- ously spatially and spectrally deblur the images collected from ASIS. The algorithms are based on proven estimation theories and do not...collected with any system using a filtering technology known as Electronic Tunable Filters (ETFs). Previous methods to deblur spectral images collected...spectrally deblurring then the previously investigated methods. This algorithm expands on a method used for increasing the spectral resolution in gamma-ray

  1. Remote sensing of vegetation fires and its contribution to a fire management information system

    Treesearch

    Stephane P. Flasse; Simon N. Trigg; Pietro N. Ceccato; Anita H. Perryman; Andrew T. Hudak; Mark W. Thompson; Bruce H. Brockett; Moussa Drame; Tim Ntabeni; Philip E. Frost; Tobias Landmann; Johan L. le Roux

    2004-01-01

    In the last decade, research has proven that remote sensing can provide very useful support to fire managers. This chapter provides an overview of the types of information remote sensing can provide to the fire community. First, it considers fire management information needs in the context of a fire management information system. An introduction to remote sensing then...

  2. Just-in-time and stockless programs for hospitals: fad or trend?

    PubMed

    Lynch, D

    1991-05-01

    The JIT and stockless approach to provider-supplier relationships has proven to be a win-win proposition for the partners that have implemented it in many manufacturing industries and health care organizations as well. This strategy will fundamentally impact the entire cost structure within the hospital supply distribution chain. rewards have proven attainable and more comprehensive than had been hoped in the health care applications. The sweeping changes the health care industry experienced during the 1980s are leading creative materiel managers to seize the initiative to improve the current operating costs of their hospitals. They do not want to be left behind "holding the inventory."

  3. Space Flight Operations Center local area network

    NASA Technical Reports Server (NTRS)

    Goodman, Ross V.

    1988-01-01

    The existing Mission Control and Computer Center at JPL will be replaced by the Space Flight Operations Center (SFOC). One part of the SFOC is the LAN-based distribution system. The purpose of the LAN is to distribute the processed data among the various elements of the SFOC. The SFOC LAN will provide a robust subsystem that will support the Magellan launch configuration and future project adaptation. Its capabilities include (1) a proven cable medium as the backbone for the entire network; (2) hardware components that are reliable, varied, and follow OSI standards; (3) accurate and detailed documentation for fault isolation and future expansion; and (4) proven monitoring and maintenance tools.

  4. The Schwinger Variational Method

    NASA Technical Reports Server (NTRS)

    Huo, Winifred M.

    1995-01-01

    Variational methods have proven invaluable in theoretical physics and chemistry, both for bound state problems and for the study of collision phenomena. For collisional problems they can be grouped into two types: those based on the Schroedinger equation and those based on the Lippmann-Schwinger equation. The application of the Schwinger variational (SV) method to e-molecule collisions and photoionization has been reviewed previously. The present chapter discusses the implementation of the SV method as applied to e-molecule collisions.

  5. 4 Other Ways to Quit Smoking Besides Using Medication

    Cancer.gov

    There are other ways to quit smoking besides cold turkey and medication. Medications are a good tool, but they’re not a magic bullet. Boost your chances of quitting and staying quit by using these other proven methods.

  6. 3 CFR 8424 - Proclamation 8424 of September 28, 2009. Family Day, 2009

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... have taught us time and again that children raised in loving, caring homes have the ability to reject..., have proven themselves to be the most effective preventative method for keeping our children drug-free...

  7. How Asian Teachers Polish Each Lesson to Perfection.

    ERIC Educational Resources Information Center

    Stigler, James W.; Stevenson, Harold W.

    1991-01-01

    Compares elementary mathematics instruction in Taiwan, Japan, Chicago, and Minneapolis. Finds that American teachers are overworked and devote less time to conducting lessons than Asian teachers, who employ proven inductive methods within the framework of standardized curricula. (DM)

  8. Big Data Provenance: Challenges, State of the Art and Opportunities.

    PubMed

    Wang, Jianwu; Crawl, Daniel; Purawat, Shweta; Nguyen, Mai; Altintas, Ilkay

    2015-01-01

    Ability to track provenance is a key feature of scientific workflows to support data lineage and reproducibility. The challenges that are introduced by the volume, variety and velocity of Big Data, also pose related challenges for provenance and quality of Big Data, defined as veracity. The increasing size and variety of distributed Big Data provenance information bring new technical challenges and opportunities throughout the provenance lifecycle including recording, querying, sharing and utilization. This paper discusses the challenges and opportunities of Big Data provenance related to the veracity of the datasets themselves and the provenance of the analytical processes that analyze these datasets. It also explains our current efforts towards tracking and utilizing Big Data provenance using workflows as a programming model to analyze Big Data.

  9. Soil magnetic susceptibility mapping as a pollution and provenance tool: an example from southern New Zealand

    NASA Astrophysics Data System (ADS)

    Martin, A. P.; Ohneiser, C.; Turnbull, R. E.; Strong, D. T.; Demler, S.

    2018-02-01

    The presence or absence, degree and variation of heavy metal contamination in New Zealand soils is a matter of ongoing debate as it affects soil quality, agriculture and human health. In many instances, however, the soil heavy metal concentration data do not exist to answer these questions and the debate is ongoing. To address this, magnetic susceptibility (a common proxy for heavy metal contamination) values were measured in topsoil (0-30 cm) and subsoil (50-70 cm) at grid sites spaced at 8 km intervals across ca. 20 000 km2 of southern New Zealand. Samples were measured for both mass- and volume-specific magnetic susceptibility, with results being strongly, positively correlated. Three different methods of determining anomalies were applied to the data including the topsoil-subsoil difference method, Tukey boxplot method and geoaccumulation index method, with each method filtering out progressively more anomalies. Additional soil magnetic (hysteresis, isothermal remanence and thermomagnetic) measurements were made on a select subset of samples from anomalous sites. Magnetite is the dominant remanence carrying mineral, and magnetic susceptibility is governed by that minerals concentration in soils, rather than mineral type. All except two anomalous sites have a dominant geogenic source (cf. anthropogenic). By proxy, heavy metal contamination in southern New Zealand soils is minimal, making them relatively pristine. The provenance of the magnetic minerals in the anomalous sites can be traced back to likely sources in outcrops of igneous rocks within the same catchment, terrane or rock type: a distance of <100 km but frequently <1 km. Soil provenance is a key step when mapping element or isotopic distribution, vectoring to mineralization or studying soil for agricultural suitability, water quality or environmental regulation. Measuring soil magnetic susceptibility is a useful, quick and inexpensive tool that usefully supplements soil geochemical data.

  10. 90 Seconds of Discovery: Fast Pyrolysis

    ScienceCinema

    Weber, Robert; Elliot, Douglas

    2018-01-16

    Fossil fuels have provided a time-proven, energy-dense fuel for more than a century. The challenge facing America today is developing alternatives that work within our existing infrastructure; to decrease environmental impact; and to increase energy security.

  11. Mainstream Deammonification (WERF Report INFR6R11)

    EPA Science Inventory

    The objective of this research was to investigate the feasibility of applying the deammonification concept, which is already highly successful and proven in sidestream configurations, in the mainstream treatment process. The deammonification process for nitrogen removal provides ...

  12. Research on non-destructive evaluation : workshop.

    DOT National Transportation Integrated Search

    2013-09-01

    The workshop held on March 28 at the MDOT Aeronautics Auditorium in Lansing, : Michigan, was organized with the goal of providing an overview of readily available and : proven NDE technologies and the process of integrating these technologies into th...

  13. Improving the performance of roadside vegetation.

    DOT National Transportation Integrated Search

    2011-02-01

    Vegetation along roadways can be aesthetically pleasing and helps to stabilize the soil, which reduces wind-blown soil : and soil erosion. While products containing chloride salts have proven to be very effective in helping to provide safe road : sur...

  14. The Materials Commons: A Collaboration Platform and Information Repository for the Global Materials Community

    NASA Astrophysics Data System (ADS)

    Puchala, Brian; Tarcea, Glenn; Marquis, Emmanuelle. A.; Hedstrom, Margaret; Jagadish, H. V.; Allison, John E.

    2016-08-01

    Accelerating the pace of materials discovery and development requires new approaches and means of collaborating and sharing information. To address this need, we are developing the Materials Commons, a collaboration platform and information repository for use by the structural materials community. The Materials Commons has been designed to be a continuous, seamless part of the scientific workflow process. Researchers upload the results of experiments and computations as they are performed, automatically where possible, along with the provenance information describing the experimental and computational processes. The Materials Commons website provides an easy-to-use interface for uploading and downloading data and data provenance, as well as for searching and sharing data. This paper provides an overview of the Materials Commons. Concepts are also outlined for integrating the Materials Commons with the broader Materials Information Infrastructure that is evolving to support the Materials Genome Initiative.

  15. Measuring the volume of brain tumour and determining its location in T2-weighted MRI images using hidden Markov random field: expectation maximization algorithm

    NASA Astrophysics Data System (ADS)

    Mat Jafri, Mohd. Zubir; Abdulbaqi, Hayder Saad; Mutter, Kussay N.; Mustapha, Iskandar Shahrim; Omar, Ahmad Fairuz

    2017-06-01

    A brain tumour is an abnormal growth of tissue in the brain. Most tumour volume measurement processes are carried out manually by the radiographer and radiologist without relying on any auto program. This manual method is a timeconsuming task and may give inaccurate results. Treatment, diagnosis, signs and symptoms of the brain tumours mainly depend on the tumour volume and its location. In this paper, an approach is proposed to improve volume measurement of brain tumors as well as using a new method to determine the brain tumour location. The current study presents a hybrid method that includes two methods. One method is hidden Markov random field - expectation maximization (HMRFEM), which employs a positive initial classification of the image. The other method employs the threshold, which enables the final segmentation. In this method, the tumour volume is calculated using voxel dimension measurements. The brain tumour location was determined accurately in T2- weighted MRI image using a new algorithm. According to the results, this process was proven to be more useful compared to the manual method. Thus, it provides the possibility of calculating the volume and determining location of a brain tumour.

  16. Portable FAIMS: Applications and Future Perspectives.

    PubMed

    Costanzo, Michael T; Boock, Jared J; Kemperman, Robin H J; Wei, Michael S; Beekman, Christopher R; Yost, Richard A

    2017-11-01

    Miniaturized mass spectrometry (MMS) is optimal for a wide variety of applications that benefit from field-portable instrumentation. Like MMS, field asymmetric ion mobility spectrometry (FAIMS) has proven capable of providing in situ analysis, allowing researchers to bring the lab to the sample. FAIMS compliments MMS very well, but has the added benefit of operating at atmospheric pressure, unlike MS. This distinct advantage makes FAIMS uniquely suited for portability. Since its inception, FAIMS has been envisioned as a field-portable device, as it affords less expense and greater simplicity than many similar methods. Ideally, these are simple, robust devices that may be operated by non-professional personnel, yet still provide adequate data when in the field. While reducing the size and complexity tends to bring with it a loss of performance and accuracy, this is made up for by the incredibly high throughput and overall convenience of the instrument. Moreover, the FAIMS device used in the field can be brought back to the lab, and coupled to a conventional mass spectrometer to provide any necessary method development and compound validation. This work discusses the various considerations, uses, and applications for portable FAIMS instrumentation, and how the future of each applicable field may benefit from the development and acceptance of such a device.

  17. Nondestructive Evaluation of Metal Fatigue Using Nonlinear Acoustics

    NASA Technical Reports Server (NTRS)

    Cantrell, John H., Jr.

    2008-01-01

    Safe-life and damage-tolerant design philosophies of high performance structures have driven the development of various methods to evaluate nondestructively the accumulation of damage in such structures resulting from cyclic loading. Although many techniques have proven useful, none has been able to provide an unambiguous, quantitative assessment of damage accumulation at each stage of fatigue from the virgin state to fracture. A method based on nonlinear acoustics is shown to provide such a means to assess the state of metal fatigue. The salient features of an analytical model are presented of the microelastic-plastic nonlinearities resulting from the interaction of an acoustic wave with fatigue-generated dislocation substructures and cracks that predictably evolve during the metal fatigue process. The interaction is quantified by the material (acoustic) nonlinearity parameter extracted from acoustic harmonic generation measurements. The parameters typically increase monotonically by several hundred percent over the fatigue life of the metal, thus providing a unique measure of the state of fatigue. Application of the model to aluminum alloy 2024-T4, 410Cb stainless steel, and IN100 nickel-base superalloy specimens fatigued using different loading conditions yields good agreement between theory and experiment. Application of the model and measurement technique to the on-site inspection of steam turbine blades is discussed.

  18. A Framework for the Comparative Assessment of Neuronal Spike Sorting Algorithms towards More Accurate Off-Line and On-Line Microelectrode Arrays Data Analysis.

    PubMed

    Regalia, Giulia; Coelli, Stefania; Biffi, Emilia; Ferrigno, Giancarlo; Pedrocchi, Alessandra

    2016-01-01

    Neuronal spike sorting algorithms are designed to retrieve neuronal network activity on a single-cell level from extracellular multiunit recordings with Microelectrode Arrays (MEAs). In typical analysis of MEA data, one spike sorting algorithm is applied indiscriminately to all electrode signals. However, this approach neglects the dependency of algorithms' performances on the neuronal signals properties at each channel, which require data-centric methods. Moreover, sorting is commonly performed off-line, which is time and memory consuming and prevents researchers from having an immediate glance at ongoing experiments. The aim of this work is to provide a versatile framework to support the evaluation and comparison of different spike classification algorithms suitable for both off-line and on-line analysis. We incorporated different spike sorting "building blocks" into a Matlab-based software, including 4 feature extraction methods, 3 feature clustering methods, and 1 template matching classifier. The framework was validated by applying different algorithms on simulated and real signals from neuronal cultures coupled to MEAs. Moreover, the system has been proven effective in running on-line analysis on a standard desktop computer, after the selection of the most suitable sorting methods. This work provides a useful and versatile instrument for a supported comparison of different options for spike sorting towards more accurate off-line and on-line MEA data analysis.

  19. A Framework for the Comparative Assessment of Neuronal Spike Sorting Algorithms towards More Accurate Off-Line and On-Line Microelectrode Arrays Data Analysis

    PubMed Central

    Pedrocchi, Alessandra

    2016-01-01

    Neuronal spike sorting algorithms are designed to retrieve neuronal network activity on a single-cell level from extracellular multiunit recordings with Microelectrode Arrays (MEAs). In typical analysis of MEA data, one spike sorting algorithm is applied indiscriminately to all electrode signals. However, this approach neglects the dependency of algorithms' performances on the neuronal signals properties at each channel, which require data-centric methods. Moreover, sorting is commonly performed off-line, which is time and memory consuming and prevents researchers from having an immediate glance at ongoing experiments. The aim of this work is to provide a versatile framework to support the evaluation and comparison of different spike classification algorithms suitable for both off-line and on-line analysis. We incorporated different spike sorting “building blocks” into a Matlab-based software, including 4 feature extraction methods, 3 feature clustering methods, and 1 template matching classifier. The framework was validated by applying different algorithms on simulated and real signals from neuronal cultures coupled to MEAs. Moreover, the system has been proven effective in running on-line analysis on a standard desktop computer, after the selection of the most suitable sorting methods. This work provides a useful and versatile instrument for a supported comparison of different options for spike sorting towards more accurate off-line and on-line MEA data analysis. PMID:27239191

  20. Fiber-Optic Strain-Gage Tank Level Measurement System for Cryogenic Propellants

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando; Mitchell, Mark; Langford, Lester

    2004-01-01

    Measurement of tank level, particularly for cryogenic propellants, has proven to be a difficult problem. Current methods based on differential pressure, capacitance sensors, temperature sensors, etc.; do not provide sufficiently accurate or robust measurements, especially at run time. These methods are designed to measure tank-level, but when the fluids are in supercritical state, the liquid-gas interface disappears. Furthermore, there is a need for a non-intrusive measurement system; that is, the sensors should not require tank modifications and/or disturb the fluids. This paper describes a simple, but effective method to determine propellant mass by measuring very small deformations of the structure supporting the tank. Results of a laboratory study to validate the method, and experimental data from a deployed system are presented. A comparison with an existing differential pressure sensor shows that the strain gage system provides a much better quality signal across all regimes during an engine test. Experimental results also show that the use of fiber optic strain gages (FOSG) over classic foil strain gages extends the operation time (before the system becomes uncalibrated), and increases accuracy. Finally, a procedure is defined whereby measurements from the FOSG mounted on the tank supporting structure are compensated using measurements of a FOSG mounted on a reference plate and temperature measurements of the structure. Results describing the performance of a deployed system that measures tank level during propulsion tests are included.

  1. Basis and methods of NASA airborne topographic mapper lidar surveys for coastal studies

    USGS Publications Warehouse

    Brock, John C.; Wright, C. Wayne; Sallenger, Asbury H.; Krabill, William B.; Swift, Robert N.

    2002-01-01

    This paper provides an overview of the basic principles of airborne laser altimetry for surveys of coastal topography, and describes the methods used in the acquisition and processing of NASA Airborne Topographic Mapper (ATM) surveys that cover much of the conterminous US coastline. This form of remote sensing, also known as "topographic lidar", has undergone extremely rapid development during the last two decades, and has the potential to contribute within a wide range of coastal scientific investigations. Various airborne laser surveying (ALS) applications that are relevant to coastal studies are being pursued by researchers in a range of Earth science disciplines. Examples include the mapping of "bald earth" land surfaces below even moderately dense vegetation in studies of geologic framework and hydrology, and determination of the vegetation canopy structure, a key variable in mapping wildlife habitats. ALS has also proven to be an excellent method for the regional mapping of geomorphic change along barrier island beaches and other sandy coasts due to storms or long-term sedimentary processes. Coastal scientists are adopting ALS as a basic method in the study of an array of additional coastal topics. ALS can provide useful information in the analysis of shoreline change, the prediction and assessment of landslides along seacliffs and headlands, examination of subsidence causing coastal land loss, and in predicting storm surge and tsunami inundation.

  2. Geographic Variation of Eastern White Pine in the Northeast

    Treesearch

    Peter W. Garrett; Ernst J. Schreiner; Harry Kettlewood

    1973-01-01

    Eastern white pine is the most valuable conifer in the Northeast, and its large botanical range has provided ample opportunity for the development of ecotypes. Provenance plantings in nine states provided information on variability within the species and recommendations for moving seed from one region to another. Good growth was obtained on southern Appalachian sources...

  3. Refutational Text and Multiple External Representations as a Method to Remediate the Misinterpretation of Box Plots

    ERIC Educational Resources Information Center

    Lem, Stephanie; Baert, Kathy; Ceulemans, Eva; Onghena, Patrick; Verschaffel, Lieven; Van Dooren, Wim

    2017-01-01

    The ability to interpret graphs is highly important in modern society, but has proven to be a challenge for many people. In this paper, two teaching methods were used to remediate one specific misinterpretation: the area misinterpretation of box plots. First, we used refutational text to explicitly state and invalidate the area misinterpretation…

  4. A tabular format of Pyle's ageing and sexing methods for landbirds

    Treesearch

    W.H. Sakai; C.J. Ralph

    2002-01-01

    We present a new method of summarizing Peter Pyle's (1997) ageing and sexing guide into a tabular format that has proven to be extremely useful for both novice and experienced banders. This format allows quick and accurate assessment by a bander to distinguish species, age, and sex criteria. Rapid and accurate processing are essential to the health of birds...

  5. Pop-Up Retailing: The Design, Implementation, and Five-Year Evolution of an Experiential Learning Project

    ERIC Educational Resources Information Center

    Burgess, Brigitte

    2012-01-01

    Educators continually seek innovative methods by which to engage students. Kolb's experiential learning theory was a catalyst for designing and incorporating a pop-up retail consignment store into a junior level retail promotion course. After five years of use and refinement, the project has proven to be a powerful method to engage students in the…

  6. An empirical evaluation of two-stage species tree inference strategies using a multilocus dataset from North American pines

    Treesearch

    Michael DeGiorgio; John Syring; Andrew J. Eckert; Aaron Liston; Richard Cronn; David B. Neale; Noah A. Rosenberg

    2014-01-01

    Background: As it becomes increasingly possible to obtain DNA sequences of orthologous genes from diverse sets of taxa, species trees are frequently being inferred from multilocus data. However, the behavior of many methods for performing this inference has remained largely unexplored. Some methods have been proven to be consistent given certain evolutionary models,...

  7. Securing Digital Audio using Complex Quadratic Map

    NASA Astrophysics Data System (ADS)

    Suryadi, MT; Satria Gunawan, Tjandra; Satria, Yudi

    2018-03-01

    In This digital era, exchanging data are common and easy to do, therefore it is vulnerable to be attacked and manipulated from unauthorized parties. One data type that is vulnerable to attack is digital audio. So, we need data securing method that is not vulnerable and fast. One of the methods that match all of those criteria is securing the data using chaos function. Chaos function that is used in this research is complex quadratic map (CQM). There are some parameter value that causing the key stream that is generated by CQM function to pass all 15 NIST test, this means that the key stream that is generated using this CQM is proven to be random. In addition, samples of encrypted digital sound when tested using goodness of fit test are proven to be uniform, so securing digital audio using this method is not vulnerable to frequency analysis attack. The key space is very huge about 8.1×l031 possible keys and the key sensitivity is very small about 10-10, therefore this method is also not vulnerable against brute-force attack. And finally, the processing speed for both encryption and decryption process on average about 450 times faster that its digital audio duration.

  8. Big Data Provenance: Challenges, State of the Art and Opportunities

    PubMed Central

    Wang, Jianwu; Crawl, Daniel; Purawat, Shweta; Nguyen, Mai; Altintas, Ilkay

    2017-01-01

    Ability to track provenance is a key feature of scientific workflows to support data lineage and reproducibility. The challenges that are introduced by the volume, variety and velocity of Big Data, also pose related challenges for provenance and quality of Big Data, defined as veracity. The increasing size and variety of distributed Big Data provenance information bring new technical challenges and opportunities throughout the provenance lifecycle including recording, querying, sharing and utilization. This paper discusses the challenges and opportunities of Big Data provenance related to the veracity of the datasets themselves and the provenance of the analytical processes that analyze these datasets. It also explains our current efforts towards tracking and utilizing Big Data provenance using workflows as a programming model to analyze Big Data. PMID:29399671

  9. Shape sensing methods: Review and experimental comparison on a wing-shaped plate

    NASA Astrophysics Data System (ADS)

    Gherlone, Marco; Cerracchio, Priscilla; Mattone, Massimiliano

    2018-05-01

    Shape sensing, i.e., the reconstruction of the displacement field of a structure from some discrete surface strain measurements, is a fundamental capability for the structural health management of critical components. In this paper, a review of the shape sensing methodologies available in the open literature and of the different applications is provided. Then, for the first time, an experimental comparative study is presented among the main approaches in order to highlight their relative merits in presence of uncertainties affecting real applications. These approaches are, namely, the inverse Finite Element Method, the Modal Method and Ko's Displacement Theory. A brief description of these methods is followed by the presentation of the experimental test results. A cantilevered, wing-shaped aluminum plate is let deform under its own weight, leading to bending and twisting. Using the experimental strain measurements as input data, the deflection field of the plate is reconstructed using the three aforementioned approaches and compared with the actual measured deflection. The inverse Finite Element Method is proven to be slightly more accurate and particularly attractive because it is versatile with respect to the boundary conditions and it does not require any information about material properties and loading conditions.

  10. Big data and ergonomics methods: A new paradigm for tackling strategic transport safety risks.

    PubMed

    Walker, Guy; Strathie, Ailsa

    2016-03-01

    Big data collected from On-Train Data Recorders (OTDR) has the potential to address the most important strategic risks currently faced by rail operators and authorities worldwide. These risk issues are increasingly orientated around human performance and have proven resistant to existing approaches. This paper presents a number of proof of concept demonstrations to show that long standing ergonomics methods can be driven from big data, and succeed in providing insight into human performance in a novel way. Over 300 ergonomics methods were reviewed and a smaller sub-set selected for proof-of-concept development using real on-train recorder data. From this are derived nine candidate Human Factors Leading Indicators which map on to all of the psychological precursors of the identified risks. This approach has the potential to make use of a significantly underused source of data, and enable rail industry stakeholders to intervene sooner to address human performance issues that, via the methods presented in this paper, are clearly manifest in on-train data recordings. The intersection of psychological knowledge, ergonomics methods and big data creates an important new framework for driving new insights. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  11. Localization of rainfall and determination its intensity in the lower layers of the troposphere from the measurements of local RF transmitter characteristics

    NASA Astrophysics Data System (ADS)

    Podhorský, Dušan; Fabo, Peter

    2016-12-01

    The article deals with a method of acquiring the temporal and spatial distribution of local precipitation from measurement of performance characteristics of local sources of high frequency electromagnetic radiation in the 1-3GHz frequency range in the lower layers of the troposphere up to 100 m. The method was experimentally proven by monitoring the GSM G2 base stations of cell phone providers in the frequency range of 920-960MHz using methods of frequential and spatial diversity reception. Modification of the SART method for localization of precipitation was also proposed. The achieved results allow us to obtain the timeframe of the intensity of local precipitation in the observed area with a temporal resolution of 10 sec. A spatial accuracy of 100m in localization of precipitation is expected, after a network of receivers is built. The acquired data can be used as one of the inputs for meteorological forecasting models, in agriculture, hydrology as a supplementary method to ombrograph stations and measurements for the weather radar network, in transportation as part of a warning system and in many other areas.

  12. High Sensitive Methods for Health Monitoring of Compressor Blades and Fatigue Detection

    PubMed Central

    Witoś, Mirosław

    2013-01-01

    The diagnostic and research aspects of compressor blade fatigue detection have been elaborated in the paper. The real maintenance and overhaul problems and characteristic of different modes of metal blade fatigue (LCF, HCF, and VHCF) have been presented. The polycrystalline defects and impurities influencing the fatigue, along with their related surface finish techniques, are taken into account. The three experimental methods of structural health assessment are considered. The metal magnetic memory (MMM), experimental modal analysis (EMA) and tip timing (TTM) methods provide information on the damage of diagnosed objects, for example, compressor blades. Early damage symptoms, that is, magnetic and modal properties of material strengthening and weakening phases (change of local dislocation density and grain diameter, increase of structural and magnetic anisotropy), have been described. It has been proven that the shape of resonance characteristic gives abilities to determine if fatigue or a blade crack is concerned. The capabilities of the methods for steel and titanium alloy blades have been illustrated in examples from active and passive experiments. In the conclusion, the MMM, EMA, and TTM have been verified, and the potential for reliable diagnosis of the compressor blades using this method has been confirmed. PMID:24191135

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Millis, Andrew

    Understanding the behavior of interacting electrons in molecules and solids so that one can predict new superconductors, catalysts, light harvesters, energy and battery materials and optimize existing ones is the ``quantum many-body problem’’. This is one of the scientific grand challenges of the 21 st century. A complete solution to the problem has been proven to be exponentially hard, meaning that straightforward numerical approaches fail. New insights and new methods are needed to provide accurate yet feasible approximate solutions. This CMSCN project brought together chemists and physicists to combine insights from the two disciplines to develop innovative new approaches. Outcomesmore » included the Density Matrix Embedding method, a new, computationally inexpensive and extremely accurate approach that may enable first principles treatment of superconducting and magnetic properties of strongly correlated materials, new techniques for existing methods including an Adaptively Truncated Hilbert Space approach that will vastly expand the capabilities of the dynamical mean field method, a self-energy embedding theory and a new memory-function based approach to the calculations of the behavior of driven systems. The methods developed under this project are now being applied to improve our understanding of superconductivity, to calculate novel topological properties of materials and to characterize and improve the properties of nanoscale devices.« less

  14. Preservative-free triamcinolone acetonide suspension developed for intravitreal injection.

    PubMed

    Bitter, Christoph; Suter, Katja; Figueiredo, Verena; Pruente, Christian; Hatz, Katja; Surber, Christian

    2008-02-01

    All commercially available triamcinolone acetonide (TACA) suspensions, used for intravitreal treatment, contain retinal toxic vehicles (e.g., benzyl alcohol, solubilizer). Our aim was to find a convenient and reproducible method to compound a completely preservative-free TACA suspension, adapted to the intraocular physiology, with consistent quality (i.e., proven sterility and stability, constant content and dose uniformity, defined particle size, and 1 year shelf life). We evaluated two published (Membrane-filter, Centrifugation) and a newly developed method (Direct Suspending) to compound TACA suspensions for intravitreal injection. Parameters as TACA content (HPLC), particle size (microscopy and laser spectrometry), sterility, and bacterial endotoxins were assessed. Stability testing (at room temperature and 40 degrees C) was performed: color and homogeneity (visually), particle size (microscopically), TACA content and dose uniformity (HPLC) were analyzed according to International Conference on Harmonisation guidelines. Contrary to the known methods, the direct suspending method is convenient, provides a TACA suspension, which fulfills all compendial requirements, and has a 2-year shelf life. We developed a simple, reproducible method to compound stable, completely preservative-free TACA suspensions with a reasonable shelf-life, which enables to study the effect of intravitreal TACA--not biased by varying doses and toxic compounds or their residues.

  15. The discrimination of geoforensic trace material from close proximity locations by organic profiling using HPLC and plant wax marker analysis by GC.

    PubMed

    McCulloch, G; Dawson, L A; Ross, J M; Morgan, R M

    2018-07-01

    There is a need to develop a wider empirical research base to expand the scope for utilising the organic fraction of soil in forensic geoscience, and to demonstrate the capability of the analytical techniques used in forensic geoscience to discriminate samples at close proximity locations. The determination of wax markers from soil samples by GC analysis has been used extensively in court and is known to be effective in discriminating samples from different land use types. A new HPLC method for the analysis of the organic fraction of forensic sediment samples has also been shown recently to add value in conjunction with existing inorganic techniques for the discrimination of samples derived from close proximity locations. This study compares the ability of these two organic techniques to discriminate samples derived from close proximity locations and finds the GC technique to provide good discrimination at this scale, providing quantification of known compounds, whilst the HPLC technique offered a shorter and simpler sample preparation method and provided very good discrimination between groups of samples of different provenance in most cases. The use of both data sets together gave further improved accuracy rates in some cases, suggesting that a combined organic approach can provide added benefits in certain case scenarios and crime reconstruction contexts. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  16. 2017 safety belt usage survey in Kentucky.

    DOT National Transportation Integrated Search

    2017-08-01

    The use of safety belts and child safety seats is a proven means of reducing injuries to motor vehicle occupants involved in traffic crashes. There have been various methods used in efforts to increase safety belt and safety seat usage. Past efforts ...

  17. 2016 safety belt usage survey in Kentucky.

    DOT National Transportation Integrated Search

    2016-08-01

    The use of safety belts and child safety seats is a proven means of reducing injuries to motor vehicle occupants involved in traffic crashes. There have been various methods used in efforts to increase safety belt and safety seat usage. Past efforts ...

  18. CHEMICAL SYNTHESES IN AQUEOUS MEDIA USING MICROWAVES

    EPA Science Inventory

    The development of efficient, selective and eco-friendly synthetic methods has remained a major focus of our research group. Microwave (MW) irradiation as alternative energy source in conjunction with water as reaction media has proven to be a successful 'greener' chemical appro...

  19. Provenance and recycling of Arabian desert sand

    NASA Astrophysics Data System (ADS)

    Garzanti, Eduardo; Vermeesch, Pieter; Andò, Sergio; Vezzoli, Giovanni; Valagussa, Manuel; Allen, Kate; Kadi, Khalid; Al-Juboury, Ali

    2013-04-01

    This study seeks to determine the ultimate origin of aeolian sand in Arabian deserts by high-resolution petrographic and heavy-mineral techniques combined with zircon U-Pb geochronology. Point-counting is used here as the sole method by which unbiased volume percentages of heavy minerals can be obtained. A comprehensive analysis of river and wadi sands from the Red Sea to the Bitlis-Zagros orogen allowed us to characterize all potential sediment sources, and thus to quantitatively constrain provenance of Arabian dune fields. Two main types of aeolian sand can be distinguished. Quartzose sands with very poor heavy-mineral suites including zircon occupy most of the region comprising the Great Nafud and Rub' al-Khali Sand Seas, and are largely recycled from thick Lower Palaeozoic quartzarenites with very minor first-cycle contributions from Precambrian basement, Mesozoic carbonate rocks, or Neogene basalts. Instead, carbonaticlastic sands with richer lithic and heavy-mineral populations characterize coastal dunes bordering the Arabian Gulf from the Jafurah Sand Sea of Saudi Arabia to the United Arab Emirates. The similarity with detritus carried by the axial Tigris-Euphrates system and by transverse rivers draining carbonate rocks of the Zagros indicates that Arabian coastal dunes largely consist of far-travelled sand, deposited on the exposed floor of the Gulf during Pleistocene lowstands and blown inland by dominant Shamal northerly winds. A dataset of detrital zircon U-Pb ages measured on twelve dune samples and two Lower Palaeozoic sandstones yielded fourteen identical age spectra. The age distributions all show a major Neoproterozoic peak corresponding to the Pan-African magmatic and tectonic events by which the Arabian Shield was assembled, with minor late Palaeoproterozoic and Neoarchean peaks. A similar U-Pb signature characterizes also Jafurah dune sands, suggesting that zircons are dominantly derived from interior Arabia, possibly deflated from the Wadi al-Batin fossil alluvial fan or even from Mesozoic sandstones of the Arabian margin accreted to the Cenozoic Zagros orogen. Due to extensive recycling and the fact that zircon is so resistant to weathering and erosion, the U-Pb age signatures are much less powerful a tracer of sedimentary provenance than framework petrography and heavy minerals. Actualistic provenance studies of dune fields at subcontinental scale shed light on the generation and homogenization of aeolian sand, and allow us to trace complex pathways of multistep sediment transport, thus providing crucial independent information for accurate palaeogeographic and palaeoclimatic reconstructions.

  20. Provenance and recycling of Arabian desert sand

    NASA Astrophysics Data System (ADS)

    Garzanti, Eduardo; Vermeesch, Pieter; Andò, Sergio; Vezzoli, Giovanni; Valagussa, Manuel; Allen, Kate; Kadi, Khalid A.; Al-Juboury, Ali I. A.

    2013-05-01

    This study seeks to determine the ultimate origin of aeolian sand in Arabian deserts by high-resolution petrographic and heavy-mineral techniques combined with zircon U-Pb geochronology. Point-counting is used here as the sole method by which unbiased volume percentages of heavy minerals can be obtained. A comprehensive analysis of river and wadi sands from the Red Sea to the Bitlis-Zagros orogen allowed us to characterize all potential sediment sources, and thus to quantitatively constrain provenance of Arabian dune fields. Two main types of aeolian sand can be distinguished. Quartzose sands with very poor heavy-mineral suites including zircon occupy most of the region comprising the Great Nafud and Rub' al-Khali Sand Seas, and are largely recycled from thick Lower Palaeozoic quartzarenites with very minor first-cycle contributions from Precambrian basement, Mesozoic carbonate rocks, or Neogene basalts. Instead, carbonaticlastic sands with richer lithic and heavy-mineral populations characterize coastal dunes bordering the Arabian Gulf from the Jafurah Sand Sea of Saudi Arabia to the United Arab Emirates. The similarity with detritus carried by the axial Tigris-Euphrates system and by transverse rivers draining carbonate rocks of the Zagros indicates that Arabian coastal dunes largely consist of far-travelled sand, deposited on the exposed floor of the Gulf during Pleistocene lowstands and blown inland by dominant Shamal northerly winds. A dataset of detrital zircon U-Pb ages measured on twelve dune samples and two Lower Palaeozoic sandstones yielded fourteen identical age spectra. The age distributions all show a major Neoproterozoic peak corresponding to the Pan-African magmatic and tectonic events by which the Arabian Shield was assembled, with minor late Palaeoproterozoic and Neoarchean peaks. A similar U-Pb signature characterizes also Jafurah dune sands, suggesting that zircons are dominantly derived from interior Arabia, possibly deflated from the Wadi al-Batin fossil alluvial fan or even from Mesozoic sandstones of the Arabian margin accreted to the Cenozoic Zagros orogen. Due to extensive recycling and the fact that zircon is so resistant to weathering and erosion, the U-Pb age signatures are much less powerful a tracer of sedimentary provenance than framework petrography and heavy minerals. Actualistic provenance studies of dune fields at subcontinental scale shed light on the generation and homogenization of aeolian sand, and allow us to trace complex pathways of multistep sediment transport, thus providing crucial independent information for accurate palaeogeographic and palaeoclimatic reconstructions.

  1. Uncertainty in determining extreme precipitation thresholds

    NASA Astrophysics Data System (ADS)

    Liu, Bingjun; Chen, Junfan; Chen, Xiaohong; Lian, Yanqing; Wu, Lili

    2013-10-01

    Extreme precipitation events are rare and occur mostly on a relatively small and local scale, which makes it difficult to set the thresholds for extreme precipitations in a large basin. Based on the long term daily precipitation data from 62 observation stations in the Pearl River Basin, this study has assessed the applicability of the non-parametric, parametric, and the detrended fluctuation analysis (DFA) methods in determining extreme precipitation threshold (EPT) and the certainty to EPTs from each method. Analyses from this study show the non-parametric absolute critical value method is easy to use, but unable to reflect the difference of spatial rainfall distribution. The non-parametric percentile method can account for the spatial distribution feature of precipitation, but the problem with this method is that the threshold value is sensitive to the size of rainfall data series and is subjected to the selection of a percentile thus make it difficult to determine reasonable threshold values for a large basin. The parametric method can provide the most apt description of extreme precipitations by fitting extreme precipitation distributions with probability distribution functions; however, selections of probability distribution functions, the goodness-of-fit tests, and the size of the rainfall data series can greatly affect the fitting accuracy. In contrast to the non-parametric and the parametric methods which are unable to provide information for EPTs with certainty, the DFA method although involving complicated computational processes has proven to be the most appropriate method that is able to provide a unique set of EPTs for a large basin with uneven spatio-temporal precipitation distribution. The consistency between the spatial distribution of DFA-based thresholds with the annual average precipitation, the coefficient of variation (CV), and the coefficient of skewness (CS) for the daily precipitation further proves that EPTs determined by the DFA method are more reasonable and applicable for the Pearl River Basin.

  2. Moving Toward Sustainability: Sustainable and Effective Practices for Creating Your Own Water Utility Roadmap

    EPA Pesticide Factsheets

    The document builds on the Effective Utility Management framework. It provides utilities of various sizes with a series of proven and effective practices to help achieve the outcomes in Effective Utility Management.

  3. Freeway travel-time estimation and forecasting.

    DOT National Transportation Integrated Search

    2013-03-01

    Real-time traffic information provided by GDOT has proven invaluable for commuters in the : Georgia freeway network. The increasing number of Variable Message Signs, addition of : services such as My-NaviGAtor, NaviGAtor-to-go etc. and the advancemen...

  4. The Extravehicular Mobility Unit (EMU): Proven hardware for Satellite Servicing

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A general technical description of the extravehicular mobility unit (EMU) is given. The description provides a basis for understanding EMU mobility capabilities and the environments a payload is exposed to in the vicinity of an EMU.

  5. Freight optimization and development in Missouri : ports and waterways module

    DOT National Transportation Integrated Search

    2008-04-01

    Missouris ports and waterways have proven to be important to the regions economic growth and significant to the states role in transporting waterborne freight. The ultimate objectives of this analysis are to provide an inventory of Missouri...

  6. The AtChem On-line model and Electronic Laboratory Notebook (ELN): A free community modelling tool with provenance capture

    NASA Astrophysics Data System (ADS)

    Young, J. C.; Boronska, K.; Martin, C. J.; Rickard, A. R.; Vázquez Moreno, M.; Pilling, M. J.; Haji, M. H.; Dew, P. M.; Lau, L. M.; Jimack, P. K.

    2010-12-01

    AtChem On-line1 is a simple to use zero-dimensional box modelling toolkit, developed for use by laboratory, field and chamber scientists. Any set of chemical reactions can be simulated, in particular the whole Master Chemical Mechanism (MCM2) or any subset of it. Parameters and initial data can be provided through a self-explanatory web form and the resulting model is compiled and run on a dedicated server. The core part of the toolkit, providing a robust solver for thousands of chemical reactions, is written in Fortran and uses SUNDIALS3 CVODE libraries. Chemical systems can be constrained at multiple, user-determined timescales; this enabled studies of radical chemistry at one minute timescales. AtChem On-line is free to use and requires no installation - a web browser, text editor and any compressing software is all the user needs. CPU and storage are provided by the server (input and output data are saved indefinitely). An off-line version is also being developed, which will provide batch processing, an advanced graphical user interface and post-processing tools, for example, Rate of Production Analysis (ROPA) and chainlength analysis. The source code is freely available for advanced users wishing to adapt and run the program locally. Data management, dissemination and archiving are essential in all areas of science. In order to do this in an efficient and transparent way, there is a critical need to capture high quality metadata/provenance for modelling activities. An Electronic Laboratory Notebook (ELN) has been developed in parallel with AtChem Online as part of the EC EUROCHAMP24 project. In order to use controlled chamber experiments to evaluate the MCM, we need to be able to archive, track and search information on all associated chamber model runs, so that they can be used in subsequent mechanism development. Therefore it would be extremely useful if experiment and model metadata/provenance could be easily and automatically stored electronically. Archiving metadata/provenance via an ELN makes it easier to write a paper or thesis and for mechanism developers/evaluators/peer review to search for appropriate experimental and modelling results and conclusions. The development of an ELN in the context mechanism evaluation/development using large experimental chamber datasets is presented.

  7. Provenance tracking for scientific software toolchains through on-demand release and archiving

    NASA Astrophysics Data System (ADS)

    Ham, David

    2017-04-01

    There is an emerging consensus that published computational science results must be backed by a provenance chain tying results to the exact versions of input data and the code which generated them. There is also now an impressive range of web services devoted to revision control of software, and the archiving in citeable form of both software and input data. However, much scientific software itself builds on libraries and toolkits, and these themselves have dependencies. Further, it is common for cutting edge research to depend on the latest version of software in online repositories, rather than the official release version. This creates a situation in which an author who wishes to follow best practice in recording the provenance chain of their results must archive and cite unreleased versions of a series of dependencies. Here, we present an alternative which toolkit authors can easily implement to provide a semi-automatic mechanism for creating and archiving custom software releases of the precise version of a package used in a particular simulation. This approach leverages the excellent services provided by GitHub and Zenodo to generate a connected set of citeable DOIs for the archived software. We present the integration of this workflow into the Firedrake automated finite element framework as a practical example of this approach in use on a complex geoscientific tool chain in practical use.

  8. Current controlled vocabularies are insufficient to uniquely map molecular entities to mass spectrometry signal.

    PubMed

    Smith, Rob; Taylor, Ryan M; Prince, John T

    2015-01-01

    The comparison of analyte mass spectrometry precursor (MS1) signal is central to many proteomic (and other -omic) workflows. Standard vocabularies for mass spectrometry exist and provide good coverage for most experimental applications yet are insufficient for concise and unambiguous description of data concepts spanning the range of signal provenance from a molecular perspective (e.g. from charged peptides down to fine isotopes). Without a standard unambiguous nomenclature, literature searches, algorithm reproducibility and algorithm evaluation for MS-omics data processing are nearly impossible. We show how terms from current official ontologies are too vague or ambiguous to explicitly map molecular entities to MS signals and we illustrate the inconsistency and ambiguity of current colloquially used terms. We also propose a set of terms for MS1 signal that uniquely, succinctly and intuitively describe data concepts spanning the range of signal provenance from full molecule downs to fine isotopes. We suggest that additional community discussion of these terms should precede any further standardization efforts. We propose a novel nomenclature that spans the range of the required granularity to describe MS data processing from the perspective of the molecular provenance of the MS signal. The proposed nomenclature provides a chain of succinct and unique terms spanning the signal created by a charged molecule down through each of its constituent subsignals. We suggest that additional community discussion of these terms should precede any further standardization efforts.

  9. Climatic control of bud burst in young seedlings of nine provenances of Norway spruce.

    PubMed

    Søgaard, Gunnhild; Johnsen, Oystein; Nilsen, Jarle; Junttila, Olavi

    2008-02-01

    Detailed knowledge of temperature effects on the timing of dormancy development and bud burst will help evaluate the impacts of climate change on forest trees. We tested the effects of temperature applied during short-day treatment, duration of short-day treatment, duration of chilling and light regime applied during forcing on the timing of bud burst in 1- and 2-year-old seedlings of nine provenances of Norway spruce (Picea abies (L.) Karst.). High temperature during dormancy induction, little or no chilling and low temperature during forcing all delayed dormancy release but did not prevent bud burst or growth onset provided the seedlings were forced under long-day conditions. Without chilling, bud burst occurred in about 20% of seedlings kept in short days at 12 degrees C, indicating that young Norway spruce seedlings do not exhibit true bud dormancy. Chilling hastened bud burst and removed the long photoperiod requirement, but the effect of high temperature applied during dormancy induction was observed even after prolonged chilling. Extension of the short-day treatment from 4 to 8 or 12 weeks hastened bud burst. The effect of treatments applied during dormancy development was larger than that of provenance; in some cases no provenance effect was detected, but in 1-year-old seedlings, time to bud burst decreased linearly with increasing latitude of origin. Differences among provenances were complicated by different responses of some origins to light conditions under long-day forcing. In conclusion, timing of bud burst in Norway spruce seedlings is significantly affected by temperature during bud set, and these effects are modified by chilling and environmental conditions during forcing.

  10. Methods for describing the electromagnetic properties of silver and gold nanoparticles.

    PubMed

    Zhao, Jing; Pinchuk, Anatoliy O; McMahon, Jeffrey M; Li, Shuzhou; Ausman, Logan K; Atkinson, Ariel L; Schatz, George C

    2008-12-01

    This Account provides an overview of the methods that are currently being used to study the electromagnetics of silver and gold nanoparticles, with an emphasis on the determination of extinction and surface-enhanced Raman scattering (SERS) spectra. These methods have proven to be immensely useful in recent years for interpreting a wide range of nanoscience experiments and providing the capability to describe optical properties of particles up to several hundred nanometers in dimension, including arbitrary particle structures and complex dielectric environments (adsorbed layers of molecules, nearby metal films, and other particles). While some of the methods date back to Mie's celebrated work a century ago, others are still at the forefront of algorithm development in computational electromagnetics. This Account gives a qualitative description of the physical and mathematical basis behind the most commonly used methods, including both analytical and numerical methods, as well as representative results of applications that are relevant to current experiments. The analytical methods that we discuss are either derived from Mie theory for spheres or from the quasistatic (Gans) model as applied to spheres and spheroids. In this discussion, we describe the use of Mie theory to determine electromagnetic contributions to SERS enhancements that include for retarded dipole emission effects, and the use of the quasistatic approximation for spheroidal particles interacting with dye adsorbate layers. The numerical methods include the discrete dipole approximation (DDA), the finite difference time domain (FDTD) method, and the finite element method (FEM) based on Whitney forms. We discuss applications such as using DDA to describe the interaction of two gold disks to define electromagnetic hot spots, FDTD for light interacting with metal wires that go from particle-like plasmonic response to the film-like transmission as wire dimension is varied, and FEM studies of electromagnetic fields near cubic particles.

  11. Spatially Resolved Chemical Imaging for Biosignature Analysis: Terrestrial and Extraterrestrial Examples

    NASA Astrophysics Data System (ADS)

    Bhartia, R.; Wanger, G.; Orphan, V. J.; Fries, M.; Rowe, A. R.; Nealson, K. H.; Abbey, W. J.; DeFlores, L. P.; Beegle, L. W.

    2014-12-01

    Detection of in situ biosignatures on terrestrial and planetary missions is becoming increasingly more important. Missions that target the Earth's deep biosphere, Mars, moons of Jupiter (including Europa), moons of Saturn (Titan and Enceladus), and small bodies such as asteroids or comets require methods that enable detection of materials for both in-situ analysis that preserve context and as a means to select high priority sample for return to Earth. In situ instrumentation for biosignature detection spans a wide range of analytical and spectroscopic methods that capitalize on amino acid distribution, chirality, lipid composition, isotopic fractionation, or textures that persist in the environment. Many of the existing analytical instruments are bulk analysis methods and while highly sensitive, these require sample acquisition and sample processing. However, by combining with triaging spectroscopic methods, biosignatures can be targeted on a surface and preserve spatial context (including mineralogy, textures, and organic distribution). To provide spatially correlated chemical analysis at multiple spatial scales (meters to microns) we have employed a dual spectroscopic approach that capitalizes on high sensitivity deep UV native fluorescence detection and high specificity deep UV Raman analysis.. Recently selected as a payload on the Mars 2020 mission, SHERLOC incorporates these optical methods for potential biosignatures detection on Mars. We present data from both Earth analogs that operate as our only examples known biosignatures and meteorite samples that provide an example of abiotic organic formation, and demonstrate how provenance effects the spatial distribution and composition of organics.

  12. Pattern of biopsy proven renal diseases at PNS SHIFA, Karachi: A cross-sectional survey

    PubMed Central

    Sabir, Sohail; Mubarak, Muhammed; Ul-Haq, Irfan; Bibi, Aisha

    2013-01-01

    Introduction: Percutaneous renal biopsy (RB) is an invaluable diagnostic procedure in patients with medical renal diseases.Objectives: To determine the pattern of biopsy proven renal disease (BPRD) from a tertiary care naval hospital in Karachi, Pakistan. Methods and Materials: All the renal biopsies in adult patients (≥18 years) performed at our hospital from 2008 to 2012 were retrospectively reviewed. The biopsies were evaluated by light microscopy and immunofluorescence. Results: A total 60 cases were analyzed. The mean age was 33.3±12.9 years (range: 18 to 72 years).The male to female ratio was 3:1. The most common indication of renal biopsy was nephrotic syndrome (43.3%), followed by renal failure (26.6%) and non-nephrotic proteinuria (23.3%). Primary glomerulonephritides (PGN) were predominant overall lesions, found in 46 (76.6%) of the total biopsies. Among PGN, the most common lesion was focal segmental glomerulosclerosis (FSGS), followed by membranous glomerulonephritis (MGN), IgA nephropathy (IgAN) and chronic sclerosing glomerulonephritis (CSGN) and a variety of rare lesions. Secondary glomerulonephritides (SGN) were found in only three (5%) cases. There were two cases of amyloidosis and one of lupus nephritis (LN). Tubulointerstitial disease (TID) and vascular disease were rare. Conclusion: This study provides information about the epidemiology of BPRD in a large tertiary care naval center in Southern Pakistan. PMID:25340152

  13. Development and evaluation of an ultrasonic personal aerosol sampler.

    PubMed

    Volckens, J; Quinn, C; Leith, D; Mehaffy, J; Henry, C S; Miller-Lionberg, D

    2017-03-01

    Assessing personal exposure to air pollution has long proven challenging due to technological limitations posed by the samplers themselves. Historically, wearable aerosol monitors have proven to be expensive, noisy, and burdensome. The objective of this work was to develop a new type of wearable monitor, an ultrasonic personal aerosol sampler (UPAS), to overcome many of the technological limitations in personal exposure assessment. The UPAS is a time-integrated monitor that features a novel micropump that is virtually silent during operation. A suite of onboard environmental sensors integrated with this pump measure and record mass airflow (0.5-3.0 L/min, accurate within 5%), temperature, pressure, relative humidity, light intensity, and acceleration. Rapid development of the UPAS was made possible through recent advances in low-cost electronics, open-source programming platforms, and additive manufacturing for rapid prototyping. Interchangeable cyclone inlets provided a close match to the EPA PM 2.5 mass criterion (within 5%) for device flows at either 1.0 or 2.0 L/min. Battery life varied from 23 to 45 hours depending on sample flow rate and selected filter media. Laboratory tests of the UPAS prototype demonstrate excellent agreement with equivalent federal reference method samplers for gravimetric analysis of PM 2.5 across a broad range of concentrations. © 2016 The Authors. Indoor Air published by John Wiley & Sons Ltd.

  14. Models for randomly distributed nanoscopic domains on spherical vesicles

    NASA Astrophysics Data System (ADS)

    Anghel, Vinicius N. P.; Bolmatov, Dima; Katsaras, John

    2018-06-01

    The existence of lipid domains in the plasma membrane of biological systems has proven controversial, primarily due to their nanoscopic size—a length scale difficult to interrogate with most commonly used experimental techniques. Scattering techniques have recently proven capable of studying nanoscopic lipid domains populating spherical vesicles. However, the development of analytical methods able of predicting and analyzing domain pair correlations from such experiments has not kept pace. Here, we developed models for the random distribution of monodisperse, circular nanoscopic domains averaged on the surface of a spherical vesicle. Specifically, the models take into account (i) intradomain correlations corresponding to form factors and interdomain correlations corresponding to pair distribution functions, and (ii) the analytical computation of interdomain correlations for cases of two and three domains on a spherical vesicle. In the case of more than three domains, these correlations are treated either by Monte Carlo simulations or by spherical analogs of the Ornstein-Zernike and Percus-Yevick (PY) equations. Importantly, the spherical analog of the PY equation works best in the case of nanoscopic size domains, a length scale that is mostly inaccessible by experimental approaches such as, for example, fluorescent techniques and optical microscopies. The analytical form factors and structure factors of nanoscopic domains populating a spherical vesicle provide a new and important framework for the quantitative analysis of experimental data from commonly studied phase-separated vesicles used in a wide range of biophysical studies.

  15. A Novel Multi-Receiver Signcryption Scheme with Complete Anonymity.

    PubMed

    Pang, Liaojun; Yan, Xuxia; Zhao, Huiyang; Hu, Yufei; Li, Huixian

    2016-01-01

    Anonymity, which is more and more important to multi-receiver schemes, has been taken into consideration by many researchers recently. To protect the receiver anonymity, in 2010, the first multi-receiver scheme based on the Lagrange interpolating polynomial was proposed. To ensure the sender's anonymity, the concept of the ring signature was proposed in 2005, but afterwards, this scheme was proven to has some weakness and at the same time, a completely anonymous multi-receiver signcryption scheme is proposed. In this completely anonymous scheme, the sender anonymity is achieved by improving the ring signature, and the receiver anonymity is achieved by also using the Lagrange interpolating polynomial. Unfortunately, the Lagrange interpolation method was proven a failure to protect the anonymity of receivers, because each authorized receiver could judge whether anyone else is authorized or not. Therefore, the completely anonymous multi-receiver signcryption mentioned above can only protect the sender anonymity. In this paper, we propose a new completely anonymous multi-receiver signcryption scheme with a new polynomial technology used to replace the Lagrange interpolating polynomial, which can mix the identity information of receivers to save it as a ciphertext element and prevent the authorized receivers from verifying others. With the receiver anonymity, the proposed scheme also owns the anonymity of the sender at the same time. Meanwhile, the decryption fairness and public verification are also provided.

  16. Key Provenance of Earth Science Observational Data Products

    NASA Astrophysics Data System (ADS)

    Conover, H.; Plale, B.; Aktas, M.; Ramachandran, R.; Purohit, P.; Jensen, S.; Graves, S. J.

    2011-12-01

    As the sheer volume of data increases, particularly evidenced in the earth and environmental sciences, local arrangements for sharing data need to be replaced with reliable records about the what, who, how, and where of a data set or collection. This is frequently called the provenance of a data set. While observational data processing systems in the earth sciences have a long history of capturing metadata about the processing pipeline, current processes are limited in both what is captured and how it is disseminated to the science community. Provenance capture plays a role in scientific data preservation and stewardship precisely because it can automatically capture and represent a coherent picture of the what, how and who of a particular scientific collection. It reflects the transformations that a data collection underwent prior to its current form and the sequence of tasks that were executed and data products applied to generate a new product. In the NASA-funded Instant Karma project, we examine provenance capture in earth science applications, specifically the Advanced Microwave Scanning Radiometer - Earth Observing System (AMSR-E) Science Investigator-led Processing system (SIPS). The project is integrating the Karma provenance collection and representation tool into the AMSR-E SIPS production environment, with an initial focus on Sea Ice. This presentation will describe capture and representation of provenance that is guided by the Open Provenance Model (OPM). Several things have become clear during the course of the project to date. One is that core OPM entities and relationships are not adequate for expressing the kinds of provenance that is of interest in the science domain. OPM supports name-value pair annotations that can be used to augment what is known about the provenance entities and relationships, but in Karma, annotations cannot be added during capture, but only after the fact. This limits the capture system's ability to record something it learned about an entity after the event of its creation in the provenance record. We will discuss extensions to the Open Provenance Model (OPM) and modifications to the Karma tool suite to address this issue, more efficient representations of earth science kinds of provenance, and definition of metadata structures for capturing related knowledge about the data products and science algorithms used to generate them. Use scenarios for provenance information is an active topic of investigation. It has additionally become clear through the project that not all provenance is created equal. In processing pipelines, some provenance is repetitive and uninteresting. Because of the volume of provenance, this obscures what are the interesting pieces of provenance. Methodologies to reveal science-relevant provenance will be presented, along with a preview of the AMSR-E Provenance Browser.

  17. Provenance based data integrity checking and verification in cloud environments.

    PubMed

    Imran, Muhammad; Hlavacs, Helmut; Haq, Inam Ul; Jan, Bilal; Khan, Fakhri Alam; Ahmad, Awais

    2017-01-01

    Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user's data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user's data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called "Data Provenance". Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking.

  18. Analysis of S-box in Image Encryption Using Root Mean Square Error Method

    NASA Astrophysics Data System (ADS)

    Hussain, Iqtadar; Shah, Tariq; Gondal, Muhammad Asif; Mahmood, Hasan

    2012-07-01

    The use of substitution boxes (S-boxes) in encryption applications has proven to be an effective nonlinear component in creating confusion and randomness. The S-box is evolving and many variants appear in literature, which include advanced encryption standard (AES) S-box, affine power affine (APA) S-box, Skipjack S-box, Gray S-box, Lui J S-box, residue prime number S-box, Xyi S-box, and S8 S-box. These S-boxes have algebraic and statistical properties which distinguish them from each other in terms of encryption strength. In some circumstances, the parameters from algebraic and statistical analysis yield results which do not provide clear evidence in distinguishing an S-box for an application to a particular set of data. In image encryption applications, the use of S-boxes needs special care because the visual analysis and perception of a viewer can sometimes identify artifacts embedded in the image. In addition to existing algebraic and statistical analysis already used for image encryption applications, we propose an application of root mean square error technique, which further elaborates the results and enables the analyst to vividly distinguish between the performances of various S-boxes. While the use of the root mean square error analysis in statistics has proven to be effective in determining the difference in original data and the processed data, its use in image encryption has shown promising results in estimating the strength of the encryption method. In this paper, we show the application of the root mean square error analysis to S-box image encryption. The parameters from this analysis are used in determining the strength of S-boxes

  19. Validation of a Projection-domain Insertion of Liver Lesions into CT Images

    PubMed Central

    Chen, Baiyu; Ma, Chi; Leng, Shuai; Fidler, Jeff L.; Sheedy, Shannon P.; McCollough, Cynthia H.; Fletcher, Joel G.; Yu, Lifeng

    2016-01-01

    Rationale and Objectives The aim of this study was to validate a projection-domain lesion-insertion method with observer studies. Materials and Methods A total of 51 proven liver lesions were segmented from computed tomography images, forward projected, and inserted into patient projection data. The images containing inserted and real lesions were then reconstructed and examined in consensus by two radiologists. First, 102 lesions (51 original, 51 inserted) were viewed in a randomized, blinded fashion and scored from 1 (absolutely inserted) to 10 (absolutely real). Statistical tests were performed to compare the scores for inserted and real lesions. Subsequently, a two-alternative-forced-choice test was conducted, with lesions viewed in pairs (real vs. inserted) in a blinded fashion. The radiologists selected the inserted lesion and provided a confidence level of 1 (no confidence) to 5 (completely certain). The number of lesion pairs that were incorrectly classified was calculated. Results The scores for inserted and proven lesions had the same median (8) and similar interquartile ranges (inserted, 5.5–8; real, 6.5–8). The means scores were not significantly different between real and inserted lesions (P value = 0.17). The receiver operating characteristic curve was nearly diagonal, with an area under the curve of 0.58 ± 0.06. For the two-alternative-forced-choice study, the inserted lesions were incorrectly identified in 49% (25 out of 51) of pairs; radiologists were incorrect in 38% (3 out of 8) of pairs even when they felt very confident in identifying the inserted lesion (confidence level ≥4). Conclusions Radiologists could not distinguish between inserted and real lesions, thereby validating the lesion-insertion technique, which may be useful for conducting virtual clinical trials to optimize image quality and radiation dose. PMID:27432267

  20. ENHANCED SOURCE REMOVAL USING IN-SITU CHEMICAL FLUSHING

    EPA Science Inventory

    Dense non-aqueous phase liquids (DNAPL) have been identified as a major impediment to the cleanup of many contaminated sites. Conventional ground water remediation methods such as pump-and-treat have proven ineffective at these sites. As a result, alternative remediation approach...

  1. A SIMPLE MULTIPLEX POLYMERASE CHAIN REACTION ASSAY FOR THE IDENTIFICATION OF FOUR ENVIRONMENTALLY RELEVANT FUNGAL CONTAMINANTS

    EPA Science Inventory

    Historically, identification of filamentous fungal (mold) species has been based on morphological characteristics, both macroscopic and microscopic. These methods have proven to be time consuming and inaccurate, necessitating the development of identification protocols that are ...

  2. Abandoned rail corridors in Texas : a policy and infrastructure evaluation.

    DOT National Transportation Integrated Search

    2011-03-01

    The use of existing and abandoned railroad rights-of-way has been a proven method of acquiring linear : corridors for the construction of roadways since the formation of the Texas Highway Department. Either : paralleling existing rail lines or re-usi...

  3. It's Deja Vu All Over Again.

    ERIC Educational Resources Information Center

    Starnes, Bobby Ann

    1998-01-01

    Describes nontraditional, creative teaching methods that celebrate differences among students. Contrasts these classrooms with programs that force teachers and learners into mediocre sameness using teacher-proof approaches that have been proven wrong for both teachers and learners. Uses personal examples from kindergarten and first-grade…

  4. Marketing Your College Music Program to Students.

    ERIC Educational Resources Information Center

    Kelly, Steven N.

    1988-01-01

    Suggests the use of time-proven marketing methods to attract high school students to college music programs and keep them interested in the music program. Explores facets of the college and the program that draw students, including reputation, location, costs, and program content. (LS)

  5. A Novel Sensor System for Measuring Wheel Loads of Vehicles on Highways

    PubMed Central

    Zhang, Wenbin; Suo, Chunguang; Wang, Qi

    2008-01-01

    With the development of the highway transportation and business trade, vehicle Weigh-In-Motion (WIM) technology has become a key technology for measuring traffic loads. In this paper a novel WIM system based on monitoring of pavement strain responses in rigid pavement was investigated. In this WIM system multiple low cost, light weight, small volume and high accuracy embedded concrete strain sensors were used as WIM sensors to measure rigid pavement strain responses. In order to verify the feasibility of the method, a system prototype based on multiple sensors was designed and deployed on a relatively busy freeway. Field calibration and tests were performed with known two-axle truck wheel loads and the measurement errors were calculated based on the static weights measured with a static weighbridge. This enables the weights of other vehicles to be calculated from the calibration constant. Calibration and test results for individual sensors or three-sensor fusions are both provided. Repeatability, sources of error, and weight accuracy are discussed. Successful results showed that the proposed method was feasible and proven to have a high accuracy. Furthermore, a sample mean approach using multiple fused individual sensors could provide better performance compared to individual sensors. PMID:27873952

  6. Submicrometer Particle Sizing by Multiangle Light Scattering following Fractionation

    PubMed

    Wyatt

    1998-01-01

    The acid test for any particle sizing technique is its ability to determine the differential number fraction size distribution of a simple, well-defined sample. The very best characterized polystyrene latex sphere standards have been measured extensively using transmission electron microscope (TEM) images of a large subpopulation of such samples or by means of the electrostatic classification method as refined at the National Institute of Standards and Technology. The great success, in the past decade, of on-line multiangle light scattering (MALS) detection combined with size exclusion chromatography for the measurement of polymer mass and size distributions suggested, in the early 1990s, that a similar attack for particle characterization might prove useful as well. At that time, fractionation of particles was achievable by capillary hydrodynamic chromatography (CHDF) and field flow fractionation (FFF) methods. The latter has proven most useful when combined with MALS to provide accurate differential number fraction size distributions for a broad range of particle classes. The MALS/FFF combination provides unique advantages and precision relative to FFF, photon correlation spectroscopy, and CHDF techniques used alone. For many classes of particles, resolution of the MALS/FFF combination far exceeds that of TEM measurements. Copyright 1998 Academic Press. Copyright 1998Academic Press

  7. Trace element chemistry of zircons from oceanic crust: A method for distinguishing detrital zircon provenance

    USGS Publications Warehouse

    Grimes, Craig B.; John, Barbara E.; Kelemen, P.B.; Mazdab, F.K.; Wooden, J.L.; Cheadle, Michael J.; Hanghoj, K.; Schwartz, J.J.

    2007-01-01

    We present newly acquired trace element compositions for more than 300 zircon grains in 36 gabbros formed at the slow-spreading Mid-Atlantic and Southwest Indian Ridges. Rare earth element patterns for zircon from modern oceanic crust completely overlap with those for zircon crystallized in continental granitoids. However, plots of U versus Yb and U/Yb versus Hf or Y discriminate zircons crystallized in oceanic crust from continental zircon, and provide a relatively robust method for distinguishing zircons from these environments. Approximately 80% of the modern ocean crust zircons are distinct from the field defined by more than 1700 continental zircons from Archean and Phanerozoic samples. These discrimination diagrams provide a new tool for fingerprinting ocean crust zircons derived from reservoirs like that of modern mid-ocean ridge basalt (MORB) in both modern and ancient detrital zircon populations. Hadean detrital zircons previously reported from the Acasta Gneiss, Canada, and the Narryer Gneiss terrane, Western Australia, plot in the continental granitoid field, supporting hypotheses that at least some Hadean detrital zircons crystallized in continental crust forming magmas and not from a reservoir like modern MORB. ?? 2007 The Geological Society of America.

  8. Through Life Costing

    NASA Astrophysics Data System (ADS)

    Newnes, Linda; Mileham, A. R.; Cheung, W. M.; Goh, Y. M.

    When an innovation is launched in such a market, reliable information about the life cost of the novel product is naturally lacking. This has proven to be a key obstacle to venture capital funded cleantech companies with innovations that are conceptually proven and that deliver significant improvements to conventional alternatives, but that lack enough reference installations to provide reliable data on life costs. One way out of this dilemma that is increasingly discussed among practitioners is servitization, i.e., the notion that the owner of the innovation should be an agency that is specialised in using and maintaining the product, letting the end customer become a buyer of the product's service (such as heat) rather than the product itself.

  9. Graphical representations of the chemistry of garnets in a three-dimensional MATLAB based provenance plot

    NASA Astrophysics Data System (ADS)

    Knierzinger, Wolfgang; Palzer, Markus; Wagreich, Michael; Meszar, Maria; Gier, Susanne

    2016-04-01

    A newly developed, MATLAB based garnet provenance plot allows a three-dimensional tetrahedral representation of the chemistry of garnets for the endmembers almandine, pyrope, spessartine and grossular. Based on a freely accessible database of Suggate & Hall (2013) and additional EPMA-data on the internet, the chemistry of more than 2500 garnets was evaluated and used to create various subfields that correspond to different facies conditions of metapelitic, metasomatic and metaigneous rocks as well as granitic rocks. These triangulated subfields act as reference structures within the tetrahedron, facilitating assignments of garnet chemistries to different lithologies. In comparison with conventional tenary garnet discrimination diagrams by Mange & Morton (2007), Wright/Preston et al. (1938/2002) and Aubrecht et al. (2009), this tetrahedral provenance plot enables a better assessment of the conditions of formation of garnets by reducing the overlapping of certain subfields. In particular, a clearer distinction between greenschist facies rocks, amphibolite facies rocks and granitic rocks can be achieved. First applications of the tetrahedral garnet plot provided new insights on sedimentary processes during the Lower Miocene in the pre-Alpine Molasse basin. Bibliography Aubrecht, R., Meres, S., Sykora, M., Mikus, T. (2009). Provenance of the detrital garnets and spinels from the Albian sediments of the Czorsztyn Unit (Pieniny Klippen Belt , Western Carpathians, Slovakia). In: Geologica Carpathica, Dec. 2009, 60, 6, pp. 463-483. Mange, M.A., Morton, A.C. (2007). Geochemistry of Heavy Minerals. In: Mange, M.A. & Wright, D.T.(2007).Heavy Minerals in Use, Amsterdam, pp. 345-391. Preston, J., Hartley, A., Mange-Rajetzky, M., Hole, M., May, G., Buck, S., Vaughan, L. (2002). The provenance of Triassic continental sandstones from the Beryl Field, northern North Sea: Mineralogical, geochemical and sedimentological constraints. In: Journal of Sedimentary Research, 72, pp. 18-29. Suggate, S.M., Hall, R., (2013). Using detrital garnet compositions to determine provenance: a new compositional database and procedure. In: Scott, R.A., Smyth, H.R., Morton, A.C., Richardson, N. (Eds.), Sediment Provenance Studies in Hydrocarbon Exploration and Production. Geological Society of London, Special Publication, 386. http://dx.doi.org/10.1144/SP386.8 Wright, W.I., (1938).The composition and occurrence of garnets. In: American Mineralogist, 23,pp. 436 - 449.

  10. Probabilistic seismic loss estimation via endurance time method

    NASA Astrophysics Data System (ADS)

    Tafakori, Ehsan; Pourzeynali, Saeid; Estekanchi, Homayoon E.

    2017-01-01

    Probabilistic Seismic Loss Estimation is a methodology used as a quantitative and explicit expression of the performance of buildings using terms that address the interests of both owners and insurance companies. Applying the ATC 58 approach for seismic loss assessment of buildings requires using Incremental Dynamic Analysis (IDA), which needs hundreds of time-consuming analyses, which in turn hinders its wide application. The Endurance Time Method (ETM) is proposed herein as part of a demand propagation prediction procedure and is shown to be an economical alternative to IDA. Various scenarios were considered to achieve this purpose and their appropriateness has been evaluated using statistical methods. The most precise and efficient scenario was validated through comparison against IDA driven response predictions of 34 code conforming benchmark structures and was proven to be sufficiently precise while offering a great deal of efficiency. The loss values were estimated by replacing IDA with the proposed ETM-based procedure in the ATC 58 procedure and it was found that these values suffer from varying inaccuracies, which were attributed to the discretized nature of damage and loss prediction functions provided by ATC 58.

  11. Φ-score: A cell-to-cell phenotypic scoring method for sensitive and selective hit discovery in cell-based assays.

    PubMed

    Guyon, Laurent; Lajaunie, Christian; Fer, Frédéric; Bhajun, Ricky; Sulpice, Eric; Pinna, Guillaume; Campalans, Anna; Radicella, J Pablo; Rouillier, Philippe; Mary, Mélissa; Combe, Stéphanie; Obeid, Patricia; Vert, Jean-Philippe; Gidrol, Xavier

    2015-09-18

    Phenotypic screening monitors phenotypic changes induced by perturbations, including those generated by drugs or RNA interference. Currently-used methods for scoring screen hits have proven to be problematic, particularly when applied to physiologically relevant conditions such as low cell numbers or inefficient transfection. Here, we describe the Φ-score, which is a novel scoring method for the identification of phenotypic modifiers or hits in cell-based screens. Φ-score performance was assessed with simulations, a validation experiment and its application to gene identification in a large-scale RNAi screen. Using robust statistics and a variance model, we demonstrated that the Φ-score showed better sensitivity, selectivity and reproducibility compared to classical approaches. The improved performance of the Φ-score paves the way for cell-based screening of primary cells, which are often difficult to obtain from patients in sufficient numbers. We also describe a dedicated merging procedure to pool scores from small interfering RNAs targeting the same gene so as to provide improved visualization and hit selection.

  12. Φ-score: A cell-to-cell phenotypic scoring method for sensitive and selective hit discovery in cell-based assays

    PubMed Central

    Guyon, Laurent; Lajaunie, Christian; fer, Frédéric; bhajun, Ricky; sulpice, Eric; pinna, Guillaume; campalans, Anna; radicella, J. Pablo; rouillier, Philippe; mary, Mélissa; combe, Stéphanie; obeid, Patricia; vert, Jean-Philippe; gidrol, Xavier

    2015-01-01

    Phenotypic screening monitors phenotypic changes induced by perturbations, including those generated by drugs or RNA interference. Currently-used methods for scoring screen hits have proven to be problematic, particularly when applied to physiologically relevant conditions such as low cell numbers or inefficient transfection. Here, we describe the Φ-score, which is a novel scoring method for the identification of phenotypic modifiers or hits in cell-based screens. Φ-score performance was assessed with simulations, a validation experiment and its application to gene identification in a large-scale RNAi screen. Using robust statistics and a variance model, we demonstrated that the Φ-score showed better sensitivity, selectivity and reproducibility compared to classical approaches. The improved performance of the Φ-score paves the way for cell-based screening of primary cells, which are often difficult to obtain from patients in sufficient numbers. We also describe a dedicated merging procedure to pool scores from small interfering RNAs targeting the same gene so as to provide improved visualization and hit selection. PMID:26382112

  13. Data Reduction Algorithm Using Nonnegative Matrix Factorization with Nonlinear Constraints

    NASA Astrophysics Data System (ADS)

    Sembiring, Pasukat

    2017-12-01

    Processing ofdata with very large dimensions has been a hot topic in recent decades. Various techniques have been proposed in order to execute the desired information or structure. Non- Negative Matrix Factorization (NMF) based on non-negatives data has become one of the popular methods for shrinking dimensions. The main strength of this method is non-negative object, the object model by a combination of some basic non-negative parts, so as to provide a physical interpretation of the object construction. The NMF is a dimension reduction method thathasbeen used widely for numerous applications including computer vision,text mining, pattern recognitions,and bioinformatics. Mathematical formulation for NMF did not appear as a convex optimization problem and various types of algorithms have been proposed to solve the problem. The Framework of Alternative Nonnegative Least Square(ANLS) are the coordinates of the block formulation approaches that have been proven reliable theoretically and empirically efficient. This paper proposes a new algorithm to solve NMF problem based on the framework of ANLS.This algorithm inherits the convergenceproperty of the ANLS framework to nonlinear constraints NMF formulations.

  14. Theoretical Sum Frequency Generation Spectroscopy of Peptides

    PubMed Central

    2015-01-01

    Vibrational sum frequency generation (SFG) has become a very promising technique for the study of proteins at interfaces, and it has been applied to important systems such as anti-microbial peptides, ion channel proteins, and human islet amyloid polypeptide. Moreover, so-called “chiral” SFG techniques, which rely on polarization combinations that generate strong signals primarily for chiral molecules, have proven to be particularly discriminatory of protein secondary structure. In this work, we present a theoretical strategy for calculating protein amide I SFG spectra by combining line-shape theory with molecular dynamics simulations. We then apply this method to three model peptides, demonstrating the existence of a significant chiral SFG signal for peptides with chiral centers, and providing a framework for interpreting the results on the basis of the dependence of the SFG signal on the peptide orientation. We also examine the importance of dynamical and coupling effects. Finally, we suggest a simple method for determining a chromophore’s orientation relative to the surface using ratios of experimental heterodyne-detected signals with different polarizations, and test this method using theoretical spectra. PMID:25203677

  15. Three-dimensional optical reconstruction of vocal fold kinematics using high-speed video with a laser projection system

    PubMed Central

    Luegmair, Georg; Mehta, Daryush D.; Kobler, James B.; Döllinger, Michael

    2015-01-01

    Vocal fold kinematics and its interaction with aerodynamic characteristics play a primary role in acoustic sound production of the human voice. Investigating the temporal details of these kinematics using high-speed videoendoscopic imaging techniques has proven challenging in part due to the limitations of quantifying complex vocal fold vibratory behavior using only two spatial dimensions. Thus, we propose an optical method of reconstructing the superior vocal fold surface in three spatial dimensions using a high-speed video camera and laser projection system. Using stereo-triangulation principles, we extend the camera-laser projector method and present an efficient image processing workflow to generate the three-dimensional vocal fold surfaces during phonation captured at 4000 frames per second. Initial results are provided for airflow-driven vibration of an ex vivo vocal fold model in which at least 75% of visible laser points contributed to the reconstructed surface. The method captures the vertical motion of the vocal folds at a high accuracy to allow for the computation of three-dimensional mucosal wave features such as vibratory amplitude, velocity, and asymmetry. PMID:26087485

  16. A binary linear programming formulation of the graph edit distance.

    PubMed

    Justice, Derek; Hero, Alfred

    2006-08-01

    A binary linear programming formulation of the graph edit distance for unweighted, undirected graphs with vertex attributes is derived and applied to a graph recognition problem. A general formulation for editing graphs is used to derive a graph edit distance that is proven to be a metric, provided the cost function for individual edit operations is a metric. Then, a binary linear program is developed for computing this graph edit distance, and polynomial time methods for determining upper and lower bounds on the solution of the binary program are derived by applying solution methods for standard linear programming and the assignment problem. A recognition problem of comparing a sample input graph to a database of known prototype graphs in the context of a chemical information system is presented as an application of the new method. The costs associated with various edit operations are chosen by using a minimum normalized variance criterion applied to pairwise distances between nearest neighbors in the database of prototypes. The new metric is shown to perform quite well in comparison to existing metrics when applied to a database of chemical graphs.

  17. Translocations of amphibians: Proven management method or experimental technique

    USGS Publications Warehouse

    Seigel, Richard A.; Dodd, C. Kenneth

    2002-01-01

    In an otherwise excellent review of metapopulation dynamics in amphibians, Marsh and Trenham (2001) make the following provocative statements (emphasis added): If isolation effects occur primarily in highly disturbed habitats, species translocations may be necessary to promote local and regional population persistence. Because most amphibians lack parental care, they areprime candidates for egg and larval translocations. Indeed, translocations have already proven successful for several species of amphibians. Where populations are severely isolated, translocations into extinct subpopulations may be the best strategy to promote regional population persistence. We take issue with these statements for a number of reasons. First, the authors fail to cite much of the relevant literature on species translocations in general and for amphibians in particular. Second, to those unfamiliar with current research in amphibian conservation biology, these comments might suggest that translocations are a proven management method. This is not the case, at least in most instances where translocations have been evaluated for an appropriate period of time. Finally, the authors fail to point out some of the negative aspects of species translocation as a management method. We realize that Marsh and Trenham's paper was not concerned primarily with translocations. However, because Marsh and Trenham (2001) made specific recommendations for conservation planners and managers (many of whom are not herpetologists or may not be familiar with the pertinent literature on amphibians), we believe that it is essential to point out that not all amphibian biologists are as comfortable with translocations as these authors appear to be. We especially urge caution about advocating potentially unproven techniques without a thorough review of available options.

  18. Development of a novel method for unraveling the origin of natron flux used in Roman glass production based on B isotopic analysis via multicollector inductively coupled plasma mass spectrometry.

    PubMed

    Devulder, Veerle; Degryse, Patrick; Vanhaecke, Frank

    2013-12-17

    The provenance of the flux raw material used in the manufacturing of Roman glass is an understudied topic in archaeology. Whether one or multiple sources of natron mineral salts were exploited during this period is still open for debate, largely because of the lack of a good provenance indicator. The flux is the major source of B in Roman glass. Therefore, B isotopic analysis of a sufficiently large collection and variety (origin and age) of such glass samples might give an indication of the number of flux sources used. For this purpose, a method based on acid digestion, chromatographic B isolation and B isotopic analysis using multicollector inductively coupled plasma mass spectrometry was developed. B isolation was accomplished using a combination of strong cation exchange and strong anion exchange chromatography. Although the B fraction was not completely matrix-free, the remaining Sb was shown not to affect the δ(11)B result. The method was validated using obsidian and archaeological glass samples that were stripped of their B content, after which an isotopic reference material with known B isotopic composition was added. Absence of artificial B isotope fractionation was demonstrated, and the total uncertainty was shown to be <2‰. A proof-of-concept application to natron glass samples showed a narrow range of δ(11)B, whereas first results for natron salt samples do show a larger difference in δ(11)B. These results suggest the use of only one natron source or of several sources with similar δ(11)B. This indicates that B isotopic analysis is a promising tool for the provenance determination of this flux raw material.

  19. Imaging Breast Density: Established and Emerging Modalities1

    PubMed Central

    Chen, Jeon-Hor; Gulsen, Gultekin; Su, Min-Ying

    2015-01-01

    Mammographic density has been proven as an independent risk factor for breast cancer. Women with dense breast tissue visible on a mammogram have a much higher cancer risk than women with little density. A great research effort has been devoted to incorporate breast density into risk prediction models to better estimate each individual’s cancer risk. In recent years, the passage of breast density notification legislation in many states in USA requires that every mammography report should provide information regarding the patient’s breast density. Accurate definition and measurement of breast density are thus important, which may allow all the potential clinical applications of breast density to be implemented. Because the two-dimensional mammography-based measurement is subject to tissue overlapping and thus not able to provide volumetric information, there is an urgent need to develop reliable quantitative measurements of breast density. Various new imaging technologies are being developed. Among these new modalities, volumetric mammographic density methods and three-dimensional magnetic resonance imaging are the most well studied. Besides, emerging modalities, including different x-ray–based, optical imaging, and ultrasound-based methods, have also been investigated. All these modalities may either overcome some fundamental problems related to mammographic density or provide additional density and/or compositional information. The present review article aimed to summarize the current established and emerging imaging techniques for the measurement of breast density and the evidence of the clinical use of these density methods from the literature. PMID:26692524

  20. 7 CFR 1944.510 - Applicant eligibility.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...-income rural families; (e) Have the ability and willingness to work within established guidelines; and (f... capacity, it must either: (1) Have necessary background and experience with proven ability to perform... or administrative experience which indicates an ability to provide responsible technical and...

  1. 7 CFR 1944.510 - Applicant eligibility.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...-income rural families; (e) Have the ability and willingness to work within established guidelines; and (f... capacity, it must either: (1) Have necessary background and experience with proven ability to perform... or administrative experience which indicates an ability to provide responsible technical and...

  2. 7 CFR 4274.343 - Application.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... charges it will assess the ultimate recipients; (v) Demonstrate to Agency satisfaction that the... organizations; (vi) Provide evidence to Agency satisfaction that the intermediary has a proven record of... intermediary's program. Outcomes should be expressed in quantitative or observable terms such as jobs created...

  3. 7 CFR 4274.343 - Application.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... charges it will assess the ultimate recipients; (v) Demonstrate to Agency satisfaction that the... organizations; (vi) Provide evidence to Agency satisfaction that the intermediary has a proven record of... intermediary's program. Outcomes should be expressed in quantitative or observable terms such as jobs created...

  4. Active Noise and Vibration Control Literature Survey: Sensors and Actuators

    DTIC Science & Technology

    1999-08-01

    energy from being coupled into the structure of the surface ship or submarine. While t hese methods have proven to be effective in general, there are...3.12 3.5 .3 Sensors Based on the Photo-elastic Effect ......................................... 3 .13 3.6 Electro-reheological Fluids...4.3 4.2.3 Control Methods for Vibration Isolation .............................................. 4.7 4.2.4 Effect of

  5. Lean Manufacturing Principles Improving the Targeting Process

    DTIC Science & Technology

    2012-06-08

    author has familiarity with Lean manufacturing principles. Third, Lean methods have been used in different industries and have proven adaptable to the...92 The case study also demonstrates the multi organizational application of VSM, JIT and the 5S method ...new members not knowing the process, this will serve as a start point for the developing of understanding. Within the Food industry we observed “the

  6. Rates of biotite weathering, and clay mineral transformation and neoformation, determined from watershed geochemical mass-balance methods for the Coweeta Hydrologic Laboratory, Southern Blue Ridge Mountains, North Carolina, USA

    Treesearch

    Jason R. Price; Michael A. Velbel

    2013-01-01

    Biotite is a common constituent of silicate bedrock. Its weathering releases plant nutrients and consumes atmospheric CO2. Because of its stoichiometric relationship with its transformational weathering product and sensitivity to botanical activity, calculating biotite weathering rates using watershed mass-balance methods has proven challenging....

  7. Integrated Processing in Planning and Understanding.

    DTIC Science & Technology

    1986-12-01

    to language analysis seemed necessary. The second observation was the rather commonsense one that it is easier to understand a foreign language ...syntactic analysis Probably the most widely employed method for natural language analysis is augmea ted transition network parsing, or ATNs (Thorne, Bratley...accomplished. It is for this reason that the programming language Prolog, which implements that general method , has proven so well-stilted to writing ATN

  8. Communication, Collaboration and Cooperation: An Evaluation of Nova Scotia's Borrow Anywhere, Return Anywhere (BARA) Multi-Type Library Initiative

    ERIC Educational Resources Information Center

    van den Hoogen, Suzanne; Parrott, Denise

    2012-01-01

    Partnerships and collaborations among libraries are proven to enhance collective resources. The collaboration of multi-type libraries offers a unique opportunity to explore the potential of different libraries working together to provide the best possible service to their community members. This article provides a detailed report of a multi-type…

  9. Wood structural differences between northern and southern beech provenances growing at a moderate site.

    PubMed

    Eilmann, B; Sterck, F; Wegner, L; de Vries, S M G; von Arx, G; Mohren, G M J; den Ouden, J; Sass-Klaassen, U

    2014-08-01

    Planting provenances originating from southern to northern locations has been discussed as a strategy to speed up species migration and mitigate negative effects of climate change on forest stability and productivity. Especially for drought-susceptible species such as European beech (Fagus sylvatica L.), the introduction of drought-tolerant provenances from the south could be an option. Yet, beech has been found to respond plastically to environmental conditions, suggesting that the climate on the plantation site might be more important for tree growth than the genetic predisposition of potentially drought-adapted provenances. In this study, we compared the radial growth, wood-anatomical traits and leaf phenology of four beech provenances originating from southern (Bulgaria, France) and northern locations (Sweden, the Netherlands) and planted in a provenance trial in the Netherlands. The distribution of precipitation largely differs between the sites of origin. The northern provenances experience a maximum and the southern provenances experience a minimum of rainfall in summer. We compared tree productivity and the anatomy of the water-conducting system for the period from 2000 to 2010, including the drought year 2003. In addition, tree mortality and the timing of leaf unfolding in spring were analysed for the years 2001, 2007 and 2012. Comparison of these traits in the four beech provenances indicates the influence of genetic predisposition and local environmental factors on the performance of these provenances under moderate site conditions. Variation in radial growth was controlled by environment, although the growth level slightly differed due to genetic background. The Bulgarian provenance had an efficient water-conducting system which was moreover unaffected by the drought in 2003, pointing to a high ability of this provenance to cope well with dry conditions. In addition, the Bulgarian provenance showed up as most productive in terms of height and radial growth. Altogether, we conclude that the similarity in ring-width variation among provenances points to environmental control of this trait, whereas the differences encountered in wood-anatomical traits between the well-performing Bulgarian provenance and the other three provenances, as well as the consistent differences in flushing pattern over 3 years under various environmental conditions, support the hypothesis of genetic control of these features. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. A novel approach to identifying regulatory motifs in distantly related genomes

    PubMed Central

    Van Hellemont, Ruth; Monsieurs, Pieter; Thijs, Gert; De Moor, Bart; Van de Peer, Yves; Marchal, Kathleen

    2005-01-01

    Although proven successful in the identification of regulatory motifs, phylogenetic footprinting methods still show some shortcomings. To assess these difficulties, most apparent when applying phylogenetic footprinting to distantly related organisms, we developed a two-step procedure that combines the advantages of sequence alignment and motif detection approaches. The results on well-studied benchmark datasets indicate that the presented method outperforms other methods when the sequences become either too long or too heterogeneous in size. PMID:16420672

  11. A collocation-shooting method for solving fractional boundary value problems

    NASA Astrophysics Data System (ADS)

    Al-Mdallal, Qasem M.; Syam, Muhammed I.; Anwar, M. N.

    2010-12-01

    In this paper, we discuss the numerical solution of special class of fractional boundary value problems of order 2. The method of solution is based on a conjugating collocation and spline analysis combined with shooting method. A theoretical analysis about the existence and uniqueness of exact solution for the present class is proven. Two examples involving Bagley-Torvik equation subject to boundary conditions are also presented; numerical results illustrate the accuracy of the present scheme.

  12. The National Shipbuilding Research Program. Development of a Quick TBT Analytical Method

    DTIC Science & Technology

    2000-08-16

    Development of a Quick TBT Analytical Method 09/25/2000 Page 3 of 38 Executive Summary Concern about the toxic effect of tributyltin have caused the...paints, developed in the 1960s, contains the organotin tributyltin ( TBT ), which has been proven to cause deformations in oysters and sex changes in...measured response (area counts) for tributyltin in deionized distilled water. Final Report – Development of a Quick TBT Analytical Method 09/25/2000

  13. Multispectral code excited linear prediction coding and its application in magnetic resonance images.

    PubMed

    Hu, J H; Wang, Y; Cahill, P T

    1997-01-01

    This paper reports a multispectral code excited linear prediction (MCELP) method for the compression of multispectral images. Different linear prediction models and adaptation schemes have been compared. The method that uses a forward adaptive autoregressive (AR) model has been proven to achieve a good compromise between performance, complexity, and robustness. This approach is referred to as the MFCELP method. Given a set of multispectral images, the linear predictive coefficients are updated over nonoverlapping three-dimensional (3-D) macroblocks. Each macroblock is further divided into several 3-D micro-blocks, and the best excitation signal for each microblock is determined through an analysis-by-synthesis procedure. The MFCELP method has been applied to multispectral magnetic resonance (MR) images. To satisfy the high quality requirement for medical images, the error between the original image set and the synthesized one is further specified using a vector quantizer. This method has been applied to images from 26 clinical MR neuro studies (20 slices/study, three spectral bands/slice, 256x256 pixels/band, 12 b/pixel). The MFCELP method provides a significant visual improvement over the discrete cosine transform (DCT) based Joint Photographers Expert Group (JPEG) method, the wavelet transform based embedded zero-tree wavelet (EZW) coding method, and the vector tree (VT) coding method, as well as the multispectral segmented autoregressive moving average (MSARMA) method we developed previously.

  14. Parental age and birth order in Chinese children with congenital heart disease.

    PubMed Central

    Tay, J S; Yip, W C; Joseph, R

    1982-01-01

    Parental age and birth order were studied in 100 Chinese children with congenital heart disease (proven by cardiac catheterisation) and in 100 controls. A higher incidence of congenital heart disease was present in the children with higher birth orders. No relationship was found between the incidence and the paternal or maternal ages. Using the method of multiple regression analysis this birth order effect was significant (p less than 0.01) and independent of parental age. This finding provides indirect evidence of environmental influence in the causation of congenital heart disease, which is known to be inherited in a multifactorial manner. Family planning to limit the size of the family may possibly contribute to the reduction of the incidence of congenital heart disease. PMID:7154041

  15. Real time aircraft fly-over noise discrimination

    NASA Astrophysics Data System (ADS)

    Genescà, M.; Romeu, J.; Pàmies, T.; Sánchez, A.

    2009-06-01

    A method for measuring aircraft noise time history with automatic elimination of simultaneous urban noise is presented in this paper. A 3 m-long 12-microphone sparse array has been proven to give good performance in a wide range of urban placements. Nowadays, urban placements have to be avoided because their background noise has a great influence on the measurements made by sound level meters or single microphones. Because of the small device size and low number of microphones (that make it so easy to set up), the resolution of the device is not high enough to provide a clean aircraft noise time history by only applying frequency domain beamforming to the spatial cross-correlations of the microphones' signals. Therefore, a new step to the processing algorithm has been added to eliminate this handicap.

  16. Databases and Associated Tools for Glycomics and Glycoproteomics.

    PubMed

    Lisacek, Frederique; Mariethoz, Julien; Alocci, Davide; Rudd, Pauline M; Abrahams, Jodie L; Campbell, Matthew P; Packer, Nicolle H; Ståhle, Jonas; Widmalm, Göran; Mullen, Elaine; Adamczyk, Barbara; Rojas-Macias, Miguel A; Jin, Chunsheng; Karlsson, Niclas G

    2017-01-01

    The access to biodatabases for glycomics and glycoproteomics has proven to be essential for current glycobiological research. This chapter presents available databases that are devoted to different aspects of glycobioinformatics. This includes oligosaccharide sequence databases, experimental databases, 3D structure databases (of both glycans and glycorelated proteins) and association of glycans with tissue, disease, and proteins. Specific search protocols are also provided using tools associated with experimental databases for converting primary glycoanalytical data to glycan structural information. In particular, researchers using glycoanalysis methods by U/HPLC (GlycoBase), MS (GlycoWorkbench, UniCarb-DB, GlycoDigest), and NMR (CASPER) will benefit from this chapter. In addition we also include information on how to utilize glycan structural information to query databases that associate glycans with proteins (UniCarbKB) and with interactions with pathogens (SugarBind).

  17. Real Time Optimal Control of Supercapacitor Operation for Frequency Response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luo, Yusheng; Panwar, Mayank; Mohanpurkar, Manish

    2016-07-01

    Supercapacitors are gaining wider applications in power systems due to fast dynamic response. Utilizing supercapacitors by means of power electronics interfaces for power compensation is a proven effective technique. For applications such as requency restoration if the cost of supercapacitors maintenance as well as the energy loss on the power electronics interfaces are addressed. It is infeasible to use traditional optimization control methods to mitigate the impacts of frequent cycling. This paper proposes a Front End Controller (FEC) using Generalized Predictive Control featuring real time receding optimization. The optimization constraints are based on cost and thermal management to enhance tomore » the utilization efficiency of supercapacitors. A rigorous mathematical derivation is conducted and test results acquired from Digital Real Time Simulator are provided to demonstrate effectiveness.« less

  18. Comprehensible Presentation of Topological Information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weber, Gunther H.; Beketayev, Kenes; Bremer, Peer-Timo

    2012-03-05

    Topological information has proven very valuable in the analysis of scientific data. An important challenge that remains is presenting this highly abstract information in a way that it is comprehensible even if one does not have an in-depth background in topology. Furthermore, it is often desirable to combine the structural insight gained by topological analysis with complementary information, such as geometric information. We present an overview over methods that use metaphors to make topological information more accessible to non-expert users, and we demonstrate their applicability to a range of scientific data sets. With the increasingly complex output of exascale simulations,more » the importance of having effective means of providing a comprehensible, abstract overview over data will grow. The techniques that we present will serve as an important foundation for this purpose.« less

  19. Plasmonic gold nanostars as optical nano-additives for injection molded polymer composites

    NASA Astrophysics Data System (ADS)

    Boyne, Devon A.; Orlicki, Joshua A.; Walck, Scott D.; Savage, Alice M.; Li, Thomas; Griep, Mark H.

    2017-10-01

    Nanoscale engineering of noble metal particles has provided numerous material configurations to selectively confine and manipulate light across the electromagnetic spectrum. Transitioning these materials to a composite form while maintaining the desired resonance properties has proven challenging. In this work, the successful integration of plasmon-focusing gold nanostars (GNSs) into polymer nanocomposites (PNCs) is demonstrated. Tailored GNSs are produced with over a 90% yield and methods to control the branching structures are shown. A protective silica capping shell is employed on the nanomaterials to facilitate survivability in the high temperate/high shear processing parameters to create optically-tuned injection molded PNCs. The developed GNS PNCs possess dichroic scattering and absorption behavior, opening up potential applications in the fields of holographic imaging, optical filtering and photovoltaics.

  20. Numerical Simulation of Metallic Uranium Sintering

    NASA Astrophysics Data System (ADS)

    Berry, Bruce

    Conventional ceramic oxide nuclear fuels are limited in their thermal and life-cycle properties. The desire to operate at higher burnups as is required by current utility economics has proven a formidable challenge for oxide fuel designs. Metallic formulations have superior thermal performance but are plagued by volumetric swelling due to fission gas buildup. In this study, we consider a number of specific microstructure configurations that have been experimentally shown to exhibit considerable resistance to porosity loss. Specifically, a void sizing that is bimodally distributed was shown to resist early pore loss and could provide collection sites for fission gas buildup. We employ the phase field model of Cahn and Hilliard, solved via the finite element method using the open source Multi-User Object Oriented Simulation Environment (MOOSE) developed by INL.

  1. Tip-enhanced Raman scattering (TERS) and high-resolution bio nano-analysis--a comparison.

    PubMed

    Deckert-Gaudig, Tanja; Deckert, Volker

    2010-10-14

    This perspective presents and assesses the development and capabilities of tip-enhanced Raman scattering (TERS) since its discovery in 2000. So far, this technique has proven to be valuable for studies of a variety of inorganic, organic and biochemical specimens. Due to its ability to provide chemical and topographic characterization in a single experiment at a sub-100 nm resolution, TERS has gained importance in super-resolution structural analysis. In this contribution the focus is set on applications with relevance in the biology and medical fields. The potential and challenges of this near-field technique are discussed with respect to state-of-the-art microscopic and spectroscopic imaging methods. Furthermore, possible ways to surpass current boundaries and an outlook to future projects are presented.

  2. SAPNEW: Parallel finite element code for thin shell structures on the Alliant FX-80

    NASA Astrophysics Data System (ADS)

    Kamat, Manohar P.; Watson, Brian C.

    1992-11-01

    The finite element method has proven to be an invaluable tool for analysis and design of complex, high performance systems, such as bladed-disk assemblies in aircraft turbofan engines. However, as the problem size increase, the computation time required by conventional computers can be prohibitively high. Parallel processing computers provide the means to overcome these computation time limits. This report summarizes the results of a research activity aimed at providing a finite element capability for analyzing turbomachinery bladed-disk assemblies in a vector/parallel processing environment. A special purpose code, named with the acronym SAPNEW, has been developed to perform static and eigen analysis of multi-degree-of-freedom blade models built-up from flat thin shell elements. SAPNEW provides a stand alone capability for static and eigen analysis on the Alliant FX/80, a parallel processing computer. A preprocessor, named with the acronym NTOS, has been developed to accept NASTRAN input decks and convert them to the SAPNEW format to make SAPNEW more readily used by researchers at NASA Lewis Research Center.

  3. Ceramic nanocarriers: versatile nanosystem for protein and peptide delivery.

    PubMed

    Singh, Deependra; Dubey, Pooja; Pradhan, Madhulika; Singh, Manju Rawat

    2013-02-01

    Proteins and peptides have been established to be the potential drug candidate for various human diseases. But, delivery of these therapeutic protein and peptides is still a challenge due to their several unfavorable properties. Nanotechnology is expanding as a promising tool for the efficient delivery of proteins and peptides. Among numerous nano-based carriers, ceramic nanoparticles have proven themselves as a unique carrier for protein and peptide delivery as they provide a more stable, bioavailable, readily manufacturable, and acceptable proteins and polypeptide formulation. This article provides an overview of the various aspects of ceramic nanoparticles including their classification, methods of preparation, latest advances, and applications as protein and peptide delivery carriers. Ceramic nanocarriers seem to have potential for preserving structural integrity of proteins and peptides, thereby promoting a better therapeutic effect. This approach thus provides pharmaceutical scientists with a new hope for the delivery of proteins and peptides. Still, considerable study on ceramic nanocarrier is necessary with respect to pharmacokinetics, toxicology, and animal studies to confirm their efficiency as well as safety and to establish their clinical usefulness and scale-up to industrial level.

  4. Development of acoustic emission evaluation method for repaired prestressed concrete bridge girders.

    DOT National Transportation Integrated Search

    2011-06-01

    Acoustic emission (AE) monitoring has proven to be a useful nondestructive testing tool in ordinary reinforced concrete beams. Over the past decade, however, the technique has also been used to test other concrete structures. It has been seen that ac...

  5. Time-lapse monitoring of soil water content using electromagnetic conductivity imaging

    USDA-ARS?s Scientific Manuscript database

    The volumetric soil water content (VWC) is fundamental to agriculture. Unfortunately, the universally accepted thermogravimetric method is labour intensive and time-consuming to use for field-scale monitoring. Electromagnetic (EM) induction instruments have proven to be useful in mapping the spatio-...

  6. Getting Students To Read Actively.

    ERIC Educational Resources Information Center

    Kitao, Kenji

    1994-01-01

    This article discusses Japanese students' difficulties in reading English, overviews some of the problems of college English textbooks, presents the results of research on the subject, and discusses characteristics of measures of readability. Teaching methods that have proven effective with Japanese students and activities for engaging students in…

  7. Best Practices of Literacy Leaders: Keys to School Improvement

    ERIC Educational Resources Information Center

    Bean, Rita M., Ed.; Dagen, Allison Swan, Ed.

    2011-01-01

    Bringing together leading experts, this book presents the principles of effective literacy leadership and describes proven methods for improving instruction, assessment, and schoolwide professional development. The book shows how all school staff--including reading specialists and coaches, administrators, teachers, and special educators--can play…

  8. IN-SITU THERMAL REMEDIATION: MECHANISMS, PRINCIPLES, AND CASE STUDIES

    EPA Science Inventory

    Since the early 1990's, thermal methods of enhanced oil recovery have been adapted for the remediation of soils and groundwater. Steam injection and electrical resistance heating have proven to be robust and aggressive techniques for the enhanced recovery of volatile and semivol...

  9. Multivariate analysis in provenance studies: Cerrillos obsidians case, Peru

    NASA Astrophysics Data System (ADS)

    Bustamante, A.; Delgado, M.; Latini, R. M.; Bellido, A. V. B.

    2007-02-01

    We present the preliminary results of a provenance study of obsidians samples from Cerrillos (ca. 800 100 b.c.) using Mössbauer Spectroscopy. The Cerrillos archaeological site, located in the Upper Ica Valley, Peru, is the only Paracas ceremonial center excavated so far. The archaeological data collected suggest the existence of a complex social and economic organization on the south coast of Peru. Provenance research of obsidian provides valuable information about the selection of lithic resources by our ancestors and eventually about the existence of communication routes and exchange networks. We characterized 18 obsidian artifacts samples by Mössbauer spectroscopy from Cerrillos. The spectra, recorded at room temperature using different velocities, are mainly composed of broad asymmetric doublets due to the superposition of at least two quadrupole doublets corresponding to Fe2+ in two different sites (species A and B), one weak Fe3+ doublet (specie C) and magnetic components associated to the presence of small particles of magnetite. Multivariate statistical analysis of the Mössbauer data (hyperfine parameters) allows to defined two main groups of obsidians, reflecting different geographical origins.

  10. Automated structure solution, density modification and model building.

    PubMed

    Terwilliger, Thomas C

    2002-11-01

    The approaches that form the basis of automated structure solution in SOLVE and RESOLVE are described. The use of a scoring scheme to convert decision making in macromolecular structure solution to an optimization problem has proven very useful and in many cases a single clear heavy-atom solution can be obtained and used for phasing. Statistical density modification is well suited to an automated approach to structure solution because the method is relatively insensitive to choices of numbers of cycles and solvent content. The detection of non-crystallographic symmetry (NCS) in heavy-atom sites and checking of potential NCS operations against the electron-density map has proven to be a reliable method for identification of NCS in most cases. Automated model building beginning with an FFT-based search for helices and sheets has been successful in automated model building for maps with resolutions as low as 3 A. The entire process can be carried out in a fully automatic fashion in many cases.

  11. Detection of underground voids in Tahura Japan Cave Bandung using ground penetrating radar

    NASA Astrophysics Data System (ADS)

    Azimmah, Azizatun; Widodo

    2017-07-01

    The detection of underground voids is important due to their effects on subsidence higher risk. Ground Penetrating Radar is one of geophysical electromagnetic methods that has been proven to be able to detect and locate any void beneath the surface effectively at a shallow depth. This method uses the contrasts of dielectric properties, resistivity and magnetic permeability to investigate and map what lies beneath the surface. Hence, this research focused on how GPR could be applied for detecting underground voids at the site of investigation, The Japan Cave in Taman Hutan Raya located in Dago, Bandung, Indonesia. A 100 MHz GPR shielded antenna frequency were used to measure three >80 meters long measurement lines. These three GPR profiles were positioned on the surface above the Japan Cave. The radargram results showed existences of different amplitude regions proven to be the air-filled cavities, at a depth of <10 meters, and interfaces between the underneath layers.

  12. Production of fumaric acid by immobilized Rhizopus arrhizus RH 7-13-9# on loofah fiber in a stirred-tank reactor.

    PubMed

    Liu, Huan; Zhao, Shijie; Jin, Yuhan; Yue, Xuemin; Deng, Li; Wang, Fang; Tan, Tianwei

    2017-11-01

    Fumaric acid is an important building-block chemical. The production of fumaric acid by fermentation is possible. Loofah fiber is a natural, biodegradable, renewable polymer material with highly sophisticated and pore structure. This work investigated a new immobilization method using loofah fiber as carrier to produce fumaric acid in a stirred-tank reactor. Compared with other carriers, loofah fiber was proven to be efficiently and successfully used in the reactor. After the optimization process, 20g addition of loofah fiber and 400rpm agitation speed were chosen as the most suitable process conditions. 30.3g/L fumaric acid in the broth as well as 19.16g fumaric acid in the precipitation of solid was achieved, while the yield from glucose reached 0.211g/g. Three batches of fermentation using the same loofah fiber carrier were conducted successfully, which meant it provided a new method to produce fumaric acid in a stirred-tank reactor. Copyright © 2017. Published by Elsevier Ltd.

  13. A draft map of the mouse pluripotent stem cell spatial proteome

    PubMed Central

    Christoforou, Andy; Mulvey, Claire M.; Breckels, Lisa M.; Geladaki, Aikaterini; Hurrell, Tracey; Hayward, Penelope C.; Naake, Thomas; Gatto, Laurent; Viner, Rosa; Arias, Alfonso Martinez; Lilley, Kathryn S.

    2016-01-01

    Knowledge of the subcellular distribution of proteins is vital for understanding cellular mechanisms. Capturing the subcellular proteome in a single experiment has proven challenging, with studies focusing on specific compartments or assigning proteins to subcellular niches with low resolution and/or accuracy. Here we introduce hyperLOPIT, a method that couples extensive fractionation, quantitative high-resolution accurate mass spectrometry with multivariate data analysis. We apply hyperLOPIT to a pluripotent stem cell population whose subcellular proteome has not been extensively studied. We provide localization data on over 5,000 proteins with unprecedented spatial resolution to reveal the organization of organelles, sub-organellar compartments, protein complexes, functional networks and steady-state dynamics of proteins and unexpected subcellular locations. The method paves the way for characterizing the impact of post-transcriptional and post-translational modification on protein location and studies involving proteome-level locational changes on cellular perturbation. An interactive open-source resource is presented that enables exploration of these data. PMID:26754106

  14. Object-Oriented/Data-Oriented Design of a Direct Simulation Monte Carlo Algorithm

    NASA Technical Reports Server (NTRS)

    Liechty, Derek S.

    2014-01-01

    Over the past decade, there has been much progress towards improved phenomenological modeling and algorithmic updates for the direct simulation Monte Carlo (DSMC) method, which provides a probabilistic physical simulation of gas Rows. These improvements have largely been based on the work of the originator of the DSMC method, Graeme Bird. Of primary importance are improved chemistry, internal energy, and physics modeling and a reduction in time to solution. These allow for an expanded range of possible solutions In altitude and velocity space. NASA's current production code, the DSMC Analysis Code (DAC), is well-established and based on Bird's 1994 algorithms written in Fortran 77 and has proven difficult to upgrade. A new DSMC code is being developed in the C++ programming language using object-oriented and data-oriented design paradigms to facilitate the inclusion of the recent improvements and future development activities. The development efforts on the new code, the Multiphysics Algorithm with Particles (MAP), are described, and performance comparisons are made with DAC.

  15. Determination of toxigenic fungi and aflatoxins in nuts and dried fruits using imaging and spectroscopic techniques.

    PubMed

    Wu, Qifang; Xie, Lijuan; Xu, Huirong

    2018-06-30

    Nuts and dried fruits contain rich nutrients and are thus highly vulnerable to contamination with toxigenic fungi and aflatoxins because of poor weather, processing and storage conditions. Imaging and spectroscopic techniques have proven to be potential alternative tools to wet chemistry methods for efficient and non-destructive determination of contamination with fungi and toxins. Thus, this review provides an overview of the current developments and applications in frequently used food safety testing techniques, including near infrared spectroscopy (NIRS), mid-infrared spectroscopy (MIRS), conventional imaging techniques (colour imaging (CI) and hyperspectral imaging (HSI)), and fluorescence spectroscopy and imaging (FS/FI). Interesting classification and determination results can be found in both static and on/in-line real-time detection for contaminated nuts and dried fruits. Although these techniques offer many benefits over conventional methods, challenges remain in terms of heterogeneous distribution of toxins, background constituent interference, model robustness, detection limits, sorting efficiency, as well as instrument development. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Use of indicator chemicals to characterize the plastic fragments ingested by Laysan albatross.

    PubMed

    Nilsen, Frances; David Hyrenbach, K; Fang, Jiasong; Jensen, Brenda

    2014-10-15

    Laysan albatross (Phoebastria immutabilis) ingest plastic marine debris of a wide range of shape, sizes and sources. To better characterize this plastic and provide insights regarding its provenance and persistence in the environment, we developed a simple method to classify plastic fragments of unknown origin according to the resin codes used by the Society of Plastics Industry. Known plastics were analyzed by gas chromatography-mass spectroscopy (GC-MS) to identify indicator chemicals characteristic of each plastic resin. Application of this method to fragments of ingested plastic debris from boluses of Laysan albatross from Kure Atoll, Hawai'i, yielded proportions of 0.8% High Density Polyethylene, 6.8% Polystyrene, 8.5% Polyethylene Terephthalate, 20.5% Polyvinyl Chloride and 68.4% Polypropylene. Some fragments were composed of multiple resin types. These results suggest that infrequently recycled plastics are the dominant fragments ingested by albatross, and that these are the most prevalent and persistent resin types in the marine environment. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. A facile strategy to decorate Cu₉S₅ nanocrystals on polyaniline nanowires and their synergetic catalytic properties.

    PubMed

    Lu, Xiao-feng; Bian, Xiu-jie; Li, Zhi-cheng; Chao, Dan-ming; Wang, Ce

    2013-10-16

    Here, we demonstrated a novel method to decorate Cu₉S₅ nanocrystals on polyaniline (PANI) nanowires using the dopant of mercaptoacetic acid (MAA) in the PANI matrix as the sulfur source under a hydrothermal reaction. TEM images showed that Cu₉S₅ nanocrystals with a size in the range of 5-20 nm were uniformly formed on the surface of PANI nanowires. Significantly, the as-prepared PANI/Cu₉S₅ composite nanowires have been proven to be novel peroxidase mimics toward the oxidation of the peroxidase substrate 3,3',5,5'-tetramethylbenzidine (TMB) in the presence of H₂O₂. Due to the synergetic effects between polyaniline nanowires and Cu₉S₅ nanocrystals, the obtained PANI/Cu₉S₅ composite nanowires exhibit superior catalytic activity over the independent components. This work not only presents a simple and versatile method to decorate semiconductor nanocrystals on the surface of conducting polymer nanostructures, but also provides fundamental guidelines for further investigations into the synergetic effect between conducting polymers and other materials.

  18. "SmartMonitor"--an intelligent security system for the protection of individuals and small properties with the possibility of home automation.

    PubMed

    Frejlichowski, Dariusz; Gościewska, Katarzyna; Forczmański, Paweł; Hofman, Radosław

    2014-06-05

    "SmartMonitor" is an intelligent security system based on image analysis that combines the advantages of alarm, video surveillance and home automation systems. The system is a complete solution that automatically reacts to every learned situation in a pre-specified way and has various applications, e.g., home and surrounding protection against unauthorized intrusion, crime detection or supervision over ill persons. The software is based on well-known and proven methods and algorithms for visual content analysis (VCA) that were appropriately modified and adopted to fit specific needs and create a video processing model which consists of foreground region detection and localization, candidate object extraction, object classification and tracking. In this paper, the "SmartMonitor" system is presented along with its architecture, employed methods and algorithms, and object analysis approach. Some experimental results on system operation are also provided. In the paper, focus is put on one of the aforementioned functionalities of the system, namely supervision over ill persons.

  19. New immobilisation protocol for the template used in solid-phase synthesis of MIP nanoparticles

    NASA Astrophysics Data System (ADS)

    Chen, Lu; Muhammad, Turghun; Yakup, Burabiye; Piletsky, Sergey A.

    2017-06-01

    As a novel imprinting method, solid-phase synthesis has proven to be a promising approach to prepare polymer nanoparticles with specific recognition sites for a template molecule. In this method, imprinted polymer nanoparticles were synthesized using template immobilized on a solid support. Herein, preparation of immobilized templates on quartz chips through homogeneous route was reported as an efficient alternative strategy to heterogeneous one. The template molecule indole-3-butyric acid (IBA) was reacted with 3-aminopropyltriethoxysilane (APTES) to produce silylated template (IBA-APTES), and it was characterized by IR, 1H NMR and GC-MS. Then, the silylated template molecule was grafted onto the activated surfaces of quartz chip to prepare immobilized template (SiO2@IBA-APTES). The immobilization was confirmed by contact angle, XPS, UV and fluorescence measurement. Immobilization protocol has shown good reproducibility and stability of the immobilized template. MIP nanoparticles were prepared with high selectivity toward the molecule immobilized onto the solid surface. This provides a new approach for the development of molecularly imprinted nanoparticles.

  20. [Comparative study on alkaloids of tissue-culture seedling and wild plant of Dendrobium huoshanense ].

    PubMed

    Chen, Nai-dong; Gao, Feng; Lin, Xin; Jin, Hui

    2014-06-01

    To compare the composition and content of alkaloid of Dendrobium huoshanense tissue-culture seedling and wild plant. A comparative evaluation on the quality was carried out by HPLC and TLC methods including the composition and the content of alkaloids. Remarkable variation existed in the two kinds of Dendrobium huoshanense. For the tissue-culture plant, only two alkaloids were checked out by both HPLC and TLC while four alkaloids were observed in the wild plant. The alkaloid content of tissue-culture seedling and wild plant was(0. 29 ± 0. 11)%o and(0. 43 ± 0. 15) %o,respectively. Distinguished difference is observed in both composition and content of alkaloids from the annual shoots of different provenances of Dendrobium huoshanense. It suggested that the quality of tissue-culture seedling of Dendrobium huoshanense might be inconsistent with the wild plant. Furthermore, the established alkaloids-knock-out HPLC method would provide a new research tool on quality control of Chinese medicinal materials which contain unknown alkaloids.

Top