Online Learner Engagement: Opportunities and Challenges with Using Data Analytics
ERIC Educational Resources Information Center
Bodily, Robert; Graham, Charles R.; Bush, Michael D.
2017-01-01
This article describes the crossroads between learning analytics and learner engagement. The authors do this by describing specific challenges of using analytics to support student engagement from three distinct perspectives: pedagogical considerations, technological issues, and interface design concerns. While engaging online learners presents a…
Qian Cutrone, Jingfang Jenny; Huang, Xiaohua Stella; Kozlowski, Edward S; Bao, Ye; Wang, Yingzi; Poronsky, Christopher S; Drexler, Dieter M; Tymiak, Adrienne A
2017-05-10
Synthetic macrocyclic peptides with natural and unnatural amino acids have gained considerable attention from a number of pharmaceutical/biopharmaceutical companies in recent years as a promising approach to drug discovery, particularly for targets involving protein-protein or protein-peptide interactions. Analytical scientists charged with characterizing these leads face multiple challenges including dealing with a class of complex molecules with the potential for multiple isomers and variable charge states and no established standards for acceptable analytical characterization of materials used in drug discovery. In addition, due to the lack of intermediate purification during solid phase peptide synthesis, the final products usually contain a complex profile of impurities. In this paper, practical analytical strategies and methodologies were developed to address these challenges, including a tiered approach to assessing the purity of macrocyclic peptides at different stages of drug discovery. Our results also showed that successful progression and characterization of a new drug discovery modality benefited from active analytical engagement, focusing on fit-for-purpose analyses and leveraging a broad palette of analytical technologies and resources. Copyright © 2017. Published by Elsevier B.V.
Zakaria, Rosita; Allen, Katrina J; Koplin, Jennifer J; Roche, Peter; Greaves, Ronda F
2016-12-01
Through the introduction of advanced analytical techniques and improved throughput, the scope of dried blood spot testing utilising mass spectrometric methods, has broadly expanded. Clinicians and researchers have become very enthusiastic about the potential applications of dried blood spot based mass spectrometric applications. Analysts on the other hand face challenges of sensitivity, reproducibility and overall accuracy of dried blood spot quantification. In this review, we aim to bring together these two facets to discuss the advantages and current challenges of non-newborn screening applications of dried blood spot quantification by mass spectrometry. To address these aims we performed a key word search of the PubMed and MEDLINE online databases in conjunction with individual manual searches to gather information. Keywords for the initial search included; "blood spot" and "mass spectrometry"; while excluding "newborn"; and "neonate". In addition, databases were restricted to English language and human specific. There was no time period limit applied. As a result of these selection criteria, 194 references were identified for review. For presentation, this information is divided into: 1) clinical applications; and 2) analytical considerations across the total testing process; being pre-analytical, analytical and post-analytical considerations. DBS analysis using MS applications is now broadly applied, with drug monitoring for both therapeutic and toxicological analysis being the most extensively reported. Several parameters can affect the accuracy of DBS measurement and further bridge experiments are required to develop adjustment rules for comparability between dried blood spot measures and the equivalent serum/plasma values. Likewise, the establishment of independent reference intervals for dried blood spot sample matrix is required.
Zakaria, Rosita; Allen, Katrina J.; Koplin, Jennifer J.; Roche, Peter
2016-01-01
Introduction Through the introduction of advanced analytical techniques and improved throughput, the scope of dried blood spot testing utilising mass spectrometric methods, has broadly expanded. Clinicians and researchers have become very enthusiastic about the potential applications of dried blood spot based mass spectrometric applications. Analysts on the other hand face challenges of sensitivity, reproducibility and overall accuracy of dried blood spot quantification. In this review, we aim to bring together these two facets to discuss the advantages and current challenges of non-newborn screening applications of dried blood spot quantification by mass spectrometry. Methods To address these aims we performed a key word search of the PubMed and MEDLINE online databases in conjunction with individual manual searches to gather information. Keywords for the initial search included; “blood spot” and “mass spectrometry”; while excluding “newborn”; and “neonate”. In addition, databases were restricted to English language and human specific. There was no time period limit applied. Results As a result of these selection criteria, 194 references were identified for review. For presentation, this information is divided into: 1) clinical applications; and 2) analytical considerations across the total testing process; being pre-analytical, analytical and post-analytical considerations. Conclusions DBS analysis using MS applications is now broadly applied, with drug monitoring for both therapeutic and toxicological analysis being the most extensively reported. Several parameters can affect the accuracy of DBS measurement and further bridge experiments are required to develop adjustment rules for comparability between dried blood spot measures and the equivalent serum/plasma values. Likewise, the establishment of independent reference intervals for dried blood spot sample matrix is required. PMID:28149263
Biosensor technology: technology push versus market pull.
Luong, John H T; Male, Keith B; Glennon, Jeremy D
2008-01-01
Biosensor technology is based on a specific biological recognition element in combination with a transducer for signal processing. Since its inception, biosensors have been expected to play a significant analytical role in medicine, agriculture, food safety, homeland security, environmental and industrial monitoring. However, the commercialization of biosensor technology has significantly lagged behind the research output as reflected by a plethora of publications and patenting activities. The rationale behind the slow and limited technology transfer could be attributed to cost considerations and some key technical barriers. Analytical chemistry has changed considerably, driven by automation, miniaturization, and system integration with high throughput for multiple tasks. Such requirements pose a great challenge in biosensor technology which is often designed to detect one single or a few target analytes. Successful biosensors must be versatile to support interchangeable biorecognition elements, and in addition miniaturization must be feasible to allow automation for parallel sensing with ease of operation at a competitive cost. A significant upfront investment in research and development is a prerequisite in the commercialization of biosensors. The progress in such endeavors is incremental with limited success, thus, the market entry for a new venture is very difficult unless a niche product can be developed with a considerable market volume.
Siegel, David; Permentier, Hjalmar; Reijngoud, Dirk-Jan; Bischoff, Rainer
2014-09-01
This review deals with chemical and technical challenges in the analysis of small-molecule metabolites involved in central carbon and energy metabolism via liquid-chromatography mass-spectrometry (LC-MS). The covered analytes belong to the prominent pathways in biochemical carbon oxidation such as glycolysis or the tricarboxylic acid cycle and, for the most part, share unfavorable properties such as a high polarity, chemical instability or metal-affinity. The topic is introduced by selected examples on successful applications of metabolomics in the clinic. In the core part of the paper, the structural features of important analyte classes such as nucleotides, coenzyme A thioesters or carboxylic acids are linked to "problematic hotspots" along the analytical chain (sample preparation and-storage, separation and detection). We discuss these hotspots from a chemical point of view, covering issues such as analyte degradation or interactions with metals and other matrix components. Based on this understanding we propose solutions wherever available. A major notion derived from these considerations is that comprehensive carbon metabolomics inevitably requires multiple, complementary analytical approaches covering different chemical classes of metabolites. Copyright © 2013 Elsevier B.V. All rights reserved.
Cismesia, Adam P.; Bailey, Laura S.; Bell, Matthew R.; Tesler, Larry F.; Polfer, Nicolas C.
2016-01-01
The detailed chemical information contained in the vibrational spectrum of a cryogenically cooled analyte would, in principle, make infrared (IR) ion spectroscopy a gold standard technique for molecular identification in mass spectrometry. Despite this immense potential, there are considerable challenges in both instrumentation and methodology to overcome before the technique is analytically useful. Here, we discuss the promise of IR ion spectroscopy for small molecule analysis in the context of metabolite identification. Experimental strategies to address sensitivity constraints, poor overall duty cycle, and speed of the experiment are intimately tied to the development of a mass-selective cryogenic trap. Therefore, the most likely avenues for success, in the authors? opinion, are presented here, alongside alternative approaches and some thoughts on data interpretation. PMID:26975370
ERIC Educational Resources Information Center
Duncan, Garrett Albert
2000-01-01
States that moral educators can learn from North Americans who have challenged U.S. human rights violations, especially violations within the United States. Uses race as an analytical tool to illustrate human rights abuses. Concludes by discussing the implications for crossing boundaries between human rights work and moral education. (CMK)
Hess, Jeremy J.; Ebi, Kristie L.; Markandya, Anil; Balbus, John M.; Wilkinson, Paul; Haines, Andy; Chalabi, Zaid
2014-01-01
Background: Policy decisions regarding climate change mitigation are increasingly incorporating the beneficial and adverse health impacts of greenhouse gas emission reduction strategies. Studies of such co-benefits and co-harms involve modeling approaches requiring a range of analytic decisions that affect the model output. Objective: Our objective was to assess analytic decisions regarding model framework, structure, choice of parameters, and handling of uncertainty when modeling health co-benefits, and to make recommendations for improvements that could increase policy uptake. Methods: We describe the assumptions and analytic decisions underlying models of mitigation co-benefits, examining their effects on modeling outputs, and consider tools for quantifying uncertainty. Discussion: There is considerable variation in approaches to valuation metrics, discounting methods, uncertainty characterization and propagation, and assessment of low-probability/high-impact events. There is also variable inclusion of adverse impacts of mitigation policies, and limited extension of modeling domains to include implementation considerations. Going forward, co-benefits modeling efforts should be carried out in collaboration with policy makers; these efforts should include the full range of positive and negative impacts and critical uncertainties, as well as a range of discount rates, and should explicitly characterize uncertainty. We make recommendations to improve the rigor and consistency of modeling of health co-benefits. Conclusion: Modeling health co-benefits requires systematic consideration of the suitability of model assumptions, of what should be included and excluded from the model framework, and how uncertainty should be treated. Increased attention to these and other analytic decisions has the potential to increase the policy relevance and application of co-benefits modeling studies, potentially helping policy makers to maximize mitigation potential while simultaneously improving health. Citation: Remais JV, Hess JJ, Ebi KL, Markandya A, Balbus JM, Wilkinson P, Haines A, Chalabi Z. 2014. Estimating the health effects of greenhouse gas mitigation strategies: addressing parametric, model, and valuation challenges. Environ Health Perspect 122:447–455; http://dx.doi.org/10.1289/ehp.1306744 PMID:24583270
Analytic Considerations and Design Basis for the IEEE Distribution Test Feeders
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schneider, K. P.; Mather, B. A.; Pal, B. C.
For nearly 20 years the Test Feeder Working Group of the Distribution System Analysis Subcommittee has been developing openly available distribution test feeders for use by researchers. The purpose of these test feeders is to provide models of distribution systems that reflect the wide diversity in design and their various analytic challenges. Because of their utility and accessibility, the test feeders have been used for a wide range of research, some of which has been outside the original scope of intended uses. This paper provides an overview of the existing distribution feeder models and clarifies the specific analytic challenges thatmore » they were originally designed to examine. Additionally, the paper will provide guidance on which feeders are best suited for various types of analysis. The purpose of this paper is to provide the original intent of the Working Group and to provide the information necessary so that researchers may make an informed decision on which of the test feeders are most appropriate for their work.« less
Analytic Considerations and Design Basis for the IEEE Distribution Test Feeders
Schneider, K. P.; Mather, B. A.; Pal, B. C.; ...
2017-10-10
For nearly 20 years the Test Feeder Working Group of the Distribution System Analysis Subcommittee has been developing openly available distribution test feeders for use by researchers. The purpose of these test feeders is to provide models of distribution systems that reflect the wide diversity in design and their various analytic challenges. Because of their utility and accessibility, the test feeders have been used for a wide range of research, some of which has been outside the original scope of intended uses. This paper provides an overview of the existing distribution feeder models and clarifies the specific analytic challenges thatmore » they were originally designed to examine. Additionally, the paper will provide guidance on which feeders are best suited for various types of analysis. The purpose of this paper is to provide the original intent of the Working Group and to provide the information necessary so that researchers may make an informed decision on which of the test feeders are most appropriate for their work.« less
Automated Predictive Big Data Analytics Using Ontology Based Semantics.
Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A
2015-10-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.
Automated Predictive Big Data Analytics Using Ontology Based Semantics
Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.
2017-01-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954
On the intersection of phonetic detail and the organization of interaction: clinical connections.
Walker, Gareth; Local, John
2013-01-01
The analysis of language use in real-world contexts poses particular methodological challenges. We codify responses to these challenges as a series of methodological imperatives. To demonstrate the relevance of these imperatives to clinical investigation, we present analyses of single episodes of interaction where one participant has a speech and/or language impairment: atypical prosody, echolalia and dysarthria. We demonstrate there is considerable heuristic and analytic value in taking this approach to analysing the organization of interaction involving individuals with a speech and/or language impairment.
An update on pharmaceutical film coating for drug delivery.
Felton, Linda A; Porter, Stuart C
2013-04-01
Pharmaceutical coating processes have generally been transformed from what was essentially an art form in the mid-twentieth century to a much more technology-driven process. This review article provides a basic overview of current film coating processes, including a discussion on polymer selection, coating formulation additives and processing equipment. Substrate considerations for pharmaceutical coating processes are also presented. While polymeric coating operations are commonplace in the pharmaceutical industry, film coating processes are still not fully understood, which presents serious challenges with current regulatory requirements. Novel analytical technologies and various modeling techniques that are being used to better understand film coating processes are discussed. This review article also examines the challenges of implementing process analytical technologies in coating operations, active pharmaceutical ingredients in polymer film coatings, the use of high-solids coating systems and continuous coating and other novel coating application methods.
Improving early cycle economic evaluation of diagnostic technologies.
Steuten, Lotte M G; Ramsey, Scott D
2014-08-01
The rapidly increasing range and expense of new diagnostics, compels consideration of a different, more proactive approach to health economic evaluation of diagnostic technologies. Early cycle economic evaluation is a decision analytic approach to evaluate technologies in development so as to increase the return on investment as well as patient and societal impact. This paper describes examples of 'early cycle economic evaluations' as applied to diagnostic technologies and highlights challenges in its real-time application. It shows that especially in the field of diagnostics, with rapid technological developments and a changing regulatory climate, early cycle economic evaluation can have a guiding role to improve the efficiency of the diagnostics innovation process. In the next five years the attention will move beyond the methodological and analytic challenges of early cycle economic evaluation towards the challenge of effectively applying it to improve diagnostic research and development and patient value. Future work in this area should therefore be 'strong on principles and soft on metrics', that is, the metrics that resonate most clearly with the various decision makers in this field.
The Occurrence of Veterinary Pharmaceuticals in the Environment: A Review
Kaczala, Fabio; Blum, Shlomo E.
2016-01-01
It is well known that there is a widespread use of veterinary pharmaceuticals and consequent release into different ecosystems such as freshwater bodies and groundwater systems. Furthermore, the use of organic fertilizers produced from animal waste manure has been also responsible for the occurrence of veterinary pharmaceuticals in agricultural soils. This article is a review of different studies focused on the detection and quantification of such compounds in environmental compartments using different analytical techniques. Furthermore, this paper reports the main challenges regarding veterinary pharmaceuticals in terms of analytical methods, detection/quantification of parent compounds and metabolites, and risks/toxicity to human health and aquatic ecosystems. Based on the existing literature, it is clear that only limited data is available regarding veterinary compounds and there are still considerable gaps to be bridged in order to remediate existing problems and prevent future ones. In terms of analytical methods, there are still considerable challenges to overcome considering the large number of existing compounds and respective metabolites. A number of studies highlight the lack of attention given to the detection and quantification of transformation products and metabolites. Furthermore more attention needs to be given in relation to the toxic effects and potential risks that veterinary compounds pose to environmental and human health. To conclude, the more research investigations focused on these subjects take place in the near future, more rapidly we will get a better understanding about the behavior of these compounds and the real risks they pose to aquatic and terrestrial environments and how to properly tackle them. PMID:28579931
Translation of proteomic biomarkers into FDA approved cancer diagnostics: issues and challenges
2013-01-01
Tremendous efforts have been made over the past few decades to discover novel cancer biomarkers for use in clinical practice. However, a striking discrepancy exists between the effort directed toward biomarker discovery and the number of markers that make it into clinical practice. One of the confounding issues in translating a novel discovery into clinical practice is that quite often the scientists working on biomarker discovery have limited knowledge of the analytical, diagnostic, and regulatory requirements for a clinical assay. This review provides an introduction to such considerations with the aim of generating more extensive discussion for study design, assay performance, and regulatory approval in the process of translating new proteomic biomarkers from discovery into cancer diagnostics. We first describe the analytical requirements for a robust clinical biomarker assay, including concepts of precision, trueness, specificity and analytical interference, and carryover. We next introduce the clinical considerations of diagnostic accuracy, receiver operating characteristic analysis, positive and negative predictive values, and clinical utility. We finish the review by describing components of the FDA approval process for protein-based biomarkers, including classification of biomarker assays as medical devices, analytical and clinical performance requirements, and the approval process workflow. While we recognize that the road from biomarker discovery, validation, and regulatory approval to the translation into the clinical setting could be long and difficult, the reward for patients, clinicians and scientists could be rather significant. PMID:24088261
Polarized Heliospheric Imaging: Lessons, Benefits, Challenges, and Status (Invited)
NASA Astrophysics Data System (ADS)
DeForest, C. E.; Howard, T. A.
2013-12-01
STEREO has delivered on the promise of continuous, photometric imaging of coronal and heliospheric transients from Sun to Earth. It is time to explore polarized heliospheric imaging. Applications include 3-D location of individual features and improved separation of signal from background. These scientific applications have different advantages and challenges in the heliosphere than the corona. We present analytical and numerical results on 3-D location of features both large and small with polarized heliospheric imaging; describe advantages to polarimetry for both in-ecliptic and out-of-ecliptic missions; and discuss some of the design considerations for PHI-C, our proposed mission to prototype this technology from LEO.
Bridge, Julia A
2017-01-01
The introduction of molecular testing into cytopathology laboratory practice has expanded the types of samples considered feasible for identifying genetic alterations that play an essential role in cancer diagnosis and treatment. Reverse transcription-polymerase chain reaction (RT-PCR), a sensitive and specific technical approach for amplifying a defined segment of RNA after it has been reverse-transcribed into its DNA complement, is commonly used in clinical practice for the identification of recurrent or tumor-specific fusion gene events. Real-time RT-PCR (quantitative RT-PCR), a technical variation, also permits the quantitation of products generated during each cycle of the polymerase chain reaction process. This review addresses qualitative and quantitative pre-analytic and analytic considerations of RT-PCR as they relate to various cytologic specimens. An understanding of these aspects of genetic testing is central to attaining optimal results in the face of the challenges that cytology specimens may present. Cancer Cytopathol 2017;125:11-19. © 2016 American Cancer Society. © 2016 American Cancer Society.
VAST Challenge 2016: Streaming Visual Analytics
2016-10-25
understand rapidly evolving situations. To support such tasks, visual analytics solutions must move well beyond systems that simply provide real-time...received. Mini-Challenge 1: Design Challenge Mini-Challenge 1 focused on systems to support security and operational analytics at the Euybia...Challenge 1 was to solicit novel approaches for streaming visual analytics that push the boundaries for what constitutes a visual analytics system , and to
Fu, Wei; Shi, Qiyuan; Prosperi, Christine; Wu, Zhenke; Hammitt, Laura L.; Feikin, Daniel R.; Baggett, Henry C.; Howie, Stephen R.C.; Scott, J. Anthony G.; Murdoch, David R.; Madhi, Shabir A.; Thea, Donald M.; Brooks, W. Abdullah; Kotloff, Karen L.; Li, Mengying; Park, Daniel E.; Lin, Wenyi; Levine, Orin S.; O’Brien, Katherine L.; Zeger, Scott L.
2017-01-01
Abstract In pneumonia, specimens are rarely obtained directly from the infection site, the lung, so the pathogen causing infection is determined indirectly from multiple tests on peripheral clinical specimens, which may have imperfect and uncertain sensitivity and specificity, so inference about the cause is complex. Analytic approaches have included expert review of case-only results, case–control logistic regression, latent class analysis, and attributable fraction, but each has serious limitations and none naturally integrate multiple test results. The Pneumonia Etiology Research for Child Health (PERCH) study required an analytic solution appropriate for a case–control design that could incorporate evidence from multiple specimens from cases and controls and that accounted for measurement error. We describe a Bayesian integrated approach we developed that combined and extended elements of attributable fraction and latent class analyses to meet some of these challenges and illustrate the advantage it confers regarding the challenges identified for other methods. PMID:28575370
Irrgeher, Johanna; Prohaska, Thomas
2016-01-01
Analytical ecogeochemistry is an evolving scientific field dedicated to the development of analytical methods and tools and their application to ecological questions. Traditional stable isotopic systems have been widely explored and have undergone continuous development during the last century. The variations of the isotopic composition of light elements (H, O, N, C, and S) have provided the foundation of stable isotope analysis followed by the analysis of traditional geochemical isotope tracers (e.g., Pb, Sr, Nd, Hf). Questions in a considerable diversity of scientific fields have been addressed, many of which can be assigned to the field of ecogeochemistry. Over the past 15 years, other stable isotopes (e.g., Li, Zn, Cu, Cl) have emerged gradually as novel tools for the investigation of scientific topics that arise in ecosystem research and have enabled novel discoveries and explorations. These systems are often referred to as non-traditional isotopes. The small isotopic differences of interest that are increasingly being addressed for a growing number of isotopic systems represent a challenge to the analytical scientist and push the limits of today's instruments constantly. This underlines the importance of a metrologically sound concept of analytical protocols and procedures and a solid foundation of data processing strategies and uncertainty considerations before these small isotopic variations can be interpreted in the context of applied ecosystem research. This review focuses on the development of isotope research in ecogeochemistry, the requirements for successful detection of small isotopic shifts, and highlights the most recent and innovative applications in the field.
Laboratory approach for diagnosis of toluene-based inhalant abuse in a clinical setting
Jain, Raka; Verma, Arpita
2016-01-01
The steady increase of inhalant abuse is a great challenge for analytical toxicologists. This review describes an overview of inhalant abuse including the extent of the problem, types of products abused, modes of administration, pharmacology and effects of inhalants, the role of laboratory, interpretation of laboratory results and clinical considerations. Regular laboratory screening for inhalant abuse as well as other substance abuse and health risk behaviors must be a part of standard clinical care. PMID:26957863
Big Data: transforming drug development and health policy decision making.
Alemayehu, Demissie; Berger, Marc L
The explosion of data sources, accompanied by the evolution of technology and analytical techniques, has created considerable challenges and opportunities for drug development and healthcare resource utilization. We present a systematic overview these phenomena, and suggest measures to be taken for effective integration of the new developments in the traditional medical research paradigm and health policy decision making. Special attention is paid to pertinent issues in emerging areas, including rare disease drug development, personalized medicine, Comparative Effectiveness Research, and privacy and confidentiality concerns.
Challenges for mapping cyanotoxin patterns from remote sensing of cyanobacteria
Stumpf, Rick P; Davis, Timothy W.; Wynne, Timothy T.; Graham, Jennifer L.; Loftin, Keith A.; Johengen, T.H.; Gossiaux, D.; Palladino, D.; Burtner, A.
2016-01-01
Using satellite imagery to quantify the spatial patterns of cyanobacterial toxins has several challenges. These challenges include the need for surrogate pigments – since cyanotoxins cannot be directly detected by remote sensing, the variability in the relationship between the pigments and cyanotoxins – especially microcystins (MC), and the lack of standardization of the various measurement methods. A dual-model strategy can provide an approach to address these challenges. One model uses either chlorophyll-a (Chl-a) or phycocyanin (PC) collected in situ as a surrogate to estimate the MC concentration. The other uses a remote sensing algorithm to estimate the concentration of the surrogate pigment. Where blooms are mixtures of cyanobacteria and eukaryotic algae, PC should be the preferred surrogate to Chl-a. Where cyanobacteria dominate, Chl-a is a better surrogate than PC for remote sensing. Phycocyanin is less sensitive to detection by optical remote sensing, it is less frequently measured, PC laboratory methods are still not standardized, and PC has greater intracellular variability. Either pigment should not be presumed to have a fixed relationship with MC for any water body. The MC-pigment relationship can be valid over weeks, but have considerable intra- and inter-annual variability due to changes in the amount of MC produced relative to cyanobacterial biomass. To detect pigments by satellite, three classes of algorithms (analytic, semi-analytic, and derivative) have been used. Analytical and semi-analytical algorithms are more sensitive but less robust than derivatives because they depend on accurate atmospheric correction; as a result derivatives are more commonly used. Derivatives can estimate Chl-a concentration, and research suggests they can detect and possibly quantify PC. Derivative algorithms, however, need to be standardized in order to evaluate the reproducibility of parameterizations between lakes. A strategy for producing useful estimates of microcystins from cyanobacterial biomass is described, provided cyanotoxin variability is addressed.
Deloria Knoll, Maria; Fu, Wei; Shi, Qiyuan; Prosperi, Christine; Wu, Zhenke; Hammitt, Laura L; Feikin, Daniel R; Baggett, Henry C; Howie, Stephen R C; Scott, J Anthony G; Murdoch, David R; Madhi, Shabir A; Thea, Donald M; Brooks, W Abdullah; Kotloff, Karen L; Li, Mengying; Park, Daniel E; Lin, Wenyi; Levine, Orin S; O'Brien, Katherine L; Zeger, Scott L
2017-06-15
In pneumonia, specimens are rarely obtained directly from the infection site, the lung, so the pathogen causing infection is determined indirectly from multiple tests on peripheral clinical specimens, which may have imperfect and uncertain sensitivity and specificity, so inference about the cause is complex. Analytic approaches have included expert review of case-only results, case-control logistic regression, latent class analysis, and attributable fraction, but each has serious limitations and none naturally integrate multiple test results. The Pneumonia Etiology Research for Child Health (PERCH) study required an analytic solution appropriate for a case-control design that could incorporate evidence from multiple specimens from cases and controls and that accounted for measurement error. We describe a Bayesian integrated approach we developed that combined and extended elements of attributable fraction and latent class analyses to meet some of these challenges and illustrate the advantage it confers regarding the challenges identified for other methods. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.
SVM-Based System for Prediction of Epileptic Seizures from iEEG Signal
Cherkassky, Vladimir; Lee, Jieun; Veber, Brandon; Patterson, Edward E.; Brinkmann, Benjamin H.; Worrell, Gregory A.
2017-01-01
Objective This paper describes a data-analytic modeling approach for prediction of epileptic seizures from intracranial electroencephalogram (iEEG) recording of brain activity. Even though it is widely accepted that statistical characteristics of iEEG signal change prior to seizures, robust seizure prediction remains a challenging problem due to subject-specific nature of data-analytic modeling. Methods Our work emphasizes understanding of clinical considerations important for iEEG-based seizure prediction, and proper translation of these clinical considerations into data-analytic modeling assumptions. Several design choices during pre-processing and post-processing are considered and investigated for their effect on seizure prediction accuracy. Results Our empirical results show that the proposed SVM-based seizure prediction system can achieve robust prediction of preictal and interictal iEEG segments from dogs with epilepsy. The sensitivity is about 90–100%, and the false-positive rate is about 0–0.3 times per day. The results also suggest good prediction is subject-specific (dog or human), in agreement with earlier studies. Conclusion Good prediction performance is possible only if the training data contain sufficiently many seizure episodes, i.e., at least 5–7 seizures. Significance The proposed system uses subject-specific modeling and unbalanced training data. This system also utilizes three different time scales during training and testing stages. PMID:27362758
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-23
... NUCLEAR REGULATORY COMMISSION [Docket No. 030-05154; NRC-2010-0056] Notice of Consideration of Amendment Request for Decommissioning of Analytical Bio-Chemistry Laboratories, Inc. Sanitary Lagoon... license amendment to Byproduct Material License No. 24- 13365-01 issued to Analytical Bio-Chemistry...
Schnohr, Christina W; Molcho, Michal; Rasmussen, Mette; Samdal, Oddrun; de Looze, Margreet; Levin, Kate; Roberts, Chris J; Ehlinger, Virginie; Krølner, Rikke; Dalmasso, Paola; Torsheim, Torbjørn
2015-04-01
This article presents the scope and development of the Health Behaviour in School-aged Children (HBSC) study, reviews trend papers published on international HBSC data up to 2012 and discusses the efforts made to produce reliable trend analyses. The major goal of this article is to present the statistical procedures and analytical strategies for upholding high data quality, as well as reflections from the authors of this article on how to produce reliable trends based on an international study of the magnitude of the HBSC study. HBSC is an international cross-sectional study collecting data from adolescents aged 11-15 years, on a broad variety of health determinants and health behaviours. A number of methodological challenges have stemmed from the growth of the HBSC-study, in particular given that the study has a focus on monitoring trends. Some of those challenges are considered. When analysing trends, researchers must be able to assess whether a change in prevalence is an expression of an actual change in the observed outcome, whether it is a result of methodological artefacts, or whether it is due to changes in the conceptualization of the outcome by the respondents. The article present recommendations to take a number of the considerations into account. The considerations imply methodological challenges, which are core issues in undertaking trend analyses. © The Author 2015. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.
New roles and challenges within the healthcare workforce: a Heideggerian perspective.
Wilson, Anthea
2015-01-01
The purpose of this paper is to explore insights based on the phenomenology of Martin Heidegger, on the dynamic relationships between human experience and work roles. Drawing on the findings of a hermeneutic phenomenological study of nurse mentors, the topics of new roles and role challenges are explored, along with a consideration of their relevance to wider issues of workforce redesign. Heidegger's philosophy of Dasein, in particular his concepts of inauthentic and authentic self, provided an interpretational lens. This paper applies these philosophical concepts to challenges associated with a changing workforce. Concepts elaborating human existence as proposed by Heidegger may offer analytic structures for understanding shifts in the lived experience of a changing workplace. In particular, the concepts could help managers to explore the implications of introducing novel work roles or extending roles. The understanding gained can also extend to situations where work practices may need to be challenged. As work roles and skill mix undergo rapid shifts, this paper offers an original way of understanding the experience of work roles.
NASA Astrophysics Data System (ADS)
Monti, Alessio; Toscano, Alessandro; Bilotti, Filiberto
2017-06-01
The introduction of nanoparticles-based screens [C. W. Hsu, Nat. Commun. 5, 3152 (2014)] has paved the way to the realization of low-cost transparent displays with a wide viewing angle and scalability to large size. Despite the huge potentialities of this approach, the design of a nanoparticles array exhibiting a sharp scattering response in the optical spectrum is still a challenging task. In this manuscript, we investigate the suitability of ellipsoidal plasmonic nanoparticles for this purpose. First, we show that some trade-offs between the sharpness of the scattering response of the array and its absorption level apply. Starting from these considerations, we prove that prolate nanoparticles may be a plausible candidate for achieving the peculiar features required in transparent screen applications. An example of a full-color and almost-isotropic transparent screen is finally proposed and its robustness towards the geometrical inaccuracies that may arise during the fabrication process is assessed. All the analytical considerations, carried out through an analytical model taking into account the surface dispersion effect affecting the nanoparticles, are supported by a proper set of full-wave simulations.
Konstantinidis, Spyridon; Heldin, Eva; Chhatre, Sunil; Velayudhan, Ajoy; Titchener-Hooker, Nigel
2012-01-01
High throughput approaches to facilitate the development of chromatographic separations have now been adopted widely in the biopharmaceutical industry, but issues of how to reduce the associated analytical burden remain. For example, acquiring experimental data by high level factorial designs in 96 well plates can place a considerable strain upon assay capabilities, generating a bottleneck that limits significantly the speed of process characterization. This article proposes an approach designed to counter this challenge; Strategic Assay Deployment (SAD). In SAD, a set of available analytical methods is investigated to determine which set of techniques is the most appropriate to use and how best to deploy these to reduce the consumption of analytical resources while still enabling accurate and complete process characterization. The approach is demonstrated by investigating how salt concentration and pH affect the binding of green fluorescent protein from Escherichia coli homogenate to an anion exchange resin presented in a 96-well filter plate format. Compared with the deployment of routinely used analytical methods alone, the application of SAD reduced both the total assay time and total assay material consumption by at least 40% and 5%, respectively. SAD has significant utility in accelerating bioprocess development activities. Copyright © 2012 American Institute of Chemical Engineers (AIChE).
Silvestri, Erin E; Yund, Cynthia; Taft, Sarah; Bowling, Charlena Yoder; Chappie, Daniel; Garrahan, Kevin; Brady-Roberts, Eletha; Stone, Harry; Nichols, Tonya L
2017-01-01
In the event of an indoor release of an environmentally persistent microbial pathogen such as Bacillus anthracis, the potential for human exposure will be considered when remedial decisions are made. Microbial site characterization and clearance sampling data collected in the field might be used to estimate exposure. However, there are many challenges associated with estimating environmental concentrations of B. anthracis or other spore-forming organisms after such an event before being able to estimate exposure. These challenges include: (1) collecting environmental field samples that are adequate for the intended purpose, (2) conducting laboratory analyses and selecting the reporting format needed for the laboratory data, and (3) analyzing and interpreting the data using appropriate statistical techniques. This paper summarizes some key challenges faced in collecting, analyzing, and interpreting microbial field data from a contaminated site. Although the paper was written with considerations for B. anthracis contamination, it may also be applicable to other bacterial agents. It explores the implications and limitations of using field data for determining environmental concentrations both before and after decontamination. Several findings were of interest. First, to date, the only validated surface/sampling device combinations are swabs and sponge-sticks on stainless steel surfaces, thus limiting availability of quantitative analytical results which could be used for statistical analysis. Second, agreement needs to be reached with the analytical laboratory on the definition of the countable range and on reporting of data below the limit of quantitation. Finally, the distribution of the microbial field data and statistical methods needed for a particular data set could vary depending on these data that were collected, and guidance is needed on appropriate statistical software for handling microbial data. Further, research is needed to develop better methods to estimate human exposure from pathogens using environmental data collected from a field setting. PMID:26883476
Manier, M. Lisa; Spraggins, Jeffrey M.; Reyzer, Michelle L.; Norris, Jeremy L.; Caprioli, Richard M.
2014-01-01
Imaging mass spectrometry (IMS) studies increasingly focus on endogenous small molecular weight metabolites and consequently bring special analytical challenges. Since analytical tissue blanks do not exist for endogenous metabolites, careful consideration must be given to confirm molecular identity. Here we present approaches for the improvement in detection of endogenous amine metabolites such as amino acids and neurotransmitters in tissues through chemical derivatization and matrix-assisted laser desorption/ionization (MALDI) IMS. Chemical derivatization with 4-hydroxy-3-methoxycinnamaldehyde (CA) was used to improve sensitivity and specificity. CA was applied to the tissue via MALDI sample targets precoated with a mixture of derivatization reagent and ferulic acid (FA) as a MALDI matrix. Spatial distributions of chemically derivatized endogenous metabolites in tissue were determined by high-mass resolution and MSn imaging mass spectrometry. We highlight an analytical strategy for metabolite validation whereby tissue extracts are analyzed by high-performance liquid chromatography (HPLC)-MS/MS to unambiguously identify metabolites and distinguish them from isobaric compounds. PMID:25044893
NASA Astrophysics Data System (ADS)
Khademian, Amir; Abdollahipour, Hamed; Bagherpour, Raheb; Faramarzi, Lohrasb
2017-10-01
In addition to the numerous planning and executive challenges, underground excavation in urban areas is always followed by certain destructive effects especially on the ground surface; ground settlement is the most important of these effects for which estimation there exist different empirical, analytical and numerical methods. Since geotechnical models are associated with considerable model uncertainty, this study characterized the model uncertainty of settlement estimation models through a systematic comparison between model predictions and past performance data derived from instrumentation. To do so, the amount of surface settlement induced by excavation of the Qom subway tunnel was estimated via empirical (Peck), analytical (Loganathan and Poulos) and numerical (FDM) methods; the resulting maximum settlement value of each model were 1.86, 2.02 and 1.52 cm, respectively. The comparison of these predicted amounts with the actual data from instrumentation was employed to specify the uncertainty of each model. The numerical model outcomes, with a relative error of 3.8%, best matched the reality and the analytical method, with a relative error of 27.8%, yielded the highest level of model uncertainty.
Critical Factors in Data Governance for Learning Analytics
ERIC Educational Resources Information Center
Elouazizi, Noureddine
2014-01-01
This paper identifies some of the main challenges of data governance modelling in the context of learning analytics for higher education institutions, and discusses the critical factors for designing data governance models for learning analytics. It identifies three fundamental common challenges that cut across any learning analytics data…
Analytical difficulties facing today's regulatory laboratories: issues in method validation.
MacNeil, James D
2012-08-01
The challenges facing analytical laboratories today are not unlike those faced in the past, although both the degree of complexity and the rate of change have increased. Challenges such as development and maintenance of expertise, maintenance and up-dating of equipment, and the introduction of new test methods have always been familiar themes for analytical laboratories, but international guidelines for laboratories involved in the import and export testing of food require management of such changes in a context which includes quality assurance, accreditation, and method validation considerations. Decisions as to when a change in a method requires re-validation of the method or on the design of a validation scheme for a complex multi-residue method require a well-considered strategy, based on a current knowledge of international guidance documents and regulatory requirements, as well the laboratory's quality system requirements. Validation demonstrates that a method is 'fit for purpose', so the requirement for validation should be assessed in terms of the intended use of a method and, in the case of change or modification of a method, whether that change or modification may affect a previously validated performance characteristic. In general, method validation involves method scope, calibration-related parameters, method precision, and recovery. Any method change which may affect method scope or any performance parameters will require re-validation. Some typical situations involving change in methods are discussed and a decision process proposed for selection of appropriate validation measures. © 2012 John Wiley & Sons, Ltd.
The Development of Modal Testing Technology for Wind Turbines: A Historical Perspective
NASA Technical Reports Server (NTRS)
James, George H., III; Carne, Thomas G.
2007-01-01
Wind turbines are very large, flexible structures, with aerodynamic forces on the rotating blades producing periodic forces with frequencies at the harmonics of the rotation frequency. Due to design consideration, these rotational frequencies are comparable to the modal frequencies; thus avoiding resonant conditions is a critical consideration. Consequently, predicting and experimentally validating the modal frequencies of wind turbines has been important to their successful design and operation. Performing modal tests on flexible structures over 120 meters tall is a substantial challenge, which has inspired innovative developments in modal test technology. A further trial to the analyst and experimentalist is that the modal frequencies are dependent on the turbine rotation speed, so testing a parked turbine does not fully validate the analytical predictions. The history and development of this modal testing technology will be reviewed, showing historical tests and techniques, ranging from two-meter to 100-meter turbines for both parked and rotating tests. The NExT (Natural Excitation Technique) was developed in the 1990's, as a predecessor to OMA to overcome these challenges. We will trace the difficulties and successes of wind turbine modal testing over the past twenty-five years from 1982 to the present.
Kpaibe, André P S; Ben-Ameur, Randa; Coussot, Gaëlle; Ladner, Yoann; Montels, Jérôme; Ake, Michèle; Perrin, Catherine
2017-08-01
Snake venoms constitute a very promising resource for the development of new medicines. They are mainly composed of very complex peptide and protein mixtures, which composition may vary significantly from batch to batch. This latter consideration is a challenge for routine quality control (QC) in the pharmaceutical industry. In this paper, we report the use of capillary zone electrophoresis for the development of an analytical fingerprint methodology to assess the quality of snake venoms. The analytical fingerprint concept is being widely used for the QC of herbal drugs but rarely for venoms QC so far. CZE was chosen for its intrinsic efficiency in the separation of protein and peptide mixtures. The analytical fingerprint methodology was first developed and evaluated for a particular snake venom, Lachesis muta. Optimal analysis conditions required the use of PDADMAC capillary coating to avoid protein and peptide adsorption. Same analytical conditions were then applied to other snake venom species. Different electrophoretic profiles were obtained for each venom. Excellent repeatability and intermediate precision was observed for each batch. Analysis of different batches of the same species revealed inherent qualitative and quantitative composition variations of the venoms between individuals. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Human Centred Design Considerations for Connected Health Devices for the Older Adult
Harte, Richard P.; Glynn, Liam G.; Broderick, Barry J.; Rodriguez-Molinero, Alejandro; Baker, Paul M. A.; McGuiness, Bernadette; O’Sullivan, Leonard; Diaz, Marta; Quinlan, Leo R.; ÓLaighin, Gearóid
2014-01-01
Connected health devices are generally designed for unsupervised use, by non-healthcare professionals, facilitating independent control of the individuals own healthcare. Older adults are major users of such devices and are a population significantly increasing in size. This group presents challenges due to the wide spectrum of capabilities and attitudes towards technology. The fit between capabilities of the user and demands of the device can be optimised in a process called Human Centred Design. Here we review examples of some connected health devices chosen by random selection, assess older adult known capabilities and attitudes and finally make analytical recommendations for design approaches and design specifications. PMID:25563225
Leclercq, Amélie; Nonell, Anthony; Todolí Torró, José Luis; Bresson, Carole; Vio, Laurent; Vercouter, Thomas; Chartier, Frédéric
2015-07-23
Inductively coupled plasma optical emission spectrometry (ICP-OES) and mass spectrometry (ICP-MS) are increasingly used to carry out analyses in organic/hydro-organic matrices. The introduction of such matrices into ICP sources is particularly challenging and can be the cause of numerous drawbacks. This tutorial review, divided in two parts, explores the rich literature related to the introduction of organic/hydro-organic matrices in ICP sources. Part I provided theoretical considerations associated with the physico-chemical properties of such matrices, in an attempt to understand the induced phenomena. Part II of this tutorial review is dedicated to more practical considerations on instrumentation, instrumental and operating parameters, as well as analytical strategies for elemental quantification in such matrices. Two important issues are addressed in this part: the first concerns the instrumentation and optimization of instrumental and operating parameters, pointing out (i) the description, benefits and drawbacks of different kinds of nebulization and desolvation devices and the impact of more specific instrumental parameters such as the injector characteristics and the material used for the cone; and, (ii) the optimization of operating parameters, for both ICP-OES and ICP-MS. Even if it is at the margin of this tutorial review, Electrothermal Vaporization and Laser Ablation will also be shortly described. The second issue is devoted to the analytical strategies for elemental quantification in such matrices, with particular insight into the isotope dilution technique, particularly used in speciation analysis by ICP-coupled separation techniques. Copyright © 2015 Elsevier B.V. All rights reserved.
Big Data Analytics Solutions: The Implementation Challenges in the Financial Services Industry
ERIC Educational Resources Information Center
Ojo, Michael O.
2016-01-01
The challenges of Big Data (BD) and Big Data Analytics (BDA) have attracted disproportionately less attention than the overwhelmingly espoused benefits and game-changing promises. While many studies have examined BD challenges across multiple industry verticals, very few have focused on the challenges of implementing BDA solutions. Fewer of these…
Are We on Our Way to Becoming a "Helicopter University"? Academics' Views on Learning Analytics
ERIC Educational Resources Information Center
Howell, Joel A.; Roberts, Lynne D.; Seaman, Kristen; Gibson, David C.
2018-01-01
Higher education institutions are developing the capacity for learning analytics. However, the technical development of learning analytics has far exceeded the consideration of ethical issues around learning analytics. We examined higher education academics' knowledge, attitudes, and concerns about the use of learning analytics though four focus…
Enabling fluorescent biosensors for the forensic identification of body fluids.
Frascione, Nunzianda; Gooch, James; Daniel, Barbara
2013-11-12
The search for body fluids often forms a crucial element of many forensic investigations. Confirming fluid presence at a scene can not only support or refute the circumstantial claims of a victim, suspect or witness, but may additionally provide a valuable source of DNA for further identification purposes. However, current biological fluid testing techniques are impaired by a number of well-characterised limitations; they often give false positives, cannot be used simultaneously, are sample destructive and lack the ability to visually locate fluid depositions. These disadvantages can negatively affect the outcome of a case through missed or misinterpreted evidence. Biosensors are devices able to transduce a biological recognition event into a measurable signal, resulting in real-time analyte detection. The use of innovative optical sensing technology may enable the highly specific and non-destructive detection of biological fluid depositions through interaction with several fluid-endogenous biomarkers. Despite considerable impact in a variety of analytical disciplines, biosensor application within forensic analyses may be considered extremely limited. This article aims to explore a number of prospective biosensing mechanisms and to outline the challenges associated with their adaptation towards detection of fluid-specific analytes.
Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms.
Jaffe, Jacob D; Feeney, Caitlin M; Patel, Jinal; Lu, Xiaodong; Mani, D R
2016-11-01
Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques. Graphical Abstract ᅟ.
Transitioning from Targeted to Comprehensive Mass Spectrometry Using Genetic Algorithms
NASA Astrophysics Data System (ADS)
Jaffe, Jacob D.; Feeney, Caitlin M.; Patel, Jinal; Lu, Xiaodong; Mani, D. R.
2016-11-01
Targeted proteomic assays are becoming increasingly popular because of their robust quantitative applications enabled by internal standardization, and they can be routinely executed on high performance mass spectrometry instrumentation. However, these assays are typically limited to 100s of analytes per experiment. Considerable time and effort are often expended in obtaining and preparing samples prior to targeted analyses. It would be highly desirable to detect and quantify 1000s of analytes in such samples using comprehensive mass spectrometry techniques (e.g., SWATH and DIA) while retaining a high degree of quantitative rigor for analytes with matched internal standards. Experimentally, it is facile to port a targeted assay to a comprehensive data acquisition technique. However, data analysis challenges arise from this strategy concerning agreement of results from the targeted and comprehensive approaches. Here, we present the use of genetic algorithms to overcome these challenges in order to configure hybrid targeted/comprehensive MS assays. The genetic algorithms are used to select precursor-to-fragment transitions that maximize the agreement in quantification between the targeted and the comprehensive methods. We find that the algorithm we used provided across-the-board improvement in the quantitative agreement between the targeted assay data and the hybrid comprehensive/targeted assay that we developed, as measured by parameters of linear models fitted to the results. We also found that the algorithm could perform at least as well as an independently-trained mass spectrometrist in accomplishing this task. We hope that this approach will be a useful tool in the development of quantitative approaches for comprehensive proteomics techniques.
Arreaza, Gladys; Qiu, Ping; Pang, Ling; Albright, Andrew; Hong, Lewis Z.; Marton, Matthew J.; Levitan, Diane
2016-01-01
In cancer drug discovery, it is important to investigate the genetic determinants of response or resistance to cancer therapy as well as factors that contribute to adverse events in the course of clinical trials. Despite the emergence of new technologies and the ability to measure more diverse analytes (e.g., circulating tumor cell (CTC), circulating tumor DNA (ctDNA), etc.), tumor tissue is still the most common and reliable source for biomarker investigation. Because of its worldwide use and ability to preserve samples for many decades at ambient temperature, formalin-fixed, paraffin-embedded tumor tissue (FFPE) is likely to be the preferred choice for tissue preservation in clinical practice for the foreseeable future. Multiple analyses are routinely performed on the same FFPE samples (such as Immunohistochemistry (IHC), in situ hybridization, RNAseq, DNAseq, TILseq, Methyl-Seq, etc.). Thus, specimen prioritization and optimization of the isolation of analytes is critical to ensure successful completion of each assay. FFPE is notorious for producing suboptimal DNA quality and low DNA yield. However, commercial vendors tend to request higher DNA sample mass than what is actually required for downstream assays, which restricts the breadth of biomarker work that can be performed. We evaluated multiple genomics service laboratories to assess the current state of NGS pre-analytical processing of FFPE. Significant differences in pre-analytical capabilities were observed. Key aspects are highlighted and recommendations are made to improve the current practice in translational research. PMID:27657050
Lutz, Philipp
2017-01-01
The effectiveness of immigrant integration policies has gained considerable attention across Western democracies dealing with ethnically and culturally diverse societies. However, the findings on what type of policy produces more favourable integration outcomes remain inconclusive. The conflation of normative and analytical assumptions on integration is a major challenge for causal analysis of integration policies. This article applies actor-centered institutionalism as a new framework for the analysis of immigrant integration outcomes in order to separate two different mechanisms of policy intervention. Conceptualising integration outcomes as a function of capabilities and aspirations allows separating assumptions on the policy intervention in assimilation and multiculturalism as the two main types of policy approaches. The article illustrates that assimilation is an incentive-based policy and primarily designed to increase immigrants' aspirations, whereas multiculturalism is an opportunity-based policy and primarily designed to increase immigrants' capabilities. Conceptualising causal mechanisms of policy intervention clarifies the link between normative concepts of immigrant integration and analytical concepts of policy effectiveness.
Parr, Maria Kristina; Wuest, Bernhard; Naegele, Edgar; Joseph, Jan F; Wenzel, Maxi; Schmidt, Alexander H; Stanic, Mijo; de la Torre, Xavier; Botrè, Francesco
2016-09-01
HPLC is considered the method of choice for the separation of various classes of drugs. However, some analytes are still challenging as HPLC shows limited resolution capabilities for highly polar analytes as they interact insufficiently on conventional reversed-phase (RP) columns. Especially in combination with mass spectrometric detection, limitations apply for alterations of stationary phases. Some highly polar sympathomimetic drugs and their metabolites showed almost no retention on different RP columns. Their retention remains poor even on phenylhexyl phases that show different selectivity due to π-π interactions. Supercritical fluid chromatography (SFC) as an orthogonal separation technique to HPLC may help to overcome these issues. Selected polar drugs and metabolites were analyzed utilizing SFC separation. All compounds showed sharp peaks and good retention even for the very polar analytes, such as sulfoconjugates. Retention times and elution orders in SFC are different to both RP and HILIC separations as a result of the orthogonality. Short cycle times could be realized. As temperature and pressure strongly influence the polarity of supercritical fluids, precise regulation of temperature and backpressure is required for the stability of the retention times. As CO2 is the main constituent of the mobile phase in SFC, solvent consumption and solvent waste are considerably reduced. Graphical Abstract SFC-MS/MS vs. LC-MS/MS.
Leclercq, Amélie; Nonell, Anthony; Todolí Torró, José Luis; Bresson, Carole; Vio, Laurent; Vercouter, Thomas; Chartier, Frédéric
2015-07-23
Due to their outstanding analytical performances, inductively coupled plasma optical emission spectrometry (ICP-OES) and mass spectrometry (ICP-MS) are widely used for multi-elemental measurements and also for isotopic characterization in the case of ICP-MS. While most studies are carried out in aqueous matrices, applications involving organic/hydro-organic matrices become increasingly widespread. This kind of matrices is introduced in ICP based instruments when classical "matrix removal" approaches such as acid digestion or extraction procedures cannot be implemented. Due to the physico-chemical properties of organic/hydro-organic matrices and their associated effects on instrumentation and analytical performances, their introduction into ICP sources is particularly challenging and has become a full topic. In this framework, numerous theoretical and phenomenological studies of these effects have been performed in the past, mainly by ICP-OES, while recent literature is more focused on applications and associated instrumental developments. This tutorial review, divided in two parts, explores the rich literature related to the introduction of organic/hydro-organic matrices in ICP-OES and ICP-MS. The present Part I, provides theoretical considerations in connection with the physico-chemical properties of organic/hydro-organic matrices, in order to better understand the induced phenomena. This focal point is divided in four chapters highlighting: (i) the impact of organic/hydro-organic matrices from aerosol generation to atomization/excitation/ionization processes; (ii) the production of carbon molecular constituents and their spatial distribution in the plasma with respect to analytes repartition; (iii) the subsequent modifications of plasma fundamental properties; and (iv) the resulting spectroscopic and non spectroscopic interferences. This first part of this tutorial review is addressed either to beginners or to more experienced scientists who are interested in the analysis of organic/hydro-organic matrices by ICP sources and would like to consider the theoretical background of effects induced by such matrices. The second part of this tutorial review will be dedicated to more practical consideration on instrumentation, such as adapted introductions devices, as well as instrumental and operating parameters optimization. The analytical strategies for elemental quantification in such matrices will also be addressed. Copyright © 2015 Elsevier B.V. All rights reserved.
Full-Range Public Health Leadership, Part 2: Qualitative Analysis and Synthesis.
Carlton, Erik L; Holsinger, James W; Riddell, Martha C; Bush, Heather
2015-01-01
Public health leadership is an important topic in the era of U.S. health reform, population health innovation, and health system transformation. This study utilized the full-range leadership model in order to examine the public health leadership. We sought to understand local public health leadership from the perspective of local health department leaders and those who work with and for them. Public health leadership was explored through interviews and focus groups with directors (n = 4) and staff (n = 33) from local health departments. Qualitative analytic methods included reflexive journals, code-recode procedures, and member checking, with analysis facilitated by Atlas.ti v.6.0. Qualitative results supported and expanded upon previously reported quantitative findings. Leading by example and providing individual consideration to followers were found to be more important than other leader factors, such as intellectual stimulation, inspirational motivation, or idealized attributes of leaders. Having a clear and competent vision of public health, being able to work collaboratively with other community agencies, and addressing the current challenges to public health with creativity and innovation were also important findings. Idealized leadership behaviors and individual consideration should be the focus of student and professional development. Models that incorporate contextual considerations, such as the situational leadership model, could be utilized to ensure that optimal individual consideration is given to followers.
Lover, Andrew A; Coker, Richard J
2014-05-01
Infections with the malaria parasite Plasmodium vivax are noteworthy for potentially very long incubation periods (6-9 months), which present a major barrier to disease elimination. Increased sporozoite challenge has been reported to be associated with both shorter incubation and pre-patent periods in a range of human challenge studies. However, this evidence base has scant empirical foundation, as these historical analyses were limited by available analytic methods, and provides no quantitative estimates of effect size. Following a comprehensive literature search, we re-analysed all identified studies using survival and/or logistic models plus contingency tables. We have found very weak evidence for dose-dependence at entomologically plausible inocula levels. These results strongly suggest that sporozoite dosage is not an important driver of long-latency. Evidence presented suggests that parasite strain and vector species have quantitatively greater impacts, and the potential existence of a dose threshold for human dose-response to sporozoites. Greater consideration of the complex interplay between these aspects of vectors and parasites are important for human challenge experiments, vaccine trials, and epidemiology towards global malaria elimination.
Supply chain optimization for pediatric perioperative departments.
Davis, Janice L; Doyle, Robert
2011-09-01
Economic challenges compel pediatric perioperative departments to reduce nonlabor supply costs while maintaining the quality of patient care. Optimization of the supply chain introduces a framework for decision making that drives fiscally responsible decisions. The cost-effective supply chain is driven by implementing a value analysis process for product selection, being mindful of product sourcing decisions to reduce supply expense, creating logistical efficiency that will eliminate redundant processes, and managing inventory to ensure product availability. The value analysis approach is an analytical methodology for product selection that involves product evaluation and recommendation based on consideration of clinical benefit, overall financial impact, and revenue implications. Copyright © 2011 AORN, Inc. Published by Elsevier Inc. All rights reserved.
Silicon Nanowire-Based Devices for Gas-Phase Sensing
Cao, Anping; Sudhölter, Ernst J.R.; de Smet, Louis C.P.M.
2014-01-01
Since their introduction in 2001, SiNW-based sensor devices have attracted considerable interest as a general platform for ultra-sensitive, electrical detection of biological and chemical species. Most studies focus on detecting, sensing and monitoring analytes in aqueous solution, but the number of studies on sensing gases and vapors using SiNW-based devices is increasing. This review gives an overview of selected research papers related to the application of electrical SiNW-based devices in the gas phase that have been reported over the past 10 years. Special attention is given to surface modification strategies and the sensing principles involved. In addition, future steps and technological challenges in this field are addressed. PMID:24368699
Geospatial Data as a Service: Towards planetary scale real-time analytics
NASA Astrophysics Data System (ADS)
Evans, B. J. K.; Larraondo, P. R.; Antony, J.; Richards, C. J.
2017-12-01
The rapid growth of earth systems, environmental and geophysical datasets poses a challenge to both end-users and infrastructure providers. For infrastructure and data providers, tasks like managing, indexing and storing large collections of geospatial data needs to take into consideration the various use cases by which consumers will want to access and use the data. Considerable investment has been made by the Earth Science community to produce suitable real-time analytics platforms for geospatial data. There are currently different interfaces that have been defined to provide data services. Unfortunately, there is considerable difference on the standards, protocols or data models which have been designed to target specific communities or working groups. The Australian National University's National Computational Infrastructure (NCI) is used for a wide range of activities in the geospatial community. Earth observations, climate and weather forecasting are examples of these communities which generate large amounts of geospatial data. The NCI has been carrying out significant effort to develop a data and services model that enables the cross-disciplinary use of data. Recent developments in cloud and distributed computing provide a publicly accessible platform where new infrastructures can be built. One of the key components these technologies offer is the possibility of having "limitless" compute power next to where the data is stored. This model is rapidly transforming data delivery from centralised monolithic services towards ubiquitous distributed services that scale up and down adapting to fluctuations in the demand. NCI has developed GSKY, a scalable, distributed server which presents a new approach for geospatial data discovery and delivery based on OGC standards. We will present the architecture and motivating use-cases that drove GSKY's collaborative design, development and production deployment. We show our approach offers the community valuable exploratory analysis capabilities, for dealing with petabyte-scale geospatial data collections.
Quantification of HCV RNA in Liver Tissue by bDNA Assay.
Dailey, P J; Collins, M L; Urdea, M S; Wilber, J C
1999-01-01
With this statement, Sherlock and Dooley have described two of the three major challenges involved in quantitatively measuring any analyte in tissue samples: the distribution of the analyte in the tissue; and the standard of reference, or denominator, with which to make comparisons between tissue samples. The third challenge for quantitative measurement of an analyte in tissue is to ensure reproducible and quantitative recovery of the analyte on extraction from tissue samples. This chapter describes a method that can be used to measure HCV RNA quantitatively in liver biopsy and tissue samples using the bDNA assay. All three of these challenges-distribution, denominator, and recovery-apply to the measurement of HCV RNA in liver biopsies.
The VAST Challenge: History, Scope, and Outcomes: An introduction to the Special Issue
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, Kristin A.; Grinstein, Georges; Whiting, Mark A.
2014-10-01
Visual analytics aims to facilitate human insight from complex data via a combination of visual representations, interaction techniques, and supporting algorithms. To create new tools and techniques that achieve this goal requires that researchers have an understanding of analytical questions to be addressed, data that illustrates the complexities and ambiguities found in realistic analytic settings, and methods for evaluating whether the plausible insights are gained through use of the new methods. However, researchers do not, generally speaking, have access to analysts who can articulate their problems or operational data that is used for analysis. To fill this gap, the Visualmore » Analytics Science and Technology (VAST) Challenge has been held annually since 2006. The VAST Challenge provides an opportunity for researchers to experiment with realistic but not real problems, using realistic synthetic data with known events embedded. Since its inception, the VAST Challenge has evolved along with the visual analytics research community to pose more complex challenges, ranging from text analysis to video analysis to large scale network log analysis. The seven years of the VAST Challenge have seen advancements in research and development, education, evaluation, and in the challenge process itself. This special issue of Information Visualization highlights some of the noteworthy advancements in each of these areas. Some of these papers focus on important research questions related to the challenge itself, and other papers focus on innovative research that has been shaped by participation in the challenge. This paper describes the VAST Challenge process and benefits in detail. It also provides an introduction to and context for the remaining papers in the issue.« less
NASA Astrophysics Data System (ADS)
Tan, Yanglan; Polfer, Nicolas C.
2015-02-01
Carbohydrates and their derivatives play important roles in biological systems, but their isomeric heterogeneity also presents a considerable challenge for analytical techniques. Here, a stepwise approach using infrared multiple-photon dissociation (IRMPD) via a tunable CO2 laser (9.2-10.7 μm) was employed to characterize isomeric variants of glucose-based trisaccharides. After the deprotonated trisaccharides were trapped and fragmented to disaccharide C2 fragments in a Fourier transform ion cyclotron resonance (FTICR) cell, a further variable-wavelength infrared irradiation of the C2 ion produced wavelength-dependent dissociation patterns that are represented as heat maps. The photodissociation patterns of these C2 fragments are shown to be strikingly similar to the photodissociation patterns of disaccharides with identical glycosidic bonds. Conversely, the photodissociation patterns of different glycosidic linkages exhibit considerable differences. On the basis of these results, the linkage position and anomericity of glycosidic bonds of disaccharide units in trisaccharides can be systematically differentiated and identified, providing a promising approach to characterize the structures of isomeric oligosaccharides.
Big data analytics in healthcare: promise and potential.
Raghupathi, Wullianallur; Raghupathi, Viju
2014-01-01
To describe the promise and potential of big data analytics in healthcare. The paper describes the nascent field of big data analytics in healthcare, discusses the benefits, outlines an architectural framework and methodology, describes examples reported in the literature, briefly discusses the challenges, and offers conclusions. The paper provides a broad overview of big data analytics for healthcare researchers and practitioners. Big data analytics in healthcare is evolving into a promising field for providing insight from very large data sets and improving outcomes while reducing costs. Its potential is great; however there remain challenges to overcome.
The Challenges of Teaching Business Analytics: Finding Real Big Data for Business Students
ERIC Educational Resources Information Center
Yap, Alexander Y.; Drye, Sherrie L.
2018-01-01
This research shares the challenges of bringing in real-world big business data into the classroom so students can experience how today's business decisions can improve with the strategic use of data analytics. Finding a true big data set that provides real world business transactions and operational data has been a challenge for academics…
Concurrence of big data analytics and healthcare: A systematic review.
Mehta, Nishita; Pandit, Anil
2018-06-01
The application of Big Data analytics in healthcare has immense potential for improving the quality of care, reducing waste and error, and reducing the cost of care. This systematic review of literature aims to determine the scope of Big Data analytics in healthcare including its applications and challenges in its adoption in healthcare. It also intends to identify the strategies to overcome the challenges. A systematic search of the articles was carried out on five major scientific databases: ScienceDirect, PubMed, Emerald, IEEE Xplore and Taylor & Francis. The articles on Big Data analytics in healthcare published in English language literature from January 2013 to January 2018 were considered. Descriptive articles and usability studies of Big Data analytics in healthcare and medicine were selected. Two reviewers independently extracted information on definitions of Big Data analytics; sources and applications of Big Data analytics in healthcare; challenges and strategies to overcome the challenges in healthcare. A total of 58 articles were selected as per the inclusion criteria and analyzed. The analyses of these articles found that: (1) researchers lack consensus about the operational definition of Big Data in healthcare; (2) Big Data in healthcare comes from the internal sources within the hospitals or clinics as well external sources including government, laboratories, pharma companies, data aggregators, medical journals etc.; (3) natural language processing (NLP) is most widely used Big Data analytical technique for healthcare and most of the processing tools used for analytics are based on Hadoop; (4) Big Data analytics finds its application for clinical decision support; optimization of clinical operations and reduction of cost of care (5) major challenge in adoption of Big Data analytics is non-availability of evidence of its practical benefits in healthcare. This review study unveils that there is a paucity of information on evidence of real-world use of Big Data analytics in healthcare. This is because, the usability studies have considered only qualitative approach which describes potential benefits but does not take into account the quantitative study. Also, majority of the studies were from developed countries which brings out the need for promotion of research on Healthcare Big Data analytics in developing countries. Copyright © 2018 Elsevier B.V. All rights reserved.
Recent Discoveries and Future Challenges in Atmospheric Organic Chemistry.
Glasius, Marianne; Goldstein, Allen H
2016-03-15
Earth's atmosphere contains a multitude of organic compounds, which differ by orders of magnitude regarding fundamental properties such as volatility, reactivity, and propensity to form cloud droplets, affecting their impact on global climate and human health. Despite recent major research efforts and advances, there are still substantial gaps in understanding of atmospheric organic chemistry, hampering efforts to understand, model, and mitigate environmental problems such as aerosol formation in both polluted urban and more pristine regions. The analytical toolbox available for chemists to study atmospheric organic components has expanded considerably during the past decade, opening new windows into speciation, time resolution and detection of reactive and semivolatile compounds at low concentrations. This has provided unprecedented opportunities, but also unveiled new scientific challenges. Specific groundbreaking examples include the role of epoxides in aerosol formation especially from isoprene, the importance of highly oxidized, reactive organics in air-surface processes (whether atmosphere-biosphere exchange or aerosols), as well as the extent of interactions of anthropogenic and biogenic emissions and the resulting impact on atmospheric organic chemistry.
Advanced Engineering Environments: Implications for Aerospace Manufacturing
NASA Technical Reports Server (NTRS)
Thomas, D.
2001-01-01
There are significant challenges facing today's aerospace industry. Global competition, more complex products, geographically-distributed design teams, demands for lower cost, higher reliability and safer vehicles, and the need to incorporate the latest technologies quicker all face the developer of aerospace systems. New information technologies offer promising opportunities to develop advanced engineering environments (AEEs) to meet these challenges. Significant advances in the state-of-the-art of aerospace engineering practice are envisioned in the areas of engineering design and analytical tools, cost and risk tools, collaborative engineering, and high-fidelity simulations early in the development cycle. These advances will enable modeling and simulation of manufacturing methods, which will in turn allow manufacturing considerations to be included much earlier in the system development cycle. Significant cost savings, increased quality, and decreased manufacturing cycle time are expected to result. This paper will give an overview of the NASA's Intelligent Synthesis Environment, the agency initiative to develop an AEE, with a focus on the anticipated benefits in aerospace manufacturing.
Saint-Maurice, Pedro F; Welk, Gregory J
2014-12-01
This paper describes the design and methods involved in calibrating a Web-based self-report instrument to estimate physical activity behavior. The limitations of self-report measures are well known, but calibration methods enable the reported information to be equated to estimates obtained from objective data. This paper summarizes design considerations for effective development and calibration of physical activity self-report measures. Each of the design considerations is put into context and followed by a practical application based on our ongoing calibration research with a promising online self-report tool called the Youth Activity Profile (YAP). We first describe the overall concept of calibration and how this influences the selection of appropriate self-report tools for this population. We point out the advantages and disadvantages of different monitoring devices since the choice of the criterion measure and the strategies used to minimize error in the measure can dramatically improve the quality of the data. We summarize strategies to ensure quality control in data collection and discuss analytical considerations involved in group- vs individual-level inference. For cross-validation procedures, we describe the advantages of equivalence testing procedures that directly test and quantify agreement. Lastly, we introduce the unique challenges encountered when transitioning from paper to a Web-based tool. The Web offers considerable potential for broad adoption but an iterative calibration approach focused on continued refinement is needed to ensure that estimates are generalizable across individuals, regions, seasons and countries.
Nightingale, Tom E; Rouse, Peter C; Thompson, Dylan; Bilzon, James L J
2017-12-01
Accurately measuring physical activity and energy expenditure in persons with chronic physical disabilities who use wheelchairs is a considerable and ongoing challenge. Quantifying various free-living lifestyle behaviours in this group is at present restricted by our understanding of appropriate measurement tools and analytical techniques. This review provides a detailed evaluation of the currently available measurement tools used to predict physical activity and energy expenditure in persons who use wheelchairs. It also outlines numerous considerations specific to this population and suggests suitable future directions for the field. Of the existing three self-report methods utilised in this population, the 3-day Physical Activity Recall Assessment for People with Spinal Cord Injury (PARA-SCI) telephone interview demonstrates the best reliability and validity. However, the complexity of interview administration and potential for recall bias are notable limitations. Objective measurement tools, which overcome such considerations, have been validated using controlled laboratory protocols. These have consistently demonstrated the arm or wrist as the most suitable anatomical location to wear accelerometers. Yet, more complex data analysis methodologies may be necessary to further improve energy expenditure prediction for more intricate movements or behaviours. Multi-sensor devices that incorporate physiological signals and acceleration have recently been adapted for persons who use wheelchairs. Population specific algorithms offer considerable improvements in energy expenditure prediction accuracy. This review highlights the progress in the field and aims to encourage the wider scientific community to develop innovative solutions to accurately quantify physical activity in this population.
Architectural Considerations for Highly Scalable Computing to Support On-demand Video Analytics
2017-04-19
enforcement . The system was tested in the wild using video files as well as a commercial Video Management System supporting more than 100 surveillance...research were used to implement a distributed on-demand video analytics system that was prototyped for the use of forensics investigators in law...cameras as video sources. The architectural considerations of this system are presented. Issues to be reckoned with in implementing a scalable
Andra, Syam S; Austin, Christine; Yang, Juan; Patel, Dhavalkumar; Arora, Manish
2016-12-01
Human exposures to bisphenol A (BPA) has attained considerable global health attention and represents one of the leading environmental contaminants with potential adverse health effects including endocrine disruption. Current practice of measuring of exposure to BPA includes the measurement of unconjugated BPA (aglycone) and total (both conjugated and unconjugated) BPA; the difference between the two measurements leads to estimation of conjugated forms. However, the measurement of BPA as the end analyte leads to inaccurate estimates from potential interferences from background sources during sample collection and analysis. BPA glucuronides (BPAG) and sulfates (BPAS) represent better candidates for biomarkers of BPA exposure, since they require in vivo metabolism and are not prone to external contamination. In this work, the primary focus was to review the current state of the art in analytical methods available to quantitate BPA conjugates. The entire analytical procedure for the simultaneous extraction and detection of aglycone BPA and conjugates is covered, from sample pre-treatment, extraction, separation, ionization, and detection. Solid phase extraction coupled with liquid chromatograph and tandem mass spectrometer analysis provides the most sensitive detection and quantification of BPA conjugates. Discussed herein are the applications of BPA conjugates analysis in human exposure assessment studies. Measuring these potential biomarkers of BPA exposure has only recently become analytically feasible and there are limitations and challenges to overcome in biomonitoring studies. Copyright © 2016 Elsevier B.V. All rights reserved.
Chilled to the bone: embodied countertransference and unspoken traumatic memories.
Zoppi, Luisa
2017-11-01
Starting from a deeply challenging experience of early embodied countertransference in a first encounter with a new patient, the author explores the issues it raised. Such moments highlight projective identification as well as what Stone (2006) has described as 'embodied resonance in the countertransference'. In these powerful experiences linear time and subject boundaries are altered, and this leads to central questions about analytic work. As well as discussing the uncanny experience at the very beginning of an analytic encounter and its challenges for the analytic field, the author considers 'the time horizon of analytic process' (Hogenson ), the relationship between 'moments of complexity and analytic boundaries' (Cambray ) and the role of mirror neurons in intersubjective experience. © 2017, The Society of Analytical Psychology.
NASA Technical Reports Server (NTRS)
Anderson, Leif; Carter-Journet, Katrina; Box, Neil; DiFilippo, Denise; Harrington, Sean; Jackson, David; Lutomski, Michael
2012-01-01
This paper introduces an analytical approach, Probability and Confidence Trade-space (PACT), which can be used to assess uncertainty in International Space Station (ISS) hardware sparing necessary to extend the life of the vehicle. There are several key areas under consideration in this research. We investigate what sparing confidence targets may be reasonable to ensure vehicle survivability and for completion of science on the ISS. The results of the analysis will provide a methodological basis for reassessing vehicle subsystem confidence targets. An ongoing annual analysis currently compares the probability of existing spares exceeding the total expected unit demand of the Orbital Replacement Unit (ORU) in functional hierarchies approximating the vehicle subsystems. In cases where the functional hierarchies availability does not meet subsystem confidence targets, the current sparing analysis further identifies which ORUs may require additional spares to extend the life of the ISS. The resulting probability is dependent upon hardware reliability estimates. However, the ISS hardware fleet carries considerable epistemic uncertainty (uncertainty in the knowledge of the true hardware failure rate), which does not currently factor into the annual sparing analysis. The existing confidence targets may be conservative. This paper will also discuss how confidence targets may be relaxed based on the inclusion of epistemic uncertainty for each ORU. The paper will conclude with strengths and limitations for implementing the analytical approach in sustaining the ISS through end of life, 2020 and beyond.
Albalat, Amaya; Husi, Holger; Siwy, Justyna; Nally, Jarlath E; McLauglin, Mark; Eckersall, Peter D; Mullen, William
2014-02-01
Proteomics is a growing field that has the potential to be applied to many biology-related disciplines. However, the study of the proteome has proven to be very challenging due to its high level of complexity when compared to genome and transcriptome data. In order to analyse this level of complexity, high resolution separation of peptides/proteins are needed together with high resolution analysers. Currently, liquid chromatography and capillary electrophoresis (CE) are the two most widely used separation techniques that can be coupled on-line with a mass spectrometer (MS). In CE, proteins/ peptides are separated according to their size, charge and shape leading to high resolving power. Although further progress in the area of sensitivity, throughput and proteome coverage are expected, MS-based proteomics have developed to a level at which they are habitually applied to study a wide range of biological questions. The aim of this review is to present CE-MS as a proteomic analytical platform for biomarker research that could be used in farm animal and veterinary studies. This is a MS-analytical platform that has been widely used for biomarker research in the biomedical field but its application in animal proteomic studies is relatively novel. The review will focus on introducing the CE-MS platform and the primary considerations for its application to biomarker research. Furthermore, current applications but more importantly potential application in the field of farm animals and veterinary science will be presented and discussed.
Hess, Cornelius; Sydow, Konrad; Kueting, Theresa; Kraemer, Michael; Maas, Alexandra
2018-02-01
The requirement for correct evaluation of forensic toxicological results in daily routine work and scientific studies is reliable analytical data based on validated methods. Validation of a method gives the analyst tools to estimate the efficacy and reliability of the analytical method. Without validation, data might be contested in court and lead to unjustified legal consequences for a defendant. Therefore, new analytical methods to be used in forensic toxicology require careful method development and validation of the final method. Until now, there are no publications on the validation of chromatographic mass spectrometric methods for the detection of endogenous substances although endogenous analytes can be important in Forensic Toxicology (alcohol consumption marker, congener alcohols, gamma hydroxy butyric acid, human insulin and C-peptide, creatinine, postmortal clinical parameters). For these analytes, conventional validation instructions cannot be followed completely. In this paper, important practical considerations in analytical method validation for endogenous substances will be discussed which may be used as guidance for scientists wishing to develop and validate analytical methods for analytes produced naturally in the human body. Especially the validation parameters calibration model, analytical limits, accuracy (bias and precision) and matrix effects and recovery have to be approached differently. Highest attention should be paid to selectivity experiments. Copyright © 2017 Elsevier B.V. All rights reserved.
Full-Range Public Health Leadership, Part 2: Qualitative Analysis and Synthesis
Carlton, Erik L.; Holsinger, James W.; Riddell, Martha C.; Bush, Heather
2015-01-01
Public health leadership is an important topic in the era of U.S. health reform, population health innovation, and health system transformation. This study utilized the full-range leadership model in order to examine the public health leadership. We sought to understand local public health leadership from the perspective of local health department leaders and those who work with and for them. Public health leadership was explored through interviews and focus groups with directors (n = 4) and staff (n = 33) from local health departments. Qualitative analytic methods included reflexive journals, code-recode procedures, and member checking, with analysis facilitated by Atlas.ti v.6.0. Qualitative results supported and expanded upon previously reported quantitative findings. Leading by example and providing individual consideration to followers were found to be more important than other leader factors, such as intellectual stimulation, inspirational motivation, or idealized attributes of leaders. Having a clear and competent vision of public health, being able to work collaboratively with other community agencies, and addressing the current challenges to public health with creativity and innovation were also important findings. Idealized leadership behaviors and individual consideration should be the focus of student and professional development. Models that incorporate contextual considerations, such as the situational leadership model, could be utilized to ensure that optimal individual consideration is given to followers. PMID:26217654
Hand, Rosa K; Perzynski, Adam T
2016-09-01
Retrospective self-reported data have limitations, making it important to evaluate alternative forms of measurement for nutrition behaviors. Ecological momentary assessment (EMA) attempts to overcome the challenges of recalled data with real-time data collection in a subject's natural environment, often leveraging technology. This perspective piece 1) introduces the concepts and terminology of EMA, 2) provides an overview of the methodological and analytical considerations, 3) gives examples of past research using EMA, and 4) suggests new opportunities (including combining assessment and intervention) and limitations (including the need for technology) for the application of EMA to research and practice regarding nutrition behaviors. Copyright © 2016 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.
Bana, Péter; Örkényi, Róbert; Lövei, Klára; Lakó, Ágnes; Túrós, György István; Éles, János; Faigl, Ferenc; Greiner, István
2017-12-01
Recent advances in the field of continuous flow chemistry allow the multistep preparation of complex molecules such as APIs (Active Pharmaceutical Ingredients) in a telescoped manner. Numerous examples of laboratory-scale applications are described, which are pointing towards novel manufacturing processes of pharmaceutical compounds, in accordance with recent regulatory, economical and quality guidances. The chemical and technical knowledge gained during these studies is considerable; nevertheless, connecting several individual chemical transformations and the attached analytics and purification holds hidden traps. In this review, we summarize innovative solutions for these challenges, in order to benefit chemists aiming to exploit flow chemistry systems for the synthesis of biologically active molecules. Copyright © 2016 Elsevier Ltd. All rights reserved.
Kristensen, Anne F; Kristensen, Søren R; Falkmer, Ursula; Münster, Anna-Marie B; Pedersen, Shona
2018-05-01
The Calibrated Automated Thrombography (CAT) is an in vitro thrombin generation (TG) assay that holds promise as a valuable tool within clinical diagnostics. However, the technique has a considerable analytical variation, and we therefore, investigated the analytical and between-subject variation of CAT systematically. Moreover, we assess the application of an internal standard for normalization to diminish variation. 20 healthy volunteers donated one blood sample which was subsequently centrifuged, aliquoted and stored at -80 °C prior to analysis. The analytical variation was determined on eight runs, where plasma from the same seven volunteers was processed in triplicates, and for the between-subject variation, TG analysis was performed on plasma from all 20 volunteers. The trigger reagents used for the TG assays included both PPP reagent containing 5 pM tissue factor (TF) and PPPlow with 1 pM TF. Plasma, drawn from a single donor, was applied to all plates as an internal standard for each TG analysis, which subsequently was used for normalization. The total analytical variation for TG analysis performed with PPPlow reagent is 3-14% and 9-13% for PPP reagent. This variation can be minimally reduced by using an internal standard but mainly for ETP (endogenous thrombin potential). The between-subject variation is higher when using PPPlow than PPP and this variation is considerable higher than the analytical variation. TG has a rather high inherent analytical variation but considerable lower than the between-subject variation when using PPPlow as reagent.
Engineered Transport in Microporous Materials and Membranes for Clean Energy Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Changyi; Meckler, Stephen M.; Smith, Zachary P.
Many forward-looking clean-energy technologies hinge on the development of scalable and efficient membrane-based separations. Ongoing investment in the basic research of microporous materials is beginning to pay dividends in membrane technology maturation. Specifically, improvements in membrane selectivity, permeability, and durability are being leveraged for more efficient carbon capture, desalination, and energy storage, and the market adoption of membranes in those areas appears to be on the horizon. Herein, an overview of the microporous materials chemistry driving advanced membrane development, the clean-energy separations employing them, and the theoretical underpinnings tying membrane performance to membrane structure across multiple length scales is provided.more » The interplay of pore architecture and chemistry for a given set of analytes emerges as a critical design consideration dictating mass transport outcomes. Also discussed are opportunities and outstanding challenges in the field, including high-flux 2D molecular-sieving membranes, phase-change adsorbents as performance-enhancing components in composite membranes, and the need for quantitative metrologies for understanding mass transport in heterophasic materials and in micropores with unusual chemical interactions with analytes of interest.« less
Engineered Transport in Microporous Materials and Membranes for Clean Energy Technologies
Li, Changyi; Meckler, Stephen M.; Smith, Zachary P.; ...
2018-01-08
Many forward-looking clean-energy technologies hinge on the development of scalable and efficient membrane-based separations. Ongoing investment in the basic research of microporous materials is beginning to pay dividends in membrane technology maturation. Specifically, improvements in membrane selectivity, permeability, and durability are being leveraged for more efficient carbon capture, desalination, and energy storage, and the market adoption of membranes in those areas appears to be on the horizon. Herein, an overview of the microporous materials chemistry driving advanced membrane development, the clean-energy separations employing them, and the theoretical underpinnings tying membrane performance to membrane structure across multiple length scales is provided.more » The interplay of pore architecture and chemistry for a given set of analytes emerges as a critical design consideration dictating mass transport outcomes. Also discussed are opportunities and outstanding challenges in the field, including high-flux 2D molecular-sieving membranes, phase-change adsorbents as performance-enhancing components in composite membranes, and the need for quantitative metrologies for understanding mass transport in heterophasic materials and in micropores with unusual chemical interactions with analytes of interest.« less
NASA Astrophysics Data System (ADS)
Potyrailo, Radislav A.; Hassib, Lamyaa
2005-06-01
Multicomponent polymer-based formulations of optical sensor materials are difficult and time consuming to optimize using conventional approaches. To address these challenges, our long-term goal is to determine relationships between sensor formulation and sensor response parameters using new scientific methodologies. As the first step, we have designed and implemented an automated analytical instrumentation infrastructure for combinatorial and high-throughput development of polymeric sensor materials for optical sensors. Our approach is based on the fabrication and performance screening of discrete and gradient sensor arrays. Simultaneous formation of multiple sensor coatings into discrete 4×6, 6×8, and 8×12 element arrays (3-15μL volume per element) and their screening provides not only a well-recognized acceleration in the screening rate, but also considerably reduces or even eliminates sources of variability, which are randomly affecting sensors response during a conventional one-at-a-time sensor coating evaluation. The application of gradient sensor arrays provides additional capabilities for rapid finding of the optimal formulation parameters.
Use of allogeneic apheresis stem cell products as an interlaboratory proficiency challenge.
Cooling, Laura; Roxbury, Kelly; Hoffmann, Sandra; DeBusscher, Joan; Kota, Usha; Goldstein, Steven; Davenport, Robertson
2017-06-01
AABB Standards requires that laboratories participate in a proficiency test (PT) program for critical analytes. Institutions can purchase commercial PT materials; however, PT can also be performed through interlaboratory exchange. We investigated the utility of allogeneic hematopoietic progenitor cell apheresis (HPC-A) products as an interlaboratory PT challenge for total nucleated cell count (TNC) and CD34 assessment. Three-year retrospective and comparative review of unrelated allogeneic HPC-A products received by the University of Michigan between January 2011 and December 2013. Internal TNC and CD34 count were compared to the external collecting facility by paired t test and linear regression. The absolute and percent difference between external and internal counts and 95% limits of agreeability (95% LA) were determined. Results were analyzed relative to donor center location (international, domestic), time zone (domestic), and calendar year. There was a strong correlation between internal and external TNC, regardless of donor center location or year. For CD34, there was a good correlation between centers (R = 0.88-0.91; slope = 0.95-0.98x) with a median difference of -1% (95% LA, -50%, +47%). This was considerably better than commercial PT challenges, which showed a persistent negative bias for absolute CD34 and CD3 counts. Allogeneic HPC-A products represent an interlaboratory PT exchange for all critical analytes, including TNC and CD34 count, cell viability, and sterility. Allogeneic HPC-A products, which are fresh and transported under validated conditions, are less subject to preanalytical variables that may impact commercial PT samples such as aliquoting and sample homogeneity, commercial additives, and sample stability during manufacturing and transport. © 2017 AABB.
Elemental mapping with energy-dispersive X-ray spectroscopy (EDX) associated with scanning electron microscopy is highly useful for studying internally mixed atmospheric particles. Presented is a study of individual particles from urban airsheds and the analytical challenges in q...
Measuring energy expenditure in clinical populations: rewards and challenges
Psota, T; Chen, KY
2013-01-01
The measurement of energy expenditure (EE) is recommended as an important component of comprehensive clinical nutrition assessments in patients with altered metabolic states, who failed to respond to nutrition support and with critical illness that require individualized nutrition support. There is evidence that EE is variable in patients with metabolic diseases, such as chronic renal disease, cirrhosis, HIV, cancer cachexia, cystic fibrosis and patients under intensive care. By using appropriate techniques and interpretations of basal or resting EE, clinicians can facilitate the adequate nutrition support with minimum negative impacts from under- or overfeeding in these patients. This review is based on our current understanding of the different components of EE and the techniques to measure them, and to re-examine advances and challenges to determine energy needs in clinical populations with more focuses on the obese, pediatric and elderly patients. In addition, technological advances have expanded the choices of market-available equipments for assessing EE, which also bring specific challenges and rewards in selecting the right equipment with specific performance criteria. Lastly, analytical considerations of interpreting the results of EE in the context of changing body composition are presented and discussed. PMID:23443826
The Purpose of Analytical Models from the Perspective of a Data Provider.
ERIC Educational Resources Information Center
Sheehan, Bernard S.
The purpose of analytical models is to reduce complex institutional management problems and situations to simpler proportions and compressed time frames so that human skills of decision makers can be brought to bear most effectively. Also, modeling cultivates the art of management by forcing explicit and analytical consideration of important…
Considerations in detecting CDC select agents under field conditions
NASA Astrophysics Data System (ADS)
Spinelli, Charles; Soelberg, Scott; Swanson, Nathaneal; Furlong, Clement; Baker, Paul
2008-04-01
Surface Plasmon Resonance (SPR) has become a widely accepted technique for real-time detection of interactions between receptor molecules and ligands. Antibody may serve as receptor and can be attached to the gold surface of the SPR device, while candidate analyte fluids contact the detecting antibody. Minute, but detectable, changes in refractive indices (RI) indicate that analyte has bound to the antibody. A decade ago, an inexpensive, robust, miniature and fully integrated SPR chip, called SPREETA, was developed. University of Washington (UW) researchers subsequently developed a portable, temperature-regulated instrument, called SPIRIT, to simultaneously use eight of these three-channel SPREETA chips. A SPIRIT prototype instrument was tested in the field, coupled to a remote reporting system on a surrogate unmanned aerial vehicle (UAV). Two target protein analytes were released sequentially as aerosols with low analyte concentration during each of three flights and were successfully detected and verified. Laboratory experimentation with a more advanced SPIRIT instrument demonstrated detection of very low levels of several select biological agents that might be employed by bioterrorists. Agent detection under field-like conditions is more challenging, especially as analyte concentrations are reduced and complex matricies are introduced. Two different sample preconditioning protocols have been developed for select agents in complex matrices. Use of these preconditioning techniques has allowed laboratory detection in spiked heavy mud of Francisella tularensis at 10 3 CFU/ml, Bacillus anthracis spores at 10 3 CFU/ml, Staphylococcal enterotoxin B (SEB) at 1 ng/ml, and Vaccinia virus (a smallpox simulant) at 10 5 PFU/ml. Ongoing experiments are aimed at simultaneous detection of multiple agents in spiked heavy mud, using a multiplex preconditioning protocol.
42 CFR 493.959 - Immunohematology.
Code of Federal Regulations, 2014 CFR
2014-10-01
... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...
42 CFR 493.959 - Immunohematology.
Code of Federal Regulations, 2013 CFR
2013-10-01
... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...
42 CFR 493.959 - Immunohematology.
Code of Federal Regulations, 2012 CFR
2012-10-01
... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...
42 CFR 493.959 - Immunohematology.
Code of Federal Regulations, 2011 CFR
2011-10-01
... challenges per testing event a program must provide for each analyte or test procedure is five. Analyte or... Compatibility testing Antibody identification (d) Evaluation of a laboratory's analyte or test performance. HHS... program must compare the laboratory's response for each analyte with the response that reflects agreement...
Analytical Challenges in Biotechnology.
ERIC Educational Resources Information Center
Glajch, Joseph L.
1986-01-01
Highlights five major analytical areas (electrophoresis, immunoassay, chromatographic separations, protein and DNA sequencing, and molecular structures determination) and discusses how analytical chemistry could further improve these techniques and thereby have a major impact on biotechnology. (JN)
-Omic and Electronic Health Record Big Data Analytics for Precision Medicine.
Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D; Venugopalan, Janani; Hoffman, Ryan; Wang, May D
2017-02-01
Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of healthcare. In this paper, we present -omic and EHR data characteristics, associated challenges, and data analytics including data preprocessing, mining, and modeling. To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Big data analytics is able to address -omic and EHR data challenges for paradigm shift toward precision medicine. Big data analytics makes sense of -omic and EHR data to improve healthcare outcome. It has long lasting societal impact.
Causon, Tim J; Hann, Stephan
2016-09-28
Fermentation and cell culture biotechnology in the form of so-called "cell factories" now play an increasingly significant role in production of both large (e.g. proteins, biopharmaceuticals) and small organic molecules for a wide variety of applications. However, associated metabolic engineering optimisation processes relying on genetic modification of organisms used in cell factories, or alteration of production conditions remain a challenging undertaking for improving the final yield and quality of cell factory products. In addition to genomic, transcriptomic and proteomic workflows, analytical metabolomics continues to play a critical role in studying detailed aspects of critical pathways (e.g. via targeted quantification of metabolites), identification of biosynthetic intermediates, and also for phenotype differentiation and the elucidation of previously unknown pathways (e.g. via non-targeted strategies). However, the diversity of primary and secondary metabolites and the broad concentration ranges encompassed during typical biotechnological processes means that simultaneous extraction and robust analytical determination of all parts of interest of the metabolome is effectively impossible. As the integration of metabolome data with transcriptome and proteome data is an essential goal of both targeted and non-targeted methods addressing production optimisation goals, additional sample preparation steps beyond necessary sampling, quenching and extraction protocols including clean-up, analyte enrichment, and derivatisation are important considerations for some classes of metabolites, especially those present in low concentrations or exhibiting poor stability. This contribution critically assesses the potential of current sample preparation strategies applied in metabolomic studies of industrially-relevant cell factory organisms using mass spectrometry-based platforms primarily coupled to liquid-phase sample introduction (i.e. flow injection, liquid chromatography, or capillary electrophoresis). Particular focus is placed on the selectivity and degree of enrichment attainable, as well as demands of speed, absolute quantification, robustness and, ultimately, consideration of fully-integrated bioanalytical solutions to optimise sample handling and throughput. Copyright © 2016 Elsevier B.V. All rights reserved.
Engendering development theory from the standpoint of women.
Currie, D H; Wickramasinghe, A
1997-01-01
Although the field of "women and development" emerged as an aftermath of the UN Decade for Women, development planners have treated gender and development as interrelated but analytically distinct by simply tacking the category "women" onto established frameworks or considering women the social "contexts" of development projects. This paper challenges this tendency with a consideration of how the global process of development is conditioned by and constitutive of gender roles and relations in specific cultural contexts. The paper presents a framework for a distinctly feminist political economy of development that moves development theory from its present impasse caused by challenges to the Marxism that has dominated critical development theory. This post-impasse framework poses Marx's theory of exploitation against the experiences of women garment workers in Free Trade Zones in Sri Lanka to illustrate how industrial development through free market channels is necessarily, not merely coincidentally, gendered. Therefore, the framework reveals the importance of engendering development theory itself. The paper opens with an introduction and continues with an exploration of the current theoretical impasse and post-impasse theory. The paper continues with a discussion of standpoint epistemology as the basis for women-centered research, a description of the research on the impact of factory employment on women from rural villages, a consideration of women's proletarianization in terms of the rise of the "new world order," a feminist reading of Marx's theory of exploitation from the standpoint of the garment workers, and an acknowledgement of the challenge posed by this application of standpoint methodology to the study of development to the current rejection by some Western feminists of universalizing categories such as "gender" and "women."
ERIC Educational Resources Information Center
Jang, Eunice E.; McDougall, Douglas E.; Pollon, Dawn; Herbert, Monique; Russell, Pia
2008-01-01
There are both conceptual and practical challenges in dealing with data from mixed methods research studies. There is a need for discussion about various integrative strategies for mixed methods data analyses. This article illustrates integrative analytic strategies for a mixed methods study focusing on improving urban schools facing challenging…
Challenges and Opportunities in Analysing Students Modelling
ERIC Educational Resources Information Center
Blanco-Anaya, Paloma; Justi, Rosária; Díaz de Bustamante, Joaquín
2017-01-01
Modelling-based teaching activities have been designed and analysed from distinct theoretical perspectives. In this paper, we use one of them--the model of modelling diagram (MMD)--as an analytical tool in a regular classroom context. This paper examines the challenges that arise when the MMD is used as an analytical tool to characterise the…
Developing Guidelines for Assessing Visual Analytics Environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholtz, Jean
2011-07-01
In this paper, we develop guidelines for evaluating visual analytic environments based on a synthesis of reviews for the entries to the 2009 Visual Analytics Science and Technology (VAST) Symposium Challenge and from a user study with professional intelligence analysts. By analyzing the 2009 VAST Challenge reviews we gained a better understanding of what is important to our reviewers, both visualization researchers and professional analysts. We also report on a small user study with professional analysts to determine the important factors that they use in evaluating visual analysis systems. We then looked at guidelines developed by researchers in various domainsmore » and synthesized these into an initial set for use by others in the community. In a second part of the user study, we looked at guidelines for a new aspect of visual analytic systems – the generation of reports. Future visual analytic systems have been challenged to help analysts generate their reports. In our study we worked with analysts to understand the criteria they used to evaluate the quality of analytic reports. We propose that this knowledge will be useful as researchers look at systems to automate some of the report generation.1 Based on these efforts, we produced some initial guidelines for evaluating visual analytic environment and for evaluation of analytic reports. It is important to understand that these guidelines are initial drafts and are limited in scope because of the type of tasks for which the visual analytic systems used in the studies in this paper were designed. More research and refinement is needed by the Visual Analytics Community to provide additional evaluation guidelines for different types of visual analytic environments.« less
Parallel Aircraft Trajectory Optimization with Analytic Derivatives
NASA Technical Reports Server (NTRS)
Falck, Robert D.; Gray, Justin S.; Naylor, Bret
2016-01-01
Trajectory optimization is an integral component for the design of aerospace vehicles, but emerging aircraft technologies have introduced new demands on trajectory analysis that current tools are not well suited to address. Designing aircraft with technologies such as hybrid electric propulsion and morphing wings requires consideration of the operational behavior as well as the physical design characteristics of the aircraft. The addition of operational variables can dramatically increase the number of design variables which motivates the use of gradient based optimization with analytic derivatives to solve the larger optimization problems. In this work we develop an aircraft trajectory analysis tool using a Legendre-Gauss-Lobatto based collocation scheme, providing analytic derivatives via the OpenMDAO multidisciplinary optimization framework. This collocation method uses an implicit time integration scheme that provides a high degree of sparsity and thus several potential options for parallelization. The performance of the new implementation was investigated via a series of single and multi-trajectory optimizations using a combination of parallel computing and constraint aggregation. The computational performance results show that in order to take full advantage of the sparsity in the problem it is vital to parallelize both the non-linear analysis evaluations and the derivative computations themselves. The constraint aggregation results showed a significant numerical challenge due to difficulty in achieving tight convergence tolerances. Overall, the results demonstrate the value of applying analytic derivatives to trajectory optimization problems and lay the foundation for future application of this collocation based method to the design of aircraft with where operational scheduling of technologies is key to achieving good performance.
Gamma-ray burst jet dynamics and their interaction with the progenitor star.
Lazzati, Davide; Morsony, Brian J; Begelman, Mitchell C
2007-05-15
The association of at least some long gamma-ray bursts with type Ic supernova explosions has been established beyond reasonable doubt. Theoretically, the challenge is to explain the presence of a light hyper-relativistic flow propagating through a massive stellar core without losing those properties. We discuss the role of the jet-star interaction in shaping the properties of the outflow emerging on the surface of the star. We show that the nature of the inner engine is hidden from the observer for most of the evolution, well beyond the time of the jet breakout on the stellar surface. The discussion is based on analytical considerations as well as high resolution numerical simulations. Finally, the observational consequences of the scenario are addressed in light of the present capabilities.
Laser-beam scintillations for weak and moderate turbulence
NASA Astrophysics Data System (ADS)
Baskov, R. A.; Chumak, O. O.
2018-04-01
The scintillation index is obtained for the practically important range of weak and moderate atmospheric turbulence. To study this challenging range, the Boltzmann-Langevin kinetic equation, describing light propagation, is derived from first principles of quantum optics based on the technique of the photon distribution function (PDF) [Berman et al., Phys. Rev. A 74, 013805 (2006), 10.1103/PhysRevA.74.013805]. The paraxial approximation for laser beams reduces the collision integral for the PDF to a two-dimensional operator in the momentum space. Analytical solutions for the average value of PDF as well as for its fluctuating constituent are obtained using an iterative procedure. The calculated scintillation index is considerably greater than that obtained within the Rytov approximation even at moderate turbulence strength. The relevant explanation is proposed.
NM-Scale Anatomy of an Entire Stardust Carrot Track
NASA Technical Reports Server (NTRS)
Nakamura-Messenger, K.; Keller, L. P.; Clemett, S. J.; Messenger, S.
2009-01-01
Comet Wild-2 samples collected by NASA s Stardust mission are extremely complex, heterogeneous, and have experienced wide ranges of alteration during the capture process. There are two major types of track morphologies: "carrot" and "bulbous," that reflect different structural/compositional properties of the impactors. Carrot type tracks are typically produced by compact or single mineral grains which survive essentially intact as a single large terminal particle. Bulbous tracks are likely produced by fine-grained or organic-rich impactors [1]. Owing to their challenging nature and especially high value of Stardust samples, we have invested considerable effort in developing both sample preparation and analytical techniques tailored for Stardust sample analyses. Our report focuses on our systematic disassembly and coordinated analysis of Stardust carrot track #112 from the mm to nm-scale.
Mulder, Leontine; van der Molen, Renate; Koelman, Carin; van Leeuwen, Ester; Roos, Anja; Damoiseaux, Jan
2018-05-01
ISO 15189:2012 requires validation of methods used in the medical laboratory, and lists a series of performance parameters for consideration to include. Although these performance parameters are feasible for clinical chemistry analytes, application in the validation of autoimmunity tests is a challenge. Lack of gold standards or reference methods in combination with the scarcity of well-defined diagnostic samples of patients with rare diseases make validation of new assays difficult. The present manuscript describes the initiative of Dutch medical immunology laboratory specialists to combine efforts and perform multi-center validation studies of new assays in the field of autoimmunity. Validation data and reports are made available to interested Dutch laboratory specialists. Copyright © 2018 Elsevier B.V. All rights reserved.
The Top 10 Challenges in Extreme-Scale Visual Analytics
Wong, Pak Chung; Shen, Han-Wei; Johnson, Christopher R.; Chen, Chaomei; Ross, Robert B.
2013-01-01
In this issue of CG&A, researchers share their R&D findings and results on applying visual analytics (VA) to extreme-scale data. Having surveyed these articles and other R&D in this field, we’ve identified what we consider the top challenges of extreme-scale VA. To cater to the magazine’s diverse readership, our discussion evaluates challenges in all areas of the field, including algorithms, hardware, software, engineering, and social issues. PMID:24489426
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, Kristin A.; Scholtz, Jean; Whiting, Mark A.
The VAST Challenge has been a popular venue for academic and industry participants for over ten years. Many participants comment that the majority of their time in preparing VAST Challenge entries is discovering elements in their software environments that need to be redesigned in order to solve the given task. Fortunately, there is no need to wait until the VAST Challenge is announced to test out software systems. The Visual Analytics Benchmark Repository contains all past VAST Challenge tasks, data, solutions and submissions. This paper details the various types of evaluations that may be conducted using the Repository information. Inmore » this paper we describe how developers can do informal evaluations of various aspects of their visual analytics environments using VAST Challenge information. Aspects that can be evaluated include the appropriateness of the software for various tasks, the various data types and formats that can be accommodated, the effectiveness and efficiency of the process supported by the software, and the intuitiveness of the visualizations and interactions. Researchers can compare their visualizations and interactions to those submitted to determine novelty. In addition, the paper provides pointers to various guidelines that software teams can use to evaluate the usability of their software. While these evaluations are not a replacement for formal evaluation methods, this information can be extremely useful during the development of visual analytics environments.« less
NASA Astrophysics Data System (ADS)
Laminack, William; Gole, James
2015-12-01
A unique MEMS/NEMS approach is presented for the modeling of a detection platform for mixed gas interactions. Mixed gas analytes interact with nanostructured decorating metal oxide island sites supported on a microporous silicon substrate. The Inverse Hard/Soft acid/base (IHSAB) concept is used to assess a diversity of conductometric responses for mixed gas interactions as a function of these nanostructured metal oxides. The analyte conductometric responses are well represented using a combination diffusion/absorption-based model for multi-gas interactions where a newly developed response absorption isotherm, based on the Fermi distribution function is applied. A further coupling of this model with the IHSAB concept describes the considerations in modeling of multi-gas mixed analyte-interface, and analyte-analyte interactions. Taking into account the molecular electronic interaction of both the analytes with each other and an extrinsic semiconductor interface we demonstrate how the presence of one gas can enhance or diminish the reversible interaction of a second gas with the extrinsic semiconductor interface. These concepts demonstrate important considerations in the array-based formats for multi-gas sensing and its applications.
Miller, Brian W.; Morisette, Jeffrey T.
2014-01-01
Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.
Fan Du; Shneiderman, Ben; Plaisant, Catherine; Malik, Sana; Perer, Adam
2017-06-01
The growing volume and variety of data presents both opportunities and challenges for visual analytics. Addressing these challenges is needed for big data to provide valuable insights and novel solutions for business, security, social media, and healthcare. In the case of temporal event sequence analytics it is the number of events in the data and variety of temporal sequence patterns that challenges users of visual analytic tools. This paper describes 15 strategies for sharpening analytic focus that analysts can use to reduce the data volume and pattern variety. Four groups of strategies are proposed: (1) extraction strategies, (2) temporal folding, (3) pattern simplification strategies, and (4) iterative strategies. For each strategy, we provide examples of the use and impact of this strategy on volume and/or variety. Examples are selected from 20 case studies gathered from either our own work, the literature, or based on email interviews with individuals who conducted the analyses and developers who observed analysts using the tools. Finally, we discuss how these strategies might be combined and report on the feedback from 10 senior event sequence analysts.
-Omic and Electronic Health Records Big Data Analytics for Precision Medicine
Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D.; Venugopalan, Janani; Hoffman, Ryan; Wang, May D.
2017-01-01
Objective Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of health care. Methods In this article, we present -omic and EHR data characteristics, associated challenges, and data analytics including data pre-processing, mining, and modeling. Results To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Conclusion Big data analytics is able to address –omic and EHR data challenges for paradigm shift towards precision medicine. Significance Big data analytics makes sense of –omic and EHR data to improve healthcare outcome. It has long lasting societal impact. PMID:27740470
Dobbin, Kevin K; Cesano, Alessandra; Alvarez, John; Hawtin, Rachael; Janetzki, Sylvia; Kirsch, Ilan; Masucci, Giuseppe V; Robbins, Paul B; Selvan, Senthamil R; Streicher, Howard Z; Zhang, Jenny; Butterfield, Lisa H; Thurin, Magdalena
2016-01-01
There is growing recognition that immunotherapy is likely to significantly improve health outcomes for cancer patients in the coming years. Currently, while a subset of patients experience substantial clinical benefit in response to different immunotherapeutic approaches, the majority of patients do not but are still exposed to the significant drug toxicities. Therefore, a growing need for the development and clinical use of predictive biomarkers exists in the field of cancer immunotherapy. Predictive cancer biomarkers can be used to identify the patients who are or who are not likely to derive benefit from specific therapeutic approaches. In order to be applicable in a clinical setting, predictive biomarkers must be carefully shepherded through a step-wise, highly regulated developmental process. Volume I of this two-volume document focused on the pre-analytical and analytical phases of the biomarker development process, by providing background, examples and "good practice" recommendations. In the current Volume II, the focus is on the clinical validation, validation of clinical utility and regulatory considerations for biomarker development. Together, this two volume series is meant to provide guidance on the entire biomarker development process, with a particular focus on the unique aspects of developing immune-based biomarkers. Specifically, knowledge about the challenges to clinical validation of predictive biomarkers, which has been gained from numerous successes and failures in other contexts, will be reviewed together with statistical methodological issues related to bias and overfitting. The different trial designs used for the clinical validation of biomarkers will also be discussed, as the selection of clinical metrics and endpoints becomes critical to establish the clinical utility of the biomarker during the clinical validation phase of the biomarker development. Finally, the regulatory aspects of submission of biomarker assays to the U.S. Food and Drug Administration as well as regulatory considerations in the European Union will be covered.
O'Brien, Kelly K; Colquhoun, Heather; Levac, Danielle; Baxter, Larry; Tricco, Andrea C; Straus, Sharon; Wickerson, Lisa; Nayar, Ayesha; Moher, David; O'Malley, Lisa
2016-07-26
Scoping studies (or reviews) are a method used to comprehensively map evidence across a range of study designs in an area, with the aim of informing future research practice, programs and policy. However, no universal agreement exists on terminology, definition or methodological steps. Our aim was to understand the experiences of, and considerations for conducting scoping studies from the perspective of academic and community partners. Primary objectives were to 1) describe experiences conducting scoping studies including strengths and challenges; and 2) describe perspectives on terminology, definition, and methodological steps. We conducted a cross-sectional web-based survey with clinicians, educators, researchers, knowledge users, representatives from community-based organizations, graduate students, and policy stakeholders with experience and/or interest in conducting scoping studies to gain an understanding of experiences and perspectives on the conduct and reporting of scoping studies. We administered an electronic self-reported questionnaire comprised of 22 items related to experiences with scoping studies, strengths and challenges, opinions on terminology, and methodological steps. We analyzed questionnaire data using descriptive statistics and content analytical techniques. Survey results were discussed during a multi-stakeholder consultation to identify key considerations in the conduct and reporting of scoping studies. Of the 83 invitations, 54 individuals (65 %) completed the scoping questionnaire, and 48 (58 %) attended the scoping study meeting from Canada, the United Kingdom and United States. Many scoping study strengths were dually identified as challenges including breadth of scope, and iterative process. No consensus on terminology emerged, however key defining features that comprised a working definition of scoping studies included the exploratory mapping of literature in a field; iterative process, inclusion of grey literature; no quality assessment of included studies, and an optional consultation phase. We offer considerations for the conduct and reporting of scoping studies for researchers, clinicians and knowledge users engaging in this methodology. Lack of consensus on scoping terminology, definition and methodological steps persists. Reasons for this may be attributed to diversity of disciplines adopting this methodology for differing purposes. Further work is needed to establish guidelines on the reporting and methodological quality assessment of scoping studies.
Vialaret, Jérôme; Picas, Alexia; Delaby, Constance; Bros, Pauline; Lehmann, Sylvain; Hirtz, Christophe
2018-06-01
Hepcidin-25 peptide is a biomarker which is known to have considerable clinical potential for diagnosing iron-related diseases. Developing analytical methods for the absolute quantification of hepcidin is still a real challenge, however, due to the sensitivity, specificity and reproducibility issues involved. In this study, we compare and discuss two MS-based assays for quantifying hepcidin, which differ only in terms of the type of liquid chromatography (nano LC/MS versus standard LC/MS) involved. The same sample preparation, the same internal standards and the same MS analyzer were used with both approaches. In the field of proteomics, nano LC chromatography is generally known to be more sensitive and less robust than standard LC methods. In this study, we established that the performances of the standard LC method are equivalent to those of our previously developed nano LC method. Although the analytical performances were very similar in both cases. The standard-flow platform therefore provides the more suitable alternative for accurately determining hepcidin in clinical settings. Copyright © 2018 Elsevier B.V. All rights reserved.
The "hospital central laboratory": automation, integration and clinical usefulness.
Zaninotto, Martina; Plebani, Mario
2010-07-01
Recent technological developments in laboratory medicine have led to a major challenge, maintaining a close connection between the search of efficiency through automation and consolidation and the assurance of effectiveness. The adoption of systems that automate most of the manual tasks characterizing routine activities has significantly improved the quality of laboratory performance; total laboratory automation being the paradigm of the idea that "human-less" robotic laboratories may allow for better operation and insuring less human errors. Furthermore, even if ongoing technological developments have considerably improved the productivity of clinical laboratories as well as reducing the turnaround time of the entire process, the value of qualified personnel remains a significant issue. Recent evidence confirms that automation allows clinical laboratories to improve analytical performances only if trained staff operate in accordance with well-defined standard operative procedures, thus assuring continuous monitoring of the analytical quality. In addition, laboratory automation may improve the appropriateness of test requests through the use of algorithms and reflex testing. This should allow the adoption of clinical and biochemical guidelines. In conclusion, in laboratory medicine, technology represents a tool for improving clinical effectiveness and patient outcomes, but it has to be managed by qualified laboratory professionals.
The challenges of editorship: a reflection on editing the Jung-Neumann correspondence.
Liebscher, Martin
2016-04-01
The complete correspondence between C.G. Jung and Erich Neumann was published in 2015. This article attempts to provide insight into the practical task, as well as the theoretical background, of the editing process. The advantages and possibilities of an unabridged edition with an extensive historical contextualization are demonstrated, and compared to the approach of the editors of the Jung Letters and their selection therein of Jung's letters to Neumann. The practical points under consideration include the establishment of the letter corpus, the ascertainment of dates and the chronological arrangement of the letter exchange, as well as the deciphering of handwritten letters. Theoretical aspects under discussion involve the question of the merits of a critical contextualisation and the position of the editor vis-à-vis the research object. The example of the selecting and editing of Jung's letters to Neumann by Aniela Jaffé and Gerhard Adler reveals how drastically the close ties of those editors with Jung, Neumann, and members of the Zurich analytical circles compromised their editorial work at times. The advantage for an editor being able to work from an historical distance is appreciated. © 2016, The Society of Analytical Psychology.
ERIC Educational Resources Information Center
Wise, Alyssa Friend; Vytasek, Jovita Maria; Hausknecht, Simone; Zhao, Yuting
2016-01-01
This paper addresses a relatively unexplored area in the field of learning analytics: how analytics are taken up and used as part of teaching and learning processes. Initial steps are taken towards developing design knowledge for this "middle space," with a focus on students as analytics users. First, a core set of challenges for…
Addressing the Analytic Challenges of Cross-Sectional Pediatric Pneumonia Etiology Data.
Hammitt, Laura L; Feikin, Daniel R; Scott, J Anthony G; Zeger, Scott L; Murdoch, David R; O'Brien, Katherine L; Deloria Knoll, Maria
2017-06-15
Despite tremendous advances in diagnostic laboratory technology, identifying the pathogen(s) causing pneumonia remains challenging because the infected lung tissue cannot usually be sampled for testing. Consequently, to obtain information about pneumonia etiology, clinicians and researchers test specimens distant to the site of infection. These tests may lack sensitivity (eg, blood culture, which is only positive in a small proportion of children with pneumonia) and/or specificity (eg, detection of pathogens in upper respiratory tract specimens, which may indicate asymptomatic carriage or a less severe syndrome, such as upper respiratory infection). While highly sensitive nucleic acid detection methods and testing of multiple specimens improve sensitivity, multiple pathogens are often detected and this adds complexity to the interpretation as the etiologic significance of results may be unclear (ie, the pneumonia may be caused by none, one, some, or all of the pathogens detected). Some of these challenges can be addressed by adjusting positivity rates to account for poor sensitivity or incorporating test results from controls without pneumonia to account for poor specificity. However, no classical analytic methods can account for measurement error (ie, sensitivity and specificity) for multiple specimen types and integrate the results of measurements for multiple pathogens to produce an accurate understanding of etiology. We describe the major analytic challenges in determining pneumonia etiology and review how the common analytical approaches (eg, descriptive, case-control, attributable fraction, latent class analysis) address some but not all challenges. We demonstrate how these limitations necessitate a new, integrated analytical approach to pneumonia etiology data. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.
Addressing the Analytic Challenges of Cross-Sectional Pediatric Pneumonia Etiology Data
Feikin, Daniel R.; Scott, J. Anthony G.; Zeger, Scott L.; Murdoch, David R.; O’Brien, Katherine L.; Deloria Knoll, Maria
2017-01-01
Abstract Despite tremendous advances in diagnostic laboratory technology, identifying the pathogen(s) causing pneumonia remains challenging because the infected lung tissue cannot usually be sampled for testing. Consequently, to obtain information about pneumonia etiology, clinicians and researchers test specimens distant to the site of infection. These tests may lack sensitivity (eg, blood culture, which is only positive in a small proportion of children with pneumonia) and/or specificity (eg, detection of pathogens in upper respiratory tract specimens, which may indicate asymptomatic carriage or a less severe syndrome, such as upper respiratory infection). While highly sensitive nucleic acid detection methods and testing of multiple specimens improve sensitivity, multiple pathogens are often detected and this adds complexity to the interpretation as the etiologic significance of results may be unclear (ie, the pneumonia may be caused by none, one, some, or all of the pathogens detected). Some of these challenges can be addressed by adjusting positivity rates to account for poor sensitivity or incorporating test results from controls without pneumonia to account for poor specificity. However, no classical analytic methods can account for measurement error (ie, sensitivity and specificity) for multiple specimen types and integrate the results of measurements for multiple pathogens to produce an accurate understanding of etiology. We describe the major analytic challenges in determining pneumonia etiology and review how the common analytical approaches (eg, descriptive, case-control, attributable fraction, latent class analysis) address some but not all challenges. We demonstrate how these limitations necessitate a new, integrated analytical approach to pneumonia etiology data. PMID:28575372
Application of GIS technology in public health: successes and challenges.
Fletcher-Lartey, Stephanie M; Caprarelli, Graziella
2016-04-01
The uptake and acceptance of Geographic Information Systems (GIS) technology has increased since the early 1990s and public health applications are rapidly expanding. In this paper, we summarize the common uses of GIS technology in the public health sector, emphasizing applications related to mapping and understanding of parasitic diseases. We also present some of the success stories, and discuss the challenges that still prevent a full scope application of GIS technology in the public health context. Geographical analysis has allowed researchers to interlink health, population and environmental data, thus enabling them to evaluate and quantify relationships between health-related variables and environmental risk factors at different geographical scales. The ability to access, share and utilize satellite and remote-sensing data has made possible even wider understanding of disease processes and of their links to the environment, an important consideration in the study of parasitic diseases. For example, disease prevention and control strategies resulting from investigations conducted in a GIS environment have been applied in many areas, particularly in Africa. However, there remain several challenges to a more widespread use of GIS technology, such as: limited access to GIS infrastructure, inadequate technical and analytical skills, and uneven data availability. Opportunities exist for international collaboration to address these limitations through knowledge sharing and governance.
ERIC Educational Resources Information Center
Muganda, Cornelia K.; Samzugi, Athuman S.; Mallinson, Brenda J.
2016-01-01
This paper shares analytical insights on the position, challenges and potential for promoting Open Educational Resources (OER) in African Open Distance and eLearning (ODeL) institutions. The researchers sought to use a participatory research approach as described by Krishnaswamy (2004), in convening a sequence of two workshops at the Open…
Analytical Sociology: A Bungean Appreciation
ERIC Educational Resources Information Center
Wan, Poe Yu-ze
2012-01-01
Analytical sociology, an intellectual project that has garnered considerable attention across a variety of disciplines in recent years, aims to explain complex social processes by dissecting them, accentuating their most important constituent parts, and constructing appropriate models to understand the emergence of what is observed. To achieve…
EXAMPLES OF THE ROLE OF ANALYTICAL CHEMISTRY IN ENVIRONMENTAL RISK MANAGEMENT RESEARCH
Analytical chemistry is an important tier of environmental protection and has been traditionally linked to compliance and/or exposure monitoring activities for environmental contaminants. The adoption of the risk management paradigm has led to special challenges for analytical ch...
Clarke, Robin; Hackbarth, Andrew S; Saigal, Christopher; Skootsky, Samuel A
2015-10-01
Evolving payer and patient expectations have challenged academic health centers (AHCs) to improve the value of clinical care. Traditional quality approaches may be unable to meet this challenge. One AHC, UCLA Health, has implemented a systematic approach to delivery system redesign that emphasizes clinician engagement, a patient-centric scope, and condition-specific, clinician-guided measurement. A physician champion serves as quality officer (QO) for each clinical department/division. Each QO, with support from a central measurement team, has developed customized analytics that use clinical data to define targeted populations and measure care across the full treatment episode. From October 2012 through June 2015, the approach developed rapidly. Forty-three QOs are actively redesigning care delivery protocols within their specialties, and 95% of the departments/divisions have received a customized measure report for at least one patient population. As an example of how these analytics promote systematic redesign, the authors discuss how Department of Urology physicians have used these new measures, first, to better understand the relationship between clinical practice and outcomes for patients with benign prostatic hyperplasia and, then, to work toward reducing unwarranted variation. Physicians have received these efforts positively. Early outcome data are encouraging. This infrastructure of engaged physicians and targeted measurement is being used to implement systematic care redesign that reliably achieves outcomes that are meaningful to patients and clinicians-incorporating both clinical and cost considerations. QOs are using an approach, for multiple newly launched projects, to identify, test, and implement value-oriented interventions tailored to specific patient populations.
Plenis, Alina; Oledzka, Ilona; Kowalski, Piotr; Baczek, Tomasz
2016-01-01
During the last few years there has been a growing interest in research focused on the metabolism of steroid hormones despite that the study of metabolic hormone pathways is still a difficult and demanding task because of low steroid concentrations and a complexity of the analysed matrices. Thus, there has been an increasing interest in the development of new, more selective and sensitive methods for monitoring these compounds in biological samples. A lot of bibliographic databases for world research literature were structurally searched using selected review question and inclusion/exclusion criteria. Next, the reports of the highest quality were selected using standard tools (181) and they were described to evaluate the advantages and limitations of different approaches in the measurements of the steroids and their metabolites. The overview of the analytical challenges, development of methods used in the assessment of the metabolic pathways of steroid hormones, and the priorities for future research with a special consideration for liquid chromatography (LC) and capillary electrophoresis (CE) techniques have been presented. Moreover, many LC and CE applications in pharmacological and psychological studies as well as endocrinology and sports medicine, taking into account the recent progress in the area of the metabolic profiling of steroids, have been critically discussed. The latest reports show that LC systems coupled with mass spectrometry have the predominant position in the research of steroid profiles. Moreover, CE techniques are going to gain a prominent position in the diagnosis of hormone levels in the near future.
Finley, Anna J; Tang, David; Schmeichel, Brandon J
2015-01-01
Prior research has found that persons who favor more analytic modes of thought are less religious. We propose that individual differences in analytic thought are associated with reduced religious beliefs particularly when analytic thought is measured (hence, primed) first. The current study provides a direct replication of prior evidence that individual differences in analytic thinking are negatively related to religious beliefs when analytic thought is measured before religious beliefs. When religious belief is measured before analytic thinking, however, the negative relationship is reduced to non-significance, suggesting that the link between analytic thought and religious belief is more tenuous than previously reported. The current study suggests that whereas inducing analytic processing may reduce religious belief, more analytic thinkers are not necessarily less religious. The potential for measurement order to inflate the inverse correlation between analytic thinking and religious beliefs deserves additional consideration.
Finley, Anna J.; Tang, David; Schmeichel, Brandon J.
2015-01-01
Prior research has found that persons who favor more analytic modes of thought are less religious. We propose that individual differences in analytic thought are associated with reduced religious beliefs particularly when analytic thought is measured (hence, primed) first. The current study provides a direct replication of prior evidence that individual differences in analytic thinking are negatively related to religious beliefs when analytic thought is measured before religious beliefs. When religious belief is measured before analytic thinking, however, the negative relationship is reduced to non-significance, suggesting that the link between analytic thought and religious belief is more tenuous than previously reported. The current study suggests that whereas inducing analytic processing may reduce religious belief, more analytic thinkers are not necessarily less religious. The potential for measurement order to inflate the inverse correlation between analytic thinking and religious beliefs deserves additional consideration. PMID:26402334
The Analytic Hierarchy Process and Participatory Decisionmaking
Daniel L. Schmoldt; Daniel L. Peterson; Robert L. Smith
1995-01-01
Managing natural resource lands requires social, as well as biophysical, considerations. Unfortunately, it is extremely difficult to accurately assess and quantify changing social preferences, and to aggregate conflicting opinions held by diverse social groups. The Analytic Hierarchy Process (AHP) provides a systematic, explicit, rigorous, and robust mechanism for...
Fundamentals of Sports Analytics.
Wasserman, Erin B; Herzog, Mackenzie M; Collins, Christy L; Morris, Sarah N; Marshall, Stephen W
2018-07-01
Recently, the importance of statistics and analytics in sports has increased. This review describes measures of sports injury and fundamentals of sports injury research with a brief overview of some of the emerging measures of sports performance. We describe research study designs that can be used to identify risk factors for injury, injury surveillance programs, and common measures of injury risk and association. Finally, we describe measures of physical performance and training and considerations for using these measures. This review provides sports medicine clinicians with an understanding of current research measures and considerations for designing sports injury research studies. Copyright © 2018 Elsevier Inc. All rights reserved.
Ethics and Justice in Learning Analytics
ERIC Educational Resources Information Center
Johnson, Jeffrey Alan
2017-01-01
The many complex challenges posed by learning analytics can best be understood within a framework of structural justice, which focuses on the ways in which the informational, operational, and organizational structures of learning analytics influence students' capacities for self-development and self-determination. This places primary…
Learning Analytics: Challenges and Limitations
ERIC Educational Resources Information Center
Wilson, Anna; Watson, Cate; Thompson, Terrie Lynn; Drew, Valerie; Doyle, Sarah
2017-01-01
Learning analytic implementations are increasingly being included in learning management systems in higher education. We lay out some concerns with the way learning analytics--both data and algorithms--are often presented within an unproblematized Big Data discourse. We describe some potential problems with the often implicit assumptions about…
Dolbin-MacNab, Megan L; Yancura, Loriena A
2018-01-01
Globally, it is common for grandparents to serve as surrogate parents to their grandchildren, often in response to family crises and other challenges such as poverty, disease epidemics, and migration. Despite the global nature of this intergenerational caregiving arrangement, there have been few contextually focused examinations of how grandparents' surrogate parenting roles are enacted across countries and cultures. This analytic review addresses this issue by exploring demographic and cultural contexts, needs and experiences, and formal and informal supports for grandparents raising grandchildren in four diverse countries: China, New Zealand, Romania, and South Africa. We conclude our analysis by discussing key contextual factors, and their associated interrelationships, from which future research may elucidate how cultural, historical, and sociopolitical factors uniquely shape grandparents' experiences. We also make recommendations for contextually informed policies and practice.
A reference web architecture and patterns for real-time visual analytics on large streaming data
NASA Astrophysics Data System (ADS)
Kandogan, Eser; Soroker, Danny; Rohall, Steven; Bak, Peter; van Ham, Frank; Lu, Jie; Ship, Harold-Jeffrey; Wang, Chun-Fu; Lai, Jennifer
2013-12-01
Monitoring and analysis of streaming data, such as social media, sensors, and news feeds, has become increasingly important for business and government. The volume and velocity of incoming data are key challenges. To effectively support monitoring and analysis, statistical and visual analytics techniques need to be seamlessly integrated; analytic techniques for a variety of data types (e.g., text, numerical) and scope (e.g., incremental, rolling-window, global) must be properly accommodated; interaction, collaboration, and coordination among several visualizations must be supported in an efficient manner; and the system should support the use of different analytics techniques in a pluggable manner. Especially in web-based environments, these requirements pose restrictions on the basic visual analytics architecture for streaming data. In this paper we report on our experience of building a reference web architecture for real-time visual analytics of streaming data, identify and discuss architectural patterns that address these challenges, and report on applying the reference architecture for real-time Twitter monitoring and analysis.
Leveraging multidisciplinarity in a visual analytics graduate course.
Elmqvist, Niklas; Ebert, David S
2012-01-01
Demand is growing in engineering, business, science, research, and industry for students with visual analytics expertise. However, teaching VA is challenging owing to the multidisciplinary nature of the topic, students' diverse backgrounds, and the corresponding requirements for instructors. This article reports best practices from a VA graduate course at Purdue University, where instructors leveraged these challenges to their advantage instead of trying to mitigate them.
Proxy-SU(3) symmetry in heavy deformed nuclei
NASA Astrophysics Data System (ADS)
Bonatsos, Dennis; Assimakis, I. E.; Minkov, N.; Martinou, Andriana; Cakirli, R. B.; Casten, R. F.; Blaum, K.
2017-06-01
Background: Microscopic calculations of heavy nuclei face considerable difficulties due to the sizes of the matrices that need to be solved. Various approximation schemes have been invoked, for example by truncating the spaces, imposing seniority limits, or appealing to various symmetry schemes such as pseudo-SU(3). This paper proposes a new symmetry scheme also based on SU(3). This proxy-SU(3) can be applied to well-deformed nuclei, is simple to use, and can yield analytic predictions. Purpose: To present the new scheme and its microscopic motivation, and to test it using a Nilsson model calculation with the original shell model orbits and with the new proxy set. Method: We invoke an approximate, analytic, treatment of the Nilsson model, that allows the above vetting and yet is also transparent in understanding the approximations involved in the new proxy-SU(3). Results: It is found that the new scheme yields a Nilsson diagram for well-deformed nuclei that is very close to the original Nilsson diagram. The specific levels of approximation in the new scheme are also shown, for each major shell. Conclusions: The new proxy-SU(3) scheme is a good approximation to the full set of orbits in a major shell. Being able to replace a complex shell model calculation with a symmetry-based description now opens up the possibility to predict many properties of nuclei analytically and often in a parameter-free way. The new scheme works best for heavier nuclei, precisely where full microscopic calculations are most challenged. Some cases in which the new scheme can be used, often analytically, to make specific predictions, are shown in a subsequent paper.
SociAL Sensor Analytics: Measuring Phenomenology at Scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Corley, Courtney D.; Dowling, Chase P.; Rose, Stuart J.
The objective of this paper is to present a system for interrogating immense social media streams through analytical methodologies that characterize topics and events critical to tactical and strategic planning. First, we propose a conceptual framework for interpreting social media as a sensor network. Time-series models and topic clustering algorithms are used to implement this concept into a functioning analytical system. Next, we address two scientific challenges: 1) to understand, quantify, and baseline phenomenology of social media at scale, and 2) to develop analytical methodologies to detect and investigate events of interest. This paper then documents computational methods and reportsmore » experimental findings that address these challenges. Ultimately, the ability to process billions of social media posts per week over a period of years enables the identification of patterns and predictors of tactical and strategic concerns at an unprecedented rate through SociAL Sensor Analytics (SALSA).« less
Lindgren, Annie R; Anderson, Frank E
2018-01-01
Historically, deep-level relationships within the molluscan class Cephalopoda (squids, cuttlefishes, octopods and their relatives) have remained elusive due in part to the considerable morphological diversity of extant taxa, a limited fossil record for species that lack a calcareous shell and difficulties in sampling open ocean taxa. Many conflicts identified by morphologists in the early 1900s remain unresolved today in spite of advances in morphological, molecular and analytical methods. In this study we assess the utility of transcriptome data for resolving cephalopod phylogeny, with special focus on the orders of Decapodiformes (open-eye squids, bobtail squids, cuttlefishes and relatives). To do so, we took new and previously published transcriptome data and used a unique cephalopod core ortholog set to generate a dataset that was subjected to an array of filtering and analytical methods to assess the impacts of: taxon sampling, ortholog number, compositional and rate heterogeneity and incongruence across loci. Analyses indicated that datasets that maximized taxonomic coverage but included fewer orthologs were less stable than datasets that sacrificed taxon sampling to increase the number of orthologs. Clades recovered irrespective of dataset, filtering or analytical method included Octopodiformes (Vampyroteuthis infernalis + octopods), Decapodiformes (squids, cuttlefishes and their relatives), and orders Oegopsida (open-eyed squids) and Myopsida (e.g., loliginid squids). Ordinal-level relationships within Decapodiformes were the most susceptible to dataset perturbation, further emphasizing the challenges associated with uncovering relationships at deep nodes in the cephalopod tree of life. Copyright © 2017 Elsevier Inc. All rights reserved.
Human performance modeling for system of systems analytics: combat performance-shaping factors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawton, Craig R.; Miller, Dwight Peter
The US military has identified Human Performance Modeling (HPM) as a significant requirement and challenge of future systems modeling and analysis initiatives. To support this goal, Sandia National Laboratories (SNL) has undertaken a program of HPM as an integral augmentation to its system-of-system (SoS) analytics capabilities. The previous effort, reported in SAND2005-6569, evaluated the effects of soldier cognitive fatigue on SoS performance. The current effort began with a very broad survey of any performance-shaping factors (PSFs) that also might affect soldiers performance in combat situations. The work included consideration of three different approaches to cognition modeling and how appropriate theymore » would be for application to SoS analytics. This bulk of this report categorizes 47 PSFs into three groups (internal, external, and task-related) and provides brief descriptions of how each affects combat performance, according to the literature. The PSFs were then assembled into a matrix with 22 representative military tasks and assigned one of four levels of estimated negative impact on task performance, based on the literature. Blank versions of the matrix were then sent to two ex-military subject-matter experts to be filled out based on their personal experiences. Data analysis was performed to identify the consensus most influential PSFs. Results indicate that combat-related injury, cognitive fatigue, inadequate training, physical fatigue, thirst, stress, poor perceptual processing, and presence of chemical agents are among the PSFs with the most negative impact on combat performance.« less
NASA Astrophysics Data System (ADS)
Omenetto, N.; Smith, B. W.; Winefordner, J. D.
1989-01-01
Several theoretical considerations are given on the potential and practical capabilities of a detector of fluorescence radiation whose operating principle is based on a multi-step excitation-ionization scheme involving the fluorescence photons as the first excitation step. This detection technique, which was first proposed by MATVEEVet al. [ Zh. Anal Khim.34, 846 (1979)], combines two independent atomizers, one analytical cell for the excitation of the sample fluorescence and one cell, filled with pure analyte atomic vapor, acting as the ionization detector. One laser beam excites the analyte fluorescence in the analytical cell and one (or two) laser beams are used to ionize the excited atoms in the detector. Several different causes of signal and noise are evaluated, together with a discussion on possible analytical atom reservoirs (flames, furnaces) and laser sources which could be used with this approach. For properly devised conditions, i.e. optical saturation of the fluorescence and unity ionization efficiency, detection limits well below pg/ml in solution and well below femtograms as absolute amounts in furnaces can be predicted. However, scattering problems, which are absent in a conventional laser-enhanced ionization set-up, may be important in this approach.
Challenge Activities for the Physical Education Classroom: Considerations
ERIC Educational Resources Information Center
McKenzie, Emily; Tapps, Tyler; Fink, Kevin; Symonds, Matthew L.
2018-01-01
The purpose of this article is to provide physical education teachers with the tools to develop and implement challenge course-like activities in their physical education classes. The article also covers environmental considerations for teachers who have the desire to create a challenge-based classroom setting in order to reach a wider and more…
Putting an Ethical Lens on Learning Analytics
ERIC Educational Resources Information Center
West, Deborah; Huijser, Henk; Heath, David
2016-01-01
As learning analytics activity has increased, a variety of ethical implications and considerations have emerged, though a significant research gap remains in explicitly investigating the views of key stakeholders, such as academic staff. This paper draws on ethics-related findings from an Australian study featuring two surveys, one of…
Risk management of drinking water relies on quality analytical data. Analytical methodology can often be adapted from environmental monitoring sources. However, risk management sometimes presents special analytical challenges because data may be needed from a source for which n...
Pfeiffer, Christine M; Looker, Anne C
2017-12-01
Biochemical assessment of iron status relies on serum-based indicators, such as serum ferritin (SF), transferrin saturation, and soluble transferrin receptor (sTfR), as well as erythrocyte protoporphyrin. These indicators present challenges for clinical practice and national nutrition surveys, and often iron status interpretation is based on the combination of several indicators. The diagnosis of iron deficiency (ID) through SF concentration, the most commonly used indicator, is complicated by concomitant inflammation. sTfR concentration is an indicator of functional ID that is not an acute-phase reactant, but challenges in its interpretation arise because of the lack of assay standardization, common reference ranges, and common cutoffs. It is unclear which indicators are best suited to assess excess iron status. The value of hepcidin, non-transferrin-bound iron, and reticulocyte indexes is being explored in research settings. Serum-based indicators are generally measured on fully automated clinical analyzers available in most hospitals. Although international reference materials have been available for years, the standardization of immunoassays is complicated by the heterogeneity of antibodies used and the absence of physicochemical reference methods to establish "true" concentrations. From 1988 to 2006, the assessment of iron status in NHANES was based on the multi-indicator ferritin model. However, the model did not indicate the severity of ID and produced categorical estimates. More recently, iron status assessment in NHANES has used the total body iron stores (TBI) model, in which the log ratio of sTfR to SF is assessed. Together, sTfR and SF concentrations cover the full range of iron status. The TBI model better predicts the absence of bone marrow iron than SF concentration alone, and TBI can be analyzed as a continuous variable. Additional consideration of methodologies, interpretation of indicators, and analytic standardization is important for further improvements in iron status assessment. © 2017 American Society for Nutrition.
Modulation aware cluster size optimisation in wireless sensor networks
NASA Astrophysics Data System (ADS)
Sriram Naik, M.; Kumar, Vinay
2017-07-01
Wireless sensor networks (WSNs) play a great role because of their numerous advantages to the mankind. The main challenge with WSNs is the energy efficiency. In this paper, we have focused on the energy minimisation with the help of cluster size optimisation along with consideration of modulation effect when the nodes are not able to communicate using baseband communication technique. Cluster size optimisations is important technique to improve the performance of WSNs. It provides improvement in energy efficiency, network scalability, network lifetime and latency. We have proposed analytical expression for cluster size optimisation using traditional sensing model of nodes for square sensing field with consideration of modulation effects. Energy minimisation can be achieved by changing the modulation schemes such as BPSK, 16-QAM, QPSK, 64-QAM, etc., so we are considering the effect of different modulation techniques in the cluster formation. The nodes in the sensing fields are random and uniformly deployed. It is also observed that placement of base station at centre of scenario enables very less number of modulation schemes to work in energy efficient manner but when base station placed at the corner of the sensing field, it enable large number of modulation schemes to work in energy efficient manner.
Considerations for Observational Research Using Large Data Sets in Radiation Oncology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jagsi, Reshma, E-mail: rjagsi@med.umich.edu; Bekelman, Justin E.; Chen, Aileen
The radiation oncology community has witnessed growing interest in observational research conducted using large-scale data sources such as registries and claims-based data sets. With the growing emphasis on observational analyses in health care, the radiation oncology community must possess a sophisticated understanding of the methodological considerations of such studies in order to evaluate evidence appropriately to guide practice and policy. Because observational research has unique features that distinguish it from clinical trials and other forms of traditional radiation oncology research, the International Journal of Radiation Oncology, Biology, Physics assembled a panel of experts in health services research to provide a concisemore » and well-referenced review, intended to be informative for the lay reader, as well as for scholars who wish to embark on such research without prior experience. This review begins by discussing the types of research questions relevant to radiation oncology that large-scale databases may help illuminate. It then describes major potential data sources for such endeavors, including information regarding access and insights regarding the strengths and limitations of each. Finally, it provides guidance regarding the analytical challenges that observational studies must confront, along with discussion of the techniques that have been developed to help minimize the impact of certain common analytical issues in observational analysis. Features characterizing a well-designed observational study include clearly defined research questions, careful selection of an appropriate data source, consultation with investigators with relevant methodological expertise, inclusion of sensitivity analyses, caution not to overinterpret small but significant differences, and recognition of limitations when trying to evaluate causality. This review concludes that carefully designed and executed studies using observational data that possess these qualities hold substantial promise for advancing our understanding of many unanswered questions of importance to the field of radiation oncology.« less
Considerations for observational research using large data sets in radiation oncology.
Jagsi, Reshma; Bekelman, Justin E; Chen, Aileen; Chen, Ronald C; Hoffman, Karen; Shih, Ya-Chen Tina; Smith, Benjamin D; Yu, James B
2014-09-01
The radiation oncology community has witnessed growing interest in observational research conducted using large-scale data sources such as registries and claims-based data sets. With the growing emphasis on observational analyses in health care, the radiation oncology community must possess a sophisticated understanding of the methodological considerations of such studies in order to evaluate evidence appropriately to guide practice and policy. Because observational research has unique features that distinguish it from clinical trials and other forms of traditional radiation oncology research, the International Journal of Radiation Oncology, Biology, Physics assembled a panel of experts in health services research to provide a concise and well-referenced review, intended to be informative for the lay reader, as well as for scholars who wish to embark on such research without prior experience. This review begins by discussing the types of research questions relevant to radiation oncology that large-scale databases may help illuminate. It then describes major potential data sources for such endeavors, including information regarding access and insights regarding the strengths and limitations of each. Finally, it provides guidance regarding the analytical challenges that observational studies must confront, along with discussion of the techniques that have been developed to help minimize the impact of certain common analytical issues in observational analysis. Features characterizing a well-designed observational study include clearly defined research questions, careful selection of an appropriate data source, consultation with investigators with relevant methodological expertise, inclusion of sensitivity analyses, caution not to overinterpret small but significant differences, and recognition of limitations when trying to evaluate causality. This review concludes that carefully designed and executed studies using observational data that possess these qualities hold substantial promise for advancing our understanding of many unanswered questions of importance to the field of radiation oncology. Copyright © 2014 Elsevier Inc. All rights reserved.
Considerations for Observational Research using Large Datasets in Radiation Oncology
Jagsi, Reshma; Bekelman, Justin E.; Chen, Aileen; Chen, Ronald C.; Hoffman, Karen; Shih, Ya-Chen Tina; Smith, Benjamin D.; Yu, James B.
2014-01-01
The radiation oncology community has witnessed growing interest in observational research conducted using large-scale data sources such as registries and claims-based datasets. With the growing emphasis on observational analyses in health care, the radiation oncology community must possess a sophisticated understanding of the methodological considerations of such studies in order to evaluate evidence appropriately to guide practice and policy. Because observational research has unique features that distinguish it from clinical trials and other forms of traditional radiation oncology research, the Red Journal assembled a panel of experts in health services research to provide a concise and well-referenced review, intended to be informative for the lay reader, as well as for scholars who wish to embark on such research without prior experience. This review begins by discussing the types of research questions relevant to radiation oncology that large-scale databases may help illuminate. It then describes major potential data sources for such endeavors, including information regarding access and insights regarding the strengths and limitations of each. Finally, it provides guidance regarding the analytic challenges that observational studies must confront, along with discussion of the techniques that have been developed to help minimize the impact of certain common analytical issues in observational analysis. Features characterizing a well-designed observational study include clearly defined research questions, careful selection of an appropriate data source, consultation with investigators with relevant methodological expertise, inclusion of sensitivity analyses, caution not to overinterpret small but significant differences, and recognition of limitations when trying to evaluate causality. This review concludes that carefully designed and executed studies using observational data that possess these qualities hold substantial promise for advancing our understanding of many unanswered questions of importance to the field of radiation oncology. PMID:25195986
Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.
Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.
CHAPTER 7: Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages
Zhu, Rui; Zacharias, Lauren; Wooding, Kerry M.; Peng, Wenjing; Mechref, Yehia
2017-01-01
Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins while automated software tools started replacing manual processing to improve the reliability and throughout of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. PMID:28109440
Understanding Education Involving Geovisual Analytics
ERIC Educational Resources Information Center
Stenliden, Linnea
2013-01-01
Handling the vast amounts of data and information available in contemporary society is a challenge. Geovisual Analytics provides technology designed to increase the effectiveness of information interpretation and analytical task solving. To date, little attention has been paid to the role such tools can play in education and to the extent to which…
Technology Enhanced Analytics (TEA) in Higher Education
ERIC Educational Resources Information Center
Daniel, Ben Kei; Butson, Russell
2013-01-01
This paper examines the role of Big Data Analytics in addressing contemporary challenges associated with current changes in institutions of higher education. The paper first explores the potential of Big Data Analytics to support instructors, students and policy analysts to make better evidence based decisions. Secondly, the paper presents an…
Be the Data: Embodied Visual Analytics
ERIC Educational Resources Information Center
Chen, Xin; Self, Jessica Zeitz; House, Leanna; Wenskovitch, John; Sun, Maoyuan; Wycoff, Nathan; Evia, Jane Robertson; Leman, Scotland; North, Chris
2018-01-01
With the rise of big data, it is becoming increasingly important to educate groups of students at many educational levels about data analytics. In particular, students without a strong mathematical background may have an unenthusiastic attitude towards high-dimensional data and find it challenging to understand relevant complex analytical methods,…
Divulging Personal Information within Learning Analytics Systems
ERIC Educational Resources Information Center
Ifenthaler, Dirk; Schumacher, Clara
2015-01-01
The purpose of this study was to investigate if students are prepared to release any personal data in order to inform learning analytics systems. Besides the well-documented benefits of learning analytics, serious concerns and challenges are associated with the application of these data driven systems. Most notably, empirical evidence regarding…
TOPICAL REVIEW: Biological and chemical sensors for cancer diagnosis
NASA Astrophysics Data System (ADS)
Simon, Elfriede
2010-11-01
The great challenge for sensor systems to be accepted as a relevant diagnostic and therapeutic tool for cancer detection is the ability to determine the presence of relevant biomarkers or biomarker patterns comparably to or even better than the traditional analytical systems. Biosensor and chemical sensor technologies are already used for several clinical applications such as blood glucose or blood gas measurements. However, up to now not many sensors have been developed for cancer-related tests because only a few of the biomarkers have shown clinical relevance and the performance of the sensor systems is not always satisfactory. New genomic and proteomic tools are used to detect new molecular signatures and identify which combinations of biomarkers may detect best the presence or risk of cancer or monitor cancer therapies. These molecular signatures include genetic and epigenetic signatures, changes in gene expressions, protein biomarker profiles and other metabolite profile changes. They provide new changes in using different sensor technologies for cancer detection especially when complex biomarker patterns have to be analyzed. To address requirements for this complex analysis, there have been recent efforts to develop sensor arrays and new solutions (e.g. lab on a chip) in which sampling, preparation, high-throughput analysis and reporting are integrated. The ability of parallelization, miniaturization and the degree of automation are the focus of new developments and will be supported by nanotechnology approaches. This review recaps some scientific considerations about cancer diagnosis and cancer-related biomarkers, relevant biosensor and chemical sensor technologies, their application as cancer sensors and consideration about future challenges.
The legal and ethical concerns that arise from using complex predictive analytics in health care.
Cohen, I Glenn; Amarasingham, Ruben; Shah, Anand; Xie, Bin; Lo, Bernard
2014-07-01
Predictive analytics, or the use of electronic algorithms to forecast future events in real time, makes it possible to harness the power of big data to improve the health of patients and lower the cost of health care. However, this opportunity raises policy, ethical, and legal challenges. In this article we analyze the major challenges to implementing predictive analytics in health care settings and make broad recommendations for overcoming challenges raised in the four phases of the life cycle of a predictive analytics model: acquiring data to build the model, building and validating it, testing it in real-world settings, and disseminating and using it more broadly. For instance, we recommend that model developers implement governance structures that include patients and other stakeholders starting in the earliest phases of development. In addition, developers should be allowed to use already collected patient data without explicit consent, provided that they comply with federal regulations regarding research on human subjects and the privacy of health information. Project HOPE—The People-to-People Health Foundation, Inc.
Weng, Naidong; Needham, Shane; Lee, Mike
2015-01-01
The 17th Annual Symposium on Clinical and Pharmaceutical Solutions through Analysis (CPSA) 29 September-2 October 2014, was held at the Sheraton Bucks County Hotel, Langhorne, PA, USA. The CPSA USA 2014 brought the various analytical fields defining the challenges of the modern analytical laboratory. Ongoing discussions focused on the future application of bioanalysis and other disciplines to support investigational new drugs (INDs) and new drug application (NDA) submissions, clinical diagnostics and pathology laboratory personnel that support patient sample analysis, and the clinical researchers that provide insights into new biomarkers within the context of the modern laboratory and personalized medicine.
Big Data Analytics in Chemical Engineering.
Chiang, Leo; Lu, Bo; Castillo, Ivan
2017-06-07
Big data analytics is the journey to turn data into insights for more informed business and operational decisions. As the chemical engineering community is collecting more data (volume) from different sources (variety), this journey becomes more challenging in terms of using the right data and the right tools (analytics) to make the right decisions in real time (velocity). This article highlights recent big data advancements in five industries, including chemicals, energy, semiconductors, pharmaceuticals, and food, and then discusses technical, platform, and culture challenges. To reach the next milestone in multiplying successes to the enterprise level, government, academia, and industry need to collaboratively focus on workforce development and innovation.
Analytical challenges: bridging the gap from regulation to enforcement.
Van den Eede, Guy; Kay, Simon; Anklam, Elke; Schimmel, Heinz
2002-01-01
An overview is presented of the analytical steps that may be needed to determine the presence of genetically modified organisms (GMOs) or for analysis of GMO-derived produce. The analytical aspects necessary for compliance with labeling regulations are discussed along with bottlenecks that may develop when a plant product or a food sample is analyzed for conformity with current European Union GMO legislation. In addition to sampling and testing, other topics deal with complications that arise from biological and agricultural realities that may influence testing capabilities. The issues presented are intended to serve as elements to examine the different challenges that enforcement laboratories might face.
Geochemical and analytical implications of extensive sulfur retention in ash from Indonesian peats
Kane, Jean S.; Neuzil, Sandra G.
1993-01-01
Sulfur is an analyte of considerable importance to the complete major element analysis of ash from low-sulfur, low-ash Indonesian peats. Most analytical schemes for major element peat- and coal-ash analyses, including the inductively coupled plasma atomic emission spectrometry method used in this work, do not permit measurement of sulfur in the ash. As a result, oxide totals cannot be used as a check on accuracy of analysis. Alternative quality control checks verify the accuracy of the cation analyses. Cation and sulfur correlations with percent ash yield suggest that silicon and titanium, and to a lesser extent, aluminum, generally originate as minerals, whereas magnesium and sulfur generally originate from organic matter. Cation correlations with oxide totals indicate that, for these Indonesian peats, magnesium dominates sulfur fixation during ashing because it is considerably more abundant in the ash than calcium, the next most important cation in sulfur fixation.
Considerations in the statistical analysis of clinical trials in periodontitis.
Imrey, P B
1986-05-01
Adult periodontitis has been described as a chronic infectious process exhibiting sporadic, acute exacerbations which cause quantal, localized losses of dental attachment. Many analytic problems of periodontal trials are similar to those of other chronic diseases. However, the episodic, localized, infrequent, and relatively unpredictable behavior of exacerbations, coupled with measurement error difficulties, cause some specific problems. Considerable controversy exists as to the proper selection and treatment of multiple site data from the same patient for group comparisons for epidemiologic or therapeutic evaluative purposes. This paper comments, with varying degrees of emphasis, on several issues pertinent to the analysis of periodontal trials. Considerable attention is given to the ways in which measurement variability may distort analytic results. Statistical treatments of multiple site data for descriptive summaries are distinguished from treatments for formal statistical inference to validate therapeutic effects. Evidence suggesting that sites behave independently is contested. For inferential analyses directed at therapeutic or preventive effects, analytic models based on site independence are deemed unsatisfactory. Methods of summarization that may yield more powerful analyses than all-site mean scores, while retaining appropriate treatment of inter-site associations, are suggested. Brief comments and opinions on an assortment of other issues in clinical trial analysis are preferred.
ERIC Educational Resources Information Center
Gaševic, Dragan; Jovanovic, Jelena; Pardo, Abelardo; Dawson, Shane
2017-01-01
The use of analytic methods for extracting learning strategies from trace data has attracted considerable attention in the literature. However, there is a paucity of research examining any association between learning strategies extracted from trace data and responses to well-established self-report instruments and performance scores. This paper…
Geometrical enhancement of the electric field: Application of fractional calculus in nanoplasmonics
NASA Astrophysics Data System (ADS)
Baskin, E.; Iomin, A.
2011-12-01
We developed an analytical approach, for a wave propagation in metal-dielectric nanostructures in the quasi-static limit. This consideration establishes a link between fractional geometry of the nanostructure and fractional integro-differentiation. The method is based on fractional calculus and permits to obtain analytical expressions for the electric-field enhancement.
Code of Federal Regulations, 2013 CFR
2013-01-01
... robust analytical methods. The Department seeks to use qualitative and quantitative analytical methods... uncertainties will be carried forward in subsequent analyses. The use of quantitative models will be... manufacturers and other interested parties. The use of quantitative models will be supplemented by qualitative...
Code of Federal Regulations, 2012 CFR
2012-01-01
... robust analytical methods. The Department seeks to use qualitative and quantitative analytical methods... uncertainties will be carried forward in subsequent analyses. The use of quantitative models will be... manufacturers and other interested parties. The use of quantitative models will be supplemented by qualitative...
Code of Federal Regulations, 2014 CFR
2014-01-01
... robust analytical methods. The Department seeks to use qualitative and quantitative analytical methods... uncertainties will be carried forward in subsequent analyses. The use of quantitative models will be... manufacturers and other interested parties. The use of quantitative models will be supplemented by qualitative...
ERIC Educational Resources Information Center
Tlili, Ahmed; Essalmi, Fathi; Jemni, Mohamed; Kinshuk; Chen, Nian-Shing
2018-01-01
With the rapid growth of online education in recent years, Learning Analytics (LA) has gained increasing attention from researchers and educational institutions as an area which can improve the overall effectiveness of learning experiences. However, the lack of guidelines on what should be taken into consideration during application of LA hinders…
Using Learning Analytics to Predict (and Improve) Student Success: A Faculty Perspective
ERIC Educational Resources Information Center
Dietz-Uhler, Beth; Hurn, Janet E.
2013-01-01
Learning analytics is receiving increased attention, in part because it offers to assist educational institutions in increasing student retention, improving student success, and easing the burden of accountability. Although these large-scale issues are worthy of consideration, faculty might also be interested in how they can use learning analytics…
Gonzalez-Dominguez, Alvaro; Duran-Guerrero, Enrique; Fernandez-Recamales, Angeles; Lechuga-Sancho, Alfonso Maria; Sayago, Ana; Schwarz, Monica; Segundo, Carmen; Gonzalez-Dominguez, Raul
2017-01-01
The analytical bias introduced by most of the commonly used techniques in metabolomics considerably hinders the simultaneous detection of all metabolites present in complex biological samples. In order to solve this limitation, the combination of complementary approaches is emerging in recent years as the most suitable strategy in order to maximize metabolite coverage. This review article presents a general overview of the most important analytical techniques usually employed in metabolomics: nuclear magnetic resonance, mass spectrometry and hybrid approaches. Furthermore, we emphasize the potential of integrating various tools in the form of metabolomic multi-platforms in order to get a deeper metabolome characterization, for which a revision of the existing literature in this field is provided. This review is not intended to be exhaustive but, rather, to give a practical and concise guide to readers not familiar with analytical chemistry on the considerations to account for the proper selection of the technique to be used in a metabolomic experiment in biomedical research. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Thermal management of liquid direct cooled split disk laser
NASA Astrophysics Data System (ADS)
Yang, Huomu; Feng, Guoying; Zhou, Shouhuan
2015-02-01
The thermal effects of a liquid direct cooled split disk laser are modeled and analytically solved. The analytical solutions with the consideration of longitudinal cooling liquid temperature rise have been given to describe the temperature distribution in the split disk and cooling liquid based on the hydrodynamics and heat transfer. The influence of cooling liquid, liquid flowing velocity, thickness of cooling channel and of disk gain medium can also be got from the analytical solutions.
The Human is the Loop: New Directions for Visual Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Endert, Alexander; Hossain, Shahriar H.; Ramakrishnan, Naren
2014-01-28
Visual analytics is the science of marrying interactive visualizations and analytic algorithms to support exploratory knowledge discovery in large datasets. We argue for a shift from a ‘human in the loop’ philosophy for visual analytics to a ‘human is the loop’ viewpoint, where the focus is on recognizing analysts’ work processes, and seamlessly fitting analytics into that existing interactive process. We survey a range of projects that provide visual analytic support contextually in the sensemaking loop, and outline a research agenda along with future challenges.
Structural Benchmark Creep Testing for the Advanced Stirling Convertor Heater Head
NASA Technical Reports Server (NTRS)
Krause, David L.; Kalluri, Sreeramesh; Bowman, Randy R.; Shah, Ashwin R.
2008-01-01
The National Aeronautics and Space Administration (NASA) has identified the high efficiency Advanced Stirling Radioisotope Generator (ASRG) as a candidate power source for use on long duration Science missions such as lunar applications, Mars rovers, and deep space missions. For the inherent long life times required, a structurally significant design limit for the heater head component of the ASRG Advanced Stirling Convertor (ASC) is creep deformation induced at low stress levels and high temperatures. Demonstrating proof of adequate margins on creep deformation and rupture for the operating conditions and the MarM-247 material of construction is a challenge that the NASA Glenn Research Center is addressing. The combined analytical and experimental program ensures integrity and high reliability of the heater head for its 17-year design life. The life assessment approach starts with an extensive series of uniaxial creep tests on thin MarM-247 specimens that comprise the same chemistry, microstructure, and heat treatment processing as the heater head itself. This effort addresses a scarcity of openly available creep properties for the material as well as for the virtual absence of understanding of the effect on creep properties due to very thin walls, fine grains, low stress levels, and high-temperature fabrication steps. The approach continues with a considerable analytical effort, both deterministically to evaluate the median creep life using nonlinear finite element analysis, and probabilistically to calculate the heater head s reliability to a higher degree. Finally, the approach includes a substantial structural benchmark creep testing activity to calibrate and validate the analytical work. This last element provides high fidelity testing of prototypical heater head test articles; the testing includes the relevant material issues and the essential multiaxial stress state, and applies prototypical and accelerated temperature profiles for timely results in a highly controlled laboratory environment. This paper focuses on the last element and presents a preliminary methodology for creep rate prediction, the experimental methods, test challenges, and results from benchmark testing of a trial MarM-247 heater head test article. The results compare favorably with the analytical strain predictions. A description of other test findings is provided, and recommendations for future test procedures are suggested. The manuscript concludes with describing the potential impact of the heater head creep life assessment and benchmark testing effort on the ASC program.
Experimental identification and analytical modelling of human walking forces: Literature review
NASA Astrophysics Data System (ADS)
Racic, V.; Pavic, A.; Brownjohn, J. M. W.
2009-09-01
Dynamic forces induced by humans walking change simultaneously in time and space, being random in nature and varying considerably not only between different people but also for a single individual who cannot repeat two identical steps. Since these important aspects of walking forces have not been adequately researched in the past, the corresponding lack of knowledge has reflected badly on the quality of their mathematical models used in vibration assessments of pedestrian structures such as footbridges, staircases and floors. To develop better force models which can be used with more confidence in the structural design, an adequate experimental and analytical approach must be taken to account for their complexity. This paper is the most comprehensive review published to date, of 270 references dealing with different experimental and analytical characterizations of human walking loading. The source of dynamic human-induced forces is in fact in the body motion. To date, human motion has attracted a lot of interest in many scientific branches, particularly in medical and sports science, bioengineering, robotics, and space flight programs. Other fields include biologists of various kinds, physiologists, anthropologists, computer scientists (graphics and animation), human factors and ergonomists, etc. It resulted in technologically advanced tools that can help understanding the human movement in more detail. Therefore, in addition to traditional direct force measurements utilizing a force plate and an instrumented treadmill, this review also introduces methods for indirect measurement of time-varying records of walking forces via combination of visual motion tracking (imaging) data and known body mass distribution. The review is therefore an interdisciplinary article that bridges the gaps between biomechanics of human gait and civil engineering dynamics. Finally, the key reason for undertaking this review is the fact that human-structure dynamic interaction and pedestrian synchronization when walking on more or less perceptibly moving structures are increasingly giving serious cause for concern in vibration serviceability design. There is a considerable uncertainty about how excessive structural vibrations modify walking and hence affect pedestrian-induced forces, significantly in many cases. Modelling of this delicate mechanism is one of the challenges that the international civil structural engineering community face nowadays and this review thus provides a step toward understanding better the problem.
Challenges in Modern Anti-Doping Analytical Science.
Ayotte, Christiane; Miller, John; Thevis, Mario
2017-01-01
The challenges facing modern anti-doping analytical science are increasingly complex given the expansion of target drug substances, as the pharmaceutical industry introduces more novel therapeutic compounds and the internet offers designer drugs to improve performance. The technical challenges are manifold, including, for example, the need for advanced instrumentation for greater speed of analyses and increased sensitivity, specific techniques capable of distinguishing between endogenous and exogenous metabolites, or biological assays for the detection of peptide hormones or their markers, all of which require an important investment from the laboratories and recruitment of highly specialized scientific personnel. The consequences of introducing sophisticated and complex analytical procedures may result in the future in a change in the strategy applied by the Word Anti-Doping Agency in relation to the introduction and performance of new techniques by the network of accredited anti-doping laboratories. © 2017 S. Karger AG, Basel.
Challenges and perspectives in quantitative NMR.
Giraudeau, Patrick
2017-01-01
This perspective article summarizes, from the author's point of view at the beginning of 2016, the major challenges and perspectives in the field of quantitative NMR. The key concepts in quantitative NMR are first summarized; then, the most recent evolutions in terms of resolution and sensitivity are discussed, as well as some potential future research directions in this field. A particular focus is made on methodologies capable of boosting the resolution and sensitivity of quantitative NMR, which could open application perspectives in fields where the sample complexity and the analyte concentrations are particularly challenging. These include multi-dimensional quantitative NMR and hyperpolarization techniques such as para-hydrogen-induced polarization or dynamic nuclear polarization. Because quantitative NMR cannot be dissociated from the key concepts of analytical chemistry, i.e. trueness and precision, the methodological developments are systematically described together with their level of analytical performance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Setting Learning Analytics in Context: Overcoming the Barriers to Large-Scale Adoption
ERIC Educational Resources Information Center
Ferguson, Rebecca; Macfadyen, Leah P.; Clow, Doug; Tynan, Belinda; Alexander, Shirley; Dawson, Shane
2014-01-01
A core goal for most learning analytic projects is to move from small-scale research towards broader institutional implementation, but this introduces a new set of challenges because institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires explicit and…
ERIC Educational Resources Information Center
Follette, William C.; Bonow, Jordan T.
2009-01-01
Whether explicitly acknowledged or not, behavior-analytic principles are at the heart of most, if not all, empirically supported therapies. However, the change process in psychotherapy is only now being rigorously studied. Functional analytic psychotherapy (FAP; Kohlenberg & Tsai, 1991; Tsai et al., 2009) explicitly identifies behavioral-change…
Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew
2015-01-01
Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012
Big data analytics as a service infrastructure: challenges, desired properties and solutions
NASA Astrophysics Data System (ADS)
Martín-Márquez, Manuel
2015-12-01
CERN's accelerator complex generates a very large amount of data. A large volumen of heterogeneous data is constantly generated from control equipment and monitoring agents. These data must be stored and analysed. Over the decades, CERN's researching and engineering teams have applied different approaches, techniques and technologies for this purpose. This situation has minimised the necessary collaboration and, more relevantly, the cross data analytics over different domains. These two factors are essential to unlock hidden insights and correlations between the underlying processes, which enable better and more efficient daily-based accelerator operations and more informed decisions. The proposed Big Data Analytics as a Service Infrastructure aims to: (1) integrate the existing developments; (2) centralise and standardise the complex data analytics needs for CERN's research and engineering community; (3) deliver real-time, batch data analytics and information discovery capabilities; and (4) provide transparent access and Extract, Transform and Load (ETL), mechanisms to the various and mission-critical existing data repositories. This paper presents the desired objectives and properties resulting from the analysis of CERN's data analytics requirements; the main challenges: technological, collaborative and educational and; potential solutions.
Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew
2015-01-01
Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.
Lismont, Jasmien; Janssens, Anne-Sophie; Odnoletkova, Irina; Vanden Broucke, Seppe; Caron, Filip; Vanthienen, Jan
2016-10-01
The aim of this study is to guide healthcare instances in applying process analytics on healthcare processes. Process analytics techniques can offer new insights in patient pathways, workflow processes, adherence to medical guidelines and compliance with clinical pathways, but also bring along specific challenges which will be examined and addressed in this paper. The following methodology is proposed: log preparation, log inspection, abstraction and selection, clustering, process mining, and validation. It was applied on a case study in the type 2 diabetes mellitus domain. Several data pre-processing steps are applied and clarify the usefulness of process analytics in a healthcare setting. Healthcare utilization, such as diabetes education, is analyzed and compared with diabetes guidelines. Furthermore, we take a look at the organizational perspective and the central role of the GP. This research addresses four challenges: healthcare processes are often patient and hospital specific which leads to unique traces and unstructured processes; data is not recorded in the right format, with the right level of abstraction and time granularity; an overflow of medical activities may cloud the analysis; and analysts need to deal with data not recorded for this purpose. These challenges complicate the application of process analytics. It is explained how our methodology takes them into account. Process analytics offers new insights into the medical services patients follow, how medical resources relate to each other and whether patients and healthcare processes comply with guidelines and regulations. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Yager, Kevin; Albert, Thomas; Brower, Bernard V.; Pellechia, Matthew F.
2015-06-01
The domain of Geospatial Intelligence Analysis is rapidly shifting toward a new paradigm of Activity Based Intelligence (ABI) and information-based Tipping and Cueing. General requirements for an advanced ABIAA system present significant challenges in architectural design, computing resources, data volumes, workflow efficiency, data mining and analysis algorithms, and database structures. These sophisticated ABI software systems must include advanced algorithms that automatically flag activities of interest in less time and within larger data volumes than can be processed by human analysts. In doing this, they must also maintain the geospatial accuracy necessary for cross-correlation of multi-intelligence data sources. Historically, serial architectural workflows have been employed in ABIAA system design for tasking, collection, processing, exploitation, and dissemination. These simpler architectures may produce implementations that solve short term requirements; however, they have serious limitations that preclude them from being used effectively in an automated ABIAA system with multiple data sources. This paper discusses modern ABIAA architectural considerations providing an overview of an advanced ABIAA system and comparisons to legacy systems. It concludes with a recommended strategy and incremental approach to the research, development, and construction of a fully automated ABIAA system.
Play-Personas: Behaviours and Belief Systems in User-Centred Game Design
NASA Astrophysics Data System (ADS)
Canossa, Alessandro; Drachen, Anders
Game designers attempt to ignite affective, emotional responses from players via engineering game designs to incite definite user experiences. Theories of emotion state that definite emotional responses are individual, and caused by the individual interaction sequence or history. Engendering desired emotions in the audience of traditional audiovisual media is a considerable challenge; however it is potentially even more difficult to achieve the same goal for the audience of interactive entertainment, because a substantial degree of control rests in the hand of the end user rather than the designer. This paper presents a possible solution to the challenge of integrating the user in the design of interactive entertainment such as computer games by employing the "persona" framework introduced by Alan Cooper. This approach is already in use in interaction design. The method can be improved by complementing the traditional narrative description of personas with quantitative, data-oriented models of predicted patterns of user behaviour for a specific computer game Additionally, persona constructs can be applied both as design-oriented metaphors during the development of games, and as analytical lenses to existing games, e.g. for evaluation of patterns of player behaviour.
Integrated Aeroservoelastic Optimization: Status and Direction
NASA Technical Reports Server (NTRS)
Livne, Eli
1999-01-01
The interactions of lightweight flexible airframe structures, steady and unsteady aerodynamics, and wide-bandwidth active controls on modern airplanes lead to considerable multidisciplinary design challenges. More than 25 years of mathematical and numerical methods' development, numerous basic research studies, simulations and wind-tunnel tests of simple models, wind-tunnel tests of complex models of real airplanes, as well as flight tests of actively controlled airplanes, have all contributed to the accumulation of a substantial body of knowledge in the area of aeroservoelasticity. A number of analysis codes, with the capabilities to model real airplane systems under the assumptions of linearity, have been developed. Many tests have been conducted, and results were correlated with analytical predictions. A selective sample of references covering aeroservoelastic testing programs from the 1960s to the early 1980s, as well as more recent wind-tunnel test programs of real or realistic configurations, are included in the References section of this paper. An examination of references 20-29 will reveal that in the course of development (or later modification), of almost every modern airplane with a high authority active control system, there arose a need to face aeroservoelastic problems and aeroservoelastic design challenges.
Baxendale, Ian R; Braatz, Richard D; Hodnett, Benjamin K; Jensen, Klavs F; Johnson, Martin D; Sharratt, Paul; Sherlock, Jon-Paul; Florence, Alastair J
2015-03-01
This whitepaper highlights current challenges and opportunities associated with continuous synthesis, workup, and crystallization of active pharmaceutical ingredients (drug substances). We describe the technologies and requirements at each stage and emphasize the different considerations for developing continuous processes compared with batch. In addition to the specific sequence of operations required to deliver the necessary chemical and physical transformations for continuous drug substance manufacture, consideration is also given to how adoption of continuous technologies may impact different manufacturing stages in development from discovery, process development, through scale-up and into full scale production. The impact of continuous manufacture on drug substance quality and the associated challenges for control and for process safety are also emphasized. In addition to the technology and operational considerations necessary for the adoption of continuous manufacturing (CM), this whitepaper also addresses the cultural, as well as skills and training, challenges that will need to be met by support from organizations in order to accommodate the new work flows. Specific action items for industry leaders are: Develop flow chemistry toolboxes, exploiting the advantages of flow processing and including highly selective chemistries that allow use of simple and effective continuous workup technologies. Availability of modular or plug and play type equipment especially for workup to assist in straightforward deployment in the laboratory. As with learning from other industries, standardization is highly desirable and will require cooperation across industry and academia to develop and implement. Implement and exploit process analytical technologies (PAT) for real-time dynamic control of continuous processes. Develop modeling and simulation techniques to support continuous process development and control. Progress is required in multiphase systems such as crystallization. Involve all parts of the organization from discovery, research and development, and manufacturing in the implementation of CM. Engage with academia to develop the training provision to support the skills base for CM, particularly in flow chemistry, physical chemistry, and chemical engineering skills at the chemistry-process interface. Promote and encourage publication and dissemination of examples of CM across the sector to demonstrate capability, engage with regulatory comment, and establish benchmarks for performance and highlight challenges. Develop the economic case for CM of drug substance. This will involve various stakeholders at project and business level, however establishing the critical economic drivers is critical to driving the transformation in manufacturing. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
Patel, Chirag J
2017-01-01
Mixtures, or combinations and interactions between multiple environmental exposures, are hypothesized to be causally linked with disease and health-related phenotypes. Established and emerging molecular measurement technologies to assay the exposome , the comprehensive battery of exposures encountered from birth to death, promise a new way of identifying mixtures in disease in the epidemiological setting. In this opinion, we describe the analytic complexity and challenges in identifying mixtures associated with phenotype and disease. Existing and emerging machine-learning methods and data analytic approaches (e.g., "environment-wide association studies" [EWASs]), as well as large cohorts may enhance possibilities to identify mixtures of correlated exposures associated with phenotypes; however, the analytic complexity of identifying mixtures is immense. If the exposome concept is realized, new analytical methods and large sample sizes will be required to ascertain how mixtures are associated with disease. The author recommends documenting prevalent correlated exposures and replicated main effects prior to identifying mixtures.
In-vitro nanodiagnostic platform through nanoparticles and DNA-RNA nanotechnology.
Chan, Ki; Ng, Tzi Bun
2015-04-01
Nanocomposites containing nanoparticles or nanostructured domains exhibit an even higher degree of material complexity that leads to an extremely high variability of nanostructured materials. This review introduces analytical concepts and techniques for nanomaterials and derives recommendations for a qualified selection of characterization techniques for specific types of samples, and focuses the characterization of nanoparticles and their agglomerates or aggregates. In addition, DNA nanotechnology and the more recent newcomer RNA nanotechnology have achieved almost an advanced status among nanotechnology researchers¸ therefore, the core features, potential, and significant challenges of DNA nanotechnology are also highlighted as a new discipline. Moreover, nanobiochips made by nanomaterials are rapidly emerging as a new paradigm in the area of large-scale biochemical analysis. The use of nanoscale components enables higher precision in diagnostics while considerably reducing the cost of the platform that leads this review to explore the use of nanoparticles, nanomaterials, and other bionanotechnologies for its application to nanodiagnostics in-vitro.
Multiaxial and thermomechanical fatigue considerations in damage tolerant design
NASA Technical Reports Server (NTRS)
Leese, G. E.; Bill, R. C.
1985-01-01
In considering damage tolerant design concepts for gas turbine hot section components, several challenging concerns arise: Complex multiaxial loading situations are encountered; Thermomechanical fatigue loading involving very wide temperature ranges is imposed on components; Some hot section materials are extremely anisotropic; and coatings and environmental interactions play an important role in crack propagation. The effects of multiaxiality and thermomechanical fatigue are considered from the standpoint of their impact on damage tolerant design concepts. Recently obtained research results as well as results from the open literature are examined and their implications for damage tolerant design are discussed. Three important needs required to advance analytical capabilities in support of damage tolerant design become readily apparent: (1) a theoretical basis to account for the effect of nonproportional loading (mechanical and mechanical/thermal); (2) the development of practical crack growth parameters that are applicable to thermomechanical fatigue situations; and (3) the development of crack growth models that address multiple crack failures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chassin, David P.; Posse, Christian; Malard, Joel M.
2004-08-01
Physical analogs have shown considerable promise for understanding the behavior of complex adaptive systems, including macroeconomics, biological systems, social networks, and electric power markets. Many of today’s most challenging technical and policy questions can be reduced to a distributed economic control problem. Indeed, economically-based control of large-scale systems is founded on the conjecture that the price-based regulation (e.g., auctions, markets) results in an optimal allocation of resources and emergent optimal system control. This paper explores the state of the art in the use physical analogs for understanding the behavior of some econophysical systems and to deriving stable and robust controlmore » strategies for them. In particular we review and discussion applications of some analytic methods based on the thermodynamic metaphor according to which the interplay between system entropy and conservation laws gives rise to intuitive and governing global properties of complex systems that cannot be otherwise understood.« less
Oxide nanomaterials: synthetic developments, mechanistic studies, and technological innovations.
Patzke, Greta R; Zhou, Ying; Kontic, Roman; Conrad, Franziska
2011-01-24
Oxide nanomaterials are indispensable for nanotechnological innovations, because they combine an infinite variety of structural motifs and properties with manifold morphological features. Given that new oxide materials are almost reported on a daily basis, considerable synthetic and technological work remains to be done to fully exploit this ever increasing family of compounds for innovative nano-applications. This calls for reliable and scalable preparative approaches to oxide nanomaterials and their development remains a challenge for many complex nanostructured oxides. Oxide nanomaterials with special physicochemical features and unusual morphologies are still difficult to access by classic synthetic pathways. The limitless options for creating nano-oxide building blocks open up new technological perspectives with the potential to revolutionize areas ranging from data processing to biocatalysis. Oxide nanotechnology of the 21st century thus needs a strong interplay of preparative creativity, analytical skills, and new ideas for synergistic implementations. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Coming In: Queer Narratives of Sexual Self-Discovery.
Rosenberg, Shoshana
2017-10-09
Many models of queer sexuality continue to depict a linear narrative of sexual development, beginning in repression/concealment and eventuating in coming out. The present study sought to challenge this by engaging in a hermeneutically informed thematic analysis of interviews with eight queer people living in Western Australia. Four themes were identified: "searching for identity," "society, stigma, and self," "sexual self-discovery," and "coming in." Interviewees discussed internalized homophobia and its impact on their life; experiences and implications of finding a community and achieving a sense of belonging; the concept of sexual self-discovery being a lifelong process; and sexuality as fluid, dynamic, and situational rather than static. The article concludes by suggesting that the idea of "coming in"-arriving at a place of acceptance of one's sexuality, regardless of its fluidity or how it is viewed by society-offers considerable analytic leverage for understanding the journeys of sexual self-discovery of queer-identified people.
The changing demographic, legal, and technological contexts of political representation.
Forest, Benjamin
2005-10-25
Three developments have created challenges for political representation in the U.S. and particularly for the use of territorially based representation (election by district). First, the demographic complexity of the U.S. population has grown both in absolute terms and in terms of residential patterns. Second, legal developments since the 1960s have recognized an increasing number of groups as eligible for voting rights protection. Third, the growing technical capacities of computer technology, particularly Geographic Information Systems, have allowed political parties and other organizations to create election districts with increasingly precise political and demographic characteristics. Scholars have made considerable progress in measuring and evaluating the racial and partisan biases of districting plans, and some states have tried to use Geographic Information Systems technology to produce more representative districts. However, case studies of Texas and Arizona illustrate that such analytic and technical advances have not overcome the basic contradictions that underlie the American system of territorial political representation.
Nursing education: contradictions and challenges of pedagogical practice.
Pinto, Joelma Batista Tebaldi; Pepe, Alda Muniz
2007-01-01
This study deals with the nursing curriculum, pedagogical practice and education. Nowadays, this theme has taken up considerable space in academic debates. Thus, this study aimed to get empirical knowledge and provide an analytical description of the academic reality of nursing education at Santa Cruz State University in the undergraduate nursing course. This is a descriptive study, which may provide a new view of the problem, with careful observation, description, and exploration of the situation aspects, interpreting the reality, without interfering in it and, consequently, being open to new studies. Descriptive statistics with simple frequency and percentage calculation was applied. In summary, results indicate that professors and students have difficulties to evaluate the curriculum. In addition, the curriculum under study is characterized as a collection curriculum, with a pedagogical practice predominantly directed at the traditional model. Hence, nursing education still shows features of the biomedical-technical model.
The agency problem and medical acting: an example of applying economic theory to medical ethics.
Langer, Andreas; Schröder-Bäck, Peter; Brink, Alexander; Eurich, Johannes
2009-03-01
In this article, the authors attempt to build a bridge between economic theory and medical ethics to offer a new perspective to tackle ethical challenges in the physician-patient encounter. They apply elements of new institutional economics to the ethically relevant dimensions of the physician-patient relationship in a descriptive heuristic sense. The principal-agent theory can be used to analytically grasp existing action problems in the physician-patient relationship and as a basis for shaping recommendations at the institutional level. Furthermore, the patients' increased self-determination and modern opportunities for the medical laity to inform themselves lead to a less asymmetrical distribution of information between physician and patient and therefore require new interaction models. Based on the analysis presented here, the authors recommend that, apart from the physician's necessary individual ethics, greater consideration should be given to approaches of institutional ethics and hence to incentive systems within medical ethics.
Shaw, Bronwen E; Hahn, Theresa; Martin, Paul J; Mitchell, Sandra A; Petersdorf, Effie W; Armstrong, Gregory T; Shelburne, Nonniekaye; Storer, Barry E; Bhatia, Smita
2017-01-01
The increasing numbers of hematopoietic cell transplantations (HCTs) performed each year, the changing demographics of HCT recipients, the introduction of new transplantation strategies, incremental improvement in survival, and the growing population of HCT survivors demand a comprehensive approach to examining the health and well-being of patients throughout life after HCT. This report summarizes strategies for the conduct of research on late effects after transplantation, including consideration of the study design and analytic approaches; methodologic challenges in handling complex phenotype data; an appreciation of the changing trends in the practice of transplantation; and the availability of biospecimens to support laboratory-based research. It is hoped that these concepts will promote continued research and facilitate the development of new approaches to address fundamental questions in transplantation outcomes. Copyright © 2017 The American Society for Blood and Marrow Transplantation. Published by Elsevier Inc. All rights reserved.
An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study
NASA Technical Reports Server (NTRS)
Ray, Paul S.
1996-01-01
The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.
Ryan, Tony; Amen, Karwan M; McKeown, Jane
2017-10-01
There exists compelling evidence that advance care planning (ACP) remains a key factor in the delivery of appropriate end of life care and facilitates the timely transition to palliative care for people with dementia. Take up of ACP within the dementia population is low, especially when compared with other conditions. Quantitative research has helped in identifying some of the key factors in enabling or inhibiting the use of ACP within the dementia population. Qualitative research can, however, shed further light upon the experiences of all. We carried out a search of the qualitative literature addressing the ACP experiences of people with dementia, family caregivers and professionals. An approach to qualitative synthesis involving coding of original text, developing descriptive themes and generating analytical themes was utilized. We identified five papers and subsequently five analytical themes: breadth and scope of future planning; challenges to ACP; postponing ACP; confidence in systems and making ACP happen for people with dementia. The synthesized findings shed light on the ongoing challenges of the use and further development of ACP in the population of people with dementia. In particular attention is drawn to the difficulties in the timing of ACP and the preference for informal approaches to planning within the families of people affected by dementia. The ACP capacity of the workforce is also addressed. The paper reveals considerable complexity in undertaking ACP in a context of dementia. It is suggested that the preference for informal approaches and the timing of initial conversations be considered and that the skills of those involved in initiating discussions should be given primacy.
Big Data and Analytics in Higher Education: Opportunities and Challenges
ERIC Educational Resources Information Center
Daniel, Ben
2015-01-01
Institutions of higher education are operating in an increasingly complex and competitive environment. This paper identifies contemporary challenges facing institutions of higher education worldwide and explores the potential of Big Data in addressing these challenges. The paper then outlines a number of opportunities and challenges associated…
Elliott, D.G.; Applegate, L.J.; Murray, A.L.; Purcell, M.K.; McKibben, C.L.
2013-01-01
No gold standard assay exhibiting error-free classification of results has been identified for detection of Renibacterium salmoninarum, the causative agent of salmonid bacterial kidney disease. Validation of diagnostic assays for R. salmoninarum has been hindered by its unique characteristics and biology, and difficulties in locating suitable populations of reference test animals. Infection status of fish in test populations is often unknown, and it is commonly assumed that the assay yielding the most positive results has the highest diagnostic accuracy, without consideration of misclassification of results. In this research, quantification of R. salmoninarum in samples by bacteriological culture provided a standardized measure of viable bacteria to evaluate analytical performance characteristics (sensitivity, specificity and repeatability) of non-culture assays in three matrices (phosphate-buffered saline, ovarian fluid and kidney tissue). Non-culture assays included polyclonal enzyme-linked immunosorbent assay (ELISA), direct smear fluorescent antibody technique (FAT), membrane-filtration FAT, nested polymerase chain reaction (nested PCR) and three real-time quantitative PCR assays. Injection challenge of specific pathogen-free Chinook salmon, Oncorhynchus tshawytscha (Walbaum), with R. salmoninarum was used to estimate diagnostic sensitivity and specificity. Results did not identify a single assay demonstrating the highest analytical and diagnostic performance characteristics, but revealed strengths and weaknesses of each test.
Net, Sopheak; Delmont, Anne; Sempéré, Richard; Paluselli, Andrea; Ouddane, Baghdad
2015-05-15
Because of their widespread application, phthalates or phthalic acid esters (PAEs) are ubiquitous in the environment. Their presence has attracted considerable attention due to their potential impacts on ecosystem functioning and on public health, so their quantification has become a necessity. Various extraction procedures as well as gas/liquid chromatography and mass spectrometry detection techniques are found as suitable for reliable detection of such compounds. However, PAEs are ubiquitous in the laboratory environment including ambient air, reagents, sampling equipment, and various analytical devices, that induces difficult analysis of real samples with a low PAE background. Therefore, accurate PAE analysis in environmental matrices is a challenging task. This paper reviews the extensive literature data on the techniques for PAE quantification in natural media. Sampling, sample extraction/pretreatment and detection for quantifying PAEs in different environmental matrices (air, water, sludge, sediment and soil) have been reviewed and compared. The concept of "green analytical chemistry" for PAE determination is also discussed. Moreover useful information about the material preparation and the procedures of quality control and quality assurance are presented to overcome the problem of sample contamination and these encountered due to matrix effects in order to avoid overestimating PAE concentrations in the environment. Copyright © 2015 Elsevier B.V. All rights reserved.
Sample Return Missions Where Contamination Issues are Critical: Genesis Mission Approach
NASA Technical Reports Server (NTRS)
Allton, Judith H.; Stansbery E. K.
2011-01-01
The Genesis Mission, sought the challenging analytical goals of accurately and precisely measuring the elemental and isotopic composition of the Sun to levels useful for planetary science, requiring sensitivities of ppm to ppt in the outer 100 nm of collector materials. Analytical capabilities were further challenged when the hard landing in 2004 broke open the canister containing the super-clean collectors. Genesis illustrates that returned samples allow flexibility and creativity to recover from setbacks.
ERIC Educational Resources Information Center
McCoy, Chase; Shih, Patrick C.
2016-01-01
Educational data science (EDS) is an emerging, interdisciplinary research domain that seeks to improve educational assessment, teaching, and student learning through data analytics. Teachers have been portrayed in the EDS literature as users of pre-constructed data dashboards in educational technologies, with little consideration given to them as…
Human Factors in Streaming Data Analysis: Challenges and Opportunities for Information Visualization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dasgupta, Aritra; Arendt, Dustin L.; Franklin, Lyndsey
State-of-the-art visual analytics models and frameworks mostly assume a static snapshot of the data, while in many cases it is a stream with constant updates and changes. Exploration of streaming data poses unique challenges as machine-level computations and abstractions need to be synchronized with the visual representation of the data and the temporally evolving human insights. In the visual analytics literature, we lack a thorough characterization of streaming data and analysis of the challenges associated with task abstraction, visualization design, and adaptation of the role of human-in-the-loop for exploration of data streams. We aim to fill this gap by conductingmore » a survey of the state-of-the-art in visual analytics of streaming data for systematically describing the contributions and shortcomings of current techniques and analyzing the research gaps that need to be addressed in the future. Our contributions are: i) problem characterization for identifying challenges that are unique to streaming data analysis tasks, ii) a survey and analysis of the state-of-the-art in streaming data visualization research with a focus on the visualization design space for dynamic data and the role of the human-in-the-loop, and iii) reflections on the design-trade-offs for streaming visual analytics techniques and their practical applicability in real-world application scenarios.« less
Hill, Claire; Martin, Jennifer L; Thomson, Simon; Scott-Ram, Nick; Penfold, Hugh; Creswell, Cathy
2017-08-01
This article presents an analysis of challenges and considerations when developing digital mental health innovations. Recommendations include collaborative working between clinicians, researchers, industry and service users in order to successfully navigate challenges and to ensure e-therapies are engaging, acceptable, evidence based, scalable and sustainable. © The Royal College of Psychiatrists 2017.
Early Alert of Academically At-Risk Students: An Open Source Analytics Initiative
ERIC Educational Resources Information Center
Jayaprakash, Sandeep M.; Moody, Erik W.; Lauría, Eitel J. M.; Regan, James R.; Baron, Joshua D.
2014-01-01
The Open Academic Analytics Initiative (OAAI) is a collaborative, multi-year grant program aimed at researching issues related to the scaling up of learning analytics technologies and solutions across all of higher education. The paper describes the goals and objectives of the OAAI, depicts the process and challenges of collecting, organizing and…
Medical Students' Understanding of Directed Questioning by Their Clinical Preceptors.
Lo, Lawrence; Regehr, Glenn
2017-01-01
Phenomenon: Throughout clerkship, preceptors ask medical students questions for both assessment and teaching purposes. However, the cognitive and strategic aspects of students' approaches to managing this situation have not been explored. Without an understanding of how students approach the question and answer activity, medical educators are unable to appreciate how effectively this activity fulfills their purposes of assessment or determine the activity's associated educational effects. A convenience sample of nine 4th-year medical students participated in semistructured one-on-one interviews exploring their approaches to managing situations in which they have been challenged with questions from preceptors to which they do not know the answer. Through an iterative and recursive analytic reading of the interview transcripts, data were coded and organized to identify themes relevant to the students' considerations in answering such questions. Students articulated deliberate strategies for managing the directed questioning activity, which at times focused on the optimization of their learning but always included considerations of image management. Managing image involved projecting not only being knowledgeable but also being teachable. The students indicated that their considerations in selecting an appropriate strategy in a given situation involved their perceptions of their preceptors' intentions and preferences as well as several contextual factors. Insights: The medical students we interviewed were quite sophisticated in their understanding of the social nuances of the directed questioning process and described a variety of contextually invoked strategies to manage the situation and maintain a positive image.
The potential influence of rain on airfoil performance
NASA Technical Reports Server (NTRS)
Dunham, R. Earl, Jr.
1987-01-01
The potential influence of heavy rain on airfoil performance is discussed. Experimental methods for evaluating rain effects are reviewed. Important scaling considerations for extrapolating model data are presented. It is shown that considerable additional effort, both analytical and experimental, is necessary to understand the degree of hazard associated with flight operations in rain.
Some technical implications of Klein's concept of 'premature ego development'.
Mitrani, Judith L
2007-08-01
In this paper, the author revisits the problem of 'premature ego development' first introduced by Melanie Klein in 1930. She also highlights several developments in post-Kleinian thinking since the publication of that paper, which can be seen as offshoots of or complements to Klein's work. The author proposes a link between this category of precocious development and the absence of the experience of what Bion termed the 'containing object.' She puts forward several technical considerations relevant to analytic work with patients who suffer as a result of early developmental failures and presents various clinical vignettes in order to demonstrate the ways in which these considerations take shape in the analytic setting.
NASA Astrophysics Data System (ADS)
Morton, A.; Stewart, R.; Held, E.; Piburn, J.; Allen, M. R.; McManamay, R.; Sanyal, J.; Sorokine, A.; Bhaduri, B. L.
2017-12-01
Spatiotemporal (ST) analytics applied to major spatio-temporal data sources from major vendors such as USGS, NOAA, World Bank and World Health Organization have tremendous value in shedding light on the evolution of physical, cultural, and geopolitical landscapes on a local and global level. Especially powerful is the integration of these physical and cultural datasets across multiple and disparate formats, facilitating new interdisciplinary analytics and insights. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, changing attributes, and content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at the Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 16000+ attributes covering 200+ countries for over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We report on these advances, provide an illustrative case study, and inform how others may freely access the tool.
Current Status of Mycotoxin Analysis: A Critical Review.
Shephard, Gordon S
2016-07-01
It is over 50 years since the discovery of aflatoxins focused the attention of food safety specialists on fungal toxins in the feed and food supply. Since then, analysis of this important group of natural contaminants has advanced in parallel with general developments in analytical science, and current MS methods are capable of simultaneously analyzing hundreds of compounds, including mycotoxins, pesticides, and drugs. This profusion of data may advance our understanding of human exposure, yet constitutes an interpretive challenge to toxicologists and food safety regulators. Despite these advances in analytical science, the basic problem of the extreme heterogeneity of mycotoxin contamination, although now well understood, cannot be circumvented. The real health challenges posed by mycotoxin exposure occur in the developing world, especially among small-scale and subsistence farmers. Addressing these problems requires innovative approaches in which analytical science must also play a role in providing suitable out-of-laboratory analytical techniques.
Xu, Shen; Rogers, Toby; Fairweather, Elliot; Glenn, Anthony; Curran, James; Curcin, Vasa
2018-01-01
Data provenance is a technique that describes the history of digital objects. In health data settings, it can be used to deliver auditability and transparency, and to achieve trust in a software system. However, implementing data provenance in analytics software at an enterprise level presents a different set of challenges from the research environments where data provenance was originally devised. In this paper, the challenges of reporting provenance information to the user is presented. Provenance captured from analytics software can be large and complex and visualizing a series of tasks over a long period can be overwhelming even for a domain expert, requiring visual aggregation mechanisms that fit with complex human cognitive activities involved in the process. This research studied how provenance-based reporting can be integrated into a health data analytics software, using the example of Atmolytics visual reporting tool. PMID:29888084
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, Kris A.; Scholtz, Jean; Whiting, Mark A.
The VAST Challenge has been a popular venue for academic and industry participants for over ten years. Many participants comment that the majority of their time in preparing VAST Challenge entries is discovering elements in their software environments that need to be redesigned in order to solve the given task. Fortunately, there is no need to wait until the VAST Challenge is announced to test out software systems. The Visual Analytics Benchmark Repository contains all past VAST Challenge tasks, data, solutions and submissions. This paper details the various types of evaluations that may be conducted using the Repository information. Inmore » this paper we describe how developers can do informal evaluations of various aspects of their visual analytics environments using VAST Challenge information. Aspects that can be evaluated include the appropriateness of the software for various tasks, the various data types and formats that can be accommodated, the effectiveness and efficiency of the process supported by the software, and the intuitiveness of the visualizations and interactions. Researchers can compare their visualizations and interactions to those submitted to determine novelty. In addition, the paper provides pointers to various guidelines that software teams can use to evaluate the usability of their software. While these evaluations are not a replacement for formal evaluation methods, this information can be extremely useful during the development of visual analytics environments.« less
Casteleyn, L; Dumez, B; Becker, K; Kolossa-Gehring, M; Den Hond, E; Schoeters, G; Castaño, A; Koch, H M; Angerer, J; Esteban, M; Exley, K; Sepai, O; Bloemen, L; Horvat, M; Knudsen, L E; Joas, A; Joas, R; Biot, P; Koppen, G; Dewolf, M-C; Katsonouri, A; Hadjipanayis, A; Cerná, M; Krsková, A; Schwedler, G; Fiddicke, U; Nielsen, J K S; Jensen, J F; Rudnai, P; Közepésy, S; Mulcahy, M; Mannion, R; Gutleb, A C; Fischer, M E; Ligocka, D; Jakubowski, M; Reis, M F; Namorado, S; Lupsa, I-R; Gurzau, A E; Halzlova, K; Jajcaj, M; Mazej, D; Tratnik Snoj, J; Posada, M; López, E; Berglund, M; Larsson, K; Lehmann, A; Crettaz, P; Aerts, D
2015-08-01
In 2004 the European Commission and Member States initiated activities towards a harmonized approach for Human Biomonitoring surveys throughout Europe. The main objective was to sustain environmental health policy by building a coherent and sustainable framework and by increasing the comparability of data across countries. A pilot study to test common guidelines for setting up surveys was considered a key step in this process. Through a bottom-up approach that included all stakeholders, a joint study protocol was elaborated. From September 2011 till February 2012, 17 European countries collected data from 1844 mother-child pairs in the frame of DEMOnstration of a study to COordinate and Perform Human Biomonitoring on a European Scale (DEMOCOPHES).(1) Mercury in hair and urinary cadmium and cotinine were selected as biomarkers of exposure covered by sufficient analytical experience. Phthalate metabolites and Bisphenol A in urine were added to take into account increasing public and political awareness for emerging types of contaminants and to test less advanced markers/markers covered by less analytical experience. Extensive efforts towards chemo-analytical comparability were included. The pilot study showed that common approaches can be found in a context of considerable differences with respect to experience and expertize, socio-cultural background, economic situation and national priorities. It also evidenced that comparable Human Biomonitoring results can be obtained in such context. A European network was built, exchanging information, expertize and experiences, and providing training on all aspects of a survey. A key challenge was finding the right balance between a rigid structure allowing maximal comparability and a flexible approach increasing feasibility and capacity building. Next steps in European harmonization in Human Biomonitoring surveys include the establishment of a joint process for prioritization of substances to cover and biomarkers to develop, linking biomonitoring surveys with health examination surveys and with research, and coping with the diverse implementations of EU regulations and international guidelines with respect to ethics and privacy. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Lutomski, Michael G.; Carter-Journet, Katrina; Anderson, Leif; Box, Neil; Harrington, Sean; Jackson, David; DiFilippo, Denise
2012-01-01
The International Space Station (ISS) was originally designed to operate until 2015 with a plan for deorbiting the ISS in 2016. Currently, the international partnership has agreed to extend the operations until 2020 and discussions are underway to extend the life even further to 2028. Each partner is responsible for the sustaining engineering, sparing, and maintenance of their own segments. National Aeronautics and Space Administration's (NASA's) challenge is to purchase the needed number of spares to maintain the functional availability of the ISS systems necessary for the United States On-Orbit Segment s contribution. This presentation introduces an analytical approach to assessing uncertainty in ISS hardware necessary to extend the life of the vehicle. Some key areas for consideration are: establishing what confidence targets are required to ensure science can be continuously carried out on the ISS, defining what confidence targets are reasonable to ensure vehicle survivability, considering what is required to determine if the confidence targets are too high, and whether sufficient number of spares are purchased. The results of the analysis will provide a methodological basis for reassessing vehicle subsystem confidence targets. This analysis compares the probability of existing spares exceeding the total expected unit demand of the Orbital Replacement Unit (ORU) in functional hierarchies approximating the vehicle subsystems. In cases where the functional hierarchies' availability does not meet subsystem confidence targets, the analysis will further identify which ORUs may require additional spares to extend the life of the ISS. The resulting probability is dependent upon hardware reliability estimates. However, the ISS hardware fleet carries considerable epistemic uncertainty which must be factored into the development and execution of sparing risk postures. In addition, it is also recognized that uncertainty in the assessment is due to disconnects between modeled functions and actual subsystem operations. Perhaps most importantly, it is acknowledged that conservative confidence targets per subsystem are currently accepted. This presentation will also discuss how subsystem confidence targets may be relaxed based on calculating the level of uncertainty for each corresponding ORU-function. The presentation will conclude with the various strengths and limitations for implementing the analytical approach in sustaining the ISS through end of life; 2020 and beyond.
Analytical challenges in sports drug testing.
Thevis, Mario; Krug, Oliver; Geyer, Hans; Walpurgis, Katja; Baume, Norbert; Thomas, Andreas
2018-03-01
Analytical chemistry represents a central aspect of doping controls. Routine sports drug testing approaches are primarily designed to address the question whether a prohibited substance is present in a doping control sample and whether prohibited methods (for example, blood transfusion or sample manipulation) have been conducted by an athlete. As some athletes have availed themselves of the substantial breadth of research and development in the pharmaceutical arena, proactive and preventive measures are required such as the early implementation of new drug candidates and corresponding metabolites into routine doping control assays, even though these drug candidates are to date not approved for human use. Beyond this, analytical data are also cornerstones of investigations into atypical or adverse analytical findings, where the overall picture provides ample reason for follow-up studies. Such studies have been of most diverse nature, and tailored approaches have been required to probe hypotheses and scenarios reported by the involved parties concerning the plausibility and consistency of statements and (analytical) facts. In order to outline the variety of challenges that doping control laboratories are facing besides providing optimal detection capabilities and analytical comprehensiveness, selected case vignettes involving the follow-up of unconventional adverse analytical findings, urine sample manipulation, drug/food contamination issues, and unexpected biotransformation reactions are thematized.
ERIC Educational Resources Information Center
Khan, Osama
2017-01-01
This paper depicts a perceptual picture of learning analytics based on the understanding of learners and teachers at the SSU as a case study. The existing literature covers technical challenges of learning analytics (LA) and how it creates better social construct for enhanced learning support, however, there has not been adequate research on…
ERIC Educational Resources Information Center
Drachsler, H.; Kalz, M.
2016-01-01
The article deals with the interplay between learning analytics and massive open online courses (MOOCs) and provides a conceptual framework to situate ongoing research in the MOOC and learning analytics innovation cycle (MOLAC framework). The MOLAC framework is organized on three levels: On the micro-level, the data collection and analytics…
Lim, Wei Yin; Goh, Boon Tong; Khor, Sook Mei
2017-08-15
Clinicians, working in the health-care diagnostic systems of developing countries, currently face the challenges of rising costs, increased number of patient visits, and limited resources. A significant trend is using low-cost substrates to develop microfluidic devices for diagnostic purposes. Various fabrication techniques, materials, and detection methods have been explored to develop these devices. Microfluidic paper-based analytical devices (μPADs) have gained attention for sensing multiplex analytes, confirming diagnostic test results, rapid sample analysis, and reducing the volume of samples and analytical reagents. μPADs, which can provide accurate and reliable direct measurement without sample pretreatment, can reduce patient medical burden and yield rapid test results, aiding physicians in choosing appropriate treatment. The objectives of this review are to provide an overview of the strategies used for developing paper-based sensors with enhanced analytical performances and to discuss the current challenges, limitations, advantages, disadvantages, and future prospects of paper-based microfluidic platforms in clinical diagnostics. μPADs, with validated and justified analytical performances, can potentially improve the quality of life by providing inexpensive, rapid, portable, biodegradable, and reliable diagnostics. Copyright © 2017 Elsevier B.V. All rights reserved.
Big data analytics to improve cardiovascular care: promise and challenges.
Rumsfeld, John S; Joynt, Karen E; Maddox, Thomas M
2016-06-01
The potential for big data analytics to improve cardiovascular quality of care and patient outcomes is tremendous. However, the application of big data in health care is at a nascent stage, and the evidence to date demonstrating that big data analytics will improve care and outcomes is scant. This Review provides an overview of the data sources and methods that comprise big data analytics, and describes eight areas of application of big data analytics to improve cardiovascular care, including predictive modelling for risk and resource use, population management, drug and medical device safety surveillance, disease and treatment heterogeneity, precision medicine and clinical decision support, quality of care and performance measurement, and public health and research applications. We also delineate the important challenges for big data applications in cardiovascular care, including the need for evidence of effectiveness and safety, the methodological issues such as data quality and validation, and the critical importance of clinical integration and proof of clinical utility. If big data analytics are shown to improve quality of care and patient outcomes, and can be successfully implemented in cardiovascular practice, big data will fulfil its potential as an important component of a learning health-care system.
ERIC Educational Resources Information Center
Stauber, Barbara; Parreira do Amaral, Marcelo
2015-01-01
This article presents analytical considerations for the discussion of issues of access to education and inequality. It first sharpens the concept of access and inequality by pointing to the interplay of structure and agency as well as to processes of social differentiation in which differences are constructed. This implies a critical view on…
A Requirements-Driven Optimization Method for Acoustic Liners Using Analytic Derivatives
NASA Technical Reports Server (NTRS)
Berton, Jeffrey J.; Lopes, Leonard V.
2017-01-01
More than ever, there is flexibility and freedom in acoustic liner design. Subject to practical considerations, liner design variables may be manipulated to achieve a target attenuation spectrum. But characteristics of the ideal attenuation spectrum can be difficult to know. Many multidisciplinary system effects govern how engine noise sources contribute to community noise. Given a hardwall fan noise source to be suppressed, and using an analytical certification noise model to compute a community noise measure of merit, the optimal attenuation spectrum can be derived using multidisciplinary systems analysis methods. In a previous paper on this subject, a method deriving the ideal target attenuation spectrum that minimizes noise perceived by observers on the ground was described. A simple code-wrapping approach was used to evaluate a community noise objective function for an external optimizer. Gradients were evaluated using a finite difference formula. The subject of this paper is an application of analytic derivatives that supply precise gradients to an optimization process. Analytic derivatives improve the efficiency and accuracy of gradient-based optimization methods and allow consideration of more design variables. In addition, the benefit of variable impedance liners is explored using a multi-objective optimization.
Empirical and semi-analytical models for predicting peak outflows caused by embankment dam failures
NASA Astrophysics Data System (ADS)
Wang, Bo; Chen, Yunliang; Wu, Chao; Peng, Yong; Song, Jiajun; Liu, Wenjun; Liu, Xin
2018-07-01
Prediction of peak discharge of floods has attracted great attention for researchers and engineers. In present study, nine typical nonlinear mathematical models are established based on database of 40 historical dam failures. The first eight models that were developed with a series of regression analyses are purely empirical, while the last one is a semi-analytical approach that was derived from an analytical solution of dam-break floods in a trapezoidal channel. Water depth above breach invert (Hw), volume of water stored above breach invert (Vw), embankment length (El), and average embankment width (Ew) are used as independent variables to develop empirical formulas of estimating the peak outflow from breached embankment dams. It is indicated from the multiple regression analysis that a function using the former two variables (i.e., Hw and Vw) produce considerably more accurate results than that using latter two variables (i.e., El and Ew). It is shown that the semi-analytical approach works best in terms of both prediction accuracy and uncertainty, and the established empirical models produce considerably reasonable results except the model only using El. Moreover, present models have been compared with other models available in literature for estimating peak discharge.
Non-traditional applications of laser desorption/ionization mass spectrometry
NASA Astrophysics Data System (ADS)
McAlpin, Casey R.
Seven studies were carried out using laser desorption/ionization mass spectrometry (LDI MS) to develop enhanced methodologies for a variety of analyte systems by investigating analyte chemistries, ionization processes, and elimination of spectral interferences. Applications of LDI and matrix assisted laser/desorption/ionization (MALDI) have been previously limited by poorly understood ionization phenomena, and spectral interferences from matrices. Matrix assisted laser desorption ionization MS is well suited to the analysis of proteins. However, the proteins associated with bacteriophages often form complexes which are too massive for detection with a standard MALDI mass spectrometer. As such, methodologies for pretreatment of these samples are discussed in detail in the first chapter. Pretreatment of bacteriophage samples with reducing agents disrupted disulfide linkages and allowed enhanced detection of bacteriophage proteins. The second chapter focuses on the use of MALDI MS for lipid compounds whose molecular mass is significantly less than the proteins for which MALDI is most often applied. The use of MALDI MS for lipid analysis presented unique challenges such as matrix interference and differential ionization efficiencies. It was observed that optimization of the matrix system, and addition of cationization reagents mitigated these challenges and resulted in an enhanced methodology for MALDI MS of lipids. One of the challenges commonly encountered in efforts to expand MALDI MS applications is as previously mentioned interferences introduced by organic matrix molecules. The third chapter focuses on the development of a novel inorganic matrix replacement system called metal oxide laser ionization mass spectrometry (MOLI MS). In contrast to other matrix replacements, considerable effort was devoted to elucidating the ionization mechanism. It was shown that chemisorption of analytes to the metal oxide surface produced acidic adsorbed species which then protonated free analyte molecules. Expanded applications of MOLI MS were developed following description of the ionization mechanism. A series of experiments were carried out involving treatment of metal oxide surfaces with reagent molecules to expand MOLI MS and develop enhanced MOLI MS methodologies. It was found that treatment of the metal oxide surface with a small molecule to act as a proton source expanded MOLI MS to analytes which did not form acidic adsorbed species. Proton-source pretreated MOLI MS was then used for the analysis of oils obtained from the fast, anoxic pyrolysis of biomass (py-oil). These samples are complex and produce MOLI mass spectra with many peaks. In this experiment, methods of data reduction including Kendrick mass defects and nominal mass z*-scores, which are commonly used for the study of petroleum fractions, were used to interpret these spectra and identify the major constituencies of py-oils. Through data reduction and collision induced dissociation (CID), homologous series of compounds were rapidly identified. The final chapter involves using metal oxides to catalytically cleave the ester linkage on lipids containing fatty acids in addition to ionization. The cleavage process results in the production of spectra similar to those observed with saponification/methylation. Fatty acid profiles were generated for a variety of micro-organisms to differentiate between bacterial species. (Abstract shortened by UMI.)
Lakbub, Jude C; Shipman, Joshua T; Desaire, Heather
2018-04-01
Disulfide bonds are important structural moieties of proteins: they ensure proper folding, provide stability, and ensure proper function. With the increasing use of proteins for biotherapeutics, particularly monoclonal antibodies, which are highly disulfide bonded, it is now important to confirm the correct disulfide bond connectivity and to verify the presence, or absence, of disulfide bond variants in the protein therapeutics. These studies help to ensure safety and efficacy. Hence, disulfide bonds are among the critical quality attributes of proteins that have to be monitored closely during the development of biotherapeutics. However, disulfide bond analysis is challenging because of the complexity of the biomolecules. Mass spectrometry (MS) has been the go-to analytical tool for the characterization of such complex biomolecules, and several methods have been reported to meet the challenging task of mapping disulfide bonds in proteins. In this review, we describe the relevant, recent MS-based techniques and provide important considerations needed for efficient disulfide bond analysis in proteins. The review focuses on methods for proper sample preparation, fragmentation techniques for disulfide bond analysis, recent disulfide bond mapping methods based on the fragmentation techniques, and automated algorithms designed for rapid analysis of disulfide bonds from liquid chromatography-MS/MS data. Researchers involved in method development for protein characterization can use the information herein to facilitate development of new MS-based methods for protein disulfide bond analysis. In addition, individuals characterizing biotherapeutics, especially by disulfide bond mapping in antibodies, can use this review to choose the best strategies for disulfide bond assignment of their biologic products. Graphical Abstract This review, describing characterization methods for disulfide bonds in proteins, focuses on three critical components: sample preparation, mass spectrometry data, and software tools.
Microplastics in the environment: Challenges in analytical chemistry - A review.
Silva, Ana B; Bastos, Ana S; Justino, Celine I L; da Costa, João P; Duarte, Armando C; Rocha-Santos, Teresa A P
2018-08-09
Microplastics can be present in the environment as manufactured microplastics (known as primary microplastics) or resulting from the continuous weathering of plastic litter, which yields progressively smaller plastic fragments (known as secondary microplastics). Herein, we discuss the numerous issues associated with the analysis of microplastics, and to a less extent of nanoplastics, in environmental samples (water, sediments, and biological tissues), from their sampling and sample handling to their identification and quantification. The analytical quality control and quality assurance associated with the validation of analytical methods and use of reference materials for the quantification of microplastics are also discussed, as well as the current challenges within this field of research and possible routes to overcome such limitations. Copyright © 2018 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Abes, Elisa S.
2009-01-01
This article is an exploration of possibilities and methodological considerations for using multiple theoretical perspectives in research that challenges inequitable power structures in student development theory. Specifically, I explore methodological considerations when partnering queer theory and constructivism in research on lesbian identity…
An approximate analytical solution for interlaminar stresses in angle-ply laminates
NASA Technical Reports Server (NTRS)
Rose, Cheryl A.; Herakovich, Carl T.
1991-01-01
An improved approximate analytical solution for interlaminar stresses in finite width, symmetric, angle-ply laminated coupons subjected to axial loading is presented. The solution is based upon statically admissible stress fields which take into consideration local property mismatch effects and global equilibrium requirements. Unknown constants in the admissible stress states are determined through minimization of the complementary energy. Typical results are presented for through-the-thickness and interlaminar stress distributions for angle-ply laminates. It is shown that the results represent an improved approximate analytical solution for interlaminar stresses.
Trends in access of plant biodiversity data revealed by Google Analytics
Baxter, David G.; Hagedorn, Gregor; Legler, Ben; Gilbert, Edward; Thiele, Kevin; Vargas-Rodriguez, Yalma; Urbatsch, Lowell E.
2014-01-01
Abstract The amount of plant biodiversity data available via the web has exploded in the last decade, but making these data available requires a considerable investment of time and work, both vital considerations for organizations and institutions looking to validate the impact factors of these online works. Here we used Google Analytics (GA), to measure the value of this digital presence. In this paper we examine usage trends using 15 different GA accounts, spread across 451 institutions or botanical projects that comprise over five percent of the world's herbaria. They were studied at both one year and total years. User data from the sample reveal: 1) over 17 million web sessions, 2) on five primary operating systems, 3) search and direct traffic dominates with minimal impact from social media, 4) mobile and new device types have doubled each year for the past three years, 5) and web browsers, the tools we use to interact with the web, are changing. Server-side analytics differ from site to site making the comparison of their data sets difficult. However, use of Google Analytics erases the reporting heterogeneity of unique server-side analytics, as they can now be examined with a standard that provides a clarity for data-driven decisions. The knowledge gained here empowers any collection-based environment regardless of size, with metrics about usability, design, and possible directions for future development. PMID:25425933
Trends in access of plant biodiversity data revealed by Google Analytics.
Jones, Timothy Mark; Baxter, David G; Hagedorn, Gregor; Legler, Ben; Gilbert, Edward; Thiele, Kevin; Vargas-Rodriguez, Yalma; Urbatsch, Lowell E
2014-01-01
The amount of plant biodiversity data available via the web has exploded in the last decade, but making these data available requires a considerable investment of time and work, both vital considerations for organizations and institutions looking to validate the impact factors of these online works. Here we used Google Analytics (GA), to measure the value of this digital presence. In this paper we examine usage trends using 15 different GA accounts, spread across 451 institutions or botanical projects that comprise over five percent of the world's herbaria. They were studied at both one year and total years. User data from the sample reveal: 1) over 17 million web sessions, 2) on five primary operating systems, 3) search and direct traffic dominates with minimal impact from social media, 4) mobile and new device types have doubled each year for the past three years, 5) and web browsers, the tools we use to interact with the web, are changing. Server-side analytics differ from site to site making the comparison of their data sets difficult. However, use of Google Analytics erases the reporting heterogeneity of unique server-side analytics, as they can now be examined with a standard that provides a clarity for data-driven decisions. The knowledge gained here empowers any collection-based environment regardless of size, with metrics about usability, design, and possible directions for future development.
Analytical advances in pharmaceutical impurity profiling.
Holm, René; Elder, David P
2016-05-25
Impurities will be present in all drug substances and drug products, i.e. nothing is 100% pure if one looks in enough depth. The current regulatory guidance on impurities accepts this, and for drug products with a dose of less than 2g/day identification of impurities is set at 0.1% levels and above (ICH Q3B(R2), 2006). For some impurities, this is a simple undertaking as generally available analytical techniques can address the prevailing analytical challenges; whereas, for others this may be much more challenging requiring more sophisticated analytical approaches. The present review provides an insight into current development of analytical techniques to investigate and quantify impurities in drug substances and drug products providing discussion of progress particular within the field of chromatography to ensure separation of and quantification of those related impurities. Further, a section is devoted to the identification of classical impurities, but in addition, inorganic (metal residues) and solid state impurities are also discussed. Risk control strategies for pharmaceutical impurities aligned with several of the ICH guidelines, are also discussed. Copyright © 2015 Elsevier B.V. All rights reserved.
Presidential Green Chemistry Challenge: 2009 Greener Reaction Conditions Award
Presidential Green Chemistry Challenge 2009 award winner, CEM Corporation, developed a fast, automated analytical process using less toxic reagents and less energy to distinguish protein from the food adulterant, melamine.
Białk-Bielińska, Anna; Kumirska, Jolanta; Borecka, Marta; Caban, Magda; Paszkiewicz, Monika; Pazdro, Ksenia; Stepnowski, Piotr
2016-03-20
Recent developments and improvements in advanced instruments and analytical methodologies have made the detection of pharmaceuticals at low concentration levels in different environmental matrices possible. As a result of these advances, over the last 15 years residues of these compounds and their metabolites have been detected in different environmental compartments and pharmaceuticals have now become recognized as so-called 'emerging' contaminants. To date, a lot of papers have been published presenting the development of analytical methodologies for the determination of pharmaceuticals in aqueous and solid environmental samples. Many papers have also been published on the application of the new methodologies, mainly to the assessment of the environmental fate of pharmaceuticals. Although impressive improvements have undoubtedly been made, in order to fully understand the behavior of these chemicals in the environment, there are still numerous methodological challenges to be overcome. The aim of this paper therefore, is to present a review of selected recent improvements and challenges in the determination of pharmaceuticals in environmental samples. Special attention has been paid to the strategies used and the current challenges (also in terms of Green Analytical Chemistry) that exist in the analysis of these chemicals in soils, marine environments and drinking waters. There is a particular focus on the applicability of modern sorbents such as carbon nanotubes (CNTs) in sample preparation techniques, to overcome some of the problems that exist in the analysis of pharmaceuticals in different environmental samples. Copyright © 2016 Elsevier B.V. All rights reserved.
Diamond, Dermot; Lau, King Tong; Brady, Sarah; Cleary, John
2008-05-15
Rapid developments in wireless communications are opening up opportunities for new ways to perform many types of analytical measurements that up to now have been restricted in scope due to the need to have access to centralised facilities. This paper will address both the potential for new applications and the challenges that currently inhibit more widespread integration of wireless communications with autonomous sensors and analytical devices. Key issues are identified and strategies for closer integration of analytical information and wireless communications systems discussed.
Assessment of Critical-Analytic Thinking
ERIC Educational Resources Information Center
Brown, Nathaniel J.; Afflerbach, Peter P.; Croninger, Robert G.
2014-01-01
National policy and standards documents, including the National Assessment of Educational Progress frameworks, the "Common Core State Standards" and the "Next Generation Science Standards," assert the need to assess critical-analytic thinking (CAT) across subject areas. However, assessment of CAT poses several challenges for…
ERIC Educational Resources Information Center
Thaneerananon, Taveep; Triampo, Wannapong; Nokkaew, Artorn
2016-01-01
Nowadays, one of the biggest challenges of education in Thailand is the development and promotion of the students' thinking skills. The main purposes of this research were to develop an analytical thinking test for 6th grade students and evaluate the students' analytical thinking. The sample was composed of 3,567 6th grade students in 2014…
Enabling Analytics on Sensitive Medical Data with Secure Multi-Party Computation.
Veeningen, Meilof; Chatterjea, Supriyo; Horváth, Anna Zsófia; Spindler, Gerald; Boersma, Eric; van der Spek, Peter; van der Galiën, Onno; Gutteling, Job; Kraaij, Wessel; Veugen, Thijs
2018-01-01
While there is a clear need to apply data analytics in the healthcare sector, this is often difficult because it requires combining sensitive data from multiple data sources. In this paper, we show how the cryptographic technique of secure multi-party computation can enable such data analytics by performing analytics without the need to share the underlying data. We discuss the issue of compliance to European privacy legislation; report on three pilots bringing these techniques closer to practice; and discuss the main challenges ahead to make fully privacy-preserving data analytics in the medical sector commonplace.
Reflection as Creative Process: Perspectives, Challenges and Practice
ERIC Educational Resources Information Center
Guillaumier, Christina
2016-01-01
This paper explores the challenges and opportunities for embedding reflection in practice-based curricula in the arts. Following the root and branch curriculum reform project recently completed at the Royal Conservatoire of Scotland, the paper presents a hermeneutic and analytical narrative of the challenges emerging from presenting reflection as…
ERIC Educational Resources Information Center
Holden, Borge; Gitlesen, Jens Petter
2008-01-01
In addition to explaining challenging behaviour by way of behaviour analytic, functional analyses, challenging behaviour is increasingly explained by way of psychiatric symptomatology. According to some researchers, the two approaches complement each other, as psychiatric symptomatology may form a motivational basis for the individual's response…
Nonlinear estimation for arrays of chemical sensors
NASA Astrophysics Data System (ADS)
Yosinski, Jason; Paffenroth, Randy
2010-04-01
Reliable detection of hazardous materials is a fundamental requirement of any national security program. Such materials can take a wide range of forms including metals, radioisotopes, volatile organic compounds, and biological contaminants. In particular, detection of hazardous materials in highly challenging conditions - such as in cluttered ambient environments, where complex collections of analytes are present, and with sensors lacking specificity for the analytes of interest - is an important part of a robust security infrastructure. Sophisticated single sensor systems provide good specificity for a limited set of analytes but often have cumbersome hardware and environmental requirements. On the other hand, simple, broadly responsive sensors are easily fabricated and efficiently deployed, but such sensors individually have neither the specificity nor the selectivity to address analyte differentiation in challenging environments. However, arrays of broadly responsive sensors can provide much of the sensitivity and selectivity of sophisticated sensors but without the substantial hardware overhead. Unfortunately, arrays of simple sensors are not without their challenges - the selectivity of such arrays can only be realized if the data is first distilled using highly advanced signal processing algorithms. In this paper we will demonstrate how the use of powerful estimation algorithms, based on those commonly used within the target tracking community, can be extended to the chemical detection arena. Herein our focus is on algorithms that not only provide accurate estimates of the mixture of analytes in a sample, but also provide robust measures of ambiguity, such as covariances.
ERIC Educational Resources Information Center
Buckingham Shum, Simon; Ferguson, Rebecca
2012-01-01
We propose that the design and implementation of effective "Social Learning Analytics (SLA)" present significant challenges and opportunities for both research and enterprise, in three important respects. The first is that the learning landscape is extraordinarily turbulent at present, in no small part due to technological drivers.…
Language Teacher Identities in the Southern United States: Transforming Rural Schools
ERIC Educational Resources Information Center
Fogle, Lyn Wright; Moser, Kelly
2017-01-01
Foreign language (FL) and English as a Second Language (ESL) teaching present considerable challenges in the rural U.S. South. Local language ideologies, budgetary considerations, and challenges in other curricular areas (e.g., math and science) lead to marginalizing both FL and ESL in schools. This article examines the personal and professional…
Overcoming the glass ceiling: views from the cellar and the roof.
McCrady, Barbara S
2012-12-01
Women's experiences as professionals and behavior therapists have changed considerably in the past 40 years. The author describes early challenges and experiences of discrimination as a young female professional. Although women's opportunities have improved considerably, women still experience unique career challenges and choices. The author provides some suggestions for women's career development.
NASA Technical Reports Server (NTRS)
Dubinskiy, Mark A.; Kamal, Mohammed M.; Misra, Prabhaker
1995-01-01
The availability of manned laboratory facilities in space offers wonderful opportunities and challenges in microgravity combustion science and technology. In turn, the fundamentals of microgravity combustion science can be studied via spectroscopic characterization of free radicals generated in flames. The laser-induced fluorescence (LIF) technique is a noninvasive method of considerable utility in combustion physics and chemistry suitable for monitoring not only specific species and their kinetics, but it is also important for imaging of flames. This makes LIF one of the most important tools for microgravity combustion science. Flame characterization under microgravity conditions using LIF is expected to be more informative than other methods aimed at searching for effects like pumping phenomenon that can be modeled via ground level experiments. A primary goal of our work consisted in working out an innovative approach to devising an LIF-based analytical unit suitable for in-space flame characterization. It was decided to follow two approaches in tandem: (1) use the existing laboratory (non-portable) equipment and determine the optimal set of parameters for flames that can be used as analytical criteria for flame characterization under microgravity conditions; and (2) use state-of-the-art developments in laser technology and concentrate some effort in devising a layout for the portable analytical equipment. This paper presents an up-to-date summary of the results of our experiments aimed at the creation of the portable device for combustion studies in a microgravity environment, which is based on a portable UV tunable solid-state laser for excitation of free radicals normally present in flames in detectable amounts. A systematic approach has allowed us to make a convenient choice of species under investigation, as well as the proper tunable laser system, and also enabled us to carry out LIF experiments on free radicals using a solid-state laser tunable in the UV.
Luo, Wei; Yin, Peifeng; Di, Qian; Hardisty, Frank; MacEachren, Alan M
2014-01-01
The world has become a complex set of geo-social systems interconnected by networks, including transportation networks, telecommunications, and the internet. Understanding the interactions between spatial and social relationships within such geo-social systems is a challenge. This research aims to address this challenge through the framework of geovisual analytics. We present the GeoSocialApp which implements traditional network analysis methods in the context of explicitly spatial and social representations. We then apply it to an exploration of international trade networks in terms of the complex interactions between spatial and social relationships. This exploration using the GeoSocialApp helps us develop a two-part hypothesis: international trade network clusters with structural equivalence are strongly 'balkanized' (fragmented) according to the geography of trading partners, and the geographical distance weighted by population within each network cluster has a positive relationship with the development level of countries. In addition to demonstrating the potential of visual analytics to provide insight concerning complex geo-social relationships at a global scale, the research also addresses the challenge of validating insights derived through interactive geovisual analytics. We develop two indicators to quantify the observed patterns, and then use a Monte-Carlo approach to support the hypothesis developed above.
NASA Astrophysics Data System (ADS)
Stewart, R.; Piburn, J.; Sorokine, A.; Myers, A.; Moehl, J.; White, D.
2015-07-01
The application of spatiotemporal (ST) analytics to integrated data from major sources such as the World Bank, United Nations, and dozens of others holds tremendous potential for shedding new light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. Realizing this potential first requires an ST data model that addresses challenges in properly merging data from multiple authors, with evolving ontological perspectives, semantical differences, and changing attributes, as well as content that is textual, numeric, categorical, and hierarchical. Equally challenging is the development of analytical and visualization approaches that provide a serious exploration of this integrated data while remaining accessible to practitioners with varied backgrounds. The WSTAMP project at Oak Ridge National Laboratory has yielded two major results in addressing these challenges: 1) development of the WSTAMP database, a significant advance in ST data modeling that integrates 10,000+ attributes covering over 200 nation states spanning over 50 years from over 30 major sources and 2) a novel online ST exploratory and analysis tool providing an array of modern statistical and visualization techniques for analyzing these data temporally, spatially, and spatiotemporally under a standard analytic workflow. We discuss the status of this work and report on major findings.
Luo, Wei; Yin, Peifeng; Di, Qian; Hardisty, Frank; MacEachren, Alan M.
2014-01-01
The world has become a complex set of geo-social systems interconnected by networks, including transportation networks, telecommunications, and the internet. Understanding the interactions between spatial and social relationships within such geo-social systems is a challenge. This research aims to address this challenge through the framework of geovisual analytics. We present the GeoSocialApp which implements traditional network analysis methods in the context of explicitly spatial and social representations. We then apply it to an exploration of international trade networks in terms of the complex interactions between spatial and social relationships. This exploration using the GeoSocialApp helps us develop a two-part hypothesis: international trade network clusters with structural equivalence are strongly ‘balkanized’ (fragmented) according to the geography of trading partners, and the geographical distance weighted by population within each network cluster has a positive relationship with the development level of countries. In addition to demonstrating the potential of visual analytics to provide insight concerning complex geo-social relationships at a global scale, the research also addresses the challenge of validating insights derived through interactive geovisual analytics. We develop two indicators to quantify the observed patterns, and then use a Monte-Carlo approach to support the hypothesis developed above. PMID:24558409
Using predictive analytics and big data to optimize pharmaceutical outcomes.
Hernandez, Inmaculada; Zhang, Yuting
2017-09-15
The steps involved, the resources needed, and the challenges associated with applying predictive analytics in healthcare are described, with a review of successful applications of predictive analytics in implementing population health management interventions that target medication-related patient outcomes. In healthcare, the term big data typically refers to large quantities of electronic health record, administrative claims, and clinical trial data as well as data collected from smartphone applications, wearable devices, social media, and personal genomics services; predictive analytics refers to innovative methods of analysis developed to overcome challenges associated with big data, including a variety of statistical techniques ranging from predictive modeling to machine learning to data mining. Predictive analytics using big data have been applied successfully in several areas of medication management, such as in the identification of complex patients or those at highest risk for medication noncompliance or adverse effects. Because predictive analytics can be used in predicting different outcomes, they can provide pharmacists with a better understanding of the risks for specific medication-related problems that each patient faces. This information will enable pharmacists to deliver interventions tailored to patients' needs. In order to take full advantage of these benefits, however, clinicians will have to understand the basics of big data and predictive analytics. Predictive analytics that leverage big data will become an indispensable tool for clinicians in mapping interventions and improving patient outcomes. Copyright © 2017 by the American Society of Health-System Pharmacists, Inc. All rights reserved.
Extending Climate Analytics-As to the Earth System Grid Federation
NASA Astrophysics Data System (ADS)
Tamkin, G.; Schnase, J. L.; Duffy, D.; McInerney, M.; Nadeau, D.; Li, J.; Strong, S.; Thompson, J. H.
2015-12-01
We are building three extensions to prior-funded work on climate analytics-as-a-service that will benefit the Earth System Grid Federation (ESGF) as it addresses the Big Data challenges of future climate research: (1) We are creating a cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables from six major reanalysis data sets. This near real-time capability will enable advanced technologies like the Cloudera Impala-based Structured Query Language (SQL) query capabilities and Hadoop-based MapReduce analytics over native NetCDF files while providing a platform for community experimentation with emerging analytic technologies. (2) We are building a full-featured Reanalysis Ensemble Service comprising monthly means data from six reanalysis data sets. The service will provide a basic set of commonly used operations over the reanalysis collections. The operations will be made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services (CDS) API. (3) We are establishing an Open Geospatial Consortium (OGC) WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation ESGF capabilities. The CDS API will be extended to accommodate the new WPS Web service endpoints as well as ESGF's Web service endpoints. These activities address some of the most important technical challenges for server-side analytics and support the research community's requirements for improved interoperability and improved access to reanalysis data.
Studabaker, William B; Puckett, Keith J; Percy, Kevin E; Landis, Matthew S
2017-04-07
Development of the Athabasca Oil Sands Region in northeastern Alberta, Canada has contributed polycyclic aromatic hydrocarbons (PAHs) and polycyclic aromatic compounds (PACs), which include alkyl PAHs and dibenzothiophenes, to the regional environment. A new analytical method was developed for quantification of PAHs and PACs in the epiphytic lichen bioindicator species Hypogymnia physodes for use in the development of receptor models for attribution of PAH and PAC concentrations to anthropogenic and natural emission sources. Milled lichens were extracted with cyclohexane, and extracts were cleaned on silica gel using automated solid phase extraction techniques. Quantitative analysis was performed by gas chromatography with selected ion monitoring (GC-SIM-MS) for PAHs, and by GC with time-of-flight mass spectrometry (GC-TOF-MS) for PACs. PACs were quantitated in groups using representative reference compounds as calibration standards. Analytical detection limits were ≤2.5ngg -1 for all individual compounds. Precision as measured by laboratory duplicates was variable; for individual analytes above 5ngg -1 the mean absolute difference between duplicates was typically <20%. Selection of single-analyte markers for source attribution should include consideration of data quality indicators. Use of TOF-MS to spectrally characterize PAC group constituents identified significant challenges for the accurate quantitation of PACs with more than two carbons in their side chain(s). Total PAH concentrations in lichen samples ranged from 12 to 482ngg -1 . Total PACs in each sample varied from a fraction of total PAHs to more than four times total PAHs. Results of our analyses of H. physodes are compared with other studies using other species of lichens as PAH receptors and with passive monitoring data using polyurethane foam (PUF) samplers in the Athabasca Oil Sands Region (AOSR). This study presents the first analytical methodology developed for the determination of PACs in an epiphytic lichen bioindicator species. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Machine learning challenges in Mars rover traverse science
NASA Technical Reports Server (NTRS)
Castano, R.; Judd, M.; Anderson, R. C.; Estlin, T.
2003-01-01
The successful implementation of machine learning in autonomous rover traverse science requires addressing challenges that range from the analytical technical realm, to the fuzzy, philosophical domain of entrenched belief systems within scientists and mission managers.
Tulabandhula, Theja; Rudin, Cynthia
2014-06-01
Our goal is to design a prediction and decision system for real-time use during a professional car race. In designing a knowledge discovery process for racing, we faced several challenges that were overcome only when domain knowledge of racing was carefully infused within statistical modeling techniques. In this article, we describe how we leveraged expert knowledge of the domain to produce a real-time decision system for tire changes within a race. Our forecasts have the potential to impact how racing teams can optimize strategy by making tire-change decisions to benefit their rank position. Our work significantly expands previous research on sports analytics, as it is the only work on analytical methods for within-race prediction and decision making for professional car racing.
76 FR 55804 - Dicamba; Pesticide Tolerances
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-09
... Considerations A. Analytical Enforcement Methodology Adequate enforcement methodologies, Methods I and II--gas chromatography with electron capture detection (GC/ECD), are available to enforce the tolerance expression. The...
Technosocial Predictive Analytics in Support of Naturalistic Decision Making
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sanfilippo, Antonio P.; Cowell, Andrew J.; Malone, Elizabeth L.
2009-06-23
A main challenge we face in fostering sustainable growth is to anticipate outcomes through predictive and proactive across domains as diverse as energy, security, the environment, health and finance in order to maximize opportunities, influence outcomes and counter adversities. The goal of this paper is to present new methods for anticipatory analytical thinking which address this challenge through the development of a multi-perspective approach to predictive modeling as a core to a creative decision making process. This approach is uniquely multidisciplinary in that it strives to create decision advantage through the integration of human and physical models, and leverages knowledgemore » management and visual analytics to support creative thinking by facilitating the achievement of interoperable knowledge inputs and enhancing the user’s cognitive access. We describe a prototype system which implements this approach and exemplify its functionality with reference to a use case in which predictive modeling is paired with analytic gaming to support collaborative decision-making in the domain of agricultural land management.« less
Electrochemical Enzyme Biosensors Revisited: Old Solutions for New Problems.
Monteiro, Tiago; Almeida, Maria Gabriela
2018-05-14
Worldwide legislation is driving the development of novel and highly efficient analytical tools for assessing the composition of every material that interacts with Consumers or Nature. The biosensor technology is one of the most active R&D domains of Analytical Sciences focused on the challenge of taking analytical chemistry to the field. Electrochemical biosensors based on redox enzymes, in particular, are highly appealing due to their usual quick response, high selectivity and sensitivity, low cost and portable dimensions. This review paper aims to provide an overview of the most important advances made in the field since the proposal of the first biosensor, the well-known hand-held glucose meter. The first section addresses the current needs and challenges for novel analytical tools, followed by a brief description of the different components and configurations of biosensing devices, and the fundamentals of enzyme kinetics and amperometry. The following sections emphasize on enzyme-based amperometric biosensors and the different stages of their development.
IBM's Health Analytics and Clinical Decision Support.
Kohn, M S; Sun, J; Knoop, S; Shabo, A; Carmeli, B; Sow, D; Syed-Mahmood, T; Rapp, W
2014-08-15
This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation.
NASA Astrophysics Data System (ADS)
Winter, S.; Schmitz, F.; Clausmeyer, T.; Tekkaya, A. E.; F-X Wagner, M.
2017-03-01
In the automotive industry, advanced high strength steels (AHSS) are widely used as sheet part components to reduce weight, even though this leads to several challenges. The demand for high-quality shear cutting surfaces that do not require reworking can be fulfilled by adiabatic shear cutting: High strain rates and local temperatures lead to the formation of adiabatic shear bands (ASB). While this process is well suited to produce AHSS parts with excellent cutting surface quality, a fundamental understanding of the process is still missing today. In this study, compression tests in a Split-Hopkinson Pressure Bar with an initial strain rate of 1000 s-1 were performed in a temperature range between 200 °C and 1000 °C. The experimental results show that high strength steels with nearly the same mechanical properties at RT may possess a considerably different behavior at higher temperatures. The resulting microstructures after testing at different temperatures were analyzed by optical microscopy. The thermo-mechanical material behavior was then considered in an analytical model. To predict the local temperature increase that occurs during the adiabatic blanking process, experimentally determined flow curves were used. Furthermore, the influence of temperature evolution with respect to phase transformation is discussed. This study contributes to a more complete understanding of the relevant microstructural and thermo-mechanical mechanisms leading to the evolution of ASB during cutting of AHSS.
Dynamic Power Distribution System Management With a Locally Connected Communication Network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dall-Anese, Emiliano; Zhang, Kaiqing; Basar, Tamer
Coordinated optimization and control of distribution-level assets can enable a reliable and optimal integration of massive amount of distributed energy resources (DERs) and facilitate distribution system management (DSM). Accordingly, the objective is to coordinate the power injection at the DERs to maintain certain quantities across the network, e.g., voltage magnitude, line flows, or line losses, to be close to a desired profile. By and large, the performance of the DSM algorithms has been challenged by two factors: i) the possibly non-strongly connected communication network over DERs that hinders the coordination; ii) the dynamics of the real system caused by themore » DERs with heterogeneous capabilities, time-varying operating conditions, and real-time measurement mismatches. In this paper, we investigate the modeling and algorithm design and analysis with the consideration of these two factors. In particular, a game theoretic characterization is first proposed to account for a locally connected communication network over DERs, along with the analysis of the existence and uniqueness of the Nash equilibrium (NE) therein. To achieve the equilibrium in a distributed fashion, a projected-gradient-based asynchronous DSM algorithm is then advocated. The algorithm performance, including the convergence speed and the tracking error, is analytically guaranteed under the dynamic setting. Extensive numerical tests on both synthetic and realistic cases corroborate the analytical results derived.« less
Applying analytic hierarchy process to assess healthcare-oriented cloud computing service systems.
Liao, Wen-Hwa; Qiu, Wan-Li
2016-01-01
Numerous differences exist between the healthcare industry and other industries. Difficulties in the business operation of the healthcare industry have continually increased because of the volatility and importance of health care, changes to and requirements of health insurance policies, and the statuses of healthcare providers, which are typically considered not-for-profit organizations. Moreover, because of the financial risks associated with constant changes in healthcare payment methods and constantly evolving information technology, healthcare organizations must continually adjust their business operation objectives; therefore, cloud computing presents both a challenge and an opportunity. As a response to aging populations and the prevalence of the Internet in fast-paced contemporary societies, cloud computing can be used to facilitate the task of balancing the quality and costs of health care. To evaluate cloud computing service systems for use in health care, providing decision makers with a comprehensive assessment method for prioritizing decision-making factors is highly beneficial. Hence, this study applied the analytic hierarchy process, compared items related to cloud computing and health care, executed a questionnaire survey, and then classified the critical factors influencing healthcare cloud computing service systems on the basis of statistical analyses of the questionnaire results. The results indicate that the primary factor affecting the design or implementation of optimal cloud computing healthcare service systems is cost effectiveness, with the secondary factors being practical considerations such as software design and system architecture.
Podevin, Michael; Fotidis, Ioannis A; Angelidaki, Irini
2018-08-01
Microalgae are well known for their ability to accumulate lipids intracellularly, which can be used for biofuels and mitigate CO 2 emissions. However, due to economic challenges, microalgae bioprocesses have maneuvered towards the simultaneous production of food, feed, fuel, and various high-value chemicals in a biorefinery concept. On-line and in-line monitoring of macromolecules such as lipids, proteins, carbohydrates, and high-value pigments will be more critical to maintain product quality and consistency for downstream processing in a biorefinery to maintain and valorize these markets. The main contribution of this review is to present current and prospective advances of on-line and in-line process analytical technology (PAT), with high-selectivity - the capability of monitoring several analytes simultaneously - in the interest of improving product quality, productivity, and process automation of a microalgal biorefinery. The high-selectivity PAT under consideration are mid-infrared (MIR), near-infrared (NIR), and Raman vibrational spectroscopies. The current review contains a critical assessment of these technologies in the context of recent advances in software and hardware in order to move microalgae production towards process automation through multivariate process control (MVPC) and software sensors trained on "big data". The paper will also include a comprehensive overview of off-line implementations of vibrational spectroscopy in microalgal research as it pertains to spectral interpretation and process automation to aid and motivate development.
Pansare, Swapnil K; Patel, Sajal Manubhai
2016-08-01
Glass transition temperature is a unique thermal characteristic of amorphous systems and is associated with changes in physical properties such as heat capacity, viscosity, electrical resistance, and molecular mobility. Glass transition temperature for amorphous solids is referred as (T g), whereas for maximally freeze concentrated solution, the notation is (T g'). This article is focused on the factors affecting determination of T g' for application to lyophilization process design and frozen storage stability. Also, this review provides a perspective on use of various types of solutes in protein formulation and their effect on T g'. Although various analytical techniques are used for determination of T g' based on the changes in physical properties associated with glass transition, the differential scanning calorimetry (DSC) is the most commonly used technique. In this article, an overview of DSC technique is provided along with brief discussion on the alternate analytical techniques for T g' determination. Additionally, challenges associated with T g' determination, using DSC for protein formulations, are discussed. The purpose of this review is to provide a practical industry perspective on determination of T g' for protein formulations as it relates to design and development of lyophilization process and/or for frozen storage; however, a comprehensive review of glass transition temperature (T g, T g'), in general, is outside the scope of this work.
Collection of biological samples in forensic toxicology.
Dinis-Oliveira, R J; Carvalho, F; Duarte, J A; Remião, F; Marques, A; Santos, A; Magalhães, T
2010-09-01
Forensic toxicology is the study and practice of the application of toxicology to the purposes of the law. The relevance of any finding is determined, in the first instance, by the nature and integrity of the specimen(s) submitted for analysis. This means that there are several specific challenges to select and collect specimens for ante-mortem and post-mortem toxicology investigation. Post-mortem specimens may be numerous and can endow some special difficulties compared to clinical specimens, namely those resulting from autolytic and putrefactive changes. Storage stability is also an important issue to be considered during the pre-analytic phase, since its consideration should facilitate the assessment of sample quality and the analytical result obtained from that sample. The knowledge on degradation mechanisms and methods to increase storage stability may enable the forensic toxicologist to circumvent possible difficulties. Therefore, advantages and limitations of specimen preservation procedures are thoroughfully discussed in this review. Presently, harmonized protocols for sampling in suspected intoxications would have obvious utility. In the present article an overview is given on sampling procedures for routinely collected specimens as well as on alternative specimens that may provide additional information on the route and timing of exposure to a specific xenobiotic. Last, but not least, a discussion on possible bias that can influence the interpretation of toxicological results is provided. This comprehensive review article is intented as a significant help for forensic toxicologists to accomplish their frequently overwhelming mission.
Perspective: Randomized Controlled Trials Are Not a Panacea for Diet-Related Research12
Hébert, James R; Frongillo, Edward A; Adams, Swann A; Turner-McGrievy, Gabrielle M; Hurley, Thomas G; Miller, Donald R; Ockene, Ira S
2016-01-01
Research into the role of diet in health faces a number of methodologic challenges in the choice of study design, measurement methods, and analytic options. Heavier reliance on randomized controlled trial (RCT) designs is suggested as a way to solve these challenges. We present and discuss 7 inherent and practical considerations with special relevance to RCTs designed to study diet: 1) the need for narrow focus; 2) the choice of subjects and exposures; 3) blinding of the intervention; 4) perceived asymmetry of treatment in relation to need; 5) temporal relations between dietary exposures and putative outcomes; 6) strict adherence to the intervention protocol, despite potential clinical counter-indications; and 7) the need to maintain methodologic rigor, including measuring diet carefully and frequently. Alternatives, including observational studies and adaptive intervention designs, are presented and discussed. Given high noise-to-signal ratios interjected by using inaccurate assessment methods in studies with weak or inappropriate study designs (including RCTs), it is conceivable and indeed likely that effects of diet are underestimated. No matter which designs are used, studies will require continued improvement in the assessment of dietary intake. As technology continues to improve, there is potential for enhanced accuracy and reduced user burden of dietary assessments that are applicable to a wide variety of study designs, including RCTs. PMID:27184269
Morota, Gota; Ventura, Ricardo V; Silva, Fabyano F; Koyama, Masanori; Fernando, Samodha C
2018-04-14
Precision animal agriculture is poised to rise to prominence in the livestock enterprise in the domains of management, production, welfare, sustainability, health surveillance, and environmental footprint. Considerable progress has been made in the use of tools to routinely monitor and collect information from animals and farms in a less laborious manner than before. These efforts have enabled the animal sciences to embark on information technology-driven discoveries to improve animal agriculture. However, the growing amount and complexity of data generated by fully automated, high-throughput data recording or phenotyping platforms, including digital images, sensor and sound data, unmanned systems, and information obtained from real-time noninvasive computer vision, pose challenges to the successful implementation of precision animal agriculture. The emerging fields of machine learning and data mining are expected to be instrumental in helping meet the daunting challenges facing global agriculture. Yet, their impact and potential in "big data" analysis have not been adequately appreciated in the animal science community, where this recognition has remained only fragmentary. To address such knowledge gaps, this article outlines a framework for machine learning and data mining and offers a glimpse into how they can be applied to solve pressing problems in animal sciences.
Electronic nose for detecting multiple targets
NASA Astrophysics Data System (ADS)
Chakraborty, Anirban; Parthasarathi, Ganga; Poddar, Rakesh; Zhao, Weiqiang; Luo, Cheng
2006-05-01
The discovery of high conductivity in doped polyacetylene in 1977 (garnering the 2000 Nobel Prize in Chemistry for the three discovering scientists) has attracted considerable interest in the application of polymers as the semiconducting and conducting materials due to their promising potential to replace silicon and metals in building devices. Previous and current efforts in developing conducting polymer microsystems mainly focus on generating a device of a single function. When multiple micropatterns made of different conducting polymers are produced on the same substrate, many microsystems of multiple functions can be envisioned. For example, analogous to the mammalian olfactory system which includes over 1,000 receptor genes in detecting various odors (e.g., beer, soda etc.), a sensor consisting of multiple distinct conducting polymer sensing elements will be capable of detecting a number of analytes simultaneously. However, existing techniques present significant technical challenges of degradation, low throughput, low resolution, depth of field, and/or residual layer in producing conducting polymer microstructures. To circumvent these challenges, an intermediate-layer lithography method developed in our group is used to generate multiple micropatterns made of different, commonly used conducting polymers, Polypyrrole (PPy), Poly(3,4-ethylenedioxy)thiophene (PEDOT) and Polyaniline (PANI). The generated multiple micropatterns are further used in an "electronic nose" to detect water vapor, glucose, toluene and acetone.
Alternative approaches to analytical designs in occupational injury epidemiology.
Mittleman, M A; Maldonado, G; Gerberich, S G; Smith, G S; Sorock, G S
1997-08-01
In this paper, we discuss the theoretical framework upon which observational studies of occupational injuries are based. Following a general description of how causal effects are estimated, the challenges faced by researchers working in this area are outlined, with an emphasis on case-control studies. These challenges include defining the at-risk period for workers whose tasks change over time and whose hazard period may be very brief, evaluating the underreporting of both exposures and injuries, and considering the effects of multiple injuries per individual on study design and data analysis. We review both the theoretical and practical considerations in the design and conduct of traditional case-control studies, based on the collection of individual level data, as well as other approaches, such as using information culled from administrative and descriptive databases, and case-control studies in which the plant or work site is the unit of analysis. The case-crossover design is also reviewed and its utility for reducing confounding due to differences between individuals by self-matching is highlighted. While this design has not yet been applied to the work setting, its potential for increasing our understanding of the causes of acute-onset occupational injuries seems promising. Finally, a variety of hybrid designs are discussed, including combinations of case-control, case-crossover, and cohort designs.
Perrault, Katelynn A; Stefanuto, Pierre-Hugues; Stuart, Barbara H; Rai, Tapan; Focant, Jean-François; Forbes, Shari L
2015-01-01
Challenges in decomposition odour profiling have led to variation in the documented odour profile by different research groups worldwide. Background subtraction and use of controls are important considerations given the variation introduced by decomposition studies conducted in different geographical environments. The collection of volatile organic compounds (VOCs) from soil beneath decomposing remains is challenging due to the high levels of inherent soil VOCs, further confounded by the use of highly sensitive instrumentation. This study presents a method that provides suitable chromatographic resolution for profiling decomposition odour in soil by comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry using appropriate controls and field blanks. Logarithmic transformation and t-testing of compounds permitted the generation of a compound list of decomposition VOCs in soil. Principal component analysis demonstrated the improved discrimination between experimental and control soil, verifying the value of the data handling method. Data handling procedures have not been well documented in this field and standardisation would thereby reduce misidentification of VOCs present in the surrounding environment as decomposition byproducts. Uniformity of data handling and instrumental procedures will reduce analytical variation, increasing confidence in the future when investigating the effect of taphonomic variables on the decomposition VOC profile. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Wu, Chunsheng; Liu, Gaohuan; Huang, Chong; Liu, Qingsheng; Guan, Xudong
2018-04-25
The Yellow River Delta (YRD), located in Yellow River estuary, is characterized by rich ecological system types, and provides habitats or migration stations for wild birds, all of which makes the delta an ecological barrier or ecotone for inland areas. Nevertheless, the abundant natural resources of YRD have brought huge challenges to the area, and frequent human activities and natural disasters have damaged the ecological systems seriously, and certain ecological functions have been threatened. Therefore, it is necessary to determine the status of the ecological environment based on scientific methods, which can provide scientifically robust data for the managers or stakeholders to adopt timely ecological protection measures. The aim of this study was to obtain the spatial distribution of the ecological vulnerability (EV) in YRD based on 21 indicators selected from underwater status, soil condition, land use, landform, vegetation cover, meteorological conditions, ocean influence, and social economy. In addition, the fuzzy analytic hierarchy process (FAHP) method was used to obtain the weights of the selected indicators, and a fuzzy logic model was constructed to obtain the result. The result showed that the spatial distribution of the EV grades was regular, while the fuzzy membership of EV decreased gradually from the coastline to inland area, especially around the river crossing, where it had the lowest EV. Along the coastline, the dikes had an obviously protective effect for the inner area, while the EV was higher in the area where no dikes were built. This result also showed that the soil condition and groundwater status were highly related to the EV spatially, with the correlation coefficients −0.55 and −0.74 respectively, and human activities had exerted considerable pressure on the ecological environment.
Le, Laetitia Minh Maï; Kégl, Balázs; Gramfort, Alexandre; Marini, Camille; Nguyen, David; Cherti, Mehdi; Tfaili, Sana; Tfayli, Ali; Baillet-Guffroy, Arlette; Prognon, Patrice; Chaminade, Pierre; Caudron, Eric
2018-07-01
The use of monoclonal antibodies (mAbs) constitutes one of the most important strategies to treat patients suffering from cancers such as hematological malignancies and solid tumors. These antibodies are prescribed by the physician and prepared by hospital pharmacists. An analytical control enables the quality of the preparations to be ensured. The aim of this study was to explore the development of a rapid analytical method for quality control. The method used four mAbs (Infliximab, Bevacizumab, Rituximab and Ramucirumab) at various concentrations and was based on recording Raman data and coupling them to a traditional chemometric and machine learning approach for data analysis. Compared to conventional linear approach, prediction errors are reduced with a data-driven approach using statistical machine learning methods. In the latter, preprocessing and predictive models are jointly optimized. An additional original aspect of the work involved on submitting the problem to a collaborative data challenge platform called Rapid Analytics and Model Prototyping (RAMP). This allowed using solutions from about 300 data scientists in collaborative work. Using machine learning, the prediction of the four mAbs samples was considerably improved. The best predictive model showed a combined error of 2.4% versus 14.6% using linear approach. The concentration and classification errors were 5.8% and 0.7%, only three spectra were misclassified over the 429 spectra of the test set. This large improvement obtained with machine learning techniques was uniform for all molecules but maximal for Bevacizumab with an 88.3% reduction on combined errors (2.1% versus 17.9%). Copyright © 2018 Elsevier B.V. All rights reserved.
Watts, R R; Langone, J J; Knight, G J; Lewtas, J
1990-01-01
A two-day technical workshop was convened November 10-11, 1986, to discuss analytical approaches for determining trace amounts of cotinine in human body fluids resulting from passive exposure to environmental tobacco smoke (ETS). The workshop, jointly sponsored by the U.S. Environmental Protection Agency and Centers for Disease Control, was attended by scientists with expertise in cotinine analytical methodology and/or conduct of human monitoring studies related to ETS. The workshop format included technical presentations, separate panel discussions on chromatography and immunoassay analytical approaches, and group discussions related to the quality assurance/quality control aspects of future monitoring programs. This report presents a consensus of opinion on general issues before the workshop panel participants and also a detailed comparison of several analytical approaches being used by the various represented laboratories. The salient features of the chromatography and immunoassay analytical methods are discussed separately. PMID:2190812
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scholtz, Jean
A new field of research, visual analytics, has recently been introduced. This has been defined as “the science of analytical reasoning facilitated by visual interfaces." Visual analytic environments, therefore, support analytical reasoning using visual representations and interactions, with data representations and transformation capabilities, to support production, presentation and dissemination. As researchers begin to develop visual analytic environments, it will be advantageous to develop metrics and methodologies to help researchers measure the progress of their work and understand the impact their work will have on the users who will work in such environments. This paper presents five areas or aspects ofmore » visual analytic environments that should be considered as metrics and methodologies for evaluation are developed. Evaluation aspects need to include usability, but it is necessary to go beyond basic usability. The areas of situation awareness, collaboration, interaction, creativity, and utility are proposed as areas for initial consideration. The steps that need to be undertaken to develop systematic evaluation methodologies and metrics for visual analytic environments are outlined.« less
Progress and challenges associated with halal authentication of consumer packaged goods.
Premanandh, Jagadeesan; Bin Salem, Samara
2017-11-01
Abusive business practices are increasingly evident in consumer packaged goods. Although consumers have the right to protect themselves against such practices, rapid urbanization and industrialization result in greater distances between producers and consumers, raising serious concerns on the supply chain. The operational complexities surrounding halal authentication pose serious challenges on the integrity of consumer packaged goods. This article attempts to address the progress and challenges associated with halal authentication. Advancement and concerns on the application of new, rapid analytical methods for halal authentication are discussed. The significance of zero tolerance policy in consumer packaged foods and its impact on analytical testing are presented. The role of halal assurance systems and their challenges are also considered. In conclusion, consensus on the establishment of one standard approach coupled with a sound traceability system and constant monitoring would certainly improve and ensure halalness of consumer packaged goods. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
ERIC Educational Resources Information Center
Papay, John P.; Bacher-Hicks, Andrew; Page, Lindsay C.; Marinell, William H.
2017-01-01
Substantial teacher turnover poses a challenge to staffing public schools with effective teachers. The scope of the teacher retention challenge across school districts, however, remains poorly defined. Applying consistent data practices and analytical techniques to administrative data sets from 16 urban districts, we document substantial…
A big data approach for climate change indicators processing in the CLIP-C project
NASA Astrophysics Data System (ADS)
D'Anca, Alessandro; Conte, Laura; Palazzo, Cosimo; Fiore, Sandro; Aloisio, Giovanni
2016-04-01
Defining and implementing processing chains with multiple (e.g. tens or hundreds of) data analytics operators can be a real challenge in many practical scientific use cases such as climate change indicators. This is usually done via scripts (e.g. bash) on the client side and requires climate scientists to take care of, implement and replicate workflow-like control logic aspects (which may be error-prone too) in their scripts, along with the expected application-level part. Moreover, the big amount of data and the strong I/O demand pose additional challenges related to the performance. In this regard, production-level tools for climate data analysis are mostly sequential and there is a lack of big data analytics solutions implementing fine-grain data parallelism or adopting stronger parallel I/O strategies, data locality, workflow optimization, etc. High-level solutions leveraging on workflow-enabled big data analytics frameworks for eScience could help scientists in defining and implementing the workflows related to their experiments by exploiting a more declarative, efficient and powerful approach. This talk will start introducing the main needs and challenges regarding big data analytics workflow management for eScience and will then provide some insights about the implementation of some real use cases related to some climate change indicators on large datasets produced in the context of the CLIP-C project - a EU FP7 project aiming at providing access to climate information of direct relevance to a wide variety of users, from scientists to policy makers and private sector decision makers. All the proposed use cases have been implemented exploiting the Ophidia big data analytics framework. The software stack includes an internal workflow management system, which coordinates, orchestrates, and optimises the execution of multiple scientific data analytics and visualization tasks. Real-time workflow monitoring execution is also supported through a graphical user interface. In order to address the challenges of the use cases, the implemented data analytics workflows include parallel data analysis, metadata management, virtual file system tasks, maps generation, rolling of datasets, and import/export of datasets in NetCDF format. The use cases have been implemented on a HPC cluster of 8-nodes (16-cores/node) of the Athena Cluster available at the CMCC Supercomputing Centre. Benchmark results will be also presented during the talk.
On the theory of evolution of particulate systems
NASA Astrophysics Data System (ADS)
Buyevich, Yuri A.; Alexandrov, Dmitri V.
2017-04-01
An analytical method for the description of particulate systems at sufficiently long times is developed. This method allows us to obtain very simple analytical expressions for the particle distribution function. The method under consideration can be applied to a number of practically important problems including evaporation of a polydisperse mist, dissolution of dispersed solids, combustion of dispersed propellants, physical and chemical transformation of powders and phase transitions in metastable materials.
How do challenges increase customer loyalty to online games?
Teng, Ching-I
2013-12-01
Despite the design of various challenge levels in online games, exactly how these challenges increase customer loyalty to online games has seldom been examined. This study investigates how such challenges increase customer loyalty to online games. The study sample comprises 2,861 online gamers. Structural equation modeling is performed. Analytical results indicate that the relationship between challenge and loyalty intensifies when customers perceive that overcoming challenges takes a long time. Results of this study contribute to efforts to determine how challenges and challenge-related perceptions impact customer loyalty to online games.
Vandekerckhove, Kristof; Seidl, Andreas; Gutka, Hiten; Kumar, Manish; Gratzl, Gyöngyi; Keire, David; Coffey, Todd; Kuehne, Henriette
2018-05-10
Leading regulatory agencies recommend biosimilar assessment to proceed in a stepwise fashion, starting with a detailed analytical comparison of the structural and functional properties of the proposed biosimilar and reference product. The degree of analytical similarity determines the degree of residual uncertainty that must be addressed through downstream in vivo studies. Substantive evidence of similarity from comprehensive analytical testing may justify a targeted clinical development plan, and thus enable a shorter path to licensing. The importance of a careful design of the analytical similarity study program therefore should not be underestimated. Designing a state-of-the-art analytical similarity study meeting current regulatory requirements in regions such as the USA and EU requires a methodical approach, consisting of specific steps that far precede the work on the actual analytical study protocol. This white paper discusses scientific and methodological considerations on the process of attribute and test method selection, criticality assessment, and subsequent assignment of analytical measures to US FDA's three tiers of analytical similarity assessment. Case examples of selection of critical quality attributes and analytical methods for similarity exercises are provided to illustrate the practical implementation of the principles discussed.
NASA Astrophysics Data System (ADS)
Krasichkov, Alexander S.; Grigoriev, Eugene B.; Bogachev, Mikhail I.; Nifontov, Eugene M.
2015-10-01
We suggest an analytical approach to the adaptive thresholding in a shape anomaly detection problem. We find an analytical expression for the distribution of the cosine similarity score between a reference shape and an observational shape hindered by strong measurement noise that depends solely on the noise level and is independent of the particular shape analyzed. The analytical treatment is also confirmed by computer simulations and shows nearly perfect agreement. Using this analytical solution, we suggest an improved shape anomaly detection approach based on adaptive thresholding. We validate the noise robustness of our approach using typical shapes of normal and pathological electrocardiogram cycles hindered by additive white noise. We show explicitly that under high noise levels our approach considerably outperforms the conventional tactic that does not take into account variations in the noise level.
Johnson, Sara B; Little, Todd D; Masyn, Katherine; Mehta, Paras D; Ghazarian, Sharon R
2017-06-01
Characterizing the determinants of child health and development over time, and identifying the mechanisms by which these determinants operate, is a research priority. The growth of precision medicine has increased awareness and refinement of conceptual frameworks, data management systems, and analytic methods for multilevel data. This article reviews key methodological challenges in cohort studies designed to investigate multilevel influences on child health and strategies to address them. We review and summarize methodological challenges that could undermine prospective studies of the multilevel determinants of child health and ways to address them, borrowing approaches from the social and behavioral sciences. Nested data, variation in intervals of data collection and assessment, missing data, construct measurement across development and reporters, and unobserved population heterogeneity pose challenges in prospective multilevel cohort studies with children. We discuss innovations in missing data, innovations in person-oriented analyses, and innovations in multilevel modeling to address these challenges. Study design and analytic approaches that facilitate the integration across multiple levels, and that account for changes in people and the multiple, dynamic, nested systems in which they participate over time, are crucial to fully realize the promise of precision medicine for children and adolescents. Copyright © 2017 Elsevier Inc. All rights reserved.
Challenges and Opportunities of Big Data in Health Care: A Systematic Review
Goswamy, Rishi; Raval, Yesha; Marawi, Sarah
2016-01-01
Background Big data analytics offers promise in many business sectors, and health care is looking at big data to provide answers to many age-related issues, particularly dementia and chronic disease management. Objective The purpose of this review was to summarize the challenges faced by big data analytics and the opportunities that big data opens in health care. Methods A total of 3 searches were performed for publications between January 1, 2010 and January 1, 2016 (PubMed/MEDLINE, CINAHL, and Google Scholar), and an assessment was made on content germane to big data in health care. From the results of the searches in research databases and Google Scholar (N=28), the authors summarized content and identified 9 and 14 themes under the categories Challenges and Opportunities, respectively. We rank-ordered and analyzed the themes based on the frequency of occurrence. Results The top challenges were issues of data structure, security, data standardization, storage and transfers, and managerial skills such as data governance. The top opportunities revealed were quality improvement, population management and health, early detection of disease, data quality, structure, and accessibility, improved decision making, and cost reduction. Conclusions Big data analytics has the potential for positive impact and global implications; however, it must overcome some legitimate obstacles. PMID:27872036
Buckling Imperfection Sensitivity of Axially Compressed Orthotropic Cylinders
NASA Technical Reports Server (NTRS)
Schultz, Marc R.; Nemeth, Michael P.
2010-01-01
Structural stability is a major consideration in the design of lightweight shell structures. However, the theoretical predictions of geometrically perfect structures often considerably over predict the buckling loads of inherently imperfect real structures. It is reasonably well understood how the shell geometry affects the imperfection sensitivity of axially compressed cylindrical shells; however, the effects of shell anisotropy on the imperfection sensitivity is less well understood. In the present paper, the development of an analytical model for assessing the imperfection sensitivity of axially compressed orthotropic cylinders is discussed. Results from the analytical model for four shell designs are compared with those from a general-purpose finite-element code, and good qualitative agreement is found. Reasons for discrepancies are discussed, and potential design implications of this line of research are discussed.
NASA Astrophysics Data System (ADS)
Devrient, M.; Da, X.; Frick, T.; Schmidt, M.
Laser transmission welding is a well known joining technology for thermoplastics. Because of the needs of lightweight, cost effective and green production thermoplastics are usually filled with glass fibers. These lead to higher absorption and more scattering within the upper joining partner with a negative influence on the welding process. Here an experimental method for the characterization of the scattering behavior of semi crystalline thermoplastics filled with short glass fibers and a finite element model of the welding process capable to consider scattering as well as an analytical model are introduced. The experimental data is used for the numerical and analytical investigation of laser transmission welding under consideration of scattering. The scattering effects of several thermoplastics onto the calculated temperature fields as well as weld seam geometries are quantified.
Posch, Tjorben Nils; Pütz, Michael; Martin, Nathalie; Huhn, Carolin
2015-01-01
In this review we introduce the advantages and limitations of electromigrative separation techniques in forensic toxicology. We thus present a summary of illustrative studies and our own experience in the field together with established methods from the German Federal Criminal Police Office rather than a complete survey. We focus on the analytical aspects of analytes' physicochemical characteristics (e.g. polarity, stereoisomers) and analytical challenges including matrix tolerance, separation from compounds present in large excess, sample volumes, and orthogonality. For these aspects we want to reveal the specific advantages over more traditional methods. Both detailed studies and profiling and screening studies are taken into account. Care was taken to nearly exclusively document well-validated methods outstanding for the analytical challenge discussed. Special attention was paid to aspects exclusive to electromigrative separation techniques, including the use of the mobility axis, the potential for on-site instrumentation, and the capillary format for immunoassays. The review concludes with an introductory guide to method development for different separation modes, presenting typical buffer systems as starting points for different analyte classes. The objective of this review is to provide an orientation for users in separation science considering using capillary electrophoresis in their laboratory in the future.
Kramberger, Petra; Urbas, Lidija; Štrancar, Aleš
2015-01-01
Downstream processing of nanoplexes (viruses, virus-like particles, bacteriophages) is characterized by complexity of the starting material, number of purification methods to choose from, regulations that are setting the frame for the final product and analytical methods for upstream and downstream monitoring. This review gives an overview on the nanoplex downstream challenges and chromatography based analytical methods for efficient monitoring of the nanoplex production.
Kramberger, Petra; Urbas, Lidija; Štrancar, Aleš
2015-01-01
Downstream processing of nanoplexes (viruses, virus-like particles, bacteriophages) is characterized by complexity of the starting material, number of purification methods to choose from, regulations that are setting the frame for the final product and analytical methods for upstream and downstream monitoring. This review gives an overview on the nanoplex downstream challenges and chromatography based analytical methods for efficient monitoring of the nanoplex production. PMID:25751122
Du, Lihong; White, Robert L
2009-02-01
A previously proposed partition equilibrium model for quantitative prediction of analyte response in electrospray ionization mass spectrometry is modified to yield an improved linear relationship. Analyte mass spectrometer response is modeled by a competition mechanism between analyte and background electrolytes that is based on partition equilibrium considerations. The correlation between analyte response and solution composition is described by the linear model over a wide concentration range and the improved model is shown to be valid for a wide range of experimental conditions. The behavior of an analyte in a salt solution, which could not be explained by the original model, is correctly predicted. The ion suppression effects of 16:0 lysophosphatidylcholine (LPC) on analyte signals are attributed to a combination of competition for excess charge and reduction of total charge due to surface tension effects. In contrast to the complicated mathematical forms that comprise the original model, the simplified model described here can more easily be employed to predict analyte mass spectrometer responses for solutions containing multiple components. Copyright (c) 2008 John Wiley & Sons, Ltd.
Trosman, Julia R; Weldon, Christine B; Douglas, Michael P; Deverka, Patricia A; Watkins, John B; Phillips, Kathryn A
2017-01-01
New payment and care organization approaches, such as those of accountable care organizations (ACOs), are reshaping accountability and shifting risk, as well as decision making, from payers to providers, within the Triple Aim context of health reform. The Triple Aim calls for improving experience of care, improving health of populations, and reducing health care costs. To understand how the transition to the ACO model impacts decision making on adoption and use of innovative technologies in the era of accelerating scientific advancement of personalized medicine and other innovations. We interviewed representatives from 10 private payers and 6 provider institutions involved in implementing the ACO model (i.e., ACOs) to understand changes, challenges, and facilitators of decision making on medical innovations, including personalized medicine. We used the framework approach of qualitative research for study design and thematic analysis. We found that representatives from the participating payer companies and ACOs perceive similar challenges to ACOs' decision making in terms of achieving a balance between the components of the Triple Aim-improving care experience, improving population health, and reducing costs. The challenges include the prevalence of cost over care quality considerations in ACOs' decisions and ACOs' insufficient analytical and technology assessment capacity to evaluate complex innovations such as personalized medicine. Decision-making facilitators included increased competition across ACOs and patients' interest in personalized medicine. As new payment models evolve, payers, ACOs, and other stakeholders should address challenges and leverage opportunities to arm ACOs with robust, consistent, rigorous, and transparent approaches to decision making on medical innovations. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Trosman, Julia R.; Weldon, Christine B.; Douglas, Michael P.; Deverka, Patricia A.; Watkins, John; Phillips, Kathryn A.
2016-01-01
Background New payment and care organization approaches, such as the Accountable Care Organization (ACO), are reshaping accountability and shifting risk, as well as decision-making, from payers to providers, under the Triple Aim of health reform. The Triple Aim calls for improving experience of care, improving health of populations and reducing healthcare costs. In the era of accelerating scientific advancement of personalized medicine and other innovations, it is critical to understand how the transition to the ACO model impacts decision-making on adoption and utilization of innovative technologies. Methods We interviewed representatives from ten private payers and six provider institutions involved in implementing the ACO model (i.e. ACOs) to understand changes, challenges and facilitators of decision-making on medical innovations, including personalized medicine. We used the framework approach of qualitative research for study design and thematic analysis. Results We found that representatives from the participating payer companies and ACOs perceive similar challenges to ACOs’ decision-making in terms of achieving a balance between the components of the Triple Aim – improving care experience, improving population health and reducing costs. The challenges include the prevalence of cost over care quality considerations in ACOs’ decisions and ACOs’ insufficient analytical and technology assessment capacity to evaluate complex innovations such as personalized medicine. Decision-making facilitators included increased competition across ACOs and patients’ interest in personalized medicine. Conclusions As new payment models evolve, payers, ACOs and other stakeholders should address challenges and leverage opportunities to arm ACOs with robust, consistent, rigorous and transparent approaches to decision-making on medical innovations. PMID:28212967
Distributed Revisiting: An Analytic for Retention of Coherent Science Learning
ERIC Educational Resources Information Center
Svihla, Vanessa; Wester, Michael J.; Linn, Marcia C.
2015-01-01
Designing learning experiences that support the development of coherent understanding of complex scientific phenomena is challenging. We sought to identify analytics that can also guide such designs to support retention of coherent understanding. Based on prior research that distributing study of material over time supports retention, we explored…
ERIC Educational Resources Information Center
Mavroudi, Anna; Giannakos, Michail; Krogstie, John
2018-01-01
Learning Analytics (LA) and adaptive learning are inextricably linked since they both foster technology-supported learner-centred education. This study identifies developments focusing on their interplay and emphasises insufficiently investigated directions which display a higher innovation potential. Twenty-one peer-reviewed studies are…
Social Learning Analytics: Navigating the Changing Settings of Higher Education
ERIC Educational Resources Information Center
de Laat, Maarten; Prinsen, Fleur R.
2014-01-01
Current trends and challenges in higher education (HE) require a reorientation towards openness, technology use and active student participation. In this article we will introduce Social Learning Analytics (SLA) as instrumental in formative assessment practices, aimed at supporting and strengthening students as active learners in increasingly open…
Exploratory Analysis in Learning Analytics
ERIC Educational Resources Information Center
Gibson, David; de Freitas, Sara
2016-01-01
This article summarizes the methods, observations, challenges and implications for exploratory analysis drawn from two learning analytics research projects. The cases include an analysis of a games-based virtual performance assessment and an analysis of data from 52,000 students over a 5-year period at a large Australian university. The complex…
Reasoning across Ontologically Distinct Levels: Students' Understandings of Molecular Genetics
ERIC Educational Resources Information Center
Duncan, Ravit Golan; Reiser, Brian J.
2007-01-01
In this article we apply a novel analytical framework to explore students' difficulties in understanding molecular genetics--a domain that is particularly challenging to learn. Our analytical framework posits that reasoning in molecular genetics entails mapping across ontologically distinct levels--an information level containing the genetic…
Analytical evaluation of current starch methods used in the international sugar industry: Part I
USDA-ARS?s Scientific Manuscript database
Several analytical starch methods currently exist in the international sugar industry that are used to prevent or mitigate starch-related processing challenges as well as assess the quality of traded end-products. These methods use simple iodometric chemistry, mostly potato starch standards, and uti...
ERIC Educational Resources Information Center
Martinez-Maldonado, Roberto; Pardo, Abelardo; Mirriahi, Negin; Yacef, Kalina; Kay, Judy; Clayphan, Andrew
2015-01-01
Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human-computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now…
Tian, Jingqi; Liu, Qian; Shi, Jinle; Hu, Jianming; Asiri, Abdullah M; Sun, Xuping; He, Yuquan
2015-09-15
Considerable recent attention has been paid to homogeneous fluorescent DNA detection with the use of nanostructures as a universal "quencher", but it still remains a great challenge to develop such nanosensor with the benefits of low cost, high speed, sensitivity, and selectivity. In this work, we report the use of iron-based metal-organic framework nanorods as a high-efficient sensing platform for fluorescent DNA detection. It only takes about 4 min to complete the whole "mix-and-detect" process with a low detection limit of 10 pM and a strong discrimination of single point mutation. Control experiments reveal the remarkable sensing behavior is a consequence of the synergies of the metal center and organic linker. This work elucidates how composition control of nanostructures can significantly impact their sensing properties, enabling new opportunities for the rational design of functional materials for analytical applications. Copyright © 2015 Elsevier B.V. All rights reserved.
Jiang, Guoying; Yu, Christopher; Yadav, Daniela B; Hu, Zhilan; Amurao, Annamarie; Duenas, Eileen; Wong, Marc; Iverson, Mark; Zheng, Kai; Lam, Xanthe; Chen, Jia; Vega, Roxanne; Ulufatu, Sheila; Leddy, Cecilia; Davis, Helen; Shen, Amy; Wong, Pin Y; Harris, Reed; Wang, Y John; Li, Dongwei
2016-07-01
Due to their potential influence on stability, pharmacokinetics, and product consistency, antibody charge variants have attracted considerable attention in the biotechnology industry. Subtle to significant differences in the level of charge variants and new charge variants under various cell culture conditions are often observed during routine manufacturing or process changes and pose a challenge when demonstrating product comparability. To explore potential solutions to control charge heterogeneity, monoclonal antibodies (mAbs) with native, wild-type C-termini, and mutants with C-terminal deletions of either lysine or lysine and glycine were constructed, expressed, purified, and characterized in vitro and in vivo. Analytical and physiological characterization demonstrated that the mAb mutants had greatly reduced levels of basic variants without decreasing antibody biologic activity, structural stability, pharmacokinetics, or subcutaneous bioavailability in rats. This study provides a possible solution to mitigate mAb heterogeneity in C-terminal processing, improve batch-to-batch consistency, and facilitate the comparability study during process changes. Published by Elsevier Inc.
Modeling Power Systems as Complex Adaptive Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chassin, David P.; Malard, Joel M.; Posse, Christian
2004-12-30
Physical analogs have shown considerable promise for understanding the behavior of complex adaptive systems, including macroeconomics, biological systems, social networks, and electric power markets. Many of today's most challenging technical and policy questions can be reduced to a distributed economic control problem. Indeed, economically based control of large-scale systems is founded on the conjecture that the price-based regulation (e.g., auctions, markets) results in an optimal allocation of resources and emergent optimal system control. This report explores the state-of-the-art physical analogs for understanding the behavior of some econophysical systems and deriving stable and robust control strategies for using them. We reviewmore » and discuss applications of some analytic methods based on a thermodynamic metaphor, according to which the interplay between system entropy and conservation laws gives rise to intuitive and governing global properties of complex systems that cannot be otherwise understood. We apply these methods to the question of how power markets can be expected to behave under a variety of conditions.« less
Clinical challenges in the molecular characterization of circulating tumour cells in breast cancer.
Lianidou, E S; Mavroudis, D; Georgoulias, V
2013-06-25
Blood testing for circulating tumour cells (CTC) has emerged as one of the hottest fields in cancer research. CTC detection and enumeration can serve as a 'liquid biopsy' and an early marker of response to systemic therapy, whereas their molecular characterisation has a strong potential to be translated to individualised targeted treatments and spare breast cancer (BC) patients unnecessary and ineffective therapies. Different analytical systems for CTC detection and isolation have been developed and new areas of research are directed towards developing novel assays for CTC molecular characterisation. Molecular characterisation of single CTC holds considerable promise for predictive biomarker assessment and to explore CTC heterogeneity. The application of extremely powerful next-generation sequencing technologies in the area of CTC molecular characterisation in combination with reliable single CTC isolation opens new frontiers for the management of patients in the near future. This review is mainly focused on the clinical potential of the molecular characterisation of CTC in BC.
Clinical challenges in the molecular characterization of circulating tumour cells in breast cancer
Lianidou, E S; Mavroudis, D; Georgoulias, V
2013-01-01
Blood testing for circulating tumour cells (CTC) has emerged as one of the hottest fields in cancer research. CTC detection and enumeration can serve as a ‘liquid biopsy' and an early marker of response to systemic therapy, whereas their molecular characterisation has a strong potential to be translated to individualised targeted treatments and spare breast cancer (BC) patients unnecessary and ineffective therapies. Different analytical systems for CTC detection and isolation have been developed and new areas of research are directed towards developing novel assays for CTC molecular characterisation. Molecular characterisation of single CTC holds considerable promise for predictive biomarker assessment and to explore CTC heterogeneity. The application of extremely powerful next-generation sequencing technologies in the area of CTC molecular characterisation in combination with reliable single CTC isolation opens new frontiers for the management of patients in the near future. This review is mainly focused on the clinical potential of the molecular characterisation of CTC in BC. PMID:23756869
Catalytic nanomotors for environmental monitoring and water remediation.
Soler, Lluís; Sánchez, Samuel
2014-07-07
Self-propelled nanomotors hold considerable promise for developing innovative environmental applications. This review highlights the recent progress in the use of self-propelled nanomotors for water remediation and environmental monitoring applications, as well as the effect of the environmental conditions on the dynamics of nanomotors. Artificial nanomotors can sense different analytes-and therefore pollutants, or "chemical threats"-can be used for testing the quality of water, selective removal of oil, and alteration of their speeds, depending on the presence of some substances in the solution in which they swim. Newly introduced micromotors with double functionality to mix liquids at the microscale and enhance chemical reactions for the degradation of organic pollutants greatly broadens the range of applications to that of environmental. These "self-powered remediation systems" could be seen as a new generation of "smart devices" for cleaning water in small pipes or cavities difficult to reach with traditional methods. With constant improvement and considering the key challenges, we expect that artificial nanomachines could play an important role in environmental applications in the near future.
Catalytic nanomotors for environmental monitoring and water remediation
NASA Astrophysics Data System (ADS)
Soler, Lluís; Sánchez, Samuel
2014-06-01
Self-propelled nanomotors hold considerable promise for developing innovative environmental applications. This review highlights the recent progress in the use of self-propelled nanomotors for water remediation and environmental monitoring applications, as well as the effect of the environmental conditions on the dynamics of nanomotors. Artificial nanomotors can sense different analytes--and therefore pollutants, or ``chemical threats''--can be used for testing the quality of water, selective removal of oil, and alteration of their speeds, depending on the presence of some substances in the solution in which they swim. Newly introduced micromotors with double functionality to mix liquids at the microscale and enhance chemical reactions for the degradation of organic pollutants greatly broadens the range of applications to that of environmental. These ``self-powered remediation systems'' could be seen as a new generation of ``smart devices'' for cleaning water in small pipes or cavities difficult to reach with traditional methods. With constant improvement and considering the key challenges, we expect that artificial nanomachines could play an important role in environmental applications in the near future.
Transmission of Insult in Out-of-Position Subjects: I. Shoulder Injury
NASA Astrophysics Data System (ADS)
Shaibani, Saami J.
2002-03-01
The dynamic response of vehicle occupants in impact events is quite well understood when initial boundary conditions have simple values. However, departure from regular seated postures can present considerable analytical challenges in identifying possible causes of injury. Research grounded in physics[1] can facilitate the latter to promote diagnosis, and enhance treatment, for injuries that might not otherwise be recognized. This is confoundingly true in some low-severity impacts, three of which are explored as separate studies because they are so different. The first involves the upper extremity of a passenger who was bent fully forward at the time of impact, and correct use of physics established an unusual injury mechanism when lesser approaches failed. Subsequent papers deal with other parts of the body.[2-3] 1. Proper Treatment of Complex Human Structures, Announcer 27 (4), 100 (1997); 2. Transmission of Insult in Out-of-Position Subjects: II. Lumbosacral Injury, Bull. Am. Phys. Soc. in press (2002); 3. ibid: III. Thoracic Spine Injury.
Zuin, Vânia G; Budarin, Vitaliy L; De Bruyn, Mario; Shuttleworth, Peter S; Hunt, Andrew J; Pluciennik, Camille; Borisova, Aleksandra; Dodson, Jennifer; Parker, Helen L; Clark, James H
2017-09-21
The recovery and separation of high value and low volume extractives are a considerable challenge for the commercial realisation of zero-waste biorefineries. Using solid-phase extractions (SPE) based on sustainable sorbents is a promising method to enable efficient, green and selective separation of these complex extractive mixtures. Mesoporous carbonaceous solids derived from renewable polysaccharides are ideal stationary phases due to their tuneable functionality and surface structure. In this study, the structure-separation relationships of thirteen polysaccharide-derived mesoporous materials and two modified types as sorbents for ten naturally-occurring bioactive phenolic compounds were investigated. For the first time, a comprehensive statistical analysis of the key molecular and surface properties influencing the recovery of these species was carried out. The obtained results show the possibility of developing tailored materials for purification, separation or extraction, depending on the molecular composition of the analyte. The wide versatility and application span of these polysaccharide-derived mesoporous materials offer new sustainable and inexpensive alternatives to traditional silica-based stationary phases.
The changing demographic, legal, and technological contexts of political representation
Forest, Benjamin
2005-01-01
Three developments have created challenges for political representation in the U.S. and particularly for the use of territorially based representation (election by district). First, the demographic complexity of the U.S. population has grown both in absolute terms and in terms of residential patterns. Second, legal developments since the 1960s have recognized an increasing number of groups as eligible for voting rights protection. Third, the growing technical capacities of computer technology, particularly Geographic Information Systems, have allowed political parties and other organizations to create election districts with increasingly precise political and demographic characteristics. Scholars have made considerable progress in measuring and evaluating the racial and partisan biases of districting plans, and some states have tried to use Geographic Information Systems technology to produce more representative districts. However, case studies of Texas and Arizona illustrate that such analytic and technical advances have not overcome the basic contradictions that underlie the American system of territorial political representation. PMID:16230615
Use of Social Media to Target Information-Driven Arms Control and Nonproliferation Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kreyling, Sean J.; Williams, Laura S.; Gastelum, Zoe N.
There has been considerable discussion within the national security community, including a recent workshop sponsored by the U.S. State Department, about the use of social media for extracting patterns of collective behavior and influencing public perception in areas relevant to arms control and nonproliferation. This paper seeks to explore if, and how, social media can be used to supplement nonproliferation and arms control inspection and monitoring activities on states and sites of greatest proliferation relevance. In this paper, we set the stage for how social media can be applied in this problem space and describe some of the foreseen challenges,more » including data validation, sources and attributes, verification, and security. Using information analytics and data visualization capabilities available at Pacific Northwest National Laboratory (PNNL), we provide graphical examples of some social media "signatures" of potential relevance for nonproliferation and arms control purposes. We conclude by describing a proposed case study and offering recommendations both for further research and next steps by the policy community.« less
Telecom Big Data for Urban Transport Analysis - a Case Study of Split-Dalmatia County in Croatia
NASA Astrophysics Data System (ADS)
Baučić, M.; Jajac, N.; Bućan, M.
2017-09-01
Today, big data has become widely available and the new technologies are being developed for big data storage architecture and big data analytics. An ongoing challenge is how to incorporate big data into GIS applications supporting the various domains. International Transport Forum explains how the arrival of big data and real-time data, together with new data processing algorithms lead to new insights and operational improvements of transport. Based on the telecom customer data, the Study of Tourist Movement and Traffic in Split-Dalmatia County in Croatia is carried out as a part of the "IPA Adriatic CBC//N.0086/INTERMODAL" project. This paper briefly explains the big data used in the study and the results of the study. Furthermore, this paper investigates the main considerations when using telecom customer big data: data privacy and data quality. The paper concludes with GIS visualisation and proposes the further use of big data used in the study.
Shell-NASA Vibration-Based Damage Characterization
NASA Technical Reports Server (NTRS)
Rollins, John M.
2014-01-01
This article describes collaborative research between Shell International Exploration and Production (IE&P) scientists and ISAG personnel to investigate the feasibility of ultrasonic-based characterization of spacecraft tile damage for in-space inspection applications. The approach was proposed by Shell personnel in a Shell-NASA "speed-matching" session in early 2011 after ISAG personnel described challenges inherent in the inspection of MMOD damage deep within spacecraft thermal protection system (TPS) tiles. The approach leveraged Shell's relevant sensor and analytical expertise. The research addressed the difficulties associated with producing 3D models of MMOD damage cavities under the surface of a TPS tile, given that simple image-based sensing is constrained by line of sight through entry holes that have diameters considerably smaller than the underlying damage cavities. Damage cavity characterization is needed as part of a vehicle inspection and risk reduction capability for long-duration, human-flown space missions. It was hoped that cavity characterization could be accomplished through the use of ultrasonic techniques that allow for signal penetration through solid material.
Strategies to address participant misrepresentation for eligibility in Web-based research.
Kramer, Jessica; Rubin, Amy; Coster, Wendy; Helmuth, Eric; Hermos, John; Rosenbloom, David; Moed, Rich; Dooley, Meghan; Kao, Ying-Chia; Liljenquist, Kendra; Brief, Deborah; Enggasser, Justin; Keane, Terence; Roy, Monica; Lachowicz, Mark
2014-03-01
Emerging methodological research suggests that the World Wide Web ("Web") is an appropriate venue for survey data collection, and a promising area for delivering behavioral intervention. However, the use of the Web for research raises concerns regarding sample validity, particularly when the Web is used for recruitment and enrollment. The purpose of this paper is to describe the challenges experienced in two different Web-based studies in which participant misrepresentation threatened sample validity: a survey study and an online intervention study. The lessons learned from these experiences generated three types of strategies researchers can use to reduce the likelihood of participant misrepresentation for eligibility in Web-based research. Examples of procedural/design strategies, technical/software strategies and data analytic strategies are provided along with the methodological strengths and limitations of specific strategies. The discussion includes a series of considerations to guide researchers in the selection of strategies that may be most appropriate given the aims, resources and target population of their studies. Copyright © 2014 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Drewes, Andrea; Henderson, Joseph; Mouza, Chrystalla
2018-01-01
Climate change is one of the most pressing challenges facing society, and climate change educational models are emerging in response. This study investigates the implementation and enactment of a climate change professional development (PD) model for science educators and its impact on student learning. Using an intrinsic case study methodology, we focused analytic attention on how one teacher made particular pedagogical and content decisions, and the implications for student's conceptual learning. Using anthropological theories of conceptual travel, we traced salient ideas through instructional delivery and into student reasoning. Analysis showed that students gained an increased understanding of the enhanced greenhouse effect and the implications of human activity on this enhanced effect at statistically significant levels and with moderate effect sizes. However, students demonstrated a limited, though non-significant gain on the likely effects of climate change. Student reasoning on the tangible actions to deal with these problems also remained underdeveloped, reflecting omissions in both PD and teacher enactment. We discuss implications for the emerging field of climate change education.
Medicines, shaken and stirred: a critical review on the ecotoxicology of pharmaceutical mixtures
Backhaus, Thomas
2014-01-01
Analytical monitoring surveys routinely confirm that organisms in the environment are exposed to complex multi-component pharmaceutical mixtures. We are hence tasked with the challenge to take this into consideration when investigating the ecotoxicology of pharmaceuticals. This review first provides a brief overview of the fundamental approaches for mixture toxicity assessment, which is then followed by a critical review on the empirical evidence that is currently at hand on the ecotoxicology of pharmaceutical mixtures. It is concluded that, while the classical concepts of concentration addition and independent action (response addition) provide a robust scientific footing, several knowledge gaps remain. This includes, in particular, the need for more and better empirical data on the effects of pharmaceutical mixtures on soil organisms as well as marine flora and fauna, and exploring the quantitative consequences of toxicokinetic, toxicodynamic and ecological interactions. Increased focus should be put on investigating the ecotoxicology of pharmaceutical mixtures in environmentally realistic settings. PMID:25405972
Evaluation of new laser spectrometer techniques for in-situ carbon monoxide measurements
NASA Astrophysics Data System (ADS)
Zellweger, C.; Steinbacher, M.; Buchmann, B.
2012-10-01
Long-term time series of the atmospheric composition are essential for environmental research and thus require compatible, multi-decadal monitoring activities. The current data quality objectives of the World Meteorological Organization (WMO) for carbon monoxide (CO) in the atmosphere are very challenging to meet with the measurement techniques that have been used until recently. During the past few years, new spectroscopic techniques came to market with promising properties for trace gas analytics. The current study compares three instruments that have recently become commercially available (since 2011) with the best currently available technique (Vacuum UV Fluorescence) and provides a link to previous comparison studies. The instruments were investigated for their performance regarding repeatability, reproducibility, drift, temperature dependence, water vapour interference and linearity. Finally, all instruments were examined during a short measurement campaign to assess their applicability for long-term field measurements. It could be shown that the new techniques perform considerably better compared to previous techniques, although some issues, such as temperature influence and cross sensitivities, need further attention.
Probabilistic risk analysis and terrorism risk.
Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J
2010-04-01
Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.
Summers, Richard L; Pipke, Matt; Wegerich, Stephan; Conkright, Gary; Isom, Kristen C
2014-01-01
Background. Monitoring cardiovascular hemodynamics in the modern clinical setting is a major challenge. Increasing amounts of physiologic data must be analyzed and interpreted in the context of the individual patients pathology and inherent biologic variability. Certain data-driven analytical methods are currently being explored for smart monitoring of data streams from patients as a first tier automated detection system for clinical deterioration. As a prelude to human clinical trials, an empirical multivariate machine learning method called Similarity-Based Modeling (SBM), was tested in an In Silico experiment using data generated with the aid of a detailed computer simulator of human physiology (Quantitative Circulatory Physiology or QCP) which contains complex control systems with realistic integrated feedback loops. Methods. SBM is a kernel-based, multivariate machine learning method that that uses monitored clinical information to generate an empirical model of a patients physiologic state. This platform allows for the use of predictive analytic techniques to identify early changes in a patients condition that are indicative of a state of deterioration or instability. The integrity of the technique was tested through an In Silico experiment using QCP in which the output of computer simulations of a slowly evolving cardiac tamponade resulted in progressive state of cardiovascular decompensation. Simulator outputs for the variables under consideration were generated at a 2-min data rate (0.083Hz) with the tamponade introduced at a point 420 minutes into the simulation sequence. The functionality of the SBM predictive analytics methodology to identify clinical deterioration was compared to the thresholds used by conventional monitoring methods. Results. The SBM modeling method was found to closely track the normal physiologic variation as simulated by QCP. With the slow development of the tamponade, the SBM model are seen to disagree while the simulated biosignals in the early stages of physiologic deterioration and while the variables are still within normal ranges. Thus, the SBM system was found to identify pathophysiologic conditions in a timeframe that would not have been detected in a usual clinical monitoring scenario. Conclusion. In this study the functionality of a multivariate machine learning predictive methodology that that incorporates commonly monitored clinical information was tested using a computer model of human physiology. SBM and predictive analytics were able to differentiate a state of decompensation while the monitored variables were still within normal clinical ranges. This finding suggests that the SBM could provide for early identification of a clinical deterioration using predictive analytic techniques. predictive analytics, hemodynamic, monitoring.
Bandura, D R; Baranov, V I; Tanner, S D
2001-07-01
A low-level review of the fundamentals of ion-molecule interactions is presented. These interactions are used to predict the efficiencies of collisional fragmentation, energy damping and reaction for a variety of neutral gases as a function of pressure in a rf-driven collision/reaction cell. It is shown that the number of collisions increases dramatically when the ion energies are reduced to near-thermal (< 0.1 eV), because of the ion-induced dipole and ion-dipole interaction. These considerations suggest that chemical reaction can be orders of magnitude more efficient at improving the analyte signal/background ratio than can collisional fragmentation. Considerations that lead to an appropriate selection of type of gas, operating pressure, and ion energies for efficient operation of the cell for the alleviation of spectral interferences are discussed. High efficiency (large differences between reaction efficiencies of the analyte and interference ions, and concomitant suppression of secondary chemistry) might be required to optimize the chemical resolution (determination of an analyte in the presence of an isobaric interference) when using ion-molecule chemistry to suppress the interfering ion. In many instances atom transfer to the analyte, which shifts the analytical m/z by the mass of the atom transferred, provides high chemical resolution, even when the efficiency of reaction is relatively low. Examples are given of oxidation, hydroxylation, and chlorination of analyte ions (V+, Fe+, As+, Se+, Sr+, Y+, and Zr+) to improve the capability of determination of complex samples. Preliminary results are given showing O-atom abstraction by CO from CaO+ to enable the determination of Fe in high-Ca samples.
The Challenge of Separating Effects of Simultaneous Education Projects on Student Achievement
ERIC Educational Resources Information Center
Ma, Xin; Ma, Lingling
2009-01-01
When multiple education projects operate in an overlapping or rear-ended manner, it is always a challenge to separate unique project effects on schooling outcomes. Our analysis represents a first attempt to address this challenge. A three-level hierarchical linear model (HLM) was presented as a general analytical framework to separate program…
The challenge of big data in public health: an opportunity for visual analytics.
Ola, Oluwakemi; Sedig, Kamran
2014-01-01
Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data's volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research.
The Challenge of Big Data in Public Health: An Opportunity for Visual Analytics
Ola, Oluwakemi; Sedig, Kamran
2014-01-01
Public health (PH) data can generally be characterized as big data. The efficient and effective use of this data determines the extent to which PH stakeholders can sufficiently address societal health concerns as they engage in a variety of work activities. As stakeholders interact with data, they engage in various cognitive activities such as analytical reasoning, decision-making, interpreting, and problem solving. Performing these activities with big data is a challenge for the unaided mind as stakeholders encounter obstacles relating to the data’s volume, variety, velocity, and veracity. Such being the case, computer-based information tools are needed to support PH stakeholders. Unfortunately, while existing computational tools are beneficial in addressing certain work activities, they fall short in supporting cognitive activities that involve working with large, heterogeneous, and complex bodies of data. This paper presents visual analytics (VA) tools, a nascent category of computational tools that integrate data analytics with interactive visualizations, to facilitate the performance of cognitive activities involving big data. Historically, PH has lagged behind other sectors in embracing new computational technology. In this paper, we discuss the role that VA tools can play in addressing the challenges presented by big data. In doing so, we demonstrate the potential benefit of incorporating VA tools into PH practice, in addition to highlighting the need for further systematic and focused research. PMID:24678376
Analytical dose modeling for preclinical proton irradiation of millimetric targets.
Vanstalle, Marie; Constanzo, Julie; Karakaya, Yusuf; Finck, Christian; Rousseau, Marc; Brasse, David
2018-01-01
Due to the considerable development of proton radiotherapy, several proton platforms have emerged to irradiate small animals in order to study the biological effectiveness of proton radiation. A dedicated analytical treatment planning tool was developed in this study to accurately calculate the delivered dose given the specific constraints imposed by the small dimensions of the irradiated areas. The treatment planning system (TPS) developed in this study is based on an analytical formulation of the Bragg peak and uses experimental range values of protons. The method was validated after comparison with experimental data from the literature and then compared to Monte Carlo simulations conducted using Geant4. Three examples of treatment planning, performed with phantoms made of water targets and bone-slab insert, were generated with the analytical formulation and Geant4. Each treatment planning was evaluated using dose-volume histograms and gamma index maps. We demonstrate the value of the analytical function for mouse irradiation, which requires a targeting accuracy of 0.1 mm. Using the appropriate database, the analytical modeling limits the errors caused by misestimating the stopping power. For example, 99% of a 1-mm tumor irradiated with a 24-MeV beam receives the prescribed dose. The analytical dose deviations from the prescribed dose remain within the dose tolerances stated by report 62 of the International Commission on Radiation Units and Measurements for all tested configurations. In addition, the gamma index maps show that the highly constrained targeting accuracy of 0.1 mm for mouse irradiation leads to a significant disagreement between Geant4 and the reference. This simulated treatment planning is nevertheless compatible with a targeting accuracy exceeding 0.2 mm, corresponding to rat and rabbit irradiations. Good dose accuracy for millimetric tumors is achieved with the analytical calculation used in this work. These volume sizes are typical in mouse models for radiation studies. Our results demonstrate that the choice of analytical rather than simulated treatment planning depends on the animal model under consideration. © 2017 American Association of Physicists in Medicine.
Hanning, Sara M; Orlu Gul, Mine; Winslade, Jackie; Baarslag, Manuel A; Neubert, Antje; Tuleu, Catherine
2016-09-25
A Paediatric Investigation Plan (PIP) is a development plan that aims to ensure that sufficient data are obtained through studies in paediatrics to support the generation of marketing authorisation of medicines for children. This paper highlights some practical considerations and challenges with respect to PIP submissions and paediatric clinical trials during the pharmaceutical development phase, using the FP7-funded Clonidine for Sedation of Paediatric Patients in the Intensive Care Unit (CloSed) project as a case study. Examples discussed include challenges and considerations regarding formulation development, blinding and randomisation, product labelling and shipment and clinical trial requirements versus requirements for marketing authorisation. A significant quantity of information is required for PIP submissions and it is hoped that future applicants may benefit from an insight into some critical considerations and challenges faced in the CloSed project. Copyright © 2016 Elsevier B.V. All rights reserved.
IBM’s Health Analytics and Clinical Decision Support
Sun, J.; Knoop, S.; Shabo, A.; Carmeli, B.; Sow, D.; Syed-Mahmood, T.; Rapp, W.
2014-01-01
Summary Objectives This survey explores the role of big data and health analytics developed by IBM in supporting the transformation of healthcare by augmenting evidence-based decision-making. Methods Some problems in healthcare and strategies for change are described. It is argued that change requires better decisions, which, in turn, require better use of the many kinds of healthcare information. Analytic resources that address each of the information challenges are described. Examples of the role of each of the resources are given. Results There are powerful analytic tools that utilize the various kinds of big data in healthcare to help clinicians make more personalized, evidenced-based decisions. Such resources can extract relevant information and provide insights that clinicians can use to make evidence-supported decisions. There are early suggestions that these resources have clinical value. As with all analytic tools, they are limited by the amount and quality of data. Conclusion Big data is an inevitable part of the future of healthcare. There is a compelling need to manage and use big data to make better decisions to support the transformation of healthcare to the personalized, evidence-supported model of the future. Cognitive computing resources are necessary to manage the challenges in employing big data in healthcare. Such tools have been and are being developed. The analytic resources, themselves, do not drive, but support healthcare transformation. PMID:25123736
Transportation systems health : concepts, applications & significance.
DOT National Transportation Integrated Search
2015-12-01
This report offers conceptual and analytical frameworks and application examples to address the question: how can broader statewide (or national) objectives be achieved while formally taking into consideration different regional priorities and constr...
Design Evaluation of Wind Turbine Spline Couplings Using an Analytical Model: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Y.; Keller, J.; Wallen, R.
2015-02-01
Articulated splines are commonly used in the planetary stage of wind turbine gearboxes for transmitting the driving torque and improving load sharing. Direct measurement of spline loads and performance is extremely challenging because of limited accessibility. This paper presents an analytical model for the analysis of articulated spline coupling designs. For a given torque and shaft misalignment, this analytical model quickly yields insights into relationships between the spline design parameters and resulting loads; bending, contact, and shear stresses; and safety factors considering various heat treatment methods. Comparisons of this analytical model against previously published computational approaches are also presented.
Decreased pain sensitivity due to trimethylbenzene exposure ...
Traditionally, human health risk assessments have relied on qualitative approaches for hazard identification, often using the Hill criteria and weight of evidence determinations to integrate data from multiple studies. Recently, the National Research Council has recommended the development of quantitative approaches for evidence integration, including the application of meta-analyses. The following hazard identification case study applies qualitative as well as meta-analytic approaches to trimethylbenzene (TMB) isomers exposure and the potential neurotoxic effects on pain sensitivity. In the meta-analytic approach, a pooled effect size is calculated, after consideration of multiple confounding factors, in order to determine whether the entire database under consideration indicates that TMBs are likely to be a neurotoxic hazard. The pain sensitivity studies included in the present analyses initially seem discordant in their results: effects on pain sensitivity are seen immediately after termination of exposure, appear to resolve 24 hours after exposure, and then reappear 50 days later following foot-shock. Qualitative consideration of toxicological and toxicokinetic characteristics of the TMB isomers suggests that the observed differences between studies are due to testing time and can be explained through a complete consideration of the underlying biology of the effect and the nervous system as a whole. Meta-analyses and –regressions support this conclus
Envisioning the future of polymer therapeutics for brain disorders.
Rodriguez-Otormin, Fernanda; Duro-Castano, Aroa; Conejos-Sánchez, Inmaculada; Vicent, María J
2018-06-14
The growing incidence of brain-related pathologies and the problems that undermine the development of efficient and effective treatments have prompted both researchers and the pharmaceutical industry to search for novel therapeutic alternatives. Polymer therapeutics (PT) display properties well suited to the treatment of neuro-related disorders, which help to overcome the many hidden obstacles on the journey to the central nervous system (CNS). The inherent features of PT, derived from drug(s) conjugation, in parallel with the progress in synthesis and analytical methods, the increasing knowledge in molecular basis of diseases, and collected clinical data through the last four decades, have driven the translation from "bench to bedside" for various biomedical applications. However, since the approval of Gliadel® wafers, little progress has been made in the CNS field, even though brain targeting represents an ever-growing challenge. A thorough assessment of the steps required for successful brain delivery via different administration routes and the consideration of the disease-specific hallmarks are essential to progress in the field. Within this review, we hope to summarize the latest developments, successes, and failures and discuss considerations on designs and strategies for PT in the treatment of CNS disorders. This article is categorized under: Therapeutic Approaches and Drug Discovery > Nanomedicine for Neurological Disease Nanotechnology Approaches to Biology > Nanoscale Systems in Biology Therapeutic Approaches and Drug Discovery > Nanomedicine for Oncologic Disease. © 2018 Wiley Periodicals, Inc.
CHALLENGES IN BIODEGRADATION OF TRACE ORGANIC CONTAMINANTS-GASOLINE OXYGENATES AND SEX HORMONES
Advances in analytical methods have led to the identification of several classes of organic chemicals that are associated with adverse environmental impacts. Two such classes of organic chemicals, gasoline oxygenates and sex hormones, are used to illustrate challenges associated ...
Assessing EDCs in the Field: Challenges and New Approaches
Assessing the occurrence and effects of EDCs in the environment can be challenging from a number of perspectives. For example, conventional analytical approaches and/or toxicity tests may not be appropriate to detecting very potent chemicals that impact specific pathways, and oft...
Effects of hot and cold temperature exposure on performance : a meta-analytic review
DOT National Transportation Integrated Search
2002-01-01
Adjusting to and working under hot or cold temperatures has long been a challenge for people living under immoderate weather conditions. In spite of the ability in industrialized societies to control indoor temperatures, a similar challenge continues...
Swarm intelligence metaheuristics for enhanced data analysis and optimization.
Hanrahan, Grady
2011-09-21
The swarm intelligence (SI) computing paradigm has proven itself as a comprehensive means of solving complicated analytical chemistry problems by emulating biologically-inspired processes. As global optimum search metaheuristics, associated algorithms have been widely used in training neural networks, function optimization, prediction and classification, and in a variety of process-based analytical applications. The goal of this review is to provide readers with critical insight into the utility of swarm intelligence tools as methods for solving complex chemical problems. Consideration will be given to algorithm development, ease of implementation and model performance, detailing subsequent influences on a number of application areas in the analytical, bioanalytical and detection sciences.
Cullinan, David B; Hondrogiannis, George; Henderson, Terry J
2008-04-15
Two-dimensional 1H-13C HSQC (heteronuclear single quantum correlation) and fast-HMQC (heteronuclear multiple quantum correlation) pulse sequences were implemented using a sensitivity-enhanced, cryogenic probehead for detecting compounds relevant to the Chemical Weapons Convention present in complex mixtures. The resulting methods demonstrated exceptional sensitivity for detecting the analytes at trace level concentrations. 1H-13C correlations of target analytes at < or = 25 microg/mL were easily detected in a sample where the 1H solvent signal was approximately 58,000-fold more intense than the analyte 1H signals. The problem of overlapping signals typically observed in conventional 1H spectroscopy was essentially eliminated, while 1H and 13C chemical shift information could be derived quickly and simultaneously from the resulting spectra. The fast-HMQC pulse sequences generated magnitude mode spectra suitable for detailed analysis in approximately 4.5 h and can be used in experiments to efficiently screen a large number of samples. The HSQC pulse sequences, on the other hand, required roughly twice the data acquisition time to produce suitable spectra. These spectra, however, were phase-sensitive, contained considerably more resolution in both dimensions, and proved to be superior for detecting analyte 1H-13C correlations. Furthermore, a HSQC spectrum collected with a multiplicity-edited pulse sequence provided additional structural information valuable for identifying target analytes. The HSQC pulse sequences are ideal for collecting high-quality data sets with overnight acquisitions and logically follow the use of fast-HMQC pulse sequences to rapidly screen samples for potential target analytes. Use of the pulse sequences considerably improves the performance of NMR spectroscopy as a complimentary technique for the screening, identification, and validation of chemical warfare agents and other small-molecule analytes present in complex mixtures and environmental samples.
ERIC Educational Resources Information Center
Lee, Alwyn Vwen Yen; Tan, Seng Chee
2017-01-01
Understanding ideas in a discourse is challenging, especially in textual discourse analysis. We propose using temporal analytics with unsupervised machine learning techniques to investigate promising ideas for the collective advancement of communal knowledge in an online knowledge building discourse. A discourse unit network was constructed and…
The NMR analysis of frying oil: a very reliable method for assessment of lipid oxidation
USDA-ARS?s Scientific Manuscript database
There are many analytical methods developed for the assessment of lipid oxidation. However, one of the most challenging issues in analyzing oil oxidation is that there is lack of consistency in results obtained from different analytical methods. The major reason for the inconsistency is that most me...
ERIC Educational Resources Information Center
Šulíková, Jana
2016-01-01
Purpose: This article proposes an analytical framework that helps to identify and challenge misconceptions of ethnocentrism found in pre-tertiary teaching resources for history and the social sciences in numerous countries. Design: Drawing on nationalism studies, the analytical framework employs ideas known under the umbrella terms of…
Challenges of Using Learning Analytics Techniques to Support Mobile Learning
ERIC Educational Resources Information Center
Arrigo, Marco; Fulantelli, Giovanni; Taibi, Davide
2015-01-01
Evaluation of Mobile Learning remains an open research issue, especially as regards the activities that take place outside the classroom. In this context, Learning Analytics can provide answers, and offer the appropriate tools to enhance Mobile Learning experiences. In this poster we introduce a task-interaction framework, using learning analytics…
ERIC Educational Resources Information Center
Tomasik, Janice Hall; LeCaptain, Dale; Murphy, Sarah; Martin, Mary; Knight, Rachel M.; Harke, Maureen A.; Burke, Ryan; Beck, Kara; Acevedo-Polakovich, I. David
2014-01-01
Motivating students in analytical chemistry can be challenging, in part because of the complexity and breadth of topics involved. Some methods that help encourage students and convey real-world relevancy of the material include incorporating environmental issues, research-based lab experiments, and service learning projects. In this paper, we…
Business Analytics in the Marketing Curriculum: A Call for Integration
ERIC Educational Resources Information Center
Mintu-Wimsatt, Alma; Lozada, Héctor R.
2018-01-01
Marketing education has responded, to some extent, to the academic challenges emanating from the Big Data revolution. To provide a forum to specifically discuss how business analytics has been integrated into the marketing curriculum, we developed a Special Issue for "Marketing Education Review." We start with a call to action that…
How Predictive Analytics and Choice Architecture Can Improve Student Success
ERIC Educational Resources Information Center
Denley, Tristan
2014-01-01
This article explores the challenges that students face in navigating the curricular structure of post-secondary degree programs, and how predictive analytics and choice architecture can play a role. It examines Degree Compass, a course recommendation system that successfully pairs current students with the courses that best fit their talents and…
Among the challenges of characterizing emerging contaminants in complex environmental matrices (e.g., biosolids, sewage, or wastewater) are the co-eluting interferences. For example, surfactants, fats, and humic acids, can be preferentially ionized instead of the analyte(s) of in...
ERIC Educational Resources Information Center
Reinholz, Daniel L.; Shah, Niral
2018-01-01
Equity in mathematics classroom discourse is a pressing concern, but analyzing issues of equity using observational tools remains a challenge. In this article, we propose equity analytics as a quantitative approach to analyzing aspects of equity and inequity in classrooms. We introduce a classroom observation tool that focuses on relatively…
The Skinny on Big Data in Education: Learning Analytics Simplified
ERIC Educational Resources Information Center
Reyes, Jacqueleen A.
2015-01-01
This paper examines the current state of learning analytics (LA), its stakeholders and the benefits and challenges these stakeholders face. LA is a field of research that involves the gathering, analyzing and reporting of data related to learners and their environments with the purpose of optimizing the learning experience. Stakeholders in LA are…
Special Issue: Learning Analytics in Higher Education
ERIC Educational Resources Information Center
Lester, Jaime; Klein, Carrie; Rangwala, Huzefa; Johri, Aditya
2017-01-01
The purpose of this monograph is to give readers a practical and theoretical foundation in learning analytics in higher education, including an understanding of the challenges and incentives that are present in the institution, in the individual, and in the technologies themselves. Among questions that are explored and answered are: (1) What are…
The Challenge of Developing a Universal Case Conceptualization for Functional Analytic Psychotherapy
ERIC Educational Resources Information Center
Bonow, Jordan T.; Maragakis, Alexandros; Follette, William C.
2012-01-01
Functional Analytic Psychotherapy (FAP) targets a client's interpersonal behavior for change with the goal of improving his or her quality of life. One question guiding FAP case conceptualization is, "What interpersonal behavioral repertoires will allow a specific client to function optimally?" Previous FAP writings have suggested that a therapist…
78 FR 46325 - Pacific Fishery Management Council (Pacific Council); Public Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-31
... Groundfish Subcommittee teleconference is to discuss analytical approaches for a meta-analysis of... development of analyses used to inform proxy F MSY harvest rates for consideration by the Pacific Council's...
Mullins, C Daniel; Sanchez, Robert J
2011-01-01
The Patient Protection and Affordable Care Act brought considerable attention to comparative effectiveness research (CER). To (a) suggest best practices for conducting and reporting CER using "real-world data" (RWD), (b) describe some of the data and infrastructure requirements for conducting CER using RWD, (c) identify statistical challenges with the analysis of nonrandomized studies and suggest appropriate techniques to address those challenges, (d) recognize the value of patient-reported outcomes in CER, (e) encourage the incorporation of observational data into randomized controlled studies, and (f) highlight the importance of incorporating payers in industry-sponsored research. The first article in this supplement, "Something old, something new…" provides a policy perspective on the recent evolution of CER. It reviews the historical context, discusses the "promise and fear" of CER, and then describes the new role of the Patient-Centered Outcomes Research Institute (PCORI) in defining and sponsoring CER. The second paper, "Ten Commandments," proposes a series of tenets for planning, conducting, and reporting CER done with RWD. Oriented for basic-to-intermediate researchers, it combines standard scientific research principles with considerations specific to nonrandomized, RWD studies. The third article, "Infrastructure Requirements," points out that effective use of secondary data requires addressing major methodological and infrastructural issues, including development of analytical tools to readily access and analyze data, formulation of guidelines to enhance quality and transparency, establishment of data standards, and creation of data warehouses that respect the privacy and confidentiality of patients. It identifies gaps that must be filled to address the underlying issues, with emphasis on data standards, data quality assurance, data warehouses, computing environment, and protection of privacy and confidentiality. The fourth paper, "Statistical Issues," discusses how the validity of analytic results from observational studies is adversely impacted by biases that may be introduced due to lack of randomization. It reviews some of the methodological challenges that arise in the analysis of data from nonrandomized studies, with particular emphasis on the limitations of traditional approaches and potential solutions from recent methodological developments. The fifth paper, "Considerations on the Use of Patient Reported Outcomes (PROs)," describes how PRO data can play a critical role in guiding patients, health care providers, payers, and policy makers in making informed decisions regarding patient-centered treatment from among alternative options and technologies and have been noted as such by PCORI. However, collection and interpretation of such data within the context of CER have not yet been fully established. It discusses some challenges with including PROs in CER initiatives, provides a framework for their effective use, and proposes several areas for future research. Lastly, "Developing a Collaborative Study Protocol…" indicates that there is the potential, the desire, and the capability for payers to be involved in CER studies, combining elements of their own observational data with prospective studies. It describes a case example of a payer, a pharmaceutical company, and a research organization collaborating on a prospective study to examine the effect of prior authorization for pregabalin on health care costs to the payer. Researchers at Pfizer routinely conduct CER-type studies. In this supplement, we have proposed some approaches that we believe are useful in developing certain kinds of evidence and have described some of our experiences. Our experiences also make us acutely aware of the limitations of approaches and data sources that have been used for CER studies and suggest that there is a need to further develop methods that are most useful for answering CER questions.
TENTACLE Multi-Camera Immersive Surveillance System Phase 2
2015-04-16
successful in solving the most challenging video analytics problems and taking the advanced research concepts into working systems for end- users in both...commercial, space and military applications. Notable successes include winning the DARPA Urban Challenge , software autonomy to guide the NASA robots (spirit... challenging urban environments. CMU is developing a scalable and extensible architecture, improving search/pursuit/tracking capabilities, and addressing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A; Beaver, Justin M; BogenII, Paul L.
In this paper, we introduce a new visual analytics system, called Matisse, that allows exploration of global trends in textual information streams with specific application to social media platforms. Despite the potential for real-time situational awareness using these services, interactive analysis of such semi-structured textual information is a challenge due to the high-throughput and high-velocity properties. Matisse addresses these challenges through the following contributions: (1) robust stream data management, (2) automated sen- timent/emotion analytics, (3) inferential temporal, geospatial, and term-frequency visualizations, and (4) a flexible drill-down interaction scheme that progresses from macroscale to microscale views. In addition to describing thesemore » contributions, our work-in-progress paper concludes with a practical case study focused on the analysis of Twitter 1% sample stream information captured during the week of the Boston Marathon bombings.« less
NASA Astrophysics Data System (ADS)
Altun, F.; Birdal, F.
2012-12-01
In this study, a 1:3 scaled, three-storey, FRP (Fiber Reinforced Polymer) retrofitted reinforced concrete model structure whose behaviour and crack development were identified experimentally in the laboratory was investigated analytically. Determination of structural behaviour under earthquake load is only possible in a laboratory environment with a specific scale, as carrying out structural experiments is difficult due to the evaluation of increased parameter numbers and because it requires an expensive laboratory setup. In an analytical study, structure was modelled using ANSYS Finite Element Package Program (2007), and its behaviour and crack development were revealed. When experimental difficulties are taken into consideration, analytical investigation of structure behaviour is more economic and much faster. At the end of the study, experimental results of structural behaviour and crack development were compared with analytical data. It was concluded that in a model structure retrofitted with FRP, the behaviour and cracking model can be determined without testing by determining the reasons for the points where analytical results are not converged with experimental data. Better understanding of structural behaviour is analytically enabled with the study.
Mathematical model of polyethylene pipe bending stress state
NASA Astrophysics Data System (ADS)
Serebrennikov, Anatoly; Serebrennikov, Daniil
2018-03-01
Introduction of new machines and new technologies of polyethylene pipeline installation is usually based on the polyethylene pipe flexibility. It is necessary that existing bending stresses do not lead to an irreversible polyethylene pipe deformation and to violation of its strength characteristics. Derivation of the mathematical model which allows calculating analytically the bending stress level of polyethylene pipes with consideration of nonlinear characteristics is presented below. All analytical calculations made with the mathematical model are experimentally proved and confirmed.
NASA Astrophysics Data System (ADS)
Jivkov, Venelin S.; Zahariev, Evtim V.
2016-12-01
The paper presents a geometrical approach to dynamics simulation of a rigid and flexible system, compiled of high speed rotating machine with eccentricity and considerable inertia and mass. The machine is mounted on a vertical flexible pillar with considerable height. The stiffness and damping of the column, as well as, of the rotor bearings and the shaft are taken into account. Non-stationary vibrations and transitional processes are analyzed. The major frequency and modal mode of the flexible column are used for analytical reduction of its mass, stiffness and damping properties. The rotor and the foundation are modelled as rigid bodies, while the flexibility of the bearings is estimated by experiments and the requirements of the manufacturer. The transition effects as a result of limited power are analyzed by asymptotic methods of averaging. Analytical expressions for the amplitudes and unstable vibrations throughout resonance are derived by quasi-static approach increasing and decreasing of the exciting frequency. Analytical functions give the possibility to analyze the influence of the design parameter of many structure applications as wind power generators, gas turbines, turbo-generators, and etc. A numerical procedure is applied to verify the effectiveness and precision of the simulation process. Nonlinear and transitional effects are analyzed and compared to the analytical results. External excitations, as wave propagation and earthquakes, are discussed. Finite elements in relative and absolute coordinates are applied to model the flexible column and the high speed rotating machine. Generalized Newton - Euler dynamics equations are used to derive the precise dynamics equations. Examples of simulation of the system vibrations and nonstationary behaviour are presented.
Xiao, Cao; Choi, Edward; Sun, Jimeng
2018-06-08
To conduct a systematic review of deep learning models for electronic health record (EHR) data, and illustrate various deep learning architectures for analyzing different data sources and their target applications. We also highlight ongoing research and identify open challenges in building deep learning models of EHRs. We searched PubMed and Google Scholar for papers on deep learning studies using EHR data published between January 1, 2010, and January 31, 2018. We summarize them according to these axes: types of analytics tasks, types of deep learning model architectures, special challenges arising from health data and tasks and their potential solutions, as well as evaluation strategies. We surveyed and analyzed multiple aspects of the 98 articles we found and identified the following analytics tasks: disease detection/classification, sequential prediction of clinical events, concept embedding, data augmentation, and EHR data privacy. We then studied how deep architectures were applied to these tasks. We also discussed some special challenges arising from modeling EHR data and reviewed a few popular approaches. Finally, we summarized how performance evaluations were conducted for each task. Despite the early success in using deep learning for health analytics applications, there still exist a number of issues to be addressed. We discuss them in detail including data and label availability, the interpretability and transparency of the model, and ease of deployment.
Challenges and Opportunities of Big Data in Health Care: A Systematic Review.
Kruse, Clemens Scott; Goswamy, Rishi; Raval, Yesha; Marawi, Sarah
2016-11-21
Big data analytics offers promise in many business sectors, and health care is looking at big data to provide answers to many age-related issues, particularly dementia and chronic disease management. The purpose of this review was to summarize the challenges faced by big data analytics and the opportunities that big data opens in health care. A total of 3 searches were performed for publications between January 1, 2010 and January 1, 2016 (PubMed/MEDLINE, CINAHL, and Google Scholar), and an assessment was made on content germane to big data in health care. From the results of the searches in research databases and Google Scholar (N=28), the authors summarized content and identified 9 and 14 themes under the categories Challenges and Opportunities, respectively. We rank-ordered and analyzed the themes based on the frequency of occurrence. The top challenges were issues of data structure, security, data standardization, storage and transfers, and managerial skills such as data governance. The top opportunities revealed were quality improvement, population management and health, early detection of disease, data quality, structure, and accessibility, improved decision making, and cost reduction. Big data analytics has the potential for positive impact and global implications; however, it must overcome some legitimate obstacles. ©Clemens Scott Kruse, Rishi Goswamy, Yesha Raval, Sarah Marawi. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 21.11.2016.
Matisse: A Visual Analytics System for Exploring Emotion Trends in Social Media Text Streams
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steed, Chad A; Drouhard, Margaret MEG G; Beaver, Justin M
Dynamically mining textual information streams to gain real-time situational awareness is especially challenging with social media systems where throughput and velocity properties push the limits of a static analytical approach. In this paper, we describe an interactive visual analytics system, called Matisse, that aids with the discovery and investigation of trends in streaming text. Matisse addresses the challenges inherent to text stream mining through the following technical contributions: (1) robust stream data management, (2) automated sentiment/emotion analytics, (3) interactive coordinated visualizations, and (4) a flexible drill-down interaction scheme that accesses multiple levels of detail. In addition to positive/negative sentiment prediction,more » Matisse provides fine-grained emotion classification based on Valence, Arousal, and Dominance dimensions and a novel machine learning process. Information from the sentiment/emotion analytics are fused with raw data and summary information to feed temporal, geospatial, term frequency, and scatterplot visualizations using a multi-scale, coordinated interaction model. After describing these techniques, we conclude with a practical case study focused on analyzing the Twitter sample stream during the week of the 2013 Boston Marathon bombings. The case study demonstrates the effectiveness of Matisse at providing guided situational awareness of significant trends in social media streams by orchestrating computational power and human cognition.« less
Peek, N; Holmes, J H; Sun, J
2014-08-15
To review technical and methodological challenges for big data research in biomedicine and health. We discuss sources of big datasets, survey infrastructures for big data storage and big data processing, and describe the main challenges that arise when analyzing big data. The life and biomedical sciences are massively contributing to the big data revolution through secondary use of data that were collected during routine care and through new data sources such as social media. Efficient processing of big datasets is typically achieved by distributing computation over a cluster of computers. Data analysts should be aware of pitfalls related to big data such as bias in routine care data and the risk of false-positive findings in high-dimensional datasets. The major challenge for the near future is to transform analytical methods that are used in the biomedical and health domain, to fit the distributed storage and processing model that is required to handle big data, while ensuring confidentiality of the data being analyzed.
Exascale computing and big data
Reed, Daniel A.; Dongarra, Jack
2015-06-25
Scientific discovery and engineering innovation requires unifying traditionally separated high-performance computing and big data analytics. The tools and cultures of high-performance computing and big data analytics have diverged, to the detriment of both; unification is essential to address a spectrum of major research domains. The challenges of scale tax our ability to transmit data, compute complicated functions on that data, or store a substantial part of it; new approaches are required to meet these challenges. Finally, the international nature of science demands further development of advanced computer architectures and global standards for processing data, even as international competition complicates themore » openness of the scientific process.« less
Exascale computing and big data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reed, Daniel A.; Dongarra, Jack
Scientific discovery and engineering innovation requires unifying traditionally separated high-performance computing and big data analytics. The tools and cultures of high-performance computing and big data analytics have diverged, to the detriment of both; unification is essential to address a spectrum of major research domains. The challenges of scale tax our ability to transmit data, compute complicated functions on that data, or store a substantial part of it; new approaches are required to meet these challenges. Finally, the international nature of science demands further development of advanced computer architectures and global standards for processing data, even as international competition complicates themore » openness of the scientific process.« less
Dinov, Ivo D
2016-01-01
Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.
Evolution of accelerometer methods for physical activity research.
Troiano, Richard P; McClain, James J; Brychta, Robert J; Chen, Kong Y
2014-07-01
The technology and application of current accelerometer-based devices in physical activity (PA) research allow the capture and storage or transmission of large volumes of raw acceleration signal data. These rich data not only provide opportunities to improve PA characterisation, but also bring logistical and analytic challenges. We discuss how researchers and developers from multiple disciplines are responding to the analytic challenges and how advances in data storage, transmission and big data computing will minimise logistical challenges. These new approaches also bring the need for several paradigm shifts for PA researchers, including a shift from count-based approaches and regression calibrations for PA energy expenditure (PAEE) estimation to activity characterisation and EE estimation based on features extracted from raw acceleration signals. Furthermore, a collaborative approach towards analytic methods is proposed to facilitate PA research, which requires a shift away from multiple independent calibration studies. Finally, we make the case for a distinction between PA represented by accelerometer-based devices and PA assessed by self-report. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Theoretical Principles of Distance Education.
ERIC Educational Resources Information Center
Keegan, Desmond, Ed.
This book contains the following papers examining the didactic, academic, analytic, philosophical, and technological underpinnings of distance education: "Introduction"; "Quality and Access in Distance Education: Theoretical Considerations" (D. Randy Garrison); "Theory of Transactional Distance" (Michael G. Moore);…
Srinivas, Nuggehally R
2006-05-01
The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. Although the bioanalytical procedure(s) originally developed during the discovery stage may not necessarily be fit to support the drug development scenario, they may be suitably modified and validated, as deemed necessary. Several reviews have appeared over the years describing analytical approaches including various techniques, detection systems, automation tools that are available for an effective separation, enhanced selectivity and sensitivity for quantitation of many analytes. The intention of this review is to cover various key areas where analytical method development becomes necessary during different stages of drug discovery research and development process. The key areas covered in this article with relevant case studies include: (a) simultaneous assay for parent compound and metabolites that are purported to display pharmacological activity; (b) bioanalytical procedures for determination of multiple drugs in combating a disease; (c) analytical measurement of chirality aspects in the pharmacokinetics, metabolism and biotransformation investigations; (d) drug monitoring for therapeutic benefits and/or occupational hazard; (e) analysis of drugs from complex and/or less frequently used matrices; (f) analytical determination during in vitro experiments (metabolism and permeability related) and in situ intestinal perfusion experiments; (g) determination of a major metabolite as a surrogate for the parent molecule; (h) analytical approaches for universal determination of CYP450 probe substrates and metabolites; (i) analytical applicability to prodrug evaluations-simultaneous determination of prodrug, parent and metabolites; (j) quantitative determination of parent compound and/or phase II metabolite(s) via direct or indirect approaches; (k) applicability in analysis of multiple compounds in select disease areas and/or in clinically important drug-drug interaction studies. A tabular representation of select examples of analysis is provided covering areas of separation conditions, validation aspects and applicable conclusion. A limited discussion is provided on relevant aspects of the need for developing bioanalytical procedures for speedy drug discovery and development. Additionally, some key elements such as internal standard selection, likely issues of mass detection, matrix effect, chiral aspects etc. are provided for consideration during method development.
An interlaboratory transfer of a multi-analyte assay between continents.
Georgiou, Alexandra; Dong, Kelly; Hughes, Stephen; Barfield, Matthew
2015-01-01
Alex has worked at GlaxoSmithKline for the past 15 years and currently works within the bioanalytical and toxicokinetic group in the United Kingdom. Alex's role in previous years has been the in-house support of preclinical and clinical bioanalysis, from method development through to sample analysis activities as well as acting as PI for GLP bioanalysis and toxicokinetics. For the past two years, Alex has applied this analytical and regulatory experience to focus on the outsourcing of preclinical bioanalysis, toxicokinetics and clinical bioanalysis, working closely with multiple bioanalytical and in-life CRO partners worldwide. Alex works to support DMPK and Safety Assessment outsourcing activities for GSK across multiple therapeutic areas, from the first GLP study through to late stage clinical PK studies. Transfer and cross-validation of an existing analytical assay between a laboratory providing current analytical support, and a laboratory needed for new or additional support, can present the bioanalyst with numerous challenges. These challenges can be technical or logistical in nature and may prove to be significant when transferring an assay between laboratories in different continents. Part of GlaxoSmithKline's strategy to improve confidence in providing quality data, is to cross-validate between laboratories. If the cross-validation fails predefined acceptance criteria, then a subsequent investigation would follow. This may also prove to be challenging. The importance of thorough planning and good communication throughout assay transfer, cross-validation and any subsequent investigations is illustrated in this case study.
ERIC Educational Resources Information Center
Graudins, Maija M.; Rehfeldt, Ruth Anne; DeMattei, Ronda; Baker, Jonathan C.; Scaglia, Fiorella
2012-01-01
Performing oral care procedures with children with autism who exhibit noncompliance can be challenging for oral care professionals. Previous research has elucidated a number of effective behavior analytic procedures for increasing compliance, but some procedures are likely to be too time consuming and expensive for community-based oral care…
ERIC Educational Resources Information Center
Macfadyen, Leah P.; Dawson, Shane; Pardo, Abelardo; Gaševic, Dragan
2014-01-01
In the new era of big educational data, learning analytics (LA) offer the possibility of implementing real-time assessment and feedback systems and processes at scale that are focused on improvement of learning, development of self-regulated learning skills, and student success. However, to realize this promise, the necessary shifts in the…
Integrating Bio-Inorganic and Analytical Chemistry into an Undergraduate Biochemistry Laboratory
ERIC Educational Resources Information Center
Erasmus, Daniel J.; Brewer, Sharon E.; Cinel, Bruno
2015-01-01
Undergraduate laboratories expose students to a wide variety of topics and techniques in a limited amount of time. This can be a challenge and lead to less exposure to concepts and activities in bio-inorganic chemistry and analytical chemistry that are closely-related to biochemistry. To address this, we incorporated a new iron determination by…
Dialogue as Data in Learning Analytics for Productive Educational Dialogue
ERIC Educational Resources Information Center
Knight, Simon; Littleton, Karen
2015-01-01
This paper provides a novel, conceptually driven stance on the state of the contemporary analytic challenges faced in the treatment of dialogue as a form of data across on- and offline sites of learning. In prior research, preliminary steps have been taken to detect occurrences of such dialogue using automated analysis techniques. Such advances…
ERIC Educational Resources Information Center
Avella, John T.; Kebritchi, Mansureh; Nunn, Sandra G.; Kanai, Therese
2016-01-01
Higher education for the 21st century continues to promote discoveries in the field through learning analytics (LA). The problem is that the rapid embrace of of LA diverts educators' attention from clearly identifying requirements and implications of using LA in higher education. LA is a promising emerging field, yet higher education stakeholders…
Statistical Challenges in "Big Data" Human Neuroimaging.
Smith, Stephen M; Nichols, Thomas E
2018-01-17
Smith and Nichols discuss "big data" human neuroimaging studies, with very large subject numbers and amounts of data. These studies provide great opportunities for making new discoveries about the brain but raise many new analytical challenges and interpretational risks. Copyright © 2017 Elsevier Inc. All rights reserved.
USDA-ARS?s Scientific Manuscript database
The correspondence titled “Analytical challenges in the assessment of NO synthesis from L-arginine in the MELAS syndrome” suggested challenges that can limit the utility of stable isotope infusion methodology in assessing NO production....
VAST Challenge 2015: Mayhem at Dinofun World
2015-10-25
adult, teen, child , and infant . Accordingly, teens “raced” through the park, children stayed close to their adults, and so on. The challenge...advance visual analytics through a series of competitions. In the VAST Challenges, researchers and software developers put themselves in the role... developers specified several group types based on their personal experiences with groups in amusement parks. As each person travelled through the park
What values in design? The challenge of incorporating moral values into design.
Manders-Huits, Noëmi
2011-06-01
Recently, there is increased attention to the integration of moral values into the conception, design, and development of emerging IT. The most reviewed approach for this purpose in ethics and technology so far is Value-Sensitive Design (VSD). This article considers VSD as the prime candidate for implementing normative considerations into design. Its methodology is considered from a conceptual, analytical, normative perspective. The focus here is on the suitability of VSD for integrating moral values into the design of technologies in a way that joins in with an analytical perspective on ethics of technology. Despite its promising character, it turns out that VSD falls short in several respects: (1) VSD does not have a clear methodology for identifying stakeholders, (2) the integration of empirical methods with conceptual research within the methodology of VSD is obscure, (3) VSD runs the risk of committing the naturalistic fallacy when using empirical knowledge for implementing values in design, (4) the concept of values, as well as their realization, is left undetermined and (5) VSD lacks a complimentary or explicit ethical theory for dealing with value trade-offs. For the normative evaluation of a technology, I claim that an explicit and justified ethical starting point or principle is required. Moreover, explicit attention should be given to the value aims and assumptions of a particular design. The criteria of adequacy for such an approach or methodology follow from the evaluation of VSD as the prime candidate for implementing moral values in design.
Analytical model for force prediction when machining metal matrix composites
NASA Astrophysics Data System (ADS)
Sikder, Snahungshu
Metal Matrix Composites (MMC) offer several thermo-mechanical advantages over standard materials and alloys which make them better candidates in different applications. Their light weight, high stiffness, and strength have attracted several industries such as automotive, aerospace, and defence for their wide range of products. However, the wide spread application of Meal Matrix Composites is still a challenge for industry. The hard and abrasive nature of the reinforcement particles is responsible for rapid tool wear and high machining costs. Fracture and debonding of the abrasive reinforcement particles are the considerable damage modes that directly influence the tool performance. It is very important to find highly effective way to machine MMCs. So, it is important to predict forces when machining Metal Matrix Composites because this will help to choose perfect tools for machining and ultimately save both money and time. This research presents an analytical force model for predicting the forces generated during machining of Metal Matrix Composites. In estimating the generated forces, several aspects of cutting mechanics were considered including: shearing force, ploughing force, and particle fracture force. Chip formation force was obtained by classical orthogonal metal cutting mechanics and the Johnson-Cook Equation. The ploughing force was formulated while the fracture force was calculated from the slip line field theory and the Griffith theory of failure. The predicted results were compared with previously measured data. The results showed very good agreement between the theoretically predicted and experimentally measured cutting forces.
Kim, Sang-Bog; Roche, Jennifer
2013-08-01
Organically bound tritium (OBT) is an important tritium species that can be measured in most environmental samples, but has only recently been recognized as a species of tritium in these samples. Currently, OBT is not routinely measured by environmental monitoring laboratories around the world. There are no certified reference materials (CRMs) for environmental samples. Thus, quality assurance (QA), or verification of the accuracy of the OBT measurement, is not possible. Alternatively, quality control (QC), or verification of the precision of the OBT measurement, can be achieved. In the past, there have been differences in OBT analysis results between environmental laboratories. A possible reason for the discrepancies may be differences in analytical methods. Therefore, inter-laboratory OBT comparisons among the environmental laboratories are important and would provide a good opportunity for adopting a reference OBT analytical procedure. Due to the analytical issues, only limited information is available on OBT measurement. Previously conducted OBT inter-laboratory practices are reviewed and the findings are described. Based on our experiences, a few considerations were suggested for the international OBT inter-laboratory comparison exercise to be completed in the near future. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.
Farzanehfar, Vahid; Faizi, Mehrdad; Naderi, Nima; Kobarfard, Farzad
2017-01-01
Dibutyl phthalate (DBP) is a phthalic acid ester and is widely used in polymeric products to make them more flexible. DBP is found in almost every plastic material and is believed to be persistent in the environment. Various analytical methods have been used to measure DBP in different matrices. Considering the ubiquitous nature of DBP, the most important challenge in DBP analyses is the contamination of even analytical grade organic solvents with this compound and lack of availability of a true blank matrix to construct the calibration line. Standard addition method or using artificial matrices reduce the precision and accuracy of the results. In this study a surrogate analyte approach that is based on using deuterium labeled analyte (DBP-d4) to construct the calibration line was applied to determine DBP in hexane samples. PMID:28496469
Maier, Barbara; Vogeser, Michael
2013-04-01
Isotope dilution LC-MS/MS methods used in the clinical laboratory typically involve multi-point external calibration in each analytical series. Our aim was to test the hypothesis that determination of target analyte concentrations directly derived from the relation of the target analyte peak area to the peak area of a corresponding stable isotope labelled internal standard compound [direct isotope dilution analysis (DIDA)] may be not inferior to conventional external calibration with respect to accuracy and reproducibility. Quality control samples and human serum pools were analysed in a comparative validation protocol for cortisol as an exemplary analyte by LC-MS/MS. Accuracy and reproducibility were compared between quantification either involving a six-point external calibration function, or a result calculation merely based on peak area ratios of unlabelled and labelled analyte. Both quantification approaches resulted in similar accuracy and reproducibility. For specified analytes, reliable analyte quantification directly derived from the ratio of peak areas of labelled and unlabelled analyte without the need for a time consuming multi-point calibration series is possible. This DIDA approach is of considerable practical importance for the application of LC-MS/MS in the clinical laboratory where short turnaround times often have high priority.
Flexural waves induced by electro-impulse deicing forces
NASA Technical Reports Server (NTRS)
Gien, P. H.
1990-01-01
The generation, reflection and propagation of flexural waves created by electroimpulsive deicing forces are demonstrated both experimentally and analytically in a thin circular plate and a thin semicylindrical shell. Analytical prediction of these waves with finite element models shows good correlation with acceleration and displacement measurements at discrete points on the structures studied. However, sensitivity to spurious flexural waves resulting from the spatial discretization of the structures is shown to be significant. Consideration is also given to composite structures as an extension of these studies.
Considerations for monitoring raptor population trends based on counts of migrants
Titus, K.; Fuller, M.R.; Ruos, J.L.; Meyburg, B-U.; Chancellor, R.D.
1989-01-01
Various problems were identified with standardized hawk count data as annually collected at six sites. Some of the hawk lookouts increased their hours of observation from 1979-1985, thereby confounding the total counts. Data recording and missing data hamper coding of data and their use with modern analytical techniques. Coefficients of variation among years in counts averaged about 40%. The advantages and disadvantages of various analytical techniques are discussed including regression, non-parametric rank correlation trend analysis, and moving averages.
Krauser, Joel A
2013-01-01
Tritium ((3) H) and carbon-14 ((14) C) labels applied in pharmaceutical research and development each offer their own distinctive advantages and disadvantages coupled with benefits and risks. The advantages of (3) H have a higher specific activity, shorter half-life that allows more manageable waste remediation, lower material costs, and often more direct synthetic routes. The advantages of (14) C offer certain analytical benefits and less potential for label loss. Although (3) H labels offer several advantages, they might be overlooked as a viable option because of the concerns about its drawbacks. A main drawback often challenged is metabolic liability. These drawbacks, in some cases, might be overstated leading to underutilization of a perfectly viable option. As a consequence, label selection may automatically default to (14) C, which is a more conservative approach. To challenge this '(14) C-by-default' approach, pharmaceutical agents with strategically selected (3) H-labeling positions based on non-labeled metabolism data have been successfully implemented and evaluated for (3) H loss. From in-house results, the long term success of projects clearly would benefit from a thorough, objective, and balanced assessment regarding label selection ((3) H or (14) C). This assessment should be based on available project information and scientific knowledge. Important considerations are project applicability (preclinical and clinical phases), synthetic feasibility, costs, and timelines. Copyright © 2013 John Wiley & Sons, Ltd.
Aeronautical record : no. 1 (to June, 1923)
NASA Technical Reports Server (NTRS)
1923-01-01
"...considerations have prompted us to pay special attention to the development of aeronautical industries and aerial navigation as a commercial enterprise and to publish an analytical review of events in the aeronautical world and of the attendant problems."
Peristalsis of nonconstant viscosity Jeffrey fluid with nanoparticles
NASA Astrophysics Data System (ADS)
Alvi, N.; Latif, T.; Hussain, Q.; Asghar, S.
Mixed convective peristaltic activity of variable viscosity nanofluids is addressed. Unlike the conventional consideration of constant viscosity; the viscosity is taken as temperature dependent. Constitutive relations for linear viscoelastic Jeffrey fluid are employed and uniform magnetic field is applied in the transverse direction. For nanofluids, the formulation is completed in presence of Brownian motion, thermophoresis, viscous dissipation and Joule heating. Consideration of temperature dependence of viscosity is not a choice but the realistic requirement of the wall temperature and the heat generated due to the viscous dissipation. Well established large wavelength and small Reynolds number approximations are invoked. Non-linear coupled system is analytically solved for the convergent series solutions identifying the interval of convergence explicitly. A comparative study between analytical and numerical solution is made for certainty. Influence of the parameters undertaken for the description of the problem is pointed out and its physics explained.
Design of a high speed business transport
NASA Technical Reports Server (NTRS)
1990-01-01
The design of a High Speed Business Transport (HSBT) was considered by the Aeronautical Design Class during the academic year 1989 to 1990. The project was chosen to offer an opportunity to develop user friendliness for some computer codes such as WAVE DRAG, supplied by NASA/Langley, and to experiment with several design lessons developed by Dr. John McMasters and his colleages at Boeing. Central to these design lessons was an appeal to marketing and feasibility considerations. There was an emphasis upon simplified analytical techniques to study trades and to stimulate creative thinking before committing to extensive analytical activity. Two designs stood out among all the rest because of the depth of thought and consideration of alternatives. One design, the Aurora, used a fixed wing design to satisfy the design mission: the Viero used a swept wing configuration to overcome problems related to supersonic flight. A summary of each of these two designs is given.
Afterword: Considerations for Future Practice of Assessment and Accountability
ERIC Educational Resources Information Center
Bresciani, Marilee J.
2013-01-01
This afterword offers challenges and considerations as the assessment movement continues to develop. The author offers some simple considerations for readers to ponder as they advance their evidence-based decision making processes, and encourages others to use these methods within the context of recent neuroscientific evidence that learning and…
Applying Pragmatics Principles for Interaction with Visual Analytics.
Hoque, Enamul; Setlur, Vidya; Tory, Melanie; Dykeman, Isaac
2018-01-01
Interactive visual data analysis is most productive when users can focus on answering the questions they have about their data, rather than focusing on how to operate the interface to the analysis tool. One viable approach to engaging users in interactive conversations with their data is a natural language interface to visualizations. These interfaces have the potential to be both more expressive and more accessible than other interaction paradigms. We explore how principles from language pragmatics can be applied to the flow of visual analytical conversations, using natural language as an input modality. We evaluate the effectiveness of pragmatics support in our system Evizeon, and present design considerations for conversation interfaces to visual analytics tools.
Application of Hamilton's law of varying action
NASA Technical Reports Server (NTRS)
Bailey, C. D.
1975-01-01
The law of varying action enunciated by Hamilton in 1834-1835 permits the direct analytical solution of the problems of mechanics, both stationary and nonstationary, without consideration of force equilibrium and the theory of differential equations associated therewith. It has not been possible to obtain direct analytical solutions to nonstationary systems through the use of energy theory, which has been limited for 140 years to the principle of least action and to Hamilton's principle. It is shown here that Hamilton's law permits the direct analytical solution to nonstationary, initial value systems in the mechanics of solids without any knowledge or use of the theory of differential equations. Solutions are demonstrated for nonconservative, nonstationary particle motion, both linear and nonlinear.
Analytical treatment of particle motion in circularly polarized slab-mode wave fields
NASA Astrophysics Data System (ADS)
Schreiner, Cedric; Vainio, Rami; Spanier, Felix
2018-02-01
Wave-particle interaction is a key process in particle diffusion in collisionless plasmas. We look into the interaction of single plasma waves with individual particles and discuss under which circumstances this is a chaotic process, leading to diffusion. We derive the equations of motion for a particle in the fields of a magnetostatic, circularly polarized, monochromatic wave and show that no chaotic particle motion can arise under such circumstances. A novel and exact analytic solution for the equations is presented. Additional plasma waves lead to a breakdown of the analytic solution and chaotic particle trajectories become possible. We demonstrate this effect by considering a linearly polarized, monochromatic wave, which can be seen as the superposition of two circularly polarized waves. Test particle simulations are provided to illustrate and expand our analytical considerations.
Coorssen, Jens R; Yergey, Alfred L
2015-12-03
Molecular mechanisms underlying health and disease function at least in part based on the flexibility and fine-tuning afforded by protein isoforms and post-translational modifications. The ability to effectively and consistently resolve these protein species or proteoforms, as well as assess quantitative changes is therefore central to proteomic analyses. Here we discuss the pros and cons of currently available and developing analytical techniques from the perspective of the full spectrum of available tools and their current applications, emphasizing the concept of fitness-for-purpose in experimental design based on consideration of sample size and complexity; this necessarily also addresses analytical reproducibility and its variance. Data quality is considered the primary criterion, and we thus emphasize that the standards of Analytical Chemistry must apply throughout any proteomic analysis.
NASA Astrophysics Data System (ADS)
Schnase, J. L.; Duffy, D.; Tamkin, G. S.; Nadeau, D.; Thompson, J. H.; Grieg, C. M.; McInerney, M.; Webster, W. P.
2013-12-01
Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRA/AS) is an example of cloud-enabled CAaaS built on this principle. MERRA/AS enables MapReduce analytics over NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRA/AS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRA/AS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to organizational change, fosters innovation and experimentation, facilitates technology transfer, and provides the agility required to meet our customers' increasing and changing needs. Cloud Computing is providing a new tier in the data services stack that helps connect earthbound, enterprise-level data and computational resources to new customers and new mobility-driven applications and modes of work. For climate science, Cloud Computing's capacity to engage communities in the construction of new capabilies is perhaps the most important link between Cloud Computing and Big Data.
NASA Technical Reports Server (NTRS)
Schnase, John L.; Duffy, Daniel Quinn; Tamkin, Glenn S.; Nadeau, Denis; Thompson, John H.; Grieg, Christina M.; McInerney, Mark A.; Webster, William P.
2014-01-01
Climate science is a Big Data domain that is experiencing unprecedented growth. In our efforts to address the Big Data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with Big Data that ultimately produce societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by Cloud Computing. Within this framework, Cloud Computing plays an important role; however, we it see it as only one element in a constellation of capabilities that are essential to delivering climate analytics as a service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the Big Data challenges in this domain. MERRA Analytic Services (MERRAAS) is an example of cloud-enabled CAaaS built on this principle. MERRAAS enables MapReduce analytics over NASAs Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of 26 key climate variables. It represents a type of data product that is of growing importance to scientists doing climate change research and a wide range of decision support applications. MERRAAS brings together the following generative elements in a full, end-to-end demonstration of CAaaS capabilities: (1) high-performance, data proximal analytics, (2) scalable data management, (3) software appliance virtualization, (4) adaptive analytics, and (5) a domain-harmonized API. The effectiveness of MERRAAS has been demonstrated in several applications. In our experience, Cloud Computing lowers the barriers and risk to organizational change, fosters innovation and experimentation, facilitates technology transfer, and provides the agility required to meet our customers' increasing and changing needs. Cloud Computing is providing a new tier in the data services stack that helps connect earthbound, enterprise-level data and computational resources to new customers and new mobility-driven applications and modes of work. For climate science, Cloud Computing's capacity to engage communities in the construction of new capabilies is perhaps the most important link between Cloud Computing and Big Data.
Big Data Goes Personal: Privacy and Social Challenges
ERIC Educational Resources Information Center
Bonomi, Luca
2015-01-01
The Big Data phenomenon is posing new challenges in our modern society. In addition to requiring information systems to effectively manage high-dimensional and complex data, the privacy and social implications associated with the data collection, data analytics, and service requirements create new important research problems. First, the high…
NASA Technical Reports Server (NTRS)
Kohl, F. J.; Leisz, D. M.; Fryburg, G. C.; Stearns, C. A.
1977-01-01
Equilibrium thermochemical analyses are employed to describe the vaporization processes of metals and metal oxides upon exposure to molecular and atomic oxygen. Specific analytic results for the chromium-, platinum-, aluminum-, and silicon-oxygen systems are presented. Maximum rates of oxidative vaporization predicted from the thermochemical considerations are compared with experimental results for chromium and platinum. The oxidative vaporization rates of chromium and platinum are considerably enhanced by oxygen atoms.
Big Data Analytics for Genomic Medicine
He, Karen Y.; Ge, Dongliang; He, Max M.
2017-01-01
Genomic medicine attempts to build individualized strategies for diagnostic or therapeutic decision-making by utilizing patients’ genomic information. Big Data analytics uncovers hidden patterns, unknown correlations, and other insights through examining large-scale various data sets. While integration and manipulation of diverse genomic data and comprehensive electronic health records (EHRs) on a Big Data infrastructure exhibit challenges, they also provide a feasible opportunity to develop an efficient and effective approach to identify clinically actionable genetic variants for individualized diagnosis and therapy. In this paper, we review the challenges of manipulating large-scale next-generation sequencing (NGS) data and diverse clinical data derived from the EHRs for genomic medicine. We introduce possible solutions for different challenges in manipulating, managing, and analyzing genomic and clinical data to implement genomic medicine. Additionally, we also present a practical Big Data toolset for identifying clinically actionable genetic variants using high-throughput NGS data and EHRs. PMID:28212287
Big Data Analytics for Genomic Medicine.
He, Karen Y; Ge, Dongliang; He, Max M
2017-02-15
Genomic medicine attempts to build individualized strategies for diagnostic or therapeutic decision-making by utilizing patients' genomic information. Big Data analytics uncovers hidden patterns, unknown correlations, and other insights through examining large-scale various data sets. While integration and manipulation of diverse genomic data and comprehensive electronic health records (EHRs) on a Big Data infrastructure exhibit challenges, they also provide a feasible opportunity to develop an efficient and effective approach to identify clinically actionable genetic variants for individualized diagnosis and therapy. In this paper, we review the challenges of manipulating large-scale next-generation sequencing (NGS) data and diverse clinical data derived from the EHRs for genomic medicine. We introduce possible solutions for different challenges in manipulating, managing, and analyzing genomic and clinical data to implement genomic medicine. Additionally, we also present a practical Big Data toolset for identifying clinically actionable genetic variants using high-throughput NGS data and EHRs.
Assessing organizational change in multisector community health alliances.
Alexander, Jeffrey A; Hearld, Larry R; Shi, Yunfeng
2015-02-01
The purpose of this article was to identify some common organizational features of multisector health care alliances (MHCAs) and the analytic challenges presented by those characteristics in assessing organizational change. Two rounds of an Internet-based survey of participants in 14 MHCAs. We highlight three analytic challenges that can arise when quantitatively studying the organizational characteristics of MHCAs-assessing change in MHCA organization, assessment of construct reliability, and aggregation of individual responses to reflect organizational characteristics. We illustrate these issues using a leadership effectiveness scale (12 items) validated in previous research and data from 14 MHCAs participating in the Robert Wood Johnson Foundation's Aligning Forces for Quality (AF4Q) program. High levels of instability and turnover in MHCA membership create challenges in using survey data to study changes in key organizational characteristics of MHCAs. We offer several recommendations to diagnose the source and extent of these problems. © Health Research and Educational Trust.
Challenges Facing Teachers New to Working in Schools Overseas
ERIC Educational Resources Information Center
Halicioglu, Margaret L.
2015-01-01
This article considers the potential challenges facing teachers moving abroad for the first time, both professional challenges in their school and personal challenges in their private life. It suggests that such teachers embarking on a professional adventure overseas would benefit from careful consideration of the kind of school they will thrive…
Warren, Alexander D; Conway, Ulric; Arthur, Christopher J; Gates, Paul J
2016-07-01
The analysis of low molecular weight compounds by matrix-assisted laser desorption/ionisation mass spectrometry is problematic due to the interference and suppression of analyte ionisation by the matrices typically employed - which are themselves low molecular weight compounds. The application of colloidal graphite is demonstrated here as an easy to use matrix that can promote the ionisation of a wide range of analytes including low molecular weight organic compounds, complex natural products and inorganic complexes. Analyte ionisation with colloidal graphite is compared with traditional organic matrices along with various other sources of graphite (e.g. graphite rods and charcoal pencils). Factors such as ease of application, spectra reproducibility, spot longevity, spot-to-spot reproducibility and spot homogeneity (through single spot imaging) are explored. For some analytes, considerable matrix suppression effects are observed resulting in spectra completely devoid of matrix ions. We also report the observation of radical molecular ions [M(-●) ] in the negative ion mode, particularly with some aromatic analytes. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Mirel, Barbara; Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan
2016-02-01
Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students' visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students' successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules.
Kumar, Anuj; Nong, Paige; Su, Gang; Meng, Fan
2016-01-01
Life scientists increasingly use visual analytics to explore large data sets and generate hypotheses. Undergraduate biology majors should be learning these same methods. Yet visual analytics is one of the most underdeveloped areas of undergraduate biology education. This study sought to determine the feasibility of undergraduate biology majors conducting exploratory analysis using the same interactive data visualizations as practicing scientists. We examined 22 upper level undergraduates in a genomics course as they engaged in a case-based inquiry with an interactive heat map. We qualitatively and quantitatively analyzed students’ visual analytic behaviors, reasoning and outcomes to identify student performance patterns, commonly shared efficiencies and task completion. We analyzed students’ successes and difficulties in applying knowledge and skills relevant to the visual analytics case and related gaps in knowledge and skill to associated tool designs. Findings show that undergraduate engagement in visual analytics is feasible and could be further strengthened through tool usability improvements. We identify these improvements. We speculate, as well, on instructional considerations that our findings suggested may also enhance visual analytics in case-based modules. PMID:26877625
ERIC Educational Resources Information Center
Villavicencio, Adriana; Klevan, Sarah; Guidry, Brandon; Wulach, Suzanne
2014-01-01
This appendix describes the data collection and analytic processes used to develop the findings in the report "Promising Opportunities for Black and Latino Young Men." A central challenge was creating an analytic framework that could be uniformly applied to all schools, despite the individualized nature of their Expanded Success…
ERIC Educational Resources Information Center
Wymbs, Cliff
2016-01-01
The designing of a new, potentially disruptive, curricular program, is not without challenges; however, it can be rewarding for students, faculty, and employers and serve as a template for other academics to follow. To be effective, the new data analytics program should be driven by business input and academic leadership that incorporates…
Analytical challenges for conducting rapid metabolism characterization for QIVIVE.
Tolonen, Ari; Pelkonen, Olavi
2015-06-05
For quantitative in vitro-in vivo extrapolation (QIVIVE) of metabolism for the purposes of toxicokinetics prediction, a precise and robust analytical technique for identifying and measuring a chemical and its metabolites is an absolute prerequisite. Currently, high-resolution mass spectrometry (HR-MS) is a tool of choice for a majority of organic relatively lipophilic molecules, linked with a LC separation tool and simultaneous UV-detection. However, additional techniques such as gas chromatography, radiometric measurements and NMR, are required to cover the whole spectrum of chemical structures. To accumulate enough reliable and robust data for the validation of QIVIVE, there are some partially opposing needs: Detailed delineation of the in vitro test system to produce a reliable toxicokinetic measure for a studied chemical, and a throughput capacity of the in vitro set-up and the analytical tool as high as possible. We discuss current analytical challenges for the identification and quantification of chemicals and their metabolites, both stable and reactive, focusing especially on LC-MS techniques, but simultaneously attempting to pinpoint factors associated with sample preparation, testing conditions and strengths and weaknesses of a particular technique available for a particular task. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
McInerney, M.; Schnase, J. L.; Duffy, D.; Tamkin, G.; Nadeau, D.; Strong, S.; Thompson, J. H.; Sinno, S.; Lazar, D.
2014-12-01
The climate sciences represent a big data domain that is experiencing unprecedented growth. In our efforts to address the big data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). We focus on analytics, because it is the knowledge gained from our interactions with big data that ultimately product societal benefits. We focus on CAaaS because we believe it provides a useful way of thinking about the problem: a specialization of the concept of business process-as-a-service, which is an evolving extension of IaaS, PaaS, and SaaS enabled by cloud computing. Within this framework, cloud computing plays an important role; however, we see it as only one element in a constellation of capabilities that are essential to delivering climate analytics-as-a-service. These elements are essential because in the aggregate they lead to generativity, a capacity for self-assembly that we feel is the key to solving many of the big data challenges in this domain. This poster will highlight specific examples of CAaaS using climate reanalysis data, high-performance cloud computing, map reduce, and the Climate Data Services API.
Modern data science for analytical chemical data - A comprehensive review.
Szymańska, Ewa
2018-10-22
Efficient and reliable analysis of chemical analytical data is a great challenge due to the increase in data size, variety and velocity. New methodologies, approaches and methods are being proposed not only by chemometrics but also by other data scientific communities to extract relevant information from big datasets and provide their value to different applications. Besides common goal of big data analysis, different perspectives and terms on big data are being discussed in scientific literature and public media. The aim of this comprehensive review is to present common trends in the analysis of chemical analytical data across different data scientific fields together with their data type-specific and generic challenges. Firstly, common data science terms used in different data scientific fields are summarized and discussed. Secondly, systematic methodologies to plan and run big data analysis projects are presented together with their steps. Moreover, different analysis aspects like assessing data quality, selecting data pre-processing strategies, data visualization and model validation are considered in more detail. Finally, an overview of standard and new data analysis methods is provided and their suitability for big analytical chemical datasets shortly discussed. Copyright © 2018 Elsevier B.V. All rights reserved.
A Holistic Management Architecture for Large-Scale Adaptive Networks
2007-09-01
transmission and processing overhead required for management. The challenges of building models to describe dynamic systems are well-known to the field of...increases the challenge of finding a simple approach to assessing the state of the network. Moreover, the performance state of one network link may be... challenging . These obstacles indicate the need for a less comprehensive-analytical, more systemic-holistic approach to managing networks. This approach might
Big Data Analytics in Medicine and Healthcare.
Ristevski, Blagoj; Chen, Ming
2018-05-10
This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.
Pandemic Influenza and Pregnancy: An Opportunity to Reassess Maternal Bioethics
Beigi, Richard H.
2009-01-01
Large-scale infectious epidemics present the medical community with numerous medical and ethical challenges. Recent attention has focused on the likelihood of an impending influenza pandemic caused by the H5N1 virus. Pregnant women in particular present policymakers with great challenges to planning for such a public health emergency. By recognizing the specific considerations needed for this population, we can preemptively address the issues presented by infectious disease outbreaks. We reviewed the important ethical challenges presented by pregnant women and highlighted the considerations for all vulnerable groups when planning for a pandemic at both the local and the national level. PMID:19461111
DIVE: A Graph-based Visual Analytics Framework for Big Data
Rysavy, Steven J.; Bromley, Dennis; Daggett, Valerie
2014-01-01
The need for data-centric scientific tools is growing; domains like biology, chemistry, and physics are increasingly adopting computational approaches. As a result, scientists must now deal with the challenges of big data. To address these challenges, we built a visual analytics platform named DIVE: Data Intensive Visualization Engine. DIVE is a data-agnostic, ontologically-expressive software framework capable of streaming large datasets at interactive speeds. Here we present the technical details of the DIVE platform, multiple usage examples, and a case study from the Dynameomics molecular dynamics project. We specifically highlight our novel contributions to structured data model manipulation and high-throughput streaming of large, structured datasets. PMID:24808197
Big Data and Analytics in Healthcare.
Tan, S S-L; Gao, G; Koch, S
2015-01-01
This editorial is part of the Focus Theme of Methods of Information in Medicine on "Big Data and Analytics in Healthcare". The amount of data being generated in the healthcare industry is growing at a rapid rate. This has generated immense interest in leveraging the availability of healthcare data (and "big data") to improve health outcomes and reduce costs. However, the nature of healthcare data, and especially big data, presents unique challenges in processing and analyzing big data in healthcare. This Focus Theme aims to disseminate some novel approaches to address these challenges. More specifically, approaches ranging from efficient methods of processing large clinical data to predictive models that could generate better predictions from healthcare data are presented.
Sampling for Chemical Analysis.
ERIC Educational Resources Information Center
Kratochvil, Byron; And Others
1984-01-01
This review, designed to make analysts aware of uncertainties introduced into analytical measurements during sampling, is organized under these headings: general considerations; theory; standards; and applications related to mineralogy, soils, sediments, metallurgy, atmosphere, water, biology, agriculture and food, medical and clinical areas, oil…
Development of finite element models to predict dynamic bridge response.
DOT National Transportation Integrated Search
1997-10-01
Dynamic response has long been recognized as one of the significant factors affecting the service life and safety of bridge structures. Even though considerable research, both analytical and experimental, has been devoted to dynamic bridge behavior, ...
FHWA statistical program : a customer's guide to using highway statistics
DOT National Transportation Integrated Search
1995-08-01
The appropriate level of spatial and temporal data aggregation for highway vehicle emissions analyses is one of several important analytical questions that has received considerable interest following passage of the Clean Air Act Amendments (CAAA) of...
An Analytical-Numerical Model for Two-Phase Slug Flow through a Sudden Area Change in Microchannels
Momen, A. Mehdizadeh; Sherif, S. A.; Lear, W. E.
2016-01-01
In this article, two new analytical models have been developed to calculate two-phase slug flow pressure drop in microchannels through a sudden contraction. Even though many studies have been reported on two-phase flow in microchannels, considerable discrepancies still exist, mainly due to the difficulties in experimental setup and measurements. Numerical simulations were performed to support the new analytical models and to explore in more detail the physics of the flow in microchannels with a sudden contraction. Both analytical and numerical results were compared to the available experimental data and other empirical correlations. Results show that models, which were developed basedmore » on the slug and semi-slug assumptions, agree well with experiments in microchannels. Moreover, in contrast to the previous empirical correlations which were tuned for a specific geometry, the new analytical models are capable of taking geometrical parameters as well as flow conditions into account.« less
De Neys, Wim
2006-06-01
Human reasoning has been shown to overly rely on intuitive, heuristic processing instead of a more demanding analytic inference process. Four experiments tested the central claim of current dual-process theories that analytic operations involve time-consuming executive processing whereas the heuristic system would operate automatically. Participants solved conjunction fallacy problems and indicative and deontic selection tasks. Experiment 1 established that making correct analytic inferences demanded more processing time than did making heuristic inferences. Experiment 2 showed that burdening the executive resources with an attention-demanding secondary task decreased correct, analytic responding and boosted the rate of conjunction fallacies and indicative matching card selections. Results were replicated in Experiments 3 and 4 with a different secondary-task procedure. Involvement of executive resources for the deontic selection task was less clear. Findings validate basic processing assumptions of the dual-process framework and complete the correlational research programme of K. E. Stanovich and R. F. West (2000).
The “2T” ion-electron semi-analytic shock solution for code-comparison with xRAGE: A report for FY16
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferguson, Jim Michael
2016-10-05
This report documents an effort to generate the semi-analytic "2T" ion-electron shock solution developed in the paper by Masser, Wohlbier, and Lowrie, and the initial attempts to understand how to use this solution as a code-verification tool for one of LANL's ASC codes, xRAGE. Most of the work so far has gone into generating the semi-analytic solution. Considerable effort will go into understanding how to write the xRAGE input deck that both matches the boundary conditions imposed by the solution, and also what physics models must be implemented within the semi-analytic solution itself to match the model assumptions inherit withinmore » xRAGE. Therefore, most of this report focuses on deriving the equations for the semi-analytic 1D-planar time-independent "2T" ion-electron shock solution, and is written in a style that is intended to provide clear guidance for anyone writing their own solver.« less
Kang, Youn-Ah; Stasko, J
2012-12-01
While the formal evaluation of systems in visual analytics is still relatively uncommon, particularly rare are case studies of prolonged system use by domain analysts working with their own data. Conducting case studies can be challenging, but it can be a particularly effective way to examine whether visual analytics systems are truly helping expert users to accomplish their goals. We studied the use of a visual analytics system for sensemaking tasks on documents by six analysts from a variety of domains. We describe their application of the system along with the benefits, issues, and problems that we uncovered. Findings from the studies identify features that visual analytics systems should emphasize as well as missing capabilities that should be addressed. These findings inform design implications for future systems.
Quo vadis, analytical chemistry?
Valcárcel, Miguel
2016-01-01
This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.
Clinical laboratory analytics: Challenges and promise for an emerging discipline.
Shirts, Brian H; Jackson, Brian R; Baird, Geoffrey S; Baron, Jason M; Clements, Bryan; Grisson, Ricky; Hauser, Ronald George; Taylor, Julie R; Terrazas, Enrique; Brimhall, Brad
2015-01-01
The clinical laboratory is a major source of health care data. Increasingly these data are being integrated with other data to inform health system-wide actions meant to improve diagnostic test utilization, service efficiency, and "meaningful use." The Academy of Clinical Laboratory Physicians and Scientists hosted a satellite meeting on clinical laboratory analytics in conjunction with their annual meeting on May 29, 2014 in San Francisco. There were 80 registrants for the clinical laboratory analytics meeting. The meeting featured short presentations on current trends in clinical laboratory analytics and several panel discussions on data science in laboratory medicine, laboratory data and its role in the larger healthcare system, integrating laboratory analytics, and data sharing for collaborative analytics. One main goal of meeting was to have an open forum of leaders that work with the "big data" clinical laboratories produce. This article summarizes the proceedings of the meeting and content discussed.
Clinical laboratory analytics: Challenges and promise for an emerging discipline
Shirts, Brian H.; Jackson, Brian R.; Baird, Geoffrey S.; Baron, Jason M.; Clements, Bryan; Grisson, Ricky; Hauser, Ronald George; Taylor, Julie R.; Terrazas, Enrique; Brimhall, Brad
2015-01-01
The clinical laboratory is a major source of health care data. Increasingly these data are being integrated with other data to inform health system-wide actions meant to improve diagnostic test utilization, service efficiency, and “meaningful use.” The Academy of Clinical Laboratory Physicians and Scientists hosted a satellite meeting on clinical laboratory analytics in conjunction with their annual meeting on May 29, 2014 in San Francisco. There were 80 registrants for the clinical laboratory analytics meeting. The meeting featured short presentations on current trends in clinical laboratory analytics and several panel discussions on data science in laboratory medicine, laboratory data and its role in the larger healthcare system, integrating laboratory analytics, and data sharing for collaborative analytics. One main goal of meeting was to have an open forum of leaders that work with the “big data” clinical laboratories produce. This article summarizes the proceedings of the meeting and content discussed. PMID:25774320
Managing knowledge business intelligence: A cognitive analytic approach
NASA Astrophysics Data System (ADS)
Surbakti, Herison; Ta'a, Azman
2017-10-01
The purpose of this paper is to identify and analyze integration of Knowledge Management (KM) and Business Intelligence (BI) in order to achieve competitive edge in context of intellectual capital. Methodology includes review of literatures and analyzes the interviews data from managers in corporate sector and models established by different authors. BI technologies have strong association with process of KM for attaining competitive advantage. KM have strong influence from human and social factors and turn them to the most valuable assets with efficient system run under BI tactics and technologies. However, the term of predictive analytics is based on the field of BI. Extracting tacit knowledge is a big challenge to be used as a new source for BI to use in analyzing. The advanced approach of the analytic methods that address the diversity of data corpus - structured and unstructured - required a cognitive approach to provide estimative results and to yield actionable descriptive, predictive and prescriptive results. This is a big challenge nowadays, and this paper aims to elaborate detail in this initial work.
Iliuk, Anton B.; Arrington, Justine V.; Tao, Weiguo Andy
2014-01-01
Phosphoproteomics is the systematic study of one of the most common protein modifications in high throughput with the aim of providing detailed information of the control, response, and communication of biological systems in health and disease. Advances in analytical technologies and strategies, in particular the contributions of high-resolution mass spectrometers, efficient enrichments of phosphopeptides, and fast data acquisition and annotation, have catalyzed dramatic expansion of signaling landscapes in multiple systems during the past decade. While phosphoproteomics is an essential inquiry to map high-resolution signaling networks and to find relevant events among the apparently ubiquitous and widespread modifications of proteome, it presents tremendous challenges in separation sciences to translate it from discovery to clinical practice. In this mini-review, we summarize the analytical tools currently utilized for phosphoproteomic analysis (with focus on MS), progresses made on deciphering clinically relevant kinase-substrate networks, MS uses for biomarker discovery and validation, and the potential of phosphoproteomics for disease diagnostics and personalized medicine. PMID:24890697
Using i2b2 to Bootstrap Rural Health Analytics and Learning Networks
Harris, Daniel R.; Baus, Adam D.; Harper, Tamela J.; Jarrett, Traci D.; Pollard, Cecil R.; Talbert, Jeffery C.
2017-01-01
We demonstrate that the open-source i2b2 (Informatics for Integrating Biology and the Bedside) data model can be used to bootstrap rural health analytics and learning networks. These networks promote communication and research initiatives by providing the infrastructure necessary for sharing data and insights across a group of healthcare and research partners. Data integration remains a crucial challenge in connecting rural healthcare sites with a common data sharing and learning network due to the lack of interoperability and standards within electronic health records. The i2b2 data model acts as a point of convergence for disparate data from multiple healthcare sites. A consistent and natural data model for healthcare data is essential for overcoming integration issues, but challenges such as those caused by weak data standardization must still be addressed. We describe our experience in the context of building the West Virginia/Kentucky Health Analytics and Learning Network, a collaborative, multi-state effort connecting rural healthcare sites. PMID:28261006
Using i2b2 to Bootstrap Rural Health Analytics and Learning Networks.
Harris, Daniel R; Baus, Adam D; Harper, Tamela J; Jarrett, Traci D; Pollard, Cecil R; Talbert, Jeffery C
2016-08-01
We demonstrate that the open-source i2b2 (Informatics for Integrating Biology and the Bedside) data model can be used to bootstrap rural health analytics and learning networks. These networks promote communication and research initiatives by providing the infrastructure necessary for sharing data and insights across a group of healthcare and research partners. Data integration remains a crucial challenge in connecting rural healthcare sites with a common data sharing and learning network due to the lack of interoperability and standards within electronic health records. The i2b2 data model acts as a point of convergence for disparate data from multiple healthcare sites. A consistent and natural data model for healthcare data is essential for overcoming integration issues, but challenges such as those caused by weak data standardization must still be addressed. We describe our experience in the context of building the West Virginia/Kentucky Health Analytics and Learning Network, a collaborative, multi-state effort connecting rural healthcare sites.
Stamm, H; Gibson, N; Anklam, E
2012-08-01
This paper describes the requirements and resulting challenges for the implementation of current and upcoming European Union legislation referring to the use of nanomaterials in food, cosmetics and other consumer products. The European Commission has recently adopted a recommendation for the definition of nanomaterials. There is now an urgent need for appropriate and fit-for-purpose analytical methods in order to identify nanomaterials properly according to this definition and to assess whether or not a product contains nanomaterials. Considering the lack of such methods to date, this paper elaborates on the challenges of the legislative framework and the type of methods needed, not only to facilitate implementation of labelling requirements, but also to ensure the safety of products coming to the market. Considering the many challenges in the analytical process itself, such as interaction of nanoparticles with matrix constituents, potential agglomeration and aggregation due to matrix environment, broad variety of matrices, etc., there is a need for integrated analytical approaches, not only for sample preparation (e.g. separation from matrix), but also for the actual characterisation. Furthermore, there is an urgent need for quality assurance tools such as validated methods and (certified) reference materials, including materials containing nanoparticles in a realistic matrix (food products, cosmetics, etc.).
ERIC Educational Resources Information Center
Tisdall, E. Kay M.
2012-01-01
Childhood studies have argued for the social construction of childhood, respecting children and childhood in the present, and recognising children's agency and rights. Such perspectives have parallels to, and challenges for, disability studies. This article considers such parallels and challenges, leading to a (re)consideration of research claims…
Wilson, Michael G; Lavis, John N; Gauvin, Francois-Pierre
2015-03-11
There is currently no mechanism in place outside of government to provide rapid syntheses of the best available research evidence about problems, options and/or implementation considerations related to a specific health system challenge that Canadian health system decision-makers need to address in a timely manner. A 'rapid-response' program could address this gap by providing access to optimally packaged, relevant and high-quality research evidence over short periods of time (i.e. days or weeks). We prepared an issue brief that describes the best available research evidence related to the problem, three broad features of a program that addresses the problem and implementation considerations. We identified systematic reviews by searching for organization-targeted implementation strategies in Health Systems Evidence ( www.healthsystemsevidence.org ) and drew on an existing analytical framework for how knowledge-brokering organizations can organize themselves to operationalize the program features. The issue brief was then used to inform a half-day stakeholder dialogue about whether and how to develop a rapid-response program for health system decision-makers in Canada. We thematically synthesized the deliberations. We found very few relevant systematic reviews but used frameworks and examples from existing programs to 1) outline key considerations for organizing a rapid-response program,, 2) determine what can be done in timelines ranging from 3 to 10 and 30 business days, and 3) define success and measure it. The 11 dialogue participants from across Canada largely agreed with the content presented in the brief, but noted two key challenges to consider: securing stable, long-term funding and finding a way to effectively and equitably manage the expected demand. Recommendations and suggestions for next steps from dialogue participants included taking an 'organic' approach to developing a pan-Canadian network and including jurisdictional scans as a type of product to deliver through the program (rather than only syntheses of research evidence). Dialogue participants clearly signalled that there is an appetite for a rapid-response program for health system decision-makers in Canada. To 'organically' build such a program, we are currently engaging in efforts to build partnerships and secure funding to support the creation of a pan-Canadian network for conducting rapid syntheses for health system decision-makers in Canada.
"This Group of Difficult Kids": The Discourse Preservice English Teachers Use to Label Students
ERIC Educational Resources Information Center
Salerno, April S.; Kibler, Amanda K.
2016-01-01
This study attempts to understand how "achievement gap Discourse" might be present in preservice teachers' (PSTs) Discourse about students they found challenging to teach. Using a Discourse analytic approach, the project considers: How do PSTs describe challenging students in their written reflections? Do PSTs draw on students' multiple…
Of Radicals and DREAMers: Harnessing Exceptionality to Challenge Immigration Control
ERIC Educational Resources Information Center
Heredia, Luisa Laura
2015-01-01
This article contributes to the literature on undocumented youth activism and citizenship by assessing undocumented youth's challenges to a growing regime of migration control in the US. It uses Doug McAdam's tactical interaction as an analytical lens to explore two consecutive high-risk campaigns, ICE infiltrations and expulsion/re-entry. In this…
Work-Based Learning as a Field of Study
ERIC Educational Resources Information Center
Gibbs, Paul; Garnett, Jonathan
2007-01-01
This article addresses the challenges that Garnett suggests face higher education through the lens of Bourdieu. In taking up the challenge set by Garnett for higher education in the knowledge economy and responding to its powerful and primary artefact--intellectual capital--the article reviews and uses the analytical tool of Bourdieu's practice in…
Aligning Postsecondary Education with Regional Workforce Needs: A Tale of Two States
ERIC Educational Resources Information Center
Barkanic, Stephen
2016-01-01
The United States faces a pressing national security and competitiveness challenge rooted in a shortage of a diverse, highly skilled workforce, particularly in vital cross-disciplinary fields such as data science and analytics, cybersecurity, and information technology. To address this challenge, Business-Higher Education Forum (BHEF) launched the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cutler, Dylan; Frank, Stephen; Slovensky, Michelle
Rich, well-organized building performance and energy consumption data enable a host of analytic capabilities for building owners and operators, from basic energy benchmarking to detailed fault detection and system optimization. Unfortunately, data integration for building control systems is challenging and costly in any setting. Large portfolios of buildings--campuses, cities, and corporate portfolios--experience these integration challenges most acutely. These large portfolios often have a wide array of control systems, including multiple vendors and nonstandard communication protocols. They typically have complex information technology (IT) networks and cybersecurity requirements and may integrate distributed energy resources into their infrastructure. Although the challenges are significant,more » the integration of control system data has the potential to provide proportionally greater value for these organizations through portfolio-scale analytics, comprehensive demand management, and asset performance visibility. As a large research campus, the National Renewable Energy Laboratory (NREL) experiences significant data integration challenges. To meet them, NREL has developed an architecture for effective data collection, integration, and analysis, providing a comprehensive view of data integration based on functional layers. The architecture is being evaluated on the NREL campus through deployment of three pilot implementations.« less
NASA Astrophysics Data System (ADS)
Sajjadi, Mohammadreza; Pishkenari, Hossein Nejat; Vossoughi, Gholamreza
2018-06-01
Trolling mode atomic force microscopy (TR-AFM) has resolved many imaging problems by a considerable reduction of the liquid-resonator interaction forces in liquid environments. The present study develops a nonlinear model of the meniscus force exerted to the nanoneedle of TR-AFM and presents an analytical solution to the distributed-parameter model of TR-AFM resonator utilizing multiple time scales (MTS) method. Based on the developed analytical solution, the frequency-response curves of the resonator operation in air and liquid (for different penetration length of the nanoneedle) are obtained. The closed-form analytical solution and the frequency-response curves are validated by the comparison with both the finite element solution of the main partial differential equations and the experimental observations. The effect of excitation angle of the resonator on horizontal oscillation of the probe tip and the effect of different parameters on the frequency-response of the system are investigated.
NASA Astrophysics Data System (ADS)
Khan, Mohammad S.; Abdullah, Mohamed H.; Ali, Gamal B.
2014-05-01
We derive analytical expression for the velocity dispersion of galaxy clusters, using the statistical mechanical approach. We compare the observed velocity dispersion profiles for 20 nearby ( z≤0.1) galaxy clusters with the analytical ones. It is interesting to find that the analytical results closely match with the observed velocity dispersion profiles only if the presence of the diffuse matter in clusters is taken into consideration. This takes us to introduce a new approach to detect the ratio of diffuse mass, M diff , within a galaxy cluster. For the present sample, the ratio f= M diff / M, where M the cluster's total mass is found to has an average value of 45±12 %. This leads us to the result that nearly 45 % of the cluster mass is impeded outside the galaxies, while around 55 % of the cluster mass is settled in the galaxies.
Technology Assessment and Policy Analysis
ERIC Educational Resources Information Center
Majone, Giandomenico
1977-01-01
Argues that the application of policy analysis to technology assessment requires the abandonment of stereotyped approaches and a reformulation of analytical paradigms to include consideration of institutional constraints. Available from: Elsevier Scientific Publishing Company, Box 211, Amsterdam, the Netherlands, single copies available.…
Recommendations on integrating children's health considerations into EPA's Action Development Process (ADP). Also how to identify economically significant actions, disproportionate risk, and developing the Analytical Blueprint.
Dynamic field testing of the Route 58 Meherrin River bridge.
DOT National Transportation Integrated Search
1996-01-01
Dynamic response has long been recognized as one of the significant factors affecting the service life and safety of bridge structures, and considerable research, both analytical and experimental, has been devoted to this area of behavior. In the des...
Distribution factors for construction loads and girder capacity equations, final report.
DOT National Transportation Integrated Search
2017-03-01
During the process of constructing a highway bridge, there are several construction stages that warrant : consideration from a structural safety and design perspective. The first objective of the present study was to use analytical : models of prestr...
Cordero, Chiara; Kiefl, Johannes; Schieberle, Peter; Reichenbach, Stephen E; Bicchi, Carlo
2015-01-01
Modern omics disciplines dealing with food flavor focus the analytical efforts on the elucidation of sensory-active compounds, including all possible stimuli of multimodal perception (aroma, taste, texture, etc.) by means of a comprehensive, integrated treatment of sample constituents, such as physicochemical properties, concentration in the matrix, and sensory properties (odor/taste quality, perception threshold). Such analyses require detailed profiling of known bioactive components as well as advanced fingerprinting techniques to catalog sample constituents comprehensively, quantitatively, and comparably across samples. Multidimensional analytical platforms support comprehensive investigations required for flavor analysis by combining information on analytes' identities, physicochemical behaviors (volatility, polarity, partition coefficient, and solubility), concentration, and odor quality. Unlike other omics, flavor metabolomics and sensomics include the final output of the biological phenomenon (i.e., sensory perceptions) as an additional analytical dimension, which is specifically and exclusively triggered by the chemicals analyzed. However, advanced omics platforms, which are multidimensional by definition, pose challenging issues not only in terms of coupling with detection systems and sample preparation, but also in terms of data elaboration and processing. The large number of variables collected during each analytical run provides a high level of information, but requires appropriate strategies to exploit fully this potential. This review focuses on advances in comprehensive two-dimensional gas chromatography and analytical platforms combining two-dimensional gas chromatography with olfactometry, chemometrics, and quantitative assays for food sensory analysis to assess the quality of a given product. We review instrumental advances and couplings, automation in sample preparation, data elaboration, and a selection of applications.
Guidance for laboratories performing molecular pathology for cancer patients
Cree, Ian A; Deans, Zandra; Ligtenberg, Marjolijn J L; Normanno, Nicola; Edsjö, Anders; Rouleau, Etienne; Solé, Francesc; Thunnissen, Erik; Timens, Wim; Schuuring, Ed; Dequeker, Elisabeth; Murray, Samuel; Dietel, Manfred; Groenen, Patricia; Van Krieken, J Han
2014-01-01
Molecular testing is becoming an important part of the diagnosis of any patient with cancer. The challenge to laboratories is to meet this need, using reliable methods and processes to ensure that patients receive a timely and accurate report on which their treatment will be based. The aim of this paper is to provide minimum requirements for the management of molecular pathology laboratories. This general guidance should be augmented by the specific guidance available for different tumour types and tests. Preanalytical considerations are important, and careful consideration of the way in which specimens are obtained and reach the laboratory is necessary. Sample receipt and handling follow standard operating procedures, but some alterations may be necessary if molecular testing is to be performed, for instance to control tissue fixation. DNA and RNA extraction can be standardised and should be checked for quality and quantity of output on a regular basis. The choice of analytical method(s) depends on clinical requirements, desired turnaround time, and expertise available. Internal quality control, regular internal audit of the whole testing process, laboratory accreditation, and continual participation in external quality assessment schemes are prerequisites for delivery of a reliable service. A molecular pathology report should accurately convey the information the clinician needs to treat the patient with sufficient information to allow for correct interpretation of the result. Molecular pathology is developing rapidly, and further detailed evidence-based recommendations are required for many of the topics covered here. PMID:25012948
Review of Pre-Analytical Errors in Oral Glucose Tolerance Testing in a Tertiary Care Hospital.
Nanda, Rachita; Patel, Suprava; Sahoo, Sibashish; Mohapatra, Eli
2018-03-13
The pre-pre-analytical and pre-analytical phases form a major chunk of the errors in a laboratory. The process has taken into consideration a very common procedure which is the oral glucose tolerance test to identify the pre-pre-analytical errors. Quality indicators provide evidence of quality, support accountability and help in the decision making of laboratory personnel. The aim of this research is to evaluate pre-analytical performance of the oral glucose tolerance test procedure. An observational study that was conducted overa period of three months, in the phlebotomy and accessioning unit of our laboratory using questionnaire that examined the pre-pre-analytical errors through a scoring system. The pre-analytical phase was analyzed for each sample collected as per seven quality indicators. About 25% of the population gave wrong answer with regard to the question that tested the knowledge of patient preparation. The appropriateness of test result QI-1 had the most error. Although QI-5 for sample collection had a low error rate, it is a very important indicator as any wrongly collected sample can alter the test result. Evaluating the pre-analytical and pre-pre-analytical phase is essential and must be conducted routinely on a yearly basis to identify errors and take corrective action and to facilitate their gradual introduction into routine practice.
Roberts, Lynne D; Howell, Joel A; Seaman, Kristen; Gibson, David C
2016-01-01
Increasingly, higher education institutions are exploring the potential of learning analytics to predict student retention, understand learning behaviors, and improve student learning through providing personalized feedback and support. The technical development of learning analytics has outpaced consideration of ethical issues surrounding their use. Of particular concern is the absence of the student voice in decision-making about learning analytics. We explored higher education students' knowledge, attitudes, and concerns about big data and learning analytics through four focus groups ( N = 41). Thematic analysis of the focus group transcripts identified six key themes. The first theme, "Uninformed and Uncertain," represents students' lack of knowledge about learning analytics prior to the focus groups. Following the provision of information, viewing of videos and discussion of learning analytics scenarios three further themes; "Help or Hindrance to Learning," "More than a Number," and "Impeding Independence"; represented students' perceptions of the likely impact of learning analytics on their learning. "Driving Inequality" and "Where Will it Stop?" represent ethical concerns raised by the students about the potential for inequity, bias and invasion of privacy and the need for informed consent. A key tension to emerge was how "personal" vs. "collective" purposes or principles can intersect with "uniform" vs. "autonomous" activity. The findings highlight the need the need to engage students in the decision making process about learning analytics.
The analyst's participation in the analytic process.
Levine, H B
1994-08-01
The analyst's moment-to-moment participation in the analytic process is inevitably and simultaneously determined by at least three sets of considerations. These are: (1) the application of proper analytic technique; (2) the analyst's personally-motivated responses to the patient and/or the analysis; (3) the analyst's use of him or herself to actualise, via fantasy, feeling or action, some aspect of the patient's conflicts, fantasies or internal object relationships. This formulation has relevance to our view of actualisation and enactment in the analytic process and to our understanding of a series of related issues that are fundamental to our theory of technique. These include the dialectical relationships that exist between insight and action, interpretation and suggestion, empathy and countertransference, and abstinence and gratification. In raising these issues, I do not seek to encourage or endorse wild analysis, the attempt to supply patients with 'corrective emotional experiences' or a rationalisation for acting out one's countertransferences. Rather, it is my hope that if we can better appreciate and describe these important dimensions of the analytic encounter, we can be better prepared to recognise, understand and interpret the continual streams of actualisation and enactment that are embedded in the analytic process. A deeper appreciation of the nature of the analyst's participation in the analytic process and the dimensions of the analytic process to which that participation gives rise may offer us a limited, although important, safeguard against analytic impasse.
ERIC Educational Resources Information Center
Monroy, Carlos; Rangel, Virginia Snodgrass; Whitaker, Reid
2014-01-01
In this paper, we discuss a scalable approach for integrating learning analytics into an online K-12 science curriculum. A description of the curriculum and the underlying pedagogical framework is followed by a discussion of the challenges to be tackled as part of this integration. We include examples of data visualization based on teacher usage…
Sampling Large Graphs for Anticipatory Analytics
2015-05-15
low. C. Random Area Sampling Random area sampling [8] is a “ snowball ” sampling method in which a set of random seed vertices are selected and areas... Sampling Large Graphs for Anticipatory Analytics Lauren Edwards, Luke Johnson, Maja Milosavljevic, Vijay Gadepally, Benjamin A. Miller Lincoln...systems, greater human-in-the-loop involvement, or through complex algorithms. We are investigating the use of sampling to mitigate these challenges
Big data analytics workflow management for eScience
NASA Astrophysics Data System (ADS)
Fiore, Sandro; D'Anca, Alessandro; Palazzo, Cosimo; Elia, Donatello; Mariello, Andrea; Nassisi, Paola; Aloisio, Giovanni
2015-04-01
In many domains such as climate and astrophysics, scientific data is often n-dimensional and requires tools that support specialized data types and primitives if it is to be properly stored, accessed, analysed and visualized. Currently, scientific data analytics relies on domain-specific software and libraries providing a huge set of operators and functionalities. However, most of these software fail at large scale since they: (i) are desktop based, rely on local computing capabilities and need the data locally; (ii) cannot benefit from available multicore/parallel machines since they are based on sequential codes; (iii) do not provide declarative languages to express scientific data analysis tasks, and (iv) do not provide newer or more scalable storage models to better support the data multidimensionality. Additionally, most of them: (v) are domain-specific, which also means they support a limited set of data formats, and (vi) do not provide a workflow support, to enable the construction, execution and monitoring of more complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides several parallel operators to manipulate large datasets. Some relevant examples include: (i) data sub-setting (slicing and dicing), (ii) data aggregation, (iii) array-based primitives (the same operator applies to all the implemented UDF extensions), (iv) data cube duplication, (v) data cube pivoting, (vi) NetCDF-import and export. Metadata operators are available too. Additionally, the Ophidia framework provides array-based primitives to perform data sub-setting, data aggregation (i.e. max, min, avg), array concatenation, algebraic expressions and predicate evaluation on large arrays of scientific data. Bit-oriented plugins have also been implemented to manage binary data cubes. Defining processing chains and workflows with tens, hundreds of data analytics operators is the real challenge in many practical scientific use cases. This talk will specifically address the main needs, requirements and challenges regarding data analytics workflow management applied to large scientific datasets. Three real use cases concerning analytics workflows for sea situational awareness, fire danger prevention, climate change and biodiversity will be discussed in detail.
Pfenniger, Alois; Obrist, Dominik; Stahel, Andreas; Koch, Volker M; Vogel, Rolf
2013-07-01
As the complexity of active medical implants increases, the task of embedding a life-long power supply at the time of implantation becomes more challenging. A periodic renewal of the energy source is often required. Human energy harvesting is, therefore, seen as a possible remedy. In this paper, we present a novel idea to harvest energy from the pressure-driven deformation of an artery by the principle of magneto-hydrodynamics. The generator relies on a highly electrically conductive fluid accelerated perpendicularly to a magnetic field by means of an efficient lever arm mechanism. An artery with 10 mm inner diameter is chosen as a potential implantation site and its ability to drive the generator is established. Three analytical models are proposed to investigate the relevant design parameters and to determine the existence of an optimal configuration. The predicted output power reaches 65 μW according to the first two models and 135 μW according to the third model. It is found that the generator, designed as a circular structure encompassing the artery, should not exceed a total volume of 3 cm³.
Resistant starch: promise for improving human health.
Birt, Diane F; Boylston, Terri; Hendrich, Suzanne; Jane, Jay-Lin; Hollis, James; Li, Li; McClelland, John; Moore, Samuel; Phillips, Gregory J; Rowling, Matthew; Schalinske, Kevin; Scott, M Paul; Whitley, Elizabeth M
2013-11-01
Ongoing research to develop digestion-resistant starch for human health promotion integrates the disciplines of starch chemistry, agronomy, analytical chemistry, food science, nutrition, pathology, and microbiology. The objectives of this research include identifying components of starch structure that confer digestion resistance, developing novel plants and starches, and modifying foods to incorporate these starches. Furthermore, recent and ongoing studies address the impact of digestion-resistant starches on the prevention and control of chronic human diseases, including diabetes, colon cancer, and obesity. This review provides a transdisciplinary overview of this field, including a description of types of resistant starches; factors in plants that affect digestion resistance; methods for starch analysis; challenges in developing food products with resistant starches; mammalian intestinal and gut bacterial metabolism; potential effects on gut microbiota; and impacts and mechanisms for the prevention and control of colon cancer, diabetes, and obesity. Although this has been an active area of research and considerable progress has been made, many questions regarding how to best use digestion-resistant starches in human diets for disease prevention must be answered before the full potential of resistant starches can be realized.
Resistant Starch: Promise for Improving Human Health12
Birt, Diane F.; Boylston, Terri; Hendrich, Suzanne; Jane, Jay-Lin; Hollis, James; Li, Li; McClelland, John; Moore, Samuel; Phillips, Gregory J.; Rowling, Matthew; Schalinske, Kevin; Scott, M. Paul; Whitley, Elizabeth M.
2013-01-01
Ongoing research to develop digestion-resistant starch for human health promotion integrates the disciplines of starch chemistry, agronomy, analytical chemistry, food science, nutrition, pathology, and microbiology. The objectives of this research include identifying components of starch structure that confer digestion resistance, developing novel plants and starches, and modifying foods to incorporate these starches. Furthermore, recent and ongoing studies address the impact of digestion-resistant starches on the prevention and control of chronic human diseases, including diabetes, colon cancer, and obesity. This review provides a transdisciplinary overview of this field, including a description of types of resistant starches; factors in plants that affect digestion resistance; methods for starch analysis; challenges in developing food products with resistant starches; mammalian intestinal and gut bacterial metabolism; potential effects on gut microbiota; and impacts and mechanisms for the prevention and control of colon cancer, diabetes, and obesity. Although this has been an active area of research and considerable progress has been made, many questions regarding how to best use digestion-resistant starches in human diets for disease prevention must be answered before the full potential of resistant starches can be realized. PMID:24228189
Thevis, Mario; Hemmersbach, Peter; Geyer, Hans; Schänzer, Wilhelm
2009-12-15
Activities concerning the fight against doping with regard to the Paralympic Games have been initiated in 1984, when first doping controls were conducted. The foundation of the International Paralympic Committee exactly 20 years ago (1989) considerably supported systematic sports drug-testing programs specifically designed to meet the particular challenges related to disabled sports, which yielded a variety of adverse analytical findings (e.g., with anabolic steroids, diuretics, corticosteroids, and stimulants) especially at Paralympic Summer Games. In Germany, doping controls for handicapped athletes were established in 1992 and have been conducted since by the National Paralympic Committee Germany and the National Anti-Doping Agency. Also here, various analogies in terms of antidoping rule violations were found in comparison to doping controls of nondisabled athletes. In the present article, available numbers of samples analyzed at Paralympic Summer and Winter Games as well as within the doping control program for disabled sports in Germany are summarized, and particularities concerning sample collection and the doping method termed boosting are presented.
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1998-05-01
Increased demands on the performance and efficiency of mechanical components impose challenges on their engineering design and optimization, especially when new and more demanding applications must be developed in relatively short periods of time while satisfying design objectives, as well as cost and manufacturability. In addition, reliability and durability must be taken into consideration. As a consequence, effective quantitative methodologies, computational and experimental, should be applied in the study and optimization of mechanical components. Computational investigations enable parametric studies and the determination of critical engineering design conditions, while experimental investigations, especially those using optical techniques, provide qualitative and quantitative information on the actual response of the structure of interest to the applied load and boundary conditions. We discuss a hybrid experimental and computational approach for investigation and optimization of mechanical components. The approach is based on analytical, computational, and experimental resolutions methodologies in the form of computational, noninvasive optical techniques, and fringe prediction analysis tools. Practical application of the hybrid approach is illustrated with representative examples that demonstrate the viability of the approach as an effective engineering tool for analysis and optimization.
Considerations on Geospatial Big Data
NASA Astrophysics Data System (ADS)
LIU, Zhen; GUO, Huadong; WANG, Changlin
2016-11-01
Geospatial data, as a significant portion of big data, has recently gained the full attention of researchers. However, few researchers focus on the evolution of geospatial data and its scientific research methodologies. When entering into the big data era, fully understanding the changing research paradigm associated with geospatial data will definitely benefit future research on big data. In this paper, we look deep into these issues by examining the components and features of geospatial big data, reviewing relevant scientific research methodologies, and examining the evolving pattern of geospatial data in the scope of the four ‘science paradigms’. This paper proposes that geospatial big data has significantly shifted the scientific research methodology from ‘hypothesis to data’ to ‘data to questions’ and it is important to explore the generality of growing geospatial data ‘from bottom to top’. Particularly, four research areas that mostly reflect data-driven geospatial research are proposed: spatial correlation, spatial analytics, spatial visualization, and scientific knowledge discovery. It is also pointed out that privacy and quality issues of geospatial data may require more attention in the future. Also, some challenges and thoughts are raised for future discussion.
Chemical analysis of Panax quinquefolius (North American ginseng): A review.
Wang, Yaping; Choi, Hyung-Kyoon; Brinckmann, Josef A; Jiang, Xue; Huang, Linfang
2015-12-24
Panax quinquefolius (PQ) is one of the best-selling natural health products due to its proposed beneficial anti-aging, anti-cancer, anti-stress, anti-fatigue, and anxiolytic effects. In recent years, the quality of PQ has received considerable attention. Sensitive and accurate methods for qualitative and quantitative analyses of chemical constituents are necessary for the comprehensive quality control to ensure the safety and efficacy of PQ. This article reviews recent progress in the chemical analysis of PQ and its preparations. Numerous analytical techniques, including spectroscopy, thin-layer chromatography (TLC), gas chromatography (GC), high-performance liquid chromatography (HPLC), liquid chromatography/mass spectrometry (LC/MS), high-speed centrifugal partition chromatography (HSCPC), high-performance counter-current chromatography (HPCCC), nuclear magnetic resonance spectroscopy (NMR), and immunoassay, are described. Among these techniques, HPLC coupled with mass spectrometry (MS) is the most promising method for quality control. The challenges encountered in the chemical analysis of PQ are also briefly discussed, and the remaining questions regarding the quality control of PQ that require further investigation are highlighted. Copyright © 2015 Elsevier B.V. All rights reserved.
"This strange disease": adolescent transference and the analyst's sexual orientation.
Burton, John K; Gilmore, Karen
2010-08-01
The treatment of adolescents by gay analysts is uncharted territory regarding the impact of the analyst's sexuality on the analytic process. Since a core challenge of adolescence involves the integration of the adult sexual body, gender role, and reproductive capacities into evolving identity, and since adolescents seek objects in their environment to facilitate both identity formation and the establishment of autonomy from primary objects, the analyst's sexual orientation is arguably a potent influence on the outcome of adolescent development. However, because sexual orientation is a less visible characteristic of the analyst than gender, race, or age, for example, the line between reality and fantasy is less clearly demarcated. This brings up special considerations regarding discovery and disclosure in the treatment. To explore these issues, the case of a late adolescent girl in treatment with a gay male analyst is presented. In this treatment, the question of the analyst's sexual orientation, and the demand by the patient for the analyst's self-disclosure, became a transference nucleus around which the patient's individual dynamics and adolescent dilemmas could be explored and clarified.
Nayar, Monisha C
2008-03-01
This paper addresses two specific aspects of clinical technique in the treatment of traumatized individuals. The first aspect involves the creation of a safe holding environment as an essential step for the emergence of trauma-related memories and the containment of the affects accompanying them. Such scenarios may appear in the clinical material only through the workings of "procedural memory." It is therefore important to contain and gradually decipher repetitive patterns of behavior and feelings of shame, guilt and rage that go with them. The second aspect examines the challenges such work poses to the analyst's containing capacities, credulousness and even his or her reality testing within a clinical situation. The resulting instability of the analyst's work ego can make it hard for him or her to remain vigilant yet empathic, and emotionally attuned but analytically skeptical. The analyst's flexibility to be utilized as a transference object, developmental object and self-object remains a critical determinant of the treatment outcome under such circumstances. The paper provides clinical material to illustrate these two aspects of clinical technique.
[Algorithms, machine intelligence, big data : general considerations].
Radermacher, F J
2015-08-01
We are experiencing astonishing developments in the areas of big data and artificial intelligence. They follow a pattern that we have now been observing for decades: according to Moore's Law,the performance and efficiency in the area of elementary arithmetic operations increases a thousand-fold every 20 years. Although we have not achieved the status where in the singular sense machines have become as "intelligent" as people, machines are becoming increasingly better. The Internet of Things has again helped to massively increase the efficiency of machines. Big data and suitable analytics do the same. If we let these processes simply continue, our civilization may be endangerd in many instances. If the "containment" of these processes succeeds in the context of a reasonable political global governance, a worldwide eco-social market economy, andan economy of green and inclusive markets, many desirable developments that are advantageous for our future may result. Then, at some point in time, the constant need for more and faster innovation may even stop. However, this is anything but certain. We are facing huge challenges.
Evaluation of three new laser spectrometer techniques for in-situ carbon monoxide measurements
NASA Astrophysics Data System (ADS)
Zellweger, C.; Steinbacher, M.; Buchmann, B.
2012-07-01
Long-term time series of the atmospheric composition are essential for environmental research and thus require compatible, multi-decadal monitoring activities. However, the current data quality objectives of the World Meteorological Organization (WMO) for carbon monoxide (CO) in the atmosphere are very challenging to meet with the measurement techniques that have been used until recently. During the past few years, new spectroscopic techniques came on the market with promising properties for trace gas analytics. The current study compares three instruments that are recently commercially available (since 2011) with the up to now best available technique (vacuum UV fluorescence) and provides a link to previous comparison studies. The instruments were investigated for their performance regarding repeatability, reproducibility, drift, temperature dependence, water vapour interference and linearity. Finally, all instruments were examined during a short measurement campaign to assess their applicability for long-term field measurements. It could be shown that the new techniques provide a considerably better performance compared to previous techniques, although some issues such as temperature influence and cross sensitivities need further attention.
NASA Astrophysics Data System (ADS)
Irving, D. H.; Rasheed, M.; Hillman, C.; O'Doherty, N.
2012-12-01
Oilfield management is moving to a more operational footing with near-realtime seismic and sensor monitoring governing drilling, fluid injection and hydrocarbon extraction workflows within safety, productivity and profitability constraints. To date, the geoscientific analytical architectures employed are configured for large volumes of data, computational power or analytical latency and compromises in system design must be made to achieve all three aspects. These challenges are encapsulated by the phrase 'Big Data' which has been employed for over a decade in the IT industry to describe the challenges presented by data sets that are too large, volatile and diverse for existing computational architectures and paradigms. We present a data-centric architecture developed to support a geoscientific and geotechnical workflow whereby: ●scientific insight is continuously applied to fresh data ●insights and derived information are incorporated into engineering and operational decisions ●data governance and provenance are routine within a broader data management framework Strategic decision support systems in large infrastructure projects such as oilfields are typically relational data environments; data modelling is pervasive across analytical functions. However, subsurface data and models are typically non-relational (i.e. file-based) in the form of large volumes of seismic imaging data or rapid streams of sensor feeds and are analysed and interpreted using niche applications. The key architectural challenge is to move data and insight from a non-relational to a relational, or structured, data environment for faster and more integrated analytics. We describe how a blend of MapReduce and relational database technologies can be applied in geoscientific decision support, and the strengths and weaknesses of each in such an analytical ecosystem. In addition we discuss hybrid technologies that use aspects of both and translational technologies for moving data and analytics across these platforms. Moving to a data-centric architecture requires data management methodologies to be overhauled by default and we show how end-to-end data provenancing and dependency management is implicit in such an environment and how it benefits system administration as well as the user community. Whilst the architectural experiences are drawn from the oil industry, we believe that they are more broadly applicable in academic and government settings where large volumes of data are added to incrementally and require revisiting with low analytical latency and we suggest application to earthquake monitoring and remote sensing networks.
The Theory and Practice of Estimating the Accuracy of Dynamic Flight-Determined Coefficients
NASA Technical Reports Server (NTRS)
Maine, R. E.; Iliff, K. W.
1981-01-01
Means of assessing the accuracy of maximum likelihood parameter estimates obtained from dynamic flight data are discussed. The most commonly used analytical predictors of accuracy are derived and compared from both statistical and simplified geometrics standpoints. The accuracy predictions are evaluated with real and simulated data, with an emphasis on practical considerations, such as modeling error. Improved computations of the Cramer-Rao bound to correct large discrepancies due to colored noise and modeling error are presented. The corrected Cramer-Rao bound is shown to be the best available analytical predictor of accuracy, and several practical examples of the use of the Cramer-Rao bound are given. Engineering judgement, aided by such analytical tools, is the final arbiter of accuracy estimation.
NASA Astrophysics Data System (ADS)
Makoveeva, Eugenya V.; Alexandrov, Dmitri V.
2018-01-01
This article is concerned with a new analytical description of nucleation and growth of crystals in a metastable mushy layer (supercooled liquid or supersaturated solution) at the intermediate stage of phase transition. The model under consideration consisting of the non-stationary integro-differential system of governing equations for the distribution function and metastability level is analytically solved by means of the saddle-point technique for the Laplace-type integral in the case of arbitrary nucleation kinetics and time-dependent heat or mass sources in the balance equation. We demonstrate that the time-dependent distribution function approaches the stationary profile in course of time. This article is part of the theme issue `From atomistic interfaces to dendritic patterns'.
Page, Trevor; Dubina, Henry; Fillipi, Gabriele; Guidat, Roland; Patnaik, Saroj; Poechlauer, Peter; Shering, Phil; Guinn, Martin; Mcdonnell, Peter; Johnston, Craig
2015-03-01
This white paper focuses on equipment, and analytical manufacturers' perspectives, regarding the challenges of continuous pharmaceutical manufacturing across five prompt questions. In addition to valued input from several vendors, commentary was provided from experienced pharmaceutical representatives, who have installed various continuous platforms. Additionally, a small medium enterprise (SME) perspective was obtained through interviews. A range of technical challenges is outlined, including: the presence of particles, equipment scalability, fouling (and cleaning), technology derisking, specific analytical challenges, and the general requirement of improved technical training. Equipment and analytical companies can make a significant contribution to help the introduction of continuous technology. A key point is that many of these challenges exist in batch processing and are not specific to continuous processing. Backward compatibility of software is not a continuous issue per se. In many cases, there is available learning from other industries. Business models and opportunities through outsourced development partners are also highlighted. Agile smaller companies and academic groups have a key role to play in developing skills, working collaboratively in partnerships, and focusing on solving relevant industry challenges. The precompetitive space differs for vendor companies compared with large pharmaceuticals. Currently, there is no strong consensus around a dominant continuous design, partly because of business dynamics and commercial interests. A more structured common approach to process design and hardware and software standardization would be beneficial, with initial practical steps in modeling. Conclusions include a digestible systems approach, accessible and published business cases, and increased user, academic, and supplier collaboration. This mirrors US FDA direction. The concept of silos in pharmaceutical companies is a common theme throughout the white papers. In the equipment domain, this is equally prevalent among a broad range of companies, mainly focusing on discrete areas. As an example, the flow chemistry and secondary drug product communities are almost entirely disconnected. Control and Process Analytical Technologies (PAT) companies are active in both domains. The equipment actors are a very diverse group with a few major Original Equipment Manufacturers (OEM) players and a variety of SME, project providers, integrators, upstream downstream providers, and specialist PAT. In some cases, partnerships or alliances are formed to increase critical mass. This white paper has focused on small molecules; equipment associated with biopharmaceuticals is covered in a separate white paper. More specifics on equipment detail are provided in final dosage form and drug substance white papers. The equipment and analytical development from laboratory to pilot to production is important, with a variety of sensors and complexity reducing with scale. The importance of robust processing rather than overcomplex control strategy mitigation is important. A search of nonacademic literature highlights, with a few notable exceptions, a relative paucity of material. Much focuses on the economics and benefits of continuous, rather than specifics of equipment issues. The disruptive nature of continuous manufacturing represents either an opportunity or a threat for many companies, so the incentive to change equipment varies. Also, for many companies, the pharmaceutical sector is not actually the dominant sector in terms of sales. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
Page, Trevor; Dubina, Henry; Fillipi, Gabriele; Guidat, Roland; Patnaik, Saroj; Poechlauer, Peter; Shering, Phil; Guinn, Martin; Mcdonnell, Peter; Johnston, Craig
2015-03-01
This white paper focuses on equipment, and analytical manufacturers' perspectives, regarding the challenges of continuous pharmaceutical manufacturing across five prompt questions. In addition to valued input from several vendors, commentary was provided from experienced pharmaceutical representatives, who have installed various continuous platforms. Additionally, a small medium enterprise (SME) perspective was obtained through interviews. A range of technical challenges is outlined, including: the presence of particles, equipment scalability, fouling (and cleaning), technology derisking, specific analytical challenges, and the general requirement of improved technical training. Equipment and analytical companies can make a significant contribution to help the introduction of continuous technology. A key point is that many of these challenges exist in batch processing and are not specific to continuous processing. Backward compatibility of software is not a continuous issue per se. In many cases, there is available learning from other industries. Business models and opportunities through outsourced development partners are also highlighted. Agile smaller companies and academic groups have a key role to play in developing skills, working collaboratively in partnerships, and focusing on solving relevant industry challenges. The precompetitive space differs for vendor companies compared with large pharmaceuticals. Currently, there is no strong consensus around a dominant continuous design, partly because of business dynamics and commercial interests. A more structured common approach to process design and hardware and software standardization would be beneficial, with initial practical steps in modeling. Conclusions include a digestible systems approach, accessible and published business cases, and increased user, academic, and supplier collaboration. This mirrors US FDA direction. The concept of silos in pharmaceutical companies is a common theme throughout the white papers. In the equipment domain, this is equally prevalent among a broad range of companies, mainly focusing on discrete areas. As an example, the flow chemistry and secondary drug product communities are almost entirely disconnected. Control and Process Analytical Technologies (PAT) companies are active in both domains. The equipment actors are a very diverse group with a few major Original Equipment Manufacturers (OEM) players and a variety of SME, project providers, integrators, upstream downstream providers, and specialist PAT. In some cases, partnerships or alliances are formed to increase critical mass. This white paper has focused on small molecules; equipment associated with biopharmaceuticals is covered in a separate white paper. More specifics on equipment detail are provided in final dosage form and drug substance white papers. The equipment and analytical development from laboratory to pilot to production is important, with a variety of sensors and complexity reducing with scale. The importance of robust processing rather than overcomplex control strategy mitigation is important. A search of nonacademic literature highlights, with a few notable exceptions, a relative paucity of material. Much focuses on the economics and benefits of continuous, rather than specifics of equipment issues. The disruptive nature of continuous manufacturing represents either an opportunity or a threat for many companies, so the incentive to change equipment varies. Also, for many companies, the pharmaceutical sector is not actually the dominant sector in terms of sales. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
Katz, Deirdre A; Peckins, Melissa K
2017-12-01
Intraindividual variability in stress responsivity and the interrelationship of multiple neuroendocrine systems make a multisystem analytic approach to examining the human stress response challenging. The present study makes use of an efficient social-evaluative stress paradigm - the Group Public Speaking Task for Adolescents (GPST-A) - to examine the hypothalamic-pituitary-adrenocortical (HPA)-axis and Autonomic Nervous System (ANS) reactivity profiles of 54 adolescents with salivary cortisol and salivary alpha-amylase (sAA). First, we account for individuals' time latency of hormone concentrations between individuals. Second, we use a two-piece multilevel growth curve model with landmark registration to examine the reactivity and recovery periods of the stress response separately. This analytic approach increases the models' sensitivity to detecting trajectory differences in the reactivity and recovery phases of the stress response and allows for interindividual variation in the timing of participants' peak response following a social-evaluative stressor. The GPST-A evoked typical cortisol and sAA responses in both males and females. Males' cortisol concentrations were significantly higher than females' during each phase of the response. We found no gender difference in the sAA response. However, the rate of increase in sAA as well as overall sAA secretion across the study were associated with steeper rates of cortisol reactivity and recovery. This study demonstrates a way to model the response trajectories of salivary biomarkers of the HPA-axis and ANS when taking a multisystem approach to neuroendocrine research that enables researchers to make conclusions about the reactivity and recovery phases of the HPA-axis and ANS responses. As the study of the human stress response progresses toward a multisystem analytic approach, it is critical that individual variability in peak latency be taken into consideration and that accurate modeling techniques capture individual variability in the stress response so that accurate conclusions can be made about separate phases of the response. Copyright © 2017 Elsevier Ltd. All rights reserved.
Alemayehu, Demissie; Cappelleri, Joseph C
2012-07-01
Patient-reported outcomes (PROs) can play an important role in personalized medicine. PROs can be viewed as an important fundamental tool to measure the extent of disease and the effect of treatment at the individual level, because they reflect the self-reported health state of the patient directly. However, their effective integration in personalized medicine requires addressing certain conceptual and methodological challenges, including instrument development and analytical issues. To evaluate methodological issues, such as multiple comparisons, missing data, and modeling approaches, associated with the analysis of data related to PRO and personalized medicine to further our understanding on the role of PRO data in personalized medicine. There is a growing recognition of the role of PROs in medical research, but their potential use in customizing healthcare is not widely appreciated. Emerging insights into the genetic basis of PROs could potentially lead to new pathways that may improve patient care. Knowledge of the biologic pathways through which the various genetic predispositions propel people toward negative or away from positive health experiences may ultimately transform healthcare. Understanding and addressing the conceptual and methodological issues in PROs and personalized medicine are expected to enhance the emerging area of personalized medicine and to improve patient care. This article addresses relevant concerns that need to be considered for effective integration of PROs in personalized medicine, with particular reference to conceptual and analytical issues that routinely arise with personalized medicine and PRO data. Some of these issues, including multiplicity problems, handling of missing values-and modeling approaches, are common to both areas. It is hoped that this article will help to stimulate further research to advance our understanding of the role of PRO data in personalized medicine. A robust conceptual framework to incorporate PROs into personalized medicine can provide fertile opportunity to bring these two areas even closer and to enhance the way a specific treatment is attuned and delivered to address patient care and patient needs.
Scheidweiler, Karl B.; Himes, Sarah K.; Chen, Xiaohong; Liu, Hua-Fen
2013-01-01
Currently, Δ9-tetrahydrocannabinol (THC) is the analyte quantified for oral fluid cannabinoid monitoring. The potential for false-positive oral fluid cannabinoid results from passive exposure to THC-laden cannabis smoke raises concerns for this promising new monitoring technology. Oral fluid 11-nor-9-carboxy-Δ9-tetrahydrocannabinol (THCCOOH) is proposed as a marker of cannabis intake since it is not present in cannabis smoke and was not measureable in oral fluid collected from subjects passively exposed to cannabis. THCCOOH concentrations are in the picogram per milliliter range in oral fluid and pose considerable analytical challenges. A liquid chromatography–tandem mass spectrometry (LCMSMS) method was developed and validated for quantifying THCCOOH in 1 mL Quantisal-collected oral fluid. After solid phase extraction, chromatography was performed on a Kinetex C18 column with a gradient of 0.01 % acetic acid in water and 0.01 % acetic acid in methanol with a 0.5-mL/min flow rate. THCCOOH was monitored in negative mode electrospray ionization and multiple reaction monitoring mass spectrometry. The THCCOOH linear range was 12–1,020 pg/mL (R2>0.995). Mean extraction efficiencies and matrix effects evaluated at low and high quality control (QC) concentrations were 40.8–65.1 and −2.4–11.5 %, respectively (n=10). Analytical recoveries (bias) and total imprecision at low, mid, and high QCs were 85.0–113.3 and 6.6–8.4 % coefficient of variation, respectively (n=20). This is the first oral fluid THCCOOH LCMSMS triple quadrupole method not requiring derivatization to achieve a <15 pg/mL limit of quantification. The assay is applicable for the workplace, driving under the influence of drugs, drug treatment, and pain management testing. PMID:23681203
Parametric study of minimum converter loss in an energy-storage dc-to-dc converter
NASA Technical Reports Server (NTRS)
Wong, R. C.; Owen, H. A., Jr.; Wilson, T. G.
1982-01-01
Through a combination of analytical and numerical minimization procedures, a converter design that results in the minimum total converter loss (including core loss, winding loss, capacitor and energy-storage-reactor loss, and various losses in the semiconductor switches) is obtained. Because the initial phase involves analytical minimization, the computation time required by the subsequent phase of numerical minimization is considerably reduced in this combination approach. The effects of various loss parameters on the optimum values of the design variables are also examined.
Healthcare predictive analytics: An overview with a focus on Saudi Arabia.
Alharthi, Hana
2018-03-08
Despite a newfound wealth of data and information, the healthcare sector is lacking in actionable knowledge. This is largely because healthcare data, though plentiful, tends to be inherently complex and fragmented. Health data analytics, with an emphasis on predictive analytics, is emerging as a transformative tool that can enable more proactive and preventative treatment options. This review considers the ways in which predictive analytics has been applied in the for-profit business sector to generate well-timed and accurate predictions of key outcomes, with a focus on key features that may be applicable to healthcare-specific applications. Published medical research presenting assessments of predictive analytics technology in medical applications are reviewed, with particular emphasis on how hospitals have integrated predictive analytics into their day-to-day healthcare services to improve quality of care. This review also highlights the numerous challenges of implementing predictive analytics in healthcare settings and concludes with a discussion of current efforts to implement healthcare data analytics in the developing country, Saudi Arabia. Copyright © 2018 The Author. Published by Elsevier Ltd.. All rights reserved.
Analytical Approaches to Verify Food Integrity: Needs and Challenges.
Stadler, Richard H; Tran, Lien-Anh; Cavin, Christophe; Zbinden, Pascal; Konings, Erik J M
2016-09-01
A brief overview of the main analytical approaches and practices to determine food authenticity is presented, addressing, as well, food supply chain and future requirements to more effectively mitigate food fraud. Food companies are introducing procedures and mechanisms that allow them to identify vulnerabilities in their food supply chain under the umbrella of a food fraud prevention management system. A key step and first line of defense is thorough supply chain mapping and full transparency, assessing the likelihood of fraudsters to penetrate the chain at any point. More vulnerable chains, such as those where ingredients and/or raw materials are purchased through traders or auctions, may require a higher degree of sampling, testing, and surveillance. Access to analytical tools is therefore pivotal, requiring continuous development and possibly sophistication in identifying chemical markers, data acquisition, and modeling. Significant progress in portable technologies is evident already today, for instance, as in the rapid testing now available at the agricultural level. In the near future, consumers may also have the ability to scan products in stores or at home to authenticate labels and food content. For food manufacturers, targeted analytical methods complemented by untargeted approaches are end control measures at the factory gate when the material is delivered. In essence, testing for food adulterants is an integral part of routine QC, ideally tailored to the risks in the individual markets and/or geographies or supply chains. The development of analytical methods is a first step in verifying the compliance and authenticity of food materials. A next, more challenging step is the successful establishment of global consensus reference methods as exemplified by the AOAC Stakeholder Panel on Infant Formula and Adult Nutritionals initiative, which can serve as an approach that could also be applied to methods for contaminants and adulterants in food. The food industry has taken these many challenges aboard, working closely with all stakeholders and continuously communicating on progress in a fully transparent manner.
ECS Law & Education Center Footnotes. No. 3.
ERIC Educational Resources Information Center
Education Commission of the States, Denver, CO. Law and Education Center.
Commentary and advice in four legal areas are offered in this newsletter on educational law. First, the document outlines preventive legal review for public educators in four basic steps, including anticipation of legal challenges, evaluation of the challenges' legal merits, consideration of the policy issues raised by potential challenges, and…
Bridging the Gap: The Challenges of Employing Entrepreneurial Processes within University Settings
ERIC Educational Resources Information Center
Wardale, Dorothy; Lord, Linley
2016-01-01
In Australia and elsewhere, universities face increasing pressure to improve research output and quality, particularly through partnerships with industry. This raises interesting challenges for academic staff with considerable industry experience who are "new" to academe. Some of these challenges were faced by the authors who have been…
NASA Astrophysics Data System (ADS)
Mohan, C.
In this paper, I survey briefly some of the recent and emerging trends in hardware and software features which impact high performance transaction processing and data analytics applications. These features include multicore processor chips, ultra large main memories, flash storage, storage class memories, database appliances, field programmable gate arrays, transactional memory, key-value stores, and cloud computing. While some applications, e.g., Web 2.0 ones, were initially built without traditional transaction processing functionality in mind, slowly system architects and designers are beginning to address such previously ignored issues. The availability, analytics and response time requirements of these applications were initially given more importance than ACID transaction semantics and resource consumption characteristics. A project at IBM Almaden is studying the implications of phase change memory on transaction processing, in the context of a key-value store. Bitemporal data management has also become an important requirement, especially for financial applications. Power consumption and heat dissipation properties are also major considerations in the emergence of modern software and hardware architectural features. Considerations relating to ease of configuration, installation, maintenance and monitoring, and improvement of total cost of ownership have resulted in database appliances becoming very popular. The MapReduce paradigm is now quite popular for large scale data analysis, in spite of the major inefficiencies associated with it.
International Space Station Model Correlation Analysis
NASA Technical Reports Server (NTRS)
Laible, Michael R.; Fitzpatrick, Kristin; Hodge, Jennifer; Grygier, Michael
2018-01-01
This paper summarizes the on-orbit structural dynamic data and the related modal analysis, model validation and correlation performed for the International Space Station (ISS) configuration ISS Stage ULF7, 2015 Dedicated Thruster Firing (DTF). The objective of this analysis is to validate and correlate the analytical models used to calculate the ISS internal dynamic loads and compare the 2015 DTF with previous tests. During the ISS configurations under consideration, on-orbit dynamic measurements were collected using the three main ISS instrumentation systems; Internal Wireless Instrumentation System (IWIS), External Wireless Instrumentation System (EWIS) and the Structural Dynamic Measurement System (SDMS). The measurements were recorded during several nominal on-orbit DTF tests on August 18, 2015. Experimental modal analyses were performed on the measured data to extract modal parameters including frequency, damping, and mode shape information. Correlation and comparisons between test and analytical frequencies and mode shapes were performed to assess the accuracy of the analytical models for the configurations under consideration. These mode shapes were also compared to earlier tests. Based on the frequency comparisons, the accuracy of the mathematical models is assessed and model refinement recommendations are given. In particular, results of the first fundamental mode will be discussed, nonlinear results will be shown, and accelerometer placement will be assessed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, Weiju
2010-01-01
Alloy 617 is currently considered as a leading candidate material for high temperature components in the Gen IV Nuclear Reactor Systems. Because of the unprecedented severe working conditions beyond its commercial service experience required by the Gen IV systems, the alloy faces various challenges in both mechanical and metallurgical properties. Following a previous paper discussing the mechanical property challenges, this paper is focused on the challenges and issues in metallurgical properties of the alloy for the intended nuclear application. Considerations are given in details about its metallurgical stability and aging evolution, aging effects on mechanical properties, potential Co hazard, andmore » internal oxidation. Some research and development activities are suggested with discussions on viability to satisfy the Gen IV Nuclear Reactor System needs.« less
ERIC Educational Resources Information Center
Felder, Franziska
2018-01-01
In recent years inclusion has become one of the most dominant values and objectives in education. However, there is still considerable disagreement concerning the theoretical concept of inclusion and its normative implications. This article suggests an understanding of inclusion that first differentiates analytically between societal and communal…
Using Large Data Sets to Study College Education Trajectories
ERIC Educational Resources Information Center
Oseguera, Leticia; Hwang, Jihee
2014-01-01
This chapter presents various considerations researchers undertook to conduct a quantitative study on low-income students using a national data set. Specifically, it describes how a critical quantitative scholar approaches guiding frameworks, variable operationalization, analytic techniques, and result interpretation. Results inform how…
Huang, T; Li, L M
2018-05-10
The era of medical big data, translational medicine and precision medicine brings new opportunities for the study of etiology of chronic complex diseases. How to implement evidence-based medicine, translational medicine and precision medicine are the challenges we are facing. Systems epidemiology, a new field of epidemiology, combines medical big data with system biology and examines the statistical model of disease risk, the future risk simulation and prediction using the data at molecular, cellular, population, social and ecological levels. Due to the diversity and complexity of big data sources, the development of study design and analytic methods of systems epidemiology face new challenges and opportunities. This paper summarizes the theoretical basis, concept, objectives, significances, research design and analytic methods of systems epidemiology and its application in the field of public health.
Lubin, Arnaud; Sheng, Sheng; Cabooter, Deirdre; Augustijns, Patrick; Cuyckens, Filip
2017-11-17
Lack of knowledge on the expected concentration range or insufficient linear dynamic range of the analytical method applied are common challenges for the analytical scientist. Samples that are above the upper limit of quantification are typically diluted and reanalyzed. The analysis of undiluted highly concentrated samples can cause contamination of the system, while the dilution step is time consuming and as the case for any sample preparation step, also potentially leads to precipitation, adsorption or degradation of the analytes. Copyright © 2017 Elsevier B.V. All rights reserved.
Analytic Strategies of Streaming Data for eHealth.
Yoon, Sunmoo
2016-01-01
New analytic strategies for streaming big data from wearable devices and social media are emerging in ehealth. We face challenges to find meaningful patterns from big data because researchers face difficulties to process big volume of streaming data using traditional processing applications.1 This introductory 180 minutes tutorial offers hand-on instruction on analytics2 (e.g., topic modeling, social network analysis) of streaming data. This tutorial aims to provide practical strategies of information on reducing dimensionality using examples of big data. This tutorial will highlight strategies of incorporating domain experts and a comprehensive approach to streaming social media data.
ERIC Educational Resources Information Center
Weeden, Marc; Ehrhardt, Kristal; Poling, Alan
2009-01-01
Both risperidone, an atypical antipsychotic drug, and function-based behavior-analytic interventions are popular and empirically validated treatments for reducing challenging behavior in children with autism. The kind of research that supports their effectiveness differs, however, and no published study has directly compared their effects or…
ERIC Educational Resources Information Center
Nyhan, Barry; Cressey, Peter; Tomassini, Massimo; Kelleher, Michael; Poell, Rob
This first volume of a two-volume publication provides an analytical overview of main questions emerging from recent European research and development projects related to the learning organization. Chapter 1 provides context for the European learning organization challenge and presents four main messages arising from the learning organization…
ERIC Educational Resources Information Center
Obrusnikova, Iva; Dillon, Suzanna R.
2011-01-01
As the first step of an instrument development, teaching challenges that occur when students with autism spectrum disorders are educated in general physical education were elicited using Goldfried and D'Zurilla's (1969) behavioral-analytic model. Data were collected from a convenience sample of 43 certified physical educators (29 women and 14 men)…
Ability of institutions to address new challenges
B. Cashore; G. Galloway; F. Cubbage; D. Humphreys; P. Katila; K. Levin; A. Maryudi; C. McDermott; Kathleen McGinley
2010-01-01
What types of institutional configurations hold the most promise in fostering efforts for long-term amelioration of enduring environmental, social, and economic challenges facing the worldâs forests? This chapter presents, and applies an analytical framework with which to review research findings and analyses that shed light on what appear to be the most promising...
ERIC Educational Resources Information Center
Dukas, Georg
2009-01-01
Though research in emerging technologies is vital to fulfilling their incredible potential for educational applications, it is often fraught with analytic challenges related to large datasets. This thesis explores these challenges in researching multiuser virtual environments (MUVEs). In a MUVE, users assume a persona and traverse a virtual space…
2007-10-24
This excellent book provides a comprehensive and analytical overview of the socio-political and economic factors that contribute to an understanding of global health governance. The opening chapter outlines compelling arguments for why such an understanding should be everyone's business.
Ekwunife, Obinna I; Grote, Andreas Gerber; Mosch, Christoph; O'Mahony, James F; Lhachimi, Stefan K
2015-05-12
Cervical cancer poses a huge health burden, both to developed and developing nations, making prevention and control strategies necessary. However, the challenges of designing and implementing prevention strategies differ for low- and middle-income countries (LMICs) as compared to countries with fully developed health care systems. Moreover, for many LMICs, much of the data needed for decision analytic modelling, such as prevalence, will most likely only be partly available or measured with much larger uncertainty. Lastly, imperfect implementation of human papillomavirus (HPV) vaccination may influence the effectiveness of cervical cancer prevention in unpredictable ways. This systematic review aims to assess how decision analytic modelling studies of HPV cost-effectiveness in LMICs accounted for the particular challenges faced in such countries. Specifically, the study will assess the following: (1) whether the existing literature on cost-effectiveness modelling of HPV vaccines acknowledges the distinct challenges of LMICs, (2) how these challenges were accommodated in the models, (3) whether certain parameters systemically exhibited large degrees of uncertainty due to lack of data and how influential were these parameters on model-based recommendations, and (4) whether the choice of modelling herd immunity influences model-based recommendations, especially when coverage of a HPV vaccination program is not optimal. We will conduct a systematic review to identify suitable studies from MEDLINE (via PubMed), EMBASE, NHS Economic Evaluation Database (NHS EED), EconLit, Web of Science, and CEA Registry. Searches will be conducted for studies of interest published since 2006. The searches will be supplemented by hand searching of the most relevant papers found in the search. Studies will be critically appraised using Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement checklist. We will undertake a descriptive, narrative, and interpretative synthesis of data to address the study objectives. The proposed systematic review will assess how the cost-effectiveness studies of HPV vaccines accounted for the distinct challenges of LMICs. The gaps identified will expose areas for additional research as well as challenges that need to be accounted for in future modelling studies. PROSPERO CRD42015017870.
NASA Technical Reports Server (NTRS)
Graf, John
2015-01-01
NASA has been developing and testing two different types of oxygen separation systems. One type of oxygen separation system uses pressure swing technology, the other type uses a solid electrolyte electrochemical oxygen separation cell. Both development systems have been subjected to long term testing, and performance testing under a variety of environmental and operational conditions. Testing these two systems revealed that measuring the product purity of oxygen, and determining if an oxygen separation device meets Aviator's Breathing Oxygen (ABO) specifications is a subtle and sometimes difficult analytical chemistry job. Verifying product purity of cryogenically produced oxygen presents a different set of analytical chemistry challenges. This presentation will describe some of the sample acquisition and analytical chemistry challenges presented by verifying oxygen produced by an oxygen separator - and verifying oxygen produced by cryogenic separation processes. The primary contaminant that causes gas samples to fail to meet ABO requirements is water. The maximum amount of water vapor allowed is 7 ppmv. The principal challenge of verifying oxygen produced by an oxygen separator is that it is produced relatively slowly, and at comparatively low temperatures. A short term failure that occurs for just a few minutes in the course of a 1 week run could cause an entire tank to be rejected. Continuous monitoring of oxygen purity and water vapor could identify problems as soon as they occur. Long term oxygen separator tests were instrumented with an oxygen analyzer and with an hygrometer: a GE Moisture Monitor Series 35. This hygrometer uses an aluminum oxide sensor. The user's manual does not report this, but long term exposure to pure oxygen causes the aluminum oxide sensor head to bias dry. Oxygen product that exceeded the 7 ppm specification was improperly accepted, because the sensor had biased. The bias is permanent - exposure to air does not cause the sensor to return to its original response - but the bias can be accounted for by recalibrating the sensor. After this issue was found, continuous measurements of water vapor in the oxygen product were made using an FTIR. The FTIR cell is relatively large, so response time is slow - but moisture measurements were repeatable and accurate. Verifying ABO compliance for oxygen produced by commercial cryogenic processes has a different set of sample acquisition and analytical chemistry challenges. Customers want analytical chemists to conserve as much as possible. Hygrometers are not exposed to hours of continuous flow of oxygen, so they don't bias, but small amounts of contamination in valves can cause a "fail". K bottles are periodically cleaned and recertified - after cleaning residual moisture can cause a "fail". Operators let bottle pressure drop to room pressure, introduce outside air into the bottle, and the subsequent fill will "fail". Outside storage of K-bottles has allowed enough in-leakage, so contents will "fail".
Analytical Sociology: A Bungean Appreciation
NASA Astrophysics Data System (ADS)
Wan, Poe Yu-ze
2012-10-01
Analytical sociology, an intellectual project that has garnered considerable attention across a variety of disciplines in recent years, aims to explain complex social processes by dissecting them, accentuating their most important constituent parts, and constructing appropriate models to understand the emergence of what is observed. To achieve this goal, analytical sociologists demonstrate an unequivocal focus on the mechanism-based explanation grounded in action theory. In this article I attempt a critical appreciation of analytical sociology from the perspective of Mario Bunge's philosophical system, which I characterize as emergentist systemism. I submit that while the principles of analytical sociology and those of Bunge's approach share a lot in common, the latter brings to the fore the ontological status and explanatory importance of supra-individual actors (as concrete systems endowed with emergent causal powers) and macro-social mechanisms (as processes unfolding in and among social systems), and therefore it does not stipulate that every causal explanation of social facts has to include explicit references to individual-level actors and mechanisms. In this sense, Bunge's approach provides a reasonable middle course between the Scylla of sociological reification and the Charybdis of ontological individualism, and thus serves as an antidote to the untenable "strong program of microfoundations" to which some analytical sociologists are committed.
Assessment of Montana road weather information system : final report
DOT National Transportation Integrated Search
2017-01-01
Weather presents considerable challenges to highway agencies both in terms of safety and operations. State transportation agencies have developed road weather information systems (RWIS) to address such challenges. Road weather information has been us...
Planning for planetary protection : challenges beyond Mars
NASA Technical Reports Server (NTRS)
Belz, Andrea P.; Cutts, James A.
2006-01-01
This document summarizes the technical challenges to planetary protection for these targets of interest and outlines some of the considerations, particularly at the system level, in designing an appropriate technology investment strategy for targets beyond Mars.
Characterization of interfacial socket pressure in transhumeral prostheses: A case series.
Schofield, Jonathon S; Schoepp, Katherine R; Williams, Heather E; Carey, Jason P; Marasco, Paul D; Hebert, Jacqueline S
2017-01-01
One of the most important factors in successful upper limb prostheses is the socket design. Sockets must be individually fabricated to arrive at a geometry that suits the user's morphology and appropriately distributes the pressures associated with prosthetic use across the residual limb. In higher levels of amputation, such as transhumeral, this challenge is amplified as prosthetic weight and the physical demands placed on the residual limb are heightened. Yet, in the upper limb, socket fabrication is largely driven by heuristic practices. An analytical understanding of the interactions between the socket and residual limb is absent in literature. This work describes techniques, adapted from lower limb prosthetic research, to empirically characterize the pressure distribution occurring between the residual limb and well-fit transhumeral prosthetic sockets. A case series analyzing the result of four participants with transhumeral amputation is presented. A Tekscan VersaTek pressure measurement system and FaroArm Edge coordinate measurement machine were employed to capture socket-residual limb interface pressures and geometrically register these values to the anatomy of participants. Participants performed two static poses with their prosthesis under two separate loading conditions. Surface pressure maps were constructed from the data, highlighting pressure distribution patterns, anatomical locations bearing maximum pressure, and the relative pressure magnitudes. Pressure distribution patterns demonstrated unique characteristics across the four participants that could be traced to individual socket design considerations. This work presents a technique that implements commercially available tools to quantitatively characterize upper limb socket-residual limb interactions. This is a fundamental first step toward improved socket designs developed through informed, analytically-based design tools.
Characterization of interfacial socket pressure in transhumeral prostheses: A case series
Schoepp, Katherine R.; Williams, Heather E.; Carey, Jason P.; Marasco, Paul D.
2017-01-01
One of the most important factors in successful upper limb prostheses is the socket design. Sockets must be individually fabricated to arrive at a geometry that suits the user’s morphology and appropriately distributes the pressures associated with prosthetic use across the residual limb. In higher levels of amputation, such as transhumeral, this challenge is amplified as prosthetic weight and the physical demands placed on the residual limb are heightened. Yet, in the upper limb, socket fabrication is largely driven by heuristic practices. An analytical understanding of the interactions between the socket and residual limb is absent in literature. This work describes techniques, adapted from lower limb prosthetic research, to empirically characterize the pressure distribution occurring between the residual limb and well-fit transhumeral prosthetic sockets. A case series analyzing the result of four participants with transhumeral amputation is presented. A Tekscan VersaTek pressure measurement system and FaroArm Edge coordinate measurement machine were employed to capture socket-residual limb interface pressures and geometrically register these values to the anatomy of participants. Participants performed two static poses with their prosthesis under two separate loading conditions. Surface pressure maps were constructed from the data, highlighting pressure distribution patterns, anatomical locations bearing maximum pressure, and the relative pressure magnitudes. Pressure distribution patterns demonstrated unique characteristics across the four participants that could be traced to individual socket design considerations. This work presents a technique that implements commercially available tools to quantitatively characterize upper limb socket-residual limb interactions. This is a fundamental first step toward improved socket designs developed through informed, analytically-based design tools. PMID:28575012
Husain, Syed S; Kalinin, Alexandr; Truong, Anh; Dinov, Ivo D
Intuitive formulation of informative and computationally-efficient queries on big and complex datasets present a number of challenges. As data collection is increasingly streamlined and ubiquitous, data exploration, discovery and analytics get considerably harder. Exploratory querying of heterogeneous and multi-source information is both difficult and necessary to advance our knowledge about the world around us. We developed a mechanism to integrate dispersed multi-source data and service the mashed information via human and machine interfaces in a secure, scalable manner. This process facilitates the exploration of subtle associations between variables, population strata, or clusters of data elements, which may be opaque to standard independent inspection of the individual sources. This a new platform includes a device agnostic tool (Dashboard webapp, http://socr.umich.edu/HTML5/Dashboard/) for graphical querying, navigating and exploring the multivariate associations in complex heterogeneous datasets. The paper illustrates this core functionality and serviceoriented infrastructure using healthcare data (e.g., US data from the 2010 Census, Demographic and Economic surveys, Bureau of Labor Statistics, and Center for Medicare Services) as well as Parkinson's Disease neuroimaging data. Both the back-end data archive and the front-end dashboard interfaces are continuously expanded to include additional data elements and new ways to customize the human and machine interactions. A client-side data import utility allows for easy and intuitive integration of user-supplied datasets. This completely open-science framework may be used for exploratory analytics, confirmatory analyses, meta-analyses, and education and training purposes in a wide variety of fields.
BIRCH: a user-oriented, locally-customizable, bioinformatics system.
Fristensky, Brian
2007-02-09
Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere.
NASA Astrophysics Data System (ADS)
Kong, Xianming; Squire, Kenneth; Wang, Alan X.
2018-02-01
Surface-enhanced Raman scattering (SERS) spectroscopy has attracted considerable attention recently as a powerful detection platform in biosensing because of the wealth of inherent information ascertained about the chemical and molecular composition of a sample. However, real-world samples are often composed of many components, which renders the detection of constitutes of mixed samples very challenging for SERS sensing. Accordingly, separation techniques are needed before SERS measurements. Thin layer chromatography (TLC) is a simple, fast and costeffective technique for analyte separation and can a play pivotal role for on-site sensing. However, combining TLC with SERS is only successful to detect a limited number of analytes that have large Raman scattering cross sections. As a kind of biogenic amine, histamine (2-(4-imidazolyl)-ethylamine) has a relationship with many health problems resulting from seafood consumption occurring worldwide. Diatomaceous earth consists of fossilized remains of diatoms, a type of hard-shelled algae. As a kind of natural photonic biosilica from geological deposits, it has a variety of unique properties including highly porous structure, excellent adsorption capacity, and low cost. In addition, the two dimensional periodic pores on diatomite earth with hierarchical nanoscale photonic crystal features can enhance the localized optical field. Herein, we fabricate TLC plates from diatomite as the stationary phase combining with SERS to separate and detect histamine from seafood samples. We have proved that the diatomite on the TLC plate not only functions as stationary phase, but also provides additional Raman enhancement, in which the detection limit of 2 ppm was achieved for pyrene in mixture.
BIRCH: A user-oriented, locally-customizable, bioinformatics system
Fristensky, Brian
2007-01-01
Background Molecular biologists need sophisticated analytical tools which often demand extensive computational resources. While finding, installing, and using these tools can be challenging, pipelining data from one program to the next is particularly awkward, especially when using web-based programs. At the same time, system administrators tasked with maintaining these tools do not always appreciate the needs of research biologists. Results BIRCH (Biological Research Computing Hierarchy) is an organizational framework for delivering bioinformatics resources to a user group, scaling from a single lab to a large institution. The BIRCH core distribution includes many popular bioinformatics programs, unified within the GDE (Genetic Data Environment) graphic interface. Of equal importance, BIRCH provides the system administrator with tools that simplify the job of managing a multiuser bioinformatics system across different platforms and operating systems. These include tools for integrating locally-installed programs and databases into BIRCH, and for customizing the local BIRCH system to meet the needs of the user base. BIRCH can also act as a front end to provide a unified view of already-existing collections of bioinformatics software. Documentation for the BIRCH and locally-added programs is merged in a hierarchical set of web pages. In addition to manual pages for individual programs, BIRCH tutorials employ step by step examples, with screen shots and sample files, to illustrate both the important theoretical and practical considerations behind complex analytical tasks. Conclusion BIRCH provides a versatile organizational framework for managing software and databases, and making these accessible to a user base. Because of its network-centric design, BIRCH makes it possible for any user to do any task from anywhere. PMID:17291351
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casteleyn, L., E-mail: Ludwine.Casteleyn@med.kuleuven.be; Dumez, B.; Becker, K.
In 2004 the European Commission and Member States initiated activities towards a harmonized approach for Human Biomonitoring surveys throughout Europe. The main objective was to sustain environmental health policy by building a coherent and sustainable framework and by increasing the comparability of data across countries. A pilot study to test common guidelines for setting up surveys was considered a key step in this process. Through a bottom-up approach that included all stakeholders, a joint study protocol was elaborated. From September 2011 till February 2012, 17 European countries collected data from 1844 mother–child pairs in the frame of DEMOnstration of amore » study to COordinate and Perform Human Biomonitoring on a European Scale (DEMOCOPHES). Mercury in hair and urinary cadmium and cotinine were selected as biomarkers of exposure covered by sufficient analytical experience. Phthalate metabolites and Bisphenol A in urine were added to take into account increasing public and political awareness for emerging types of contaminants and to test less advanced markers/markers covered by less analytical experience. Extensive efforts towards chemo-analytical comparability were included. The pilot study showed that common approaches can be found in a context of considerable differences with respect to experience and expertize, socio-cultural background, economic situation and national priorities. It also evidenced that comparable Human Biomonitoring results can be obtained in such context. A European network was built, exchanging information, expertize and experiences, and providing training on all aspects of a survey. A key challenge was finding the right balance between a rigid structure allowing maximal comparability and a flexible approach increasing feasibility and capacity building. Next steps in European harmonization in Human Biomonitoring surveys include the establishment of a joint process for prioritization of substances to cover and biomarkers to develop, linking biomonitoring surveys with health examination surveys and with research, and coping with the diverse implementations of EU regulations and international guidelines with respect to ethics and privacy. - Highlights: • A common European Human Biomonitoring (HBM) survey protocol was developed through a bottom-up approach. • A joint process for prioritization was established to select a limited set of biomarkers, some covered by experience and others for emerging substances. • The protocol was tested in a pilot study, resulting in HBM results comparable on a European scale which sustained environmental health policy. • Ethics and privacy regulations were not an obstacle for transnational harmonization.« less
2014-11-14
responses from any analyte under consideration. Figure 1 illustrates this behavior. Figure 1: LIBS spectra from OVA (ricin simulant) on...illustrates this behavior. Figure 1: LIBS spectra from OVA (ricin simulant) on several different substrates: steel, aluminum, and polycarbonate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ivanov, A. P.
2009-06-15
In the referenced paper an analytical approach was introduced, which allows one to demonstrate the instability in linearly stable systems, specifically, in a classical three-body problem. These considerations are disproved here.
Neutron Scattering Announcements
will be added. We encourage everyone interested in neutron scattering to take full advantage of this neutron source ESS. After an initial layout phase using analytical considerations further assessment of Home Page | Facilities | Reference | Software | Conferences | Announcements | Mailing Lists Neutron
Surface-bonded ionic liquid stationary phases in high-performance liquid chromatography--a review.
Pino, Verónica; Afonso, Ana M
2012-02-10
Ionic liquids (ILs) are a class of ionic, nonmolecular solvents which remain in liquid state at temperatures below 100°C. ILs possess a variety of properties including low to negligible vapor pressure, high thermal stability, miscibility with water or a variety of organic solvents, and variable viscosity. IL-modified silica as novel high-performance liquid chromatography (HPLC) stationary phases have attracted considerable attention for their differential behavior and low free-silanol activity. Indeed, around 21 surface-confined ionic liquids (SCIL) stationary phases have been developed in the last six years. Their chromatographic behavior has been studied, and, despite the presence of a positive charge on the stationary phase, they showed considerable promise for the separation of neutral solutes (not only basic analytes), when operated in reversed phase mode. This aspect points to the potential for truly multimodal stationary phases. This review attempts to summarize the state-of-the-art about SCIL phases including their preparation, chromatographic behavior, and analytical performance. Copyright © 2011 Elsevier B.V. All rights reserved.
Schroder, Kerstin E. E.; Carey, Michael P.; Vanable, Peter A.
2008-01-01
Investigation of sexual behavior involves many challenges, including how to assess sexual behavior and how to analyze the resulting data. Sexual behavior can be assessed using absolute frequency measures (also known as “counts”) or with relative frequency measures (e.g., rating scales ranging from “never” to “always”). We discuss these two assessment approaches in the context of research on HIV risk behavior. We conclude that these two approaches yield non-redundant information and, more importantly, that only data yielding information about the absolute frequency of risk behavior have the potential to serve as valid indicators of HIV contraction risk. However, analyses of count data may be challenging due to non-normal distributions with many outliers. Therefore, we identify new and powerful data analytical solutions that have been developed recently to analyze count data, and discuss limitations of a commonly applied method (viz., ANCOVA using baseline scores as covariates). PMID:14534027
Identifying and Coordinating Care for Complex Patients
Rudin, Robert S.; Gidengil, Courtney A.; Predmore, Zachary; Schneider, Eric C.; Sorace, James; Hornstein, Rachel
2017-01-01
Abstract In the United States, a relatively small proportion of complex patients---defined as having multiple comorbidities, high risk for poor outcomes, and high cost---incur most of the nation's health care costs. Improved care coordination and management of complex patients could reduce costs while increasing quality of care. However, care coordination efforts face multiple challenges, such as segmenting populations of complex patients to better match their needs with the design of specific interventions, understanding how to reduce spending, and integrating care coordination programs into providers' care delivery processes. Innovative uses of analytics and health information technology (HIT) may address these challenges. Rudin and colleagues at RAND completed a literature review and held discussions with subject matter experts, reaching the conclusion that analytics and HIT are being used in innovative ways to coordinate care for complex patients but that the capabilities are limited, evidence of their effectiveness is lacking, and challenges are substantial, and important foundational work is still needed. PMID:28845354
Pleil, Joachim D; Angrish, Michelle M; Madden, Michael C
2015-12-11
Immunochemistry is an important clinical tool for indicating biological pathways leading towards disease. Standard enzyme-linked immunosorbent assays (ELISA) are labor intensive and lack sensitivity at low-level concentrations. Here we report on emerging technology implementing fully-automated ELISA capable of molecular level detection and describe application to exhaled breath condensate (EBC) samples. The Quanterix SIMOA HD-1 analyzer was evaluated for analytical performance for inflammatory cytokines (IL-6, TNF-α, IL-1β and IL-8). The system was challenged with human EBC representing the most dilute and analytically difficult of the biological media. Calibrations from synthetic samples and spiked EBC showed excellent linearity at trace levels (r(2) > 0.99). Sensitivities varied by analyte, but were robust from ~0.006 (IL-6) to ~0.01 (TNF-α) pg ml(-1). All analytes demonstrated response suppression when diluted with deionized water and so assay buffer diluent was found to be a better choice. Analytical runs required ~45 min setup time for loading samples, reagents, calibrants, etc., after which the instrument performs without further intervention for up to 288 separate samples. Currently, available kits are limited to single-plex analyses and so sample volumes require adjustments. Sample dilutions should be made with assay diluent to avoid response suppression. Automation performs seamlessly and data are automatically analyzed and reported in spreadsheet format. The internal 5-parameter logistic (pl) calibration model should be supplemented with a linear regression spline at the very lowest analyte levels, (<1.3 pg ml(-1)). The implementation of the automated Quanterix platform was successfully demonstrated using EBC, which poses the greatest challenge to ELISA due to limited sample volumes and low protein levels.
Hogan, Dianna; Arthaud, Greg; Pattison, Malka; Sayre, Roger G.; Shapiro, Carl
2010-01-01
The analytical framework for understanding ecosystem services in conservation, resource management, and development decisions is multidisciplinary, encompassing a combination of the natural and social sciences. This report summarizes a workshop on 'Developing an Analytical Framework: Incorporating Ecosystem Services into Decision Making,' which focused on the analytical process and on identifying research priorities for assessing ecosystem services, their production and use, their spatial and temporal characteristics, their relationship with natural systems, and their interdependencies. Attendees discussed research directions and solutions to key challenges in developing the analytical framework. The discussion was divided into two sessions: (1) the measurement framework: quantities and values, and (2) the spatial framework: mapping and spatial relationships. This workshop was the second of three preconference workshops associated with ACES 2008 (A Conference on Ecosystem Services): Using Science for Decision Making in Dynamic Systems. These three workshops were designed to explore the ACES 2008 theme on decision making and how the concept of ecosystem services can be more effectively incorporated into conservation, restoration, resource management, and development decisions. Preconference workshop 1, 'Developing a Vision: Incorporating Ecosystem Services into Decision Making,' was held on April 15, 2008, in Cambridge, MA. In preconference workshop 1, participants addressed what would have to happen to make ecosystem services be used more routinely and effectively in conservation, restoration, resource management, and development decisions, and they identified some key challenges in developing the analytical framework. Preconference workshop 3, 'Developing an Institutional Framework: Incorporating Ecosystem Services into Decision Making,' was held on October 30, 2008, in Albuquerque, NM; participants examined the relationship between the institutional framework and the use of ecosystem services in decision making.
The seventh penis: towards effective psychoanalytic work with pre-surgical transsexuals.
Withers, Robert
2015-06-01
The author reflects on his contrasting analytic work with two transsexual patients. He uses three previous psychoanalytic studies (Stoller, Morel and Lemma) to explore whether effective analytic work with the issues driving a person's determined wish for sex reassignment surgery (SRS) is possible. Particular consideration is given to how such work might navigate a path between traumatizing and pathologizing the patient on the one hand and avoiding important analytic material out of fear of so doing on the other. The author proceeds to ask whether it is possible to tell in advance, with any degree of reliability, who is and who is not likely to benefit from surgery. He considers certain diagnostic issues in relation to these questions. Illustrations are given of how, in practice, countertransference anxieties about psychopathologizing transsexual patients can contribute to significant difficulties in working clinically with them. It is argued that the understanding and containment of such anxieties could eventually lead to more effective analytic work, and that such work might be further facilitated by considering the contribution of mind-body dissociation to transsexualism. © 2015, The Society of Analytical Psychology.
Useful measures and models for analytical quality management in medical laboratories.
Westgard, James O
2016-02-01
The 2014 Milan Conference "Defining analytical performance goals 15 years after the Stockholm Conference" initiated a new discussion of issues concerning goals for precision, trueness or bias, total analytical error (TAE), and measurement uncertainty (MU). Goal-setting models are critical for analytical quality management, along with error models, quality-assessment models, quality-planning models, as well as comprehensive models for quality management systems. There are also critical underlying issues, such as an emphasis on MU to the possible exclusion of TAE and a corresponding preference for separate precision and bias goals instead of a combined total error goal. This opinion recommends careful consideration of the differences in the concepts of accuracy and traceability and the appropriateness of different measures, particularly TAE as a measure of accuracy and MU as a measure of traceability. TAE is essential to manage quality within a medical laboratory and MU and trueness are essential to achieve comparability of results across laboratories. With this perspective, laboratory scientists can better understand the many measures and models needed for analytical quality management and assess their usefulness for practical applications in medical laboratories.
Universal electronics for miniature and automated chemical assays.
Urban, Pawel L
2015-02-21
This minireview discusses universal electronic modules (generic programmable units) and their use by analytical chemists to construct inexpensive, miniature or automated devices. Recently, open-source platforms have gained considerable popularity among tech-savvy chemists because their implementation often does not require expert knowledge and investment of funds. Thus, chemistry students and researchers can easily start implementing them after a few hours of reading tutorials and trial-and-error. Single-board microcontrollers and micro-computers such as Arduino, Teensy, Raspberry Pi or BeagleBone enable collecting experimental data with high precision as well as efficient control of electric potentials and actuation of mechanical systems. They are readily programmed using high-level languages, such as C, C++, JavaScript or Python. They can also be coupled with mobile consumer electronics, including smartphones as well as teleinformatic networks. More demanding analytical tasks require fast signal processing. Field-programmable gate arrays enable efficient and inexpensive prototyping of high-performance analytical platforms, thus becoming increasingly popular among analytical chemists. This minireview discusses the advantages and drawbacks of universal electronic modules, considering their application in prototyping and manufacture of intelligent analytical instrumentation.
Clinical and diagnostic utility of saliva as a non-invasive diagnostic fluid: a systematic review
Nunes, Lazaro Alessandro Soares; Mussavira, Sayeeda
2015-01-01
This systematic review presents the latest trends in salivary research and its applications in health and disease. Among the large number of analytes present in saliva, many are affected by diverse physiological and pathological conditions. Further, the non-invasive, easy and cost-effective collection methods prompt an interest in evaluating its diagnostic or prognostic utility. Accumulating data over the past two decades indicates towards the possible utility of saliva to monitor overall health, diagnose and treat various oral or systemic disorders and drug monitoring. Advances in saliva based systems biology has also contributed towards identification of several biomarkers, development of diverse salivary diagnostic kits and other sensitive analytical techniques. However, its utilization should be carefully evaluated in relation to standardization of pre-analytical and analytical variables, such as collection and storage methods, analyte circadian variation, sample recovery, prevention of sample contamination and analytical procedures. In spite of all these challenges, there is an escalating evolution of knowledge with the use of this biological matrix. PMID:26110030
Rocchitta, Gaia; Spanu, Angela; Babudieri, Sergio; Latte, Gavinella; Madeddu, Giordano; Galleri, Grazia; Nuvoli, Susanna; Bagella, Paola; Demartis, Maria Ilaria; Fiore, Vito; Manetti, Roberto; Serra, Pier Andrea
2016-01-01
Enzyme-based chemical biosensors are based on biological recognition. In order to operate, the enzymes must be available to catalyze a specific biochemical reaction and be stable under the normal operating conditions of the biosensor. Design of biosensors is based on knowledge about the target analyte, as well as the complexity of the matrix in which the analyte has to be quantified. This article reviews the problems resulting from the interaction of enzyme-based amperometric biosensors with complex biological matrices containing the target analyte(s). One of the most challenging disadvantages of amperometric enzyme-based biosensor detection is signal reduction from fouling agents and interference from chemicals present in the sample matrix. This article, therefore, investigates the principles of functioning of enzymatic biosensors, their analytical performance over time and the strategies used to optimize their performance. Moreover, the composition of biological fluids as a function of their interaction with biosensing will be presented. PMID:27249001
Climate Analytics as a Service
NASA Technical Reports Server (NTRS)
Schnase, John L.; Duffy, Daniel Q.; McInerney, Mark A.; Webster, W. Phillip; Lee, Tsengdar J.
2014-01-01
Climate science is a big data domain that is experiencing unprecedented growth. In our efforts to address the big data challenges of climate science, we are moving toward a notion of Climate Analytics-as-a-Service (CAaaS). CAaaS combines high-performance computing and data-proximal analytics with scalable data management, cloud computing virtualization, the notion of adaptive analytics, and a domain-harmonized API to improve the accessibility and usability of large collections of climate data. MERRA Analytic Services (MERRA/AS) provides an example of CAaaS. MERRA/AS enables MapReduce analytics over NASA's Modern-Era Retrospective Analysis for Research and Applications (MERRA) data collection. The MERRA reanalysis integrates observational data with numerical models to produce a global temporally and spatially consistent synthesis of key climate variables. The effectiveness of MERRA/AS has been demonstrated in several applications. In our experience, CAaaS is providing the agility required to meet our customers' increasing and changing data management and data analysis needs.
Situational awareness in public health preparedness settings
NASA Astrophysics Data System (ADS)
Mirhaji, Parsa; Michea, Yanko F.; Zhang, Jiajie; Casscells, Samuel W.
2005-05-01
September 11 2001 attacks and following Anthrax mailings introduced emergent need for developing technologies that can distinguish between man made and natural incidents in the public health level. With this objective in mind, government agencies started a funding effort to foster the design, development and implementation of such systems on a wide scale. But the outcomes have not met the expectations set by the resources invested. Multiple elements explain this phenomenon: As it has been frequent with technology, introduction of new surveillance systems to the workflow equation has occurred without taking into consideration the need for understanding and inclusion of deeper personal, psychosocial, organizational and methodological concepts. The environment, in which these systems are operating, is complex, highly dynamic, uncertain, risky, and subject to intense time pressures. Such 'difficult' environments are very challenging to the human as a decision maker. In this paper we will challenge these systems from the perspective of human factors design. We will propose employment of systematic situational awareness research for design and implementation of the next generation public health preparedness infrastructures. We believe that systems designed based on results of such analytical definition of the domain enable public health practitioners to effectively collect the most important cues from the environment, process, interpret and understand the information in the context of organizational objectives and immediate tasks at hand, and use that understanding to forecast the short term and long term impact of the events in the safety and well being of the community.
NASA Astrophysics Data System (ADS)
Perdigão, R. A. P.
2017-12-01
Predictability assessments are traditionally made on a case-by-case basis, often by running the particular model of interest with randomly perturbed initial/boundary conditions and parameters, producing computationally expensive ensembles. These approaches provide a lumped statistical view of uncertainty evolution, without eliciting the fundamental processes and interactions at play in the uncertainty dynamics. In order to address these limitations, we introduce a systematic dynamical framework for predictability assessment and forecast, by analytically deriving governing equations of predictability in terms of the fundamental architecture of dynamical systems, independent of any particular problem under consideration. The framework further relates multiple uncertainty sources along with their coevolutionary interplay, enabling a comprehensive and explicit treatment of uncertainty dynamics along time, without requiring the actual model to be run. In doing so, computational resources are freed and a quick and effective a-priori systematic dynamic evaluation is made of predictability evolution and its challenges, including aspects in the model architecture and intervening variables that may require optimization ahead of initiating any model runs. It further brings out universal dynamic features in the error dynamics elusive to any case specific treatment, ultimately shedding fundamental light on the challenging issue of predictability. The formulated approach, framed with broad mathematical physics generality in mind, is then implemented in dynamic models of nonlinear geophysical systems with various degrees of complexity, in order to evaluate their limitations and provide informed assistance on how to optimize their design and improve their predictability in fundamental dynamical terms.