7 CFR 94.103 - Analytical methods.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...
7 CFR 94.103 - Analytical methods.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...
7 CFR 94.103 - Analytical methods.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...
7 CFR 94.103 - Analytical methods.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...
7 CFR 94.103 - Analytical methods.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...
FIELD ANALYTICAL SCREENING PROGRAM: PCB METHOD - INNOVATIVE TECHNOLOGY REPORT
This innovative technology evaluation report (ITER) presents information on the demonstration of the U.S. Environmental Protection Agency (EPA) Region 7 Superfund Field Analytical Screening Program (FASP) method for determining polychlorinated biphenyl (PCB) contamination in soil...
Rehabilitation Risk Management: Enabling Data Analytics with Quantified Self and Smart Home Data.
Hamper, Andreas; Eigner, Isabella; Wickramasinghe, Nilmini; Bodendorf, Freimut
2017-01-01
A variety of acute and chronic diseases require rehabilitation at home after treatment. Outpatient rehabilitation is crucial for the quality of the medical outcome but is mainly performed without medical supervision. Non-Compliance can lead to severe health risks and readmission to the hospital. While the patient is closely monitored in the hospital, methods and technologies to identify risks at home have to be developed. We analyze state-of-the-art monitoring systems and technologies and show possibilities to transfer these technologies into rehabilitation monitoring. For this purpose, we analyze sensor technology from the field of Quantified Self and Smart Homes. The available sensor data from this consumer grade technology is summarized to give an overview of the possibilities for medical data analytics. Subsequently, we show a conceptual roadmap to transfer data analytics methods to sensor based rehabilitation risk management.
FIELD ANALYTICAL SCREENING PROGRAM: PCP METHOD - INNOVATIVE TECHNOLOGY EVALUATION REPORT
The Field Analytical Screening Program (FASP) pentachlorophenol (PCP) method uses a gas chromatograph (GC) equipped with a megabore capillary column and flame ionization detector (FID) and electron capture detector (ECD) to identify and quantify PCP. The FASP PCP method is design...
Performance evaluation soil samples utilizing encapsulation technology
Dahlgran, J.R.
1999-08-17
Performance evaluation soil samples and method of their preparation uses encapsulation technology to encapsulate analytes which are introduced into a soil matrix for analysis and evaluation by analytical laboratories. Target analytes are mixed in an appropriate solvent at predetermined concentrations. The mixture is emulsified in a solution of polymeric film forming material. The emulsified solution is polymerized to form microcapsules. The microcapsules are recovered, quantitated and introduced into a soil matrix in a predetermined ratio to form soil samples with the desired analyte concentration. 1 fig.
Performance evaluation soil samples utilizing encapsulation technology
Dahlgran, James R.
1999-01-01
Performance evaluation soil samples and method of their preparation using encapsulation technology to encapsulate analytes which are introduced into a soil matrix for analysis and evaluation by analytical laboratories. Target analytes are mixed in an appropriate solvent at predetermined concentrations. The mixture is emulsified in a solution of polymeric film forming material. The emulsified solution is polymerized to form microcapsules. The microcapsules are recovered, quantitated and introduced into a soil matrix in a predetermined ratio to form soil samples with the desired analyte concentration.
The Relationship between SW-846, PBMS, and Innovative Analytical Technologies
This paper explains EPA's position regarding testing methods used within waste programs, documentation of EPA's position, the reasoning behind EPA's position, and the relationship between analytical method regulatory flexibility and the use of on-site...
Analytical quality by design: a tool for regulatory flexibility and robust analytics.
Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy
2015-01-01
Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).
Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics
Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy
2015-01-01
Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723
Modern Instrumental Methods in Forensic Toxicology*
Smith, Michael L.; Vorce, Shawn P.; Holler, Justin M.; Shimomura, Eric; Magluilo, Joe; Jacobs, Aaron J.; Huestis, Marilyn A.
2009-01-01
This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids and tissues. A brief description of the theory and inherent strengths and limitations of each methodology is included. The focus is on new technologies that address current analytical limitations. A goal of this review is to encourage innovations to improve our technological capabilities and to encourage use of these analytical techniques in forensic toxicology practice. PMID:17579968
Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang
2016-10-01
Currently, near infrared spectroscopy (NIRS) has been considered as an efficient tool for achieving process analytical technology(PAT) in the manufacture of traditional Chinese medicine (TCM) products. In this article, the NIRS based process analytical system for the production of salvianolic acid for injection was introduced. The design of the process analytical system was described in detail, including the selection of monitored processes and testing mode, and potential risks that should be avoided. Moreover, the development of relative technologies was also presented, which contained the establishment of the monitoring methods for the elution of polyamide resin and macroporous resin chromatography processes, as well as the rapid analysis method for finished products. Based on author's experience of research and work, several issues in the application of NIRS to the process monitoring and control in TCM production were then raised, and some potential solutions were also discussed. The issues include building the technical team for process analytical system, the design of the process analytical system in the manufacture of TCM products, standardization of the NIRS-based analytical methods, and improving the management of process analytical system. Finally, the prospect for the application of NIRS in the TCM industry was put forward. Copyright© by the Chinese Pharmaceutical Association.
Cho, Il-Hoon; Ku, Seockmo
2017-09-30
The development of novel and high-tech solutions for rapid, accurate, and non-laborious microbial detection methods is imperative to improve the global food supply. Such solutions have begun to address the need for microbial detection that is faster and more sensitive than existing methodologies (e.g., classic culture enrichment methods). Multiple reviews report the technical functions and structures of conventional microbial detection tools. These tools, used to detect pathogens in food and food homogenates, were designed via qualitative analysis methods. The inherent disadvantage of these analytical methods is the necessity for specimen preparation, which is a time-consuming process. While some literature describes the challenges and opportunities to overcome the technical issues related to food industry legal guidelines, there is a lack of reviews of the current trials to overcome technological limitations related to sample preparation and microbial detection via nano and micro technologies. In this review, we primarily explore current analytical technologies, including metallic and magnetic nanomaterials, optics, electrochemistry, and spectroscopy. These techniques rely on the early detection of pathogens via enhanced analytical sensitivity and specificity. In order to introduce the potential combination and comparative analysis of various advanced methods, we also reference a novel sample preparation protocol that uses microbial concentration and recovery technologies. This technology has the potential to expedite the pre-enrichment step that precedes the detection process.
Mirski, Tomasz; Bartoszcze, Michał; Bielawska-Drózd, Agata; Cieślik, Piotr; Michalski, Aleksander J; Niemcewicz, Marcin; Kocik, Janusz; Chomiczewski, Krzysztof
2014-01-01
Modern threats of bioterrorism force the need to develop methods for rapid and accurate identification of dangerous biological agents. Currently, there are many types of methods used in this field of studies that are based on immunological or genetic techniques, or constitute a combination of both methods (immuno-genetic). There are also methods that have been developed on the basis of physical and chemical properties of the analytes. Each group of these analytical assays can be further divided into conventional methods (e.g. simple antigen-antibody reactions, classical PCR, real-time PCR), and modern technologies (e.g. microarray technology, aptamers, phosphors, etc.). Nanodiagnostics constitute another group of methods that utilize the objects at a nanoscale (below 100 nm). There are also integrated and automated diagnostic systems, which combine different methods and allow simultaneous sampling, extraction of genetic material and detection and identification of the analyte using genetic, as well as immunological techniques.
Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engel, David W.; Dalton, Angela C.; Dale, Crystal
2014-06-01
Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by anmore » integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.« less
Research and development of LANDSAT-based crop inventory techniques
NASA Technical Reports Server (NTRS)
Horvath, R.; Cicone, R. C.; Malila, W. A. (Principal Investigator)
1982-01-01
A wide spectrum of technology pertaining to the inventory of crops using LANDSAT without in situ training data is addressed. Methods considered include Bayesian based through-the-season methods, estimation technology based on analytical profile fitting methods, and expert-based computer aided methods. Although the research was conducted using U.S. data, the adaptation of the technology to the Southern Hemisphere, especially Argentina was considered.
The overall goal of this task is to help reduce the uncertainties in the assessment of environmental health and human exposure by better characterizing hazardous wastes through cost-effective analytical methods. Research projects are directed towards the applied development and ...
Assessment of technological level of stem cell research using principal component analysis.
Do Cho, Sung; Hwan Hyun, Byung; Kim, Jae Kyeom
2016-01-01
In general, technological levels have been assessed based on specialist's opinion through the methods such as Delphi. But in such cases, results could be significantly biased per study design and individual expert. In this study, therefore scientific literatures and patents were selected by means of analytic indexes for statistic approach and technical assessment of stem cell fields. The analytic indexes, numbers and impact indexes of scientific literatures and patents, were weighted based on principal component analysis, and then, were summated into the single value. Technological obsolescence was calculated through the cited half-life of patents issued by the United States Patents and Trademark Office and was reflected in technological level assessment. As results, ranks of each nation's in reference to the technology level were rated by the proposed method. Furthermore we were able to evaluate strengthens and weaknesses thereof. Although our empirical research presents faithful results, in the further study, there is a need to compare the existing methods and the suggested method.
Multidisciplinary optimization in aircraft design using analytic technology models
NASA Technical Reports Server (NTRS)
Malone, Brett; Mason, W. H.
1991-01-01
An approach to multidisciplinary optimization is presented which combines the Global Sensitivity Equation method, parametric optimization, and analytic technology models. The result is a powerful yet simple procedure for identifying key design issues. It can be used both to investigate technology integration issues very early in the design cycle, and to establish the information flow framework between disciplines for use in multidisciplinary optimization projects using much more computational intense representations of each technology. To illustrate the approach, an examination of the optimization of a short takeoff heavy transport aircraft is presented for numerous combinations of performance and technology constraints.
NASA Astrophysics Data System (ADS)
Dirpan, Andi
2018-05-01
This research was intended to select the best handling methods or postharvest technologies that can be used to maintain the quality of citrus fruit in Selayar, South Sulawesi, Indonesia among (1) modified atmosphere packaging (MAP (2) Controlled atmosphere storage (CAS) (3) coatings (4) hot water treatment (5) Hot Calcium Dip (HCD) by using combination between an analytic hierarchy process (AHP) and TOPSIS. Improving quality, applicability, increasing shelf life and reducing cost are used as the criteria to determine the best postharvest technologies. The results show that the most important criteria for selecting postharvest technology is improving quality followed by increasing shelf life, reducing cost and applicability. Furthermore, by using TOPSIS, it is clear that the postharvest technology that had the lowest rangking is modified atmosphere packaging (MAP), followed by controlled atmosphere storage (CAS), coatings, hot calcium dip (HCD) and hot water treatment (HWT). Therefore, it can be concluded that the best postharvest technology method for Selayar citrus is modified atmosphere packaging (MAP).
The National Shipbuilding Research Program. Environmental Studies and Testing (Phase V)
2000-11-20
development of an analytical procedure for toxic organic compounds, including TBT ( tributyltin ), whose turnaround time would be in the order of minutes...Cost of the Subtask was $20,000. Subtask #33 - Turnaround Analytical Method for TBT This Subtask performed a preliminary investigation leading to the...34Quick TBT Analytical Method" that will yield reliable results in 15 minutes, a veritable breakthrough in sampling technology. The Subtask was managed by
Evaporative concentration on a paper-based device to concentrate analytes in a biological fluid.
Wong, Sharon Y; Cabodi, Mario; Rolland, Jason; Klapperich, Catherine M
2014-12-16
We report the first demonstration of using heat on a paper device to rapidly concentrate a clinically relevant analyte of interest from a biological fluid. Our technology relies on the application of localized heat to a paper strip to evaporate off hundreds of microliters of liquid to concentrate the target analyte. This method can be used to enrich for a target analyte that is present at low concentrations within a biological fluid to enhance the sensitivity of downstream detection methods. We demonstrate our method by concentrating the tuberculosis-specific glycolipid, lipoarabinomannan (LAM), a promising urinary biomarker for the detection and diagnosis of tuberculosis. We show that the heat does not compromise the subsequent immunodetectability of LAM, and in 20 min, the tuberculosis biomarker was concentrated by nearly 20-fold in simulated urine. Our method requires only 500 mW of power, and sample flow is self-driven via capillary action. As such, our technology can be readily integrated into portable, battery-powered, instrument-free diagnostic devices intended for use in low-resource settings.
Technology advancement for integrative stem cell analyses.
Jeong, Yoon; Choi, Jonghoon; Lee, Kwan Hyi
2014-12-01
Scientists have endeavored to use stem cells for a variety of applications ranging from basic science research to translational medicine. Population-based characterization of such stem cells, while providing an important foundation to further development, often disregard the heterogeneity inherent among individual constituents within a given population. The population-based analysis and characterization of stem cells and the problems associated with such a blanket approach only underscore the need for the development of new analytical technology. In this article, we review current stem cell analytical technologies, along with the advantages and disadvantages of each, followed by applications of these technologies in the field of stem cells. Furthermore, while recent advances in micro/nano technology have led to a growth in the stem cell analytical field, underlying architectural concepts allow only for a vertical analytical approach, in which different desirable parameters are obtained from multiple individual experiments and there are many technical challenges that limit vertically integrated analytical tools. Therefore, we propose--by introducing a concept of vertical and horizontal approach--that there is the need of adequate methods to the integration of information, such that multiple descriptive parameters from a stem cell can be obtained from a single experiment.
Technology Advancement for Integrative Stem Cell Analyses
Jeong, Yoon
2014-01-01
Scientists have endeavored to use stem cells for a variety of applications ranging from basic science research to translational medicine. Population-based characterization of such stem cells, while providing an important foundation to further development, often disregard the heterogeneity inherent among individual constituents within a given population. The population-based analysis and characterization of stem cells and the problems associated with such a blanket approach only underscore the need for the development of new analytical technology. In this article, we review current stem cell analytical technologies, along with the advantages and disadvantages of each, followed by applications of these technologies in the field of stem cells. Furthermore, while recent advances in micro/nano technology have led to a growth in the stem cell analytical field, underlying architectural concepts allow only for a vertical analytical approach, in which different desirable parameters are obtained from multiple individual experiments and there are many technical challenges that limit vertically integrated analytical tools. Therefore, we propose—by introducing a concept of vertical and horizontal approach—that there is the need of adequate methods to the integration of information, such that multiple descriptive parameters from a stem cell can be obtained from a single experiment. PMID:24874188
The science of visual analysis at extreme scale
NASA Astrophysics Data System (ADS)
Nowell, Lucy T.
2011-01-01
Driven by market forces and spanning the full spectrum of computational devices, computer architectures are changing in ways that present tremendous opportunities and challenges for data analysis and visual analytic technologies. Leadership-class high performance computing system will have as many as a million cores by 2020 and support 10 billion-way concurrency, while laptop computers are expected to have as many as 1,000 cores by 2015. At the same time, data of all types are increasing exponentially and automated analytic methods are essential for all disciplines. Many existing analytic technologies do not scale to make full use of current platforms and fewer still are likely to scale to the systems that will be operational by the end of this decade. Furthermore, on the new architectures and for data at extreme scales, validating the accuracy and effectiveness of analytic methods, including visual analysis, will be increasingly important.
Genetics-based methods for detection of Salmonella spp. in foods.
Mozola, Mark A
2006-01-01
Genetic methods are now at the forefront of foodborne pathogen testing. The sensitivity, specificity, and inclusivity advantages offered by deoxyribonucleic acid (DNA) probe technology have driven an intense effort in methods development over the past 20 years. DNA probe-based methods for Salmonella spp. and other pathogens have progressed from time-consuming procedures involving the use of radioisotopes to simple, high throughput, automated assays. The analytical sensitivity of nucleic acid amplification technology has facilitated a reduction in analysis time by allowing enriched samples to be tested for previously undetectable quantities of analyte. This article will trace the evolution of the development of genetic methods for detection of Salmonella in foods, review the basic assay formats and their advantages and limitations, and discuss method performance characteristics and considerations for selection of methods.
Knowledge, Skills, and Abilities for Entry-Level Business Analytics Positions: A Multi-Method Study
ERIC Educational Resources Information Center
Cegielski, Casey G.; Jones-Farmer, L. Allison
2016-01-01
It is impossible to deny the significant impact from the emergence of big data and business analytics on the fields of Information Technology, Quantitative Methods, and the Decision Sciences. Both industry and academia seek to hire talent in these areas with the hope of developing organizational competencies. This article describes a multi-method…
AmO 2 Analysis for Analytical Method Testing and Assessment: Analysis Support for AmO 2 Production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhn, Kevin John; Bland, Galey Jean; Fulwyler, James Brent
Americium oxide samples will be measured for various analytes to support AmO 2 production. The key analytes that are currently requested by the Am production customer at LANL include total Am content, Am isotopics, Pu assay, Pu isotopics, and trace element content including 237Np content. Multiple analytical methods will be utilized depending on the sensitivity, accuracy and precision needs of the Am matrix. Traceability to the National Institute of Standards and Technology (NIST) will be achieved, where applicable, by running NIST traceable quality control materials. This given that there are no suitable AmO 2 reference materials currently available for requestedmore » analytes. The primary objective is to demonstrate the suitability of actinide analytical chemistry methods to support AmO 2 production operations.« less
Recent advances in immunosensor for narcotic drug detection
Gandhi, Sonu; Suman, Pankaj; Kumar, Ashok; Sharma, Prince; Capalash, Neena; Suri, C. Raman
2015-01-01
Introduction: Immunosensor for illicit drugs have gained immense interest and have found several applications for drug abuse monitoring. This technology has offered a low cost detection of narcotics; thereby, providing a confirmatory platform to compliment the existing analytical methods. Methods: In this minireview, we define the basic concept of transducer for immunosensor development that utilizes antibodies and low molecular mass hapten (opiate) molecules. Results: This article emphasizes on recent advances in immunoanalytical techniques for monitoring of opiate drugs. Our results demonstrate that high quality antibodies can be used for immunosensor development against target analyte with greater sensitivity, specificity and precision than other available analytical methods. Conclusion: In this review we highlight the fundamentals of different transducer technologies and its applications for immunosensor development currently being developed in our laboratory using rapid screening via immunochromatographic kit, label free optical detection via enzyme, fluorescence, gold nanoparticles and carbon nanotubes based immunosensing for sensitive and specific monitoring of opiates. PMID:26929925
Chemical Detection and Identification Techniques for Exobiology Flight Experiments
NASA Technical Reports Server (NTRS)
Kojiro, Daniel R.; Sheverev, Valery A.; Khromov, Nikolai A.
2002-01-01
Exobiology flight experiments require highly sensitive instrumentation for in situ analysis of the volatile chemical species that occur in the atmospheres and surfaces of various bodies within the solar system. The complex mixtures encountered place a heavy burden on the analytical Instrumentation to detect and identify all species present. The minimal resources available onboard for such missions mandate that the instruments provide maximum analytical capabilities with minimal requirements of volume, weight and consumables. Advances in technology may be achieved by increasing the amount of information acquired by a given technique with greater analytical capabilities and miniaturization of proven terrestrial technology. We describe here methods to develop analytical instruments for the detection and identification of a wide range of chemical species using Gas Chromatography. These efforts to expand the analytical capabilities of GC technology are focused on the development of detectors for the GC which provide sample identification independent of the GC retention time data. A novel new approach employs Penning Ionization Electron Spectroscopy (PIES).
2011-02-01
Process Architecture Technology Analysis: Executive .............................................. 15 UIMA as Executive...44 A.4: Flow Code in UIMA ......................................................................................................... 46... UIMA ................................................................................................................................ 57 E.2
Triangulation and Mixed Methods Designs: Data Integration with New Research Technologies
ERIC Educational Resources Information Center
Fielding, Nigel G.
2012-01-01
Data integration is a crucial element in mixed methods analysis and conceptualization. It has three principal purposes: illustration, convergent validation (triangulation), and the development of analytic density or "richness." This article discusses such applications in relation to new technologies for social research, looking at three…
Innovations in coating technology.
Behzadi, Sharareh S; Toegel, Stefan; Viernstein, Helmut
2008-01-01
Despite representing one of the oldest pharmaceutical techniques, coating of dosage forms is still frequently used in pharmaceutical manufacturing. The aims of coating range from simply masking the taste or odour of drugs to the sophisticated controlling of site and rate of drug release. The high expectations for different coating technologies have required great efforts regarding the development of reproducible and controllable production processes. Basically, improvements in coating methods have focused on particle movement, spraying systems, and air and energy transport. Thereby, homogeneous distribution of coating material and increased drying efficiency should be accomplished in order to achieve high end product quality. Moreover, given the claim of the FDA to design the end product quality already during the manufacturing process (Quality by Design), the development of analytical methods for the analysis, management and control of coating processes has attracted special attention during recent years. The present review focuses on recent patents claiming improvements in pharmaceutical coating technology and intends to first familiarize the reader with the available procedures and to subsequently explain the application of different analytical tools. Aiming to structure this comprehensive field, coating technologies are primarily divided into pan and fluidized bed coating methods. Regarding pan coating procedures, pans rotating around inclined, horizontal and vertical axes are reviewed separately. On the other hand, fluidized bed technologies are subdivided into those involving fluidized and spouted beds. Then, continuous processing techniques and improvements in spraying systems are discussed in dedicated chapters. Finally, currently used analytical methods for the understanding and management of coating processes are reviewed in detail in the last section of the review.
Control research in the NASA high-alpha technology program
NASA Technical Reports Server (NTRS)
Gilbert, William P.; Nguyen, Luat T.; Gera, Joseph
1990-01-01
NASA is conducting a focused technology program, known as the High-Angle-of-Attack Technology Program, to accelerate the development of flight-validated technology applicable to the design of fighters with superior stall and post-stall characteristics and agility. A carefully integrated effort is underway combining wind tunnel testing, analytical predictions, piloted simulation, and full-scale flight research. A modified F-18 aircraft has been extensively instrumented for use as the NASA High-Angle-of-Attack Research Vehicle used for flight verification of new methods and concepts. This program stresses the importance of providing improved aircraft control capabilities both by powered control (such as thrust-vectoring) and by innovative aerodynamic control concepts. The program is accomplishing extensive coordinated ground and flight testing to assess and improve available experimental and analytical methods and to develop new concepts for enhanced aerodynamics and for effective control, guidance, and cockpit displays essential for effective pilot utilization of the increased agility provided.
Challa, Shruthi; Potumarthi, Ravichandra
2013-01-01
Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.
Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools
2014-01-14
Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools Final Technical Report SERC -2014-TR-041-1 January 14...by the U.S. Department of Defense through the Systems Engineering Research Center ( SERC ) under Contract H98230-08-D-0171 (Task Order 0026, RT 51... SERC is a federally funded University Affiliated Research Center managed by Stevens Institute of Technology Any opinions, findings and
Churchwell, Mona I; Twaddle, Nathan C; Meeker, Larry R; Doerge, Daniel R
2005-10-25
Recent technological advances have made available reverse phase chromatographic media with a 1.7 microm particle size along with a liquid handling system that can operate such columns at much higher pressures. This technology, termed ultra performance liquid chromatography (UPLC), offers significant theoretical advantages in resolution, speed, and sensitivity for analytical determinations, particularly when coupled with mass spectrometers capable of high-speed acquisitions. This paper explores the differences in LC-MS performance by conducting a side-by-side comparison of UPLC for several methods previously optimized for HPLC-based separation and quantification of multiple analytes with maximum throughput. In general, UPLC produced significant improvements in method sensitivity, speed, and resolution. Sensitivity increases with UPLC, which were found to be analyte-dependent, were as large as 10-fold and improvements in method speed were as large as 5-fold under conditions of comparable peak separations. Improvements in chromatographic resolution with UPLC were apparent from generally narrower peak widths and from a separation of diastereomers not possible using HPLC. Overall, the improvements in LC-MS method sensitivity, speed, and resolution provided by UPLC show that further advances can be made in analytical methodology to add significant value to hypothesis-driven research.
Wirges, M; Funke, A; Serno, P; Knop, K; Kleinebudde, P
2013-05-05
Incorporation of an active pharmaceutical ingredient (API) into the coating layer of film-coated tablets is a method mainly used to formulate fixed-dose combinations. Uniform and precise spray-coating of an API represents a substantial challenge, which could be overcome by applying Raman spectroscopy as process analytical tool. In pharmaceutical industry, Raman spectroscopy is still mainly used as a bench top laboratory analytical method and usually not implemented in the production process. Concerning the application in the production process, a lot of scientific approaches stop at the level of feasibility studies and do not manage the step to production scale and process applications. The present work puts the scale up of an active coating process into focus, which is a step of highest importance during the pharmaceutical development. Active coating experiments were performed at lab and production scale. Using partial least squares (PLS), a multivariate model was constructed by correlating in-line measured Raman spectral data with the coated amount of API. By transferring this model, being implemented for a lab scale process, to a production scale process, the robustness of this analytical method and thus its applicability as a Process Analytical Technology (PAT) tool for the correct endpoint determination in pharmaceutical manufacturing could be shown. Finally, this method was validated according to the European Medicine Agency (EMA) guideline with respect to the special requirements of the applied in-line model development strategy. Copyright © 2013 Elsevier B.V. All rights reserved.
Evolution of microbiological analytical methods for dairy industry needs
Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence
2014-01-01
Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry’s needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards. PMID:24570675
Evolution of microbiological analytical methods for dairy industry needs.
Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence
2014-01-01
Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry's needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards.
Review of spectral imaging technology in biomedical engineering: achievements and challenges.
Li, Qingli; He, Xiaofu; Wang, Yiting; Liu, Hongying; Xu, Dongrong; Guo, Fangmin
2013-10-01
Spectral imaging is a technology that integrates conventional imaging and spectroscopy to get both spatial and spectral information from an object. Although this technology was originally developed for remote sensing, it has been extended to the biomedical engineering field as a powerful analytical tool for biological and biomedical research. This review introduces the basics of spectral imaging, imaging methods, current equipment, and recent advances in biomedical applications. The performance and analytical capabilities of spectral imaging systems for biological and biomedical imaging are discussed. In particular, the current achievements and limitations of this technology in biomedical engineering are presented. The benefits and development trends of biomedical spectral imaging are highlighted to provide the reader with an insight into the current technological advances and its potential for biomedical research.
Complex Investigations of Sapphire Crystals Production
NASA Astrophysics Data System (ADS)
Malyukov, S. P.; Klunnikova, Yu V.
The problem of optimum conditions choice for processing sapphire substrates was solved with optimization methods and with combination of analytical simulation methods, experiment and expert system technology. The experimental results and software give rather full information on features of real structure of the sapphire crystal substrates and can be effectively used for optimization of technology of the substrate preparation for electronic devices.
Learning Analytics in Higher Education Development: A Roadmap
ERIC Educational Resources Information Center
Adejo, Olugbenga; Connolly, Thomas
2017-01-01
The increase in education data and advance in technology are bringing about enhanced teaching and learning methodology. The emerging field of Learning Analytics (LA) continues to seek ways to improve the different methods of gathering, analysing, managing and presenting learners' data with the sole aim of using it to improve the student learning…
2013-01-01
Influenza virus-like particle vaccines are one of the most promising ways to respond to the threat of future influenza pandemics. VLPs are composed of viral antigens but lack nucleic acids making them non-infectious which limit the risk of recombination with wild-type strains. By taking advantage of the advancements in cell culture technologies, the process from strain identification to manufacturing has the potential to be completed rapidly and easily at large scales. After closely reviewing the current research done on influenza VLPs, it is evident that the development of quantification methods has been consistently overlooked. VLP quantification at all stages of the production process has been left to rely on current influenza quantification methods (i.e. Hemagglutination assay (HA), Single Radial Immunodiffusion assay (SRID), NA enzymatic activity assays, Western blot, Electron Microscopy). These are analytical methods developed decades ago for influenza virions and final bulk influenza vaccines. Although these methods are time-consuming and cumbersome they have been sufficient for the characterization of final purified material. Nevertheless, these analytical methods are impractical for in-line process monitoring because VLP concentration in crude samples generally falls out of the range of detection for these methods. This consequently impedes the development of robust influenza-VLP production and purification processes. Thus, development of functional process analytical techniques, applicable at every stage during production, that are compatible with different production platforms is in great need to assess, optimize and exploit the full potential of novel manufacturing platforms. PMID:23642219
SOIL AND SEDIMENT SAMPLING METHODS
The EPA Office of Solid Waste and Emergency Response's (OSWER) Office of Superfund Remediation and Technology Innovation (OSRTI) needs innovative methods and techniques to solve new and difficult sampling and analytical problems found at the numerous Superfund sites throughout th...
Vernon, John A; Hughen, W Keener; Johnson, Scott J
2005-05-01
In the face of significant real healthcare cost inflation, pressured budgets, and ongoing launches of myriad technology of uncertain value, payers have formalized new valuation techniques that represent a barrier to entry for drugs. Cost-effectiveness analysis predominates among these methods, which involves differencing a new technological intervention's marginal costs and benefits with a comparator's, and comparing the resulting ratio to a payer's willingness-to-pay threshold. In this paper we describe how firms are able to model the feasible range of future product prices when making in-licensing and developmental Go/No-Go decisions by considering payers' use of the cost-effectiveness method. We illustrate this analytic method with a simple deterministic example and then incorporate stochastic assumptions using both analytic and simulation methods. Using this strategic approach, firms may reduce product development and in-licensing risk.
Adamec, Jiri; Yang, Wen-Chu; Regnier, Fred E
2014-01-14
Reagents and methods are provided that permit simultaneous analysis of multiple diverse small molecule analytes present in a complex mixture. Samples are labeled with chemically identical but isotopically distince forms of the labeling reagent, and analyzed using mass spectrometry. A single reagent simultaneously derivatizes multiple small molecule analytes having different reactive functional groups.
Analyzing the Heterogeneous Hierarchy of Cultural Heritage Materials: Analytical Imaging.
Trentelman, Karen
2017-06-12
Objects of cultural heritage significance are created using a wide variety of materials, or mixtures of materials, and often exhibit heterogeneity on multiple length scales. The effective study of these complex constructions thus requires the use of a suite of complementary analytical technologies. Moreover, because of the importance and irreplaceability of most cultural heritage objects, researchers favor analytical techniques that can be employed noninvasively, i.e., without having to remove any material for analysis. As such, analytical imaging has emerged as an important approach for the study of cultural heritage. Imaging technologies commonly employed, from the macroscale through the micro- to nanoscale, are discussed with respect to how the information obtained helps us understand artists' materials and methods, the cultures in which the objects were created, how the objects may have changed over time, and importantly, how we may develop strategies for their preservation.
Kim, Sungjune; Hong, Seokpyo; Ahn, Kilsoo; Gong, Sungyong
2015-01-01
Objectives This study presents the indicators and proxy variables for the quantitative assessment of green chemistry technologies and evaluates the relative importance of each assessment element by consulting experts from the fields of ecology, chemistry, safety, and public health. Methods The results collected were subjected to an analytic hierarchy process to obtain the weights of the indicators and the proxy variables. Results These weights may prove useful in avoiding having to resort to qualitative means in absence of weights between indicators when integrating the results of quantitative assessment by indicator. Conclusions This study points to the limitations of current quantitative assessment techniques for green chemistry technologies and seeks to present the future direction for quantitative assessment of green chemistry technologies. PMID:26206364
Liquid-cooling technology for gas turbines - Review and status
NASA Technical Reports Server (NTRS)
Van Fossen, G. J., Jr.; Stepka, F. S.
1978-01-01
After a brief review of past efforts involving the forced-convection cooling of gas turbines, the paper surveys the state of the art of the liquid cooling of gas turbines. Emphasis is placed on thermosyphon methods of cooling, including those utilizing closed, open, and closed-loop thermosyphons; other methods, including sweat, spray and stator cooling, are also discussed. The more significant research efforts, design data, correlations, and analytical methods are mentioned and voids in technology are summarized.
ERIC Educational Resources Information Center
Lee, Hyeon Woo
2011-01-01
As the technology-enriched learning environments and theoretical constructs involved in instructional design become more sophisticated and complex, a need arises for equally sophisticated analytic methods to research these environments, theories, and models. Thus, this paper illustrates a comprehensive approach for analyzing data arising from…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bevolo, A.J.; Kjartanson, B.H.; Wonder, J.D.
1996-03-01
The goal of the Ames Expedited Site Characterization (ESC) project is to evaluate and promote both innovative technologies (IT) and state-of-the-practice technologies (SOPT) for site characterization and monitoring. In April and May 1994, the ESC project conducted site characterization, technology comparison, and stakeholder demonstration activities at a former manufactured gas plant (FMGP) owned by Iowa Electric Services (IES) Utilities, Inc., in Marshalltown, Iowa. Three areas of technology were fielded at the Marshalltown FMGP site: geophysical, analytical and data integration. The geophysical technologies are designed to assess the subsurface geological conditions so that the location, fate and transport of the targetmore » contaminants may be assessed and forecasted. The analytical technologies/methods are designed to detect and quantify the target contaminants. The data integration technology area consists of hardware and software systems designed to integrate all the site information compiled and collected into a conceptual site model on a daily basis at the site; this conceptual model then becomes the decision-support tool. Simultaneous fielding of different methods within each of the three areas of technology provided data for direct comparison of the technologies fielded, both SOPT and IT. This document reports the results of the site characterization, technology comparison, and ESC demonstration activities associated with the Marshalltown FMGP site. 124 figs., 27 tabs.« less
Helios: Understanding Solar Evolution Through Text Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Randazzese, Lucien
This proof-of-concept project focused on developing, testing, and validating a range of bibliometric, text analytic, and machine-learning based methods to explore the evolution of three photovoltaic (PV) technologies: Cadmium Telluride (CdTe), Dye-Sensitized solar cells (DSSC), and Multi-junction solar cells. The analytical approach to the work was inspired by previous work by the same team to measure and predict the scientific prominence of terms and entities within specific research domains. The goal was to create tools that could assist domain-knowledgeable analysts in investigating the history and path of technological developments in general, with a focus on analyzing step-function changes in performance,more » or “breakthroughs,” in particular. The text-analytics platform developed during this project was dubbed Helios. The project relied on computational methods for analyzing large corpora of technical documents. For this project we ingested technical documents from the following sources into Helios: Thomson Scientific Web of Science (papers), the U.S. Patent & Trademark Office (patents), the U.S. Department of Energy (technical documents), the U.S. National Science Foundation (project funding summaries), and a hand curated set of full-text documents from Thomson Scientific and other sources.« less
2005-07-01
approach for measuring the return on Information Technology (IT) investments. A review of existing methods suggests the difficulty in adequately...measuring the returns of IT at various levels of analysis (e.g., firm or process level). To address this issue, this study aims to develop a method for...view (KBV), this paper proposes an analytic method for measuring the historical revenue and cost of IT investments by estimating the amount of
NASA Technical Reports Server (NTRS)
Mudgett, Paul D.; Schultz, John R.; Sauer, Richard L.
1992-01-01
Until 1989, ion chromatography (IC) was the baseline technology selected for the Specific Ion Analyzer, an in-flight inorganic water quality monitor being designed for Space Station Freedom. Recent developments in capillary electrophoresis (CE) may offer significant savings of consumables, power consumption, and weight/volume allocation, relative to IC technology. A thorough evaluation of CE's analytical capability, however, is necessary before one of the two techniques is chosen. Unfortunately, analytical methods currently available for inorganic CE are unproven for NASA's target list of anions and cations. Thus, CE electrolyte chemistry and methods to measure the target contaminants must be first identified and optimized. This paper reports the status of a study to evaluate CE's capability with regard to inorganic and carboxylate anions, alkali and alkaline earth cations, and transition metal cations. Preliminary results indicate that CE has an impressive selectivity and trace sensitivity, although considerable methods development remains to be performed.
Stillhart, Cordula; Kuentz, Martin
2012-02-05
Self-emulsifying drug delivery systems (SEDDS) are complex mixtures in which drug quantification can become a challenging task. Thus, a general need exists for novel analytical methods and a particular interest lies in techniques with the potential for process monitoring. This article compares Raman spectroscopy with high-resolution ultrasonic resonator technology (URT) for drug quantification in SEDDS. The model drugs fenofibrate, indomethacin, and probucol were quantitatively assayed in different self-emulsifying formulations. We measured ultrasound velocity and attenuation in the bulk formulation containing drug at different concentrations. The formulations were also studied by Raman spectroscopy. We used both, an in-line immersion probe for the bulk formulation and a multi-fiber sensor for measuring through hard-gelatin capsules that were filled with SEDDS. Each method was assessed by calculating the relative standard error of prediction (RSEP) as well as the limit of quantification (LOQ) and the mean recovery. Raman spectroscopy led to excellent calibration models for the bulk formulation as well as the capsules. The RSEP depended on the SEDDS type with values of 1.5-3.8%, while LOQ was between 0.04 and 0.35% (w/w) for drug quantification in the bulk. Similarly, the analysis of the capsules led to RSEP of 1.9-6.5% and LOQ of 0.01-0.41% (w/w). On the other hand, ultrasound attenuation resulted in RSEP of 2.3-4.4% and LOQ of 0.1-0.6% (w/w). Moreover, ultrasound velocity provided an interesting analytical response in cases where the drug strongly affected the density or compressibility of the SEDDS. We conclude that ultrasonic resonator technology and Raman spectroscopy constitute suitable methods for drug quantification in SEDDS, which is promising for their use as process analytical technologies. Copyright © 2011 Elsevier B.V. All rights reserved.
Emerging Materials Technologies That Matter to Manufacturers
NASA Technical Reports Server (NTRS)
Misra, Ajay K.
2015-01-01
A brief overview of emerging materials technologies. Exploring the weight reduction benefit of replacing Carbon Fiber with Carbon Nanotube (CNT) in Polymer Composites. Review of the benign purification method developed for CNT sheets. The future of manufacturing will include the integration of computational material design and big data analytics, along with Nanomaterials as building blocks.
Li, Yubo; Zhang, Zhenzhu; Liu, Xinyu; Li, Aizhu; Hou, Zhiguo; Wang, Yuming; Zhang, Yanjun
2015-08-28
This study combines solid phase extraction (SPE) using 96-well plates with column-switching technology to construct a rapid and high-throughput method for the simultaneous extraction and non-targeted analysis of small molecules metabolome and lipidome based on ultra-performance liquid chromatography quadrupole time-of-flight mass spectrometry. This study first investigated the columns and analytical conditions for small molecules metabolome and lipidome, separated by an HSS T3 and BEH C18 columns, respectively. Next, the loading capacity and actuation duration of SPE were further optimized. Subsequently, SPE and column switching were used together to rapidly and comprehensively analyze the biological samples. The experimental results showed that the new analytical procedure had good precision and maintained sample stability (RSD<15%). The method was then satisfactorily applied to more widely analyze the small molecules metabolome and lipidome to test the throughput. The resulting method represents a new analytical approach for biological samples, and a highly useful tool for researches in metabolomics and lipidomics. Copyright © 2015 Elsevier B.V. All rights reserved.
Quality of Big Data in Healthcare
Sukumar, Sreenivas R.; Ramachandran, Natarajan; Ferrell, Regina Kay
2015-01-01
The current trend in Big Data Analytics and in particular Health information technology is towards building sophisticated models, methods and tools for business, operational and clinical intelligence, but the critical issue of data quality required for these models is not getting the attention it deserves. The objective of the paper is to highlight the issues of data quality in the context of Big Data Healthcare Analytics.
Quality of Big Data in Healthcare
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sukumar, Sreenivas R.; Ramachandran, Natarajan; Ferrell, Regina Kay
The current trend in Big Data Analytics and in particular Health information technology is towards building sophisticated models, methods and tools for business, operational and clinical intelligence, but the critical issue of data quality required for these models is not getting the attention it deserves. The objective of the paper is to highlight the issues of data quality in the context of Big Data Healthcare Analytics.
Shaaban, Heba; Górecki, Tadeusz
2015-01-01
Green analytical chemistry is an aspect of green chemistry which introduced in the late nineties. The main objectives of green analytical chemistry are to obtain new analytical technologies or to modify an old method to incorporate procedures that use less hazardous chemicals. There are several approaches to achieve this goal such as using environmentally benign solvents and reagents, reducing the chromatographic separation times and miniaturization of analytical devices. Traditional methods used for the analysis of pharmaceutically active compounds require large volumes of organic solvents and generate large amounts of waste. Most of them are volatile and harmful to the environment. With the awareness about the environment, the development of green technologies has been receiving increasing attention aiming at eliminating or reducing the amount of organic solvents consumed everyday worldwide without loss in chromatographic performance. This review provides the state of the art of green analytical methodologies for environmental analysis of pharmaceutically active compounds in the aquatic environment with special emphasis on strategies for greening liquid chromatography (LC). The current trends of fast LC applied to environmental analysis, including elevated mobile phase temperature, as well as different column technologies such as monolithic columns, fully porous sub-2 μm and superficially porous particles are presented. In addition, green aspects of gas chromatography (GC) and supercritical fluid chromatography (SFC) will be discussed. We pay special attention to new green approaches such as automation, miniaturization, direct analysis and the possibility of locating the chromatograph on-line or at-line as a step forward in reducing the environmental impact of chromatographic analyses. Copyright © 2014 Elsevier B.V. All rights reserved.
An Analytical Method for Measuring Competence in Project Management
ERIC Educational Resources Information Center
González-Marcos, Ana; Alba-Elías, Fernando; Ordieres-Meré, Joaquín
2016-01-01
The goal of this paper is to present a competence assessment method in project management that is based on participants' performance and value creation. It seeks to close an existing gap in competence assessment in higher education. The proposed method relies on information and communication technology (ICT) tools and combines Project Management…
Formic Acid: Development of an Analytical Method and Use as Process Indicator in Anaerobic Systems
1992-03-01
distilled to remove compounds such as cinnamic , glycolic and levulinic acids which can be oxidized to formic acid by ceric sulfate, thus interfering...I AD-A250 668 D0 ,I I I 111 Wl’i ill EDT|CS ELECTE MAY 27 1992 I C I FORMIC ACID : DEVELCPMENT OF AN ANALYTICAL METHOD AND USE AS A PROCESS INDICATOR...OF TECHNOLOGY A UNIT OF THE UNIVERSITY SYSTEM OF GEORGIA SCHOOL OF CIVIL ENGINEERING ATLANTA, GEORGIA 30332 iIi ii FORMIC ACID : DEVELOPMENT OF AN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koester, C J; Moulik, A
This article discusses developments in environmental analytical chemistry that occurred in the years of 2003 and 2004. References were found by searching the ''Science Citation Index and Current Contents''. As in our review of two years ago (A1), techniques are highlighted that represent current trends and state-of-the-art technologies in the sampling, extraction, separation, and detection of trace concentrations, low-part-per-billion and less, of organic, inorganic, and organometallic contaminants in environmental samples. New analytes of interest are also reviewed, the detections of which are made possible by recently developed analytical instruments and methods.
State-of-the-Art of (Bio)Chemical Sensor Developments in Analytical Spanish Groups
Plata, María Reyes; Contento, Ana María; Ríos, Angel
2010-01-01
(Bio)chemical sensors are one of the most exciting fields in analytical chemistry today. The development of these analytical devices simplifies and miniaturizes the whole analytical process. Although the initial expectation of the massive incorporation of sensors in routine analytical work has been truncated to some extent, in many other cases analytical methods based on sensor technology have solved important analytical problems. Many research groups are working in this field world-wide, reporting interesting results so far. Modestly, Spanish researchers have contributed to these recent developments. In this review, we summarize the more representative achievements carried out for these groups. They cover a wide variety of sensors, including optical, electrochemical, piezoelectric or electro-mechanical devices, used for laboratory or field analyses. The capabilities to be used in different applied areas are also critically discussed. PMID:22319260
The study on the near infrared spectrum technology of sauce component analysis
NASA Astrophysics Data System (ADS)
Li, Shangyu; Zhang, Jun; Chen, Xingdan; Liang, Jingqiu; Wang, Ce
2006-01-01
The author, Shangyu Li, engages in supervising and inspecting the quality of products. In soy sauce manufacturing, quality control of intermediate and final products by many components such as total nitrogen, saltless soluble solids, nitrogen of amino acids and total acid is demanded. Wet chemistry analytical methods need much labor and time for these analyses. In order to compensate for this problem, we used near infrared spectroscopy technology to measure the chemical-composition of soy sauce. In the course of the work, a certain amount of soy sauce was collected and was analyzed by wet chemistry analytical methods. The soy sauce was scanned by two kinds of the spectrometer, the Fourier Transform near infrared spectrometer (FT-NIR spectrometer) and the filter near infrared spectroscopy analyzer. The near infrared spectroscopy of soy sauce was calibrated with the components of wet chemistry methods by partial least squares regression and stepwise multiple linear regression. The contents of saltless soluble solids, total nitrogen, total acid and nitrogen of amino acids were predicted by cross validation. The results are compared with the wet chemistry analytical methods. The correlation coefficient and root-mean-square error of prediction (RMSEP) in the better prediction run were found to be 0.961 and 0.206 for total nitrogen, 0.913 and 1.215 for saltless soluble solids, 0.855 and 0.199 nitrogen of amino acids, 0.966 and 0.231 for total acid, respectively. The results presented here demonstrate that the NIR spectroscopy technology is promising for fast and reliable determination of major components of soy sauce.
Fishman, M. J.
1993-01-01
Methods to be used to analyze samples of water, suspended sediment and bottom material for their content of inorganic and organic constituents are presented. Technology continually changes, and so this laboratory manual includes new and revised methods for determining the concentration of dissolved constituents in water, whole water recoverable constituents in water-suspended sediment samples, and recoverable concentration of constit- uents in bottom material. For each method, the general topics covered are the application, the principle of the method, interferences, the apparatus and reagents required, a detailed description of the analytical procedure, reporting results, units and significant figures, and analytical precision data. Included in this manual are 30 methods.
DOT National Transportation Integrated Search
2010-10-01
The Volvo-Ford-UMTRI project: Safety Impact Methodology (SIM) for Lane Departure Warning is part of the U.S. Department of Transportation's Advanced Crash Avoidance Technologies (ACAT) program. The project developed a basic analytical framework for e...
Polymerase chain reaction technology as analytical tool in agricultural biotechnology.
Lipp, Markus; Shillito, Raymond; Giroux, Randal; Spiegelhalter, Frank; Charlton, Stacy; Pinero, David; Song, Ping
2005-01-01
The agricultural biotechnology industry applies polymerase chain reaction (PCR) technology at numerous points in product development. Commodity and food companies as well as third-party diagnostic testing companies also rely on PCR technology for a number of purposes. The primary use of the technology is to verify the presence or absence of genetically modified (GM) material in a product or to quantify the amount of GM material present in a product. This article describes the fundamental elements of PCR analysis and its application to the testing of grains. The document highlights the many areas to which attention must be paid in order to produce reliable test results. These include sample preparation, method validation, choice of appropriate reference materials, and biological and instrumental sources of error. The article also discusses issues related to the analysis of different matrixes and the effect they may have on the accuracy of the PCR analytical results.
A Review of Current Methods for Analysis of Mycotoxins in Herbal Medicines
Zhang, Lei; Dou, Xiao-Wen; Zhang, Cheng; Logrieco, Antonio F.; Yang, Mei-Hua
2018-01-01
The presence of mycotoxins in herbal medicines is an established problem throughout the entire world. The sensitive and accurate analysis of mycotoxin in complicated matrices (e.g., herbs) typically involves challenging sample pretreatment procedures and an efficient detection instrument. However, although numerous reviews have been published regarding the occurrence of mycotoxins in herbal medicines, few of them provided a detailed summary of related analytical methods for mycotoxin determination. This review focuses on analytical techniques including sampling, extraction, cleanup, and detection for mycotoxin determination in herbal medicines established within the past ten years. Dedicated sections of this article address the significant developments in sample preparation, and highlight the importance of this procedure in the analytical technology. This review also summarizes conventional chromatographic techniques for mycotoxin qualification or quantitation, as well as recent studies regarding the development and application of screening assays such as enzyme-linked immunosorbent assays, lateral flow immunoassays, aptamer-based lateral flow assays, and cytometric bead arrays. The present work provides a good insight regarding the advanced research that has been done and closes with an indication of future demand for the emerging technologies. PMID:29393905
Zhdanov,; Michael, S [Salt Lake City, UT
2008-01-29
Mineral exploration needs a reliable method to distinguish between uneconomic mineral deposits and economic mineralization. A method and system includes a geophysical technique for subsurface material characterization, mineral exploration and mineral discrimination. The technique introduced in this invention detects induced polarization effects in electromagnetic data and uses remote geophysical observations to determine the parameters of an effective conductivity relaxation model using a composite analytical multi-phase model of the rock formations. The conductivity relaxation model and analytical model can be used to determine parameters related by analytical expressions to the physical characteristics of the microstructure of the rocks and minerals. These parameters are ultimately used for the discrimination of different components in underground formations, and in this way provide an ability to distinguish between uneconomic mineral deposits and zones of economic mineralization using geophysical remote sensing technology.
An applied study using systems engineering methods to prioritize green systems options
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Sonya M; Macdonald, John M
2009-01-01
For many years, there have been questions about the effectiveness of applying different green solutions. If you're building a home and wish to use green technologies, where do you start? While all technologies sound promising, which will perform the best over time? All this has to be considered within the cost and schedule of the project. The amount of information available on the topic can be overwhelming. We seek to examine if Systems Engineering methods can be used to help people choose and prioritize technologies that fit within their project and budget. Several methods are used to gain perspective intomore » how to select the green technologies, such as the Analytic Hierarchy Process (AHP) and Kepner-Tregoe. In our study, subjects applied these methods to analyze cost, schedule, and trade-offs. Results will document whether the experimental approach is applicable to defining system priorities for green technologies.« less
Patel, Chirag J
2017-01-01
Mixtures, or combinations and interactions between multiple environmental exposures, are hypothesized to be causally linked with disease and health-related phenotypes. Established and emerging molecular measurement technologies to assay the exposome , the comprehensive battery of exposures encountered from birth to death, promise a new way of identifying mixtures in disease in the epidemiological setting. In this opinion, we describe the analytic complexity and challenges in identifying mixtures associated with phenotype and disease. Existing and emerging machine-learning methods and data analytic approaches (e.g., "environment-wide association studies" [EWASs]), as well as large cohorts may enhance possibilities to identify mixtures of correlated exposures associated with phenotypes; however, the analytic complexity of identifying mixtures is immense. If the exposome concept is realized, new analytical methods and large sample sizes will be required to ascertain how mixtures are associated with disease. The author recommends documenting prevalent correlated exposures and replicated main effects prior to identifying mixtures.
Xiao, Fengjun; Li, Chengzhi; Sun, Jiangman; Zhang, Lianjie
2017-01-01
To study the rapid growth of research on organic photovoltaic (OPV) technology, development trends in the relevant research are analyzed based on CiteSpace software of text mining and visualization in scientific literature. By this analytical method, the outputs and cooperation of authors, the hot research topics, the vital references and the development trend of OPV are identified and visualized. Different from the traditional review articles by the experts on OPV, this work provides a new method of visualizing information about the development of the OPV technology research over the past decade quantitatively.
NASA Astrophysics Data System (ADS)
Xiao, Fengjun; Li, Chengzhi; Sun, Jiangman; Zhang, Lianjie
2017-09-01
To study the rapid growth of research on organic photovoltaic (OPV) technology, development trends in the relevant research are analyzed based on CiteSpace software of text mining and visualization in scientific literature. By this analytical method, the outputs and cooperation of authors, the hot research topics, the vital references and the development trend of OPV are identified and visualized. Different from the traditional review articles by the experts on OPV, this work provides a new method of visualizing information about the development of the OPV technology research over the past decade quantitatively.
Bicubic uniform B-spline wavefront fitting technology applied in computer-generated holograms
NASA Astrophysics Data System (ADS)
Cao, Hui; Sun, Jun-qiang; Chen, Guo-jie
2006-02-01
This paper presented a bicubic uniform B-spline wavefront fitting technology to figure out the analytical expression for object wavefront used in Computer-Generated Holograms (CGHs). In many cases, to decrease the difficulty of optical processing, off-axis CGHs rather than complex aspherical surface elements are used in modern advanced military optical systems. In order to design and fabricate off-axis CGH, we have to fit out the analytical expression for object wavefront. Zernike Polynomial is competent for fitting wavefront of centrosymmetric optical systems, but not for axisymmetrical optical systems. Although adopting high-degree polynomials fitting method would achieve higher fitting precision in all fitting nodes, the greatest shortcoming of this method is that any departure from the fitting nodes would result in great fitting error, which is so-called pulsation phenomenon. Furthermore, high-degree polynomials fitting method would increase the calculation time in coding computer-generated hologram and solving basic equation. Basing on the basis function of cubic uniform B-spline and the character mesh of bicubic uniform B-spline wavefront, bicubic uniform B-spline wavefront are described as the product of a series of matrices. Employing standard MATLAB routines, four kinds of different analytical expressions for object wavefront are fitted out by bicubic uniform B-spline as well as high-degree polynomials. Calculation results indicate that, compared with high-degree polynomials, bicubic uniform B-spline is a more competitive method to fit out the analytical expression for object wavefront used in off-axis CGH, for its higher fitting precision and C2 continuity.
Advances in Adaptive Control Methods
NASA Technical Reports Server (NTRS)
Nguyen, Nhan
2009-01-01
This poster presentation describes recent advances in adaptive control technology developed by NASA. Optimal Control Modification is a novel adaptive law that can improve performance and robustness of adaptive control systems. A new technique has been developed to provide an analytical method for computing time delay stability margin for adaptive control systems.
Parallel Aircraft Trajectory Optimization with Analytic Derivatives
NASA Technical Reports Server (NTRS)
Falck, Robert D.; Gray, Justin S.; Naylor, Bret
2016-01-01
Trajectory optimization is an integral component for the design of aerospace vehicles, but emerging aircraft technologies have introduced new demands on trajectory analysis that current tools are not well suited to address. Designing aircraft with technologies such as hybrid electric propulsion and morphing wings requires consideration of the operational behavior as well as the physical design characteristics of the aircraft. The addition of operational variables can dramatically increase the number of design variables which motivates the use of gradient based optimization with analytic derivatives to solve the larger optimization problems. In this work we develop an aircraft trajectory analysis tool using a Legendre-Gauss-Lobatto based collocation scheme, providing analytic derivatives via the OpenMDAO multidisciplinary optimization framework. This collocation method uses an implicit time integration scheme that provides a high degree of sparsity and thus several potential options for parallelization. The performance of the new implementation was investigated via a series of single and multi-trajectory optimizations using a combination of parallel computing and constraint aggregation. The computational performance results show that in order to take full advantage of the sparsity in the problem it is vital to parallelize both the non-linear analysis evaluations and the derivative computations themselves. The constraint aggregation results showed a significant numerical challenge due to difficulty in achieving tight convergence tolerances. Overall, the results demonstrate the value of applying analytic derivatives to trajectory optimization problems and lay the foundation for future application of this collocation based method to the design of aircraft with where operational scheduling of technologies is key to achieving good performance.
Remote Sensing is a scientific discipline of non-contact monitoring. It includes a range of technologies that span from aerial photography to advanced spectral imaging and analytical methods. This Session is designed to demonstrate contemporary practical applications of remote se...
Understanding Procurement for Sampling and Analytical Services Under a Triad Approach
The EPA Brownfields and Land Revitalization Technology Support Center (BTSC) has prepared this document to highlight methods and strategies that have been successfully used to procure services under a Triad framework.
Zhang, Fen-Fen; Jiang, Meng-Hong; Sun, Lin-Lin; Zheng, Feng; Dong, Lei; Shah, Vishva; Shen, Wen-Bin; Ding, Ya
2015-01-07
To expand the application scope of nuclear magnetic resonance (NMR) technology in quantitative analysis of pharmaceutical ingredients, (19)F nuclear magnetic resonance ((19)F-NMR) spectroscopy has been employed as a simple, rapid, and reproducible approach for the detection of a fluorine-containing model drug, sitagliptin phosphate monohydrate (STG). ciprofloxacin (Cipro) has been used as the internal standard (IS). Influential factors, including the relaxation delay time (d1) and pulse angle, impacting the accuracy and precision of spectral data are systematically optimized. Method validation has been carried out in terms of precision and intermediate precision, linearity, limit of detection (LOD) and limit of quantification (LOQ), robustness, and stability. To validate the reliability and feasibility of the (19)F-NMR technology in quantitative analysis of pharmaceutical analytes, the assay result has been compared with that of (1)H-NMR. The statistical F-test and student t-test at 95% confidence level indicate that there is no significant difference between these two methods. Due to the advantages of (19)F-NMR, such as higher resolution and suitability for biological samples, it can be used as a universal technology for the quantitative analysis of other fluorine-containing pharmaceuticals and analytes.
Bringing Business Intelligence to Health Information Technology Curriculum
ERIC Educational Resources Information Center
Zheng, Guangzhi; Zhang, Chi; Li, Lei
2015-01-01
Business intelligence (BI) and healthcare analytics are the emerging technologies that provide analytical capability to help healthcare industry improve service quality, reduce cost, and manage risks. However, such component on analytical healthcare data processing is largely missed from current healthcare information technology (HIT) or health…
Business Intelligence in Process Control
NASA Astrophysics Data System (ADS)
Kopčeková, Alena; Kopček, Michal; Tanuška, Pavol
2013-12-01
The Business Intelligence technology, which represents a strong tool not only for decision making support, but also has a big potential in other fields of application, is discussed in this paper. Necessary fundamental definitions are offered and explained to better understand the basic principles and the role of this technology for company management. Article is logically divided into five main parts. In the first part, there is the definition of the technology and the list of main advantages. In the second part, an overview of the system architecture with the brief description of separate building blocks is presented. Also, the hierarchical nature of the system architecture is shown. The technology life cycle consisting of four steps, which are mutually interconnected into a ring, is described in the third part. In the fourth part, analytical methods incorporated in the online analytical processing and data mining used within the business intelligence as well as the related data mining methodologies are summarised. Also, some typical applications of the above-mentioned particular methods are introduced. In the final part, a proposal of the knowledge discovery system for hierarchical process control is outlined. The focus of this paper is to provide a comprehensive view and to familiarize the reader with the Business Intelligence technology and its utilisation.
Nagy, Brigitta; Farkas, Attila; Gyürkés, Martin; Komaromy-Hiller, Szofia; Démuth, Balázs; Szabó, Bence; Nusser, Dávid; Borbás, Enikő; Marosi, György; Nagy, Zsombor Kristóf
2017-09-15
The integration of Process Analytical Technology (PAT) initiative into the continuous production of pharmaceuticals is indispensable for reliable production. The present paper reports the implementation of in-line Raman spectroscopy in a continuous blending and tableting process of a three-component model pharmaceutical system, containing caffeine as model active pharmaceutical ingredient (API), glucose as model excipient and magnesium stearate as lubricant. The real-time analysis of API content, blend homogeneity, and tablet content uniformity was performed using a Partial Least Squares (PLS) quantitative method. The in-line Raman spectroscopic monitoring showed that the continuous blender was capable of producing blends with high homogeneity, and technological malfunctions can be detected by the proposed PAT method. The Raman spectroscopy-based feedback control of the API feeder was also established, creating a 'Process Analytically Controlled Technology' (PACT), which guarantees the required API content in the produced blend. This is, to the best of the authors' knowledge, the first ever application of Raman-spectroscopy in continuous blending and the first Raman-based feedback control in the formulation technology of solid pharmaceuticals. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Al-Qudaimi, Abdullah; Kumar, Amit
2018-05-01
Recently, Abdullah and Najib (International Journal of Sustainable Energy 35(4): 360-377, 2016) proposed an intuitionistic fuzzy analytic hierarchy process to deal with uncertainty in decision-making and applied it to establish preference in the sustainable energy planning decision-making of Malaysia. This work may attract the researchers of other countries to choose energy technology for their countries. However, after a deep study of the published paper (International Journal of Sustainable Energy 35(4): 362-377, 2016), it is noticed that the expression used by Abdullah and Najib in Step 6 of their proposed method for evaluating the intuitionistic fuzzy entropy of each aggregate of each row of intuitionistic fuzzy matrix is not valid. Therefore, it is not genuine to use the method proposed by Abdullah and Najib for solving real-life problems. The aim of this paper was to suggest the required necessary modifications for resolving the flaws of the Abdullah and Najib method.
Magnetic ionic liquids in analytical chemistry: A review.
Clark, Kevin D; Nacham, Omprakash; Purslow, Jeffrey A; Pierson, Stephen A; Anderson, Jared L
2016-08-31
Magnetic ionic liquids (MILs) have recently generated a cascade of innovative applications in numerous areas of analytical chemistry. By incorporating a paramagnetic component within the cation or anion, MILs exhibit a strong response toward external magnetic fields. Careful design of the MIL structure has yielded magnetoactive compounds with unique physicochemical properties including high magnetic moments, enhanced hydrophobicity, and the ability to solvate a broad range of molecules. The structural tunability and paramagnetic properties of MILs have enabled magnet-based technologies that can easily be added to the analytical method workflow, complement needed extraction requirements, or target specific analytes. This review highlights the application of MILs in analytical chemistry and examines the important structural features of MILs that largely influence their physicochemical and magnetic properties. Copyright © 2016 Elsevier B.V. All rights reserved.
Martinez, Ramon; Ordunez, Pedro; Soliz, Patricia N; Ballesteros, Michael F
2016-01-01
Background The complexity of current injury-related health issues demands the usage of diverse and massive data sets for comprehensive analyses, and application of novel methods to communicate data effectively to the public health community, decision-makers and the public. Recent advances in information visualisation, availability of new visual analytic methods and tools, and progress on information technology provide an opportunity for shaping the next generation of injury surveillance. Objective To introduce data visualisation conceptual bases, and propose a visual analytic and visualisation platform in public health surveillance for injury prevention and control. Methods The paper introduces data visualisation conceptual bases, describes a visual analytic and visualisation platform, and presents two real-world case studies illustrating their application in public health surveillance for injury prevention and control. Results Application of visual analytic and visualisation platform is presented as solution for improved access to heterogeneous data sources, enhance data exploration and analysis, communicate data effectively, and support decision-making. Conclusions Applications of data visualisation concepts and visual analytic platform could play a key role to shape the next generation of injury surveillance. Visual analytic and visualisation platform could improve data use, the analytic capacity, and ability to effectively communicate findings and key messages. The public health surveillance community is encouraged to identify opportunities to develop and expand its use in injury prevention and control. PMID:26728006
Role of chromatography in the development of Standard Reference Materials for organic analysis.
Wise, Stephen A; Phinney, Karen W; Sander, Lane C; Schantz, Michele M
2012-10-26
The certification of chemical constituents in natural-matrix Standard Reference Materials (SRMs) at the National Institute of Standards and Technology (NIST) can require the use of two or more independent analytical methods. The independence among the methods is generally achieved by taking advantage of differences in extraction, separation, and detection selectivity. This review describes the development of the independent analytical methods approach at NIST, and its implementation in the measurement of organic constituents such as contaminants in environmental materials, nutrients and marker compounds in food and dietary supplement matrices, and health diagnostic and nutritional assessment markers in human serum. The focus of this review is the important and critical role that separation science techniques play in achieving the necessary independence of the analytical steps in the measurement of trace-level organic constituents in natural matrix SRMs. Published by Elsevier B.V.
Tanaka, Ryoma; Takahashi, Naoyuki; Nakamura, Yasuaki; Hattori, Yusuke; Ashizawa, Kazuhide; Otsuka, Makoto
2017-01-01
Resonant acoustic ® mixing (RAM) technology is a system that performs high-speed mixing by vibration through the control of acceleration and frequency. In recent years, real-time process monitoring and prediction has become of increasing interest, and process analytical technology (PAT) systems will be increasingly introduced into actual manufacturing processes. This study examined the application of PAT with the combination of RAM, near-infrared spectroscopy, and chemometric technology as a set of PAT tools for introduction into actual pharmaceutical powder blending processes. Content uniformity was based on a robust partial least squares regression (PLSR) model constructed to manage the RAM configuration parameters and the changing concentration of the components. As a result, real-time monitoring may be possible and could be successfully demonstrated for in-line real-time prediction of active pharmaceutical ingredients and other additives using chemometric technology. This system is expected to be applicable to the RAM method for the risk management of quality.
An update on pharmaceutical film coating for drug delivery.
Felton, Linda A; Porter, Stuart C
2013-04-01
Pharmaceutical coating processes have generally been transformed from what was essentially an art form in the mid-twentieth century to a much more technology-driven process. This review article provides a basic overview of current film coating processes, including a discussion on polymer selection, coating formulation additives and processing equipment. Substrate considerations for pharmaceutical coating processes are also presented. While polymeric coating operations are commonplace in the pharmaceutical industry, film coating processes are still not fully understood, which presents serious challenges with current regulatory requirements. Novel analytical technologies and various modeling techniques that are being used to better understand film coating processes are discussed. This review article also examines the challenges of implementing process analytical technologies in coating operations, active pharmaceutical ingredients in polymer film coatings, the use of high-solids coating systems and continuous coating and other novel coating application methods.
Review and assessment of the database and numerical modeling for turbine heat transfer
NASA Technical Reports Server (NTRS)
Gladden, H. J.; Simoneau, R. J.
1989-01-01
The objectives of the NASA Hot Section Technology (HOST) Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena and to assess and improve the analytical methods used to predict the flow and heat transfer in high-temperature gas turbines. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. A building-block approach was utilized and the research ranged from the study of fundamental phenomena and modeling to experiments in simulated real engine environments. Experimental research accounted for approximately 75 percent of the funding while the analytical efforts were approximately 25 percent. A healthy government/industry/university partnership, with industry providing almost half of the research, was created to advance the turbine heat transfer design technology base.
Micro-separation toward systems biology.
Liu, Bi-Feng; Xu, Bo; Zhang, Guisen; Du, Wei; Luo, Qingming
2006-02-17
Current biology is experiencing transformation in logic or philosophy that forces us to reevaluate the concept of cell, tissue or entire organism as a collection of individual components. Systems biology that aims at understanding biological system at the systems level is an emerging research area, which involves interdisciplinary collaborations of life sciences, computational and mathematical sciences, systems engineering, and analytical technology, etc. For analytical chemistry, developing innovative methods to meet the requirement of systems biology represents new challenges as also opportunities and responsibility. In this review, systems biology-oriented micro-separation technologies are introduced for comprehensive profiling of genome, proteome and metabolome, characterization of biomolecules interaction and single cell analysis such as capillary electrophoresis, ultra-thin layer gel electrophoresis, micro-column liquid chromatography, and their multidimensional combinations, parallel integrations, microfabricated formats, and nano technology involvement. Future challenges and directions are also suggested.
NASA Astrophysics Data System (ADS)
Qu, Jianan Y.; Suria, David; Wilson, Brian C.
1998-05-01
The primary goal of these studies was to demonstrate that NIR Raman spectroscopy is feasible as a rapid and reagentless analytic method for clinical diagnostics. Raman spectra were collected on human serum and urine samples using a 785 nm excitation laser and a single-stage holographic spectrometer. A partial east squares method was used to predict the analyte concentrations of interest. The actual concentrations were determined by a standard clinical chemistry. The prediction accuracy of total protein, albumin, triglyceride and glucose in human sera ranged from 1.5 percent to 5 percent which is greatly acceptable for clinical diagnostics. The concentration measurements of acetaminophen, ethanol and codeine inhuman urine have demonstrated the potential of NIR Raman technology in screening of therapeutic drugs and substances of abuse.
JPRS Report, Science & Technology, Japan
1988-10-05
collagen, we are conducting research on the immobilization, through chemical bond rather than physical absorption , of collagen on synthetic material...of a large number of samples are conducted by using automated apparatus and enzymatic reagents, it is natural to devise a method to use natural...improvement of enzymatic analytical methods ; 3) development of reaction system and instrumentation system; 4) research on sample treatment methods ; and
Rosas, Juan G; Blanco, Marcel; González, Josep M; Alcalà, Manel
2012-08-15
Process Analytical Technology (PAT) is playing a central role in current regulations on pharmaceutical production processes. Proper understanding of all operations and variables connecting the raw materials to end products is one of the keys to ensuring quality of the products and continuous improvement in their production. Near infrared spectroscopy (NIRS) has been successfully used to develop faster and non-invasive quantitative methods for real-time predicting critical quality attributes (CQA) of pharmaceutical granulates (API content, pH, moisture, flowability, angle of repose and particle size). NIR spectra have been acquired from the bin blender after granulation process in a non-classified area without the need of sample withdrawal. The methodology used for data acquisition, calibration modelling and method application in this context is relatively inexpensive and can be easily implemented by most pharmaceutical laboratories. For this purpose, Partial Least-Squares (PLS) algorithm was used to calculate multivariate calibration models, that provided acceptable Root Mean Square Error of Predictions (RMSEP) values (RMSEP(API)=1.0 mg/g; RMSEP(pH)=0.1; RMSEP(Moisture)=0.1%; RMSEP(Flowability)=0.6 g/s; RMSEP(Angle of repose)=1.7° and RMSEP(Particle size)=2.5%) that allowed the application for routine analyses of production batches. The proposed method affords quality assessment of end products and the determination of important parameters with a view to understanding production processes used by the pharmaceutical industry. As shown here, the NIRS technique is a highly suitable tool for Process Analytical Technologies. Copyright © 2012 Elsevier B.V. All rights reserved.
-Omic and Electronic Health Records Big Data Analytics for Precision Medicine
Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D.; Venugopalan, Janani; Hoffman, Ryan; Wang, May D.
2017-01-01
Objective Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of health care. Methods In this article, we present -omic and EHR data characteristics, associated challenges, and data analytics including data pre-processing, mining, and modeling. Results To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Conclusion Big data analytics is able to address –omic and EHR data challenges for paradigm shift towards precision medicine. Significance Big data analytics makes sense of –omic and EHR data to improve healthcare outcome. It has long lasting societal impact. PMID:27740470
1992-12-01
from several intellectual subfields, various re- searchers have produced lists of intersegment relations - from philosophers (e.g., [ Toulmin 5k]) to...Approach Integrating Case-Based and Analytical Methods. Ph.D. dissertation, Georgia Institute of Technology. [ Toulmin 58] Toulmin , S. 1958. The Uses of
BIOLOGICALLY-BASED RAPID SCREENING METHODS FOR DIOXINS IN THE UNITED STATES
Because of the extensive cost, in personnel, time, equipment, and money to measure dioxin-like chemicals analytically, alternative technologies have been developed to measure the integrated sum of the activity of the dioxin-like chemicals.
NASA Astrophysics Data System (ADS)
Lawton, Zachary E.; Traub, Angelica; Fatigante, William L.; Mancias, Jose; O'Leary, Adam E.; Hall, Seth E.; Wieland, Jamie R.; Oberacher, Herbert; Gizzi, Michael C.; Mulligan, Christopher C.
2017-06-01
Forensic evidentiary backlogs are indicative of the growing need for cost-effective, high-throughput instrumental methods. One such emerging technology that shows high promise in meeting this demand while also allowing on-site forensic investigation is portable mass spectrometric (MS) instrumentation, particularly that which enables the coupling to ambient ionization techniques. While the benefits of rapid, on-site screening of contraband can be anticipated, the inherent legal implications of field-collected data necessitates that the analytical performance of technology employed be commensurate with accepted techniques. To this end, comprehensive analytical validation studies are required before broad incorporation by forensic practitioners can be considered, and are the focus of this work. Pertinent performance characteristics such as throughput, selectivity, accuracy/precision, method robustness, and ruggedness have been investigated. Reliability in the form of false positive/negative response rates is also assessed, examining the effect of variables such as user training and experience level. To provide flexibility toward broad chemical evidence analysis, a suite of rapidly-interchangeable ion sources has been developed and characterized through the analysis of common illicit chemicals and emerging threats like substituted phenethylamines. [Figure not available: see fulltext.
HOST turbine heat transfer program summary
NASA Technical Reports Server (NTRS)
Gladden, Herbert J.; Simoneau, Robert J.
1988-01-01
The objectives of the HOST Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena and to assess and improve the analytical methods used to predict the flow and heat transfer in high temperature gas turbines. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. A building-block approach was utilized and the research ranged from the study of fundamental phenomena and modeling to experiments in simulated real engine environments. Experimental research accounted for approximately 75 percent of the funding with the remainder going to analytical efforts. A healthy government/industry/university partnership, with industry providing almost half of the research, was created to advance the turbine heat transfer design technology base.
Turk, Gregory C; Sharpless, Katherine E; Cleveland, Danielle; Jongsma, Candice; Mackey, Elizabeth A; Marlow, Anthony F; Oflaz, Rabia; Paul, Rick L; Sieber, John R; Thompson, Robert Q; Wood, Laura J; Yu, Lee L; Zeisler, Rolf; Wise, Stephen A; Yen, James H; Christopher, Steven J; Day, Russell D; Long, Stephen E; Greene, Ella; Harnly, James; Ho, I-Pin; Betz, Joseph M
2013-01-01
Standard Reference Material 3280 Multivitamin/ Multielement Tablets was issued by the National Institute of Standards and Technology in 2009, and has certified and reference mass fraction values for 13 vitamins, 26 elements, and two carotenoids. Elements were measured using two or more analytical methods at NIST with additional data contributed by collaborating laboratories. This reference material is expected to serve a dual purpose: to provide quality assurance in support of a database of dietary supplement products and to provide a means for analysts, dietary supplement manufacturers, and researchers to assess the appropriateness and validity of their analytical methods and the accuracy of their results.
Propulsion System Modeling and Simulation
NASA Technical Reports Server (NTRS)
Tai, Jimmy C. M.; McClure, Erin K.; Mavris, Dimitri N.; Burg, Cecile
2002-01-01
The Aerospace Systems Design Laboratory at the School of Aerospace Engineering in Georgia Institute of Technology has developed a core competency that enables propulsion technology managers to make technology investment decisions substantiated by propulsion and airframe technology system studies. This method assists the designer/manager in selecting appropriate technology concepts while accounting for the presence of risk and uncertainty as well as interactions between disciplines. This capability is incorporated into a single design simulation system that is described in this paper. This propulsion system design environment is created with a commercially available software called iSIGHT, which is a generic computational framework, and with analysis programs for engine cycle, engine flowpath, mission, and economic analyses. iSIGHT is used to integrate these analysis tools within a single computer platform and facilitate information transfer amongst the various codes. The resulting modeling and simulation (M&S) environment in conjunction with the response surface method provides the designer/decision-maker an analytical means to examine the entire design space from either a subsystem and/or system perspective. The results of this paper will enable managers to analytically play what-if games to gain insight in to the benefits (and/or degradation) of changing engine cycle design parameters. Furthermore, the propulsion design space will be explored probabilistically to show the feasibility and viability of the propulsion system integrated with a vehicle.
Jessen, Torben E; Höskuldsson, Agnar T; Bjerrum, Poul J; Verder, Henrik; Sørensen, Lars; Bratholm, Palle S; Christensen, Bo; Jensen, Lene S; Jensen, Maria A B
2014-09-01
Direct measurement of chemical constituents in complex biologic matrices without the use of analyte specific reagents could be a step forward toward the simplification of clinical biochemistry. Problems related to reagents such as production errors, improper handling, and lot-to-lot variations would be eliminated as well as errors occurring during assay execution. We describe and validate a reagent free method for direct measurement of six analytes in human plasma based on Fourier-transform infrared spectroscopy (FTIR). Blood plasma is analyzed without any sample preparation. FTIR spectrum of the raw plasma is recorded in a sampling cuvette specially designed for measurement of aqueous solutions. For each analyte, a mathematical calibration process is performed by a stepwise selection of wavelengths giving the optimal least-squares correlation between the measured FTIR signal and the analyte concentration measured by conventional clinical reference methods. The developed calibration algorithms are subsequently evaluated for their capability to predict the concentration of the six analytes in blinded patient samples. The correlation between the six FTIR methods and corresponding reference methods were 0.87
Rediscovering the ritual technology of the placebo effect in analytical psychology.
Goodwyn, Erik
2017-06-01
Technology, viewed more generally, is a collection of skills and methods that are used to accomplish an objective of some kind. Modernity has produced many kinds of ever-expanding new technologies, but it is also evident that technologies can be lost or fall out of use. A cross-cultural survey of ritual reveals a rather startling observation: that while developed nations often exceed other cultures in terms of material technology, they often pale by comparison in their use of ritual technology. In this essay we will see how ritual is a powerful sort of technology that developed nations have mostly allowed to drift out of regular, vigorous use, despite its numerous psychological and biological effects. This tendency has left one of the rituals we still have - psychotherapy itself - to be bereft of some of the typical tools for concretizing the symbolic in recurrent patterns around the world. Jung himself could be accused of being somewhat anti-ritual himself, enmeshed as he was in the post-Protestant, post-Enlightenment cultural environment that defines the West in many ways. But these under-utilized elements of ritual technology may be a natural fit for Jungian therapy due to its use of symbols. © 2017, The Society of Analytical Psychology.
GENETIC-BASED ANALYTICAL METHODS FOR BACTERIA AND FUNGI
In the past two decades, advances in high-throughput sequencing technologies have lead to a veritable explosion in the generation of nucleic acid sequence information (1). While these advances are illustrated most prominently by the successful sequencing of the human genome, they...
LOCATING BURIED WORLD WAR 1 MUNITIONS WITH REMOTE SENSING AND GIS
Remote Sensing is a scientific discipline of non-contact monitoring. It includes a range of technologies that span from aerial photography to advanced spectral imaging and analytical methods. This Session is designed to demonstrate contemporary practical applications of remote ...
Selected Analytical Methods for Environmental Remediation and Recovery (SAM) Presentation for APHL
The US Environmental Protection Agency’s Office of Research and Development (ORD) conducts cutting-edge research that provides the underpinning of science and technology for public health and environmental policies and decisions made by federal, state and other governmental...
METHOD 8261: USING SURROGATES TO MEASURE MATRIX EFFECTS AND CORRECT ANALYTICAL RESULTS
Vacuum distillation uses a specialized apparatus. This apparatus has been developed and patented by
the EPA. Through the Federal Technology Transfer Act this invention has been made available for commercialization. Available vendors for this instrumentation are being evaluat...
Aspects concerning verification methods and rigidity increment of complex technological systems
NASA Astrophysics Data System (ADS)
Casian, M.
2016-11-01
Any technological process and technology aims a quality and precise product, something almost impossible without high rigidity machine tools, equipment and components. Therefore, from the design phase, it is very important to create structures and machines with high stiffness characteristics. At the same time, increasing the stiffness should not raise the material costs. Searching this midpoint between high rigidity and minimum expenses leads to investigations and checks in structural components through various methods and techniques and sometimes quite advanced methods. In order to highlight some aspects concerning the significance of the mechanical equipment rigidity, the finite element method and an analytical method based on the use Mathcad software were used, by taking into consideration a subassembly of a grinding machine. Graphical representations were elaborated, offering a more complete image about the stresses and deformations able to affect the considered mechanical subassembly.
Applications of Raman Spectroscopy in Biopharmaceutical Manufacturing: A Short Review.
Buckley, Kevin; Ryder, Alan G
2017-06-01
The production of active pharmaceutical ingredients (APIs) is currently undergoing its biggest transformation in a century. The changes are based on the rapid and dramatic introduction of protein- and macromolecule-based drugs (collectively known as biopharmaceuticals) and can be traced back to the huge investment in biomedical science (in particular in genomics and proteomics) that has been ongoing since the 1970s. Biopharmaceuticals (or biologics) are manufactured using biological-expression systems (such as mammalian, bacterial, insect cells, etc.) and have spawned a large (>€35 billion sales annually in Europe) and growing biopharmaceutical industry (BioPharma). The structural and chemical complexity of biologics, combined with the intricacy of cell-based manufacturing, imposes a huge analytical burden to correctly characterize and quantify both processes (upstream) and products (downstream). In small molecule manufacturing, advances in analytical and computational methods have been extensively exploited to generate process analytical technologies (PAT) that are now used for routine process control, leading to more efficient processes and safer medicines. In the analytical domain, biologic manufacturing is considerably behind and there is both a huge scope and need to produce relevant PAT tools with which to better control processes, and better characterize product macromolecules. Raman spectroscopy, a vibrational spectroscopy with a number of useful properties (nondestructive, non-contact, robustness) has significant potential advantages in BioPharma. Key among them are intrinsically high molecular specificity, the ability to measure in water, the requirement for minimal (or no) sample pre-treatment, the flexibility of sampling configurations, and suitability for automation. Here, we review and discuss a representative selection of the more important Raman applications in BioPharma (with particular emphasis on mammalian cell culture). The review shows that the properties of Raman have been successfully exploited to deliver unique and useful analytical solutions, particularly for online process monitoring. However, it also shows that its inherent susceptibility to fluorescence interference and the weakness of the Raman effect mean that it can never be a panacea. In particular, Raman-based methods are intrinsically limited by the chemical complexity and wide analyte-concentration-profiles of cell culture media/bioprocessing broths which limit their use for quantitative analysis. Nevertheless, with appropriate foreknowledge of these limitations and good experimental design, robust analytical methods can be produced. In addition, new technological developments such as time-resolved detectors, advanced lasers, and plasmonics offer potential of new Raman-based methods to resolve existing limitations and/or provide new analytical insights.
Managing knowledge business intelligence: A cognitive analytic approach
NASA Astrophysics Data System (ADS)
Surbakti, Herison; Ta'a, Azman
2017-10-01
The purpose of this paper is to identify and analyze integration of Knowledge Management (KM) and Business Intelligence (BI) in order to achieve competitive edge in context of intellectual capital. Methodology includes review of literatures and analyzes the interviews data from managers in corporate sector and models established by different authors. BI technologies have strong association with process of KM for attaining competitive advantage. KM have strong influence from human and social factors and turn them to the most valuable assets with efficient system run under BI tactics and technologies. However, the term of predictive analytics is based on the field of BI. Extracting tacit knowledge is a big challenge to be used as a new source for BI to use in analyzing. The advanced approach of the analytic methods that address the diversity of data corpus - structured and unstructured - required a cognitive approach to provide estimative results and to yield actionable descriptive, predictive and prescriptive results. This is a big challenge nowadays, and this paper aims to elaborate detail in this initial work.
2017-01-01
In this work, the use of fused deposition modeling (FDM) in a (bio)analytical/lab-on-a-chip research laboratory is described. First, the specifications of this 3D printing method that are important for the fabrication of (micro)devices were characterized for a benchtop FDM 3D printer. These include resolution, surface roughness, leakage, transparency, material deformation, and the possibilities for integration of other materials. Next, the autofluorescence, solvent compatibility, and biocompatibility of 12 representative FDM materials were tested and evaluated. Finally, we demonstrate the feasibility of FDM in a number of important applications. In particular, we consider the fabrication of fluidic channels, masters for polymer replication, and tools for the production of paper microfluidic devices. This work thus provides a guideline for (i) the use of FDM technology by addressing its possibilities and current limitations, (ii) material selection for FDM, based on solvent compatibility and biocompatibility, and (iii) application of FDM technology to (bio)analytical research by demonstrating a broad range of illustrative examples. PMID:28628294
Workshop on Emerging Technology and Data Analytics for Behavioral Health.
Kotz, David; Lord, Sarah E; O'Malley, A James; Stark, Luke; Marsch, Lisa A
2018-06-20
Wearable and portable digital devices can support self-monitoring for patients with chronic medical conditions, individuals seeking to reduce stress, and people seeking to modify health-related behaviors such as substance use or overeating. The resulting data may be used directly by a consumer, or shared with a clinician for treatment, a caregiver for assistance, or a health coach for support. The data can also be used by researchers to develop and evaluate just-in-time interventions that leverage mobile technology to help individuals manage their symptoms and behavior in real time and as needed. Such wearable systems have huge potential for promoting delivery of anywhere-anytime health care, improving public health, and enhancing the quality of life for many people. The Center for Technology and Behavioral Health at Dartmouth College, a P30 "Center of Excellence" supported by the National Institute on Drug Abuse at the National Institutes of Health, conducted a workshop in February 2017 on innovations in emerging technology, user-centered design, and data analytics for behavioral health, with presentations by a diverse range of experts in the field. The workshop focused on wearable and mobile technologies being used in clinical and research contexts, with an emphasis on applications in mental health, addiction, and health behavior change. In this paper, we summarize the workshop panels on mobile sensing, user experience design, statistics and machine learning, and privacy and security, and conclude with suggested research directions for this important and emerging field of applying digital approaches to behavioral health. Workshop insights yielded four key directions for future research: (1) a need for behavioral health researchers to work iteratively with experts in emerging technology and data analytics, (2) a need for research into optimal user-interface design for behavioral health technologies, (3) a need for privacy-oriented design from the beginning of a novel technology, and (4) the need to develop new analytical methods that can scale to thousands of individuals and billions of data points. ©David Kotz, Sarah E Lord, A James O'Malley, Luke Stark, Lisa A. Marsch. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 20.06.2018.
Development of Process Analytical Technology (PAT) methods for controlled release pellet coating.
Avalle, P; Pollitt, M J; Bradley, K; Cooper, B; Pearce, G; Djemai, A; Fitzpatrick, S
2014-07-01
This work focused on the control of the manufacturing process for a controlled release (CR) pellet product, within a Quality by Design (QbD) framework. The manufacturing process was Wurster coating: firstly layering active pharmaceutical ingredient (API) onto sugar pellet cores and secondly a controlled release (CR) coating. For each of these two steps, development of a Process Analytical Technology (PAT) method is discussed and also a novel application of automated microscopy as the reference method. Ultimately, PAT methods should link to product performance and the two key Critical Quality Attributes (CQAs) for this CR product are assay and release rate, linked to the API and CR coating steps respectively. In this work, the link between near infra-red (NIR) spectra and those attributes was explored by chemometrics over the course of the coating process in a pilot scale industrial environment. Correlations were built between the NIR spectra and coating weight (for API amount), CR coating thickness and dissolution performance. These correlations allow the coating process to be monitored at-line and so better control of the product performance in line with QbD requirements. Copyright © 2014 Elsevier B.V. All rights reserved.
Characterization of spacecraft humidity condensate
NASA Technical Reports Server (NTRS)
Muckle, Susan; Schultz, John R.; Sauer, Richard L.
1994-01-01
When construction of Space Station Freedom reaches the Permanent Manned Capability (PMC) stage, the Water Recovery and Management Subsystem will be fully operational such that (distilled) urine, spent hygiene water, and humidity condensate will be reclaimed to provide water of potable quality. The reclamation technologies currently baselined to process these waste waters include adsorption, ion exchange, catalytic oxidation, and disinfection. To ensure that the baseline technologies will be able to effectively remove those compounds presenting a health risk to the crew, the National Research Council has recommended that additional information be gathered on specific contaminants in waste waters representative of those to be encountered on the Space Station. With the application of new analytical methods and the analysis of waste water samples more representative of the Space Station environment, advances in the identification of the specific contaminants continue to be made. Efforts by the Water and Food Analytical Laboratory at JSC were successful in enlarging the database of contaminants in humidity condensate. These efforts have not only included the chemical characterization of condensate generated during ground-based studies, but most significantly the characterization of cabin and Spacelab condensate generated during Shuttle missions. The analytical results presented in this paper will be used to show how the composition of condensate varies amongst enclosed environments and thus the importance of collecting condensate from an environment close to that of the proposed Space Station. Although advances were made in the characterization of space condensate, complete characterization, particularly of the organics, requires further development of analytical methods.
7 CFR 2902.5 - Item designation.
Code of Federal Regulations, 2011 CFR
2011-01-01
..., USDA will use life cycle cost information only from tests using the BEES analytical method. (c... availability of such items and the economic and technological feasibility of using such items, including life cycle costs. USDA will gather information on individual products within an item and extrapolate that...
40 CFR 63.90 - Program overview.
Code of Federal Regulations, 2012 CFR
2012-07-01
... calibration gases or test cells; (4) Use of an analytical technology that differs from that specified by a... “proven technology” (generally accepted by the scientific community as equivalent or better) that is... enforceable test method involving “proven technology” (generally accepted by the scientific community as...
40 CFR 63.90 - Program overview.
Code of Federal Regulations, 2013 CFR
2013-07-01
... calibration gases or test cells; (4) Use of an analytical technology that differs from that specified by a... “proven technology” (generally accepted by the scientific community as equivalent or better) that is... enforceable test method involving “proven technology” (generally accepted by the scientific community as...
40 CFR 63.90 - Program overview.
Code of Federal Regulations, 2014 CFR
2014-07-01
... calibration gases or test cells; (4) Use of an analytical technology that differs from that specified by a... “proven technology” (generally accepted by the scientific community as equivalent or better) that is... enforceable test method involving “proven technology” (generally accepted by the scientific community as...
Microfluidic devices to enrich and isolate circulating tumor cells
Myung, J. H.; Hong, S.
2015-01-01
Given the potential clinical impact of circulating tumor cells (CTCs) in blood as a clinical biomarker for diagnosis and prognosis of various cancers, a myriad of detection methods for CTCs have been recently introduced. Among those, a series of microfluidic devices are particularly promising as these uniquely offer micro-scale analytical systems that are highlighted by low consumption of samples and reagents, high flexibility to accommodate other cutting-edge technologies, precise and well-defined flow behaviors, and automation capability, presenting significant advantages over the conventional larger scale systems. In this review, we highlight the advantages of microfluidic devices and their translational potential into CTC detection methods, categorized by miniaturization of bench-top analytical instruments, integration capability with nanotechnologies, and in situ or sequential analysis of captured CTCs. This review provides a comprehensive overview of recent advances in the CTC detection achieved through application of microfluidic devices and their challenges that these promising technologies must overcome to be clinically impactful. PMID:26549749
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1998-12-01
Accurate measurements of radioactivity in soils contaminated with Strontium-90 (Sr-90) or Uranium-238 (U-238) are essential for many DOE site remediation programs. These crucial measurements determine if excavation and soil removal is necessary, where remediation efforts should be focused, and/or if a site has reached closure. Measuring soil contamination by standard EPA laboratory methods typically takes a week (accelerated analytical test turnaround) or a month (standard analytical test turnaround). The time delay extends to operations involving heavy excavation equipment and associated personnel which are the main costs of remediation. This report describes an application of the BetaScint{trademark} fiber-optic sensor that measuresmore » Sr-90 or U-238 contamination in soil samples on site in about 20 minutes, at a much lower cost than time-consuming laboratory methods, to greatly facilitate remediation. This report describes the technology, its performance, its uses, cost, regulatory and policy issues, and lessons learned.« less
Advanced rotorcraft technology: Task force report
NASA Technical Reports Server (NTRS)
1978-01-01
The technological needs and opportunities related to future civil and military rotorcraft were determined and a program plan for NASA research which was responsive to the needs and opportunities was prepared. In general, the program plan places the primary emphasis on design methodology where the development and verification of analytical methods is built upon a sound data base. The four advanced rotorcraft technology elements identified are aerodynamics and structures, flight control and avionic systems, propulsion, and vehicle configurations. Estimates of the total funding levels that would be required to support the proposed program plan are included.
NASA Technical Reports Server (NTRS)
Keiter, I. D.
1982-01-01
Studies of several General Aviation aircraft indicated that the application of advanced technologies to General Aviation propellers can reduce fuel consumption in future aircraft by a significant amount. Propeller blade weight reductions achieved through the use of composites, propeller efficiency and noise improvements achieved through the use of advanced concepts and improved propeller analytical design methods result in aircraft with lower operating cost, acquisition cost and gross weight.
Benchmark Comparison of Cloud Analytics Methods Applied to Earth Observations
NASA Technical Reports Server (NTRS)
Lynnes, Chris; Little, Mike; Huang, Thomas; Jacob, Joseph; Yang, Phil; Kuo, Kwo-Sen
2016-01-01
Cloud computing has the potential to bring high performance computing capabilities to the average science researcher. However, in order to take full advantage of cloud capabilities, the science data used in the analysis must often be reorganized. This typically involves sharding the data across multiple nodes to enable relatively fine-grained parallelism. This can be either via cloud-based file systems or cloud-enabled databases such as Cassandra, Rasdaman or SciDB. Since storing an extra copy of data leads to increased cost and data management complexity, NASA is interested in determining the benefits and costs of various cloud analytics methods for real Earth Observation cases. Accordingly, NASA's Earth Science Technology Office and Earth Science Data and Information Systems project have teamed with cloud analytics practitioners to run a benchmark comparison on cloud analytics methods using the same input data and analysis algorithms. We have particularly looked at analysis algorithms that work over long time series, because these are particularly intractable for many Earth Observation datasets which typically store data with one or just a few time steps per file. This post will present side-by-side cost and performance results for several common Earth observation analysis operations.
Benchmark Comparison of Cloud Analytics Methods Applied to Earth Observations
NASA Astrophysics Data System (ADS)
Lynnes, C.; Little, M. M.; Huang, T.; Jacob, J. C.; Yang, C. P.; Kuo, K. S.
2016-12-01
Cloud computing has the potential to bring high performance computing capabilities to the average science researcher. However, in order to take full advantage of cloud capabilities, the science data used in the analysis must often be reorganized. This typically involves sharding the data across multiple nodes to enable relatively fine-grained parallelism. This can be either via cloud-based filesystems or cloud-enabled databases such as Cassandra, Rasdaman or SciDB. Since storing an extra copy of data leads to increased cost and data management complexity, NASA is interested in determining the benefits and costs of various cloud analytics methods for real Earth Observation cases. Accordingly, NASA's Earth Science Technology Office and Earth Science Data and Information Systems project have teamed with cloud analytics practitioners to run a benchmark comparison on cloud analytics methods using the same input data and analysis algorithms. We have particularly looked at analysis algorithms that work over long time series, because these are particularly intractable for many Earth Observation datasets which typically store data with one or just a few time steps per file. This post will present side-by-side cost and performance results for several common Earth observation analysis operations.
Visual analytics as a translational cognitive science.
Fisher, Brian; Green, Tera Marie; Arias-Hernández, Richard
2011-07-01
Visual analytics is a new interdisciplinary field of study that calls for a more structured scientific approach to understanding the effects of interaction with complex graphical displays on human cognitive processes. Its primary goal is to support the design and evaluation of graphical information systems that better support cognitive processes in areas as diverse as scientific research and emergency management. The methodologies that make up this new field are as yet ill defined. This paper proposes a pathway for development of visual analytics as a translational cognitive science that bridges fundamental research in human/computer cognitive systems and design and evaluation of information systems in situ. Achieving this goal will require the development of enhanced field methods for conceptual decomposition of human/computer cognitive systems that maps onto laboratory studies, and improved methods for conducting laboratory investigations that might better map onto real-world cognitive processes in technology-rich environments. Copyright © 2011 Cognitive Science Society, Inc.
Warth, Arne; Muley, Thomas; Meister, Michael; Weichert, Wilko
2015-01-01
Preanalytic sampling techniques and preparation of tissue specimens strongly influence analytical results in lung tissue diagnostics both on the morphological but also on the molecular level. However, in contrast to analytics where tremendous achievements in the last decade have led to a whole new portfolio of test methods, developments in preanalytics have been minimal. This is specifically unfortunate in lung cancer, where usually only small amounts of tissue are at hand and optimization in all processing steps is mandatory in order to increase the diagnostic yield. In the following, we provide a comprehensive overview on some aspects of preanalytics in lung cancer from the method of sampling over tissue processing to its impact on analytical test results. We specifically discuss the role of preanalytics in novel technologies like next-generation sequencing and in the state-of the-art cytology preparations. In addition, we point out specific problems in preanalytics which hamper further developments in the field of lung tissue diagnostics.
A combined approach of AHP and TOPSIS methods applied in the field of integrated software systems
NASA Astrophysics Data System (ADS)
Berdie, A. D.; Osaci, M.; Muscalagiu, I.; Barz, C.
2017-05-01
Adopting the most appropriate technology for developing applications on an integrated software system for enterprises, may result in great savings both in cost and hours of work. This paper proposes a research study for the determination of a hierarchy between three SAP (System Applications and Products in Data Processing) technologies. The technologies Web Dynpro -WD, Floorplan Manager - FPM and CRM WebClient UI - CRM WCUI are multi-criteria evaluated in terms of the obtained performances through the implementation of the same web business application. To establish the hierarchy a multi-criteria analysis model that combines the AHP (Analytic Hierarchy Process) and the TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) methods was proposed. This model was built with the help of the SuperDecision software. This software is based on the AHP method and determines the weights for the selected sets of criteria. The TOPSIS method was used to obtain the final ranking and the technologies hierarchy.
Danezis, G P; Anagnostopoulos, C J; Liapis, K; Koupparis, M A
2016-10-26
One of the recent trends in Analytical Chemistry is the development of economic, quick and easy hyphenated methods to be used in a field that includes analytes of different classes and physicochemical properties. In this work a multi-residue method was developed for the simultaneous determination of 28 xenobiotics (polar and hydrophilic) using hydrophilic interaction liquid chromatography technique (HILIC) coupled with triple quadrupole mass spectrometry (LC-MS/MS) technology. The scope of the method includes plant growth regulators (chlormequat, daminozide, diquat, maleic hydrazide, mepiquat, paraquat), pesticides (cyromazine, the metabolite of the fungicide propineb PTU (propylenethiourea), amitrole), various multiclass antibiotics (tetracyclines, sulfonamides quinolones, kasugamycin and mycotoxins (aflatoxin B1, B2, fumonisin B1 and ochratoxin A). Isolation of the analytes from the matrix was achieved with a fast and effective technique. The validation of the multi-residue method was performed at the levels: 10 μg/kg and 100 μg/kg in the following representative substrates: fruits-vegetables (apples, apricots, lettuce and onions), cereals and pulses (flour and chickpeas), animal products (milk and meat) and cereal based baby foods. The method was validated taking into consideration EU guidelines and showed acceptable linearity (r ≥ 0.99), accuracy with recoveries between 70 and 120% and precision with RSD ≤ 20% for the majority of the analytes studied. For the analytes that presented accuracy and precision values outside the acceptable limits the method still is able to serve as a semi-quantitative method. The matrix effect, the limits of detection and quantification were also estimated and compared with the current EU MRLs (Maximum Residue Levels) and FAO/WHO MLs (Maximum Levels) or CXLs (Codex Maximum Residue Limits). The combined and expanded uncertainty of the method for each analyte per substrate, was also estimated. Copyright © 2016 Elsevier B.V. All rights reserved.
Carbon dioxide gas purification and analytical measurement for leading edge 193nm lithography
NASA Astrophysics Data System (ADS)
Riddle Vogt, Sarah; Landoni, Cristian; Applegarth, Chuck; Browning, Matt; Succi, Marco; Pirola, Simona; Macchi, Giorgio
2015-03-01
The use of purified carbon dioxide (CO2) has become a reality for leading edge 193 nm immersion lithography scanners. Traditionally, both dry and immersion 193 nm lithographic processes have constantly purged the optics stack with ultrahigh purity compressed dry air (UHPCDA). CO2 has been utilized for a similar purpose as UHPCDA. Airborne molecular contamniation (AMC) purification technologies and analytical measurement methods have been extensively developed to support the Lithography Tool Manufacturers purity requirements. This paper covers the analytical tests and characterizations carried out to assess impurity removal from 3.0 N CO2 (beverage grade) for its final utilization in 193 nm and EUV scanners.
Review and assessment of the database and numerical modeling for turbine heat transfer
NASA Technical Reports Server (NTRS)
Gladden, H. J.; Simoneau, R. J.
1988-01-01
The objectives of the HOST Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena and to assess and improve the analytical methods used to predict the flow and heat transfer in high-temperature gas turbines. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. A building-block approach was utilized and the research ranged from the study of fundamental phenomena and modeling to experiments in simulated real engine environments. Experimental research accounted for approximately 75 percent of the funding while the analytical efforts were approximately 25 percent. A healthy government/industry/university partnership, with industry providing almost half of the research, was created to advance the turbine heat transfer design technology base.
Analytical method for thermal stress analysis of plasma facing materials
NASA Astrophysics Data System (ADS)
You, J. H.; Bolt, H.
2001-10-01
The thermo-mechanical response of plasma facing materials (PFMs) to heat loads from the fusion plasma is one of the crucial issues in fusion technology. In this work, a fully analytical description of the thermal stress distribution in armour tiles of plasma facing components is presented which is expected to occur under typical high heat flux (HHF) loads. The method of stress superposition is applied considering the temperature gradient and thermal expansion mismatch. Several combinations of PFMs and heat sink metals are analysed and compared. In the framework of the present theoretical model, plastic flow and the effect of residual stress can be quantitatively assessed. Possible failure features are discussed.
Analytical Ferrography Standardization.
1982-01-01
AD-AII6 508 MECHANICAL TECHNOLOGY INC LATHAM NY RESEARCH AND 0EV--ETC F/6 7/4 ANALYTICAL FERROGRAPHY STANDARDIZATION. (U) JAN 82 P A SENHOLZI, A S...ii Mwl jutio7 Unimte SMechanical Technology Incorporated Research and Development Division ReerhadDvlpetDvso I FINAL REPORT ANALYTICAL FERROGRAPHY ...Final Report MTI Technical Report No. 82TRS6 ANALYTICAL FERROGRAPHY STANDARDIZATION P. B. Senholzi A. S. Maciejewski Applications Engineering Mechanical
The Use of Life Cycle Tools to Support Decision Making for Sustainable Nanotechnologies
Nanotechnology is a broad-impact technology with applications ranging from materials and electronics to analytical methods and metrology. The many benefits that can be realized through the utilization of nanotechnology are intended to lead to an improved quality of life. However,...
On-Site Detection as a Countermeasure to Chemical Warfare/Terrorism.
Seto, Y
2014-01-01
On-site monitoring and detection are necessary in the crisis and consequence management of wars and terrorism involving chemical warfare agents (CWAs) such as sarin. The analytical performance required for on-site detection is mainly determined by the fatal vapor concentration and volatility of the CWAs involved. The analytical performance for presently available on-site technologies and commercially available on-site equipment for detecting CWAs interpreted and compared in this review include: classical manual methods, photometric methods, ion mobile spectrometry, vibrational spectrometry, gas chromatography, mass spectrometry, sensors, and other methods. Some of the data evaluated were obtained from our experiments using authentic CWAs. We concluded that (a) no technologies perfectly fulfill all of the on-site detection requirements and (b) adequate on-site detection requires (i) a combination of the monitoring-tape method and ion-mobility spectrometry for point detection and (ii) a combination of the monitoring-tape method, atmospheric pressure chemical ionization mass spectrometry with counterflow introduction, and gas chromatography with a trap and special detectors for continuous monitoring. The basic properties of CWAs, the concept of on-site detection, and the sarin gas attacks in Japan as well as the forensic investigations thereof, are also explicated in this article. Copyright © 2014 Central Police University.
Mattarozzi, Monica; Suman, Michele; Cascio, Claudia; Calestani, Davide; Weigel, Stefan; Undas, Anna; Peters, Ruud
2017-01-01
Estimating consumer exposure to nanomaterials (NMs) in food products and predicting their toxicological properties are necessary steps in the assessment of the risks of this technology. To this end, analytical methods have to be available to detect, characterize and quantify NMs in food and materials related to food, e.g. food packaging and biological samples following metabolization of food. The challenge for the analytical sciences is that the characterization of NMs requires chemical as well as physical information. This article offers a comprehensive analysis of methods available for the detection and characterization of NMs in food and related products. Special attention was paid to the crucial role of sample preparation methods since these have been partially neglected in the scientific literature so far. The currently available instrumental methods are grouped as fractionation, counting and ensemble methods, and their advantages and limitations are discussed. We conclude that much progress has been made over the last 5 years but that many challenges still exist. Future perspectives and priority research needs are pointed out. Graphical Abstract Two possible analytical strategies for the sizing and quantification of Nanoparticles: Asymmetric Flow Field-Flow Fractionation with multiple detectors (allows the determination of true size and mass-based particle size distribution); Single Particle Inductively Coupled Plasma Mass Spectrometry (allows the determination of a spherical equivalent diameter of the particle and a number-based particle size distribution).
Alharthi, Hana; Sultana, Nahid; Al-Amoudi, Amjaad; Basudan, Afrah
2015-01-01
Pharmacy barcode scanning is used to reduce errors during the medication dispensing process. However, this technology has rarely been used in hospital pharmacies in Saudi Arabia. This article describes the barriers to successful implementation of a barcode scanning system in Saudi Arabia. A literature review was conducted to identify the relevant critical success factors (CSFs) for a successful dispensing barcode system implementation. Twenty-eight pharmacists from a local hospital in Saudi Arabia were interviewed to obtain their perception of these CSFs. In this study, planning (process flow issues and training requirements), resistance (fear of change, communication issues, and negative perceptions about technology), and technology (software, hardware, and vendor support) were identified as the main barriers. The analytic hierarchy process (AHP), one of the most widely used tools for decision making in the presence of multiple criteria, was used to compare and rank these identified CSFs. The results of this study suggest that resistance barriers have a greater impact than planning and technology barriers. In particular, fear of change is the most critical factor, and training is the least critical factor.
Biosensors for hepatitis B virus detection.
Yao, Chun-Yan; Fu, Wei-Ling
2014-09-21
A biosensor is an analytical device used for the detection of analytes, which combines a biological component with a physicochemical detector. Recently, an increasing number of biosensors have been used in clinical research, for example, the blood glucose biosensor. This review focuses on the current state of biosensor research with respect to efficient, specific and rapid detection of hepatitis B virus (HBV). The biosensors developed based on different techniques, including optical methods (e.g., surface plasmon resonance), acoustic wave technologies (e.g., quartz crystal microbalance), electrochemistry (amperometry, voltammetry and impedance) and novel nanotechnology, are also discussed.
Model and Analytic Processes for Export License Assessments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.
2011-09-29
This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determinemore » which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An approach to developing testable hypotheses for the macro-level assessment methodologies is provided. The outcome of this works suggests that we should develop a Bayes Net for micro-level analysis and continue to focus on Bayes Net, System Dynamics and Economic Input/Output models for assessing macro-level problems. Simultaneously, we need to develop metrics for assessing intent in export control, including the risks and consequences associated with all aspects of export control.« less
Qian Cutrone, Jingfang Jenny; Huang, Xiaohua Stella; Kozlowski, Edward S; Bao, Ye; Wang, Yingzi; Poronsky, Christopher S; Drexler, Dieter M; Tymiak, Adrienne A
2017-05-10
Synthetic macrocyclic peptides with natural and unnatural amino acids have gained considerable attention from a number of pharmaceutical/biopharmaceutical companies in recent years as a promising approach to drug discovery, particularly for targets involving protein-protein or protein-peptide interactions. Analytical scientists charged with characterizing these leads face multiple challenges including dealing with a class of complex molecules with the potential for multiple isomers and variable charge states and no established standards for acceptable analytical characterization of materials used in drug discovery. In addition, due to the lack of intermediate purification during solid phase peptide synthesis, the final products usually contain a complex profile of impurities. In this paper, practical analytical strategies and methodologies were developed to address these challenges, including a tiered approach to assessing the purity of macrocyclic peptides at different stages of drug discovery. Our results also showed that successful progression and characterization of a new drug discovery modality benefited from active analytical engagement, focusing on fit-for-purpose analyses and leveraging a broad palette of analytical technologies and resources. Copyright © 2017. Published by Elsevier B.V.
Total analysis systems with Thermochromic Etching Discs technology.
Avella-Oliver, Miquel; Morais, Sergi; Carrascosa, Javier; Puchades, Rosa; Maquieira, Ángel
2014-12-16
A new analytical system based on Thermochromic Etching Discs (TED) technology is presented. TED comprises a number of attractive features such as track independency, selective irradiation, a high power laser, and the capability to create useful assay platforms. The analytical versatility of this tool opens up a wide range of possibilities to design new compact disc-based total analysis systems applicable in chemistry and life sciences. In this paper, TED analytical implementation is described and discussed, and their analytical potential is supported by several applications. Microarray immunoassay, immunofiltration assay, solution measurement, and cell culture approaches are herein addressed in order to demonstrate the practical capacity of this system. The analytical usefulness of TED technology is herein demonstrated, describing how to exploit this tool for developing truly integrated analytical systems that provide solutions within the point of care framework.
[Progress in the application of laser ablation ICP-MS to surface microanalysis in material science].
Zhang, Yong; Jia, Yun-hai; Chen, Ji-wen; Shen, Xue-jing; Liu, Ying; Zhao, Leiz; Li, Dong-ling; Hang, Peng-cheng; Zhao, Zhen; Fan, Wan-lun; Wang, Hai-zhou
2014-08-01
In the present paper, apparatus and theory of surface analysis is introduced, and the progress in the application of laser ablation ICP-MS to microanalysis in ferrous, nonferrous and semiconductor field is reviewed in detail. Compared with traditional surface analytical tools, such as SEM/EDS (scanning electron microscopy/energy dispersive spectrum), EPMA (electron probe microanalysis analysis), AES (auger energy spectrum), etc. the advantage is little or no sample preparation, adjustable spatial resolution according to analytical demand, multi-element analysis and high sensitivity. It is now a powerful complementary method to traditional surface analytical tool. With the development of LA-ICP-MS technology maturing, more and more analytical workers will use this powerful tool in the future, and LA-ICP-MS will be a super star in elemental analysis field just like LIBS (Laser-induced breakdown spectroscopy).
Schneid, Stefan C; Johnson, Robert E; Lewis, Lavinia M; Stärtzel, Peter; Gieseler, Henning
2015-05-01
Process analytical technology (PAT) and quality by design have gained importance in all areas of pharmaceutical development and manufacturing. One important method for monitoring of critical product attributes and process optimization in laboratory scale freeze-drying is manometric temperature measurement (MTM). A drawback of this innovative technology is that problems are encountered when processing high-concentrated amorphous materials, particularly protein formulations. In this study, a model solution of bovine serum albumin and sucrose was lyophilized at both conservative and aggressive primary drying conditions. Different temperature sensors were employed to monitor product temperatures. The residual moisture content at primary drying endpoints as indicated by temperature sensors and batch PAT methods was quantified from extracted sample vials. The data from temperature probes were then used to recalculate critical product parameters, and the results were compared with MTM data. The drying endpoints indicated by the temperature sensors were not suitable for endpoint indication, in contrast to the batch methods endpoints. The accuracy of MTM Pice data was found to be influenced by water reabsorption. Recalculation of Rp and Pice values based on data from temperature sensors and weighed vials was possible. Overall, extensive information about critical product parameters could be obtained using data from complementary PAT tools. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.
NASA Astrophysics Data System (ADS)
Crockett, Denise King
The purpose of the study is to show how science is defined and technology is selected in an Amish Mennonite (fundamentalist Christian) community and its school. Additionally, by examining this community, information is collected on how a fundamentalist school's treatment of and experience with science and technology compare to what has occurred over time in public schools in the United States. An ethnographic approach was used to recreate the shared beliefs, practices, artifacts, folk knowledge, and behaviors of this community. The ethnographic methodology allowed analytical descriptions and reconstructions of whole cultural scenes and groups of the community. Analysis of data followed an analytic induction method. The data collected included participant observation, documentation, photographs, formal interviews, informal interviews, audiotaping, journal entries, and artifacts. Findings indicate that science is wholly subsumed by Amish Mennonite religion. Using the transmission model, the Amish Mennonites teach science as a list of facts from the King James version of the Holy Bible. This method of teaching promotes community values and beliefs. The encouragement stands in sharp contrast to the Amish Mennonite school. Technology is seen as a tool for making the community prosper. For this community to sustain itself, economic stability must be maintained. Their economic stability is dependent on the outside community purchasing their goods and services; producing these goods and services requires use of appropriate technologies. In the United States public schools, science is encouraged to be taught a way of knowing that implies a critical view about how the world works. In addition, public schools promote new and innovative technologies. Thus, they become fertile soil for developing new concepts about implementing scientific ideas and using technology. For the Amish Mennonites, rigorous standards, such as the scientific method, as addressed in the public school do not exist. In contrast, critical analysis of any new technology is always used in this community.
Colorimetric Solid Phase Extraction (CSPE): Using Color to Monitor Spacecraft Water Quality
NASA Technical Reports Server (NTRS)
Gazda, Daniel B.; Nolan, Daniel J.; Rutz, Jeffrey A.; Schultz, John R.; Siperko, Lorraine M.; Porter, Marc D.; Lipert, Robert J.; Flint, Stephanie M.; McCoy, J. Torin
2010-01-01
In August 2009, an experimental water quality monitoring kit based on Colorimetric Solid Phase Extraction (CSPE) technology was delivered to the International Space Station (ISS). The kit, called the Colorimetric Water Quality Monitoring Kit (CWQMK), was launched as a Station Development Test Objective (SDTO) experiment to evaluate the suitability of CSPE technology for routine use monitoring water quality on the ISS. CSPE is a sorption-spectrophotometric technique that combines colorimetric reagents, solid-phase extraction, and diffuse reflectance spectroscopy to quantify trace analytes in water samples. In CSPE, a known volume of sample is metered through a membrane disk that has been impregnated with an analyte-specific colorimetric reagent and any additives required to optimize the formation of the analyte-reagent complex. As the sample flows through the membrane disk, the target analyte is selectively extracted, concentrated, and complexed. Formation of the analyte-reagent complex causes a detectable change in the color of the membrane disk that is proportional to the amount of analyte present in the sample. The analyte is then quantified by measuring the color of the membrane disk surface using a hand-held diffuse reflectance spectrophotometer (DRS). The CWQMK provides the capability to measure the ionic silver (Ag +) and molecular iodine (I2) in water samples on-orbit. These analytes were selected for the evaluation of CSPE technology because they are the biocides used in the potable water storage and distribution systems on the ISS. Biocides are added to the potable water systems on spacecraft to inhibit microbial growth. On the United States (US) segment of the ISS molecular iodine serves as the biocide, while the Russian space agency utilizes silver as a biocide in their systems. In both cases, the biocides must be maintained at a level sufficient to control bacterial growth, but low enough to avoid any negative effects on crew health. For example, the presence of high levels of iodine in water can cause taste and odor issues that result in decreased water consumption by the crew. There are also concerns about potential impacts on thyroid function following exposure to high levels of iodine. With silver, there is a risk of developing argyria, an irreversible blue-gray discoloration of the skin, associated with long term consumption of water containing high concentrations of silver. The need to ensure that safe, effective levels of biocide are maintained in the potable water systems on the ISS provides a perfect platform for evaluating the suitability of CSPE technology for in-flight water quality monitoring. This paper provides an overview of CSPE technology and details on the silver and iodine methods used in the CWQMK. It also reports results obtained during in-flight analyses performed with the CWQMK and briefly discusses other potential applications for CSPE technology in both the spacecraft and terrestrial environments.
Chebrolu, Kranthi K; Yousef, Gad G; Park, Ryan; Tanimura, Yoshinori; Brown, Allan F
2015-09-15
A high-throughput, robust and reliable method for simultaneous analysis of five carotenoids, four chlorophylls and one tocopherol was developed for rapid screening large sample populations to facilitate molecular biology and plant breeding. Separation was achieved for 10 known analytes and four unknown carotenoids in a significantly reduced run time of 10min. Identity of the 10 analytes was confirmed by their UV-Vis absorption spectras. Quantification of tocopherol, carotenoids and chlorophylls was performed at 290nm, 460nm and 650nm respectively. In this report, two sub two micron particle core-shell columns, Kinetex from Phenomenex (1.7μm particle size, 12% carbon load) and Cortecs from Waters (1.6μm particle size, 6.6% carbon load) were investigated and their separation efficiencies were evaluated. The peak resolutions were >1.5 for all analytes except for chlorophyll-a' with Cortecs column. The ruggedness of this method was evaluated in two identical but separate instruments that produced CV<2 in peak retentions for nine out of 10 analytes separated. Copyright © 2015 Elsevier B.V. All rights reserved.
Patel, Bhumit A; Pinto, Nuno D S; Gospodarek, Adrian; Kilgore, Bruce; Goswami, Kudrat; Napoli, William N; Desai, Jayesh; Heo, Jun H; Panzera, Dominick; Pollard, David; Richardson, Daisy; Brower, Mark; Richardson, Douglas D
2017-11-07
Combining process analytical technology (PAT) with continuous production provides a powerful tool to observe and control monoclonal antibody (mAb) fermentation and purification processes. This work demonstrates on-line liquid chromatography (on-line LC) as a PAT tool for monitoring a continuous biologics process and forced degradation studies. Specifically, this work focused on ion exchange chromatography (IEX), which is a critical separation technique to detect charge variants. Product-related impurities, including charge variants, that impact function are classified as critical quality attributes (CQAs). First, we confirmed no significant differences were observed in the charge heterogeneity profile of a mAb through both at-line and on-line sampling and that the on-line method has the ability to rapidly detect changes in protein quality over time. The robustness and versatility of the PAT methods were tested by sampling from two purification locations in a continuous mAb process. The PAT IEX methods used with on-line LC were a weak cation exchange (WCX) separation and a newly developed shorter strong cation exchange (SCX) assay. Both methods provided similar results with the distribution of percent acidic, main, and basic species remaining unchanged over a 2 week period. Second, a forced degradation study showed an increase in acidic species and a decrease in basic species when sampled on-line over 7 days. These applications further strengthen the use of on-line LC to monitor CQAs of a mAb continuously with various PAT IEX analytical methods. Implementation of on-line IEX will enable faster decision making during process development and could potentially be applied to control in biomanufacturing.
Second International Conference on Accelerating Biopharmaceutical Development
2009-01-01
The Second International Conference on Accelerating Biopharmaceutical Development was held in Coronado, California. The meeting was organized by the Society for Biological Engineering (SBE) and the American Institute of Chemical Engineers (AIChE); SBE is a technological community of the AIChE. Bob Adamson (Wyeth) and Chuck Goochee (Centocor) were co-chairs of the event, which had the theme “Delivering cost-effective, robust processes and methods quickly and efficiently.” The first day focused on emerging disruptive technologies and cutting-edge analytical techniques. Day two featured presentations on accelerated cell culture process development, critical quality attributes, specifications and comparability, and high throughput protein formulation development. The final day was dedicated to discussion of technology options and new analysis methods provided by emerging disruptive technologies; functional interaction, integration and synergy in platform development; and rapid and economic purification process development. PMID:20065637
NMR and MS Methods for Metabolomics.
Amberg, Alexander; Riefke, Björn; Schlotterbeck, Götz; Ross, Alfred; Senn, Hans; Dieterle, Frank; Keck, Matthias
2017-01-01
Metabolomics, also often referred as "metabolic profiling," is the systematic profiling of metabolites in biofluids or tissues of organisms and their temporal changes. In the last decade, metabolomics has become more and more popular in drug development, molecular medicine, and other biotechnology fields, since it profiles directly the phenotype and changes thereof in contrast to other "-omics" technologies. The increasing popularity of metabolomics has been possible only due to the enormous development in the technology and bioinformatics fields. In particular, the analytical technologies supporting metabolomics, i.e., NMR, UPLC-MS, and GC-MS, have evolved into sensitive and highly reproducible platforms allowing the determination of hundreds of metabolites in parallel. This chapter describes the best practices of metabolomics as seen today. All important steps of metabolic profiling in drug development and molecular medicine are described in great detail, starting from sample preparation to determining the measurement details of all analytical platforms, and finally to discussing the corresponding specific steps of data analysis.
NMR and MS methods for metabonomics.
Dieterle, Frank; Riefke, Björn; Schlotterbeck, Götz; Ross, Alfred; Senn, Hans; Amberg, Alexander
2011-01-01
Metabonomics, also often referred to as "metabolomics" or "metabolic profiling," is the systematic profiling of metabolites in bio-fluids or tissues of organisms and their temporal changes. In the last decade, metabonomics has become increasingly popular in drug development, molecular medicine, and other biotechnology fields, since it profiles directly the phenotype and changes thereof in contrast to other "-omics" technologies. The increasing popularity of metabonomics has been possible only due to the enormous development in the technology and bioinformatics fields. In particular, the analytical technologies supporting metabonomics, i.e., NMR, LC-MS, UPLC-MS, and GC-MS have evolved into sensitive and highly reproducible platforms allowing the determination of hundreds of metabolites in parallel. This chapter describes the best practices of metabonomics as seen today. All important steps of metabolic profiling in drug development and molecular medicine are described in great detail, starting from sample preparation, to determining the measurement details of all analytical platforms, and finally, to discussing the corresponding specific steps of data analysis.
NASA Technical Reports Server (NTRS)
Bunin, Bruce L.
1985-01-01
A program was conducted to develop the technology for critical structural joints in composite wing structure that meets all the design requirements of a 1990 commercial transport aircraft. The results of four large composite multirow bolted joint tests are presented. The tests were conducted to demonstrate the technology for critical joints in highly loaded composite structure and to verify the analytical methods that were developed throughout the program. The test consisted of a wing skin-stringer transition specimen representing a stringer runout and skin splice on the wing lower surface at the side of the fuselage attachment. All tests were static tension tests. The composite material was Toray T-300 fiber with Ciba-Geigy 914 resin in 10 mil tape form. The splice members were metallic, using combinations of aluminum and titanium. Discussions are given of the test article, instrumentation, test setup, test procedures, and test results for each of the four specimens. Some of the analytical predictions are also included.
Recent advances in analytical methods, biomarker discovery, cell-based assay development, computational tools, sensor/monitor, and omics technology have enabled new streams of exposure and toxicity data to be generated at higher volumes and speed. These new data offer the opport...
Quantitative DNA fiber mapping
Gray, Joe W.; Weier, Heinz-Ulrich G.
1998-01-01
The present invention relates generally to the DNA mapping and sequencing technologies. In particular, the present invention provides enhanced methods and compositions for the physical mapping and positional cloning of genomic DNA. The present invention also provides a useful analytical technique to directly map cloned DNA sequences onto individual stretched DNA molecules.
This project will demonstrate ways to detect contaminants by LC/MS technologies in order to protect water systems and environments. Contaminants can affect drinking water usage and limit acceptable sources of ground and reservoir supplies. The analytical method to enhance the s...
Ultra-sensitive detection using integrated waveguide technologies
USDA-ARS?s Scientific Manuscript database
There is a pressing need to detect analytes at very low concentrations, such as food- and water-borne pathogens (e.g. E. coli O157:H7) and biothreat agents (e.g., anthrax, toxins). Common fluorescence detection methods, such as 96 well plate readers, are not sufficiently sensitive for low concentra...
A Graphical Approach to Teaching Amplifier Design at the Undergraduate Level
ERIC Educational Resources Information Center
Assaad, R. S.; Silva-Martinez, J.
2009-01-01
Current methods of teaching basic amplifier design at the undergraduate level need further development to match today's technological advances. The general class approach to amplifier design is analytical and heavily based on mathematical manipulations. However, the students mathematical abilities are generally modest, creating a void in which…
Homeland Security Research Improves the Nation's Ability to ...
Technical Brief Homeland Security (HS) Research develops data, tools, and technologies to minimize the impact of accidents, natural disasters, terrorist attacks, and other incidents that can result in toxic chemical, biological or radiological (CBR) contamination. HS Research develops ways to detect contamination, sampling strategies, sampling and analytical methods, cleanup methods, waste management approaches, exposure assessment methods, and decision support tools (including water system models). These contributions improve EPA’s response to a broad range of environmental disasters.
Schaefer, Cédric; Clicq, David; Lecomte, Clémence; Merschaert, Alain; Norrant, Edith; Fotiadu, Frédéric
2014-03-01
Pharmaceutical companies are progressively adopting and introducing Process Analytical Technology (PAT) and Quality-by-Design (QbD) concepts promoted by the regulatory agencies, aiming the building of the quality directly into the product by combining thorough scientific understanding and quality risk management. An analytical method based on near infrared (NIR) spectroscopy was developed as a PAT tool to control on-line an API (active pharmaceutical ingredient) manufacturing crystallization step during which the API and residual solvent contents need to be precisely determined to reach the predefined seeding point. An original methodology based on the QbD principles was designed to conduct the development and validation of the NIR method and to ensure that it is fitted for its intended use. On this basis, Partial least squares (PLS) models were developed and optimized using chemometrics methods. The method was fully validated according to the ICH Q2(R1) guideline and using the accuracy profile approach. The dosing ranges were evaluated to 9.0-12.0% w/w for the API and 0.18-1.50% w/w for the residual methanol. As by nature the variability of the sampling method and the reference method are included in the variability obtained for the NIR method during the validation phase, a real-time process monitoring exercise was performed to prove its fit for purpose. The implementation of this in-process control (IPC) method on the industrial plant from the launch of the new API synthesis process will enable automatic control of the final crystallization step in order to ensure a predefined quality level of the API. In addition, several valuable benefits are expected including reduction of the process time, suppression of a rather difficult sampling and tedious off-line analyses. © 2013 Published by Elsevier B.V.
Hou, Xiang-Mei; Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang
2016-07-01
To study and establish a monitoring method for macroporous resin column chromatography process of salvianolic acids by using near infrared spectroscopy (NIR) as a process analytical technology (PAT).The multivariate statistical process control (MSPC) model was developed based on 7 normal operation batches, and 2 test batches (including one normal operation batch and one abnormal operation batch) were used to verify the monitoring performance of this model. The results showed that MSPC model had a good monitoring ability for the column chromatography process. Meanwhile, NIR quantitative calibration model was established for three key quality indexes (rosmarinic acid, lithospermic acid and salvianolic acid B) by using partial least squares (PLS) algorithm. The verification results demonstrated that this model had satisfactory prediction performance. The combined application of the above two models could effectively achieve real-time monitoring for macroporous resin column chromatography process of salvianolic acids, and can be used to conduct on-line analysis of key quality indexes. This established process monitoring method could provide reference for the development of process analytical technology for traditional Chinese medicines manufacturing. Copyright© by the Chinese Pharmaceutical Association.
Application of ionic liquid in liquid phase microextraction technology.
Han, Dandan; Tang, Baokun; Lee, Yu Ri; Row, Kyung Ho
2012-11-01
Ionic liquids (ILs) are novel nonmolecular solvents. Their unique properties, such as high thermal stability, tunable viscosity, negligible vapor pressure, nonflammability, and good solubility for inorganic and organic compounds, make them excellent candidates as extraction media for a range of microextraction techniques. Many physical properties of ILs can be varied, and the structural design can be tuned to impart the desired functionality and enhance the analyte extraction selectivity, efficiency, and sensitivity. This paper provides an overview of the applications of ILs in liquid phase microextraction technology, such as single-drop microextraction, hollow fiber based liquid phase microextraction, and dispersive liquid-liquid microextraction. The sensitivity, linear calibration range, and detection limits for a range of target analytes in the methods were analyzed to determine the advantages of ILs in liquid phase microextraction. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Lin, Xiangyue; Peng, Minli; Lei, Fengming; Tan, Jiangxian; Shi, Huacheng
2017-12-01
Based on the assumptions of uniform corrosion and linear elastic expansion, an analytical model of cracking due to rebar corrosion expansion in concrete was established, which is able to consider the structure internal force. And then, by means of the complex variable function theory and series expansion technology established by Muskhelishvili, the corresponding stress component functions of concrete around the reinforcement were obtained. Also, a comparative analysis was conducted between the numerical simulation model and present model in this paper. The results show that the calculation results of both methods were consistent with each other, and the numerical deviation was less than 10%, proving that the analytical model established in this paper is reliable.
Going local: technologies for exploring bacterial microenvironments
Wessel, Aimee K.; Hmelo, Laura; Parsek, Matthew R.; Whiteley, Marvin
2014-01-01
Microorganisms lead social lives and use coordinated chemical and physical interactions to establish complex communities. Mechanistic insights into these interactions have revealed that there are remarkably intricate systems for coordinating microbial behaviour, but little is known about how these interactions proceed in the spatially organized communities that are found in nature. This Review describes the technologies available for spatially organizing small microbial communities and the analytical methods for characterizing the chemical environment surrounding these communities. Together, these complementary technologies have provided novel insights into the impact of spatial organization on both microbial behaviour and the development of phenotypic heterogeneity within microbial communities. PMID:23588251
On Establishing Big Data Wave Breakwaters with Analytics (Invited)
NASA Astrophysics Data System (ADS)
Riedel, M.
2013-12-01
The Research Data Alliance Big Data Analytics (RDA-BDA) Interest Group seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. RDA-BDA seeks to analyze different scientific domain applications and their potential use of various big data analytics techniques. A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. These combinations are complex since a wide variety of different data analysis algorithms exist (e.g. specific algorithms using GPUs of analyzing brain images) that need to work together with multiple analytical tools reaching from simple (iterative) map-reduce methods (e.g. with Apache Hadoop or Twister) to sophisticated higher level frameworks that leverage machine learning algorithms (e.g. Apache Mahout). These computational analysis techniques are often augmented with visual analytics techniques (e.g. computational steering on large-scale high performance computing platforms) to put the human judgement into the analysis loop or new approaches with databases that are designed to support new forms of unstructured or semi-structured data as opposed to the rather tradtional structural databases (e.g. relational databases). More recently, data analysis and underpinned analytics frameworks also have to consider energy footprints of underlying resources. To sum up, the aim of this talk is to provide pieces of information to understand big data analytics in the context of science and engineering using the aforementioned classification as the lighthouse and as the frame of reference for a systematic approach. This talk will provide insights about big data analytics methods in context of science within varios communities and offers different views of how approaches of correlation and causality offer complementary methods to advance in science and engineering today. The RDA Big Data Analytics Group seeks to understand what approaches are not only technically feasible, but also scientifically feasible. The lighthouse Goal of the RDA Big Data Analytics Group is a classification of clever combinations of various Technologies and scientific applications in order to provide clear recommendations to the scientific community what approaches are technicalla and scientifically feasible.
Development of airframe design technology for crashworthiness.
NASA Technical Reports Server (NTRS)
Kruszewski, E. T.; Thomson, R. G.
1973-01-01
This paper describes the NASA portion of a joint FAA-NASA General Aviation Crashworthiness Program leading to the development of improved crashworthiness design technology. The objectives of the program are to develop analytical technology for predicting crashworthiness of structures, provide design improvements, and perform full-scale crash tests. The analytical techniques which are being developed both in-house and under contract are described, and typical results from these analytical programs are shown. In addition, the full-scale testing facility and test program are discussed.
MSFC Advanced Concepts Office and the Iterative Launch Vehicle Concept Method
NASA Technical Reports Server (NTRS)
Creech, Dennis
2011-01-01
This slide presentation reviews the work of the Advanced Concepts Office (ACO) at Marshall Space Flight Center (MSFC) with particular emphasis on the method used to model launch vehicles using INTegrated ROcket Sizing (INTROS), a modeling system that assists in establishing the launch concept design, and stage sizing, and facilitates the integration of exterior analytic efforts, vehicle architecture studies, and technology and system trades and parameter sensitivities.
SDF technology in location and navigation procedures: a survey of applications
NASA Astrophysics Data System (ADS)
Kelner, Jan M.; Ziółkowski, Cezary
2017-04-01
The basis for development the Doppler location method, also called the signal Doppler frequency (SDF) method or technology is the analytical solution of the wave equation for a mobile source. This paper presents an overview of the simulations, numerical analysis and empirical studies of the possibilities and the range of SDF method applications. In the paper, the various applications from numerous publications are collected and described. They mainly focus on the use of SDF method in: emitter positioning, electronic warfare, crisis management, search and rescue, navigation. The developed method is characterized by an innovative, unique property among other location methods, because it allows the simultaneous location of the many radio emitters. Moreover, this is the first method based on the Doppler effect, which allows positioning of transmitters, using a single mobile platform. In the paper, the results of the using SDF method by the other teams are also presented.
How Should Blood Glucose Meter System Analytical Performance Be Assessed?
Simmons, David A
2015-08-31
Blood glucose meter system analytical performance is assessed by comparing pairs of meter system and reference instrument blood glucose measurements measured over time and across a broad array of glucose values. Consequently, no single, complete, and ideal parameter can fully describe the difference between meter system and reference results. Instead, a number of assessment tools, both graphical (eg, regression plots, modified Bland-Altman plots, and error grid analysis) and tabular (eg, International Organization for Standardization guidelines, mean absolute difference, and mean absolute relative difference) have been developed to evaluate meter system performance. The strengths and weaknesses of these methods of presenting meter system performance data, including a new method known as Radar Plots, are described here. © 2015 Diabetes Technology Society.
Abrevaya, Ximena C; Sacco, Natalia J; Bonetto, Maria C; Hilding-Ohlsson, Astrid; Cortón, Eduardo
2015-01-15
Microbial fuel cells were rediscovered twenty years ago and now are a very active research area. The reasons behind this new activity are the relatively recent discovery of electrogenic or electroactive bacteria and the vision of two important practical applications, as wastewater treatment coupled with clean energy production and power supply systems for isolated low-power sensor devices. Although some analytical applications of MFCs were proposed earlier (as biochemical oxygen demand sensing) only lately a myriad of new uses of this technology are being presented by research groups around the world, which combine both biological-microbiological and electroanalytical expertises. This is the second part of a review of MFC applications in the area of analytical sciences. In Part I a general introduction to biological-based analytical methods including bioassays, biosensors, MFCs design, operating principles, as well as, perhaps the main and earlier presented application, the use as a BOD sensor was reviewed. In Part II, other proposed uses are presented and discussed. As other microbially based analytical systems, MFCs are satisfactory systems to measure and integrate complex parameters that are difficult or impossible to measure otherwise, such as water toxicity (where the toxic effect to aquatic organisms needed to be integrated). We explore here the methods proposed to measure toxicity, microbial metabolism, and, being of special interest to space exploration, life sensors. Also, some methods with higher specificity, proposed to detect a single analyte, are presented. Different possibilities to increase selectivity and sensitivity, by using molecular biology or other modern techniques are also discussed here. Copyright © 2014 Elsevier B.V. All rights reserved.
Muehlwald, S; Buchner, N; Kroh, L W
2018-03-23
Because of the high number of possible pesticide residues and their chemical complexity, it is necessary to develop methods which cover a broad range of pesticides. In this work, a qualitative multi-screening method for pesticides was developed by use of HPLC-ESI-Q-TOF. 110 pesticides were chosen for the creation of a personal compound database and library (PCDL). The MassHunter Qualitative Analysis software from Agilent Technologies was used to identify the analytes. The software parameter settings were optimised to produce a low number of false positive as well as false negative results. The method was validated for 78 selected pesticides. However, the validation criteria were not fulfilled for 45 analytes. Due to this result, investigations were started to elucidate reasons for the low detectability. It could be demonstrated that the three main causes of the signal suppression were the co-eluting matrix (matrix effect), the low sensitivity of the analyte in standard solution and the fragmentation of the analyte in the ion source (in-source collision-induced dissociation). In this paper different examples are discussed showing that the impact of these three causes is different for each analyte. For example, it is possible that an analyte with low signal intensity and an intense fragmentation in the ion source is detectable in a difficult matrix, whereas an analyte with a high sensitivity and a low fragmentation is not detectable in a simple matrix. Additionally, it could be shown that in-source fragments are a helpful tool for an unambiguous identification. Copyright © 2018 Elsevier B.V. All rights reserved.
Rogue athletes and recombinant DNA technology: challenges for doping control.
Azzazy, Hassan M E; Mansour, Mai M H
2007-10-01
The quest for athletic excellence holds no limit for some athletes, and the advances in recombinant DNA technology have handed these athletes the ultimate doping weapons: recombinant proteins and gene doping. Some detection methods are now available for several recombinant proteins that are commercially available as pharmaceuticals and being abused by dopers. However, researchers are struggling to come up with efficient detection methods in preparation for the imminent threat of gene doping, expected in the 2008 Olympics. This Forum article presents the main detection strategies for recombinant proteins and the forthcoming detection strategies for gene doping as well as the prime analytical challenges facing them.
Shiokawa, Yuka; Date, Yasuhiro; Kikuchi, Jun
2018-02-21
Computer-based technological innovation provides advancements in sophisticated and diverse analytical instruments, enabling massive amounts of data collection with relative ease. This is accompanied by a fast-growing demand for technological progress in data mining methods for analysis of big data derived from chemical and biological systems. From this perspective, use of a general "linear" multivariate analysis alone limits interpretations due to "non-linear" variations in metabolic data from living organisms. Here we describe a kernel principal component analysis (KPCA)-incorporated analytical approach for extracting useful information from metabolic profiling data. To overcome the limitation of important variable (metabolite) determinations, we incorporated a random forest conditional variable importance measure into our KPCA-based analytical approach to demonstrate the relative importance of metabolites. Using a market basket analysis, hippurate, the most important variable detected in the importance measure, was associated with high levels of some vitamins and minerals present in foods eaten the previous day, suggesting a relationship between increased hippurate and intake of a wide variety of vegetables and fruits. Therefore, the KPCA-incorporated analytical approach described herein enabled us to capture input-output responses, and should be useful not only for metabolic profiling but also for profiling in other areas of biological and environmental systems.
Passive Magnetic Bearing With Ferrofluid Stabilization
NASA Technical Reports Server (NTRS)
Jansen, Ralph; DiRusso, Eliseo
1996-01-01
A new class of magnetic bearings is shown to exist analytically and is demonstrated experimentally. The class of magnetic bearings utilize a ferrofluid/solid magnet interaction to stabilize the axial degree of freedom of a permanent magnet radial bearing. Twenty six permanent magnet bearing designs and twenty two ferrofluid stabilizer designs are evaluated. Two types of radial bearing designs are tested to determine their force and stiffness utilizing two methods. The first method is based on the use of frequency measurements to determine stiffness by utilizing an analytical model. The second method consisted of loading the system and measuring displacement in order to measure stiffness. Two ferrofluid stabilizers are tested and force displacement curves are measured. Two experimental test fixtures are designed and constructed in order to conduct the stiffness testing. Polynomial models of the data are generated and used to design the bearing prototype. The prototype was constructed and tested and shown to be stable. Further testing shows the possibility of using this technology for vibration isolation. The project successfully demonstrated the viability of the passive magnetic bearing with ferrofluid stabilization both experimentally and analytically.
Study on application of aerospace technology to improve surgical implants
NASA Technical Reports Server (NTRS)
Johnson, R. E.; Youngblood, J. L.
1982-01-01
The areas where aerospace technology could be used to improve the reliability and performance of metallic, orthopedic implants was assessed. Specifically, comparisons were made of material controls, design approaches, analytical methods and inspection approaches being used in the implant industry with hardware for the aerospace industries. Several areas for possible improvement were noted such as increased use of finite element stress analysis and fracture control programs on devices where the needs exist for maximum reliability and high structural performance.
Slewing control experiment for a flexible panel
NASA Technical Reports Server (NTRS)
Juang, Jer-Nan
1987-01-01
Technology areas are identified in which better analytical and/or experimental methods are needed to adequately and accurately control the dynamic responses of multibody space platforms such as the space station. A generic space station solar panel is used to experimentally evaluate current control technologies. Active suppression of solar panel vibrations induced by large angle maneuvers is studied with a torque actuator at the root of the solar panel. These active suppression tests will identify the hardware requirements and adequacy of various controller designs.
Impact of active controls technology on structural integrity
NASA Technical Reports Server (NTRS)
Noll, Thomas; Austin, Edward; Donley, Shawn; Graham, George; Harris, Terry
1991-01-01
This paper summarizes the findings of The Technical Cooperation Program to assess the impact of active controls technology on the structural integrity of aeronautical vehicles and to evaluate the present state-of-the-art for predicting the loads caused by a flight-control system modification and the resulting change in the fatigue life of the flight vehicle. The potential for active controls to adversely affect structural integrity is described, and load predictions obtained using two state-of-the-art analytical methods are given.
Chemmalil, Letha; Suravajjala, Sreekanth; See, Kate; Jordan, Eric; Furtado, Marsha; Sun, Chong; Hosselet, Stephen
2015-01-01
This paper describes a novel approach for the quantitation of nonderivatized sialic acid in glycoproteins, separated by hydrophilic interaction chromatography, and detection by Nano Quantity Analyte Detector (NQAD). The detection technique of NQAD is based on measuring change in the size of dry aerosol and converting the particle count rate into chromatographic output signal. NQAD detector is suitable for the detection of sialic acid, which lacks sufficiently active chromophore or fluorophore. The water condensation particle counting technology allows the analyte to be enlarged using water vapor to provide highest sensitivity. Derivatization-free analysis of glycoproteins using HPLC/NQAD method with PolyGLYCOPLEX™ amide column is well correlated with HPLC method with precolumn derivatization using 1, 2-diamino-4, 5-methylenedioxybenzene (DMB) as well as the Dionex-based high-pH anion-exchange chromatography (or ion chromatography) with pulsed amperometric detection (HPAEC-PAD). With the elimination of derivatization step, HPLC/NQAD method is more efficient than HPLC/DMB method. HPLC/NQAD method is more reproducible than HPAEC-PAD method as HPAEC-PAD method suffers high variability because of electrode fouling during analysis. Overall, HPLC/NQAD method offers broad linear dynamic range as well as excellent precision, accuracy, repeatability, reliability, and ease of use, with acceptable comparability to the commonly used HPAEC-PAD and HPLC/DMB methods. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
The purpose of this report is to summarize the activities of the Analytical Chemistry Laboratory (ACL) at Argonne National Laboratory (ANL) for Fiscal Year (FY) 1993 (October 1992 through September 1993). This annual report is the tenth for the ACL and describes continuing effort on projects, work on new projects, and contributions of the ACL staff to various programs at ANL. The Analytical Chemistry Laboratory is a full-cost-recovery service center, with the primary mission of providing a broad range of analytical chemistry support services to the scientific and engineering programs at ANL. The ACL also has research programs in analyticalmore » chemistry, conducts instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL handles a wide range of analytical problems. Some routine or standard analyses are done, but it is common for the Argonne programs to generate unique problems that require development or modification of methods and adaption of techniques to obtain useful analytical data. The ACL is administratively within the Chemical Technology Division (CMT), its principal ANL client, but provides technical support for many of the technical divisions and programs at ANL. The ACL has four technical groups--Chemical Analysis, Instrumental Analysis, Organic Analysis, and Environmental Analysis--which together include about 45 technical staff members. Talents and interests of staff members cross the group lines, as do many projects within the ACL.« less
Martinez, Ramon; Ordunez, Pedro; Soliz, Patricia N; Ballesteros, Michael F
2016-04-01
The complexity of current injury-related health issues demands the usage of diverse and massive data sets for comprehensive analyses, and application of novel methods to communicate data effectively to the public health community, decision-makers and the public. Recent advances in information visualisation, availability of new visual analytic methods and tools, and progress on information technology provide an opportunity for shaping the next generation of injury surveillance. To introduce data visualisation conceptual bases, and propose a visual analytic and visualisation platform in public health surveillance for injury prevention and control. The paper introduces data visualisation conceptual bases, describes a visual analytic and visualisation platform, and presents two real-world case studies illustrating their application in public health surveillance for injury prevention and control. Application of visual analytic and visualisation platform is presented as solution for improved access to heterogeneous data sources, enhance data exploration and analysis, communicate data effectively, and support decision-making. Applications of data visualisation concepts and visual analytic platform could play a key role to shape the next generation of injury surveillance. Visual analytic and visualisation platform could improve data use, the analytic capacity, and ability to effectively communicate findings and key messages. The public health surveillance community is encouraged to identify opportunities to develop and expand its use in injury prevention and control. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
Absorption into fluorescence. A method to sense biologically relevant gas molecules
NASA Astrophysics Data System (ADS)
Strianese, Maria; Varriale, Antonio; Staiano, Maria; Pellecchia, Claudio; D'Auria, Sabato
2011-01-01
In this work we present an innovative optical sensing methodology based on the use of biomolecules as molecular gating nano-systems. Here, as an example, we report on the detection ofanalytes related to climate change. In particular, we focused our attention on the detection ofnitric oxide (NO) and oxygen (O2). Our methodology builds on the possibility of modulating the excitation intensity of a fluorescent probe used as a transducer and a sensor molecule whose absorption is strongly affected by the binding of an analyte of interest used as a filter. The two simple conditions that have to be fulfilled for the method to work are: (a) the absorption spectrum of the sensor placed inside the cuvette, and acting as the recognition element for the analyte of interest, should strongly change upon the binding of the analyte and (b) the fluorescence dye transducer should exhibit an excitation band which overlaps with one or more absorption bands of the sensor. The absorption band of the sensor affected by the binding of the specific analyte should overlap with the excitation band of the transducer. The high sensitivity of fluorescence detection combined with the use of proteins as highly selective sensors makes this method a powerful basis for the development of a new generation of analytical assays. Proof-of-principle results showing that cytochrome c peroxidase (CcP) for NO detection and myoglobin (Mb) for O2 detection can be successfully used by exploiting our new methodology are reported. The proposed technology can be easily expanded to the determination of different target analytes.
Big Data in Science and Healthcare: A Review of Recent Literature and Perspectives
Miron-Shatz, T.; Lau, A. Y. S.; Paton, C.
2014-01-01
Summary Objectives As technology continues to evolve and rise in various industries, such as healthcare, science, education, and gaming, a sophisticated concept known as Big Data is surfacing. The concept of analytics aims to understand data. We set out to portray and discuss perspectives of the evolving use of Big Data in science and healthcare and, to examine some of the opportunities and challenges. Methods A literature review was conducted to highlight the implications associated with the use of Big Data in scientific research and healthcare innovations, both on a large and small scale. Results Scientists and health-care providers may learn from one another when it comes to understanding the value of Big Data and analytics. Small data, derived by patients and consumers, also requires analytics to become actionable. Connectivism provides a framework for the use of Big Data and analytics in the areas of science and healthcare. This theory assists individuals to recognize and synthesize how human connections are driving the increase in data. Despite the volume and velocity of Big Data, it is truly about technology connecting humans and assisting them to construct knowledge in new ways. Concluding Thoughts The concept of Big Data and associated analytics are to be taken seriously when approaching the use of vast volumes of both structured and unstructured data in science and health-care. Future exploration of issues surrounding data privacy, confidentiality, and education are needed. A greater focus on data from social media, the quantified self-movement, and the application of analytics to “small data” would also be useful. PMID:25123717
Quality of Big Data in health care.
Sukumar, Sreenivas R; Natarajan, Ramachandran; Ferrell, Regina K
2015-01-01
The current trend in Big Data analytics and in particular health information technology is toward building sophisticated models, methods and tools for business, operational and clinical intelligence. However, the critical issue of data quality required for these models is not getting the attention it deserves. The purpose of this paper is to highlight the issues of data quality in the context of Big Data health care analytics. The insights presented in this paper are the results of analytics work that was done in different organizations on a variety of health data sets. The data sets include Medicare and Medicaid claims, provider enrollment data sets from both public and private sources, electronic health records from regional health centers accessed through partnerships with health care claims processing entities under health privacy protected guidelines. Assessment of data quality in health care has to consider: first, the entire lifecycle of health data; second, problems arising from errors and inaccuracies in the data itself; third, the source(s) and the pedigree of the data; and fourth, how the underlying purpose of data collection impact the analytic processing and knowledge expected to be derived. Automation in the form of data handling, storage, entry and processing technologies is to be viewed as a double-edged sword. At one level, automation can be a good solution, while at another level it can create a different set of data quality issues. Implementation of health care analytics with Big Data is enabled by a road map that addresses the organizational and technological aspects of data quality assurance. The value derived from the use of analytics should be the primary determinant of data quality. Based on this premise, health care enterprises embracing Big Data should have a road map for a systematic approach to data quality. Health care data quality problems can be so very specific that organizations might have to build their own custom software or data quality rule engines. Today, data quality issues are diagnosed and addressed in a piece-meal fashion. The authors recommend a data lifecycle approach and provide a road map, that is more appropriate with the dimensions of Big Data and fits different stages in the analytical workflow.
Verification of Decision-Analytic Models for Health Economic Evaluations: An Overview.
Dasbach, Erik J; Elbasha, Elamin H
2017-07-01
Decision-analytic models for cost-effectiveness analysis are developed in a variety of software packages where the accuracy of the computer code is seldom verified. Although modeling guidelines recommend using state-of-the-art quality assurance and control methods for software engineering to verify models, the fields of pharmacoeconomics and health technology assessment (HTA) have yet to establish and adopt guidance on how to verify health and economic models. The objective of this paper is to introduce to our field the variety of methods the software engineering field uses to verify that software performs as expected. We identify how many of these methods can be incorporated in the development process of decision-analytic models in order to reduce errors and increase transparency. Given the breadth of methods used in software engineering, we recommend a more in-depth initiative to be undertaken (e.g., by an ISPOR-SMDM Task Force) to define the best practices for model verification in our field and to accelerate adoption. Establishing a general guidance for verifying models will benefit the pharmacoeconomics and HTA communities by increasing accuracy of computer programming, transparency, accessibility, sharing, understandability, and trust of models.
Harries, Megan; Bukovsky-Reyes, Santiago; Bruno, Thomas J
2016-01-15
This paper details the sampling methods used with the field portable porous layer open tubular cryoadsorption (PLOT-cryo) approach, described in Part I of this two-part series, applied to several analytes of interest. We conducted tests with coumarin and 2,4,6-trinitrotoluene (two solutes that were used in initial development of PLOT-cryo technology), naphthalene, aviation turbine kerosene, and diesel fuel, on a variety of matrices and test beds. We demonstrated that these analytes can be easily detected and reliably identified using the portable unit for analyte collection. By leveraging efficiency-boosting temperature control and the high flow rate multiple capillary wafer, very short collection times (as low as 3s) yielded accurate detection. For diesel fuel spiked on glass beads, we determined a method detection limit below 1 ppm. We observed greater variability among separate samples analyzed with the portable unit than previously documented in work using the laboratory-based PLOT-cryo technology. We identify three likely sources that may help explain the additional variation: the use of a compressed air source to generate suction, matrix geometry, and variability in the local vapor concentration around the sampling probe as solute depletion occurs both locally around the probe and in the test bed as a whole. This field-portable adaptation of the PLOT-cryo approach has numerous and diverse potential applications. Published by Elsevier B.V.
Harries, Megan; Bukovsky-Reyes, Santiago; Bruno, Thomas J.
2016-01-01
This paper details the sampling methods used with the field portable porous layer open tubular cryoadsorption (PLOT-cryo) approach, described in Part I of this two-part series, applied to several analytes of interest. We conducted tests with coumarin and 2,4,6-trinitrotoluene (two solutes that were used in initial development of PLOT-cryo technology), naphthalene, aviation turbine kerosene, and diesel fuel, on a variety of matrices and test beds. We demonstrated that these analytes can be easily detected and reliably identified using the portable unit for analyte collection. By leveraging efficiency-boosting temperature control and the high flow rate multiple capillary wafer, very short collection times (as low as 3 s) yielded accurate detection. For diesel fuel spiked on glass beads, we determined a method detection limit below 1 ppm. We observed greater variability among separate samples analyzed with the portable unit than previously documented in work using the laboratory-based PLOT-cryo technology. We identify three likely sources that may help explain the additional variation: the use of a compressed air source to generate suction, matrix geometry, and variability in the local vapor concentration around the sampling probe as solute depletion occurs both locally around the probe and in the test bed as a whole. This field-portable adaptation of the PLOT-cryo approach has numerous and diverse potential applications. PMID:26726934
NASA Technical Reports Server (NTRS)
Gafka, Tammy; Terrier, Doug; Smith, James
2011-01-01
This slide presentation is a review of the work of Johnson Space Center. It includes a section on technology development areas, (i.e., composite structures, non-destructive evaluation, applied nanotechnology, additive manufacturing, and fracture and fatigue analytical methods), a section on structural analysis capabilities within NASA/JSC and a section on Friction stir welding and laser peening.
Assessing Collaborative Learning: Big Data, Analytics and University Futures
ERIC Educational Resources Information Center
Williams, Peter
2017-01-01
Assessment in higher education has focused on the performance of individual students. This focus has been a practical as well as an epistemic one: methods of assessment are constrained by the technology of the day, and in the past they required the completion by individuals under controlled conditions of set-piece academic exercises. Recent…
Bayes Nets in Educational Assessment: Where Do the Numbers Come from? CSE Technical Report.
ERIC Educational Resources Information Center
Mislevy, Robert J.; Almond, Russell G.; Yan, Duanli; Steinberg, Linda S.
Educational assessments that exploit advances in technology and cognitive psychology can produce observations and pose student models that outstrip familiar test-theoretic models and analytic methods. Bayesian inference networks (BINs), which include familiar models and techniques as special cases, can be used to manage belief about students'…
Reichert, Janice M; Jacob, Nitya; Amanullah, Ashraf
2009-01-01
The Second International Conference on Accelerating Biopharmaceutical Development was held in Coronado, California. The meeting was organized by the Society for Biological Engineering (SBE) and the American Institute of Chemical Engineers (AIChE); SBE is a technological community of the AIChE. Bob Adamson (Wyeth) and Chuck Goochee (Centocor) were co-chairs of the event, which had the theme "Delivering cost-effective, robust processes and methods quickly and efficiently." The first day focused on emerging disruptive technologies and cutting-edge analytical techniques. Day two featured presentations on accelerated cell culture process development, critical quality attributes, specifications and comparability, and high throughput protein formulation development. The final day was dedicated to discussion of technology options and new analysis methods provided by emerging disruptive technologies; functional interaction, integration and synergy in platform development; and rapid and economic purification process development.
Reichert, Janice M; Jacob, Nitya M; Amanullah, Ashraf
2009-01-01
The Second International Conference on Accelerating Biopharmaceutical Development was held in Coronado, California. The meeting was organized by the Society for Biological Engineering (SBE) and the American Institute of Chemical Engineers (AIChE); SBE is a technological community of the AIChE. Bob Adamson (Wyeth) and Chuck Goochee (Centocor) were co-chairs of the event, which had the theme "Delivering cost-effective, robust processes and methods quickly and efficiently." The first day focused on emerging disruptive technologies and cutting-edge analytical techniques. Day two featured presentations on accelerated cell culture process development, critical quality attributes, specifications and comparability, and high throughput protein formulation development. The final day was dedicated to discussion of technology options and new analysis methods provided by emerging disruptive technologies; functional interaction, integration and synergy in platform development; and rapid and economic purification process development.
Holland, Tanja; Blessing, Daniel; Hellwig, Stephan; Sack, Markus
2013-10-01
Radio frequency impedance spectroscopy (RFIS) is a robust method for the determination of cell biomass during fermentation. RFIS allows non-invasive in-line monitoring of the passive electrical properties of cells in suspension and can distinguish between living and dead cells based on their distinct behavior in an applied radio frequency field. We used continuous in situ RFIS to monitor batch-cultivated plant suspension cell cultures in stirred-tank bioreactors and compared the in-line data to conventional off-line measurements. RFIS-based analysis was more rapid and more accurate than conventional biomass determination, and was sensitive to changes in cell viability. The higher resolution of the in-line measurement revealed subtle changes in cell growth which were not accessible using conventional methods. Thus, RFIS is well suited for correlating such changes with intracellular states and product accumulation, providing unique opportunities for employing systems biotechnology and process analytical technology approaches to increase product yield and quality. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Development of the Diabetes Technology Society Blood Glucose Monitor System Surveillance Protocol
Klonoff, David C.; Lias, Courtney; Beck, Stayce; Parkes, Joan Lee; Kovatchev, Boris; Vigersky, Robert A.; Arreaza-Rubin, Guillermo; Burk, Robert D.; Kowalski, Aaron; Little, Randie; Nichols, James; Petersen, Matt; Rawlings, Kelly; Sacks, David B.; Sampson, Eric; Scott, Steve; Seley, Jane Jeffrie; Slingerland, Robbert; Vesper, Hubert W.
2015-01-01
Background: Inaccurate blood glucsoe monitoring systems (BGMSs) can lead to adverse health effects. The Diabetes Technology Society (DTS) Surveillance Program for cleared BGMSs is intended to protect people with diabetes from inaccurate, unreliable BGMS products that are currently on the market in the United States. The Surveillance Program will provide an independent assessment of the analytical performance of cleared BGMSs. Methods: The DTS BGMS Surveillance Program Steering Committee included experts in glucose monitoring, surveillance testing, and regulatory science. Over one year, the committee engaged in meetings and teleconferences aiming to describe how to conduct BGMS surveillance studies in a scientifically sound manner that is in compliance with good clinical practice and all relevant regulations. Results: A clinical surveillance protocol was created that contains performance targets and analytical accuracy-testing studies with marketed BGMS products conducted by qualified clinical and laboratory sites. This protocol entitled “Protocol for the Diabetes Technology Society Blood Glucose Monitor System Surveillance Program” is attached as supplementary material. Conclusion: This program is needed because currently once a BGMS product has been cleared for use by the FDA, no systematic postmarket Surveillance Program exists that can monitor analytical performance and detect potential problems. This protocol will allow identification of inaccurate and unreliable BGMSs currently available on the US market. The DTS Surveillance Program will provide BGMS manufacturers a benchmark to understand the postmarket analytical performance of their products. Furthermore, patients, health care professionals, payers, and regulatory agencies will be able to use the results of the study to make informed decisions to, respectively, select, prescribe, finance, and regulate BGMSs on the market. PMID:26481642
NASA Astrophysics Data System (ADS)
Mølgaard, Lasse L.; Buus, Ole T.; Larsen, Jan; Babamoradi, Hamid; Thygesen, Ida L.; Laustsen, Milan; Munk, Jens Kristian; Dossi, Eleftheria; O'Keeffe, Caroline; Lässig, Lina; Tatlow, Sol; Sandström, Lars; Jakobsen, Mogens H.
2017-05-01
We present a data-driven machine learning approach to detect drug- and explosives-precursors using colorimetric sensor technology for air-sampling. The sensing technology has been developed in the context of the CRIM-TRACK project. At present a fully- integrated portable prototype for air sampling with disposable sensing chips and automated data acquisition has been developed. The prototype allows for fast, user-friendly sampling, which has made it possible to produce large datasets of colorimetric data for different target analytes in laboratory and simulated real-world application scenarios. To make use of the highly multi-variate data produced from the colorimetric chip a number of machine learning techniques are employed to provide reliable classification of target analytes from confounders found in the air streams. We demonstrate that a data-driven machine learning method using dimensionality reduction in combination with a probabilistic classifier makes it possible to produce informative features and a high detection rate of analytes. Furthermore, the probabilistic machine learning approach provides a means of automatically identifying unreliable measurements that could produce false predictions. The robustness of the colorimetric sensor has been evaluated in a series of experiments focusing on the amphetamine pre-cursor phenylacetone as well as the improvised explosives pre-cursor hydrogen peroxide. The analysis demonstrates that the system is able to detect analytes in clean air and mixed with substances that occur naturally in real-world sampling scenarios. The technology under development in CRIM-TRACK has the potential as an effective tool to control trafficking of illegal drugs, explosive detection, or in other law enforcement applications.
Hypersonic airframe structures: Technology needs and flight test requirements
NASA Technical Reports Server (NTRS)
Stone, J. E.; Koch, L. C.
1979-01-01
Hypersonic vehicles, that may be produced by the year 2000, were identified. Candidate thermal/structural concepts that merit consideration for these vehicles were described. The current status of analytical methods, materials, manufacturing techniques, and conceptual developments pertaining to these concepts were reviewed. Guidelines establishing meaningful technology goals were defined and twenty-eight specific technology needs were identified. The extent to which these technology needs can be satisfied, using existing capabilities and facilities without the benefit of a hypersonic research aircraft, was assessed. The role that a research aircraft can fill in advancing this technology was discussed and a flight test program was outlined. Research aircraft thermal/structural design philosophy was also discussed. Programs, integrating technology advancements with the projected vehicle needs, were presented. Program options were provided to reflect various scheduling and cost possibilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1960-01-01
Thirty-one papers and 10 summaries of papers presented at the Third Conference on Analytical Chemistry in Nuclear Reactor Technology held at Gatlinburg, Tennessee, October 26 to 29, 1959, are given. The papers are grouped into four sections: general, analytical chemistry of fuels, analytical chemistry of plutonium and the transplutonic elements, and the analysis of fission-product mixtures. Twenty-seven of the papers are covered by separate abstracts. Four were previously abstracted for NSA. (M.C.G.)
Technology-assisted psychoanalysis.
Scharff, Jill Savege
2013-06-01
Teleanalysis-remote psychoanalysis by telephone, voice over internet protocol (VoIP), or videoteleconference (VTC)-has been thought of as a distortion of the frame that cannot support authentic analytic process. Yet it can augment continuity, permit optimum frequency of analytic sessions for in-depth analytic work, and enable outreach to analysands in areas far from specialized psychoanalytic centers. Theoretical arguments against teleanalysis are presented and countered and its advantages and disadvantages discussed. Vignettes of analytic process from teleanalytic sessions are presented, and indications, contraindications, and ethical concerns are addressed. The aim is to provide material from which to judge the authenticity of analytic process supported by technology.
Historical review of missile aerodynamic developments
NASA Technical Reports Server (NTRS)
Spearman, M. Leroy
1989-01-01
A comprehensive development history to about 1970 is presented for missile technologies and their associated capabilities and difficulties. Attention is given to the growth of an experimental data base for missile design, as well as to the critical early efforts to develop analytical methods applicable to missiles. Most of the important missile development efforts made during the period from the end of the Second World War to the early 1960s were based primarily on experiences gained through wind tunnel and flight testing; analytical techniques began to demonstrate their usefulness in the design process only in the late 1960s.
PFOA and PFOS: Treatment and Analytics | Science Inventory ...
PFOA and PFOS are not regulated by the USEPA. However, in 2016, USEPA established a Lifetime Drinking Water Health Advisory limit of 70 ng/L for the combined concentration of PFOA and PFOS. This presentation will cover the available technologies that can treat for PFOA and PFOS and discuss the costs of those treatments. It will also cover the implementation of EPA's Method 537 that can be used to analyze for PFOA and PFOS. To present on the available treatments a community could use to treat PFOA or PFOS, and the analytical technique to analyze them.
A mass spectrometry primer for mass spectrometry imaging
Rubakhin, Stanislav S.; Sweedler, Jonathan V.
2011-01-01
Mass spectrometry imaging (MSI), a rapidly growing subfield of chemical imaging, employs mass spectrometry (MS) technologies to create single- and multi-dimensional localization maps for a variety of atoms and molecules. Complimentary to other imaging approaches, MSI provides high chemical specificity and broad analyte coverage. This powerful analytical toolset is capable of measuring the distribution of many classes of inorganics, metabolites, proteins and pharmaceuticals in chemically and structurally complex biological specimens in vivo, in vitro, and in situ. The MSI approaches highlighted in this Methods in Molecular Biology volume provide flexibility of detection, characterization, and identification of multiple known and unknown analytes. The goal of this chapter is to introduce investigators who may be unfamiliar with MS to the basic principles of the mass spectrometric approaches as used in MSI. In addition to guidelines for choosing the most suitable MSI method for specific investigations, cross-references are provided to the chapters in this volume that describe the appropriate experimental protocols. PMID:20680583
Feigenbaum, A; Scholler, D; Bouquant, J; Brigot, G; Ferrier, D; Franzl, R; Lillemarktt, L; Riquet, A M; Petersen, J H; van Lierop, B; Yagoubi, N
2002-02-01
The results of a research project (EU AIR Research Programme CT94-1025) aimed to introduce control of migration into good manufacturing practice and into enforcement work are reported. Representative polymer classes were defined on the basis of chemical structure, technological function, migration behaviour and market share. These classes were characterized by analytical methods. Analytical techniques were investigated for identification of potential migrants. High-temperature gas chromatography was shown to be a powerful method and 1H-magnetic resonance provided a convenient fingerprint of plastic materials. Volatile compounds were characterized by headspace techniques, where it was shown to be essential to differentiate volatile compounds desorbed from those generated during the thermal desorption itself. For metal trace analysis, microwave mineralization followed by atomic absorption was employed. These different techniques were introduced into a systematic testing scheme that is envisaged as being suitable both for industrial control and for enforcement laboratories. Guidelines will be proposed in the second part of this paper.
Sensor failure detection for jet engines
NASA Technical Reports Server (NTRS)
Merrill, Walter C.
1988-01-01
The use of analytical redundancy to improve gas turbine engine control system reliability through sensor failure detection, isolation, and accommodation is surveyed. Both the theoretical and application papers that form the technology base of turbine engine analytical redundancy research are discussed. Also, several important application efforts are reviewed. An assessment of the state-of-the-art in analytical redundancy technology is given.
Esmonde-White, Karen A; Cuellar, Maryann; Uerpmann, Carsten; Lenain, Bruno; Lewis, Ian R
2017-01-01
Adoption of Quality by Design (QbD) principles, regulatory support of QbD, process analytical technology (PAT), and continuous manufacturing are major factors effecting new approaches to pharmaceutical manufacturing and bioprocessing. In this review, we highlight new technology developments, data analysis models, and applications of Raman spectroscopy, which have expanded the scope of Raman spectroscopy as a process analytical technology. Emerging technologies such as transmission and enhanced reflection Raman, and new approaches to using available technologies, expand the scope of Raman spectroscopy in pharmaceutical manufacturing, and now Raman spectroscopy is successfully integrated into real-time release testing, continuous manufacturing, and statistical process control. Since the last major review of Raman as a pharmaceutical PAT in 2010, many new Raman applications in bioprocessing have emerged. Exciting reports of in situ Raman spectroscopy in bioprocesses complement a growing scientific field of biological and biomedical Raman spectroscopy. Raman spectroscopy has made a positive impact as a process analytical and control tool for pharmaceutical manufacturing and bioprocessing, with demonstrated scientific and financial benefits throughout a product's lifecycle.
Chemical Technology Division, Annual technical report, 1991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-03-01
Highlights of the Chemical Technology (CMT) Division's activities during 1991 are presented. In this period, CMT conducted research and development in the following areas: (1) electrochemical technology, including advanced batteries and fuel cells; (2) technology for fluidized-bed combustion and coal-fired magnetohydrodynamics; (3) methods for treatment of hazardous and mixed hazardous/radioactive waste; (4) the reaction of nuclear waste glass and spent fuel under conditions expected for an unsaturated repository; (5) processes for separating and recovering transuranic elements from nuclear waste streams; (6) recovery processes for discharged fuel and the uranium blanket in the Integral Fast Reactor (IFR); (7) processes for removalmore » of actinides in spent fuel from commercial water-cooled nuclear reactors and burnup in IFRs; and (8) physical chemistry of selected materials in environments simulating those of fission and fusion energy systems. The Division also conducts basic research in catalytic chemistry associated with molecular energy resources; chemistry of superconducting oxides and other materials of interest with technological application; interfacial processes of importance to corrosion science, catalysis, and high-temperature superconductivity; and the geochemical processes involved in water-rock interactions occurring in active hydrothermal systems. In addition, the Analytical Chemistry Laboratory in CMT provides a broad range of analytical chemistry support services to the technical programs at Argonne National Laboratory (ANL).« less
Chemical Technology Division, Annual technical report, 1991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-03-01
Highlights of the Chemical Technology (CMT) Division`s activities during 1991 are presented. In this period, CMT conducted research and development in the following areas: (1) electrochemical technology, including advanced batteries and fuel cells; (2) technology for fluidized-bed combustion and coal-fired magnetohydrodynamics; (3) methods for treatment of hazardous and mixed hazardous/radioactive waste; (4) the reaction of nuclear waste glass and spent fuel under conditions expected for an unsaturated repository; (5) processes for separating and recovering transuranic elements from nuclear waste streams; (6) recovery processes for discharged fuel and the uranium blanket in the Integral Fast Reactor (IFR); (7) processes for removalmore » of actinides in spent fuel from commercial water-cooled nuclear reactors and burnup in IFRs; and (8) physical chemistry of selected materials in environments simulating those of fission and fusion energy systems. The Division also conducts basic research in catalytic chemistry associated with molecular energy resources; chemistry of superconducting oxides and other materials of interest with technological application; interfacial processes of importance to corrosion science, catalysis, and high-temperature superconductivity; and the geochemical processes involved in water-rock interactions occurring in active hydrothermal systems. In addition, the Analytical Chemistry Laboratory in CMT provides a broad range of analytical chemistry support services to the technical programs at Argonne National Laboratory (ANL).« less
Metabolomic Technologies for Improving the Quality of Food: Practice and Promise.
Johanningsmeier, Suzanne D; Harris, G Keith; Klevorn, Claire M
2016-01-01
It is now well documented that the diet has a significant impact on human health and well-being. However, the complete set of small molecule metabolites present in foods that make up the human diet and the role of food production systems in altering this food metabolome are still largely unknown. Metabolomic platforms that rely on nuclear magnetic resonance (NMR) and mass spectrometry (MS) analytical technologies are being employed to study the impact of agricultural practices, processing, and storage on the global chemical composition of food; to identify novel bioactive compounds; and for authentication and region-of-origin classifications. This review provides an overview of the current terminology, analytical methods, and compounds associated with metabolomic studies, and provides insight into the application of metabolomics to generate new knowledge that enables us to produce, preserve, and distribute high-quality foods for health promotion.
Paterson, Helen; Carpenter, Christine
2015-01-01
This study aimed to explore how adults with severe acquired communication difficulties experience and make decisions about the communication methods they use. The primary objectives were to explore their perceptions of different communication methods, how they choose communication methods to use in different situations and with different communication partners, and what facilitates their decision-making. A qualitative phenomenological approach was used. Data collection methods were face-to-face video-recorded interviews using each participant's choice of communication method and e-mail interviews. The methodological challenges of involving participants with severe acquired communication disorders in research were addressed in the study design. Seven participants, all men, were recruited from a long-term care setting in a rehabilitation hospital. The data analysis process was guided by Colaizzi's (1978) analytic framework. Four main themes were identified: communicating in the digital age – e-mail and social media, encountering frustrations in using communication technologies, role and identity changes and the influences of communication technology and seeking a functional interaction using communication technologies. Adults with acquired communication difficulties find digital communication, such as e-mail and social media, and mainstream technologies, such as iPads, beneficial in communicating with others. Current communication technologies present a number of challenges for adults with disabilities and are limited in their communicative functions to support desired interactions. The implications for AAC technology development and speech and language therapy service delivery are addressed.
Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.
Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís
2016-01-08
In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Tulebekova, S.; Saliyev, D.; Zhang, D.; Kim, J. R.; Karabay, A.; Turlybek, A.; Kazybayeva, L.
2017-11-01
Compressed air energy storage technology is one of the promising methods that have high reliability, economic feasibility and low environmental impact. Current applications of the technology are mainly limited to energy storage for power plants using large scale underground caverns. This paper explores the possibility of making use of reinforced concrete pile foundations to store renewable energy generated from solar panels or windmills attached to building structures. The energy will be stored inside the pile foundation with hollow sections via compressed air. Given the relatively small volume of storage provided by the foundation, the required storage pressure is expected to be higher than that in the large-scale underground cavern. The high air pressure typically associated with large temperature increase, combined with structural loads, will make the pile foundation in a complicated loading condition, which might cause issues in the structural and geotechnical safety. This paper presents a preliminary analytical study on the performance of the pile foundation subjected to high pressure, large temperature increase and structural loads. Finite element analyses on pile foundation models, which are built from selected prototype structures, have been conducted. The analytical study identifies maximum stresses in the concrete of the pile foundation under combined pressure, temperature change and structural loads. Recommendations have been made for the use of reinforced concrete pile foundations for renewable energy storage.
NASA Astrophysics Data System (ADS)
Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.
2016-09-01
Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.
Krapfenbauer, Kurt
2017-12-01
Diabetes mellitus is produced and progresses as a consequence of complex and gradual processes, in which a variety of alterations of the endocrine pancreas, are involved and which mainly result in beta cell failure. Those molecular alterations can be found in the bloodstream, which suggests that we could quantify specific biomarkers in plasma or serum by very sensitive methods before the onset diabetes mellitus is diagnosed. However, classical methods of protein analysis such as electrophoresis, Western blot, ELISA, and liquid chromatography are generally time-consuming, lab-intensive, and not sensitive enough to detect such alteration in a pre-symptomatic state of the disease. A very sensitive and novel analytical detection conjugate system by using the combination of polyfluorophor technology with protein microchip method was developed. This innovative system facilitates the use of a very sensitive microchip assays that measure selected biomarkers in a small sample volume (10 μL) with a much higher sensitivity (92%) compare to common immune assay systems. Further advances of the application of this technology combine the power of miniaturization and faster quantification (around 10 min). The power of this technology offers great promise for point-of-care clinical testing and monitoring of specific biomarkers for diabetes in femtogram level in serum or plasma. In conclusion, the results indicate that the technical performance of this new technology is valid and that the assay is able to quantified PPY-specific antigens in plasma at femtogram levels which can be used for identification of beta cell dysfunction at the pre-symptomatic stage of diabetes mellitus.
A study on building data warehouse of hospital information system.
Li, Ping; Wu, Tao; Chen, Mu; Zhou, Bin; Xu, Wei-guo
2011-08-01
Existing hospital information systems with simple statistical functions cannot meet current management needs. It is well known that hospital resources are distributed with private property rights among hospitals, such as in the case of the regional coordination of medical services. In this study, to integrate and make full use of medical data effectively, we propose a data warehouse modeling method for the hospital information system. The method can also be employed for a distributed-hospital medical service system. To ensure that hospital information supports the diverse needs of health care, the framework of the hospital information system has three layers: datacenter layer, system-function layer, and user-interface layer. This paper discusses the role of a data warehouse management system in handling hospital information from the establishment of the data theme to the design of a data model to the establishment of a data warehouse. Online analytical processing tools assist user-friendly multidimensional analysis from a number of different angles to extract the required data and information. Use of the data warehouse improves online analytical processing and mitigates deficiencies in the decision support system. The hospital information system based on a data warehouse effectively employs statistical analysis and data mining technology to handle massive quantities of historical data, and summarizes from clinical and hospital information for decision making. This paper proposes the use of a data warehouse for a hospital information system, specifically a data warehouse for the theme of hospital information to determine latitude, modeling and so on. The processing of patient information is given as an example that demonstrates the usefulness of this method in the case of hospital information management. Data warehouse technology is an evolving technology, and more and more decision support information extracted by data mining and with decision-making technology is required for further research.
Opportunity and Challenges for Migrating Big Data Analytics in Cloud
NASA Astrophysics Data System (ADS)
Amitkumar Manekar, S.; Pradeepini, G., Dr.
2017-08-01
Big Data Analytics is a big word now days. As per demanding and more scalable process data generation capabilities, data acquisition and storage become a crucial issue. Cloud storage is a majorly usable platform; the technology will become crucial to executives handling data powered by analytics. Now a day’s trend towards “big data-as-a-service” is talked everywhere. On one hand, cloud-based big data analytics exactly tackle in progress issues of scale, speed, and cost. But researchers working to solve security and other real-time problem of big data migration on cloud based platform. This article specially focused on finding possible ways to migrate big data to cloud. Technology which support coherent data migration and possibility of doing big data analytics on cloud platform is demanding in natute for new era of growth. This article also gives information about available technology and techniques for migration of big data in cloud.
Templeton, David W.; Sluiter, Justin B.; Sluiter, Amie; ...
2016-10-18
In an effort to find economical, carbon-neutral transportation fuels, biomass feedstock compositional analysis methods are used to monitor, compare, and improve biofuel conversion processes. These methods are empirical, and the analytical variability seen in the feedstock compositional data propagates into variability in the conversion yields, component balances, mass balances, and ultimately the minimum ethanol selling price (MESP). We report the average composition and standard deviations of 119 individually extracted National Institute of Standards and Technology (NIST) bagasse [Reference Material (RM) 8491] run by seven analysts over 7 years. Two additional datasets, using bulk-extracted bagasse (containing 58 and 291 replicates each),more » were examined to separate out the effects of batch, analyst, sugar recovery standard calculation method, and extractions from the total analytical variability seen in the individually extracted dataset. We believe this is the world's largest NIST bagasse compositional analysis dataset and it provides unique insight into the long-term analytical variability. Understanding the long-term variability of the feedstock analysis will help determine the minimum difference that can be detected in yield, mass balance, and efficiency calculations. The long-term data show consistent bagasse component values through time and by different analysts. This suggests that the standard compositional analysis methods were performed consistently and that the bagasse RM itself remained unchanged during this time period. The long-term variability seen here is generally higher than short-term variabilities. It is worth noting that the effect of short-term or long-term feedstock compositional variability on MESP is small, about $0.03 per gallon. The long-term analysis variabilities reported here are plausible minimum values for these methods, though not necessarily average or expected variabilities. We must emphasize the importance of training and good analytical procedures needed to generate this data. As a result, when combined with a robust QA/QC oversight protocol, these empirical methods can be relied upon to generate high-quality data over a long period of time.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Templeton, David W.; Sluiter, Justin B.; Sluiter, Amie
In an effort to find economical, carbon-neutral transportation fuels, biomass feedstock compositional analysis methods are used to monitor, compare, and improve biofuel conversion processes. These methods are empirical, and the analytical variability seen in the feedstock compositional data propagates into variability in the conversion yields, component balances, mass balances, and ultimately the minimum ethanol selling price (MESP). We report the average composition and standard deviations of 119 individually extracted National Institute of Standards and Technology (NIST) bagasse [Reference Material (RM) 8491] run by seven analysts over 7 years. Two additional datasets, using bulk-extracted bagasse (containing 58 and 291 replicates each),more » were examined to separate out the effects of batch, analyst, sugar recovery standard calculation method, and extractions from the total analytical variability seen in the individually extracted dataset. We believe this is the world's largest NIST bagasse compositional analysis dataset and it provides unique insight into the long-term analytical variability. Understanding the long-term variability of the feedstock analysis will help determine the minimum difference that can be detected in yield, mass balance, and efficiency calculations. The long-term data show consistent bagasse component values through time and by different analysts. This suggests that the standard compositional analysis methods were performed consistently and that the bagasse RM itself remained unchanged during this time period. The long-term variability seen here is generally higher than short-term variabilities. It is worth noting that the effect of short-term or long-term feedstock compositional variability on MESP is small, about $0.03 per gallon. The long-term analysis variabilities reported here are plausible minimum values for these methods, though not necessarily average or expected variabilities. We must emphasize the importance of training and good analytical procedures needed to generate this data. As a result, when combined with a robust QA/QC oversight protocol, these empirical methods can be relied upon to generate high-quality data over a long period of time.« less
Ślączka-Wilk, Magdalena M; Włodarczyk, Elżbieta; Kaleniecka, Aleksandra; Zarzycki, Paweł K
2017-07-01
There is increasing interest in the development of simple analytical systems enabling the fast screening of target components in complex samples. A number of newly invented protocols are based on quasi separation techniques involving microfluidic paper-based analytical devices and/or micro total analysis systems. Under such conditions, the quantification of target components can be performed mainly due to selective detection. The main goal of this paper is to demonstrate that miniaturized planar chromatography has the capability to work as an efficient separation and quantification tool for the analysis of multiple targets within complex environmental samples isolated and concentrated using an optimized SPE method. In particular, we analyzed various samples collected from surface water ecosystems (lakes, rivers, and the Baltic Sea of Middle Pomerania in the northern part of Poland) in different seasons, as well as samples collected during key wastewater technological processes (originating from the "Jamno" wastewater treatment plant in Koszalin, Poland). We documented that the multiple detection of chromatographic spots on RP-18W microplates-under visible light, fluorescence, and fluorescence quenching conditions, and using the visualization reagent phosphomolybdic acid-enables fast and robust sample classification. The presented data reveal that the proposed micro-TLC system is useful, inexpensive, and can be considered as a complementary method for the fast control of treated sewage water discharged by a municipal wastewater treatment plant, particularly for the detection of low-molecular mass micropollutants with polarity ranging from estetrol to progesterone, as well as chlorophyll-related dyes. Due to the low consumption of mobile phases composed of water-alcohol binary mixtures (less than 1 mL/run for the simultaneous separation of up to nine samples), this method can be considered an environmentally friendly and green chemistry analytical tool. The described analytical protocol can be complementary to those involving classical column chromatography (HPLC) or various planar microfluidic devices.
Application of Interface Technology in Progressive Failure Analysis of Composite Panels
NASA Technical Reports Server (NTRS)
Sleight, D. W.; Lotts, C. G.
2002-01-01
A progressive failure analysis capability using interface technology is presented. The capability has been implemented in the COMET-AR finite element analysis code developed at the NASA Langley Research Center and is demonstrated on composite panels. The composite panels are analyzed for damage initiation and propagation from initial loading to final failure using a progressive failure analysis capability that includes both geometric and material nonlinearities. Progressive failure analyses are performed on conventional models and interface technology models of the composite panels. Analytical results and the computational effort of the analyses are compared for the conventional models and interface technology models. The analytical results predicted with the interface technology models are in good correlation with the analytical results using the conventional models, while significantly reducing the computational effort.
Relationship between Social Media and Academic Performance in Distance Education
ERIC Educational Resources Information Center
Gupta, C. A. Pallavi; Singh, Bharti; Marwaha, Tushar
2013-01-01
The scope and method of imparting distance education to the learner has evolved over a period of time. Various models of distance education have been introduced over the years; the latest introduction is the use of Web 2.0 technologies to make distance learning more analytical, flexible, interactive, and collaborative for both the teacher and the…
Teaching Data Analysis with Interactive Visual Narratives
ERIC Educational Resources Information Center
Saundage, Dilal; Cybulski, Jacob L.; Keller, Susan; Dharmasena, Lasitha
2016-01-01
Data analysis is a major part of business analytics (BA), which refers to the skills, methods, and technologies that enable managers to make swift, quality decisions based on large amounts of data. BA has become a major component of Information Systems (IS) courses all over the world. The challenge for IS educators is to teach data analysis--the…
Logic of Sherlock Holmes in Technology Enhanced Learning
ERIC Educational Resources Information Center
Patokorpi, Erkki
2007-01-01
Abduction is a method of reasoning that people use under uncertainty in a context in order to come up with new ideas. The use of abduction in this exploratory study is twofold: (i) abduction is a cross-disciplinary analytic tool that can be used to explain certain key aspects of human-computer interaction in advanced Information Society Technology…
Aircraft Engine Gas Path Diagnostic Methods: Public Benchmarking Results
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Borguet, Sebastien; Leonard, Olivier; Zhang, Xiaodong (Frank)
2013-01-01
Recent technology reviews have identified the need for objective assessments of aircraft engine health management (EHM) technologies. To help address this issue, a gas path diagnostic benchmark problem has been created and made publicly available. This software tool, referred to as the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES), has been constructed based on feedback provided by the aircraft EHM community. It provides a standard benchmark problem enabling users to develop, evaluate and compare diagnostic methods. This paper will present an overview of ProDiMES along with a description of four gas path diagnostic methods developed and applied to the problem. These methods, which include analytical and empirical diagnostic techniques, will be described and associated blind-test-case metric results will be presented and compared. Lessons learned along with recommendations for improving the public benchmarking processes will also be presented and discussed.
An NCI-FDA Interagency Oncology Task Force (IOTF) Molecular Diagnostics Workshop was held on October 30, 2008 in Cambridge, MA, to discuss requirements for analytical validation of protein-based multiplex technologies in the context of its intended use. This workshop developed through NCI's Clinical Proteomic Technologies for Cancer initiative and the FDA focused on technology-specific analytical validation processes to be addressed prior to use in clinical settings. In making this workshop unique, a case study approach was used to discuss issues related to
Watson, Douglas S; Kerchner, Kristi R; Gant, Sean S; Pedersen, Joseph W; Hamburger, James B; Ortigosa, Allison D; Potgieter, Thomas I
2016-01-01
Tangential flow microfiltration (MF) is a cost-effective and robust bioprocess separation technique, but successful full scale implementation is hindered by the empirical, trial-and-error nature of scale-up. We present an integrated approach leveraging at-line process analytical technology (PAT) and mass balance based modeling to de-risk MF scale-up. Chromatography-based PAT was employed to improve the consistency of an MF step that had been a bottleneck in the process used to manufacture a therapeutic protein. A 10-min reverse phase ultra high performance liquid chromatography (RP-UPLC) assay was developed to provide at-line monitoring of protein concentration. The method was successfully validated and method performance was comparable to previously validated methods. The PAT tool revealed areas of divergence from a mass balance-based model, highlighting specific opportunities for process improvement. Adjustment of appropriate process controls led to improved operability and significantly increased yield, providing a successful example of PAT deployment in the downstream purification of a therapeutic protein. The general approach presented here should be broadly applicable to reduce risk during scale-up of filtration processes and should be suitable for feed-forward and feed-back process control. © 2015 American Institute of Chemical Engineers.
Interpretation and classification of microvolt T wave alternans tests
NASA Technical Reports Server (NTRS)
Bloomfield, Daniel M.; Hohnloser, Stefan H.; Cohen, Richard J.
2002-01-01
Measurement of microvolt-level T wave alternans (TWA) during routine exercise stress testing now is possible as a result of sophisticated noise reduction techniques and analytic methods that have become commercially available. Even though this technology is new, the available data suggest that microvolt TWA is a potent predictor of arrhythmia risk in diverse disease states. As this technology becomes more widely available, physicians will be called upon to interpret microvolt TWA tracings. This review seeks to establish uniform standards for the clinical interpretation of microvolt TWA tracings.
Next-Generation Technologies for Multiomics Approaches Including Interactome Sequencing
Ohashi, Hiroyuki; Miyamoto-Sato, Etsuko
2015-01-01
The development of high-speed analytical techniques such as next-generation sequencing and microarrays allows high-throughput analysis of biological information at a low cost. These techniques contribute to medical and bioscience advancements and provide new avenues for scientific research. Here, we outline a variety of new innovative techniques and discuss their use in omics research (e.g., genomics, transcriptomics, metabolomics, proteomics, and interactomics). We also discuss the possible applications of these methods, including an interactome sequencing technology that we developed, in future medical and life science research. PMID:25649523
2012-10-01
education of a new generation of data fusion analysts Jacob L. Graham College of Information Sciences & Technology Pennsylvania State University...University Park, PA, U.S.A. jgraham@ist.psu.edu David L. Hall College of Information Sciences & Technology Pennsylvania State University...ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) College
A novel compact model for on-chip stacked transformers in RF-CMOS technology
NASA Astrophysics Data System (ADS)
Jun, Liu; Jincai, Wen; Qian, Zhao; Lingling, Sun
2013-08-01
A novel compact model for on-chip stacked transformers is presented. The proposed model topology gives a clear distinction to the eddy current, resistive and capacitive losses of the primary and secondary coils in the substrate. A method to analytically determine the non-ideal parasitics between the primary coil and substrate is provided. The model is further verified by the excellent match between the measured and simulated S -parameters on the extracted parameters for a 1 : 1 stacked transformer manufactured in a commercial RF-CMOS technology.
Zhou, Weiqiang; Sherwood, Ben; Ji, Hongkai
2017-01-01
Technological advances have led to an explosive growth of high-throughput functional genomic data. Exploiting the correlation among different data types, it is possible to predict one functional genomic data type from other data types. Prediction tools are valuable in understanding the relationship among different functional genomic signals. They also provide a cost-efficient solution to inferring the unknown functional genomic profiles when experimental data are unavailable due to resource or technological constraints. The predicted data may be used for generating hypotheses, prioritizing targets, interpreting disease variants, facilitating data integration, quality control, and many other purposes. This article reviews various applications of prediction methods in functional genomics, discusses analytical challenges, and highlights some common and effective strategies used to develop prediction methods for functional genomic data. PMID:28076869
site LAB(& Analytical Test Kit UVF-3 I OOA (UVF-3 I OOA) developed by siteLABqD Corporation (siteLABa)) was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program in June 2000 at the Navy Base Ventura County site in ...
Langemann, Timo; Mayr, Ulrike Beate; Meitz, Andrea; Lubitz, Werner; Herwig, Christoph
2016-01-01
Flow cytometry (FCM) is a tool for the analysis of single-cell properties in a cell suspension. In this contribution, we present an improved FCM method for the assessment of E-lysis in Enterobacteriaceae. The result of the E-lysis process is empty bacterial envelopes-called bacterial ghosts (BGs)-that constitute potential products in the pharmaceutical field. BGs have reduced light scattering properties when compared with intact cells. In combination with viability information obtained from staining samples with the membrane potential-sensitive fluorescent dye bis-(1,3-dibutylarbituric acid) trimethine oxonol (DiBAC4(3)), the presented method allows to differentiate between populations of viable cells, dead cells, and BGs. Using a second fluorescent dye RH414 as a membrane marker, non-cellular background was excluded from the data which greatly improved the quality of the results. Using true volumetric absolute counting, the FCM data correlated well with cell count data obtained from colony-forming units (CFU) for viable populations. Applicability of the method to several Enterobacteriaceae (different Escherichia coli strains, Salmonella typhimurium, Shigella flexneri 2a) could be shown. The method was validated as a resilient process analytical technology (PAT) tool for the assessment of E-lysis and for particle counting during 20-l batch processes for the production of Escherichia coli Nissle 1917 BGs.
Hahn, Barbara
The essays in this forum brace this meditation on the historiography of technology. Understanding devices incorporates the context of any particular hardware, as John Staudenmaier showed by quantifying the contents of the first decades of Technology and Culture. As contextualist approaches have widened from systems theory through social construction and into the assemblages of actor-network theory, the discipline has kept artifacts at the analytical center: it is the history of technology that scholars seek to understand. Even recognizing that the machine only embodies the technology, the discipline has long sought to explain the machine. These essays invite consideration of how the history of technology might apply to non-corporeal things-methods as well as machines, and all the worldly phenomena that function in technological ways even without physicality. Materiality is financial as well as corporeal, the history of capitalism reminds us, and this essay urges scholars to apply history-of-technology approaches more broadly.
[Automation and organization of technological process of urinalysis].
Kolenkin, S M; Kishkun, A A; Kol'chenko, O L
2000-12-01
Results of introduction into practice of a working model of industrial technology of laboratory studies and KONE Specific Supra and Miditron M devices are shown as exemplified by clinical analysis of the urine. This technology helps standardize all stages and operations, improves the efficiency of quality control of laboratory studies, rationally organizes the work at all stages of the process, creates a system for permanent improvement of the efficiency of investigations at the preanalytical, analytical, and postanalytical stages of technological process of laboratory studies. As a result of introduction of this technology into laboratory practice, violations of quality criteria of clinical urinalysis decreased from 15 to 8% at the preanalytical stage and from 6 to 3% at the analytical stage. Automation of the analysis decreased the need in reagents 3-fold and improved the productivity at the analytical stage 4-fold.
Surface Plasmon Resonance: New Biointerface Designs and High-Throughput Affinity Screening
NASA Astrophysics Data System (ADS)
Linman, Matthew J.; Cheng, Quan Jason
Surface plasmon resonance (SPR) is a surface optical technique that measures minute changes in refractive index at a metal-coated surface. It has become increasingly popular in the study of biological and chemical analytes because of its label-free measurement feature. In addition, SPR allows for both quantitative and qualitative assessment of binding interactions in real time, making it ideally suited for probing weak interactions that are often difficult to study with other methods. This chapter presents the biosensor development in the last 3 years or so utilizing SPR as the principal analytical technique, along with a concise background of the technique itself. While SPR has demonstrated many advantages, it is a nonselective method and so, building reproducible and functional interfaces is vital to sensing applications. This chapter, therefore, focuses mainly on unique surface chemistries and assay approaches to examine biological interactions with SPR. In addition, SPR imaging for high-throughput screening based on microarrays and novel hyphenated techniques involving the coupling of SPR to other analytical methods is discussed. The chapter concludes with a commentary on the current state of SPR biosensing technology and the general direction of future biosensor research.
ENVIRONMENTAL TECHNOLOGICAL VERIFICATION REPORT - L2000 PCB/CHLORIDE ANALYZER - DEXSIL CORPORATION
In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of Polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCBs in soil...
ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - ENVIROGARD PCB TEST KIT - STRATEGIC DIAGNOSTICS INC
In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of Polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCBs in soil...
Thomas, Jeanice B; Sharpless, Katherine E; Yen, James H; Rimmer, Catherine A
2011-01-01
The concentrations of selected fat-soluble vitamins and carotenoids in Standard Reference Material (SRM) 3280 Multivitamin/Multielement Tablets have been determined by two independent LC methods, with measurements performed by the National Institute of Standards and Technology (NIST). This SRM has been prepared as part of a collaborative effort between NIST and the National Institutes of Health Office of Dietary Supplements. The SRM is also intended to support the Dietary Supplement Ingredient Database that is being established by the U.S. Department of Agriculture. The methods used at NIST to determine the concentration levels of vitamins A and E, and beta-carotene in the SRM used RPLC with absorbance detection. The relative precision of these methods ranged from 2 to 8% for the analytes measured. SRM 3280 is primarily intended for use in validating analytical methods for the determination of selected vitamins, carotenoids, and elements in multivitamin/multielement tablets and similar matrixes.
We Canwatch It For You Wholesale
NASA Astrophysics Data System (ADS)
Lipton, Alan J.
This chapter provides an introduction to video analytics—a branch of computer vision technology that deals with automatic detection of activities and events in surveillance video feeds. Initial applications focused on the security and surveillance space, but as the technology improves it is rapidly finding a home in many other application areas. This chapter looks at some of those spaces, the requirements they impose on video analytics systems, and provides an example architecture and set of technology components to meet those requirements. This exemplary system is put through its paces to see how it stacks up in an embedded environment. Finally, we explore the future of video analytics and examine some of the market requirements that are driving breakthroughs in both video analytics and processor platform technology alike.
Reagen, William K; Lindstrom, Kent R; Thompson, Kathy L; Flaherty, John M
2004-09-01
The widespread use of semi- and nonvolatile organofluorochemicals in industrial facilities, concern about their persistence, and relatively recent advancements in liquid chromatography/mass spectrometry (LC/MS) technology have led to the development of new analytical methods to assess potential worker exposure to airborne organofluorochemicals. Techniques were evaluated for the determination of 19 organofluorochemicals and for total fluorine in ambient air samples. Due to the potential biphasic nature of most of these fluorochemicals when airborne, Occupational Safety and Health Administration (OSHA) versatile sampler (OVS) tubes were used to simultaneously trap fluorochemical particulates and vapors from workplace air. Analytical methods were developed for OVS air samples to quantitatively analyze for total fluorine using oxygen bomb combustion/ion selective electrode and for 17 organofluorochemicals using LC/MS and gas chromatography/mass spectrometry (GC/MS). The experimental design for this validation was based on the National Institute of Occupational Safety and Health (NIOSH) Guidelines for Air Sampling and Analytical Method Development and Evaluation, with some revisions of the experimental design. The study design incorporated experiments to determine analytical recovery and stability, sampler capacity, the effect of some environmental parameters on recoveries, storage stability, limits of detection, precision, and accuracy. Fluorochemical mixtures were spiked onto each OVS tube over a range of 0.06-6 microg for each of 12 compounds analyzed by LC/MS and 0.3-30 microg for 5 compounds analyzed by GC/MS. These ranges allowed reliable quantitation at 0.001-0.1 mg/m3 in general for LC/MS analytes and 0.005-0.5 mg/m3 for GC/MS analytes when 60 L of air are sampled. The organofluorochemical exposure guideline (EG) is currently 0.1 mg/m3 for many analytes, with one exception being ammonium perfluorooctanoate (EG is 0.01 mg/m3). Total fluorine results may be used to determine if the individual compounds quantified provide a suitable mass balance of total airborne organofluorochemicals based on known fluorine content. Improvements in precision and/or recovery as well as some additional testing would be needed to meet all NIOSH validation criteria. This study provided valuable information about the accuracy of this method for organofluorochemical exposure assessment.
Numerical modelling and experimental analysis of acoustic emission
NASA Astrophysics Data System (ADS)
Gerasimov, S. I.; Sych, T. V.
2018-05-01
In the present paper, the authors report on the application of non-destructive acoustic waves technologies to determine the structural integrity of engineering components. In particular, a finite element (FE) system COSMOS/M is used to investigate propagation characteristics of ultrasonic waves in linear, plane and three-dimensional structures without and with geometric concentrators. In addition, the FE results obtained are compared to the analytical and experimental ones. The study illustrates the efficient use of the FE method to model guided wave propagation problems and demonstrates the FE method’s potential to solve problems when an analytical solution is not possible due to “complicated” geometry.
NASA Technical Reports Server (NTRS)
Housner, J. M.; Anderson, M.; Belvin, W.; Horner, G.
1985-01-01
Dynamic analysis of large space antenna systems must treat the deployment as well as vibration and control of the deployed antenna. Candidate computer programs for deployment dynamics, and issues and needs for future program developments are reviewed. Some results for mast and hoop deployment are also presented. Modeling of complex antenna geometry with conventional finite element methods and with repetitive exact elements is considered. Analytical comparisons with experimental results for a 15 meter hoop/column antenna revealed the importance of accurate structural properties including nonlinear joints. Slackening of cables in this antenna is also a consideration. The technology of designing actively damped structures through analytical optimization is discussed and results are presented.
Frequency Response Function Based Damage Identification for Aerospace Structures
NASA Astrophysics Data System (ADS)
Oliver, Joseph Acton
Structural health monitoring technologies continue to be pursued for aerospace structures in the interests of increased safety and, when combined with health prognosis, efficiency in life-cycle management. The current dissertation develops and validates damage identification technology as a critical component for structural health monitoring of aerospace structures and, in particular, composite unmanned aerial vehicles. The primary innovation is a statistical least-squares damage identification algorithm based in concepts of parameter estimation and model update. The algorithm uses frequency response function based residual force vectors derived from distributed vibration measurements to update a structural finite element model through statistically weighted least-squares minimization producing location and quantification of the damage, estimation uncertainty, and an updated model. Advantages compared to other approaches include robust applicability to systems which are heavily damped, large, and noisy, with a relatively low number of distributed measurement points compared to the number of analytical degrees-of-freedom of an associated analytical structural model (e.g., modal finite element model). Motivation, research objectives, and a dissertation summary are discussed in Chapter 1 followed by a literature review in Chapter 2. Chapter 3 gives background theory and the damage identification algorithm derivation followed by a study of fundamental algorithm behavior on a two degree-of-freedom mass-spring system with generalized damping. Chapter 4 investigates the impact of noise then successfully proves the algorithm against competing methods using an analytical eight degree-of-freedom mass-spring system with non-proportional structural damping. Chapter 5 extends use of the algorithm to finite element models, including solutions for numerical issues, approaches for modeling damping approximately in reduced coordinates, and analytical validation using a composite sandwich plate model. Chapter 6 presents the final extension to experimental systems-including methods for initial baseline correlation and data reduction-and validates the algorithm on an experimental composite plate with impact damage. The final chapter deviates from development and validation of the primary algorithm to discuss development of an experimental scaled-wing test bed as part of a collaborative effort for developing structural health monitoring and prognosis technology. The dissertation concludes with an overview of technical conclusions and recommendations for future work.
Process-Hardened, Multi-Analyte Sensor for Characterizing Rocket Plume Constituents
NASA Technical Reports Server (NTRS)
Goswami, Kisholoy
2011-01-01
A multi-analyte sensor was developed that enables simultaneous detection of rocket engine combustion-product molecules in a launch-vehicle ground test stand. The sensor was developed using a pin-printing method by incorporating multiple sensor elements on a single chip. It demonstrated accurate and sensitive detection of analytes such as carbon dioxide, carbon monoxide, kerosene, isopropanol, and ethylene from a single measurement. The use of pin-printing technology enables high-volume fabrication of the sensor chip, which will ultimately eliminate the need for individual sensor calibration since many identical sensors are made in one batch. Tests were performed using a single-sensor chip attached to a fiber-optic bundle. The use of a fiber bundle allows placement of the opto-electronic readout device at a place remote from the test stand. The sensors are rugged for operation in harsh environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, D.W.; Heinrich, R.R.; Jensen, K.J.
Technical and administrative activities of the Analytical Chemistry Laboratory (ACL) are reported for fiscal year 1984. The ACL is a full-cost-recovery service center, with the primary mission of providing a broad range of technical support services to the scientific and engineering programs at ANL. In addition, ACL conducts a research program in analytical chemistry, works on instrumental and methods development, and provides analytical services for governmental, educational, and industrial organizations. The ACL is administratively within the Chemical Technology Division, the principal user, but provides technical support for all of the technical divisions and programs at ANL. The ACL has threemore » technical groups - Chemical Analysis, Instrumental Analysis, and Organic Analysis. Under technical activities 26 projects are briefly described. Under professional activities, a list is presented for publications and reports, oral presentations, awards and meetings attended. 6 figs., 2 tabs.« less
Applications of Optical Microcavity Resonators in Analytical Chemistry
Wade, James H.; Bailey, Ryan C.
2018-01-01
Optical resonator sensors are an emerging class of analytical technologies that use recirculating light confined within a microcavity to sensitively measure the surrounding environment. Bolstered by advances in microfabrication, these devices can be configured for a wide variety of chemical or biomolecular sensing applications. The review begins with a brief description of optical resonator sensor operation followed by discussions regarding sensor design, including different geometries, choices of material systems, methods of sensor interrogation, and new approaches to sensor operation. Throughout, key recent developments are highlighted, including advancements in biosensing and other applications of optical sensors. Alternative sensing mechanisms and hybrid sensing devices are then discussed in terms of their potential for more sensitive and rapid analyses. Brief concluding statements offer our perspective on the future of optical microcavity sensors and their promise as versatile detection elements within analytical chemistry. PMID:27049629
Lowe, Aaron M.; Bertics, Paul J.; Abbott, Nicholas L.
2009-01-01
We report methods for the acquisition and analysis of optical images formed by thin films of twisted nematic liquid crystals (LCs) placed into contact with surfaces patterned with bio/chemical functionality relevant to surface-based assays. The methods are simple to implement and are shown to provide easily interpreted maps of chemical transformations on surfaces that are widely exploited in the preparation of analytic devices. The methods involve acquisition of multiple images of the LC as a function of the orientation of a polarizer; data analysis condenses the information present in the stack of images into a spatial map of the twist angle of the LC on the analytic surface. The potential utility of the methods is illustrated by mapping (i) the displacement of a monolayer formed from one alkanethiol on a gold film by a second thiol in solution, (ii) coadsorption of mixtures of amine-terminated and ethyleneglycol-terminated alkanethiols on gold films, which leads to a type of mixed monolayer that is widely exploited for immobilization of proteins on analytic surfaces, and (iii) patterns of antibodies printed onto surfaces. These results show that maps of the twist angle of the LC constructed from families of optical images can be used to reveal surface features that are not apparent in a single image of the LC film. Furthermore, the twist angles of the LC can be used to quantify the energy of interaction of the LC with the surface with a spatial resolution of <10 µm. When combined, the results described in this paper suggest non-destructive methods to monitor and validate chemical transformations on surfaces of the type that are routinely employed in the preparation of surface-based analytic technologies. PMID:18355089
Sensor failure detection for jet engines using analytical redundance
NASA Technical Reports Server (NTRS)
Merrill, W. C.
1984-01-01
Analytical redundant sensor failure detection, isolation and accommodation techniques for gas turbine engines are surveyed. Both the theoretical technology base and demonstrated concepts are discussed. Also included is a discussion of current technology needs and ongoing Government sponsored programs to meet those needs.
In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCB's in soi...
Morbioli, Giorgio Gianini; Mazzu-Nascimento, Thiago; Milan, Luis Aparecido; Stockton, Amanda M; Carrilho, Emanuel
2017-05-02
Paper-based devices are a portable, user-friendly, and affordable technology that is one of the best analytical tools for inexpensive diagnostic devices. Three-dimensional microfluidic paper-based analytical devices (3D-μPADs) are an evolution of single layer devices and they permit effective sample dispersion, individual layer treatment, and multiplex analytical assays. Here, we present the rational design of a wax-printed 3D-μPAD that enables more homogeneous permeation of fluids along the cellulose matrix than other existing designs in the literature. Moreover, we show the importance of the rational design of channels on these devices using glucose oxidase, peroxidase, and 2,2'-azino-bis(3-ethylbenzothiazoline-6-sulfonic acid) (ABTS) reactions. We present an alternative method for layer stacking using a magnetic apparatus, which facilitates fluidic dispersion and improves the reproducibility of tests performed on 3D-μPADs. We also provide the optimized designs for printing, facilitating further studies using 3D-μPADs.
Nine-analyte detection using an array-based biosensor
NASA Technical Reports Server (NTRS)
Taitt, Chris Rowe; Anderson, George P.; Lingerfelt, Brian M.; Feldstein, s. Mark. J.; Ligler, Frances S.
2002-01-01
A fluorescence-based multianalyte immunosensor has been developed for simultaneous analysis of multiple samples. While the standard 6 x 6 format of the array sensor has been used to analyze six samples for six different analytes, this same format has the potential to allow a single sample to be tested for 36 different agents. The method described herein demonstrates proof of principle that the number of analytes detectable using a single array can be increased simply by using complementary mixtures of capture and tracer antibodies. Mixtures were optimized to allow detection of closely related analytes without significant cross-reactivity. Following this facile modification of patterning and assay procedures, the following nine targets could be detected in a single 3 x 3 array: Staphylococcal enterotoxin B, ricin, cholera toxin, Bacillus anthracis Sterne, Bacillus globigii, Francisella tularensis LVS, Yersiniapestis F1 antigen, MS2 coliphage, and Salmonella typhimurium. This work maximizes the efficiency and utility of the described array technology, increasing only reagent usage and cost; production and fabrication costs are not affected.
Annual banned-substance review: analytical approaches in human sports drug testing.
Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm
2014-01-01
Monitoring the misuse of drugs and the abuse of substances and methods potentially or evidently improving athletic performance by analytical chemistry strategies is one of the main pillars of modern anti-doping efforts. Owing to the continuously growing knowledge in medicine, pharmacology, and (bio)chemistry, new chemical entities are frequently established and developed, various of which present a temptation for sportsmen and women due to assumed/attributed beneficial effects of such substances and preparations on, for example, endurance, strength, and regeneration. By means of new technologies, expanded existing test protocols, new insights into metabolism, distribution, and elimination of compounds prohibited by the World Anti-Doping Agency (WADA), analytical assays have been further improved in agreement with the content of the 2013 Prohibited List. In this annual banned-substance review, literature concerning human sports drug testing that was published between October 2012 and September 2013 is summarized and reviewed with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2013 John Wiley & Sons, Ltd.
Review and assessment of the HOST turbine heat transfer program
NASA Technical Reports Server (NTRS)
Gladden, Herbert J.
1988-01-01
The objectives of the HOST Turbine Heat Transfer subproject were to obtain a better understanding of the physics of the aerothermodynamic phenomena occurring in high-performance gas turbine engines and to assess and improve the analytical methods used to predict the fluid dynamics and heat transfer phenomena. At the time the HOST project was initiated, an across-the-board improvement in turbine design technology was needed. Therefore, a building-block approach was utilized, with research ranging from the study of fundamental phenomena and analytical modeling to experiments in simulated real-engine environments. Experimental research accounted for 75 percent of the project, and analytical efforts accounted for approximately 25 percent. Extensive experimental datasets were created depicting the three-dimensional flow field, high free-stream turbulence, boundary-layer transition, blade tip region heat transfer, film cooling effects in a simulated engine environment, rough-wall cooling enhancement in a rotating passage, and rotor-stator interaction effects. In addition, analytical modeling of these phenomena was initiated using boundary-layer assumptions as well as Navier-Stokes solutions.
Process analytical technology in continuous manufacturing of a commercial pharmaceutical product.
Vargas, Jenny M; Nielsen, Sarah; Cárdenas, Vanessa; Gonzalez, Anthony; Aymat, Efrain Y; Almodovar, Elvin; Classe, Gustavo; Colón, Yleana; Sanchez, Eric; Romañach, Rodolfo J
2018-03-01
The implementation of process analytical technology and continuous manufacturing at an FDA approved commercial manufacturing site is described. In this direct compaction process the blends produced were monitored with a Near Infrared (NIR) spectroscopic calibration model developed with partial least squares (PLS) regression. The authors understand that this is the first study where the continuous manufacturing (CM) equipment was used as a gravimetric reference method for the calibration model. A principal component analysis (PCA) model was also developed to identify the powder blend, and determine whether it was similar to the calibration blends. An air diagnostic test was developed to assure that powder was present within the interface when the NIR spectra were obtained. The air diagnostic test as well the PCA and PLS calibration model were integrated into an industrial software platform that collects the real time NIR spectra and applies the calibration models. The PCA test successfully detected an equipment malfunction. Variographic analysis was also performed to estimate the sampling analytical errors that affect the results from the NIR spectroscopic method during commercial production. The system was used to monitor and control a 28 h continuous manufacturing run, where the average drug concentration determined by the NIR method was 101.17% of label claim with a standard deviation of 2.17%, based on 12,633 spectra collected. The average drug concentration for the tablets produced from these blends was 100.86% of label claim with a standard deviation of 0.4%, for 500 tablets analyzed by Fourier Transform Near Infrared (FT-NIR) transmission spectroscopy. The excellent agreement between the mean drug concentration values in the blends and tablets produced provides further evidence of the suitability of the validation strategy that was followed. Copyright © 2018 Elsevier B.V. All rights reserved.
Cespi, Marco; Perinelli, Diego R; Casettari, Luca; Bonacucina, Giulia; Caporicci, Giuseppe; Rendina, Filippo; Palmieri, Giovanni F
2014-12-30
The use of process analytical technologies (PAT) to ensure final product quality is by now a well established practice in pharmaceutical industry. To date, most of the efforts in this field have focused on development of analytical methods using spectroscopic techniques (i.e., NIR, Raman, etc.). This work evaluated the possibility of using the parameters derived from the processing of in-line raw compaction data (the forces and displacement of the punches) as a PAT tool for controlling the tableting process. To reach this goal, two commercially available formulations were used, changing the quantitative composition and compressing them on a fully instrumented rotary pressing machine. The Heckel yield pressure and the compaction energies, together with the tablets hardness and compaction pressure, were selected and evaluated as discriminating parameters in all the prepared formulations. The apparent yield pressure, as shown in the obtained results, has the necessary sensitivity to be effectively included in a PAT strategy to monitor the tableting process. Additional investigations were performed to understand the criticalities and the mechanisms beyond this performing parameter and the associated implications. Specifically, it was discovered that the efficiency of the apparent yield pressure depends on the nominal drug title, the drug densification mechanism and the error in pycnometric density. In this study, the potential of using some parameters derived from the compaction raw data has been demonstrated to be an attractive alternative and complementary method to the well established spectroscopic techniques to monitor and control the tableting process. The compaction data monitoring method is also easy to set up and very cost effective. Copyright © 2014 Elsevier B.V. All rights reserved.
Embedded Piezoresistive Microcantilever Sensors for Chemical and Biological Sensing
NASA Astrophysics Data System (ADS)
Porter, Timothy; Eastman, Michael; Kooser, Ara; Manygoats, Kevin; Zhine, Rosalie
2003-03-01
Microcantilever sensors based on embedded piezoresisative technology offer a promising, low-cost method of sensing chemical and biological species. Here, we present data on the detection of various gaseous analytes, including volatile organic compounds (VOC's) and carbon monoxide. Also, we have used these sensors to detect the protein bovine serum albumin (BSA), a protein important in the study of human childhood diabetes.
David C. Calkin; Mark A. Finney; Alan A. Ager; Matthew P. Thompson; Krista M. Gebert
2011-01-01
In this paper we review progress towards the implementation of a riskmanagement framework for US federal wildland fire policy and operations. We first describe new developments in wildfire simulation technology that catalyzed the development of risk-based decision support systems for strategic wildfire management. These systems include new analytical methods to measure...
Analytics-Driven Lossless Data Compression for Rapid In-situ Indexing, Storing, and Querying
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jenkins, John; Arkatkar, Isha; Lakshminarasimhan, Sriram
2013-01-01
The analysis of scientific simulations is highly data-intensive and is becoming an increasingly important challenge. Peta-scale data sets require the use of light-weight query-driven analysis methods, as opposed to heavy-weight schemes that optimize for speed at the expense of size. This paper is an attempt in the direction of query processing over losslessly compressed scientific data. We propose a co-designed double-precision compression and indexing methodology for range queries by performing unique-value-based binning on the most significant bytes of double precision data (sign, exponent, and most significant mantissa bits), and inverting the resulting metadata to produce an inverted index over amore » reduced data representation. Without the inverted index, our method matches or improves compression ratios over both general-purpose and floating-point compression utilities. The inverted index is light-weight, and the overall storage requirement for both reduced column and index is less than 135%, whereas existing DBMS technologies can require 200-400%. As a proof-of-concept, we evaluate univariate range queries that additionally return column values, a critical component of data analytics, against state-of-the-art bitmap indexing technology, showing multi-fold query performance improvements.« less
Sonner, Zachary; Wilder, Eliza; Gaillard, Trudy; Kasting, Gerald; Heikenfeld, Jason
2017-07-25
Eccrine sweat has rapidly emerged as a non-invasive, ergonomic, and rich source of chemical analytes with numerous technological demonstrations now showing the ability for continuous electrochemical sensing. However, beyond active perspirers (athletes, workers, etc.), continuous sweat access in individuals at rest has hindered the advancement of both sweat sensing science and technology. Reported here is integration of sudomotor axon reflex sweat stimulation for continuous wearable sweat analyte analysis, including the ability for side-by-side integration of chemical stimulants & sensors without cross-contamination. This integration approach is uniquely compatible with sensors which consume the analyte (enzymatic) or sensors which equilibrate with analyte concentrations. In vivo validation is performed using iontophoretic delivery of carbachol with ion-selective and impedance sensors for sweat analysis. Carbachol has shown prolonged sweat stimulation in directly stimulated regions for five hours or longer. This work represents a significant leap forward in sweat sensing technology, and may be of broader interest to those interested in on-skin sensing integrated with drug-delivery.
Mass spectrometry-based proteomics: basic principles and emerging technologies and directions.
Van Riper, Susan K; de Jong, Ebbing P; Carlis, John V; Griffin, Timothy J
2013-01-01
As the main catalytic and structural molecules within living systems, proteins are the most likely biomolecules to be affected by radiation exposure. Proteomics, the comprehensive characterization of proteins within complex biological samples, is therefore a research approach ideally suited to assess the effects of radiation exposure on cells and tissues. For comprehensive characterization of proteomes, an analytical platform capable of quantifying protein abundance, identifying post-translation modifications and revealing members of protein complexes on a system-wide level is necessary. Mass spectrometry (MS), coupled with technologies for sample fractionation and automated data analysis, provides such a versatile and powerful platform. In this chapter we offer a view on the current state of MS-proteomics, and focus on emerging technologies within three areas: (1) New instrumental methods; (2) New computational methods for peptide identification; and (3) Label-free quantification. These emerging technologies should be valuable for researchers seeking to better understand biological effects of radiation on living systems.
NASA Astrophysics Data System (ADS)
Doko, Tomoko; Chen, Wenbo; Higuchi, Hiroyoshi
2016-06-01
Satellite tracking technology has been used to reveal the migration patterns and flyways of migratory birds. In general, bird migration can be classified according to migration status. These statuses include the wintering period, spring migration, breeding period, and autumn migration. To determine the migration status, periods of these statuses should be individually determined, but there is no objective method to define 'a threshold date' for when an individual bird changes its status. The research objective is to develop an effective and objective method to determine threshold dates of migration status based on satellite-tracked data. The developed method was named the "MATCHED (Migratory Analytical Time Change Easy Detection) method". In order to demonstrate the method, data acquired from satellite-tracked Tundra Swans were used. MATCHED method is composed by six steps: 1) dataset preparation, 2) time frame creation, 3) automatic identification, 4) visualization of change points, 5) interpretation, and 6) manual correction. Accuracy was tested. In general, MATCHED method was proved powerful to identify the change points between migration status as well as stopovers. Nevertheless, identifying "exact" threshold dates is still challenging. Limitation and application of this method was discussed.
NASA Astrophysics Data System (ADS)
Molina-Perez, Edmundo
It is widely recognized that international environmental technological change is key to reduce the rapidly rising greenhouse gas emissions of emerging nations. In 2010, the United Nations Framework Convention on Climate Change (UNFCCC) Conference of the Parties (COP) agreed to the creation of the Green Climate Fund (GCF). This new multilateral organization has been created with the collective contributions of COP members, and has been tasked with directing over USD 100 billion per year towards investments that can enhance the development and diffusion of clean energy technologies in both advanced and emerging nations (Helm and Pichler, 2015). The landmark agreement arrived at the COP 21 has reaffirmed the key role that the GCF plays in enabling climate mitigation as it is now necessary to align large scale climate financing efforts with the long-term goals agreed at Paris 2015. This study argues that because of the incomplete understanding of the mechanics of international technological change, the multiplicity of policy options and ultimately the presence of climate and technological change deep uncertainty, climate financing institutions such as the GCF, require new analytical methods for designing long-term robust investment plans. Motivated by these challenges, this dissertation shows that the application of new analytical methods, such as Robust Decision Making (RDM) and Exploratory Modeling (Lempert, Popper and Bankes, 2003) to the study of international technological change and climate policy provides useful insights that can be used for designing a robust architecture of international technological cooperation for climate change mitigation. For this study I developed an exploratory dynamic integrated assessment model (EDIAM) which is used as the scenario generator in a large computational experiment. The scope of the experimental design considers an ample set of climate and technological scenarios. These scenarios combine five sources of uncertainty: climate change, elasticity of substitution between renewable and fossil energy and three different sources of technological uncertainty (i.e. R&D returns, innovation propensity and technological transferability). The performance of eight different GCF and non-GCF based policy regimes is evaluated in light of various end-of-century climate policy targets. Then I combine traditional scenario discovery data mining methods (Bryant and Lempert, 2010) with high dimensional stacking methods (Suzuki, Stem and Manzocchi, 2015; Taylor et al., 2006; LeBlanc, Ward and Wittels, 1990) to quantitatively characterize the conditions under which it is possible to stabilize greenhouse gas emissions and keep temperature rise below 2°C before the end of the century. Finally, I describe a method by which it is possible to combine the results of scenario discovery with high-dimensional stacking to construct a dynamic architecture of low cost technological cooperation. This dynamic architecture consists of adaptive pathways (Kwakkel, Haasnoot and Walker, 2014; Haasnoot et al., 2013) which begin with carbon taxation across both regions as a critical near term action. Then in subsequent phases different forms of cooperation are triggered depending on the unfolding climate and technological conditions. I show that there is no single policy regime that dominates over the entire uncertainty space. Instead I find that it is possible to combine these different architectures into a dynamic framework for technological cooperation across regions that can be adapted to unfolding climate and technological conditions which can lead to a greater rate of success and to lower costs in meeting the end-of-century climate change objectives agreed at the 2015 Paris Conference of the Parties. Keywords: international technological change, emerging nations, climate change, technological uncertainties, Green Climate Fund.
PCR technology for screening and quantification of genetically modified organisms (GMOs).
Holst-Jensen, Arne; Rønning, Sissel B; Løvseth, Astrid; Berdal, Knut G
2003-04-01
Although PCR technology has obvious limitations, the potentially high degree of sensitivity and specificity explains why it has been the first choice of most analytical laboratories interested in detection of genetically modified (GM) organisms (GMOs) and derived materials. Because the products that laboratories receive for analysis are often processed and refined, the quality and quantity of target analyte (e.g. protein or DNA) frequently challenges the sensitivity of any detection method. Among the currently available methods, PCR methods are generally accepted as the most sensitive and reliable methods for detection of GM-derived material in routine applications. The choice of target sequence motif is the single most important factor controlling the specificity of the PCR method. The target sequence is normally a part of the modified gene construct, for example a promoter, a terminator, a gene, or a junction between two of these elements. However, the elements may originate from wildtype organisms, they may be present in more than one GMO, and their copy number may also vary from one GMO to another. They may even be combined in a similar way in more than one GMO. Thus, the choice of method should fit the purpose. Recent developments include event-specific methods, particularly useful for identification and quantification of GM content. Thresholds for labelling are now in place in many countries including those in the European Union. The success of the labelling schemes is dependent upon the efficiency with which GM-derived material can be detected. We will present an overview of currently available PCR methods for screening and quantification of GM-derived DNA, and discuss their applicability and limitations. In addition, we will discuss some of the major challenges related to determination of the limits of detection (LOD) and quantification (LOQ), and to validation of methods.
Some problems in mechanics of growing solids with applications to AM technologies
NASA Astrophysics Data System (ADS)
Manzhirov, A. V.
2018-04-01
Additive Manufacturing (AM) technologies are an exciting area of the modern industrial revolution and have applications in engineering, medicine, electronics, aerospace industry, etc. AM enables cost-effective production of customized geometry and parts by direct fabrication from 3D data and mathematical models. Despite much progress in AM technologies, problems of mechanical analysis for AM fabricated parts yet remain to be solved. This paper deals with three main mechanical problems: the onset of residual stresses, which occur in the AM process and can lead to failure of the parts, the distortion of the final shape of AM fabricated parts, and the development of technological solutions aimed at improving existing AM technologies and creating new ones. An approach proposed deals with the construction of adequate analytical model and effective methods for the simulation of AM processes for fabricated solid parts.
Long, Stephen E; Catron, Brittany L; Boggs, Ashley Sp; Tai, Susan Sc; Wise, Stephen A
2016-09-01
The use of urinary iodine as an indicator of iodine status relies in part on the accuracy of the analytical measurement of iodine in urine. Likewise, the use of dietary iodine intake as an indicator of iodine status relies in part on the accuracy of the analytical measurement of iodine in dietary sources, including foods and dietary supplements. Similarly, the use of specific serum biomarkers of thyroid function to screen for both iodine deficiency and iodine excess relies in part on the accuracy of the analytical measurement of those biomarkers. The National Institute of Standards and Technology has been working with the NIH Office of Dietary Supplements for several years to develop higher-order reference measurement procedures and Standard Reference Materials to support the validation of new routine analytical methods for iodine in foods and dietary supplements, for urinary iodine, and for several serum biomarkers of thyroid function including thyroid-stimulating hormone, thyroglobulin, total and free thyroxine, and total and free triiodothyronine. These materials and methods have the potential to improve the assessment of iodine status and thyroid function in observational studies and clinical trials, thereby promoting public health efforts related to iodine nutrition. © 2016 American Society for Nutrition.
Ates, Ebru; Mittendorf, Klaus; Senyuva, Hamide
2013-01-01
An automated sample preparation technique involving cleanup and analytical separation in a single operation using an online coupled TurboFlow (RP-LC system) is reported. This method eliminates time-consuming sample preparation steps that can be potential sources for cross-contamination in the analysis of plasticizers. Using TurboFlow chromatography, liquid samples were injected directly into the automated system without previous extraction or cleanup. Special cleanup columns enabled specific binding of target compounds; higher MW compounds, i.e., fats and proteins, and other matrix interferences with different chemical properties were removed to waste, prior to LC/MS/MS. Systematic stepwise method development using this new technology in the food safety area is described. Selection of optimum columns and mobile phases for loading onto the cleanup column followed by transfer onto the analytical column and MS detection are critical method parameters. The method was optimized for the assay of 10 phthalates (dimethyl, diethyl, dipropyl, butyl benzyl, diisobutyl, dicyclohexyl, dihexyl, diethylhexyl, diisononyl, and diisododecyl) and one adipate (diethylhexyl) in beverages and milk.
Lee, Yu; Yu, Chanki; Lee, Sang Wook
2018-01-10
We present a sequential fitting-and-separating algorithm for surface reflectance components that separates individual dominant reflectance components and simultaneously estimates the corresponding bidirectional reflectance distribution function (BRDF) parameters from the separated reflectance values. We tackle the estimation of a Lafortune BRDF model, which combines a nonLambertian diffuse reflection and multiple specular reflectance components with a different specular lobe. Our proposed method infers the appropriate number of BRDF lobes and their parameters by separating and estimating each of the reflectance components using an interval analysis-based branch-and-bound method in conjunction with iterative K-ordered scale estimation. The focus of this paper is the estimation of the Lafortune BRDF model. Nevertheless, our proposed method can be applied to other analytical BRDF models such as the Cook-Torrance and Ward models. Experiments were carried out to validate the proposed method using isotropic materials from the Mitsubishi Electric Research Laboratories-Massachusetts Institute of Technology (MERL-MIT) BRDF database, and the results show that our method is superior to a conventional minimization algorithm.
Wu, Cao; Chen, Zhou; Hu, Ya; Rao, Zhiyuan; Wu, Wangping; Yang, Zhaogang
2018-05-15
Crystallization is a significant process employed to produce a wide variety of materials in pharmaceutical and food area. The control of crystal dimension, crystallinity, and shape is very important because they will affect the subsequent filtration, drying and grinding performance as well as the physical and chemical properties of the material. This review summarizes the special features of crystallization technology and the preparation methods of nanocrystals, and discusses analytical technology which is used to control crystal quality and performance. The crystallization technology applications in pharmaceutics and foods are also outlined. These illustrated examples further help us to gain a better understanding of the crystallization technology for pharmaceutics and foods. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Schmidt, Kathrin S; Mankertz, Joachim
2018-06-01
A sensitive and robust LC-MS/MS method allowing the rapid screening and confirmation of selective androgen receptor modulators in bovine urine was developed and successfully validated according to Commission Decision 2002/657/EC, chapter 3.1.3 'alternative validation', by applying a matrix-comprehensive in-house validation concept. The confirmation of the analytes in the validation samples was achieved both on the basis of the MRM ion ratios as laid down in Commission Decision 2002/657/EC and by comparison of their enhanced product ion (EPI) spectra with a reference mass spectral library by making use of the QTRAP technology. Here, in addition to the MRM survey scan, EPI spectra were generated in a data-dependent way according to an information-dependent acquisition criterion. Moreover, stability studies of the analytes in solution and in matrix according to an isochronous approach proved the stability of the analytes in solution and in matrix for at least the duration of the validation study. To identify factors that have a significant influence on the test method in routine analysis, a factorial effect analysis was performed. To this end, factors considered to be relevant for the method in routine analysis (e.g. operator, storage duration of the extracts before measurement, different cartridge lots and different hydrolysis conditions) were systematically varied on two levels. The examination of the extent to which these factors influence the measurement results of the individual analytes showed that none of the validation factors exerts a significant influence on the measurement results.
2013-01-01
Background Nowadays long-lasting insecticidal mosquito nets (LNs) are frequently used around the world to protect people against malaria vectors. As they contain insecticide, laboratory control is needed to check whether the content of the active ingredient follows the conditions of the manufacturer and also if the active ingredient is still present after some time of use. For this purpose, an analytical method had to be developed. The fact that LNs include a range of polymers for the yarn and use coated or incorporated technologies for the active ingredient, it is a challenge to find only one analytical method determining the active ingredient in LNs, which takes into account both impregnation technologies. Some methods are provided by international organizations but are limited by the determination of only one pesticide per method. The aim of this study was to optimize a short time extraction method for deltamethrin and alpha-cypermethrin from coated and incorporated mosquito nets and also to detect both insecticides in one analytical run, using gas chromatography with electron capture detection (GC-μECD). Methods Based on the literature, the most suitable solvent and the adequate extraction process for the insecticides used for net making were identified and adapted for the new multi-residue method. Results The validation data of the multi-residue method to determine deltamethrin and alpha-cypermethrin in mosquito nets by GC-μECD are given. Depending on the concentration of the active ingredient spiked on the nets, the mean recovery for alpha-cypermethrin ranged between 86% and 107% with a relative standard deviation below 3.5%. For deltamethrin it ranged between 90% and 108% with a relative standard deviation also below 3.5%. The limit of detection is 0.009 g.a.i/kg of net (0.3 mg a.i./m2 of net) both for alpha-cypermethrin and deltamethrin. Conclusions Data obtained are excellent. A 30 minutes reflux extraction method with xylene was developed to determine alpha-cypermethrin and deltamethrin in long-lasting insecticidal mosquito nets (LNs) by gas chromatography with electron capture detection (GC-μECD). The method can be easily extended to others pyrethroid used for mosquito net treatment. This paper also presents an overview of the studies dealing with pesticide determination in mosquito nets. PMID:23514225
The technological raw material heating furnaces operation efficiency improving issue
NASA Astrophysics Data System (ADS)
Paramonov, A. M.
2017-08-01
The issue of fuel oil applying efficiency improving in the technological raw material heating furnaces by means of its combustion intensification is considered in the paper. The technical and economic optimization problem of the fuel oil heating before combustion is solved. The fuel oil heating optimal temperature defining method and algorithm analytically considering the correlation of thermal, operating parameters and discounted costs for the heating furnace were developed. The obtained optimization functionality provides the heating furnace appropriate thermal indices achievement at minimum discounted costs. The carried out research results prove the expediency of the proposed solutions using.
NASA Astrophysics Data System (ADS)
Abdullah, Lazim; Najib, Liana
2016-04-01
Energy consumption for developing countries is sharply increasing due to the higher economic growth due to industrialisation along with population growth and urbanisation. The increasing demand of energy leads to global energy crisis. Selecting the best energy technology and conservation requires both quantitative and qualitative evaluation criteria. The fuzzy set-based approach is one of the well-known theories to handle fuzziness, uncertainty in decision-making and vagueness of information. This paper proposes a new method of intuitionistic fuzzy analytic hierarchy process (IF-AHP) to deal with the uncertainty in decision-making. The new IF-AHP is applied to establish a preference in the sustainable energy planning decision-making problem. Three decision-makers attached with Malaysian government agencies were interviewed to provide linguistic judgement prior to analysing with the new IF-AHP. Nuclear energy has been decided as the best alternative in energy planning which provides the highest weight among all the seven alternatives.
Frost, Megan C; Meyerhoff, Mark E
2015-01-01
We review approaches and challenges in developing chemical sensor-based methods to accurately and continuously monitor levels of key analytes in blood related directly to the status of critically ill hospitalized patients. Electrochemical and optical sensor-based technologies have been pursued to measure important critical care species in blood [i.e., oxygen, carbon dioxide, pH, electrolytes (K(+), Na(+), Cl(-), etc.), glucose, and lactate] in real-time or near real-time. The two main configurations examined to date for achieving this goal have been intravascular catheter sensors and patient attached ex vivo sensors with intermittent blood sampling via an attached indwelling catheter. We discuss the status of these configurations and the main issues affecting the accuracy of the measurements, including cell adhesion and thrombus formation on the surface of the sensors, sensor drift, sensor selectivity, etc. Recent approaches to mitigate these nagging performance issues that have prevented these technologies from clinical use are also discussed.
Josefsberg, Jessica O; Buckland, Barry
2012-06-01
The evolution of vaccines (e.g., live attenuated, recombinant) and vaccine production methods (e.g., in ovo, cell culture) are intimately tied to each other. As vaccine technology has advanced, the methods to produce the vaccine have advanced and new vaccine opportunities have been created. These technologies will continue to evolve as we strive for safer and more immunogenic vaccines and as our understanding of biology improves. The evolution of vaccine process technology has occurred in parallel to the remarkable growth in the development of therapeutic proteins as products; therefore, recent vaccine innovations can leverage the progress made in the broader biotechnology industry. Numerous important legacy vaccines are still in use today despite their traditional manufacturing processes, with further development focusing on improving stability (e.g., novel excipients) and updating formulation (e.g., combination vaccines) and delivery methods (e.g., skin patches). Modern vaccine development is currently exploiting a wide array of novel technologies to create safer and more efficacious vaccines including: viral vectors produced in animal cells, virus-like particles produced in yeast or insect cells, polysaccharide conjugation to carrier proteins, DNA plasmids produced in E. coli, and therapeutic cancer vaccines created by in vitro activation of patient leukocytes. Purification advances (e.g., membrane adsorption, precipitation) are increasing efficiency, while innovative analytical methods (e.g., microsphere-based multiplex assays, RNA microarrays) are improving process understanding. Novel adjuvants such as monophosphoryl lipid A, which acts on antigen presenting cell toll-like receptors, are expanding the previously conservative list of widely accepted vaccine adjuvants. As in other areas of biotechnology, process characterization by sophisticated analysis is critical not only to improve yields, but also to determine the final product quality. From a regulatory perspective, Quality by Design (QbD) and Process Analytical Technology (PAT) are important initiatives that can be applied effectively to many types of vaccine processes. Universal demand for vaccines requires that a manufacturer plan to supply tens and sometimes hundreds of millions of doses per year at low cost. To enable broader use, there is intense interest in improving temperature stability to allow for excursions from a rigid cold chain supply, especially at the point of vaccination. Finally, there is progress in novel routes of delivery to move away from the traditional intramuscular injection by syringe approach. Copyright © 2012 Wiley Periodicals, Inc.
Trends in Process Analytical Technology: Present State in Bioprocessing.
Jenzsch, Marco; Bell, Christian; Buziol, Stefan; Kepert, Felix; Wegele, Harald; Hakemeyer, Christian
2017-08-04
Process analytical technology (PAT), the regulatory initiative for incorporating quality in pharmaceutical manufacturing, is an area of intense research and interest. If PAT is effectively applied to bioprocesses, this can increase process understanding and control, and mitigate the risk from substandard drug products to both manufacturer and patient. To optimize the benefits of PAT, the entire PAT framework must be considered and each elements of PAT must be carefully selected, including sensor and analytical technology, data analysis techniques, control strategies and algorithms, and process optimization routines. This chapter discusses the current state of PAT in the biopharmaceutical industry, including several case studies demonstrating the degree of maturity of various PAT tools. Graphical Abstract Hierarchy of QbD components.
Rapid Detection of Transition Metals in Welding Fumes Using Paper-Based Analytical Devices
Volckens, John
2014-01-01
Metals in particulate matter (PM) are considered a driving factor for many pathologies. Despite the hazards associated with particulate metals, personal exposures for at-risk workers are rarely assessed due to the cost and effort associated with monitoring. As a result, routine exposure assessments are performed for only a small fraction of the exposed workforce. The objective of this research was to evaluate a relatively new technology, microfluidic paper-based analytical devices (µPADs), for measuring the metals content in welding fumes. Fumes from three common welding techniques (shielded metal arc, metal inert gas, and tungsten inert gas welding) were sampled in two welding shops. Concentrations of acid-extractable Fe, Cu, Ni, and Cr were measured and independently verified using inductively coupled plasma-optical emission spectroscopy (ICP-OES). Results from the µPAD sensors agreed well with ICP-OES analysis; the two methods gave statistically similar results in >80% of the samples analyzed. Analytical costs for the µPAD technique were ~50 times lower than market-rate costs with ICP-OES. Further, the µPAD method was capable of providing same-day results (as opposed several weeks for ICP laboratory analysis). Results of this work suggest that µPAD sensors are a viable, yet inexpensive alternative to traditional analytic methods for transition metals in welding fume PM. These sensors have potential to enable substantially higher levels of hazard surveillance for a given resource cost, especially in resource-limited environments. PMID:24515892
Rapid detection of transition metals in welding fumes using paper-based analytical devices.
Cate, David M; Nanthasurasak, Pavisara; Riwkulkajorn, Pornpak; L'Orange, Christian; Henry, Charles S; Volckens, John
2014-05-01
Metals in particulate matter (PM) are considered a driving factor for many pathologies. Despite the hazards associated with particulate metals, personal exposures for at-risk workers are rarely assessed due to the cost and effort associated with monitoring. As a result, routine exposure assessments are performed for only a small fraction of the exposed workforce. The objective of this research was to evaluate a relatively new technology, microfluidic paper-based analytical devices (µPADs), for measuring the metals content in welding fumes. Fumes from three common welding techniques (shielded metal arc, metal inert gas, and tungsten inert gas welding) were sampled in two welding shops. Concentrations of acid-extractable Fe, Cu, Ni, and Cr were measured and independently verified using inductively coupled plasma-optical emission spectroscopy (ICP-OES). Results from the µPAD sensors agreed well with ICP-OES analysis; the two methods gave statistically similar results in >80% of the samples analyzed. Analytical costs for the µPAD technique were ~50 times lower than market-rate costs with ICP-OES. Further, the µPAD method was capable of providing same-day results (as opposed several weeks for ICP laboratory analysis). Results of this work suggest that µPAD sensors are a viable, yet inexpensive alternative to traditional analytic methods for transition metals in welding fume PM. These sensors have potential to enable substantially higher levels of hazard surveillance for a given resource cost, especially in resource-limited environments.
Tao, Li; Zhu, Kun; Zhu, Jungao; Xu, Xiaohan; Lin, Chen; Ma, Wenjun; Lu, Haiyang; Zhao, Yanying; Lu, Yuanrong; Chen, Jia-Er; Yan, Xueqing
2017-07-07
With the development of laser technology, laser-driven proton acceleration provides a new method for proton tumor therapy. However, it has not been applied in practice because of the wide and decreasing energy spectrum of laser-accelerated proton beams. In this paper, we propose an analytical model to reconstruct the spread-out Bragg peak (SOBP) using laser-accelerated proton beams. Firstly, we present a modified weighting formula for protons of different energies. Secondly, a theoretical model for the reconstruction of SOBPs with laser-accelerated proton beams has been built. It can quickly calculate the number of laser shots needed for each energy interval of the laser-accelerated protons. Finally, we show the 2D reconstruction results of SOBPs for laser-accelerated proton beams and the ideal situation. The final results show that our analytical model can give an SOBP reconstruction scheme that can be used for actual tumor therapy.
Manufacturing Methods and Technology for Microwave Stripline Circuits
1982-02-26
to the dielectric material so It does not peel during the etching and subsequent processing. The copper cladding requirements were defined by MIL-F...the B-stage,giv- ing acceptable peel strengths per the military requirements. For PTFE sub- strata printed wiring boards that are laminated using a...examining multilayers for measles and delaminations, and analytically by performing peel tests and glass transition temperatures. "STRIPLINE
1984-03-01
a....... .6 Thin Layer Chromatographic (TLC) Analyses... o................. ......a... 7 RESULTS AND DISCUSSIONo.....................o...Beckman 5230 UV/visible spectrophotometer was used for colorimetric determinations of urea and cyanamide. Urea was hydrolyzed by urease and...correlation coefficients were 0.9999. THIN LAYER CHROMATOGRAPHIC (TLC) ANALYSES Cellulose plates were used and were developed in the following systems: 3N
Joshi, Varsha; Kumar, Vijesh; Rathore, Anurag S
2015-08-07
A method is proposed for rapid development of a short, analytical cation exchange high performance liquid chromatography method for analysis of charge heterogeneity in monoclonal antibody products. The parameters investigated and optimized include pH, shape of elution gradient and length of the column. It is found that the most important parameter for development of a shorter method is the choice of the shape of elution gradient. In this paper, we propose a step by step approach to develop a non-linear sigmoidal shape gradient for analysis of charge heterogeneity for two different monoclonal antibody products. The use of this gradient not only decreases the run time of the method to 4min against the conventional method that takes more than 40min but also the resolution is retained. Superiority of the phosphate gradient over sodium chloride gradient for elution of mAbs is also observed. The method has been successfully evaluated for specificity, sensitivity, linearity, limit of detection, and limit of quantification. Application of this method as a potential at-line process analytical technology tool has been suggested. Copyright © 2015 Elsevier B.V. All rights reserved.
Rahman, Ziyaur; Siddiqui, Akhtar; Khan, Mansoor A
2013-12-01
The focus of present investigation was to characterize and evaluate the variability of solid dispersion (SD) of amorphous vancomycin (VCM), utilizing crystalline polyethylene glycol (PEG-6000) as a carrier and subsequently, determining their percentage composition by nondestructive method of process analytical technology (PAT) sensors. The SD were prepared by heat fusion method and characterized for physicochemical and spectral properties. Enhanced dissolution was shown by the SD formulations. Decreased crystallinity of PEG-6000 was observed indicating that the drug was present as solution and dispersed form within the polymer. The SD formulations were homogenous as shown by near infrared (NIR) chemical imaging data. Principal component analysis (PCA) and partial least square (PLS) method were applied to NIR and PXRD (powder X-ray diffraction) data to develop model for quantification of drug and carrier. PLS of both data showed correlation coefficient >0.9934 with good prediction capability as revealed by smaller value of root mean square and standard error. The model based on NIR and PXRD were two folds more accurate in estimating PEG-6000 than VCM. In conclusion, the drug dissolution from the SD increased by decreasing crystallinity of PEG-6000, and the chemometric models showed usefulness of PAT sensor in estimating percentage of both VCM and PEG-600 simultaneously. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.
Designing Technology-Enabled Instruction to Utilize Learning Analytics
ERIC Educational Resources Information Center
Davies, Randall; Nyland, Robert; Bodily, Robert; Chapman, John; Jones, Brian; Young, Jay
2017-01-01
A key notion conveyed by those who advocate for the use of data to enhance instruction is an awareness that learning analytics has the potential to improve instruction and learning but is not currently reaching that potential. Gibbons (2014) suggested that a lack of learning facilitated by current technology-enabled instructional systems may be…
Quo vadis, analytical chemistry?
Valcárcel, Miguel
2016-01-01
This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.
NASA Astrophysics Data System (ADS)
Sarni, W.
2017-12-01
Water scarcity and poor quality impacts economic development, business growth, and social well-being. Water has become, in our generation, the foremost critical local, regional, and global issue of our time. Despite these needs, there is no water hub or water technology accelerator solely dedicated to water data and tools. There is a need by the public and private sectors for vastly improved data management and visualization tools. This is the WetDATA opportunity - to develop a water data tech hub dedicated to water data acquisition, analytics, and visualization tools for informed policy and business decisions. WetDATA's tools will help incubate disruptive water data technologies and accelerate adoption of current water data solutions. WetDATA is a Colorado-based (501c3), global hub for water data analytics and technology innovation. WetDATA's vision is to be a global leader in water information, data technology innovation and collaborate with other US and global water technology hubs. ROADMAP * Portal (www.wetdata.org) to provide stakeholders with tools/resources to understand related water risks. * The initial activities will provide education, awareness and tools to stakeholders to support the implementation of the Colorado State Water Plan. * Leverage the Western States Water Council Water Data Exchange database. * Development of visualization, predictive analytics and AI tools to engage with stakeholders and provide actionable data and information. TOOLS Education: Provide information on water issues and risks at the local, state, national and global scale. Visualizations: Development of data analytics and visualization tools based upon the 2030 Water Resources Group methodology to support the implementation of the Colorado State Water Plan. Predictive Analytics: Accessing publically available water databases and using machine learning to develop water availability forecasting tools, and time lapse images to support city / urban planning.
Micro- and nanofluidic systems in devices for biological, medical and environmental research
NASA Astrophysics Data System (ADS)
Evstrapov, A. A.
2017-11-01
The use of micro- and nanofluidic systems in modern analytical instruments allow you to implement a number of unique opportunities and achieve ultra-high measurement sensitivity. The possibility of manipulation of the individual biological objects (cells, bacteria, viruses, proteins, nucleic acids) in a liquid medium caused the development of devices on microchip platform for methods: chromatographic and electrophoretic analyzes; polymerase chain reaction; sequencing of nucleic acids; immunoassay; cytometric studies. Development of micro and nano fabrication technologies, materials science, surface chemistry, analytical chemistry, cell engineering have led to the creation of a unique systems such as “lab-on-a-chip”, “human-on-a-chip” and other. This article discusses common in microfluidics materials and methods of making functional structures. Examples of integration of nanoscale structures in microfluidic devices for the implementation of new features and improve the technical characteristics of devices and systems are shown.
Aptamer-Based Analysis: A Promising Alternative for Food Safety Control
Amaya-González, Sonia; de-los-Santos-Álvarez, Noemí; Miranda-Ordieres, Arturo J.; Lobo-Castañón, Maria Jesús
2013-01-01
Ensuring food safety is nowadays a top priority of authorities and professional players in the food supply chain. One of the key challenges to determine the safety of food and guarantee a high level of consumer protection is the availability of fast, sensitive and reliable analytical methods to identify specific hazards associated to food before they become a health problem. The limitations of existing methods have encouraged the development of new technologies, among them biosensors. Success in biosensor design depends largely on the development of novel receptors with enhanced affinity to the target, while being stable and economical. Aptamers fulfill these characteristics, and thus have surfaced as promising alternatives to natural receptors. This Review describes analytical strategies developed so far using aptamers for the control of pathogens, allergens, adulterants, toxins and other forbidden contaminants to ensure food safety. The main progresses to date are presented, highlighting potential prospects for the future. PMID:24287543
Stabilization of glucose-oxidase in the graphene paste for screen-printed glucose biosensor
NASA Astrophysics Data System (ADS)
Pepłowski, Andrzej; Janczak, Daniel; Jakubowska, Małgorzata
2015-09-01
Various methods and materials for enzyme stabilization within screen-printed graphene sensor were analyzed. Main goal was to develop technology allowing immediate printing of the biosensors in single printing process. Factors being considered were: toxicity of the materials used, ability of the material to be screen-printed (squeezed through the printing mesh) and temperatures required in the fabrication process. Performance of the examined sensors was measured using chemical amperometry method, then appropriate analysis of the measurements was conducted. The analysis results were then compared with the medical requirements. Parameters calculated were: correlation coefficient between concentration of the analyte and the measured electrical current (0.986) and variation coefficient for the particular concentrations of the analyte used as the calibration points. Variation of the measured values was significant only in ranges close to 0, decreasing for the concentrations of clinical importance. These outcomes justify further development of the graphene-based biosensors fabricated through printing techniques.
From pixel to voxel: a deeper view of biological tissue by 3D mass spectral imaging
Ye, Hui; Greer, Tyler; Li, Lingjun
2011-01-01
Three dimensional mass spectral imaging (3D MSI) is an exciting field that grants the ability to study a broad mass range of molecular species ranging from small molecules to large proteins by creating lateral and vertical distribution maps of select compounds. Although the general premise behind 3D MSI is simple, factors such as choice of ionization method, sample handling, software considerations and many others must be taken into account for the successful design of a 3D MSI experiment. This review provides a brief overview of ionization methods, sample preparation, software types and technological advancements driving 3D MSI research of a wide range of low- to high-mass analytes. Future perspectives in this field are also provided to conclude that the positive and promises ever-growing applications in the biomedical field with continuous developments of this powerful analytical tool. PMID:21320052
Food Adulteration in Switzerland: From 'Ravioli' over 'Springbok' to 'Disco Sushi'.
Hubner, Philipp
2016-01-01
The driving force behind food adulteration is monetary profit and this has remained unchanged for at least the last hundred years. Food adulterations were and still are difficult to uncover because they occur mostly in an unpredictable and unexpected way. Very often food falsifiers take advantage of modern technology in such a way that food adulterations are difficult or sometimes even impossible to detect. Targets for food adulteration were and still are highly priced food items such as spirits, meat, seafood and olive oil. Although difficult to detect, food adulterations were in the past strong driving forces for the development of adequate detection methods in the official food control laboratories and for the enforcement of the food law. A very prominent example in this context is the 'Ravioli scandal' in Switzerland in the late 1970s which showed that cheap second-class meat could be processed into products without being discovered for long time. As a consequence the official food control laboratories in Switzerland were reinforced with more laboratory equipment and technical staff. With the introduction of new detection principles such as DNA-based analytical methods new kinds of food adulteration could and can be uncovered. Analytical methods have their limits and in some cases of food fraud there are no analytical means to detect them. In such cases the examination of trade by checking of accounts is the method of choice.
NASA Astrophysics Data System (ADS)
Pletikapić, Galja; Ivošević DeNardis, Nadica
2017-01-01
Surface analytical methods are applied to examine the environmental status of seawaters. The present overview emphasizes advantages of combining surface analytical methods, applied to a hazardous situation in the Adriatic Sea, such as monitoring of the first aggregation phases of dissolved organic matter in order to potentially predict the massive mucilage formation and testing of oil spill cleanup. Such an approach, based on fast and direct characterization of organic matter and its high-resolution visualization, sets a continuous-scale description of organic matter from micro- to nanometre scales. Electrochemical method of chronoamperometry at the dropping mercury electrode meets the requirements for monitoring purposes due to the simple and fast analysis of a large number of natural seawater samples enabling simultaneous differentiation of organic constituents. In contrast, atomic force microscopy allows direct visualization of biotic and abiotic particles and provides an insight into structural organization of marine organic matter at micro- and nanometre scales. In the future, merging data at different spatial scales, taking into account experimental input on micrometre scale, observations on metre scale and modelling on kilometre scale, will be important for developing sophisticated technological platforms for knowledge transfer, reports and maps applicable for the marine environmental protection and management of the coastal area, especially for tourism, fishery and cruiser trafficking.
Ryals, John; Lawton, Kay; Stevens, Daniel; Milburn, Michael
2007-07-01
Metabolon is an emerging technology company developing proprietary analytical methods and software for biomarker discovery using metabolomics. The company's aim is to measure all small molecules (<1500 Da) in a biological sample. These small-molecule compounds include biochemicals of cellular metabolism and xenobiotics from diet and environment. Our proprietary mLIMStrade mark system contains advanced metabolomic software and automated data-processing tools that use a variety of data-analysis and quality-control algorithms to convert raw mass-spectrometry data to identified, quantitated compounds. Metabolon's primary focus is a fee-for-service business that exploits this technology for pharmaceutical and biotechnology companies, with additional clients in the consumer goods, cosmetics and agricultural industries. Fee-for-service studies are often collaborations with groups that employ a variety of technologies for biomarker discovery. Metabolon's goal is to develop technology that will automatically analyze any sample for the small-molecule components present and become a standard technology for applications in health and related sciences.
Printed organo-functionalized graphene for biosensing applications.
Wisitsoraat, A; Mensing, J Ph; Karuwan, C; Sriprachuabwong, C; Jaruwongrungsee, K; Phokharatkul, D; Daniels, T M; Liewhiran, C; Tuantranont, A
2017-01-15
Graphene is a highly promising material for biosensors due to its excellent physical and chemical properties which facilitate electron transfer between the active locales of enzymes or other biomaterials and a transducer surface. Printing technology has recently emerged as a low-cost and practical method for fabrication of flexible and disposable electronics devices. The combination of these technologies is promising for the production and commercialization of low cost sensors. In this review, recent developments in organo-functionalized graphene and printed biosensor technologies are comprehensively covered. Firstly, various methods for printing graphene-based fluids on different substrates are discussed. Secondly, different graphene-based ink materials and preparation methods are described. Lastly, biosensing performances of printed or printable graphene-based electrochemical and field effect transistor sensors for some important analytes are elaborated. The reported printed graphene based sensors exhibit promising properties with good reliability suitable for commercial applications. Among most reports, only a few printed graphene-based biosensors including screen-printed oxidase-functionalized graphene biosensor have been demonstrated. The technology is still at early stage but rapidly growing and will earn great attention in the near future due to increasing demand of low-cost and disposable biosensors. Copyright © 2016 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oxstrand, Johanna; Bly, Aaron
The U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program initiated research in to what is needed in order to provide a roadmap or model for Nuclear Power Plants to reference when building an architecture that can support the growing data supply and demand flowing through their networks. The Digital Architecture project published report Digital Architecture Planning Model (Oxstrand et. al, 2016) discusses things to consider when building an architecture to support the increasing needs and demands of data throughout the plant. Once the plant is able to support the data demands it still needs to be able tomore » provide the data in an easy, quick and reliable method. A common method is to create a “one stop shop” application that a user can go to get all the data they need. The creation of this leads to the need of creating a Seamless Digital Environment (SDE) to integrate all the “siloed” data. An SDE is the desired perception that should be presented to users by gathering the data from any data source (e.g., legacy applications and work management systems) without effort by the user. The goal for FY16 was to complete a feasibility study for data mining and analytics for employing information from computer-based procedures enabled technologies for use in developing improved business analytics. The research team collaborated with multiple organizations to identify use cases or scenarios, which could be beneficial to investigate in a feasibility study. Many interesting potential use cases were identified throughout the FY16 activity. Unfortunately, due to factors out of the research team’s control, none of the studies were initiated this year. However, the insights gained and the relationships built with both PVNGS and NextAxiom will be valuable when moving forward with future research. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting it was identified would be very beneficial to the industry to support a research effort focused on data analytics. It was suggested that the effort would develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics.« less
Considerations in the development of circulating tumor cell technology for clinical use
2012-01-01
This manuscript summarizes current thinking on the value and promise of evolving circulating tumor cell (CTC) technologies for cancer patient diagnosis, prognosis, and response to therapy, as well as accelerating oncologic drug development. Moving forward requires the application of the classic steps in biomarker development–analytical and clinical validation and clinical qualification for specific contexts of use. To that end, this review describes methods for interactive comparisons of proprietary new technologies, clinical trial designs, a clinical validation qualification strategy, and an approach for effectively carrying out this work through a public-private partnership that includes test developers, drug developers, clinical trialists, the US Food & Drug Administration (FDA) and the US National Cancer Institute (NCI). PMID:22747748
Packer, Nicolle H.; Schulz, Benjamin L.
2016-01-01
The glycoproteome remains severely understudied because of significant analytical challenges associated with glycoproteomics, the system-wide analysis of intact glycopeptides. This review introduces important structural aspects of protein N-glycosylation and summarizes the latest technological developments and applications in LC-MS/MS-based qualitative and quantitative N-glycoproteomics. These maturing technologies provide unique structural insights into the N-glycoproteome and its synthesis and regulation by complementing existing methods in glycoscience. Modern glycoproteomics is now sufficiently mature to initiate efforts to capture the molecular complexity displayed by the N-glycoproteome, opening exciting opportunities to increase our understanding of the functional roles of protein N-glycosylation in human health and disease. PMID:26929216
Microscale technology and biocatalytic processes: opportunities and challenges for synthesis.
Wohlgemuth, Roland; Plazl, Igor; Žnidaršič-Plazl, Polona; Gernaey, Krist V; Woodley, John M
2015-05-01
Despite the expanding presence of microscale technology in chemical synthesis and energy production as well as in biomedical devices and analytical and diagnostic tools, its potential in biocatalytic processes for pharmaceutical and fine chemicals, as well as related industries, has not yet been fully exploited. The aim of this review is to shed light on the strategic advantages of this promising technology for the development and realization of biocatalytic processes and subsequent product recovery steps, demonstrated with examples from the literature. Constraints, opportunities, and the future outlook for the implementation of these key green engineering methods and the role of supporting tools such as mathematical models to establish sustainable production processes are discussed. Copyright © 2015 Elsevier Ltd. All rights reserved.
Seamless Digital Environment – Plan for Data Analytics Use Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oxstrand, Johanna Helene; Bly, Aaron Douglas
The U.S Department of Energy Light Water Reactor Sustainability (LWRS) Program initiated research in to what is needed in order to provide a roadmap or model for Nuclear Power Plants to reference when building an architecture that can support the growing data supply and demand flowing through their networks. The Digital Architecture project published report Digital Architecture Planning Model (Oxstrand et. al, 2016) discusses things to consider when building an architecture to support the increasing needs and demands of data throughout the plant. Once the plant is able to support the data demands it still needs to be able tomore » provide the data in an easy, quick and reliable method. A common method is to create a “one stop shop” application that a user can go to get all the data they need. The creation of this leads to the need of creating a Seamless Digital Environment (SDE) to integrate all the “siloed” data. An SDE is the desired perception that should be presented to users by gathering the data from any data source (e.g., legacy applications and work management systems) without effort by the user. The goal for FY16 was to complete a feasibility study for data mining and analytics for employing information from computer-based procedures enabled technologies for use in developing improved business analytics. The research team collaborated with multiple organizations to identify use cases or scenarios, which could be beneficial to investigate in a feasibility study. Many interesting potential use cases were identified throughout the FY16 activity. Unfortunately, due to factors out of the research team’s control, none of the studies were initiated this year. However, the insights gained and the relationships built with both PVNGS and NextAxiom will be valuable when moving forward with future research. During the 2016 annual Nuclear Information Technology Strategic Leadership (NITSL) group meeting it was identified would be very beneficial to the industry to support a research effort focused on data analytics. It was suggested that the effort would develop and evaluate use cases for data mining and analytics for employing information from plant sensors and database for use in developing improved business analytics.« less
Ion beams provided by small accelerators for material synthesis and characterization
NASA Astrophysics Data System (ADS)
Mackova, Anna; Havranek, Vladimir
2017-06-01
The compact, multipurpose electrostatic tandem accelerators are extensively used for production of ion beams with energies in the range from 400 keV to 24 MeV of almost all elements of the periodic system for the trace element analysis by means of nuclear analytical methods. The ion beams produced by small accelerators have a broad application, mainly for material characterization (Rutherford Back-Scattering spectrometry, Particle Induced X ray Emission analysis, Nuclear Reaction Analysis and Ion-Microprobe with 1 μm lateral resolution among others) and for high-energy implantation. Material research belongs to traditionally progressive fields of technology. Due to the continuous miniaturization, the underlying structures are far beyond the analytical limits of the most conventional methods. Ion Beam Analysis (IBA) techniques provide this possibility as they use probes of similar or much smaller dimensions (particles, radiation). Ion beams can be used for the synthesis of new progressive functional nanomaterials for optics, electronics and other applications. Ion beams are extensively used in studies of the fundamental energetic ion interaction with matter as well as in the novel nanostructure synthesis using ion beam irradiation in various amorphous and crystalline materials in order to get structures with extraordinary functional properties. IBA methods serve for investigation of materials coming from material research, industry, micro- and nano-technology, electronics, optics and laser technology, chemical, biological and environmental investigation in general. Main research directions in laboratories employing small accelerators are also the preparation and characterization of micro- and nano-structured materials which are of interest for basic and oriented research in material science, and various studies of biological, geological, environmental and cultural heritage artefacts are provided too.
Addressing trend-related changes within cumulative effects studies in water resources planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Canter, L.W., E-mail: envimptr@aol.com; Chawla, M.K.; Swor, C.T.
2014-01-15
Summarized herein are 28 case studies wherein trend-related causative physical, social, or institutional changes were connected to consequential changes in runoff, water quality, and riparian and aquatic ecological features. The reviewed cases were systematically evaluated relative to their identified environmental effects; usage of analytical frameworks, and appropriate models, methods, and technologies; and the attention given to mitigation and/or management of the resultant causative and consequential changes. These changes also represent important considerations in project design and operation, and in cumulative effects studies associated therewith. The cases were grouped into five categories: institutional changes associated with legislation and policies (seven cases);more » physical changes from land use changes in urbanizing watersheds (eight cases); physical changes from land use changes and development projects in watersheds (four cases); physical, institutional, and social changes from land use and related policy changes in river basins (three cases); and multiple changes within a comprehensive study of land use and policy changes in the Willamette River Basin in Oregon (six cases). A tabulation of 110 models, methods and technologies used in the studies is also presented. General observations from this review were that the features were unique for each case; the consequential changes were logically based on the causative changes; the analytical frameworks provided relevant structures for the studies, and the identified methods and technologies were pertinent for addressing both the causative and consequential changes. One key lesson was that the cases provide useful, “real-world” illustrations of the importance of addressing trend-related changes in cumulative effects studies within water resources planning. Accordingly, they could be used as an “initial tool kit” for addressing trend-related changes.« less
Yassine, Mahmoud M; Dabek-Zlotorzynska, Ewa; Celo, Valbona
2012-03-16
The use of urea based selective catalytic reduction (SCR) technology for the reduction of NOx from the exhaust of diesel-powered vehicles has the potential to emit at least six thermal decomposition by-products, ammonia, and unreacted urea from the tailpipe. These compounds may include: biuret, dicyandiamine, cyanuric acid, ammelide, ammeline and melamine. In the present study, a simple, sensitive and reliable hydrophilic interaction liquid chromatography (HILIC)-electrospray ionization (ESI)/mass spectrometry (MS) method without complex sample pre-treatment was developed for identification and determination of urea decomposition by-products in diesel exhaust. Gradient separation was performed on a SeQuant ZIC-HILIC column with a highly polar zwitterionic stationary phase, and using a mobile phase consisting of acetonitrile (eluent A) and 15 mM ammonium formate (pH 6; eluent B). Detection and quantification were performed using a quadrupole ESI/MS operated simultaneously in negative and positive mode. With 10 μL injection volume, LODs for all target analytes were in the range of 0.2-3 μg/L. The method showed a good inter-day precision of retention time (RSD<0.5%) and peak area (RSD<3%). Satisfactory extraction recoveries from spiked blanks ranged between 96 and 98%. Analyses of samples collected during transient chassis dynamometer tests of a bus engine equipped with a diesel particulate filter (DPF) and urea based SCR technology showed the presence of five target analytes with cyanuric acid and ammelide the most abundant compounds in the exhaust. Crown Copyright © 2012. Published by Elsevier B.V. All rights reserved.
Analytical methods for determination of mycotoxins: An update (2009-2014).
Turner, Nicholas W; Bramhmbhatt, Heli; Szabo-Vezse, Monika; Poma, Alessandro; Coker, Raymond; Piletsky, Sergey A
2015-12-11
Mycotoxins are a problematic and toxic group of small organic molecules that are produced as secondary metabolites by several fungal species that colonise crops. They lead to contamination at both the field and postharvest stages of food production with a considerable range of foodstuffs affected, from coffee and cereals, to dried fruit and spices. With wide ranging structural diversity of mycotoxins, severe toxic effects caused by these molecules and their high chemical stability the requirement for robust and effective detection methods is clear. This paper builds on our previous review and summarises the most recent advances in this field, in the years 2009-2014 inclusive. This review summarises traditional methods such as chromatographic and immunochemical techniques, as well as newer approaches such as biosensors, and optical techniques which are becoming more prevalent. A section on sampling and sample treatment has been prepared to highlight the importance of this step in the analytical methods. We close with a look at emerging technologies that will bring effective and rapid analysis out of the laboratory and into the field. Copyright © 2015 Elsevier B.V. All rights reserved.
Hill, Ryan C; Oman, Trent J; Shan, Guomin; Schafer, Barry; Eble, Julie; Chen, Cynthia
2015-08-26
Currently, traditional immunochemistry technologies such as enzyme-linked immunosorbent assays (ELISA) are the predominant analytical tool used to measure levels of recombinant proteins expressed in genetically engineered (GE) plants. Recent advances in agricultural biotechnology have created a need to develop methods capable of selectively detecting and quantifying multiple proteins in complex matrices because of increasing numbers of transgenic proteins being coexpressed or "stacked" to achieve tolerance to multiple herbicides or to provide multiple modes of action for insect control. A multiplexing analytical method utilizing liquid chromatography with tandem mass spectrometry (LC-MS/MS) has been developed and validated to quantify three herbicide-tolerant proteins in soybean tissues: aryloxyalkanoate dioxygenase (AAD-12), 5-enol-pyruvylshikimate-3-phosphate synthase (2mEPSPS), and phosphinothricin acetyltransferase (PAT). Results from the validation showed high recovery and precision over multiple analysts and laboratories. Results from this method were comparable to those obtained with ELISA with respect to protein quantitation, and the described method was demonstrated to be suitable for multiplex quantitation of transgenic proteins in GE crops.
Training the next generation analyst using red cell analytics
NASA Astrophysics Data System (ADS)
Graham, Meghan N.; Graham, Jacob L.
2016-05-01
We have seen significant change in the study and practice of human reasoning in recent years from both a theoretical and methodological perspective. Ubiquitous communication coupled with advances in computing and a plethora of analytic support tools have created a push for instantaneous reporting and analysis. This notion is particularly prevalent in law enforcement, emergency services and the intelligence community (IC), where commanders (and their civilian leadership) expect not only a birds' eye view of operations as they occur, but a play-by-play analysis of operational effectiveness. This paper explores the use of Red Cell Analytics (RCA) as pedagogy to train the next-gen analyst. A group of Penn State students in the College of Information Sciences and Technology at the University Park campus of The Pennsylvania State University have been practicing Red Team Analysis since 2008. RCA draws heavily from the military application of the same concept, except student RCA problems are typically on non-military in nature. RCA students utilize a suite of analytic tools and methods to explore and develop red-cell tactics, techniques and procedures (TTPs), and apply their tradecraft across a broad threat spectrum, from student-life issues to threats to national security. The strength of RCA is not always realized by the solution but by the exploration of the analytic pathway. This paper describes the concept and use of red cell analytics to teach and promote the use of structured analytic techniques, analytic writing and critical thinking in the area of security and risk and intelligence training.
Analysis of Low-Biomass Microbial Communities in the Deep Biosphere.
Morono, Y; Inagaki, F
2016-01-01
Over the past few decades, the subseafloor biosphere has been explored by scientific ocean drilling to depths of about 2.5km below the seafloor. Although organic-rich anaerobic sedimentary habitats in the ocean margins harbor large numbers of microbial cells, microbial populations in ultraoligotrophic aerobic sedimentary habitats in the open ocean gyres are several orders of magnitude less abundant. Despite advances in cultivation-independent molecular ecological techniques, exploring the low-biomass environment remains technologically challenging, especially in the deep subseafloor biosphere. Reviewing the historical background of deep-biosphere analytical methods, the importance of obtaining clean samples and tracing contamination, as well as methods for detecting microbial life, technological aspects of molecular microbiology, and detecting subseafloor metabolic activity will be discussed. Copyright © 2016 Elsevier Inc. All rights reserved.
Comprehensive rotorcraft analysis methods
NASA Technical Reports Server (NTRS)
Stephens, Wendell B.; Austin, Edward E.
1988-01-01
The development and application of comprehensive rotorcraft analysis methods in the field of rotorcraft technology are described. These large scale analyses and the resulting computer programs are intended to treat the complex aeromechanical phenomena that describe the behavior of rotorcraft. They may be used to predict rotor aerodynamics, acoustic, performance, stability and control, handling qualities, loads and vibrations, structures, dynamics, and aeroelastic stability characteristics for a variety of applications including research, preliminary and detail design, and evaluation and treatment of field problems. The principal comprehensive methods developed or under development in recent years and generally available to the rotorcraft community because of US Army Aviation Research and Technology Activity (ARTA) sponsorship of all or part of the software systems are the Rotorcraft Flight Simulation (C81), Dynamic System Coupler (DYSCO), Coupled Rotor/Airframe Vibration Analysis Program (SIMVIB), Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD), General Rotorcraft Aeromechanical Stability Program (GRASP), and Second Generation Comprehensive Helicopter Analysis System (2GCHAS).
Guzman, Norberto A.; Blanc, Timothy; Phillips, Terry M.
2009-01-01
In the last few years, there has been a greater appreciation by the scientific community of how separation science has contributed to the advancement of biomedical research. Despite past contributions in facilitating several biomedical breakthroughs, separation sciences still urgently need the development of improved methods for the separation and detection of biological and chemical substances. In particular, the challenging task of quantifying small molecules and biomolecules, found in low abundance in complex matrices (e.g., serum), is a particular area in need of new high-efficiency techniques. The tandem or on-line coupling of highly selective antibody capture agents with the high-resolving power of CE is being recognized as a powerful analytical tool for the enrichment and quantification of ultra-low abundance analytes in complex matrices. This development will have a significant impact on the identification and characterization of many putative biomarkers and on biomedical research in general. Immunoaffinity CE (IACE) technology is rapidly emerging as the most promising method for the analysis of low-abundance biomarkers; its power comes from a three-step procedure: (i) bioselective adsorption and (ii) subsequent recovery of compounds from an immobilized affinity ligand followed by (iii) separation of the enriched compounds. This technology is highly suited to automation and can be engineered to as a multiplex instrument capable of routinely performing hundreds of assays per day. Furthermore, a significant enhancement in sensitivity can be achieved for the purified and enriched affinity targeted analytes. Thus, a compound that exists in a complex biological matrix at a concentration far below its LOD is easily brought to well within its range of quantification. The present review summarizes several applications of IACE, as well as a chronological description of the improvements made in the fabrication of the analyte concentrator-microreactor device leading to the development of a multidimensional biomarker analyzer. PMID:18646282
NASA Technical Reports Server (NTRS)
Dubinskiy, Mark A.; Kamal, Mohammed M.; Misra, Prabhaker
1995-01-01
The availability of manned laboratory facilities in space offers wonderful opportunities and challenges in microgravity combustion science and technology. In turn, the fundamentals of microgravity combustion science can be studied via spectroscopic characterization of free radicals generated in flames. The laser-induced fluorescence (LIF) technique is a noninvasive method of considerable utility in combustion physics and chemistry suitable for monitoring not only specific species and their kinetics, but it is also important for imaging of flames. This makes LIF one of the most important tools for microgravity combustion science. Flame characterization under microgravity conditions using LIF is expected to be more informative than other methods aimed at searching for effects like pumping phenomenon that can be modeled via ground level experiments. A primary goal of our work consisted in working out an innovative approach to devising an LIF-based analytical unit suitable for in-space flame characterization. It was decided to follow two approaches in tandem: (1) use the existing laboratory (non-portable) equipment and determine the optimal set of parameters for flames that can be used as analytical criteria for flame characterization under microgravity conditions; and (2) use state-of-the-art developments in laser technology and concentrate some effort in devising a layout for the portable analytical equipment. This paper presents an up-to-date summary of the results of our experiments aimed at the creation of the portable device for combustion studies in a microgravity environment, which is based on a portable UV tunable solid-state laser for excitation of free radicals normally present in flames in detectable amounts. A systematic approach has allowed us to make a convenient choice of species under investigation, as well as the proper tunable laser system, and also enabled us to carry out LIF experiments on free radicals using a solid-state laser tunable in the UV.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Feng; Liu, Yijin; Yu, Xiqian
Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution, because a sole technique may lead to biased and inaccurate conclusions. We then discuss the current progress of experimental design for synchrotron experiments and methods to mitigate beam effects. Finally, a perspective is provided to elaborate how synchrotron techniques can impact the development of next-generation battery chemistries.« less
Lin, Feng; Liu, Yijin; Yu, Xiqian; ...
2017-08-30
Rechargeable battery technologies have ignited major breakthroughs in contemporary society, including but not limited to revolutions in transportation, electronics, and grid energy storage. The remarkable development of rechargeable batteries is largely attributed to in-depth efforts to improve battery electrode and electrolyte materials. There are, however, still intimidating challenges of lower cost, longer cycle and calendar life, higher energy density, and better safety for large scale energy storage and vehicular applications. Further progress with rechargeable batteries may require new chemistries (lithium ion batteries and beyond) and better understanding of materials electrochemistry in the various battery technologies. In the past decade, advancementmore » of battery materials has been complemented by new analytical techniques that are capable of probing battery chemistries at various length and time scales. Synchrotron X-ray techniques stand out as one of the most effective methods that allows for nearly nondestructive probing of materials characteristics such as electronic and geometric structures with various depth sensitivities through spectroscopy, scattering, and imaging capabilities. This article begins with the discussion of various rechargeable batteries and associated important scientific questions in the field, followed by a review of synchrotron X-ray based analytical tools (scattering, spectroscopy and imaging) and their successful applications (ex situ, in situ, and in operando) in gaining fundamental insights into these scientific questions. Furthermore, electron microscopy and spectroscopy complement the detection length scales of synchrotron X-ray tools, and are also discussed towards the end. We highlight the importance of studying battery materials by combining analytical techniques with complementary length sensitivities, such as the combination of X-ray absorption spectroscopy and electron spectroscopy with spatial resolution, because a sole technique may lead to biased and inaccurate conclusions. We then discuss the current progress of experimental design for synchrotron experiments and methods to mitigate beam effects. Finally, a perspective is provided to elaborate how synchrotron techniques can impact the development of next-generation battery chemistries.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nimbalkar, Sachin U.; Guo, Wei; Wenning, Thomas J.
Smart manufacturing and advanced data analytics can help the manufacturing sector unlock energy efficiency from the equipment level to the entire manufacturing facility and the whole supply chain. These technologies can make manufacturing industries more competitive, with intelligent communication systems, real-time energy savings, and increased energy productivity. Smart manufacturing can give all employees in an organization the actionable information they need, when they need it, so that each person can contribute to the optimal operation of the corporation through informed, data-driven decision making. This paper examines smart technologies and data analytics approaches for improving energy efficiency and reducing energy costsmore » in process-supporting energy systems. It dives into energy-saving improvement opportunities through smart manufacturing technologies and sophisticated data collection and analysis. The energy systems covered in this paper include those with motors and drives, fans, pumps, air compressors, steam, and process heating.« less
Solid-phase microextraction technology for in vitro and in vivo metabolite analysis
Zhang, Qihui; Zhou, Liandi; Chen, Hua; Wang, Chong-Zhi; Xia, Zhining; Yuan, Chun-Su
2016-01-01
Analysis of endogenous metabolites in biological samples may lead to the identification of biomarkers in metabolomics studies. To achieve accurate sample analysis, a combined method of continuous quick sampling and extraction is required for online compound detection. Solid-phase microextraction (SPME) integrates sampling, extraction and concentration into a single solvent-free step for chemical analysis. SPME has a number of advantages, including simplicity, high sensitivity and a relatively non-invasive nature. In this article, we reviewed SPME technology in in vitro and in vivo analyses of metabolites after the ingestion of herbal medicines, foods and pharmaceutical agents. The metabolites of microorganisms in dietary supplements and in the gastrointestinal tract will also be examined. As a promising technology in biomedical and pharmaceutical research, SPME and its future applications will depend on advances in analytical technologies and material science. PMID:27695152
Immunosensors using a quartz crystal microbalance
NASA Astrophysics Data System (ADS)
Kurosawa, Shigeru; Aizawa, Hidenobu; Tozuka, Mitsuhiro; Nakamura, Miki; Park, Jong-Won
2003-11-01
Better analytical technology has been demanded for accurate and rapid determination of trace amounts of chemical compounds, such as marker proteins for disease or endocrine disrupters like dioxin, which might be contained in blood, food and the environment. The study of immunosensors using a quartz crystal microbalance (QCM) has recently focused on conventional detection methods for the determination of chemical compounds together with the development of reagents and processes. This paper introduces the principle of the detection method of QCM immunosensors developed at AIST and its application to the detection of trace amounts of chemical compounds.
Covaci, Adrian; Voorspoels, Stefan; Abdallah, Mohamed Abou-Elwafa; Geens, Tinne; Harrad, Stuart; Law, Robin J
2009-01-16
The present article reviews the available literature on the analytical and environmental aspects of tetrabromobisphenol-A (TBBP-A), a currently intensively used brominated flame retardant (BFR). Analytical methods, including sample preparation, chromatographic separation, detection techniques, and quality control are discussed. An important recent development in the analysis of TBBP-A is the growing tendency for liquid chromatographic techniques. At the detection stage, mass-spectrometry is a well-established and reliable technology in the identification and quantification of TBBP-A. Although interlaboratory exercises for BFRs have grown in popularity in the last 10 years, only a few participating laboratories report concentrations for TBBP-A. Environmental levels of TBBP-A in abiotic and biotic matrices are low, probably due to the major use of TBBP-A as reactive FR. As a consequence, the expected human exposure is low. This is in agreement with the EU risk assessment that concluded that there is no risk for humans concerning TBBP-A exposure. Much less analytical and environmental information exists for the various groups of TBBP-A derivatives which are largely used as additive flame retardants.
NASA Astrophysics Data System (ADS)
Ranamukhaarachchi, Sahan A.; Padeste, Celestino; Häfeli, Urs O.; Stoeber, Boris; Cadarso, Victor J.
2018-02-01
A hollow metallic microneedle is integrated with microfluidics and photonic components to form a microneedle-optofluidic biosensor suitable for therapeutic drug monitoring (TDM) in biological fluids, like interstitial fluid, that can be collected in a painless and minimally-invasive manner. The microneedle inner lumen surface is bio-functionalized to trap and bind target analytes on-site in a sample volume as small as 0.6 nl, and houses an enzyme-linked assay on its 0.06 mm2 wall. The optofluidic components are designed to rapidly quantify target analytes present in the sample and collected in the microneedle using a simple and sensitive absorbance scheme. This contribution describes how the biosensor components were optimized to detect in vitro streptavidin-horseradish peroxidase (Sav-HRP) as a model analyte over a large detection range (0-7.21 µM) and a very low limit of detection (60.2 nM). This biosensor utilizes the lowest analyte volume reported for TDM with microneedle technology, and presents significant avenues to improve current TDM methods for patients, by potentially eliminating blood draws for several drug candidates.
Analytical Approaches to Verify Food Integrity: Needs and Challenges.
Stadler, Richard H; Tran, Lien-Anh; Cavin, Christophe; Zbinden, Pascal; Konings, Erik J M
2016-09-01
A brief overview of the main analytical approaches and practices to determine food authenticity is presented, addressing, as well, food supply chain and future requirements to more effectively mitigate food fraud. Food companies are introducing procedures and mechanisms that allow them to identify vulnerabilities in their food supply chain under the umbrella of a food fraud prevention management system. A key step and first line of defense is thorough supply chain mapping and full transparency, assessing the likelihood of fraudsters to penetrate the chain at any point. More vulnerable chains, such as those where ingredients and/or raw materials are purchased through traders or auctions, may require a higher degree of sampling, testing, and surveillance. Access to analytical tools is therefore pivotal, requiring continuous development and possibly sophistication in identifying chemical markers, data acquisition, and modeling. Significant progress in portable technologies is evident already today, for instance, as in the rapid testing now available at the agricultural level. In the near future, consumers may also have the ability to scan products in stores or at home to authenticate labels and food content. For food manufacturers, targeted analytical methods complemented by untargeted approaches are end control measures at the factory gate when the material is delivered. In essence, testing for food adulterants is an integral part of routine QC, ideally tailored to the risks in the individual markets and/or geographies or supply chains. The development of analytical methods is a first step in verifying the compliance and authenticity of food materials. A next, more challenging step is the successful establishment of global consensus reference methods as exemplified by the AOAC Stakeholder Panel on Infant Formula and Adult Nutritionals initiative, which can serve as an approach that could also be applied to methods for contaminants and adulterants in food. The food industry has taken these many challenges aboard, working closely with all stakeholders and continuously communicating on progress in a fully transparent manner.
Evaluation of two methods to determine glyphosate and AMPA in soils of Argentina
NASA Astrophysics Data System (ADS)
De Geronimo, Eduardo; Lorenzon, Claudio; Iwasita, Barbara; Faggioli, Valeria; Aparicio, Virginia; Costa, Jose Luis
2017-04-01
Argentine agricultural production is fundamentally based on a technological package combining no-tillage and the dependence of glyphosate applications to control weeds in transgenic crops (soybean, maize and cotton). Therefore, glyphosate is the most employed herbicide in the country, where 180 to 200 million liters are applied every year. Due to its widespread use, it is important to assess its impact on the environment and, therefore, reliable analytical methods are mandatory. Glyphosate molecule exhibits unique physical and chemical characteristics which difficult its quantification, especially in soils with high organic matter content, such as the central eastern Argentine soils, where strong interferences are normally observed. The objective of this work was to compare two methods for extraction and quantification of glyphosate and AMPA in samples of 8 representative soils of Argentina. The first analytical method (method 1) was based on the use of phosphate buffer as extracting solution and dichloromethane to minimize matrix organic content. In the second method (method 2), potassium hydroxide was used to extract the analytes followed by a clean-up step using solid phase extraction (SPE) to minimize strong interferences. Sensitivity, recoveries, matrix effects and robustness were evaluated. Both methodologies involved a derivatization with 9-fluorenyl-methyl-chloroformate (FMOC) in borate buffer and detection based on ultra-high-pressure liquid chromatography coupled to tandem mass spectrometry (UHPLC-MS/MS). Recoveries obtained from soil samples spiked at 0.1 and 1 mg kg-1 and were satisfactory in both methods (70% - 120%). However, there was a remarkable difference regarding the matrix effect, being the SPE clean-up step (method 2) insufficient to remove the interferences. Whereas the dilution and the clean-up with dichloromethane (method 1) were more effective minimizing the ionic suppression. Moreover, method 1 had fewer steps in the protocol of sample processing than method 2. This can be highly valuable in the routine lab work due to the reduction of potential undesired errors such as the loss of analyte or sample contamination. In addition, the substitution of SPE by another alternative involved a considerable reduction of analytical costs in method 1. We conclude that method 1 seemed to be simpler and cheaper than method 2, as well as reliable to quantify glyphosate in Argentinean soils. We hope that this experience can be useful to simplify the protocols of glyphosate quantification and contribute to the understanding of the fate of this herbicide in the environment.
Meder, Roger; Stahl, Wolfgang; Warburton, Paul; Woolley, Sam; Earnshaw, Scott; Haselhofer, Klaus; van Langenberg, Ken; Ebdon, Nick; Mulder, Roger
2017-01-01
The reactivity of melamine-urea-formaldehyde resins is of key importance in the manufacture of engineered wood products such as medium density fibreboard (MDF) and other wood composite products. Often the MDF manufacturing plant has little available information on the resin reactivity other than details of the resin specification at the time of batch manufacture, which often occurs off-site at a third-party resin plant. Often too, fresh resin on delivery at the MDF plant is mixed with variable volume of aged resin in storage tanks, thereby rendering any specification of the fresh resin batch obsolete. It is therefore highly desirable to develop a real-time, at-line or on-line, process analytical technology to monitor the quality of the resin prior to MDF panel manufacture. Near infrared (NIR) spectroscopy has been calibrated against standard quality methods and against 13 C nuclear magnetic resonance (NMR) measures of molecular composition in order to provide at-line process analytical technology (PAT), to monitor the resin quality, particularly the formaldehyde content of the resin. At-line determination of formaldehyde content in the resin was made possible using a six-factor calibration with an R 2 (cal) value of 0.973, and R 2 (CV) value of 0.929 and a root-mean-square error of cross-validation of 0.01. This calibration was then used to generate control charts of formaldehyde content at regular four-hourly periods during MDF panel manufacture in a commercial MDF manufacturing plant.
Electric vehicle propulsion alternatives
NASA Technical Reports Server (NTRS)
Secunde, R. R.; Schuh, R. M.; Beach, R. F.
1983-01-01
Propulsion technology development for electric vehicles is summarized. Analytical studies, technology evaluation, and the development of technology for motors, controllers, transmissions, and complete propulsion systems are included.
Spacecraft drag-free technology development: On-board estimation and control synthesis
NASA Technical Reports Server (NTRS)
Key, R. W.; Mettler, E.; Milman, M. H.; Schaechter, D. B.
1982-01-01
Estimation and control methods for a Drag-Free spacecraft are discussed. The functional and analytical synthesis of on-board estimators and controllers for an integrated attitude and translation control system is represented. The framework for detail definition and design of the baseline drag-free system is created. The techniques for solution of self-gravity and electrostatic charging problems are applicable generally, as is the control system development.
2011-09-01
project research addresses our long-term goal to develop an analytical suite of the Advanced Laser Fluorescence (ALF) methods and instruments to improve...demonstrated ALF utility as an integrated tool for aquatic research and observations. The ALF integration into the major oceanographic programs is...currently in progress, including the California Current Ecosystem Long Term Ecological Research (CCE LTER, NSF) and California Cooperative Oceanic
Electron Beam Melting and Refining of Metals: Computational Modeling and Optimization
Vutova, Katia; Donchev, Veliko
2013-01-01
Computational modeling offers an opportunity for a better understanding and investigation of thermal transfer mechanisms. It can be used for the optimization of the electron beam melting process and for obtaining new materials with improved characteristics that have many applications in the power industry, medicine, instrument engineering, electronics, etc. A time-dependent 3D axis-symmetrical heat model for simulation of thermal transfer in metal ingots solidified in a water-cooled crucible at electron beam melting and refining (EBMR) is developed. The model predicts the change in the temperature field in the casting ingot during the interaction of the beam with the material. A modified Pismen-Rekford numerical scheme to discretize the analytical model is developed. These equation systems, describing the thermal processes and main characteristics of the developed numerical method, are presented. In order to optimize the technological regimes, different criteria for better refinement and obtaining dendrite crystal structures are proposed. Analytical problems of mathematical optimization are formulated, discretized and heuristically solved by cluster methods. Using important for the practice simulation results, suggestions can be made for EBMR technology optimization. The proposed tool is important and useful for studying, control, optimization of EBMR process parameters and improving of the quality of the newly produced materials. PMID:28788351
Nano-biosensors to detect beta-amyloid for Alzheimer's disease management.
Kaushik, Ajeet; Jayant, Rahul Dev; Tiwari, Sneham; Vashist, Arti; Nair, Madhavan
2016-06-15
Beta-amyloid (β-A) peptides are potential biomarkers to monitor Alzheimer's diseases (AD) for diagnostic purposes. Increased β-A level is neurotoxic and induces oxidative stress in brain resulting in neurodegeneration and causes dementia. As of now, no sensitive and inexpensive method is available for β-A detection under physiological and pathological conditions. Although, available methods such as neuroimaging, enzyme-linked immunosorbent assay (ELISA), and polymerase chain reaction (PCR) detect β-A, but they are not yet extended at point-of-care (POC) due to sophisticated equipments, need of high expertize, complicated operations, and challenge of low detection limit. Recently, β-A antibody based electrochemical immuno-sensing approach has been explored to detect β-A at pM levels within 30-40 min compared to 6-8h of ELISA test. The introduction of nano-enabling electrochemical sensing technology could enable rapid detection of β-A at POC and may facilitate fast personalized health care delivery. This review explores recent advancements in nano-enabling electrochemical β-A sensing technologies towards POC application to AD management. These analytical tools can serve as an analytical tool for AD management program to obtain bio-informatics needed to optimize therapeutics for neurodegenerative diseases diagnosis management. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Wescoat, James L.; Siddiqi, Afreen; Muhammad, Abubakr
2018-01-01
This paper presents a socio-hydrologic analysis of channel flows in Punjab province of the Indus River basin in Pakistan. The Indus has undergone profound transformations, from large-scale canal irrigation in the mid-nineteenth century to partition and development of the international river basin in the mid-twentieth century, systems modeling in the late-twentieth century, and new technologies for discharge measurement and data analytics in the early twenty-first century. We address these processes through a socio-hydrologic framework that couples historical geographic and analytical methods at three levels of flow in the Punjab. The first level assesses Indus River inflows analysis from its origins in 1922 to the present. The second level shows how river inflows translate into 10-daily canal command deliveries that vary widely in their conformity with canal entitlements. The third level of analysis shows how new flow measurement technologies raise questions about the performance of established methods of water scheduling (warabandi) on local distributaries. We show how near real-time measurement sheds light on the efficiency and transparency of surface water management. These local socio-hydrologic changes have implications in turn for the larger scales of canal and river inflow management in complex river basins.
Fozooni, Tahereh; Ravan, Hadi; Sasan, Hosseinali
2017-12-01
Due to their unique properties, such as programmability, ligand-binding capability, and flexibility, nucleic acids can serve as analytes and/or recognition elements for biosensing. To improve the sensitivity of nucleic acid-based biosensing and hence the detection of a few copies of target molecule, different modern amplification methodologies, namely target-and-signal-based amplification strategies, have already been developed. These recent signal amplification technologies, which are capable of amplifying the signal intensity without changing the targets' copy number, have resulted in fast, reliable, and sensitive methods for nucleic acid detection. Working in cell-free settings, researchers have been able to optimize a variety of complex and quantitative methods suitable for deploying in live-cell conditions. In this study, a comprehensive review of the signal amplification technologies for the detection of nucleic acids is provided. We classify the signal amplification methodologies into enzymatic and non-enzymatic strategies with a primary focus on the methods that enable us to shift away from in vitro detecting to in vivo imaging. Finally, the future challenges and limitations of detection for cellular conditions are discussed.
Della Pelle, Flavio; Compagnone, Dario
2018-02-04
Polyphenolic compounds (PCs) have received exceptional attention at the end of the past millennium and as much at the beginning of the new one. Undoubtedly, these compounds in foodstuffs provide added value for their well-known health benefits, for their technological role and also marketing. Many efforts have been made to provide simple, effective and user friendly analytical methods for the determination and antioxidant capacity (AOC) evaluation of food polyphenols. In a parallel track, over the last twenty years, nanomaterials (NMs) have made their entry in the analytical chemistry domain; NMs have, in fact, opened new paths for the development of analytical methods with the common aim to improve analytical performance and sustainability, becoming new tools in quality assurance of food and beverages. The aim of this review is to provide information on the most recent developments of new NMs-based tools and strategies for total polyphenols (TP) determination and AOC evaluation in food. In this review optical, electrochemical and bioelectrochemical approaches have been reviewed. The use of nanoparticles, quantum dots, carbon nanomaterials and hybrid materials for the detection of polyphenols is the main subject of the works reported. However, particular attention has been paid to the success of the application in real samples, in addition to the NMs. In particular, the discussion has been focused on methods/devices presenting, in the opinion of the authors, clear advancement in the fields, in terms of simplicity, rapidity and usability. This review aims to demonstrate how the NM-based approaches represent valid alternatives to classical methods for polyphenols analysis, and are mature to be integrated for the rapid quality assessment of food quality in lab or directly in the field.
2018-01-01
Polyphenolic compounds (PCs) have received exceptional attention at the end of the past millennium and as much at the beginning of the new one. Undoubtedly, these compounds in foodstuffs provide added value for their well-known health benefits, for their technological role and also marketing. Many efforts have been made to provide simple, effective and user friendly analytical methods for the determination and antioxidant capacity (AOC) evaluation of food polyphenols. In a parallel track, over the last twenty years, nanomaterials (NMs) have made their entry in the analytical chemistry domain; NMs have, in fact, opened new paths for the development of analytical methods with the common aim to improve analytical performance and sustainability, becoming new tools in quality assurance of food and beverages. The aim of this review is to provide information on the most recent developments of new NMs-based tools and strategies for total polyphenols (TP) determination and AOC evaluation in food. In this review optical, electrochemical and bioelectrochemical approaches have been reviewed. The use of nanoparticles, quantum dots, carbon nanomaterials and hybrid materials for the detection of polyphenols is the main subject of the works reported. However, particular attention has been paid to the success of the application in real samples, in addition to the NMs. In particular, the discussion has been focused on methods/devices presenting, in the opinion of the authors, clear advancement in the fields, in terms of simplicity, rapidity and usability. This review aims to demonstrate how the NM-based approaches represent valid alternatives to classical methods for polyphenols analysis, and are mature to be integrated for the rapid quality assessment of food quality in lab or directly in the field. PMID:29401719
Akbani, Rehan; Becker, Karl-Friedrich; Carragher, Neil; Goldstein, Ted; de Koning, Leanne; Korf, Ulrike; Liotta, Lance; Mills, Gordon B; Nishizuka, Satoshi S; Pawlak, Michael; Petricoin, Emanuel F; Pollard, Harvey B; Serrels, Bryan; Zhu, Jingchun
2014-07-01
Reverse phase protein array (RPPA) technology introduced a miniaturized "antigen-down" or "dot-blot" immunoassay suitable for quantifying the relative, semi-quantitative or quantitative (if a well-accepted reference standard exists) abundance of total protein levels and post-translational modifications across a variety of biological samples including cultured cells, tissues, and body fluids. The recent evolution of RPPA combined with more sophisticated sample handling, optical detection, quality control, and better quality affinity reagents provides exquisite sensitivity and high sample throughput at a reasonable cost per sample. This facilitates large-scale multiplex analysis of multiple post-translational markers across samples from in vitro, preclinical, or clinical samples. The technical power of RPPA is stimulating the application and widespread adoption of RPPA methods within academic, clinical, and industrial research laboratories. Advances in RPPA technology now offer scientists the opportunity to quantify protein analytes with high precision, sensitivity, throughput, and robustness. As a result, adopters of RPPA technology have recognized critical success factors for useful and maximum exploitation of RPPA technologies, including the following: preservation and optimization of pre-analytical sample quality, application of validated high-affinity and specific antibody (or other protein affinity) detection reagents, dedicated informatics solutions to ensure accurate and robust quantification of protein analytes, and quality-assured procedures and data analysis workflows compatible with application within regulated clinical environments. In 2011, 2012, and 2013, the first three Global RPPA workshops were held in the United States, Europe, and Japan, respectively. These workshops provided an opportunity for RPPA laboratories, vendors, and users to share and discuss results, the latest technology platforms, best practices, and future challenges and opportunities. The outcomes of the workshops included a number of key opportunities to advance the RPPA field and provide added benefit to existing and future participants in the RPPA research community. The purpose of this report is to share and disseminate, as a community, current knowledge and future directions of the RPPA technology. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.
Recent Advances in Paper-Based Sensors
Liana, Devi D.; Raguse, Burkhard; Gooding, J. Justin; Chow, Edith
2012-01-01
Paper-based sensors are a new alternative technology for fabricating simple, low-cost, portable and disposable analytical devices for many application areas including clinical diagnosis, food quality control and environmental monitoring. The unique properties of paper which allow passive liquid transport and compatibility with chemicals/biochemicals are the main advantages of using paper as a sensing platform. Depending on the main goal to be achieved in paper-based sensors, the fabrication methods and the analysis techniques can be tuned to fulfill the needs of the end-user. Current paper-based sensors are focused on microfluidic delivery of solution to the detection site whereas more advanced designs involve complex 3-D geometries based on the same microfluidic principles. Although paper-based sensors are very promising, they still suffer from certain limitations such as accuracy and sensitivity. However, it is anticipated that in the future, with advances in fabrication and analytical techniques, that there will be more new and innovative developments in paper-based sensors. These sensors could better meet the current objectives of a viable low-cost and portable device in addition to offering high sensitivity and selectivity, and multiple analyte discrimination. This paper is a review of recent advances in paper-based sensors and covers the following topics: existing fabrication techniques, analytical methods and application areas. Finally, the present challenges and future outlooks are discussed. PMID:23112667
A review of fracture mechanics life technology
NASA Technical Reports Server (NTRS)
Besuner, P. M.; Harris, D. O.; Thomas, J. M.
1986-01-01
Lifetime prediction technology for structural components subjected to cyclic loads is examined. The central objectives of the project are: (1) to report the current state of the art, and (2) recommend future development of fracture mechanics-based analytical tools for modeling subcritical fatigue crack growth in structures. Of special interest is the ability to apply these tools to practical engineering problems and the developmental steps necessary to bring vital technologies to this stage. The authors conducted a survey of published literature and numerous discussions with experts in the field of fracture mechanics life technology. One of the key points made is that fracture mechanics analyses of crack growth often involve consideration of fatigue and fracture under extreme conditions. Therefore, inaccuracies in predicting component lifetime will be dominated by inaccuracies in environment and fatigue crack growth relations, stress intensity factor solutions, and methods used to model given loads and stresses. Suggestions made for reducing these inaccuracies include development of improved models of subcritical crack growth, research efforts aimed at better characterizing residual and assembly stresses that can be introduced during fabrication, and more widespread and uniform use of the best existing methods.
A review of fracture mechanics life technology
NASA Technical Reports Server (NTRS)
Thomas, J. M.; Besuner, P. M.; Harris, D. O.
1985-01-01
Current lifetime prediction technology for structural components subjected to cyclic loads was reviewed. The central objectives of the project were to report the current state of and recommend future development of fracture mechanics-based analytical tools for modeling and forecasting subcritical fatigue crack growth in structures. Of special interest to NASA was the ability to apply these tools to practical engineering problems and the developmental steps necessary to bring vital technologies to this stage. A survey of published literature and numerous discussions with experts in the field of fracture mechanics life technology were conducted. One of the key points made is that fracture mechanics analyses of crack growth often involve consideration of fatigue and fracture under extreme conditions. Therefore, inaccuracies in predicting component lifetime will be dominated by inaccuracies in environment and fatigue crack growth relations, stress intensity factor solutions, and methods used to model given loads and stresses. Suggestions made for reducing these inaccuracies include: development of improved models of subcritical crack growth, research efforts aimed at better characterizing residual and assembly stresses that can be introduced during fabrication, and more widespread and uniform use of the best existing methods.
Wu, Suo-Wei; Chen, Tong; Pan, Qi; Wei, Liang-Yu; Wang, Qin; Li, Chao; Song, Jing-Chen; Luo, Ji
2018-06-05
The development and application of medical technologies reflect the medical quality and clinical capacity of a hospital. It is also an effective approach in upgrading medical service and core competitiveness among medical institutions. This study aimed to build a quantitative medical technology evaluation system through questionnaire survey within medical institutions to perform an assessment to medical technologies more objectively and accurately, and promote the management of medical quality technologies and ensure the medical safety of various operations among the hospitals. A two-leveled quantitative medical technology evaluation system was built through a two-round questionnaire survey of chosen experts. The Delphi method was applied in identifying the structure of evaluation system and indicators. The judgment of the experts on the indicators was adopted in building the matrix so that the weight coefficient and maximum eigenvalue (λ max), consistency index (CI), and random consistency ratio (CR) could be obtained and collected. The results were verified through consistency tests, and the index weight coefficient of each indicator was conducted and calculated through analytical hierarchy process. Twenty-six experts of different medical fields were involved in the questionnaire survey, 25 of whom successfully responded to the two-round research. Altogether, 4 primary indicators (safety, effectiveness, innovativeness, and benefits), as well as 13 secondary indicators, were included in the evaluation system. The matrix is built to conduct the λ max, CI, and CR of each expert in the survey, and the index weight coefficients of primary indicators were 0.33, 0.28, 0.27, and 0.12, respectively, and the index weight coefficients of secondary indicators were conducted and calculated accordingly. As the two-round questionnaire survey of experts and statistical analysis were performed and credibility of the results was verified through consistency evaluation test, the study established a quantitative medical technology evaluation system model and assessment indicators within medical institutions based on the Delphi method and analytical hierarchy process. Moreover, further verifications, adjustments, and optimizations of the system and indicators will be performed in follow-up studies.
Evolution of accelerometer methods for physical activity research.
Troiano, Richard P; McClain, James J; Brychta, Robert J; Chen, Kong Y
2014-07-01
The technology and application of current accelerometer-based devices in physical activity (PA) research allow the capture and storage or transmission of large volumes of raw acceleration signal data. These rich data not only provide opportunities to improve PA characterisation, but also bring logistical and analytic challenges. We discuss how researchers and developers from multiple disciplines are responding to the analytic challenges and how advances in data storage, transmission and big data computing will minimise logistical challenges. These new approaches also bring the need for several paradigm shifts for PA researchers, including a shift from count-based approaches and regression calibrations for PA energy expenditure (PAEE) estimation to activity characterisation and EE estimation based on features extracted from raw acceleration signals. Furthermore, a collaborative approach towards analytic methods is proposed to facilitate PA research, which requires a shift away from multiple independent calibration studies. Finally, we make the case for a distinction between PA represented by accelerometer-based devices and PA assessed by self-report. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Experiments with Analytic Centers: A confluence of data, tools and help in using them.
NASA Astrophysics Data System (ADS)
Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.
2017-12-01
Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.
ERIC Educational Resources Information Center
Polito, Vincent A., Jr.
2010-01-01
The objective of this research was to explore the possibilities of identifying knowledge style factors that could be used as central elements of a professional business analyst's (PBA) performance attributes at work for those decision makers that use advanced analytical technologies on decision making tasks. Indicators of knowledge style were…
Exline, David L; Wallace, Christie; Roux, Claude; Lennard, Chris; Nelson, Matthew P; Treado, Patrick J
2003-09-01
Chemical imaging technology is a rapid examination technique that combines molecular spectroscopy and digital imaging, providing information on morphology, composition, structure, and concentration of a material. Among many other applications, chemical imaging offers an array of novel analytical testing methods, which limits sample preparation and provides high-quality imaging data essential in the detection of latent fingerprints. Luminescence chemical imaging and visible absorbance chemical imaging have been successfully applied to ninhydrin, DFO, cyanoacrylate, and luminescent dye-treated latent fingerprints, demonstrating the potential of this technology to aid forensic investigations. In addition, visible absorption chemical imaging has been applied successfully to visualize untreated latent fingerprints.
Using data warehousing and OLAP in public health care.
Hristovski, D; Rogac, M; Markota, M
2000-01-01
The paper describes the possibilities of using data warehousing and OLAP technologies in public health care in general and then our own experience with these technologies gained during the implementation of a data warehouse of outpatient data at the national level. Such a data warehouse serves as a basis for advanced decision support systems based on statistical, OLAP or data mining methods. We used OLAP to enable interactive exploration and analysis of the data. We found out that data warehousing and OLAP are suitable for the domain of public health and that they enable new analytical possibilities in addition to the traditional statistical approaches.
Using data warehousing and OLAP in public health care.
Hristovski, D.; Rogac, M.; Markota, M.
2000-01-01
The paper describes the possibilities of using data warehousing and OLAP technologies in public health care in general and then our own experience with these technologies gained during the implementation of a data warehouse of outpatient data at the national level. Such a data warehouse serves as a basis for advanced decision support systems based on statistical, OLAP or data mining methods. We used OLAP to enable interactive exploration and analysis of the data. We found out that data warehousing and OLAP are suitable for the domain of public health and that they enable new analytical possibilities in addition to the traditional statistical approaches. PMID:11079907
NASA Technical Reports Server (NTRS)
Marsik, S. J.; Morea, S. F.
1985-01-01
A research and technology program for advanced high pressure, oxygen-hydrogen rocket propulsion technology is presently being pursued by the National Aeronautics and Space Administration (NASA) to establish the basic discipline technologies, develop the analytical tools, and establish the data base necessary for an orderly evolution of the staged combustion reusable rocket engine. The need for the program is based on the premise that the USA will depend on the Shuttle and its derivative versions as its principal Earth-to-orbit transportation system for the next 20 to 30 yr. The program is focused in three principal areas of enhancement: (1) life extension, (2) performance, and (3) operations and diagnosis. Within the technological disciplines the efforts include: rotordynamics, structural dynamics, fluid and gas dynamics, materials fatigue/fracture/life, turbomachinery fluid mechanics, ignition/combustion processes, manufacturing/producibility/nondestructive evaluation methods and materials development/evaluation. An overview of the Advanced High Pressure Oxygen-Hydrogen Rocket Propulsion Technology Program Structure and Working Groups objectives are presented with highlights of several significant achievements.
NASA Technical Reports Server (NTRS)
Marsik, S. J.; Morea, S. F.
1985-01-01
A research and technology program for advanced high pressure, oxygen-hydrogen rocket propulsion technology is presently being pursued by the National Aeronautics and Space Administration (NASA) to establish the basic discipline technologies, develop the analytical tools, and establish the data base necessary for an orderly evolution of the staged combustion reusable rocket engine. The need for the program is based on the premise that the USA will depend on the Shuttle and its derivative versions as its principal Earth-to-orbit transportation system for the next 20 to 30 yr. The program is focused in three principal areas of enhancement: (1) life extension, (2) performance, and (3) operations and diagnosis. Within the technological disciplines the efforts include: rotordynamics, structural dynamics, fluid and gas dynamics, materials fatigue/fracture/life, turbomachinery fluid mechanics, ignition/combustion processes, manufacturing/producibility/nondestructive evaluation methods and materials development/evaluation. An overview of the Advanced High Pressure Oxygen-Hydrogen Rocket Propulsion Technology Program Structure and Working Groups objectives are presented with highlights of several significant achievements.
NASA Astrophysics Data System (ADS)
Marsik, S. J.; Morea, S. F.
1985-03-01
A research and technology program for advanced high pressure, oxygen-hydrogen rocket propulsion technology is presently being pursued by the National Aeronautics and Space Administration (NASA) to establish the basic discipline technologies, develop the analytical tools, and establish the data base necessary for an orderly evolution of the staged combustion reusable rocket engine. The need for the program is based on the premise that the USA will depend on the Shuttle and its derivative versions as its principal Earth-to-orbit transportation system for the next 20 to 30 yr. The program is focused in three principal areas of enhancement: (1) life extension, (2) performance, and (3) operations and diagnosis. Within the technological disciplines the efforts include: rotordynamics, structural dynamics, fluid and gas dynamics, materials fatigue/fracture/life, turbomachinery fluid mechanics, ignition/combustion processes, manufacturing/producibility/nondestructive evaluation methods and materials development/evaluation. An overview of the Advanced High Pressure Oxygen-Hydrogen Rocket Propulsion Technology Program Structure and Working Groups objectives are presented with highlights of several significant achievements.
Modeling timelines for translational science in cancer; the impact of technological maturation
McNamee, Laura M.; Ledley, Fred D.
2017-01-01
This work examines translational science in cancer based on theories of innovation that posit a relationship between the maturation of technologies and their capacity to generate successful products. We examined the growth of technologies associated with 138 anticancer drugs using an analytical model that identifies the point of initiation of exponential growth and the point at which growth slows as the technology becomes established. Approval of targeted and biological products corresponded with technological maturation, with first approval averaging 14 years after the established point and 44 years after initiation of associated technologies. The lag in cancer drug approvals after the increases in cancer funding and dramatic scientific advances of the 1970s thus reflects predictable timelines of technology maturation. Analytical models of technological maturation may be used for technological forecasting to guide more efficient translation of scientific discoveries into cures. PMID:28346525
An Analytical Assessment of NASA's N+1 Subsonic Fixed Wing Project Noise Goal
NASA Technical Reports Server (NTRS)
Berton, Jeffrey J.; Envia, Edmane; Burley, Casey L.
2009-01-01
The Subsonic Fixed Wing Project of NASA's Fundamental Aeronautics Program has adopted a noise reduction goal for new, subsonic, single-aisle, civil aircraft expected to replace current 737 and A320 airplanes. These so-called 'N+1' aircraft - designated in NASA vernacular as such since they will follow the current, in-service, 'N' airplanes - are hoped to achieve certification noise goal levels of 32 cumulative EPNdB under current Stage 4 noise regulations. A notional, N+1, single-aisle, twinjet transport with ultrahigh bypass ratio turbofan engines is analyzed in this study using NASA software and methods. Several advanced noise-reduction technologies are analytically applied to the propulsion system and airframe. Certification noise levels are predicted and compared with the NASA goal.
NASA Technical Reports Server (NTRS)
Lyle, Karen H.
2015-01-01
Acceptance of new spacecraft structural architectures and concepts requires validated design methods to minimize the expense involved with technology demonstration via flight-testing. Hypersonic Inflatable Aerodynamic Decelerator (HIAD) architectures are attractive for spacecraft deceleration because they are lightweight, store compactly, and utilize the atmosphere to decelerate a spacecraft during entry. However, designers are hesitant to include these inflatable approaches for large payloads or spacecraft because of the lack of flight validation. This publication summarizes results comparing analytical results with test data for two concepts subjected to representative entry, static loading. The level of agreement and ability to predict the load distribution is considered sufficient to enable analytical predictions to be used in the design process.
Tassi, Marco; De Vos, Jelle; Chatterjee, Sneha; Sobott, Frank; Bones, Jonathan; Eeltink, Sebastiaan
2018-01-01
The characterization of biotherapeutics represents a major analytical challenge. This review discusses the current state-of-the-art in analytical technologies to profile biopharma products under native conditions, i.e., the protein three dimensional conformation is maintained during liquid chromatographic analysis. Native liquid-chromatographic modes that are discussed include aqueous size-exclusion chromatography, hydrophobic interaction chromatography, and ion-exchange chromatography. Infusion conditions and the possibilities and limitations to hyphenate native liquid chromatography to mass spectrometry are discussed. Furthermore, the applicability of native liquid-chromatography methods and intact mass spectrometry analysis for the characterization of monoclonal antibodies and antibody-drug conjugates is discussed. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Muthu, Pravin; Lutz, Stefan
2016-04-05
Fast, simple and cost-effective methods for detecting and quantifying pharmaceutical agents in patients are highly sought after to replace equipment and labor-intensive analytical procedures. The development of new diagnostic technology including portable detection devices also enables point-of-care by non-specialists in resource-limited environments. We have focused on the detection and dose monitoring of nucleoside analogues used in viral and cancer therapies. Using deoxyribonucleoside kinases (dNKs) as biosensors, our chemometric model compares observed time-resolved kinetics of unknown analytes to known substrate interactions across multiple enzymes. The resulting dataset can simultaneously identify and quantify multiple nucleosides and nucleoside analogues in complex sample mixtures. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A
2017-12-19
As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.
Akan, Ozgur B.
2018-01-01
We consider a microfluidic molecular communication (MC) system, where the concentration-encoded molecular messages are transported via fluid flow-induced convection and diffusion, and detected by a surface-based MC receiver with ligand receptors placed at the bottom of the microfluidic channel. The overall system is a convection-diffusion-reaction system that can only be solved by numerical methods, e.g., finite element analysis (FEA). However, analytical models are key for the information and communication technology (ICT), as they enable an optimisation framework to develop advanced communication techniques, such as optimum detection methods and reliable transmission schemes. In this direction, we develop an analytical model to approximate the expected time course of bound receptor concentration, i.e., the received signal used to decode the transmitted messages. The model obviates the need for computationally expensive numerical methods by capturing the nonlinearities caused by laminar flow resulting in parabolic velocity profile, and finite number of ligand receptors leading to receiver saturation. The model also captures the effects of reactive surface depletion layer resulting from the mass transport limitations and moving reaction boundary originated from the passage of finite-duration molecular concentration pulse over the receiver surface. Based on the proposed model, we derive closed form analytical expressions that approximate the received pulse width, pulse delay and pulse amplitude, which can be used to optimize the system from an ICT perspective. We evaluate the accuracy of the proposed model by comparing model-based analytical results to the numerical results obtained by solving the exact system model with COMSOL Multiphysics. PMID:29415019
Kuscu, Murat; Akan, Ozgur B
2018-01-01
We consider a microfluidic molecular communication (MC) system, where the concentration-encoded molecular messages are transported via fluid flow-induced convection and diffusion, and detected by a surface-based MC receiver with ligand receptors placed at the bottom of the microfluidic channel. The overall system is a convection-diffusion-reaction system that can only be solved by numerical methods, e.g., finite element analysis (FEA). However, analytical models are key for the information and communication technology (ICT), as they enable an optimisation framework to develop advanced communication techniques, such as optimum detection methods and reliable transmission schemes. In this direction, we develop an analytical model to approximate the expected time course of bound receptor concentration, i.e., the received signal used to decode the transmitted messages. The model obviates the need for computationally expensive numerical methods by capturing the nonlinearities caused by laminar flow resulting in parabolic velocity profile, and finite number of ligand receptors leading to receiver saturation. The model also captures the effects of reactive surface depletion layer resulting from the mass transport limitations and moving reaction boundary originated from the passage of finite-duration molecular concentration pulse over the receiver surface. Based on the proposed model, we derive closed form analytical expressions that approximate the received pulse width, pulse delay and pulse amplitude, which can be used to optimize the system from an ICT perspective. We evaluate the accuracy of the proposed model by comparing model-based analytical results to the numerical results obtained by solving the exact system model with COMSOL Multiphysics.
Addressing the Analytic Challenges of Cross-Sectional Pediatric Pneumonia Etiology Data.
Hammitt, Laura L; Feikin, Daniel R; Scott, J Anthony G; Zeger, Scott L; Murdoch, David R; O'Brien, Katherine L; Deloria Knoll, Maria
2017-06-15
Despite tremendous advances in diagnostic laboratory technology, identifying the pathogen(s) causing pneumonia remains challenging because the infected lung tissue cannot usually be sampled for testing. Consequently, to obtain information about pneumonia etiology, clinicians and researchers test specimens distant to the site of infection. These tests may lack sensitivity (eg, blood culture, which is only positive in a small proportion of children with pneumonia) and/or specificity (eg, detection of pathogens in upper respiratory tract specimens, which may indicate asymptomatic carriage or a less severe syndrome, such as upper respiratory infection). While highly sensitive nucleic acid detection methods and testing of multiple specimens improve sensitivity, multiple pathogens are often detected and this adds complexity to the interpretation as the etiologic significance of results may be unclear (ie, the pneumonia may be caused by none, one, some, or all of the pathogens detected). Some of these challenges can be addressed by adjusting positivity rates to account for poor sensitivity or incorporating test results from controls without pneumonia to account for poor specificity. However, no classical analytic methods can account for measurement error (ie, sensitivity and specificity) for multiple specimen types and integrate the results of measurements for multiple pathogens to produce an accurate understanding of etiology. We describe the major analytic challenges in determining pneumonia etiology and review how the common analytical approaches (eg, descriptive, case-control, attributable fraction, latent class analysis) address some but not all challenges. We demonstrate how these limitations necessitate a new, integrated analytical approach to pneumonia etiology data. © The Author 2017. Published by Oxford University Press for the Infectious Diseases Society of America.
Addressing the Analytic Challenges of Cross-Sectional Pediatric Pneumonia Etiology Data
Feikin, Daniel R.; Scott, J. Anthony G.; Zeger, Scott L.; Murdoch, David R.; O’Brien, Katherine L.; Deloria Knoll, Maria
2017-01-01
Abstract Despite tremendous advances in diagnostic laboratory technology, identifying the pathogen(s) causing pneumonia remains challenging because the infected lung tissue cannot usually be sampled for testing. Consequently, to obtain information about pneumonia etiology, clinicians and researchers test specimens distant to the site of infection. These tests may lack sensitivity (eg, blood culture, which is only positive in a small proportion of children with pneumonia) and/or specificity (eg, detection of pathogens in upper respiratory tract specimens, which may indicate asymptomatic carriage or a less severe syndrome, such as upper respiratory infection). While highly sensitive nucleic acid detection methods and testing of multiple specimens improve sensitivity, multiple pathogens are often detected and this adds complexity to the interpretation as the etiologic significance of results may be unclear (ie, the pneumonia may be caused by none, one, some, or all of the pathogens detected). Some of these challenges can be addressed by adjusting positivity rates to account for poor sensitivity or incorporating test results from controls without pneumonia to account for poor specificity. However, no classical analytic methods can account for measurement error (ie, sensitivity and specificity) for multiple specimen types and integrate the results of measurements for multiple pathogens to produce an accurate understanding of etiology. We describe the major analytic challenges in determining pneumonia etiology and review how the common analytical approaches (eg, descriptive, case-control, attributable fraction, latent class analysis) address some but not all challenges. We demonstrate how these limitations necessitate a new, integrated analytical approach to pneumonia etiology data. PMID:28575372
Research | Argonne National Laboratory
, and Decision Analytics Energy Systems Analysis Engines and Fuels Friction, Wear, and Lubrication Vehicle Technologies Buildings and Climate-Environment Energy, Power, and Decision Analytics Energy
Automated Predictive Big Data Analytics Using Ontology Based Semantics.
Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A
2015-10-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.
Automated Predictive Big Data Analytics Using Ontology Based Semantics
Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.
2017-01-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954
Tang, Xiaolin Charlie; Nail, Steven L; Pikal, Michael J
2006-01-01
The purpose of this work was to study the factors that may cause systematic errors in the manometric temperature measurement (MTM) procedure used to determine product dry-layer resistance to vapor flow. Product temperature and dry-layer resistance were obtained using MTM software installed on a laboratory freeze-dryer. The MTM resistance values were compared with the resistance values obtained using the "vial method." The product dry-layer resistances obtained by MTM, assuming fixed temperature difference (DeltaT; 2 degrees C), were lower than the actual values, especially when the product temperatures and sublimation rates were low, but with DeltaT determined from the pressure rise data, more accurate results were obtained. MTM resistance values were generally lower than the values obtained with the vial method, particularly whenever freeze-drying was conducted under conditions that produced large variations in product temperature (ie, low shelf temperature, low chamber pressure, and without thermal shields). In an experiment designed to magnify temperature heterogeneity, MTM resistance values were much lower than the simple average of the product resistances. However, in experiments where product temperatures were homogenous, good agreement between MTM and "vial-method" resistances was obtained. The reason for the low MTM resistance problem is the fast vapor pressure rise from a few "warm" edge vials or vials with low resistance. With proper use of thermal shields, and the evaluation of DeltaT from the data, MTM resistance data are accurate. Thus, the MTM method for determining dry-layer resistance is a useful tool for freeze-drying process analytical technology.
Dave, Vivek S; Shahin, Hend I; Youngren-Ortiz, Susanne R; Chougule, Mahavir B; Haware, Rahul V
2017-10-30
The density, porosity, breaking force, viscoelastic properties, and the presence or absence of any structural defects or irregularities are important physical-mechanical quality attributes of popular solid dosage forms like tablets. The irregularities associated with these attributes may influence the drug product functionality. Thus, an accurate and efficient characterization of these properties is critical for successful development and manufacturing of a robust tablets. These properties are mainly analyzed and monitored with traditional pharmacopeial and non-pharmacopeial methods. Such methods are associated with several challenges such as lack of spatial resolution, efficiency, or sample-sparing attributes. Recent advances in technology, design, instrumentation, and software have led to the emergence of newer techniques for non-invasive characterization of physical-mechanical properties of tablets. These techniques include near infrared spectroscopy, Raman spectroscopy, X-ray microtomography, nuclear magnetic resonance (NMR) imaging, terahertz pulsed imaging, laser-induced breakdown spectroscopy, and various acoustic- and thermal-based techniques. Such state-of-the-art techniques are currently applied at various stages of development and manufacturing of tablets at industrial scale. Each technique has specific advantages or challenges with respect to operational efficiency and cost, compared to traditional analytical methods. Currently, most of these techniques are used as secondary analytical tools to support the traditional methods in characterizing or monitoring tablet quality attributes. Therefore, further development in the instrumentation and software, and studies on the applications are necessary for their adoption in routine analysis and monitoring of tablet physical-mechanical properties. Copyright © 2017 Elsevier B.V. All rights reserved.
ELISA-type assays of trace biomarkers using microfluidic methods.
Dong, Jinhua; Ueda, Hiroshi
2017-09-01
Recently, great progress has been achieved for analytical technologies for biological substances. Traditionally, detection methods for analytes mainly rely on large instrumental analyses. These methods require costly equipment, skilled operators and long measurement time despite their generally low sensitivity. In contrast, immunoassays are becoming more and more popular for it is powerful, inexpensive, and convenient nature. Immunoassay has a range of applications, because it employs antibody, a protein produced by plasma cells in the acquired immune system to identify and neutralize diverse pathogens and other exogenous substances. However, the sensitivity of conventional immunoassays so far is limited by their reaction principles and detection methods. The microfluidics technology is the one that manipulates small volumes of fluid and flow, which has the potential to miniaturize many laboratory procedures. Immunoassays on microfluidic devices have been studied extensively and have gained significant attention owing to intrinsic advantages offered by the assay platforms. The techniques have allowed the miniaturization of conventional immunoassay and bring the advantages such as small volumes of samples and reagents as well as the decrease of contamination, which results in the decline of false-positive results. Ultimately, the combination of immunoassays with microfluidics affords a promising platform for multiple, sensitive, and automatic point-of-care diagnostics. Recent achievements on microfluidic devices and immunoassay detection systems including digital assay employing single molecule will be introduced in detail and the strategies for faster and more sensitive configurations in microfluidic immunosensors will be highlighted. WIREs Nanomed Nanobiotechnol 2017, 9:e1457. doi: 10.1002/wnan.1457 For further resources related to this article, please visit the WIREs website. © 2017 Wiley Periodicals, Inc.
NASA Technical Reports Server (NTRS)
Wing, L. D.
1979-01-01
Simplified analytical techniques of sounding rocket programs are suggested as a means of bringing the cost of thermal analysis of the Get Away Special (GAS) payloads within acceptable bounds. Particular attention is given to two methods adapted from sounding rocket technology - a method in which the container and payload are assumed to be divided in half vertically by a thermal plane of symmetry, and a method which considers the container and its payload to be an analogous one-dimensional unit having the real or correct container top surface area for radiative heat transfer and a fictitious mass and geometry which model the average thermal effects.
NASA Technical Reports Server (NTRS)
Griswold, M.; Roskam, J.
1980-01-01
An analytical method is presented for predicting lateral-directional aerodynamic characteristics of light twin engine propeller-driven airplanes. This method is applied to the Advanced Technology Light Twin Engine airplane. The calculated characteristics are correlated against full-scale wind tunnel data. The method predicts the sideslip derivatives fairly well, although angle of attack variations are not well predicted. Spoiler performance was predicted somewhat high but was still reasonable. The rudder derivatives were not well predicted, in particular the effect of angle of attack. The predicted dynamic derivatives could not be correlated due to lack of experimental data.
Immunochemical analytical methods for the determination of peanut proteins in foods.
Whitaker, Thomas B; Williams, Kristina M; Trucksess, Mary W; Slate, Andrew B
2005-01-01
Peanut proteins can cause allergenic reactions that can result in respiratory and circulatory effects in the body sometimes leading to shock and death. The determination of peanut proteins in foods by analytical methods can reduce the risk of serious reactions in the highly sensitized individual by allowing for the detection of these proteins in a food at various stages of the manufacturing process. The method performance of 4 commercially available enzyme-linked immunosorbent assay (ELISA) kits was evaluated for the detection of peanut proteins in milk chocolate, ice cream, cookies, and breakfast cereals: ELISA-TEK Peanut Protein Assay, now known as "Bio-Kit" for peanut proteins, from ELISA Technologies Inc.; Veratox for Peanut Allergens from Neogen Corp.; RIDASCREEN Peanut Kit from R-Biopharm GmbH; and ProLisa from Canadian Food Technology Ltd. The 4 test kits were evaluated for accuracy (recovery) and precision using known concentrations of peanut or peanut proteins in the 4 food matrixes. Two different techniques, incurred and spiked, were used to prepare samples with 4 known concentrations of peanut protein. Defatted peanut flour was added in the incurred samples, and water-soluble peanut proteins were added in the spiked samples. The incurred levels were 0.0, 10, 20, and 100 microg whole peanut per g food; the spiked levels were 0.0, 5, 10, and 20 microg peanut protein per g food. Performance varied by test kit, protein concentration, and food matrix. The Veratox kit had the best accuracy or lowest percent difference between measured and incurred levels of 15.7% when averaged across all incurred levels and food matrixes. Recoveries associated with the Veratox kit varied from 93 to 115% for all food matrixes except cookies. Recoveries for all kits were about 50% for cookies. The analytical precision, as measured by the variance, increased with an increase in protein concentration. However, the coefficient of variation (CV) was stable across the 4 incurred protein levels and was 7.0% when averaged across the 4 food matrixes and analytical kits. The R-Biopharm test kit had the best precision or a CV of 4.2% when averaged across all incurred levels and food matrixes. Because measured protein values varied by test kit and food matrix, a method was developed to normalize or transform measured protein concentrations to an adjusted protein value that was equal to the known protein concentration. The normalization method adjusts measured protein values to equal the true protein value regardless of the type test kit or type food matrix.
Data Analytics and Visualization for Large Army Testing Data
2013-09-01
and relationships in the data that would otherwise remain hidden. 7 Bibliography 1. Goodall , J. R.; Tesone, D. R. Visual Analytics for Network...Software Visualization, 2003, pp 143–149. 3. Goodall , J. R.; Sowul, M. VIAssist: Visual Analytics for Cyber Defense, IEEE Conference on Technologies
Online Learner Engagement: Opportunities and Challenges with Using Data Analytics
ERIC Educational Resources Information Center
Bodily, Robert; Graham, Charles R.; Bush, Michael D.
2017-01-01
This article describes the crossroads between learning analytics and learner engagement. The authors do this by describing specific challenges of using analytics to support student engagement from three distinct perspectives: pedagogical considerations, technological issues, and interface design concerns. While engaging online learners presents a…
Information security of power enterprises of North-Arctic region
NASA Astrophysics Data System (ADS)
Sushko, O. P.
2018-05-01
The role of information technologies in providing technological security for energy enterprises is a component of the economic security for the northern Arctic region in general. Applying instruments and methods of information protection modelling of the energy enterprises' business process in the northern Arctic region (such as Arkhenergo and Komienergo), the authors analysed and identified most frequent risks of information security. With the analytic hierarchy process based on weighting factor estimations, information risks of energy enterprises' technological processes were ranked. The economic estimation of the information security within an energy enterprise considers weighting factor-adjusted variables (risks). Investments in information security systems of energy enterprises in the northern Arctic region are related to necessary security elements installation; current operating expenses on business process protection systems become materialized economic damage.
A novel upper limb rehabilitation system with self-driven virtual arm illusion.
Aung, Yee Mon; Al-Jumaily, Adel; Anam, Khairul
2014-01-01
This paper proposes a novel upper extremity rehabilitation system with virtual arm illusion. It aims for fast recovery from lost functions of the upper limb as a result of stroke to provide a novel rehabilitation system for paralyzed patients. The system is integrated with a number of technologies that include Augmented Reality (AR) technology to develop game like exercise, computer vision technology to create the illusion scene, 3D modeling and model simulation, and signal processing to detect user intention via EMG signal. The effectiveness of the developed system has evaluated via usability study and questionnaires which is represented by graphical and analytical methods. The evaluation provides with positive results and this indicates the developed system has potential as an effective rehabilitation system for upper limb impairment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
P Yu
Unlike traditional 'wet' analytical methods which during processing for analysis often result in destruction or alteration of the intrinsic protein structures, advanced synchrotron radiation-based Fourier transform infrared microspectroscopy has been developed as a rapid and nondestructive and bioanalytical technique. This cutting-edge synchrotron-based bioanalytical technology, taking advantages of synchrotron light brightness (million times brighter than sun), is capable of exploring the molecular chemistry or structure of a biological tissue without destruction inherent structures at ultra-spatial resolutions. In this article, a novel approach is introduced to show the potential of the advanced synchrotron-based analytical technology, which can be used to study plant-basedmore » food or feed protein molecular structure in relation to nutrient utilization and availability. Recent progress was reported on using synchrotron-based bioanalytical technique synchrotron radiation-based Fourier transform infrared microspectroscopy and diffused reflectance infrared Fourier transform spectroscopy to detect the effects of gene-transformation (Application 1), autoclaving (Application 2), and bio-ethanol processing (Application 3) on plant-based food and feed protein structure changes on a molecular basis. The synchrotron-based technology provides a new approach for plant-based protein structure research at ultra-spatial resolutions at cellular and molecular levels.« less
Warfighter decision making performance analysis as an investment priority driver
NASA Astrophysics Data System (ADS)
Thornley, David J.; Dean, David F.; Kirk, James C.
2010-04-01
Estimating the relative value of alternative tactics, techniques and procedures (TTP) and information systems requires measures of the costs and benefits of each, and methods for combining and comparing those measures. The NATO Code of Best Practice for Command and Control Assessment explains that decision making quality would ideally be best assessed on outcomes. Lessons learned in practice can be assessed statistically to support this, but experimentation with alternate measures in live conflict is undesirable. To this end, the development of practical experimentation to parameterize effective constructive simulation and analytic modelling for system utility prediction is desirable. The Land Battlespace Systems Department of Dstl has modeled human development of situational awareness to support constructive simulation by empirically discovering how evidence is weighed according to circumstance, personality, training and briefing. The human decision maker (DM) provides the backbone of the information processing activity associated with military engagements because of inherent uncertainty associated with combat operations. To develop methods for representing the process in order to assess equipment and non-technological interventions such as training and TTPs we are developing componentized or modularized timed analytic stochastic model components and instruments as part of a framework to support quantitative assessment of intelligence production and consumption methods in a human decision maker-centric mission space. In this paper, we formulate an abstraction of the human intelligence fusion process from the Defence Science and Technology Laboratory's (Dstl's) INCIDER model to include in our framework, and synthesize relevant cost and benefit characteristics.
An integrative framework for sensor-based measurement of teamwork in healthcare
Rosen, Michael A; Dietz, Aaron S; Yang, Ting; Priebe, Carey E; Pronovost, Peter J
2015-01-01
There is a strong link between teamwork and patient safety. Emerging evidence supports the efficacy of teamwork improvement interventions. However, the availability of reliable, valid, and practical measurement tools and strategies is commonly cited as a barrier to long-term sustainment and spread of these teamwork interventions. This article describes the potential value of sensor-based technology as a methodology to measure and evaluate teamwork in healthcare. The article summarizes the teamwork literature within healthcare, including team improvement interventions and measurement. Current applications of sensor-based measurement of teamwork are reviewed to assess the feasibility of employing this approach in healthcare. The article concludes with a discussion highlighting current application needs and gaps and relevant analytical techniques to overcome the challenges to implementation. Compelling studies exist documenting the feasibility of capturing a broad array of team input, process, and output variables with sensor-based methods. Implications of this research are summarized in a framework for development of multi-method team performance measurement systems. Sensor-based measurement within healthcare can unobtrusively capture information related to social networks, conversational patterns, physical activity, and an array of other meaningful information without having to directly observe or periodically survey clinicians. However, trust and privacy concerns present challenges that need to be overcome through engagement of end users in healthcare. Initial evidence exists to support the feasibility of sensor-based measurement to drive feedback and learning across individual, team, unit, and organizational levels. Future research is needed to refine methods, technologies, theory, and analytical strategies. PMID:25053579
Research implications of science-informed, value-based decision making.
Dowie, Jack
2004-01-01
In 'Hard' science, scientists correctly operate as the 'guardians of certainty', using hypothesis testing formulations and value judgements about error rates and time discounting that make classical inferential methods appropriate. But these methods can neither generate most of the inputs needed by decision makers in their time frame, nor generate them in a form that allows them to be integrated into the decision in an analytically coherent and transparent way. The need for transparent accountability in public decision making under uncertainty and value conflict means the analytical coherence provided by the stochastic Bayesian decision analytic approach, drawing on the outputs of Bayesian science, is needed. If scientific researchers are to play the role they should be playing in informing value-based decision making, they need to see themselves also as 'guardians of uncertainty', ensuring that the best possible current posterior distributions on relevant parameters are made available for decision making, irrespective of the state of the certainty-seeking research. The paper distinguishes the actors employing different technologies in terms of the focus of the technology (knowledge, values, choice); the 'home base' mode of their activity on the cognitive continuum of varying analysis-to-intuition ratios; and the underlying value judgements of the activity (especially error loss functions and time discount rates). Those who propose any principle of decision making other than the banal 'Best Principle', including the 'Precautionary Principle', are properly interpreted as advocates seeking to have their own value judgements and preferences regarding mode location apply. The task for accountable decision makers, and their supporting technologists, is to determine the best course of action under the universal conditions of uncertainty and value difference/conflict.
Biosensor technology: technology push versus market pull.
Luong, John H T; Male, Keith B; Glennon, Jeremy D
2008-01-01
Biosensor technology is based on a specific biological recognition element in combination with a transducer for signal processing. Since its inception, biosensors have been expected to play a significant analytical role in medicine, agriculture, food safety, homeland security, environmental and industrial monitoring. However, the commercialization of biosensor technology has significantly lagged behind the research output as reflected by a plethora of publications and patenting activities. The rationale behind the slow and limited technology transfer could be attributed to cost considerations and some key technical barriers. Analytical chemistry has changed considerably, driven by automation, miniaturization, and system integration with high throughput for multiple tasks. Such requirements pose a great challenge in biosensor technology which is often designed to detect one single or a few target analytes. Successful biosensors must be versatile to support interchangeable biorecognition elements, and in addition miniaturization must be feasible to allow automation for parallel sensing with ease of operation at a competitive cost. A significant upfront investment in research and development is a prerequisite in the commercialization of biosensors. The progress in such endeavors is incremental with limited success, thus, the market entry for a new venture is very difficult unless a niche product can be developed with a considerable market volume.
Using PAT to accelerate the transition to continuous API manufacturing.
Gouveia, Francisca F; Rahbek, Jesper P; Mortensen, Asmus R; Pedersen, Mette T; Felizardo, Pedro M; Bro, Rasmus; Mealy, Michael J
2017-01-01
Significant improvements can be realized by converting conventional batch processes into continuous ones. The main drivers include reduction of cost and waste, increased safety, and simpler scale-up and tech transfer activities. Re-designing the process layout offers the opportunity to incorporate a set of process analytical technologies (PAT) embraced in the Quality-by-Design (QbD) framework. These tools are used for process state estimation, providing enhanced understanding of the underlying variability in the process impacting quality and yield. This work describes a road map for identifying the best technology to speed-up the development of continuous processes while providing the basis for developing analytical methods for monitoring and controlling the continuous full-scale reaction. The suitability of in-line Raman, FT-infrared (FT-IR), and near-infrared (NIR) spectroscopy for real-time process monitoring was investigated in the production of 1-bromo-2-iodobenzene. The synthesis consists of three consecutive reaction steps including the formation of an unstable diazonium salt intermediate, which is critical to secure high yield and avoid formation of by-products. All spectroscopic methods were able to capture critical information related to the accumulation of the intermediate with very similar accuracy. NIR spectroscopy proved to be satisfactory in terms of performance, ease of installation, full-scale transferability, and stability to very adverse process conditions. As such, in-line NIR was selected to monitor the continuous full-scale production. The quantitative method was developed against theoretical concentration values of the intermediate since representative sampling for off-line reference analysis cannot be achieved. The rapid and reliable analytical system allowed the following: speeding up the design of the continuous process and a better understanding of the manufacturing requirements to ensure optimal yield and avoid unreacted raw materials and by-products in the continuous reactor effluent. Graphical Abstract Using PAT to accelerate the transition to continuous API manufacturing.
The Dairy Technology System in Venezuela. Summary of Research 79.
ERIC Educational Resources Information Center
Nieto, Ruben D.; Henderson, Janet L.
A study examined the agricultural technology system in Venezuela with emphasis on the dairy industry. An analytical framework was used to identify the strengths and weaknesses of the following components of Venezuela's agricultural technology system: policy, technology development, technology transfer, and technology use. Selected government…
Fluorescence metrology used for analytics of high-quality optical materials
NASA Astrophysics Data System (ADS)
Engel, Axel; Haspel, Rainer; Rupertus, Volker
2004-09-01
Optical, glass ceramics and crystals are used for various specialized applications in telecommunication, biomedical, optical, and micro lithography technology. In order to qualify and control the material quality during the research and production processes several specialized ultra trace analytisis methods have to be appliedcs Schott Glas is applied. One focus of our the activities is the determination of impurities ranging in the sub ppb-regime, because such kind of impurity level is required e.g. for pure materials used for microlithography for example. Common analytical techniques for these impurity levels areSuch impurities are determined using analytical methods like LA ICP-MS and or Neutron Activation Analysis for example. On the other hand direct and non-destructive optical analysistic becomes is attractive because it visualizes the requirement of the optical applications additionally. Typical eExamples are absorption and laser resistivity measurements of optical material with optical methods like precision spectral photometers and or in-situ transmission measurements by means ofusing lamps and or UV lasers. Analytical methods have the drawback that they are time consuming and rather expensive, whereas the sensitivity for the absorption method will not be sufficient to characterize the future needs (coefficient much below 10-3 cm-1). For a non-destructive qualification for the current and future quality requirements a Jobin Yvon FLUOROLOG 3.22 fluorescence spectrometery is employed to enable fast and precise qualification and analysis. The main advantage of this setup is the combination of highest sensitivity (more than one order of magnitude higher sensitivity than state of the art UV absorption spectroscopy), fast measurement and evaluation cycles (several minutes compared to several hours necessary for chemical analystics). An overview is given for spectral characteristics using specified standards, which are necessary to establish the analytical system. The elementary fluorescence and absorption of rare earth element impurities as well as crystal defects induced luminescence originated by impurities was investigated. Quantitative numbers are given for the relative quantum yield as well as for the excitation cross section for doped glass and calcium fluoride.
Planetary quarantine: Principles, methods, and problems.
NASA Technical Reports Server (NTRS)
Hall, L. B.
1971-01-01
Microbial survival in deep space environment, contamination of planets by nonsterile flight hardware, and hazards of back contamination are among the topics covered in papers concerned with the analytical basis for planetary quarantine. The development of the technology and policies of planetary quarantine is covered in contributions on microbiologic assay and sterilization of space flight hardware and control of microbial contamination. A comprehensive subject index is included. Individual items are abstracted in this issue.
ERIC Educational Resources Information Center
Ahmed, Abdelrahman M.; AbdelAlmuniem, Arwa; Almabhouh, Ahmed A.
2016-01-01
This study aimed to identify the current status of using Web 2.0 tools in university teaching by the faculty members of the College of Education at Sudan University of Science and Technology. The study used a descriptive analytical method based on the use of questionnaires and interviews. The questionnaire was administered to a sample of 40…
Computational toxicity in 21st century safety sciences (China ...
presentation at the Joint Meeting of Analytical Toxicology and Computational Toxicology Committee (Chinese Society of Toxicology) International Workshop on Advanced Chemical Safety Assessment Technologies on 11 May 2016, Fuzhou University, Fuzhou China presentation at the Joint Meeting of Analytical Toxicology and Computational Toxicology Committee (Chinese Society of Toxicology) International Workshop on Advanced Chemical Safety Assessment Technologies on 11 May 2016, Fuzhou University, Fuzhou China
DNA biosensing with 3D printing technology.
Loo, Adeline Huiling; Chua, Chun Kiang; Pumera, Martin
2017-01-16
3D printing, an upcoming technology, has vast potential to transform conventional fabrication processes due to the numerous improvements it can offer to the current methods. To date, the employment of 3D printing technology has been examined for applications in the fields of engineering, manufacturing and biological sciences. In this study, we examined the potential of adopting 3D printing technology for a novel application, electrochemical DNA biosensing. Metal 3D printing was utilized to construct helical-shaped stainless steel electrodes which functioned as a transducing platform for the detection of DNA hybridization. The ability of electroactive methylene blue to intercalate into the double helix structure of double-stranded DNA was then exploited to monitor the DNA hybridization process, with its inherent reduction peak serving as an analytical signal. The designed biosensing approach was found to demonstrate superior selectivity against a non-complementary DNA target, with a detection range of 1-1000 nM.
Uy, Raymonde Charles Y; Kury, Fabricio P; Fontelo, Paul A
2015-01-01
The standard of safe medication practice requires strict observance of the five rights of medication administration: the right patient, drug, time, dose, and route. Despite adherence to these guidelines, medication errors remain a public health concern that has generated health policies and hospital processes that leverage automation and computerization to reduce these errors. Bar code, RFID, biometrics and pharmacy automation technologies have been demonstrated in literature to decrease the incidence of medication errors by minimizing human factors involved in the process. Despite evidence suggesting the effectivity of these technologies, adoption rates and trends vary across hospital systems. The objective of study is to examine the state and adoption trends of automatic identification and data capture (AIDC) methods and pharmacy automation technologies in U.S. hospitals. A retrospective descriptive analysis of survey data from the HIMSS Analytics® Database was done, demonstrating an optimistic growth in the adoption of these patient safety solutions.
Biosensing Technologies for Mycobacterium tuberculosis Detection: Status and New Developments
Zhou, Lixia; He, Xiaoxiao; He, Dinggeng; Wang, Kemin; Qin, Dilan
2011-01-01
Biosensing technologies promise to improve Mycobacterium tuberculosis (M. tuberculosis) detection and management in clinical diagnosis, food analysis, bioprocess, and environmental monitoring. A variety of portable, rapid, and sensitive biosensors with immediate “on-the-spot” interpretation have been developed for M. tuberculosis detection based on different biological elements recognition systems and basic signal transducer principles. Here, we present a synopsis of current developments of biosensing technologies for M. tuberculosis detection, which are classified on the basis of basic signal transducer principles, including piezoelectric quartz crystal biosensors, electrochemical biosensors, and magnetoelastic biosensors. Special attention is paid to the methods for improving the framework and analytical parameters of the biosensors, including sensitivity and analysis time as well as automation of analysis procedures. Challenges and perspectives of biosensing technologies development for M. tuberculosis detection are also discussed in the final part of this paper. PMID:21437177
Vehicle concepts and technology requirements for buoyant heavy-lift systems
NASA Technical Reports Server (NTRS)
Ardema, M. D.
1981-01-01
Several buoyant-vehicle (airship) concepts proposed for short hauls of heavy payloads are described. Numerous studies identified operating cost and payload capacity advantages relative to existing or proposed heavy-lift helicopters for such vehicles. Applications involving payloads of from 15 tons up to 800 tons were identified. The buoyant quad-rotor concept is discussed in detail, including the history of its development, current estimates of performance and economics, currently perceived technology requirements, and recent research and technology development. It is concluded that the buoyant quad-rotor, and possibly other buoyant vehicle concepts, has the potential of satisfying the market for very heavy vertical lift but that additional research and technology development are necessary. Because of uncertainties in analytical prediction methods and small-scale experimental measurements, there is a strong need for large or full-scale experiments in ground test facilities and, ultimately, with a flight research vehicle.
Field Performance of ISFET based Deep Ocean pH Sensors
NASA Astrophysics Data System (ADS)
Branham, C. W.; Murphy, D. J.
2017-12-01
Historically, ocean pH time series data was acquired from infrequent shipboard grab samples and measured using labor intensive spectrophotometry methods. However, with the introduction of robust and stable ISFET pH sensors for use in ocean applications a paradigm shift in the methods used to acquire long-term pH time series data has occurred. Sea-Bird Scientific played a critical role in the adoption this new technology by commercializing the SeaFET pH sensor and float pH Sensor developed by the MBARI chemical sensor group. Sea-Bird Scientific continues to advance this technology through a concerted effort to improve pH sensor accuracy and reliability by characterizing their performance in the laboratory and field. This presentation will focus on calibration of the ISFET pH sensor, evaluate its analytical performance, and validate performance using recent field data.
Single-Cell Detection of Secreted Aβ and sAPPα from Human IPSC-Derived Neurons and Astrocytes.
Liao, Mei-Chen; Muratore, Christina R; Gierahn, Todd M; Sullivan, Sarah E; Srikanth, Priya; De Jager, Philip L; Love, J Christopher; Young-Pearse, Tracy L
2016-02-03
Secreted factors play a central role in normal and pathological processes in every tissue in the body. The brain is composed of a highly complex milieu of different cell types and few methods exist that can identify which individual cells in a complex mixture are secreting specific analytes. By identifying which cells are responsible, we can better understand neural physiology and pathophysiology, more readily identify the underlying pathways responsible for analyte production, and ultimately use this information to guide the development of novel therapeutic strategies that target the cell types of relevance. We present here a method for detecting analytes secreted from single human induced pluripotent stem cell (iPSC)-derived neural cells and have applied the method to measure amyloid β (Aβ) and soluble amyloid precursor protein-alpha (sAPPα), analytes central to Alzheimer's disease pathogenesis. Through these studies, we have uncovered the dynamic range of secretion profiles of these analytes from single iPSC-derived neuronal and glial cells and have molecularly characterized subpopulations of these cells through immunostaining and gene expression analyses. In examining Aβ and sAPPα secretion from single cells, we were able to identify previously unappreciated complexities in the biology of APP cleavage that could not otherwise have been found by studying averaged responses over pools of cells. This technique can be readily adapted to the detection of other analytes secreted by neural cells, which would have the potential to open new perspectives into human CNS development and dysfunction. We have established a technology that, for the first time, detects secreted analytes from single human neurons and astrocytes. We examine secretion of the Alzheimer's disease-relevant factors amyloid β (Aβ) and soluble amyloid precursor protein-alpha (sAPPα) and present novel findings that could not have been observed without a single-cell analytical platform. First, we identify a previously unappreciated subpopulation that secretes high levels of Aβ in the absence of detectable sAPPα. Further, we show that multiple cell types secrete high levels of Aβ and sAPPα, but cells expressing GABAergic neuronal markers are overrepresented. Finally, we show that astrocytes are competent to secrete high levels of Aβ and therefore may be a significant contributor to Aβ accumulation in the brain. Copyright © 2016 the authors 0270-6474/16/361730-17$15.00/0.
Self-Assembly of Large Gold Nanoparticles for Surface-Enhanced Raman Spectroscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Guang; Nanda, Jagjit; Wang, Boya
Performance of portable technologies from mobile phones to electric vehicles is currently limited by the energy density and lifetime of lithium batteries. Expanding the limits of battery technology requires in situ detection of trace components at electrode–electrolyte interphases. Surface-enhance Raman spectroscopy could satisfy this need if a robust and reproducible substrate were available. Gold nanoparticles (Au NPs) larger than 20 nm diameter are expected to greatly enhance Raman intensity if they can be assembled into ordered monolayers. A three-phase self-assembly method is presented that successfully results in ordered Au NP monolayers for particle diameters ranging from 13 to 90 nm.more » The monolayer structure and Raman enhancement factors (EFs) are reported for a model analyte, rhodamine, as well as the best performing polymer electrolyte salt, lithium bis(trifluoromethane)sulfonimide. Experimental EFs for the most part correlate with predictions based on monolayer geometry and with numerical simulations that identify local electromagnetic field enhancements. Lastly, the EFs for the best performing Au NP monolayer are between 10 6 and 10 8 and give quantitative signal response when analyte concentration is changed.« less
Yan, Binjun; Chen, Teng; Xu, Zhilin; Qu, Haibin
2014-06-01
The concept of quality by design (QbD) is widely applied in the process development of pharmaceuticals. However, the additional cost and time have caused some resistance about QbD implementation. To show a possible solution, this work proposed a rapid process development method, which used direct analysis in real time mass spectrometry (DART-MS) as a process analytical technology (PAT) tool for studying the chromatographic process of Ginkgo biloba L., as an example. The breakthrough curves were fast determined by DART-MS at-line. A high correlation coefficient of 0.9520 was found between the concentrations of ginkgolide A determined by DART-MS and HPLC. Based on the PAT tool, the impacts of process parameters on the adsorption capacity were discovered rapidly, which showed a decreased adsorption capacity with the increase of the flow rate. This work has shown the feasibility and advantages of integrating PAT into QbD implementation for rapid process development. Copyright © 2014 Elsevier B.V. All rights reserved.
Peres, Daniela D'Almeida; Ariede, Maira Bueno; Candido, Thalita Marcilio; de Almeida, Tania Santos; Lourenço, Felipe Rebello; Consiglieri, Vladi Olga; Kaneko, Telma Mary; Velasco, Maria Valéria Robles; Baby, André Rolim
2017-02-01
Multifunctional formulations are of great importance to ensure better skin protection from harm caused by ultraviolet radiation (UV). Despite the advantages of Quality by Design and Process Analytical Technology approaches to the development and optimization of new products, we found in the literature only a few studies concerning their applications in cosmetic product industry. Thus, in this research work, we applied the QbD and PAT approaches to the development of multifunctional sunscreens containing bemotrizinol, ethylhexyl triazone, and ferulic acid. In addition, UV transmittance method was applied to assess qualitative and quantitative critical quality attributes of sunscreens using chemometrics analyses. Linear discriminant analysis allowed classifying unknown formulations, which is useful for investigation of counterfeit and adulteration. Simultaneous quantification of ethylhexyl triazone, bemotrizinol, and ferulic acid presented at the formulations was performed using PLS regression. This design allowed us to verify the compounds in isolation and in combination and to prove that the antioxidant action of ferulic acid as well as the sunscreen actions, since the presence of this component increased 90% of antioxidant activity in vitro.
Self-Assembly of Large Gold Nanoparticles for Surface-Enhanced Raman Spectroscopy.
Yang, Guang; Nanda, Jagjit; Wang, Boya; Chen, Gang; Hallinan, Daniel T
2017-04-19
Performance of portable technologies from mobile phones to electric vehicles is currently limited by the energy density and lifetime of lithium batteries. Expanding the limits of battery technology requires in situ detection of trace components at electrode-electrolyte interphases. Surface-enhance Raman spectroscopy could satisfy this need if a robust and reproducible substrate were available. Gold nanoparticles (Au NPs) larger than 20 nm diameter are expected to greatly enhance Raman intensity if they can be assembled into ordered monolayers. A three-phase self-assembly method is presented that successfully results in ordered Au NP monolayers for particle diameters ranging from 13 to 90 nm. The monolayer structure and Raman enhancement factors (EFs) are reported for a model analyte, rhodamine, as well as the best performing polymer electrolyte salt, lithium bis(trifluoromethane)sulfonimide. Experimental EFs for the most part correlate with predictions based on monolayer geometry and with numerical simulations that identify local electromagnetic field enhancements. The EFs for the best performing Au NP monolayer are between 10 6 and 10 8 and give quantitative signal response when analyte concentration is changed.
Self-Assembly of Large Gold Nanoparticles for Surface-Enhanced Raman Spectroscopy
Yang, Guang; Nanda, Jagjit; Wang, Boya; ...
2017-04-04
Performance of portable technologies from mobile phones to electric vehicles is currently limited by the energy density and lifetime of lithium batteries. Expanding the limits of battery technology requires in situ detection of trace components at electrode–electrolyte interphases. Surface-enhance Raman spectroscopy could satisfy this need if a robust and reproducible substrate were available. Gold nanoparticles (Au NPs) larger than 20 nm diameter are expected to greatly enhance Raman intensity if they can be assembled into ordered monolayers. A three-phase self-assembly method is presented that successfully results in ordered Au NP monolayers for particle diameters ranging from 13 to 90 nm.more » The monolayer structure and Raman enhancement factors (EFs) are reported for a model analyte, rhodamine, as well as the best performing polymer electrolyte salt, lithium bis(trifluoromethane)sulfonimide. Experimental EFs for the most part correlate with predictions based on monolayer geometry and with numerical simulations that identify local electromagnetic field enhancements. Lastly, the EFs for the best performing Au NP monolayer are between 10 6 and 10 8 and give quantitative signal response when analyte concentration is changed.« less
Understanding Education Involving Geovisual Analytics
ERIC Educational Resources Information Center
Stenliden, Linnea
2013-01-01
Handling the vast amounts of data and information available in contemporary society is a challenge. Geovisual Analytics provides technology designed to increase the effectiveness of information interpretation and analytical task solving. To date, little attention has been paid to the role such tools can play in education and to the extent to which…
Technology Enhanced Analytics (TEA) in Higher Education
ERIC Educational Resources Information Center
Daniel, Ben Kei; Butson, Russell
2013-01-01
This paper examines the role of Big Data Analytics in addressing contemporary challenges associated with current changes in institutions of higher education. The paper first explores the potential of Big Data Analytics to support instructors, students and policy analysts to make better evidence based decisions. Secondly, the paper presents an…
Penetrating the Fog: Analytics in Learning and Education
ERIC Educational Resources Information Center
Siemens, George; Long, Phil
2011-01-01
Attempts to imagine the future of education often emphasize new technologies--ubiquitous computing devices, flexible classroom designs, and innovative visual displays. But the most dramatic factor shaping the future of higher education is something that people cannot actually touch or see: "big data and analytics." Learning analytics is still in…
Developing a Code of Practice for Learning Analytics
ERIC Educational Resources Information Center
Sclater, Niall
2016-01-01
Ethical and legal objections to learning analytics are barriers to development of the field, thus potentially denying students the benefits of predictive analytics and adaptive learning. Jisc, a charitable organization that champions the use of digital technologies in UK education and research, has attempted to address this with the development of…
Overcoming Barriers to Educational Analytics: How Systems Thinking and Pragmatism Can Help
ERIC Educational Resources Information Center
Macfadyen, Leah P.
2017-01-01
Learning technologies are now commonplace in education, and generate large volumes of educational data. Scholars have argued that analytics can and should be employed to optimize learning and learning environments. This article explores what is really meant by "analytics", describes the current best-known examples of institutional…
Investigation of Using Analytics in Promoting Mobile Learning Support
ERIC Educational Resources Information Center
Visali, Videhi; Swami, Niraj
2013-01-01
Learning analytics can promote pedagogically informed use of learner data, which can steer the progress of technology mediated learning across several learning contexts. This paper presents the application of analytics to a mobile learning solution and demonstrates how a pedagogical sense was inferred from the data. Further, this inference was…
NASA Astrophysics Data System (ADS)
Qin, Ting; Liao, Congwei; Huang, Shengxiang; Yu, Tianbao; Deng, Lianwen
2018-01-01
An analytical drain current model based on the surface potential is proposed for amorphous indium gallium zinc oxide (a-InGaZnO) thin-film transistors (TFTs) with a synchronized symmetric dual-gate (DG) structure. Solving the electric field, surface potential (φS), and central potential (φ0) of the InGaZnO film using the Poisson equation with the Gaussian method and Lambert function is demonstrated in detail. The compact analytical model of current-voltage behavior, which consists of drift and diffusion components, is investigated by regional integration, and voltage-dependent effective mobility is taken into account. Comparison results demonstrate that the calculation results obtained using the derived models match well with the simulation results obtained using a technology computer-aided design (TCAD) tool. Furthermore, the proposed model is incorporated into SPICE simulations using Verilog-A to verify the feasibility of using DG InGaZnO TFTs for high-performance circuit designs.
Annual banned-substance review: analytical approaches in human sports drug testing.
Thevis, Mario; Kuuranne, Tiia; Geyer, Hans; Schänzer, Wilhelm
2015-01-01
Within the mosaic display of international anti-doping efforts, analytical strategies based on up-to-date instrumentation as well as most recent information about physiology, pharmacology, metabolism, etc., of prohibited substances and methods of doping are indispensable. The continuous emergence of new chemical entities and the identification of arguably beneficial effects of established or even obsolete drugs on endurance, strength, and regeneration, necessitate frequent and adequate adaptations of sports drug testing procedures. These largely rely on exploiting new technologies, extending the substance coverage of existing test protocols, and generating new insights into metabolism, distribution, and elimination of compounds prohibited by the World Anti-Doping Agency (WADA). In reference of the content of the 2014 Prohibited List, literature concerning human sports drug testing that was published between October 2013 and September 2014 is summarized and reviewed in this annual banned-substance review, with particular emphasis on analytical approaches and their contribution to enhanced doping controls. Copyright © 2014 John Wiley & Sons, Ltd.
Impact of Profiling Technologies in the Understanding of Recombinant Protein Production
NASA Astrophysics Data System (ADS)
Vijayendran, Chandran; Flaschel, Erwin
Since expression profiling methods have been available in a high throughput fashion, the implication of these technologies in the field of biotechnology has increased dramatically. Microarray technology is one such unique and efficient methodology for simultaneous exploration of expression levels of numerous genes. Likewise, two-dimensional gel electrophoresis or multidimensional liquid chromatography coupled with mass spectrometry are extensively utilised for studying expression levels of numerous proteins. In the field of biotechnology these highly parallel analytical methods have paved the way to study and understand various biological phenomena depending on expression patterns. The next phenomenological level is represented by the metabolome and the (metabolic) fluxome. However, this chapter reviews gene and protein profiling and their impact on understanding recombinant protein production. We focus on the computational methods utilised for the analyses of data obtained from these profiling technologies as well as prominent results focusing on recombinant protein expression with Escherichia coli. Owing to the knowledge accumulated with respect to cellular signals triggered during recombinant protein production, this field is on the way to design strategies for developing improved processes. Both gene and protein profiling have exhibited a handful of functional categories to concentrate on in order to identify target genes and proteins, respectively, involved in the signalling network with major impact on recombinant protein production.
Selected Analytical Methods for Environmental Remediation ...
The US Environmental Protection Agency’s Office of Research and Development (ORD) conducts cutting-edge research that provides the underpinning of science and technology for public health and environmental policies and decisions made by federal, state and other governmental organizations. ORD’s six research programs identify the pressing research needs with input from EPA offices and stakeholders. Research is conducted by ORD’s 3 labs, 4 centers, and 2 offices located in 14 facilities. The EPA booth at APHL will have several resources available to attendees, mostly in the form of print materials, that showcase our research labs, case studies of research activities, and descriptions of specific research projects. The Selected Analytical Methods for Environmental Remediation and Recovery (SAM), a library of selected methods that are helping to increase the nation's laboratory capacity to support large-scale emergency response operations, will be demoed by EPA scientists at the APHL Experience booth in the Exhibit Hall on Tuesday during the morning break. Please come to the EPA booth #309 for more information! To be on a loop at our ORD booth demo during APHL.
Criado-García, Laura; Garrido-Delgado, Rocío; Arce, Lourdes; Valcárcel, Miguel
2013-07-15
An UV-Ion Mobility Spectrometer is a simple, rapid, inexpensive instrument widely used in environmental analysis among other fields. The advantageous features of its underlying technology can be of great help towards developing reliable, economical methods for determining gaseous compounds from gaseous, liquid and solid samples. Developing an effective method using UV-Ion Mobility Spectrometry (UV-IMS) to determine volatile analytes entails using appropriate gaseous standards for calibrating the spectrometer. In this work, two home-made sample introduction systems (SISs) and a commercial gas generator were used to obtain such gaseous standards. The first home-made SIS used was a static head-space to measure compounds present in liquid samples and the other home-made system was an exponential dilution set-up to measure compounds present in gaseous samples. Gaseous compounds generated by each method were determined on-line by UV-IMS. Target analytes chosen for this comparative study were ethanol, acetone, benzene, toluene, ethylbenzene and xylene isomers. The different alternatives were acceptable in terms of sensitivity, precision and selectivity. Copyright © 2013 Elsevier B.V. All rights reserved.
Macedonia, Christian R; Johnson, Clark T; Rajapakse, Indika
2017-02-01
Technical advances in science have had broad implications in reproductive and women's health care. Recent innovations in population-level data collection and storage have made available an unprecedented amount of data for analysis while computational technology has evolved to permit processing of data previously thought too dense to study. "Big data" is a term used to describe data that are a combination of dramatically greater volume, complexity, and scale. The number of variables in typical big data research can readily be in the thousands, challenging the limits of traditional research methodologies. Regardless of what it is called, advanced data methods, predictive analytics, or big data, this unprecedented revolution in scientific exploration has the potential to dramatically assist research in obstetrics and gynecology broadly across subject matter. Before implementation of big data research methodologies, however, potential researchers and reviewers should be aware of strengths, strategies, study design methods, and potential pitfalls. Examination of big data research examples contained in this article provides insight into the potential and the limitations of this data science revolution and practical pathways for its useful implementation.
Medical student use of digital learning resources.
Scott, Karen; Morris, Anne; Marais, Ben
2018-02-01
University students expect to use technology as part of their studies, yet health professional teachers can struggle with the change in student learning habits fuelled by technology. Our research aimed to document the learning habits of contemporary medical students during a clinical rotation by exploring the use of locally and externally developed digital and print self-directed learning resources, and study groups. We investigated the learning habits of final-stage medical students during their clinical paediatric rotation using mixed methods, involving learning analytics and a student questionnaire. Learning analytics tracked aggregate student usage statistics of locally produced e-learning resources on two learning management systems and mobile learning resources. The questionnaire recorded student-reported use of digital and print learning resources and study groups. The students made extensive use of digital self-directed learning resources, especially in the 2 weeks before the examination, which peaked the day before the written examination. All students used locally produced digital formative assessment, and most (74/98; 76%) also used digital resources developed by other institutions. Most reported finding locally produced e-learning resources beneficial for learning. In terms of traditional forms of self-directed learning, one-third (28/94; 30%) indicated that they never read the course textbook, and few students used face-to-face 39/98 (40%) or online 6/98 (6%) study groups. Learning analytics and student questionnaire data confirmed the extensive use of digital resources for self-directed learning. Through clarification of learning habits and experiences, we think teachers can help students to optimise effective learning strategies; however, the impact of contemporary learning habits on learning efficacy requires further evaluation. Health professional teachers can struggle with the change in student learning habits fuelled by technology. © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.
Zhang, Chunhui; Ning, Ke; Zhang, Wenwen; Guo, Yuanjie; Chen, Jun; Liang, Chen
2013-04-01
Increased attention is currently being directed towards the potential negative effects of antibiotics and other PPCPs discharged into the aquatic environment via municipal WWTP secondary effluents. A number of analytical methods, such as high performance liquid chromatography technologies, including a high performance liquid chromatography-fluorescence method (HPLC-FLD), high performance liquid chromatography-UV detection method (HPLC-UV) and high performance liquid chromatography-mass spectrometry method (HPLC-MS), have been suggested as determination technologies for antibiotic residues in water. In this study, we implement a HPLC-MS/MS combined method to detect and analyze antibiotics in WWTP secondary effluent and apply a horizontal subsurface flow constructed wetland (CW) as an advanced wastewater treatment for removing antibiotics in the WWTP secondary effluent. The results show that there were 2 macrolides, 2 quinolones and 5 sulfas in WWTP secondary effluent among all the 22 antibiotics considered. After the CW advanced treatment, the concentration removal efficiencies and removal loads of 9 antibiotics were 53-100% and 0.004-0.7307 μg m(-2) per day, respectively.
A methodology for designing aircraft to low sonic boom constraints
NASA Technical Reports Server (NTRS)
Mack, Robert J.; Needleman, Kathy E.
1991-01-01
A method for designing conceptual supersonic cruise aircraft to meet low sonic boom requirements is outlined and described. The aircraft design is guided through a systematic evolution from initial three view drawing to a final numerical model description, while the designer using the method controls the integration of low sonic boom, high supersonic aerodynamic efficiency, adequate low speed handling, and reasonable structure and materials technologies. Some experience in preliminary aircraft design and in the use of various analytical and numerical codes is required for integrating the volume and lift requirements throughout the design process.
Interactive program for analysis and design problems in advanced composites technology
NASA Technical Reports Server (NTRS)
Cruse, T. A.; Swedlow, J. L.
1971-01-01
During the past year an experimental program in the fracture of advanced fiber composites has been completed. The experimental program has given direction to additional experimental and theoretical work. A synthesis program for designing low weight multifastener joints in composites is proposed, based on extensive analytical background. A number of failed joints have been thoroughly analyzed to evaluate the failure hypothesis used in the synthesis procedure. Finally, a new solution is reported for isotropic and anisotropic laminates using the boundary-integral method. The solution method offers significant savings of computer core and time for important problems.
Assessing Technological Change in Cardiothoracic Surgery
Iribarne, Alexander; Russo, Mark J.; Moskowitz, Alan J.; Ascheim, Deborah D.; Brown, Lawrence D.; Gelijns, Annetine C.
2010-01-01
Technological innovation— broadly defined as the development and introduction of new drugs, devices, and procedures— has played a major role in advancing the field of cardiothoracic surgery. It has generated new forms of care for patients and improved treatment options. Innovation, however, comes at a price. Total national health care expenditures now exceed $2 trillion per year in the United States and all current estimates indicate that this number will continue to rise. As we continue to seek the most innovative medical treatments for cardiovascular disease, the spiraling cost of these technologies comes to the forefront. In this article, we address 3 challenges in managing the health and economic impact of new and emerging technologies in cardiothoracic surgery: (1) challenges associated with the dynamics of technological growth itself; (2) challenges associated with methods of analysis; and (3) the ways in which value judgments and political factors shape the translation of evidence into policy. We conclude by discussing changes in the analytical, financial, and institutional realms that can improve evidence-based decision-making in cardiac surgery. PMID:19632560
ERIC Educational Resources Information Center
Chonkaew, Patcharee; Sukhummek, Boonnak; Faikhamta, Chatree
2016-01-01
The purpose of this study was to investigate the analytical thinking abilities and attitudes towards science learning of grade-11 students through science, technology, engineering, and mathematics (STEM) education integrated with a problem-based learning in the study of stoichiometry. The research tools consisted of a pre- and post-analytical…
CAA Annual Report Fiscal Year 1998.
1998-12-01
Studies , 3-1 Quick Reaction Analyses & Projects 3-1 4 TECHNOLOGY RESEARCH AND ANALYSIS SUPPORT 4-1 Technology Research 4-1 Methodology Research 4-2...Publications, Graphics, and Reproduction 5-2 6 ANALYTICAL EFFORTS COMPLETED BETWEEN FY90 AND FY98 6-1 Appendix A Annual Study , Work Evaluation...future. Chapter 2 highlights major studies and analysis activities which occurred in FY 98. Chapter 3 is the total package of analytical summaries
What values in design? The challenge of incorporating moral values into design.
Manders-Huits, Noëmi
2011-06-01
Recently, there is increased attention to the integration of moral values into the conception, design, and development of emerging IT. The most reviewed approach for this purpose in ethics and technology so far is Value-Sensitive Design (VSD). This article considers VSD as the prime candidate for implementing normative considerations into design. Its methodology is considered from a conceptual, analytical, normative perspective. The focus here is on the suitability of VSD for integrating moral values into the design of technologies in a way that joins in with an analytical perspective on ethics of technology. Despite its promising character, it turns out that VSD falls short in several respects: (1) VSD does not have a clear methodology for identifying stakeholders, (2) the integration of empirical methods with conceptual research within the methodology of VSD is obscure, (3) VSD runs the risk of committing the naturalistic fallacy when using empirical knowledge for implementing values in design, (4) the concept of values, as well as their realization, is left undetermined and (5) VSD lacks a complimentary or explicit ethical theory for dealing with value trade-offs. For the normative evaluation of a technology, I claim that an explicit and justified ethical starting point or principle is required. Moreover, explicit attention should be given to the value aims and assumptions of a particular design. The criteria of adequacy for such an approach or methodology follow from the evaluation of VSD as the prime candidate for implementing moral values in design.
Mojsiewicz-Pieńkowska, Krystyna; Jamrógiewicz, Marzena; Zebrowska, Maria; Sznitowska, Małgorzata; Centkowska, Katarzyna
2011-08-25
Silicone polymers possess unique properties, which make them suitable for many different applications, for example in the pharmaceutical and medical industry. To create an adhesive silicone film, the appropriate silicone components have to be chosen first. From these components two layers were made: an adhesive elastomer applied on the skin, and a non-adhesive elastomer on the other side of the film. The aim of this study was to identify a set of analytical methods that can be used for detailed characterization of the elastomer layers, as needed when designing new silicone films. More specifically, the following methods were combined to detailed identification of the silicone components: Fourier transform infrared spectroscopy (FTIR), proton nuclear magnetic resonance (¹H NMR) and size exclusion chromatography with evaporative light scattering detector (SEC-ELSD). It was demonstrated that these methods together with a rheological analysis are suitable for controlling the cross-linking reaction, thus obtaining the desired properties of the silicone film. Adhesive silicone films can be used as universal materials for medical use, particularly for effective treatment of scars and keloids or as drug carriers in transdermal therapy.
Cottenet, Geoffrey; Blancpain, Carine; Sonnard, Véronique; Chuah, Poh Fong
2013-08-01
Considering the increase of the total cultivated land area dedicated to genetically modified organisms (GMO), the consumers' perception toward GMO and the need to comply with various local GMO legislations, efficient and accurate analytical methods are needed for their detection and identification. Considered as the gold standard for GMO analysis, the real-time polymerase chain reaction (RTi-PCR) technology was optimised to produce a high-throughput GMO screening method. Based on simultaneous 24 multiplex RTi-PCR running on a ready-to-use 384-well plate, this new procedure allows the detection and identification of 47 targets on seven samples in duplicate. To comply with GMO analytical quality requirements, a negative and a positive control were analysed in parallel. In addition, an internal positive control was also included in each reaction well for the detection of potential PCR inhibition. Tested on non-GM materials, on different GM events and on proficiency test samples, the method offered high specificity and sensitivity with an absolute limit of detection between 1 and 16 copies depending on the target. Easy to use, fast and cost efficient, this multiplex approach fits the purpose of GMO testing laboratories.
Big data, big knowledge: big data for personalized healthcare.
Viceconti, Marco; Hunter, Peter; Hose, Rod
2015-07-01
The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.
Comparison of Gluten Extraction Protocols Assessed by LC-MS/MS Analysis.
Fallahbaghery, Azadeh; Zou, Wei; Byrne, Keren; Howitt, Crispin A; Colgrave, Michelle L
2017-04-05
The efficiency of gluten extraction is of critical importance to the results derived from any analytical method for gluten detection and quantitation, whether it employs reagent-based technology (antibodies) or analytical instrumentation (mass spectrometry). If the target proteins are not efficiently extracted, the end result will be an under-estimation in the gluten content posing a health risk to people affected by conditions such as celiac disease (CD) and nonceliac gluten sensitivity (NCGS). Five different extraction protocols were investigated using LC-MRM-MS for their ability to efficiently and reproducibly extract gluten. The rapid and simple "IPA/DTT" protocol and related "two-step" protocol were enriched for gluten proteins, 55/86% (trypsin/chymotrypsin) and 41/68% of all protein identifications, respectively, with both methods showing high reproducibility (CV < 15%). When using multistep protocols, it was critical to examine all fractions, as coextraction of proteins occurred across fractions, with significant levels of proteins existing in unexpected fractions and not all proteins within a particular gluten class behaving the same.
Graphing trillions of triangles.
Burkhardt, Paul
2017-07-01
The increasing size of Big Data is often heralded but how data are transformed and represented is also profoundly important to knowledge discovery, and this is exemplified in Big Graph analytics. Much attention has been placed on the scale of the input graph but the product of a graph algorithm can be many times larger than the input. This is true for many graph problems, such as listing all triangles in a graph. Enabling scalable graph exploration for Big Graphs requires new approaches to algorithms, architectures, and visual analytics. A brief tutorial is given to aid the argument for thoughtful representation of data in the context of graph analysis. Then a new algebraic method to reduce the arithmetic operations in counting and listing triangles in graphs is introduced. Additionally, a scalable triangle listing algorithm in the MapReduce model will be presented followed by a description of the experiments with that algorithm that led to the current largest and fastest triangle listing benchmarks to date. Finally, a method for identifying triangles in new visual graph exploration technologies is proposed.
Nitrate biosensors and biological methods for nitrate determination.
Sohail, Manzar; Adeloju, Samuel B
2016-06-01
The inorganic nitrate (NO3‾) anion is present under a variety of both natural and artificial environmental conditions. Nitrate is ubiquitous within the environment, food, industrial and physiological systems and is mostly present as hydrated anion of a corresponding dissolved salt. Due to the significant environmental and toxicological effects of nitrate, its determination and monitoring in environmental and industrial waters are often necessary. A wide range of analytical techniques are available for nitrate determination in various sample matrices. This review discusses biosensors available for nitrate determination using the enzyme nitrate reductase (NaR). We conclude that nitrate determination using biosensors is an excellent non-toxic alternative to all other available analytical methods. Over the last fifteen years biosensing technology for nitrate analysis has progressed very well, however, there is a need to expedite the development of nitrate biosensors as a suitable alternative to non-enzymatic techniques through the use of different polymers, nanostructures, mediators and strategies to overcome oxygen interference. Copyright © 2016 Elsevier B.V. All rights reserved.
Characterization of NIST food-matrix Standard Reference Materials for their vitamin C content.
Thomas, Jeanice B; Yen, James H; Sharpless, Katherine E
2013-05-01
The vitamin C concentrations in three food-matrix Standard Reference Materials (SRMs) from the National Institute of Standards and Technology (NIST) have been determined by liquid chromatography (LC) with absorbance detection. These materials (SRM 1549a Whole Milk Powder, SRM 1849a Infant/Adult Nutritional Formula, and SRM 3233 Fortified Breakfast Cereal) have been characterized to support analytical measurements made by food processors that are required to provide information about their products' vitamin C content on the labels of products distributed in the United States. The SRMs are primarily intended for use in validating analytical methods for the determination of selected vitamins, elements, fatty acids, and other nutrients in these materials and in similar matrixes. They can also be used for quality assurance in the characterization of test samples or in-house control materials, and for establishing measurement traceability. Within-day precision of the LC method used to measure vitamin C in the food-matrix SRMs characterized in this study ranged from 2.7% to 6.5%.
7 CFR 94.303 - Analytical methods.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...
7 CFR 94.303 - Analytical methods.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...
7 CFR 94.303 - Analytical methods.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...
7 CFR 94.303 - Analytical methods.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...
7 CFR 94.303 - Analytical methods.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 94.303 Section 94.303 Agriculture... POULTRY AND EGG PRODUCTS Processed Poultry Products § 94.303 Analytical methods. The analytical methods... latest edition of the Official Methods of Analysis of AOAC INTERNATIONAL, Suite 500, 481 North Frederick...
SAM Radiochemical Methods Query
Laboratories measuring target radiochemical analytes in environmental samples can use this online query tool to identify analytical methods in EPA's Selected Analytical Methods for Environmental Remediation and Recovery for select radiochemical analytes.
2011-01-01
The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper. PMID:21410968
Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin
2011-03-16
The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.
Designing a Marketing Analytics Course for the Digital Age
ERIC Educational Resources Information Center
Liu, Xia; Burns, Alvin C.
2018-01-01
Marketing analytics is receiving great attention because of evolving technology and the radical changes in the marketing environment. This study aims to assist the design and implementation of a marketing analytics course. We assembled a rich data set from four sources: business executives, 400 employers' job postings, one million tweets about…
Big data analytics as a service infrastructure: challenges, desired properties and solutions
NASA Astrophysics Data System (ADS)
Martín-Márquez, Manuel
2015-12-01
CERN's accelerator complex generates a very large amount of data. A large volumen of heterogeneous data is constantly generated from control equipment and monitoring agents. These data must be stored and analysed. Over the decades, CERN's researching and engineering teams have applied different approaches, techniques and technologies for this purpose. This situation has minimised the necessary collaboration and, more relevantly, the cross data analytics over different domains. These two factors are essential to unlock hidden insights and correlations between the underlying processes, which enable better and more efficient daily-based accelerator operations and more informed decisions. The proposed Big Data Analytics as a Service Infrastructure aims to: (1) integrate the existing developments; (2) centralise and standardise the complex data analytics needs for CERN's research and engineering community; (3) deliver real-time, batch data analytics and information discovery capabilities; and (4) provide transparent access and Extract, Transform and Load (ETL), mechanisms to the various and mission-critical existing data repositories. This paper presents the desired objectives and properties resulting from the analysis of CERN's data analytics requirements; the main challenges: technological, collaborative and educational and; potential solutions.
Nanoscale Surface Plasmonics Sensor With Nanofluidic Control
NASA Technical Reports Server (NTRS)
Wei, Jianjun; Singhal, Sameer; Waldeck, David H.; Kofke, Matthew
2013-01-01
Conventional quantitative protein assays of bodily fluids typically involve multiple steps to obtain desired measurements. Such methods are not well suited for fast and accurate assay measurements in austere environments such as spaceflight and in the aftermath of disasters. Consequently, there is a need for a protein assay technology capable of routinely monitoring proteins in austere environments. For example, there is an immediate need for a urine protein assay to assess astronaut renal health during spaceflight. The disclosed nanoscale surface plasmonics sensor provides a core detection method that can be integrated to a lab-on-chip device that satisfies the unmet need for such a protein assay technology. Assays based upon combinations of nanoholes, nanorings, and nanoslits with transmission surface plasmon resonance (SPR) are used for assays requiring extreme sensitivity, and are capable of detecting specific analytes at concentrations as low as picomole to femtomole level in well-controlled environments. The device operates in a transmission mode configuration in which light is directed at one planar surface of the array, which functions as an optical aperture. The incident light induces surface plasmon light transmission from the opposite surface of the array. The presence of a target analyte is detected by changes in the spectrum of light transmitted by the array when a target analyte induces a change in the refractive index of the fluid within the nanochannels. This occurs, for example, when a target analyte binds to a receptor fixed to the walls of the nanochannels in the array. Independent fluid handling capability for individual nanoarrays on a nanofluidic chip containing a plurality of nanochannel arrays allows each array to be used to sense a different target analyte and/or for paired arrays to analyze control and test samples simultaneously in parallel. The present invention incorporates transmission mode nanoplasmonics and nanofluidics into a single, microfluidically controlled device. The device comprises one or more arrays of aligned nanochannels that are in fluid communication with inflowing and outflowing fluid handling manifolds that control the flow of fluid through the arrays. The array acts as an aperture in a plasmonic sensor. Fluid, in the form of a liquid or a gas and comprising a sample for analysis, is moved from an inlet manifold through the nanochannel array, and out through an exit manifold. The fluid may also contain a reagent used to modify the interior surfaces of the nanochannels, and/or a reagent required for the detection of an analyte.
Ferguson, Alicia; Boomer, Ryan M.; Kurz, Markus; Keene, Sara C.; Diener, John L.; Keefe, Anthony D.; Wilson, Charles; Cload, Sharon T.
2004-01-01
We have utilized in vitro selection technology to develop allosteric ribozyme sensors that are specific for the small molecule analytes caffeine or aspartame. Caffeine- or aspartame-responsive ribozymes were converted into fluorescence-based RiboReporter™ sensor systems that were able to detect caffeine or aspartame in solution over a concentration range from 0.5 to 5 mM. With read-times as short as 5 min, these caffeine- or aspartame-dependent ribozymes function as highly specific and facile molecular sensors. Interestingly, successful isolation of allosteric ribozymes for the analytes described here was enabled by a novel selection strategy that incorporated elements of both modular design and activity-based selection methods typically used for generation of catalytic nucleic acids. PMID:15026535
NASA Astrophysics Data System (ADS)
Devrient, M.; Da, X.; Frick, T.; Schmidt, M.
Laser transmission welding is a well known joining technology for thermoplastics. Because of the needs of lightweight, cost effective and green production thermoplastics are usually filled with glass fibers. These lead to higher absorption and more scattering within the upper joining partner with a negative influence on the welding process. Here an experimental method for the characterization of the scattering behavior of semi crystalline thermoplastics filled with short glass fibers and a finite element model of the welding process capable to consider scattering as well as an analytical model are introduced. The experimental data is used for the numerical and analytical investigation of laser transmission welding under consideration of scattering. The scattering effects of several thermoplastics onto the calculated temperature fields as well as weld seam geometries are quantified.
NASA Astrophysics Data System (ADS)
Mishima, Kenji; Yamashita, Koichi
2011-03-01
We theoretically and numerically investigated a new type of analytically solvable laser-driven systems inspired by electron-injection dynamics in dye-sensitized solar cells. The simple analytical expressions were found to be useful for understanding the difference between dye excitation and direct photo-injection occurring between dye molecule and semiconductor nanoparticles. More importantly, we propose a method for discriminating experimentally dye excitation and direct photo-injection by using time-dependent fluorescence. We found that dye excitation shows no significant quantum beat whereas the direct photo-injection shows a significant quantum beat. This work was supported by Funding Program for World-Leading Innovative R&D on Science and Technology (FIRST) ``Development of Organic Photovoltaics toward a Low-Carbon Society,'' Cabinet Office, Japan.
Bourget, P; Amin, A; Vidal, F; Merlette, C; Troude, P; Corriol, O
2013-09-01
In France, central IV admixture of chemotherapy (CT) treatments at the hospital is now required by law. We have previously shown that the shaping of Therapeutic Objects (TOs) could profit from an Analytical Quality Assurance (AQA), closely linked to the batch release, for the three key parameters: identity, purity, and initial concentration of the compound of interest. In the course of recent and diversified works, we showed the technical superiority of non-intrusive Raman Spectroscopy (RS) vs. any other analytical option and, especially for both HPLC and vibrational method using a UV/visible-FTIR coupling. An interconnected qualitative and economic assessment strongly helps to enrich these relevant works. The study compares in operational situation, the performance of three analytical methods used for the AQC of TOs. We used: a) a set of evaluation criteria, b) the depreciation tables of the machinery, c) the cost of disposables, d) the weight of equipment and technical installations, e) the basic accounting unit (unit of work) and its composite costs (Euros), which vary according to the technical options, the weight of both human resources and disposables; finally, different combinations are described. So, the unit of work can take 12 different values between 1 and 5.5 Euros, and we provide various recommendations. A qualitative evaluation grid constantly places the SR technology as superior or equal to the 2 other techniques currently available. Our results demonstrated: a) the major interest of the non-intrusive AQC performed by RS, especially when it is not possible to analyze a TO with existing methods e.g. elastomeric portable pumps, and b) the high potential for this technique to be a strong contributor to the security of the medication circuit, and to fight the iatrogenic effects of drugs especially in the hospital. It also contributes to the protection of all actors in healthcare and of their working environment.
Incentives for knowledge sharing: impact of organisational culture and information technology
NASA Astrophysics Data System (ADS)
Lyu, Hongbo; Zhang, Zuopeng Justin
2017-10-01
This research presents and examines an analytical model of knowledge management in which organisational culture dynamically improves with knowledge-sharing and learning activities within organisations. We investigate the effects of organisational incentives and the level of information technology on the motivation of knowledge sharing. We derive a linear incentive reward structure for knowledge sharing under both homogeneous and heterogeneous conditions. In addition, we show how the organisational culture and the optimum linear sharing reward change with several crucial factors, and summarise three sets of methods (strong IT support, congruent organisational culture, and effective employee assessment) to complement the best linear incentive. Our research provides valuable insights for practitioners in terms of implementing knowledge-management initiatives.
Software Analytical Instrument for Assessment of the Process of Casting Slabs
NASA Astrophysics Data System (ADS)
Franěk, Zdeněk; Kavička, František; Štětina, Josef; Masarik, Miloš
2010-06-01
The paper describes the original proposal of ways of solution and function of the program equipment for assessment of the process of casting slabs. The program system LITIOS was developed and implemented in EVRAZ Vitkovice Steel Ostrava on the equipment of continuous casting of steel (further only ECC). This program system works on the data warehouse of technological parameters of casting and quality parameters of slabs. It enables an ECC technologist to analyze the course of casting melt and with using statistics methods to set the influence of single technological parameters on the duality of final slabs. The system also enables long term monitoring and optimization of the production.
7 CFR 98.4 - Analytical methods.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...
7 CFR 93.4 - Analytical methods.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...
7 CFR 98.4 - Analytical methods.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...
7 CFR 93.4 - Analytical methods.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...
7 CFR 93.4 - Analytical methods.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...
7 CFR 98.4 - Analytical methods.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...
7 CFR 98.4 - Analytical methods.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture....4 Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MRE's are listed as follows: (1) Official Methods of...
7 CFR 98.4 - Analytical methods.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 98.4 Section 98.4 Agriculture... Analytical methods. (a) The majority of analytical methods used by the USDA laboratories to perform analyses of meat, meat food products and MREs are listed as follows: (1) Official Methods of Analysis of AOAC...
7 CFR 93.4 - Analytical methods.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...
7 CFR 93.4 - Analytical methods.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 93.4 Section 93.4 Agriculture... PROCESSED FRUITS AND VEGETABLES Citrus Juices and Certain Citrus Products § 93.4 Analytical methods. (a) The majority of analytical methods for citrus products are found in the Official Methods of Analysis of AOAC...
1990-06-01
on simple railgun accelerators andI homopolar generators. Complex rotating flux compressors would drastically improve the performance of EM launchers...velocities. If this is the direction of improvement, then energies stored in the electric trains built with linear electric motors in Japan and Western I...laboratories which had power supplies 3 already built for other programs ( homopolar generators in conjunction with an inductor and an opening switch
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melaina, Marc; Saur, Genevieve; Ramsden, Todd
2015-05-28
This presentation summarizes NREL's hydrogen and fuel cell analysis work in three areas: resource potential, greenhouse gas emissions and cost of delivered energy, and influence of auxiliary revenue streams. NREL's hydrogen and fuel cell analysis projects focus on low-carbon and economic transportation and stationary fuel cell applications. Analysis tools developed by the lab provide insight into the degree to which bridging markets can strengthen the business case for fuel cell applications.
1998 Technology Showcase. JOAP International Condition Monitoring Conference.
1998-04-01
Systems using Automated SEM/ EDX and New Diagnostic Routines 276 N. W Farrant & T. Luckhurst ADVANCED DIAGNOSTIC SYSTEMS Model-Based Diagnostics of Gas...Microscopy with Energy Dispersive X-Ray (SEM/ EDX ) micro analysis packages and Energy Dispersive X-Ray Fluorescence (EDXRF) analytical equipment. Therqfore...wear particles separated by ferrogram method. a- I WEAR PARTICLE A SLAS 97 (HOME PAGE) Fig I Home Page NONFE;RROUS MATERIAL A wW~ a48 -1, rV fr , ý b
Experimental demonstration of the control of flexible structures
NASA Technical Reports Server (NTRS)
Schaechter, D. B.; Eldred, D. B.
1984-01-01
The Large Space Structure Technology Flexible Beam Experiment employs a pinned-free flexible beam to demonstrate such required methods as dynamic and adaptive control, as well as various control law design approaches and hardware requirements. An attempt is made to define the mechanization difficulties that may inhere in flexible structures. Attention is presently given to analytical work performed in support of the test facility's development, the final design's specifications, the control laws' synthesis, and experimental results obtained.
Material design and structural color inspired by biomimetic approach
Saito, Akira
2011-01-01
Generation of structural color is one of the essential functions realized by living organisms, and its industrial reproduction can result in numerous applications. From this viewpoint, the mechanisms, materials, analytical methods and fabrication technologies of the structural color are reviewed in this paper. In particular, the basic principles of natural photonic materials, the ideas developed from these principles, the directions of applications and practical industrial realizations are presented by summarizing the recent research results. PMID:27877459
Proactive human-computer collaboration for information discovery
NASA Astrophysics Data System (ADS)
DiBona, Phil; Shilliday, Andrew; Barry, Kevin
2016-05-01
Lockheed Martin Advanced Technology Laboratories (LM ATL) is researching methods, representations, and processes for human/autonomy collaboration to scale analysis and hypotheses substantiation for intelligence analysts. This research establishes a machinereadable hypothesis representation that is commonsensical to the human analyst. The representation unifies context between the human and computer, enabling autonomy in the form of analytic software, to support the analyst through proactively acquiring, assessing, and organizing high-value information that is needed to inform and substantiate hypotheses.
Comparison of analytical methods for profiling N- and O-linked glycans from cultured cell lines
Togayachi, Akira; Azadi, Parastoo; Ishihara, Mayumi; Geyer, Rudolf; Galuska, Christina; Geyer, Hildegard; Kakehi, Kazuaki; Kinoshita, Mitsuhiro; Karlsson, Niclas G.; Jin, Chunsheng; Kato, Koichi; Yagi, Hirokazu; Kondo, Sachiko; Kawasaki, Nana; Hashii, Noritaka; Kolarich, Daniel; Stavenhagen, Kathrin; Packer, Nicolle H.; Thaysen-Andersen, Morten; Nakano, Miyako; Taniguchi, Naoyuki; Kurimoto, Ayako; Wada, Yoshinao; Tajiri, Michiko; Yang, Pengyuan; Cao, Weiqian; Li, Hong; Rudd, Pauline M.; Narimatsu, Hisashi
2016-01-01
The Human Disease Glycomics/Proteome Initiative (HGPI) is an activity in the Human Proteome Organization (HUPO) supported by leading researchers from international institutes and aims at development of disease-related glycomics/glycoproteomics analysis techniques. Since 2004, the initiative has conducted three pilot studies. The first two were N- and O-glycan analyses of purified transferrin and immunoglobulin-G and assessed the most appropriate analytical approach employed at the time. This paper describes the third study, which was conducted to compare different approaches for quantitation of N- and O-linked glycans attached to proteins in crude biological samples. The preliminary analysis on cell pellets resulted in wildly varied glycan profiles, which was probably the consequence of variations in the pre-processing sample preparation methodologies. However, the reproducibility of the data was not improved dramatically in the subsequent analysis on cell lysate fractions prepared in a specified method by one lab. The study demonstrated the difficulty of carrying out a complete analysis of the glycome in crude samples by any single technology and the importance of rigorous optimization of the course of analysis from preprocessing to data interpretation. It suggests that another collaborative study employing the latest technologies in this rapidly evolving field will help to realize the requirements of carrying out the large-scale analysis of glycoproteins in complex cell samples. PMID:26511985
Water flow in fractured rock masses: numerical modeling for tunnel inflow assessment
NASA Astrophysics Data System (ADS)
Gattinoni, P.; Scesi, L.; Terrana, S.
2009-04-01
Water circulation in rocks represents a very important element to solve many problems linked with civil, environmental and mining engineering. In particular, the interaction of tunnelling with groundwater has become a very relevant problem not only due to the need to safeguard water resources from impoverishment and from the pollution risk, but also to guarantee the safety of workers and to assure the efficiency of the tunnel drainage systems. The evaluation of the hydrogeological risk linked to the underground excavation is very complex, either for the large number of variables involved or for the lack of data available during the planning stage. The study is aimed to quantify the influence of some geo-structural parameters (i.e. discontinuities dip and dip direction) on the tunnel drainage process, comparing the traditional analytical method to the modeling approach, with specific reference to the case of anisotropic rock masses. To forecast the tunnel inflows, a few Authors suggest analytic formulations (Goodman et al., 1965; Knutsson et al., 1996; Ribacchi et al., 2002; Park et al., 2008; Perrochet et al., 2007; Cesano et al., 2003; Hwang et al., 2007), valid for infinite, homogeneous and isotropic aquifer, in which the permeability value is given as a modulus of equivalent hydraulic conductivity Keq. On the contrary, in discontinuous rock masses the water flow is strongly controlled by joints orientation, by their hydraulic characteristics and by rocks fracturing conditions. The analytic equations found in the technical literature could be very useful, but often they don't reflect the real phenomena of the tunnel inflow in rock masses. Actually, these equations are based on the hypothesis of homogeneous aquifer, and then they don't give good agreement for an heterogeneous fractured medium. In this latter case, the numerical modelling could provide the best results, but only with a detailed conceptual model of the water circulation, high costs and long simulation times. Therefore, the integration of analytic method and numerical modeling is very important to adapt the analytic formula to the specific hydrogeological structure. The study was carried out through a parametrical modeling, so that groundwater flow was simulated with the DEM Model UDEC 2D, considering different geometrical (tunnel depth and radius) and hydrogeological settings (piezometrical). The influence of geo-structural setting (as dip and dip direction of discontinuities, with reference to their permeability) on tunnel drainage process was quantified. The simulations are aimed to create a sufficient data set of tunnel inflows, in different geological-structural setting, enabling a quantitative comparison between numerical and the well-known analytic formulas (i.e. Goodman and El Tani equations). Results of this comparison point out the following aspects: - the geological-structural setting critical for hydrogeological risk in tunnel corresponds to joints having low dip (close to 0°) that favour the drainage processes and the increasing of the tunnel inflow; - the rock mass anisotropy strongly influences both the tunnel inflow and the water table drawdown; - the reliability of analytic formulas for the tunnel inflow assessment in discontinuous rock masses depends on the geostractural setting; actually the analytic formulas overestimate the tunnel inflow and this overestimation is bigger for geostructural setting having discontinuities with higher dips. Finally, using the results of parametrical modeling, the previous cited analytic formulas were corrected to point out an empirical equation that gives the tunnel inflow as a function of the different geological-structural setting, with particular regard to: - the horizontal component of discontinuities, - the hydraulic conductivity anisotropy ratio, - the orientation of the hydraulic conductivity tensor. The obtained empirical equation allows a first evaluation of the tunnel inflow, in which joint characteristics are taken into account, very useful to identify the areas where in-depth studies are required. References Cesano D., Bagtzoglou A.C., Olofsson B. (2003). Quantifying fractured rock hydraulic heterogeneity and groundwater inflow prediction in underground excavations: the heterogeneity index. Tunneling and Underground Space Technology, 18, pp. 19-34. El Tani M. (2003). Circular tunnel in a semi-infinite aquifer. Tunnelling and Groundwater Space Technology, 18, pp. 49-55. Goodman R.E., Moye D.G., Van Schalkwyk A., Javandel I. (1965). Ground water inflow during tunnel driving. Eng. Geol., 2, pp. 39-56. Hwang J-H., Lu C-C. (2007). A semi-analytical method for analyzing the tunnel water inflow. Tunneling and Underground Space Technology, 22, pp. 39-46. Itasca (2001). UDEC, User's guide. Itasca Consultino Group Inc., Minneapolis, Minnesota. Knutsson G., Olofsson B., Cesano D. (1996). Prognosis of groundwater inflows and drawdown due to the construction of rock tunnels in heterogeneous media. Res. Proj. Rep. Kungl Tekniska, Stokholm. Park K-H., Owatsiriwong A., Lee G-G. (2008). Analytical solution for steady-state groundwater inflow into a drained circular tunnel in a semi-infinite aquifer: a revisit. Tunnelling and Underground Space Technology, 23, pp. 206-209. Perrochet P., Dematteis A. (2007). Modelling Transient Discharge into a Tunnel Drilled in Heterogeneous Formation. Ground Water, 45(6), pp. 786-790.
Prieto-Ballesteros, Olga; Martínez-Frías, Jesús; Schutt, John; Sutter, Brad; Heldmann, Jennifer L; Bell, Mary Sue; Battler, Melissa; Cannon, Howard; Gómez-Elvira, Javier; Stoker, Carol R
2008-10-01
The 2005 Mars Astrobiology Research and Technology Experiment (MARTE) project conducted a simulated 1-month Mars drilling mission in the Río Tinto district, Spain. Dry robotic drilling, core sampling, and biological and geological analytical technologies were collectively tested for the first time for potential use on Mars. Drilling and subsurface sampling and analytical technologies are being explored for Mars because the subsurface is the most likely place to find life on Mars. The objectives of this work are to describe drilling, sampling, and analytical procedures; present the geological analysis of core and borehole material; and examine lessons learned from the drilling simulation. Drilling occurred at an undisclosed location, causing the science team to rely only on mission data for geological and biological interpretations. Core and borehole imaging was used for micromorphological analysis of rock, targeting rock for biological analysis, and making decisions regarding the next day's drilling operations. Drilling reached 606 cm depth into poorly consolidated gossan that allowed only 35% of core recovery and contributed to borehole wall failure during drilling. Core material containing any indication of biology was sampled and analyzed in more detail for its confirmation. Despite the poorly consolidated nature of the subsurface gossan, dry drilling was able to retrieve useful core material for geological and biological analysis. Lessons learned from this drilling simulation can guide the development of dry drilling and subsurface geological and biological analytical technologies for future Mars drilling missions.
NASA Astrophysics Data System (ADS)
Prieto-Ballesteros, Olga; Martínez-Frías, Jesús; Schutt, John; Sutter, Brad; Heldmann, Jennifer L.; Bell Johnson, Mary Sue; Battler, Melissa; Cannon, Howard; Gómez-Elvira, Javier; Stoker, Carol R.
2008-10-01
The 2005 Mars Astrobiology Research and Technology Experiment (MARTE) project conducted a simulated 1-month Mars drilling mission in the Río Tinto district, Spain. Dry robotic drilling, core sampling, and biological and geological analytical technologies were collectively tested for the first time for potential use on Mars. Drilling and subsurface sampling and analytical technologies are being explored for Mars because the subsurface is the most likely place to find life on Mars. The objectives of this work are to describe drilling, sampling, and analytical procedures; present the geological analysis of core and borehole material; and examine lessons learned from the drilling simulation. Drilling occurred at an undis closed location, causing the science team to rely only on mission data for geological and biological interpretations. Core and borehole imaging was used for micromorphological analysis of rock, targeting rock for biological analysis, and making decisions regarding the next day's drilling operations. Drilling reached 606 cm depth into poorly consolidated gossan that allowed only 35% of core recovery and contributed to borehole wall failure during drilling. Core material containing any indication of biology was sampled and analyzed in more detail for its confirmation. Despite the poorly consolidated nature of the subsurface gossan, dry drilling was able to retrieve useful core material for geological and biological analysis. Lessons learned from this drilling simulation can guide the development of dry drilling and subsurface geological and biological analytical technologies for future Mars drilling missions.
NASA Technical Reports Server (NTRS)
Patterson, Maria T.; Anderson, Nicholas; Bennett, Collin; Bruggemann, Jacob; Grossman, Robert L.; Handy, Matthew; Ly, Vuong; Mandl, Daniel J.; Pederson, Shane; Pivarski, James;
2016-01-01
Project Matsu is a collaboration between the Open Commons Consortium and NASA focused on developing open source technology for cloud-based processing of Earth satellite imagery with practical applications to aid in natural disaster detection and relief. Project Matsu has developed an open source cloud-based infrastructure to process, analyze, and reanalyze large collections of hyperspectral satellite image data using OpenStack, Hadoop, MapReduce and related technologies. We describe a framework for efficient analysis of large amounts of data called the Matsu "Wheel." The Matsu Wheel is currently used to process incoming hyperspectral satellite data produced daily by NASA's Earth Observing-1 (EO-1) satellite. The framework allows batches of analytics, scanning for new data, to be applied to data as it flows in. In the Matsu Wheel, the data only need to be accessed and preprocessed once, regardless of the number or types of analytics, which can easily be slotted into the existing framework. The Matsu Wheel system provides a significantly more efficient use of computational resources over alternative methods when the data are large, have high-volume throughput, may require heavy preprocessing, and are typically used for many types of analysis. We also describe our preliminary Wheel analytics, including an anomaly detector for rare spectral signatures or thermal anomalies in hyperspectral data and a land cover classifier that can be used for water and flood detection. Each of these analytics can generate visual reports accessible via the web for the public and interested decision makers. The result products of the analytics are also made accessible through an Open Geospatial Compliant (OGC)-compliant Web Map Service (WMS) for further distribution. The Matsu Wheel allows many shared data services to be performed together to efficiently use resources for processing hyperspectral satellite image data and other, e.g., large environmental datasets that may be analyzed for many purposes.
Pacheco, Bruno D; Valério, Jaqueline; Angnes, Lúcio; Pedrotti, Jairo J
2011-06-24
A fast and robust analytical method for amperometric determination of hydrogen peroxide (H(2)O(2)) based on batch injection analysis (BIA) on an array of gold microelectrodes modified with platinum is proposed. The gold microelectrode array (n=14) was obtained from electronic chips developed for surface mounted device technology (SMD), whose size offers advantages to adapt them in batch cells. The effect of the dispensing rate, volume injected, distance between the platinum microelectrodes and the pipette tip, as well as the volume of solution in the cell on the analytical response were evaluated. The method allows the H(2)O(2) amperometric determination in the concentration range from 0.8 μmolL(-1) to 100 μmolL(-1). The analytical frequency can attain 300 determinations per hour and the detection limit was estimated in 0.34 μmolL(-1) (3σ). The anodic current peaks obtained after a series of 23 successive injections of 50 μL of 25 μmolL(-1) H(2)O(2) showed an RSD<0.9%. To ensure the good selectivity to detect H(2)O(2), its determination was performed in a differential mode, with selective destruction of the H(2)O(2) with catalase in 10 mmolL(-1) phosphate buffer solution. Practical application of the analytical procedure involved H(2)O(2) determination in rainwater of São Paulo City. A comparison of the results obtained by the proposed amperometric method with another one which combines flow injection analysis (FIA) with spectrophotometric detection showed good agreement. Copyright © 2011 Elsevier B.V. All rights reserved.
The evolution of analytical chemistry methods in foodomics.
Gallo, Monica; Ferranti, Pasquale
2016-01-08
The methodologies of food analysis have greatly evolved over the past 100 years, from basic assays based on solution chemistry to those relying on the modern instrumental platforms. Today, the development and optimization of integrated analytical approaches based on different techniques to study at molecular level the chemical composition of a food may allow to define a 'food fingerprint', valuable to assess nutritional value, safety and quality, authenticity and security of foods. This comprehensive strategy, defined foodomics, includes emerging work areas such as food chemistry, phytochemistry, advanced analytical techniques, biosensors and bioinformatics. Integrated approaches can help to elucidate some critical issues in food analysis, but also to face the new challenges of a globalized world: security, sustainability and food productions in response to environmental world-wide changes. They include the development of powerful analytical methods to ensure the origin and quality of food, as well as the discovery of biomarkers to identify potential food safety problems. In the area of nutrition, the future challenge is to identify, through specific biomarkers, individual peculiarities that allow early diagnosis and then a personalized prognosis and diet for patients with food-related disorders. Far from the aim of an exhaustive review of the abundant literature dedicated to the applications of omic sciences in food analysis, we will explore how classical approaches, such as those used in chemistry and biochemistry, have evolved to intersect with the new omics technologies to produce a progress in our understanding of the complexity of foods. Perhaps most importantly, a key objective of the review will be to explore the development of simple and robust methods for a fully applied use of omics data in food science. Copyright © 2015 Elsevier B.V. All rights reserved.
Results of the International Energy Agency Round Robin on Fast Pyrolysis Bio-oil Production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elliott, Douglas C.; Meier, Dietrich; Oasmaa, Anja
An international round robin study of the production of fast pyrolysis bio-oil was undertaken. Fifteen institutions in six countries contributed. Three biomass samples were distributed to the laboratories for processing in fast pyrolysis reactors. Samples of the bio-oil produced were transported to a central analytical laboratory for analysis. The round robin was focused on validating the pyrolysis community understanding of production of fast pyrolysis bio-oil by providing a common feedstock for bio-oil preparation. The round robin included: •distribution of 3 feedstock samples from a common source to each participating laboratory; •preparation of fast pyrolysis bio-oil in each laboratory with themore » 3 feedstocks provided; •return of the 3 bio-oil products (minimum 500 ml) with operational description to a central analytical laboratory for bio-oil property determination. The analyses of interest were: density, viscosity, dissolved water, filterable solids, CHN, S, trace element analysis, ash, total acid number, pyrolytic lignin, and accelerated aging of bio-oil. In addition, an effort was made to compare the bio-oil components to the products of analytical pyrolysis through GC/MS analysis. The results showed that clear differences can occur in fast pyrolysis bio-oil properties by applying different reactor technologies or configurations. The comparison to analytical pyrolysis method suggested that Py-GC/MS could serve as a rapid screening method for bio-oil composition when produced in fluid-bed reactors. Furthermore, hot vapor filtration generally resulted in the most favorable bio-oil product, with respect to water, solids, viscosity, and total acid number. These results can be helpful in understanding the variation in bio-oil production methods and their effects on bio-oil product composition.« less
Cogeneration Technology Alternatives Study (CTAS). Volume 2: Analytical approach
NASA Technical Reports Server (NTRS)
Gerlaugh, H. E.; Hall, E. W.; Brown, D. H.; Priestley, R. R.; Knightly, W. F.
1980-01-01
The use of various advanced energy conversion systems were compared with each other and with current technology systems for their savings in fuel energy, costs, and emissions in individual plants and on a national level. The ground rules established by NASA and assumptions made by the General Electric Company in performing this cogeneration technology alternatives study are presented. The analytical methodology employed is described in detail and is illustrated with numerical examples together with a description of the computer program used in calculating over 7000 energy conversion system-industrial process applications. For Vol. 1, see 80N24797.
40 CFR 161.180 - Enforcement analytical method.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 25 2012-07-01 2012-07-01 false Enforcement analytical method. 161.180... DATA REQUIREMENTS FOR REGISTRATION OF ANTIMICROBIAL PESTICIDES Product Chemistry Data Requirements § 161.180 Enforcement analytical method. An analytical method suitable for enforcement purposes must be...
A Novel Calibration-Minimum Method for Prediction of Mole Fraction in Non-Ideal Mixture.
Shibayama, Shojiro; Kaneko, Hiromasa; Funatsu, Kimito
2017-04-01
This article proposes a novel concentration prediction model that requires little training data and is useful for rapid process understanding. Process analytical technology is currently popular, especially in the pharmaceutical industry, for enhancement of process understanding and process control. A calibration-free method, iterative optimization technology (IOT), was proposed to predict pure component concentrations, because calibration methods such as partial least squares, require a large number of training samples, leading to high costs. However, IOT cannot be applied to concentration prediction in non-ideal mixtures because its basic equation is derived from the Beer-Lambert law, which cannot be applied to non-ideal mixtures. We proposed a novel method that realizes prediction of pure component concentrations in mixtures from a small number of training samples, assuming that spectral changes arising from molecular interactions can be expressed as a function of concentration. The proposed method is named IOT with virtual molecular interaction spectra (IOT-VIS) because the method takes spectral change as a virtual spectrum x nonlin,i into account. It was confirmed through the two case studies that the predictive accuracy of IOT-VIS was the highest among existing IOT methods.
NASA Astrophysics Data System (ADS)
Nayak, Aditya B.; Price, James M.; Dai, Bin; Perkins, David; Chen, Ding Ding; Jones, Christopher M.
2015-06-01
Multivariate optical computing (MOC), an optical sensing technique for analog calculation, allows direct and robust measurement of chemical and physical properties of complex fluid samples in high-pressure/high-temperature (HP/HT) downhole environments. The core of this MOC technology is the integrated computational element (ICE), an optical element with a wavelength-dependent transmission spectrum designed to allow the detector to respond sensitively and specifically to the analytes of interest. A key differentiator of this technology is it uses all of the information present in the broadband optical spectrum to determine the proportion of the analyte present in a complex fluid mixture. The detection methodology is photometric in nature; therefore, this technology does not require a spectrometer to measure and record a spectrum or a computer to perform calculations on the recorded optical spectrum. The integrated computational element is a thin-film optical element with a specific optical response function designed for each analyte. The optical response function is achieved by fabricating alternating layers of high-index (a-Si) and low-index (SiO2) thin films onto a transparent substrate (BK7 glass) using traditional thin-film manufacturing processes (e.g., ion-assisted e-beam vacuum deposition). A proprietary software and process are used to control the thickness and material properties, including the optical constants of the materials during deposition to achieve the desired optical response function. The ion-assisted deposition is useful for controlling the densification of the film, stoichiometry, and material optical constants as well as to achieve high deposition growth rates and moisture-stable films. However, the ion-source can induce undesirable absorption in the film; and subsequently, modify the optical constants of the material during the ramp-up and stabilization period of the e-gun and ion-source, respectively. This paper characterizes the unwanted absorption in the a-Si thin-film using advanced thin-film metrology methods, including spectroscopic ellipsometry and Fourier transform infrared (FTIR) spectroscopy. The resulting analysis identifies a fundamental mechanism contributing to this absorption and a method for minimizing and accounting for the unwanted absorption in the thin-film such that the exact optical response function can be achieved.
Review on the conversion of thermoacoustic power into electricity.
Timmer, Michael A G; de Blok, Kees; van der Meer, Theo H
2018-02-01
Thermoacoustic engines convert heat energy into high amplitude acoustic waves and subsequently into electric power. This article provides a review of the four main methods to convert the (thermo)acoustic power into electricity. First, loudspeakers and linear alternators are discussed in a section on electromagnetic devices. This is followed by sections on piezoelectric transducers, magnetohydrodynamic generators, and bidirectional turbines. Each segment provides a literature review of the given technology for the field of thermoacoustics, focusing on possible configurations, operating characteristics, output performance, and analytical and numerical methods to study the devices. This information is used as an input to discuss the performance and feasibility of each method, and to identify challenges that should be overcome for a more successful implementation in thermoacoustic engines. The work is concluded by a comparison of the four technologies, concentrating on the possible areas of application, the conversion efficiency, maximum electrical power output and more generally the suggested focus for future work in the field.
Some Aspects in Photogrammetry Education at the Department of Geodesy and Cadastre of the VGTU
NASA Astrophysics Data System (ADS)
Ruzgienė, Birutė
2008-03-01
The education in photogrammetry is very important when applying photogrammetric methods for the terrain mapping purposes, for spatial data modelling, solving engineering tasks, measuring of architectural monuments etc. During the time the traditional photogrammetric technologies have been changing to modern fully digital photogrammetric workflow. The number of potential users of the photogrammetric methods tends to increase, because of high-degree automation in photographs (images) processing. The main subjects in Photogrammetry (particularly in Digital Photogrammetry) educational process are discussed. Different methods and digital systems are demonstrated with the examples of aerial photogrammetry products. The main objective is to search the possibilities for training in the photogrammetric measurements. Special attention is paid to the stereo plotting from aerial photography applying modified for teaching analytical technology. The integration of functionality of Digital Photogrammetric Systems and Digital Image Processing is analysed as well with an intention of extending the application areas and possibilities for usage of modern technologies in urban mapping and land cadastre. The practical presentation of photos geometry restitution is implemented as significant part of the studies. The interactive teaching for main photogrammetric procedures and controlling systems are highly desirable that without any doubt improve the quality of educational process.
Bernard, Elyse D; Nguyen, Kathy C; DeRosa, Maria C; Tayabali, Azam F; Aranda-Rodriguez, Rocio
2017-01-01
Aptamers are short oligonucleotide sequences used in detection systems because of their high affinity binding to a variety of macromolecules. With the introduction of aptamers over 25 years ago came the exploration of their use in many different applications as a substitute for antibodies. Aptamers have several advantages; they are easy to synthesize, can bind to analytes for which it is difficult to obtain antibodies, and in some cases bind better than antibodies. As such, aptamer applications have significantly expanded as an adjunct to a variety of different immunoassay designs. The Multiple-Analyte Profiling (xMAP) technology developed by Luminex Corporation commonly uses antibodies for the detection of analytes in small sample volumes through the use of fluorescently coded microbeads. This technology permits the simultaneous detection of multiple analytes in each sample tested and hence could be applied in many research fields. Although little work has been performed adapting this technology for use with apatmers, optimizing aptamer-based xMAP assays would dramatically increase the versatility of analyte detection. We report herein on the development of an xMAP bead-based aptamer/antibody sandwich assay for a biomarker of inflammation (C-reactive protein or CRP). Protocols for the coupling of aptamers to xMAP beads, validation of coupling, and for an aptamer/antibody sandwich-type assay for CRP are detailed. The optimized conditions, protocols and findings described in this research could serve as a starting point for the development of new aptamer-based xMAP assays.
The VAST Challenge: History, Scope, and Outcomes: An introduction to the Special Issue
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, Kristin A.; Grinstein, Georges; Whiting, Mark A.
2014-10-01
Visual analytics aims to facilitate human insight from complex data via a combination of visual representations, interaction techniques, and supporting algorithms. To create new tools and techniques that achieve this goal requires that researchers have an understanding of analytical questions to be addressed, data that illustrates the complexities and ambiguities found in realistic analytic settings, and methods for evaluating whether the plausible insights are gained through use of the new methods. However, researchers do not, generally speaking, have access to analysts who can articulate their problems or operational data that is used for analysis. To fill this gap, the Visualmore » Analytics Science and Technology (VAST) Challenge has been held annually since 2006. The VAST Challenge provides an opportunity for researchers to experiment with realistic but not real problems, using realistic synthetic data with known events embedded. Since its inception, the VAST Challenge has evolved along with the visual analytics research community to pose more complex challenges, ranging from text analysis to video analysis to large scale network log analysis. The seven years of the VAST Challenge have seen advancements in research and development, education, evaluation, and in the challenge process itself. This special issue of Information Visualization highlights some of the noteworthy advancements in each of these areas. Some of these papers focus on important research questions related to the challenge itself, and other papers focus on innovative research that has been shaped by participation in the challenge. This paper describes the VAST Challenge process and benefits in detail. It also provides an introduction to and context for the remaining papers in the issue.« less
The NASA Lewis large wind turbine program
NASA Technical Reports Server (NTRS)
Thomas, R. L.; Baldwin, D. H.
1981-01-01
The program is directed toward development of the technology for safe, reliable, environmentally acceptable large wind turbines that have the potential to generate a significant amount of electricity at costs competitive with conventional electric generation systems. In addition, these large wind turbines must be fully compatible with electric utility operations and interface requirements. Advances are made by gaining a better understanding of the system design drivers, improvements in the analytical design tools, verification of design methods with operating field data, and the incorporation of new technology and innovative designs. An overview of the program activities is presented and includes results from the first and second generation field machines (Mod-OA, -1, and -2), the design phase of the third generation wind turbine (Mod-5) and the advanced technology projects. Also included is the status of the Department of Interior WTS-4 machine.
Coggins, Christopher R E; Merski, Jerome A; Oldham, Michael J
2013-01-01
Recent technological advances allow ventilation holes in (or adjacent to) cigarette filters to be produced using lasers instead of using the mechanical procedures of earlier techniques. Analytical chemistry can be used to compare the composition of mainstream smoke from experimental cigarettes having filters with mechanically produced ventilation holes to that of cigarettes with ventilation holes that were produced using laser technology. Established procedures were used to analyze the smoke composition of 38 constituents of mainstream smoke generated using standard conditions. There were no differences between the smoke composition of cigarettes with filter ventilation holes that were produced mechanically or through use of laser technology. The two methods for producing ventilation holes in cigarette filters are equivalent in terms of resulting mainstream smoke chemistry, at two quite different filter ventilation percentages.
Kouri, T T; Gant, V A; Fogazzi, G B; Hofmann, W; Hallander, H O; Guder, W G
2000-07-01
Improved standardized performance is needed because urinalysis continues to be one of the most frequently requested laboratory tests. Since 1997, the European Confederation of Laboratory Medicine (ECLM) has been supporting an interdisciplinary project aiming to produce European urinalysis guidelines. More than seventy clinical chemists, microbiologists and ward-based clinicians, as well as representatives of manufacturers are taking part. These guidelines aim to improve the quality and consistency of chemical urinalysis, particle counting and bacterial culture by suggesting optimal investigative processes that could be applied in Europe. The approach is based on medical needs for urinalysis. The importance of the pre-analytical stage for total quality is stressed by detailed illustrative advice for specimen collection. Attention is also given to emerging automated technology. For cost containment reasons, both optimum (ideal) procedures and minimum analytical approaches are suggested. Since urinalysis mostly lacks genuine reference methods (primary reference measurement procedures; Level 4), a novel classification of the methods is proposed: comparison measurement procedures (Level 3), quantitative routine procedures (Level 2), and ordinal scale examinations (Level 1). Stepwise strategies are suggested to save costs, applying different rules for general and specific patient populations. New analytical quality specifications have been created. After a consultation period, the final written text will be published in full as a separate document.
Fibrinolysis standards: a review of the current status.
Thelwell, C
2010-07-01
Biological standards are used to calibrate measurements of components of the fibrinolytic system, either for assigning potency values to therapeutic products, or to determine levels in human plasma as an indicator of thrombotic risk. Traditionally WHO International Standards are calibrated in International Units based on consensus values from collaborative studies. The International Unit is defined by the response activity of a given amount of the standard in a bioassay, independent of the method used. Assay validity is based on the assumption that both standard and test preparation contain the same analyte, and the response in an assay is a true function of this analyte. This principle is reflected in the diversity of source materials used to prepare fibrinolysis standards, which has depended on the contemporary preparations they were employed to measure. With advancing recombinant technology, and improved analytical techniques, a reference system based on reference materials and associated reference methods has been recommended for future fibrinolysis standards. Careful consideration and scientific judgement must however be applied when deciding on an approach to develop a new standard, with decisions based on the suitability of a standard to serve its purpose, and not just to satisfy a metrological ideal. 2010 The International Association for Biologicals. Published by Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Santavenere, Alex
An action research study was undertaken to examine the effects of educational technology resources on critical thinking and analytical skills. The researcher observed 3 different 11th grade classes, a total of 75 students, over a week as they worked in the school's computer lab. Each class was composed of 25 to 30 students, all of whom were…
Archaic man meets a marvellous automaton: posthumanism, social robots, archetypes.
Jones, Raya
2017-06-01
Posthumanism is associated with critical explorations of how new technologies are rewriting our understanding of what it means to be human and how they might alter human existence itself. Intersections with analytical psychology vary depending on which technologies are held in focus. Social robotics promises to populate everyday settings with entities that have populated the imagination for millennia. A legend of A Marvellous Automaton appears as early as 350 B.C. in a book of Taoist teachings, and is joined by ancient and medieval legends of manmade humanoids coming to life, as well as the familiar robots of modern science fiction. However, while the robotics industry seems to be realizing an archetypal fantasy, the technology creates new social realities that generate distinctive issues of potential relevance for the theory and practice of analytical psychology. © 2017, The Society of Analytical Psychology.
Decision Support Model for Selection Technologies in Processing of Palm Oil Industrial Liquid Waste
NASA Astrophysics Data System (ADS)
Ishak, Aulia; Ali, Amir Yazid bin
2017-12-01
The palm oil industry continues to grow from year to year. Processing of the palm oil industry into crude palm oil (CPO) and palm kernel oil (PKO). The ratio of the amount of oil produced by both products is 30% of the raw material. This means that 70% is palm oil waste. The amount of palm oil waste will increase in line with the development of the palm oil industry. The amount of waste generated by the palm oil industry if it is not handled properly and effectively will contribute significantly to environmental damage. Industrial activities ranging from raw materials to produce products will disrupt the lives of people around the factory. There are many alternative technologies available to process other industries, but problems that often occur are difficult to implement the most appropriate technology. The purpose of this research is to develop a database of waste processing technology, looking for qualitative and quantitative criteria to select technology and develop Decision Support System (DSS) that can help make decisions. The method used to achieve the objective of this research is to develop a questionnaire to identify waste processing technology and develop the questionnaire to find appropriate database technology. Methods of data analysis performed on the system by using Analytic Hierarchy Process (AHP) and to build the model by using the MySQL Software that can be used as a tool in the evaluation and selection of palm oil mill processing technology.
Simultaneous Multiparameter Cellular Energy Metabolism Profiling of Small Populations of Cells.
Kelbauskas, Laimonas; Ashili, Shashaanka P; Lee, Kristen B; Zhu, Haixin; Tian, Yanqing; Meldrum, Deirdre R
2018-03-12
Functional and genomic heterogeneity of individual cells are central players in a broad spectrum of normal and disease states. Our knowledge about the role of cellular heterogeneity in tissue and organism function remains limited due to analytical challenges one encounters when performing single cell studies in the context of cell-cell interactions. Information based on bulk samples represents ensemble averages over populations of cells, while data generated from isolated single cells do not account for intercellular interactions. We describe a new technology and demonstrate two important advantages over existing technologies: first, it enables multiparameter energy metabolism profiling of small cell populations (<100 cells)-a sample size that is at least an order of magnitude smaller than other, commercially available technologies; second, it can perform simultaneous real-time measurements of oxygen consumption rate (OCR), extracellular acidification rate (ECAR), and mitochondrial membrane potential (MMP)-a capability not offered by any other commercially available technology. Our results revealed substantial diversity in response kinetics of the three analytes in dysplastic human epithelial esophageal cells and suggest the existence of varying cellular energy metabolism profiles and their kinetics among small populations of cells. The technology represents a powerful analytical tool for multiparameter studies of cellular function.
40 CFR 158.355 - Enforcement analytical method.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 24 2014-07-01 2014-07-01 false Enforcement analytical method. 158.355... DATA REQUIREMENTS FOR PESTICIDES Product Chemistry § 158.355 Enforcement analytical method. An analytical method suitable for enforcement purposes must be provided for each active ingredient in the...
Incoherent beam combining based on the momentum SPGD algorithm
NASA Astrophysics Data System (ADS)
Yang, Guoqing; Liu, Lisheng; Jiang, Zhenhua; Guo, Jin; Wang, Tingfeng
2018-05-01
Incoherent beam combining (ICBC) technology is one of the most promising ways to achieve high-energy, near-diffraction laser output. In this paper, the momentum method is proposed as a modification of the stochastic parallel gradient descent (SPGD) algorithm. The momentum method can improve the speed of convergence of the combining system efficiently. The analytical method is employed to interpret the principle of the momentum method. Furthermore, the proposed algorithm is testified through simulations as well as experiments. The results of the simulations and the experiments show that the proposed algorithm not only accelerates the speed of the iteration, but also keeps the stability of the combining process. Therefore the feasibility of the proposed algorithm in the beam combining system is testified.
Technology Utilization House Study Report. [For Energy Conservation
NASA Technical Reports Server (NTRS)
1974-01-01
The objectives of Project TECH are: (1) to construct a single family detached dwelling for demonstrating the application of advanced technology and minimizing the requirement for energy and utility services, and (2) to help influence future development in home construction by defining the interaction of integrated energy and water management systems with building configuration and construction materials. Components and methods expected to be cost effective over a 20 year span were studied. Emphasis was placed on the utilization of natural heating and cooling characteristics. Orientation and location of windows, landscaping, natural ventilation, and characteristics of the local climate and microclimate were intended to be used to best advantage. Energy conserving homes are most efficient when design for specific sites, therefore project TECH should not be considered a prototype design suitable for all locations. However, it does provide ideas and analytical methods which can be applied to some degree in all housing.
Magnetic biosensors: Modelling and simulation.
Nabaei, Vahid; Chandrawati, Rona; Heidari, Hadi
2018-04-30
In the past few years, magnetoelectronics has emerged as a promising new platform technology in various biosensors for detection, identification, localisation and manipulation of a wide spectrum of biological, physical and chemical agents. The methods are based on the exposure of the magnetic field of a magnetically labelled biomolecule interacting with a complementary biomolecule bound to a magnetic field sensor. This Review presents various schemes of magnetic biosensor techniques from both simulation and modelling as well as analytical and numerical analysis points of view, and the performance variations under magnetic fields at steady and nonstationary states. This is followed by magnetic sensors modelling and simulations using advanced Multiphysics modelling software (e.g. Finite Element Method (FEM) etc.) and home-made developed tools. Furthermore, outlook and future directions of modelling and simulations of magnetic biosensors in different technologies and materials are critically discussed. Crown Copyright © 2017. Published by Elsevier B.V. All rights reserved.
Application of the near-infrared spectroscopy in the pharmaceutical technology.
Jamrógiewicz, Marzena
2012-07-01
Near-infrared (NIR) spectroscopy is currently the fastest-growing and the most versatile analytical method not only in the pharmaceutical sciences but also in the industry. This review focuses on recent NIR applications in the pharmaceutical technology. This article covers monitoring, by NIR, of many manufacturing processes, such as granulation, mixing or drying, in order to determine the end-point of these processes. In this paper, apart from basic theoretical information concerning the NIR spectra, there are included determinations of the quality and quantity of pharmaceutical compounds. Some examples of measurements and control of physicochemical parameters of the final medicinal products, such as hardness, porosity, thickness size, compression strength, disintegration time and potential counterfeit are included. Biotechnology and plant drug analysis using NIR is also described. Moreover, some disadvantages of this method are stressed and future perspectives are anticipated. Copyright © 2012 Elsevier B.V. All rights reserved.
Arduini, Fabiana; Cinti, Stefano; Scognamiglio, Viviana; Moscone, Danila; Palleschi, Giuseppe
2017-03-22
Through the years, scientists have developed cutting-edge technologies to make (bio)sensors more convenient for environmental analytical purposes. Technological advancements in the fields of material science, rational design, microfluidics, and sensor printing, have radically shaped biosensor technology, which is even more evident in the continuous development of sensing systems for the monitoring of hazardous chemicals. These efforts will be crucial in solving some of the problems constraining biosensors to reach real environmental applications, such as continuous analyses in field by means of multi-analyte portable devices. This review (with 203 refs.) covers the progress between 2010 and 2015 in the field of technologies enabling biosensor applications in environmental analysis, including i) printing technology, ii) nanomaterial technology, iii) nanomotors, iv) biomimetic design, and (v) microfluidics. Next section describes futuristic cutting-edge technologies that are gaining momentum in recent years, which furnish highly innovative aspects to biosensing devices. Copyright © 2016 Elsevier B.V. All rights reserved.
Lindahl, Sofia; Gundersen, Cathrine Brecke; Lundanes, Elsa
2014-08-01
This review aims to summarize the available analytical methods in the open literature for the determination of some aliphatic and cyclic nitramines. Nitramines covered in this review are the ones that can be formed from the use of amines in post-combustion CO2 capture (PCC) plants and end up in the environment. Since the literature is quite scarce regarding the determination of nitramines in aqueous and soil samples, methods for determination of nitramines in other matrices have also been included. Since the nitramines are found in complex matrices and/or in very low concentration, an extraction step is often necessary before their determination. Liquid-liquid extraction (LLE) using dichloromethane and solid phase extraction (SPE) with an activated carbon based material have been the two most common extraction methods. Gas chromatography (GC) or reversed phase liquid chromatography (RPLC) has been used often combined with mass spectrometry (MS) in the final determination step. Presently there is no comprehensive method available that can be used for determination of all nitramines included in this review. The lowest concentration limit of quantification (cLOQ) is in the ng L(-1) range, however, most methods appear to have a cLOQ in the μg L(-1) range, if the cLOQ has been given.
Woolfenden, Elizabeth
2010-04-16
Sorbent tubes/traps are widely used in combination with gas chromatographic (GC) analytical methods to monitor the vapour-phase fraction of organic compounds in air. Applications range from atmospheric research and ambient air monitoring (indoor and outdoor) to occupational hygiene (personal exposure assessment) and measuring chemical emission levels. Part 1 of this paper reviewed the main sorbent-based air sampling strategies including active (pumped) tube monitoring, diffusive (passive) sampling onto sorbent tubes/cartridges plus sorbent trapping/focusing of whole air samples that are either collected in containers (such as canisters or bags) or monitored online. Options for subsequent extraction and transfer to GC(MS) analysis were also summarised and the trend to thermal desorption (TD)-based methods and away from solvent extraction was explained. As a result of this trend, demand for TD-compatible sorbents (alternatives to traditional charcoal) is growing. Part 2 of this paper therefore continues with a summary of TD-compatible sorbents, their respective advantages and limitations and considerations for sorbent selection. Other analytical considerations for optimizing sorbent-based air monitoring methods are also discussed together with recent technical developments and sampling accessories which have extended the application range of sorbent trapping technology generally. Copyright 2010 Elsevier B.V. All rights reserved.
Early Alert of Academically At-Risk Students: An Open Source Analytics Initiative
ERIC Educational Resources Information Center
Jayaprakash, Sandeep M.; Moody, Erik W.; Lauría, Eitel J. M.; Regan, James R.; Baron, Joshua D.
2014-01-01
The Open Academic Analytics Initiative (OAAI) is a collaborative, multi-year grant program aimed at researching issues related to the scaling up of learning analytics technologies and solutions across all of higher education. The paper describes the goals and objectives of the OAAI, depicts the process and challenges of collecting, organizing and…
The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate
ERIC Educational Resources Information Center
Cárdenas-Navia, Isabel; Fitzgerald, Brian K.
2015-01-01
New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…
Visual Analytics in Public Safety: Example Capabilities for Example Government Agencies
2011-10-01
is not limited to: the Police Records Information Management Environment for British Columbia (PRIME-BC), the Police Reporting and Occurrence System...and filtering for rapid identification of relevant documents - Graphical environment for visual evidence marshaling - Interactive linking and...analytical reasoning facilitated by interactive visual interfaces and integration with computational analytics. Indeed, a wide variety of technologies
Web Analytics: A Picture of the Academic Library Web Site User
ERIC Educational Resources Information Center
Black, Elizabeth L.
2009-01-01
This article describes the usefulness of Web analytics for understanding the users of an academic library Web site. Using a case study, the analysis describes how Web analytics can answer questions about Web site user behavior, including when visitors come, the duration of the visit, how they get there, the technology they use, and the most…
Report: Analytical Chemistry in a Changing World.
ERIC Educational Resources Information Center
Laitinen, H. A.
1980-01-01
Examines some of the changes that have occurred in the field of analytic chemistry, with emphasis on how the field has adapted to changes in science and technology. Current trends also are identified and discussed. (CS)
7 CFR 94.4 - Analytical methods.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 94.4 Section 94.4 Agriculture... POULTRY AND EGG PRODUCTS Mandatory Analyses of Egg Products § 94.4 Analytical methods. The majority of analytical methods used by the USDA laboratories to perform mandatory analyses for egg products are listed as...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this Part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
40 CFR 136.6 - Method modifications and analytical requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... PROGRAMS (CONTINUED) GUIDELINES ESTABLISHING TEST PROCEDURES FOR THE ANALYSIS OF POLLUTANTS § 136.6 Method... person or laboratory using a test procedure (analytical method) in this part. (2) Chemistry of the method means the reagents and reactions used in a test procedure that allow determination of the analyte(s) of...
Kulle, A; Krone, N; Holterhus, P M; Schuler, G; Greaves, R F; Juul, A; de Rijke, Y B; Hartmann, M F; Saba, A; Hiort, O; Wudy, S A
2017-05-01
Disorders or differences in sex development (DSD) comprise a heterogeneous group of conditions with an atypical sex development. For optimal diagnosis, highly specialised laboratory analyses are required across European countries. Working group 3 of EU COST (European Cooperation in Science and Technology) Action BM 1303 'DSDnet' 'Harmonisation of Laboratory Assessment' has developed recommendations on laboratory assessment for DSD regarding the use of technologies and analytes to be investigated. This position paper on steroid hormone analysis in diagnosis and treatment of DSD was compiled by a group of specialists in DSD and/or hormonal analysis, either from participating European countries or international partner countries. The topics discussed comprised analytical methods (immunoassay/mass spectrometry-based methods), matrices (urine/serum/saliva) and harmonisation of laboratory tests. The following positions were agreed upon: support of the appropriate use of immunoassay- and mass spectrometry-based methods for diagnosis and monitoring of DSD. Serum/plasma and urine are established matrices for analysis. Laboratories performing analyses for DSD need to operate within a quality framework and actively engage in harmonisation processes so that results and their interpretation are the same irrespective of the laboratory they are performed in. Participation in activities of peer comparison such as sample exchange or when available subscribing to a relevant external quality assurance program should be achieved. The ultimate aim of the guidelines is the implementation of clinical standards for diagnosis and appropriate treatment of DSD to achieve the best outcome for patients, no matter where patients are investigated or managed. © 2017 The authors.
Hess, Cornelius; Sydow, Konrad; Kueting, Theresa; Kraemer, Michael; Maas, Alexandra
2018-02-01
The requirement for correct evaluation of forensic toxicological results in daily routine work and scientific studies is reliable analytical data based on validated methods. Validation of a method gives the analyst tools to estimate the efficacy and reliability of the analytical method. Without validation, data might be contested in court and lead to unjustified legal consequences for a defendant. Therefore, new analytical methods to be used in forensic toxicology require careful method development and validation of the final method. Until now, there are no publications on the validation of chromatographic mass spectrometric methods for the detection of endogenous substances although endogenous analytes can be important in Forensic Toxicology (alcohol consumption marker, congener alcohols, gamma hydroxy butyric acid, human insulin and C-peptide, creatinine, postmortal clinical parameters). For these analytes, conventional validation instructions cannot be followed completely. In this paper, important practical considerations in analytical method validation for endogenous substances will be discussed which may be used as guidance for scientists wishing to develop and validate analytical methods for analytes produced naturally in the human body. Especially the validation parameters calibration model, analytical limits, accuracy (bias and precision) and matrix effects and recovery have to be approached differently. Highest attention should be paid to selectivity experiments. Copyright © 2017 Elsevier B.V. All rights reserved.
Earthdata Cloud Analytics Project
NASA Technical Reports Server (NTRS)
Ramachandran, Rahul; Lynnes, Chris
2018-01-01
This presentation describes a nascent project in NASA to develop a framework to support end-user analytics of NASA's Earth science data in the cloud. The chief benefit of migrating EOSDIS (Earth Observation System Data and Information Systems) data to the cloud is to position the data next to enormous computing capacity to allow end users to process data at scale. The Earthdata Cloud Analytics project will user a service-based approach to facilitate the infusion of evolving analytics technology and the integration with non-NASA analytics or other complementary functionality at other agencies and in other nations.
2015-01-01
Background The assessment of a new health technology is a multidisciplinary and multidimensional process, which requires a complex analysis and the convergence of different stakeholders into a common decision. This task is even more delicate when the assessment is carried out in early stage of development processes, when the maturity of the technology prevents conducting a large scale trials to evaluate the cost effectiveness through classic health economics methods. This lack of information may limit the future development and deployment in the clinical practice. This work aims to 1) identify the most relevant user needs of a new medical technology for managing and monitoring Parkinson's Disease (PD) patients and to 2) use these user needs for a preliminary assessment of a specific system called PERFORM, as a case study. Methods Analytic Hierarchy Process (AHP) was used to design a hierarchy of 17 needs, grouped into 5 categories. A total of 16 experts, 6 of them with a clinical background and the remaining 10 with a technical background, were asked to rank these needs and categories. Results On/Off fluctuations detection, Increase wearability acceptance, and Increase self-management support have been identified as the most relevant user needs. No significant differences were found between the clinician and technical groups. These results have been used to evaluate the PERFORM system and to identify future areas of improvement. Conclusions First of all, the AHP contributed to the elaboration of a unified hierarchy, integrating the needs of a variety of stakeholders, promoting the discussion and the agreement into a common framework of evaluation. Moreover, the AHP effectively supported the user need elicitation as well as the assignment of different weights and priorities to each need and, consequently, it helped to define a framework for the assessment of telehealth systems for PD management and monitoring. This framework can be used to support the decision-making process for the adoption of new technologies in PD. PMID:26391847
Advanced industrial fluorescence metrology used for qualification of high quality optical materials
NASA Astrophysics Data System (ADS)
Engel, Axel; Becker, Hans-Juergen; Sohr, Oliver; Haspel, Rainer; Rupertus, Volker
2003-11-01
Schott Glas is developing and producing the optical material for various specialized applications in telecommunication, biomedical, optical, and micro lithography technology. The requirements on quality for optical materials are extremely high and still increasing. For example in micro lithography applications the impurities of the material are specified to be in the low ppb range. Usually the impurities in the lower ppb range are determined using analytical methods like LA ICP-MS and Neutron Activation Analysis. On the other hand absorption and laser resistivity of optical material is qualified with optical methods like precision spectral photometers and in-situ transmission measurements having UV lasers. Analytical methods have the drawback that they are time consuming and rather expensive, whereas the sensitivity for the absorption method will not be sufficient to characterize the future needs (coefficient much below 10-3 cm-1). In order to achieve the current and future quality requirements a Jobin Yvon FLUOROLOG 3.22 fluorescence spectrometer is employed to enable fast and precise qualification and analysis. The main advantage of this setup is the combination of highest sensitivity (more than one order of magnitude higher sensitivity that state of the art UV absorption spectroscopy) and fast measurement and evaluation cycles (several minutes compared to several hours necessary for chemical analytics). An overview is given for spectral characteristics and using specified standards. Moreover correlations to the material qualities are shown. In particular we have investigated the elementary fluorescence and absorption of rare earth element impurities as well as defects induced luminescence originated by impurities.
Angioni, Alberto; Porcu, Luciano; Pirisi, Filippo
2011-10-26
The behavior in the field and the transfer from olives to olive oil during the technological process of imidacloprid, thiacloprid, and spinosad were studied. The extraction method used was effective in extracting the analytes of interest, and no interfering peaks were detected in the chromatogram. The residue levels found in olives after treatment were 0.14, 0.04, and 0.30 mg/kg for imidacloprid, thiacloprid, and spinosad, respectively, far below the maximum residue levels (MRLs) set for these insecticides in EU. At the preharvest interval (PHI), no residue was detected for imidacloprid and thiacloprid, while spinosad showed a residue level of 0.04 mg/kg. The study of the effect of the technological process on pesticide transfer in olive oil showed that these insecticides tend to remain in the olive cake. The LC/DAD/ESI/MS method showed good performance with adequate recoveries ranging from 80 to 119% and good method limits of quantitation (LOQs) and of determination (LODs). No matrix effect was detected.
Varmazyar, Mohsen; Dehghanbaghi, Maryam; Afkhami, Mehdi
2016-10-01
Balanced Scorecard (BSC) is a strategic evaluation tool using both financial and non-financial indicators to determine the business performance of organizations or companies. In this paper, a new integrated approach based on the Balanced Scorecard (BSC) and multi-criteria decision making (MCDM) methods are proposed to evaluate the performance of research centers of research and technology organization (RTO) in Iran. Decision-Making Trial and Evaluation Laboratory (DEMATEL) are employed to reflect the interdependencies among BSC perspectives. Then, Analytic Network Process (ANP) is utilized to weight the indices influencing the considered problem. In the next step, we apply four MCDM methods including Additive Ratio Assessment (ARAS), Complex Proportional Assessment (COPRAS), Multi-Objective Optimization by Ratio Analysis (MOORA), and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) for ranking of alternatives. Finally, the utility interval technique is applied to combine the ranking results of MCDM methods. Weighted utility intervals are computed by constructing a correlation matrix between the ranking methods. A real case is presented to show the efficacy of the proposed approach. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola
2005-01-01
Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.
Kim, Sungjune; Hong, Seokpyo; Ahn, Kilsoo; Gong, Sungyong
2015-01-01
This study presents the indicators and proxy variables for the quantitative assessment of green chemistry technologies and evaluates the relative importance of each assessment element by consulting experts from the fields of ecology, chemistry, safety, and public health. The results collected were subjected to an analytic hierarchy process to obtain the weights of the indicators and the proxy variables. These weights may prove useful in avoiding having to resort to qualitative means in absence of weights between indicators when integrating the results of quantitative assessment by indicator. This study points to the limitations of current quantitative assessment techniques for green chemistry technologies and seeks to present the future direction for quantitative assessment of green chemistry technologies.
SmartAQnet: remote and in-situ sensing of urban air quality
NASA Astrophysics Data System (ADS)
Budde, Matthias; Riedel, Till; Beigl, Michael; Schäfer, Klaus; Emeis, Stefan; Cyrys, Josef; Schnelle-Kreis, Jürgen; Philipp, Andreas; Ziegler, Volker; Grimm, Hans; Gratza, Thomas
2017-10-01
Air quality and the associated subjective and health-related quality of life are among the important topics of urban life in our time. However, it is very difficult for many cities to take measures to accommodate today's needs concerning e.g. mobility, housing and work, because a consistent fine-granular data and information on causal chains is largely missing. This has the potential to change, as today, both large-scale basic data as well as new promising measuring approaches are becoming available. The project "SmartAQnet", funded by the German Federal Ministry of Transport and Digital Infrastructure (BMVI), is based on a pragmatic, data driven approach, which for the first time combines existing data sets with a networked mobile measurement strategy in the urban space. By connecting open data, such as weather data or development plans, remote sensing of influencing factors, and new mobile measurement approaches, such as participatory sensing with low-cost sensor technology, "scientific scouts" (autonomous, mobile smart dust measurement device that is auto-calibrated to a high-quality reference instrument within an intelligent monitoring network) and demand-oriented measurements by light-weight UAVs, a novel measuring and analysis concept is created within the model region of Augsburg, Germany. In addition to novel analytics, a prototypical technology stack is planned which, through modern analytics methods and Big Data and IoT technologies, enables application in a scalable way.
Chemical Technology Division annual technical report, 1992
DOE Office of Scientific and Technical Information (OSTI.GOV)
Battles, J.E.; Myles, K.M.; Laidler, J.J.
1993-06-01
In this period, CMT conducted research and development in the following areas: (1) electrochemical technology, including advanced batteries and fuel cells; (2) technology for fluidized-bed combustion and coal-fired magnetohydrodynamics; (3) methods for treatment of hazardous waste, mixed hazardous/radioactive waste, and municipal solid waste; (4) the reaction of nuclear waste glass and spent fuel under conditions expected for an unsaturated repository; (5) processes for separating and recovering transuranic elements from nuclear waste streams, treating water contaminated with volatile organics, and concentrating radioactive waste streams; (6) recovery processes for discharged fuel and the uranium blanket in the Integral Fast Reactor (EFR); (7)more » processes for removal of actinides in spent fuel from commercial water-cooled nuclear reactors and burnup in IFRs; and (8) physical chemistry of selected materials (corium; Fe-U-Zr, tritium in LiAlO{sub 2} in environments simulating those of fission and fusion energy systems. The Division also conducts basic research in catalytic chemistry associated with molecular energy resources and novel` ceramic precursors; materials chemistry of superconducting oxides, electrified metal/solution interfaces, and molecular sieve structures; and the geochemical processes involved in water-rock interactions occurring in active hydrothermal systems. In addition, the Analytical Chemistry Laboratory in CMT provides a broad range of analytical chemistry support services to the technical programs at Argonne National Laboratory (ANL).« less
MS-based analytical methodologies to characterize genetically modified crops.
García-Cañas, Virginia; Simó, Carolina; León, Carlos; Ibáñez, Elena; Cifuentes, Alejandro
2011-01-01
The development of genetically modified crops has had a great impact on the agriculture and food industries. However, the development of any genetically modified organism (GMO) requires the application of analytical procedures to confirm the equivalence of the GMO compared to its isogenic non-transgenic counterpart. Moreover, the use of GMOs in foods and agriculture faces numerous criticisms from consumers and ecological organizations that have led some countries to regulate their production, growth, and commercialization. These regulations have brought about the need of new and more powerful analytical methods to face the complexity of this topic. In this regard, MS-based technologies are increasingly used for GMOs analysis to provide very useful information on GMO composition (e.g., metabolites, proteins). This review focuses on the MS-based analytical methodologies used to characterize genetically modified crops (also called transgenic crops). First, an overview on genetically modified crops development is provided, together with the main difficulties of their analysis. Next, the different MS-based analytical approaches applied to characterize GM crops are critically discussed, and include "-omics" approaches and target-based approaches. These methodologies allow the study of intended and unintended effects that result from the genetic transformation. This information is considered to be essential to corroborate (or not) the equivalence of the GM crop with its isogenic non-transgenic counterpart. Copyright © 2010 Wiley Periodicals, Inc.
Biosensors for GMO Testing: Nearly 25 Years of Research.
Sánchez-Paniagua López, Marta; Manzanares-Palenzuela, Carmen Lorena; López-Ruiz, Beatriz
2018-09-03
In the nearly two decades since genetically modified organisms (GMOs) were first commercialized, genetically engineered crops have gained ground on their conventional counterparts, reaching 185 million hectares worldwide in 2016. The technology has bestowed most of its benefits on enhancing crop productivity with two main traits currently dominating the market: insect-resistant and herbicide-tolerant crops. Despite their rapid and vast adoption by farmers worldwide, GMOs have generated heated debates, especially in European countries (EU), driven mostly by consumers concerned about safety of transgenic foods and about the potential impact on the environment. The need to monitor and to verify the presence and the amount of GMOs in agricultural crops and in food products has generated interest in analytical methods for sensitive, accurate, rapid, and cheap detection of these products. DNA biosensors have been envisioned as a novel DNA-detection technology that would one day substitute current amplification-based methods, providing hand-held, quick, and ultrasensitive gene-level detection. This review summarizes the contributions made in nearly 20 years of research regarding the application of genosensing technology for the qualitative and quantitative determination of transgenic traits.