Esmonde-White, Karen A; Cuellar, Maryann; Uerpmann, Carsten; Lenain, Bruno; Lewis, Ian R
2017-01-01
Adoption of Quality by Design (QbD) principles, regulatory support of QbD, process analytical technology (PAT), and continuous manufacturing are major factors effecting new approaches to pharmaceutical manufacturing and bioprocessing. In this review, we highlight new technology developments, data analysis models, and applications of Raman spectroscopy, which have expanded the scope of Raman spectroscopy as a process analytical technology. Emerging technologies such as transmission and enhanced reflection Raman, and new approaches to using available technologies, expand the scope of Raman spectroscopy in pharmaceutical manufacturing, and now Raman spectroscopy is successfully integrated into real-time release testing, continuous manufacturing, and statistical process control. Since the last major review of Raman as a pharmaceutical PAT in 2010, many new Raman applications in bioprocessing have emerged. Exciting reports of in situ Raman spectroscopy in bioprocesses complement a growing scientific field of biological and biomedical Raman spectroscopy. Raman spectroscopy has made a positive impact as a process analytical and control tool for pharmaceutical manufacturing and bioprocessing, with demonstrated scientific and financial benefits throughout a product's lifecycle.
Bringing Business Intelligence to Health Information Technology Curriculum
ERIC Educational Resources Information Center
Zheng, Guangzhi; Zhang, Chi; Li, Lei
2015-01-01
Business intelligence (BI) and healthcare analytics are the emerging technologies that provide analytical capability to help healthcare industry improve service quality, reduce cost, and manage risks. However, such component on analytical healthcare data processing is largely missed from current healthcare information technology (HIT) or health…
Technology-assisted psychoanalysis.
Scharff, Jill Savege
2013-06-01
Teleanalysis-remote psychoanalysis by telephone, voice over internet protocol (VoIP), or videoteleconference (VTC)-has been thought of as a distortion of the frame that cannot support authentic analytic process. Yet it can augment continuity, permit optimum frequency of analytic sessions for in-depth analytic work, and enable outreach to analysands in areas far from specialized psychoanalytic centers. Theoretical arguments against teleanalysis are presented and countered and its advantages and disadvantages discussed. Vignettes of analytic process from teleanalytic sessions are presented, and indications, contraindications, and ethical concerns are addressed. The aim is to provide material from which to judge the authenticity of analytic process supported by technology.
Trends in Process Analytical Technology: Present State in Bioprocessing.
Jenzsch, Marco; Bell, Christian; Buziol, Stefan; Kepert, Felix; Wegele, Harald; Hakemeyer, Christian
2017-08-04
Process analytical technology (PAT), the regulatory initiative for incorporating quality in pharmaceutical manufacturing, is an area of intense research and interest. If PAT is effectively applied to bioprocesses, this can increase process understanding and control, and mitigate the risk from substandard drug products to both manufacturer and patient. To optimize the benefits of PAT, the entire PAT framework must be considered and each elements of PAT must be carefully selected, including sensor and analytical technology, data analysis techniques, control strategies and algorithms, and process optimization routines. This chapter discusses the current state of PAT in the biopharmaceutical industry, including several case studies demonstrating the degree of maturity of various PAT tools. Graphical Abstract Hierarchy of QbD components.
[Automation and organization of technological process of urinalysis].
Kolenkin, S M; Kishkun, A A; Kol'chenko, O L
2000-12-01
Results of introduction into practice of a working model of industrial technology of laboratory studies and KONE Specific Supra and Miditron M devices are shown as exemplified by clinical analysis of the urine. This technology helps standardize all stages and operations, improves the efficiency of quality control of laboratory studies, rationally organizes the work at all stages of the process, creates a system for permanent improvement of the efficiency of investigations at the preanalytical, analytical, and postanalytical stages of technological process of laboratory studies. As a result of introduction of this technology into laboratory practice, violations of quality criteria of clinical urinalysis decreased from 15 to 8% at the preanalytical stage and from 6 to 3% at the analytical stage. Automation of the analysis decreased the need in reagents 3-fold and improved the productivity at the analytical stage 4-fold.
Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang
2016-10-01
Currently, near infrared spectroscopy (NIRS) has been considered as an efficient tool for achieving process analytical technology(PAT) in the manufacture of traditional Chinese medicine (TCM) products. In this article, the NIRS based process analytical system for the production of salvianolic acid for injection was introduced. The design of the process analytical system was described in detail, including the selection of monitored processes and testing mode, and potential risks that should be avoided. Moreover, the development of relative technologies was also presented, which contained the establishment of the monitoring methods for the elution of polyamide resin and macroporous resin chromatography processes, as well as the rapid analysis method for finished products. Based on author's experience of research and work, several issues in the application of NIRS to the process monitoring and control in TCM production were then raised, and some potential solutions were also discussed. The issues include building the technical team for process analytical system, the design of the process analytical system in the manufacture of TCM products, standardization of the NIRS-based analytical methods, and improving the management of process analytical system. Finally, the prospect for the application of NIRS in the TCM industry was put forward. Copyright© by the Chinese Pharmaceutical Association.
An update on pharmaceutical film coating for drug delivery.
Felton, Linda A; Porter, Stuart C
2013-04-01
Pharmaceutical coating processes have generally been transformed from what was essentially an art form in the mid-twentieth century to a much more technology-driven process. This review article provides a basic overview of current film coating processes, including a discussion on polymer selection, coating formulation additives and processing equipment. Substrate considerations for pharmaceutical coating processes are also presented. While polymeric coating operations are commonplace in the pharmaceutical industry, film coating processes are still not fully understood, which presents serious challenges with current regulatory requirements. Novel analytical technologies and various modeling techniques that are being used to better understand film coating processes are discussed. This review article also examines the challenges of implementing process analytical technologies in coating operations, active pharmaceutical ingredients in polymer film coatings, the use of high-solids coating systems and continuous coating and other novel coating application methods.
Process analytical technologies (PAT) in freeze-drying of parenteral products.
Patel, Sajal Manubhai; Pikal, Michael
2009-01-01
Quality by Design (QbD), aims at assuring quality by proper design and control, utilizing appropriate Process Analytical Technologies (PAT) to monitor critical process parameters during processing to ensure that the product meets the desired quality attributes. This review provides a comprehensive list of process monitoring devices that can be used to monitor critical process parameters and will focus on a critical review of the viability of the PAT schemes proposed. R&D needs in PAT for freeze-drying have also been addressed with particular emphasis on batch techniques that can be used on all the dryers independent of the dryer scale.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nimbalkar, Sachin U.; Guo, Wei; Wenning, Thomas J.
Smart manufacturing and advanced data analytics can help the manufacturing sector unlock energy efficiency from the equipment level to the entire manufacturing facility and the whole supply chain. These technologies can make manufacturing industries more competitive, with intelligent communication systems, real-time energy savings, and increased energy productivity. Smart manufacturing can give all employees in an organization the actionable information they need, when they need it, so that each person can contribute to the optimal operation of the corporation through informed, data-driven decision making. This paper examines smart technologies and data analytics approaches for improving energy efficiency and reducing energy costsmore » in process-supporting energy systems. It dives into energy-saving improvement opportunities through smart manufacturing technologies and sophisticated data collection and analysis. The energy systems covered in this paper include those with motors and drives, fans, pumps, air compressors, steam, and process heating.« less
NASA Astrophysics Data System (ADS)
Coughlin, J.; Mital, R.; Nittur, S.; SanNicolas, B.; Wolf, C.; Jusufi, R.
2016-09-01
Operational analytics when combined with Big Data technologies and predictive techniques have been shown to be valuable in detecting mission critical sensor anomalies that might be missed by conventional analytical techniques. Our approach helps analysts and leaders make informed and rapid decisions by analyzing large volumes of complex data in near real-time and presenting it in a manner that facilitates decision making. It provides cost savings by being able to alert and predict when sensor degradations pass a critical threshold and impact mission operations. Operational analytics, which uses Big Data tools and technologies, can process very large data sets containing a variety of data types to uncover hidden patterns, unknown correlations, and other relevant information. When combined with predictive techniques, it provides a mechanism to monitor and visualize these data sets and provide insight into degradations encountered in large sensor systems such as the space surveillance network. In this study, data from a notional sensor is simulated and we use big data technologies, predictive algorithms and operational analytics to process the data and predict sensor degradations. This study uses data products that would commonly be analyzed at a site. This study builds on a big data architecture that has previously been proven valuable in detecting anomalies. This paper outlines our methodology of implementing an operational analytic solution through data discovery, learning and training of data modeling and predictive techniques, and deployment. Through this methodology, we implement a functional architecture focused on exploring available big data sets and determine practical analytic, visualization, and predictive technologies.
Challa, Shruthi; Potumarthi, Ravichandra
2013-01-01
Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.
Tanaka, Ryoma; Takahashi, Naoyuki; Nakamura, Yasuaki; Hattori, Yusuke; Ashizawa, Kazuhide; Otsuka, Makoto
2017-01-01
Resonant acoustic ® mixing (RAM) technology is a system that performs high-speed mixing by vibration through the control of acceleration and frequency. In recent years, real-time process monitoring and prediction has become of increasing interest, and process analytical technology (PAT) systems will be increasingly introduced into actual manufacturing processes. This study examined the application of PAT with the combination of RAM, near-infrared spectroscopy, and chemometric technology as a set of PAT tools for introduction into actual pharmaceutical powder blending processes. Content uniformity was based on a robust partial least squares regression (PLSR) model constructed to manage the RAM configuration parameters and the changing concentration of the components. As a result, real-time monitoring may be possible and could be successfully demonstrated for in-line real-time prediction of active pharmaceutical ingredients and other additives using chemometric technology. This system is expected to be applicable to the RAM method for the risk management of quality.
An NCI-FDA Interagency Oncology Task Force (IOTF) Molecular Diagnostics Workshop was held on October 30, 2008 in Cambridge, MA, to discuss requirements for analytical validation of protein-based multiplex technologies in the context of its intended use. This workshop developed through NCI's Clinical Proteomic Technologies for Cancer initiative and the FDA focused on technology-specific analytical validation processes to be addressed prior to use in clinical settings. In making this workshop unique, a case study approach was used to discuss issues related to
Experiments with Analytic Centers: A confluence of data, tools and help in using them.
NASA Astrophysics Data System (ADS)
Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.
2017-12-01
Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.
Rüdt, Matthias; Briskot, Till; Hubbuch, Jürgen
2017-03-24
Process analytical technologies (PAT) for the manufacturing of biologics have drawn increased interest in the last decade. Besides being encouraged by the Food and Drug Administration's (FDA's) PAT initiative, PAT promises to improve process understanding, reduce overall production costs and help to implement continuous manufacturing. This article focuses on spectroscopic tools for PAT in downstream processing (DSP). Recent advances and future perspectives will be reviewed. In order to exploit the full potential of gathered data, chemometric tools are widely used for the evaluation of complex spectroscopic information. Thus, an introduction into the field will be given. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Wirges, M; Funke, A; Serno, P; Knop, K; Kleinebudde, P
2013-05-05
Incorporation of an active pharmaceutical ingredient (API) into the coating layer of film-coated tablets is a method mainly used to formulate fixed-dose combinations. Uniform and precise spray-coating of an API represents a substantial challenge, which could be overcome by applying Raman spectroscopy as process analytical tool. In pharmaceutical industry, Raman spectroscopy is still mainly used as a bench top laboratory analytical method and usually not implemented in the production process. Concerning the application in the production process, a lot of scientific approaches stop at the level of feasibility studies and do not manage the step to production scale and process applications. The present work puts the scale up of an active coating process into focus, which is a step of highest importance during the pharmaceutical development. Active coating experiments were performed at lab and production scale. Using partial least squares (PLS), a multivariate model was constructed by correlating in-line measured Raman spectral data with the coated amount of API. By transferring this model, being implemented for a lab scale process, to a production scale process, the robustness of this analytical method and thus its applicability as a Process Analytical Technology (PAT) tool for the correct endpoint determination in pharmaceutical manufacturing could be shown. Finally, this method was validated according to the European Medicine Agency (EMA) guideline with respect to the special requirements of the applied in-line model development strategy. Copyright © 2013 Elsevier B.V. All rights reserved.
Kim, Sungjune; Hong, Seokpyo; Ahn, Kilsoo; Gong, Sungyong
2015-01-01
This study presents the indicators and proxy variables for the quantitative assessment of green chemistry technologies and evaluates the relative importance of each assessment element by consulting experts from the fields of ecology, chemistry, safety, and public health. The results collected were subjected to an analytic hierarchy process to obtain the weights of the indicators and the proxy variables. These weights may prove useful in avoiding having to resort to qualitative means in absence of weights between indicators when integrating the results of quantitative assessment by indicator. This study points to the limitations of current quantitative assessment techniques for green chemistry technologies and seeks to present the future direction for quantitative assessment of green chemistry technologies.
Process analytical technology in the pharmaceutical industry: a toolkit for continuous improvement.
Scott, Bradley; Wilcock, Anne
2006-01-01
Process analytical technology (PAT) refers to a series of tools used to ensure that quality is built into products while at the same time improving the understanding of processes, increasing efficiency, and decreasing costs. It has not been widely adopted by the pharmaceutical industry. As the setting for this paper, the current pharmaceutical manufacturing paradigm and PAT guidance to date are discussed prior to the review of PAT principles and tools, benefits, and challenges. The PAT toolkit contains process analyzers, multivariate analysis tools, process control tools, and continuous improvement/knowledge management/information technology systems. The integration and implementation of these tools is complex, and has resulted in uncertainty with respect to both regulation and validation. The paucity of staff knowledgeable in this area may complicate adoption. Studies to quantitate the benefits resulting from the adoption of PAT within the pharmaceutical industry would be a valuable addition to the qualitative studies that are currently available.
Opportunity and Challenges for Migrating Big Data Analytics in Cloud
NASA Astrophysics Data System (ADS)
Amitkumar Manekar, S.; Pradeepini, G., Dr.
2017-08-01
Big Data Analytics is a big word now days. As per demanding and more scalable process data generation capabilities, data acquisition and storage become a crucial issue. Cloud storage is a majorly usable platform; the technology will become crucial to executives handling data powered by analytics. Now a day’s trend towards “big data-as-a-service” is talked everywhere. On one hand, cloud-based big data analytics exactly tackle in progress issues of scale, speed, and cost. But researchers working to solve security and other real-time problem of big data migration on cloud based platform. This article specially focused on finding possible ways to migrate big data to cloud. Technology which support coherent data migration and possibility of doing big data analytics on cloud platform is demanding in natute for new era of growth. This article also gives information about available technology and techniques for migration of big data in cloud.
Gnoth, S; Jenzsch, M; Simutis, R; Lübbert, A
2007-10-31
The Process Analytical Technology (PAT) initiative of the FDA is a reaction on the increasing discrepancy between current possibilities in process supervision and control of pharmaceutical production processes and its current application in industrial manufacturing processes. With rigid approval practices based on standard operational procedures, adaptations of production reactors towards the state of the art were more or less inhibited for long years. Now PAT paves the way for continuous process and product improvements through improved process supervision based on knowledge-based data analysis, "Quality-by-Design"-concepts, and, finally, through feedback control. Examples of up-to-date implementations of this concept are presented. They are taken from one key group of processes in recombinant pharmaceutical protein manufacturing, the cultivations of genetically modified Escherichia coli bacteria.
Fold-Back: Using Emerging Technologies to Move from Quality Assurance to Quality Enhancement
ERIC Educational Resources Information Center
Leonard, Simon N.; Fitzgerald, Robert N.; Bacon, Matt
2016-01-01
Emerging technologies offer an opportunity for the development, at the institutional level, of quality processes with greater capacity to enhance learning in higher education than available through current quality processes. These systems offer the potential to extend use of learning analytics in institutional-level quality processes in addition…
NASA Astrophysics Data System (ADS)
Wu, Huiquan; Khan, Mansoor
2012-08-01
As an emerging technology, THz spectroscopy has gained increasing attention in the pharmaceutical area during the last decade. This attention is due to the fact that (1) it provides a promising alternative approach for in-depth understanding of both intermolecular interaction among pharmaceutical molecules and pharmaceutical product quality attributes; (2) it provides a promising alternative approach for enhanced process understanding of certain pharmaceutical manufacturing processes; and (3) the FDA pharmaceutical quality initiatives, most noticeably, the Process Analytical Technology (PAT) initiative. In this work, the current status and progress made so far on using THz spectroscopy for pharmaceutical development and pharmaceutical PAT applications are reviewed. In the spirit of demonstrating the utility of first principles modeling approach for addressing model validation challenge and reducing unnecessary model validation "burden" for facilitating THz pharmaceutical PAT applications, two scientific case studies based on published THz spectroscopy measurement results are created and discussed. Furthermore, other technical challenges and opportunities associated with adapting THz spectroscopy as a pharmaceutical PAT tool are highlighted.
Kim, Sungjune; Hong, Seokpyo; Ahn, Kilsoo; Gong, Sungyong
2015-01-01
Objectives This study presents the indicators and proxy variables for the quantitative assessment of green chemistry technologies and evaluates the relative importance of each assessment element by consulting experts from the fields of ecology, chemistry, safety, and public health. Methods The results collected were subjected to an analytic hierarchy process to obtain the weights of the indicators and the proxy variables. Results These weights may prove useful in avoiding having to resort to qualitative means in absence of weights between indicators when integrating the results of quantitative assessment by indicator. Conclusions This study points to the limitations of current quantitative assessment techniques for green chemistry technologies and seeks to present the future direction for quantitative assessment of green chemistry technologies. PMID:26206364
Group decision making with the analytic hierarchy process in benefit-risk assessment: a tutorial.
Hummel, J Marjan; Bridges, John F P; IJzerman, Maarten J
2014-01-01
The analytic hierarchy process (AHP) has been increasingly applied as a technique for multi-criteria decision analysis in healthcare. The AHP can aid decision makers in selecting the most valuable technology for patients, while taking into account multiple, and even conflicting, decision criteria. This tutorial illustrates the procedural steps of the AHP in supporting group decision making about new healthcare technology, including (1) identifying the decision goal, decision criteria, and alternative healthcare technologies to compare, (2) structuring the decision criteria, (3) judging the value of the alternative technologies on each decision criterion, (4) judging the importance of the decision criteria, (5) calculating group judgments, (6) analyzing the inconsistency in judgments, (7) calculating the overall value of the technologies, and (8) conducting sensitivity analyses. The AHP is illustrated via a hypothetical example, adapted from an empirical AHP analysis on the benefits and risks of tissue regeneration to repair small cartilage lesions in the knee.
Innovations in coating technology.
Behzadi, Sharareh S; Toegel, Stefan; Viernstein, Helmut
2008-01-01
Despite representing one of the oldest pharmaceutical techniques, coating of dosage forms is still frequently used in pharmaceutical manufacturing. The aims of coating range from simply masking the taste or odour of drugs to the sophisticated controlling of site and rate of drug release. The high expectations for different coating technologies have required great efforts regarding the development of reproducible and controllable production processes. Basically, improvements in coating methods have focused on particle movement, spraying systems, and air and energy transport. Thereby, homogeneous distribution of coating material and increased drying efficiency should be accomplished in order to achieve high end product quality. Moreover, given the claim of the FDA to design the end product quality already during the manufacturing process (Quality by Design), the development of analytical methods for the analysis, management and control of coating processes has attracted special attention during recent years. The present review focuses on recent patents claiming improvements in pharmaceutical coating technology and intends to first familiarize the reader with the available procedures and to subsequently explain the application of different analytical tools. Aiming to structure this comprehensive field, coating technologies are primarily divided into pan and fluidized bed coating methods. Regarding pan coating procedures, pans rotating around inclined, horizontal and vertical axes are reviewed separately. On the other hand, fluidized bed technologies are subdivided into those involving fluidized and spouted beds. Then, continuous processing techniques and improvements in spraying systems are discussed in dedicated chapters. Finally, currently used analytical methods for the understanding and management of coating processes are reviewed in detail in the last section of the review.
Early Alert of Academically At-Risk Students: An Open Source Analytics Initiative
ERIC Educational Resources Information Center
Jayaprakash, Sandeep M.; Moody, Erik W.; Lauría, Eitel J. M.; Regan, James R.; Baron, Joshua D.
2014-01-01
The Open Academic Analytics Initiative (OAAI) is a collaborative, multi-year grant program aimed at researching issues related to the scaling up of learning analytics technologies and solutions across all of higher education. The paper describes the goals and objectives of the OAAI, depicts the process and challenges of collecting, organizing and…
2011-02-01
Process Architecture Technology Analysis: Executive .............................................. 15 UIMA as Executive...44 A.4: Flow Code in UIMA ......................................................................................................... 46... UIMA ................................................................................................................................ 57 E.2
Technology to improve quality and accountability.
Kay, Jonathan
2006-01-01
A body of evidence has been accumulated to demonstrate that current practice is not sufficiently safe for several stages of central laboratory testing. In particular, while analytical and perianalytical steps that take place within the laboratory are subjected to quality control procedures, this is not the case for several pre- and post-analytical steps. The ubiquitous application of auto-identification technology seems to represent a valuable tool for reducing error rates. A series of projects in Oxford has attempted to improve processes which support several areas of laboratory medicine, including point-of-care testing, blood transfusion, delivery and interpretation of reports, and support of decision-making by clinicians. The key tools are auto-identification, Internet communication technology, process re-engineering, and knowledge management.
NASA Astrophysics Data System (ADS)
Dirpan, Andi
2018-05-01
This research was intended to select the best handling methods or postharvest technologies that can be used to maintain the quality of citrus fruit in Selayar, South Sulawesi, Indonesia among (1) modified atmosphere packaging (MAP (2) Controlled atmosphere storage (CAS) (3) coatings (4) hot water treatment (5) Hot Calcium Dip (HCD) by using combination between an analytic hierarchy process (AHP) and TOPSIS. Improving quality, applicability, increasing shelf life and reducing cost are used as the criteria to determine the best postharvest technologies. The results show that the most important criteria for selecting postharvest technology is improving quality followed by increasing shelf life, reducing cost and applicability. Furthermore, by using TOPSIS, it is clear that the postharvest technology that had the lowest rangking is modified atmosphere packaging (MAP), followed by controlled atmosphere storage (CAS), coatings, hot calcium dip (HCD) and hot water treatment (HWT). Therefore, it can be concluded that the best postharvest technology method for Selayar citrus is modified atmosphere packaging (MAP).
Alharthi, Hana; Sultana, Nahid; Al-Amoudi, Amjaad; Basudan, Afrah
2015-01-01
Pharmacy barcode scanning is used to reduce errors during the medication dispensing process. However, this technology has rarely been used in hospital pharmacies in Saudi Arabia. This article describes the barriers to successful implementation of a barcode scanning system in Saudi Arabia. A literature review was conducted to identify the relevant critical success factors (CSFs) for a successful dispensing barcode system implementation. Twenty-eight pharmacists from a local hospital in Saudi Arabia were interviewed to obtain their perception of these CSFs. In this study, planning (process flow issues and training requirements), resistance (fear of change, communication issues, and negative perceptions about technology), and technology (software, hardware, and vendor support) were identified as the main barriers. The analytic hierarchy process (AHP), one of the most widely used tools for decision making in the presence of multiple criteria, was used to compare and rank these identified CSFs. The results of this study suggest that resistance barriers have a greater impact than planning and technology barriers. In particular, fear of change is the most critical factor, and training is the least critical factor.
Chemical Technology Division, Annual technical report, 1991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-03-01
Highlights of the Chemical Technology (CMT) Division's activities during 1991 are presented. In this period, CMT conducted research and development in the following areas: (1) electrochemical technology, including advanced batteries and fuel cells; (2) technology for fluidized-bed combustion and coal-fired magnetohydrodynamics; (3) methods for treatment of hazardous and mixed hazardous/radioactive waste; (4) the reaction of nuclear waste glass and spent fuel under conditions expected for an unsaturated repository; (5) processes for separating and recovering transuranic elements from nuclear waste streams; (6) recovery processes for discharged fuel and the uranium blanket in the Integral Fast Reactor (IFR); (7) processes for removalmore » of actinides in spent fuel from commercial water-cooled nuclear reactors and burnup in IFRs; and (8) physical chemistry of selected materials in environments simulating those of fission and fusion energy systems. The Division also conducts basic research in catalytic chemistry associated with molecular energy resources; chemistry of superconducting oxides and other materials of interest with technological application; interfacial processes of importance to corrosion science, catalysis, and high-temperature superconductivity; and the geochemical processes involved in water-rock interactions occurring in active hydrothermal systems. In addition, the Analytical Chemistry Laboratory in CMT provides a broad range of analytical chemistry support services to the technical programs at Argonne National Laboratory (ANL).« less
Chemical Technology Division, Annual technical report, 1991
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-03-01
Highlights of the Chemical Technology (CMT) Division`s activities during 1991 are presented. In this period, CMT conducted research and development in the following areas: (1) electrochemical technology, including advanced batteries and fuel cells; (2) technology for fluidized-bed combustion and coal-fired magnetohydrodynamics; (3) methods for treatment of hazardous and mixed hazardous/radioactive waste; (4) the reaction of nuclear waste glass and spent fuel under conditions expected for an unsaturated repository; (5) processes for separating and recovering transuranic elements from nuclear waste streams; (6) recovery processes for discharged fuel and the uranium blanket in the Integral Fast Reactor (IFR); (7) processes for removalmore » of actinides in spent fuel from commercial water-cooled nuclear reactors and burnup in IFRs; and (8) physical chemistry of selected materials in environments simulating those of fission and fusion energy systems. The Division also conducts basic research in catalytic chemistry associated with molecular energy resources; chemistry of superconducting oxides and other materials of interest with technological application; interfacial processes of importance to corrosion science, catalysis, and high-temperature superconductivity; and the geochemical processes involved in water-rock interactions occurring in active hydrothermal systems. In addition, the Analytical Chemistry Laboratory in CMT provides a broad range of analytical chemistry support services to the technical programs at Argonne National Laboratory (ANL).« less
Towards an Analytical Framework for Evaluating the Impact of Technology on Future Contexts
2004-02-01
truly revolutionary ( disruptive ) technologies that have the potential to substantially impact on future warfighting operations. It also discusses the roles of and relationships between the various participants in such a process.
Yu, Zhou; Reid, Jennifer C; Yang, Yan-Ping
2013-12-01
Protein aggregation is a common challenge in the manufacturing of biological products. It is possible to minimize the extent of aggregation through timely measurement and in-depth characterization of aggregation. In this study, we demonstrated the use of dynamic light scattering (DLS) to monitor inclusion body (IB) solubilization, protein refolding, and aggregation near the production line of a recombinant protein-based vaccine candidate. Our results were in good agreement with those measured by size-exclusion chromatography. DLS was also used to characterize the mechanism of aggregation. As DLS is a quick, nonperturbing technology, it can potentially be used as an at-line process analytical technology to ensure complete IB solubilization and aggregate-free refolding. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.
Ritrovato, Matteo; Faggiano, Francesco C; Tedesco, Giorgia; Derrico, Pietro
2015-06-01
This article outlines the Decision-Oriented Health Technology Assessment: a new implementation of the European network for Health Technology Assessment Core Model, integrating the multicriteria decision-making analysis by using the analytic hierarchy process to introduce a standardized methodological approach as a valued and shared tool to support health care decision making within a hospital. Following the Core Model as guidance (European network for Health Technology Assessment. HTA core model for medical and surgical interventions. Available from: http://www.eunethta.eu/outputs/hta-core-model-medical-and-surgical-interventions-10r. [Accessed May 27, 2014]), it is possible to apply the analytic hierarchy process to break down a problem into its constituent parts and identify priorities (i.e., assigning a weight to each part) in a hierarchical structure. Thus, it quantitatively compares the importance of multiple criteria in assessing health technologies and how the alternative technologies perform in satisfying these criteria. The verbal ratings are translated into a quantitative form by using the Saaty scale (Saaty TL. Decision making with the analytic hierarchy process. Int J Serv Sci 2008;1:83-98). An eigenvectors analysis is used for deriving the weights' systems (i.e., local and global weights' system) that reflect the importance assigned to the criteria and the priorities related to the performance of the alternative technologies. Compared with the Core Model, this methodological approach supplies a more timely as well as contextualized evidence for a specific technology, making it possible to obtain data that are more relevant and easier to interpret, and therefore more useful for decision makers to make investment choices with greater awareness. We reached the conclusion that although there may be scope for improvement, this implementation is a step forward toward the goal of building a "solid bridge" between the scientific evidence and the final decision maker's choice. Copyright © 2015 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Cogeneration Technology Alternatives Study (CTAS). Volume 2: Analytical approach
NASA Technical Reports Server (NTRS)
Gerlaugh, H. E.; Hall, E. W.; Brown, D. H.; Priestley, R. R.; Knightly, W. F.
1980-01-01
The use of various advanced energy conversion systems were compared with each other and with current technology systems for their savings in fuel energy, costs, and emissions in individual plants and on a national level. The ground rules established by NASA and assumptions made by the General Electric Company in performing this cogeneration technology alternatives study are presented. The analytical methodology employed is described in detail and is illustrated with numerical examples together with a description of the computer program used in calculating over 7000 energy conversion system-industrial process applications. For Vol. 1, see 80N24797.
Earthdata Cloud Analytics Project
NASA Technical Reports Server (NTRS)
Ramachandran, Rahul; Lynnes, Chris
2018-01-01
This presentation describes a nascent project in NASA to develop a framework to support end-user analytics of NASA's Earth science data in the cloud. The chief benefit of migrating EOSDIS (Earth Observation System Data and Information Systems) data to the cloud is to position the data next to enormous computing capacity to allow end users to process data at scale. The Earthdata Cloud Analytics project will user a service-based approach to facilitate the infusion of evolving analytics technology and the integration with non-NASA analytics or other complementary functionality at other agencies and in other nations.
Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools
2014-01-14
Enterprise Systems Value-Based R&D Portfolio Analytics: Methods, Processes, and Tools Final Technical Report SERC -2014-TR-041-1 January 14...by the U.S. Department of Defense through the Systems Engineering Research Center ( SERC ) under Contract H98230-08-D-0171 (Task Order 0026, RT 51... SERC is a federally funded University Affiliated Research Center managed by Stevens Institute of Technology Any opinions, findings and
Let's Not Forget: Learning Analytics Are about Learning
ERIC Educational Resources Information Center
Gaševic, Dragan; Dawson, Shane; Siemens, George
2015-01-01
The analysis of data collected from the interaction of users with educational and information technology has attracted much attention as a promising approach for advancing our understanding of the learning process. This promise motivated the emergence of the new research field, learning analytics, and its closely related discipline, educational…
Learning Analytics to Understand Cultural Impacts on Technology Enhanced Learning
ERIC Educational Resources Information Center
Mittelmeier, Jenna; Tempelaar, Dirk; Rienties, Bart; Nguyen, Quan
2016-01-01
In this empirical study, we investigate the role of national cultural dimensions as distal antecedents of the use intensity of e-tutorials, which constitute the digital component within a blended learning course. Profiting from the context of a dispositional learning analytics application, we investigate cognitive processing strategies and…
Narang, Ajit S; Sheverev, Valery; Freeman, Tim; Both, Douglas; Stepaniuk, Vadim; Delancy, Michael; Millington-Smith, Doug; Macias, Kevin; Subramanian, Ganeshkumar
2016-01-01
Drag flow force (DFF) sensor that measures the force exerted by wet mass in a granulator on a thin cylindrical probe was shown as a promising process analytical technology for real-time in-line high-resolution monitoring of wet mass consistency during high shear wet granulation. Our previous studies indicated that this process analytical technology tool could be correlated to granulation end point established independently through drug product critical quality attributes. In this study, the measurements of flow force by a DFF sensor, taken during wet granulation of 3 placebo formulations with different binder content, are compared with concurrent at line FT4 Powder Rheometer characterization of wet granules collected at different time points of the processing. The wet mass consistency measured by the DFF sensor correlated well with the granulation's resistance to flow and interparticulate interactions as measured by FT4 Powder Rheometer. This indicated that the force pulse magnitude measured by the DFF sensor was indicative of fundamental material properties (e.g., shear viscosity and granule size/density), as they were changing during the granulation process. These studies indicate that DFF sensor can be a valuable tool for wet granulation formulation and process development and scale up, as well as for routine monitoring and control during manufacturing. Copyright © 2016. Published by Elsevier Inc.
Predictive Analytics to Support Real-Time Management in Pathology Facilities.
Lessard, Lysanne; Michalowski, Wojtek; Chen Li, Wei; Amyot, Daniel; Halwani, Fawaz; Banerjee, Diponkar
2016-01-01
Predictive analytics can provide valuable support to the effective management of pathology facilities. The introduction of new tests and technologies in anatomical pathology will increase the volume of specimens to be processed, as well as the complexity of pathology processes. In order for predictive analytics to address managerial challenges associated with the volume and complexity increases, it is important to pinpoint the areas where pathology managers would most benefit from predictive capabilities. We illustrate common issues in managing pathology facilities with an analysis of the surgical specimen process at the Department of Pathology and Laboratory Medicine (DPLM) at The Ottawa Hospital, which processes all surgical specimens for the Eastern Ontario Regional Laboratory Association. We then show how predictive analytics could be used to support management. Our proposed approach can be generalized beyond the DPLM, contributing to a more effective management of pathology facilities and in turn to quicker clinical diagnoses.
Predictive Analytics to Support Real-Time Management in Pathology Facilities
Lessard, Lysanne; Michalowski, Wojtek; Chen Li, Wei; Amyot, Daniel; Halwani, Fawaz; Banerjee, Diponkar
2016-01-01
Predictive analytics can provide valuable support to the effective management of pathology facilities. The introduction of new tests and technologies in anatomical pathology will increase the volume of specimens to be processed, as well as the complexity of pathology processes. In order for predictive analytics to address managerial challenges associated with the volume and complexity increases, it is important to pinpoint the areas where pathology managers would most benefit from predictive capabilities. We illustrate common issues in managing pathology facilities with an analysis of the surgical specimen process at the Department of Pathology and Laboratory Medicine (DPLM) at The Ottawa Hospital, which processes all surgical specimens for the Eastern Ontario Regional Laboratory Association. We then show how predictive analytics could be used to support management. Our proposed approach can be generalized beyond the DPLM, contributing to a more effective management of pathology facilities and in turn to quicker clinical diagnoses. PMID:28269873
Processing plutonium-contaminated soil on Johnston Atoll
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moroney, K.; Moroney, J. III; Turney, J.
1994-07-01
This article describes a cleanup project to process plutonium- and americium-contaminated soil on Johnston Atoll for volume reduction. Thermo Analytical`s (TMA`s) segmented gate system (SGS) for this remedial operation has been in successful on-site operation since 1992. Topics covered include the basis for development, a description of the Johnston Atoll; the significance of results; the benefits of the technology; applicability to other radiologically contaminated sites. 7 figs., 1 tab.
Nagy, Brigitta; Farkas, Attila; Gyürkés, Martin; Komaromy-Hiller, Szofia; Démuth, Balázs; Szabó, Bence; Nusser, Dávid; Borbás, Enikő; Marosi, György; Nagy, Zsombor Kristóf
2017-09-15
The integration of Process Analytical Technology (PAT) initiative into the continuous production of pharmaceuticals is indispensable for reliable production. The present paper reports the implementation of in-line Raman spectroscopy in a continuous blending and tableting process of a three-component model pharmaceutical system, containing caffeine as model active pharmaceutical ingredient (API), glucose as model excipient and magnesium stearate as lubricant. The real-time analysis of API content, blend homogeneity, and tablet content uniformity was performed using a Partial Least Squares (PLS) quantitative method. The in-line Raman spectroscopic monitoring showed that the continuous blender was capable of producing blends with high homogeneity, and technological malfunctions can be detected by the proposed PAT method. The Raman spectroscopy-based feedback control of the API feeder was also established, creating a 'Process Analytically Controlled Technology' (PACT), which guarantees the required API content in the produced blend. This is, to the best of the authors' knowledge, the first ever application of Raman-spectroscopy in continuous blending and the first Raman-based feedback control in the formulation technology of solid pharmaceuticals. Copyright © 2017 Elsevier B.V. All rights reserved.
Model and Analytic Processes for Export License Assessments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thompson, Sandra E.; Whitney, Paul D.; Weimar, Mark R.
2011-09-29
This paper represents the Department of Energy Office of Nonproliferation Research and Development (NA-22) Simulations, Algorithms and Modeling (SAM) Program's first effort to identify and frame analytical methods and tools to aid export control professionals in effectively predicting proliferation intent; a complex, multi-step and multi-agency process. The report focuses on analytical modeling methodologies that alone, or combined, may improve the proliferation export control license approval process. It is a follow-up to an earlier paper describing information sources and environments related to international nuclear technology transfer. This report describes the decision criteria used to evaluate modeling techniques and tools to determinemore » which approaches will be investigated during the final 2 years of the project. The report also details the motivation for why new modeling techniques and tools are needed. The analytical modeling methodologies will enable analysts to evaluate the information environment for relevance to detecting proliferation intent, with specific focus on assessing risks associated with transferring dual-use technologies. Dual-use technologies can be used in both weapons and commercial enterprises. A decision-framework was developed to evaluate which of the different analytical modeling methodologies would be most appropriate conditional on the uniqueness of the approach, data availability, laboratory capabilities, relevance to NA-22 and Office of Arms Control and Nonproliferation (NA-24) research needs and the impact if successful. Modeling methodologies were divided into whether they could help micro-level assessments (e.g., help improve individual license assessments) or macro-level assessment. Macro-level assessment focuses on suppliers, technology, consumers, economies, and proliferation context. Macro-level assessment technologies scored higher in the area of uniqueness because less work has been done at the macro level. An approach to developing testable hypotheses for the macro-level assessment methodologies is provided. The outcome of this works suggests that we should develop a Bayes Net for micro-level analysis and continue to focus on Bayes Net, System Dynamics and Economic Input/Output models for assessing macro-level problems. Simultaneously, we need to develop metrics for assessing intent in export control, including the risks and consequences associated with all aspects of export control.« less
Motion/imagery secure cloud enterprise architecture analysis
NASA Astrophysics Data System (ADS)
DeLay, John L.
2012-06-01
Cloud computing with storage virtualization and new service-oriented architectures brings a new perspective to the aspect of a distributed motion imagery and persistent surveillance enterprise. Our existing research is focused mainly on content management, distributed analytics, WAN distributed cloud networking performance issues of cloud based technologies. The potential of leveraging cloud based technologies for hosting motion imagery, imagery and analytics workflows for DOD and security applications is relatively unexplored. This paper will examine technologies for managing, storing, processing and disseminating motion imagery and imagery within a distributed network environment. Finally, we propose areas for future research in the area of distributed cloud content management enterprises.
ERIC Educational Resources Information Center
Williamson, Ben
2015-01-01
The emergence of digitized health and physical education, or "eHPE", embeds software algorithms in the organization of health and physical education pedagogies. Particularly with the emergence of wearable and mobile activity trackers, biosensors and personal analytics apps, algorithmic processes have an increasingly powerful part to play…
Ethical and Privacy Principles for Learning Analytics
ERIC Educational Resources Information Center
Pardo, Abelardo; Siemens, George
2014-01-01
The massive adoption of technology in learning processes comes with an equally large capacity to track learners. Learning analytics aims at using the collected information to understand and improve the quality of a learning experience. The privacy and ethical issues that emerge in this context are tightly interconnected with other aspects such as…
We've Got Plenty of Data, Now How Can We Use It?
ERIC Educational Resources Information Center
Weiler, Jeffrey K.; Mears, Robert L.
1999-01-01
To mine a large store of school data, a new technology (variously termed data warehousing, data marts, online analytical processing, and executive information systems) is emerging. Data warehousing helps school districts extract and restructure desired data from automated systems and create new databases designed to enhance analytical and…
Usefulness of Analytical Research: Rethinking Analytical R&D&T Strategies.
Valcárcel, Miguel
2017-11-07
This Perspective is intended to help foster true innovation in Research & Development & Transfer (R&D&T) in Analytical Chemistry in the form of advances that are primarily useful for analytical purposes rather than solely for publishing. Devising effective means to strengthen the crucial contribution of Analytical Chemistry to progress in Chemistry, Science & Technology, and Society requires carefully examining the present status of our discipline and also identifying internal and external driving forces with a potential adverse impact on its development. The diagnostic process should be followed by administration of an effective therapy and supported by adoption of a theragnostic strategy if Analytical Chemistry is to enjoy a better future.
Molinos-Senante, María; Gómez, Trinidad; Caballero, Rafael; Hernández-Sancho, Francesc; Sala-Garrido, Ramón
2015-11-01
The selection of the most appropriate wastewater treatment (WWT) technology is a complex problem since many alternatives are available and many criteria are involved in the decision-making process. To deal with this challenge, the analytic network process (ANP) is applied for the first time to rank a set of seven WWT technology set-ups for secondary treatment in small communities. A major advantage of ANP is that it incorporates interdependent relationships between elements. Results illustrated that extensive technologies, constructed wetlands and pond systems are the most preferred alternatives by WWT experts. The sensitivity analysis performed verified that the ranking of WWT alternatives is very stable since constructed wetlands are almost always placed in the first position. This paper showed that ANP analysis is suitable to deal with complex decision-making problems, such as the selection of the most appropriate WWT system contributing to better understand the multiple interdependences among elements involved in the assessment. Copyright © 2015 Elsevier B.V. All rights reserved.
Cho, Il-Hoon; Ku, Seockmo
2017-09-30
The development of novel and high-tech solutions for rapid, accurate, and non-laborious microbial detection methods is imperative to improve the global food supply. Such solutions have begun to address the need for microbial detection that is faster and more sensitive than existing methodologies (e.g., classic culture enrichment methods). Multiple reviews report the technical functions and structures of conventional microbial detection tools. These tools, used to detect pathogens in food and food homogenates, were designed via qualitative analysis methods. The inherent disadvantage of these analytical methods is the necessity for specimen preparation, which is a time-consuming process. While some literature describes the challenges and opportunities to overcome the technical issues related to food industry legal guidelines, there is a lack of reviews of the current trials to overcome technological limitations related to sample preparation and microbial detection via nano and micro technologies. In this review, we primarily explore current analytical technologies, including metallic and magnetic nanomaterials, optics, electrochemistry, and spectroscopy. These techniques rely on the early detection of pathogens via enhanced analytical sensitivity and specificity. In order to introduce the potential combination and comparative analysis of various advanced methods, we also reference a novel sample preparation protocol that uses microbial concentration and recovery technologies. This technology has the potential to expedite the pre-enrichment step that precedes the detection process.
Lending Officers' Decisions to Recommend Innovative Agricultural Technology.
ERIC Educational Resources Information Center
McIntosh, Wm. Alex; Zey-Ferrell, Mary
1986-01-01
Path analysis examines an analytical model of decision making by lending officers of 211 Texas banks when recommending agricultural technology to farmer-clients. Model analyzes effects of loan officers' ascribed/achieved personal characteristics and perceptions of organizational constraints during three stages of decision process: using…
Application of analytic hierarchy process in a waste treatment technology assessment in Mexico.
Taboada-González, Paul; Aguilar-Virgen, Quetzalli; Ojeda-Benítez, Sara; Cruz-Sotelo, Samantha
2014-09-01
The high per capita generation of solid waste and the environmental problems in major rural communities of Ensenada, Baja California, have prompted authorities to seek alternatives for waste treatment. In the absence of a selection methodology, three technologies of waste treatment with energy recovery (an anaerobic digester, a downdraft gasifier, and a plasma gasifier) were evaluated, taking the broader social, political, economic, and environmental issues into considerations. Using the scientific literature as a baseline, interviews with experts, decision makers and the community, and waste stream studies were used to construct a hierarchy that was evaluated by the analytic hierarchy process. In terms of the criteria, judgments, and assumptions made in the model, the anaerobic digester was found to have the highest rating and should consequently be selected as the waste treatment technology for this area. The study results showed low sensitivity, so alternative scenarios were not considered. The methodology developed in this study may be useful for other governments who wish to assess technologies to select waste treatment.
Technology and public policy: The process of technology assessment in the federal government
NASA Technical Reports Server (NTRS)
Coates, V. T.
1975-01-01
A study was conducted to provide a descriptive and analytical review of the concept of technology assessment and the current status of its applications in the work of the federal executive agencies. The origin of the term technology assessment was examined along with a brief history of its discussion and development since 1966 and some of the factors influencing that development.
Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean
2014-01-01
Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT). © 2014 American Institute of Chemical Engineers.
Rosas, Juan G; Blanco, Marcel; González, Josep M; Alcalà, Manel
2012-08-15
Process Analytical Technology (PAT) is playing a central role in current regulations on pharmaceutical production processes. Proper understanding of all operations and variables connecting the raw materials to end products is one of the keys to ensuring quality of the products and continuous improvement in their production. Near infrared spectroscopy (NIRS) has been successfully used to develop faster and non-invasive quantitative methods for real-time predicting critical quality attributes (CQA) of pharmaceutical granulates (API content, pH, moisture, flowability, angle of repose and particle size). NIR spectra have been acquired from the bin blender after granulation process in a non-classified area without the need of sample withdrawal. The methodology used for data acquisition, calibration modelling and method application in this context is relatively inexpensive and can be easily implemented by most pharmaceutical laboratories. For this purpose, Partial Least-Squares (PLS) algorithm was used to calculate multivariate calibration models, that provided acceptable Root Mean Square Error of Predictions (RMSEP) values (RMSEP(API)=1.0 mg/g; RMSEP(pH)=0.1; RMSEP(Moisture)=0.1%; RMSEP(Flowability)=0.6 g/s; RMSEP(Angle of repose)=1.7° and RMSEP(Particle size)=2.5%) that allowed the application for routine analyses of production batches. The proposed method affords quality assessment of end products and the determination of important parameters with a view to understanding production processes used by the pharmaceutical industry. As shown here, the NIRS technique is a highly suitable tool for Process Analytical Technologies. Copyright © 2012 Elsevier B.V. All rights reserved.
2013-01-01
Influenza virus-like particle vaccines are one of the most promising ways to respond to the threat of future influenza pandemics. VLPs are composed of viral antigens but lack nucleic acids making them non-infectious which limit the risk of recombination with wild-type strains. By taking advantage of the advancements in cell culture technologies, the process from strain identification to manufacturing has the potential to be completed rapidly and easily at large scales. After closely reviewing the current research done on influenza VLPs, it is evident that the development of quantification methods has been consistently overlooked. VLP quantification at all stages of the production process has been left to rely on current influenza quantification methods (i.e. Hemagglutination assay (HA), Single Radial Immunodiffusion assay (SRID), NA enzymatic activity assays, Western blot, Electron Microscopy). These are analytical methods developed decades ago for influenza virions and final bulk influenza vaccines. Although these methods are time-consuming and cumbersome they have been sufficient for the characterization of final purified material. Nevertheless, these analytical methods are impractical for in-line process monitoring because VLP concentration in crude samples generally falls out of the range of detection for these methods. This consequently impedes the development of robust influenza-VLP production and purification processes. Thus, development of functional process analytical techniques, applicable at every stage during production, that are compatible with different production platforms is in great need to assess, optimize and exploit the full potential of novel manufacturing platforms. PMID:23642219
Yan, Binjun; Chen, Teng; Xu, Zhilin; Qu, Haibin
2014-06-01
The concept of quality by design (QbD) is widely applied in the process development of pharmaceuticals. However, the additional cost and time have caused some resistance about QbD implementation. To show a possible solution, this work proposed a rapid process development method, which used direct analysis in real time mass spectrometry (DART-MS) as a process analytical technology (PAT) tool for studying the chromatographic process of Ginkgo biloba L., as an example. The breakthrough curves were fast determined by DART-MS at-line. A high correlation coefficient of 0.9520 was found between the concentrations of ginkgolide A determined by DART-MS and HPLC. Based on the PAT tool, the impacts of process parameters on the adsorption capacity were discovered rapidly, which showed a decreased adsorption capacity with the increase of the flow rate. This work has shown the feasibility and advantages of integrating PAT into QbD implementation for rapid process development. Copyright © 2014 Elsevier B.V. All rights reserved.
Perspectives on bioanalytical mass spectrometry and automation in drug discovery.
Janiszewski, John S; Liston, Theodore E; Cole, Mark J
2008-11-01
The use of high speed synthesis technologies has resulted in a steady increase in the number of new chemical entities active in the drug discovery research stream. Large organizations can have thousands of chemical entities in various stages of testing and evaluation across numerous projects on a weekly basis. Qualitative and quantitative measurements made using LC/MS are integrated throughout this process from early stage lead generation through candidate nomination. Nearly all analytical processes and procedures in modern research organizations are automated to some degree. This includes both hardware and software automation. In this review we discuss bioanalytical mass spectrometry and automation as components of the analytical chemistry infrastructure in pharma. Analytical chemists are presented as members of distinct groups with similar skillsets that build automated systems, manage test compounds, assays and reagents, and deliver data to project teams. The ADME-screening process in drug discovery is used as a model to highlight the relationships between analytical tasks in drug discovery. Emerging software and process automation tools are described that can potentially address gaps and link analytical chemistry related tasks. The role of analytical chemists and groups in modern 'industrialized' drug discovery is also discussed.
Data-driven ranch management: A vision for sustainable ranching
USDA-ARS?s Scientific Manuscript database
Introduction The 21st century has ushered in an era of tiny, inexpensive electronics with impressive capabilities for sensing the environment. Also emerging are new technologies for communicating data to computer systems where new analytical tools can process the data. Many of these technologies w...
THE BERKELEY DATA ANALYSIS SYSTEM (BDAS): AN OPEN SOURCE PLATFORM FOR BIG DATA ANALYTICS
2017-09-01
Evan Sparks, Oliver Zahn, Michael J. Franklin, David A. Patterson, Saul Perlmutter. Scientific Computing Meets Big Data Technology: An Astronomy ...Processing Astronomy Imagery Using Big Data Technology. IEEE Transaction on Big Data, 2016. Approved for Public Release; Distribution Unlimited. 22 [93
Three Perspectives on Innovation--the Technological, the Political, and the Cultural. Draft.
ERIC Educational Resources Information Center
House, Ernest R.
Awareness of the three analytical perspectives on educational innovation leads to better understanding of educational change processes and better innovation strategies and policies. The three perspectives--technological, political, and cultural--are "screens" of facts, values, and presuppositions through which analysts view innovation.…
Barbin, Douglas Fernandes; Valous, Nektarios A; Dias, Adriana Passos; Camisa, Jaqueline; Hirooka, Elisa Yoko; Yamashita, Fabio
2015-11-01
There is an increasing interest in the use of polysaccharides and proteins for the production of biodegradable films. Visible and near-infrared (VIS-NIR) spectroscopy is a reliable analytical tool for objective analyses of biological sample attributes. The objective is to investigate the potential of VIS-NIR spectroscopy as a process analytical technology for compositional characterization of biodegradable materials and correlation to their mechanical properties. Biofilms were produced by single-screw extrusion with different combinations of polybutylene adipate-co-terephthalate, whole oat flour, glycerol, magnesium stearate, and citric acid. Spectral data were recorded in the range of 400-2498nm at 2nm intervals. Partial least square regression was used to investigate the correlation between spectral information and mechanical properties. Results show that spectral information is influenced by the major constituent components, as they are clustered according to polybutylene adipate-co-terephthalate content. Results for regression models using the spectral information as predictor of tensile properties achieved satisfactory results, with coefficients of prediction (R(2)C) of 0.83, 0.88 and 0.92 (calibration models) for elongation, tensile strength, and Young's modulus, respectively. Results corroborate the correlation of NIR spectra with tensile properties, showing that NIR spectroscopy has potential as a rapid analytical technology for non-destructive assessment of the mechanical properties of the films. Copyright © 2015 Elsevier B.V. All rights reserved.
Quality of Big Data in health care.
Sukumar, Sreenivas R; Natarajan, Ramachandran; Ferrell, Regina K
2015-01-01
The current trend in Big Data analytics and in particular health information technology is toward building sophisticated models, methods and tools for business, operational and clinical intelligence. However, the critical issue of data quality required for these models is not getting the attention it deserves. The purpose of this paper is to highlight the issues of data quality in the context of Big Data health care analytics. The insights presented in this paper are the results of analytics work that was done in different organizations on a variety of health data sets. The data sets include Medicare and Medicaid claims, provider enrollment data sets from both public and private sources, electronic health records from regional health centers accessed through partnerships with health care claims processing entities under health privacy protected guidelines. Assessment of data quality in health care has to consider: first, the entire lifecycle of health data; second, problems arising from errors and inaccuracies in the data itself; third, the source(s) and the pedigree of the data; and fourth, how the underlying purpose of data collection impact the analytic processing and knowledge expected to be derived. Automation in the form of data handling, storage, entry and processing technologies is to be viewed as a double-edged sword. At one level, automation can be a good solution, while at another level it can create a different set of data quality issues. Implementation of health care analytics with Big Data is enabled by a road map that addresses the organizational and technological aspects of data quality assurance. The value derived from the use of analytics should be the primary determinant of data quality. Based on this premise, health care enterprises embracing Big Data should have a road map for a systematic approach to data quality. Health care data quality problems can be so very specific that organizations might have to build their own custom software or data quality rule engines. Today, data quality issues are diagnosed and addressed in a piece-meal fashion. The authors recommend a data lifecycle approach and provide a road map, that is more appropriate with the dimensions of Big Data and fits different stages in the analytical workflow.
Biosensor technology: technology push versus market pull.
Luong, John H T; Male, Keith B; Glennon, Jeremy D
2008-01-01
Biosensor technology is based on a specific biological recognition element in combination with a transducer for signal processing. Since its inception, biosensors have been expected to play a significant analytical role in medicine, agriculture, food safety, homeland security, environmental and industrial monitoring. However, the commercialization of biosensor technology has significantly lagged behind the research output as reflected by a plethora of publications and patenting activities. The rationale behind the slow and limited technology transfer could be attributed to cost considerations and some key technical barriers. Analytical chemistry has changed considerably, driven by automation, miniaturization, and system integration with high throughput for multiple tasks. Such requirements pose a great challenge in biosensor technology which is often designed to detect one single or a few target analytes. Successful biosensors must be versatile to support interchangeable biorecognition elements, and in addition miniaturization must be feasible to allow automation for parallel sensing with ease of operation at a competitive cost. A significant upfront investment in research and development is a prerequisite in the commercialization of biosensors. The progress in such endeavors is incremental with limited success, thus, the market entry for a new venture is very difficult unless a niche product can be developed with a considerable market volume.
Exploring the Analytical Processes of Intelligence Analysts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, George; Kuchar, Olga A.; Wolf, Katherine E.
We present an observational case study in which we investigate and analyze the analytical processes of intelligence analysts. Participating analysts in the study carry out two scenarios where they organize and triage information, conduct intelligence analysis, report results, and collaborate with one another. Through a combination of artifact analyses, group interviews, and participant observations, we explore the space and boundaries in which intelligence analysts work and operate. We also assess the implications of our findings on the use and application of relevant information technologies.
Business Intelligence in Process Control
NASA Astrophysics Data System (ADS)
Kopčeková, Alena; Kopček, Michal; Tanuška, Pavol
2013-12-01
The Business Intelligence technology, which represents a strong tool not only for decision making support, but also has a big potential in other fields of application, is discussed in this paper. Necessary fundamental definitions are offered and explained to better understand the basic principles and the role of this technology for company management. Article is logically divided into five main parts. In the first part, there is the definition of the technology and the list of main advantages. In the second part, an overview of the system architecture with the brief description of separate building blocks is presented. Also, the hierarchical nature of the system architecture is shown. The technology life cycle consisting of four steps, which are mutually interconnected into a ring, is described in the third part. In the fourth part, analytical methods incorporated in the online analytical processing and data mining used within the business intelligence as well as the related data mining methodologies are summarised. Also, some typical applications of the above-mentioned particular methods are introduced. In the final part, a proposal of the knowledge discovery system for hierarchical process control is outlined. The focus of this paper is to provide a comprehensive view and to familiarize the reader with the Business Intelligence technology and its utilisation.
Use of Audiovisual Texts in University Education Process
ERIC Educational Resources Information Center
Aleksandrov, Evgeniy P.
2014-01-01
Audio-visual learning technologies offer great opportunities in the development of students' analytical and projective abilities. These technologies can be used in classroom activities and for homework. This article discusses the features of audiovisual media texts use in a series of social sciences and humanities in the University curriculum.
Leveraging Big-Data for Business Process Analytics
ERIC Educational Resources Information Center
Vera-Baquero, Alejandro; Colomo Palacios, Ricardo; Stantchev, Vladimir; Molloy, Owen
2015-01-01
Purpose: This paper aims to present a solution that enables organizations to monitor and analyse the performance of their business processes by means of Big Data technology. Business process improvement can drastically influence in the profit of corporations and helps them to remain viable. However, the use of traditional Business Intelligence…
Big data analytics as a service infrastructure: challenges, desired properties and solutions
NASA Astrophysics Data System (ADS)
Martín-Márquez, Manuel
2015-12-01
CERN's accelerator complex generates a very large amount of data. A large volumen of heterogeneous data is constantly generated from control equipment and monitoring agents. These data must be stored and analysed. Over the decades, CERN's researching and engineering teams have applied different approaches, techniques and technologies for this purpose. This situation has minimised the necessary collaboration and, more relevantly, the cross data analytics over different domains. These two factors are essential to unlock hidden insights and correlations between the underlying processes, which enable better and more efficient daily-based accelerator operations and more informed decisions. The proposed Big Data Analytics as a Service Infrastructure aims to: (1) integrate the existing developments; (2) centralise and standardise the complex data analytics needs for CERN's research and engineering community; (3) deliver real-time, batch data analytics and information discovery capabilities; and (4) provide transparent access and Extract, Transform and Load (ETL), mechanisms to the various and mission-critical existing data repositories. This paper presents the desired objectives and properties resulting from the analysis of CERN's data analytics requirements; the main challenges: technological, collaborative and educational and; potential solutions.
Nakano, Yoshio; Katakuse, Yoshimitsu; Azechi, Yasutaka
2018-06-01
An attempt to apply X-Ray Fluorescence (XRF) analysis to evaluate small particle coating process as a Process Analytical Technologies (PAT) was made. The XRF analysis was used to monitor coating level in small particle coating process with at-line manner. The small particle coating process usually consists of multiple coating processes. This study was conducted by a simple coating particles prepared by first coating of a model compound (DL-methionine) and second coating by talc on spherical microcrystalline cellulose cores. The particles with two layered coating are enough to demonstrate the small particle coating process. From the result by the small particle coating process, it was found that the XRF signal played different roles, resulting that XRF signals by first coating (layering) and second coating (mask coating) could demonstrate the extent with different mechanisms for the coating process. Furthermore, the particle coating of the different particle size has also been investigated to evaluate size effect of these coating processes. From these results, it was concluded that the XRF could be used as a PAT in monitoring particle coating processes and become powerful tool in pharmaceutical manufacturing.
ERIC Educational Resources Information Center
Yen, Cheng-Huang; Chen, I-Chuan; Lai, Su-Chun; Chuang, Yea-Ru
2015-01-01
Traces of learning behaviors generally provide insights into learners and the learning processes that they employ. In this article, a learning-analytics-based approach is proposed for managing cognitive load by adjusting the instructional strategies used in online courses. The technology-based learning environment examined in this study involved a…
Troy, Declan J; Ojha, Kumari Shikha; Kerry, Joseph P; Tiwari, Brijesh K
2016-10-01
New and emerging robust technologies can play an important role in ensuring a more resilient meat value chain and satisfying consumer demands and needs. This paper outlines various novel thermal and non-thermal technologies which have shown potential for meat processing applications. A number of process analytical techniques which have shown potential for rapid, real-time assessment of meat quality are also discussed. The commercial uptake and consumer acceptance of novel technologies in meat processing have been subjects of great interest over the past decade. Consumer focus group studies have shown that consumer expectations and liking for novel technologies, applicable to meat processing applications, vary significantly. This overview also highlights the necessity for meat processors to address consumer risk-benefit perceptions, knowledge and trust in order to be commercially successful in the application of novel technologies within the meat sector. Copyright © 2016. Published by Elsevier Ltd.
Novel approach to investigation of semiconductor MOCVD by microreactor technology
NASA Astrophysics Data System (ADS)
Konakov, S. A.; Krzhizhanovskaya, V. V.
2017-11-01
Metal-Organic Chemical Vapour Deposition is a very complex technology that requires further investigation and optimization. We propose to apply microreactors to (1) replace multiple expensive time-consuming macroscale experiments by just one microreactor deposition with many points on one substrate; (2) to derive chemical reaction rates from individual deposition profiles using theoretical analytical solution. In this paper we also present the analytical solution of a simplified equation describing the deposition rate dependency on temperature. It allows to solve an inverse problem and to obtain detailed information about chemical reaction mechanism of MOCVD process.
Process for Selecting System Level Assessments for Human System Technologies
NASA Technical Reports Server (NTRS)
Watts, James; Park, John
2006-01-01
The integration of many life support systems necessary to construct a stable habitat is difficult. The correct identification of the appropriate technologies and corresponding interfaces is an exhaustive process. Once technologies are selected secondary issues such as mechanical and electrical interfaces must be addressed. The required analytical and testing work must be approached in a piecewise fashion to achieve timely results. A repeatable process has been developed to identify and prioritize system level assessments and testing needs. This Assessment Selection Process has been defined to assess cross cutting integration issues on topics at the system or component levels. Assessments are used to identify risks, encourage future actions to mitigate risks, or spur further studies.
Stillhart, Cordula; Kuentz, Martin
2012-02-05
Self-emulsifying drug delivery systems (SEDDS) are complex mixtures in which drug quantification can become a challenging task. Thus, a general need exists for novel analytical methods and a particular interest lies in techniques with the potential for process monitoring. This article compares Raman spectroscopy with high-resolution ultrasonic resonator technology (URT) for drug quantification in SEDDS. The model drugs fenofibrate, indomethacin, and probucol were quantitatively assayed in different self-emulsifying formulations. We measured ultrasound velocity and attenuation in the bulk formulation containing drug at different concentrations. The formulations were also studied by Raman spectroscopy. We used both, an in-line immersion probe for the bulk formulation and a multi-fiber sensor for measuring through hard-gelatin capsules that were filled with SEDDS. Each method was assessed by calculating the relative standard error of prediction (RSEP) as well as the limit of quantification (LOQ) and the mean recovery. Raman spectroscopy led to excellent calibration models for the bulk formulation as well as the capsules. The RSEP depended on the SEDDS type with values of 1.5-3.8%, while LOQ was between 0.04 and 0.35% (w/w) for drug quantification in the bulk. Similarly, the analysis of the capsules led to RSEP of 1.9-6.5% and LOQ of 0.01-0.41% (w/w). On the other hand, ultrasound attenuation resulted in RSEP of 2.3-4.4% and LOQ of 0.1-0.6% (w/w). Moreover, ultrasound velocity provided an interesting analytical response in cases where the drug strongly affected the density or compressibility of the SEDDS. We conclude that ultrasonic resonator technology and Raman spectroscopy constitute suitable methods for drug quantification in SEDDS, which is promising for their use as process analytical technologies. Copyright © 2011 Elsevier B.V. All rights reserved.
Formic Acid: Development of an Analytical Method and Use as Process Indicator in Anaerobic Systems
1992-03-01
distilled to remove compounds such as cinnamic , glycolic and levulinic acids which can be oxidized to formic acid by ceric sulfate, thus interfering...I AD-A250 668 D0 ,I I I 111 Wl’i ill EDT|CS ELECTE MAY 27 1992 I C I FORMIC ACID : DEVELCPMENT OF AN ANALYTICAL METHOD AND USE AS A PROCESS INDICATOR...OF TECHNOLOGY A UNIT OF THE UNIVERSITY SYSTEM OF GEORGIA SCHOOL OF CIVIL ENGINEERING ATLANTA, GEORGIA 30332 iIi ii FORMIC ACID : DEVELOPMENT OF AN
Big-BOE: Fusing Spanish Official Gazette with Big Data Technology.
Basanta-Val, Pablo; Sánchez-Fernández, Luis
2018-06-01
The proliferation of new data sources, stemmed from the adoption of open-data schemes, in combination with an increasing computing capacity causes the inception of new type of analytics that process Internet of things with low-cost engines to speed up data processing using parallel computing. In this context, the article presents an initiative, called BIG-Boletín Oficial del Estado (BOE), designed to process the Spanish official government gazette (BOE) with state-of-the-art processing engines, to reduce computation time and to offer additional speed up for big data analysts. The goal of including a big data infrastructure is to be able to process different BOE documents in parallel with specific analytics, to search for several issues in different documents. The application infrastructure processing engine is described from an architectural perspective and from performance, showing evidence on how this type of infrastructure improves the performance of different types of simple analytics as several machines cooperate.
Barriers to Achieving Economies of Scale in Analysis of EHR Data. A Cautionary Tale.
Sendak, Mark P; Balu, Suresh; Schulman, Kevin A
2017-08-09
Signed in 2009, the Health Information Technology for Economic and Clinical Health Act infused $28 billion of federal funds to accelerate adoption of electronic health records (EHRs). Yet, EHRs have produced mixed results and have even raised concern that the current technology ecosystem stifles innovation. We describe the development process and report initial outcomes of a chronic kidney disease analytics application that identifies high-risk patients for nephrology referral. The cost to validate and integrate the analytics application into clinical workflow was $217,138. Despite the success of the program, redundant development and validation efforts will require $38.8 million to scale the application across all multihospital systems in the nation. We address the shortcomings of current technology investments and distill insights from the technology industry. To yield a return on technology investments, we propose policy changes that address the underlying issues now being imposed on the system by an ineffective technology business model.
Hou, Xiang-Mei; Zhang, Lei; Yue, Hong-Shui; Ju, Ai-Chun; Ye, Zheng-Liang
2016-07-01
To study and establish a monitoring method for macroporous resin column chromatography process of salvianolic acids by using near infrared spectroscopy (NIR) as a process analytical technology (PAT).The multivariate statistical process control (MSPC) model was developed based on 7 normal operation batches, and 2 test batches (including one normal operation batch and one abnormal operation batch) were used to verify the monitoring performance of this model. The results showed that MSPC model had a good monitoring ability for the column chromatography process. Meanwhile, NIR quantitative calibration model was established for three key quality indexes (rosmarinic acid, lithospermic acid and salvianolic acid B) by using partial least squares (PLS) algorithm. The verification results demonstrated that this model had satisfactory prediction performance. The combined application of the above two models could effectively achieve real-time monitoring for macroporous resin column chromatography process of salvianolic acids, and can be used to conduct on-line analysis of key quality indexes. This established process monitoring method could provide reference for the development of process analytical technology for traditional Chinese medicines manufacturing. Copyright© by the Chinese Pharmaceutical Association.
Awotwe Otoo, David; Agarabi, Cyrus; Khan, Mansoor A
2014-07-01
The aim of the present study was to apply an integrated process analytical technology (PAT) approach to control and monitor the effect of the degree of supercooling on critical process and product parameters of a lyophilization cycle. Two concentrations of a mAb formulation were used as models for lyophilization. ControLyo™ technology was applied to control the onset of ice nucleation, whereas tunable diode laser absorption spectroscopy (TDLAS) was utilized as a noninvasive tool for the inline monitoring of the water vapor concentration and vapor flow velocity in the spool during primary drying. The instantaneous measurements were then used to determine the effect of the degree of supercooling on critical process and product parameters. Controlled nucleation resulted in uniform nucleation at lower degrees of supercooling for both formulations, higher sublimation rates, lower mass transfer resistance, lower product temperatures at the sublimation interface, and shorter primary drying times compared with the conventional shelf-ramped freezing. Controlled nucleation also resulted in lyophilized cakes with more elegant and porous structure with no visible collapse or shrinkage, lower specific surface area, and shorter reconstitution times compared with the uncontrolled nucleation. Uncontrolled nucleation however resulted in lyophilized cakes with relatively lower residual moisture contents compared with controlled nucleation. TDLAS proved to be an efficient tool to determine the endpoint of primary drying. There was good agreement between data obtained from TDLAS-based measurements and SMART™ technology. ControLyo™ technology and TDLAS showed great potential as PAT tools to achieve enhanced process monitoring and control during lyophilization cycles. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
State-of-the-Art of (Bio)Chemical Sensor Developments in Analytical Spanish Groups
Plata, María Reyes; Contento, Ana María; Ríos, Angel
2010-01-01
(Bio)chemical sensors are one of the most exciting fields in analytical chemistry today. The development of these analytical devices simplifies and miniaturizes the whole analytical process. Although the initial expectation of the massive incorporation of sensors in routine analytical work has been truncated to some extent, in many other cases analytical methods based on sensor technology have solved important analytical problems. Many research groups are working in this field world-wide, reporting interesting results so far. Modestly, Spanish researchers have contributed to these recent developments. In this review, we summarize the more representative achievements carried out for these groups. They cover a wide variety of sensors, including optical, electrochemical, piezoelectric or electro-mechanical devices, used for laboratory or field analyses. The capabilities to be used in different applied areas are also critically discussed. PMID:22319260
Keystroke Logging in Writing Research: Using Inputlog to Analyze and Visualize Writing Processes
ERIC Educational Resources Information Center
Leijten, Marielle; Van Waes, Luuk
2013-01-01
Keystroke logging has become instrumental in identifying writing strategies and understanding cognitive processes. Recent technological advances have refined logging efficiency and analytical outputs. While keystroke logging allows for ecological data collection, it is often difficult to connect the fine grain of logging data to the underlying…
ERIC Educational Resources Information Center
Burton, Hilary D.
TIS (Technology Information System) is an intelligent gateway system capable of performing quantitative evaluation and analysis of bibliographic citations using a set of Process functions. Originally developed by Lawrence Livermore National Laboratory (LLNL) to analyze information retrieved from three major federal databases, DOE/RECON,…
NASA Astrophysics Data System (ADS)
Vondran, Gary; Chao, Hui; Lin, Xiaofan; Beyer, Dirk; Joshi, Parag; Atkins, Brian; Obrador, Pere
2006-02-01
To run a targeted campaign involves coordination and management across numerous organizations and complex process flows. Everything from market analytics on customer databases, acquiring content and images, composing the materials, meeting the sponsoring enterprise brand standards, driving through production and fulfillment, and evaluating results; all processes are currently performed by experienced highly trained staff. Presented is a developed solution that not only brings together technologies that automate each process, but also automates the entire flow so that a novice user could easily run a successful campaign from their desktop. This paper presents the technologies, structure, and process flows used to bring this system together. Highlighted will be how the complexity of running a targeted campaign is hidden from the user through technologies, all while providing the benefits of a professionally managed campaign.
Microscale technology and biocatalytic processes: opportunities and challenges for synthesis.
Wohlgemuth, Roland; Plazl, Igor; Žnidaršič-Plazl, Polona; Gernaey, Krist V; Woodley, John M
2015-05-01
Despite the expanding presence of microscale technology in chemical synthesis and energy production as well as in biomedical devices and analytical and diagnostic tools, its potential in biocatalytic processes for pharmaceutical and fine chemicals, as well as related industries, has not yet been fully exploited. The aim of this review is to shed light on the strategic advantages of this promising technology for the development and realization of biocatalytic processes and subsequent product recovery steps, demonstrated with examples from the literature. Constraints, opportunities, and the future outlook for the implementation of these key green engineering methods and the role of supporting tools such as mathematical models to establish sustainable production processes are discussed. Copyright © 2015 Elsevier Ltd. All rights reserved.
End-User Evaluations of Semantic Web Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCool, Rob; Cowell, Andrew J.; Thurman, David A.
Stanford University's Knowledge Systems Laboratory (KSL) is working in partnership with Battelle Memorial Institute and IBM Watson Research Center to develop a suite of technologies for information extraction, knowledge representation & reasoning, and human-information interaction, in unison entitled 'Knowledge Associates for Novel Intelligence' (KANI). We have developed an integrated analytic environment composed of a collection of analyst associates, software components that aid the user at different stages of the information analysis process. An important part of our participatory design process has been to ensure our technologies and designs are tightly integrate with the needs and requirements of our end users,more » To this end, we perform a sequence of evaluations towards the end of the development process that ensure the technologies are both functional and usable. This paper reports on that process.« less
Applications of Raman Spectroscopy in Biopharmaceutical Manufacturing: A Short Review.
Buckley, Kevin; Ryder, Alan G
2017-06-01
The production of active pharmaceutical ingredients (APIs) is currently undergoing its biggest transformation in a century. The changes are based on the rapid and dramatic introduction of protein- and macromolecule-based drugs (collectively known as biopharmaceuticals) and can be traced back to the huge investment in biomedical science (in particular in genomics and proteomics) that has been ongoing since the 1970s. Biopharmaceuticals (or biologics) are manufactured using biological-expression systems (such as mammalian, bacterial, insect cells, etc.) and have spawned a large (>€35 billion sales annually in Europe) and growing biopharmaceutical industry (BioPharma). The structural and chemical complexity of biologics, combined with the intricacy of cell-based manufacturing, imposes a huge analytical burden to correctly characterize and quantify both processes (upstream) and products (downstream). In small molecule manufacturing, advances in analytical and computational methods have been extensively exploited to generate process analytical technologies (PAT) that are now used for routine process control, leading to more efficient processes and safer medicines. In the analytical domain, biologic manufacturing is considerably behind and there is both a huge scope and need to produce relevant PAT tools with which to better control processes, and better characterize product macromolecules. Raman spectroscopy, a vibrational spectroscopy with a number of useful properties (nondestructive, non-contact, robustness) has significant potential advantages in BioPharma. Key among them are intrinsically high molecular specificity, the ability to measure in water, the requirement for minimal (or no) sample pre-treatment, the flexibility of sampling configurations, and suitability for automation. Here, we review and discuss a representative selection of the more important Raman applications in BioPharma (with particular emphasis on mammalian cell culture). The review shows that the properties of Raman have been successfully exploited to deliver unique and useful analytical solutions, particularly for online process monitoring. However, it also shows that its inherent susceptibility to fluorescence interference and the weakness of the Raman effect mean that it can never be a panacea. In particular, Raman-based methods are intrinsically limited by the chemical complexity and wide analyte-concentration-profiles of cell culture media/bioprocessing broths which limit their use for quantitative analysis. Nevertheless, with appropriate foreknowledge of these limitations and good experimental design, robust analytical methods can be produced. In addition, new technological developments such as time-resolved detectors, advanced lasers, and plasmonics offer potential of new Raman-based methods to resolve existing limitations and/or provide new analytical insights.
NASA Technical Reports Server (NTRS)
Tavana, Madjid
1995-01-01
The evaluation and prioritization of Engineering Support Requests (ESR's) is a particularly difficult task at the Kennedy Space Center (KSC) -- Shuttle Project Engineering Office. This difficulty is due to the complexities inherent in the evaluation process and the lack of structured information. The evaluation process must consider a multitude of relevant pieces of information concerning Safety, Supportability, O&M Cost Savings, Process Enhancement, Reliability, and Implementation. Various analytical and normative models developed over the past have helped decision makers at KSC utilize large volumes of information in the evaluation of ESR's. The purpose of this project is to build on the existing methodologies and develop a multiple criteria decision support system that captures the decision maker's beliefs through a series of sequential, rational, and analytical processes. The model utilizes the Analytic Hierarchy Process (AHP), subjective probabilities, the entropy concept, and Maximize Agreement Heuristic (MAH) to enhance the decision maker's intuition in evaluating a set of ESR's.
Chemical Technology Division annual technical report, 1990
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1991-05-01
Highlights of the Chemical Technology (CMT) Division's activities during 1990 are presented. In this period, CMT conducted research and development in the following areas: (1) electrochemical technology, including advanced batteries and fuel cells; (2) technology for coal- fired magnetohydrodynamics and fluidized-bed combustion; (3) methods for recovery of energy from municipal waste and techniques for treatment of hazardous organic waste; (4) the reaction of nuclear waste glass and spent fuel under conditions expected for a high-level waste repository; (5) processes for separating and recovering transuranic elements from nuclear waste streams, concentrating plutonium solids in pyrochemical residues by aqueous biphase extraction, andmore » treating natural and process waters contaminated by volatile organic compounds; (6) recovery processes for discharged fuel and the uranium blanket in the Integral Fast Reactor (IFR); (7) processes for removal of actinides in spent fuel from commercial water-cooled nuclear reactors and burnup in IFRs; and (8) physical chemistry of selected materials in environments simulating those of fission and fusion energy systems. The Division also has a program in basic chemistry research in the areas of fluid catalysis for converting small molecules to desired products; materials chemistry for superconducting oxides and associated and ordered solutions at high temperatures; interfacial processes of importance to corrosion science, high-temperature superconductivity, and catalysis; and the geochemical processes responsible for trace-element migration within the earth's crust. The Analytical Chemistry Laboratory in CMT provides a broad range of analytical chemistry support services to the scientific and engineering programs at Argonne National Laboratory (ANL). 66 refs., 69 figs., 6 tabs.« less
2005-07-01
approach for measuring the return on Information Technology (IT) investments. A review of existing methods suggests the difficulty in adequately...measuring the returns of IT at various levels of analysis (e.g., firm or process level). To address this issue, this study aims to develop a method for...view (KBV), this paper proposes an analytic method for measuring the historical revenue and cost of IT investments by estimating the amount of
ERIC Educational Resources Information Center
Bergeron, Pierrette; Hiller, Christine A.
2002-01-01
Reviews the evolution of competitive intelligence since 1994, including terminology and definitions and analytical techniques. Addresses the issue of ethics; explores how information technology supports the competitive intelligence process; and discusses education and training opportunities for competitive intelligence, including core competencies…
Multidimensional Data Modeling for Business Process Analysis
NASA Astrophysics Data System (ADS)
Mansmann, Svetlana; Neumuth, Thomas; Scholl, Marc H.
The emerging area of business process intelligence attempts to enhance the analytical capabilities of business process management systems by employing data warehousing and mining technologies. This paper presents an approach to re-engineering the business process modeling in conformity with the multidimensional data model. Since the business process and the multidimensional model are driven by rather different objectives and assumptions, there is no straightforward solution to converging these models.
Propellant injection systems and processes
NASA Technical Reports Server (NTRS)
Ito, Jackson I.
1995-01-01
The previous 'Art of Injector Design' is maturing and merging with the more systematic 'Science of Combustion Device Analysis.' This technology can be based upon observation, correlation, experimentation and ultimately analytical modeling based upon basic engineering principles. This methodology is more systematic and far superior to the historical injector design process of 'Trial and Error' or blindly 'Copying Past Successes.' The benefit of such an approach is to be able to rank candidate design concepts for relative probability of success or technical risk in all the important combustion device design requirements and combustion process development risk categories before committing to an engine development program. Even if a single analytical design concept cannot be developed to predict satisfying all requirements simultaneously, a series of risk mitigation key enabling technologies can be identified for early resolution. Lower cost subscale or laboratory experimentation to demonstrate proof of principle, critical instrumentation requirements, and design discriminating test plans can be developed based on the physical insight provided by these analyses.
Quality by control: Towards model predictive control of mammalian cell culture bioprocesses.
Sommeregger, Wolfgang; Sissolak, Bernhard; Kandra, Kulwant; von Stosch, Moritz; Mayer, Martin; Striedner, Gerald
2017-07-01
The industrial production of complex biopharmaceuticals using recombinant mammalian cell lines is still mainly built on a quality by testing approach, which is represented by fixed process conditions and extensive testing of the end-product. In 2004 the FDA launched the process analytical technology initiative, aiming to guide the industry towards advanced process monitoring and better understanding of how critical process parameters affect the critical quality attributes. Implementation of process analytical technology into the bio-production process enables moving from the quality by testing to a more flexible quality by design approach. The application of advanced sensor systems in combination with mathematical modelling techniques offers enhanced process understanding, allows on-line prediction of critical quality attributes and subsequently real-time product quality control. In this review opportunities and unsolved issues on the road to a successful quality by design and dynamic control implementation are discussed. A major focus is directed on the preconditions for the application of model predictive control for mammalian cell culture bioprocesses. Design of experiments providing information about the process dynamics upon parameter change, dynamic process models, on-line process state predictions and powerful software environments seem to be a prerequisite for quality by control realization. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Visual analytics as a translational cognitive science.
Fisher, Brian; Green, Tera Marie; Arias-Hernández, Richard
2011-07-01
Visual analytics is a new interdisciplinary field of study that calls for a more structured scientific approach to understanding the effects of interaction with complex graphical displays on human cognitive processes. Its primary goal is to support the design and evaluation of graphical information systems that better support cognitive processes in areas as diverse as scientific research and emergency management. The methodologies that make up this new field are as yet ill defined. This paper proposes a pathway for development of visual analytics as a translational cognitive science that bridges fundamental research in human/computer cognitive systems and design and evaluation of information systems in situ. Achieving this goal will require the development of enhanced field methods for conceptual decomposition of human/computer cognitive systems that maps onto laboratory studies, and improved methods for conducting laboratory investigations that might better map onto real-world cognitive processes in technology-rich environments. Copyright © 2011 Cognitive Science Society, Inc.
Development of Process Analytical Technology (PAT) methods for controlled release pellet coating.
Avalle, P; Pollitt, M J; Bradley, K; Cooper, B; Pearce, G; Djemai, A; Fitzpatrick, S
2014-07-01
This work focused on the control of the manufacturing process for a controlled release (CR) pellet product, within a Quality by Design (QbD) framework. The manufacturing process was Wurster coating: firstly layering active pharmaceutical ingredient (API) onto sugar pellet cores and secondly a controlled release (CR) coating. For each of these two steps, development of a Process Analytical Technology (PAT) method is discussed and also a novel application of automated microscopy as the reference method. Ultimately, PAT methods should link to product performance and the two key Critical Quality Attributes (CQAs) for this CR product are assay and release rate, linked to the API and CR coating steps respectively. In this work, the link between near infra-red (NIR) spectra and those attributes was explored by chemometrics over the course of the coating process in a pilot scale industrial environment. Correlations were built between the NIR spectra and coating weight (for API amount), CR coating thickness and dissolution performance. These correlations allow the coating process to be monitored at-line and so better control of the product performance in line with QbD requirements. Copyright © 2014 Elsevier B.V. All rights reserved.
Complex Investigations of Sapphire Crystals Production
NASA Astrophysics Data System (ADS)
Malyukov, S. P.; Klunnikova, Yu V.
The problem of optimum conditions choice for processing sapphire substrates was solved with optimization methods and with combination of analytical simulation methods, experiment and expert system technology. The experimental results and software give rather full information on features of real structure of the sapphire crystal substrates and can be effectively used for optimization of technology of the substrate preparation for electronic devices.
Improving Logistics Processes in Industry Using Web Technologies
NASA Astrophysics Data System (ADS)
Jánošík, Ján; Tanuška, Pavol; Václavová, Andrea
2016-12-01
The aim of this paper is to propose the concept of a system that takes advantage of web technologies and integrates them into the management process and management of internal stocks which may relate to external applications and creates the conditions to transform a Computerized Control of Warehouse Stock (CCWS) in the company. The importance of implementing CCWS is in the elimination of the claims caused by the human factor, as well as to allow the processing of information for analytical purposes and their subsequent use to improve internal processes. Using CCWS in the company would also facilitate better use of the potential tools Business Intelligence and Data Mining.
Information security of Smart Factories
NASA Astrophysics Data System (ADS)
Iureva, R. A.; Andreev, Y. S.; Iuvshin, A. M.; Timko, A. S.
2018-05-01
In several years, technologies and systems based on the Internet of things (IoT) will be widely used in all smart factories. When processing a huge array of unstructured data, their filtration and adequate interpretation are a priority for enterprises. In this context, the correct representation of information in a user-friendly form acquires special importance, for which the market today presents advanced analytical platforms designed to collect, store and analyze data on technological processes and events in real time. The main idea of the paper is the statement of the information security problem in IoT and integrity of processed information.
Use telecommunications for real-time process control
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zilberman, I.; Bigman, J.; Sela, I.
1996-05-01
Process operators design real-time accurate information to monitor and control product streams and to optimize unit operations. The challenge is how to cost-effectively install sophisticated analytical equipment in harsh environments such as process areas and maintain system reliability. Incorporating telecommunications technology with near infrared (NIR) spectroscopy may be the bridge to help operations achieve their online control goals. Coupling communications fiber optics with NIR analyzers enables the probe and sampling system to remain in the field and crucial analytical equipment to be remotely located in a general purpose area without specialized protection provisions. The case histories show how two refineriesmore » used NIR spectroscopy online to track octane levels for reformate streams.« less
NASA Astrophysics Data System (ADS)
Kaur, Jagreet; Singh Mann, Kulwinder, Dr.
2018-01-01
AI in Healthcare needed to bring real, actionable insights and Individualized insights in real time for patients and Doctors to support treatment decisions., We need a Patient Centred Platform for integrating EHR Data, Patient Data, Prescriptions, Monitoring, Clinical research and Data. This paper proposes a generic architecture for enabling AI based healthcare analytics Platform by using open sources Technologies Apache beam, Apache Flink Apache Spark, Apache NiFi, Kafka, Tachyon, Gluster FS, NoSQL- Elasticsearch, Cassandra. This paper will show the importance of applying AI based predictive and prescriptive analytics techniques in Health sector. The system will be able to extract useful knowledge that helps in decision making and medical monitoring in real-time through an intelligent process analysis and big data processing.
Information security of power enterprises of North-Arctic region
NASA Astrophysics Data System (ADS)
Sushko, O. P.
2018-05-01
The role of information technologies in providing technological security for energy enterprises is a component of the economic security for the northern Arctic region in general. Applying instruments and methods of information protection modelling of the energy enterprises' business process in the northern Arctic region (such as Arkhenergo and Komienergo), the authors analysed and identified most frequent risks of information security. With the analytic hierarchy process based on weighting factor estimations, information risks of energy enterprises' technological processes were ranked. The economic estimation of the information security within an energy enterprise considers weighting factor-adjusted variables (risks). Investments in information security systems of energy enterprises in the northern Arctic region are related to necessary security elements installation; current operating expenses on business process protection systems become materialized economic damage.
Wang, Lu; Zeng, Shanshan; Chen, Teng; Qu, Haibin
2014-03-01
A promising process analytical technology (PAT) tool has been introduced for batch processes monitoring. Direct analysis in real time mass spectrometry (DART-MS), a means of rapid fingerprint analysis, was applied to a percolation process with multi-constituent substances for an anti-cancer botanical preparation. Fifteen batches were carried out, including ten normal operations and five abnormal batches with artificial variations. The obtained multivariate data were analyzed by a multi-way partial least squares (MPLS) model. Control trajectories were derived from eight normal batches, and the qualification was tested by R(2) and Q(2). Accuracy and diagnosis capability of the batch model were then validated by the remaining batches. Assisted with high performance liquid chromatography (HPLC) determination, process faults were explained by corresponding variable contributions. Furthermore, a batch level model was developed to compare and assess the model performance. The present study has demonstrated that DART-MS is very promising in process monitoring in botanical manufacturing. Compared with general PAT tools, DART-MS offers a particular account on effective compositions and can be potentially used to improve batch quality and process consistency of samples in complex matrices. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Patterson, Maria T.; Anderson, Nicholas; Bennett, Collin; Bruggemann, Jacob; Grossman, Robert L.; Handy, Matthew; Ly, Vuong; Mandl, Daniel J.; Pederson, Shane; Pivarski, James;
2016-01-01
Project Matsu is a collaboration between the Open Commons Consortium and NASA focused on developing open source technology for cloud-based processing of Earth satellite imagery with practical applications to aid in natural disaster detection and relief. Project Matsu has developed an open source cloud-based infrastructure to process, analyze, and reanalyze large collections of hyperspectral satellite image data using OpenStack, Hadoop, MapReduce and related technologies. We describe a framework for efficient analysis of large amounts of data called the Matsu "Wheel." The Matsu Wheel is currently used to process incoming hyperspectral satellite data produced daily by NASA's Earth Observing-1 (EO-1) satellite. The framework allows batches of analytics, scanning for new data, to be applied to data as it flows in. In the Matsu Wheel, the data only need to be accessed and preprocessed once, regardless of the number or types of analytics, which can easily be slotted into the existing framework. The Matsu Wheel system provides a significantly more efficient use of computational resources over alternative methods when the data are large, have high-volume throughput, may require heavy preprocessing, and are typically used for many types of analysis. We also describe our preliminary Wheel analytics, including an anomaly detector for rare spectral signatures or thermal anomalies in hyperspectral data and a land cover classifier that can be used for water and flood detection. Each of these analytics can generate visual reports accessible via the web for the public and interested decision makers. The result products of the analytics are also made accessible through an Open Geospatial Compliant (OGC)-compliant Web Map Service (WMS) for further distribution. The Matsu Wheel allows many shared data services to be performed together to efficiently use resources for processing hyperspectral satellite image data and other, e.g., large environmental datasets that may be analyzed for many purposes.
Meder, Roger; Stahl, Wolfgang; Warburton, Paul; Woolley, Sam; Earnshaw, Scott; Haselhofer, Klaus; van Langenberg, Ken; Ebdon, Nick; Mulder, Roger
2017-01-01
The reactivity of melamine-urea-formaldehyde resins is of key importance in the manufacture of engineered wood products such as medium density fibreboard (MDF) and other wood composite products. Often the MDF manufacturing plant has little available information on the resin reactivity other than details of the resin specification at the time of batch manufacture, which often occurs off-site at a third-party resin plant. Often too, fresh resin on delivery at the MDF plant is mixed with variable volume of aged resin in storage tanks, thereby rendering any specification of the fresh resin batch obsolete. It is therefore highly desirable to develop a real-time, at-line or on-line, process analytical technology to monitor the quality of the resin prior to MDF panel manufacture. Near infrared (NIR) spectroscopy has been calibrated against standard quality methods and against 13 C nuclear magnetic resonance (NMR) measures of molecular composition in order to provide at-line process analytical technology (PAT), to monitor the resin quality, particularly the formaldehyde content of the resin. At-line determination of formaldehyde content in the resin was made possible using a six-factor calibration with an R 2 (cal) value of 0.973, and R 2 (CV) value of 0.929 and a root-mean-square error of cross-validation of 0.01. This calibration was then used to generate control charts of formaldehyde content at regular four-hourly periods during MDF panel manufacture in a commercial MDF manufacturing plant.
Instrumentation for optimizing an underground coal-gasification process
NASA Astrophysics Data System (ADS)
Seabaugh, W.; Zielinski, R. E.
1982-06-01
While the United States has a coal resource base of 6.4 trillion tons, only seven percent is presently recoverable by mining. The process of in-situ gasification can recover another twenty-eight percent of the vast resource, however, viable technology must be developed for effective in-situ recovery. The key to this technology is system that can optimize and control the process in real-time. An instrumentation system is described that optimizes the composition of the injection gas, controls the in-situ process and conditions the product gas for maximum utilization. The key elements of this system are Monsanto PRISM Systems, a real-time analytical system, and a real-time data acquisition and control system. This system provides from complete automation of the process but can easily be overridden by manual control. The use of this cost effective system can provide process optimization and is an effective element in developing a viable in-situ technology.
Analytical quality by design: a tool for regulatory flexibility and robust analytics.
Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy
2015-01-01
Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).
Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics
Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy
2015-01-01
Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723
ERIC Educational Resources Information Center
Lucidi, Louis; Mecca, Peter M.
2001-01-01
Introduces a project in which students examined the physics, chemistry, and geology of radon and used available technology to measure radon concentrations in their homes. Uses the inquiry process, analytical skills, communication skills, content knowledge, and production of authentic products for student assessment. (YDS)
Second International Conference on Accelerating Biopharmaceutical Development
2009-01-01
The Second International Conference on Accelerating Biopharmaceutical Development was held in Coronado, California. The meeting was organized by the Society for Biological Engineering (SBE) and the American Institute of Chemical Engineers (AIChE); SBE is a technological community of the AIChE. Bob Adamson (Wyeth) and Chuck Goochee (Centocor) were co-chairs of the event, which had the theme “Delivering cost-effective, robust processes and methods quickly and efficiently.” The first day focused on emerging disruptive technologies and cutting-edge analytical techniques. Day two featured presentations on accelerated cell culture process development, critical quality attributes, specifications and comparability, and high throughput protein formulation development. The final day was dedicated to discussion of technology options and new analysis methods provided by emerging disruptive technologies; functional interaction, integration and synergy in platform development; and rapid and economic purification process development. PMID:20065637
Canis, Laure; Linkov, Igor; Seager, Thomas P
2010-11-15
The unprecedented uncertainty associated with engineered nanomaterials greatly expands the need for research regarding their potential environmental consequences. However, decision-makers such as regulatory agencies, product developers, or other nanotechnology stakeholders may not find the results of such research directly informative of decisions intended to mitigate environmental risks. To help interpret research findings and prioritize new research needs, there is an acute need for structured decision-analytic aids that are operable in a context of extraordinary uncertainty. Whereas existing stochastic decision-analytic techniques explore uncertainty only in decision-maker preference information, this paper extends model uncertainty to technology performance. As an illustrative example, the framework is applied to the case of single-wall carbon nanotubes. Four different synthesis processes (arc, high pressure carbon monoxide, chemical vapor deposition, and laser) are compared based on five salient performance criteria. A probabilistic rank ordering of preferred processes is determined using outranking normalization and a linear-weighted sum for different weighting scenarios including completely unknown weights and four fixed-weight sets representing hypothetical stakeholder views. No single process pathway dominates under all weight scenarios, but it is likely that some inferior process technologies could be identified as low priorities for further research.
Watson, Douglas S; Kerchner, Kristi R; Gant, Sean S; Pedersen, Joseph W; Hamburger, James B; Ortigosa, Allison D; Potgieter, Thomas I
2016-01-01
Tangential flow microfiltration (MF) is a cost-effective and robust bioprocess separation technique, but successful full scale implementation is hindered by the empirical, trial-and-error nature of scale-up. We present an integrated approach leveraging at-line process analytical technology (PAT) and mass balance based modeling to de-risk MF scale-up. Chromatography-based PAT was employed to improve the consistency of an MF step that had been a bottleneck in the process used to manufacture a therapeutic protein. A 10-min reverse phase ultra high performance liquid chromatography (RP-UPLC) assay was developed to provide at-line monitoring of protein concentration. The method was successfully validated and method performance was comparable to previously validated methods. The PAT tool revealed areas of divergence from a mass balance-based model, highlighting specific opportunities for process improvement. Adjustment of appropriate process controls led to improved operability and significantly increased yield, providing a successful example of PAT deployment in the downstream purification of a therapeutic protein. The general approach presented here should be broadly applicable to reduce risk during scale-up of filtration processes and should be suitable for feed-forward and feed-back process control. © 2015 American Institute of Chemical Engineers.
-Omic and Electronic Health Records Big Data Analytics for Precision Medicine
Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D.; Venugopalan, Janani; Hoffman, Ryan; Wang, May D.
2017-01-01
Objective Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of health care. Methods In this article, we present -omic and EHR data characteristics, associated challenges, and data analytics including data pre-processing, mining, and modeling. Results To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Conclusion Big data analytics is able to address –omic and EHR data challenges for paradigm shift towards precision medicine. Significance Big data analytics makes sense of –omic and EHR data to improve healthcare outcome. It has long lasting societal impact. PMID:27740470
Shaw, P E; Burn, P L
2017-11-15
The detection of explosives continues to be a pressing global challenge with many potential technologies being pursued by the scientific research community. Luminescence-based detection of explosive vapours with an organic semiconductor has attracted much interest because of its potential for detectors that have high sensitivity, compact form factor, simple operation and low-cost. Despite the abundance of literature on novel sensor materials systems there are relatively few mechanistic studies targeted towards vapour-based sensing. In this Perspective, we will review the progress that has been made in understanding the processes that control the real-time luminescence quenching of thin films by analyte vapours. These are the non-radiative quenching process by which the sensor exciton decays, the analyte-sensor intermolecular binding interaction, and the diffusion process for the analyte vapours in the film. We comment on the contributions of each of these processes towards the sensing response and, in particular, the relative roles of analyte diffusion and exciton diffusion. While the latter has been historically judged to be one of, if not the primary, causes for the high sensitivity of many conjugated polymers to nitrated vapours, recent evidence suggests that long exciton diffusion lengths are unnecessary. The implications of these results on the development of sensor materials for real-time detection are discussed.
NASA Astrophysics Data System (ADS)
Poggio, Andrew J.; Mayall, Brian H.
1989-04-01
The Lawrence Livermore National Laboratory (LLNL) is an acknowledged world center for analytical cytology. This leadership was recognized by the Regents of the University of California (UC), who in 1982 established and funded the Program for Analytical Cytology to facilitate the transfer of this technology from scientists at LLNL to their University colleagues, primarily through innovative collaborative research. This issue of Energy and Technology Review describes three of the forty projects that have been funded in this way, chosen to illustrate the potential medical application of the research. Analytical cytology is a relatively new field of biomedical research that is increasingly being applied in clinical medicine. It has been particularly important in unraveling the complexities of the human immune system and in quantifying the pathobiology of malignancy. Defined as the characterization and measurement of cells and cellular constituents for biological and medical purposes, analytical cytology bridges the gap between the quantitative discipline of molecular biology and the more qualitative disciplines of anatomy and pathology. It is itself multidisciplinary in nature. Two major approaches to analytical cytology are flow cytometry and image cytometry. In each of these research techniques, cells are measured one at a time in an automated device. In flow instruments, the cells are dispersed in fluid suspension and pass in single file through a beam of laser light to generate optical signals that are measured. In image cytometry, cells are dispersed on a slide and are imaged through a microscope onto an electronic imaging and analysis system that processes the cell image to extract measurements of interest.
Analysis of THG modes for femtosecond laser pulse
NASA Astrophysics Data System (ADS)
Trofimov, Vyacheslav A.; Sidorov, Pavel S.
2017-05-01
THG is used nowadays in many practical applications such as a substance diagnostics, and biological objects imaging, and etc. With developing of new materials and technology (for example, photonic crystal) an attention to THG process analysis grow. Therefore, THG features understanding are a modern problem. Early we have developed new analytical approach based on using the problem invariant for analytical solution construction of the THG process. It should be stressed that we did not use a basic wave non-depletion approximation. Nevertheless, a long pulse duration approximation and plane wave approximation has applied. The analytical solution demonstrates, in particular, an optical bistability property (and may other regimes of frequency tripling) for the third harmonic generation process. But, obviously, this approach does not reflect an influence of a medium dispersion on the frequency tripling. Therefore, in this paper we analyze THG efficiency of a femtosecond laser pulse taking into account a second order dispersion affect as well as self- and crossmodulation of the interacting waves affect on the frequency conversion process. Analysis is made using a computer simulation on the base of Schrödinger equations describing the process under consideration.
Service Bundle Recommendation for Person-Centered Care Planning in Cities.
Kotoulas, Spyros; Daly, Elizabeth; Tommasi, Pierpaolo; Kishimoto, Akihiro; Lopez, Vanessa; Stephenson, Martin; Botea, Adi; Sbodio, Marco; Marinescu, Radu; Rooney, Ronan
2016-01-01
Providing appropriate support for the most vulnerable individuals carries enormous societal significance and economic burden. Yet, finding the right balance between costs, estimated effectiveness and the experience of the care recipient is a daunting task that requires considering vast amount of information. We present a system that helps care teams choose the optimal combination of providers for a set of services. We draw from techniques in Open Data processing, semantic processing, faceted exploration, visual analytics, transportation analytics and multi-objective optimization. We present an implementation of the system using data from New York City and illustrate the feasibility these technologies to guide care workers in care planning.
AUVA - Augmented Reality Empowers Visual Analytics to explore Medical Curriculum Data.
Nifakos, Sokratis; Vaitsis, Christos; Zary, Nabil
2015-01-01
Medical curriculum data play a key role in the structure and the organization of medical programs in Universities around the world. The effective processing and usage of these data may improve the educational environment of medical students. As a consequence, the new generation of health professionals would have improved skills from the previous ones. This study introduces the process of enhancing curriculum data by the use of augmented reality technology as a management and presentation tool. The final goal is to enrich the information presented from a visual analytics approach applied on medical curriculum data and to sustain low levels of complexity of understanding these data.
Pizarro, Shelly A; Dinges, Rachel; Adams, Rachel; Sanchez, Ailen; Winter, Charles
2009-10-01
Process analytical technology (PAT) is an initiative from the US FDA combining analytical and statistical tools to improve manufacturing operations and ensure regulatory compliance. This work describes the use of a continuous monitoring system for a protein refolding reaction to provide consistency in product quality and process performance across batches. A small-scale bioreactor (3 L) is used to understand the impact of aeration for refolding recombinant human vascular endothelial growth factor (rhVEGF) in a reducing environment. A reverse-phase HPLC assay is used to assess product quality. The goal in understanding the oxygen needs of the reaction and its impact to quality, is to make a product that is efficiently refolded to its native and active form with minimum oxidative degradation from batch to batch. Because this refolding process is heavily dependent on oxygen, the % dissolved oxygen (DO) profile is explored as a PAT tool to regulate process performance at commercial manufacturing scale. A dynamic gassing out approach using constant mass transfer (k(L)a) is used for scale-up of the aeration parameters to manufacturing scale tanks (2,000 L, 15,000 L). The resulting DO profiles of the refolding reaction show similar trends across scales and these are analyzed using rpHPLC. The desired product quality attributes are then achieved through alternating air and nitrogen sparging triggered by changes in the monitored DO profile. This approach mitigates the impact of differences in equipment or feedstock components between runs, and is directly inline with the key goal of PAT to "actively manage process variability using a knowledge-based approach." (c) 2009 Wiley Periodicals, Inc.
Chemical Technology Division annual technical report, 1992
DOE Office of Scientific and Technical Information (OSTI.GOV)
Battles, J.E.; Myles, K.M.; Laidler, J.J.
1993-06-01
In this period, CMT conducted research and development in the following areas: (1) electrochemical technology, including advanced batteries and fuel cells; (2) technology for fluidized-bed combustion and coal-fired magnetohydrodynamics; (3) methods for treatment of hazardous waste, mixed hazardous/radioactive waste, and municipal solid waste; (4) the reaction of nuclear waste glass and spent fuel under conditions expected for an unsaturated repository; (5) processes for separating and recovering transuranic elements from nuclear waste streams, treating water contaminated with volatile organics, and concentrating radioactive waste streams; (6) recovery processes for discharged fuel and the uranium blanket in the Integral Fast Reactor (EFR); (7)more » processes for removal of actinides in spent fuel from commercial water-cooled nuclear reactors and burnup in IFRs; and (8) physical chemistry of selected materials (corium; Fe-U-Zr, tritium in LiAlO{sub 2} in environments simulating those of fission and fusion energy systems. The Division also conducts basic research in catalytic chemistry associated with molecular energy resources and novel` ceramic precursors; materials chemistry of superconducting oxides, electrified metal/solution interfaces, and molecular sieve structures; and the geochemical processes involved in water-rock interactions occurring in active hydrothermal systems. In addition, the Analytical Chemistry Laboratory in CMT provides a broad range of analytical chemistry support services to the technical programs at Argonne National Laboratory (ANL).« less
Schneid, Stefan C; Johnson, Robert E; Lewis, Lavinia M; Stärtzel, Peter; Gieseler, Henning
2015-05-01
Process analytical technology (PAT) and quality by design have gained importance in all areas of pharmaceutical development and manufacturing. One important method for monitoring of critical product attributes and process optimization in laboratory scale freeze-drying is manometric temperature measurement (MTM). A drawback of this innovative technology is that problems are encountered when processing high-concentrated amorphous materials, particularly protein formulations. In this study, a model solution of bovine serum albumin and sucrose was lyophilized at both conservative and aggressive primary drying conditions. Different temperature sensors were employed to monitor product temperatures. The residual moisture content at primary drying endpoints as indicated by temperature sensors and batch PAT methods was quantified from extracted sample vials. The data from temperature probes were then used to recalculate critical product parameters, and the results were compared with MTM data. The drying endpoints indicated by the temperature sensors were not suitable for endpoint indication, in contrast to the batch methods endpoints. The accuracy of MTM Pice data was found to be influenced by water reabsorption. Recalculation of Rp and Pice values based on data from temperature sensors and weighed vials was possible. Overall, extensive information about critical product parameters could be obtained using data from complementary PAT tools. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.
DOE Office of Scientific and Technical Information (OSTI.GOV)
P Yu
Unlike traditional 'wet' analytical methods which during processing for analysis often result in destruction or alteration of the intrinsic protein structures, advanced synchrotron radiation-based Fourier transform infrared microspectroscopy has been developed as a rapid and nondestructive and bioanalytical technique. This cutting-edge synchrotron-based bioanalytical technology, taking advantages of synchrotron light brightness (million times brighter than sun), is capable of exploring the molecular chemistry or structure of a biological tissue without destruction inherent structures at ultra-spatial resolutions. In this article, a novel approach is introduced to show the potential of the advanced synchrotron-based analytical technology, which can be used to study plant-basedmore » food or feed protein molecular structure in relation to nutrient utilization and availability. Recent progress was reported on using synchrotron-based bioanalytical technique synchrotron radiation-based Fourier transform infrared microspectroscopy and diffused reflectance infrared Fourier transform spectroscopy to detect the effects of gene-transformation (Application 1), autoclaving (Application 2), and bio-ethanol processing (Application 3) on plant-based food and feed protein structure changes on a molecular basis. The synchrotron-based technology provides a new approach for plant-based protein structure research at ultra-spatial resolutions at cellular and molecular levels.« less
Brouckaert, Davinia; De Meyer, Laurens; Vanbillemont, Brecht; Van Bockstal, Pieter-Jan; Lammens, Joris; Mortier, Séverine; Corver, Jos; Vervaet, Chris; Nopens, Ingmar; De Beer, Thomas
2018-04-03
Near-infrared chemical imaging (NIR-CI) is an emerging tool for process monitoring because it combines the chemical selectivity of vibrational spectroscopy with spatial information. Whereas traditional near-infrared spectroscopy is an attractive technique for water content determination and solid-state investigation of lyophilized products, chemical imaging opens up possibilities for assessing the homogeneity of these critical quality attributes (CQAs) throughout the entire product. In this contribution, we aim to evaluate NIR-CI as a process analytical technology (PAT) tool for at-line inspection of continuously freeze-dried pharmaceutical unit doses based on spin freezing. The chemical images of freeze-dried mannitol samples were resolved via multivariate curve resolution, allowing us to visualize the distribution of mannitol solid forms throughout the entire cake. Second, a mannitol-sucrose formulation was lyophilized with variable drying times for inducing changes in water content. Analyzing the corresponding chemical images via principal component analysis, vial-to-vial variations as well as within-vial inhomogeneity in water content could be detected. Furthermore, a partial least-squares regression model was constructed for quantifying the water content in each pixel of the chemical images. It was hence concluded that NIR-CI is inherently a most promising PAT tool for continuously monitoring freeze-dried samples. Although some practicalities are still to be solved, this analytical technique could be applied in-line for CQA evaluation and for detecting the drying end point.
NASA Technical Reports Server (NTRS)
Defelice, David M.; Aydelott, John C.
1987-01-01
The resupply of the cryogenic propellants is an enabling technology for spacebased orbit transfer vehicles. As part of the NASA Lewis ongoing efforts in microgravity fluid management, thermodynamic analysis and subscale modeling techniques were developed to support an on-orbit test bed for cryogenic fluid management technologies. Analytical results have shown that subscale experimental modeling of liquid resupply can be used to validate analytical models when the appropriate target temperature is selected to relate the model to its prototype system. Further analyses were used to develop a thermodynamic model of the tank chilldown process which is required prior to the no-vent fill operation. These efforts were incorporated into two FORTRAN programs which were used to present preliminary analyticl results.
Wang, Xiao; Esquerre, Carlos; Downey, Gerard; Henihan, Lisa; O'Callaghan, Donal; O'Donnell, Colm
2018-06-01
In this study, visible and near-infrared (Vis-NIR), mid-infrared (MIR) and Raman process analytical technologies were investigated for assessment of infant formula quality and compositional parameters namely preheat temperature, storage temperature, storage time, fluorescence of advanced Maillard products and soluble tryptophan (FAST) index, soluble protein, fat and surface free fat (SFF) content. PLS-DA models developed using spectral data with appropriate data pre-treatment and significant variables selected using Martens' uncertainty test had good accuracy for the discrimination of preheat temperature (92.3-100%) and storage temperature (91.7-100%). The best PLS regression models developed yielded values for the ratio of prediction error to deviation (RPD) of 3.6-6.1, 2.1-2.7, 1.7-2.9, 1.6-2.6 and 2.5-3.0 for storage time, FAST index, soluble protein, fat and SFF content prediction respectively. Vis-NIR, MIR and Raman were demonstrated to be potential PAT tools for process control and quality assurance applications in infant formula and dairy ingredient manufacture. Copyright © 2018 Elsevier B.V. All rights reserved.
Improving early cycle economic evaluation of diagnostic technologies.
Steuten, Lotte M G; Ramsey, Scott D
2014-08-01
The rapidly increasing range and expense of new diagnostics, compels consideration of a different, more proactive approach to health economic evaluation of diagnostic technologies. Early cycle economic evaluation is a decision analytic approach to evaluate technologies in development so as to increase the return on investment as well as patient and societal impact. This paper describes examples of 'early cycle economic evaluations' as applied to diagnostic technologies and highlights challenges in its real-time application. It shows that especially in the field of diagnostics, with rapid technological developments and a changing regulatory climate, early cycle economic evaluation can have a guiding role to improve the efficiency of the diagnostics innovation process. In the next five years the attention will move beyond the methodological and analytic challenges of early cycle economic evaluation towards the challenge of effectively applying it to improve diagnostic research and development and patient value. Future work in this area should therefore be 'strong on principles and soft on metrics', that is, the metrics that resonate most clearly with the various decision makers in this field.
Reichert, Janice M; Jacob, Nitya; Amanullah, Ashraf
2009-01-01
The Second International Conference on Accelerating Biopharmaceutical Development was held in Coronado, California. The meeting was organized by the Society for Biological Engineering (SBE) and the American Institute of Chemical Engineers (AIChE); SBE is a technological community of the AIChE. Bob Adamson (Wyeth) and Chuck Goochee (Centocor) were co-chairs of the event, which had the theme "Delivering cost-effective, robust processes and methods quickly and efficiently." The first day focused on emerging disruptive technologies and cutting-edge analytical techniques. Day two featured presentations on accelerated cell culture process development, critical quality attributes, specifications and comparability, and high throughput protein formulation development. The final day was dedicated to discussion of technology options and new analysis methods provided by emerging disruptive technologies; functional interaction, integration and synergy in platform development; and rapid and economic purification process development.
Reichert, Janice M; Jacob, Nitya M; Amanullah, Ashraf
2009-01-01
The Second International Conference on Accelerating Biopharmaceutical Development was held in Coronado, California. The meeting was organized by the Society for Biological Engineering (SBE) and the American Institute of Chemical Engineers (AIChE); SBE is a technological community of the AIChE. Bob Adamson (Wyeth) and Chuck Goochee (Centocor) were co-chairs of the event, which had the theme "Delivering cost-effective, robust processes and methods quickly and efficiently." The first day focused on emerging disruptive technologies and cutting-edge analytical techniques. Day two featured presentations on accelerated cell culture process development, critical quality attributes, specifications and comparability, and high throughput protein formulation development. The final day was dedicated to discussion of technology options and new analysis methods provided by emerging disruptive technologies; functional interaction, integration and synergy in platform development; and rapid and economic purification process development.
75 FR 11896 - National Cancer Institute; Notice of Closed Meetings
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-12
... DEPARTMENT OF HEALTH AND HUMAN SERVICES National Institutes of Health National Cancer Institute... Cancer Institute Special Emphasis Panel; Process Analytic Technologies, Date: April 6, 2010, Time: 1 p.m... of Extramural Activities, National Cancer Institute, 6116 Executive Boulevard, Room 7142, Bethesda...
Some problems in mechanics of growing solids with applications to AM technologies
NASA Astrophysics Data System (ADS)
Manzhirov, A. V.
2018-04-01
Additive Manufacturing (AM) technologies are an exciting area of the modern industrial revolution and have applications in engineering, medicine, electronics, aerospace industry, etc. AM enables cost-effective production of customized geometry and parts by direct fabrication from 3D data and mathematical models. Despite much progress in AM technologies, problems of mechanical analysis for AM fabricated parts yet remain to be solved. This paper deals with three main mechanical problems: the onset of residual stresses, which occur in the AM process and can lead to failure of the parts, the distortion of the final shape of AM fabricated parts, and the development of technological solutions aimed at improving existing AM technologies and creating new ones. An approach proposed deals with the construction of adequate analytical model and effective methods for the simulation of AM processes for fabricated solid parts.
Real-Time Process Analytics in Emergency Healthcare.
Koufi, Vassiliki; Malamateniou, Flora; Prentza, Adrianna; Vassilacopoulos, George
2017-01-01
Emergency medical systems (EMS) are considered to be amongst the most crucial systems as they involve a variety of activities which are performed from the time of a call to an ambulance service till the time of patient's discharge from the emergency department of a hospital. These activities are closely interrelated so that collaboration and coordination becomes a vital issue for patients and for emergency healthcare service performance. The utilization of standard workflow technology in the context of Service Oriented Architecture can provide an appropriate technological infrastructure for defining and automating EMS processes that span organizational boundaries so that to create and empower collaboration and coordination among the participating organizations. In such systems, the utilization of leading-edge analytics tools can prove important as it can facilitate real-time extraction and visualization of useful insights from the mountains of generated data pertaining to emergency case management. This paper presents a framework which provides healthcare professionals with just-in-time insight within and across emergency healthcare processes by performing real-time analysis on process-related data in order to better support decision making and identify potential critical risks that may affect the provision of emergency care to patients.
NASA Astrophysics Data System (ADS)
Dolotovskii, I. V.; Dolotovskaya, N. V.; Larin, E. A.
2018-05-01
The article presents the architecture and content of a specialized analytical system for monitoring operational conditions, planning of consumption and generation of energy resources, long-term planning of production activities and development of a strategy for the development of the energy complex of gas processing enterprises. A compositional model of structured data on the equipment of the main systems of the power complex is proposed. The correctness of the use of software modules and the database of the analytical system is confirmed by comparing the results of measurements on the equipment of the electric power system and simulation at the operating gas processing plant. A high accuracy in the planning of consumption of fuel and energy resources has been achieved (the error does not exceed 1%). Information and program modules of the analytical system allow us to develop a strategy for improving the energy complex in the face of changing technological topology and partial uncertainty of economic factors.
The I/O transform of a chemical sensor
Katta, Nalin; Meier, Douglas C.; Benkstein, Kurt D.; Semancik, Steve; Raman, Baranidharan
2016-01-01
A number of sensing technologies, using a variety of transduction principles, have been proposed for non-invasive chemical sensing. A fundamental problem common to all these sensing technologies is determining what features of the transducer's signal constitute a chemical fingerprint that allows for precise analyte recognition. Of particular importance is the need to extract features that are robust with respect to the sensor's age or stimulus intensity. Here, using pulsed stimulus delivery, we show that a sensor's operation can be modeled as a linear input-output (I/O) transform. The I/O transform is unique for each analyte and can be used to precisely predict a temperature-programmed chemiresistor's response to the analyte given the recent stimulus history (i.e. state of an analyte delivery valve being open or closed). We show that the analyte specific I/O transforms are to a certain degree stimulus intensity invariant and can remain consistent even when the sensor has undergone considerable aging. Significantly, the I/O transforms for a given analyte are highly conserved across sensors of equal manufacture, thereby allowing training data obtained from one sensor to be used for recognition of the same set of chemical species with another sensor. Hence, this proposed approach facilitates decoupling of the signal processing algorithms from the chemical transducer, a key advance necessary for achieving long-term, non-invasive chemical sensing. PMID:27932855
Wu, Huiquan; White, Maury; Khan, Mansoor A
2011-02-28
The aim of this work was to develop an integrated process analytical technology (PAT) approach for a dynamic pharmaceutical co-precipitation process characterization and design space development. A dynamic co-precipitation process by gradually introducing water to the ternary system of naproxen-Eudragit L100-alcohol was monitored at real-time in situ via Lasentec FBRM and PVM. 3D map of count-time-chord length revealed three distinguishable process stages: incubation, transition, and steady-state. The effects of high risk process variables (slurry temperature, stirring rate, and water addition rate) on both derived co-precipitation process rates and final chord-length-distribution were evaluated systematically using a 3(3) full factorial design. Critical process variables were identified via ANOVA for both transition and steady state. General linear models (GLM) were then used for parameter estimation for each critical variable. Clear trends about effects of each critical variable during transition and steady state were found by GLM and were interpreted using fundamental process principles and Nyvlt's transfer model. Neural network models were able to link process variables with response variables at transition and steady state with R(2) of 0.88-0.98. PVM images evidenced nucleation and crystal growth. Contour plots illustrated design space via critical process variables' ranges. It demonstrated the utility of integrated PAT approach for QbD development. Published by Elsevier B.V.
Lyophilization process design space.
Patel, Sajal Manubhai; Pikal, Michael J
2013-11-01
The application of key elements of quality by design (QbD), such as risk assessment, process analytical technology, and design space, is discussed widely as it relates to freeze-drying process design and development. However, this commentary focuses on constructing the Design and Control Space, particularly for the primary drying step of the freeze-drying process. Also, practical applications and considerations of claiming a process Design Space under the QbD paradigm have been discussed. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.
Patel, Bhumit A; Pinto, Nuno D S; Gospodarek, Adrian; Kilgore, Bruce; Goswami, Kudrat; Napoli, William N; Desai, Jayesh; Heo, Jun H; Panzera, Dominick; Pollard, David; Richardson, Daisy; Brower, Mark; Richardson, Douglas D
2017-11-07
Combining process analytical technology (PAT) with continuous production provides a powerful tool to observe and control monoclonal antibody (mAb) fermentation and purification processes. This work demonstrates on-line liquid chromatography (on-line LC) as a PAT tool for monitoring a continuous biologics process and forced degradation studies. Specifically, this work focused on ion exchange chromatography (IEX), which is a critical separation technique to detect charge variants. Product-related impurities, including charge variants, that impact function are classified as critical quality attributes (CQAs). First, we confirmed no significant differences were observed in the charge heterogeneity profile of a mAb through both at-line and on-line sampling and that the on-line method has the ability to rapidly detect changes in protein quality over time. The robustness and versatility of the PAT methods were tested by sampling from two purification locations in a continuous mAb process. The PAT IEX methods used with on-line LC were a weak cation exchange (WCX) separation and a newly developed shorter strong cation exchange (SCX) assay. Both methods provided similar results with the distribution of percent acidic, main, and basic species remaining unchanged over a 2 week period. Second, a forced degradation study showed an increase in acidic species and a decrease in basic species when sampled on-line over 7 days. These applications further strengthen the use of on-line LC to monitor CQAs of a mAb continuously with various PAT IEX analytical methods. Implementation of on-line IEX will enable faster decision making during process development and could potentially be applied to control in biomanufacturing.
Cespi, Marco; Perinelli, Diego R; Casettari, Luca; Bonacucina, Giulia; Caporicci, Giuseppe; Rendina, Filippo; Palmieri, Giovanni F
2014-12-30
The use of process analytical technologies (PAT) to ensure final product quality is by now a well established practice in pharmaceutical industry. To date, most of the efforts in this field have focused on development of analytical methods using spectroscopic techniques (i.e., NIR, Raman, etc.). This work evaluated the possibility of using the parameters derived from the processing of in-line raw compaction data (the forces and displacement of the punches) as a PAT tool for controlling the tableting process. To reach this goal, two commercially available formulations were used, changing the quantitative composition and compressing them on a fully instrumented rotary pressing machine. The Heckel yield pressure and the compaction energies, together with the tablets hardness and compaction pressure, were selected and evaluated as discriminating parameters in all the prepared formulations. The apparent yield pressure, as shown in the obtained results, has the necessary sensitivity to be effectively included in a PAT strategy to monitor the tableting process. Additional investigations were performed to understand the criticalities and the mechanisms beyond this performing parameter and the associated implications. Specifically, it was discovered that the efficiency of the apparent yield pressure depends on the nominal drug title, the drug densification mechanism and the error in pycnometric density. In this study, the potential of using some parameters derived from the compaction raw data has been demonstrated to be an attractive alternative and complementary method to the well established spectroscopic techniques to monitor and control the tableting process. The compaction data monitoring method is also easy to set up and very cost effective. Copyright © 2014 Elsevier B.V. All rights reserved.
Druzinec, Damir; Weiss, Katja; Elseberg, Christiane; Salzig, Denise; Kraume, Matthias; Pörtner, Ralf; Czermak, Peter
2014-01-01
Modern bioprocesses demand for a careful definition of the critical process parameters (CPPs) already during the early stages of process development in order to ensure high-quality products and satisfactory yields. In this context, online monitoring tools can be applied to recognize unfavorable changes of CPPs during the production processes and to allow for early interventions in order to prevent losses of production batches due to quality issues. Process analytical technologies such as the dielectric spectroscopy or focused beam reflectance measurement (FBRM) are possible online monitoring tools, which can be applied to monitor cell growth as well as morphological changes. Since the dielectric spectroscopy only captures cells with intact cell membranes, even information about dead cells with ruptured or leaking cell membranes can be derived. The following chapter describes the application of dielectric spectroscopy on various virus-infected and non-infected cell lines with respect to adherent as well as suspension cultures in common stirred tank reactors. The adherent mammalian cell lines Vero (African green monkey kidney cells) and hMSC-TERT (telomerase-immortalized human mesenchymal stem cells) are thereby cultured on microcarrier, which provide the required growth surface and allow the cultivation of these cells even in dynamic culture systems. In turn, the insect-derived cell lines S2 and Sf21 are used as examples for cells typically cultured in suspension. Moreover, the FBRM technology as a further monitoring tool for cell culture applications has been included in this chapter using the example of Drosophila S2 insect cells.
NASA's Climate Data Services Initiative
NASA Astrophysics Data System (ADS)
McInerney, M.; Duffy, D.; Schnase, J. L.; Webster, W. P.
2013-12-01
Our understanding of the Earth's processes is based on a combination of observational data records and mathematical models. The size of NASA's space-based observational data sets is growing dramatically as new missions come online. However a potentially bigger data challenge is posed by the work of climate scientists, whose models are regularly producing data sets of hundreds of terabytes or more. It is important to understand that the 'Big Data' challenge of climate science cannot be solved with a single technological approach or an ad hoc assemblage of technologies. It will require a multi-faceted, well-integrated suite of capabilities that include cloud computing, large-scale compute-storage systems, high-performance analytics, scalable data management, and advanced deployment mechanisms in addition to the existing, well-established array of mature information technologies. It will also require a coherent organizational effort that is able to focus on the specific and sometimes unique requirements of climate science. Given that it is the knowledge that is gained from data that is of ultimate benefit to society, data publication and data analytics will play a particularly important role. In an effort to accelerate scientific discovery and innovation through broader use of climate data, NASA Goddard Space Flight Center's Office of Computational and Information Sciences and Technology has embarked on a determined effort to build a comprehensive, integrated data publication and analysis capability for climate science. The Climate Data Services (CDS) Initiative integrates people, expertise, and technology into a highly-focused, next-generation, one-stop climate science information service. The CDS Initiative is providing the organizational framework, processes, and protocols needed to deploy existing information technologies quickly using a combination of enterprise-level services and an expanding array of cloud services. Crucial to its effectiveness, the CDS Initiative is developing the technical expertise to move new information technologies from R&D into operational use. This combination enables full, end-to-end support for climate data publishing and data analytics, and affords the flexibility required to meet future and unanticipated needs. Current science efforts being supported by the CDS Initiative include IPPC, OBS4MIP, ANA4MIPS, MERRA II, National Climate Assessment, the Ocean Data Assimilation project, NASA Earth Exchange (NEX), and the RECOVER Burned Area Emergency Response decision support system. Service offerings include an integrated suite of classic technologies (FTP, LAS, THREDDS, ESGF, GRaD-DODS, OPeNDAP, WMS, ArcGIS Server), emerging technologies (iRODS, UVCDAT), and advanced technologies (MERRA Analytic Services, MapReduce, Ontology Services, and the CDS API). This poster will describe the CDS Initiative, provide details about the Initiative's advanced offerings, and layout the CDS Initiative's deployment roadmap.
Luka, George; Ahmadi, Ali; Najjaran, Homayoun; Alocilja, Evangelyn; DeRosa, Maria; Wolthers, Kirsten; Malki, Ahmed; Aziz, Hassan; Althani, Asmaa; Hoorfar, Mina
2015-01-01
A biosensor can be defined as a compact analytical device or unit incorporating a biological or biologically derived sensitive recognition element immobilized on a physicochemical transducer to measure one or more analytes. Microfluidic systems, on the other hand, provide throughput processing, enhance transport for controlling the flow conditions, increase the mixing rate of different reagents, reduce sample and reagents volume (down to nanoliter), increase sensitivity of detection, and utilize the same platform for both sample preparation and detection. In view of these advantages, the integration of microfluidic and biosensor technologies provides the ability to merge chemical and biological components into a single platform and offers new opportunities for future biosensing applications including portability, disposability, real-time detection, unprecedented accuracies, and simultaneous analysis of different analytes in a single device. This review aims at representing advances and achievements in the field of microfluidic-based biosensing. The review also presents examples extracted from the literature to demonstrate the advantages of merging microfluidic and biosensing technologies and illustrate the versatility that such integration promises in the future biosensing for emerging areas of biological engineering, biomedical studies, point-of-care diagnostics, environmental monitoring, and precision agriculture. PMID:26633409
Managing knowledge business intelligence: A cognitive analytic approach
NASA Astrophysics Data System (ADS)
Surbakti, Herison; Ta'a, Azman
2017-10-01
The purpose of this paper is to identify and analyze integration of Knowledge Management (KM) and Business Intelligence (BI) in order to achieve competitive edge in context of intellectual capital. Methodology includes review of literatures and analyzes the interviews data from managers in corporate sector and models established by different authors. BI technologies have strong association with process of KM for attaining competitive advantage. KM have strong influence from human and social factors and turn them to the most valuable assets with efficient system run under BI tactics and technologies. However, the term of predictive analytics is based on the field of BI. Extracting tacit knowledge is a big challenge to be used as a new source for BI to use in analyzing. The advanced approach of the analytic methods that address the diversity of data corpus - structured and unstructured - required a cognitive approach to provide estimative results and to yield actionable descriptive, predictive and prescriptive results. This is a big challenge nowadays, and this paper aims to elaborate detail in this initial work.
Department of Energy Technology Readiness Assessments - Process Guide and Training Plan
2008-09-12
Hanford Waste Treatment and Immobilization Plant ( WTP ) Analytical Laboratory, Low Activity Waste (LAW) Facility and Balance of Facilities (3 TRAs... WTP High-Level Waste (HLW) Facility – WTP Pre-Treatment (PT) Facility – Hanford River Protection Project Low Activity Waste Treatment Alternatives
Software Development Management: Empirical and Analytical Perspectives
ERIC Educational Resources Information Center
Kang, Keumseok
2011-01-01
Managing software development is a very complex activity because it must deal with people, organizations, technologies, and business processes. My dissertation consists of three studies that examine software development management from various perspectives. The first study empirically investigates the impacts of prior experience with similar…
Bulbul, Gonca; Chaves, Gepoliano; Olivier, Joseph; Ozel, Rifat Emrah; Pourmand, Nader
2018-06-06
Examining the behavior of a single cell within its natural environment is valuable for understanding both the biological processes that control the function of cells and how injury or disease lead to pathological change of their function. Single-cell analysis can reveal information regarding the causes of genetic changes, and it can contribute to studies on the molecular basis of cell transformation and proliferation. By contrast, whole tissue biopsies can only yield information on a statistical average of several processes occurring in a population of different cells. Electrowetting within a nanopipette provides a nanobiopsy platform for the extraction of cellular material from single living cells. Additionally, functionalized nanopipette sensing probes can differentiate analytes based on their size, shape or charge density, making the technology uniquely suited to sensing changes in single-cell dynamics. In this review, we highlight the potential of nanopipette technology as a non-destructive analytical tool to monitor single living cells, with particular attention to integration into applications in molecular biology.
NASA Astrophysics Data System (ADS)
Al-Qudaimi, Abdullah; Kumar, Amit
2018-05-01
Recently, Abdullah and Najib (International Journal of Sustainable Energy 35(4): 360-377, 2016) proposed an intuitionistic fuzzy analytic hierarchy process to deal with uncertainty in decision-making and applied it to establish preference in the sustainable energy planning decision-making of Malaysia. This work may attract the researchers of other countries to choose energy technology for their countries. However, after a deep study of the published paper (International Journal of Sustainable Energy 35(4): 362-377, 2016), it is noticed that the expression used by Abdullah and Najib in Step 6 of their proposed method for evaluating the intuitionistic fuzzy entropy of each aggregate of each row of intuitionistic fuzzy matrix is not valid. Therefore, it is not genuine to use the method proposed by Abdullah and Najib for solving real-life problems. The aim of this paper was to suggest the required necessary modifications for resolving the flaws of the Abdullah and Najib method.
NASA Astrophysics Data System (ADS)
Abdullah, Lazim; Najib, Liana
2016-04-01
Energy consumption for developing countries is sharply increasing due to the higher economic growth due to industrialisation along with population growth and urbanisation. The increasing demand of energy leads to global energy crisis. Selecting the best energy technology and conservation requires both quantitative and qualitative evaluation criteria. The fuzzy set-based approach is one of the well-known theories to handle fuzziness, uncertainty in decision-making and vagueness of information. This paper proposes a new method of intuitionistic fuzzy analytic hierarchy process (IF-AHP) to deal with the uncertainty in decision-making. The new IF-AHP is applied to establish a preference in the sustainable energy planning decision-making problem. Three decision-makers attached with Malaysian government agencies were interviewed to provide linguistic judgement prior to analysing with the new IF-AHP. Nuclear energy has been decided as the best alternative in energy planning which provides the highest weight among all the seven alternatives.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poggio, A.J.; Mayall, B.H.
1989-04-01
The Lawrence Livermore National Laboratory (LLNL) is an acknowledged world center for analytical cytology. This leadership was recognized by the Regents of the University of California (UC), who in 1982 established and funded the Program for Analytical Cytology to facilitate the transfer of this technology from scientists at LLNL to their University colleagues, primarily through innovative collaborative research. This issue of Energy and Technology Review describes three of the forty projects that have been funded in this way; chosen to illustrate the potential medical application of the research. Analytical cytology is a relatively new field of biomedical research that ismore » increasingly being applied in clinical medicine. It has been particularly important in unraveling the complexities of the human immune system and in quantifying the pathobiology of malignancy. Defined as the characterization and measurement of cells and cellular constituents for biological and medical purposes, analytical cytology bridges the gap between the quantitative discipline of molecular biology and the more qualitative disciplines of anatomy and pathology. It is itself multidisciplinary in nature. Two major approaches to analytical cytology are flow cytometry and image cytometry. In each of these research techniques, cells are measured one at a time in an automated device. In flow instruments, the cells are dispersed in fluid suspension and pass in single file through a beam of laser light to generate optical signals that are measured. In image cytometry, cells are dispersed on a slide and are imaged through a microscope onto an electronic imaging and analysis system that processes the cell image to extract measurements of interest.« less
Dangerous Waste Characteristics of Waste from Hanford Tank 241-S-109
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tingey, Joel M.; Bryan, Garry H.; Deschane, Jaquetta R.
2004-11-05
Existing analytical data from samples taken from Hanford Tank 241-S-109, along with process knowledge of the wastes transferred to this tank, are reviewed to determine whether dangerous waste characteristics currently assigned to all waste in Hanford underground storage tanks are applicable to this tank waste. Supplemental technologies are examined to accelerate the Hanford tank waste cleanup mission and to accomplish the waste treatment in a safer and more efficient manner. The goals of supplemental technologies are to reduce costs, conserve double-shell tank space, and meet the scheduled tank waste processing completion date of 2028.
Warth, Arne; Muley, Thomas; Meister, Michael; Weichert, Wilko
2015-01-01
Preanalytic sampling techniques and preparation of tissue specimens strongly influence analytical results in lung tissue diagnostics both on the morphological but also on the molecular level. However, in contrast to analytics where tremendous achievements in the last decade have led to a whole new portfolio of test methods, developments in preanalytics have been minimal. This is specifically unfortunate in lung cancer, where usually only small amounts of tissue are at hand and optimization in all processing steps is mandatory in order to increase the diagnostic yield. In the following, we provide a comprehensive overview on some aspects of preanalytics in lung cancer from the method of sampling over tissue processing to its impact on analytical test results. We specifically discuss the role of preanalytics in novel technologies like next-generation sequencing and in the state-of the-art cytology preparations. In addition, we point out specific problems in preanalytics which hamper further developments in the field of lung tissue diagnostics.
Podevin, Michael; Fotidis, Ioannis A; Angelidaki, Irini
2018-08-01
Microalgae are well known for their ability to accumulate lipids intracellularly, which can be used for biofuels and mitigate CO 2 emissions. However, due to economic challenges, microalgae bioprocesses have maneuvered towards the simultaneous production of food, feed, fuel, and various high-value chemicals in a biorefinery concept. On-line and in-line monitoring of macromolecules such as lipids, proteins, carbohydrates, and high-value pigments will be more critical to maintain product quality and consistency for downstream processing in a biorefinery to maintain and valorize these markets. The main contribution of this review is to present current and prospective advances of on-line and in-line process analytical technology (PAT), with high-selectivity - the capability of monitoring several analytes simultaneously - in the interest of improving product quality, productivity, and process automation of a microalgal biorefinery. The high-selectivity PAT under consideration are mid-infrared (MIR), near-infrared (NIR), and Raman vibrational spectroscopies. The current review contains a critical assessment of these technologies in the context of recent advances in software and hardware in order to move microalgae production towards process automation through multivariate process control (MVPC) and software sensors trained on "big data". The paper will also include a comprehensive overview of off-line implementations of vibrational spectroscopy in microalgal research as it pertains to spectral interpretation and process automation to aid and motivate development.
Decision Support Model for Selection Technologies in Processing of Palm Oil Industrial Liquid Waste
NASA Astrophysics Data System (ADS)
Ishak, Aulia; Ali, Amir Yazid bin
2017-12-01
The palm oil industry continues to grow from year to year. Processing of the palm oil industry into crude palm oil (CPO) and palm kernel oil (PKO). The ratio of the amount of oil produced by both products is 30% of the raw material. This means that 70% is palm oil waste. The amount of palm oil waste will increase in line with the development of the palm oil industry. The amount of waste generated by the palm oil industry if it is not handled properly and effectively will contribute significantly to environmental damage. Industrial activities ranging from raw materials to produce products will disrupt the lives of people around the factory. There are many alternative technologies available to process other industries, but problems that often occur are difficult to implement the most appropriate technology. The purpose of this research is to develop a database of waste processing technology, looking for qualitative and quantitative criteria to select technology and develop Decision Support System (DSS) that can help make decisions. The method used to achieve the objective of this research is to develop a questionnaire to identify waste processing technology and develop the questionnaire to find appropriate database technology. Methods of data analysis performed on the system by using Analytic Hierarchy Process (AHP) and to build the model by using the MySQL Software that can be used as a tool in the evaluation and selection of palm oil mill processing technology.
Using PAT to accelerate the transition to continuous API manufacturing.
Gouveia, Francisca F; Rahbek, Jesper P; Mortensen, Asmus R; Pedersen, Mette T; Felizardo, Pedro M; Bro, Rasmus; Mealy, Michael J
2017-01-01
Significant improvements can be realized by converting conventional batch processes into continuous ones. The main drivers include reduction of cost and waste, increased safety, and simpler scale-up and tech transfer activities. Re-designing the process layout offers the opportunity to incorporate a set of process analytical technologies (PAT) embraced in the Quality-by-Design (QbD) framework. These tools are used for process state estimation, providing enhanced understanding of the underlying variability in the process impacting quality and yield. This work describes a road map for identifying the best technology to speed-up the development of continuous processes while providing the basis for developing analytical methods for monitoring and controlling the continuous full-scale reaction. The suitability of in-line Raman, FT-infrared (FT-IR), and near-infrared (NIR) spectroscopy for real-time process monitoring was investigated in the production of 1-bromo-2-iodobenzene. The synthesis consists of three consecutive reaction steps including the formation of an unstable diazonium salt intermediate, which is critical to secure high yield and avoid formation of by-products. All spectroscopic methods were able to capture critical information related to the accumulation of the intermediate with very similar accuracy. NIR spectroscopy proved to be satisfactory in terms of performance, ease of installation, full-scale transferability, and stability to very adverse process conditions. As such, in-line NIR was selected to monitor the continuous full-scale production. The quantitative method was developed against theoretical concentration values of the intermediate since representative sampling for off-line reference analysis cannot be achieved. The rapid and reliable analytical system allowed the following: speeding up the design of the continuous process and a better understanding of the manufacturing requirements to ensure optimal yield and avoid unreacted raw materials and by-products in the continuous reactor effluent. Graphical Abstract Using PAT to accelerate the transition to continuous API manufacturing.
Carbon dioxide gas purification and analytical measurement for leading edge 193nm lithography
NASA Astrophysics Data System (ADS)
Riddle Vogt, Sarah; Landoni, Cristian; Applegarth, Chuck; Browning, Matt; Succi, Marco; Pirola, Simona; Macchi, Giorgio
2015-03-01
The use of purified carbon dioxide (CO2) has become a reality for leading edge 193 nm immersion lithography scanners. Traditionally, both dry and immersion 193 nm lithographic processes have constantly purged the optics stack with ultrahigh purity compressed dry air (UHPCDA). CO2 has been utilized for a similar purpose as UHPCDA. Airborne molecular contamniation (AMC) purification technologies and analytical measurement methods have been extensively developed to support the Lithography Tool Manufacturers purity requirements. This paper covers the analytical tests and characterizations carried out to assess impurity removal from 3.0 N CO2 (beverage grade) for its final utilization in 193 nm and EUV scanners.
Transforming Undergraduate Education Through the use of Analytical Reasoning (TUETAR)
NASA Astrophysics Data System (ADS)
Bishop, M. P.; Houser, C.; Lemmons, K.
2015-12-01
Traditional learning limits the potential for self-discovery, and the use of data and knowledge to understand Earth system relationships, processes, feedback mechanisms and system coupling. It is extremely difficult for undergraduate students to analyze, synthesize, and integrate quantitative information related to complex systems, as many concepts may not be mathematically tractable or yet to be formalized. Conceptual models have long served as a means for Earth scientists to organize their understanding of Earth's dynamics, and have served as a basis for human analytical reasoning and landscape interpretation. Consequently, we evaluated the use of conceptual modeling, knowledge representation and analytical reasoning to provide undergraduate students with an opportunity to develop and test geocomputational conceptual models based upon their understanding of Earth science concepts. This study describes the use of geospatial technologies and fuzzy cognitive maps to predict desertification across the South-Texas Sandsheet in an upper-level geomorphology course. Students developed conceptual models based on their understanding of aeolian processes from lectures, and then compared and evaluated their modeling results against an expert conceptual model and spatial predictions, and the observed distribution of dune activity in 2010. Students perceived that the analytical reasoning approach was significantly better for understanding desertification compared to traditional lecture, and promoted reflective learning, working with data, teamwork, student interaction, innovation, and creative thinking. Student evaluations support the notion that the adoption of knowledge representation and analytical reasoning in the classroom has the potential to transform undergraduate education by enabling students to formalize and test their conceptual understanding of Earth science. A model for developing and utilizing this geospatial technology approach in Earth science is presented.
Davenport, Thomas H
2006-01-01
We all know the power of the killer app. It's not just a support tool; it's a strategic weapon. Companies questing for killer apps generally focus all their firepower on the one area that promises to create the greatest competitive advantage. But a new breed of organization has upped the stakes: Amazon, Harrah's, Capital One, and the Boston Red Sox have all dominated their fields by deploying industrial-strength analytics across a wide variety of activities. At a time when firms in many industries offer similar products and use comparable technologies, business processes are among the few remaining points of differentiation--and analytics competitors wring every last drop of value from those processes. Employees hired for their expertise with numbers or trained to recognize their importance are armed with the best evidence and the best quantitative tools. As a result, they make the best decisions. In companies that compete on analytics, senior executives make it clear--from the top down--that analytics is central to strategy. Such organizations launch multiple initiatives involving complex data and statistical analysis, and quantitative activity is managed atthe enterprise (not departmental) level. In this article, professor Thomas H. Davenport lays out the characteristics and practices of these statistical masters and describes some of the very substantial changes other companies must undergo in order to compete on quantitative turf. As one would expect, the transformation requires a significant investment in technology, the accumulation of massive stores of data, and the formulation of company-wide strategies for managing the data. But, at least as important, it also requires executives' vocal, unswerving commitment and willingness to change the way employees think, work, and are treated.
DOT National Transportation Integrated Search
1995-08-01
Bridge design engineers and local highway officials make bridge replacement decisions across the : United States. The Analytical Hierarchy Process was used to characterize the bridge material selection : decision of these individuals. State Departmen...
Cárdenas, V; Cordobés, M; Blanco, M; Alcalà, M
2015-10-10
The pharmaceutical industry is under stringent regulations on quality control of their products because is critical for both, productive process and consumer safety. According to the framework of "process analytical technology" (PAT), a complete understanding of the process and a stepwise monitoring of manufacturing are required. Near infrared spectroscopy (NIRS) combined with chemometrics have lately performed efficient, useful and robust for pharmaceutical analysis. One crucial step in developing effective NIRS-based methodologies is selecting an appropriate calibration set to construct models affording accurate predictions. In this work, we developed calibration models for a pharmaceutical formulation during its three manufacturing stages: blending, compaction and coating. A novel methodology is proposed for selecting the calibration set -"process spectrum"-, into which physical changes in the samples at each stage are algebraically incorporated. Also, we established a "model space" defined by Hotelling's T(2) and Q-residuals statistics for outlier identification - inside/outside the defined space - in order to select objectively the factors to be used in calibration set construction. The results obtained confirm the efficacy of the proposed methodology for stepwise pharmaceutical quality control, and the relevance of the study as a guideline for the implementation of this easy and fast methodology in the pharma industry. Copyright © 2015 Elsevier B.V. All rights reserved.
Gurtner, Sebastian
2014-01-01
Decision makers in hospitals are regularly faced with choices about the adoption of new technologies. Wrong decisions lead to a waste of resources and can have serious effects on the patients' and hospital's well-being. The goal of this research was to contribute to the understanding of decision making in hospitals. This study produced insights regarding relevant decision criteria and explored their specific relevance. An initial empirical survey was used to collect the relevant criteria for technological decision making in hospitals. In total, 220 experts in the field of health technology assessment from 34 countries participated in the survey. As a second step, the abovementioned criteria were used to form the basis of an analytic hierarchy process model. A group of 115 physicians, medical technical assistants, and other staff, all of whom worked in the field of radiooncology, prioritized the criteria. An analysis of variance was performed to explore differences among groups in terms of institutional and personal categorization variables. The first part of the research revealed seven key criteria for technological decision making in hospitals. The analytic hierarchy process model revealed that organizational impact was the most important criterion, followed by budget impact. The analysis of variance indicated that there were differences in the perceptions of the importance of the identified criteria. This exploration of the criteria for technological decision making in hospitals will help decision makers consider all of the relevant aspects, leading to more structured and rational decisions. For the optimal resource allocation, all of the relevant stakeholder perspectives and local issues must be considered appropriately.
DOT National Transportation Integrated Search
1995-08-01
Bridge design engineers and local highway officials make bridge replacement decsions across the U.S. The Analytical Hierarchical Process was used to characterize the bridge material selection decisions of these individuals. State Departments of Trans...
Smart Partnerships to Increase Equity in Education
ERIC Educational Resources Information Center
Leahy, Margaret; Davis, Niki; Lewin, Cathy; Charania, Amina; Nordin, Hasniza; Orlic, Davor; Butler, Deirdre; Lopez-Fernadez, Olatz
2016-01-01
This exploratory analysis of smart partnerships identifies the risk of increasing the digital divide with the deployment of data analytics. Smart partnerships in education appear to include a process of evolution into a synergy of strategic and holistic approaches that enhance the quality of education with digital technologies, harnessing ICT…
Implementation of quality by design toward processing of food products.
Rathore, Anurag S; Kapoor, Gautam
2017-05-28
Quality by design (QbD) is a systematic approach that begins with predefined objectives and emphasizes product and process understanding and process control. It is an approach based on principles of sound science and quality risk management. As the food processing industry continues to embrace the idea of in-line, online, and/or at-line sensors and real-time characterization for process monitoring and control, the existing gaps with regard to our ability to monitor multiple parameters/variables associated with the manufacturing process will be alleviated over time. Investments made for development of tools and approaches that facilitate high-throughput analytical and process development, process analytical technology, design of experiments, risk analysis, knowledge management, and enhancement of process/product understanding would pave way for operational and economic benefits later in the commercialization process and across other product pipelines. This article aims to achieve two major objectives. First, to review the progress that has been made in the recent years on the topic of QbD implementation in processing of food products and second, present a case study that illustrates benefits of such QbD implementation.
Uzoka, Faith-Michael Emeka; Obot, Okure; Barker, Ken; Osuji, J
2011-07-01
The task of medical diagnosis is a complex one, considering the level vagueness and uncertainty management, especially when the disease has multiple symptoms. A number of researchers have utilized the fuzzy-analytic hierarchy process (fuzzy-AHP) methodology in handling imprecise data in medical diagnosis and therapy. The fuzzy logic is able to handle vagueness and unstructuredness in decision making, while the AHP has the ability to carry out pairwise comparison of decision elements in order to determine their importance in the decision process. This study attempts to do a case comparison of the fuzzy and AHP methods in the development of medical diagnosis system, which involves basic symptoms elicitation and analysis. The results of the study indicate a non-statistically significant relative superiority of the fuzzy technology over the AHP technology. Data collected from 30 malaria patients were used to diagnose using AHP and fuzzy logic independent of one another. The results were compared and found to covary strongly. It was also discovered from the results of fuzzy logic diagnosis covary a little bit more strongly to the conventional diagnosis results than that of AHP. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Josefsberg, Jessica O; Buckland, Barry
2012-06-01
The evolution of vaccines (e.g., live attenuated, recombinant) and vaccine production methods (e.g., in ovo, cell culture) are intimately tied to each other. As vaccine technology has advanced, the methods to produce the vaccine have advanced and new vaccine opportunities have been created. These technologies will continue to evolve as we strive for safer and more immunogenic vaccines and as our understanding of biology improves. The evolution of vaccine process technology has occurred in parallel to the remarkable growth in the development of therapeutic proteins as products; therefore, recent vaccine innovations can leverage the progress made in the broader biotechnology industry. Numerous important legacy vaccines are still in use today despite their traditional manufacturing processes, with further development focusing on improving stability (e.g., novel excipients) and updating formulation (e.g., combination vaccines) and delivery methods (e.g., skin patches). Modern vaccine development is currently exploiting a wide array of novel technologies to create safer and more efficacious vaccines including: viral vectors produced in animal cells, virus-like particles produced in yeast or insect cells, polysaccharide conjugation to carrier proteins, DNA plasmids produced in E. coli, and therapeutic cancer vaccines created by in vitro activation of patient leukocytes. Purification advances (e.g., membrane adsorption, precipitation) are increasing efficiency, while innovative analytical methods (e.g., microsphere-based multiplex assays, RNA microarrays) are improving process understanding. Novel adjuvants such as monophosphoryl lipid A, which acts on antigen presenting cell toll-like receptors, are expanding the previously conservative list of widely accepted vaccine adjuvants. As in other areas of biotechnology, process characterization by sophisticated analysis is critical not only to improve yields, but also to determine the final product quality. From a regulatory perspective, Quality by Design (QbD) and Process Analytical Technology (PAT) are important initiatives that can be applied effectively to many types of vaccine processes. Universal demand for vaccines requires that a manufacturer plan to supply tens and sometimes hundreds of millions of doses per year at low cost. To enable broader use, there is intense interest in improving temperature stability to allow for excursions from a rigid cold chain supply, especially at the point of vaccination. Finally, there is progress in novel routes of delivery to move away from the traditional intramuscular injection by syringe approach. Copyright © 2012 Wiley Periodicals, Inc.
Analytical and numerical study of New field emitter processing for superconducting cavities
NASA Astrophysics Data System (ADS)
Volkov, Vladimir; Petrov, Victor
2018-02-01
In this article a scientific prove for a new technology to maximize the accelerating gradient in superconducting cavities by processing on higher order mode frequencies is presented. As dominant energy source the heating of field emitters by an induced rf current (rf-heating) is considered. The field emitter structure is assumed to be a chain of conductive particles, which are formed by attractive forces.
NASA Astrophysics Data System (ADS)
Johnson, S. P.; Rohrer, M. E.
2017-12-01
The application of scientific research pertaining to satellite imaging and data processing has facilitated the development of dynamic methodologies and tools that utilize nanosatellites and analytical platforms to address the increasing scope, scale, and intensity of emerging environmental threats to national security. While the use of remotely sensed data to monitor the environment at local and global scales is not a novel proposition, the application of advances in nanosatellites and analytical platforms are capable of overcoming the data availability and accessibility barriers that have historically impeded the timely detection, identification, and monitoring of these stressors. Commercial and university-based applications of these technologies were used to identify and evaluate their capacity as security-motivated environmental monitoring tools. Presently, nanosatellites can provide consumers with 1-meter resolution imaging, frequent revisits, and customizable tasking, allowing users to define an appropriate temporal scale for high resolution data collection that meets their operational needs. Analytical platforms are capable of ingesting increasingly large and diverse volumes of data, delivering complex analyses in the form of interpretation-ready data products and solutions. The synchronous advancement of these technologies creates the capability of analytical platforms to deliver interpretable products from persistently collected high-resolution data that meet varying temporal and geographic scale requirements. In terms of emerging environmental threats, these advances translate into customizable and flexible tools that can respond to and accommodate the evolving nature of environmental stressors. This presentation will demonstrate the capability of nanosatellites and analytical platforms to provide timely, relevant, and actionable information that enables environmental analysts and stakeholders to make informed decisions regarding the prevention, intervention, and prediction of emerging environmental threats.
Information literacy of U.S. and Indian engineering undergraduates.
Taraban, Roman; Suar, Damodar; Oliver, Kristin
2013-12-01
To be competitive, contemporary engineers must be capable of both processing and communicating information effectively. Available research suggests that Indian students would be disadvantaged in information literacy in their language of instruction (English) compared to U.S. students because English is not Indian students' native language. Compared to U.S. students, Indian students (a) were predicted to apply practical text processing strategies to a greater extent than analytic strategies and (b) endorse the direct transmission of information over critical, interpretive analysis of information. Two validated scales measuring self-reported use of reading strategies and beliefs about interpreting and critiquing written information were administered to engineering students at an Indian Institute of Technology in their freshman to senior years. Neither prediction was supported: Indian students reported applying analytic strategies over pragmatic strategies and were more disposed to critically analyze information rather than accept it passively. Further, Indian students reported being more analytic and more reflective in their reading behaviors than U.S. engineering students. Additional data indicated that U.S. and Indian students' text-processing strategies and beliefs are associated with the texts that they read and their academic behaviors.
Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A
2017-12-19
As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.
Schaefer, Cédric; Clicq, David; Lecomte, Clémence; Merschaert, Alain; Norrant, Edith; Fotiadu, Frédéric
2014-03-01
Pharmaceutical companies are progressively adopting and introducing Process Analytical Technology (PAT) and Quality-by-Design (QbD) concepts promoted by the regulatory agencies, aiming the building of the quality directly into the product by combining thorough scientific understanding and quality risk management. An analytical method based on near infrared (NIR) spectroscopy was developed as a PAT tool to control on-line an API (active pharmaceutical ingredient) manufacturing crystallization step during which the API and residual solvent contents need to be precisely determined to reach the predefined seeding point. An original methodology based on the QbD principles was designed to conduct the development and validation of the NIR method and to ensure that it is fitted for its intended use. On this basis, Partial least squares (PLS) models were developed and optimized using chemometrics methods. The method was fully validated according to the ICH Q2(R1) guideline and using the accuracy profile approach. The dosing ranges were evaluated to 9.0-12.0% w/w for the API and 0.18-1.50% w/w for the residual methanol. As by nature the variability of the sampling method and the reference method are included in the variability obtained for the NIR method during the validation phase, a real-time process monitoring exercise was performed to prove its fit for purpose. The implementation of this in-process control (IPC) method on the industrial plant from the launch of the new API synthesis process will enable automatic control of the final crystallization step in order to ensure a predefined quality level of the API. In addition, several valuable benefits are expected including reduction of the process time, suppression of a rather difficult sampling and tedious off-line analyses. © 2013 Published by Elsevier B.V.
Mass Spectrometry Based Lipidomics: An Overview of Technological Platforms
Köfeler, Harald C.; Fauland, Alexander; Rechberger, Gerald N.; Trötzmüller, Martin
2012-01-01
One decade after the genomic and the proteomic life science revolution, new ‘omics’ fields are emerging. The metabolome encompasses the entity of small molecules—Most often end products of a catalytic process regulated by genes and proteins—with the lipidome being its fat soluble subdivision. Within recent years, lipids are more and more regarded not only as energy storage compounds but also as interactive players in various cellular regulation cycles and thus attain rising interest in the bio-medical community. The field of lipidomics is, on one hand, fuelled by analytical technology advances, particularly mass spectrometry and chromatography, but on the other hand new biological questions also drive analytical technology developments. Compared to fairly standardized genomic or proteomic high-throughput protocols, the high degree of molecular heterogeneity adds a special analytical challenge to lipidomic analysis. In this review, we will take a closer look at various mass spectrometric platforms for lipidomic analysis. We will focus on the advantages and limitations of various experimental setups like ‘shotgun lipidomics’, liquid chromatography—Mass spectrometry (LC-MS) and matrix assisted laser desorption ionization-time of flight (MALDI-TOF) based approaches. We will also examine available software packages for data analysis, which nowadays is in fact the rate limiting step for most ‘omics’ workflows. PMID:24957366
Mass spectrometry based lipidomics: an overview of technological platforms.
Köfeler, Harald C; Fauland, Alexander; Rechberger, Gerald N; Trötzmüller, Martin
2012-01-05
One decade after the genomic and the proteomic life science revolution, new 'omics' fields are emerging. The metabolome encompasses the entity of small molecules-Most often end products of a catalytic process regulated by genes and proteins-with the lipidome being its fat soluble subdivision. Within recent years, lipids are more and more regarded not only as energy storage compounds but also as interactive players in various cellular regulation cycles and thus attain rising interest in the bio-medical community. The field of lipidomics is, on one hand, fuelled by analytical technology advances, particularly mass spectrometry and chromatography, but on the other hand new biological questions also drive analytical technology developments. Compared to fairly standardized genomic or proteomic high-throughput protocols, the high degree of molecular heterogeneity adds a special analytical challenge to lipidomic analysis. In this review, we will take a closer look at various mass spectrometric platforms for lipidomic analysis. We will focus on the advantages and limitations of various experimental setups like 'shotgun lipidomics', liquid chromatography-Mass spectrometry (LC-MS) and matrix assisted laser desorption ionization-time of flight (MALDI-TOF) based approaches. We will also examine available software packages for data analysis, which nowadays is in fact the rate limiting step for most 'omics' workflows.
Big data and high-performance analytics in structural health monitoring for bridge management
NASA Astrophysics Data System (ADS)
Alampalli, Sharada; Alampalli, Sandeep; Ettouney, Mohammed
2016-04-01
Structural Health Monitoring (SHM) can be a vital tool for effective bridge management. Combining large data sets from multiple sources to create a data-driven decision-making framework is crucial for the success of SHM. This paper presents a big data analytics framework that combines multiple data sets correlated with functional relatedness to convert data into actionable information that empowers risk-based decision-making. The integrated data environment incorporates near real-time streams of semi-structured data from remote sensors, historical visual inspection data, and observations from structural analysis models to monitor, assess, and manage risks associated with the aging bridge inventories. Accelerated processing of dataset is made possible by four technologies: cloud computing, relational database processing, support from NOSQL database, and in-memory analytics. The framework is being validated on a railroad corridor that can be subjected to multiple hazards. The framework enables to compute reliability indices for critical bridge components and individual bridge spans. In addition, framework includes a risk-based decision-making process that enumerate costs and consequences of poor bridge performance at span- and network-levels when rail networks are exposed to natural hazard events such as floods and earthquakes. Big data and high-performance analytics enable insights to assist bridge owners to address problems faster.
From Streaming Data to Streaming Insights: The Impact of Data Velocities on Mental Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Endert, Alexander; Pike, William A.; Cook, Kristin A.
The rise of Big Data has influenced the design and technical implementation of visual analytic tools required to handle the increased volumes, velocities, and varieties of data. This has required a set of data management and computational advancements to allow us to store and compute on such datasets. However, as the ultimate goal of visual analytic technology is to enable the discovery and creation of insights from the users, an under-explored area is understanding how these datasets impact their mental models. That is, how have the analytic processes and strategies of users changed? How have users changed their perception ofmore » how to leverage, and ask questions of, these datasets?« less
PAT: From Western solid dosage forms to Chinese materia medica preparations using NIR-CI.
Zhou, Luwei; Xu, Manfei; Wu, Zhisheng; Shi, Xinyuan; Qiao, Yanjiang
2016-01-01
Near-infrared chemical imaging (NIR-CI) is an emerging technology that combines traditional near-infrared spectroscopy with chemical imaging. Therefore, NIR-CI can extract spectral information from pharmaceutical products and simultaneously visualize the spatial distribution of chemical components. The rapid and non-destructive features of NIR-CI make it an attractive process analytical technology (PAT) for identifying and monitoring critical control parameters during the pharmaceutical manufacturing process. This review mainly focuses on the pharmaceutical applications of NIR-CI in each unit operation during the manufacturing processes, from the Western solid dosage forms to the Chinese materia medica preparations. Finally, future applications of chemical imaging in the pharmaceutical industry are discussed. Copyright © 2015 John Wiley & Sons, Ltd.
Microelectromechanical Systems
NASA Technical Reports Server (NTRS)
Gabriel, Kaigham J.
1995-01-01
Micro-electromechanical systems (MEMS) is an enabling technology that merges computation and communication with sensing and actuation to change the way people and machines interact with the physical world. MEMS is a manufacturing technology that will impact widespread applications including: miniature inertial measurement measurement units for competent munitions and personal navigation; distributed unattended sensors; mass data storage devices; miniature analytical instruments; embedded pressure sensors; non-invasive biomedical sensors; fiber-optics components and networks; distributed aerodynamic control; and on-demand structural strength. The long term goal of ARPA's MEMS program is to merge information processing with sensing and actuation to realize new systems and strategies for both perceiving and controlling systems, processes, and the environment. The MEMS program has three major thrusts: advanced devices and processes, system design, and infrastructure.
Naidu, Venkata Ramana; Deshpande, Rucha S; Syed, Moinuddin R; Wakte, Pravin S
2018-07-01
A direct imaging system (Eyecon TM ) was used as a Process Analytical Technology (PAT) tool to monitor fluid bed coating process. Eyecon TM generated real-time onscreen images, particle size and shape information of two identically manufactured laboratory-scale batches. Eyecon TM has accuracy of measuring the particle size increase of ±1 μm on particles in the size range of 50-3000 μm. Eyecon TM captured data every 2 s during the entire process. The moving average of D90 particle size values recorded by Eyecon TM were calculated for every 30 min to calculate the radial coating thickness of coated particles. After the completion of coating process, the radial coating thickness was found to be 11.3 and 9.11 μm, with a standard deviation of ±0.68 and 1.8 μm for Batch 1 and Batch 2, respectively. The coating thickness was also correlated with percent weight build-up by gel permeation chromatography (GPC) and dissolution. GPC indicated weight build-up of 10.6% and 9.27% for Batch 1 and Batch 2, respectively. In conclusion, weight build-up of 10% can also be correlated with 10 ± 2 μm increase in the coating thickness of pellets, indicating the potential applicability of real-time imaging as an endpoint determination tool for fluid bed coating process.
Military engine computational structures technology
NASA Technical Reports Server (NTRS)
Thomson, Daniel E.
1992-01-01
Integrated High Performance Turbine Engine Technology Initiative (IHPTET) goals require a strong analytical base. Effective analysis of composite materials is critical to life analysis and structural optimization. Accurate life prediction for all material systems is critical. User friendly systems are also desirable. Post processing of results is very important. The IHPTET goal is to double turbine engine propulsion capability by the year 2003. Fifty percent of the goal will come from advanced materials and structures, the other 50 percent will come from increasing performance. Computer programs are listed.
Analytical Ferrography Standardization.
1982-01-01
AD-AII6 508 MECHANICAL TECHNOLOGY INC LATHAM NY RESEARCH AND 0EV--ETC F/6 7/4 ANALYTICAL FERROGRAPHY STANDARDIZATION. (U) JAN 82 P A SENHOLZI, A S...ii Mwl jutio7 Unimte SMechanical Technology Incorporated Research and Development Division ReerhadDvlpetDvso I FINAL REPORT ANALYTICAL FERROGRAPHY ...Final Report MTI Technical Report No. 82TRS6 ANALYTICAL FERROGRAPHY STANDARDIZATION P. B. Senholzi A. S. Maciejewski Applications Engineering Mechanical
DNA biosensing with 3D printing technology.
Loo, Adeline Huiling; Chua, Chun Kiang; Pumera, Martin
2017-01-16
3D printing, an upcoming technology, has vast potential to transform conventional fabrication processes due to the numerous improvements it can offer to the current methods. To date, the employment of 3D printing technology has been examined for applications in the fields of engineering, manufacturing and biological sciences. In this study, we examined the potential of adopting 3D printing technology for a novel application, electrochemical DNA biosensing. Metal 3D printing was utilized to construct helical-shaped stainless steel electrodes which functioned as a transducing platform for the detection of DNA hybridization. The ability of electroactive methylene blue to intercalate into the double helix structure of double-stranded DNA was then exploited to monitor the DNA hybridization process, with its inherent reduction peak serving as an analytical signal. The designed biosensing approach was found to demonstrate superior selectivity against a non-complementary DNA target, with a detection range of 1-1000 nM.
Uy, Raymonde Charles Y; Kury, Fabricio P; Fontelo, Paul A
2015-01-01
The standard of safe medication practice requires strict observance of the five rights of medication administration: the right patient, drug, time, dose, and route. Despite adherence to these guidelines, medication errors remain a public health concern that has generated health policies and hospital processes that leverage automation and computerization to reduce these errors. Bar code, RFID, biometrics and pharmacy automation technologies have been demonstrated in literature to decrease the incidence of medication errors by minimizing human factors involved in the process. Despite evidence suggesting the effectivity of these technologies, adoption rates and trends vary across hospital systems. The objective of study is to examine the state and adoption trends of automatic identification and data capture (AIDC) methods and pharmacy automation technologies in U.S. hospitals. A retrospective descriptive analysis of survey data from the HIMSS Analytics® Database was done, demonstrating an optimistic growth in the adoption of these patient safety solutions.
Kornecki, Martin; Strube, Jochen
2018-03-16
Productivity improvements of mammalian cell culture in the production of recombinant proteins have been made by optimizing cell lines, media, and process operation. This led to enhanced titers and process robustness without increasing the cost of the upstream processing (USP); however, a downstream bottleneck remains. In terms of process control improvement, the process analytical technology (PAT) initiative, initiated by the American Food and Drug Administration (FDA), aims to measure, analyze, monitor, and ultimately control all important attributes of a bioprocess. Especially, spectroscopic methods such as Raman or near-infrared spectroscopy enable one to meet these analytical requirements, preferably in-situ. In combination with chemometric techniques like partial least square (PLS) or principal component analysis (PCA), it is possible to generate soft sensors, which estimate process variables based on process and measurement models for the enhanced control of bioprocesses. Macroscopic kinetic models can be used to simulate cell metabolism. These models are able to enhance the process understanding by predicting the dynamic of cells during cultivation. In this article, in-situ turbidity (transmission, 880 nm) and ex-situ Raman spectroscopy (785 nm) measurements are combined with an offline macroscopic Monod kinetic model in order to predict substrate concentrations. Experimental data of Chinese hamster ovary cultivations in bioreactors show a sufficiently linear correlation (R² ≥ 0.97) between turbidity and total cell concentration. PLS regression of Raman spectra generates a prediction model, which was validated via offline viable cell concentration measurement (RMSE ≤ 13.82, R² ≥ 0.92). Based on these measurements, the macroscopic Monod model can be used to determine different process attributes, e.g., glucose concentration. In consequence, it is possible to approximately calculate (R² ≥ 0.96) glucose concentration based on online cell concentration measurements using turbidity or Raman spectroscopy. Future approaches will use these online substrate concentration measurements with turbidity and Raman measurements, in combination with the kinetic model, in order to control the bioprocess in terms of feeding strategies, by employing an open platform communication (OPC) network-either in fed-batch or perfusion mode, integrated into a continuous operation of upstream and downstream.
Kornecki, Martin; Strube, Jochen
2018-01-01
Productivity improvements of mammalian cell culture in the production of recombinant proteins have been made by optimizing cell lines, media, and process operation. This led to enhanced titers and process robustness without increasing the cost of the upstream processing (USP); however, a downstream bottleneck remains. In terms of process control improvement, the process analytical technology (PAT) initiative, initiated by the American Food and Drug Administration (FDA), aims to measure, analyze, monitor, and ultimately control all important attributes of a bioprocess. Especially, spectroscopic methods such as Raman or near-infrared spectroscopy enable one to meet these analytical requirements, preferably in-situ. In combination with chemometric techniques like partial least square (PLS) or principal component analysis (PCA), it is possible to generate soft sensors, which estimate process variables based on process and measurement models for the enhanced control of bioprocesses. Macroscopic kinetic models can be used to simulate cell metabolism. These models are able to enhance the process understanding by predicting the dynamic of cells during cultivation. In this article, in-situ turbidity (transmission, 880 nm) and ex-situ Raman spectroscopy (785 nm) measurements are combined with an offline macroscopic Monod kinetic model in order to predict substrate concentrations. Experimental data of Chinese hamster ovary cultivations in bioreactors show a sufficiently linear correlation (R2 ≥ 0.97) between turbidity and total cell concentration. PLS regression of Raman spectra generates a prediction model, which was validated via offline viable cell concentration measurement (RMSE ≤ 13.82, R2 ≥ 0.92). Based on these measurements, the macroscopic Monod model can be used to determine different process attributes, e.g., glucose concentration. In consequence, it is possible to approximately calculate (R2 ≥ 0.96) glucose concentration based on online cell concentration measurements using turbidity or Raman spectroscopy. Future approaches will use these online substrate concentration measurements with turbidity and Raman measurements, in combination with the kinetic model, in order to control the bioprocess in terms of feeding strategies, by employing an open platform communication (OPC) network—either in fed-batch or perfusion mode, integrated into a continuous operation of upstream and downstream. PMID:29547557
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-23
... Administration--Partnering With Industry; Public Conference AGENCY: Food and Drug Administration, HHS. ACTION: Notice of public conference. The Food and Drug Administration (FDA) is announcing a joint conference with... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2010-N-0001...
DIY Analytics for Postsecondary Students
ERIC Educational Resources Information Center
Arndt, Timothy; Guercio, Angela
2014-01-01
Recently organizations have begun to realize the potential value in the huge amounts of raw, constantly fluctuating data sets that they generate and, with the help of advances in storage and processing technologies, collect. This leads to the phenomenon of big data. This data may be stored in structured format in relational database systems, but…
Data Entry: Towards the Critical Study of Digital Data and Education
ERIC Educational Resources Information Center
Selwyn, Neil
2015-01-01
The generation and processing of data through digital technologies is an integral element of contemporary society, as reflected in recent debates over online data privacy, "Big Data" and the rise of data mining and analytics in business, science and government. This paper outlines the significance of digital data within education,…
Holland, Tanja; Blessing, Daniel; Hellwig, Stephan; Sack, Markus
2013-10-01
Radio frequency impedance spectroscopy (RFIS) is a robust method for the determination of cell biomass during fermentation. RFIS allows non-invasive in-line monitoring of the passive electrical properties of cells in suspension and can distinguish between living and dead cells based on their distinct behavior in an applied radio frequency field. We used continuous in situ RFIS to monitor batch-cultivated plant suspension cell cultures in stirred-tank bioreactors and compared the in-line data to conventional off-line measurements. RFIS-based analysis was more rapid and more accurate than conventional biomass determination, and was sensitive to changes in cell viability. The higher resolution of the in-line measurement revealed subtle changes in cell growth which were not accessible using conventional methods. Thus, RFIS is well suited for correlating such changes with intracellular states and product accumulation, providing unique opportunities for employing systems biotechnology and process analytical technology approaches to increase product yield and quality. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Peres, Daniela D'Almeida; Ariede, Maira Bueno; Candido, Thalita Marcilio; de Almeida, Tania Santos; Lourenço, Felipe Rebello; Consiglieri, Vladi Olga; Kaneko, Telma Mary; Velasco, Maria Valéria Robles; Baby, André Rolim
2017-02-01
Multifunctional formulations are of great importance to ensure better skin protection from harm caused by ultraviolet radiation (UV). Despite the advantages of Quality by Design and Process Analytical Technology approaches to the development and optimization of new products, we found in the literature only a few studies concerning their applications in cosmetic product industry. Thus, in this research work, we applied the QbD and PAT approaches to the development of multifunctional sunscreens containing bemotrizinol, ethylhexyl triazone, and ferulic acid. In addition, UV transmittance method was applied to assess qualitative and quantitative critical quality attributes of sunscreens using chemometrics analyses. Linear discriminant analysis allowed classifying unknown formulations, which is useful for investigation of counterfeit and adulteration. Simultaneous quantification of ethylhexyl triazone, bemotrizinol, and ferulic acid presented at the formulations was performed using PLS regression. This design allowed us to verify the compounds in isolation and in combination and to prove that the antioxidant action of ferulic acid as well as the sunscreen actions, since the presence of this component increased 90% of antioxidant activity in vitro.
[Big data in medicine and healthcare].
Rüping, Stefan
2015-08-01
Healthcare is one of the business fields with the highest Big Data potential. According to the prevailing definition, Big Data refers to the fact that data today is often too large and heterogeneous and changes too quickly to be stored, processed, and transformed into value by previous technologies. The technological trends drive Big Data: business processes are more and more executed electronically, consumers produce more and more data themselves - e.g. in social networks - and finally ever increasing digitalization. Currently, several new trends towards new data sources and innovative data analysis appear in medicine and healthcare. From the research perspective, omics-research is one clear Big Data topic. In practice, the electronic health records, free open data and the "quantified self" offer new perspectives for data analytics. Regarding analytics, significant advances have been made in the information extraction from text data, which unlocks a lot of data from clinical documentation for analytics purposes. At the same time, medicine and healthcare is lagging behind in the adoption of Big Data approaches. This can be traced to particular problems regarding data complexity and organizational, legal, and ethical challenges. The growing uptake of Big Data in general and first best-practice examples in medicine and healthcare in particular, indicate that innovative solutions will be coming. This paper gives an overview of the potentials of Big Data in medicine and healthcare.
Cancela, Jorge; Fico, Giuseppe; Arredondo Waldmeyer, Maria T
2015-01-01
The assessment of a new health technology is a multidisciplinary and multidimensional process, which requires a complex analysis and the convergence of different stakeholders into a common decision. This task is even more delicate when the assessment is carried out in early stage of development processes, when the maturity of the technology prevents conducting a large scale trials to evaluate the cost effectiveness through classic health economics methods. This lack of information may limit the future development and deployment in the clinical practice. This work aims to 1) identify the most relevant user needs of a new medical technology for managing and monitoring Parkinson's Disease (PD) patients and to 2) use these user needs for a preliminary assessment of a specific system called PERFORM, as a case study. Analytic Hierarchy Process (AHP) was used to design a hierarchy of 17 needs, grouped into 5 categories. A total of 16 experts, 6 of them with a clinical background and the remaining 10 with a technical background, were asked to rank these needs and categories. On/Off fluctuations detection, Increase wearability acceptance, and Increase self-management support have been identified as the most relevant user needs. No significant differences were found between the clinician and technical groups. These results have been used to evaluate the PERFORM system and to identify future areas of improvement. First of all, the AHP contributed to the elaboration of a unified hierarchy, integrating the needs of a variety of stakeholders, promoting the discussion and the agreement into a common framework of evaluation. Moreover, the AHP effectively supported the user need elicitation as well as the assignment of different weights and priorities to each need and, consequently, it helped to define a framework for the assessment of telehealth systems for PD management and monitoring. This framework can be used to support the decision-making process for the adoption of new technologies in PD.
2015-01-01
Background The assessment of a new health technology is a multidisciplinary and multidimensional process, which requires a complex analysis and the convergence of different stakeholders into a common decision. This task is even more delicate when the assessment is carried out in early stage of development processes, when the maturity of the technology prevents conducting a large scale trials to evaluate the cost effectiveness through classic health economics methods. This lack of information may limit the future development and deployment in the clinical practice. This work aims to 1) identify the most relevant user needs of a new medical technology for managing and monitoring Parkinson's Disease (PD) patients and to 2) use these user needs for a preliminary assessment of a specific system called PERFORM, as a case study. Methods Analytic Hierarchy Process (AHP) was used to design a hierarchy of 17 needs, grouped into 5 categories. A total of 16 experts, 6 of them with a clinical background and the remaining 10 with a technical background, were asked to rank these needs and categories. Results On/Off fluctuations detection, Increase wearability acceptance, and Increase self-management support have been identified as the most relevant user needs. No significant differences were found between the clinician and technical groups. These results have been used to evaluate the PERFORM system and to identify future areas of improvement. Conclusions First of all, the AHP contributed to the elaboration of a unified hierarchy, integrating the needs of a variety of stakeholders, promoting the discussion and the agreement into a common framework of evaluation. Moreover, the AHP effectively supported the user need elicitation as well as the assignment of different weights and priorities to each need and, consequently, it helped to define a framework for the assessment of telehealth systems for PD management and monitoring. This framework can be used to support the decision-making process for the adoption of new technologies in PD. PMID:26391847
Total analysis systems with Thermochromic Etching Discs technology.
Avella-Oliver, Miquel; Morais, Sergi; Carrascosa, Javier; Puchades, Rosa; Maquieira, Ángel
2014-12-16
A new analytical system based on Thermochromic Etching Discs (TED) technology is presented. TED comprises a number of attractive features such as track independency, selective irradiation, a high power laser, and the capability to create useful assay platforms. The analytical versatility of this tool opens up a wide range of possibilities to design new compact disc-based total analysis systems applicable in chemistry and life sciences. In this paper, TED analytical implementation is described and discussed, and their analytical potential is supported by several applications. Microarray immunoassay, immunofiltration assay, solution measurement, and cell culture approaches are herein addressed in order to demonstrate the practical capacity of this system. The analytical usefulness of TED technology is herein demonstrated, describing how to exploit this tool for developing truly integrated analytical systems that provide solutions within the point of care framework.
[Laser Raman Spectroscopy and Its Application in Gas Hydrate Studies].
Fu, Juan; Wu, Neng-you; Lu, Hai-long; Wu, Dai-dai; Su, Qiu-cheng
2015-11-01
Gas hydrates are important potential energy resources. Microstructural characterization of gas hydrate can provide information to study the mechanism of gas hydrate formation and to support the exploitation and application of gas hydrate technology. This article systemly introduces the basic principle of laser Raman spectroscopy and summarizes its application in gas hydrate studies. Based on Raman results, not only can the information about gas composition and structural type be deduced, but also the occupancies of large and small cages and even hydration number can be calculated from the relative intensities of Raman peaks. By using the in-situ analytical technology, laser Raman specstropy can be applied to characterize the formation and decomposition processes of gas hydrate at microscale, for example the enclathration and leaving of gas molecules into/from its cages, to monitor the changes in gas concentration and gas solubility during hydrate formation and decomposition, and to identify phase changes in the study system. Laser Raman in-situ analytical technology has also been used in determination of hydrate structure and understanding its changing process under the conditions of ultra high pressure. Deep-sea in-situ Raman spectrometer can be employed for the in-situ analysis of the structures of natural gas hydrate and their formation environment. Raman imaging technology can be applied to specify the characteristics of crystallization and gas distribution over hydrate surface. With the development of laser Raman technology and its combination with other instruments, it will become more powerful and play a more significant role in the microscopic study of gas hydrate.
Page, Trevor; Dubina, Henry; Fillipi, Gabriele; Guidat, Roland; Patnaik, Saroj; Poechlauer, Peter; Shering, Phil; Guinn, Martin; Mcdonnell, Peter; Johnston, Craig
2015-03-01
This white paper focuses on equipment, and analytical manufacturers' perspectives, regarding the challenges of continuous pharmaceutical manufacturing across five prompt questions. In addition to valued input from several vendors, commentary was provided from experienced pharmaceutical representatives, who have installed various continuous platforms. Additionally, a small medium enterprise (SME) perspective was obtained through interviews. A range of technical challenges is outlined, including: the presence of particles, equipment scalability, fouling (and cleaning), technology derisking, specific analytical challenges, and the general requirement of improved technical training. Equipment and analytical companies can make a significant contribution to help the introduction of continuous technology. A key point is that many of these challenges exist in batch processing and are not specific to continuous processing. Backward compatibility of software is not a continuous issue per se. In many cases, there is available learning from other industries. Business models and opportunities through outsourced development partners are also highlighted. Agile smaller companies and academic groups have a key role to play in developing skills, working collaboratively in partnerships, and focusing on solving relevant industry challenges. The precompetitive space differs for vendor companies compared with large pharmaceuticals. Currently, there is no strong consensus around a dominant continuous design, partly because of business dynamics and commercial interests. A more structured common approach to process design and hardware and software standardization would be beneficial, with initial practical steps in modeling. Conclusions include a digestible systems approach, accessible and published business cases, and increased user, academic, and supplier collaboration. This mirrors US FDA direction. The concept of silos in pharmaceutical companies is a common theme throughout the white papers. In the equipment domain, this is equally prevalent among a broad range of companies, mainly focusing on discrete areas. As an example, the flow chemistry and secondary drug product communities are almost entirely disconnected. Control and Process Analytical Technologies (PAT) companies are active in both domains. The equipment actors are a very diverse group with a few major Original Equipment Manufacturers (OEM) players and a variety of SME, project providers, integrators, upstream downstream providers, and specialist PAT. In some cases, partnerships or alliances are formed to increase critical mass. This white paper has focused on small molecules; equipment associated with biopharmaceuticals is covered in a separate white paper. More specifics on equipment detail are provided in final dosage form and drug substance white papers. The equipment and analytical development from laboratory to pilot to production is important, with a variety of sensors and complexity reducing with scale. The importance of robust processing rather than overcomplex control strategy mitigation is important. A search of nonacademic literature highlights, with a few notable exceptions, a relative paucity of material. Much focuses on the economics and benefits of continuous, rather than specifics of equipment issues. The disruptive nature of continuous manufacturing represents either an opportunity or a threat for many companies, so the incentive to change equipment varies. Also, for many companies, the pharmaceutical sector is not actually the dominant sector in terms of sales. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
Page, Trevor; Dubina, Henry; Fillipi, Gabriele; Guidat, Roland; Patnaik, Saroj; Poechlauer, Peter; Shering, Phil; Guinn, Martin; Mcdonnell, Peter; Johnston, Craig
2015-03-01
This white paper focuses on equipment, and analytical manufacturers' perspectives, regarding the challenges of continuous pharmaceutical manufacturing across five prompt questions. In addition to valued input from several vendors, commentary was provided from experienced pharmaceutical representatives, who have installed various continuous platforms. Additionally, a small medium enterprise (SME) perspective was obtained through interviews. A range of technical challenges is outlined, including: the presence of particles, equipment scalability, fouling (and cleaning), technology derisking, specific analytical challenges, and the general requirement of improved technical training. Equipment and analytical companies can make a significant contribution to help the introduction of continuous technology. A key point is that many of these challenges exist in batch processing and are not specific to continuous processing. Backward compatibility of software is not a continuous issue per se. In many cases, there is available learning from other industries. Business models and opportunities through outsourced development partners are also highlighted. Agile smaller companies and academic groups have a key role to play in developing skills, working collaboratively in partnerships, and focusing on solving relevant industry challenges. The precompetitive space differs for vendor companies compared with large pharmaceuticals. Currently, there is no strong consensus around a dominant continuous design, partly because of business dynamics and commercial interests. A more structured common approach to process design and hardware and software standardization would be beneficial, with initial practical steps in modeling. Conclusions include a digestible systems approach, accessible and published business cases, and increased user, academic, and supplier collaboration. This mirrors US FDA direction. The concept of silos in pharmaceutical companies is a common theme throughout the white papers. In the equipment domain, this is equally prevalent among a broad range of companies, mainly focusing on discrete areas. As an example, the flow chemistry and secondary drug product communities are almost entirely disconnected. Control and Process Analytical Technologies (PAT) companies are active in both domains. The equipment actors are a very diverse group with a few major Original Equipment Manufacturers (OEM) players and a variety of SME, project providers, integrators, upstream downstream providers, and specialist PAT. In some cases, partnerships or alliances are formed to increase critical mass. This white paper has focused on small molecules; equipment associated with biopharmaceuticals is covered in a separate white paper. More specifics on equipment detail are provided in final dosage form and drug substance white papers. The equipment and analytical development from laboratory to pilot to production is important, with a variety of sensors and complexity reducing with scale. The importance of robust processing rather than overcomplex control strategy mitigation is important. A search of nonacademic literature highlights, with a few notable exceptions, a relative paucity of material. Much focuses on the economics and benefits of continuous, rather than specifics of equipment issues. The disruptive nature of continuous manufacturing represents either an opportunity or a threat for many companies, so the incentive to change equipment varies. Also, for many companies, the pharmaceutical sector is not actually the dominant sector in terms of sales. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
Emergent FDA biodefense issues for microarray technology: process analytical technology.
Weinberg, Sandy
2004-11-01
A successful biodefense strategy relies upon any combination of four approaches. A nation can protect its troops and citizenry first by advanced mass vaccination, second, by responsive ring vaccination, and third, by post-exposure therapeutic treatment (including vaccine therapies). Finally, protection can be achieved by rapid detection followed by exposure limitation (suites and air filters) or immediate treatment (e.g., antibiotics, rapid vaccines and iodine pills). All of these strategies rely upon or are enhanced by microarray technologies. Microarrays can be used to screen, engineer and test vaccines. They are also used to construct early detection tools. While effective biodefense utilizes a variety of tactical tools, microarray technology is a valuable arrow in that quiver.
Historical review of missile aerodynamic developments
NASA Technical Reports Server (NTRS)
Spearman, M. Leroy
1989-01-01
A comprehensive development history to about 1970 is presented for missile technologies and their associated capabilities and difficulties. Attention is given to the growth of an experimental data base for missile design, as well as to the critical early efforts to develop analytical methods applicable to missiles. Most of the important missile development efforts made during the period from the end of the Second World War to the early 1960s were based primarily on experiences gained through wind tunnel and flight testing; analytical techniques began to demonstrate their usefulness in the design process only in the late 1960s.
Aziz, Nazneen; Zhao, Qin; Bry, Lynn; Driscoll, Denise K; Funke, Birgit; Gibson, Jane S; Grody, Wayne W; Hegde, Madhuri R; Hoeltge, Gerald A; Leonard, Debra G B; Merker, Jason D; Nagarajan, Rakesh; Palicki, Linda A; Robetorye, Ryan S; Schrijver, Iris; Weck, Karen E; Voelkerding, Karl V
2015-04-01
The higher throughput and lower per-base cost of next-generation sequencing (NGS) as compared to Sanger sequencing has led to its rapid adoption in clinical testing. The number of laboratories offering NGS-based tests has also grown considerably in the past few years, despite the fact that specific Clinical Laboratory Improvement Amendments of 1988/College of American Pathologists (CAP) laboratory standards had not yet been developed to regulate this technology. To develop a checklist for clinical testing using NGS technology that sets standards for the analytic wet bench process and for bioinformatics or "dry bench" analyses. As NGS-based clinical tests are new to diagnostic testing and are of much greater complexity than traditional Sanger sequencing-based tests, there is an urgent need to develop new regulatory standards for laboratories offering these tests. To develop the necessary regulatory framework for NGS and to facilitate appropriate adoption of this technology for clinical testing, CAP formed a committee in 2011, the NGS Work Group, to deliberate upon the contents to be included in the checklist. Results . -A total of 18 laboratory accreditation checklist requirements for the analytic wet bench process and bioinformatics analysis processes have been included within CAP's molecular pathology checklist (MOL). This report describes the important issues considered by the CAP committee during the development of the new checklist requirements, which address documentation, validation, quality assurance, confirmatory testing, exception logs, monitoring of upgrades, variant interpretation and reporting, incidental findings, data storage, version traceability, and data transfer confidentiality.
Biosensors for Sustainable Food Engineering: Challenges and Perspectives.
Neethirajan, Suresh; Ragavan, Vasanth; Weng, Xuan; Chand, Rohit
2018-03-12
Current food production faces tremendous challenges from growing human population, maintaining clean resources and food qualities, and protecting climate and environment. Food sustainability is mostly a cooperative effort resulting in technology development supported by both governments and enterprises. Multiple attempts have been promoted in tackling challenges and enhancing drivers in food production. Biosensors and biosensing technologies with their applications, are being widely applied to tackling top challenges in food production and its sustainability. Consequently, a growing demand in biosensing technologies exists in food sustainability. Microfluidics represents a technological system integrating multiple technologies. Nanomaterials, with its technology in biosensing, is thought to be the most promising tool in dealing with health, energy, and environmental issues closely related to world populations. The demand of point of care (POC) technologies in this area focus on rapid, simple, accurate, portable, and low-cost analytical instruments. This review provides current viewpoints from the literature on biosensing in food production, food processing, safety and security, food packaging and supply chain, food waste processing, food quality assurance, and food engineering. The current understanding of progress, solution, and future challenges, as well as the commercialization of biosensors are summarized.
Fuzzy control of burnout of multilayer ceramic actuators
NASA Astrophysics Data System (ADS)
Ling, Alice V.; Voss, David; Christodoulou, Leo
1996-08-01
To improve the yield and repeatability of the burnout process of multilayer ceramic actuators (MCAs), an intelligent processing of materials (IPM-based) control system has been developed for the manufacture of MCAs. IPM involves the active (ultimately adaptive) control of a material process using empirical or analytical models and in situ sensing of critical process states (part features and process parameters) to modify the processing conditions in real time to achieve predefined product goals. Thus, the three enabling technologies for the IPM burnout control system are process modeling, in situ sensing and intelligent control. This paper presents the design of an IPM-based control strategy for the burnout process of MCAs.
Physical Processes and Real-Time Chemical Measurement of the Insect Olfactory Environment
Abrell, Leif; Hildebrand, John G.
2009-01-01
Odor-mediated insect navigation in airborne chemical plumes is vital to many ecological interactions, including mate finding, flower nectaring, and host locating (where disease transmission or herbivory may begin). After emission, volatile chemicals become rapidly mixed and diluted through physical processes that create a dynamic olfactory environment. This review examines those physical processes and some of the analytical technologies available to characterize those behavior-inducing chemical signals at temporal scales equivalent to the olfactory processing in insects. In particular, we focus on two areas of research that together may further our understanding of olfactory signal dynamics and its processing and perception by insects. First, measurement of physical atmospheric processes in the field can provide insight into the spatiotemporal dynamics of the odor signal available to insects. Field measurements in turn permit aspects of the physical environment to be simulated in the laboratory, thereby allowing careful investigation into the links between odor signal dynamics and insect behavior. Second, emerging analytical technologies with high recording frequencies and field-friendly inlet systems may offer new opportunities to characterize natural odors at spatiotemporal scales relevant to insect perception and behavior. Characterization of the chemical signal environment allows the determination of when and where olfactory-mediated behaviors may control ecological interactions. Finally, we argue that coupling of these two research areas will foster increased understanding of the physicochemical environment and enable researchers to determine how olfactory environments shape insect behaviors and sensory systems. PMID:18548311
Lambertus, Gordon; Shi, Zhenqi; Forbes, Robert; Kramer, Timothy T; Doherty, Steven; Hermiller, James; Scully, Norma; Wong, Sze Wing; LaPack, Mark
2014-01-01
An on-line analytical method based on transmission near-infrared spectroscopy (NIRS) for the quantitative determination of water concentrations (in parts per million) was developed and applied to the manufacture of a pharmaceutical intermediate. Calibration models for water analysis, built at the development site and applied at the manufacturing site, were successfully demonstrated during six manufacturing runs at a 250-gallon scale. The water measurements will be used as a forward-processing control point following distillation of a toluene product solution prior to use in a Grignard reaction. The most significant impact of using this NIRS-based process analytical technology (PAT) to replace off-line measurements is the significant reduction in the risk of operator exposure through the elimination of sampling of a severely lachrymatory and mutagenic compound. The work described in this report illustrates the development effort from proof-of-concept phase to manufacturing implementation.
A study on building data warehouse of hospital information system.
Li, Ping; Wu, Tao; Chen, Mu; Zhou, Bin; Xu, Wei-guo
2011-08-01
Existing hospital information systems with simple statistical functions cannot meet current management needs. It is well known that hospital resources are distributed with private property rights among hospitals, such as in the case of the regional coordination of medical services. In this study, to integrate and make full use of medical data effectively, we propose a data warehouse modeling method for the hospital information system. The method can also be employed for a distributed-hospital medical service system. To ensure that hospital information supports the diverse needs of health care, the framework of the hospital information system has three layers: datacenter layer, system-function layer, and user-interface layer. This paper discusses the role of a data warehouse management system in handling hospital information from the establishment of the data theme to the design of a data model to the establishment of a data warehouse. Online analytical processing tools assist user-friendly multidimensional analysis from a number of different angles to extract the required data and information. Use of the data warehouse improves online analytical processing and mitigates deficiencies in the decision support system. The hospital information system based on a data warehouse effectively employs statistical analysis and data mining technology to handle massive quantities of historical data, and summarizes from clinical and hospital information for decision making. This paper proposes the use of a data warehouse for a hospital information system, specifically a data warehouse for the theme of hospital information to determine latitude, modeling and so on. The processing of patient information is given as an example that demonstrates the usefulness of this method in the case of hospital information management. Data warehouse technology is an evolving technology, and more and more decision support information extracted by data mining and with decision-making technology is required for further research.
Liu, Ronghua; Li, Lian; Yin, Wenping; Xu, Dongbo; Zang, Hengchang
2017-09-15
The fluidized bed granulation and pellets coating technologies are widely used in pharmaceutical industry, because the particles made in a fluidized bed have good flowability, compressibility, and the coating thickness of pellets are homogeneous. With the popularization of process analytical technology (PAT), real-time analysis for critical quality attributes (CQA) was getting more attention. Near-infrared (NIR) spectroscopy, as a PAT tool, could realize the real-time monitoring and control during the granulating and coating processes, which could optimize the manufacturing processes. This article reviewed the application of NIR spectroscopy in CQA (moisture content, particle size and tablet/pellet thickness) monitoring during fluidized bed granulation and coating processes. Through this review, we would like to provide references for realizing automated control and intelligent production in fluidized bed granulation and pellets coating of pharmaceutical industry. Copyright © 2017 Elsevier B.V. All rights reserved.
Santos, R S; Malheiros, S M F; Cavalheiro, S; de Oliveira, J M Parente
2013-03-01
Cancer is the leading cause of death in economically developed countries and the second leading cause of death in developing countries. Malignant brain neoplasms are among the most devastating and incurable forms of cancer, and their treatment may be excessively complex and costly. Public health decision makers require significant amounts of analytical information to manage public treatment programs for these patients. Data mining, a technology that is used to produce analytically useful information, has been employed successfully with medical data. However, the large-scale adoption of this technique has been limited thus far because it is difficult to use, especially for non-expert users. One way to facilitate data mining by non-expert users is to automate the process. Our aim is to present an automated data mining system that allows public health decision makers to access analytical information regarding brain tumors. The emphasis in this study is the use of ontology in an automated data mining process. The non-experts who tried the system obtained useful information about the treatment of brain tumors. These results suggest that future work should be conducted in this area. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Petchkovsky, Leon
2017-06-01
Analytical psychology shares with many other psychotherapies the important task of repairing the consequences of developmental trauma. The majority of analytic patients come from compromised early developmental backgrounds: they may have experienced neglect, abuse, or failures of empathic resonance from their carers. Functional brain imagery techniques including Quantitative Electroencephalogram (QEEG), and functional Magnetic Resonance Imagery (fMRI), allow us to track mental processes in ways beyond verbal reportage and introspection. This independent perspective is useful for developing new psychodynamic hypotheses, testing current ones, providing diagnostic markers, and monitoring treatment progress. Jung, with the Word Association Test, grasped these principles 100 years ago. Brain imaging techniques have contributed to powerful recent advances in our understanding of neurodevelopmental processes in the first three years of life. If adequate nurturance is compromised, a range of difficulties may emerge. This has important implications for how we understand and treat our psychotherapy clients. The paper provides an overview of functional brain imaging and advances in developmental neuropsychology, and looks at applications of some of these findings (including neurofeedback) in the Jungian psychotherapy domain. © 2017, The Society of Analytical Psychology.
NASA Astrophysics Data System (ADS)
Devrient, M.; Da, X.; Frick, T.; Schmidt, M.
Laser transmission welding is a well known joining technology for thermoplastics. Because of the needs of lightweight, cost effective and green production thermoplastics are usually filled with glass fibers. These lead to higher absorption and more scattering within the upper joining partner with a negative influence on the welding process. Here an experimental method for the characterization of the scattering behavior of semi crystalline thermoplastics filled with short glass fibers and a finite element model of the welding process capable to consider scattering as well as an analytical model are introduced. The experimental data is used for the numerical and analytical investigation of laser transmission welding under consideration of scattering. The scattering effects of several thermoplastics onto the calculated temperature fields as well as weld seam geometries are quantified.
Analytics to Better Interpret and Use Large Amounts of Heterogeneous Data
NASA Astrophysics Data System (ADS)
Mathews, T. J.; Baskin, W. E.; Rinsland, P. L.
2014-12-01
Data scientists at NASA's Atmospheric Science Data Center (ASDC) are seasoned software application developers who have worked with the creation, archival, and distribution of large datasets (multiple terabytes and larger). In order for ASDC data scientists to effectively implement the most efficient processes for cataloging and organizing data access applications, they must be intimately familiar with data contained in the datasets with which they are working. Key technologies that are critical components to the background of ASDC data scientists include: large RBMSs (relational database management systems) and NoSQL databases; web services; service-oriented architectures; structured and unstructured data access; as well as processing algorithms. However, as prices of data storage and processing decrease, sources of data increase, and technologies advance - granting more people to access to data at real or near-real time - data scientists are being pressured to accelerate their ability to identify and analyze vast amounts of data. With existing tools this is becoming exceedingly more challenging to accomplish. For example, NASA Earth Science Data and Information System (ESDIS) alone grew from having just over 4PBs of data in 2009 to nearly 6PBs of data in 2011. This amount then increased to roughly10PBs of data in 2013. With data from at least ten new missions to be added to the ESDIS holdings by 2017, the current volume will continue to grow exponentially and drive the need to be able to analyze more data even faster. Though there are many highly efficient, off-the-shelf analytics tools available, these tools mainly cater towards business data, which is predominantly unstructured. Inadvertently, there are very few known analytics tools that interface well to archived Earth science data, which is predominantly heterogeneous and structured. This presentation will identify use cases for data analytics from an Earth science perspective in order to begin to identify specific tools that may be able to address those challenges.
NASA Astrophysics Data System (ADS)
Hoefer, Christoph; Santner, Jakob; Borisov, Sergey; Kreuzeder, Andreas; Wenzel, Walter; Puschenreiter, Markus
2015-04-01
Two dimensional chemical imaging of root processes refers to novel in situ methods to investigate and map solutes at a high spatial resolution (sub-mm). The visualization of these solutes reveals new insights in soil biogeochemistry and root processes. We derive chemical images by using data from DGT-LA-ICP-MS (Diffusive Gradients in Thin Films and Laser Ablation Inductively Coupled Plasma Mass Spectrometry) and POS (Planar Optode Sensors). Both technologies have shown promising results when applied in aqueous environment but need to be refined and improved for imaging at the soil-plant interface. Co-localized mapping using combined DGT and POS technologies and the development of new gel combinations are in our focus. DGTs are smart and thin (<0.4 mm) hydrogels; containing a binding resin for the targeted analytes (e.g. trace metals, phosphate, sulphide or radionuclides). The measurement principle is passive and diffusion based. The present analytes are diffusing into the gel and are bound by the resin. Thereby, the resin acts as zero sink. After application, DGTs are retrieved, dried, and analysed using LA-ICP-MS. The data is then normalized by an internal standard (e.g. 13C), calibrated using in-house standards and chemical images of the target area are plotted using imaging software. POS are, similar to DGT, thin sensor foils containing a fluorophore coating depending on the target analyte. The measurement principle is based on excitation of the flourophore by a specific wavelength and emission of the fluorophore depending on the presence of the analyte. The emitted signal is captured using optical filters and a DSLR camera. While DGT analysis is destructive, POS measurements can be performed continuously during the application. Both semi-quantitative techniques allow an in situ application to visualize chemical processes directly at the soil-plant interface. Here, we present a summary of results from rhizotron experiments with different plants in metal contaminated and agricultural soils.
Sample and data processing considerations for the NIST quantitative infrared database
NASA Astrophysics Data System (ADS)
Chu, Pamela M.; Guenther, Franklin R.; Rhoderick, George C.; Lafferty, Walter J.; Phillips, William
1999-02-01
Fourier-transform infrared (FT-IR) spectrometry has become a useful real-time in situ analytical technique for quantitative gas phase measurements. In fact, the U.S. Environmental Protection Agency (EPA) has recently approved open-path FT-IR monitoring for the determination of hazardous air pollutants (HAP) identified in EPA's Clean Air Act of 1990. To support infrared based sensing technologies, the National Institute of Standards and Technology (NIST) is currently developing a standard quantitative spectral database of the HAPs based on gravimetrically prepared standard samples. The procedures developed to ensure the quantitative accuracy of the reference data are discussed, including sample preparation, residual sample contaminants, data processing considerations, and estimates of error.
Software Analytical Instrument for Assessment of the Process of Casting Slabs
NASA Astrophysics Data System (ADS)
Franěk, Zdeněk; Kavička, František; Štětina, Josef; Masarik, Miloš
2010-06-01
The paper describes the original proposal of ways of solution and function of the program equipment for assessment of the process of casting slabs. The program system LITIOS was developed and implemented in EVRAZ Vitkovice Steel Ostrava on the equipment of continuous casting of steel (further only ECC). This program system works on the data warehouse of technological parameters of casting and quality parameters of slabs. It enables an ECC technologist to analyze the course of casting melt and with using statistics methods to set the influence of single technological parameters on the duality of final slabs. The system also enables long term monitoring and optimization of the production.
Biosensors-on-chip: a topical review
NASA Astrophysics Data System (ADS)
Chen, Sensen; Shamsi, Mohtashim H.
2017-08-01
This review will examine the integration of two fields that are currently at the forefront of science, i.e. biosensors and microfluidics. As a lab-on-a-chip (LOC) technology, microfluidics has been enriched by the integration of various detection tools for analyte detection and quantitation. The application of such microfluidic platforms is greatly increased in the area of biosensors geared towards point-of-care diagnostics. Together, the merger of microfluidics and biosensors has generated miniaturized devices for sample processing and sensitive detection with quantitation. We believe that microfluidic biosensors (biosensors-on-chip) are essential for developing robust and cost effective point-of-care diagnostics. This review is relevant to a variety of disciplines, such as medical science, clinical diagnostics, LOC technologies including MEMs/NEMs, and analytical science. Specifically, this review will appeal to scientists working in the two overlapping fields of biosensors and microfluidics, and will also help new scientists to find their directions in developing point-of-care devices.
Evolution of microbiological analytical methods for dairy industry needs
Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence
2014-01-01
Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry’s needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards. PMID:24570675
Evolution of microbiological analytical methods for dairy industry needs.
Sohier, Danièle; Pavan, Sonia; Riou, Armelle; Combrisson, Jérôme; Postollec, Florence
2014-01-01
Traditionally, culture-based methods have been used to enumerate microbial populations in dairy products. Recent developments in molecular methods now enable faster and more sensitive analyses than classical microbiology procedures. These molecular tools allow a detailed characterization of cell physiological states and bacterial fitness and thus, offer new perspectives to integration of microbial physiology monitoring to improve industrial processes. This review summarizes the methods described to enumerate and characterize physiological states of technological microbiota in dairy products, and discusses the current deficiencies in relation to the industry's needs. Recent studies show that Polymerase chain reaction-based methods can successfully be applied to quantify fermenting microbes and probiotics in dairy products. Flow cytometry and omics technologies also show interesting analytical potentialities. However, they still suffer from a lack of validation and standardization for quality control analyses, as reflected by the absence of performance studies and official international standards.
NASA Technical Reports Server (NTRS)
Seshan, P. K.; Ferrall, Joseph F.; Rohatgi, Naresh K.
1991-01-01
Several alternative configurations of life-support systems (LSSs) for a Mars missions are compared analytically on a quantitative basis in terms of weight, volume, and power. A baseline technology set is utilized for the illustrations of systems including totally open loop, carbon dioxide removal only, partially closed loop, and totally closed loop. The analytical model takes advantage of a modular, top-down hierarchical breakdown of LSS subsystems into functional elements that represent individual processing technologies. The open-loop systems are not competitive in terms of weight for both long-duration orbiters and short-duration lander vehicles, and power demands are lowest with the open loop and highest with the closed loop. The closed-loop system can reduce vehicle weight by over 70,000 lbs and thereby overcome the power penalty of 1600 W; the closed-loop variety is championed as the preferred system for a Mars expedition.
Metabolomic Technologies for Improving the Quality of Food: Practice and Promise.
Johanningsmeier, Suzanne D; Harris, G Keith; Klevorn, Claire M
2016-01-01
It is now well documented that the diet has a significant impact on human health and well-being. However, the complete set of small molecule metabolites present in foods that make up the human diet and the role of food production systems in altering this food metabolome are still largely unknown. Metabolomic platforms that rely on nuclear magnetic resonance (NMR) and mass spectrometry (MS) analytical technologies are being employed to study the impact of agricultural practices, processing, and storage on the global chemical composition of food; to identify novel bioactive compounds; and for authentication and region-of-origin classifications. This review provides an overview of the current terminology, analytical methods, and compounds associated with metabolomic studies, and provides insight into the application of metabolomics to generate new knowledge that enables us to produce, preserve, and distribute high-quality foods for health promotion.
Moghimi, Fatemeh Hoda; Cheung, Michael; Wickramasinghe, Nilmini
2013-01-01
Healthcare is an information rich industry where successful outcomes require the processing of multi-spectral data and sound decision making. The exponential growth of data and big data issues coupled with a rapid increase of service demands in healthcare contexts today, requires a robust framework enabled by IT (information technology) solutions as well as real-time service handling in order to ensure superior decision making and successful healthcare outcomes. Such a context is appropriate for the application of real time intelligent risk detection decision support systems using predictive analytic techniques such as data mining. To illustrate the power and potential of data science technologies in healthcare decision making scenarios, the use of an intelligent risk detection (IRD) model is proffered for the context of Congenital Heart Disease (CHD) in children, an area which requires complex high risk decisions that need to be made expeditiously and accurately in order to ensure successful healthcare outcomes.
Identification of Technologies for Provision of Future Aeronautical Communications
NASA Technical Reports Server (NTRS)
Gilbert, Tricia; Dyer, Glen; Henriksen, Steve; Berger, Jason; Jin, Jenny; Boci, Tony
2006-01-01
This report describes the process, findings, and recommendations of the second of three phases of the Future Communications Study (FCS) technology investigation conducted by NASA Glenn Research Center and ITT Advanced Engineering & Sciences Division for the Federal Aviation Administration (FAA). The FCS is a collaborative research effort between the FAA and Eurocontrol to address frequency congestion and spectrum depletion for safety critical airground communications. The goal of the technology investigation is to identify technologies that can support the longterm aeronautical mobile communication operating concept. A derived set of evaluation criteria traceable to the operating concept document is presented. An adaptation of the analytical hierarchy process is described and recommended for selecting candidates for detailed evaluation. Evaluations of a subset of technologies brought forward from the prescreening process are provided. Five of those are identified as candidates with the highest potential for continental airspace solutions in L-band (P-34, W-CDMA, LDL, B-VHF, and E-TDMA). Additional technologies are identified as best performers in the unique environments of remote/oceanic airspace in the satellite bands (Inmarsat SBB and a custom satellite solution) and the airport flight domain in C-band (802.16e). Details of the evaluation criteria, channel models, and the technology evaluations are provided in appendixes.
ERIC Educational Resources Information Center
Sadiig, I. Ahmed M. J.
2005-01-01
The traditional learning paradigm involving face-to-face interaction with students is shifting to highly data-intensive electronic learning with the advances in Information and Communication Technology. An important component of the e-learning process is the delivery of the learning contents to their intended audience over a network. A distributed…
Robert L. Smith; Robert J. Bush; Daniel L. Schmoldt
1995-01-01
Bridge design engineers and local highway officials make bridge replacement decisions across the United States. The Analytical Hierarchy Process was used to characterize the bridge material selection decision of these individuals. State Department of Transportation engineers, private consulting engineers, and local highway officials were personally interviewed in...
ERIC Educational Resources Information Center
Lu, Owen H. T.; Huang, Jeff C. H.; Huang, Anna Y. Q.; Yang, Stephen J. H.
2017-01-01
As information technology continues to evolve rapidly, programming skills become increasingly crucial. To be able to construct superb programming skills, the training must begin before college or even senior high school. However, when developing comprehensive training programmers, the learning and teaching processes must be considered. In order to…
Development of airframe design technology for crashworthiness.
NASA Technical Reports Server (NTRS)
Kruszewski, E. T.; Thomson, R. G.
1973-01-01
This paper describes the NASA portion of a joint FAA-NASA General Aviation Crashworthiness Program leading to the development of improved crashworthiness design technology. The objectives of the program are to develop analytical technology for predicting crashworthiness of structures, provide design improvements, and perform full-scale crash tests. The analytical techniques which are being developed both in-house and under contract are described, and typical results from these analytical programs are shown. In addition, the full-scale testing facility and test program are discussed.
Langemann, Timo; Mayr, Ulrike Beate; Meitz, Andrea; Lubitz, Werner; Herwig, Christoph
2016-01-01
Flow cytometry (FCM) is a tool for the analysis of single-cell properties in a cell suspension. In this contribution, we present an improved FCM method for the assessment of E-lysis in Enterobacteriaceae. The result of the E-lysis process is empty bacterial envelopes-called bacterial ghosts (BGs)-that constitute potential products in the pharmaceutical field. BGs have reduced light scattering properties when compared with intact cells. In combination with viability information obtained from staining samples with the membrane potential-sensitive fluorescent dye bis-(1,3-dibutylarbituric acid) trimethine oxonol (DiBAC4(3)), the presented method allows to differentiate between populations of viable cells, dead cells, and BGs. Using a second fluorescent dye RH414 as a membrane marker, non-cellular background was excluded from the data which greatly improved the quality of the results. Using true volumetric absolute counting, the FCM data correlated well with cell count data obtained from colony-forming units (CFU) for viable populations. Applicability of the method to several Enterobacteriaceae (different Escherichia coli strains, Salmonella typhimurium, Shigella flexneri 2a) could be shown. The method was validated as a resilient process analytical technology (PAT) tool for the assessment of E-lysis and for particle counting during 20-l batch processes for the production of Escherichia coli Nissle 1917 BGs.
Electron Beam Melting and Refining of Metals: Computational Modeling and Optimization
Vutova, Katia; Donchev, Veliko
2013-01-01
Computational modeling offers an opportunity for a better understanding and investigation of thermal transfer mechanisms. It can be used for the optimization of the electron beam melting process and for obtaining new materials with improved characteristics that have many applications in the power industry, medicine, instrument engineering, electronics, etc. A time-dependent 3D axis-symmetrical heat model for simulation of thermal transfer in metal ingots solidified in a water-cooled crucible at electron beam melting and refining (EBMR) is developed. The model predicts the change in the temperature field in the casting ingot during the interaction of the beam with the material. A modified Pismen-Rekford numerical scheme to discretize the analytical model is developed. These equation systems, describing the thermal processes and main characteristics of the developed numerical method, are presented. In order to optimize the technological regimes, different criteria for better refinement and obtaining dendrite crystal structures are proposed. Analytical problems of mathematical optimization are formulated, discretized and heuristically solved by cluster methods. Using important for the practice simulation results, suggestions can be made for EBMR technology optimization. The proposed tool is important and useful for studying, control, optimization of EBMR process parameters and improving of the quality of the newly produced materials. PMID:28788351
Exploration Laboratory Analysis
NASA Technical Reports Server (NTRS)
Krihak, M.; Ronzano, K.; Shaw, T.
2016-01-01
The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the availability of inflight laboratory analysis instrumentation has been identified as an essential capability for manned exploration missions. Since a single, compact space-ready laboratory analysis capability to perform all exploration clinical measurements is not commercially available, the ELA project objective is to demonstrate the feasibility of emerging operational and analytical capability as a biomedical diagnostics precursor to long duration manned exploration missions. The initial step towards ground and flight demonstrations in fiscal year (FY) 2015 was the downselection of platform technologies for demonstrations in the space environment. The technologies selected included two Small Business Innovation Research (SBIR) performers: DNA Medicine Institute's rHEALTH X and Intelligent Optical System's lateral flow assays combined with Holomic's smartphone analyzer. The selection of these technologies were based on their compact size, breadth of analytical capability and favorable ability to process fluids in a space environment, among several factors. These two technologies will be advanced to meet ground and flight demonstration success criteria and requirements. The technology demonstrations and metrics for success will be finalized in FY16. Also, the downselected performers will continue the technology development phase towards meeting prototype deliverables in either late 2016 or 2017.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szoka de Valladares, M.R.; Mack, S.
The DOE Hydrogen Program needs to develop criteria as part of a systematic evaluation process for proposal identification, evaluation and selection. The H Scan component of this process provides a framework in which a project proposer can fully describe their candidate technology system and its components. The H Scan complements traditional methods of capturing cost and technical information. It consists of a special set of survey forms designed to elicit information so expert reviewers can assess the proposal relative to DOE specified selection criteria. The Analytic Hierarchy Process (AHP) component of the decision process assembles the management defined evaluation andmore » selection criteria into a coherent multi-level decision construct by which projects can be evaluated in pair-wise comparisons. The AHP model will reflect management`s objectives and it will assist in the ranking of individual projects based on the extent to which each contributes to management`s objectives. This paper contains a detailed description of the products and activities associated with the planning and evaluation process: The objectives or criteria; the H Scan; and The Analytic Hierarchy Process (AHP).« less
Lourenço, Vera; Herdling, Thorsten; Reich, Gabriele; Menezes, José C; Lochmann, Dirk
2011-08-01
A set of 192 fluid bed granulation batches at industrial scale were in-line monitored using microwave resonance technology (MRT) to determine moisture, temperature and density of the granules. Multivariate data analysis techniques such as multiway partial least squares (PLS), multiway principal component analysis (PCA) and multivariate batch control charts were applied onto collected batch data sets. The combination of all these techniques, along with off-line particle size measurements, led to significantly increased process understanding. A seasonality effect could be put into evidence that impacted further processing through its influence on the final granule size. Moreover, it was demonstrated by means of a PLS that a relation between the particle size and the MRT measurements can be quantitatively defined, highlighting a potential ability of the MRT sensor to predict information about the final granule size. This study has contributed to improve a fluid bed granulation process, and the process knowledge obtained shows that the product quality can be built in process design, following Quality by Design (QbD) and Process Analytical Technology (PAT) principles. Copyright © 2011. Published by Elsevier B.V.
Internet-based data warehousing
NASA Astrophysics Data System (ADS)
Boreisha, Yurii
2001-10-01
In this paper, we consider the process of the data warehouse creation and population using the latest Internet and database access technologies. The logical three-tier model is applied. This approach allows developing of an enterprise schema by analyzing the various processes in the organization, and extracting the relevant entities and relationships from them. Integration with local schemas and population of the data warehouse is done through the corresponding user, business, and data services components. The hierarchy of these components is used to hide from the data warehouse users the entire complex online analytical processing functionality.
NASA Astrophysics Data System (ADS)
Bellini, Anna
Customer-driven product customization and continued demand for cost and time savings have generated a renewed interest in agile manufacturing based on improvements on Rapid Prototyping (RP) technologies. The advantages of RP technologies are: (1) ability to shorten the product design and development time, (2) suitability for automation and decrease in the level of human intervention, (3) ability to build many geometrically complex shapes. A shift from "prototyping" to "manufacturing" necessitates the following improvements: (1) Flexibility in choice of materials; (2) Part integrity and built-in characteristics to meet performance requirements; (3) Dimensional stability and tolerances; (4) Improved surface finish. A project funded by ONR has been undertaken to develop an agile manufacturing technology for fabrication of ceramic and multi-component parts to meet various needs of the Navy, such as transducers, etc. The project is based on adaptation of a layered manufacturing concept since the program required that the new technology be developed based on a commercially available RP technology. Among various RP technologies available today, Fused Deposition Modeling (FDM) has been identified as the focus of this research because of its potential versatility in the choice of materials and deposition configuration. This innovative approach allows for designing and implementing highly complex internal architectures into parts through deposition of different materials in a variety of configurations in such a way that the finished product exhibit characteristics to meet the performance requirements. This implies that, in principle, one can tailor-make the assemble of materials and structures as per specifications of an optimum design. The program objectives can be achieved only through accurate process modeling and modeling of material behavior. Oftentimes, process modeling is based on some type of computational approach where as modeling of material behavior is based on extensive experimental investigations. Studies are conducted in the following categories: (1) Flow modeling during extrusion and deposition; (2) Thermal modeling; (3) Flow control during deposition; (4) Product characterization and property determination for dimensional analysis; (5) Development of a novel technology based on a mini-extrusion system. Studies in each of these stages have involved experimental as well as analytical approaches to develop a comprehensive modeling.
Cost analysis of advanced turbine blade manufacturing processes
NASA Technical Reports Server (NTRS)
Barth, C. F.; Blake, D. E.; Stelson, T. S.
1977-01-01
A rigorous analysis was conducted to estimate relative manufacturing costs for high technology gas turbine blades prepared by three candidate materials process systems. The manufacturing costs for the same turbine blade configuration of directionally solidified eutectic alloy, an oxide dispersion strengthened superalloy, and a fiber reinforced superalloy were compared on a relative basis to the costs of the same blade currently in production utilizing the directional solidification process. An analytical process cost model was developed to quantitatively perform the cost comparisons. The impact of individual process yield factors on costs was also assessed as well as effects of process parameters, raw materials, labor rates and consumable items.
Selecting appropriate wastewater treatment technologies using a choosing-by-advantages approach.
Arroyo, Paz; Molinos-Senante, María
2018-06-01
Selecting the most sustainable wastewater treatment (WWT) technology among possible alternatives is a very complex task because the choice must integrate economic, environmental, and social criteria. Traditionally, several multi-criteria decision-making approaches have been applied, with the most often used being the analytical hierarchical process (AHP). However, AHP allows users to offset poor environmental and/or social performance with low cost. To overcome this limitation, our study examines a choosing-by-advantages (CBA) approach to rank seven WWT technologies for secondary WWT. CBA results were compared with results obtained by using the AHP approach. The rankings of WWT alternatives differed, depending on whether the CBA or AHP approach was used, which highlights the importance of the method used to support decision-making processes, particularly ones that rely on subjective interpretations by experts. This paper uses a holistic perspective to demonstrate the benefits of using the CBA approach to support a decision-making process when a group of experts must come to a consensus in selecting the most suitable WWT technology among several available. Copyright © 2017 Elsevier B.V. All rights reserved.
A visual analytic framework for data fusion in investigative intelligence
NASA Astrophysics Data System (ADS)
Cai, Guoray; Gross, Geoff; Llinas, James; Hall, David
2014-05-01
Intelligence analysis depends on data fusion systems to provide capabilities of detecting and tracking important objects, events, and their relationships in connection to an analytical situation. However, automated data fusion technologies are not mature enough to offer reliable and trustworthy information for situation awareness. Given the trend of increasing sophistication of data fusion algorithms and loss of transparency in data fusion process, analysts are left out of the data fusion process cycle with little to no control and confidence on the data fusion outcome. Following the recent rethinking of data fusion as human-centered process, this paper proposes a conceptual framework towards developing alternative data fusion architecture. This idea is inspired by the recent advances in our understanding of human cognitive systems, the science of visual analytics, and the latest thinking about human-centered data fusion. Our conceptual framework is supported by an analysis of the limitation of existing fully automated data fusion systems where the effectiveness of important algorithmic decisions depend on the availability of expert knowledge or the knowledge of the analyst's mental state in an investigation. The success of this effort will result in next generation data fusion systems that can be better trusted while maintaining high throughput.
Dinov, Ivo D
2016-01-01
Managing, processing and understanding big healthcare data is challenging, costly and demanding. Without a robust fundamental theory for representation, analysis and inference, a roadmap for uniform handling and analyzing of such complex data remains elusive. In this article, we outline various big data challenges, opportunities, modeling methods and software techniques for blending complex healthcare data, advanced analytic tools, and distributed scientific computing. Using imaging, genetic and healthcare data we provide examples of processing heterogeneous datasets using distributed cloud services, automated and semi-automated classification techniques, and open-science protocols. Despite substantial advances, new innovative technologies need to be developed that enhance, scale and optimize the management and processing of large, complex and heterogeneous data. Stakeholder investments in data acquisition, research and development, computational infrastructure and education will be critical to realize the huge potential of big data, to reap the expected information benefits and to build lasting knowledge assets. Multi-faceted proprietary, open-source, and community developments will be essential to enable broad, reliable, sustainable and efficient data-driven discovery and analytics. Big data will affect every sector of the economy and their hallmark will be 'team science'.
Piva, Elisa; Tosato, Francesca; Plebani, Mario
2015-12-07
Most errors in laboratory medicine occur in the pre-analytical phase of the total testing process. Phlebotomy, a crucial step in the pre-analytical phase influencing laboratory results and patient outcome, calls for quality assurance procedures and automation in order to prevent errors and ensure patient safety. We compared the performance of a new small, automated device, the ProTube Inpeco, designed for use in phlebotomy with a complete traceability of the process, with a centralized automated system, BC ROBO. ProTube was used for 15,010 patients undergoing phlebotomy with 48,776 tubes being labeled. The mean time and standard deviation (SD) for blood sampling was 3:03 (min:sec; SD ± 1:24) when using ProTube, against 5:40 (min:sec; SD ± 1:57) when using BC ROBO. The mean number of patients per hour managed at each phlebotomy point was 16 ± 3 with ProTube, and 10 ± 2 with BC ROBO. No tubes were labeled erroneously or incorrectly, even if process failure occurred in 2.8% of cases when ProTube was used. Thanks to its cutting edge technology, the ProTube has many advantages over BC ROBO, above all in verifying patient identity, and in allowing a reduction in both identification error and tube mislabeling.
Enhancing mHealth Technology in the PCMH Environment to Activate Chronic Care Patients
2016-09-01
9. Appendices…………………………………………………………… 16 Abstract for AMSUS Poster #1 Abstract for AMSUS Poster #2 Power Point sample slides from the mCare product... transfer ? (Not applicable for this reporting period) What was the impact on society beyond science and technology? Phase II research will make an...and process requirements (e.g. interface with wireless communication providers, visualization capabilities and options, data analytic structure) while
Marshall Space Flight Center's Virtual Reality Applications Program 1993
NASA Technical Reports Server (NTRS)
Hale, Joseph P., II
1993-01-01
A Virtual Reality (VR) applications program has been under development at the Marshall Space Flight Center (MSFC) since 1989. Other NASA Centers, most notably Ames Research Center (ARC), have contributed to the development of the VR enabling technologies and VR systems. This VR technology development has now reached a level of maturity where specific applications of VR as a tool can be considered. The objectives of the MSFC VR Applications Program are to develop, validate, and utilize VR as a Human Factors design and operations analysis tool and to assess and evaluate VR as a tool in other applications (e.g., training, operations development, mission support, teleoperations planning, etc.). The long-term goals of this technology program is to enable specialized Human Factors analyses earlier in the hardware and operations development process and develop more effective training and mission support systems. The capability to perform specialized Human Factors analyses earlier in the hardware and operations development process is required to better refine and validate requirements during the requirements definition phase. This leads to a more efficient design process where perturbations caused by late-occurring requirements changes are minimized. A validated set of VR analytical tools must be developed to enable a more efficient process for the design and development of space systems and operations. Similarly, training and mission support systems must exploit state-of-the-art computer-based technologies to maximize training effectiveness and enhance mission support. The approach of the VR Applications Program is to develop and validate appropriate virtual environments and associated object kinematic and behavior attributes for specific classes of applications. These application-specific environments and associated simulations will be validated, where possible, through empirical comparisons with existing, accepted tools and methodologies. These validated VR analytical tools will then be available for use in the design and development of space systems and operations and in training and mission support systems.
Frisse, Mark E
2016-04-01
New mobile devices, social networks, analytics, and communications technologies are emerging at an unparalleled rate. As a result, academic health centers will face both new opportunities and formidable challenges. Unlike previous transitions from paper-based systems to networked computer systems, these new technologies are the product of new entrepreneurial and commercial interests driven by consumers. As these new commercial products and services are more widely adopted, the likelihood grows that data will be used in unanticipated ways inconsistent with societal norms. Academic health centers will have to understand the implications of these technologies and engage more actively in processes governing the collection, aggregation, and use of health data produced in a new era of consumer-driven health care technology. Maintaining public trust should be a paramount concern.
An Overview of Aerospace Propulsion Research at NASA Glenn Research Center
NASA Technical Reports Server (NTRS)
Reddy, D. R.
2007-01-01
NASA Glenn Research center is the recognized leader in aerospace propulsion research, advanced technology development and revolutionary system concepts committed to meeting the increasing demand for low noise, low emission, high performance, and light weight propulsion systems for affordable and safe aviation and space transportation needs. The technologies span a broad range of areas including air breathing, as well as rocket propulsion systems, for commercial and military aerospace applications and for space launch, as well as in-space propulsion applications. The scope of work includes fundamentals, components, processes, and system interactions. Technologies developed use both experimental and analytical approaches. The presentation provides an overview of the current research and technology development activities at NASA Glenn Research Center .
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1960-01-01
Thirty-one papers and 10 summaries of papers presented at the Third Conference on Analytical Chemistry in Nuclear Reactor Technology held at Gatlinburg, Tennessee, October 26 to 29, 1959, are given. The papers are grouped into four sections: general, analytical chemistry of fuels, analytical chemistry of plutonium and the transplutonic elements, and the analysis of fission-product mixtures. Twenty-seven of the papers are covered by separate abstracts. Four were previously abstracted for NSA. (M.C.G.)
xQuake: A Modern Approach to Seismic Network Analytics
NASA Astrophysics Data System (ADS)
Johnson, C. E.; Aikin, K. E.
2017-12-01
While seismic networks have expanded over the past few decades, and social needs for accurate and timely information has increased dramatically, approaches to the operational needs of both global and regional seismic observatories have been slow to adopt new technologies. This presentation presents the xQuake system that provides a fresh approach to seismic network analytics based on complexity theory and an adaptive architecture of streaming connected microservices as diverse data (picks, beams, and other data) flow into a final, curated catalog of events. The foundation for xQuake is the xGraph (executable graph) framework that is essentially a self-organizing graph database. An xGraph instance provides both the analytics as well as the data storage capabilities at the same time. Much of the analytics, such as synthetic annealing in the detection process and an evolutionary programing approach for event evolution, draws from the recent GLASS 3.0 seismic associator developed by and for the USGS National Earthquake Information Center (NEIC). In some respects xQuake is reminiscent of the Earthworm system, in that it comprises processes interacting through store and forward rings; not surprising as the first author was the lead architect of the original Earthworm project when it was known as "Rings and Things". While Earthworm components can easily be integrated into the xGraph processing framework, the architecture and analytics are more current (e.g. using a Kafka Broker for store and forward rings). The xQuake system is being released under an unrestricted open source license to encourage and enable sthe eismic community support in further development of its capabilities.
Performance evaluation soil samples utilizing encapsulation technology
Dahlgran, J.R.
1999-08-17
Performance evaluation soil samples and method of their preparation uses encapsulation technology to encapsulate analytes which are introduced into a soil matrix for analysis and evaluation by analytical laboratories. Target analytes are mixed in an appropriate solvent at predetermined concentrations. The mixture is emulsified in a solution of polymeric film forming material. The emulsified solution is polymerized to form microcapsules. The microcapsules are recovered, quantitated and introduced into a soil matrix in a predetermined ratio to form soil samples with the desired analyte concentration. 1 fig.
Performance evaluation soil samples utilizing encapsulation technology
Dahlgran, James R.
1999-01-01
Performance evaluation soil samples and method of their preparation using encapsulation technology to encapsulate analytes which are introduced into a soil matrix for analysis and evaluation by analytical laboratories. Target analytes are mixed in an appropriate solvent at predetermined concentrations. The mixture is emulsified in a solution of polymeric film forming material. The emulsified solution is polymerized to form microcapsules. The microcapsules are recovered, quantitated and introduced into a soil matrix in a predetermined ratio to form soil samples with the desired analyte concentration.
Technology advancement for integrative stem cell analyses.
Jeong, Yoon; Choi, Jonghoon; Lee, Kwan Hyi
2014-12-01
Scientists have endeavored to use stem cells for a variety of applications ranging from basic science research to translational medicine. Population-based characterization of such stem cells, while providing an important foundation to further development, often disregard the heterogeneity inherent among individual constituents within a given population. The population-based analysis and characterization of stem cells and the problems associated with such a blanket approach only underscore the need for the development of new analytical technology. In this article, we review current stem cell analytical technologies, along with the advantages and disadvantages of each, followed by applications of these technologies in the field of stem cells. Furthermore, while recent advances in micro/nano technology have led to a growth in the stem cell analytical field, underlying architectural concepts allow only for a vertical analytical approach, in which different desirable parameters are obtained from multiple individual experiments and there are many technical challenges that limit vertically integrated analytical tools. Therefore, we propose--by introducing a concept of vertical and horizontal approach--that there is the need of adequate methods to the integration of information, such that multiple descriptive parameters from a stem cell can be obtained from a single experiment.
Technology Advancement for Integrative Stem Cell Analyses
Jeong, Yoon
2014-01-01
Scientists have endeavored to use stem cells for a variety of applications ranging from basic science research to translational medicine. Population-based characterization of such stem cells, while providing an important foundation to further development, often disregard the heterogeneity inherent among individual constituents within a given population. The population-based analysis and characterization of stem cells and the problems associated with such a blanket approach only underscore the need for the development of new analytical technology. In this article, we review current stem cell analytical technologies, along with the advantages and disadvantages of each, followed by applications of these technologies in the field of stem cells. Furthermore, while recent advances in micro/nano technology have led to a growth in the stem cell analytical field, underlying architectural concepts allow only for a vertical analytical approach, in which different desirable parameters are obtained from multiple individual experiments and there are many technical challenges that limit vertically integrated analytical tools. Therefore, we propose—by introducing a concept of vertical and horizontal approach—that there is the need of adequate methods to the integration of information, such that multiple descriptive parameters from a stem cell can be obtained from a single experiment. PMID:24874188
Sensor failure detection for jet engines
NASA Technical Reports Server (NTRS)
Merrill, Walter C.
1988-01-01
The use of analytical redundancy to improve gas turbine engine control system reliability through sensor failure detection, isolation, and accommodation is surveyed. Both the theoretical and application papers that form the technology base of turbine engine analytical redundancy research are discussed. Also, several important application efforts are reviewed. An assessment of the state-of-the-art in analytical redundancy technology is given.
Integrative workflows for metagenomic analysis
Ladoukakis, Efthymios; Kolisis, Fragiskos N.; Chatziioannou, Aristotelis A.
2014-01-01
The rapid evolution of all sequencing technologies, described by the term Next Generation Sequencing (NGS), have revolutionized metagenomic analysis. They constitute a combination of high-throughput analytical protocols, coupled to delicate measuring techniques, in order to potentially discover, properly assemble and map allelic sequences to the correct genomes, achieving particularly high yields for only a fraction of the cost of traditional processes (i.e., Sanger). From a bioinformatic perspective, this boils down to many GB of data being generated from each single sequencing experiment, rendering the management or even the storage, critical bottlenecks with respect to the overall analytical endeavor. The enormous complexity is even more aggravated by the versatility of the processing steps available, represented by the numerous bioinformatic tools that are essential, for each analytical task, in order to fully unveil the genetic content of a metagenomic dataset. These disparate tasks range from simple, nonetheless non-trivial, quality control of raw data to exceptionally complex protein annotation procedures, requesting a high level of expertise for their proper application or the neat implementation of the whole workflow. Furthermore, a bioinformatic analysis of such scale, requires grand computational resources, imposing as the sole realistic solution, the utilization of cloud computing infrastructures. In this review article we discuss different, integrative, bioinformatic solutions available, which address the aforementioned issues, by performing a critical assessment of the available automated pipelines for data management, quality control, and annotation of metagenomic data, embracing various, major sequencing technologies and applications. PMID:25478562
Peters, Johanna; Bartscher, Kathrin; Döscher, Claas; Taute, Wolfgang; Höft, Michael; Knöchel, Reinhard; Breitkreutz, Jörg
2017-08-01
Microwave resonance technology (MRT) is known as a process analytical technology (PAT) tool for moisture measurements in fluid-bed granulation. It offers a great potential for wet granulation processes even where the suitability of near-infrared (NIR) spectroscopy is limited, e.g. colored granules, large variations in bulk density. However, previous sensor systems operating around a single resonance frequency showed limitations above approx. 7.5% granule moisture. This paper describes the application of a novel sensor working with four resonance frequencies. In-line data of all four resonance frequencies were collected and further processed. Based on calculation of density-independent microwave moisture values multiple linear regression (MLR) models using Karl-Fischer titration (KF) as well as loss on drying (LOD) as reference methods were build. Rapid, reliable in-process moisture control (RMSEP≤0.5%) even at higher moisture contents was achieved. Copyright © 2017 Elsevier B.V. All rights reserved.
CERT: Center of Excellence in Rotorcraft Technology
NASA Technical Reports Server (NTRS)
2002-01-01
The research objectives of this effort are to understand the physical processes that influence the formation of the tip vortex of a rotor in advancing flight, and to develop active and passive means of weakening the tip vortex during conditions when strong blade-vortex-interaction effects are expected. A combined experimental, analytical, and computational effort is being employed. Specifically, the following efforts are being pursued: 1. Analytical evaluation and design of combined elastic tailoring and active material actuators applicable to rotor blade tips. 2. Numerical simulations of active and passive tip devices. 3. LDV Measurement of the near and far wake behind rotors in forward flight.
Thermal luminescence spectroscopy chemical imaging sensor.
Carrieri, Arthur H; Buican, Tudor N; Roese, Erik S; Sutter, James; Samuels, Alan C
2012-10-01
The authors present a pseudo-active chemical imaging sensor model embodying irradiative transient heating, temperature nonequilibrium thermal luminescence spectroscopy, differential hyperspectral imaging, and artificial neural network technologies integrated together. We elaborate on various optimizations, simulations, and animations of the integrated sensor design and apply it to the terrestrial chemical contamination problem, where the interstitial contaminant compounds of detection interest (analytes) comprise liquid chemical warfare agents, their various derivative condensed phase compounds, and other material of a life-threatening nature. The sensor must measure and process a dynamic pattern of absorptive-emissive middle infrared molecular signature spectra of subject analytes to perform its chemical imaging and standoff detection functions successfully.
Fiber optic evanescent wave biosensor
NASA Astrophysics Data System (ADS)
Duveneck, Gert L.; Ehrat, Markus; Widmer, H. M.
1991-09-01
The role of modern analytical chemistry is not restricted to quality control and environmental surveillance, but has been extended to process control using on-line analytical techniques. Besides industrial applications, highly specific, ultra-sensitive biochemical analysis becomes increasingly important as a diagnostic tool, both in central clinical laboratories and in the doctor's office. Fiber optic sensor technology can fulfill many of the requirements for both types of applications. As an example, the experimental arrangement of a fiber optic sensor for biochemical affinity assays is presented. The evanescent electromagnetic field, associated with a light ray guided in an optical fiber, is used for the excitation of luminescence labels attached to the biomolecules in solution to be analyzed. Due to the small penetration depth of the evanescent field into the medium, the generation of luminescence is restricted to the close proximity of the fiber, where, e.g., the luminescent analyte molecules combine with their affinity partners, which are immobilized on the fiber. Both cw- and pulsed light excitation can be used in evanescent wave sensor technology, enabling the on-line observation of an affinity assay on a macroscopic time scale (seconds and minutes), as well as on a microscopic, molecular time scale (nanoseconds or microseconds).
Enhanced heat transfer combustor technology, subtasks 1 and 2, tast C.1
NASA Technical Reports Server (NTRS)
Baily, R. D.
1986-01-01
Analytical and experimental studies are being conducted for NASA to evaluate means of increasing the heat extraction capability and service life of a liquid rocket combustor. This effort is being conducted in conjunction with other tasks to develop technologies for an advanced, expander cycle, oxygen/hydrogen engine planned for upper stage propulsion applications. Increased heat extraction, needed to raise available turbine drive energy for higher chamber pressure, is derived from combustion chamber hot gas wall ribs that increase the heat transfer surface area. Life improvement is obtained through channel designs that enhance cooling and maintain the wall temperature at an accepatable level. Laboratory test programs were conducted to evaluate the heat transfer characteristics of hot gas rib and coolant channel geometries selected through an analytical screening process. Detailed velocity profile maps, previously unavailable for rib and channel geometries, were obtained for the candidate designs using a cold flow laser velocimeter facility. Boundary layer behavior and heat transfer characteristics were determined from the velocity maps. Rib results were substantiated by hot air calorimeter testing. The flow data were analytically scaled to hot fire conditions and the results used to select two rib and three enhanced coolant channel configurations for further evaluation.
Process analytical technology in continuous manufacturing of a commercial pharmaceutical product.
Vargas, Jenny M; Nielsen, Sarah; Cárdenas, Vanessa; Gonzalez, Anthony; Aymat, Efrain Y; Almodovar, Elvin; Classe, Gustavo; Colón, Yleana; Sanchez, Eric; Romañach, Rodolfo J
2018-03-01
The implementation of process analytical technology and continuous manufacturing at an FDA approved commercial manufacturing site is described. In this direct compaction process the blends produced were monitored with a Near Infrared (NIR) spectroscopic calibration model developed with partial least squares (PLS) regression. The authors understand that this is the first study where the continuous manufacturing (CM) equipment was used as a gravimetric reference method for the calibration model. A principal component analysis (PCA) model was also developed to identify the powder blend, and determine whether it was similar to the calibration blends. An air diagnostic test was developed to assure that powder was present within the interface when the NIR spectra were obtained. The air diagnostic test as well the PCA and PLS calibration model were integrated into an industrial software platform that collects the real time NIR spectra and applies the calibration models. The PCA test successfully detected an equipment malfunction. Variographic analysis was also performed to estimate the sampling analytical errors that affect the results from the NIR spectroscopic method during commercial production. The system was used to monitor and control a 28 h continuous manufacturing run, where the average drug concentration determined by the NIR method was 101.17% of label claim with a standard deviation of 2.17%, based on 12,633 spectra collected. The average drug concentration for the tablets produced from these blends was 100.86% of label claim with a standard deviation of 0.4%, for 500 tablets analyzed by Fourier Transform Near Infrared (FT-NIR) transmission spectroscopy. The excellent agreement between the mean drug concentration values in the blends and tablets produced provides further evidence of the suitability of the validation strategy that was followed. Copyright © 2018 Elsevier B.V. All rights reserved.
Ortega-Rivas, Enrique; Salmerón-Ochoa, Iván
2014-01-01
Food drinks are normally processed to increase their shelf-life and facilitate distribution before consumption. Thermal pasteurization is quite efficient in preventing microbial spoilage of many types of beverages, but the applied heat may also cause undesirable biochemical and nutritious changes that may affect sensory attributes of the final product. Alternative methods of pasteurization that do not include direct heat have been investigated in order to obtain products safe for consumption, but with sensory attributes maintained as unchanged as possible. Food scientists interested in nonthermal food preservation technologies have claimed that such methods of preserving foods are equally efficient in microbial inactivation as compared with conventional thermal means of food processing. Researchers in the nonthermal food preservation area also affirm that alternative preservation technologies will not affect, as much as thermal processes, nutritional and sensory attributes of processed foods. This article reviews research in nonthermal food preservation, focusing on effects of processing of food drinks such as fruit juices and dairy products. Analytical techniques used to identify volatile flavor-aroma compounds will be reviewed and comparative effects for both thermal and nonthermal preservation technologies will be discussed.
The Prospect of Internet of Things and Big Data Analytics in Transportation System
NASA Astrophysics Data System (ADS)
Noori Hussein, Waleed; Kamarudin, L. M.; Hussain, Haider N.; Zakaria, A.; Badlishah Ahmed, R.; Zahri, N. A. H.
2018-05-01
Internet of Things (IoT); the new dawn technology that describes how data, people and interconnected physical objects act based on communicated information, and big data analytics have been adopted by diverse domains for varying purposes. Manufacturing, agriculture, banks, oil and gas, healthcare, retail, hospitality, and food services are few of the sectors that have adopted and massively utilized IoT and big data analytics. The transportation industry is also an early adopter, with significant attendant effects on its processes of tracking shipment, freight monitoring, and transparent warehousing. This is recorded in countries like England, Singapore, Portugal, and Germany, while Malaysia is currently assessing the potentials and researching a purpose-driven adoption and implementation. This paper, based on review of related literature, presents a summary of the inherent prospects in adopting IoT and big data analytics in the Malaysia transportation system. Efficient and safe port environment, predictive maintenance and remote management, boundary-less software platform and connected ecosystem, among others, are the inherent benefits in the IoT and big data analytics for the Malaysia transportation system.
Evaluating supplier quality performance using fuzzy analytical hierarchy process
NASA Astrophysics Data System (ADS)
Ahmad, Nazihah; Kasim, Maznah Mat; Rajoo, Shanmugam Sundram Kalimuthu
2014-12-01
Evaluating supplier quality performance is vital in ensuring continuous supply chain improvement, reducing the operational costs and risks towards meeting customer's expectation. This paper aims to illustrate an application of Fuzzy Analytical Hierarchy Process to prioritize the evaluation criteria in a context of automotive manufacturing in Malaysia. Five main criteria were identified which were quality, cost, delivery, customer serviceand technology support. These criteria had been arranged into hierarchical structure and evaluated by an expert. The relative importance of each criteria was determined by using linguistic variables which were represented as triangular fuzzy numbers. The Center of Gravity defuzzification method was used to convert the fuzzy evaluations into their corresponding crisps values. Such fuzzy evaluation can be used as a systematic tool to overcome the uncertainty evaluation of suppliers' performance which usually associated with human being subjective judgments.
Computer-Based Mathematics Instructions for Engineering Students
NASA Technical Reports Server (NTRS)
Khan, Mustaq A.; Wall, Curtiss E.
1996-01-01
Almost every engineering course involves mathematics in one form or another. The analytical process of developing mathematical models is very important for engineering students. However, the computational process involved in the solution of some mathematical problems may be very tedious and time consuming. There is a significant amount of mathematical software such as Mathematica, Mathcad, and Maple designed to aid in the solution of these instructional problems. The use of these packages in classroom teaching can greatly enhance understanding, and save time. Integration of computer technology in mathematics classes, without de-emphasizing the traditional analytical aspects of teaching, has proven very successful and is becoming almost essential. Sample computer laboratory modules are developed for presentation in the classroom setting. This is accomplished through the use of overhead projectors linked to graphing calculators and computers. Model problems are carefully selected from different areas.
Examining the Impact of Culture and Human Elements on OLAP Tools Usefulness
ERIC Educational Resources Information Center
Sharoupim, Magdy S.
2010-01-01
The purpose of the present study was to examine the impact of culture and human-related elements on the On-line Analytical Processing (OLAP) usability in generating decision-making information. The use of OLAP technology has evolved rapidly and gained momentum, mainly due to the ability of OLAP tools to examine and query large amounts of data sets…
Treatment of RDX & HMX Plumes Using Mulch Biowalls
2008-08-01
Classification TAL Target Analyte List TCLP Toxicity Characteristic Leachate Procedure TNB 1,3,5-Trinitobenzene TNT 2,4,6-Trinitrotoluene TNX...active phytoremediation process in the source area (i.e., the former Pink Water pond area) that might already be contributing dissolved TOC...Technical Report i. Presence of other remediation technologies in the immediate vicinity No 4 No Yes; active phytoremediation in Pink
NASA Astrophysics Data System (ADS)
Bianchetti, Raechel Anne
Remotely sensed images have become a ubiquitous part of our daily lives. From novice users, aiding in search and rescue missions using tools such as TomNod, to trained analysts, synthesizing disparate data to address complex problems like climate change, imagery has become central to geospatial problem solving. Expert image analysts are continually faced with rapidly developing sensor technologies and software systems. In response to these cognitively demanding environments, expert analysts develop specialized knowledge and analytic skills to address increasingly complex problems. This study identifies the knowledge, skills, and analytic goals of expert image analysts tasked with identification of land cover and land use change. Analysts participating in this research are currently working as part of a national level analysis of land use change, and are well versed with the use of TimeSync, forest science, and image analysis. The results of this study benefit current analysts as it improves their awareness of their mental processes used during the image interpretation process. The study also can be generalized to understand the types of knowledge and visual cues that analysts use when reasoning with imagery for purposes beyond land use change studies. Here a Cognitive Task Analysis framework is used to organize evidence from qualitative knowledge elicitation methods for characterizing the cognitive aspects of the TimeSync image analysis process. Using a combination of content analysis, diagramming, semi-structured interviews, and observation, the study highlights the perceptual and cognitive elements of expert remote sensing interpretation. Results show that image analysts perform several standard cognitive processes, but flexibly employ these processes in response to various contextual cues. Expert image analysts' ability to think flexibly during their analysis process was directly related to their amount of image analysis experience. Additionally, results show that the basic Image Interpretation Elements continue to be important despite technological augmentation of the interpretation process. These results are used to derive a set of design guidelines for developing geovisual analytic tools and training to support image analysis.
[Big data, medical language and biomedical terminology systems].
Schulz, Stefan; López-García, Pablo
2015-08-01
A variety of rich terminology systems, such as thesauri, classifications, nomenclatures and ontologies support information and knowledge processing in health care and biomedical research. Nevertheless, human language, manifested as individually written texts, persists as the primary carrier of information, in the description of disease courses or treatment episodes in electronic medical records, and in the description of biomedical research in scientific publications. In the context of the discussion about big data in biomedicine, we hypothesize that the abstraction of the individuality of natural language utterances into structured and semantically normalized information facilitates the use of statistical data analytics to distil new knowledge out of textual data from biomedical research and clinical routine. Computerized human language technologies are constantly evolving and are increasingly ready to annotate narratives with codes from biomedical terminology. However, this depends heavily on linguistic and terminological resources. The creation and maintenance of such resources is labor-intensive. Nevertheless, it is sensible to assume that big data methods can be used to support this process. Examples include the learning of hierarchical relationships, the grouping of synonymous terms into concepts and the disambiguation of homonyms. Although clear evidence is still lacking, the combination of natural language technologies, semantic resources, and big data analytics is promising.
A combined approach of AHP and TOPSIS methods applied in the field of integrated software systems
NASA Astrophysics Data System (ADS)
Berdie, A. D.; Osaci, M.; Muscalagiu, I.; Barz, C.
2017-05-01
Adopting the most appropriate technology for developing applications on an integrated software system for enterprises, may result in great savings both in cost and hours of work. This paper proposes a research study for the determination of a hierarchy between three SAP (System Applications and Products in Data Processing) technologies. The technologies Web Dynpro -WD, Floorplan Manager - FPM and CRM WebClient UI - CRM WCUI are multi-criteria evaluated in terms of the obtained performances through the implementation of the same web business application. To establish the hierarchy a multi-criteria analysis model that combines the AHP (Analytic Hierarchy Process) and the TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) methods was proposed. This model was built with the help of the SuperDecision software. This software is based on the AHP method and determines the weights for the selected sets of criteria. The TOPSIS method was used to obtain the final ranking and the technologies hierarchy.
Uy, Raymonde Charles Y.; Kury, Fabricio P.; Fontelo, Paul A.
2015-01-01
The standard of safe medication practice requires strict observance of the five rights of medication administration: the right patient, drug, time, dose, and route. Despite adherence to these guidelines, medication errors remain a public health concern that has generated health policies and hospital processes that leverage automation and computerization to reduce these errors. Bar code, RFID, biometrics and pharmacy automation technologies have been demonstrated in literature to decrease the incidence of medication errors by minimizing human factors involved in the process. Despite evidence suggesting the effectivity of these technologies, adoption rates and trends vary across hospital systems. The objective of study is to examine the state and adoption trends of automatic identification and data capture (AIDC) methods and pharmacy automation technologies in U.S. hospitals. A retrospective descriptive analysis of survey data from the HIMSS Analytics® Database was done, demonstrating an optimistic growth in the adoption of these patient safety solutions. PMID:26958264
Process-Hardened, Multi-Analyte Sensor for Characterizing Rocket Plume Constituents
NASA Technical Reports Server (NTRS)
Goswami, Kisholoy
2011-01-01
A multi-analyte sensor was developed that enables simultaneous detection of rocket engine combustion-product molecules in a launch-vehicle ground test stand. The sensor was developed using a pin-printing method by incorporating multiple sensor elements on a single chip. It demonstrated accurate and sensitive detection of analytes such as carbon dioxide, carbon monoxide, kerosene, isopropanol, and ethylene from a single measurement. The use of pin-printing technology enables high-volume fabrication of the sensor chip, which will ultimately eliminate the need for individual sensor calibration since many identical sensors are made in one batch. Tests were performed using a single-sensor chip attached to a fiber-optic bundle. The use of a fiber bundle allows placement of the opto-electronic readout device at a place remote from the test stand. The sensors are rugged for operation in harsh environments.
FIELD ANALYTICAL SCREENING PROGRAM: PCB METHOD - INNOVATIVE TECHNOLOGY REPORT
This innovative technology evaluation report (ITER) presents information on the demonstration of the U.S. Environmental Protection Agency (EPA) Region 7 Superfund Field Analytical Screening Program (FASP) method for determining polychlorinated biphenyl (PCB) contamination in soil...
Biosensors for Sustainable Food Engineering: Challenges and Perspectives
Ragavan, Vasanth; Weng, Xuan; Chand, Rohit
2018-01-01
Current food production faces tremendous challenges from growing human population, maintaining clean resources and food qualities, and protecting climate and environment. Food sustainability is mostly a cooperative effort resulting in technology development supported by both governments and enterprises. Multiple attempts have been promoted in tackling challenges and enhancing drivers in food production. Biosensors and biosensing technologies with their applications, are being widely applied to tackling top challenges in food production and its sustainability. Consequently, a growing demand in biosensing technologies exists in food sustainability. Microfluidics represents a technological system integrating multiple technologies. Nanomaterials, with its technology in biosensing, is thought to be the most promising tool in dealing with health, energy, and environmental issues closely related to world populations. The demand of point of care (POC) technologies in this area focus on rapid, simple, accurate, portable, and low-cost analytical instruments. This review provides current viewpoints from the literature on biosensing in food production, food processing, safety and security, food packaging and supply chain, food waste processing, food quality assurance, and food engineering. The current understanding of progress, solution, and future challenges, as well as the commercialization of biosensors are summarized. PMID:29534552
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hess, W.P.; Bushaw, B.A.; McCarthy, M.I.
1996-10-01
The Department of Energy is undertaking the enormous task of remediating defense wastes and environmental insults which have occurred over 50 years of nuclear weapons production. It is abundantly clear that significant technology advances are needed to characterize, process, and store highly radioactive waste and to remediate contaminated zones. In addition to the processing and waste form issues, analytical technologies needed for the characterization of solids, and for monitoring storage tanks and contaminated sites do not exist or are currently expensive labor-intensive tasks. This report describes progress in developing sensitive, rapid, and widely applicable laser-based mass spectrometry techniques for analysismore » of mixed chemical wastes and contaminated soils.« less
A review on recent technologies for the manufacture of pulmonary drugs.
Hadiwinoto, Gabriela Daisy; Lip Kwok, Philip Chi; Lakerveld, Richard
2018-01-01
This review discusses recent developments in the manufacture of inhalable dry powder formulations. Pulmonary drugs have distinct advantages compared with other drug administration routes. However, requirements of drugs properties complicate the manufacture. Control over crystallization to make particles with the desired properties in a single step is often infeasible, which calls for micronization techniques. Although spray drying produces particles in the desired size range, a stable solid state may not be attainable. Supercritical fluids may be used as a solvent or antisolvent, which significantly reduces solvent waste. Future directions include application areas such as biopharmaceuticals for dry powder inhalers and new processing strategies to improve the control over particle formation such as continuous manufacturing with in-line process analytical technologies.
Public and stakeholder participation for managing and reducing the risks of shale gas development.
North, D Warner; Stern, Paul C; Webler, Thomas; Field, Patrick
2014-01-01
Emerging technologies pose particularly strong challenges for risk governance when they have multidimensional and inequitable impacts, when there is scientific uncertainty about the technology and its risks, when there are strong value conflicts over the perceived benefits and risks, when decisions must be made urgently, and when the decision making environment is rife with mistrust. Shale gas development is one such emerging technology. Drawing on previous U.S. National Research Council committee reports that examined risk decision making for complex issues like these, we point to the benefits and challenges of applying the analytic-deliberative process recommended in those reports for stakeholder and public engagement in risk decision making about shale gas development in the United States. We discuss the different phases of such a process and conclude by noting the dangers of allowing controversy to ossify and the benefits of sound dialogue and learning among publics, stakeholders, industry, and regulatory decision makers.
Application of Interface Technology in Progressive Failure Analysis of Composite Panels
NASA Technical Reports Server (NTRS)
Sleight, D. W.; Lotts, C. G.
2002-01-01
A progressive failure analysis capability using interface technology is presented. The capability has been implemented in the COMET-AR finite element analysis code developed at the NASA Langley Research Center and is demonstrated on composite panels. The composite panels are analyzed for damage initiation and propagation from initial loading to final failure using a progressive failure analysis capability that includes both geometric and material nonlinearities. Progressive failure analyses are performed on conventional models and interface technology models of the composite panels. Analytical results and the computational effort of the analyses are compared for the conventional models and interface technology models. The analytical results predicted with the interface technology models are in good correlation with the analytical results using the conventional models, while significantly reducing the computational effort.
NASA Astrophysics Data System (ADS)
Rana, Narender; Zhang, Yunlin; Wall, Donald; Dirahoui, Bachir; Bailey, Todd C.
2015-03-01
Integrate circuit (IC) technology is going through multiple changes in terms of patterning techniques (multiple patterning, EUV and DSA), device architectures (FinFET, nanowire, graphene) and patterning scale (few nanometers). These changes require tight controls on processes and measurements to achieve the required device performance, and challenge the metrology and process control in terms of capability and quality. Multivariate data with complex nonlinear trends and correlations generally cannot be described well by mathematical or parametric models but can be relatively easily learned by computing machines and used to predict or extrapolate. This paper introduces the predictive metrology approach which has been applied to three different applications. Machine learning and predictive analytics have been leveraged to accurately predict dimensions of EUV resist patterns down to 18 nm half pitch leveraging resist shrinkage patterns. These patterns could not be directly and accurately measured due to metrology tool limitations. Machine learning has also been applied to predict the electrical performance early in the process pipeline for deep trench capacitance and metal line resistance. As the wafer goes through various processes its associated cost multiplies. It may take days to weeks to get the electrical performance readout. Predicting the electrical performance early on can be very valuable in enabling timely actionable decision such as rework, scrap, feedforward, feedback predicted information or information derived from prediction to improve or monitor processes. This paper provides a general overview of machine learning and advanced analytics application in the advanced semiconductor development and manufacturing.
Analysis of pultrusion processing for long fiber reinforced thermoplastic composite system
NASA Technical Reports Server (NTRS)
Tso, W.; Hou, T. H.; Tiwari, S. N.
1993-01-01
Pultrusion is one of the composite processing technology, commonly recognized as a simple and cost-effective means for the manufacturing of fiber-reinforced, resin matrix composite parts with different regular geometries. Previously, because the majority of the pultruded composite parts were made of thermosetting resin matrix, emphasis of the analysis on the process has been on the conservation of energy from various sources, such as heat conduction and the curing kinetics of the resin system. Analysis on the flow aspect of the process was almost absent in the literature for thermosetting process. With the increasing uses of thermoplastic materials, it is desirable to obtain the detailed velocity and pressure profiles inside the pultrusion die. Using a modified Darcy's law for flow through porous media, closed form analytical solutions for the velocity and pressure distributions inside the pultrusion die are obtained for the first time. This enables us to estimate the magnitude of viscous dissipation and it's effects on the pultruded parts. Pulling forces refined in the pultrusion processing are also analyzed. The analytical model derived in this study can be used to advance our knowledge and control of the pultrusion process for fiber reinforced thermoplastic composite parts.
Advancement of CMOS Doping Technology in an External Development Framework
NASA Astrophysics Data System (ADS)
Jain, Amitabh; Chambers, James J.; Shaw, Judy B.
2011-01-01
The consumer appetite for a rich multimedia experience drives technology development for mobile hand-held devices and the infrastructure to support them. Enhancements in functionality, speed, and user experience are derived from advancements in CMOS technology. The technical challenges in developing each successive CMOS technology node to support these enhancements have become increasingly difficult. These trends have motivated the CMOS business towards a collaborative approach based on strategic partnerships. This paper describes our model and experience of CMOS development, based on multi-dimensional industrial and academic partnerships. We provide to our process equipment, materials, and simulation partners, as well as to our silicon foundry partners, the detailed requirements for future integrated circuit products. This is done very early in the development cycle to ensure that these requirements can be met. In order to determine these fundamental requirements, we rely on a strategy that requires strong interaction between process and device simulation, physical and chemical analytical methods, and research at academic institutions. This learning is shared with each project partner to address integration and manufacturing issues encountered during CMOS technology development from its inception through product ramp. We utilize TI's core strengths in physical analysis, unit processes and integration, yield ramp, reliability, and product engineering to support this technological development. Finally, this paper presents examples of the advancement of CMOS doping technology for the 28 nm node and beyond through this development model.
Exploration Laboratory Analysis
NASA Technical Reports Server (NTRS)
Krihak, M.; Ronzano, K.; Shaw, T.
2016-01-01
The Exploration Laboratory Analysis (ELA) project supports the Exploration Medical Capability (ExMC) risk to minimize or reduce the risk of adverse health outcomes and decrements in performance due to in-flight medical capabilities on human exploration missions. To mitigate this risk, the availability of inflight laboratory analysis instrumentation has been identified as an essential capability for manned exploration missions. Since a single, compact space-ready laboratory analysis capability to perform all exploration clinical measurements is not commercially available, the ELA project objective is to demonstrate the feasibility of emerging operational and analytical capability as a biomedical diagnostics precursor to long duration manned exploration missions. The initial step towards ground and flight demonstrations in fiscal year (FY) 2015 was the down selection of platform technologies for demonstrations in the space environment. The technologies selected included two Small Business Innovation Research (SBIR) performers: DNA Medicine Institutes rHEALTH X and Intelligent Optical Systems later flow assays combined with Holomics smartphone analyzer. The selection of these technologies were based on their compact size, breadth of analytical capability and favorable ability to process fluids in a space environment, among several factors. These two technologies will be advanced to meet ground and flight demonstration success criteria and requirements that will be finalized in FY16. Also, the down selected performers will continue the technology development phase towards meeting prototype deliverables in either late 2016 or 2017.
Vrancken, C; Longhurst, P J; Wagland, S T
2017-03-01
Waste management processes generally represent a significant loss of material, energy and economic resources, so legislation and financial incentives are being implemented to improve the recovery of these valuable resources whilst reducing contamination levels. Material recovery and waste derived fuels are potentially valuable options being pursued by industry, using mechanical and biological processes incorporating sensor and sorting technologies developed and optimised for recycling plants. In its current state, waste management presents similarities to other industries that could improve their efficiencies using process analytical technology tools. Existing sensor technologies could be used to measure critical waste characteristics, providing data required by existing legislation, potentially aiding waste treatment processes and assisting stakeholders in decision making. Optical technologies offer the most flexible solution to gather real-time information applicable to each of the waste mechanical and biological treatment processes used by industry. In particular, combinations of optical sensors in the visible and the near-infrared range from 800nm to 2500nm of the spectrum, and different mathematical techniques, are able to provide material information and fuel properties with typical performance levels between 80% and 90%. These sensors not only could be used to aid waste processes, but to provide most waste quality indicators required by existing legislation, whilst offering better tools to the stakeholders. Copyright © 2017 Elsevier Ltd. All rights reserved.
Kumar, Vijay; Taylor, Michael K; Mehrotra, Amit; Stagner, William C
2013-06-01
Focused beam reflectance measurement (FBRM) was used as a process analytical technology tool to perform inline real-time particle size analysis of a proprietary granulation manufactured using a continuous twin-screw granulation-drying-milling process. A significant relationship between D20, D50, and D80 length-weighted chord length and sieve particle size was observed with a p value of <0.0001 and R(2) of 0.886. A central composite response surface statistical design was used to evaluate the effect of granulator screw speed and Comil® impeller speed on the length-weighted chord length distribution (CLD) and particle size distribution (PSD) determined by FBRM and nested sieve analysis, respectively. The effect of granulator speed and mill speed on bulk density, tapped density, Compressibility Index, and Flowability Index were also investigated. An inline FBRM probe placed below the Comil-generated chord lengths and CLD data at designated times. The collection of the milled samples for sieve analysis and PSD evaluation were coordinated with the timing of the FBRM determinations. Both FBRM and sieve analysis resulted in similar bimodal distributions for all ten manufactured batches studied. Within the experimental space studied, the granulator screw speed (650-850 rpm) and Comil® impeller speed (1,000-2,000 rpm) did not have a significant effect on CLD, PSD, bulk density, tapped density, Compressibility Index, and Flowability Index (p value > 0.05).
Volume and Value of Big Healthcare Data.
Dinov, Ivo D
Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions.
NASA Astrophysics Data System (ADS)
Phipps, Marja; Lewis, Gina
2012-06-01
Over the last decade, intelligence capabilities within the Department of Defense/Intelligence Community (DoD/IC) have evolved from ad hoc, single source, just-in-time, analog processing; to multi source, digitally integrated, real-time analytics; to multi-INT, predictive Processing, Exploitation and Dissemination (PED). Full Motion Video (FMV) technology and motion imagery tradecraft advancements have greatly contributed to Intelligence, Surveillance and Reconnaissance (ISR) capabilities during this timeframe. Imagery analysts have exploited events, missions and high value targets, generating and disseminating critical intelligence reports within seconds of occurrence across operationally significant PED cells. Now, we go beyond FMV, enabling All-Source Analysts to effectively deliver ISR information in a multi-INT sensor rich environment. In this paper, we explore the operational benefits and technical challenges of an Activity Based Intelligence (ABI) approach to FMV PED. Existing and emerging ABI features within FMV PED frameworks are discussed, to include refined motion imagery tools, additional intelligence sources, activity relevant content management techniques and automated analytics.
Volume and Value of Big Healthcare Data
Dinov, Ivo D.
2016-01-01
Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions. PMID:26998309
Phase transformations in steels: Processing, microstructure, and performance
Gibbs, Paul J.
2014-04-03
In this study, contemporary steel research is revealing new processing avenues to tailor microstructure and properties that, until recently, were only imaginable. Much of the technological versatility facilitating this development is provided by the understanding and utilization of the complex phase transformation sequences available in ferrous alloys. Today we have the opportunity to explore the diverse phenomena displayed by steels with specialized analytical and experimental tools. Advances in multi-scale characterization techniques provide a fresh perspective into microstructural relationships at the macro- and micro-scale, enabling a fundamental understanding of the role of phase transformations during processing and subsequent deformation.
NASA Astrophysics Data System (ADS)
Fan, Guofang; Li, Yuan; Hu, Chunguang; Lei, Lihua; Guo, Yanchuan
2016-08-01
A novel process to control light through the coupling modulation by surface acoustic wave (SAW) is presented in an optical micro resonator. An optical waveguide modulator of a racetrack resonator on silicon-on-insulator (SOI) technology is took as an example to explore the mechanism. A finite-difference time-domain (FDTD) is developed to simulate the acousto-optical (AO) modulator using the mechanism. An analytical method is presented to verify our proposal. The results show that the process can work well as an optical modulator by SAW.
NASA Astrophysics Data System (ADS)
Urakaev, Farit Kh.; Akmalaev, Kenzhebek A.; Orynbekov, Eljan S.; Balgysheva, Beykut D.; Zharlykasimova, Dinar N.
2016-02-01
The use of metallothermy (MT) and self-propagating high-temperature synthesis (SHS) is considered for processing different geological and technogenic materials. Traditional MT and SHS processes for production of various metals and nonmetal materials are widely known. Another rapidly developing direction is that connected with the use of ores, concentrates, minerals, and technogenic waste products as one of the components of a thermite mixture, both for the treatment of mineral raw by means of MT or SHS resulting in semi-products and for technological, analytical, and ecological purposes.
Wu, Cao; Chen, Zhou; Hu, Ya; Rao, Zhiyuan; Wu, Wangping; Yang, Zhaogang
2018-05-15
Crystallization is a significant process employed to produce a wide variety of materials in pharmaceutical and food area. The control of crystal dimension, crystallinity, and shape is very important because they will affect the subsequent filtration, drying and grinding performance as well as the physical and chemical properties of the material. This review summarizes the special features of crystallization technology and the preparation methods of nanocrystals, and discusses analytical technology which is used to control crystal quality and performance. The crystallization technology applications in pharmaceutics and foods are also outlined. These illustrated examples further help us to gain a better understanding of the crystallization technology for pharmaceutics and foods. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
site LAB(& Analytical Test Kit UVF-3 I OOA (UVF-3 I OOA) developed by siteLABqD Corporation (siteLABa)) was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program in June 2000 at the Navy Base Ventura County site in ...
Imaging and Analytics: The changing face of Medical Imaging
NASA Astrophysics Data System (ADS)
Foo, Thomas
There have been significant technological advances in imaging capability over the past 40 years. Medical imaging capabilities have developed rapidly, along with technology development in computational processing speed and miniaturization. Moving to all-digital, the number of images that are acquired in a routine clinical examination has increased dramatically from under 50 images in the early days of CT and MRI to more than 500-1000 images today. The staggering number of images that are routinely acquired poses significant challenges for clinicians to interpret the data and to correctly identify the clinical problem. Although the time provided to render a clinical finding has not substantially changed, the amount of data available for interpretation has grown exponentially. In addition, the image quality (spatial resolution) and information content (physiologically-dependent image contrast) has also increased significantly with advances in medical imaging technology. On its current trajectory, medical imaging in the traditional sense is unsustainable. To assist in filtering and extracting the most relevant data elements from medical imaging, image analytics will have a much larger role. Automated image segmentation, generation of parametric image maps, and clinical decision support tools will be needed and developed apace to allow the clinician to manage, extract and utilize only the information that will help improve diagnostic accuracy and sensitivity. As medical imaging devices continue to improve in spatial resolution, functional and anatomical information content, image/data analytics will be more ubiquitous and integral to medical imaging capability.
Manufacturing Methods and Technology for Microwave Stripline Circuits
1982-02-26
to the dielectric material so It does not peel during the etching and subsequent processing. The copper cladding requirements were defined by MIL-F...the B-stage,giv- ing acceptable peel strengths per the military requirements. For PTFE sub- strata printed wiring boards that are laminated using a...examining multilayers for measles and delaminations, and analytically by performing peel tests and glass transition temperatures. "STRIPLINE
De Beer, T R M; Allesø, M; Goethals, F; Coppens, A; Heyden, Y Vander; De Diego, H Lopez; Rantanen, J; Verpoort, F; Vervaet, C; Remon, J P; Baeyens, W R G
2007-11-01
The aim of the present study was to propose a strategy for the implementation of a Process Analytical Technology system in freeze-drying processes. Mannitol solutions, some of them supplied with NaCl, were used as models to freeze-dry. Noninvasive and in-line Raman measurements were continuously performed during lyophilization of the solutions to monitor real time the mannitol solid state, the end points of the different process steps (freezing, primary drying, secondary drying), and physical phenomena occurring during the process. At-line near-infrared (NIR) and X-ray powder diffractometry (XRPD) measurements were done to confirm the Raman conclusions and to find out additional information. The collected spectra during the processes were analyzed using principal component analysis and multivariate curve resolution. A two-level full factorial design was used to study the significant influence of process (freezing rate) and formulation variables (concentration of mannitol, concentration of NaCl, volume of freeze-dried sample) upon freeze-drying. Raman spectroscopy was able to monitor (i) the mannitol solid state (amorphous, alpha, beta, delta, and hemihydrate), (ii) several process step end points (end of mannitol crystallization during freezing, primary drying), and (iii) physical phenomena occurring during freeze-drying (onset of ice nucleation, onset of mannitol crystallization during the freezing step, onset of ice sublimation). NIR proved to be a more sensitive tool to monitor sublimation than Raman spectroscopy, while XRPD helped to unravel the mannitol hemihydrate in the samples. The experimental design results showed that several process and formulation variables significantly influence different aspects of lyophilization and that both are interrelated. Raman spectroscopy (in-line) and NIR spectroscopy and XRPD (at-line) not only allowed the real-time monitoring of mannitol freeze-drying processes but also helped (in combination with experimental design) us to understand the process.
ENVIRONMENTAL TECHNOLOGICAL VERIFICATION REPORT - L2000 PCB/CHLORIDE ANALYZER - DEXSIL CORPORATION
In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of Polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCBs in soil...
ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - ENVIROGARD PCB TEST KIT - STRATEGIC DIAGNOSTICS INC
In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of Polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCBs in soil...
We Canwatch It For You Wholesale
NASA Astrophysics Data System (ADS)
Lipton, Alan J.
This chapter provides an introduction to video analytics—a branch of computer vision technology that deals with automatic detection of activities and events in surveillance video feeds. Initial applications focused on the security and surveillance space, but as the technology improves it is rapidly finding a home in many other application areas. This chapter looks at some of those spaces, the requirements they impose on video analytics systems, and provides an example architecture and set of technology components to meet those requirements. This exemplary system is put through its paces to see how it stacks up in an embedded environment. Finally, we explore the future of video analytics and examine some of the market requirements that are driving breakthroughs in both video analytics and processor platform technology alike.
RFID in healthcare: a Six Sigma DMAIC and simulation case study.
Southard, Peter B; Chandra, Charu; Kumar, Sameer
2012-01-01
The purpose of this paper is to develop a business model to generate quantitative evidence of the benefits of implementing radio frequency identification (RFID) technology, limiting the scope to outpatient surgical processes in hospitals. The study primarily uses the define-measure-analyze-improve-control (DMAIC) approach, and draws on various analytical tools such as work flow diagrams, value stream mapping, and discrete event simulation to examine the effect of implementing RFID technology on improving effectiveness (quality and timeliness) and efficiency (cost reduction) of outpatient surgical processes. The analysis showed significant estimated annual cost and time savings in carrying out patients' surgical procedures with RFID technology implementation for the outpatient surgery processes in a hospital. This is largely due to the elimination of both non-value added activities of locating supplies and equipment and also the elimination of the "return" loop created by preventable post operative infections. Several poka-yokes developed using RFID technology were identified to eliminate those two issues. Several poka-yokes developed using RFID technology were identified for improving the safety of the patient and cost effectiveness of the operation to ensure the success of the outpatient surgical process. Many stakeholders in the hospital environment will be impacted including patients, physicians, nurses, technicians, administrators and other hospital personnel. Different levels of training of hospital personnel will be required, based on the degree of interaction with the RFID system. Computations of costs and savings will help decision makers understand the benefits and implications of the technology in the hospital environment.
Performance specifications for the extra-analytical phases of laboratory testing: Why and how.
Plebani, Mario
2017-07-01
An important priority in the current healthcare scenario should be to address errors in laboratory testing, which account for a significant proportion of diagnostic errors. Efforts made in laboratory medicine to enhance the diagnostic process have been directed toward improving technology, greater volumes and more accurate laboratory tests being achieved, but data collected in the last few years highlight the need to re-evaluate the total testing process (TTP) as the unique framework for improving quality and patient safety. Valuable quality indicators (QIs) and extra-analytical performance specifications are required for guidance in improving all TTP steps. Yet in literature no data are available on extra-analytical performance specifications based on outcomes, and nor is it possible to set any specification using calculations involving biological variability. The collection of data representing the state-of-the-art based on quality indicators is, therefore, underway. The adoption of a harmonized set of QIs, a common data collection and standardised reporting method is mandatory as it will not only allow the accreditation of clinical laboratories according to the International Standard, but also assure guidance for promoting improvement processes and guaranteeing quality care to patients. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Cobelo-García, A; Filella, M; Croot, P; Frazzoli, C; Du Laing, G; Ospina-Alvarez, N; Rauch, S; Salaun, P; Schäfer, J; Zimmermann, S
2015-10-01
The current socio-economic, environmental and public health challenges that countries are facing clearly need common-defined strategies to inform and support our transition to a sustainable economy. Here, the technology-critical elements (which includes Ga, Ge, In, Te, Nb, Ta, Tl, the Platinum Group Elements and most of the rare-earth elements) are of great relevance in the development of emerging key technologies-including renewable energy, energy efficiency, electronics or the aerospace industry. In this context, the increasing use of technology-critical elements (TCEs) and associated environmental impacts (from mining to end-of-life waste products) is not restricted to a national level but covers most likely a global scale. Accordingly, the European COST Action TD1407: Network on Technology-Critical Elements (NOTICE)-from environmental processes to human health threats, has an overall objective for creating a network of scientists and practitioners interested in TCEs, from the evaluation of their environmental processes to understanding potential human health threats, with the aim of defining the current state of knowledge and gaps, proposing priority research lines/activities and acting as a platform for new collaborations and joint research projects. The Action is focused on three major scientific areas: (i) analytical chemistry, (ii) environmental biogeochemistry and (iii) human exposure and (eco)-toxicology.
Stabilization of glucose-oxidase in the graphene paste for screen-printed glucose biosensor
NASA Astrophysics Data System (ADS)
Pepłowski, Andrzej; Janczak, Daniel; Jakubowska, Małgorzata
2015-09-01
Various methods and materials for enzyme stabilization within screen-printed graphene sensor were analyzed. Main goal was to develop technology allowing immediate printing of the biosensors in single printing process. Factors being considered were: toxicity of the materials used, ability of the material to be screen-printed (squeezed through the printing mesh) and temperatures required in the fabrication process. Performance of the examined sensors was measured using chemical amperometry method, then appropriate analysis of the measurements was conducted. The analysis results were then compared with the medical requirements. Parameters calculated were: correlation coefficient between concentration of the analyte and the measured electrical current (0.986) and variation coefficient for the particular concentrations of the analyte used as the calibration points. Variation of the measured values was significant only in ranges close to 0, decreasing for the concentrations of clinical importance. These outcomes justify further development of the graphene-based biosensors fabricated through printing techniques.
Modern Instrumental Methods in Forensic Toxicology*
Smith, Michael L.; Vorce, Shawn P.; Holler, Justin M.; Shimomura, Eric; Magluilo, Joe; Jacobs, Aaron J.; Huestis, Marilyn A.
2009-01-01
This article reviews modern analytical instrumentation in forensic toxicology for identification and quantification of drugs and toxins in biological fluids and tissues. A brief description of the theory and inherent strengths and limitations of each methodology is included. The focus is on new technologies that address current analytical limitations. A goal of this review is to encourage innovations to improve our technological capabilities and to encourage use of these analytical techniques in forensic toxicology practice. PMID:17579968
NASA Astrophysics Data System (ADS)
Phipps, Marja; Capel, David; Srinivasan, James
2014-06-01
Motion imagery capabilities within the Department of Defense/Intelligence Community (DoD/IC) have advanced significantly over the last decade, attempting to meet continuously growing data collection, video processing and analytical demands in operationally challenging environments. The motion imagery tradecraft has evolved accordingly, enabling teams of analysts to effectively exploit data and generate intelligence reports across multiple phases in structured Full Motion Video (FMV) Processing Exploitation and Dissemination (PED) cells. Yet now the operational requirements are drastically changing. The exponential growth in motion imagery data continues, but to this the community adds multi-INT data, interoperability with existing and emerging systems, expanded data access, nontraditional users, collaboration, automation, and support for ad hoc configurations beyond the current FMV PED cells. To break from the legacy system lifecycle, we look towards a technology application and commercial adoption model course which will meet these future Intelligence, Surveillance and Reconnaissance (ISR) challenges. In this paper, we explore the application of cutting edge computer vision technology to meet existing FMV PED shortfalls and address future capability gaps. For example, real-time georegistration services developed from computer-vision-based feature tracking, multiple-view geometry, and statistical methods allow the fusion of motion imagery with other georeferenced information sources - providing unparalleled situational awareness. We then describe how these motion imagery capabilities may be readily deployed in a dynamically integrated analytical environment; employing an extensible framework, leveraging scalable enterprise-wide infrastructure and following commercial best practices.
The "hospital central laboratory": automation, integration and clinical usefulness.
Zaninotto, Martina; Plebani, Mario
2010-07-01
Recent technological developments in laboratory medicine have led to a major challenge, maintaining a close connection between the search of efficiency through automation and consolidation and the assurance of effectiveness. The adoption of systems that automate most of the manual tasks characterizing routine activities has significantly improved the quality of laboratory performance; total laboratory automation being the paradigm of the idea that "human-less" robotic laboratories may allow for better operation and insuring less human errors. Furthermore, even if ongoing technological developments have considerably improved the productivity of clinical laboratories as well as reducing the turnaround time of the entire process, the value of qualified personnel remains a significant issue. Recent evidence confirms that automation allows clinical laboratories to improve analytical performances only if trained staff operate in accordance with well-defined standard operative procedures, thus assuring continuous monitoring of the analytical quality. In addition, laboratory automation may improve the appropriateness of test requests through the use of algorithms and reflex testing. This should allow the adoption of clinical and biochemical guidelines. In conclusion, in laboratory medicine, technology represents a tool for improving clinical effectiveness and patient outcomes, but it has to be managed by qualified laboratory professionals.
Sensor failure detection for jet engines using analytical redundance
NASA Technical Reports Server (NTRS)
Merrill, W. C.
1984-01-01
Analytical redundant sensor failure detection, isolation and accommodation techniques for gas turbine engines are surveyed. Both the theoretical technology base and demonstrated concepts are discussed. Also included is a discussion of current technology needs and ongoing Government sponsored programs to meet those needs.
In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCB's in soi...
Characterization of spacecraft humidity condensate
NASA Technical Reports Server (NTRS)
Muckle, Susan; Schultz, John R.; Sauer, Richard L.
1994-01-01
When construction of Space Station Freedom reaches the Permanent Manned Capability (PMC) stage, the Water Recovery and Management Subsystem will be fully operational such that (distilled) urine, spent hygiene water, and humidity condensate will be reclaimed to provide water of potable quality. The reclamation technologies currently baselined to process these waste waters include adsorption, ion exchange, catalytic oxidation, and disinfection. To ensure that the baseline technologies will be able to effectively remove those compounds presenting a health risk to the crew, the National Research Council has recommended that additional information be gathered on specific contaminants in waste waters representative of those to be encountered on the Space Station. With the application of new analytical methods and the analysis of waste water samples more representative of the Space Station environment, advances in the identification of the specific contaminants continue to be made. Efforts by the Water and Food Analytical Laboratory at JSC were successful in enlarging the database of contaminants in humidity condensate. These efforts have not only included the chemical characterization of condensate generated during ground-based studies, but most significantly the characterization of cabin and Spacelab condensate generated during Shuttle missions. The analytical results presented in this paper will be used to show how the composition of condensate varies amongst enclosed environments and thus the importance of collecting condensate from an environment close to that of the proposed Space Station. Although advances were made in the characterization of space condensate, complete characterization, particularly of the organics, requires further development of analytical methods.
A Big Data-driven Model for the Optimization of Healthcare Processes.
Koufi, Vassiliki; Malamateniou, Flora; Vassilacopoulos, George
2015-01-01
Healthcare organizations increasingly navigate a highly volatile, complex environment in which technological advancements and new healthcare delivery business models are the only constants. In their effort to out-perform in this environment, healthcare organizations need to be agile enough in order to become responsive to these increasingly changing conditions. To act with agility, healthcare organizations need to discover new ways to optimize their operations. To this end, they focus on healthcare processes that guide healthcare delivery and on the technologies that support them. Business process management (BPM) and Service-Oriented Architecture (SOA) can provide a flexible, dynamic, cloud-ready infrastructure where business process analytics can be utilized to extract useful insights from mountains of raw data, and make them work in ways beyond the abilities of human brains, or IT systems from just a year ago. This paper presents a framework which provides healthcare professionals gain better insight within and across your business processes. In particular, it performs real-time analysis on process-related data in order reveal areas of potential process improvement.
Comparison of pre-processing methods for multiplex bead-based immunoassays.
Rausch, Tanja K; Schillert, Arne; Ziegler, Andreas; Lüking, Angelika; Zucht, Hans-Dieter; Schulz-Knappe, Peter
2016-08-11
High throughput protein expression studies can be performed using bead-based protein immunoassays, such as the Luminex® xMAP® technology. Technical variability is inherent to these experiments and may lead to systematic bias and reduced power. To reduce technical variability, data pre-processing is performed. However, no recommendations exist for the pre-processing of Luminex® xMAP® data. We compared 37 different data pre-processing combinations of transformation and normalization methods in 42 samples on 384 analytes obtained from a multiplex immunoassay based on the Luminex® xMAP® technology. We evaluated the performance of each pre-processing approach with 6 different performance criteria. Three performance criteria were plots. All plots were evaluated by 15 independent and blinded readers. Four different combinations of transformation and normalization methods performed well as pre-processing procedure for this bead-based protein immunoassay. The following combinations of transformation and normalization were suitable for pre-processing Luminex® xMAP® data in this study: weighted Box-Cox followed by quantile or robust spline normalization (rsn), asinh transformation followed by loess normalization and Box-Cox followed by rsn.
NASA Astrophysics Data System (ADS)
Shtyn, S. U.; Lebedev, V. A.; Gorlenko, A. O.
2017-02-01
On the basis of thermodynamic concepts of the process, we proposed an energy model that reflects the mechanochemical essence of coating forming in terms of vibration technology systems, which takes into account the contribution to the formation of the coating, the increase of unavailable energy due to the growth of entropy, the increase in the energy of elastic-plastic crystal lattice distortion as a result of the mechanical influence of working environment indenters, surface layer internal energy change which occurs as a result of chemical interaction of the contacting media. We proposed adhesion strength of the local volume modified through processing as a criterion of the energy condition of the formed coating. We established analytical dependence which helps to obtain the coating strength of the material required by operating conditions.
Owen, Jesse; Imel, Zac E
2016-04-01
This article introduces the special section on utilizing large data sets to explore psychotherapy processes and outcomes. The increased use of technology has provided new opportunities for psychotherapy researchers. In particular, there is a rise in large databases of tens of thousands clients. Additionally, there are new ways to pool valuable resources for meta-analytic processes. At the same time, these tools also come with limitations. These issues are introduced as well as brief overview of the articles. (c) 2016 APA, all rights reserved).
Next generation data harmonization
NASA Astrophysics Data System (ADS)
Armstrong, Chandler; Brown, Ryan M.; Chaves, Jillian; Czerniejewski, Adam; Del Vecchio, Justin; Perkins, Timothy K.; Rudnicki, Ron; Tauer, Greg
2015-05-01
Analysts are presented with a never ending stream of data sources. Often, subsets of data sources to solve problems are easily identified but the process to align data sets is time consuming. However, many semantic technologies do allow for fast harmonization of data to overcome these problems. These include ontologies that serve as alignment targets, visual tools and natural language processing that generate semantic graphs in terms of the ontologies, and analytics that leverage these graphs. This research reviews a developed prototype that employs all these approaches to perform analysis across disparate data sources documenting violent, extremist events.
Baumann, Pascal; Bluthardt, Nicolai; Renner, Sarah; Burghardt, Hannah; Osberghaus, Anna; Hubbuch, Jürgen
2015-04-20
Product analytics is the bottleneck of most processes in bioprocess engineering, as it is rather time-consuming. Real-time and in-line product tracing without sample pre-treatment is only possible for few products. The Cherry-Tag™ (Delphi Genetics, Belgium) which can be fused to any target protein allows for straightforward product analytics by VIS absorption measurements. When the fused protein becomes unstable or insoluble, the chromophore function of the group is lost, which makes this technology an ideal screening tool for solubility and stability in up- and downstream process development. The Cherry-Tag™ technology will be presented for the tagged enzyme glutathione-S-transferase (GST) from Escherichia coli in a combined up- and downstream process development study. High-throughput cultivations were carried out in a 48-well format in a BioLector system (m2p-Labs, Germany). The best cultivation setup of highest product titer was scaled up to a 2.5L shake flask culture, followed by a selective affinity chromatography product capturing step. In upstream applications the tag was capable of identifying conditions where insoluble and non-native inclusion bodies were formed. In downstream applications the red-colored product was found to be bound effectively to a GST affinity column. Thus, it was identified to be a native and active protein, as the binding mechanism relies on catalytic activity of the enzyme. The Cherry-Tag™ was found to be a reliable and quantitative tool for real-time tracking of stable and soluble proteins in up- and downstream processing applications. Denaturation and aggregation of the product can be detected in-line at any stage of the process. Critical stages can be identified and subsequently changed or replaced. Copyright © 2015 Elsevier B.V. All rights reserved.
Ghader, Masoud; Shokoufi, Nader; Es-Haghi, Ali; Kargosha, Kazem
2018-04-15
Vaccine production is a biological process in which variation in time and output is inevitable. Thus, the application of Process Analytical Technologies (PAT) will be important in this regard. Headspace solid - phase microextraction (HS-SPME) coupled with GC-MS can be used as a PAT for process monitoring. This method is suitable to chemical profiling of volatile organic compounds (VOCs) emitted from microorganisms. Tetanus is a lethal disease caused by Clostridium tetani (C. tetani) bacterium and vaccination is an ultimate way to prevent this disease. In this paper, SPME fiber was used for the investigation of VOCs emerging from C. tetani during cultivation. Different types of VOCs such as sulfur-containing compounds were identified and some of them were selected as biomarkers for bioreactor monitoring during vaccine production. In the second step, the portable dynamic air sampling (PDAS) device was used as an interface for sampling VOCs by SPME fibers. The sampling procedure was optimized by face-centered central composite design (FC-CCD). The optimized sampling time and inlet gas flow rates were 10 min and 2 m L s -1 , respectively. PDAS was mounted in exhausted gas line of bioreactor and 42 samples of VOCs were prepared by SPME fibers in 7 days during incubation. Simultaneously, pH and optical density (OD) were evaluated to cultivation process which showed good correlations with the identified VOCs (>80%). This method could be used for VOCs sampling from off-gas of a bioreactor to monitoring of the cultivation process. Copyright © 2018. Published by Elsevier B.V.
Sonner, Zachary; Wilder, Eliza; Gaillard, Trudy; Kasting, Gerald; Heikenfeld, Jason
2017-07-25
Eccrine sweat has rapidly emerged as a non-invasive, ergonomic, and rich source of chemical analytes with numerous technological demonstrations now showing the ability for continuous electrochemical sensing. However, beyond active perspirers (athletes, workers, etc.), continuous sweat access in individuals at rest has hindered the advancement of both sweat sensing science and technology. Reported here is integration of sudomotor axon reflex sweat stimulation for continuous wearable sweat analyte analysis, including the ability for side-by-side integration of chemical stimulants & sensors without cross-contamination. This integration approach is uniquely compatible with sensors which consume the analyte (enzymatic) or sensors which equilibrate with analyte concentrations. In vivo validation is performed using iontophoretic delivery of carbachol with ion-selective and impedance sensors for sweat analysis. Carbachol has shown prolonged sweat stimulation in directly stimulated regions for five hours or longer. This work represents a significant leap forward in sweat sensing technology, and may be of broader interest to those interested in on-skin sensing integrated with drug-delivery.
Chemical Detection and Identification Techniques for Exobiology Flight Experiments
NASA Technical Reports Server (NTRS)
Kojiro, Daniel R.; Sheverev, Valery A.; Khromov, Nikolai A.
2002-01-01
Exobiology flight experiments require highly sensitive instrumentation for in situ analysis of the volatile chemical species that occur in the atmospheres and surfaces of various bodies within the solar system. The complex mixtures encountered place a heavy burden on the analytical Instrumentation to detect and identify all species present. The minimal resources available onboard for such missions mandate that the instruments provide maximum analytical capabilities with minimal requirements of volume, weight and consumables. Advances in technology may be achieved by increasing the amount of information acquired by a given technique with greater analytical capabilities and miniaturization of proven terrestrial technology. We describe here methods to develop analytical instruments for the detection and identification of a wide range of chemical species using Gas Chromatography. These efforts to expand the analytical capabilities of GC technology are focused on the development of detectors for the GC which provide sample identification independent of the GC retention time data. A novel new approach employs Penning Ionization Electron Spectroscopy (PIES).
Roßteuscher-Carl, Katrin; Fricke, Sabine; Hacker, Michael C; Schulz-Siegmund, Michaela
2015-12-30
Ethinylestradiol (EE) as a highly active and low dosed compound is prone to oxidative degradation. The stability of the drug substance is therefore a critical parameter that has to be considered during drug formulation. Beside the stability of the drug substance, granule particle size and moisture are critical quality attributes (CQA) of the fluid bed granulation process which influence the tableting ability of the resulting granules. Both CQA should therefore be monitored during the production process by process analytic technology (PAT) according to ICH Q8. This work focusses on the effects of drying conditions on the stability of EE in a fluid-bed granulation process. We quantified EE degradation products 6-alpha-hydroxy-EE, 6-beta-hydroxy-EE, 9(11)-dehydro-EE and 6-oxo-EE during long time storage and accelerated conditions. PAT-tools that monitor granule particle size (Spatial filtering technology) and granule moisture (Microwave resonance technology) were applied and compared with off-line methods. We found a relevant influence of residual granule moisture and thermic stress applied during granulation on the storage stability of EE, whereas no degradation was found immediately after processing. Hence we conclude that drying parameters have a relevant influence on long term EE stability. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Chen, Yiming; Zhou, Beihai; Yuan, Rongfang; Bao, Xiangming; Li, Dongwei
2018-02-01
In recent years, water contamination problem has been becoming more and more serious due to increasing wastewater discharge. So our country has accelerated the pace of constructing sewage treatment plant in small towns. But in China it has not been issued any corresponding technical specifications about the choice of treatment technology. So the article is based on the basin of Duliujian river, through field research, data collection and analysis of relevant documentations, preliminarily elects seven kinds of technology: Improved A2/O, Integrated oxidation ditch, Orbal oxidation ditch, CASS, A/O+refined diatomite, BIOLAK and UNITANK as alternatives for Tianjin sewage discharge local standard.Then the article use the analytic hierarchy process (AHP) to evaluate the seven kinds of alternatives, finally it is concluded that CASS technology is most suitable for the main technology of new sewage treatment plants in small towns along the Duliujian River basin.
Ryals, John; Lawton, Kay; Stevens, Daniel; Milburn, Michael
2007-07-01
Metabolon is an emerging technology company developing proprietary analytical methods and software for biomarker discovery using metabolomics. The company's aim is to measure all small molecules (<1500 Da) in a biological sample. These small-molecule compounds include biochemicals of cellular metabolism and xenobiotics from diet and environment. Our proprietary mLIMStrade mark system contains advanced metabolomic software and automated data-processing tools that use a variety of data-analysis and quality-control algorithms to convert raw mass-spectrometry data to identified, quantitated compounds. Metabolon's primary focus is a fee-for-service business that exploits this technology for pharmaceutical and biotechnology companies, with additional clients in the consumer goods, cosmetics and agricultural industries. Fee-for-service studies are often collaborations with groups that employ a variety of technologies for biomarker discovery. Metabolon's goal is to develop technology that will automatically analyze any sample for the small-molecule components present and become a standard technology for applications in health and related sciences.
An applied study using systems engineering methods to prioritize green systems options
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Sonya M; Macdonald, John M
2009-01-01
For many years, there have been questions about the effectiveness of applying different green solutions. If you're building a home and wish to use green technologies, where do you start? While all technologies sound promising, which will perform the best over time? All this has to be considered within the cost and schedule of the project. The amount of information available on the topic can be overwhelming. We seek to examine if Systems Engineering methods can be used to help people choose and prioritize technologies that fit within their project and budget. Several methods are used to gain perspective intomore » how to select the green technologies, such as the Analytic Hierarchy Process (AHP) and Kepner-Tregoe. In our study, subjects applied these methods to analyze cost, schedule, and trade-offs. Results will document whether the experimental approach is applicable to defining system priorities for green technologies.« less
NASA Astrophysics Data System (ADS)
Latinović, T. S.; Preradović, D. M.; Barz, C. R.; Latinović, M. T.; Petrica, P. P.; Pop-Vadean, A.
2016-08-01
The amount of data at the global level has grown exponentially. Along with this phenomena, we have a need for a new unit of measure like exabyte, zettabyte, and yottabyte as the last unit measures the amount of data. The growth of data gives a situation where the classic systems for the collection, storage, processing, and visualization of data losing the battle with a large amount, speed, and variety of data that is generated continuously. Many of data that is created by the Internet of Things, IoT (cameras, satellites, cars, GPS navigation, etc.). It is our challenge to come up with new technologies and tools for the management and exploitation of these large amounts of data. Big Data is a hot topic in recent years in IT circles. However, Big Data is recognized in the business world, and increasingly in the public administration. This paper proposes an ontology of big data analytics and examines how to enhance business intelligence through big data analytics as a service by presenting a big data analytics services-oriented architecture. This paper also discusses the interrelationship between business intelligence and big data analytics. The proposed approach in this paper might facilitate the research and development of business analytics, big data analytics, and business intelligence as well as intelligent agents.
Rehabilitation Risk Management: Enabling Data Analytics with Quantified Self and Smart Home Data.
Hamper, Andreas; Eigner, Isabella; Wickramasinghe, Nilmini; Bodendorf, Freimut
2017-01-01
A variety of acute and chronic diseases require rehabilitation at home after treatment. Outpatient rehabilitation is crucial for the quality of the medical outcome but is mainly performed without medical supervision. Non-Compliance can lead to severe health risks and readmission to the hospital. While the patient is closely monitored in the hospital, methods and technologies to identify risks at home have to be developed. We analyze state-of-the-art monitoring systems and technologies and show possibilities to transfer these technologies into rehabilitation monitoring. For this purpose, we analyze sensor technology from the field of Quantified Self and Smart Homes. The available sensor data from this consumer grade technology is summarized to give an overview of the possibilities for medical data analytics. Subsequently, we show a conceptual roadmap to transfer data analytics methods to sensor based rehabilitation risk management.
On the Modeling and Management of Cloud Data Analytics
NASA Astrophysics Data System (ADS)
Castillo, Claris; Tantawi, Asser; Steinder, Malgorzata; Pacifici, Giovanni
A new era is dawning where vast amount of data is subjected to intensive analysis in a cloud computing environment. Over the years, data about a myriad of things, ranging from user clicks to galaxies, have been accumulated, and continue to be collected, on storage media. The increasing availability of such data, along with the abundant supply of compute power and the urge to create useful knowledge, gave rise to a new data analytics paradigm in which data is subjected to intensive analysis, and additional data is created in the process. Meanwhile, a new cloud computing environment has emerged where seemingly limitless compute and storage resources are being provided to host computation and data for multiple users through virtualization technologies. Such a cloud environment is becoming the home for data analytics. Consequently, providing good performance at run-time to data analytics workload is an important issue for cloud management. In this paper, we provide an overview of the data analytics and cloud environment landscapes, and investigate the performance management issues related to running data analytics in the cloud. In particular, we focus on topics such as workload characterization, profiling analytics applications and their pattern of data usage, cloud resource allocation, placement of computation and data and their dynamic migration in the cloud, and performance prediction. In solving such management problems one relies on various run-time analytic models. We discuss approaches for modeling and optimizing the dynamic data analytics workload in the cloud environment. All along, we use the Map-Reduce paradigm as an illustration of data analytics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gałuszka, Agnieszka, E-mail: Agnieszka.Galuszka@ujk.edu.pl; Migaszewski, Zdzisław M.; Namieśnik, Jacek
The recent rapid progress in technology of field portable instruments has increased their applications in environmental sample analysis. These instruments offer a possibility of cost-effective, non-destructive, real-time, direct, on-site measurements of a wide range of both inorganic and organic analytes in gaseous, liquid and solid samples. Some of them do not require the use of reagents and do not produce any analytical waste. All these features contribute to the greenness of field portable techniques. Several stationary analytical instruments have their portable versions. The most popular ones include: gas chromatographs with different detectors (mass spectrometer (MS), flame ionization detector, photoionization detector),more » ultraviolet–visible and near-infrared spectrophotometers, X-ray fluorescence spectrometers, ion mobility spectrometers, electronic noses and electronic tongues. The use of portable instruments in environmental sample analysis gives a possibility of on-site screening and a subsequent selection of samples for routine laboratory analyses. They are also very useful in situations that require an emergency response and for process monitoring applications. However, quantification of results is still problematic in many cases. The other disadvantages include: higher detection limits and lower sensitivity than these obtained in laboratory conditions, a strong influence of environmental factors on the instrument performance and a high possibility of sample contamination in the field. This paper reviews recent applications of field portable instruments in environmental sample analysis and discusses their analytical capabilities. - Highlights: • Field portable instruments are widely used in environmental sample analysis. • Field portable instruments are indispensable for analysis in emergency response. • Miniaturization of field portable instruments reduces resource consumption. • In situ analysis is in agreement with green analytical chemistry principles. • Performance requirements in field analysis stimulate technological progress.« less
Analyte stability during the total testing process: studies of vitamins A, D and E by LC-MS/MS.
Albahrani, Ali A; Rotarou, Victor; Roche, Peter J; Greaves, Ronda F
2016-10-01
There are limited evidence based studies demonstrating the stability of fat-soluble vitamins (FSV) measured in blood. This study aimed to examine the effects of light, temperature and time on vitamins A, D and E throughout the total testing process. Four experiments were conducted. Three investigated the sample matrix, of whole blood, serum and the extracted sample, against the variables of temperature and light; and the fourth experiment investigated the sample during the extraction process against the variable of light. All samples were analysed via our simultaneous FSV method using liquid chromatography-tandem mass spectrometry technology. The allowable clinical percentage change was calculated based on biological variation and desirable method imprecision for each analyte. The total change limit was ±7.3% for 25-OH-vitamin D3, ±11.8% for retinol and ±10.8% for α-tocopherol. Vitamins D and E were stable in the investigated conditions (concentration changes <4%) in the pre-analytical and analytical stages. Vitamin A showed photosensitivity in times >48 h with concentration changes of -6.8% (blood) and -6.5% (serum), both are within the allowable clinical percentage change. By contrast, the extracted retinol sample demonstrated a concentration change of -18.4% after 48 h of light exposure. However, vitamin A in the serum and extracted solution was stable for one month when stored at -20°C. Blood samples for vitamins D and E analyses can be processed in normal laboratory conditions of lighting and temperature. The required conditions for vitamin A analysis are similar when performed within 48 h. For longer-term storage, serum and vitamin A extracts should be stored at -20°C.
NASA Astrophysics Data System (ADS)
Tamkin, G.; Schnase, J. L.; Duffy, D.; Li, J.; Strong, S.; Thompson, J. H.
2017-12-01
NASA's efforts to advance climate analytics-as-a-service are making new capabilities available to the research community: (1) A full-featured Reanalysis Ensemble Service (RES) comprising monthly means data from multiple reanalysis data sets, accessible through an enhanced set of extraction, analytic, arithmetic, and intercomparison operations. The operations are made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services Python library, CDSlib; (2) A cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables. This near real-time capability enables advanced technologies like Spark and Hadoop-based MapReduce analytics over native NetCDF files; and (3) A WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation systems such as ESGF. The Reanalysis Ensemble Service includes the following: - New API that supports full temporal, spatial, and grid-based resolution services with sample queries - A Docker-ready RES application to deploy across platforms - Extended capabilities that enable single- and multiple reanalysis area average, vertical average, re-gridding, standard deviation, and ensemble averages - Convenient, one-stop shopping for commonly used data products from multiple reanalyses including basic sub-setting and arithmetic operations (e.g., avg, sum, max, min, var, count, anomaly) - Full support for the MERRA-2 reanalysis dataset in addition to, ECMWF ERA-Interim, NCEP CFSR, JMA JRA-55 and NOAA/ESRL 20CR… - A Jupyter notebook-based distribution mechanism designed for client use cases that combines CDSlib documentation with interactive scenarios and personalized project management - Supporting analytic services for NASA GMAO Forward Processing datasets - Basic uncertainty quantification services that combine heterogeneous ensemble products with comparative observational products (e.g., reanalysis, observational, visualization) - The ability to compute and visualize multiple reanalysis for ease of inter-comparisons - Automated tools to retrieve and prepare data collections for analytic processing
NASA Astrophysics Data System (ADS)
Wollocko, Arthur; Danczyk, Jennifer; Farry, Michael; Jenkins, Michael; Voshell, Martin
2015-05-01
The proliferation of sensor technologies continues to impact Intelligence Analysis (IA) work domains. Historical procurement focus on sensor platform development and acquisition has resulted in increasingly advanced collection systems; however, such systems often demonstrate classic data overload conditions by placing increased burdens on already overtaxed human operators and analysts. Support technologies and improved interfaces have begun to emerge to ease that burden, but these often focus on single modalities or sensor platforms rather than underlying operator and analyst support needs, resulting in systems that do not adequately leverage their natural human attentional competencies, unique skills, and training. One particular reason why emerging support tools often fail is due to the gap between military applications and their functions, and the functions and capabilities afforded by cutting edge technology employed daily by modern knowledge workers who are increasingly "digitally native." With the entry of Generation Y into these workplaces, "net generation" analysts, who are familiar with socially driven platforms that excel at giving users insight into large data sets while keeping cognitive burdens at a minimum, are creating opportunities for enhanced workflows. By using these ubiquitous platforms, net generation analysts have trained skills in discovering new information socially, tracking trends among affinity groups, and disseminating information. However, these functions are currently under-supported by existing tools. In this paper, we describe how socially driven techniques can be contextualized to frame complex analytical threads throughout the IA process. This paper focuses specifically on collaborative support technology development efforts for a team of operators and analysts. Our work focuses on under-supported functions in current working environments, and identifies opportunities to improve a team's ability to discover new information and disseminate insightful analytic findings. We describe our Cognitive Systems Engineering approach to developing a novel collaborative enterprise IA system that combines modern collaboration tools with familiar contemporary social technologies. Our current findings detail specific cognitive and collaborative work support functions that defined the design requirements for a prototype analyst collaborative support environment.
Lermen, Dominik; Schmitt, Daniel; Bartel-Steinbach, Martina; Schröter-Kermani, Christa; Kolossa-Gehring, Marike; von Briesen, Hagen; Zimmermann, Heiko
2014-01-01
Technical progress has simplified tasks in lab diagnosis and improved quality of test results. Errors occurring during the pre-analytical phase have more negative impact on the quality of test results than errors encountered during the total analytical process. Different infrastructures of sampling sites can highly influence the quality of samples and therewith of analytical results. Annually the German Environmental Specimen Bank (ESB) collects, characterizes, and stores blood, plasma, and urine samples of 120–150 volunteers each on four different sampling sites in Germany. Overarching goal is to investigate the exposure to environmental pollutants of non-occupational exposed young adults combining human biomonitoring with questionnaire data. We investigated the requirements of the study and the possibility to realize a highly standardized sampling procedure on a mobile platform in order to increase the required quality of the pre-analytical phase. The results lead to the development of a mobile epidemiologic laboratory (epiLab) in the project “Labor der Zukunft” (future’s lab technology). This laboratory includes a 14.7 m2 reception area to record medical history and exposure-relevant behavior, a 21.1 m2 examination room to record dental fillings and for blood withdrawal, a 15.5 m2 biological safety level 2 laboratory to process and analyze samples on site including a 2.8 m2 personnel lock and a 3.6 m2 cryofacility to immediately freeze samples. Frozen samples can be transferred to their final destination within the vehicle without breaking the cold chain. To our knowledge, we herewith describe for the first time the implementation of a biological safety laboratory (BSL) 2 lab and an epidemiologic unit on a single mobile platform. Since 2013 we have been collecting up to 15.000 individual human samples annually under highly standardized conditions using the mobile laboratory. Characterized and free of alterations they are kept ready for retrospective analyses in their final archive, the German ESB. PMID:25141120
Lermen, Dominik; Schmitt, Daniel; Bartel-Steinbach, Martina; Schröter-Kermani, Christa; Kolossa-Gehring, Marike; von Briesen, Hagen; Zimmermann, Heiko
2014-01-01
Technical progress has simplified tasks in lab diagnosis and improved quality of test results. Errors occurring during the pre-analytical phase have more negative impact on the quality of test results than errors encountered during the total analytical process. Different infrastructures of sampling sites can highly influence the quality of samples and therewith of analytical results. Annually the German Environmental Specimen Bank (ESB) collects, characterizes, and stores blood, plasma, and urine samples of 120-150 volunteers each on four different sampling sites in Germany. Overarching goal is to investigate the exposure to environmental pollutants of non-occupational exposed young adults combining human biomonitoring with questionnaire data. We investigated the requirements of the study and the possibility to realize a highly standardized sampling procedure on a mobile platform in order to increase the required quality of the pre-analytical phase. The results lead to the development of a mobile epidemiologic laboratory (epiLab) in the project "Labor der Zukunft" (future's lab technology). This laboratory includes a 14.7 m(2) reception area to record medical history and exposure-relevant behavior, a 21.1 m(2) examination room to record dental fillings and for blood withdrawal, a 15.5 m(2) biological safety level 2 laboratory to process and analyze samples on site including a 2.8 m(2) personnel lock and a 3.6 m2 cryofacility to immediately freeze samples. Frozen samples can be transferred to their final destination within the vehicle without breaking the cold chain. To our knowledge, we herewith describe for the first time the implementation of a biological safety laboratory (BSL) 2 lab and an epidemiologic unit on a single mobile platform. Since 2013 we have been collecting up to 15.000 individual human samples annually under highly standardized conditions using the mobile laboratory. Characterized and free of alterations they are kept ready for retrospective analyses in their final archive, the German ESB.
Ślączka-Wilk, Magdalena M; Włodarczyk, Elżbieta; Kaleniecka, Aleksandra; Zarzycki, Paweł K
2017-07-01
There is increasing interest in the development of simple analytical systems enabling the fast screening of target components in complex samples. A number of newly invented protocols are based on quasi separation techniques involving microfluidic paper-based analytical devices and/or micro total analysis systems. Under such conditions, the quantification of target components can be performed mainly due to selective detection. The main goal of this paper is to demonstrate that miniaturized planar chromatography has the capability to work as an efficient separation and quantification tool for the analysis of multiple targets within complex environmental samples isolated and concentrated using an optimized SPE method. In particular, we analyzed various samples collected from surface water ecosystems (lakes, rivers, and the Baltic Sea of Middle Pomerania in the northern part of Poland) in different seasons, as well as samples collected during key wastewater technological processes (originating from the "Jamno" wastewater treatment plant in Koszalin, Poland). We documented that the multiple detection of chromatographic spots on RP-18W microplates-under visible light, fluorescence, and fluorescence quenching conditions, and using the visualization reagent phosphomolybdic acid-enables fast and robust sample classification. The presented data reveal that the proposed micro-TLC system is useful, inexpensive, and can be considered as a complementary method for the fast control of treated sewage water discharged by a municipal wastewater treatment plant, particularly for the detection of low-molecular mass micropollutants with polarity ranging from estetrol to progesterone, as well as chlorophyll-related dyes. Due to the low consumption of mobile phases composed of water-alcohol binary mixtures (less than 1 mL/run for the simultaneous separation of up to nine samples), this method can be considered an environmentally friendly and green chemistry analytical tool. The described analytical protocol can be complementary to those involving classical column chromatography (HPLC) or various planar microfluidic devices.
Dropwise additive manufacturing of pharmaceutical products for melt-based dosage forms.
Içten, Elçin; Giridhar, Arun; Taylor, Lynne S; Nagy, Zoltan K; Reklaitis, Gintaras V
2015-05-01
The US Food and Drug Administration introduced the quality by design approach and process analytical technology guidance to encourage innovation and efficiency in pharmaceutical development, manufacturing, and quality assurance. As part of this renewed emphasis on the improvement of manufacturing, the pharmaceutical industry has begun to develop more efficient production processes with more intensive use of online measurement and sensing, real-time quality control, and process control tools. Here, we present dropwise additive manufacturing of pharmaceutical products (DAMPP) as an alternative to conventional pharmaceutical manufacturing methods. This mini-manufacturing process for the production of pharmaceuticals utilizes drop on demand printing technology for automated and controlled deposition of melt-based formulations onto edible substrates. The advantages of drop-on-demand technology, including reproducible production of small droplets, adjustable drop sizing, high placement accuracy, and flexible use of different formulations, enable production of individualized dosing even for low-dose and high-potency drugs. In this work, DAMPP is used to produce solid oral dosage forms from hot melts of an active pharmaceutical ingredient and a polymer. The dosage forms are analyzed to show the reproducibility of dosing and the dissolution behavior of different formulations. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.
Some Aspects in Photogrammetry Education at the Department of Geodesy and Cadastre of the VGTU
NASA Astrophysics Data System (ADS)
Ruzgienė, Birutė
2008-03-01
The education in photogrammetry is very important when applying photogrammetric methods for the terrain mapping purposes, for spatial data modelling, solving engineering tasks, measuring of architectural monuments etc. During the time the traditional photogrammetric technologies have been changing to modern fully digital photogrammetric workflow. The number of potential users of the photogrammetric methods tends to increase, because of high-degree automation in photographs (images) processing. The main subjects in Photogrammetry (particularly in Digital Photogrammetry) educational process are discussed. Different methods and digital systems are demonstrated with the examples of aerial photogrammetry products. The main objective is to search the possibilities for training in the photogrammetric measurements. Special attention is paid to the stereo plotting from aerial photography applying modified for teaching analytical technology. The integration of functionality of Digital Photogrammetric Systems and Digital Image Processing is analysed as well with an intention of extending the application areas and possibilities for usage of modern technologies in urban mapping and land cadastre. The practical presentation of photos geometry restitution is implemented as significant part of the studies. The interactive teaching for main photogrammetric procedures and controlling systems are highly desirable that without any doubt improve the quality of educational process.
Applied analytical combustion/emissions research at the NASA Lewis Research Center
NASA Technical Reports Server (NTRS)
Deur, J. M.; Kundu, K. P.; Nguyen, H. L.
1992-01-01
Emissions of pollutants from future commercial transports are a significant concern. As a result, the Lewis Research Center (LeRC) is investigating various low emission combustor technologies. As part of this effort, a combustor analysis code development program was pursued to guide the combustor design process, to identify concepts having the greatest promise, and to optimize them at the lowest cost in the minimum time.
NASA Technical Reports Server (NTRS)
Deur, J. M.; Kundu, K. P.; Nguyen, H. L.
1992-01-01
Emissions of pollutants from future commercial transports are a significant concern. As a result, the Lewis Research Center (LeRC) is investigating various low emission combustor technologies. As part of this effort, a combustor analysis code development program was pursued to guide the combustor design process, to identify concepts having the greatest promise, and to optimize them at the lowest cost in the minimum time.
High performance cryogenic turboexpanders
DOE Office of Scientific and Technical Information (OSTI.GOV)
Agahi, R.R.; Ershaghi, B.; Lin, M.C.
1996-12-31
The use of turboexpanders for deep cryogenic temperatures has been constrained because of thermal efficiency limitations. This limited thermal efficiency was mostly due to mechanical constraints. Recent improvements in analytical techniques, bearing technology, and design features have made it possible to design and operate turboexpanders at more favorable conditions, such as of higher rotational speeds. Several turboexpander installations in helium and hydrogen processes have shown a significant improvement in plant performance over non-turboexpander options.
Real-time assessment of critical quality attributes of a continuous granulation process.
Fonteyne, Margot; Vercruysse, Jurgen; Díaz, Damián Córdoba; Gildemyn, Delphine; Vervaet, Chris; Remon, Jean Paul; De Beer, Thomas
2013-02-01
There exists the intention to shift pharmaceutical manufacturing of solid dosage forms from traditional batch production towards continuous production. The currently applied conventional quality control systems, based on sampling and time-consuming off-line analyses in analytical laboratories, would annul the advantages of continuous processing. It is clear that real-time quality assessment and control is indispensable for continuous production. This manuscript evaluates strengths and weaknesses of several complementary Process Analytical Technology (PAT) tools implemented in a continuous wet granulation process, which is part of a fully continuous from powder-to-tablet production line. The use of Raman and NIR-spectroscopy and a particle size distribution analyzer is evaluated for the real-time monitoring of critical parameters during the continuous wet agglomeration of an anhydrous theophylline- lactose blend. The solid state characteristics and particle size of the granules were analyzed in real-time and the critical process parameters influencing these granule characteristics were identified. The temperature of the granulator barrel, the amount of granulation liquid added and, to a lesser extent, the powder feed rate were the parameters influencing the solid state of the active pharmaceutical ingredient (API). A higher barrel temperature and a higher powder feed rate, resulted in larger granules.
Health-Enabled Smart Sensor Fusion Technology
NASA Technical Reports Server (NTRS)
Wang, Ray
2012-01-01
A process was designed to fuse data from multiple sensors in order to make a more accurate estimation of the environment and overall health in an intelligent rocket test facility (IRTF), to provide reliable, high-confidence measurements for a variety of propulsion test articles. The object of the technology is to provide sensor fusion based on a distributed architecture. Specifically, the fusion technology is intended to succeed in providing health condition monitoring capability at the intelligent transceiver, such as RF signal strength, battery reading, computing resource monitoring, and sensor data reading. The technology also provides analytic and diagnostic intelligence at the intelligent transceiver, enhancing the IEEE 1451.x-based standard for sensor data management and distributions, as well as providing appropriate communications protocols to enable complex interactions to support timely and high-quality flow of information among the system elements.
System Architecture Modeling for Technology Portfolio Management using ATLAS
NASA Technical Reports Server (NTRS)
Thompson, Robert W.; O'Neil, Daniel A.
2006-01-01
Strategic planners and technology portfolio managers have traditionally relied on consensus-based tools, such as Analytical Hierarchy Process (AHP) and Quality Function Deployment (QFD) in planning the funding of technology development. While useful to a certain extent, these tools are limited in the ability to fully quantify the impact of a technology choice on system mass, system reliability, project schedule, and lifecycle cost. The Advanced Technology Lifecycle Analysis System (ATLAS) aims to provide strategic planners a decision support tool for analyzing technology selections within a Space Exploration Architecture (SEA). Using ATLAS, strategic planners can select physics-based system models from a library, configure the systems with technologies and performance parameters, and plan the deployment of a SEA. Key parameters for current and future technologies have been collected from subject-matter experts and other documented sources in the Technology Tool Box (TTB). ATLAS can be used to compare the technical feasibility and economic viability of a set of technology choices for one SEA, and compare it against another set of technology choices or another SEA. System architecture modeling in ATLAS is a multi-step process. First, the modeler defines the system level requirements. Second, the modeler identifies technologies of interest whose impact on an SEA. Third, the system modeling team creates models of architecture elements (e.g. launch vehicles, in-space transfer vehicles, crew vehicles) if they are not already in the model library. Finally, the architecture modeler develops a script for the ATLAS tool to run, and the results for comparison are generated.
7 CFR 94.103 - Analytical methods.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 3 2014-01-01 2014-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...
7 CFR 94.103 - Analytical methods.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 3 2010-01-01 2010-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...
7 CFR 94.103 - Analytical methods.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 3 2011-01-01 2011-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...
7 CFR 94.103 - Analytical methods.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 3 2013-01-01 2013-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...
7 CFR 94.103 - Analytical methods.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 7 Agriculture 3 2012-01-01 2012-01-01 false Analytical methods. 94.103 Section 94.103 Agriculture... POULTRY AND EGG PRODUCTS Voluntary Analyses of Egg Products § 94.103 Analytical methods. The analytical methods used by the Science and Technology Division laboratories to perform voluntary analyses for egg...
Efficient data management tools for the heterogeneous big data warehouse
NASA Astrophysics Data System (ADS)
Alekseev, A. A.; Osipova, V. V.; Ivanov, M. A.; Klimentov, A.; Grigorieva, N. V.; Nalamwar, H. S.
2016-09-01
The traditional RDBMS has been consistent for the normalized data structures. RDBMS served well for decades, but the technology is not optimal for data processing and analysis in data intensive fields like social networks, oil-gas industry, experiments at the Large Hadron Collider, etc. Several challenges have been raised recently on the scalability of data warehouse like workload against the transactional schema, in particular for the analysis of archived data or the aggregation of data for summary and accounting purposes. The paper evaluates new database technologies like HBase, Cassandra, and MongoDB commonly referred as NoSQL databases for handling messy, varied and large amount of data. The evaluation depends upon the performance, throughput and scalability of the above technologies for several scientific and industrial use-cases. This paper outlines the technologies and architectures needed for processing Big Data, as well as the description of the back-end application that implements data migration from RDBMS to NoSQL data warehouse, NoSQL database organization and how it could be useful for further data analytics.
Multidisciplinary optimization in aircraft design using analytic technology models
NASA Technical Reports Server (NTRS)
Malone, Brett; Mason, W. H.
1991-01-01
An approach to multidisciplinary optimization is presented which combines the Global Sensitivity Equation method, parametric optimization, and analytic technology models. The result is a powerful yet simple procedure for identifying key design issues. It can be used both to investigate technology integration issues very early in the design cycle, and to establish the information flow framework between disciplines for use in multidisciplinary optimization projects using much more computational intense representations of each technology. To illustrate the approach, an examination of the optimization of a short takeoff heavy transport aircraft is presented for numerous combinations of performance and technology constraints.
Improving Adolescent Judgment and Decision Making
Dansereau, Donald F.; Knight, Danica K.; Flynn, Patrick M.
2013-01-01
Human judgment and decision making (JDM) has substantial room for improvement, especially among adolescents. Increased technological and social complexity “ups the ante” for developing impactful JDM interventions and aids. Current explanatory advances in this field emphasize dual processing models that incorporate both experiential and analytic processing systems. According to these models, judgment and decisions based on the experiential system are rapid and stem from automatic reference to previously stored episodes. Those based on the analytic system are viewed as slower and consciously developed. These models also hypothesize that metacognitive (self-monitoring) activities embedded in the analytic system influence how and when the two systems are used. What is not included in these models is the development of an intersection between the two systems. Because such an intersection is strongly suggested by memory and educational research as the basis of wisdom/expertise, the present paper describes an Integrated Judgment and Decision-Making Model (IJDM) that incorporates this component. Wisdom/expertise is hypothesized to contain a collection of schematic structures that can emerge from the accumulation of similar episodes or repeated analytic practice. As will be argued, in comparisons to dual system models, the addition of this component provides a broader basis for selecting and designing interventions to improve adolescent JDM. Its development also has implications for generally enhancing cognitive interventions by adopting principles from athletic training to create automated, expert behaviors. PMID:24391350
Novel immunoassay formats for integrated microfluidic circuits: diffusion immunoassays (DIA)
NASA Astrophysics Data System (ADS)
Weigl, Bernhard H.; Hatch, Anson; Kamholz, Andrew E.; Yager, Paul
2000-03-01
Novel designs of integrated fluidic microchips allow separations, chemical reactions, and calibration-free analytical measurements to be performed directly in very small quantities of complex samples such as whole blood and contaminated environmental samples. This technology lends itself to applications such as clinical diagnostics, including tumor marker screening, and environmental sensing in remote locations. Lab-on-a-Chip based systems offer many *advantages over traditional analytical devices: They consume extremely low volumes of both samples and reagents. Each chip is inexpensive and small. The sampling-to-result time is extremely short. They perform all analytical functions, including sampling, sample pretreatment, separation, dilution, and mixing steps, chemical reactions, and detection in an integrated microfluidic circuit. Lab-on-a-Chip systems enable the design of small, portable, rugged, low-cost, easy to use, yet extremely versatile and capable diagnostic instruments. In addition, fluids flowing in microchannels exhibit unique characteristics ('microfluidics'), which allow the design of analytical devices and assay formats that would not function on a macroscale. Existing Lab-on-a-chip technologies work very well for highly predictable and homogeneous samples common in genetic testing and drug discovery processes. One of the biggest challenges for current Labs-on-a-chip, however, is to perform analysis in the presence of the complexity and heterogeneity of actual samples such as whole blood or contaminated environmental samples. Micronics has developed a variety of Lab-on-a-Chip assays that can overcome those shortcomings. We will now present various types of novel Lab- on-a-Chip-based immunoassays, including the so-called Diffusion Immunoassays (DIA) that are based on the competitive laminar diffusion of analyte molecules and tracer molecules into a region of the chip containing antibodies that target the analyte molecules. Advantages of this technique are a reduction in reagents, higher sensitivity, minimal preparation of complex samples such as blood, real-time calibration, and extremely rapid analysis.
Insight and analysis problem solving in microbes to machines.
Clark, Kevin B
2015-11-01
A key feature for obtaining solutions to difficult problems, insight is oftentimes vaguely regarded as a special discontinuous intellectual process and/or a cognitive restructuring of problem representation or goal approach. However, this nearly century-old state of art devised by the Gestalt tradition to explain the non-analytical or non-trial-and-error, goal-seeking aptitude of primate mentality tends to neglect problem-solving capabilities of lower animal phyla, Kingdoms other than Animalia, and advancing smart computational technologies built from biological, artificial, and composite media. Attempting to provide an inclusive, precise definition of insight, two major criteria of insight, discontinuous processing and problem restructuring, are here reframed using terminology and statistical mechanical properties of computational complexity classes. Discontinuous processing becomes abrupt state transitions in algorithmic/heuristic outcomes or in types of algorithms/heuristics executed by agents using classical and/or quantum computational models. And problem restructuring becomes combinatorial reorganization of resources, problem-type substitution, and/or exchange of computational models. With insight bounded by computational complexity, humans, ciliated protozoa, and complex technological networks, for example, show insight when restructuring time requirements, combinatorial complexity, and problem type to solve polynomial and nondeterministic polynomial decision problems. Similar effects are expected from other problem types, supporting the idea that insight might be an epiphenomenon of analytical problem solving and consequently a larger information processing framework. Thus, this computational complexity definition of insight improves the power, external and internal validity, and reliability of operational parameters with which to classify, investigate, and produce the phenomenon for computational agents ranging from microbes to man-made devices. Copyright © 2015 Elsevier Ltd. All rights reserved.
Wu, Suo-Wei; Chen, Tong; Pan, Qi; Wei, Liang-Yu; Wang, Qin; Li, Chao; Song, Jing-Chen; Luo, Ji
2018-06-05
The development and application of medical technologies reflect the medical quality and clinical capacity of a hospital. It is also an effective approach in upgrading medical service and core competitiveness among medical institutions. This study aimed to build a quantitative medical technology evaluation system through questionnaire survey within medical institutions to perform an assessment to medical technologies more objectively and accurately, and promote the management of medical quality technologies and ensure the medical safety of various operations among the hospitals. A two-leveled quantitative medical technology evaluation system was built through a two-round questionnaire survey of chosen experts. The Delphi method was applied in identifying the structure of evaluation system and indicators. The judgment of the experts on the indicators was adopted in building the matrix so that the weight coefficient and maximum eigenvalue (λ max), consistency index (CI), and random consistency ratio (CR) could be obtained and collected. The results were verified through consistency tests, and the index weight coefficient of each indicator was conducted and calculated through analytical hierarchy process. Twenty-six experts of different medical fields were involved in the questionnaire survey, 25 of whom successfully responded to the two-round research. Altogether, 4 primary indicators (safety, effectiveness, innovativeness, and benefits), as well as 13 secondary indicators, were included in the evaluation system. The matrix is built to conduct the λ max, CI, and CR of each expert in the survey, and the index weight coefficients of primary indicators were 0.33, 0.28, 0.27, and 0.12, respectively, and the index weight coefficients of secondary indicators were conducted and calculated accordingly. As the two-round questionnaire survey of experts and statistical analysis were performed and credibility of the results was verified through consistency evaluation test, the study established a quantitative medical technology evaluation system model and assessment indicators within medical institutions based on the Delphi method and analytical hierarchy process. Moreover, further verifications, adjustments, and optimizations of the system and indicators will be performed in follow-up studies.
Federal Ocean Energy Technology
NASA Astrophysics Data System (ADS)
1987-10-01
The Department of Energy's (DOE) Ocean Energy Technology (OET) Program is looking for cost-effective ways to harness ocean energy to help power tomorrow's world. Federally sponsored researchers are studying methods to transform the solar heat stored in the ocean's surface waters into electricity as well as new ways to convert wave energy into mechanical energy or electricity. This report provides a summary of research completed during FY86. Four major research areas are addressed in the work covered by this report: Thermodynamic Research and Analysis addresses the process and system analyses which provide the underlying understanding of physical effects which constitute the energy conversion processes, Experimental Verification and Testing provides confirmation of the analytical projections and empirical relationships, Materials and Structural Research addresses special materials compatibility issues related to operation in the sea. Much of its focus is on concepts for the system CWP which is a major technology cost driver, and Oceanographic, Environmental, and Geotechnical Research addresss those unique design requirements imposed by construction in steep slope coastal areas.
Study on internal flow and surface deformation of large droplet levitated by ultrasonic wave.
Abe, Yutaka; Hyuga, Daisuke; Yamada, Shogo; Aoki, Kazuyoshi
2006-09-01
It is expected that new materials will be manufactured with containerless processing under the microgravity environment in space. Under the microgravity environment, handling technology of molten metal is important for such processes. There are a lot of previous studies about droplet levitation technologies, including the use of acoustic waves, as the holding technology. However, experimental and analytical information about the relationship between surface deformation and internal flow of a large levitated droplet is still unknown. The purpose of this study is to experimentally investigate the large droplet behavior levitated by the acoustic wave field and its internal flow. To achieve this, first, numerical simulation is conducted to clarify the characteristics of acoustic wave field. Second, the levitation characteristic and the internal flow of the levitated droplet are investigated by the ultrasonic standing wave under normal gravity environment. Finally, the levitation characteristic and internal flow of levitated droplet are observed under microgravity in an aircraft to compare results with the experiment performed under the normal gravity environment.
Commercial Aircraft Protection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ehst, David A.
This report summarizes the results of theoretical research performed during 3 years of P371 Project implementation. In results of such research a new scientific conceptual technology of quasi-passive individual infrared protection of heat-generating objects – Spatial Displacement of Thermal Image (SDTI technology) was developed. Theoretical substantiation and description of working processes of civil aircraft individual IR-protection system were conducted. The mathematical models and methodology were presented, there were obtained the analytical dependencies which allow performing theoretical research of the affect of intentionally arranged dynamic field of the artificial thermal interferences with variable contrast onto main parameters of optic-electronic tracking andmore » homing systems.« less
Peak distortion effects in analytical ion chromatography.
Wahab, M Farooq; Anderson, Jordan K; Abdelrady, Mohamed; Lucy, Charles A
2014-01-07
The elution profile of chromatographic peaks provides fundamental understanding of the processes that occur in the mobile phase and the stationary phase. Major advances have been made in the column chemistry and suppressor technology in ion chromatography (IC) to handle a variety of sample matrices and ions. However, if the samples contain high concentrations of matrix ions, the overloaded peak elution profile is distorted. Consequently, the trace peaks shift their positions in the chromatogram in a manner that depends on the peak shape of the overloading analyte. In this work, the peak shapes in IC are examined from a fundamental perspective. Three commercial IC columns AS16, AS18, and AS23 were studied with borate, hydroxide and carbonate as suppressible eluents. Monovalent ions (chloride, bromide, and nitrate) are used as model analytes under analytical (0.1 mM) to overload conditions (10-500 mM). Both peak fronting and tailing are observed. On the basis of competitive Langmuir isotherms, if the eluent anion is more strongly retained than the analyte ion on an ion exchanger, the analyte peak is fronting. If the eluent is more weakly retained on the stationary phase, the analyte peak always tails under overload conditions regardless of the stationary phase capacity. If the charge of the analyte and eluent anions are different (e.g., Br(-) vs CO3(2-)), the analyte peak shapes depend on the eluent concentration in a more complex pattern. It was shown that there are interesting similarities with peak distortions due to strongly retained mobile phase components in other modes of liquid chromatography.
Großhans, Steffen; Rüdt, Matthias; Sanden, Adrian; Brestrich, Nina; Morgenstern, Josefine; Heissler, Stefan; Hubbuch, Jürgen
2018-04-27
Fourier-transform infrared spectroscopy (FTIR) is a well-established spectroscopic method in the analysis of small molecules and protein secondary structure. However, FTIR is not commonly applied for in-line monitoring of protein chromatography. Here, the potential of in-line FTIR as a process analytical technology (PAT) in downstream processing was investigated in three case studies addressing the limits of currently applied spectroscopic PAT methods. A first case study exploited the secondary structural differences of monoclonal antibodies (mAbs) and lysozyme to selectively quantify the two proteins with partial least squares regression (PLS) giving root mean square errors of cross validation (RMSECV) of 2.42 g/l and 1.67 g/l, respectively. The corresponding Q 2 values are 0.92 and, respectively, 0.99, indicating robust models in the calibration range. Second, a process separating lysozyme and PEGylated lysozyme species was monitored giving an estimate of the PEGylation degree of currently eluting species with RMSECV of 2.35 g/l for lysozyme and 1.24 g/l for PEG with Q 2 of 0.96 and 0.94, respectively. Finally, Triton X-100 was added to a feed of lysozyme as a typical process-related impurity. It was shown that the species could be selectively quantified from the FTIR 3D field without PLS calibration. In summary, the proposed PAT tool has the potential to be used as a versatile option for monitoring protein chromatography. It may help to achieve a more complete implementation of the PAT initiative by mitigating limitations of currently used techniques. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Magnoni, L.; Suthakar, U.; Cordeiro, C.; Georgiou, M.; Andreeva, J.; Khan, A.; Smith, D. R.
2015-12-01
Monitoring the WLCG infrastructure requires the gathering and analysis of a high volume of heterogeneous data (e.g. data transfers, job monitoring, site tests) coming from different services and experiment-specific frameworks to provide a uniform and flexible interface for scientists and sites. The current architecture, where relational database systems are used to store, to process and to serve monitoring data, has limitations in coping with the foreseen increase in the volume (e.g. higher LHC luminosity) and the variety (e.g. new data-transfer protocols and new resource-types, as cloud-computing) of WLCG monitoring events. This paper presents a new scalable data store and analytics platform designed by the Support for Distributed Computing (SDC) group, at the CERN IT department, which uses a variety of technologies each one targeting specific aspects of big-scale distributed data-processing (commonly referred as lambda-architecture approach). Results of data processing on Hadoop for WLCG data activities monitoring are presented, showing how the new architecture can easily analyze hundreds of millions of transfer logs in a few minutes. Moreover, a comparison of data partitioning, compression and file format (e.g. CSV, Avro) is presented, with particular attention given to how the file structure impacts the overall MapReduce performance. In conclusion, the evolution of the current implementation, which focuses on data storage and batch processing, towards a complete lambda-architecture is discussed, with consideration of candidate technology for the serving layer (e.g. Elasticsearch) and a description of a proof of concept implementation, based on Apache Spark and Esper, for the real-time part which compensates for batch-processing latency and automates problem detection and failures.
Environmental management strategy: four forces analysis.
Doyle, Martin W; Von Windheim, Jesko
2015-01-01
We develop an analytical approach for more systematically analyzing environmental management problems in order to develop strategic plans. This approach can be deployed by agencies, non-profit organizations, corporations, or other organizations and institutions tasked with improving environmental quality. The analysis relies on assessing the underlying natural processes followed by articulation of the relevant societal forces causing environmental change: (1) science and technology, (2) governance, (3) markets and the economy, and (4) public behavior. The four forces analysis is then used to strategize which types of actions might be most effective at influencing environmental quality. Such strategy has been under-used and under-valued in environmental management outside of the corporate sector, and we suggest that this four forces analysis is a useful analytic to begin developing such strategy.
New Tools and New Biology: Recent Miniaturized Systems for Molecular and Cellular Biology
Hamon, Morgan; Hong, Jong Wook
2013-01-01
Recent advances in applied physics and chemistry have led to the development of novel microfluidic systems. Microfluidic systems allow minute amounts of reagents to be processed using μm-scale channels and offer several advantages over conventional analytical devices for use in biological sciences: faster, more accurate and more reproducible analytical performance, reduced cell and reagent consumption, portability, and integration of functional components in a single chip. In this review, we introduce how microfluidics has been applied to biological sciences. We first present an overview of the fabrication of microfluidic systems and describe the distinct technologies available for biological research. We then present examples of microsystems used in biological sciences, focusing on applications in molecular and cellular biology. PMID:24305843
Design of High Field Solenoids made of High Temperature Superconductors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bartalesi, Antonio; /Pisa U.
2010-12-01
This thesis starts from the analytical mechanical analysis of a superconducting solenoid, loaded by self generated Lorentz forces. Also, a finite element model is proposed and verified with the analytical results. To study the anisotropic behavior of a coil made by layers of superconductor and insulation, a finite element meso-mechanic model is proposed and designed. The resulting material properties are then used in the main solenoid analysis. In parallel, design work is performed as well: an existing Insert Test Facility (ITF) is adapted and structurally verified to support a coil made of YBa{sub 2}Cu{sub 3}O{sub 7}, a High Temperature Superconductormore » (HTS). Finally, a technological winding process was proposed and the required tooling is designed.« less
NASA Technical Reports Server (NTRS)
Lyle, Karen H.
2015-01-01
Acceptance of new spacecraft structural architectures and concepts requires validated design methods to minimize the expense involved with technology demonstration via flight-testing. Hypersonic Inflatable Aerodynamic Decelerator (HIAD) architectures are attractive for spacecraft deceleration because they are lightweight, store compactly, and utilize the atmosphere to decelerate a spacecraft during entry. However, designers are hesitant to include these inflatable approaches for large payloads or spacecraft because of the lack of flight validation. This publication summarizes results comparing analytical results with test data for two concepts subjected to representative entry, static loading. The level of agreement and ability to predict the load distribution is considered sufficient to enable analytical predictions to be used in the design process.
Environmental Management Strategy: Four Forces Analysis
NASA Astrophysics Data System (ADS)
Doyle, Martin W.; Von Windheim, Jesko
2015-01-01
We develop an analytical approach for more systematically analyzing environmental management problems in order to develop strategic plans. This approach can be deployed by agencies, non-profit organizations, corporations, or other organizations and institutions tasked with improving environmental quality. The analysis relies on assessing the underlying natural processes followed by articulation of the relevant societal forces causing environmental change: (1) science and technology, (2) governance, (3) markets and the economy, and (4) public behavior. The four forces analysis is then used to strategize which types of actions might be most effective at influencing environmental quality. Such strategy has been under-used and under-valued in environmental management outside of the corporate sector, and we suggest that this four forces analysis is a useful analytic to begin developing such strategy.
Designing Technology-Enabled Instruction to Utilize Learning Analytics
ERIC Educational Resources Information Center
Davies, Randall; Nyland, Robert; Bodily, Robert; Chapman, John; Jones, Brian; Young, Jay
2017-01-01
A key notion conveyed by those who advocate for the use of data to enhance instruction is an awareness that learning analytics has the potential to improve instruction and learning but is not currently reaching that potential. Gibbons (2014) suggested that a lack of learning facilitated by current technology-enabled instructional systems may be…
Quo vadis, analytical chemistry?
Valcárcel, Miguel
2016-01-01
This paper presents an open, personal, fresh approach to the future of Analytical Chemistry in the context of the deep changes Science and Technology are anticipated to experience. Its main aim is to challenge young analytical chemists because the future of our scientific discipline is in their hands. A description of not completely accurate overall conceptions of our discipline, both past and present, to be avoided is followed by a flexible, integral definition of Analytical Chemistry and its cornerstones (viz., aims and objectives, quality trade-offs, the third basic analytical reference, the information hierarchy, social responsibility, independent research, transfer of knowledge and technology, interfaces to other scientific-technical disciplines, and well-oriented education). Obsolete paradigms, and more accurate general and specific that can be expected to provide the framework for our discipline in the coming years are described. Finally, the three possible responses of analytical chemists to the proposed changes in our discipline are discussed.
NASA Astrophysics Data System (ADS)
Sarni, W.
2017-12-01
Water scarcity and poor quality impacts economic development, business growth, and social well-being. Water has become, in our generation, the foremost critical local, regional, and global issue of our time. Despite these needs, there is no water hub or water technology accelerator solely dedicated to water data and tools. There is a need by the public and private sectors for vastly improved data management and visualization tools. This is the WetDATA opportunity - to develop a water data tech hub dedicated to water data acquisition, analytics, and visualization tools for informed policy and business decisions. WetDATA's tools will help incubate disruptive water data technologies and accelerate adoption of current water data solutions. WetDATA is a Colorado-based (501c3), global hub for water data analytics and technology innovation. WetDATA's vision is to be a global leader in water information, data technology innovation and collaborate with other US and global water technology hubs. ROADMAP * Portal (www.wetdata.org) to provide stakeholders with tools/resources to understand related water risks. * The initial activities will provide education, awareness and tools to stakeholders to support the implementation of the Colorado State Water Plan. * Leverage the Western States Water Council Water Data Exchange database. * Development of visualization, predictive analytics and AI tools to engage with stakeholders and provide actionable data and information. TOOLS Education: Provide information on water issues and risks at the local, state, national and global scale. Visualizations: Development of data analytics and visualization tools based upon the 2030 Water Resources Group methodology to support the implementation of the Colorado State Water Plan. Predictive Analytics: Accessing publically available water databases and using machine learning to develop water availability forecasting tools, and time lapse images to support city / urban planning.
Li, Hui; Sheeran, Jillian W; Clausen, Andrew M; Fang, Yuan-Qing; Bio, Matthew M; Bader, Scott
2017-08-01
The development of a flow chemistry process for asymmetric propargylation using allene gas as a reagent is reported. The connected continuous process of allene dissolution, lithiation, Li-Zn transmetallation, and asymmetric propargylation provides homopropargyl β-amino alcohol 1 with high regio- and diastereoselectivity in high yield. This flow process enables practical use of an unstable allenyllithium intermediate. The process uses the commercially available and recyclable (1S,2R)-N-pyrrolidinyl norephedrine as a ligand to promote the highly diastereoselective (32:1) propargylation. Judicious selection of mixers based on the chemistry requirement and real-time monitoring of the process using process analytical technology (PAT) enabled stable and scalable flow chemistry runs. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Garnett, Kenisha; Cooper, Tim
2014-12-01
The complexity of municipal waste management decision-making has increased in recent years, accompanied by growing scrutiny from stakeholders, including local communities. This complexity reflects a socio-technical framing of the risks and social impacts associated with selecting technologies and sites for waste treatment and disposal facilities. Consequently there is growing pressure on local authorities for stakeholders (including communities) to be given an early opportunity to shape local waste policy in order to encourage swift planning, development and acceptance of the technologies needed to meet statutory targets to divert waste from landfill. This paper presents findings from a research project that explored the use of analytical-deliberative processes as a legitimising tool for waste management decision-making. Adopting a mixed methods approach, the study revealed that communicating the practical benefits of more inclusive forms of engagement is proving difficult even though planning and policy delays are hindering development and implementation of waste management infrastructure. Adopting analytical-deliberative processes at a more strategic level will require local authorities and practitioners to demonstrate how expert-citizen deliberations may foster progress in resolving controversial issues, through change in individuals, communities and institutions. The findings suggest that a significant shift in culture will be necessary for local authorities to realise the potential of more inclusive decision processes. This calls for political actors and civic society to collaborate in institutionalising public involvement in both strategic and local planning structures. Copyright © 2014 Elsevier Ltd. All rights reserved.
Warfighter decision making performance analysis as an investment priority driver
NASA Astrophysics Data System (ADS)
Thornley, David J.; Dean, David F.; Kirk, James C.
2010-04-01
Estimating the relative value of alternative tactics, techniques and procedures (TTP) and information systems requires measures of the costs and benefits of each, and methods for combining and comparing those measures. The NATO Code of Best Practice for Command and Control Assessment explains that decision making quality would ideally be best assessed on outcomes. Lessons learned in practice can be assessed statistically to support this, but experimentation with alternate measures in live conflict is undesirable. To this end, the development of practical experimentation to parameterize effective constructive simulation and analytic modelling for system utility prediction is desirable. The Land Battlespace Systems Department of Dstl has modeled human development of situational awareness to support constructive simulation by empirically discovering how evidence is weighed according to circumstance, personality, training and briefing. The human decision maker (DM) provides the backbone of the information processing activity associated with military engagements because of inherent uncertainty associated with combat operations. To develop methods for representing the process in order to assess equipment and non-technological interventions such as training and TTPs we are developing componentized or modularized timed analytic stochastic model components and instruments as part of a framework to support quantitative assessment of intelligence production and consumption methods in a human decision maker-centric mission space. In this paper, we formulate an abstraction of the human intelligence fusion process from the Defence Science and Technology Laboratory's (Dstl's) INCIDER model to include in our framework, and synthesize relevant cost and benefit characteristics.
NASA Technical Reports Server (NTRS)
Marsik, S. J.; Morea, S. F.
1985-01-01
A research and technology program for advanced high pressure, oxygen-hydrogen rocket propulsion technology is presently being pursued by the National Aeronautics and Space Administration (NASA) to establish the basic discipline technologies, develop the analytical tools, and establish the data base necessary for an orderly evolution of the staged combustion reusable rocket engine. The need for the program is based on the premise that the USA will depend on the Shuttle and its derivative versions as its principal Earth-to-orbit transportation system for the next 20 to 30 yr. The program is focused in three principal areas of enhancement: (1) life extension, (2) performance, and (3) operations and diagnosis. Within the technological disciplines the efforts include: rotordynamics, structural dynamics, fluid and gas dynamics, materials fatigue/fracture/life, turbomachinery fluid mechanics, ignition/combustion processes, manufacturing/producibility/nondestructive evaluation methods and materials development/evaluation. An overview of the Advanced High Pressure Oxygen-Hydrogen Rocket Propulsion Technology Program Structure and Working Groups objectives are presented with highlights of several significant achievements.
NASA Technical Reports Server (NTRS)
Marsik, S. J.; Morea, S. F.
1985-01-01
A research and technology program for advanced high pressure, oxygen-hydrogen rocket propulsion technology is presently being pursued by the National Aeronautics and Space Administration (NASA) to establish the basic discipline technologies, develop the analytical tools, and establish the data base necessary for an orderly evolution of the staged combustion reusable rocket engine. The need for the program is based on the premise that the USA will depend on the Shuttle and its derivative versions as its principal Earth-to-orbit transportation system for the next 20 to 30 yr. The program is focused in three principal areas of enhancement: (1) life extension, (2) performance, and (3) operations and diagnosis. Within the technological disciplines the efforts include: rotordynamics, structural dynamics, fluid and gas dynamics, materials fatigue/fracture/life, turbomachinery fluid mechanics, ignition/combustion processes, manufacturing/producibility/nondestructive evaluation methods and materials development/evaluation. An overview of the Advanced High Pressure Oxygen-Hydrogen Rocket Propulsion Technology Program Structure and Working Groups objectives are presented with highlights of several significant achievements.
NASA Astrophysics Data System (ADS)
Marsik, S. J.; Morea, S. F.
1985-03-01
A research and technology program for advanced high pressure, oxygen-hydrogen rocket propulsion technology is presently being pursued by the National Aeronautics and Space Administration (NASA) to establish the basic discipline technologies, develop the analytical tools, and establish the data base necessary for an orderly evolution of the staged combustion reusable rocket engine. The need for the program is based on the premise that the USA will depend on the Shuttle and its derivative versions as its principal Earth-to-orbit transportation system for the next 20 to 30 yr. The program is focused in three principal areas of enhancement: (1) life extension, (2) performance, and (3) operations and diagnosis. Within the technological disciplines the efforts include: rotordynamics, structural dynamics, fluid and gas dynamics, materials fatigue/fracture/life, turbomachinery fluid mechanics, ignition/combustion processes, manufacturing/producibility/nondestructive evaluation methods and materials development/evaluation. An overview of the Advanced High Pressure Oxygen-Hydrogen Rocket Propulsion Technology Program Structure and Working Groups objectives are presented with highlights of several significant achievements.
NASA Astrophysics Data System (ADS)
Alawneh, Firas Mohamad
This thesis investigates continuity and change of ceramics from Late Byzantine-Early Islamic transition period Jordan. The transition period has been characterized largely by an overlap of two ceramic traditions. The material culture of this period has been primarily viewed through formal and stylistic changes. However, ceramic technology and distribution have never been subjected to rigorous analytical study. In order to explain continuity and change in ceramic tradition the undertaken study has focused on the provenance and technology, using multifaceted analytical approach. This study of the transition period pottery has focused on the classification and technological features of potsherds from selected sites in Jordan (Amman, Aqaba, Beit Ras, Khirbet el-Nawafleh, Jarash, Jericho, Pella, Madaba, Gharndal, Humaimah, Um er-Rassas and Um el-Waleed). Samples were studied by particle-induced X-ray emission spectroscopy, X-ray powder diffraction, and optical microscopy to analyze their chemical, mineralogical and textural features in the aim of determining their possible provenance and production technology. Compositional data were statistically processed with multivariate analysis using SYSTAT II software 2006. To obtain further information about possible source areas of raw materials used in ceramic production, clays were also sampled in the studied areas. Firing experiments were conducted for clays with compositions comparable with those of ceramic sherds, to better understand the firing technology of the pottery. The multifaceted analytical approach has revealed important information on ceramic production in Transjordan. Khirbet el-Nawafleh and Aqaba in the south, Jarash and Pella in the north, Amman and Madaba in the middle are possibly just a few important production centers during this period. The study shows a multidirectional socio-cultural exchange and economic trade patterns within each region and between adjacent regions, as well. Also, importation from adjacent provinces cannot be excluded for certain samples. Despite the different archaeological levels to which these samples belong to, this study illustrates some similarity in technological features and chemical composition. This in turn suggests that continuity is rather the trend in ceramic tradition of the society during the transition period. However, further work on clays, kilns, and pottery from other sites discovered in Jordan is necessary to confirm this conclusion.
Data Discovery with IBM Watson
NASA Astrophysics Data System (ADS)
Fessler, J.
2016-12-01
BM Watson is a cognitive computing system that uses machine learning, statistical analysis, and natural language processing to find and understand the clues in questions posed to it. Watson was made famous when it bested two champions on TV's Jeopardy! show. Since then, Watson has evolved into a platform of cognitive services that can be trained on very granular fields up study. Watson is being used to support a number of subject domains, such as cancer research, public safety, engineering, and the intelligence community. IBM will be providing a presentation and demonstration on the Watson technology and will discuss its capabilities including Natural Language Processing, text analytics and enterprise search, as well as cognitive computing with deep Q&A. The team will also be giving examples of how IBM Watson technology is being used to support real-world problems across a number of public sector agencies
2014-01-01
The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called “big data” challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. The MapReduce programming framework uses two tasks common in functional programming: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields. PMID:25383096
Mohammed, Emad A; Far, Behrouz H; Naugler, Christopher
2014-01-01
The emergence of massive datasets in a clinical setting presents both challenges and opportunities in data storage and analysis. This so called "big data" challenges traditional analytic tools and will increasingly require novel solutions adapted from other fields. Advances in information and communication technology present the most viable solutions to big data analysis in terms of efficiency and scalability. It is vital those big data solutions are multithreaded and that data access approaches be precisely tailored to large volumes of semi-structured/unstructured data. THE MAPREDUCE PROGRAMMING FRAMEWORK USES TWO TASKS COMMON IN FUNCTIONAL PROGRAMMING: Map and Reduce. MapReduce is a new parallel processing framework and Hadoop is its open-source implementation on a single computing node or on clusters. Compared with existing parallel processing paradigms (e.g. grid computing and graphical processing unit (GPU)), MapReduce and Hadoop have two advantages: 1) fault-tolerant storage resulting in reliable data processing by replicating the computing tasks, and cloning the data chunks on different computing nodes across the computing cluster; 2) high-throughput data processing via a batch processing framework and the Hadoop distributed file system (HDFS). Data are stored in the HDFS and made available to the slave nodes for computation. In this paper, we review the existing applications of the MapReduce programming framework and its implementation platform Hadoop in clinical big data and related medical health informatics fields. The usage of MapReduce and Hadoop on a distributed system represents a significant advance in clinical big data processing and utilization, and opens up new opportunities in the emerging era of big data analytics. The objective of this paper is to summarize the state-of-the-art efforts in clinical big data analytics and highlight what might be needed to enhance the outcomes of clinical big data analytics tools. This paper is concluded by summarizing the potential usage of the MapReduce programming framework and Hadoop platform to process huge volumes of clinical data in medical health informatics related fields.
NASA Astrophysics Data System (ADS)
Song, Y.; Gui, Z.; Wu, H.; Wei, Y.
2017-09-01
Analysing spatiotemporal distribution patterns and its dynamics of different industries can help us learn the macro-level developing trends of those industries, and in turn provides references for industrial spatial planning. However, the analysis process is challenging task which requires an easy-to-understand information presentation mechanism and a powerful computational technology to support the visual analytics of big data on the fly. Due to this reason, this research proposes a web-based framework to enable such a visual analytics requirement. The framework uses standard deviational ellipse (SDE) and shifting route of gravity centers to show the spatial distribution and yearly developing trends of different enterprise types according to their industry categories. The calculation of gravity centers and ellipses is paralleled using Apache Spark to accelerate the processing. In the experiments, we use the enterprise registration dataset in Mainland China from year 1960 to 2015 that contains fine-grain location information (i.e., coordinates of each individual enterprise) to demonstrate the feasibility of this framework. The experiment result shows that the developed visual analytics method is helpful to understand the multi-level patterns and developing trends of different industries in China. Moreover, the proposed framework can be used to analyse any nature and social spatiotemporal point process with large data volume, such as crime and disease.
Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.
2016-01-01
Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933
Big, Deep, and Smart Data in Scanning Probe Microscopy
Kalinin, Sergei V.; Strelcov, Evgheni; Belianinov, Alex; ...
2016-09-27
Scanning probe microscopy techniques open the door to nanoscience and nanotechnology by enabling imaging and manipulation of structure and functionality of matter on nanometer and atomic scales. We analyze the discovery process by SPM in terms of information flow from tip-surface junction to the knowledge adoption by scientific community. Furthermore, we discuss the challenges and opportunities offered by merging of SPM and advanced data mining, visual analytics, and knowledge discovery technologies.
The production, properties, and applications of thermostable steryl glucosidases.
Aguirre, Andres; Eberhardt, Florencia; Hails, Guillermo; Cerminati, Sebastian; Castelli, María Eugenia; Rasia, Rodolfo M; Paoletti, Luciana; Menzella, Hugo G; Peiru, Salvador
2018-02-21
Extremophilic microorganisms are a rich source of enzymes, the enzymes which can serve as industrial catalysts that can withstand harsh processing conditions. An example is thermostable β-glucosidases that are addressing a challenging problem in the biodiesel industry: removing steryl glucosides (SGs) from biodiesel. Steryl glucosidases (SGases) must be tolerant to heat and solvents in order to function efficiently in biodiesel. The amphipathic nature of SGs also requires enzymes with an affinity for water/solvent interfaces in order to achieve efficient hydrolysis. Additionally, the development of an enzymatic process involving a commodity such as soybean biodiesel must be cost-effective, necessitating an efficient manufacturing process for SGases. This review summarizes the identification of microbial SGases and their applications, discusses biodiesel refining processes and the development of analytical methods for identifying and quantifying SGs in foods and biodiesel, and considers technologies for strain engineering and process optimization for the heterologous production of a SGase from Thermococcus litoralis. All of these technologies might be used for the production of other thermostable enzymes. Structural features of SGases and the feasibility of protein engineering for novel applications are explored.
Smartphone Analytics: Mobilizing the Lab into the Cloud for Omic-Scale Analyses.
Montenegro-Burke, J Rafael; Phommavongsay, Thiery; Aisporna, Aries E; Huan, Tao; Rinehart, Duane; Forsberg, Erica; Poole, Farris L; Thorgersen, Michael P; Adams, Michael W W; Krantz, Gregory; Fields, Matthew W; Northen, Trent R; Robbins, Paul D; Niedernhofer, Laura J; Lairson, Luke; Benton, H Paul; Siuzdak, Gary
2016-10-04
Active data screening is an integral part of many scientific activities, and mobile technologies have greatly facilitated this process by minimizing the reliance on large hardware instrumentation. In order to meet with the increasingly growing field of metabolomics and heavy workload of data processing, we designed the first remote metabolomic data screening platform for mobile devices. Two mobile applications (apps), XCMS Mobile and METLIN Mobile, facilitate access to XCMS and METLIN, which are the most important components in the computer-based XCMS Online platforms. These mobile apps allow for the visualization and analysis of metabolic data throughout the entire analytical process. Specifically, XCMS Mobile and METLIN Mobile provide the capabilities for remote monitoring of data processing, real time notifications for the data processing, visualization and interactive analysis of processed data (e.g., cloud plots, principle component analysis, box-plots, extracted ion chromatograms, and hierarchical cluster analysis), and database searching for metabolite identification. These apps, available on Apple iOS and Google Android operating systems, allow for the migration of metabolomic research onto mobile devices for better accessibility beyond direct instrument operation. The utility of XCMS Mobile and METLIN Mobile functionalities was developed and is demonstrated here through the metabolomic LC-MS analyses of stem cells, colon cancer, aging, and bacterial metabolism.
Smartphone Analytics: Mobilizing the Lab into the Cloud for Omic-Scale Analyses
2016-01-01
Active data screening is an integral part of many scientific activities, and mobile technologies have greatly facilitated this process by minimizing the reliance on large hardware instrumentation. In order to meet with the increasingly growing field of metabolomics and heavy workload of data processing, we designed the first remote metabolomic data screening platform for mobile devices. Two mobile applications (apps), XCMS Mobile and METLIN Mobile, facilitate access to XCMS and METLIN, which are the most important components in the computer-based XCMS Online platforms. These mobile apps allow for the visualization and analysis of metabolic data throughout the entire analytical process. Specifically, XCMS Mobile and METLIN Mobile provide the capabilities for remote monitoring of data processing, real time notifications for the data processing, visualization and interactive analysis of processed data (e.g., cloud plots, principle component analysis, box-plots, extracted ion chromatograms, and hierarchical cluster analysis), and database searching for metabolite identification. These apps, available on Apple iOS and Google Android operating systems, allow for the migration of metabolomic research onto mobile devices for better accessibility beyond direct instrument operation. The utility of XCMS Mobile and METLIN Mobile functionalities was developed and is demonstrated here through the metabolomic LC-MS analyses of stem cells, colon cancer, aging, and bacterial metabolism. PMID:27560777
Smartphone Analytics: Mobilizing the Lab into the Cloud for Omic-Scale Analyses
Montenegro-Burke, J. Rafael; Phommavongsay, Thiery; Aisporna, Aries E.; ...
2016-08-25
Active data screening is an integral part of many scientific activities, and mobile technologies have greatly facilitated this process by minimizing the reliance on large hardware instrumentation. In order to meet with the increasingly growing field of metabolomics and heavy workload of data processing, we designed the first remote metabolomic data screening platform for mobile devices. Two mobile applications (apps), XCMS Mobile and METLIN Mobile, facilitate access to XCMS and METLIN, which are the most important components in the computer-based XCMS Online platforms. These mobile apps allow for the visualization and analysis of metabolic data throughout the entire analytical process.more » Specifically, XCMS Mobile and METLIN Mobile provide the capabilities for remote monitoring of data processing, real time notifications for the data processing, visualization and interactive analysis of processed data (e.g., cloud plots, principle component analysis, box-plots, extracted ion chromatograms, and hierarchical cluster analysis), and database searching for metabolite identification. These apps, available on Apple iOS and Google Android operating systems, allow for the migration of metabolomic research onto mobile devices for better accessibility beyond direct instrument operation. The utility of XCMS Mobile and METLIN Mobile functionalities was developed and is demonstrated here through the metabolomic LC-MS analyses of stem cells, colon cancer, aging, and bacterial metabolism.« less
Smartphone Analytics: Mobilizing the Lab into the Cloud for Omic-Scale Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Montenegro-Burke, J. Rafael; Phommavongsay, Thiery; Aisporna, Aries E.
Active data screening is an integral part of many scientific activities, and mobile technologies have greatly facilitated this process by minimizing the reliance on large hardware instrumentation. In order to meet with the increasingly growing field of metabolomics and heavy workload of data processing, we designed the first remote metabolomic data screening platform for mobile devices. Two mobile applications (apps), XCMS Mobile and METLIN Mobile, facilitate access to XCMS and METLIN, which are the most important components in the computer-based XCMS Online platforms. These mobile apps allow for the visualization and analysis of metabolic data throughout the entire analytical process.more » Specifically, XCMS Mobile and METLIN Mobile provide the capabilities for remote monitoring of data processing, real time notifications for the data processing, visualization and interactive analysis of processed data (e.g., cloud plots, principle component analysis, box-plots, extracted ion chromatograms, and hierarchical cluster analysis), and database searching for metabolite identification. These apps, available on Apple iOS and Google Android operating systems, allow for the migration of metabolomic research onto mobile devices for better accessibility beyond direct instrument operation. The utility of XCMS Mobile and METLIN Mobile functionalities was developed and is demonstrated here through the metabolomic LC-MS analyses of stem cells, colon cancer, aging, and bacterial metabolism.« less
Deployment simulation of a deployable reflector for earth science application
NASA Astrophysics Data System (ADS)
Wang, Xiaokai; Fang, Houfei; Cai, Bei; Ma, Xiaofei
2015-10-01
A novel mission concept namely NEXRAD-In-Space (NIS) has been developed for monitoring hurricanes, cyclones and other severe storms from a geostationary orbit. It requires a space deployable 35-meter diameter Ka-band (35 GHz) reflector. NIS can measure hurricane precipitation intensity, dynamics and its life cycle. These information is necessary for predicting the track, intensity, rain rate and hurricane-induced floods. To meet the requirements of the radar system, a Membrane Shell Reflector Segment (MSRS) reflector technology has been developed and several technologies have been evaluated. However, the deployment analysis of this large size and high-precision reflector has not been investigated. For a pre-studies, a scaled tetrahedral truss reflector with spring driving deployment system has been made and tested, deployment dynamics analysis of this scaled reflector has been performed using ADAMS to understand its deployment dynamic behaviors. Eliminating the redundant constraints in the reflector system with a large number of moving parts is a challenging issue. A primitive joint and flexible struts were introduced to the analytical model and they can effectively eliminate over constraints of the model. By using a high-speed camera and a force transducer, a deployment experiment of a single-bay tetrahedral module has been conducted. With the tested results, an optimization process has been performed by using the parameter optimization module of ADAMS to obtain the parameters of the analytical model. These parameters were incorporated to the analytical model of the whole reflector. It is observed from the analysis results that the deployment process of the reflector with a fixed boundary experiences three stages. These stages are rapid deployment stage, slow deployment stage and impact stage. The insight of the force peak distributions of the reflector can help the optimization design of the structure.
Investigation of the laser engineered net shaping process for nanostructured cermets
NASA Astrophysics Data System (ADS)
Xiong, Yuhong
Laser Engineered Net Shaping (LENSRTM) is a solid freeform fabrication (SFF) technology that combines high power laser deposition and powder metallurgy technologies. The LENSRTM technology has been used to fabricate a number of metallic alloys with improved physical and mechanical material properties. The successful application provides a motivation to also apply this method to fabricate non-metallic alloys, such as tungsten carbide-cobalt (WC-Co) cermets in a timely and easy way. However, reports on this topic are very limited. In this work, the LENSRTM technology was used to investigate its application to nanostructured WC-Co cermets, including processing conditions, microstructural evolution, thermal behavior, mechanical properties, and environmental and economic benefits. Details of the approaches are described as follows. A comprehensive analysis of the relationships between process parameters, microstructural evolution and mechanical properties was conducted through various analytical techniques. Effects of process parameters on sample profiles and microstructures were analyzed. Dissolution, shape change and coarsening of WC particles were investigated to study the mechanisms of microstructural evolution. The thermal features were correlated with the microstructure and mechanical properties. The special thermal behavior during this process and its relevant effects on the microstructure have been experimentally studied and numerically simulated. A high-speed digital camera was applied to study the temperature profile, temperature gradient and cooling rate in and near the molten pool. Numerical modeling was employed for 3D samples using finite element method with ADINA software for the first time. The validated modeling results were used to interpret microstructural evolution and thermal history. In order to fully evaluate the capability of the LENSRTM technology for the fabrication of cermets, material properties of WC-Co cermets produced by different powder metallurgy technologies were compared. In addition, another cermet system, nanostructured titanium/tungsten carbide-nickel ((Ti,W)C-Ni) powder, prepared using high-energy ball milling process, was also deposited by the LENSRTM technology. Because of the near net shape feature of the LENSRTM process, special emphasis was also placed on its potential environmental and economic benefits by applying life cycle assessment (LCA) and technical cost modeling (TCM). Comparisons were conducted between the conventional powder metallurgy processes and the LENSRTM process.
Bicubic uniform B-spline wavefront fitting technology applied in computer-generated holograms
NASA Astrophysics Data System (ADS)
Cao, Hui; Sun, Jun-qiang; Chen, Guo-jie
2006-02-01
This paper presented a bicubic uniform B-spline wavefront fitting technology to figure out the analytical expression for object wavefront used in Computer-Generated Holograms (CGHs). In many cases, to decrease the difficulty of optical processing, off-axis CGHs rather than complex aspherical surface elements are used in modern advanced military optical systems. In order to design and fabricate off-axis CGH, we have to fit out the analytical expression for object wavefront. Zernike Polynomial is competent for fitting wavefront of centrosymmetric optical systems, but not for axisymmetrical optical systems. Although adopting high-degree polynomials fitting method would achieve higher fitting precision in all fitting nodes, the greatest shortcoming of this method is that any departure from the fitting nodes would result in great fitting error, which is so-called pulsation phenomenon. Furthermore, high-degree polynomials fitting method would increase the calculation time in coding computer-generated hologram and solving basic equation. Basing on the basis function of cubic uniform B-spline and the character mesh of bicubic uniform B-spline wavefront, bicubic uniform B-spline wavefront are described as the product of a series of matrices. Employing standard MATLAB routines, four kinds of different analytical expressions for object wavefront are fitted out by bicubic uniform B-spline as well as high-degree polynomials. Calculation results indicate that, compared with high-degree polynomials, bicubic uniform B-spline is a more competitive method to fit out the analytical expression for object wavefront used in off-axis CGH, for its higher fitting precision and C2 continuity.
Marek, Lukáš; Tuček, Pavel; Pászto, Vít
2015-01-28
Visual analytics aims to connect the processing power of information technologies and the user's ability of logical thinking and reasoning through the complex visual interaction. Moreover, the most of the data contain the spatial component. Therefore, the need for geovisual tools and methods arises. Either one can develop own system but the dissemination of findings and its usability might be problematic or the widespread and well-known platform can be utilized. The aim of this paper is to prove the applicability of Google Earth™ software as a tool for geovisual analytics that helps to understand the spatio-temporal patterns of the disease distribution. We combined the complex joint spatio-temporal analysis with comprehensive visualisation. We analysed the spatio-temporal distribution of the campylobacteriosis in the Czech Republic between 2008 and 2012. We applied three main approaches in the study: (1) the geovisual analytics of the surveillance data that were visualised in the form of bubble chart; (2) the geovisual analytics of the disease's weekly incidence surfaces computed by spatio-temporal kriging and (3) the spatio-temporal scan statistics that was employed in order to identify high or low rates clusters of affected municipalities. The final data are stored in Keyhole Markup Language files and visualised in Google Earth™ in order to apply geovisual analytics. Using geovisual analytics we were able to display and retrieve information from complex dataset efficiently. Instead of searching for patterns in a series of static maps or using numerical statistics, we created the set of interactive visualisations in order to explore and communicate results of analyses to the wider audience. The results of the geovisual analytics identified periodical patterns in the behaviour of the disease as well as fourteen spatio-temporal clusters of increased relative risk. We prove that Google Earth™ software is a usable tool for the geovisual analysis of the disease distribution. Google Earth™ has many indisputable advantages (widespread, freely available, intuitive interface, space-time visualisation capabilities and animations, communication of results), nevertheless it is still needed to combine it with pre-processing tools that prepare the data into a form suitable for the geovisual analytics itself.
A novel upper limb rehabilitation system with self-driven virtual arm illusion.
Aung, Yee Mon; Al-Jumaily, Adel; Anam, Khairul
2014-01-01
This paper proposes a novel upper extremity rehabilitation system with virtual arm illusion. It aims for fast recovery from lost functions of the upper limb as a result of stroke to provide a novel rehabilitation system for paralyzed patients. The system is integrated with a number of technologies that include Augmented Reality (AR) technology to develop game like exercise, computer vision technology to create the illusion scene, 3D modeling and model simulation, and signal processing to detect user intention via EMG signal. The effectiveness of the developed system has evaluated via usability study and questionnaires which is represented by graphical and analytical methods. The evaluation provides with positive results and this indicates the developed system has potential as an effective rehabilitation system for upper limb impairment.
NASA Technical Reports Server (NTRS)
1981-01-01
The Space Transportation System (STS) is discussed, including the launch processing system, the thermal protection subsystem, meteorological research, sound supression water system, rotating service structure, improved hypergol or removal systems, fiber optics research, precision positioning, remote controlled solid rocket booster nozzle plugs, ground operations for Centaur orbital transfer vehicle, parachute drying, STS hazardous waste disposal and recycle, toxic waste technology and control concepts, fast analytical densitometry study, shuttle inventory management system, operational intercommunications system improvement, and protective garment ensemble. Terrestrial applications are also covered, including LANDSAT applications to water resources, satellite freeze forecast system, application of ground penetrating radar to soil survey, turtle tracking, evaluating computer drawn ground cover maps, sparkless load pulsar, and coupling a microcomputer and computing integrator with a gas chromatograph.
CMOS Time-Resolved, Contact, and Multispectral Fluorescence Imaging for DNA Molecular Diagnostics
Guo, Nan; Cheung, Ka Wai; Wong, Hiu Tung; Ho, Derek
2014-01-01
Instrumental limitations such as bulkiness and high cost prevent the fluorescence technique from becoming ubiquitous for point-of-care deoxyribonucleic acid (DNA) detection and other in-field molecular diagnostics applications. The complimentary metal-oxide-semiconductor (CMOS) technology, as benefited from process scaling, provides several advanced capabilities such as high integration density, high-resolution signal processing, and low power consumption, enabling sensitive, integrated, and low-cost fluorescence analytical platforms. In this paper, CMOS time-resolved, contact, and multispectral imaging are reviewed. Recently reported CMOS fluorescence analysis microsystem prototypes are surveyed to highlight the present state of the art. PMID:25365460
Collaborative Web-Enabled GeoAnalytics Applied to OECD Regional Data
NASA Astrophysics Data System (ADS)
Jern, Mikael
Recent advances in web-enabled graphics technologies have the potential to make a dramatic impact on developing collaborative geovisual analytics (GeoAnalytics). In this paper, tools are introduced that help establish progress initiatives at international and sub-national levels aimed at measuring and collaborating, through statistical indicators, economic, social and environmental developments and to engage both statisticians and the public in such activities. Given this global dimension of such a task, the “dream” of building a repository of progress indicators, where experts and public users can use GeoAnalytics collaborative tools to compare situations for two or more countries, regions or local communities, could be accomplished. While the benefits of GeoAnalytics tools are many, it remains a challenge to adapt these dynamic visual tools to the Internet. For example, dynamic web-enabled animation that enables statisticians to explore temporal, spatial and multivariate demographics data from multiple perspectives, discover interesting relationships, share their incremental discoveries with colleagues and finally communicate selected relevant knowledge to the public. These discoveries often emerge through the diverse backgrounds and experiences of expert domains and are precious in a creative analytics reasoning process. In this context, we introduce a demonstrator “OECD eXplorer”, a customized tool for interactively analyzing, and collaborating gained insights and discoveries based on a novel story mechanism that capture, re-use and share task-related explorative events.
Current trends in nanobiosensor technology
Wu, Diana; Langer, Robert S
2014-01-01
The development of tools and processes used to fabricate, measure, and image nanoscale objects has lead to a wide range of work devoted to producing sensors that interact with extremely small numbers (or an extremely small concentration) of analyte molecules. These advances are particularly exciting in the context of biosensing, where the demands for low concentration detection and high specificity are great. Nanoscale biosensors, or nanobiosensors, provide researchers with an unprecedented level of sensitivity, often to the single molecule level. The use of biomolecule-functionalized surfaces can dramatically boost the specificity of the detection system, but can also yield reproducibility problems and increased complexity. Several nanobiosensor architectures based on mechanical devices, optical resonators, functionalized nanoparticles, nanowires, nanotubes, and nanofibers have been demonstrated in the lab. As nanobiosensor technology becomes more refined and reliable, it is likely it will eventually make its way from the lab to the clinic, where future lab-on-a-chip devices incorporating an array of nanobiosensors could be used for rapid screening of a wide variety of analytes at low cost using small samples of patient material. PMID:21391305
Electric vehicle propulsion alternatives
NASA Technical Reports Server (NTRS)
Secunde, R. R.; Schuh, R. M.; Beach, R. F.
1983-01-01
Propulsion technology development for electric vehicles is summarized. Analytical studies, technology evaluation, and the development of technology for motors, controllers, transmissions, and complete propulsion systems are included.
IMPROVING THE EFFECTIVENESS AND EFFICIENCY OF EVIDENCE PRODUCTION FOR HEALTH TECHNOLOGY ASSESSMENT.
Facey, Karen; Henshall, Chris; Sampietro-Colom, Laura; Thomas, Sarah
2015-01-01
Health Technology Assessment (HTA) needs to address the challenges posed by high cost, effective technologies, expedited regulatory approaches, and the opportunities provided by collaborative real-world evaluation of technologies. The Health Technology Assessment International (HTAi) Policy Forum met to consider these issues and the implications for evidence production to inform HTA. This paper shares their discussion to stimulate further debate. A background paper, presentations, group discussions, and stakeholder role play at the 2015 HTAi Policy Forum meeting informed this paper. HTA has an important role to play in helping improve evidence production and ensuring that the health service is ready to adopt effective technologies. It needs to move from simply informing health system decisions to also working actively to align stakeholder expectations about realistic evidence requirements. Processes to support dialogue over the health technology life cycle need to be developed that are mindful of limited resources, operate across jurisdictions and learn from past processes. Collaborations between health technology developers and health systems in different countries should be encouraged to develop evidence that will inform decision making. New analytical techniques emerging for real-world data should be harnessed to support modeling for HTA. A paradigm shift (to "Health Innovation System 2.0") is suggested where HTA adopts a more central, proactive role to support alignment within and amongst stakeholders over the whole life cycle of the technology. This could help ensure that evidence production is better aligned with patient and health system needs and so is more effective and efficient.
Negative dielectrophoresis spectroscopy for rare analyte quantification in biological samples
NASA Astrophysics Data System (ADS)
Kirmani, Syed Abdul Mannan; Gudagunti, Fleming Dackson; Velmanickam, Logeeshan; Nawarathna, Dharmakeerthi; Lima, Ivan T., Jr.
2017-03-01
We propose the use of negative dielectrophoresis (DEP) spectroscopy as a technique to improve the detection limit of rare analytes in biological samples. We observe a significant dependence of the negative DEP force on functionalized polystyrene beads at the edges of interdigitated electrodes with respect to the frequency of the electric field. We measured this velocity of repulsion for 0% and 0.8% conjugation of avidin with biotin functionalized polystyrene beads with our automated software through real-time image processing that monitors the Rayleigh scattering from the beads. A significant difference in the velocity of the beads was observed in the presence of as little as 80 molecules of avidin per biotin functionalized bead. This technology can be applied in the detection and quantification of rare analytes that can be useful in the diagnosis and the treatment of diseases, such as cancer and myocardial infarction, with the use of polystyrene beads functionalized with antibodies for the target biomarkers.
NASA Astrophysics Data System (ADS)
Farsoiya, Palas Kumar; Dasgupta, Ratul
2017-11-01
When the interface between two radially unbounded, viscous fluids lying vertically in a stable configuration (denser fluid below) at rest, is perturbed, radially propagating capillary-gravity waves are formed which damp out with time. We study this process analytically using a recently developed linearised theory. For small amplitude initial perturbations, the analytical solution to the initial value problem, represented as a linear superposition of Bessel modes at time t = 0 , is found to agree very well with results obtained from direct numerical simulations of the Navier-Stokes equations, for a range of initial conditions. Our study extends the earlier work by John W. Miles who studied this initial value problem analytically, taking into account, a single viscous fluid only. Implications of this study for the mechanistic understanding of droplet impact into a deep pool, will be discussed. Some preliminary, qualitative comparison with experiments will also be presented. We thank SERB Dept. Science & Technology, Govt. of India, Grant No. EMR/2016/000830 for financial support.
Knowledge management in a waste based biorefinery in the QbD paradigm.
Rathore, Anurag S; Chopda, Viki R; Gomes, James
2016-09-01
Shifting resource base from fossil feedstock to renewable raw materials for production of chemical products has opened up an area of novel applications of industrial biotechnology-based process tools. This review aims to provide a concise and focused discussion on recent advances in knowledge management to facilitate efficient and optimal operation of a biorefinery. Application of quality by design (QbD) and process analytical technology (PAT) as tools for knowledge creation and management at different levels has been highlighted. Role of process integration, government policies, knowledge exchange through collaboration, and use of databases and computational tools have also been touched upon. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Nagaraja, K. S.; Kraft, R. H.
1999-01-01
The HSCT Flight Controls Group has developed longitudinal control laws, utilizing PTC aeroelastic flexible models to minimize aeroservoelastic interaction effects, for a number of flight conditions. The control law design process resulted in a higher order controller and utilized a large number of sensors distributed along the body for minimizing the flexibility effects. Processes were developed to implement these higher order control laws for performing the dynamic gust loads and flutter analyses. The processes and its validation were documented in Reference 2, for selected flight condition. The analytical results for additional flight conditions are presented in this document for further validation.
Intelligent system of coordination and control for manufacturing
NASA Astrophysics Data System (ADS)
Ciortea, E. M.
2016-08-01
This paper wants shaping an intelligent system monitoring and control, which leads to optimizing material and information flows of the company. The paper presents a model for tracking and control system using intelligent real. Production system proposed for simulation analysis provides the ability to track and control the process in real time. Using simulation models be understood: the influence of changes in system structure, commands influence on the general condition of the manufacturing process conditions influence the behavior of some system parameters. Practical character consists of tracking and real-time control of the technological process. It is based on modular systems analyzed using mathematical models, graphic-analytical sizing, configuration, optimization and simulation.
High-end clinical domain information systems for effective healthcare delivery.
Mangalampalli, Ashish; Rama, Chakravarthy; Muthiyalian, Raja; Jain, Ajeet K
2007-01-01
The Electronic Health Record (EHR) provides doctors with a quick, reliable, secure, real-time and user-friendly source of all relevant patient data. The latest information system technologies, such as Clinical Data Warehouses (CDW), Clinical Decision-Support (CDS) systems and data-mining techniques (Online Analytical Processing (OLAP) and Online Transactional Processing (OLTP)), are used to maintain and utilise patient data intelligently, based on the users' requirements. Moreover, clinical trial reports for new drug approvals are now being submitted electronically for faster and easier processing. Also, information systems are used in educating patients about the latest developments in medical science through the internet and specially configured kiosks in hospitals and clinics.
NASA Astrophysics Data System (ADS)
Fauzi, Ilham; Muharram Hasby, Fariz; Irianto, Dradjad
2018-03-01
Although government is able to make mandatory standards that must be obeyed by the industry, the respective industries themselves often have difficulties to fulfil the requirements described in those standards. This is especially true in many small and medium sized enterprises that lack the required capital to invest in standard-compliant equipment and machineries. This study aims to develop a set of measurement tools for evaluating the level of readiness of production technology with respect to the requirements of a product standard based on the quality function deployment (QFD) method. By combining the QFD methodology, UNESCAP Technometric model [9] and Analytic Hierarchy Process (AHP), this model is used to measure a firm’s capability to fulfill government standard in the toy making industry. Expert opinions from both the governmental officers responsible for setting and implementing standards and the industry practitioners responsible for managing manufacturing processes are collected and processed to find out the technological capabilities that should be improved by the firm to fulfill the existing standard. This study showed that the proposed model can be used successfully to measure the gap between the requirements of the standard and the readiness of technoware technological component in a particular firm.
NASA Technical Reports Server (NTRS)
Stephenson, Frank W., Jr.
1988-01-01
The NASA Earth-to-Orbit (ETO) Propulsion Technology Program is dedicated to advancing rocket engine technologies for the development of fully reusable engine systems that will enable space transportation systems to achieve low cost, routine access to space. The program addresses technology advancements in the areas of engine life extension/prediction, performance enhancements, reduced ground operations costs, and in-flight fault tolerant engine operations. The primary objective is to acquire increased knowledge and understanding of rocket engine chemical and physical processes in order to evolve more realistic analytical simulations of engine internal environments, to derive more accurate predictions of steady and unsteady loads, and using improved structural analyses, to more accurately predict component life and performance, and finally to identify and verify more durable advanced design concepts. In addition, efforts were focused on engine diagnostic needs and advances that would allow integrated health monitoring systems to be developed for enhanced maintainability, automated servicing, inspection, and checkout, and ultimately, in-flight fault tolerant engine operations.
ERIC Educational Resources Information Center
Polito, Vincent A., Jr.
2010-01-01
The objective of this research was to explore the possibilities of identifying knowledge style factors that could be used as central elements of a professional business analyst's (PBA) performance attributes at work for those decision makers that use advanced analytical technologies on decision making tasks. Indicators of knowledge style were…
Badawy, Sherif I F; Narang, Ajit S; LaMarche, Keirnan R; Subramanian, Ganeshkumar A; Varia, Sailesh A; Lin, Judy; Stevens, Tim; Shah, Pankaj A
2016-01-01
Modern drug product development is expected to follow quality-by-design (QbD) paradigm. At the same time, although there are several issue-specific examples in the literature that demonstrate the application of QbD principles, a holistic demonstration of the application of QbD principles to drug product development and control strategy, is lacking. This article provides an integrated case study on the systematic application of QbD to product development and demonstrates the implementation of QbD concepts in the different aspects of product and process design for brivanib alaninate film-coated tablets. Using a risk-based approach, the strategy for development entailed identification of product critical quality attributes (CQAs), assessment of risks to the CQAs, and performing experiments to understand and mitigate identified risks. Quality risk assessments and design of experiments were performed to understand the quality of the input raw materials required for a robust formulation and the impact of manufacturing process parameters on CQAs. In addition to the material property and process parameter controls, the proposed control strategy includes use of process analytical technology and conventional analytical tests to control in-process material attributes and ensure quality of the final product. Copyright © 2016. Published by Elsevier Inc.
Big, Deep, and Smart Data in Scanning Probe Microscopy.
Kalinin, Sergei V; Strelcov, Evgheni; Belianinov, Alex; Somnath, Suhas; Vasudevan, Rama K; Lingerfelt, Eric J; Archibald, Richard K; Chen, Chaomei; Proksch, Roger; Laanait, Nouamane; Jesse, Stephen
2016-09-27
Scanning probe microscopy (SPM) techniques have opened the door to nanoscience and nanotechnology by enabling imaging and manipulation of the structure and functionality of matter at nanometer and atomic scales. Here, we analyze the scientific discovery process in SPM by following the information flow from the tip-surface junction, to knowledge adoption by the wider scientific community. We further discuss the challenges and opportunities offered by merging SPM with advanced data mining, visual analytics, and knowledge discovery technologies.
Proactive human-computer collaboration for information discovery
NASA Astrophysics Data System (ADS)
DiBona, Phil; Shilliday, Andrew; Barry, Kevin
2016-05-01
Lockheed Martin Advanced Technology Laboratories (LM ATL) is researching methods, representations, and processes for human/autonomy collaboration to scale analysis and hypotheses substantiation for intelligence analysts. This research establishes a machinereadable hypothesis representation that is commonsensical to the human analyst. The representation unifies context between the human and computer, enabling autonomy in the form of analytic software, to support the analyst through proactively acquiring, assessing, and organizing high-value information that is needed to inform and substantiate hypotheses.
Rahman, Ziyaur; Siddiqui, Akhtar; Khan, Mansoor A
2013-12-01
The focus of present investigation was to characterize and evaluate the variability of solid dispersion (SD) of amorphous vancomycin (VCM), utilizing crystalline polyethylene glycol (PEG-6000) as a carrier and subsequently, determining their percentage composition by nondestructive method of process analytical technology (PAT) sensors. The SD were prepared by heat fusion method and characterized for physicochemical and spectral properties. Enhanced dissolution was shown by the SD formulations. Decreased crystallinity of PEG-6000 was observed indicating that the drug was present as solution and dispersed form within the polymer. The SD formulations were homogenous as shown by near infrared (NIR) chemical imaging data. Principal component analysis (PCA) and partial least square (PLS) method were applied to NIR and PXRD (powder X-ray diffraction) data to develop model for quantification of drug and carrier. PLS of both data showed correlation coefficient >0.9934 with good prediction capability as revealed by smaller value of root mean square and standard error. The model based on NIR and PXRD were two folds more accurate in estimating PEG-6000 than VCM. In conclusion, the drug dissolution from the SD increased by decreasing crystallinity of PEG-6000, and the chemometric models showed usefulness of PAT sensor in estimating percentage of both VCM and PEG-600 simultaneously. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.
Biosimilars advancements: Moving on to the future.
Tsuruta, Lilian Rumi; Lopes dos Santos, Mariana; Moro, Ana Maria
2015-01-01
Many patents for the first biologicals derived from recombinant technology and, more recently, monoclonal antibodies (mAbs) are expiring. Naturally, biosimilars are becoming an increasingly important area of interest for the pharmaceutical industry worldwide, not only for emergent countries that need to import biologic products. This review shows the evolution of biosimilar development regarding regulatory, manufacturing bioprocess, comparability, and marketing. The regulatory landscape is evolving globally, whereas analytical structure and functional analyses provide the foundation of a biosimilar development program. The challenges to develop and demonstrate biosimilarity should overcome the inherent differences in the bioprocess manufacturing and physicochemical and biological characterization of a biosimilar compared to several lots of the reference product. The implementation of approaches, such as Quality by Design (QbD), will provide products with defined specifications in relation to quality, purity, safety, and efficacy that were not possible when the reference product was developed. Actually, the need to prove comparability to the reference product by the biosimilar industry has increased the knowledge about the product and the production-process associated by the use of powerful analytical tools. The technological challenges to make copies of biologic products while attending regulatory and market demands are expected to help innovation in the direction of attaining more productive manufacturing processes. © 2015 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.
Modeling timelines for translational science in cancer; the impact of technological maturation
McNamee, Laura M.; Ledley, Fred D.
2017-01-01
This work examines translational science in cancer based on theories of innovation that posit a relationship between the maturation of technologies and their capacity to generate successful products. We examined the growth of technologies associated with 138 anticancer drugs using an analytical model that identifies the point of initiation of exponential growth and the point at which growth slows as the technology becomes established. Approval of targeted and biological products corresponded with technological maturation, with first approval averaging 14 years after the established point and 44 years after initiation of associated technologies. The lag in cancer drug approvals after the increases in cancer funding and dramatic scientific advances of the 1970s thus reflects predictable timelines of technology maturation. Analytical models of technological maturation may be used for technological forecasting to guide more efficient translation of scientific discoveries into cures. PMID:28346525
EarthServer: Cross-Disciplinary Earth Science Through Data Cube Analytics
NASA Astrophysics Data System (ADS)
Baumann, P.; Rossi, A. P.
2016-12-01
The unprecedented increase of imagery, in-situ measurements, and simulation data produced by Earth (and Planetary) Science observations missions bears a rich, yet not leveraged potential for getting insights from integrating such diverse datasets and transform scientific questions into actual queries to data, formulated in a standardized way.The intercontinental EarthServer [1] initiative is demonstrating new directions for flexible, scalable Earth Science services based on innovative NoSQL technology. Researchers from Europe, the US and Australia have teamed up to rigorously implement the concept of the datacube. Such a datacube may have spatial and temporal dimensions (such as a satellite image time series) and may unite an unlimited number of scenes. Independently from whatever efficient data structuring a server network may perform internally, users (scientist, planners, decision makers) will always see just a few datacubes they can slice and dice.EarthServer has established client [2] and server technology for such spatio-temporal datacubes. The underlying scalable array engine, rasdaman [3,4], enables direct interaction, including 3-D visualization, common EO data processing, and general analytics. Services exclusively rely on the open OGC "Big Geo Data" standards suite, the Web Coverage Service (WCS). Conversely, EarthServer has shaped and advanced WCS based on the experience gained. The first phase of EarthServer has advanced scalable array database technology into 150+ TB services. Currently, Petabyte datacubes are being built for ad-hoc and cross-disciplinary querying, e.g. using climate, Earth observation and ocean data.We will present the EarthServer approach, its impact on OGC / ISO / INSPIRE standardization, and its platform technology, rasdaman.References: [1] Baumann, et al. (2015) DOI: 10.1080/17538947.2014.1003106 [2] Hogan, P., (2011) NASA World Wind, Proceedings of the 2nd International Conference on Computing for Geospatial Research & Applications ACM. [3] Baumann, Peter, et al. (2014) In Proc. 10th ICDM, 194-201. [4] Dumitru, A. et al. (2014) In Proc ACM SIGMOD Workshop on Data Analytics in the Cloud (DanaC'2014), 1-4.
[Real-time detection of quality of Chinese materia medica: strategy of NIR model evaluation].
Wu, Zhi-sheng; Shi, Xin-yuan; Xu, Bing; Dai, Xing-xing; Qiao, Yan-jiang
2015-07-01
The definition of critical quality attributes of Chinese materia medica ( CMM) was put forward based on the top-level design concept. Nowadays, coupled with the development of rapid analytical science, rapid assessment of critical quality attributes of CMM was firstly carried out, which was the secondary discipline branch of CMM. Taking near infrared (NIR) spectroscopy as an example, which is a rapid analytical technology in pharmaceutical process over the past decade, systematic review is the chemometric parameters in NIR model evaluation. According to the characteristics of complexity of CMM and trace components analysis, a multi-source information fusion strategy of NIR model was developed for assessment of critical quality attributes of CMM. The strategy has provided guideline for NIR reliable analysis in critical quality attributes of CMM.
Development and application of dynamic simulations of a subsonic wind tunnel
NASA Technical Reports Server (NTRS)
Szuch, J. R.; Cole, G. L.; Seidel, R. C.; Arpasi, D. J.
1986-01-01
Efforts are currently underway at NASA Lewis to improve and expand ground test facilities and to develop supporting technologies to meet anticipated aeropropulsion research needs. Many of these efforts have been focused on a proposed rehabilitation of the Altitude Wind Tunnel (AWT). In order to insure a technically sound design, an AWT modeling program (both analytical and physical) was initiated to provide input to the AWT final design process. This paper describes the approach taken to develop analytical, dynamic computer simulations of the AWT, and the use of these simulations as test-beds for: (1) predicting the dynamic response characteristics of the AWT, and (2) evaluating proposed AWT control concepts. Plans for developing a portable, real-time simulator for the AWT facility are also described.
Wash water reclamation technology for advanced manned spacecraft
NASA Technical Reports Server (NTRS)
Putnam, D. F.
1977-01-01
The results of an analytical study and assessment of state-of-the-art wash water reclamation technology for advanced manned spacecraft is presented. All non-phase-change unit operations, unit processes, and subsystems currently under development by NASA are considered. Included among these are: filtration, ultrafiltration, carbon adsorption, ion exchange, chemical pretreatment, reverse osmosis, hyperfiltration, and certain urea removal techniques. Performance data are given together with the projected weights and sizes of key components and subsystems. In the final assessment, a simple multifiltration approach consisting of surface-type cartridge filters, carbon adsorption and ion exchange resins receives the highest rating for six-man orbital missions of up to 10 years in duration.
“Playing around” with Field-Effect Sensors on the Basis of EIS Structures, LAPS and ISFETs
Schöning, Michael J.
2005-01-01
Microfabricated semiconductor devices are becoming increasingly relevant, also for the detection of biological and chemical quantities. Especially, the “marriage” of biomolecules and silicon technology often yields successful new sensor concepts. The fabrication techniques of such silicon-based chemical sensors and biosensors, respectively, will have a distinct impact in different fields of application such as medicine, food technology, environment, chemistry and biotechnology as well as information processing. Moreover, scientists and engineers are interested in the analytical benefits of miniaturised and microfabricated sensor devices. This paper gives a survey on different types of semiconductor-based field-effect structures that have been recently developed in our laboratory.
New tactics and technologies to meet the competitive utility environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Audin, L.
A new age is dawning for lower-cost energy use and supply. The deregulation of the electric industry is creating new pricing options that will change how one evaluates cost-cutting energy alternatives. As competition begins, smart users will grasp these opportunities and press for greater innovation on the part of marketers. Energy users can best navigate these choices by: understanding the concepts inherent in deregulation (such as transmission constrains); influencing the deregulation process (which does not end when markets first open); learning to use new analytical tools (such as load profile analysis); applying new technologies (e.g., wireless automatic metering); and beingmore » as creative as possible (because marketers won`t be).« less
Mayer, Horst; Brümmer, Jens; Brinkmann, Thomas
2011-01-01
To implement Lean Six Sigma in our central laboratory we conducted a project to measure single pre-analytical steps influencing turnaround time (TAT) of emergency department (ED) serum samples. The traditional approach of extracting data from the Laboratory Information System (LIS) for a retrospective calculation of a mean TAT is not suitable. Therefore, we used radiofrequency identification (RFID) chips for real time tracking of individual samples at any pre-analytical step. 1,200 serum tubes were labelled with RFID chips and were provided to the emergency department. 3 RFID receivers were installed in the laboratory: at the outlet of the pneumatic tube system, at the centrifuge, and in the analyser area. In addition, time stamps of sample entry at the automated sample distributor and communication of results from the analyser were collected from LIS. 1,023 labelled serum tubes arrived at our laboratory. 899 RFID tags were used for TAT calculation. The following transfer times were determined (median 95th percentile in min:sec): pneumatic tube system --> centrifuge (01:25/04:48), centrifuge --> sample distributor (14:06/5:33), sample distributor --> analysis system zone (02:39/15:07), analysis system zone --> result communication (12:42/22:21). Total TAT was calculated at 33:19/57:40 min:sec. Manual processes around centrifugation were identified as a major part of TAT with 44%/60% (median/95th percentile). RFID is a robust, easy to use, and error-free technology and not susceptible to interferences in the laboratory environment. With this study design we were able to measure significant variations in a single manual sample transfer process. We showed that TAT is mainly influenced by manual steps around the centrifugation process and we concluded that centrifugation should be integrated in solutions for total laboratory automation.
Vann, Lucas; Sheppard, John
2017-12-01
Control of biopharmaceutical processes is critical to achieve consistent product quality. The most challenging unit operation to control is cell growth in bioreactors due to the exquisitely sensitive and complex nature of the cells that are converting raw materials into new cells and products. Current monitoring capabilities are increasing, however, the main challenge is now becoming the ability to use the data generated in an effective manner. There are a number of contributors to this challenge including integration of different monitoring systems as well as the functionality to perform data analytics in real-time to generate process knowledge and understanding. In addition, there is a lack of ability to easily generate strategies and close the loop to feedback into the process for advanced process control (APC). The current research aims to demonstrate the use of advanced monitoring tools along with data analytics to generate process understanding in an Escherichia coli fermentation process. NIR spectroscopy was used to measure glucose and critical amino acids in real-time to help in determining the root cause of failures associated with different lots of yeast extract. First, scale-down of the process was required to execute a simple design of experiment, followed by scale-up to build NIR models as well as soft sensors for advanced process control. In addition, the research demonstrates the potential for a novel platform technology that enables manufacturers to consistently achieve "goldenbatch" performance through monitoring, integration, data analytics, understanding, strategy design and control (MIDUS control). MIDUS control was employed to increase batch-to-batch consistency in final product titers, decrease the coefficient of variability from 8.49 to 1.16%, predict possible exhaust filter failures and close the loop to prevent their occurrence and avoid lost batches.
Application of the near-infrared spectroscopy in the pharmaceutical technology.
Jamrógiewicz, Marzena
2012-07-01
Near-infrared (NIR) spectroscopy is currently the fastest-growing and the most versatile analytical method not only in the pharmaceutical sciences but also in the industry. This review focuses on recent NIR applications in the pharmaceutical technology. This article covers monitoring, by NIR, of many manufacturing processes, such as granulation, mixing or drying, in order to determine the end-point of these processes. In this paper, apart from basic theoretical information concerning the NIR spectra, there are included determinations of the quality and quantity of pharmaceutical compounds. Some examples of measurements and control of physicochemical parameters of the final medicinal products, such as hardness, porosity, thickness size, compression strength, disintegration time and potential counterfeit are included. Biotechnology and plant drug analysis using NIR is also described. Moreover, some disadvantages of this method are stressed and future perspectives are anticipated. Copyright © 2012 Elsevier B.V. All rights reserved.
Data warehousing leads to improved business performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morris, R.
1995-09-01
Data warehousing is emerging as one of the most significant trends in information technology (IT) during the 1990s. According to William H. Inmon, sometimes referred to as the father of data warehousing, a data warehouse is a subject-oriented, integrated, nonvolatile, time-variant collection of data organized to support management needs. Data warehousing can: provide integrated, historical and operational data; integrate disparate application systems; and organize and store data for informational, analytical processing. Data warehousing offers opportunity to address today`s problems of realizing a return on massive investments being made in acquiring and managing E and P data. Effective implementations require anmore » understanding of business benefits being sought and an adaptive, flexible IT architecture for supporting processes and technologies involved. As national E and P data archives continue to emerge and complement existing data reserves within E and P companies, expect to see increased data warehousing use to merge these two environments.« less
NASA Astrophysics Data System (ADS)
Hsu, Pi-Sui
The purpose of this qualitative case study was to provide a detailed description of the change process of technology integration into a science methods course, SCIED 458, as well as to interpret and analyze essential issues involved in the change process and examine how these factors influenced the change process. This study undertook qualitative research that employed case study research design. In-depth interviewing and review of the documents were two major data collection methods in this study. Participants included the three key faculty members in the science education program, a former graduate student who participated in writing the Link-to-Learn grant proposal, a former graduate student who taught SCIED 458, and two current graduate students who were teaching SCIED 458. A number of data analysis strategies were used in this study; these strategies included (1) coding for different periods of time and project categories and roles of people, (2) identifying themes, trends and coding for patterns, (3) reducing the data for analysis of trends and synthesizing and summarizing the data, and (4) integrating the data into one analytical framework. The findings indicated that this change process had evolved through the stages of adoption and diffusion, implementation, and institutionalization and a number of strategies facilitated the changes in individual stages, including the formation of a leadership team in the early stages, gradual adoption of technology tools, use of powerful pedagogy and methodology, the formation of a research community, and separation of technology training and subject teaching. The findings also indicated the essential factors and systems that interacted with each other and sustained the change process; these included a transformational shared leadership team, the formation of a learning and research community, reduced resistance of the elementary prospective teachers to technology, availability of university resources, involvement of the local school districts, support of the state department of education, recognition of the professional organizations, creation of partnerships with software companies, and technology advancements in society. A framework for integrating technology was presented to assist school reformers and instructional designers in initiating, implementing, and sustaining the changes due to technology integration in a systemic manner.
Research | Argonne National Laboratory
, and Decision Analytics Energy Systems Analysis Engines and Fuels Friction, Wear, and Lubrication Vehicle Technologies Buildings and Climate-Environment Energy, Power, and Decision Analytics Energy
Masucci, Giuseppe V; Cesano, Alessandra; Hawtin, Rachael; Janetzki, Sylvia; Zhang, Jenny; Kirsch, Ilan; Dobbin, Kevin K; Alvarez, John; Robbins, Paul B; Selvan, Senthamil R; Streicher, Howard Z; Butterfield, Lisa H; Thurin, Magdalena
2016-01-01
Immunotherapies have emerged as one of the most promising approaches to treat patients with cancer. Recently, there have been many clinical successes using checkpoint receptor blockade, including T cell inhibitory receptors such as cytotoxic T-lymphocyte-associated antigen 4 (CTLA-4) and programmed cell death-1 (PD-1). Despite demonstrated successes in a variety of malignancies, responses only typically occur in a minority of patients in any given histology. Additionally, treatment is associated with inflammatory toxicity and high cost. Therefore, determining which patients would derive clinical benefit from immunotherapy is a compelling clinical question. Although numerous candidate biomarkers have been described, there are currently three FDA-approved assays based on PD-1 ligand expression (PD-L1) that have been clinically validated to identify patients who are more likely to benefit from a single-agent anti-PD-1/PD-L1 therapy. Because of the complexity of the immune response and tumor biology, it is unlikely that a single biomarker will be sufficient to predict clinical outcomes in response to immune-targeted therapy. Rather, the integration of multiple tumor and immune response parameters, such as protein expression, genomics, and transcriptomics, may be necessary for accurate prediction of clinical benefit. Before a candidate biomarker and/or new technology can be used in a clinical setting, several steps are necessary to demonstrate its clinical validity. Although regulatory guidelines provide general roadmaps for the validation process, their applicability to biomarkers in the cancer immunotherapy field is somewhat limited. Thus, Working Group 1 (WG1) of the Society for Immunotherapy of Cancer (SITC) Immune Biomarkers Task Force convened to address this need. In this two volume series, we discuss pre-analytical and analytical (Volume I) as well as clinical and regulatory (Volume II) aspects of the validation process as applied to predictive biomarkers for cancer immunotherapy. To illustrate the requirements for validation, we discuss examples of biomarker assays that have shown preliminary evidence of an association with clinical benefit from immunotherapeutic interventions. The scope includes only those assays and technologies that have established a certain level of validation for clinical use (fit-for-purpose). Recommendations to meet challenges and strategies to guide the choice of analytical and clinical validation design for specific assays are also provided.
Analytical Nanoscience and Nanotechnology: Where we are and where we are heading.
Laura Soriano, María; Zougagh, Mohammed; Valcárcel, Miguel; Ríos, Ángel
2018-01-15
The main aim of this paper is to offer an objective and critical overview of the situation and trends in Analytical Nanoscience and Nanotechnology (AN&N), which is an important break point in the evolution of Analytical Chemistry in the XXI century as they were computers and instruments in the second half of XX century. The first part of this overview is devoted to provide a general approach to AN&N by describing the state of the art of this recent topic, being the importance of it also emphasized. Secondly, particular but very relevant trends in this topic are outlined: the analysis of the nanoworld, the so "third way" in AN&N, the growing importance of bioanalysis, the evaluation of both nanosensors and nanosorbents, the impact of AN&N in bioimaging and in nanotoxicological studies, as well as the crucial importance of reliability of the nanotechnological processes and results for solving real analytical problems in the frame of Social Responsibility (SR) of science and technology. Several reflections are included at the end of this overview written as a bird's eye view, which is not an easy task for experts in AN&N. Copyright © 2017 Elsevier B.V. All rights reserved.
Automated Predictive Big Data Analytics Using Ontology Based Semantics.
Nural, Mustafa V; Cotterell, Michael E; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A
2015-10-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology.
Automated Predictive Big Data Analytics Using Ontology Based Semantics
Nural, Mustafa V.; Cotterell, Michael E.; Peng, Hao; Xie, Rui; Ma, Ping; Miller, John A.
2017-01-01
Predictive analytics in the big data era is taking on an ever increasingly important role. Issues related to choice on modeling technique, estimation procedure (or algorithm) and efficient execution can present significant challenges. For example, selection of appropriate and optimal models for big data analytics often requires careful investigation and considerable expertise which might not always be readily available. In this paper, we propose to use semantic technology to assist data analysts and data scientists in selecting appropriate modeling techniques and building specific models as well as the rationale for the techniques and models selected. To formally describe the modeling techniques, models and results, we developed the Analytics Ontology that supports inferencing for semi-automated model selection. The SCALATION framework, which currently supports over thirty modeling techniques for predictive big data analytics is used as a testbed for evaluating the use of semantic technology. PMID:29657954
The VAST Challenge: History, Scope, and Outcomes: An introduction to the Special Issue
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cook, Kristin A.; Grinstein, Georges; Whiting, Mark A.
2014-10-01
Visual analytics aims to facilitate human insight from complex data via a combination of visual representations, interaction techniques, and supporting algorithms. To create new tools and techniques that achieve this goal requires that researchers have an understanding of analytical questions to be addressed, data that illustrates the complexities and ambiguities found in realistic analytic settings, and methods for evaluating whether the plausible insights are gained through use of the new methods. However, researchers do not, generally speaking, have access to analysts who can articulate their problems or operational data that is used for analysis. To fill this gap, the Visualmore » Analytics Science and Technology (VAST) Challenge has been held annually since 2006. The VAST Challenge provides an opportunity for researchers to experiment with realistic but not real problems, using realistic synthetic data with known events embedded. Since its inception, the VAST Challenge has evolved along with the visual analytics research community to pose more complex challenges, ranging from text analysis to video analysis to large scale network log analysis. The seven years of the VAST Challenge have seen advancements in research and development, education, evaluation, and in the challenge process itself. This special issue of Information Visualization highlights some of the noteworthy advancements in each of these areas. Some of these papers focus on important research questions related to the challenge itself, and other papers focus on innovative research that has been shaped by participation in the challenge. This paper describes the VAST Challenge process and benefits in detail. It also provides an introduction to and context for the remaining papers in the issue.« less
Improving preanalytic processes using the principles of lean production (Toyota Production System).
Persoon, Thomas J; Zaleski, Sue; Frerichs, Janice
2006-01-01
The basic technologies used in preanalytic processes for chemistry tests have been mature for a long time, and improvements in preanalytic processes have lagged behind improvements in analytic and postanalytic processes. We describe our successful efforts to improve chemistry test turnaround time from a central laboratory by improving preanalytic processes, using existing resources and the principles of lean production. Our goal is to report 80% of chemistry tests in less than 1 hour and to no longer recognize a distinction between expedited and routine testing. We used principles of lean production (the Toyota Production System) to redesign preanalytic processes. The redesigned preanalytic process has fewer steps and uses 1-piece flow to move blood samples through the accessioning, centrifugation, and aliquoting processes. Median preanalytic processing time was reduced from 29 to 19 minutes, and the laboratory met the goal of reporting 80% of chemistry results in less than 1 hour for 11 consecutive months.
NASA Astrophysics Data System (ADS)
Nayak, Aditya B.; Price, James M.; Dai, Bin; Perkins, David; Chen, Ding Ding; Jones, Christopher M.
2015-06-01
Multivariate optical computing (MOC), an optical sensing technique for analog calculation, allows direct and robust measurement of chemical and physical properties of complex fluid samples in high-pressure/high-temperature (HP/HT) downhole environments. The core of this MOC technology is the integrated computational element (ICE), an optical element with a wavelength-dependent transmission spectrum designed to allow the detector to respond sensitively and specifically to the analytes of interest. A key differentiator of this technology is it uses all of the information present in the broadband optical spectrum to determine the proportion of the analyte present in a complex fluid mixture. The detection methodology is photometric in nature; therefore, this technology does not require a spectrometer to measure and record a spectrum or a computer to perform calculations on the recorded optical spectrum. The integrated computational element is a thin-film optical element with a specific optical response function designed for each analyte. The optical response function is achieved by fabricating alternating layers of high-index (a-Si) and low-index (SiO2) thin films onto a transparent substrate (BK7 glass) using traditional thin-film manufacturing processes (e.g., ion-assisted e-beam vacuum deposition). A proprietary software and process are used to control the thickness and material properties, including the optical constants of the materials during deposition to achieve the desired optical response function. The ion-assisted deposition is useful for controlling the densification of the film, stoichiometry, and material optical constants as well as to achieve high deposition growth rates and moisture-stable films. However, the ion-source can induce undesirable absorption in the film; and subsequently, modify the optical constants of the material during the ramp-up and stabilization period of the e-gun and ion-source, respectively. This paper characterizes the unwanted absorption in the a-Si thin-film using advanced thin-film metrology methods, including spectroscopic ellipsometry and Fourier transform infrared (FTIR) spectroscopy. The resulting analysis identifies a fundamental mechanism contributing to this absorption and a method for minimizing and accounting for the unwanted absorption in the thin-film such that the exact optical response function can be achieved.
Scalable Predictive Analysis in Critically Ill Patients Using a Visual Open Data Analysis Platform
Poucke, Sven Van; Zhang, Zhongheng; Schmitz, Martin; Vukicevic, Milan; Laenen, Margot Vander; Celi, Leo Anthony; Deyne, Cathy De
2016-01-01
With the accumulation of large amounts of health related data, predictive analytics could stimulate the transformation of reactive medicine towards Predictive, Preventive and Personalized (PPPM) Medicine, ultimately affecting both cost and quality of care. However, high-dimensionality and high-complexity of the data involved, prevents data-driven methods from easy translation into clinically relevant models. Additionally, the application of cutting edge predictive methods and data manipulation require substantial programming skills, limiting its direct exploitation by medical domain experts. This leaves a gap between potential and actual data usage. In this study, the authors address this problem by focusing on open, visual environments, suited to be applied by the medical community. Moreover, we review code free applications of big data technologies. As a showcase, a framework was developed for the meaningful use of data from critical care patients by integrating the MIMIC-II database in a data mining environment (RapidMiner) supporting scalable predictive analytics using visual tools (RapidMiner’s Radoop extension). Guided by the CRoss-Industry Standard Process for Data Mining (CRISP-DM), the ETL process (Extract, Transform, Load) was initiated by retrieving data from the MIMIC-II tables of interest. As use case, correlation of platelet count and ICU survival was quantitatively assessed. Using visual tools for ETL on Hadoop and predictive modeling in RapidMiner, we developed robust processes for automatic building, parameter optimization and evaluation of various predictive models, under different feature selection schemes. Because these processes can be easily adopted in other projects, this environment is attractive for scalable predictive analytics in health research. PMID:26731286
Scalable Predictive Analysis in Critically Ill Patients Using a Visual Open Data Analysis Platform.
Van Poucke, Sven; Zhang, Zhongheng; Schmitz, Martin; Vukicevic, Milan; Laenen, Margot Vander; Celi, Leo Anthony; De Deyne, Cathy
2016-01-01
With the accumulation of large amounts of health related data, predictive analytics could stimulate the transformation of reactive medicine towards Predictive, Preventive and Personalized (PPPM) Medicine, ultimately affecting both cost and quality of care. However, high-dimensionality and high-complexity of the data involved, prevents data-driven methods from easy translation into clinically relevant models. Additionally, the application of cutting edge predictive methods and data manipulation require substantial programming skills, limiting its direct exploitation by medical domain experts. This leaves a gap between potential and actual data usage. In this study, the authors address this problem by focusing on open, visual environments, suited to be applied by the medical community. Moreover, we review code free applications of big data technologies. As a showcase, a framework was developed for the meaningful use of data from critical care patients by integrating the MIMIC-II database in a data mining environment (RapidMiner) supporting scalable predictive analytics using visual tools (RapidMiner's Radoop extension). Guided by the CRoss-Industry Standard Process for Data Mining (CRISP-DM), the ETL process (Extract, Transform, Load) was initiated by retrieving data from the MIMIC-II tables of interest. As use case, correlation of platelet count and ICU survival was quantitatively assessed. Using visual tools for ETL on Hadoop and predictive modeling in RapidMiner, we developed robust processes for automatic building, parameter optimization and evaluation of various predictive models, under different feature selection schemes. Because these processes can be easily adopted in other projects, this environment is attractive for scalable predictive analytics in health research.
Data Analytics and Visualization for Large Army Testing Data
2013-09-01
and relationships in the data that would otherwise remain hidden. 7 Bibliography 1. Goodall , J. R.; Tesone, D. R. Visual Analytics for Network...Software Visualization, 2003, pp 143–149. 3. Goodall , J. R.; Sowul, M. VIAssist: Visual Analytics for Cyber Defense, IEEE Conference on Technologies
Online Learner Engagement: Opportunities and Challenges with Using Data Analytics
ERIC Educational Resources Information Center
Bodily, Robert; Graham, Charles R.; Bush, Michael D.
2017-01-01
This article describes the crossroads between learning analytics and learner engagement. The authors do this by describing specific challenges of using analytics to support student engagement from three distinct perspectives: pedagogical considerations, technological issues, and interface design concerns. While engaging online learners presents a…
Water flow in fractured rock masses: numerical modeling for tunnel inflow assessment
NASA Astrophysics Data System (ADS)
Gattinoni, P.; Scesi, L.; Terrana, S.
2009-04-01
Water circulation in rocks represents a very important element to solve many problems linked with civil, environmental and mining engineering. In particular, the interaction of tunnelling with groundwater has become a very relevant problem not only due to the need to safeguard water resources from impoverishment and from the pollution risk, but also to guarantee the safety of workers and to assure the efficiency of the tunnel drainage systems. The evaluation of the hydrogeological risk linked to the underground excavation is very complex, either for the large number of variables involved or for the lack of data available during the planning stage. The study is aimed to quantify the influence of some geo-structural parameters (i.e. discontinuities dip and dip direction) on the tunnel drainage process, comparing the traditional analytical method to the modeling approach, with specific reference to the case of anisotropic rock masses. To forecast the tunnel inflows, a few Authors suggest analytic formulations (Goodman et al., 1965; Knutsson et al., 1996; Ribacchi et al., 2002; Park et al., 2008; Perrochet et al., 2007; Cesano et al., 2003; Hwang et al., 2007), valid for infinite, homogeneous and isotropic aquifer, in which the permeability value is given as a modulus of equivalent hydraulic conductivity Keq. On the contrary, in discontinuous rock masses the water flow is strongly controlled by joints orientation, by their hydraulic characteristics and by rocks fracturing conditions. The analytic equations found in the technical literature could be very useful, but often they don't reflect the real phenomena of the tunnel inflow in rock masses. Actually, these equations are based on the hypothesis of homogeneous aquifer, and then they don't give good agreement for an heterogeneous fractured medium. In this latter case, the numerical modelling could provide the best results, but only with a detailed conceptual model of the water circulation, high costs and long simulation times. Therefore, the integration of analytic method and numerical modeling is very important to adapt the analytic formula to the specific hydrogeological structure. The study was carried out through a parametrical modeling, so that groundwater flow was simulated with the DEM Model UDEC 2D, considering different geometrical (tunnel depth and radius) and hydrogeological settings (piezometrical). The influence of geo-structural setting (as dip and dip direction of discontinuities, with reference to their permeability) on tunnel drainage process was quantified. The simulations are aimed to create a sufficient data set of tunnel inflows, in different geological-structural setting, enabling a quantitative comparison between numerical and the well-known analytic formulas (i.e. Goodman and El Tani equations). Results of this comparison point out the following aspects: - the geological-structural setting critical for hydrogeological risk in tunnel corresponds to joints having low dip (close to 0°) that favour the drainage processes and the increasing of the tunnel inflow; - the rock mass anisotropy strongly influences both the tunnel inflow and the water table drawdown; - the reliability of analytic formulas for the tunnel inflow assessment in discontinuous rock masses depends on the geostractural setting; actually the analytic formulas overestimate the tunnel inflow and this overestimation is bigger for geostructural setting having discontinuities with higher dips. Finally, using the results of parametrical modeling, the previous cited analytic formulas were corrected to point out an empirical equation that gives the tunnel inflow as a function of the different geological-structural setting, with particular regard to: - the horizontal component of discontinuities, - the hydraulic conductivity anisotropy ratio, - the orientation of the hydraulic conductivity tensor. The obtained empirical equation allows a first evaluation of the tunnel inflow, in which joint characteristics are taken into account, very useful to identify the areas where in-depth studies are required. References Cesano D., Bagtzoglou A.C., Olofsson B. (2003). Quantifying fractured rock hydraulic heterogeneity and groundwater inflow prediction in underground excavations: the heterogeneity index. Tunneling and Underground Space Technology, 18, pp. 19-34. El Tani M. (2003). Circular tunnel in a semi-infinite aquifer. Tunnelling and Groundwater Space Technology, 18, pp. 49-55. Goodman R.E., Moye D.G., Van Schalkwyk A., Javandel I. (1965). Ground water inflow during tunnel driving. Eng. Geol., 2, pp. 39-56. Hwang J-H., Lu C-C. (2007). A semi-analytical method for analyzing the tunnel water inflow. Tunneling and Underground Space Technology, 22, pp. 39-46. Itasca (2001). UDEC, User's guide. Itasca Consultino Group Inc., Minneapolis, Minnesota. Knutsson G., Olofsson B., Cesano D. (1996). Prognosis of groundwater inflows and drawdown due to the construction of rock tunnels in heterogeneous media. Res. Proj. Rep. Kungl Tekniska, Stokholm. Park K-H., Owatsiriwong A., Lee G-G. (2008). Analytical solution for steady-state groundwater inflow into a drained circular tunnel in a semi-infinite aquifer: a revisit. Tunnelling and Underground Space Technology, 23, pp. 206-209. Perrochet P., Dematteis A. (2007). Modelling Transient Discharge into a Tunnel Drilled in Heterogeneous Formation. Ground Water, 45(6), pp. 786-790.
The Dairy Technology System in Venezuela. Summary of Research 79.
ERIC Educational Resources Information Center
Nieto, Ruben D.; Henderson, Janet L.
A study examined the agricultural technology system in Venezuela with emphasis on the dairy industry. An analytical framework was used to identify the strengths and weaknesses of the following components of Venezuela's agricultural technology system: policy, technology development, technology transfer, and technology use. Selected government…
Statistically Qualified Neuro-Analytic system and Method for Process Monitoring
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
1998-11-04
An apparatus and method for monitoring a process involves development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two steps: deterministic model adaption and stochastic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics,augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation emor minimization technique. Stochastic model adaptation involves qualifying any remaining uncertaintymore » in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system.« less
Computational toxicity in 21st century safety sciences (China ...
presentation at the Joint Meeting of Analytical Toxicology and Computational Toxicology Committee (Chinese Society of Toxicology) International Workshop on Advanced Chemical Safety Assessment Technologies on 11 May 2016, Fuzhou University, Fuzhou China presentation at the Joint Meeting of Analytical Toxicology and Computational Toxicology Committee (Chinese Society of Toxicology) International Workshop on Advanced Chemical Safety Assessment Technologies on 11 May 2016, Fuzhou University, Fuzhou China
The Convergence of High Performance Computing and Large Scale Data Analytics
NASA Astrophysics Data System (ADS)
Duffy, D.; Bowen, M. K.; Thompson, J. H.; Yang, C. P.; Hu, F.; Wills, B.
2015-12-01
As the combinations of remote sensing observations and model outputs have grown, scientists are increasingly burdened with both the necessity and complexity of large-scale data analysis. Scientists are increasingly applying traditional high performance computing (HPC) solutions to solve their "Big Data" problems. While this approach has the benefit of limiting data movement, the HPC system is not optimized to run analytics, which can create problems that permeate throughout the HPC environment. To solve these issues and to alleviate some of the strain on the HPC environment, the NASA Center for Climate Simulation (NCCS) has created the Advanced Data Analytics Platform (ADAPT), which combines both HPC and cloud technologies to create an agile system designed for analytics. Large, commonly used data sets are stored in this system in a write once/read many file system, such as Landsat, MODIS, MERRA, and NGA. High performance virtual machines are deployed and scaled according to the individual scientist's requirements specifically for data analysis. On the software side, the NCCS and GMU are working with emerging commercial technologies and applying them to structured, binary scientific data in order to expose the data in new ways. Native NetCDF data is being stored within a Hadoop Distributed File System (HDFS) enabling storage-proximal processing through MapReduce while continuing to provide accessibility of the data to traditional applications. Once the data is stored within HDFS, an additional indexing scheme is built on top of the data and placed into a relational database. This spatiotemporal index enables extremely fast mappings of queries to data locations to dramatically speed up analytics. These are some of the first steps toward a single unified platform that optimizes for both HPC and large-scale data analysis, and this presentation will elucidate the resulting and necessary exascale architectures required for future systems.
Understanding Education Involving Geovisual Analytics
ERIC Educational Resources Information Center
Stenliden, Linnea
2013-01-01
Handling the vast amounts of data and information available in contemporary society is a challenge. Geovisual Analytics provides technology designed to increase the effectiveness of information interpretation and analytical task solving. To date, little attention has been paid to the role such tools can play in education and to the extent to which…
Technology Enhanced Analytics (TEA) in Higher Education
ERIC Educational Resources Information Center
Daniel, Ben Kei; Butson, Russell
2013-01-01
This paper examines the role of Big Data Analytics in addressing contemporary challenges associated with current changes in institutions of higher education. The paper first explores the potential of Big Data Analytics to support instructors, students and policy analysts to make better evidence based decisions. Secondly, the paper presents an…
Penetrating the Fog: Analytics in Learning and Education
ERIC Educational Resources Information Center
Siemens, George; Long, Phil
2011-01-01
Attempts to imagine the future of education often emphasize new technologies--ubiquitous computing devices, flexible classroom designs, and innovative visual displays. But the most dramatic factor shaping the future of higher education is something that people cannot actually touch or see: "big data and analytics." Learning analytics is still in…
Developing a Code of Practice for Learning Analytics
ERIC Educational Resources Information Center
Sclater, Niall
2016-01-01
Ethical and legal objections to learning analytics are barriers to development of the field, thus potentially denying students the benefits of predictive analytics and adaptive learning. Jisc, a charitable organization that champions the use of digital technologies in UK education and research, has attempted to address this with the development of…
Overcoming Barriers to Educational Analytics: How Systems Thinking and Pragmatism Can Help
ERIC Educational Resources Information Center
Macfadyen, Leah P.
2017-01-01
Learning technologies are now commonplace in education, and generate large volumes of educational data. Scholars have argued that analytics can and should be employed to optimize learning and learning environments. This article explores what is really meant by "analytics", describes the current best-known examples of institutional…
Investigation of Using Analytics in Promoting Mobile Learning Support
ERIC Educational Resources Information Center
Visali, Videhi; Swami, Niraj
2013-01-01
Learning analytics can promote pedagogically informed use of learner data, which can steer the progress of technology mediated learning across several learning contexts. This paper presents the application of analytics to a mobile learning solution and demonstrates how a pedagogical sense was inferred from the data. Further, this inference was…
Improta, Giovanni; Russo, Mario Alessandro; Triassi, Maria; Converso, Giuseppe; Murino, Teresa; Santillo, Liberatina Carmela
2018-05-01
Health technology assessments (HTAs) are often difficult to conduct because of the decisive procedures of the HTA algorithm, which are often complex and not easy to apply. Thus, their use is not always convenient or possible for the assessment of technical requests requiring a multidisciplinary approach. This paper aims to address this issue through a multi-criteria analysis focusing on the analytic hierarchy process (AHP). This methodology allows the decision maker to analyse and evaluate different alternatives and monitor their impact on different actors during the decision-making process. However, the multi-criteria analysis is implemented through a simulation model to overcome the limitations of the AHP methodology. Simulations help decision-makers to make an appropriate decision and avoid unnecessary and costly attempts. Finally, a decision problem regarding the evaluation of two health technologies, namely, the evaluation of two biological prostheses for incisional infected hernias, will be analysed to assess the effectiveness of the model. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Korasa, Klemen; Vrečer, Franc
2018-01-01
Over the last two decades, regulatory agencies have demanded better understanding of pharmaceutical products and processes by implementing new technological approaches, such as process analytical technology (PAT). Process analysers present a key PAT tool, which enables effective process monitoring, and thus improved process control of medicinal product manufacturing. Process analysers applicable in pharmaceutical coating unit operations are comprehensibly described in the present article. The review is focused on monitoring of solid oral dosage forms during film coating in two most commonly used coating systems, i.e. pan and fluid bed coaters. Brief theoretical background and critical overview of process analysers used for real-time or near real-time (in-, on-, at- line) monitoring of critical quality attributes of film coated dosage forms are presented. Besides well recognized spectroscopic methods (NIR and Raman spectroscopy), other techniques, which have made a significant breakthrough in recent years, are discussed (terahertz pulsed imaging (TPI), chord length distribution (CLD) analysis, and image analysis). Last part of the review is dedicated to novel techniques with high potential to become valuable PAT tools in the future (optical coherence tomography (OCT), acoustic emission (AE), microwave resonance (MR), and laser induced breakdown spectroscopy (LIBS)). Copyright © 2017 Elsevier B.V. All rights reserved.
Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engel, David W.; Dalton, Angela C.; Dale, Crystal
2014-06-01
Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by anmore » integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.« less
Perez-Diaz de Cerio, David; Hernández, Ángela; Valenzuela, Jose Luis; Valdovinos, Antonio
2017-01-01
The purpose of this paper is to evaluate from a real perspective the performance of Bluetooth Low Energy (BLE) as a technology that enables fast and reliable discovery of a large number of users/devices in a short period of time. The BLE standard specifies a wide range of configurable parameter values that determine the discovery process and need to be set according to the particular application requirements. Many previous works have been addressed to investigate the discovery process through analytical and simulation models, according to the ideal specification of the standard. However, measurements show that additional scanning gaps appear in the scanning process, which reduce the discovery capabilities. These gaps have been identified in all of the analyzed devices and respond to both regular patterns and variable events associated with the decoding process. We have demonstrated that these non-idealities, which are not taken into account in other studies, have a severe impact on the discovery process performance. Extensive performance evaluation for a varying number of devices and feasible parameter combinations has been done by comparing simulations and experimental measurements. This work also includes a simple mathematical model that closely matches both the standard implementation and the different chipset peculiarities for any possible parameter value specified in the standard and for any number of simultaneous advertising devices under scanner coverage. PMID:28273801
Perez-Diaz de Cerio, David; Hernández, Ángela; Valenzuela, Jose Luis; Valdovinos, Antonio
2017-03-03
The purpose of this paper is to evaluate from a real perspective the performance of Bluetooth Low Energy (BLE) as a technology that enables fast and reliable discovery of a large number of users/devices in a short period of time. The BLE standard specifies a wide range of configurable parameter values that determine the discovery process and need to be set according to the particular application requirements. Many previous works have been addressed to investigate the discovery process through analytical and simulation models, according to the ideal specification of the standard. However, measurements show that additional scanning gaps appear in the scanning process, which reduce the discovery capabilities. These gaps have been identified in all of the analyzed devices and respond to both regular patterns and variable events associated with the decoding process. We have demonstrated that these non-idealities, which are not taken into account in other studies, have a severe impact on the discovery process performance. Extensive performance evaluation for a varying number of devices and feasible parameter combinations has been done by comparing simulations and experimental measurements. This work also includes a simple mathematical model that closely matches both the standard implementation and the different chipset peculiarities for any possible parameter value specified in the standard and for any number of simultaneous advertising devices under scanner coverage.
Perestrelo, Rosa; Albuquerque, Francisco; Rocha, Sílvia M; Câmara, José S
2011-01-01
Madeira wine, a fortified wine produced in Madeira Island, is a special wine among all types of wine due its specific winemaking process. The aim of this chapter is to describe important aspects of Madeira winemaking and some scientific research currently carried out in these particular kinds of wines. The first part of the chapter concerns the most important aspects of winemaking technology used in Madeira wine production. The second part, the more extensive, deals with the different groups of compounds and how these are modified during the various steps of the production process, namely the aging period. Copyright © 2011 Elsevier Inc. All rights reserved.
Intelligent manipulation technique for multi-branch robotic systems
NASA Technical Reports Server (NTRS)
Chen, Alexander Y. K.; Chen, Eugene Y. S.
1990-01-01
New analytical development in kinematics planning is reported. The INtelligent KInematics Planner (INKIP) consists of the kinematics spline theory and the adaptive logic annealing process. Also, a novel framework of robot learning mechanism is introduced. The FUzzy LOgic Self Organized Neural Networks (FULOSONN) integrates fuzzy logic in commands, control, searching, and reasoning, the embedded expert system for nominal robotics knowledge implementation, and the self organized neural networks for the dynamic knowledge evolutionary process. Progress on the mechanical construction of SRA Advanced Robotic System (SRAARS) and the real time robot vision system is also reported. A decision was made to incorporate the Local Area Network (LAN) technology in the overall communication system.
Systems Analyze Water Quality in Real Time
NASA Technical Reports Server (NTRS)
2010-01-01
A water analyzer developed under Small Business Innovation Research (SBIR) contracts with Kennedy Space Center now monitors treatment processes at water and wastewater facilities around the world. Originally designed to provide real-time detection of nutrient levels in hydroponic solutions for growing plants in space, the ChemScan analyzer, produced by ASA Analytics Inc., of Waukesha, Wisconsin, utilizes spectrometry and chemometric algorithms to automatically analyze multiple parameters in the water treatment process with little need for maintenance, calibration, or operator intervention. The company has experienced a compound annual growth rate of 40 percent over its 15-year history as a direct result of the technology's success.
Metabolomics: beyond biomarkers and towards mechanisms
Johnson, Caroline H.; Ivanisevic, Julijana; Siuzdak, Gary
2017-01-01
Metabolomics, which is the profiling of metabolites in biofluids, cells and tissues, is routinely applied as a tool for biomarker discovery. Owing to innovative developments in informatics and analytical technologies, and the integration of orthogonal biological approaches, it is now possible to expand metabolomic analyses to understand the systems-level effects of metabolites. Moreover, because of the inherent sensitivity of metabolomics, subtle alterations in biological pathways can be detected to provide insight into the mechanisms that underlie various physiological conditions and aberrant processes, including diseases. PMID:26979502
Rembovskiĭ, V R; Mogilenkova, L A; Savel'eva, E I
2005-01-01
The major unit monitoring chemical weapons destruction objects is a system of chemical analyticcontrol over the technologic process procedures and possibility of environment and workplace pollution withtoxicchemicals and their destruction products. At the same time, physical and chemical control means meet sanitary and hygienic requirements incompletely. To provide efficient control, internationally recognized approaches should be adapted to features of Russian system monitoring pollution of chemical weapons destruction objects with toxic chemicals.
Is Analytic Information Processing a Feature of Expertise in Medicine?
ERIC Educational Resources Information Center
McLaughlin, Kevin; Rikers, Remy M.; Schmidt, Henk G.
2008-01-01
Diagnosing begins by generating an initial diagnostic hypothesis by automatic information processing. Information processing may stop here if the hypothesis is accepted, or analytical processing may be used to refine the hypothesis. This description portrays analytic processing as an optional extra in information processing, leading us to…
Selection of Sustainable Technology for VOC Abatement in an Industry: An Integrated AHP-QFD Approach
NASA Astrophysics Data System (ADS)
Gupta, Alok Kumar; Modi, Bharat A.
2018-04-01
Volatile organic compounds (VOCs) are universally present in global atmospheric pollutants. These VOCs are responsible for photo chemical reaction in atmosphere leading to serious harmful effects on human health and environment. VOCs are produced from both natural and man-made sources and may have good commercial value if it can be utilized as alternate fuel. As per data from US EPA, 15% of total VOC emissions are generated from surface coating industry but VOC concentration and exhaust air volume varies to a great extent and is dependent on processes used by industry. Various technologies are available for abatement of VOCs. Physical, Chemical and Biological technologies are available to remove VOCs by either recovery or destruction with many advantages and limitations. With growing environmental awareness and considering the resource limitations of medium and small scale industries, requirement of a tool for selecting appropriate techno economically viable solution for removal of VOCs from industrial process exhaust is envisaged. The aim of the present study is to provide management a tool to determine the overall effect of implementation of VOC abatement technology on business performance and VOC emissions. The primary purpose of this work is to outline a methodology to rate various VOC abatement technologies with respect to the constraint of meeting current and foreseeable future regulatory requirements, operational flexibility and Over All Economics Parameters considering conservation of energy. In this paper an integrated approach has been proposed to select most appropriate abatement technology strategically. Analytical hierarchy process and Quality function deployment have been integrated for Techno-commercial evaluation. A case study on selection of VOC abatement technology for a leading aluminium foil surface coating, lamination and printing facility using this methodology is presented in this study.
Analysis of latency performance of bluetooth low energy (BLE) networks.
Cho, Keuchul; Park, Woojin; Hong, Moonki; Park, Gisu; Cho, Wooseong; Seo, Jihoon; Han, Kijun
2014-12-23
Bluetooth Low Energy (BLE) is a short-range wireless communication technology aiming at low-cost and low-power communication. The performance evaluation of classical Bluetooth device discovery have been intensively studied using analytical modeling and simulative methods, but these techniques are not applicable to BLE, since BLE has a fundamental change in the design of the discovery mechanism, including the usage of three advertising channels. Recently, there several works have analyzed the topic of BLE device discovery, but these studies are still far from thorough. It is thus necessary to develop a new, accurate model for the BLE discovery process. In particular, the wide range settings of the parameters introduce lots of potential for BLE devices to customize their discovery performance. This motivates our study of modeling the BLE discovery process and performing intensive simulation. This paper is focused on building an analytical model to investigate the discovery probability, as well as the expected discovery latency, which are then validated via extensive experiments. Our analysis considers both continuous and discontinuous scanning modes. We analyze the sensitivity of these performance metrics to parameter settings to quantitatively examine to what extent parameters influence the performance metric of the discovery processes.
Analysis of Latency Performance of Bluetooth Low Energy (BLE) Networks
Cho, Keuchul; Park, Woojin; Hong, Moonki; Park, Gisu; Cho, Wooseong; Seo, Jihoon; Han, Kijun
2015-01-01
Bluetooth Low Energy (BLE) is a short-range wireless communication technology aiming at low-cost and low-power communication. The performance evaluation of classical Bluetooth device discovery have been intensively studied using analytical modeling and simulative methods, but these techniques are not applicable to BLE, since BLE has a fundamental change in the design of the discovery mechanism, including the usage of three advertising channels. Recently, there several works have analyzed the topic of BLE device discovery, but these studies are still far from thorough. It is thus necessary to develop a new, accurate model for the BLE discovery process. In particular, the wide range settings of the parameters introduce lots of potential for BLE devices to customize their discovery performance. This motivates our study of modeling the BLE discovery process and performing intensive simulation. This paper is focused on building an analytical model to investigate the discovery probability, as well as the expected discovery latency, which are then validated via extensive experiments. Our analysis considers both continuous and discontinuous scanning modes. We analyze the sensitivity of these performance metrics to parameter settings to quantitatively examine to what extent parameters influence the performance metric of the discovery processes. PMID:25545266
ERIC Educational Resources Information Center
Chonkaew, Patcharee; Sukhummek, Boonnak; Faikhamta, Chatree
2016-01-01
The purpose of this study was to investigate the analytical thinking abilities and attitudes towards science learning of grade-11 students through science, technology, engineering, and mathematics (STEM) education integrated with a problem-based learning in the study of stoichiometry. The research tools consisted of a pre- and post-analytical…
CAA Annual Report Fiscal Year 1998.
1998-12-01
Studies , 3-1 Quick Reaction Analyses & Projects 3-1 4 TECHNOLOGY RESEARCH AND ANALYSIS SUPPORT 4-1 Technology Research 4-1 Methodology Research 4-2...Publications, Graphics, and Reproduction 5-2 6 ANALYTICAL EFFORTS COMPLETED BETWEEN FY90 AND FY98 6-1 Appendix A Annual Study , Work Evaluation...future. Chapter 2 highlights major studies and analysis activities which occurred in FY 98. Chapter 3 is the total package of analytical summaries
Wu, Ching-Sung; Hu, Kuang-Hua; Chen, Fu-Hsiang
2016-01-01
The development of high-tech industry has been prosperous around the world in past decades, while technology and finance have already become the most significant issues in the information era. While high-tech firms are a major force behind a country's economic development, it requires a lot of money for the development process, as well as the financing difficulties for its potential problems, thus, how to evaluate and establish appropriate technology and financial services platforms innovation strategy has become one of the most critical and difficult issues. Moreover, how the chosen intertwined financial environment can be optimized in order that high-tech firms financing problems can be decided has seldom been addressed. Thus, this research aims to establish a technology and financial services platform innovation strategy improvement model, as based on the hybrid MADM model, which addresses the main causal factors and amended priorities in order to strengthen ongoing planning. A DEMATEL technique, as based on Analytic Network Process, as well as modified VIKOR, will be proposed for selecting and re-configuring the aspired technology and financial services platform. An empirical study, as based on China's technology and financial services platform innovation strategy, will be provided for verifying the effectiveness of this proposed methodology. Based on expert interviews, technology and financial services platforms innovation strategy improvement should be made in the following order: credit guarantee platform ( C )_credit rating platform ( B )_investment and finance platform ( A ).
Batch Statistical Process Monitoring Approach to a Cocrystallization Process.
Sarraguça, Mafalda C; Ribeiro, Paulo R S; Dos Santos, Adenilson O; Lopes, João A
2015-12-01
Cocrystals are defined as crystalline structures composed of two or more compounds that are solid at room temperature held together by noncovalent bonds. Their main advantages are the increase of solubility, bioavailability, permeability, stability, and at the same time retaining active pharmaceutical ingredient bioactivity. The cocrystallization between furosemide and nicotinamide by solvent evaporation was monitored on-line using near-infrared spectroscopy (NIRS) as a process analytical technology tool. The near-infrared spectra were analyzed using principal component analysis. Batch statistical process monitoring was used to create control charts to perceive the process trajectory and define control limits. Normal and non-normal operating condition batches were performed and monitored with NIRS. The use of NIRS associated with batch statistical process models allowed the detection of abnormal variations in critical process parameters, like the amount of solvent or amount of initial components present in the cocrystallization. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.
The European Water Framework Directive: Challenges For A New Type of Social and Policy Analysis
NASA Astrophysics Data System (ADS)
Pahl-Wostl, C.
Water resources managment is facing increasing uncertainties in all areas. Socio- economic boundary conditions change quickly and require more flexible management strategies. Climate change, for example results in an increase in uncertainties, in par- ticular extreme events. Given the fact that current management practices deal with extreme events by designing the technical systems to manage the most extreme of all cases (e.g. higher dams for the protection against extreme floods, larger water reser- voirs for droughts and to meet daily peak demand) a serious problem is posed for long-term planning and risk management. Engineering planning has perceived the hu- man dimension as exogenous boundary conditions. Legislation focused largely on the environmental and technological dimensions that set limits and prescribe new tech- nologies without taking the importance of institutional change into account. However, technology is only the "hardware" and it is becoming increasingly obvious that the "software", the social dimension, has to become part of planning and management processes. Hence, the inclusion of the human dimension into integrated models and processes will be valuable in supporting the introduction of new elements into plan- ning processes in water resources management. With the European Water Framework Directive environmental policy enters a new era. The traditional approach to solving isolated environmental problems with technological fixes and end-of-pipe solutions has started to shift towards a more thoughtful attitude which involves the development of integrated approaches to problem solving. The WFD introduces the river basin as the management unit, thus following the experience of some European countries (e.g. France) and the example of the management of some international rivers (e.g. the Rhine). Overall the WFD represents a general shift towards a polycentric understand- ing of policy making that requires the involvement of stakeholders as active partic- ipants into the policy process at different levels of societal organization. The WFD requires the inclusion of stakeholders in the process of developing and adopting a river basin management plan. In order to improve stakeholder-based policy design and modeling processes innovation and research is required in linking analytical methods and participatory approaches. Factual knowledge and analytical techniques have to be combined with local knowledge and subjective perceptions of the various stakeholder groups. The talk will summarize current approaches and point out research needs.
Afonso-Olivares, Cristina; Montesdeoca-Esponda, Sarah; Sosa-Ferrera, Zoraida; Santana-Rodríguez, José Juan
2016-12-01
Today, the presence of contaminants in the environment is a topic of interest for society in general and for the scientific community in particular. A very large amount of different chemical substances reaches the environment after passing through wastewater treatment plants without being eliminated. This is due to the inefficiency of conventional removal processes and the lack of government regulations. The list of compounds entering treatment plants is gradually becoming longer and more varied because most of these compounds come from pharmaceuticals, hormones or personal care products, which are increasingly used by modern society. As a result of this increase in compound variety, to address these emerging pollutants, the development of new and more efficient removal technologies is needed. Different advanced oxidation processes (AOPs), especially photochemical AOPs, have been proposed as supplements to traditional treatments for the elimination of pollutants, showing significant advantages over the use of conventional methods alone. This work aims to review the analytical methodologies employed for the analysis of pharmaceutical compounds from wastewater in studies in which advanced oxidation processes are applied. Due to the low concentrations of these substances in wastewater, mass spectrometry detectors are usually chosen to meet the low detection limits and identification power required. Specifically, time-of-flight detectors are required to analyse the by-products.
Big data, big knowledge: big data for personalized healthcare.
Viceconti, Marco; Hunter, Peter; Hose, Rod
2015-07-01
The idea that the purely phenomenological knowledge that we can extract by analyzing large amounts of data can be useful in healthcare seems to contradict the desire of VPH researchers to build detailed mechanistic models for individual patients. But in practice no model is ever entirely phenomenological or entirely mechanistic. We propose in this position paper that big data analytics can be successfully combined with VPH technologies to produce robust and effective in silico medicine solutions. In order to do this, big data technologies must be further developed to cope with some specific requirements that emerge from this application. Such requirements are: working with sensitive data; analytics of complex and heterogeneous data spaces, including nontextual information; distributed data management under security and performance constraints; specialized analytics to integrate bioinformatics and systems biology information with clinical observations at tissue, organ and organisms scales; and specialized analytics to define the "physiological envelope" during the daily life of each patient. These domain-specific requirements suggest a need for targeted funding, in which big data technologies for in silico medicine becomes the research priority.
Model-Based Extracted Water Desalination System for Carbon Sequestration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dees, Elizabeth M.; Moore, David Roger; Li, Li
Over the last 1.5 years, GE Global Research and Pennsylvania State University defined a model-based, scalable, and multi-stage extracted water desalination system that yields clean water, concentrated brine, and, optionally, salt. The team explored saline brines that ranged across the expected range for extracted water for carbon sequestration reservoirs (40,000 up to 220,000 ppm total dissolved solids, TDS). In addition, the validated the system performance at pilot scale with field-sourced water using GE’s pre-pilot and lab facilities. This project encompassed four principal tasks, in addition to Project Management and Planning: 1) identify a deep saline formation carbon sequestration site andmore » a partner that are suitable for supplying extracted water; 2) conduct a techno-economic assessment and down-selection of pre-treatment and desalination technologies to identify a cost-effective system for extracted water recovery; 3) validate the downselected processes at the lab/pre-pilot scale; and 4) define the scope of the pilot desalination project. Highlights from each task are described below: Deep saline formation characterization The deep saline formations associated with the five DOE NETL 1260 Phase 1 projects were characterized with respect to their mineralogy and formation water composition. Sources of high TDS feed water other than extracted water were explored for high TDS desalination applications, including unconventional oil and gas and seawater reverse osmosis concentrate. Technoeconomic analysis of desalination technologies Techno-economic evaluations of alternate brine concentration technologies, including humidification-dehumidification (HDH), membrane distillation (MD), forward osmosis (FO), turboexpander-freeze, solvent extraction and high pressure reverse osmosis (HPRO), were conducted. These technologies were evaluated against conventional falling film-mechanical vapor recompression (FF-MVR) as a baseline desalination process. Furthermore, a quality function deployment (QFD) method was used to compare alternate high TDS desalination technologies to FF-MVR. High pressure reverse osmosis was found to a be a promising alternative desalination technology. A deep-dive technoeconomic analysis of HPRO was performed, including Capex and Opex estimates, for seawater RO (SWRO). Additionally, two additional cases were explored: 1) a comparison of a SWRO plus HPRO system to the option of doubling the size of a standard seawater RO system to achieve the same total pure water recovery rate; and 2) a flue gas desulfurization wastewater treatment zero-liquid discharge (ZLD) application, where preconcentration with RO (SWRO or SWRO + HPRO) before evaporation and crystallization was compared to FF-MVR and crystallization technologies without RO preconcentration. Pre-pilot process validation Pre-pilot-scale tests were conducted using field production water to validate key process steps for extracted water pretreatment. Approximately 5,000 gallons of field produced water was processed through, microfiltration, ultrafiltration, and steam regenerable sorbent operations. Smaller quantities were processed through microclarification. In addition, analytical methods (purge-and-trap gas chromatography and Hach TOC analytical methods) were validated. Lab-scale HPRO elements were constructed and tested at high pressures, to identify and mitigate technical risks of the technology. Lastly, improvements in RO membrane materials were identified as the necessary next step to achieve further improvement in element performance at high pressure. Scope of Field Pilot A field pilot for extracted water pretreatment was designed.« less
Bolam, Bruce; McLean, Carl; Pennington, Andrew; Gillies, Pamela
2006-03-01
The present article presents an exploratory qualitative process evaluation study of 'Ambassador' participation in CityNet, an innovative information-communication technology-based (ICT) project that aims to build aspects of social capital and improve access to information and services among disadvantaged groups in Nottingham, UK. A purposive sample of 40 'Ambassadors' interviewees was gathered in three waves of data collection. The two emergent analytic themes highlighted how improvements in confidence, self-esteem and social networks produced via participation were mitigated by structural problems in devolving power within the project. This illustrates how concepts of power are important for understanding the process of health promotion interventions using new media.
The MSFC Collaborative Engineering Process for Preliminary Design and Concept Definition Studies
NASA Technical Reports Server (NTRS)
Mulqueen, Jack; Jones, David; Hopkins, Randy
2011-01-01
This paper describes a collaborative engineering process developed by the Marshall Space Flight Center's Advanced Concepts Office for performing rapid preliminary design and mission concept definition studies for potential future NASA missions. The process has been developed and demonstrated for a broad range of mission studies including human space exploration missions, space transportation system studies and in-space science missions. The paper will describe the design team structure and specialized analytical tools that have been developed to enable a unique rapid design process. The collaborative engineering process consists of integrated analysis approach for mission definition, vehicle definition and system engineering. The relevance of the collaborative process elements to the standard NASA NPR 7120.1 system engineering process will be demonstrated. The study definition process flow for each study discipline will be will be outlined beginning with the study planning process, followed by definition of ground rules and assumptions, definition of study trades, mission analysis and subsystem analyses leading to a standardized set of mission concept study products. The flexibility of the collaborative engineering design process to accommodate a wide range of study objectives from technology definition and requirements definition to preliminary design studies will be addressed. The paper will also describe the applicability of the collaborative engineering process to include an integrated systems analysis approach for evaluating the functional requirements of evolving system technologies and capabilities needed to meet the needs of future NASA programs.
2011-01-01
The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper. PMID:21410968
Kamel Boulos, Maged N; Viangteeravat, Teeradache; Anyanwu, Matthew N; Ra Nagisetty, Venkateswara; Kuscu, Emin
2011-03-16
The goal of visual analytics is to facilitate the discourse between the user and the data by providing dynamic displays and versatile visual interaction opportunities with the data that can support analytical reasoning and the exploration of data from multiple user-customisable aspects. This paper introduces geospatial visual analytics, a specialised subtype of visual analytics, and provides pointers to a number of learning resources about the subject, as well as some examples of human health, surveillance, emergency management and epidemiology-related geospatial visual analytics applications and examples of free software tools that readers can experiment with, such as Google Public Data Explorer. The authors also present a practical demonstration of geospatial visual analytics using partial data for 35 countries from a publicly available World Health Organization (WHO) mortality dataset and Microsoft Live Labs Pivot technology, a free, general purpose visual analytics tool that offers a fresh way to visually browse and arrange massive amounts of data and images online and also supports geographic and temporal classifications of datasets featuring geospatial and temporal components. Interested readers can download a Zip archive (included with the manuscript as an additional file) containing all files, modules and library functions used to deploy the WHO mortality data Pivot collection described in this paper.
Designing a Marketing Analytics Course for the Digital Age
ERIC Educational Resources Information Center
Liu, Xia; Burns, Alvin C.
2018-01-01
Marketing analytics is receiving great attention because of evolving technology and the radical changes in the marketing environment. This study aims to assist the design and implementation of a marketing analytics course. We assembled a rich data set from four sources: business executives, 400 employers' job postings, one million tweets about…
Sensors for spacecraft cabin environment monitoring
NASA Astrophysics Data System (ADS)
Ramsden, J. J.; Sharkan, Y. P.; Zhitov, N. B.; Korposh, S. O.
2007-10-01
It is very necessary, in manned spaceflight, to ensure that essential variables, including concentrations of oxygen, carbon dioxide, water vapour and volatile organic contaminants, are maintained within acceptable limits. Furthermore, the purity of drinking water, etc. must at all times be assured. Moreover, for lengthy voyages, the proliferation of bacteria and other microorganisms may need to be monitored. Here we present a platform approach to these problems based on multiplexed optical fibres sensitized to the different analytes by coating them with thin-film capture layers of bionanomaterial composites. Both amplitude and interference measurement modes are described, as well as a photoactivated amplitude measurement mode offering further sensitivity enhancement. It is a great and novel advantage that the same technology, and hence the same data processing and diagnostics procedures, can be used over a vast range of analytes in both gaseous and liquid media.
Laser ablation in analytical chemistry - A review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Russo, Richard E.; Mao, Xianglei; Liu, Haichen
Laser ablation is becoming a dominant technology for direct solid sampling in analytical chemistry. Laser ablation refers to the process in which an intense burst of energy delivered by a short laser pulse is used to sample (remove a portion of) a material. The advantages of laser ablation chemical analysis include direct characterization of solids, no chemical procedures for dissolution, reduced risk of contamination or sample loss, analysis of very small samples not separable for solution analysis, and determination of spatial distributions of elemental composition. This review describes recent research to understand and utilize laser ablation for direct solid sampling,more » with emphasis on sample introduction to an inductively coupled plasma (ICP). Current research related to contemporary experimental systems, calibration and optimization, and fractionation is discussed, with a summary of applications in several areas.« less
Prieto-Ballesteros, Olga; Martínez-Frías, Jesús; Schutt, John; Sutter, Brad; Heldmann, Jennifer L; Bell, Mary Sue; Battler, Melissa; Cannon, Howard; Gómez-Elvira, Javier; Stoker, Carol R
2008-10-01
The 2005 Mars Astrobiology Research and Technology Experiment (MARTE) project conducted a simulated 1-month Mars drilling mission in the Río Tinto district, Spain. Dry robotic drilling, core sampling, and biological and geological analytical technologies were collectively tested for the first time for potential use on Mars. Drilling and subsurface sampling and analytical technologies are being explored for Mars because the subsurface is the most likely place to find life on Mars. The objectives of this work are to describe drilling, sampling, and analytical procedures; present the geological analysis of core and borehole material; and examine lessons learned from the drilling simulation. Drilling occurred at an undisclosed location, causing the science team to rely only on mission data for geological and biological interpretations. Core and borehole imaging was used for micromorphological analysis of rock, targeting rock for biological analysis, and making decisions regarding the next day's drilling operations. Drilling reached 606 cm depth into poorly consolidated gossan that allowed only 35% of core recovery and contributed to borehole wall failure during drilling. Core material containing any indication of biology was sampled and analyzed in more detail for its confirmation. Despite the poorly consolidated nature of the subsurface gossan, dry drilling was able to retrieve useful core material for geological and biological analysis. Lessons learned from this drilling simulation can guide the development of dry drilling and subsurface geological and biological analytical technologies for future Mars drilling missions.
NASA Astrophysics Data System (ADS)
Prieto-Ballesteros, Olga; Martínez-Frías, Jesús; Schutt, John; Sutter, Brad; Heldmann, Jennifer L.; Bell Johnson, Mary Sue; Battler, Melissa; Cannon, Howard; Gómez-Elvira, Javier; Stoker, Carol R.
2008-10-01
The 2005 Mars Astrobiology Research and Technology Experiment (MARTE) project conducted a simulated 1-month Mars drilling mission in the Río Tinto district, Spain. Dry robotic drilling, core sampling, and biological and geological analytical technologies were collectively tested for the first time for potential use on Mars. Drilling and subsurface sampling and analytical technologies are being explored for Mars because the subsurface is the most likely place to find life on Mars. The objectives of this work are to describe drilling, sampling, and analytical procedures; present the geological analysis of core and borehole material; and examine lessons learned from the drilling simulation. Drilling occurred at an undis closed location, causing the science team to rely only on mission data for geological and biological interpretations. Core and borehole imaging was used for micromorphological analysis of rock, targeting rock for biological analysis, and making decisions regarding the next day's drilling operations. Drilling reached 606 cm depth into poorly consolidated gossan that allowed only 35% of core recovery and contributed to borehole wall failure during drilling. Core material containing any indication of biology was sampled and analyzed in more detail for its confirmation. Despite the poorly consolidated nature of the subsurface gossan, dry drilling was able to retrieve useful core material for geological and biological analysis. Lessons learned from this drilling simulation can guide the development of dry drilling and subsurface geological and biological analytical technologies for future Mars drilling missions.
Kitsikopoulos, Harry
2013-09-01
This essay provides an analytical account of the history of various steam devices by tracing the key technological and scientific developments culminating in the Savery and Newcomen models. It begins in antiquity with the writings of Hero of Alexandria, which were rediscovered and translated in Italy fourteen centuries later, followed by the construction of simple steam devices. The most decisive development comes in the middle of the seventeenth century with the overturning, through the experimental work of Torricelli, Pascal, and Guericke, of the Aristotelian dogma that no vacuum exists. The final stretch of this discovery process amounted to an Anglo-French race, with English inventors being more successful in the end.
Hu, Ruofei; Cancela, Jorge; Arredondo Waldmeyer, Maria Teresa; Cea, Gloria; Vlachopapadopoulou, Elpis-Athina; Fotiadis, Dimitrios I; Fico, Giuseppe
2016-01-01
Childhood obesity is becoming one of the 21st century's most important public health problems. Nowadays, the main treatment of childhood obesity is behavior intervention that aims at improve children's lifestyle to arrest the disease. Information and communication technologies (ICTs) have not been widely employed in this intervention, and most of existing ICTs systems are not having a long-term effect. The purpose of this paper is to define a system to support family-based intervention through a state-of-the-art analysis of family-based interventions and related technological solutions first, and then using the analytic hierarchy process to derive a childhood obesity family-based behavior intervention model, and finally to provide a prototype of a system called OB CITY. The system makes use of applied behavior analysis, affective computing technologies, as well as serious game and gamification techniques, to offer long term services in all care dimensions of the family-based behavioral intervention aiming to provide positive effects to the treatment of childhood obesity.
Technological advances in real-time tracking of cell death
Skommer, Joanna; Darzynkiewicz, Zbigniew; Wlodkowic, Donald
2010-01-01
Cell population can be viewed as a quantum system, which like Schrödinger’s cat exists as a combination of survival- and death-allowing states. Tracking and understanding cell-to-cell variability in processes of high spatio-temporal complexity such as cell death is at the core of current systems biology approaches. As probabilistic modeling tools attempt to impute information inaccessible by current experimental approaches, advances in technologies for single-cell imaging and omics (proteomics, genomics, metabolomics) should go hand in hand with the computational efforts. Over the last few years we have made exciting technological advances that allow studies of cell death dynamically in real-time and with the unprecedented accuracy. These approaches are based on innovative fluorescent assays and recombinant proteins, bioelectrical properties of cells, and more recently also on state-of-the-art optical spectroscopy. Here, we review current status of the most innovative analytical technologies for dynamic tracking of cell death, and address the interdisciplinary promises and future challenges of these methods. PMID:20519963
Review of spectral imaging technology in biomedical engineering: achievements and challenges.
Li, Qingli; He, Xiaofu; Wang, Yiting; Liu, Hongying; Xu, Dongrong; Guo, Fangmin
2013-10-01
Spectral imaging is a technology that integrates conventional imaging and spectroscopy to get both spatial and spectral information from an object. Although this technology was originally developed for remote sensing, it has been extended to the biomedical engineering field as a powerful analytical tool for biological and biomedical research. This review introduces the basics of spectral imaging, imaging methods, current equipment, and recent advances in biomedical applications. The performance and analytical capabilities of spectral imaging systems for biological and biomedical imaging are discussed. In particular, the current achievements and limitations of this technology in biomedical engineering are presented. The benefits and development trends of biomedical spectral imaging are highlighted to provide the reader with an insight into the current technological advances and its potential for biomedical research.
Miller, Tyler M; Geraci, Lisa
2016-05-01
People may change their memory predictions after retrieval practice using naïve theories of memory and/or by using subjective experience - analytic and non-analytic processes respectively. The current studies disentangled contributions of each process. In one condition, learners studied paired-associates, made a memory prediction, completed a short-run of retrieval practice and made a second prediction. In another condition, judges read about a yoked learners' retrieval practice performance but did not participate in retrieval practice and therefore, could not use non-analytic processes for the second prediction. In Study 1, learners reduced their predictions following moderately difficult retrieval practice whereas judges increased their predictions. In Study 2, learners made lower adjusted predictions than judges following both easy and difficult retrieval practice. In Study 3, judge-like participants used analytic processes to report adjusted predictions. Overall, the results suggested non-analytic processes play a key role for participants to reduce their predictions after retrieval practice. Copyright © 2016 Elsevier Inc. All rights reserved.
Evangelopoulos, Angelos A; Dalamaga, Maria; Panoutsopoulos, Konstantinos; Dima, Kleanthi
2013-01-01
In the early 80s, the word automation was used in the clinical laboratory setting referring only to analyzers. But in late 80s and afterwards, automation found its way into all aspects of the diagnostic process, embracing not only the analytical but also the pre- and post-analytical phase. While laboratories in the eastern world, mainly Japan, paved the way for laboratory automation, US and European laboratories soon realized the benefits and were quick to follow. Clearly, automation and robotics will be a key survival tool in a very competitive and cost-concious healthcare market. What sets automation technology apart from so many other efficiency solutions are the dramatic savings that it brings to the clinical laboratory. Further standardization will assure the success of this revolutionary new technology. One of the main difficulties laboratory managers and personnel must deal with when studying solutions to reengineer a laboratory is familiarizing themselves with the multidisciplinary and technical terminology of this new and exciting field. The present review/glossary aims at giving an overview of the most frequently used terms within the scope of laboratory automation and to put laboratory automation on a sounder linguistic basis.
Analytics-Driven Lossless Data Compression for Rapid In-situ Indexing, Storing, and Querying
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jenkins, John; Arkatkar, Isha; Lakshminarasimhan, Sriram
2013-01-01
The analysis of scientific simulations is highly data-intensive and is becoming an increasingly important challenge. Peta-scale data sets require the use of light-weight query-driven analysis methods, as opposed to heavy-weight schemes that optimize for speed at the expense of size. This paper is an attempt in the direction of query processing over losslessly compressed scientific data. We propose a co-designed double-precision compression and indexing methodology for range queries by performing unique-value-based binning on the most significant bytes of double precision data (sign, exponent, and most significant mantissa bits), and inverting the resulting metadata to produce an inverted index over amore » reduced data representation. Without the inverted index, our method matches or improves compression ratios over both general-purpose and floating-point compression utilities. The inverted index is light-weight, and the overall storage requirement for both reduced column and index is less than 135%, whereas existing DBMS technologies can require 200-400%. As a proof-of-concept, we evaluate univariate range queries that additionally return column values, a critical component of data analytics, against state-of-the-art bitmap indexing technology, showing multi-fold query performance improvements.« less
NASA Astrophysics Data System (ADS)
Hong, ZHAO; Chengwu, YI; Rongjie, YI; Huijuan, WANG; Lanlan, YIN; I, N. MUHAMMAD; Zhongfei, MA
2018-03-01
The degradation mechanism of dimethyl phthalate (DMP) in the drinking water was investigated using strong ionization discharge technology in this study. Under the optimized condition, the degradation efficiency of DMP in drinking water was up to 93% in 60 min. A series of analytical techniques including high-performance liquid chromatography, liquid chromatography mass spectrometry, total organic carbon analyzer and ultraviolet-visible spectroscopy were used in the study. It was found that a high concentration of ozone (O3) produced by dielectric barrier discharge reactor was up to 74.4 mg l-1 within 60 min. Tert-butanol, isopropyl alcohol, carbonate ions ({{{{CO}}}3}2-) and bicarbonate ions ({{{{HCO}}}3}-) was added to the sample solution to indirectly prove the presence and effect of hydroxyl radicals (·OH). These analytical findings indicate that mono-methyl phthalate, phthalic acid (PA) and methyl ester PA were detected as the major intermediates in the process of DMP degradation. Finally, DMP and all products were mineralized into carbon dioxide (CO2) and water (H2O) ultimately. Based on these analysis results, the degradation pathway of DMP by strong ionization discharge technology were proposed.
NASA Astrophysics Data System (ADS)
Bogdanov, Valery L.; Boyce-Jacino, Michael
1999-05-01
Confined arrays of biochemical probes deposited on a solid support surface (analytical microarray or 'chip') provide an opportunity to analysis multiple reactions simultaneously. Microarrays are increasingly used in genetics, medicine and environment scanning as research and analytical instruments. A power of microarray technology comes from its parallelism which grows with array miniaturization, minimization of reagent volume per reaction site and reaction multiplexing. An optical detector of microarray signals should combine high sensitivity, spatial and spectral resolution. Additionally, low-cost and a high processing rate are needed to transfer microarray technology into biomedical practice. We designed an imager that provides confocal and complete spectrum detection of entire fluorescently-labeled microarray in parallel. Imager uses microlens array, non-slit spectral decomposer, and high- sensitive detector (cooled CCD). Two imaging channels provide a simultaneous detection of localization, integrated and spectral intensities for each reaction site in microarray. A dimensional matching between microarray and imager's optics eliminates all in moving parts in instrumentation, enabling highly informative, fast and low-cost microarray detection. We report theory of confocal hyperspectral imaging with microlenses array and experimental data for implementation of developed imager to detect fluorescently labeled microarray with a density approximately 103 sites per cm2.
Workplace Skills Taught in a Simulated Analytical Department
NASA Astrophysics Data System (ADS)
Sonchik Marine, Susan
2001-11-01
Integration of workplace skills into the academic setting is paramount for any chemical technology program. In addition to the expected chemistry content, courses must build proficiency in oral and written communication skills, computer skills, laboratory safety, and logical troubleshooting. Miami University's Chemical Technology II course is set up as a contract analytical laboratory. Students apply the advanced sampling techniques, quality assurance, standard methods, and statistical analyses they have studied. For further integration of workplace skills, weekly "department meetings" are held where the student, as members of the department, report on their work in process, present completed projects, and share what they have learned and what problems they have encountered. Information is shared between the experienced members of the department and those encountering problems or starting a new project. The instructor as department manager makes announcements, reviews company and department status, and assigns work for the coming week. The department members report results to clients in formal reports or in short memos. Factors affecting the success of the "department meeting" approach include the formality of the meeting room, use of an official agenda, the frequency, time, and duration of the meeting, and accountability of the students.
Bernard, Elyse D; Nguyen, Kathy C; DeRosa, Maria C; Tayabali, Azam F; Aranda-Rodriguez, Rocio
2017-01-01
Aptamers are short oligonucleotide sequences used in detection systems because of their high affinity binding to a variety of macromolecules. With the introduction of aptamers over 25 years ago came the exploration of their use in many different applications as a substitute for antibodies. Aptamers have several advantages; they are easy to synthesize, can bind to analytes for which it is difficult to obtain antibodies, and in some cases bind better than antibodies. As such, aptamer applications have significantly expanded as an adjunct to a variety of different immunoassay designs. The Multiple-Analyte Profiling (xMAP) technology developed by Luminex Corporation commonly uses antibodies for the detection of analytes in small sample volumes through the use of fluorescently coded microbeads. This technology permits the simultaneous detection of multiple analytes in each sample tested and hence could be applied in many research fields. Although little work has been performed adapting this technology for use with apatmers, optimizing aptamer-based xMAP assays would dramatically increase the versatility of analyte detection. We report herein on the development of an xMAP bead-based aptamer/antibody sandwich assay for a biomarker of inflammation (C-reactive protein or CRP). Protocols for the coupling of aptamers to xMAP beads, validation of coupling, and for an aptamer/antibody sandwich-type assay for CRP are detailed. The optimized conditions, protocols and findings described in this research could serve as a starting point for the development of new aptamer-based xMAP assays.
Oxide nanomaterials: synthetic developments, mechanistic studies, and technological innovations.
Patzke, Greta R; Zhou, Ying; Kontic, Roman; Conrad, Franziska
2011-01-24
Oxide nanomaterials are indispensable for nanotechnological innovations, because they combine an infinite variety of structural motifs and properties with manifold morphological features. Given that new oxide materials are almost reported on a daily basis, considerable synthetic and technological work remains to be done to fully exploit this ever increasing family of compounds for innovative nano-applications. This calls for reliable and scalable preparative approaches to oxide nanomaterials and their development remains a challenge for many complex nanostructured oxides. Oxide nanomaterials with special physicochemical features and unusual morphologies are still difficult to access by classic synthetic pathways. The limitless options for creating nano-oxide building blocks open up new technological perspectives with the potential to revolutionize areas ranging from data processing to biocatalysis. Oxide nanotechnology of the 21st century thus needs a strong interplay of preparative creativity, analytical skills, and new ideas for synergistic implementations. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Krujatz, Felix; Lode, Anja; Seidel, Julia; Bley, Thomas; Gelinsky, Michael; Steingroewer, Juliane
2017-10-25
The diversity and complexity of biotechnological applications are constantly increasing, with ever expanding ranges of production hosts, cultivation conditions and measurement tasks. Consequently, many analytical and cultivation systems for biotechnology and bioprocess engineering, such as microfluidic devices or bioreactors, are tailor-made to precisely satisfy the requirements of specific measurements or cultivation tasks. Additive manufacturing (AM) technologies offer the possibility of fabricating tailor-made 3D laboratory equipment directly from CAD designs with previously inaccessible levels of freedom in terms of structural complexity. This review discusses the historical background of these technologies, their most promising current implementations and the associated workflows, fabrication processes and material specifications, together with some of the major challenges associated with using AM in biotechnology/bioprocess engineering. To illustrate the great potential of AM, selected examples in microfluidic devices, 3D-bioprinting/biofabrication and bioprocess engineering are highlighted. Copyright © 2017 Elsevier B.V. All rights reserved.
Aptamer-based technology for food analysis.
Liu, Xiaofei; Zhang, Xuewu
2015-01-01
Aptamers are short and functional single-stranded oligonucleotide sequences selected from systematic evolution of ligands by exponential enrichment (SELEX) process, which have the capacity to recognize various classes of target molecules with high affinity and specificity. Various analytical aptamers acquired by SELEX are widely used in many research fields, such as medicine, biology, and chemistry. However, the application of this innovative and emerging technology to food safety is just in infant stage. Food safety plays a very important role in our daily lives because varieties of poisonous and harmful substances in food affect human health. Aptamer technique is promising, which can overcome many disadvantages of existing detection methods in food safety, such as long detection time, low sensitivity, difficult, and expensive antibody preparation. This review provides an overview of various aptamer screening technologies and summarizes the recent applications of aptamers in food safety, and future prospects are also discussed.
Tenório, Marge; Mello, Guilherme Arantes; Viana, Ana Luiza D'Ávila
2017-05-01
The purpose of this article is to highlight a number of underlying issues that may be useful for a comprehensive review of the management of Health-Related Science, Technology and Innovation policies (ST&I/H), and its strategies and priorities. It is an analytical study supported by an extensive review of the technical and journalistic literature, clippings, legislation and federal government directives. The results show that the Healthcare Production Complex undeniably and increasingly needs science to maintain itself. One may infer that a framework of institutional milestones is being built in Brazil, to strengthen, guide and encourage Research and Development, and that clinical research creates scientific knowledge to address public healthcare issues by generating new inputs or enhancing existing techniques, processes and technologies that will be produced, marketed and used in the different segments, thus feeding the Healthcare Productive Complex.
Aspects concerning verification methods and rigidity increment of complex technological systems
NASA Astrophysics Data System (ADS)
Casian, M.
2016-11-01
Any technological process and technology aims a quality and precise product, something almost impossible without high rigidity machine tools, equipment and components. Therefore, from the design phase, it is very important to create structures and machines with high stiffness characteristics. At the same time, increasing the stiffness should not raise the material costs. Searching this midpoint between high rigidity and minimum expenses leads to investigations and checks in structural components through various methods and techniques and sometimes quite advanced methods. In order to highlight some aspects concerning the significance of the mechanical equipment rigidity, the finite element method and an analytical method based on the use Mathcad software were used, by taking into consideration a subassembly of a grinding machine. Graphical representations were elaborated, offering a more complete image about the stresses and deformations able to affect the considered mechanical subassembly.
ERIC Educational Resources Information Center
Santavenere, Alex
An action research study was undertaken to examine the effects of educational technology resources on critical thinking and analytical skills. The researcher observed 3 different 11th grade classes, a total of 75 students, over a week as they worked in the school's computer lab. Each class was composed of 25 to 30 students, all of whom were…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wantuck, P. J.; Hollen, R. M.
2002-01-01
This paper provides an overview of some design and automation-related projects ongoing within the Applied Engineering Technologies (AET) Group at Los Alamos National Laboratory. AET uses a diverse set of technical capabilities to develop and apply processes and technologies to applications for a variety of customers both internal and external to the Laboratory. The Advanced Recovery and Integrated Extraction System (ARIES) represents a new paradigm for the processing of nuclear material from retired weapon systems in an environment that seeks to minimize the radiation dose to workers. To achieve this goal, ARIES relies upon automation-based features to handle and processmore » the nuclear material. Our Chemical Process Development Team specializes in fuzzy logic and intelligent control systems. Neural network technology has been utilized in some advanced control systems developed by team members. Genetic algorithms and neural networks have often been applied for data analysis. Enterprise modeling, or discrete event simulation, as well as chemical process simulation has been employed for chemical process plant design. Fuel cell research and development has historically been an active effort within the AET organization. Under the principal sponsorship of the Department of Energy, the Fuel Cell Team is now focusing on technologies required to produce fuel cell compatible feed gas from reformation of a variety of conventional fuels (e.g., gasoline, natural gas), principally for automotive applications. This effort involves chemical reactor design and analysis, process modeling, catalyst analysis, as well as full scale system characterization and testing. The group's Automation and Robotics team has at its foundation many years of experience delivering automated and robotic systems for nuclear, analytical chemistry, and bioengineering applications. As an integrator of commercial systems and a developer of unique custom-made systems, the team currently supports the automation needs of many Laboratory programs.« less
Archaic man meets a marvellous automaton: posthumanism, social robots, archetypes.
Jones, Raya
2017-06-01
Posthumanism is associated with critical explorations of how new technologies are rewriting our understanding of what it means to be human and how they might alter human existence itself. Intersections with analytical psychology vary depending on which technologies are held in focus. Social robotics promises to populate everyday settings with entities that have populated the imagination for millennia. A legend of A Marvellous Automaton appears as early as 350 B.C. in a book of Taoist teachings, and is joined by ancient and medieval legends of manmade humanoids coming to life, as well as the familiar robots of modern science fiction. However, while the robotics industry seems to be realizing an archetypal fantasy, the technology creates new social realities that generate distinctive issues of potential relevance for the theory and practice of analytical psychology. © 2017, The Society of Analytical Psychology.
Simultaneous Multiparameter Cellular Energy Metabolism Profiling of Small Populations of Cells.
Kelbauskas, Laimonas; Ashili, Shashaanka P; Lee, Kristen B; Zhu, Haixin; Tian, Yanqing; Meldrum, Deirdre R
2018-03-12
Functional and genomic heterogeneity of individual cells are central players in a broad spectrum of normal and disease states. Our knowledge about the role of cellular heterogeneity in tissue and organism function remains limited due to analytical challenges one encounters when performing single cell studies in the context of cell-cell interactions. Information based on bulk samples represents ensemble averages over populations of cells, while data generated from isolated single cells do not account for intercellular interactions. We describe a new technology and demonstrate two important advantages over existing technologies: first, it enables multiparameter energy metabolism profiling of small cell populations (<100 cells)-a sample size that is at least an order of magnitude smaller than other, commercially available technologies; second, it can perform simultaneous real-time measurements of oxygen consumption rate (OCR), extracellular acidification rate (ECAR), and mitochondrial membrane potential (MMP)-a capability not offered by any other commercially available technology. Our results revealed substantial diversity in response kinetics of the three analytes in dysplastic human epithelial esophageal cells and suggest the existence of varying cellular energy metabolism profiles and their kinetics among small populations of cells. The technology represents a powerful analytical tool for multiparameter studies of cellular function.
Reaction-diffusion systems in natural sciences and new technology transfer
NASA Astrophysics Data System (ADS)
Keller, André A.
2012-12-01
Diffusion mechanisms in natural sciences and innovation management involve partial differential equations (PDEs). This is due to their spatio-temporal dimensions. Functional semi-discretized PDEs (with lattice spatial structures or time delays) may be even more adapted to real world problems. In the modeling process, PDEs can also formalize behaviors, such as the logistic growth of populations with migration, and the adopters’ dynamics of new products in innovation models. In biology, these events are related to variations in the environment, population densities and overcrowding, migration and spreading of humans, animals, plants and other cells and organisms. In chemical reactions, molecules of different species interact locally and diffuse. In the management of new technologies, the diffusion processes of innovations in the marketplace (e.g., the mobile phone) are a major subject. These innovation diffusion models refer mainly to epidemic models. This contribution introduces that modeling process by using PDEs and reviews the essential features of the dynamics and control in biological, chemical and new technology transfer. This paper is essentially user-oriented with basic nonlinear evolution equations, delay PDEs, several analytical and numerical methods for solving, different solutions, and with the use of mathematical packages, notebooks and codes. The computations are carried out by using the software Wolfram Mathematica®7, and C++ codes.
Statistically qualified neuro-analytic failure detection method and system
Vilim, Richard B.; Garcia, Humberto E.; Chen, Frederick W.
2002-03-02
An apparatus and method for monitoring a process involve development and application of a statistically qualified neuro-analytic (SQNA) model to accurately and reliably identify process change. The development of the SQNA model is accomplished in two stages: deterministic model adaption and stochastic model modification of the deterministic model adaptation. Deterministic model adaption involves formulating an analytic model of the process representing known process characteristics, augmenting the analytic model with a neural network that captures unknown process characteristics, and training the resulting neuro-analytic model by adjusting the neural network weights according to a unique scaled equation error minimization technique. Stochastic model modification involves qualifying any remaining uncertainty in the trained neuro-analytic model by formulating a likelihood function, given an error propagation equation, for computing the probability that the neuro-analytic model generates measured process output. Preferably, the developed SQNA model is validated using known sequential probability ratio tests and applied to the process as an on-line monitoring system. Illustrative of the method and apparatus, the method is applied to a peristaltic pump system.
GESearch: An Interactive GUI Tool for Identifying Gene Expression Signature.
Ye, Ning; Yin, Hengfu; Liu, Jingjing; Dai, Xiaogang; Yin, Tongming
2015-01-01
The huge amount of gene expression data generated by microarray and next-generation sequencing technologies present challenges to exploit their biological meanings. When searching for the coexpression genes, the data mining process is largely affected by selection of algorithms. Thus, it is highly desirable to provide multiple options of algorithms in the user-friendly analytical toolkit to explore the gene expression signatures. For this purpose, we developed GESearch, an interactive graphical user interface (GUI) toolkit, which is written in MATLAB and supports a variety of gene expression data files. This analytical toolkit provides four models, including the mean, the regression, the delegate, and the ensemble models, to identify the coexpression genes, and enables the users to filter data and to select gene expression patterns by browsing the display window or by importing knowledge-based genes. Subsequently, the utility of this analytical toolkit is demonstrated by analyzing two sets of real-life microarray datasets from cell-cycle experiments. Overall, we have developed an interactive GUI toolkit that allows for choosing multiple algorithms for analyzing the gene expression signatures.
Dangerous Waste Characteristics of Contact-Handled Transuranic Mixed Wastes from Hanford Tanks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tingey, Joel M.; Bryan, Garry H.; Deschane, Jaquetta R.
2004-10-05
This report summarizes existing analytical data gleaned from samples taken from the Hanford tanks designated as potentially containing transuranic mixed process wastes. Process knowledge of the wastes transferred to these tanks has been reviewed to determine whether the dangerous waste characteristics now assigned to all Hanford underground storage tanks are applicable to these particular wastes. Supplemental technologies are being examined to accelerate the Hanford tank waste cleanup mission and accomplish waste treatment safely and efficiently. To date, 11 Hanford waste tanks have been designated as potentially containing contact-handled (CH) transuranic mixed (TRUM) wastes. The CH-TRUM wastes are found in single-shellmore » tanks B-201 through B-204, T-201 through T-204, T-104, T-110, and T-111. Methods and equipment to solidify and package the CH-TRUM wastes are part of the supplemental technologies being evaluated. The resulting packages and wastes must be acceptable for disposal at the Waste Isolation Pilot Plant (WIPP). The dangerous waste characteristics being considered include ignitability, corrosivity, reactivity, and toxicity arising from the presence of 2,4,5-trichlorophenol at levels above the dangerous waste threshold. The analytical data reviewed include concentrations of sulfur, sulfate, cyanide, 2,4,5-trichlorophenol, total organic carbon, and oxalate; the composition of the tank headspace, pH, and mercury. Differential scanning calorimetry results were used to determine the energetics of the wastes as a function of temperature. This report supercedes and replaces PNNL-14832.« less
Dangerous Waste Characteristics of Contact-Handled Transuranic Mixed Wastes from the Hanford Tanks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tingey, Joel M.; Bryan, Garry H.; Deschane, Jaquetta R.
2004-08-31
This report summarizes existing analytical data from samples taken from the Hanford tanks designated as potentially containing transuranic mixed process wastes. Process knowledge of the wastes transferred to these tanks has been reviewed to determine whether the dangerous waste characteristics now assigned to all Hanford underground storage tanks are applicable to these particular wastes. Supplemental technologies are being examined to accelerate the Hanford tank waste cleanup mission and accomplish waste treatment safely and efficiently. To date, 11 Hanford waste tanks have been designated as potentially containing contact-handled (CH) transuranic mixed (TRUM) wastes. The CH-TRUM wastes are found in single-shell tanksmore » B-201 through B-204, T-201 through T-204, T-104, T-110, and T-111. Methods and equipment to solidify and package the CH-TRUM wastes are part of the supplemental technologies being evaluated. The resulting packages and wastes must be acceptable for disposal at the Waste Isolation Pilot Plant (WIPP). The dangerous waste characteristics being considered include ignitability, corrosivity, reactivity, and toxicity arising from the presence of 2,4,5-trichlorophenol at levels above the dangerous waste threshold. The analytical data reviewed include concentrations of sulfur, sulfate, cyanide, 2,4,5-trichlorophenol, total organic carbon, and oxalate; the composition of the tank headspace, pH, and mercury. Differential scanning calorimetry results were used to determine the energetics of the wastes as a function of temperature.« less
Kristó, Katalin; Kovács, Orsolya; Kelemen, András; Lajkó, Ferenc; Klivényi, Gábor; Jancsik, Béla; Pintye-Hódi, Klára; Regdon, Géza
2016-12-01
In the literature there are some publications about the effect of impeller and chopper speeds on product parameters. However, there is no information about the effect of temperature. Therefore our main aim was the investigation of elevated temperature and temperature distribution during pelletization in a high shear granulator according to process analytical technology. During our experimental work, pellets containing pepsin were formulated with a high-shear granulator. A specially designed chamber (Opulus Ltd.) was used for pelletization. This chamber contained four PyroButton-TH® sensors built in the wall and three PyroDiff® sensors 1, 2 and 3cm from the wall. The sensors were located in three different heights. The impeller and chopper speeds were set on the basis of 3 2 factorial design. The temperature was measured continuously in 7 different points during pelletization and the results were compared with the temperature values measured by the thermal sensor of the high-shear granulator. The optimization parameters were enzyme activity, average size, breaking hardness, surface free energy and aspect ratio. One of the novelties was the application of the specially designed chamber (Opulus Ltd.) for monitoring the temperature continuously in 7 different points during high-shear granulation. The other novelty of this study was the evaluation of the effect of temperature on the properties of pellets containing protein during high-shear pelletization. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Miedzińska, D.; Gieleta, R.; Osiński, J.
2015-02-01
A vibratory pile hammer (VPH) is a mechanical device used to drive steel piles as well as tube piles into soil to provide foundation support for buildings or other structures. In order to increase the stability and the efficiency of the VPH work in the over-resonance frequency, a new VPH construction was developed at the Military University of Technology. The new VPH contains a system of counter-rotating eccentric weights, powered by hydraulic motors, and designed in such a way that horizontal vibrations cancel out, while vertical vibrations are transmitted into the pile. This system is suspended in the static parts by the adaptive variable stiffness pillows based on a smart material, magnetorheological elastomer (MRE), whose rheological and mechanical properties can be reversibly and rapidly controlled by an external magnetic field. The work presented in the paper is a part of the modified VPH construction design process. It concerns the experimental research on the vibrations during the piling process and the analytical analyses of the gained signal. The results will be applied in the VPH control system.
Evaluating supplier quality performance using analytical hierarchy process
NASA Astrophysics Data System (ADS)
Kalimuthu Rajoo, Shanmugam Sundram; Kasim, Maznah Mat; Ahmad, Nazihah
2013-09-01
This paper elaborates the importance of evaluating supplier quality performance to an organization. Supplier quality performance evaluation reflects the actual performance of the supplier exhibited at customer's end. It is critical in enabling the organization to determine the area of improvement and thereafter works with supplier to close the gaps. Success of the customer partly depends on supplier's quality performance. Key criteria as quality, cost, delivery, technology support and customer service are categorized as main factors in contributing to supplier's quality performance. 18 suppliers' who were manufacturing automotive application parts evaluated in year 2010 using weight point system. There were few suppliers with common rating which led to common ranking observed by few suppliers'. Analytical Hierarchy Process (AHP), a user friendly decision making tool for complex and multi criteria problems was used to evaluate the supplier's quality performance challenging the weight point system that was used for 18 suppliers'. The consistency ratio was checked for criteria and sub-criteria. Final results of AHP obtained with no overlap ratings, therefore yielded a better decision making methodology as compared to weight point rating system.
Conversion of an atomic to a molecular argon ion and low pressure argon relaxation
NASA Astrophysics Data System (ADS)
M, N. Stankov; A, P. Jovanović; V, Lj Marković; S, N. Stamenković
2016-01-01
The dominant process in relaxation of DC glow discharge between two plane parallel electrodes in argon at pressure 200 Pa is analyzed by measuring the breakdown time delay and by analytical and numerical models. By using the approximate analytical model it is found that the relaxation in a range from 20 to 60 ms in afterglow is dominated by ions, produced by atomic-to-molecular conversion of Ar+ ions in the first several milliseconds after the cessation of the discharge. This conversion is confirmed by the presence of double-Gaussian distribution for the formative time delay, as well as conversion maxima in a set of memory curves measured in different conditions. Finally, the numerical one-dimensional (1D) model for determining the number densities of dominant particles in stationary DC glow discharge and two-dimensional (2D) model for the relaxation are used to confirm the previous assumptions and to determine the corresponding collision and transport coefficients of dominant species and processes. Project supported by the Ministry of Education, Science and Technological Development of the Republic of Serbia (Grant No. ON171025).
Arduini, Fabiana; Cinti, Stefano; Scognamiglio, Viviana; Moscone, Danila; Palleschi, Giuseppe
2017-03-22
Through the years, scientists have developed cutting-edge technologies to make (bio)sensors more convenient for environmental analytical purposes. Technological advancements in the fields of material science, rational design, microfluidics, and sensor printing, have radically shaped biosensor technology, which is even more evident in the continuous development of sensing systems for the monitoring of hazardous chemicals. These efforts will be crucial in solving some of the problems constraining biosensors to reach real environmental applications, such as continuous analyses in field by means of multi-analyte portable devices. This review (with 203 refs.) covers the progress between 2010 and 2015 in the field of technologies enabling biosensor applications in environmental analysis, including i) printing technology, ii) nanomaterial technology, iii) nanomotors, iv) biomimetic design, and (v) microfluidics. Next section describes futuristic cutting-edge technologies that are gaining momentum in recent years, which furnish highly innovative aspects to biosensing devices. Copyright © 2016 Elsevier B.V. All rights reserved.
The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate
ERIC Educational Resources Information Center
Cárdenas-Navia, Isabel; Fitzgerald, Brian K.
2015-01-01
New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…
Visual Analytics in Public Safety: Example Capabilities for Example Government Agencies
2011-10-01
is not limited to: the Police Records Information Management Environment for British Columbia (PRIME-BC), the Police Reporting and Occurrence System...and filtering for rapid identification of relevant documents - Graphical environment for visual evidence marshaling - Interactive linking and...analytical reasoning facilitated by interactive visual interfaces and integration with computational analytics. Indeed, a wide variety of technologies
Web Analytics: A Picture of the Academic Library Web Site User
ERIC Educational Resources Information Center
Black, Elizabeth L.
2009-01-01
This article describes the usefulness of Web analytics for understanding the users of an academic library Web site. Using a case study, the analysis describes how Web analytics can answer questions about Web site user behavior, including when visitors come, the duration of the visit, how they get there, the technology they use, and the most…
Reproducible Computing: a new Technology for Statistics Education and Educational Research
NASA Astrophysics Data System (ADS)
Wessa, Patrick
2009-05-01
This paper explains how the R Framework (http://www.wessa.net) and a newly developed Compendium Platform (http://www.freestatistics.org) allow us to create, use, and maintain documents that contain empirical research results which can be recomputed and reused in derived work. It is illustrated that this technological innovation can be used to create educational applications that can be shown to support effective learning of statistics and associated analytical skills. It is explained how a Compendium can be created by anyone, without the need to understand the technicalities of scientific word processing (L style="font-variant: small-caps">ATEX) or statistical computing (R code). The proposed Reproducible Computing system allows educational researchers to objectively measure key aspects of the actual learning process based on individual and constructivist activities such as: peer review, collaboration in research, computational experimentation, etc. The system was implemented and tested in three statistics courses in which the use of Compendia was used to create an interactive e-learning environment that simulated the real-world process of empirical scientific research.
NASA Astrophysics Data System (ADS)
Kouziokas, Georgios N.
2016-01-01
The adoption of Information and Communication Technologies (ICT) in environmental management has become a significant demand nowadays with the rapid growth of environmental information. This paper presents a prototype Environmental Management Information System (EMIS) that was developed to provide a systematic way of managing environmental data and human resources of an environmental organization. The system was designed using programming languages, a Database Management System (DBMS) and other technologies and programming tools and combines information from the relational database in order to achieve the principal goals of the environmental organization. The developed application can be used to store and elaborate information regarding: human resources data, environmental projects, observations, reports, data about the protected species, environmental measurements of pollutant factors or other kinds of analytical measurements and also the financial data of the organization. Furthermore, the system supports the visualization of spatial data structures by using geographic information systems (GIS) and web mapping technologies. This paper describes this prototype software application, its structure, its functions and how this system can be utilized to facilitate technology-based environmental management and decision-making process.