Sample records for integrated analysis process

  1. Abhijit Dutta | NREL

    Science.gov Websites

    Techno-economic analysis Process model development for existing and conceptual processes Detailed heat integration Economic analysis of integrated processes Integration of process simulation learnings into control ;Conceptual Process Design and Techno-Economic Assessment of Ex Situ Catalytic Fast Pyrolysis of Biomass: A

  2. The Vehicle Integrated Performance Analysis Experience: Reconnecting With Technical Integration

    NASA Technical Reports Server (NTRS)

    McGhee, D. S.

    2006-01-01

    Very early in the Space Launch Initiative program, a small team of engineers at MSFC proposed a process for performing system-level assessments of a launch vehicle. Aimed primarily at providing insight and making NASA a smart buyer, the Vehicle Integrated Performance Analysis (VIPA) team was created. The difference between the VIPA effort and previous integration attempts is that VIPA a process using experienced people from various disciplines, which focuses them on a technically integrated assessment. The foundations of VIPA s process are described. The VIPA team also recognized the need to target early detailed analysis toward identifying significant systems issues. This process is driven by the T-model for technical integration. VIPA s approach to performing system-level technical integration is discussed in detail. The VIPA process significantly enhances the development and monitoring of realizable project requirements. VIPA s assessment validates the concept s stated performance, identifies significant issues either with the concept or the requirements, and then reintegrates these issues to determine impacts. This process is discussed along with a description of how it may be integrated into a program s insight and review process. The VIPA process has gained favor with both engineering and project organizations for being responsive and insightful

  3. Marketing Mix Formulation for Higher Education: An Integrated Analysis Employing Analytic Hierarchy Process, Cluster Analysis and Correspondence Analysis

    ERIC Educational Resources Information Center

    Ho, Hsuan-Fu; Hung, Chia-Chi

    2008-01-01

    Purpose: The purpose of this paper is to examine how a graduate institute at National Chiayi University (NCYU), by using a model that integrates analytic hierarchy process, cluster analysis and correspondence analysis, can develop effective marketing strategies. Design/methodology/approach: This is primarily a quantitative study aimed at…

  4. A Comparative Analysis of Extract, Transformation and Loading (ETL) Process

    NASA Astrophysics Data System (ADS)

    Runtuwene, J. P. A.; Tangkawarow, I. R. H. T.; Manoppo, C. T. M.; Salaki, R. J.

    2018-02-01

    The current growth of data and information occurs rapidly in varying amount and media. These types of development will eventually produce large number of data better known as the Big Data. Business Intelligence (BI) utilizes large number of data and information for analysis so that one can obtain important information. This type of information can be used to support decision-making process. In practice a process integrating existing data and information into data warehouse is needed. This data integration process is known as Extract, Transformation and Loading (ETL). In practice, many applications have been developed to carry out the ETL process, but selection which applications are more time, cost and power effective and efficient may become a challenge. Therefore, the objective of the study was to provide comparative analysis through comparison between the ETL process using Microsoft SQL Server Integration Service (SSIS) and one using Pentaho Data Integration (PDI).

  5. Integration of design, structural, thermal and optical analysis: And user's guide for structural-to-optical translator (PATCOD)

    NASA Technical Reports Server (NTRS)

    Amundsen, R. M.; Feldhaus, W. S.; Little, A. D.; Mitchum, M. V.

    1995-01-01

    Electronic integration of design and analysis processes was achieved and refined at Langley Research Center (LaRC) during the development of an optical bench for a laser-based aerospace experiment. Mechanical design has been integrated with thermal, structural and optical analyses. Electronic import of the model geometry eliminates the repetitive steps of geometry input to develop each analysis model, leading to faster and more accurate analyses. Guidelines for integrated model development are given. This integrated analysis process has been built around software that was already in use by designers and analysis at LaRC. The process as currently implemented used Pro/Engineer for design, Pro/Manufacturing for fabrication, PATRAN for solid modeling, NASTRAN for structural analysis, SINDA-85 and P/Thermal for thermal analysis, and Code V for optical analysis. Currently, the only analysis model to be built manually is the Code V model; all others can be imported for the Pro/E geometry. The translator from PATRAN results to Code V optical analysis (PATCOD) was developed and tested at LaRC. Directions for use of the translator or other models are given.

  6. Integrated Multi-process Microfluidic Systems for Automating Analysis

    PubMed Central

    Yang, Weichun; Woolley, Adam T.

    2010-01-01

    Microfluidic technologies have been applied extensively in rapid sample analysis. Some current challenges for standard microfluidic systems are relatively high detection limits, and reduced resolving power and peak capacity compared to conventional approaches. The integration of multiple functions and components onto a single platform can overcome these separation and detection limitations of microfluidics. Multiplexed systems can greatly increase peak capacity in multidimensional separations and can increase sample throughput by analyzing many samples simultaneously. On-chip sample preparation, including labeling, preconcentration, cleanup and amplification, can all serve to speed up and automate processes in integrated microfluidic systems. This paper summarizes advances in integrated multi-process microfluidic systems for automated analysis, their benefits and areas for needed improvement. PMID:20514343

  7. Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Nagpal, Vinod K.

    2007-01-01

    An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.

  8. Integrated Structural Analysis and Test Program

    NASA Technical Reports Server (NTRS)

    Kaufman, Daniel

    2005-01-01

    An integrated structural-analysis and structure-testing computer program is being developed in order to: Automate repetitive processes in testing and analysis; Accelerate pre-test analysis; Accelerate reporting of tests; Facilitate planning of tests; Improve execution of tests; Create a vibration, acoustics, and shock test database; and Integrate analysis and test data. The software package includes modules pertaining to sinusoidal and random vibration, shock and time replication, acoustics, base-driven modal survey, and mass properties and static/dynamic balance. The program is commanded by use of ActiveX controls. There is minimal need to generate command lines. Analysis or test files are selected by opening a Windows Explorer display. After selecting the desired input file, the program goes to a so-called analysis data process or test data process, depending on the type of input data. The status of the process is given by a Windows status bar, and when processing is complete, the data are reported in graphical, tubular, and matrix form.

  9. Integration of Design, Thermal, Structural, and Optical Analysis, Including Thermal Animation

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth M.

    1993-01-01

    In many industries there has recently been a concerted movement toward 'quality management' and the issue of how to accomplish work more efficiently. Part of this effort is focused on concurrent engineering; the idea of integrating the design and analysis processes so that they are not separate, sequential processes (often involving design rework due to analytical findings) but instead form an integrated system with smooth transfers of information. Presented herein are several specific examples of concurrent engineering methods being carried out at Langley Research Center (LaRC): integration of thermal, structural and optical analyses to predict changes in optical performance based on thermal and structural effects; integration of the CAD design process with thermal and structural analyses; and integration of analysis and presentation by animating the thermal response of a system as an active color map -- a highly effective visual indication of heat flow.

  10. Global sensitivity analysis of DRAINMOD-FOREST, an integrated forest ecosystem model

    Treesearch

    Shiying Tian; Mohamed A. Youssef; Devendra M. Amatya; Eric D. Vance

    2014-01-01

    Global sensitivity analysis is a useful tool to understand process-based ecosystem models by identifying key parameters and processes controlling model predictions. This study reported a comprehensive global sensitivity analysis for DRAINMOD-FOREST, an integrated model for simulating water, carbon (C), and nitrogen (N) cycles and plant growth in lowland forests. The...

  11. The use of artificial intelligence techniques to improve the multiple payload integration process

    NASA Technical Reports Server (NTRS)

    Cutts, Dannie E.; Widgren, Brian K.

    1992-01-01

    A maximum return of science and products with a minimum expenditure of time and resources is a major goal of mission payload integration. A critical component then, in successful mission payload integration is the acquisition and analysis of experiment requirements from the principal investigator and payload element developer teams. One effort to use artificial intelligence techniques to improve the acquisition and analysis of experiment requirements within the payload integration process is described.

  12. Cost-effectiveness of integrated analysis/design systems /IPAD/ An executive summary. II. [for aerospace vehicles

    NASA Technical Reports Server (NTRS)

    Miller, R. E., Jr.; Hansen, S. D.; Redhed, D. D.; Southall, J. W.; Kawaguchi, A. S.

    1974-01-01

    Evaluation of the cost-effectiveness of integrated analysis/design systems with particular attention to Integrated Program for Aerospace-Vehicle Design (IPAD) project. An analysis of all the ingredients of IPAD indicates the feasibility of a significant cost and flowtime reduction in the product design process involved. It is also concluded that an IPAD-supported design process will provide a framework for configuration control, whereby the engineering costs for design, analysis and testing can be controlled during the air vehicle development cycle.

  13. INTEGRATED ENVIRONMENTAL ASSESSMENT OF THE MID-ATLANTIC REGION WITH ANALYTICAL NETWORK PROCESS

    EPA Science Inventory

    A decision analysis method for integrating environmental indicators was developed. This was a combination of Principal Component Analysis (PCA) and the Analytic Network Process (ANP). Being able to take into account interdependency among variables, the method was capable of ran...

  14. Performance analysis of different tuning rules for an isothermal CSTR using integrated EPC and SPC

    NASA Astrophysics Data System (ADS)

    Roslan, A. H.; Karim, S. F. Abd; Hamzah, N.

    2018-03-01

    This paper demonstrates the integration of Engineering Process Control (EPC) and Statistical Process Control (SPC) for the control of product concentration of an isothermal CSTR. The objectives of this study are to evaluate the performance of Ziegler-Nichols (Z-N), Direct Synthesis, (DS) and Internal Model Control (IMC) tuning methods and determine the most effective method for this process. The simulation model was obtained from past literature and re-constructed using SIMULINK MATLAB to evaluate the process response. Additionally, the process stability, capability and normality were analyzed using Process Capability Sixpack reports in Minitab. Based on the results, DS displays the best response for having the smallest rise time, settling time, overshoot, undershoot, Integral Time Absolute Error (ITAE) and Integral Square Error (ISE). Also, based on statistical analysis, DS yields as the best tuning method as it exhibits the highest process stability and capability.

  15. Integrating Individual Learning Processes and Organizational Knowledge Formation: Foundational Determinants for Organizational Performance

    ERIC Educational Resources Information Center

    Song, Ji Hoon; Chermack, Thomas J.; Kim, Hong Min

    2008-01-01

    This research examined the link between learning processes and knowledge formation through an integrated literature review from both academic and practical viewpoints. Individuals' learning processes and organizational knowledge creation were reviewed by means of theoretical and integrative analysis based on a lack of empirical research on the…

  16. Train integrity detection risk analysis based on PRISM

    NASA Astrophysics Data System (ADS)

    Wen, Yuan

    2018-04-01

    GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.

  17. Integration of rocket turbine design and analysis through computer graphics

    NASA Technical Reports Server (NTRS)

    Hsu, Wayne; Boynton, Jim

    1988-01-01

    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.

  18. Work Integration of People with Disabilities in the Regular Labour Market: What Can We Do to Improve These Processes?

    ERIC Educational Resources Information Center

    Vila, Montserrat; Pallisera, Maria; Fullana, Judit

    2007-01-01

    Background: It is important to ensure that regular processes of labour market integration are available for all citizens. Method: Thematic content analysis techniques, using semi-structured group interviews, were used to identify the principal elements contributing to the processes of integrating people with disabilities into the regular labour…

  19. Integrating fire management analysis into land management planning

    Treesearch

    Thomas J. Mills

    1983-01-01

    The analysis of alternative fire management programs should be integrated into the land and resource management planning process, but a single fire management analysis model cannot meet all planning needs. Therefore, a set of simulation models that are analytically separate from integrated land management planning models are required. The design of four levels of fire...

  20. Six sigma tools in integrating internal operations of a retail pharmacy: a case study.

    PubMed

    Kumar, Sameer; Kwong, Anthony M

    2011-01-01

    This study was initiated to integrate information and enterprise-wide healthcare delivery system issues specifically within an inpatient retail pharmacy operation in a U.S. community hospital. Six Sigma tools were used to examine the effects to an inpatient retail pharmacy service process. Some of the tools used include service blueprints, cause-effect diagram, gap analysis derived from customer and employee surveys, mistake proofing was applied in various business situations and results were analyzed to identify and propose process improvements and integration. The research indicates that the Six Sigma tools in this discussion are very applicable and quite effective in helping to streamline and integrate the pharmacy process flow. Additionally, gap analysis derived from two different surveys was used to estimate the primary areas of focus to increase customer and employee satisfaction. The results of this analysis were useful in initiating discussions of how to effectively narrow these service gaps. This retail pharmaceutical service study serves as a framework for the process that should occur for successful process improvement tool evaluation and implementation. Pharmaceutical Service operations in the U.S. that use this integration framework must tailor it to their individual situations to maximize their chances for success.

  1. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    NASA Technical Reports Server (NTRS)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  2. Developing a comprehensive framework of community integration for people with acquired brain injury: a conceptual analysis.

    PubMed

    Shaikh, Nusratnaaz M; Kersten, Paula; Siegert, Richard J; Theadom, Alice

    2018-03-06

    Despite increasing emphasis on the importance of community integration as an outcome for acquired brain injury (ABI), there is still no consensus on the definition of community integration. The aim of this study was to complete a concept analysis of community integration in people with ABI. The method of concept clarification was used to guide concept analysis of community integration based on a literature review. Articles were included if they explored community integration in people with ABI. Data extraction was performed by the initial coding of (1) the definition of community integration used in the articles, (2) attributes of community integration recognized in the articles' findings, and (3) the process of community integration. This information was synthesized to develop a model of community integration. Thirty-three articles were identified that met the inclusion criteria. The construct of community integration was found to be a non-linear process reflecting recovery over time, sequential goals, and transitions. Community integration was found to encompass six components including: independence, sense of belonging, adjustment, having a place to live, involved in a meaningful occupational activity, and being socially connected into the community. Antecedents to community integration included individual, injury-related, environmental, and societal factors. The findings of this concept analysis suggest that the concept of community integration is more diverse than previously recognized. New measures and rehabilitation plans capturing all attributes of community integration are needed in clinical practice. Implications for rehabilitation Understanding of perceptions and lived experiences of people with acquired brain injury through this analysis provides basis to ensure rehabilitation meets patients' needs. This model highlights the need for clinicians to be aware and assess the role of antecedents as well as the attributes of community integration itself to ensure all aspects are addressed in in a manner that will enhance the recovery and improve the level of integration into the community. The finding that community integration is a non-linear process also highlights the need for rehabilitation professionals to review and revise plans over time in response to a person's changing circumstances and recovery journey. This analysis provides the groundwork for an operational model of community integration for the development of a measure of community integration that assesses all six attributes revealed in this review not recognized in previous frameworks.

  3. Quality by design case study: an integrated multivariate approach to drug product and process development.

    PubMed

    Huang, Jun; Kaul, Goldi; Cai, Chunsheng; Chatlapalli, Ramarao; Hernandez-Abad, Pedro; Ghosh, Krishnendu; Nagi, Arwinder

    2009-12-01

    To facilitate an in-depth process understanding, and offer opportunities for developing control strategies to ensure product quality, a combination of experimental design, optimization and multivariate techniques was integrated into the process development of a drug product. A process DOE was used to evaluate effects of the design factors on manufacturability and final product CQAs, and establish design space to ensure desired CQAs. Two types of analyses were performed to extract maximal information, DOE effect & response surface analysis and multivariate analysis (PCA and PLS). The DOE effect analysis was used to evaluate the interactions and effects of three design factors (water amount, wet massing time and lubrication time), on response variables (blend flow, compressibility and tablet dissolution). The design space was established by the combined use of DOE, optimization and multivariate analysis to ensure desired CQAs. Multivariate analysis of all variables from the DOE batches was conducted to study relationships between the variables and to evaluate the impact of material attributes/process parameters on manufacturability and final product CQAs. The integrated multivariate approach exemplifies application of QbD principles and tools to drug product and process development.

  4. Review of Exploration Systems Development (ESD) Integrated Hazard Development Process. Volume 1; Appendices

    NASA Technical Reports Server (NTRS)

    Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.

    2015-01-01

    The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g., missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.

  5. Review of Exploration Systems Development (ESD) Integrated Hazard Development Process. Appendices; Volume 2

    NASA Technical Reports Server (NTRS)

    Smiles, Michael D.; Blythe, Michael P.; Bejmuk, Bohdan; Currie, Nancy J.; Doremus, Robert C.; Franzo, Jennifer C.; Gordon, Mark W.; Johnson, Tracy D.; Kowaleski, Mark M.; Laube, Jeffrey R.

    2015-01-01

    The Chief Engineer of the Exploration Systems Development (ESD) Office requested that the NASA Engineering and Safety Center (NESC) perform an independent assessment of the ESD's integrated hazard development process. The focus of the assessment was to review the integrated hazard analysis (IHA) process and identify any gaps/improvements in the process (e.g. missed causes, cause tree completeness, missed hazards). This document contains the outcome of the NESC assessment.

  6. Gene ARMADA: an integrated multi-analysis platform for microarray data implemented in MATLAB.

    PubMed

    Chatziioannou, Aristotelis; Moulos, Panagiotis; Kolisis, Fragiskos N

    2009-10-27

    The microarray data analysis realm is ever growing through the development of various tools, open source and commercial. However there is absence of predefined rational algorithmic analysis workflows or batch standardized processing to incorporate all steps, from raw data import up to the derivation of significantly differentially expressed gene lists. This absence obfuscates the analytical procedure and obstructs the massive comparative processing of genomic microarray datasets. Moreover, the solutions provided, heavily depend on the programming skills of the user, whereas in the case of GUI embedded solutions, they do not provide direct support of various raw image analysis formats or a versatile and simultaneously flexible combination of signal processing methods. We describe here Gene ARMADA (Automated Robust MicroArray Data Analysis), a MATLAB implemented platform with a Graphical User Interface. This suite integrates all steps of microarray data analysis including automated data import, noise correction and filtering, normalization, statistical selection of differentially expressed genes, clustering, classification and annotation. In its current version, Gene ARMADA fully supports 2 coloured cDNA and Affymetrix oligonucleotide arrays, plus custom arrays for which experimental details are given in tabular form (Excel spreadsheet, comma separated values, tab-delimited text formats). It also supports the analysis of already processed results through its versatile import editor. Besides being fully automated, Gene ARMADA incorporates numerous functionalities of the Statistics and Bioinformatics Toolboxes of MATLAB. In addition, it provides numerous visualization and exploration tools plus customizable export data formats for seamless integration by other analysis tools or MATLAB, for further processing. Gene ARMADA requires MATLAB 7.4 (R2007a) or higher and is also distributed as a stand-alone application with MATLAB Component Runtime. Gene ARMADA provides a highly adaptable, integrative, yet flexible tool which can be used for automated quality control, analysis, annotation and visualization of microarray data, constituting a starting point for further data interpretation and integration with numerous other tools.

  7. Subsurface Hydrology: Data Integration for Properties and Processes

    NASA Astrophysics Data System (ADS)

    Hyndman, David W.; Day-Lewis, Frederick D.; Singha, Kamini

    Groundwater is a critical resource and the PrinciPal source of drinking water for over 1.5 billion people. In 2001, the National Research Council cited as a "grand challenge" our need to understand the processes that control water movement in the subsurface. This volume faces that challenge in terms of data integration between complex, multi-scale hydrologie processes, and their links to other physical, chemical, and biological processes at multiple scales. Subsurface Hydrology: Data Integration for Properties and Processes presents the current state of the science in four aspects: • Approaches to hydrologie data integration • Data integration for characterization of hydrologie properties • Data integration for understanding hydrologie processes • Meta-analysis of current interpretations Scientists and researchers in the field, the laboratory, and the classroom will find this work an important resource in advancing our understanding of subsurface water movement.

  8. Universal microfluidic automaton for autonomous sample processing: application to the Mars Organic Analyzer.

    PubMed

    Kim, Jungkyu; Jensen, Erik C; Stockton, Amanda M; Mathies, Richard A

    2013-08-20

    A fully integrated multilayer microfluidic chemical analyzer for automated sample processing and labeling, as well as analysis using capillary zone electrophoresis is developed and characterized. Using lifting gate microfluidic control valve technology, a microfluidic automaton consisting of a two-dimensional microvalve cellular array is fabricated with soft lithography in a format that enables facile integration with a microfluidic capillary electrophoresis device. The programmable sample processor performs precise mixing, metering, and routing operations that can be combined to achieve automation of complex and diverse assay protocols. Sample labeling protocols for amino acid, aldehyde/ketone and carboxylic acid analysis are performed automatically followed by automated transfer and analysis by the integrated microfluidic capillary electrophoresis chip. Equivalent performance to off-chip sample processing is demonstrated for each compound class; the automated analysis resulted in a limit of detection of ~16 nM for amino acids. Our microfluidic automaton provides a fully automated, portable microfluidic analysis system capable of autonomous analysis of diverse compound classes in challenging environments.

  9. [Problems of world outlook and methodology of science integration in biological studies].

    PubMed

    Khododova, Iu D

    1981-01-01

    Problems of worldoutlook and methodology of the natural-science knowledge are considered basing on the analysis of tendencies in the development of the membrane theory of cell processes and the use of principles of biological membrane functioning when solving some scientific and applied problems pertaining to different branches of chemistry and biology. The notion scientific knowledge integration is defined as interpenetration of approaches, methods and ideas of different branches of knowledge and enrichment on this basis of their content resulting in knowledge augmentation in each field taken separately. These processes are accompanied by appearance of new branches of knowledge - sciences "on junction" and their subsequent differentiations. The analysis of some gnoseological situations shows that integration of sciences contributes to coordination and some agreement of thinking styles of different specialists, puts forward keen personality of a scientist demanding, in particular, his high professional mobility. Problems of scientific activity organization are considered, which involve social sciences into the integration processes. The role of philosophy in the integration processes is emphasized.

  10. Marcus canonical integral for non-Gaussian processes and its computation: pathwise simulation and tau-leaping algorithm.

    PubMed

    Li, Tiejun; Min, Bin; Wang, Zhiming

    2013-03-14

    The stochastic integral ensuring the Newton-Leibnitz chain rule is essential in stochastic energetics. Marcus canonical integral has this property and can be understood as the Wong-Zakai type smoothing limit when the driving process is non-Gaussian. However, this important concept seems not well-known for physicists. In this paper, we discuss Marcus integral for non-Gaussian processes and its computation in the context of stochastic energetics. We give a comprehensive introduction to Marcus integral and compare three equivalent definitions in the literature. We introduce the exact pathwise simulation algorithm and give the error analysis. We show how to compute the thermodynamic quantities based on the pathwise simulation algorithm. We highlight the information hidden in the Marcus mapping, which plays the key role in determining thermodynamic quantities. We further propose the tau-leaping algorithm, which advance the process with deterministic time steps when tau-leaping condition is satisfied. The numerical experiments and its efficiency analysis show that it is very promising.

  11. Development of Spreadsheet-Based Integrated Transaction Processing Systems and Financial Reporting Systems

    NASA Astrophysics Data System (ADS)

    Ariana, I. M.; Bagiada, I. M.

    2018-01-01

    Development of spreadsheet-based integrated transaction processing systems and financial reporting systems is intended to optimize the capabilities of spreadsheet in accounting data processing. The purpose of this study are: 1) to describe the spreadsheet-based integrated transaction processing systems and financial reporting systems; 2) to test its technical and operational feasibility. This study type is research and development. The main steps of study are: 1) needs analysis (need assessment); 2) developing spreadsheet-based integrated transaction processing systems and financial reporting systems; and 3) testing the feasibility of spreadsheet-based integrated transaction processing systems and financial reporting systems. The technical feasibility include the ability of hardware and operating systems to respond the application of accounting, simplicity and ease of use. Operational feasibility include the ability of users using accounting applications, the ability of accounting applications to produce information, and control applications of the accounting applications. The instrument used to assess the technical and operational feasibility of the systems is the expert perception questionnaire. The instrument uses 4 Likert scale, from 1 (strongly disagree) to 4 (strongly agree). Data were analyzed using percentage analysis by comparing the number of answers within one (1) item by the number of ideal answer within one (1) item. Spreadsheet-based integrated transaction processing systems and financial reporting systems integrate sales, purchases, and cash transaction processing systems to produce financial reports (statement of profit or loss and other comprehensive income, statement of changes in equity, statement of financial position, and statement of cash flows) and other reports. Spreadsheet-based integrated transaction processing systems and financial reporting systems is feasible from the technical aspects (87.50%) and operational aspects (84.17%).

  12. [Challenges in geriatric rehabilitation: the development of an integrated care pathway].

    PubMed

    Everink, Irma Helga Johanna; van Haastregt, Jolanda C M; Kempen, Gertrudis I J M; Dielis, Leen M J; Maessen, José M C; Schols, Jos M G A

    2015-04-01

    Coordination and continuity of care within geriatric rehabilitation is challenging. To tackle these challenges, an integrated care pathway within geriatric rehabilitation care (hospital, geriatric rehabilitation and follow-up care in the home situation) has been developed. The aim of this article is to expound the process of developing the integrated care pathway, and to describe and discuss the results of this process (which is the integrated care pathway). Developing the integrated care pathway was done by the guidance of the first four steps of the theoretical framework for implementation of change from Grol and Wensing: (1) development of a specific proposal for change in practice; (2) analysis of current care practice; (3) analysis of the target group and setting; and (4) development and selection of interventions/strategies for change. The organizations involved in geriatric rehabilitation argued that the integrated care pathway should focus on improving the process of care, including transfer of patients, handovers and communication between care organizations. Current practice, barriers and incentives for change were analyzed through literature research, expert consultation, and interviews with the involved caregivers and by establishing working groups of health care professionals, patients and informal caregivers. This resulted in valuable proposals for improvement of the care process, which were gathered and combined in the integrated care pathway. The integrated care pathway entails agreements on (a) the triage process in the hospital; (b) active engagement of patients and informal caregivers in the care process; (c) timely and high quality handovers; and (d) improved communication between caregivers.

  13. Integrated electrocoagulation-electrooxidation process for the treatment of soluble coffee effluent: Optimization of COD degradation and operation time analysis.

    PubMed

    Ibarra-Taquez, Harold N; GilPavas, Edison; Blatchley, Ernest R; Gómez-García, Miguel-Ángel; Dobrosz-Gómez, Izabela

    2017-09-15

    Soluble coffee production generates wastewater containing complex mixtures of organic macromolecules. In this work, a sequential Electrocoagulation-Electrooxidation (EC-EO) process, using aluminum and graphite electrodes, was proposed as an alternative way for the treatment of soluble coffee effluent. Process operational parameters were optimized, achieving total decolorization, as well as 74% and 63.5% of COD and TOC removal, respectively. The integrated EC-EO process yielded a highly oxidized (AOS = 1.629) and biocompatible (BOD 5 /COD ≈ 0.6) effluent. The Molecular Weight Distribution (MWD) analysis showed that during the EC-EO process, EC effectively decomposed contaminants with molecular weight in the range of 10-30 kDa. In contrast, EO was quite efficient in mineralization of contaminants with molecular weight higher than 30 kDa. A kinetic analysis allowed determination of the time required to meet Colombian permissible discharge limits. Finally, a comprehensive operational cost analysis was performed. The integrated EC-EO process was demonstrated as an efficient alternative for the treatment of industrial effluents resulting from soluble coffee production. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Performance analysis of Integrated Communication and Control System networks

    NASA Technical Reports Server (NTRS)

    Halevi, Y.; Ray, A.

    1990-01-01

    This paper presents statistical analysis of delays in Integrated Communication and Control System (ICCS) networks that are based on asynchronous time-division multiplexing. The models are obtained in closed form for analyzing control systems with randomly varying delays. The results of this research are applicable to ICCS design for complex dynamical processes like advanced aircraft and spacecraft, autonomous manufacturing plants, and chemical and processing plants.

  15. TRAPR: R Package for Statistical Analysis and Visualization of RNA-Seq Data.

    PubMed

    Lim, Jae Hyun; Lee, Soo Youn; Kim, Ju Han

    2017-03-01

    High-throughput transcriptome sequencing, also known as RNA sequencing (RNA-Seq), is a standard technology for measuring gene expression with unprecedented accuracy. Numerous bioconductor packages have been developed for the statistical analysis of RNA-Seq data. However, these tools focus on specific aspects of the data analysis pipeline, and are difficult to appropriately integrate with one another due to their disparate data structures and processing methods. They also lack visualization methods to confirm the integrity of the data and the process. In this paper, we propose an R-based RNA-Seq analysis pipeline called TRAPR, an integrated tool that facilitates the statistical analysis and visualization of RNA-Seq expression data. TRAPR provides various functions for data management, the filtering of low-quality data, normalization, transformation, statistical analysis, data visualization, and result visualization that allow researchers to build customized analysis pipelines.

  16. Integrating multiple immunogenetic data sources for feature extraction and mining somatic hypermutation patterns: the case of "towards analysis" in chronic lymphocytic leukaemia.

    PubMed

    Kavakiotis, Ioannis; Xochelli, Aliki; Agathangelidis, Andreas; Tsoumakas, Grigorios; Maglaveras, Nicos; Stamatopoulos, Kostas; Hadzidimitriou, Anastasia; Vlahavas, Ioannis; Chouvarda, Ioanna

    2016-06-06

    Somatic Hypermutation (SHM) refers to the introduction of mutations within rearranged V(D)J genes, a process that increases the diversity of Immunoglobulins (IGs). The analysis of SHM has offered critical insight into the physiology and pathology of B cells, leading to strong prognostication markers for clinical outcome in chronic lymphocytic leukaemia (CLL), the most frequent adult B-cell malignancy. In this paper we present a methodology for integrating multiple immunogenetic and clinocobiological data sources in order to extract features and create high quality datasets for SHM analysis in IG receptors of CLL patients. This dataset is used as the basis for a higher level integration procedure, inspired form social choice theory. This is applied in the Towards Analysis, our attempt to investigate the potential ontogenetic transformation of genes belonging to specific stereotyped CLL subsets towards other genes or gene families, through SHM. The data integration process, followed by feature extraction, resulted in the generation of a dataset containing information about mutations occurring through SHM. The Towards analysis performed on the integrated dataset applying voting techniques, revealed the distinct behaviour of subset #201 compared to other subsets, as regards SHM related movements among gene clans, both in allele-conserved and non-conserved gene areas. With respect to movement between genes, a high percentage movement towards pseudo genes was found in all CLL subsets. This data integration and feature extraction process can set the basis for exploratory analysis or a fully automated computational data mining approach on many as yet unanswered, clinically relevant biological questions.

  17. Environmental analysis using integrated GIS and remotely sensed data - Some research needs and priorities

    NASA Technical Reports Server (NTRS)

    Davis, Frank W.; Quattrochi, Dale A.; Ridd, Merrill K.; Lam, Nina S.-N.; Walsh, Stephen J.

    1991-01-01

    This paper discusses some basic scientific issues and research needs in the joint processing of remotely sensed and GIS data for environmental analysis. Two general topics are treated in detail: (1) scale dependence of geographic data and the analysis of multiscale remotely sensed and GIS data, and (2) data transformations and information flow during data processing. The discussion of scale dependence focuses on the theory and applications of spatial autocorrelation, geostatistics, and fractals for characterizing and modeling spatial variation. Data transformations during processing are described within the larger framework of geographical analysis, encompassing sampling, cartography, remote sensing, and GIS. Development of better user interfaces between image processing, GIS, database management, and statistical software is needed to expedite research on these and other impediments to integrated analysis of remotely sensed and GIS data.

  18. Expert system for web based collaborative CAE

    NASA Astrophysics Data System (ADS)

    Hou, Liang; Lin, Zusheng

    2006-11-01

    An expert system for web based collaborative CAE was developed based on knowledge engineering, relational database and commercial FEA (Finite element analysis) software. The architecture of the system was illustrated. In this system, the experts' experiences, theories and typical examples and other related knowledge, which will be used in the stage of pre-process in FEA, were categorized into analysis process and object knowledge. Then, the integrated knowledge model based on object-oriented method and rule based method was described. The integrated reasoning process based on CBR (case based reasoning) and rule based reasoning was presented. Finally, the analysis process of this expert system in web based CAE application was illustrated, and an analysis example of a machine tool's column was illustrated to prove the validity of the system.

  19. MeDICi Software Superglue for Data Analysis Pipelines

    ScienceCinema

    Ian Gorton

    2017-12-09

    The Middleware for Data-Intensive Computing (MeDICi) Integration Framework is an integrated middleware platform developed to solve data analysis and processing needs of scientists across many domains. MeDICi is scalable, easily modified, and robust to multiple languages, protocols, and hardware platforms, and in use today by PNNL scientists for bioinformatics, power grid failure analysis, and text analysis.

  20. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine.

    PubMed

    Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun

    2017-02-06

    In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human-machine-environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines.

  1. A 3D Human-Machine Integrated Design and Analysis Framework for Squat Exercises with a Smith Machine

    PubMed Central

    Lee, Haerin; Jung, Moonki; Lee, Ki-Kwang; Lee, Sang Hun

    2017-01-01

    In this paper, we propose a three-dimensional design and evaluation framework and process based on a probabilistic-based motion synthesis algorithm and biomechanical analysis system for the design of the Smith machine and squat training programs. Moreover, we implemented a prototype system to validate the proposed framework. The framework consists of an integrated human–machine–environment model as well as a squat motion synthesis system and biomechanical analysis system. In the design and evaluation process, we created an integrated model in which interactions between a human body and machine or the ground are modeled as joints with constraints at contact points. Next, we generated Smith squat motion using the motion synthesis program based on a Gaussian process regression algorithm with a set of given values for independent variables. Then, using the biomechanical analysis system, we simulated joint moments and muscle activities from the input of the integrated model and squat motion. We validated the model and algorithm through physical experiments measuring the electromyography (EMG) signals, ground forces, and squat motions as well as through a biomechanical simulation of muscle forces. The proposed approach enables the incorporation of biomechanics in the design process and reduces the need for physical experiments and prototypes in the development of training programs and new Smith machines. PMID:28178184

  2. Dictionary-based image reconstruction for superresolution in integrated circuit imaging.

    PubMed

    Cilingiroglu, T Berkin; Uyar, Aydan; Tuysuzoglu, Ahmet; Karl, W Clem; Konrad, Janusz; Goldberg, Bennett B; Ünlü, M Selim

    2015-06-01

    Resolution improvement through signal processing techniques for integrated circuit imaging is becoming more crucial as the rapid decrease in integrated circuit dimensions continues. Although there is a significant effort to push the limits of optical resolution for backside fault analysis through the use of solid immersion lenses, higher order laser beams, and beam apodization, signal processing techniques are required for additional improvement. In this work, we propose a sparse image reconstruction framework which couples overcomplete dictionary-based representation with a physics-based forward model to improve resolution and localization accuracy in high numerical aperture confocal microscopy systems for backside optical integrated circuit analysis. The effectiveness of the framework is demonstrated on experimental data.

  3. Wake acoustic analysis and image decomposition via beamforming of microphone signal projections on wavelet subspaces

    DOT National Transportation Integrated Search

    2006-05-08

    This paper describes the integration of wavelet analysis and time-domain beamforming : of microphone array output signals for analyzing the acoustic emissions from airplane : generated wake vortices. This integrated process provides visual and quanti...

  4. Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Li, Wesley; Lung, Shun-fat

    2008-01-01

    An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.

  5. Information Flow in an Atmospheric Model and Data Assimilation

    ERIC Educational Resources Information Center

    Yoon, Young-noh

    2011-01-01

    Weather forecasting consists of two processes, model integration and analysis (data assimilation). During the model integration, the state estimate produced by the analysis evolves to the next cycle time according to the atmospheric model to become the background estimate. The analysis then produces a new state estimate by combining the background…

  6. Development of a Cost Estimation Process for Human Systems Integration Practitioners During the Analysis of Alternatives

    DTIC Science & Technology

    2010-12-01

    processes. Novice estimators must often use of these complicated cost estimation tools (e.g., ACEIT , SEER-H, SEER-S, PRICE-H, PRICE-S, etc.) until...However, the thesis will leverage the processes embedded in cost estimation tools such as the Automated Cost Estimating Integration Tool ( ACEIT ) and the

  7. The effects of computer-assisted instruction and locus of control upon preservice elementary teachers' acquisition of the integrated science process skills

    NASA Astrophysics Data System (ADS)

    Wesley, Beth Eddinger; Krockover, Gerald H.; Devito, Alfred

    The purpose of this study was to determine the effects of computer-assisted instruction (CAI) versus a text mode of programmed instruction (PI), and the cognitive style of locus of control, on preservice elementary teachers' achievement of the integrated science process skills. Eighty-one preservice elementary teachers in six sections of a science methods class were classified as internally or externally controlled. The sections were randomly assigned to receive instruction in the integrated science process skills via a microcomputer or printed text. The study used a pretest-posttest control group design. Before assessing main and interaction effects, analysis of covariance was used to adjust posttest scores using the pretest scores. Statistical analysis revealed that main effects were not significant. Additionally, no interaction effects between treatments and loci of control were demonstrated. The results suggest that printed PI and tutorial CAI are equally effective modes of instruction for teaching internally and externally oriented preservice elementary teachers the integrated science process skills.

  8. Thermal Model Development for Ares I-X

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth M.; DelCorso, Joe

    2008-01-01

    Thermal analysis for the Ares I-X vehicle has involved extensive thermal model integration, since thermal models of vehicle elements came from several different NASA and industry organizations. Many valuable lessons were learned in terms of model integration and validation. Modeling practices such as submodel, analysis group and symbol naming were standardized to facilitate the later model integration. Upfront coordination of coordinate systems, timelines, units, symbols and case scenarios was very helpful in minimizing integration rework. A process for model integration was developed that included pre-integration runs and basic checks of both models, and a step-by-step process to efficiently integrate one model into another. Extensive use of model logic was used to create scenarios and timelines for avionics and air flow activation. Efficient methods of model restart between case scenarios were developed. Standardization of software version and even compiler version between organizations was found to be essential. An automated method for applying aeroheating to the full integrated vehicle model, including submodels developed by other organizations, was developed.

  9. KA-SB: from data integration to large scale reasoning

    PubMed Central

    Roldán-García, María del Mar; Navas-Delgado, Ismael; Kerzazi, Amine; Chniber, Othmane; Molina-Castro, Joaquín; Aldana-Montes, José F

    2009-01-01

    Background The analysis of information in the biological domain is usually focused on the analysis of data from single on-line data sources. Unfortunately, studying a biological process requires having access to disperse, heterogeneous, autonomous data sources. In this context, an analysis of the information is not possible without the integration of such data. Methods KA-SB is a querying and analysis system for final users based on combining a data integration solution with a reasoner. Thus, the tool has been created with a process divided into two steps: 1) KOMF, the Khaos Ontology-based Mediator Framework, is used to retrieve information from heterogeneous and distributed databases; 2) the integrated information is crystallized in a (persistent and high performance) reasoner (DBOWL). This information could be further analyzed later (by means of querying and reasoning). Results In this paper we present a novel system that combines the use of a mediation system with the reasoning capabilities of a large scale reasoner to provide a way of finding new knowledge and of analyzing the integrated information from different databases, which is retrieved as a set of ontology instances. This tool uses a graphical query interface to build user queries easily, which shows a graphical representation of the ontology and allows users o build queries by clicking on the ontology concepts. Conclusion These kinds of systems (based on KOMF) will provide users with very large amounts of information (interpreted as ontology instances once retrieved), which cannot be managed using traditional main memory-based reasoners. We propose a process for creating persistent and scalable knowledgebases from sets of OWL instances obtained by integrating heterogeneous data sources with KOMF. This process has been applied to develop a demo tool , which uses the BioPax Level 3 ontology as the integration schema, and integrates UNIPROT, KEGG, CHEBI, BRENDA and SABIORK databases. PMID:19796402

  10. CONFIG: Integrated engineering of systems and their operation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Ryan, Dan; Fleming, Land

    1994-01-01

    This article discusses CONFIG 3, a prototype software tool that supports integrated conceptual design evaluation from early in the product life cycle, by supporting isolated or integrated modeling, simulation, and analysis of the function, structure, behavior, failures and operations of system designs. Integration and reuse of models is supported in an object-oriented environment providing capabilities for graph analysis and discrete event simulation. CONFIG supports integration among diverse modeling approaches (component view, configuration or flow path view, and procedure view) and diverse simulation and analysis approaches. CONFIG is designed to support integrated engineering in diverse design domains, including mechanical and electro-mechanical systems, distributed computer systems, and chemical processing and transport systems.

  11. Process Integration and Optimization of ICME Carbon Fiber Composites for Vehicle Lightweighting: A Preliminary Development

    DOE PAGES

    Xu, Hongyi; Li, Yang; Zeng, Danielle

    2017-01-02

    Process integration and optimization is the key enabler of the Integrated Computational Materials Engineering (ICME) of carbon fiber composites. In this paper, automated workflows are developed for two types of composites: Sheet Molding Compounds (SMC) short fiber composites, and multi-layer unidirectional (UD) composites. For SMC, the proposed workflow integrates material processing simulation, microstructure representation volume element (RVE) models, material property prediction and structure preformation simulation to enable multiscale, multidisciplinary analysis and design. Processing parameters, microstructure parameters and vehicle subframe geometry parameters are defined as the design variables; the stiffness and weight of the structure are defined as the responses. Formore » multi-layer UD structure, this work focuses on the discussion of different design representation methods and their impacts on the optimization performance. Challenges in ICME process integration and optimization are also summarized and highlighted. Two case studies are conducted to demonstrate the integrated process and its application in optimization.« less

  12. Fuel ethanol production: process design trends and integration opportunities.

    PubMed

    Cardona, Carlos A; Sánchez, Oscar J

    2007-09-01

    Current fuel ethanol research and development deals with process engineering trends for improving biotechnological production of ethanol. In this work, the key role that process design plays during the development of cost-effective technologies is recognized through the analysis of major trends in process synthesis, modeling, simulation and optimization related to ethanol production. Main directions in techno-economical evaluation of fuel ethanol processes are described as well as some prospecting configurations. The most promising alternatives for compensating ethanol production costs by the generation of valuable co-products are analyzed. Opportunities for integration of fuel ethanol production processes and their implications are underlined. Main ways of process intensification through reaction-reaction, reaction-separation and separation-separation processes are analyzed in the case of bioethanol production. Some examples of energy integration during ethanol production are also highlighted. Finally, some concluding considerations on current and future research tendencies in fuel ethanol production regarding process design and integration are presented.

  13. Integrative omics analysis. A study based on Plasmodium falciparum mRNA and protein data.

    PubMed

    Tomescu, Oana A; Mattanovich, Diethard; Thallinger, Gerhard G

    2014-01-01

    Technological improvements have shifted the focus from data generation to data analysis. The availability of large amounts of data from transcriptomics, protemics and metabolomics experiments raise new questions concerning suitable integrative analysis methods. We compare three integrative analysis techniques (co-inertia analysis, generalized singular value decomposition and integrative biclustering) by applying them to gene and protein abundance data from the six life cycle stages of Plasmodium falciparum. Co-inertia analysis is an analysis method used to visualize and explore gene and protein data. The generalized singular value decomposition has shown its potential in the analysis of two transcriptome data sets. Integrative Biclustering applies biclustering to gene and protein data. Using CIA, we visualize the six life cycle stages of Plasmodium falciparum, as well as GO terms in a 2D plane and interpret the spatial configuration. With GSVD, we decompose the transcriptomic and proteomic data sets into matrices with biologically meaningful interpretations and explore the processes captured by the data sets. IBC identifies groups of genes, proteins, GO Terms and life cycle stages of Plasmodium falciparum. We show method-specific results as well as a network view of the life cycle stages based on the results common to all three methods. Additionally, by combining the results of the three methods, we create a three-fold validated network of life cycle stage specific GO terms: Sporozoites are associated with transcription and transport; merozoites with entry into host cell as well as biosynthetic and metabolic processes; rings with oxidation-reduction processes; trophozoites with glycolysis and energy production; schizonts with antigenic variation and immune response; gametocyctes with DNA packaging and mitochondrial transport. Furthermore, the network connectivity underlines the separation of the intraerythrocytic cycle from the gametocyte and sporozoite stages. Using integrative analysis techniques, we can integrate knowledge from different levels and obtain a wider view of the system under study. The overlap between method-specific and common results is considerable, even if the basic mathematical assumptions are very different. The three-fold validated network of life cycle stage characteristics of Plasmodium falciparum could identify a large amount of the known associations from literature in only one study.

  14. Integrative omics analysis. A study based on Plasmodium falciparum mRNA and protein data

    PubMed Central

    2014-01-01

    Background Technological improvements have shifted the focus from data generation to data analysis. The availability of large amounts of data from transcriptomics, protemics and metabolomics experiments raise new questions concerning suitable integrative analysis methods. We compare three integrative analysis techniques (co-inertia analysis, generalized singular value decomposition and integrative biclustering) by applying them to gene and protein abundance data from the six life cycle stages of Plasmodium falciparum. Co-inertia analysis is an analysis method used to visualize and explore gene and protein data. The generalized singular value decomposition has shown its potential in the analysis of two transcriptome data sets. Integrative Biclustering applies biclustering to gene and protein data. Results Using CIA, we visualize the six life cycle stages of Plasmodium falciparum, as well as GO terms in a 2D plane and interpret the spatial configuration. With GSVD, we decompose the transcriptomic and proteomic data sets into matrices with biologically meaningful interpretations and explore the processes captured by the data sets. IBC identifies groups of genes, proteins, GO Terms and life cycle stages of Plasmodium falciparum. We show method-specific results as well as a network view of the life cycle stages based on the results common to all three methods. Additionally, by combining the results of the three methods, we create a three-fold validated network of life cycle stage specific GO terms: Sporozoites are associated with transcription and transport; merozoites with entry into host cell as well as biosynthetic and metabolic processes; rings with oxidation-reduction processes; trophozoites with glycolysis and energy production; schizonts with antigenic variation and immune response; gametocyctes with DNA packaging and mitochondrial transport. Furthermore, the network connectivity underlines the separation of the intraerythrocytic cycle from the gametocyte and sporozoite stages. Conclusion Using integrative analysis techniques, we can integrate knowledge from different levels and obtain a wider view of the system under study. The overlap between method-specific and common results is considerable, even if the basic mathematical assumptions are very different. The three-fold validated network of life cycle stage characteristics of Plasmodium falciparum could identify a large amount of the known associations from literature in only one study. PMID:25033389

  15. AIRSAR Automated Web-based Data Processing and Distribution System

    NASA Technical Reports Server (NTRS)

    Chu, Anhua; vanZyl, Jakob; Kim, Yunjin; Lou, Yunling; Imel, David; Tung, Wayne; Chapman, Bruce; Durden, Stephen

    2005-01-01

    In this paper, we present an integrated, end-to-end synthetic aperture radar (SAR) processing system that accepts data processing requests, submits processing jobs, performs quality analysis, delivers and archives processed data. This fully automated SAR processing system utilizes database and internet/intranet web technologies to allow external users to browse and submit data processing requests and receive processed data. It is a cost-effective way to manage a robust SAR processing and archival system. The integration of these functions has reduced operator errors and increased processor throughput dramatically.

  16. Integrating the Medical Home into the EHDI Process

    ERIC Educational Resources Information Center

    Munoz, Karen F.; Nelson, Lauri; Bradham, Tamala S.; Hoffman, Jeff; Houston, K. Todd

    2011-01-01

    State coordinators of early hearing detection and intervention (EHDI) programs completed a strengths, weaknesses, opportunities, and threats, or SWOT, analysis that examined 12 areas within state EHDI programs. Related to how the medical home is integrated into the EHDI process, 273 items were listed by 48 coordinators, and themes were identified…

  17. Factors in Teacher Adherence to Treatment.

    ERIC Educational Resources Information Center

    Gum, Louann

    Treatment integrity, a measure of how accurately a treatment is carried out, is integral to the concept of effective behavioral analysis and intervention. This study sought to correlate teachers' perceptions of the functional behavior assessment and behavior intervention process (FBA/BIP) with their confidence that the process is an effective and…

  18. Functional Integration

    NASA Astrophysics Data System (ADS)

    Cartier, Pierre; DeWitt-Morette, Cecile

    2006-11-01

    Acknowledgements; List symbols, conventions, and formulary; Part I. The Physical and Mathematical Environment: 1. The physical and mathematical environment; Part II. Quantum Mechanics: 2. First lesson: gaussian integrals; 3. Selected examples; 4. Semiclassical expansion: WKB; 5. Semiclassical expansion: beyond WKB; 6. Quantum dynamics: path integrals and operator formalism; Part III. Methods from Differential Geometry: 7. Symmetries; 8. Homotopy; 9. Grassmann analysis: basics; 10. Grassmann analysis: applications; 11. Volume elements, divergences, gradients; Part IV. Non-Gaussian Applications: 12. Poisson processes in physics; 13. A mathematical theory of Poisson processes; 14. First exit time: energy problems; Part V. Problems in Quantum Field Theory: 15. Renormalization 1: an introduction; 16. Renormalization 2: scaling; 17. Renormalization 3: combinatorics; 18. Volume elements in quantum field theory Bryce DeWitt; Part VI. Projects: 19. Projects; Appendix A. Forward and backward integrals: spaces of pointed paths; Appendix B. Product integrals; Appendix C. A compendium of gaussian integrals; Appendix D. Wick calculus Alexander Wurm; Appendix E. The Jacobi operator; Appendix F. Change of variables of integration; Appendix G. Analytic properties of covariances; Appendix H. Feynman's checkerboard; Bibliography; Index.

  19. Functional Integration

    NASA Astrophysics Data System (ADS)

    Cartier, Pierre; DeWitt-Morette, Cecile

    2010-06-01

    Acknowledgements; List symbols, conventions, and formulary; Part I. The Physical and Mathematical Environment: 1. The physical and mathematical environment; Part II. Quantum Mechanics: 2. First lesson: gaussian integrals; 3. Selected examples; 4. Semiclassical expansion: WKB; 5. Semiclassical expansion: beyond WKB; 6. Quantum dynamics: path integrals and operator formalism; Part III. Methods from Differential Geometry: 7. Symmetries; 8. Homotopy; 9. Grassmann analysis: basics; 10. Grassmann analysis: applications; 11. Volume elements, divergences, gradients; Part IV. Non-Gaussian Applications: 12. Poisson processes in physics; 13. A mathematical theory of Poisson processes; 14. First exit time: energy problems; Part V. Problems in Quantum Field Theory: 15. Renormalization 1: an introduction; 16. Renormalization 2: scaling; 17. Renormalization 3: combinatorics; 18. Volume elements in quantum field theory Bryce DeWitt; Part VI. Projects: 19. Projects; Appendix A. Forward and backward integrals: spaces of pointed paths; Appendix B. Product integrals; Appendix C. A compendium of gaussian integrals; Appendix D. Wick calculus Alexander Wurm; Appendix E. The Jacobi operator; Appendix F. Change of variables of integration; Appendix G. Analytic properties of covariances; Appendix H. Feynman's checkerboard; Bibliography; Index.

  20. FUZZY DECISION ANALYSIS FOR INTEGRATED ENVIRONMENTAL VULNERABILITY ASSESSMENT OF THE MID-ATLANTIC REGION

    EPA Science Inventory


    A fuzzy decision analysis method for integrating ecological indicators is developed. This is a combination of a fuzzy ranking method and the Analytic Hierarchy Process (AHP). The method is capable ranking ecosystems in terms of environmental conditions and suggesting cumula...

  1. Varied Human Tolerance to the Combined Conditions of Low Contrast and Diminished Luminance: A Quasi-Meta Analysis

    DTIC Science & Technology

    2017-08-30

    as being three-fold: 1) a measurement of the integrity of both the central and peripheral visual processing centers; 2) an indicator of detail...visual assessment task 12 integral to the Army’s Class 1 Flight Physical (Ginsburg, 1981 and 1984; Bachman & Behar, 1986). During a Class 1 flight...systems. Meta-analysis has been defined as the statistical analysis of a collection of analytical results for the purpose of integrating the findings

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Hongyi; Li, Yang; Zeng, Danielle

    Process integration and optimization is the key enabler of the Integrated Computational Materials Engineering (ICME) of carbon fiber composites. In this paper, automated workflows are developed for two types of composites: Sheet Molding Compounds (SMC) short fiber composites, and multi-layer unidirectional (UD) composites. For SMC, the proposed workflow integrates material processing simulation, microstructure representation volume element (RVE) models, material property prediction and structure preformation simulation to enable multiscale, multidisciplinary analysis and design. Processing parameters, microstructure parameters and vehicle subframe geometry parameters are defined as the design variables; the stiffness and weight of the structure are defined as the responses. Formore » multi-layer UD structure, this work focuses on the discussion of different design representation methods and their impacts on the optimization performance. Challenges in ICME process integration and optimization are also summarized and highlighted. Two case studies are conducted to demonstrate the integrated process and its application in optimization.« less

  3. Fractionation of bamboo culms by autohydrolysis, organosolv delignification and extended delignification: understanding the fundamental chemistry of the lignin during the integrated process.

    PubMed

    Wen, Jia-Long; Sun, Shao-Ni; Yuan, Tong-Qi; Xu, Feng; Sun, Run-Cang

    2013-12-01

    Bamboo (Phyllostachys pubescens) was successfully fractionated using a three-step integrated process: (1) autohydrolysis pretreatment facilitating xylooligosaccharide (XOS) production (2) organosolv delignification with organic acids to obtain high-purity lignin, and (3) extended delignification with alkaline hydrogen peroxide (AHP) to produce purified pulp. The integrated process was comprehensively evaluated by component analysis, SEM, XRD, and CP-MAS NMR techniques. Emphatically, the fundamental chemistry of the lignin fragments obtained from the integrated process was thoroughly investigated by gel permeation chromatography and solution-state NMR techniques (quantitative (13)C, 2D-HSQC, and (31)P-NMR spectroscopies). It is believed that the integrated process facilitate the production of XOS, high-purity lignin, and purified pulp. Moreover, the enhanced understanding of structural features and chemical reactivity of lignin polymers will maximize their utilizations in a future biorefinery industry. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. The Integration Process of Very Thin Mirror Shells with a Particular Regard to Simbol-X

    NASA Astrophysics Data System (ADS)

    Basso, S.; Pareschi, G.; Tagliaferri, G.; Mazzoleni, F.; Valtolina, R.; Citterio, O.; Conconi, P.

    2009-05-01

    The optics of Simbol-X are very thin compared to previous X-ray missions (like XMM). Therefore their shells floppy and are unable to maintain the correct shape. To avoid the deformations of their very thin X-ray optics during the integration process we adopt two stiffening rings with a good roundness. In this article the procedure used for the first three prototypes of the Simbol-X optics is presented with a description of the problems involved and with an analysis of the degradation of the performances during the integration. This analysis has been performed with the UV vertical bench measurements at INAF-OAB.

  5. Model reduction in integrated controls-structures design

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.

    1993-01-01

    It is the objective of this paper to present a model reduction technique developed for the integrated controls-structures design of flexible structures. Integrated controls-structures design problems are typically posed as nonlinear mathematical programming problems, where the design variables consist of both structural and control parameters. In the solution process, both structural and control design variables are constantly changing; therefore, the dynamic characteristics of the structure are also changing. This presents a problem in obtaining a reduced-order model for active control design and analysis which will be valid for all design points within the design space. In other words, the frequency and number of the significant modes of the structure (modes that should be included) may vary considerably throughout the design process. This is also true as the locations and/or masses of the sensors and actuators change. Moreover, since the number of design evaluations in the integrated design process could easily run into thousands, any feasible order-reduction method should not require model reduction analysis at every design iteration. In this paper a novel and efficient technique for model reduction in the integrated controls-structures design process, which addresses these issues, is presented.

  6. Integration of Off-Track Sonic Boom Analysis in Conceptual Design of Supersonic Aircraft

    NASA Technical Reports Server (NTRS)

    Ordaz, Irian; Li, Wu

    2011-01-01

    A highly desired capability for the conceptual design of aircraft is the ability to rapidly and accurately evaluate new concepts to avoid adverse trade decisions that may hinder the development process in the later stages of design. Evaluating the robustness of new low-boom concepts is important for the conceptual design of supersonic aircraft. Here, robustness means that the aircraft configuration has a low-boom ground signature at both under- and off-track locations. An integrated process for off-track boom analysis is developed to facilitate the design of robust low-boom supersonic aircraft. The integrated off-track analysis can also be used to study the sonic boom impact and to plan future flight trajectories where flight conditions and ground elevation might have a significant effect on ground signatures. The key enabler for off-track sonic boom analysis is accurate computational fluid dynamics (CFD) solutions for off-body pressure distributions. To ensure the numerical accuracy of the off-body pressure distributions, a mesh study is performed with Cart3D to determine the mesh requirements for off- body CFD analysis and comparisons are made between the Cart3D and USM3D results. The variations in ground signatures that result from changes in the initial location of the near-field waveform are also examined. Finally, a complete under- and off-track sonic boom analysis is presented for two distinct supersonic concepts to demonstrate the capability of the integrated analysis process.

  7. Development of economic consequence methodology for process risk analysis.

    PubMed

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.

  8. An Integrated Tool for System Analysis of Sample Return Vehicles

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.; Maddock, Robert W.; Winski, Richard G.

    2012-01-01

    The next important step in space exploration is the return of sample materials from extraterrestrial locations to Earth for analysis. Most mission concepts that return sample material to Earth share one common element: an Earth entry vehicle. The analysis and design of entry vehicles is multidisciplinary in nature, requiring the application of mass sizing, flight mechanics, aerodynamics, aerothermodynamics, thermal analysis, structural analysis, and impact analysis tools. Integration of a multidisciplinary problem is a challenging task; the execution process and data transfer among disciplines should be automated and consistent. This paper describes an integrated analysis tool for the design and sizing of an Earth entry vehicle. The current tool includes the following disciplines: mass sizing, flight mechanics, aerodynamics, aerothermodynamics, and impact analysis tools. Python and Java languages are used for integration. Results are presented and compared with the results from previous studies.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zitney, S.E.; McCorkle, D.; Yang, C.

    Process modeling and simulation tools are widely used for the design and operation of advanced power generation systems. These tools enable engineers to solve the critical process systems engineering problems that arise throughout the lifecycle of a power plant, such as designing a new process, troubleshooting a process unit or optimizing operations of the full process. To analyze the impact of complex thermal and fluid flow phenomena on overall power plant performance, the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) has developed the Advanced Process Engineering Co-Simulator (APECS). The APECS system is an integrated software suite that combinesmore » process simulation (e.g., Aspen Plus) and high-fidelity equipment simulations such as those based on computational fluid dynamics (CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper we discuss the initial phases of the integration of the APECS system with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite uses the ActiveX (OLE Automation) controls in the Aspen Plus process simulator wrapped by the CASI library developed by Reaction Engineering International to run process/CFD co-simulations and query for results. This integration represents a necessary step in the development of virtual power plant co-simulations that will ultimately reduce the time, cost, and technical risk of developing advanced power generation systems.« less

  10. Analysis of metabolomic data: tools, current strategies and future challenges for omics data integration.

    PubMed

    Cambiaghi, Alice; Ferrario, Manuela; Masseroli, Marco

    2017-05-01

    Metabolomics is a rapidly growing field consisting of the analysis of a large number of metabolites at a system scale. The two major goals of metabolomics are the identification of the metabolites characterizing each organism state and the measurement of their dynamics under different situations (e.g. pathological conditions, environmental factors). Knowledge about metabolites is crucial for the understanding of most cellular phenomena, but this information alone is not sufficient to gain a comprehensive view of all the biological processes involved. Integrated approaches combining metabolomics with transcriptomics and proteomics are thus required to obtain much deeper insights than any of these techniques alone. Although this information is available, multilevel integration of different 'omics' data is still a challenge. The handling, processing, analysis and integration of these data require specialized mathematical, statistical and bioinformatics tools, and several technical problems hampering a rapid progress in the field exist. Here, we review four main tools for number of users or provided features (MetaCoreTM, MetaboAnalyst, InCroMAP and 3Omics) out of the several available for metabolomic data analysis and integration with other 'omics' data, highlighting their strong and weak aspects; a number of related issues affecting data analysis and integration are also identified and discussed. Overall, we provide an objective description of how some of the main currently available software packages work, which may help the experimental practitioner in the choice of a robust pipeline for metabolomic data analysis and integration. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. The Role of Occupational Identification During Post-Merger Integration

    PubMed Central

    Kroon, David P.; Noorderhaven, Niels G.

    2016-01-01

    Integration processes after mergers are fraught with difficulties, and constitute a main cause of merger failure. This study focuses on the human aspect of post-merger integration, and in particular, on the role of occupational identification. We theorize and empirically demonstrate by means of a survey design that employees’ identification with their occupation is positively related to their willingness to cooperate in the post-merger integration process, over and above the effect of organization members’ organizational identification. This positive effect of occupational identification is stronger for uniformed personnel but attenuates in the course of the integration process. Qualitative interviews further explore and interpret the results from our statistical analysis. Together, these findings have important practical implications and suggest future research directions. PMID:29568214

  12. Model-based analysis of pattern motion processing in mouse primary visual cortex

    PubMed Central

    Muir, Dylan R.; Roth, Morgane M.; Helmchen, Fritjof; Kampa, Björn M.

    2015-01-01

    Neurons in sensory areas of neocortex exhibit responses tuned to specific features of the environment. In visual cortex, information about features such as edges or textures with particular orientations must be integrated to recognize a visual scene or object. Connectivity studies in rodent cortex have revealed that neurons make specific connections within sub-networks sharing common input tuning. In principle, this sub-network architecture enables local cortical circuits to integrate sensory information. However, whether feature integration indeed occurs locally in rodent primary sensory areas has not been examined directly. We studied local integration of sensory features in primary visual cortex (V1) of the mouse by presenting drifting grating and plaid stimuli, while recording the activity of neuronal populations with two-photon calcium imaging. Using a Bayesian model-based analysis framework, we classified single-cell responses as being selective for either individual grating components or for moving plaid patterns. Rather than relying on trial-averaged responses, our model-based framework takes into account single-trial responses and can easily be extended to consider any number of arbitrary predictive models. Our analysis method was able to successfully classify significantly more responses than traditional partial correlation (PC) analysis, and provides a rigorous statistical framework to rank any number of models and reject poorly performing models. We also found a large proportion of cells that respond strongly to only one stimulus class. In addition, a quarter of selectively responding neurons had more complex responses that could not be explained by any simple integration model. Our results show that a broad range of pattern integration processes already take place at the level of V1. This diversity of integration is consistent with processing of visual inputs by local sub-networks within V1 that are tuned to combinations of sensory features. PMID:26300738

  13. Non-Integrated Information and Communication Technologies in the Kidney Transplantation Process in Brazil.

    PubMed

    Peres Penteado, Alissa; Fábio Maciel, Rafael; Erbs, João; Feijó Ortolani, Cristina Lucia; Aguiar Roza, Bartira; Torres Pisa, Ivan

    2015-01-01

    The entire kidney transplantation process in Brazil is defined through laws, decrees, ordinances, and resolutions, but there is no defined theoretical map describing this process. From this representation it's possible to perform analysis, such as the identification of bottlenecks and information and communication technologies (ICTs) that support this process. The aim of this study was to analyze and represent the kidney transplantation workflow using business process modeling notation (BPMN) and then to identify the ICTs involved in the process. This study was conducted in eight steps, including document analysis and professional evaluation. The results include the BPMN model of the kidney transplantation process in Brazil and the identification of ICTs. We discovered that there are great delays in the process due to there being many different ICTs involved, which can cause information to be poorly integrated.

  14. Preliminary results from the High Speed Airframe Integration Research project

    NASA Technical Reports Server (NTRS)

    Coen, Peter G.; Sobieszczanski-Sobieski, Jaroslaw; Dollyhigh, Samuel M.

    1992-01-01

    A review is presented of the accomplishment of the near term objectives of developing an analysis system and optimization methods during the first year of the NASA Langley High Speed Airframe Integration Research (HiSAIR) project. The characteristics of a Mach 3 HSCT transport have been analyzed utilizing the newly developed process. In addition to showing more detailed information about the aerodynamic and structural coupling for this type of vehicle, this exercise aided in further refining the data requirements for the analysis process.

  15. A computational modeling of semantic knowledge in reading comprehension: Integrating the landscape model with latent semantic analysis.

    PubMed

    Yeari, Menahem; van den Broek, Paul

    2016-09-01

    It is a well-accepted view that the prior semantic (general) knowledge that readers possess plays a central role in reading comprehension. Nevertheless, computational models of reading comprehension have not integrated the simulation of semantic knowledge and online comprehension processes under a unified mathematical algorithm. The present article introduces a computational model that integrates the landscape model of comprehension processes with latent semantic analysis representation of semantic knowledge. In three sets of simulations of previous behavioral findings, the integrated model successfully simulated the activation and attenuation of predictive and bridging inferences during reading, as well as centrality estimations and recall of textual information after reading. Analyses of the computational results revealed new theoretical insights regarding the underlying mechanisms of the various comprehension phenomena.

  16. Depth Cue Integration in an Active Control Paradigm

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K.; Sweet, Barabara T.; Shafto, Meredith; Null, Cynthia H. (Technical Monitor)

    1995-01-01

    Numerous models of depth cue integration have been proposed. Of particular interest is how the visual system processes discrepent cues, as might arise when viewing synthetic displays. A powerful paradigm for examining this integration process can be adapted from manual control research. This methodology introduces independent disturbances in the candidate cues, then performs spectral analysis of subjects' resulting motoric responses (e.g., depth matching). We will describe this technique and present initial findings.

  17. Low-cost solar array project and Proceedings of the 14th Project Integration Meeting

    NASA Technical Reports Server (NTRS)

    Mcdonald, R. R.

    1980-01-01

    Activities are reported on the following areas: project analysis and integration; technology development in silicon material, large area sheet silicon, and encapsulation; production process and equipment development; and engineering and operations, and the steps taken to integrate these efforts. Visual materials presented at the project Integration Meeting are included.

  18. Targeted and untargeted-metabolite profiling to track the compositional integrity of ginger during processing using digitally-enhanced HPTLC pattern recognition analysis.

    PubMed

    Ibrahim, Reham S; Fathy, Hoda

    2018-03-30

    Tracking the impact of commonly applied post-harvesting and industrial processing practices on the compositional integrity of ginger rhizome was implemented in this work. Untargeted metabolite profiling was performed using digitally-enhanced HPTLC method where the chromatographic fingerprints were extracted using ImageJ software then analysed with multivariate Principal Component Analysis (PCA) for pattern recognition. A targeted approach was applied using a new, validated, simple and fast HPTLC image analysis method for simultaneous quantification of the officially recognized markers 6-, 8-, 10-gingerol and 6-shogaol in conjunction with chemometric Hierarchical Clustering Analysis (HCA). The results of both targeted and untargeted metabolite profiling revealed that peeling, drying in addition to storage employed during processing have a great influence on ginger chemo-profile, the different forms of processed ginger shouldn't be used interchangeably. Moreover, it deemed necessary to consider the holistic metabolic profile for comprehensive evaluation of ginger during processing. Copyright © 2018. Published by Elsevier B.V.

  19. Integration of Engine, Plume, and CFD Analyses in Conceptual Design of Low-Boom Supersonic Aircraft

    NASA Technical Reports Server (NTRS)

    Li, Wu; Campbell, Richard; Geiselhart, Karl; Shields, Elwood; Nayani, Sudheer; Shenoy, Rajiv

    2009-01-01

    This paper documents an integration of engine, plume, and computational fluid dynamics (CFD) analyses in the conceptual design of low-boom supersonic aircraft, using a variable fidelity approach. In particular, the Numerical Propulsion Simulation System (NPSS) is used for propulsion system cycle analysis and nacelle outer mold line definition, and a low-fidelity plume model is developed for plume shape prediction based on NPSS engine data and nacelle geometry. This model provides a capability for the conceptual design of low-boom supersonic aircraft that accounts for plume effects. Then a newly developed process for automated CFD analysis is presented for CFD-based plume and boom analyses of the conceptual geometry. Five test cases are used to demonstrate the integrated engine, plume, and CFD analysis process based on a variable fidelity approach, as well as the feasibility of the automated CFD plume and boom analysis capability.

  20. Crystallographic data processing for free-electron laser sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Thomas A., E-mail: taw@physics.org; Barty, Anton; Stellato, Francesco

    2013-07-01

    A processing pipeline for diffraction data acquired using the ‘serial crystallography’ methodology with a free-electron laser source is described with reference to the crystallographic analysis suite CrystFEL and the pre-processing program Cheetah. A processing pipeline for diffraction data acquired using the ‘serial crystallography’ methodology with a free-electron laser source is described with reference to the crystallographic analysis suite CrystFEL and the pre-processing program Cheetah. A detailed analysis of the nature and impact of indexing ambiguities is presented. Simulations of the Monte Carlo integration scheme, which accounts for the partially recorded nature of the diffraction intensities, are presented and show thatmore » the integration of partial reflections could be made to converge more quickly if the bandwidth of the X-rays were to be increased by a small amount or if a slight convergence angle were introduced into the incident beam.« less

  1. Integration and segregation in auditory scene analysis

    NASA Astrophysics Data System (ADS)

    Sussman, Elyse S.

    2005-03-01

    Assessment of the neural correlates of auditory scene analysis, using an index of sound change detection that does not require the listener to attend to the sounds [a component of event-related brain potentials called the mismatch negativity (MMN)], has previously demonstrated that segregation processes can occur without attention focused on the sounds and that within-stream contextual factors influence how sound elements are integrated and represented in auditory memory. The current study investigated the relationship between the segregation and integration processes when they were called upon to function together. The pattern of MMN results showed that the integration of sound elements within a sound stream occurred after the segregation of sounds into independent streams and, further, that the individual streams were subject to contextual effects. These results are consistent with a view of auditory processing that suggests that the auditory scene is rapidly organized into distinct streams and the integration of sequential elements to perceptual units takes place on the already formed streams. This would allow for the flexibility required to identify changing within-stream sound patterns, needed to appreciate music or comprehend speech..

  2. Fully integrated wearable sensor arrays for multiplexed in situ perspiration analysis

    DOE PAGES

    Gao, Wei; Emaminejad, Sam; Nyein, Hnin Yin Yin; ...

    2016-01-27

    We report that wearable sensor technologies are essential to the realization of personalized medicine through continuously monitoring an individual’s state of health. Sampling human sweat, which is rich in physiological information13, could enable non-invasive monitoring. Previously reported sweat-based and other noninvasive biosensors either can only monitor a single analyte at a time or lack on-site signal processing circuitry and sensor calibration mechanisms for accurate analysis of the physiological state14–18. Given the complexity of sweat secretion, simultaneous and multiplexed screening of target biomarkers is critical and requires full system integration to ensure the accuracy of measurements. Here we present a mechanicallymore » flexible and fully integrated (that is, no external analysis is needed) sensor array for multiplexed in situ perspiration analysis, which simultaneously and selectively measures sweat metabolites (such as glucose and lactate) and electrolytes (such as sodium and potassium ions), as well as the skin temperature (to calibrate the response of the sensors). Lastly, our work bridges the technological gap between signal transduction, conditioning (amplification and filtering), processing and wireless transmission in wearable biosensors by merging plasticbased sensors that interface with the skin with silicon integrated circuits consolidated on a flexible circuit board for complex signal processing.« less

  3. Fully integrated wearable sensor arrays for multiplexed in situ perspiration analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, Wei; Emaminejad, Sam; Nyein, Hnin Yin Yin

    We report that wearable sensor technologies are essential to the realization of personalized medicine through continuously monitoring an individual’s state of health. Sampling human sweat, which is rich in physiological information13, could enable non-invasive monitoring. Previously reported sweat-based and other noninvasive biosensors either can only monitor a single analyte at a time or lack on-site signal processing circuitry and sensor calibration mechanisms for accurate analysis of the physiological state14–18. Given the complexity of sweat secretion, simultaneous and multiplexed screening of target biomarkers is critical and requires full system integration to ensure the accuracy of measurements. Here we present a mechanicallymore » flexible and fully integrated (that is, no external analysis is needed) sensor array for multiplexed in situ perspiration analysis, which simultaneously and selectively measures sweat metabolites (such as glucose and lactate) and electrolytes (such as sodium and potassium ions), as well as the skin temperature (to calibrate the response of the sensors). Lastly, our work bridges the technological gap between signal transduction, conditioning (amplification and filtering), processing and wireless transmission in wearable biosensors by merging plasticbased sensors that interface with the skin with silicon integrated circuits consolidated on a flexible circuit board for complex signal processing.« less

  4. Towards the Integration of APECS with VE-Suite to Create a Comprehensive Virtual Engineering Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCorkle, D.; Yang, C.; Jordan, T.

    2007-06-01

    Modeling and simulation tools are becoming pervasive in the process engineering practice of designing advanced power generation facilities. These tools enable engineers to explore many what-if scenarios before cutting metal or constructing a pilot scale facility. While such tools enable investigation of crucial plant design aspects, typical commercial process simulation tools such as Aspen Plus®, gPROMS®, and HYSYS® still do not explore some plant design information, including computational fluid dynamics (CFD) models for complex thermal and fluid flow phenomena, economics models for policy decisions, operational data after the plant is constructed, and as-built information for use in as-designed models. Softwaremore » tools must be created that allow disparate sources of information to be integrated if environments are to be constructed where process simulation information can be accessed. At the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL), the Advanced Process Engineering Co-Simulator (APECS) has been developed as an integrated software suite that combines process simulation (e.g., Aspen Plus) and high-fidelity equipment simulation (e.g., Fluent® CFD), together with advanced analysis capabilities including case studies, sensitivity analysis, stochastic simulation for risk/uncertainty analysis, and multi-objective optimization. In this paper, we discuss the initial phases of integrating APECS with the immersive and interactive virtual engineering software, VE-Suite, developed at Iowa State University and Ames Laboratory. VE-Suite utilizes the ActiveX (OLE Automation) controls in Aspen Plus wrapped by the CASI library developed by Reaction Engineering International to run the process simulation and query for unit operation results. This integration permits any application that uses the VE-Open interface to integrate with APECS co-simulations, enabling construction of the comprehensive virtual engineering environment needed for the rapid engineering of advanced power generation facilities.« less

  5. Lifestyle change in type 2 diabetes a process model.

    PubMed

    Whittemore, Robin; Chase, Susan K; Mandle, Carol Lynn; Roy, Callista

    2002-01-01

    Integration is an emerging concept in the study of self-management and chronic illness, yet this process and how it occurs is not well understood. This investigation, part of a triangulated study, focused on the experience of integrating type 2 diabetes treatment recommendations into an existing lifestyle while participating in a nurse-coaching intervention. An interpretive method elicited data from nurse-coaching sessions (4), field notes, and an interview in 9 women with type 2 diabetes. The process of data reduction and analysis (Miles & Huberman, 1994) was used to interpret data. The core process of integrating lifestyle change in type 2 diabetes was multifaceted and complex. Challenges to the process of integrating lifestyle change included reconciling emotions, composing a structure, striving for satisfaction, exploring self and conflicts, discovering balance, and developing a new cadence to life. These challenges required acknowledgment in order for participants to progress toward integration. Balance was an integral component to the experience of integration, between structure and flexibility, fear and hope, conflict and acceptance, diabetes and life. Conceptualizations identified with this investigation extend understanding of theories of integration and lifestyle change and invite the development and testing of nursing interventions.

  6. Biodiesel production process from microalgae oil by waste heat recovery and process integration.

    PubMed

    Song, Chunfeng; Chen, Guanyi; Ji, Na; Liu, Qingling; Kansha, Yasuki; Tsutsumi, Atsushi

    2015-10-01

    In this work, the optimization of microalgae oil (MO) based biodiesel production process is carried out by waste heat recovery and process integration. The exergy analysis of each heat exchanger presented an efficient heat coupling between hot and cold streams, thus minimizing the total exergy destruction. Simulation results showed that the unit production cost of optimized process is 0.592$/L biodiesel, and approximately 0.172$/L biodiesel can be avoided by heat integration. Although the capital cost of the optimized biodiesel production process increased 32.5% and 23.5% compared to the reference cases, the operational cost can be reduced by approximately 22.5% and 41.6%. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. Integrated photonics for infrared spectroscopic sensing

    NASA Astrophysics Data System (ADS)

    Lin, Hongtao; Kita, Derek; Han, Zhaohong; Su, Peter; Agarwal, Anu; Yadav, Anupama; Richardson, Kathleen; Gu, Tian; Hu, Juejun

    2017-05-01

    Infrared (IR) spectroscopy is widely recognized as a gold standard technique for chemical analysis. Traditional IR spectroscopy relies on fragile bench-top instruments located in dedicated laboratory settings, and is thus not suitable for emerging field-deployed applications such as in-line industrial process control, environmental monitoring, and point-ofcare diagnosis. Recent strides in photonic integration technologies provide a promising route towards enabling miniaturized, rugged platforms for IR spectroscopic analysis. Chalcogenide glasses, the amorphous compounds containing S, Se or Te, have stand out as a promising material for infrared photonic integration given their broadband infrared transparency and compatibility with silicon photonic integration. In this paper, we discuss our recent work exploring integrated chalcogenide glass based photonic devices for IR spectroscopic chemical analysis, including on-chip cavityenhanced chemical sensing and monolithic integration of mid-IR waveguides with photodetectors.

  8. Vehicle Infrastructure Integration (VII) data use analysis and processing : project summary report.

    DOT National Transportation Integrated Search

    2012-03-01

    The purpose of the Data Use Analysis and Processing (DUAP) project is to support the : Michigan Department of Transportation (MDOT) and its partners in evaluating uses and : benefits of connected vehicle data in transportation agency management and :...

  9. Development of Integrated Programs for Aerospace-vehicle design (IPAD): Reference design process

    NASA Technical Reports Server (NTRS)

    Meyer, D. D.

    1979-01-01

    The airplane design process and its interfaces with manufacturing and customer operations are documented to be used as criteria for the development of integrated programs for the analysis, design, and testing of aerospace vehicles. Topics cover: design process management, general purpose support requirements, design networks, and technical program elements. Design activity sequences are given for both supersonic and subsonic commercial transports, naval hydrofoils, and military aircraft.

  10. Alternative Procedure of Heat Integration Tehnique Election between Two Unit Processes to Improve Energy Saving

    NASA Astrophysics Data System (ADS)

    Santi, S. S.; Renanto; Altway, A.

    2018-01-01

    The energy use system in a production process, in this case heat exchangers networks (HENs), is one element that plays a role in the smoothness and sustainability of the industry itself. Optimizing Heat Exchanger Networks (HENs) from process streams can have a major effect on the economic value of an industry as a whole. So the solving of design problems with heat integration becomes an important requirement. In a plant, heat integration can be carried out internally or in combination between process units. However, steps in the determination of suitable heat integration techniques require long calculations and require a long time. In this paper, we propose an alternative step in determining heat integration technique by investigating 6 hypothetical units using Pinch Analysis approach with objective function energy target and total annual cost target. The six hypothetical units consist of units A, B, C, D, E, and F, where each unit has the location of different process streams to the temperature pinch. The result is a potential heat integration (ΔH’) formula that can trim conventional steps from 7 steps to just 3 steps. While the determination of the preferred heat integration technique is to calculate the potential of heat integration (ΔH’) between the hypothetical process units. Completion of calculation using matlab language programming.

  11. Mess management in microbial ecology: Rhetorical processes of disciplinary integration

    NASA Astrophysics Data System (ADS)

    McCracken, Christopher W.

    As interdisciplinary work becomes more common in the sciences, research into the rhetorical processes mediating disciplinary integration becomes more vital. This dissertation, which takes as its subject the integration of microbiology and ecology, combines a postplural approach to rhetoric of science research with Victor Turner's "social drama" analysis and a third-generation activity theory methodological framework to identify conceptual and practical conflicts in interdisciplinary work and describe how, through visual and verbal communication, scientists negotiate these conflicts. First, to understand the conflicting disciplinary principles that might impede integration, the author conducts a Turnerian analysis of a disciplinary conflict that took place in the 1960s and 70s, during which American ecologists and biologists debated whether they should participate in the International Biological Program (IBP). Participation in the IBP ultimately contributed to the emergence of ecology as a discipline distinct from biology, and Turnerian social drama analysis of the debate surrounding participation lays bare the conflicting principles separating biology and ecology. Second, to answer the question of how these conflicting principles are negotiated in practice, the author reports on a yearlong qualitative study of scientists working in a microbial ecology laboratory. Focusing specifically on two case studies from this fieldwork that illustrate the key concept of textually mediated disciplinary integration, the author's analysis demonstrates how scientific objects emerge in differently situated practices, and how these objects manage to cohere despite their multiplicity through textually mediated rhetorical processes of calibration and alignment.

  12. Integrating Six Sigma with total quality management: a case example for measuring medication errors.

    PubMed

    Revere, Lee; Black, Ken

    2003-01-01

    Six Sigma is a new management philosophy that seeks a nonexistent error rate. It is ripe for healthcare because many healthcare processes require a near-zero tolerance for mistakes. For most organizations, establishing a Six Sigma program requires significant resources and produces considerable stress. However, in healthcare, management can piggyback Six Sigma onto current total quality management (TQM) efforts so that minimal disruption occurs in the organization. Six Sigma is an extension of the Failure Mode and Effects Analysis that is required by JCAHO; it can easily be integrated into existing quality management efforts. Integrating Six Sigma into the existing TQM program facilitates process improvement through detailed data analysis. A drilled-down approach to root-cause analysis greatly enhances the existing TQM approach. Using the Six Sigma metrics, internal project comparisons facilitate resource allocation while external project comparisons allow for benchmarking. Thus, the application of Six Sigma makes TQM efforts more successful. This article presents a framework for including Six Sigma in an organization's TQM plan while providing a concrete example using medication errors. Using the process defined in this article, healthcare executives can integrate Six Sigma into all of their TQM projects.

  13. Image processing and classification procedures for analysis of sub-decimeter imagery acquired with an unmanned aircraft over arid rangelands

    USDA-ARS?s Scientific Manuscript database

    Using five centimeter resolution images acquired with an unmanned aircraft system (UAS), we developed and evaluated an image processing workflow that included the integration of resolution-appropriate field sampling, feature selection, object-based image analysis, and processing approaches for UAS i...

  14. The TAME Project: Towards improvement-oriented software environments

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Rombach, H. Dieter

    1988-01-01

    Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture.

  15. From Physical Process to Economic Cost - Integrated Approaches of Landslide Risk Assessment

    NASA Astrophysics Data System (ADS)

    Klose, M.; Damm, B.

    2014-12-01

    The nature of landslides is complex in many respects, with landslide hazard and impact being dependent on a variety of factors. This obviously requires an integrated assessment for fundamental understanding of landslide risk. Integrated risk assessment, according to the approach presented in this contribution, implies combining prediction of future landslide occurrence with analysis of landslide impact in the past. A critical step for assessing landslide risk in integrated perspective is to analyze what types of landslide damage affected people and property in which way and how people contributed and responded to these damage types. In integrated risk assessment, the focus is on systematic identification and monetization of landslide damage, and analytical tools that allow deriving economic costs from physical landslide processes are at the heart of this approach. The broad spectrum of landslide types and process mechanisms as well as nonlinearity between landslide magnitude, damage intensity, and direct costs are some main factors explaining recent challenges in risk assessment. The two prevailing approaches for assessing the impact of landslides in economic terms are cost survey (ex-post) and risk analysis (ex-ante). Both approaches are able to complement each other, but yet a combination of them has not been realized so far. It is common practice today to derive landslide risk without considering landslide process-based cause-effect relationships, since integrated concepts or new modeling tools expanding conventional methods are still widely missing. The approach introduced in this contribution is based on a systematic framework that combines cost survey and GIS-based tools for hazard or cost modeling with methods to assess interactions between land use practices and landslides in historical perspective. Fundamental understanding of landslide risk also requires knowledge about the economic and fiscal relevance of landslide losses, wherefore analysis of their impact on public budgets is a further component of this approach. In integrated risk assessment, combination of methods plays an important role, with the objective of collecting and integrating complex data sets on landslide risk.

  16. Analysis of the possibility of a PGA309 integrated circuit application in pressure sensors

    NASA Astrophysics Data System (ADS)

    Walendziuk, Wojciech; Baczewski, Michal; Idzkowski, Adam

    2016-09-01

    This article present the results of research concerning the analysis of the possibilities of applying a PGA309 integrated circuit in transducers used for pressure measurement. The experiments were done with the use of a PGA309EVM-USB evaluation circuit with a BD|SENSORS pressure sensor. A specially prepared MATLAB script was used in the process of the calibration setting choice and the results analysis. The article discusses the worked out algorithm that processes the measurement results, i.e. the algorithm which calculates the desired gain and the offset adjustment voltage of the transducer measurement bridge in relation to the input signal range of the integrated circuit and the temperature of the environment (temperature compensation). The checking procedure was conducted in a measurement laboratory and the obtained result were analyzed and discussed.

  17. 7 Processes that Enable NASA Software Engineering Technologies: Value-Added Process Engineering

    NASA Technical Reports Server (NTRS)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    The presentation reviews Agency process requirements and the purpose, benefits, and experiences or seven software engineering processes. The processes include: product integration, configuration management, verification, software assurance, measurement and analysis, requirements management, and planning and monitoring.

  18. MANTECH project book

    NASA Astrophysics Data System (ADS)

    The effective integration of processes, systems, and procedures used in the production of aerospace systems using computer technology is managed by the Integration Technology Division (MTI). Under its auspices are the Information Management Branch, which is actively involved with information management, information sciences and integration, and the Implementation Branch, whose technology areas include computer integrated manufacturing, engineering design, operations research, and material handling and assembly. The Integration Technology Division combines design, manufacturing, and supportability functions within the same organization. The Processing and Fabrication Division manages programs to improve structural and nonstructural materials processing and fabrication. Within this division, the Metals Branch directs the manufacturing methods program for metals and metal matrix composites processing and fabrication. The Nonmetals Branch directs the manufacturing methods programs, which include all manufacturing processes for producing and utilizing propellants, plastics, resins, fibers, composites, fluid elastomers, ceramics, glasses, and coatings. The objective of the Industrial Base Analysis Division is to act as focal point for the USAF industrial base program for productivity, responsiveness, and preparedness planning.

  19. Vehicle Integrated Performance Analysis, the VIPA Experience: Reconnecting with Technical Integration

    NASA Technical Reports Server (NTRS)

    McGhee, David S.

    2005-01-01

    Today's NASA is facing significant challenges and changes. The Exploration initiative indicates a large increase in projects with limited increase in budget. The Columbia report has criticized NASA for its lack of insight and technical integration impacting its ability to provide safety. The Aldridge report is advocating NASA find new ways of doing business. Very early in the Space Launch Initiative (SLI) program a small team of engineers at MSFC were asked to propose a process for performing a system level assessment of a launch vehicle. The request was aimed primarily at providing insight and making NASA a "smart buyer." Out of this effort the VIPA team was created. The difference between the VIPA effort and many integration attempts is that VIPA focuses on using experienced people from various disciplines and a process which focuses them on a technically integrated assessment. Most previous attempts have focused on developing an all encompassing software tool. In addition, VIPA anchored its process formulation in the experience of its members and in early developmental Space Shuttle experience. The primary reference for this is NASA-TP-2001-210092, "Launch Vehicle Design Process: Characterization, Technical Integration, and Lessons Learned," and discussions with its authors. The foundations of VIPA's process are described. The VIPA team also recognized the need to drive detailed analysis earlier in the design process. Analyses and techniques typically done in later design phases, are brought forward using improved computing technology. The intent is to allow the identification of significant sensitivities, trades, and design issues much earlier in the program. This process is driven by the T-model for Technical Integration described in the aforementioned reference. VIPA's approach to performing system level technical integration is discussed in detail. Proposed definitions are offered to clarify this discussion and the general systems integration dialog. VIPA's capabilities and process can now be used to significantly enhance the development and monitoring of realizable project requirements. This is done through the use of VIPA's V-model. Starting with a given concept, VIPA's assessment validates the concept's stated performance, identifies significant issues either with the concept or the requirements, and then re-integrates these issues to determine impacts. This process is discussed along with a description of how it may be integrated into a program's insight and review process. The VIPA process has been employed successfully on several projects including SLI, Orbital Space Plane (OSP), and several heavy lift concepts for Exploration. It has also been proposed for use on the Jupiter Icy Moon (JIMO) spacecraft. The VIPA process has gained favor with both engineering and project organizations for being responsive and insightful. Selected results from these assessments will be presented.

  20. Integrating enzyme fermentation in lignocellulosic ethanol production: life-cycle assessment and techno-economic analysis.

    PubMed

    Olofsson, Johanna; Barta, Zsolt; Börjesson, Pål; Wallberg, Ola

    2017-01-01

    Cellulase enzymes have been reported to contribute with a significant share of the total costs and greenhouse gas emissions of lignocellulosic ethanol production today. A potential future alternative to purchasing enzymes from an off-site manufacturer is to integrate enzyme and ethanol production, using microorganisms and part of the lignocellulosic material as feedstock for enzymes. This study modelled two such integrated process designs for ethanol from logging residues from spruce production, and compared it to an off-site case based on existing data regarding purchased enzymes. Greenhouse gas emissions and primary energy balances were studied in a life-cycle assessment, and cost performance in a techno-economic analysis. The base case scenario suggests that greenhouse gas emissions per MJ of ethanol could be significantly lower in the integrated cases than in the off-site case. However, the difference between the integrated and off-site cases is reduced with alternative assumptions regarding enzyme dosage and the environmental impact of the purchased enzymes. The comparison of primary energy balances did not show any significant difference between the cases. The minimum ethanol selling price, to reach break-even costs, was from 0.568 to 0.622 EUR L -1 for the integrated cases, as compared to 0.581 EUR L -1 for the off-site case. An integrated process design could reduce greenhouse gas emissions from lignocellulose-based ethanol production, and the cost of an integrated process could be comparable to purchasing enzymes produced off-site. This study focused on the environmental and economic assessment of an integrated process, and in order to strengthen the comparison to the off-site case, more detailed and updated data regarding industrial off-site enzyme production are especially important.

  1. Rapid analysis of protein backbone resonance assignments using cryogenic probes, a distributed Linux-based computing architecture, and an integrated set of spectral analysis tools.

    PubMed

    Monleón, Daniel; Colson, Kimberly; Moseley, Hunter N B; Anklin, Clemens; Oswald, Robert; Szyperski, Thomas; Montelione, Gaetano T

    2002-01-01

    Rapid data collection, spectral referencing, processing by time domain deconvolution, peak picking and editing, and assignment of NMR spectra are necessary components of any efficient integrated system for protein NMR structure analysis. We have developed a set of software tools designated AutoProc, AutoPeak, and AutoAssign, which function together with the data processing and peak-picking programs NMRPipe and Sparky, to provide an integrated software system for rapid analysis of protein backbone resonance assignments. In this paper we demonstrate that these tools, together with high-sensitivity triple resonance NMR cryoprobes for data collection and a Linux-based computer cluster architecture, can be combined to provide nearly complete backbone resonance assignments and secondary structures (based on chemical shift data) for a 59-residue protein in less than 30 hours of data collection and processing time. In this optimum case of a small protein providing excellent spectra, extensive backbone resonance assignments could also be obtained using less than 6 hours of data collection and processing time. These results demonstrate the feasibility of high throughput triple resonance NMR for determining resonance assignments and secondary structures of small proteins, and the potential for applying NMR in large scale structural proteomics projects.

  2. Romanian Higher Education as a Facilitator of Romania's Continued Formal and Informal Integration in the European Union

    ERIC Educational Resources Information Center

    Salajan, Florin D.; Chiper, Sorina

    2013-01-01

    This article conducts an exploration of Romania's European integration process through higher education. It contends that integration occurs at "formal" and "informal levels" through institutional norms and human agency, respectively. Through theoretical and empirical analysis, the authors discuss the modalities through which…

  3. Full Life Cycle of Data Analysis with Climate Model Diagnostic Analyzer (CMDA)

    NASA Astrophysics Data System (ADS)

    Lee, S.; Zhai, C.; Pan, L.; Tang, B.; Zhang, J.; Bao, Q.; Malarout, N.

    2017-12-01

    We have developed a system that supports the full life cycle of a data analysis process, from data discovery, to data customization, to analysis, to reanalysis, to publication, and to reproduction. The system called Climate Model Diagnostic Analyzer (CMDA) is designed to demonstrate that the full life cycle of data analysis can be supported within one integrated system for climate model diagnostic evaluation with global observational and reanalysis datasets. CMDA has four subsystems that are highly integrated to support the analysis life cycle. Data System manages datasets used by CMDA analysis tools, Analysis System manages CMDA analysis tools which are all web services, Provenance System manages the meta data of CMDA datasets and the provenance of CMDA analysis history, and Recommendation System extracts knowledge from CMDA usage history and recommends datasets/analysis tools to users. These four subsystems are not only highly integrated but also easily expandable. New datasets can be easily added to Data System and scanned to be visible to the other subsystems. New analysis tools can be easily registered to be available in the Analysis System and Provenance System. With CMDA, a user can start a data analysis process by discovering datasets of relevance to their research topic using the Recommendation System. Next, the user can customize the discovered datasets for their scientific use (e.g. anomaly calculation, regridding, etc) with tools in the Analysis System. Next, the user can do their analysis with the tools (e.g. conditional sampling, time averaging, spatial averaging) in the Analysis System. Next, the user can reanalyze the datasets based on the previously stored analysis provenance in the Provenance System. Further, they can publish their analysis process and result to the Provenance System to share with other users. Finally, any user can reproduce the published analysis process and results. By supporting the full life cycle of climate data analysis, CMDA improves the research productivity and collaboration level of its user.

  4. An optimal policy for a single-vendor and a single-buyer integrated system with setup cost reduction and process-quality improvement

    NASA Astrophysics Data System (ADS)

    Shu, Hui; Zhou, Xideng

    2014-05-01

    The single-vendor single-buyer integrated production inventory system has been an object of study for a long time, but little is known about the effect of investing in reducing setup cost reduction and process-quality improvement for an integrated inventory system in which the products are sold with free minimal repair warranty. The purpose of this article is to minimise the integrated cost by optimising simultaneously the number of shipments and the shipment quantity, the setup cost, and the process quality. An efficient algorithm procedure is proposed for determining the optimal decision variables. A numerical example is presented to illustrate the results of the proposed models graphically. Sensitivity analysis of the model with respect to key parameters of the system is carried out. The paper shows that the proposed integrated model can result in significant savings in the integrated cost.

  5. Fuzzy Decision Analysis for Integrated Environmental Vulnerability Assessment of the Mid-Atlantic Region

    Treesearch

    Liem T. Tran; C. Gregory Knight; Robert V. O' Neill; Elizabeth R. Smith; Kurt H. Riitters; James D. Wickham

    2002-01-01

    A fuzzy decision analysis method for integrating ecological indicators was developed. This was a combination of a fuzzy ranking method and the analytic hierarchy process (AHP). The method was capable of ranking ecosystems in terms of environmental conditions and suggesting cumulative impacts across a large region. Using data on land cover, population, roads, streams,...

  6. PNNL Data-Intensive Computing for a Smarter Energy Grid

    ScienceCinema

    Carol Imhoff; Zhenyu (Henry) Huang; Daniel Chavarria

    2017-12-09

    The Middleware for Data-Intensive Computing (MeDICi) Integration Framework, an integrated platform to solve data analysis and processing needs, supports PNNL research on the U.S. electric power grid. MeDICi is enabling development of visualizations of grid operations and vulnerabilities, with goal of near real-time analysis to aid operators in preventing and mitigating grid failures.

  7. Database integration for investigative data visualization with the Temporal Analysis System

    NASA Astrophysics Data System (ADS)

    Barth, Stephen W.

    1997-02-01

    This paper describes an effort to provide mechanisms for integration of existing law enforcement databases with the temporal analysis system (TAS) -- an application for analysis and visualization of military intelligence data. Such integration mechanisms are essential for bringing advanced military intelligence data handling software applications to bear on the analysis of data used in criminal investigations. Our approach involved applying a software application for intelligence message handling to the problem of data base conversion. This application provides mechanisms for distributed processing and delivery of converted data records to an end-user application. It also provides a flexible graphic user interface for development and customization in the field.

  8. Sensemaking: a driving force behind the integration of professional practices.

    PubMed

    Sylvain, Chantal; Lamothe, Lise

    2012-01-01

    There has been considerable effort in recent years to link and integrate professional services more closely for patients with comorbidities. However, difficulties persist, especially at the clinical level. This study aims to shed light on these difficulties by examining the process of sensemaking in professionals directly involved in this integration. The authors conducted an eight-year longitudinal case study of an organization specializing in mental health and substance abuse. Different data collection methods were used, including 34 interviews conducted between 2003 and 2009, observations and document analysis. The authors performed a qualitative analysis of the data using a processual perspective. This paper provides empirical insights about the nature of the sensemaking process in which professionals collectively participate and the effects of this process on the evolution of integrated services. It suggests that the development of integrated practices results from an evolutional and collective process of constructing meanings that is rooted in the work activities of the professionals involved. By drawing attention to the capacity of professionals to shape the projects they are implementing, this study questions the capacity of managers to actually manage such a process. In order to obtain the expected benefits of integration projects, such emergent dynamics must first be recognized and then supported. Only then can thought be given to mastering them. The fact that this is a single case study is not a limitation per se, although it does raise the issue of the transferability of results. Replicating the study in other contexts would verify the applicability of the authors' conclusions. This study offers a fresh perspective on the difficulties generally encountered at the clinical level when trying to integrate services. It makes a significant contribution to work on the dynamics of sensemaking in organizational life.

  9. Information Acquisition, Analysis and Integration

    DTIC Science & Technology

    2016-08-03

    of sensing and processing, theory, applications, signal processing, image and video processing, machine learning , technology transfer. 16. SECURITY... learning . 5. Solved elegantly old problems like image and video debluring, intro- ducing new revolutionary approaches. 1 DISTRIBUTION A: Distribution...Polatkan, G. Sapiro, D. Blei, D. B. Dunson, and L. Carin, “ Deep learning with hierarchical convolution factor analysis,” IEEE 6 DISTRIBUTION A

  10. Intershot Analysis of Flows in DIII-D

    NASA Astrophysics Data System (ADS)

    Meyer, W. H.; Allen, S. L.; Samuell, C. M.; Howard, J.

    2016-10-01

    Analysis of the DIII-D flow diagnostic data require demodulation of interference images, and inversion of the resultant line integrated emissivity and flow (phase) images. Four response matrices are pre-calculated: the emissivity line integral and the line integral of the scalar product of the lines-of-site with the orthogonal unit vectors of parallel flow. Equilibrium data determines the relative weight of the component matrices used in the final flow inversion matrix. Serial processing has been used for the lower divertor viewing flow camera 800x600 pixel image. The full cross section viewing camera will require parallel processing of the 2160x2560 pixel image. We will discuss using a Posix thread pool and a Tesla K40c GPU in the processing of this data. Prepared by LLNL under Contract DE-AC52-07NA27344. This material is based upon work supported by the U.S. DOE, Office of Science, Fusion Energy Sciences.

  11. Optical integrator for optical dark-soliton detection and pulse shaping.

    PubMed

    Ngo, Nam Quoc

    2006-09-10

    The design and analysis of an Nth-order optical integrator using the digital filter technique is presented. The optical integrator is synthesized using planar-waveguide technology. It is shown that a first-order optical integrator can be used as an optical dark-soliton detector by converting an optical dark-soliton pulse into an optical bell-shaped pulse for ease of detection. The optical integrators can generate an optical step function, staircase function, and paraboliclike functions from input optical Gaussian pulses. The optical integrators may be potentially used as basic building blocks of all-optical signal processing systems because the time integrals of signals may sometimes be required for further use or analysis. Furthermore, an optical integrator may be used for the shaping of optical pulses or in an optical feedback control system.

  12. Students' Academic Performance and Various Cognitive Processes of Learning: An Integrative Framework and Empirical Analysis

    ERIC Educational Resources Information Center

    Phan, Huy Phuong

    2010-01-01

    The main aim of this study is to test a conceptualised framework that involved the integration of achievement goals, self-efficacy and self-esteem beliefs, and study-processing strategies. Two hundred and ninety (178 females, 112 males) first-year university students were administered a number of Likert-scale inventories in tutorial classes. Data…

  13. SIGMA Release v1.2 - Capabilities, Enhancements and Fixes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahadevan, Vijay; Grindeanu, Iulian R.; Ray, Navamita

    In this report, we present details on SIGMA toolkit along with its component structure, capabilities, and feature additions in FY15, release cycles, and continuous integration process. These software processes along with updated documentation are imperative to successfully integrate and utilize in several applications including the SHARP coupled analysis toolkit for reactor core systems funded under the NEAMS DOE-NE program.

  14. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  15. Multisensory integration processing during olfactory-visual stimulation-An fMRI graph theoretical network analysis.

    PubMed

    Ripp, Isabelle; Zur Nieden, Anna-Nora; Blankenagel, Sonja; Franzmeier, Nicolai; Lundström, Johan N; Freiherr, Jessica

    2018-05-07

    In this study, we aimed to understand how whole-brain neural networks compute sensory information integration based on the olfactory and visual system. Task-related functional magnetic resonance imaging (fMRI) data was obtained during unimodal and bimodal sensory stimulation. Based on the identification of multisensory integration processing (MIP) specific hub-like network nodes analyzed with network-based statistics using region-of-interest based connectivity matrices, we conclude the following brain areas to be important for processing the presented bimodal sensory information: right precuneus connected contralaterally to the supramarginal gyrus for memory-related imagery and phonology retrieval, and the left middle occipital gyrus connected ipsilaterally to the inferior frontal gyrus via the inferior fronto-occipital fasciculus including functional aspects of working memory. Applied graph theory for quantification of the resulting complex network topologies indicates a significantly increased global efficiency and clustering coefficient in networks including aspects of MIP reflecting a simultaneous better integration and segregation. Graph theoretical analysis of positive and negative network correlations allowing for inferences about excitatory and inhibitory network architectures revealed-not significant, but very consistent-that MIP-specific neural networks are dominated by inhibitory relationships between brain regions involved in stimulus processing. © 2018 Wiley Periodicals, Inc.

  16. Design techniques for low-voltage analog integrated circuits

    NASA Astrophysics Data System (ADS)

    Rakús, Matej; Stopjaková, Viera; Arbet, Daniel

    2017-08-01

    In this paper, a review and analysis of different design techniques for (ultra) low-voltage integrated circuits (IC) are performed. This analysis shows that the most suitable design methods for low-voltage analog IC design in a standard CMOS process include techniques using bulk-driven MOS transistors, dynamic threshold MOS transistors and MOS transistors operating in weak or moderate inversion regions. The main advantage of such techniques is that there is no need for any modification of standard CMOS structure or process. Basic circuit building blocks like differential amplifiers or current mirrors designed using these approaches are able to operate with the power supply voltage of 600 mV (or even lower), which is the key feature towards integrated systems for modern portable applications.

  17. [A SAS marco program for batch processing of univariate Cox regression analysis for great database].

    PubMed

    Yang, Rendong; Xiong, Jie; Peng, Yangqin; Peng, Xiaoning; Zeng, Xiaomin

    2015-02-01

    To realize batch processing of univariate Cox regression analysis for great database by SAS marco program. We wrote a SAS macro program, which can filter, integrate, and export P values to Excel by SAS9.2. The program was used for screening survival correlated RNA molecules of ovarian cancer. A SAS marco program could finish the batch processing of univariate Cox regression analysis, the selection and export of the results. The SAS macro program has potential applications in reducing the workload of statistical analysis and providing a basis for batch processing of univariate Cox regression analysis.

  18. Demonstration and Methodology of Structural Monitoring of Stringer Runs out Composite Areas by Embedded Optical Fiber Sensors and Connectors Integrated during Production in a Composite Plant.

    PubMed

    Miguel Giraldo, Carlos; Zúñiga Sagredo, Juan; Sánchez Gómez, José; Corredera, Pedro

    2017-07-21

    Embedding optical fibers sensors into composite structures for Structural Health Monitoring purposes is not just one of the most attractive solutions contributing to smart structures, but also the optimum integration approach that insures maximum protection and integrity of the fibers. Nevertheless this intended integration level still remains an industrial challenge since today there is no mature integration process in composite plants matching all necessary requirements. This article describes the process developed to integrate optical fiber sensors in the Production cycle of a test specimen. The sensors, Bragg gratings, were integrated into the laminate during automatic tape lay-up and also by a secondary bonding process, both in the Airbus Composite Plant. The test specimen, completely representative of the root joint of the lower wing cover of a real aircraft, is comprised of a structural skin panel with the associated stringer run out. The ingress-egress was achieved through the precise design and integration of miniaturized optical connectors compatible with the manufacturing conditions and operational test requirements. After production, the specimen was trimmed, assembled and bolted to metallic plates to represent the real triform and buttstrap, and eventually installed into the structural test rig. The interrogation of the sensors proves the effectiveness of the integration process; the analysis of the strain results demonstrate the good correlation between fiber sensors and electrical gauges in those locations where they are installed nearby, and the curvature and load transfer analysis in the bolted stringer run out area enable demonstration of the consistency of the fiber sensors measurements. In conclusion, this work presents strong evidence of the performance of embedded optical sensors for structural health monitoring purposes, where in addition and most importantly, the fibers were integrated in a real production environment and the ingress-egress issue was solved by the design and integration of miniaturized connectors compatible with the manufacturing and structural test phases.

  19. Demonstration and Methodology of Structural Monitoring of Stringer Runs out Composite Areas by Embedded Optical Fiber Sensors and Connectors Integrated during Production in a Composite Plant

    PubMed Central

    Miguel Giraldo, Carlos; Zúñiga Sagredo, Juan; Sánchez Gómez, José; Corredera, Pedro

    2017-01-01

    Embedding optical fibers sensors into composite structures for Structural Health Monitoring purposes is not just one of the most attractive solutions contributing to smart structures, but also the optimum integration approach that insures maximum protection and integrity of the fibers. Nevertheless this intended integration level still remains an industrial challenge since today there is no mature integration process in composite plants matching all necessary requirements. This article describes the process developed to integrate optical fiber sensors in the Production cycle of a test specimen. The sensors, Bragg gratings, were integrated into the laminate during automatic tape lay-up and also by a secondary bonding process, both in the Airbus Composite Plant. The test specimen, completely representative of the root joint of the lower wing cover of a real aircraft, is comprised of a structural skin panel with the associated stringer run out. The ingress-egress was achieved through the precise design and integration of miniaturized optical connectors compatible with the manufacturing conditions and operational test requirements. After production, the specimen was trimmed, assembled and bolted to metallic plates to represent the real triform and buttstrap, and eventually installed into the structural test rig. The interrogation of the sensors proves the effectiveness of the integration process; the analysis of the strain results demonstrate the good correlation between fiber sensors and electrical gauges in those locations where they are installed nearby, and the curvature and load transfer analysis in the bolted stringer run out area enable demonstration of the consistency of the fiber sensors measurements. In conclusion, this work presents strong evidence of the performance of embedded optical sensors for structural health monitoring purposes, where in addition and most importantly, the fibers were integrated in a real production environment and the ingress-egress issue was solved by the design and integration of miniaturized connectors compatible with the manufacturing and structural test phases. PMID:28754009

  20. Integrated system for automated financial document processing

    NASA Astrophysics Data System (ADS)

    Hassanein, Khaled S.; Wesolkowski, Slawo; Higgins, Ray; Crabtree, Ralph; Peng, Antai

    1997-02-01

    A system was developed that integrates intelligent document analysis with multiple character/numeral recognition engines in order to achieve high accuracy automated financial document processing. In this system, images are accepted in both their grayscale and binary formats. A document analysis module starts by extracting essential features from the document to help identify its type (e.g. personal check, business check, etc.). These features are also utilized to conduct a full analysis of the image to determine the location of interesting zones such as the courtesy amount and the legal amount. These fields are then made available to several recognition knowledge sources such as courtesy amount recognition engines and legal amount recognition engines through a blackboard architecture. This architecture allows all the available knowledge sources to contribute incrementally and opportunistically to the solution of the given recognition query. Performance results on a test set of machine printed business checks using the integrated system are also reported.

  1. Analysis of peptides using an integrated microchip HPLC-MS/MS system.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kirby, Brian J.; Chirica, Gabriela S.; Reichmuth, David S.

    Hyphendated LC-MS techniques are quickly becoming the standard tool for protemic analyses. For large homogeneous samples, bulk processing methods and capillary injection and separation techniques are suitable. However, for analysis of small or heterogeneous samples, techniques that can manipulate picoliter samples without dilution are required or samples will be lost or corrupted; further, static nanospray-type flowrates are required to maximize SNR. Microchip-level integration of sample injection with separation and mass spectrometry allow small-volume analytes to be processed on chip and immediately injected without dilution for analysis. An on-chip HPLC was fabricated using in situ polymerization of both fixed and mobilemore » polymer monoliths. Integration of the chip with a nanospray MS emitter enables identification of peptides by the use of tandem MS. The chip is capable of analyzing of very small sample volumes (< 200 pl) in short times (< 3 min).« less

  2. System enhancements of Mesoscale Analysis and Space Sensor (MASS) computer system

    NASA Technical Reports Server (NTRS)

    Hickey, J. S.; Karitani, S.

    1985-01-01

    The interactive information processing for the mesoscale analysis and space sensor (MASS) program is reported. The development and implementation of new spaceborne remote sensing technology to observe and measure atmospheric processes is described. The space measurements and conventional observational data are processed together to gain an improved understanding of the mesoscale structure and dynamical evolution of the atmosphere relative to cloud development and precipitation processes. A Research Computer System consisting of three primary computers was developed (HP-1000F, Perkin-Elmer 3250, and Harris/6) which provides a wide range of capabilities for processing and displaying interactively large volumes of remote sensing data. The development of a MASS data base management and analysis system on the HP-1000F computer and extending these capabilities by integration with the Perkin-Elmer and Harris/6 computers using the MSFC's Apple III microcomputer workstations is described. The objectives are: to design hardware enhancements for computer integration and to provide data conversion and transfer between machines.

  3. Proceedings of the 22nd Project Integration Meeting

    NASA Technical Reports Server (NTRS)

    1983-01-01

    This report describes progress made by the Flat-Plate Solar Array Project during the period January to September 1983. It includes reports on silicon sheet growth and characterization, module technology, silicon material, cell processing and high-efficiency cells, environmental isolation, engineering sciences, module performance and failure analysis and project analysis and integration. It includes a report on, and copies of visual presentations made at the 22nd Project Integration Meeting held at Pasadena, California, on September 28 and 29, 1983.

  4. NASA Supportability Engineering Implementation Utilizing DoD Practices and Processes

    NASA Technical Reports Server (NTRS)

    Smith, David A.; Smith, John V.

    2010-01-01

    The Ares I design and development program made the determination early in the System Design Review Phase to utilize DoD ILS and LSA approach for supportability engineering as an integral part of the system engineering process. This paper is to provide a review of the overall approach to design Ares-I with an emphasis on a more affordable, supportable, and sustainable launch vehicle. Discussions will include the requirements development, design influence, support concept alternatives, ILS and LSA planning, Logistics support analyses/trades performed, LSA tailoring for NASA Ares Program, support system infrastructure identification, ILS Design Review documentation, Working Group coordination, and overall ILS implementation. At the outset, the Ares I Project initiated the development of the Integrated Logistics Support Plan (ILSP) and a Logistics Support Analysis process to provide a path forward for the management of the Ares-I ILS program and supportability analysis activities. The ILSP provide the initial planning and coordination between the Ares-I Project Elements and Ground Operation Project. The LSA process provided a system engineering approach in the development of the Ares-I supportability requirements; influence the design for supportability and development of alternative support concepts that satisfies the program operability requirements. The LSA planning and analysis results are documented in the Logistics Support Analysis Report. This document was required during the Ares-I System Design Review (SDR) and Preliminary Design Review (PDR) review cycles. To help coordinate the LSA process across the Ares-I project and between programs, the LSA Report is updated and released quarterly. A System Requirement Analysis was performed to determine the supportability requirements and technical performance measurements (TPMs). Two working groups were established to provide support in the management and implement the Ares-I ILS program, the Integrated Logistics Support Working Group (ILSWG) and the Logistics Support Analysis Record Working Group (LSARWG). The Ares I ILSWG is established to assess the requirements and conduct, evaluate analyses and trade studies associated with acquisition logistic and supportability processes and to resolve Ares I integrated logistics and supportability issues. It established a strategic collaborative alliance for coordination of Logistics Support Analysis activates in support of the integrated Ares I vehicle design and development of logistics support infrastructure. A Joint Ares I - Orion LSAR Working Group was established to: 1) Guide the development of Ares-I and Orion LSAR data and serve as a model for future Constellation programs, 2) Develop rules and assumptions that will apply across the Constellation program with regards to the program's LSAR development, and 3) Maintain the Constellation LSAR Style Guide.

  5. Systems integration of marketable subsystems: A collection of progress reports

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Monthly progress reports are given in the areas of marketable subsystems integration; development, design, and building of site data acquisition subsystems and data processing systems; operation of the solar test facility and a systems analysis.

  6. Computational knowledge integration in biopharmaceutical research.

    PubMed

    Ficenec, David; Osborne, Mark; Pradines, Joel; Richards, Dan; Felciano, Ramon; Cho, Raymond J; Chen, Richard O; Liefeld, Ted; Owen, James; Ruttenberg, Alan; Reich, Christian; Horvath, Joseph; Clark, Tim

    2003-09-01

    An initiative to increase biopharmaceutical research productivity by capturing, sharing and computationally integrating proprietary scientific discoveries with public knowledge is described. This initiative involves both organisational process change and multiple interoperating software systems. The software components rely on mutually supporting integration techniques. These include a richly structured ontology, statistical analysis of experimental data against stored conclusions, natural language processing of public literature, secure document repositories with lightweight metadata, web services integration, enterprise web portals and relational databases. This approach has already begun to increase scientific productivity in our enterprise by creating an organisational memory (OM) of internal research findings, accessible on the web. Through bringing together these components it has also been possible to construct a very large and expanding repository of biological pathway information linked to this repository of findings which is extremely useful in analysis of DNA microarray data. This repository, in turn, enables our research paradigm to be shifted towards more comprehensive systems-based understandings of drug action.

  7. Collaboration processes and perceived effectiveness of integrated care projects in primary care: a longitudinal mixed-methods study.

    PubMed

    Valentijn, Pim P; Ruwaard, Dirk; Vrijhoef, Hubertus J M; de Bont, Antoinette; Arends, Rosa Y; Bruijnzeels, Marc A

    2015-10-09

    Collaborative partnerships are considered an essential strategy for integrating local disjointed health and social services. Currently, little evidence is available on how integrated care arrangements between professionals and organisations are achieved through the evolution of collaboration processes over time. The first aim was to develop a typology of integrated care projects (ICPs) based on the final degree of integration as perceived by multiple stakeholders. The second aim was to study how types of integration differ in changes of collaboration processes over time and final perceived effectiveness. A longitudinal mixed-methods study design based on two data sources (surveys and interviews) was used to identify the perceived degree of integration and patterns in collaboration among 42 ICPs in primary care in The Netherlands. We used cluster analysis to identify distinct subgroups of ICPs based on the final perceived degree of integration from a professional, organisational and system perspective. With the use of ANOVAs, the subgroups were contrasted based on: 1) changes in collaboration processes over time (shared ambition, interests and mutual gains, relationship dynamics, organisational dynamics and process management) and 2) final perceived effectiveness (i.e. rated success) at the professional, organisational and system levels. The ICPs were classified into three subgroups with: 'United Integration Perspectives (UIP)', 'Disunited Integration Perspectives (DIP)' and 'Professional-oriented Integration Perspectives (PIP)'. ICPs within the UIP subgroup made the strongest increase in trust-based (mutual gains and relationship dynamics) as well as control-based (organisational dynamics and process management) collaboration processes and had the highest overall effectiveness rates. On the other hand, ICPs with the DIP subgroup decreased on collaboration processes and had the lowest overall effectiveness rates. ICPs within the PIP subgroup increased in control-based collaboration processes (organisational dynamics and process management) and had the highest effectiveness rates at the professional level. The differences across the three subgroups in terms of the development of collaboration processes and the final perceived effectiveness provide evidence that united stakeholders' perspectives are achieved through a constructive collaboration process over time. Disunited perspectives at the professional, organisation and system levels can be aligned by both trust-based and control-based collaboration processes.

  8. VISPA2: a scalable pipeline for high-throughput identification and annotation of vector integration sites.

    PubMed

    Spinozzi, Giulio; Calabria, Andrea; Brasca, Stefano; Beretta, Stefano; Merelli, Ivan; Milanesi, Luciano; Montini, Eugenio

    2017-11-25

    Bioinformatics tools designed to identify lentiviral or retroviral vector insertion sites in the genome of host cells are used to address the safety and long-term efficacy of hematopoietic stem cell gene therapy applications and to study the clonal dynamics of hematopoietic reconstitution. The increasing number of gene therapy clinical trials combined with the increasing amount of Next Generation Sequencing data, aimed at identifying integration sites, require both highly accurate and efficient computational software able to correctly process "big data" in a reasonable computational time. Here we present VISPA2 (Vector Integration Site Parallel Analysis, version 2), the latest optimized computational pipeline for integration site identification and analysis with the following features: (1) the sequence analysis for the integration site processing is fully compliant with paired-end reads and includes a sequence quality filter before and after the alignment on the target genome; (2) an heuristic algorithm to reduce false positive integration sites at nucleotide level to reduce the impact of Polymerase Chain Reaction or trimming/alignment artifacts; (3) a classification and annotation module for integration sites; (4) a user friendly web interface as researcher front-end to perform integration site analyses without computational skills; (5) the time speedup of all steps through parallelization (Hadoop free). We tested VISPA2 performances using simulated and real datasets of lentiviral vector integration sites, previously obtained from patients enrolled in a hematopoietic stem cell gene therapy clinical trial and compared the results with other preexisting tools for integration site analysis. On the computational side, VISPA2 showed a > 6-fold speedup and improved precision and recall metrics (1 and 0.97 respectively) compared to previously developed computational pipelines. These performances indicate that VISPA2 is a fast, reliable and user-friendly tool for integration site analysis, which allows gene therapy integration data to be handled in a cost and time effective fashion. Moreover, the web access of VISPA2 ( http://openserver.itb.cnr.it/vispa/ ) ensures accessibility and ease of usage to researches of a complex analytical tool. We released the source code of VISPA2 in a public repository ( https://bitbucket.org/andreacalabria/vispa2 ).

  9. Integrating complex business processes for knowledge-driven clinical decision support systems.

    PubMed

    Kamaleswaran, Rishikesan; McGregor, Carolyn

    2012-01-01

    This paper presents in detail the component of the Complex Business Process for Stream Processing framework that is responsible for integrating complex business processes to enable knowledge-driven Clinical Decision Support System (CDSS) recommendations. CDSSs aid the clinician in supporting the care of patients by providing accurate data analysis and evidence-based recommendations. However, the incorporation of a dynamic knowledge-management system that supports the definition and enactment of complex business processes and real-time data streams has not been researched. In this paper we discuss the process web service as an innovative method of providing contextual information to a real-time data stream processing CDSS.

  10. PSP, TSP, XP, CMMI...Eating the Alphabet Soup!

    DTIC Science & Technology

    2011-05-19

    Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that notwithstanding any other...4 Q tit t Continuous process improvement Organizational Performance Management Causal Analysis and Resolution Level Focus Process Areas Requirements...Project Management process standardization Risk management Decision Analysis and Resolution Product Integration 2 M d R i t t anage Basic Project

  11. Spectral analysis of temporal non-stationary rainfall-runoff processes

    NASA Astrophysics Data System (ADS)

    Chang, Ching-Min; Yeh, Hund-Der

    2018-04-01

    This study treats the catchment as a block box system with considering the rainfall input and runoff output being a stochastic process. The temporal rainfall-runoff relationship at the catchment scale is described by a convolution integral on a continuous time scale. Using the Fourier-Stieltjes representation approach, a frequency domain solution to the convolution integral is developed to the spectral analysis of runoff processes generated by temporal non-stationary rainfall events. It is shown that the characteristic time scale of rainfall process increases the runoff discharge variability, while the catchment mean travel time constant plays the role in reducing the variability of runoff discharge. Similar to the behavior of groundwater aquifers, catchments act as a low-pass filter in the frequency domain for the rainfall input signal.

  12. Data on conceptual design of cryogenic energy storage system combined with liquefied natural gas regasification process.

    PubMed

    Lee, Inkyu; Park, Jinwoo; Moon, Il

    2017-12-01

    This paper describes data of an integrated process, cryogenic energy storage system combined with liquefied natural gas (LNG) regasification process. The data in this paper is associated with the article entitled "Conceptual Design and Exergy Analysis of Combined Cryogenic Energy Storage and LNG Regasification Processes: Cold and Power Integration" (Lee et al., 2017) [1]. The data includes the sensitivity case study dataset of the air flow rate and the heat exchanging feasibility data by composite curves. The data is expected to be helpful to the cryogenic energy process development.

  13. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    NASA Astrophysics Data System (ADS)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  14. Optimal Parameter Design of Coarse Alignment for Fiber Optic Gyro Inertial Navigation System.

    PubMed

    Lu, Baofeng; Wang, Qiuying; Yu, Chunmei; Gao, Wei

    2015-06-25

    Two different coarse alignment algorithms for Fiber Optic Gyro (FOG) Inertial Navigation System (INS) based on inertial reference frame are discussed in this paper. Both of them are based on gravity vector integration, therefore, the performance of these algorithms is determined by integration time. In previous works, integration time is selected by experience. In order to give a criterion for the selection process, and make the selection of the integration time more accurate, optimal parameter design of these algorithms for FOG INS is performed in this paper. The design process is accomplished based on the analysis of the error characteristics of these two coarse alignment algorithms. Moreover, this analysis and optimal parameter design allow us to make an adequate selection of the most accurate algorithm for FOG INS according to the actual operational conditions. The analysis and simulation results show that the parameter provided by this work is the optimal value, and indicate that in different operational conditions, the coarse alignment algorithms adopted for FOG INS are different in order to achieve better performance. Lastly, the experiment results validate the effectiveness of the proposed algorithm.

  15. Development of virtual research environment for regional climatic and ecological studies and continuous education support

    NASA Astrophysics Data System (ADS)

    Gordov, Evgeny; Lykosov, Vasily; Krupchatnikov, Vladimir; Bogomolov, Vasily; Gordova, Yulia; Martynova, Yulia; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara

    2014-05-01

    Volumes of environmental data archives are growing immensely due to recent models, high performance computers and sensors development. It makes impossible their comprehensive analysis in conventional manner on workplace using in house computing facilities, data storage and processing software at hands. One of possible answers to this challenge is creation of virtual research environment (VRE), which should provide a researcher with an integrated access to huge data resources, tools and services across disciplines and user communities and enable researchers to process structured and qualitative data in virtual workspaces. VRE should integrate data, network and computing resources providing interdisciplinary climatic research community with opportunity to get profound understanding of ongoing and possible future climatic changes and their consequences. Presented are first steps and plans for development of VRE prototype element aimed at regional climatic and ecological monitoring and modeling as well as at continuous education and training support. Recently developed experimental software and hardware platform aimed at integrated analysis of heterogeneous georeferenced data "Climate" (http://climate.scert.ru/, Gordov et al., 2013; Shulgina et al., 2013; Okladnikov et al., 2013) is used as a VRE element prototype and approach test bench. VRE under development will integrate on the base of geoportal distributed thematic data storage, processing and analysis systems and set of models of complex climatic and environmental processes run on supercomputers. VRE specific tools are aimed at high resolution rendering on-going climatic processes occurring in Northern Eurasia and reliable and found prognoses of their dynamics for selected sets of future mankind activity scenaria. Currently the VRE element is accessible via developed geoportal at the same link (http://climate.scert.ru/) and integrates the WRF and «Planet Simulator» models, basic reanalysis and instrumental measurements data and support profound statistical analysis of storaged and modeled on demand data. In particular, one can run the integrated models, preprocess modeling results data, using dedicated modules for numerical processing perform analysys and visualize obtained results. New functionality recently has been added to the statistical analysis tools set aimed at detailed studies of climatic extremes occurring in Northern Asia. The VRE element is also supporting thematic educational courses for students and post-graduate students of the Tomsk State University. In particular, it allow students to perform on-line thematic laboratory work cycles on the basics of analysis of current and potential future regional climate change using Siberia territory as an example (Gordova et al, 2013). We plan to expand the integrated models set and add comprehensive surface and Arctic Ocean description. Developed VRE element "Climate" provides specialists involved into multidisciplinary research projects with reliable and practical instruments for integrated research of climate and ecosystems changes on global and regional scales. With its help even a user without programming skills can process and visualize multidimensional observational and model data through unified web-interface using a common graphical web-browser. This work is partially supported by SB RAS project VIII.80.2.1, RFBR grant 13-05-12034, grant 14-05-00502, and integrated project SB RAS 131. References 1. Gordov E.P., Lykosov V.N., Krupchatnikov V.N., Okladnikov I.G., Titov A.G., Shulgina T.M. Computationaland information technologies for monitoring and modeling of climate changes and their consequences. Novosibirsk: Nauka, Siberian branch, 2013. - 195 p. (in Russian) 2. T.M. Shulgina, E.P. Gordov, I.G. Okladnikov, A.G., Titov, E.Yu. Genina, N.P. Gorbatenko, I.V. Kuzhevskaya,A.S. Akhmetshina. Software complex for a regional climate change analysis. // Vestnik NGU. Series: Information technologies. 2013. Vol. 11. Issue 1. P. 124-131. (in Russian) 3. I.G. Okladnikov, A.G. Titov, T.M. Shulgina, E.P. Gordov, V.Yu. Bogomolov, Yu.V. Martynova, S.P. Suschenko,A.V. Skvortsov. Software for analysis and visualization of climate change monitoring and forecasting data //Numerical methods and programming, 2013. Vol. 14. P. 123-131.(in Russian) 4. Yu.E. Gordova, E.Yu. Genina, V.P. Gorbatenko, E.P. Gordov, I.V. Kuzhevskaya, Yu.V. Martynova , I.G. Okladnikov, A.G. Titov, T.M. Shulgina, N.K. Barashkova Support of the educational process in modern climatology within the web-gis platform «Climate». Open and Distant Education. 2013, No 1(49)., P. 14-19.(in Russian)

  16. Process and information integration via hypermedia

    NASA Technical Reports Server (NTRS)

    Hammen, David G.; Labasse, Daniel L.; Myers, Robert M.

    1990-01-01

    Success stories for advanced automation prototypes abound in the literature but the deployments of practical large systems are few in number. There are several factors that militate against the maturation of such prototypes into products. Here, the integration of advanced automation software into large systems is discussed. Advanced automation systems tend to be specific applications that need to be integrated and aggregated into larger systems. Systems integration can be achieved by providing expert user-developers with verified tools to efficiently create small systems that interface to large systems through standard interfaces. The use of hypermedia as such a tool in the context of the ground control centers that support Shuttle and space station operations is explored. Hypermedia can be an integrating platform for data, conventional software, and advanced automation software, enabling data integration through the display of diverse types of information and through the creation of associative links between chunks of information. Further, hypermedia enables process integration through graphical invoking of system functions. Through analysis and examples, researchers illustrate how diverse information and processing paradigms can be integrated into a single software platform.

  17. Seven Processes that Enable NASA Software Engineering Technologies

    NASA Technical Reports Server (NTRS)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    This slide presentation reviews seven processes that NASA uses to ensure that software is developed, acquired and maintained as specified in the NPR 7150.2A requirement. The requirement is to ensure that all software be appraised for the Capability Maturity Model Integration (CMMI). The enumerated processes are: (7) Product Integration, (6) Configuration Management, (5) Verification, (4) Software Assurance, (3) Measurement and Analysis, (2) Requirements Management and (1) Planning & Monitoring. Each of these is described and the group(s) that are responsible is described.

  18. Coal gasification systems engineering and analysis. Appendix A: Coal gasification catalog

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The scope of work in preparing the Coal Gasification Data Catalog included the following subtasks: (1) candidate system subsystem definition, (2) raw materials analysis, (3) market analysis for by-products, (4) alternate products analysis, (5) preliminary integrated facility requirements. Definition of candidate systems/subsystems includes the identity of and alternates for each process unit, raw material requirements, and the cost and design drivers for each process design.

  19. MDAS: an integrated system for metabonomic data analysis.

    PubMed

    Liu, Juan; Li, Bo; Xiong, Jiang-Hui

    2009-03-01

    Metabonomics, the latest 'omics' research field, shows great promise as a tool in biomarker discovery, drug efficacy and toxicity analysis, disease diagnosis and prognosis. One of the major challenges now facing researchers is how to process this data to yield useful information about a biological system, e.g., the mechanism of diseases. Traditional methods employed in metabonomic data analysis use multivariate analysis methods developed independently in chemometrics research. Additionally, with the development of machine learning approaches, some methods such as SVMs also show promise for use in metabonomic data analysis. Aside from the application of general multivariate analysis and machine learning methods to this problem, there is also a need for an integrated tool customized for metabonomic data analysis which can be easily used by biologists to reveal interesting patterns in metabonomic data.In this paper, we present a novel software tool MDAS (Metabonomic Data Analysis System) for metabonomic data analysis which integrates traditional chemometrics methods and newly introduced machine learning approaches. MDAS contains a suite of functional models for metabonomic data analysis and optimizes the flow of data analysis. Several file formats can be accepted as input. The input data can be optionally preprocessed and can then be processed with operations such as feature analysis and dimensionality reduction. The data with reduced dimensionalities can be used for training or testing through machine learning models. The system supplies proper visualization for data preprocessing, feature analysis, and classification which can be a powerful function for users to extract knowledge from the data. MDAS is an integrated platform for metabonomic data analysis, which transforms a complex analysis procedure into a more formalized and simplified one. The software package can be obtained from the authors.

  20. [Health in Andean regional integration].

    PubMed

    Agudelo, Carlos A

    2007-01-01

    Despite their shared history, the Andean countries are socially and politically diverse, with heterogeneous health realities and complex integration processes. General developments such as the Latin American Free Trade Association and Latin American Integration Association have existed for decades, along with others of a regional scope, like the Andean Community of Nations, Caribbean Community, and Central American Common Market. The health field has a specific instrument in the Andean Region called the Hipólito Unánue Agreement, created in 1971. Integration processes have concentrated on economic aspects, based on preferential customs agreements that have led to an important long-term increase in trade. Less progress has been made in the field of health in terms of sharing national experiences, knowledge, and capabilities. Analysis of experiences in health has shown that integration depends on the countries' respective strengths and to a major extent on national political processes.

  1. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform.

    PubMed

    Moutsatsos, Ioannis K; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J; Jenkins, Jeremy L; Holway, Nicholas; Tallarico, John; Parker, Christian N

    2017-03-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an "off-the-shelf," open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community.

  2. Jenkins-CI, an Open-Source Continuous Integration System, as a Scientific Data and Image-Processing Platform

    PubMed Central

    Moutsatsos, Ioannis K.; Hossain, Imtiaz; Agarinis, Claudia; Harbinski, Fred; Abraham, Yann; Dobler, Luc; Zhang, Xian; Wilson, Christopher J.; Jenkins, Jeremy L.; Holway, Nicholas; Tallarico, John; Parker, Christian N.

    2016-01-01

    High-throughput screening generates large volumes of heterogeneous data that require a diverse set of computational tools for management, processing, and analysis. Building integrated, scalable, and robust computational workflows for such applications is challenging but highly valuable. Scientific data integration and pipelining facilitate standardized data processing, collaboration, and reuse of best practices. We describe how Jenkins-CI, an “off-the-shelf,” open-source, continuous integration system, is used to build pipelines for processing images and associated data from high-content screening (HCS). Jenkins-CI provides numerous plugins for standard compute tasks, and its design allows the quick integration of external scientific applications. Using Jenkins-CI, we integrated CellProfiler, an open-source image-processing platform, with various HCS utilities and a high-performance Linux cluster. The platform is web-accessible, facilitates access and sharing of high-performance compute resources, and automates previously cumbersome data and image-processing tasks. Imaging pipelines developed using the desktop CellProfiler client can be managed and shared through a centralized Jenkins-CI repository. Pipelines and managed data are annotated to facilitate collaboration and reuse. Limitations with Jenkins-CI (primarily around the user interface) were addressed through the selection of helper plugins from the Jenkins-CI community. PMID:27899692

  3. Evaluation of automated decisionmaking methodologies and development of an integrated robotic system simulation, volume 2, part 1. Appendix A: Software documentation

    NASA Technical Reports Server (NTRS)

    Lowrie, J. W.; Fermelia, A. J.; Haley, D. C.; Gremban, K. D.; Vanbaalen, J.; Walsh, R. W.

    1982-01-01

    Documentation of the preliminary software developed as a framework for a generalized integrated robotic system simulation is presented. The program structure is composed of three major functions controlled by a program executive. The three major functions are: system definition, analysis tools, and post processing. The system definition function handles user input of system parameters and definition of the manipulator configuration. The analysis tools function handles the computational requirements of the program. The post processing function allows for more detailed study of the results of analysis tool function executions. Also documented is the manipulator joint model software to be used as the basis of the manipulator simulation which will be part of the analysis tools capability.

  4. Applying integrals of motion to the numerical solution of differential equations

    NASA Technical Reports Server (NTRS)

    Vezewski, D. J.

    1980-01-01

    A method is developed for using the integrals of systems of nonlinear, ordinary, differential equations in a numerical integration process to control the local errors in these integrals and reduce the global errors of the solution. The method is general and can be applied to either scalar or vector integrals. A number of example problems, with accompanying numerical results, are used to verify the analysis and support the conjecture of global error reduction.

  5. Applying integrals of motion to the numerical solution of differential equations

    NASA Technical Reports Server (NTRS)

    Jezewski, D. J.

    1979-01-01

    A method is developed for using the integrals of systems of nonlinear, ordinary differential equations in a numerical integration process to control the local errors in these integrals and reduce the global errors of the solution. The method is general and can be applied to either scaler or vector integrals. A number of example problems, with accompanying numerical results, are used to verify the analysis and support the conjecture of global error reduction.

  6. Proceedings of the 21st Project Integration Meeting

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Progress made by the Flat Plate Solar Array Project during the period April 1982 to January 1983 is described. Reports on polysilicon refining, thin film solar cell and module technology development, central station electric utility activities, silicon sheet growth and characteristics, advanced photovoltaic materials, cell and processes research, module technology, environmental isolation, engineering sciences, module performance and failure analysis and project analysis and integration are included.

  7. Corrective emotional experience in an integrative affect-focused therapy: Building a preliminary model using task analysis.

    PubMed

    Nakamura, Kaori; Iwakabe, Shigeru

    2018-03-01

    The present study constructed a preliminary process model of corrective emotional experience (CEE) in an integrative affect-focused therapy. Task analysis was used to analyse 6 in-session events taken from 6 Japanese clients who worked with an integrative affect-focused therapist. The 6 events included 3 successful CEEs and 3 partially successful CEEs for comparison. A rational-empirical model of CEE was generated, which consisted of two parallel client change processes, intrapersonal change and interpersonal change, and the therapist interventions corresponding to each process. Therapist experiential interventions and therapist affirmation facilitated both intrapersonal and interpersonal change processes, whereas his relational interventions were associated with the interpersonal change process. The partially successful CEEs were differentiated by the absence of the component of core painful emotions or negative beliefs in intrapersonal change process, which seemed crucial for the interpersonal change process to develop. CEE is best represented by a preliminary model that depicts two parallel yet interacting change processes. Intrapersonal change process is similar to the sequence of change described by the emotional processing model (Pascual-Leone & Greenberg, ), whereas interpersonal change process is a unique contribution of this study. Interpersonal change process was facilitated when the therapist's active stance and use of immediacy responses to make their relational process explicit allowed a shared exploration. Therapist affirmation bridged intrapersonal change to interpersonal change by promoting an adaptive sense of self in clients and forging a deeper emotional connection between the two. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Visual noise disrupts conceptual integration in reading.

    PubMed

    Gao, Xuefei; Stine-Morrow, Elizabeth A L; Noh, Soo Rim; Eskew, Rhea T

    2011-02-01

    The Effortfulness Hypothesis suggests that sensory impairment (either simulated or age-related) may decrease capacity for semantic integration in language comprehension. We directly tested this hypothesis by measuring resource allocation to different levels of processing during reading (i.e., word vs. semantic analysis). College students read three sets of passages word-by-word, one at each of three levels of dynamic visual noise. There was a reliable interaction between processing level and noise, such that visual noise increased resources allocated to word-level processing, at the cost of attention paid to semantic analysis. Recall of the most important ideas also decreased with increasing visual noise. Results suggest that sensory challenge can impair higher-level cognitive functions in learning from text, supporting the Effortfulness Hypothesis.

  9. "Competing Conceptions of Globalization" Revisited: Relocating the Tension between World-Systems Analysis and Globalization Analysis

    ERIC Educational Resources Information Center

    Clayton, Thomas

    2004-01-01

    In recent years, many scholars have become fascinated by a contemporary, multidimensional process that has come to be known as "globalization." Globalization originally described economic developments at the world level. More specifically, scholars invoked the concept in reference to the process of global economic integration and the seemingly…

  10. The Integration of Psycholinguistic and Discourse Processing Theories of Reading Comprehension.

    ERIC Educational Resources Information Center

    Beebe, Mona J.

    To assess the compatibility of miscue analysis and recall analysis as independent elements in a theory of reading comprehension, a study was performed that operationalized each theory and separated its components into measurable units to allow empirical testing. A cueing strategy model was estimated, but the discourse processing model was broken…

  11. Integrated Modeling Activities for the James Webb Space Telescope (JWST): Structural-Thermal-Optical Analysis

    NASA Technical Reports Server (NTRS)

    Johnston, John D.; Parrish, Keith; Howard, Joseph M.; Mosier, Gary E.; McGinnis, Mark; Bluth, Marcel; Kim, Kevin; Ha, Hong Q.

    2004-01-01

    This is a continuation of a series of papers on modeling activities for JWST. The structural-thermal- optical, often referred to as "STOP", analysis process is used to predict the effect of thermal distortion on optical performance. The benchmark STOP analysis for JWST assesses the effect of an observatory slew on wavefront error. The paper begins an overview of multi-disciplinary engineering analysis, or integrated modeling, which is a critical element of the JWST mission. The STOP analysis process is then described. This process consists of the following steps: thermal analysis, structural analysis, and optical analysis. Temperatures predicted using geometric and thermal math models are mapped to the structural finite element model in order to predict thermally-induced deformations. Motions and deformations at optical surfaces are input to optical models and optical performance is predicted using either an optical ray trace or WFE estimation techniques based on prior ray traces or first order optics. Following the discussion of the analysis process, results based on models representing the design at the time of the System Requirements Review. In addition to baseline performance predictions, sensitivity studies are performed to assess modeling uncertainties. Of particular interest is the sensitivity of optical performance to uncertainties in temperature predictions and variations in metal properties. The paper concludes with a discussion of modeling uncertainty as it pertains to STOP analysis.

  12. COBRApy: COnstraints-Based Reconstruction and Analysis for Python.

    PubMed

    Ebrahim, Ali; Lerman, Joshua A; Palsson, Bernhard O; Hyduke, Daniel R

    2013-08-08

    COnstraint-Based Reconstruction and Analysis (COBRA) methods are widely used for genome-scale modeling of metabolic networks in both prokaryotes and eukaryotes. Due to the successes with metabolism, there is an increasing effort to apply COBRA methods to reconstruct and analyze integrated models of cellular processes. The COBRA Toolbox for MATLAB is a leading software package for genome-scale analysis of metabolism; however, it was not designed to elegantly capture the complexity inherent in integrated biological networks and lacks an integration framework for the multiomics data used in systems biology. The openCOBRA Project is a community effort to promote constraints-based research through the distribution of freely available software. Here, we describe COBRA for Python (COBRApy), a Python package that provides support for basic COBRA methods. COBRApy is designed in an object-oriented fashion that facilitates the representation of the complex biological processes of metabolism and gene expression. COBRApy does not require MATLAB to function; however, it includes an interface to the COBRA Toolbox for MATLAB to facilitate use of legacy codes. For improved performance, COBRApy includes parallel processing support for computationally intensive processes. COBRApy is an object-oriented framework designed to meet the computational challenges associated with the next generation of stoichiometric constraint-based models and high-density omics data sets. http://opencobra.sourceforge.net/

  13. IPAD applications to the design, analysis, and/or machining of aerospace structures. [Integrated Program for Aerospace-vehicle Design

    NASA Technical Reports Server (NTRS)

    Blackburn, C. L.; Dovi, A. R.; Kurtze, W. L.; Storaasli, O. O.

    1981-01-01

    A computer software system for the processing and integration of engineering data and programs, called IPAD (Integrated Programs for Aerospace-Vehicle Design), is described. The ability of the system to relieve the engineer of the mundane task of input data preparation is demonstrated by the application of a prototype system to the design, analysis, and/or machining of three simple structures. Future work to further enhance the system's automated data handling and ability to handle larger and more varied design problems are also presented.

  14. A process-based framework to guide nurse practitioners integration into primary healthcare teams: results from a logic analysis.

    PubMed

    Contandriopoulos, Damien; Brousselle, Astrid; Dubois, Carl-Ardy; Perroux, Mélanie; Beaulieu, Marie-Dominique; Brault, Isabelle; Kilpatrick, Kelley; D'Amour, Danielle; Sansgter-Gormley, Esther

    2015-02-27

    Integrating Nurse Practitioners into primary care teams is a process that involves significant challenges. To be successful, nurse practitioner integration into primary care teams requires, among other things, a redefinition of professional boundaries, in particular those of medicine and nursing, a coherent model of inter- and intra- professional collaboration, and team-based work processes that make the best use of the subsidiarity principle. There have been numerous studies on nurse practitioner integration, and the literature provides a comprehensive list of barriers to, and facilitators of, integration. However, this literature is much less prolific in discussing the operational level implications of those barriers and facilitators and in offering practical recommendations. In the context of a large-scale research project on the introduction of nurse practitioners in Quebec (Canada) we relied on a logic-analysis approach based, on the one hand on a realist review of the literature and, on the other hand, on qualitative case-studies in 6 primary healthcare teams in rural and urban area of Quebec. Five core themes that need to be taken into account when integrating nurse practitioners into primary care teams were identified. Those themes are: planning, role definition, practice model, collaboration, and team support. The present paper has two objectives: to present the methods used to develop the themes, and to discuss an integrative model of nurse practitioner integration support centered around these themes. It concludes with a discussion of how this framework contributes to existing knowledge and some ideas for future avenues of study.

  15. Process Intensification for Cellulosic Biorefineries.

    PubMed

    Sadula, Sunitha; Athaley, Abhay; Zheng, Weiqing; Ierapetritou, Marianthi; Saha, Basudeb

    2017-06-22

    Utilization of renewable carbon source, especially non-food biomass is critical to address the climate change and future energy challenge. Current chemical and enzymatic processes for producing cellulosic sugars are multistep, and energy- and water-intensive. Techno-economic analysis (TEA) suggests that upstream lignocellulose processing is a major hurdle to the economic viability of the cellulosic biorefineries. Process intensification, which integrates processes and uses less water and energy, has the potential to overcome the aforementioned challenges. Here, we demonstrate a one-pot depolymerization and saccharification process of woody biomass, energy crops, and agricultural residues to produce soluble sugars with high yields. Lignin is separated as a solid for selective upgrading. Further integration of our upstream process with a reactive extraction step makes energy-efficient separation of sugars in the form of furans. TEA reveals that the process efficiency and integration enable, for the first time, economic production of feed streams that could profoundly improve process economics for downstream cellulosic bioproducts. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Requirement analysis for the one-stop logistics management of fresh agricultural products

    NASA Astrophysics Data System (ADS)

    Li, Jun; Gao, Hongmei; Liu, Yuchuan

    2017-08-01

    Issues and concerns for food safety, agro-processing, and the environmental and ecological impact of food production have been attracted many research interests. Traceability and logistics management of fresh agricultural products is faced with the technological challenges including food product label and identification, activity/process characterization, information systems for the supply chain, i.e., from farm to table. Application of one-stop logistics service focuses on the whole supply chain process integration for fresh agricultural products is studied. A collaborative research project for the supply and logistics of fresh agricultural products in Tianjin was performed. Requirement analysis for the one-stop logistics management information system is studied. The model-driven business transformation, an approach uses formal models to explicitly define the structure and behavior of a business, is applied for the review and analysis process. Specific requirements for the logistic management solutions are proposed. Development of this research is crucial for the solution of one-stop logistics management information system integration platform for fresh agricultural products.

  17. Protocol for a process-oriented qualitative evaluation of the Waltham Forest and East London Collaborative (WELC) integrated care pioneer programme using the Researcher-in-Residence model

    PubMed Central

    Eyre, Laura; George, Bethan; Marshall, Martin

    2015-01-01

    Introduction The integration of health and social care in England is widely accepted as the answer to fragmentation, financial concerns and system inefficiencies, in the context of growing and ageing populations with increasingly complex needs. Despite an expanding body of literature, there is little evidence yet to suggest that integrated care can achieve the benefits that its advocates claim for it. Researchers have often adopted rationalist and technocratic approaches to evaluation, treating integration as an intervention rather than a process. Results have usually been of limited use to practitioners responsible for health and social care integration. There is, therefore, a need to broaden the evidence base, exploring not only what works but also how integrated care can most successfully be implemented and delivered. For this reason, we are carrying out a formative evaluation of the Waltham Forest and East London Collaborative (WELC) integrated care pioneer programme. Our expectation is that this will add value to the literature by focusing on the processes by which the vision and objectives of integrated care are translated through phases of development, implementation and delivery from a central to a local perspective, and from a strategic to an operational perspective. Methods and analysis The qualitative and process-oriented evaluation uses an innovative participative approach—the Researcher-in-Residence model. The evaluation is underpinned by a critical ontology, an interpretive epistemology and a critical discourse analysis methodology. Data will be generated using interviews, observations and documentary gathering. Ethics and dissemination Emerging findings will be interpreted and disseminated collaboratively with stakeholders, to enable the research to influence and optimise the effective implementation of integrated care across WELC. Presentations and publications will ensure that learning is shared as widely as possible. The study has received ethical approval from University College London's Research Ethics Committee and has all appropriate NHS governance clearances. PMID:26546147

  18. Integration of photovoltaic and concentrated solar thermal technologies for H2 production by the hybrid sulfur cycle

    NASA Astrophysics Data System (ADS)

    Liberatore, Raffaele; Ferrara, Mariarosaria; Lanchi, Michela; Turchetti, Luca

    2017-06-01

    It is widely agreed that hydrogen used as energy carrier and/or storage media may significantly contribute in the reduction of emissions, especially if produced by renewable energy sources. The Hybrid Sulfur (HyS) cycle is considered as one of the most promising processes to produce hydrogen through the water-splitting process. The FP7 project SOL2HY2 (Solar to Hydrogen Hybrid Cycles) investigates innovative material and process solutions for the use of solar heat and power in the HyS process. A significant part of the SOL2HY2 project is devoted to the analysis and optimization of the integration of the solar and chemical (hydrogen production) plants. In this context, this work investigates the possibility to integrate different solar technologies, namely photovoltaic, solar central receiver and solar troughs, to optimize their use in the HyS cycle for a green hydrogen production, both in the open and closed process configurations. The analysis carried out accounts for different combinations of geographical location and plant sizing criteria. The use of a sulfur burner, which can serve both as thermal backup and SO2 source for the open cycle, is also considered.

  19. System integration of marketable subsystems. [for residential solar heating and cooling

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Progress is reported in the following areas: systems integration of marketable subsystems; development, design, and building of site data acquisition subsystems; development and operation of the central data processing system; operation of the MSFC Solar Test Facility; and systems analysis.

  20. Master of Puppets: Cooperative Multitasking for In Situ Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morozov, Dmitriy; Lukic, Zarija

    2016-01-01

    Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. Here, we present a novel design for running multiple codes in situ: using coroutines and position-independent executables we enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. We present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. This design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The techniques we present can also be integrated into other in situ frameworks.« less

  1. BFPTool: a software tool for analysis of Biomembrane Force Probe experiments.

    PubMed

    Šmít, Daniel; Fouquet, Coralie; Doulazmi, Mohamed; Pincet, Frédéric; Trembleau, Alain; Zapotocky, Martin

    2017-01-01

    The Biomembrane Force Probe is an approachable experimental technique commonly used for single-molecule force spectroscopy and experiments on biological interfaces. The technique operates in the range of forces from 0.1 pN to 1000 pN. Experiments are typically repeated many times, conditions are often not optimal, the captured video can be unstable and lose focus; this makes efficient analysis challenging, while out-of-the-box non-proprietary solutions are not freely available. This dedicated tool was developed to integrate and simplify the image processing and analysis of videomicroscopy recordings from BFP experiments. A novel processing feature, allowing the tracking of the pipette, was incorporated to address a limitation of preceding methods. Emphasis was placed on versatility and comprehensible user interface implemented in a graphical form. An integrated analytical tool was implemented to provide a faster, simpler and more convenient way to process and analyse BFP experiments.

  2. Short-Term Load Forecasting Error Distributions and Implications for Renewable Integration Studies: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodge, B. M.; Lew, D.; Milligan, M.

    2013-01-01

    Load forecasting in the day-ahead timescale is a critical aspect of power system operations that is used in the unit commitment process. It is also an important factor in renewable energy integration studies, where the combination of load and wind or solar forecasting techniques create the net load uncertainty that must be managed by the economic dispatch process or with suitable reserves. An understanding of that load forecasting errors that may be expected in this process can lead to better decisions about the amount of reserves necessary to compensate errors. In this work, we performed a statistical analysis of themore » day-ahead (and two-day-ahead) load forecasting errors observed in two independent system operators for a one-year period. Comparisons were made with the normal distribution commonly assumed in power system operation simulations used for renewable power integration studies. Further analysis identified time periods when the load is more likely to be under- or overforecast.« less

  3. An Observation Analysis Tool for time-series analysis and sensor management in the FREEWAT GIS environment for water resources management

    NASA Astrophysics Data System (ADS)

    Cannata, Massimiliano; Neumann, Jakob; Cardoso, Mirko; Rossetto, Rudy; Foglia, Laura; Borsi, Iacopo

    2017-04-01

    In situ time-series are an important aspect of environmental modelling, especially with the advancement of numerical simulation techniques and increased model complexity. In order to make use of the increasing data available through the requirements of the EU Water Framework Directive, the FREEWAT GIS environment incorporates the newly developed Observation Analysis Tool for time-series analysis. The tool is used to import time-series data into QGIS from local CSV files, online sensors using the istSOS service, or MODFLOW model result files and enables visualisation, pre-processing of data for model development, and post-processing of model results. OAT can be used as a pre-processor for calibration observations, integrating the creation of observations for calibration directly from sensor time-series. The tool consists in an expandable Python library of processing methods and an interface integrated in the QGIS FREEWAT plug-in which includes a large number of modelling capabilities, data management tools and calibration capacity.

  4. Biometric Attendance and Big Data Analysis for Optimizing Work Processes.

    PubMed

    Verma, Neetu; Xavier, Teenu; Agrawal, Deepak

    2016-01-01

    Although biometric attendance management is available, large healthcare organizations have difficulty in big data analysis for optimization of work processes. The aim of this project was to assess the implementation of a biometric attendance system and its utility following big data analysis. In this prospective study the implementation of biometric system was evaluated over 3 month period at our institution. Software integration with other existing systems for data analysis was also evaluated. Implementation of the biometric system could be successfully done over a two month period with enrollment of 10,000 employees into the system. However generating reports and taking action this large number of staff was a challenge. For this purpose software was made for capturing the duty roster of each employee and integrating it with the biometric system and adding an SMS gateway. This helped in automating the process of sending SMSs to each employee who had not signed in. Standalone biometric systems have limited functionality in large organizations unless it is meshed with employee duty roster.

  5. Integration of Multifidelity Multidisciplinary Computer Codes for Design and Analysis of Supersonic Aircraft

    NASA Technical Reports Server (NTRS)

    Geiselhart, Karl A.; Ozoroski, Lori P.; Fenbert, James W.; Shields, Elwood W.; Li, Wu

    2011-01-01

    This paper documents the development of a conceptual level integrated process for design and analysis of efficient and environmentally acceptable supersonic aircraft. To overcome the technical challenges to achieve this goal, a conceptual design capability which provides users with the ability to examine the integrated solution between all disciplines and facilitates the application of multidiscipline design, analysis, and optimization on a scale greater than previously achieved, is needed. The described capability is both an interactive design environment as well as a high powered optimization system with a unique blend of low, mixed and high-fidelity engineering tools combined together in the software integration framework, ModelCenter. The various modules are described and capabilities of the system are demonstrated. The current limitations and proposed future enhancements are also discussed.

  6. An expert system for integrated structural analysis and design optimization for aerospace structures

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The results of a research study on the development of an expert system for integrated structural analysis and design optimization is presented. An Object Representation Language (ORL) was developed first in conjunction with a rule-based system. This ORL/AI shell was then used to develop expert systems to provide assistance with a variety of structural analysis and design optimization tasks, in conjunction with procedural modules for finite element structural analysis and design optimization. The main goal of the research study was to provide expertise, judgment, and reasoning capabilities in the aerospace structural design process. This will allow engineers performing structural analysis and design, even without extensive experience in the field, to develop error-free, efficient and reliable structural designs very rapidly and cost-effectively. This would not only improve the productivity of design engineers and analysts, but also significantly reduce time to completion of structural design. An extensive literature survey in the field of structural analysis, design optimization, artificial intelligence, and database management systems and their application to the structural design process was first performed. A feasibility study was then performed, and the architecture and the conceptual design for the integrated 'intelligent' structural analysis and design optimization software was then developed. An Object Representation Language (ORL), in conjunction with a rule-based system, was then developed using C++. Such an approach would improve the expressiveness for knowledge representation (especially for structural analysis and design applications), provide ability to build very large and practical expert systems, and provide an efficient way for storing knowledge. Functional specifications for the expert systems were then developed. The ORL/AI shell was then used to develop a variety of modules of expert systems for a variety of modeling, finite element analysis, and design optimization tasks in the integrated aerospace structural design process. These expert systems were developed to work in conjunction with procedural finite element structural analysis and design optimization modules (developed in-house at SAT, Inc.). The complete software, AutoDesign, so developed, can be used for integrated 'intelligent' structural analysis and design optimization. The software was beta-tested at a variety of companies, used by a range of engineers with different levels of background and expertise. Based on the feedback obtained by such users, conclusions were developed and are provided.

  7. An expert system for integrated structural analysis and design optimization for aerospace structures

    NASA Astrophysics Data System (ADS)

    1992-04-01

    The results of a research study on the development of an expert system for integrated structural analysis and design optimization is presented. An Object Representation Language (ORL) was developed first in conjunction with a rule-based system. This ORL/AI shell was then used to develop expert systems to provide assistance with a variety of structural analysis and design optimization tasks, in conjunction with procedural modules for finite element structural analysis and design optimization. The main goal of the research study was to provide expertise, judgment, and reasoning capabilities in the aerospace structural design process. This will allow engineers performing structural analysis and design, even without extensive experience in the field, to develop error-free, efficient and reliable structural designs very rapidly and cost-effectively. This would not only improve the productivity of design engineers and analysts, but also significantly reduce time to completion of structural design. An extensive literature survey in the field of structural analysis, design optimization, artificial intelligence, and database management systems and their application to the structural design process was first performed. A feasibility study was then performed, and the architecture and the conceptual design for the integrated 'intelligent' structural analysis and design optimization software was then developed. An Object Representation Language (ORL), in conjunction with a rule-based system, was then developed using C++. Such an approach would improve the expressiveness for knowledge representation (especially for structural analysis and design applications), provide ability to build very large and practical expert systems, and provide an efficient way for storing knowledge. Functional specifications for the expert systems were then developed. The ORL/AI shell was then used to develop a variety of modules of expert systems for a variety of modeling, finite element analysis, and design optimization tasks in the integrated aerospace structural design process. These expert systems were developed to work in conjunction with procedural finite element structural analysis and design optimization modules (developed in-house at SAT, Inc.). The complete software, AutoDesign, so developed, can be used for integrated 'intelligent' structural analysis and design optimization. The software was beta-tested at a variety of companies, used by a range of engineers with different levels of background and expertise. Based on the feedback obtained by such users, conclusions were developed and are provided.

  8. Integration of design and inspection

    NASA Astrophysics Data System (ADS)

    Simmonds, William H.

    1990-08-01

    Developments in advanced computer integrated manufacturing technology, coupled with the emphasis on Total Quality Management, are exposing needs for new techniques to integrate all functions from design through to support of the delivered product. One critical functional area that must be integrated into design is that embracing the measurement, inspection and test activities necessary for validation of the delivered product. This area is being tackled by a collaborative project supported by the UK Government Department of Trade and Industry. The project is aimed at developing techniques for analysing validation needs and for planning validation methods. Within the project an experimental Computer Aided Validation Expert system (CAVE) is being constructed. This operates with a generalised model of the validation process and helps with all design stages: specification of product requirements; analysis of the assurance provided by a proposed design and method of manufacture; development of the inspection and test strategy; and analysis of feedback data. The kernel of the system is a knowledge base containing knowledge of the manufacturing process capabilities and of the available inspection and test facilities. The CAVE system is being integrated into a real life advanced computer integrated manufacturing facility for demonstration and evaluation.

  9. A COMPARISON OF TRANSIENT INFINITE ELEMENTS AND TRANSIENT KIRCHHOFF INTEGRAL METHODS FOR FAR FIELD ACOUSTIC ANALYSIS

    DOE PAGES

    WALSH, TIMOTHY F.; JONES, ANDREA; BHARDWAJ, MANOJ; ...

    2013-04-01

    Finite element analysis of transient acoustic phenomena on unbounded exterior domains is very common in engineering analysis. In these problems there is a common need to compute the acoustic pressure at points outside of the acoustic mesh, since meshing to points of interest is impractical in many scenarios. In aeroacoustic calculations, for example, the acoustic pressure may be required at tens or hundreds of meters from the structure. In these cases, a method is needed for post-processing the acoustic results to compute the response at far-field points. In this paper, we compare two methods for computing far-field acoustic pressures, onemore » derived directly from the infinite element solution, and the other from the transient version of the Kirchhoff integral. Here, we show that the infinite element approach alleviates the large storage requirements that are typical of Kirchhoff integral and related procedures, and also does not suffer from loss of accuracy that is an inherent part of computing numerical derivatives in the Kirchhoff integral. In order to further speed up and streamline the process of computing the acoustic response at points outside of the mesh, we also address the nonlinear iterative procedure needed for locating parametric coordinates within the host infinite element of far-field points, the parallelization of the overall process, linear solver requirements, and system stability considerations.« less

  10. Towards adaptive and integrated management paradigms to meet the challenges of water governance.

    PubMed

    Halbe, J; Pahl-Wostl, C; Sendzimir, J; Adamowski, J

    2013-01-01

    Integrated Water Resource Management (IWRM) aims at finding practical and sustainable solutions to water resource issues. Research and practice have shown that innovative methods and tools are not sufficient to implement IWRM - the concept needs to also be integrated in prevailing management paradigms and institutions. Water governance science addresses this human dimension by focusing on the analysis of regulatory processes that influence the behavior of actors in water management systems. This paper proposes a new methodology for the integrated analysis of water resources management and governance systems in order to elicit and analyze case-specific management paradigms. It builds on the Management and Transition Framework (MTF) that allows for the examination of structures and processes underlying water management and governance. The new methodology presented in this paper combines participatory modeling and analysis of the governance system by using the MTF to investigate case-specific management paradigms. The linking of participatory modeling and research on complex management and governance systems allows for the transfer of knowledge between scientific, policy, engineering and local communities. In this way, the proposed methodology facilitates assessment and implementation of transformation processes towards IWRM that require also the adoption of adaptive management principles. A case study on flood management in the Tisza River Basin in Hungary is provided to illustrate the application of the proposed methodology.

  11. Integrated Optical Design Analysis (IODA): New Test Data and Modeling Features

    NASA Technical Reports Server (NTRS)

    Moore, Jim; Troy, Ed; Patrick, Brian

    2003-01-01

    A general overview of the capabilities of the IODA ("Integrated Optical Design Analysis") exchange of data and modeling results between thermal, structures, optical design, and testing engineering disciplines. This presentation focuses on new features added to the software that allow measured test data to be imported into the IODA environment for post processing or comparisons with pretest model predictions. software is presented. IODA promotes efficient

  12. Obtaining mathematical models for assessing efficiency of dust collectors using integrated system of analysis and data management STATISTICA Design of Experiments

    NASA Astrophysics Data System (ADS)

    Azarov, A. V.; Zhukova, N. S.; Kozlovtseva, E. Yu; Dobrinsky, D. R.

    2018-05-01

    The article considers obtaining mathematical models to assess the efficiency of the dust collectors using an integrated system of analysis and data management STATISTICA Design of Experiments. The procedure for obtaining mathematical models and data processing is considered by the example of laboratory studies on a mounted installation containing a dust collector in counter-swirling flows (CSF) using gypsum dust of various fractions. Planning of experimental studies has been carried out in order to reduce the number of experiments and reduce the cost of experimental research. A second-order non-position plan (Box-Bencken plan) was used, which reduced the number of trials from 81 to 27. The order of statistical data research of Box-Benken plan using standard tools of integrated system for analysis and data management STATISTICA Design of Experiments is considered. Results of statistical data processing with significance estimation of coefficients and adequacy of mathematical models are presented.

  13. Framework for Infectious Disease Analysis: A comprehensive and integrative multi-modeling approach to disease prediction and management.

    PubMed

    Erraguntla, Madhav; Zapletal, Josef; Lawley, Mark

    2017-12-01

    The impact of infectious disease on human populations is a function of many factors including environmental conditions, vector dynamics, transmission mechanics, social and cultural behaviors, and public policy. A comprehensive framework for disease management must fully connect the complete disease lifecycle, including emergence from reservoir populations, zoonotic vector transmission, and impact on human societies. The Framework for Infectious Disease Analysis is a software environment and conceptual architecture for data integration, situational awareness, visualization, prediction, and intervention assessment. Framework for Infectious Disease Analysis automatically collects biosurveillance data using natural language processing, integrates structured and unstructured data from multiple sources, applies advanced machine learning, and uses multi-modeling for analyzing disease dynamics and testing interventions in complex, heterogeneous populations. In the illustrative case studies, natural language processing from social media, news feeds, and websites was used for information extraction, biosurveillance, and situation awareness. Classification machine learning algorithms (support vector machines, random forests, and boosting) were used for disease predictions.

  14. Linking Automated Data Analysis and Visualization with Applications in Developmental Biology and High-Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruebel, Oliver

    2009-11-20

    Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research coveredmore » in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle acceleration, physicists model LWFAs computationally. The datasets produced by LWFA simulations are (i) extremely large, (ii) of varying spatial and temporal resolution, (iii) heterogeneous, and (iv) high-dimensional, making analysis and knowledge discovery from complex LWFA simulation data a challenging task. To address these challenges this thesis describes the integration of the visualization system VisIt and the state-of-the-art index/query system FastBit, enabling interactive visual exploration of extremely large three-dimensional particle datasets. Researchers are especially interested in beams of high-energy particles formed during the course of a simulation. This thesis describes novel methods for automatic detection and analysis of particle beams enabling a more accurate and efficient data analysis process. By integrating these automated analysis methods with visualization, this research enables more accurate, efficient, and effective analysis of LWFA simulation data than previously possible.« less

  15. A Longitudinal Analysis of the Influence of a Peer Run Warm Line Phone Service on Psychiatric Recovery.

    PubMed

    Dalgin, Rebecca Spirito; Dalgin, M Halim; Metzger, Scott J

    2018-05-01

    This article focuses on the impact of a peer run warm line as part of the psychiatric recovery process. It utilized data including the Recovery Assessment Scale, community integration measures and crisis service usage. Longitudinal statistical analysis was completed on 48 sets of data from 2011, 2012, and 2013. Although no statistically significant differences were observed for the RAS score, community integration data showed increases in visits to primary care doctors, leisure/recreation activities and socialization with others. This study highlights the complexity of psychiatric recovery and that nonclinical peer services like peer run warm lines may be critical to the process.

  16. An Integrated Approach Linking Process to Structural Modeling With Microstructural Characterization for Injections-Molded Long-Fiber Thermoplastics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Ba Nghiep; Bapanapalli, Satish K.; Smith, Mark T.

    2008-09-01

    The objective of our work is to enable the optimum design of lightweight automotive structural components using injection-molded long fiber thermoplastics (LFTs). To this end, an integrated approach that links process modeling to structural analysis with experimental microstructural characterization and validation is developed. First, process models for LFTs are developed and implemented into processing codes (e.g. ORIENT, Moldflow) to predict the microstructure of the as-formed composite (i.e. fiber length and orientation distributions). In parallel, characterization and testing methods are developed to obtain necessary microstructural data to validate process modeling predictions. Second, the predicted LFT composite microstructure is imported into amore » structural finite element analysis by ABAQUS to determine the response of the as-formed composite to given boundary conditions. At this stage, constitutive models accounting for the composite microstructure are developed to predict various types of behaviors (i.e. thermoelastic, viscoelastic, elastic-plastic, damage, fatigue, and impact) of LFTs. Experimental methods are also developed to determine material parameters and to validate constitutive models. Such a process-linked-structural modeling approach allows an LFT composite structure to be designed with confidence through numerical simulations. Some recent results of our collaborative research will be illustrated to show the usefulness and applications of this integrated approach.« less

  17. Evaluation of rail test frequencies using risk analysis

    DOT National Transportation Integrated Search

    2009-03-03

    Several industries now use risk analysis to develop : inspection programs to ensure acceptable mechanical integrity : and reliability. These industries include nuclear and electric : power generation, oil refining, gas processing, onshore and : offsh...

  18. Work Flow Analysis Report Action Tracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PETERMANN, M.L.

    The Work Flow Analysis Report will be used to facilitate the requirements for implementing the further deployment of the Action Tracking module of Passport. The report consists of workflow integration processes for Action Tracking.

  19. Multidisciplinary Graduate Curriculum on Integrative Biointerfacial Engineering

    ERIC Educational Resources Information Center

    Moghe, Prabhas V.; Roth, Charles M.

    2006-01-01

    A wide range of biotechnological and biomedical processes and products involves the design, synthesis, and analysis of biological interfaces. Such biointerfaces mediate interactions between living cells or intracellular species and designed materials or biologics. Incorporating the experiences of a NSF-­sponsored IGERT (Integrative Graduate…

  20. An integrated GIS application system for soil moisture data assimilation

    NASA Astrophysics Data System (ADS)

    Wang, Di; Shen, Runping; Huang, Xiaolong; Shi, Chunxiang

    2014-11-01

    The gaps in knowledge and existing challenges in precisely describing the land surface process make it critical to represent the massive soil moisture data visually and mine the data for further research.This article introduces a comprehensive soil moisture assimilation data analysis system, which is instructed by tools of C#, IDL, ArcSDE, Visual Studio 2008 and SQL Server 2005. The system provides integrated service, management of efficient graphics visualization and analysis of land surface data assimilation. The system is not only able to improve the efficiency of data assimilation management, but also comprehensively integrate the data processing and analysis tools into GIS development environment. So analyzing the soil moisture assimilation data and accomplishing GIS spatial analysis can be realized in the same system. This system provides basic GIS map functions, massive data process and soil moisture products analysis etc. Besides,it takes full advantage of a spatial data engine called ArcSDE to effeciently manage, retrieve and store all kinds of data. In the system, characteristics of temporal and spatial pattern of soil moiture will be plotted. By analyzing the soil moisture impact factors, it is possible to acquire the correlation coefficients between soil moisture value and its every single impact factor. Daily and monthly comparative analysis of soil moisture products among observations, simulation results and assimilations can be made in this system to display the different trends of these products. Furthermore, soil moisture map production function is realized for business application.

  1. ImatraNMR: novel software for batch integration and analysis of quantitative NMR spectra.

    PubMed

    Mäkelä, A V; Heikkilä, O; Kilpeläinen, I; Heikkinen, S

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D (1)H and (13)C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. PLANiTS : structuring and supporting the intelligent transportation systems planning process

    DOT National Transportation Integrated Search

    1997-01-01

    PLANiTS (Planning and Analysis Integration for Intelligent Transportation Systems) is a process-based computer system that supports a series of mutually interdependent steps progressing toward developing and programming transportation improvement pro...

  3. Essential metrics for assessing sex & gender integration in health research proposals involving human participants.

    PubMed

    Day, Suzanne; Mason, Robin; Tannenbaum, Cara; Rochon, Paula A

    2017-01-01

    Integrating sex and gender in health research is essential to produce the best possible evidence to inform health care. Comprehensive integration of sex and gender requires considering these variables from the very beginning of the research process, starting at the proposal stage. To promote excellence in sex and gender integration, we have developed a set of metrics to assess the quality of sex and gender integration in research proposals. These metrics are designed to assist both researchers in developing proposals and reviewers in making funding decisions. We developed this tool through an iterative three-stage method involving 1) review of existing sex and gender integration resources and initial metrics design, 2) expert review and feedback via anonymous online survey (Likert scale and open-ended questions), and 3) analysis of feedback data and collective revision of the metrics. We received feedback on the initial metrics draft from 20 reviewers with expertise in conducting sex- and/or gender-based health research. The majority of reviewers responded positively to questions regarding the utility, clarity and completeness of the metrics, and all reviewers provided responses to open-ended questions about suggestions for improvements. Coding and analysis of responses identified three domains for improvement: clarifying terminology, refining content, and broadening applicability. Based on this analysis we revised the metrics into the Essential Metrics for Assessing Sex and Gender Integration in Health Research Proposals Involving Human Participants, which outlines criteria for excellence within each proposal component and provides illustrative examples to support implementation. By enhancing the quality of sex and gender integration in proposals, the metrics will help to foster comprehensive, meaningful integration of sex and gender throughout each stage of the research process, resulting in better quality evidence to inform health care for all.

  4. Essential metrics for assessing sex & gender integration in health research proposals involving human participants

    PubMed Central

    Mason, Robin; Tannenbaum, Cara; Rochon, Paula A.

    2017-01-01

    Integrating sex and gender in health research is essential to produce the best possible evidence to inform health care. Comprehensive integration of sex and gender requires considering these variables from the very beginning of the research process, starting at the proposal stage. To promote excellence in sex and gender integration, we have developed a set of metrics to assess the quality of sex and gender integration in research proposals. These metrics are designed to assist both researchers in developing proposals and reviewers in making funding decisions. We developed this tool through an iterative three-stage method involving 1) review of existing sex and gender integration resources and initial metrics design, 2) expert review and feedback via anonymous online survey (Likert scale and open-ended questions), and 3) analysis of feedback data and collective revision of the metrics. We received feedback on the initial metrics draft from 20 reviewers with expertise in conducting sex- and/or gender-based health research. The majority of reviewers responded positively to questions regarding the utility, clarity and completeness of the metrics, and all reviewers provided responses to open-ended questions about suggestions for improvements. Coding and analysis of responses identified three domains for improvement: clarifying terminology, refining content, and broadening applicability. Based on this analysis we revised the metrics into the Essential Metrics for Assessing Sex and Gender Integration in Health Research Proposals Involving Human Participants, which outlines criteria for excellence within each proposal component and provides illustrative examples to support implementation. By enhancing the quality of sex and gender integration in proposals, the metrics will help to foster comprehensive, meaningful integration of sex and gender throughout each stage of the research process, resulting in better quality evidence to inform health care for all. PMID:28854192

  5. The experience of well-being in professionals who support victims of political or familiar conflicts during their social integration process in Barranquilla, Colombia.

    PubMed

    Polo, Jean David; De Castro, Alberto; Amarís, María

    2015-01-01

    This article is the result of integration between a theoretical analysis and a qualitative approximation to the field about the experience of well-being. It presents the results of an investigation and its principal purpose was to examine the state and experience of well-being in the professionals who support victims of political or familiar conflicts during their social-integration processes. For this purpose, the researchers approached lived discourses, analyzed the context of the participants in the investigation in order to clarify direct and subjective experiences.

  6. Integration of Value Stream Map and Healthcare Failure Mode and Effect Analysis into Six Sigma Methodology to Improve Process of Surgical Specimen Handling.

    PubMed

    Hung, Sheng-Hui; Wang, Pa-Chun; Lin, Hung-Chun; Chen, Hung-Ying; Su, Chao-Ton

    2015-01-01

    Specimen handling is a critical patient safety issue. Problematic handling process, such as misidentification (of patients, surgical site, and specimen counts), specimen loss, or improper specimen preparation can lead to serious patient harms and lawsuits. Value stream map (VSM) is a tool used to find out non-value-added works, enhance the quality, and reduce the cost of the studied process. On the other hand, healthcare failure mode and effect analysis (HFMEA) is now frequently employed to avoid possible medication errors in healthcare process. Both of them have a goal similar to Six Sigma methodology for process improvement. This study proposes a model that integrates VSM and HFMEA into the framework, which mainly consists of define, measure, analyze, improve, and control (DMAIC), of Six Sigma. A Six Sigma project for improving the process of surgical specimen handling in a hospital was conducted to demonstrate the effectiveness of the proposed model.

  7. An Integrated Approach for Conducting a Behavioral Systems Analysis

    ERIC Educational Resources Information Center

    Diener, Lori H.; McGee, Heather M.; Miguel, Caio F.

    2009-01-01

    The aim of this paper is to illustrate how to conduct a Behavioral Systems Analysis (BSA) to aid in the design of targeted performance improvement interventions. BSA is a continuous process of analyzing the right variables to the right extent to aid in planning and managing performance at the organization, process, and job levels. BSA helps to…

  8. Biomedical data integration - capturing similarities while preserving disparities.

    PubMed

    Bianchi, Stefano; Burla, Anna; Conti, Costanza; Farkash, Ariel; Kent, Carmel; Maman, Yonatan; Shabo, Amnon

    2009-01-01

    One of the challenges of healthcare data processing, analysis and warehousing is the integration of data gathered from disparate and diverse data sources. Promoting the adoption of worldwide accepted information standards along with common terminologies and the use of technologies derived from semantic web representation, is a suitable path to achieve that. To that end, the HL7 V3 Reference Information Model (RIM) [1] has been used as the underlying information model coupled with the Web Ontology Language (OWL) [2] as the semantic data integration technology. In this paper we depict a biomedical data integration process and demonstrate how it was used for integrating various data sources, containing clinical, environmental and genomic data, within Hypergenes, a European Commission funded project exploring the Essential Hypertension [3] disease model.

  9. Developing an Integrated, Brief Biobehavioral HIV Prevention Intervention for High-Risk Drug Users in Treatment: The Process and Outcome of Formative Research

    PubMed Central

    Shrestha, Roman; Altice, Frederick; Karki, Pramila; Copenhaver, Michael

    2017-01-01

    To date, HIV prevention efforts have largely relied on singular strategies (e.g., behavioral or biomedical approaches alone) with modest HIV risk-reduction outcomes for people who use drugs (PWUD), many of whom experience a wide range of neurocognitive impairments (NCI). We report on the process and outcome of our formative research aimed at developing an integrated biobehavioral approach that incorporates innovative strategies to address the HIV prevention and cognitive needs of high-risk PWUD in drug treatment. Our formative work involved first adapting an evidence-based behavioral intervention—guided by the Assessment–Decision–Administration–Production–Topical experts–Integration–Training–Testing model—and then combining the behavioral intervention with an evidence-based biomedical intervention for implementation among the target population. This process involved eliciting data through structured focus groups (FGs) with key stakeholders—members of the target population (n = 20) and treatment providers (n = 10). Analysis of FG data followed a thematic analysis approach utilizing several qualitative data analysis techniques, including inductive analysis and cross-case analysis. Based on all information, we integrated the adapted community-friendly health recovery program—a brief evidence-based HIV prevention behavioral intervention—with the evidence-based biomedical component [i.e., preexposure prophylaxis (PrEP)], an approach that incorporates innovative strategies to accommodate individuals with NCI. This combination approach—now called the biobehavioral community-friendly health recovery program—is designed to address HIV-related risk behaviors and PrEP uptake and adherence as experienced by many PWUD in treatment. This study provides a complete example of the process of selecting, adapting, and integrating the evidence-based interventions—taking into account both empirical evidence and input from target population members and target organization stakeholders. The resultant brief evidence-based biobehavioral approach could significantly advance primary prevention science by cost-effectively optimizing PrEP adherence and HIV risk reduction within common drug treatment settings. PMID:28553295

  10. Integrating national community-based health worker programmes into health systems: a systematic review identifying lessons learned from low-and middle-income countries.

    PubMed

    Zulu, Joseph Mumba; Kinsman, John; Michelo, Charles; Hurtig, Anna-Karin

    2014-09-22

    Despite the development of national community-based health worker (CBHW) programmes in several low- and middle-income countries, their integration into health systems has not been optimal. Studies have been conducted to investigate the factors influencing the integration processes, but systematic reviews to provide a more comprehensive understanding are lacking. We conducted a systematic review of published research to understand factors that may influence the integration of national CBHW programmes into health systems in low- and middle-income countries. To be included in the study, CBHW programmes should have been developed by the government and have standardised training, supervision and incentive structures. A conceptual framework on the integration of health innovations into health systems guided the review. We identified 3410 records, of which 36 were finally selected, and on which an analysis was conducted concerning the themes and pathways associated with different factors that may influence the integration process. Four programmes from Brazil, Ethiopia, India and Pakistan met the inclusion criteria. Different aspects of each of these programmes were integrated in different ways into their respective health systems. Factors that facilitated the integration process included the magnitude of countries' human resources for health problems and the associated discourses about how to address these problems; the perceived relative advantage of national CBHWs with regard to delivering health services over training and retaining highly skilled health workers; and the participation of some politicians and community members in programme processes, with the result that they viewed the programmes as legitimate, credible and relevant. Finally, integration of programmes within the existing health systems enhanced programme compatibility with the health systems' governance, financing and training functions. Factors that inhibited the integration process included a rapid scale-up process; resistance from other health workers; discrimination of CBHWs based on social, gender and economic status; ineffective incentive structures; inadequate infrastructure and supplies; and hierarchical and parallel communication structures. CBHW programmes should design their scale-up strategy differently based on current contextual factors. Further, adoption of a stepwise approach to the scale-up and integration process may positively shape the integration process of CBHW programmes into health systems.

  11. Systems Integration Processes for NASA Ares I Crew Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Taylor, James L.; Reuter, James L.; Sexton, Jeffrey D.

    2006-01-01

    NASA's Exploration Initiative will require development of many new elements to constitute a robust system of systems. New launch vehicles are needed to place cargo and crew in stable Low Earth Orbit (LEO). This paper examines the systems integration processes NASA is utilizing to ensure integration and control of propulsion and nonpropulsion elements within NASA's Crew Launch Vehicle (CLV), now known as the Ares I. The objective of the Ares I is to provide the transportation capabilities to meet the Constellation Program requirements for delivering a Crew Exploration Vehicle (CEV) or other payload to LEO in support of the lunar and Mars missions. The Ares I must successfully provide this capability within cost and schedule, and with an acceptable risk approach. This paper will describe the systems engineering management processes that will be applied to assure Ares I Project success through complete and efficient technical integration. Discussion of technical review and management processes for requirements development and verification, integrated design and analysis, integrated simulation and testing, and the integration of reliability, maintainability and supportability (RMS) into the design will also be included. The Ares I Project is logically divided into elements by the major hardware groupings, and associated management, system engineering, and integration functions. The processes to be described herein are designed to integrate within these Ares I elements and among the other Constellation projects. Also discussed is launch vehicle stack integration (Ares I to CEV, and Ground and Flight Operations integration) throughout the life cycle, including integrated vehicle performance through orbital insertion, recovery of the first stage, and reentry of the upper stage. The processes for decomposing requirements to the elements and ensuring that requirements have been correctly validated, decomposed, and allocated, and that the verification requirements are properly defined to ensure that the system design meets requirements, will be discussed.

  12. The Joint Distribution Process Analysis Center (JDPAC): Background and Current Capability

    DTIC Science & Technology

    2007-06-12

    Systems Integration and Data Management JDDE Analysis/Global Distribution Performance Assessment Futures/Transformation Analysis Balancing Operational Art ... Science JDPAC “101” USTRANSCOM Future Operations Center SDDC – TEA Army SES (Dual Hat) • Transportability Engineering • Other Title 10

  13. Possibilities of the Integration of the Method of the Ecologically Oriented Independent Scientific Research in the Study Process

    NASA Astrophysics Data System (ADS)

    Grizans, Jurijs; Vanags, Janis

    2010-01-01

    The aim of this paper is to analyse possibilities of the integration of the method of the ecologically oriented independent scientific research in the study process. In order to achieve the set aim, the following scientific research methods were used: analysis of the conceptual guidelines for the development of environmentally oriented entrepreneurship, interpretation of the experts' evaluation of the ecologically oriented management, analysis of the results of the students' ecologically oriented independent scientific research, as well as monographic and logically constructive methods. The results of the study give an opportunity to make conclusions and to develop conceptual recommendations on how to introduce future economics and business professionals with the theoretical and practical aspects of ecologically oriented management during the study process.

  14. Nouvelle methode d'integration energetique pour la retro-installation des procedes industriels et la transformation des usines papetieres

    NASA Astrophysics Data System (ADS)

    Bonhivers, Jean-Christophe

    The increase in production of goods over the last decades has led to the need for improving the management of natural resources management and the efficiency of processes. As a consequence, heat integration methods for industry have been developed. These have been successful for the design of new plants: the integration principles are largely employed, and energy intensity has dramatically decreased in many processes. Although progress has also been achieved in integration methods for retrofit, these methods still need further conceptual development. Furthermore, methodological difficulties increase when trying to retrofit heat exchange networks that are closely interrelated to water networks, such as the case of pulp and paper mills. The pulp and paper industry seeks to increase its profitability by reducing production costs and optimizing supply chains. Recent process developments in forestry biorefining give this industry the opportunity for diversification into bio-products, increasing potential profit margins, and at the same time modernizing its energy systems. Identification of energy strategies for a mill in a changing environment, including the possibility of adding a biorefinery process on the industrial site, requires better integration methods for retrofit situations. The objective of this thesis is to develop an energy integration method for the retrofit of industrial systems and the transformation of pulp and paper mills, ant to demonstrate the method in case studies. Energy is conserved and degraded in a process. Heat can be converted into electricity, stored as chemical energy, or rejected to the environment. A systematic analysis of successive degradations of energy between the hot utilities until the environment, through process operations and existing heat exchangers, is essential in order to reduce the heat consumption. In this thesis, the "Bridge Method" for energy integration by heat exchanger network retrofit has been developed. This method is the first that considers the analysis of these degradations. The fundamental mechanism to reduce the heat consumption in an existing network has been made explicit; it is the basis of the developed method. The Bridge Method includes the definition of "a bridge", which is a set of modifications leading to heat reduction in a heat exchanger network. It is proven that, for a given set of streams, only bridges can lead to heat savings. The Bridge Method also includes (1) a global procedure for heat exchanger network retrofit, (2) a procedure to enumerate systematically the bridges, (3) "a network table" to easily evaluate them, and (4) an "energy transfer diagram" showing the effect of the two first principles of thermodynamics of energy conservation and degradation in industrial processes in order to identify energy savings opportunities. The Bridge Method can be used for the analysis of networks including several types of heat transfer, and site-wide analysis. The Bridge Method has been applied in case studies for retrofitting networks composed of indirect-contact heat exchangers, including the network of a kraft pulp mill, and also networks of direct-contact heat exchangers, including the hot water production system of a pulp mill. The method has finally been applied for the evaluation of a biorefinery process, alone or hosted in a kraft pulp mill. Results show that the use of the method significantly reduces the search space and leads to identification of the relevant solutions. The necessity of a bridge to reduce the inputs and outputs of a process is a consequence of the two first thermodynamics principles of energy conservation and increase in entropy. The concept of bridge alone can also be used as a tool for process analysis, and in numerical optimization-based approaches for energy integration.

  15. Beyond the limitations of best practices: how logic analysis helped reinterpret dual diagnosis guidelines.

    PubMed

    Brousselle, Astrid; Lamothe, Lise; Mercier, Céline; Perreault, Michel

    2007-02-01

    The co-occurrence of mental health and substance use disorders is becoming increasingly recognized as a single problem, and professionals recognize that both should be addressed at the same time. Medical best practices recommend integrated treatment. However, criticisms have arisen, particularly concerning the difficulty of implementing integrated teams in specific health-care contexts and the appropriateness of the proposed model for certain populations. Using logic analysis, we identify the key clinical and organizational factors that contribute to successful implementation. Building on both the professional and organizational literatures on integrated services, we propose a conceptual model that makes it possible to analyze integration processes and places integrated treatment within an interpretative framework. Using this model, it becomes possible to identify key factors necessary to support service integration, and suggest new models of practice adapted to particular contexts.

  16. Analysis of integrated healthcare networks' performance: a contingency-strategic management perspective.

    PubMed

    Lin, B Y; Wan, T T

    1999-12-01

    Few empirical analyses have been done in the organizational researches of integrated healthcare networks (IHNs) or integrated healthcare delivery systems. Using a contingency derived contact-process-performance model, this study attempts to explore the relationships among an IHN's strategic direction, structural design, and performance. A cross-sectional analysis of 100 IHNs suggests that certain contextual factors such as market competition and network age and tax status have statistically significant effects on the implementation of an IHN's service differentiation strategy, which addresses coordination and control in the market. An IHN's service differentiation strategy is positively related to its integrated structural design, which is characterized as integration of administration, patient care, and information system across different settings. However, no evidence supports that the development of integrated structural design may benefit an IHN's performance in terms of clinical efficiency and financial viability.

  17. Application of process monitoring to anomaly detection in nuclear material processing systems via system-centric event interpretation of data from multiple sensors of varying reliability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, Humberto E.; Simpson, Michael F.; Lin, Wen-Chiao

    In this paper, we apply an advanced safeguards approach and associated methods for process monitoring to a hypothetical nuclear material processing system. The assessment regarding the state of the processing facility is conducted at a systemcentric level formulated in a hybrid framework. This utilizes architecture for integrating both time- and event-driven data and analysis for decision making. While the time-driven layers of the proposed architecture encompass more traditional process monitoring methods based on time series data and analysis, the event-driven layers encompass operation monitoring methods based on discrete event data and analysis. By integrating process- and operation-related information and methodologiesmore » within a unified framework, the task of anomaly detection is greatly improved. This is because decision-making can benefit from not only known time-series relationships among measured signals but also from known event sequence relationships among generated events. This available knowledge at both time series and discrete event layers can then be effectively used to synthesize observation solutions that optimally balance sensor and data processing requirements. The application of the proposed approach is then implemented on an illustrative monitored system based on pyroprocessing and results are discussed.« less

  18. Fully integrated wearable sensor arrays for multiplexed in situ perspiration analysis.

    PubMed

    Gao, Wei; Emaminejad, Sam; Nyein, Hnin Yin Yin; Challa, Samyuktha; Chen, Kevin; Peck, Austin; Fahad, Hossain M; Ota, Hiroki; Shiraki, Hiroshi; Kiriya, Daisuke; Lien, Der-Hsien; Brooks, George A; Davis, Ronald W; Javey, Ali

    2016-01-28

    Wearable sensor technologies are essential to the realization of personalized medicine through continuously monitoring an individual's state of health. Sampling human sweat, which is rich in physiological information, could enable non-invasive monitoring. Previously reported sweat-based and other non-invasive biosensors either can only monitor a single analyte at a time or lack on-site signal processing circuitry and sensor calibration mechanisms for accurate analysis of the physiological state. Given the complexity of sweat secretion, simultaneous and multiplexed screening of target biomarkers is critical and requires full system integration to ensure the accuracy of measurements. Here we present a mechanically flexible and fully integrated (that is, no external analysis is needed) sensor array for multiplexed in situ perspiration analysis, which simultaneously and selectively measures sweat metabolites (such as glucose and lactate) and electrolytes (such as sodium and potassium ions), as well as the skin temperature (to calibrate the response of the sensors). Our work bridges the technological gap between signal transduction, conditioning (amplification and filtering), processing and wireless transmission in wearable biosensors by merging plastic-based sensors that interface with the skin with silicon integrated circuits consolidated on a flexible circuit board for complex signal processing. This application could not have been realized using either of these technologies alone owing to their respective inherent limitations. The wearable system is used to measure the detailed sweat profile of human subjects engaged in prolonged indoor and outdoor physical activities, and to make a real-time assessment of the physiological state of the subjects. This platform enables a wide range of personalized diagnostic and physiological monitoring applications.

  19. Fully integrated wearable sensor arrays for multiplexed in situ perspiration analysis

    NASA Astrophysics Data System (ADS)

    Gao, Wei; Emaminejad, Sam; Nyein, Hnin Yin Yin; Challa, Samyuktha; Chen, Kevin; Peck, Austin; Fahad, Hossain M.; Ota, Hiroki; Shiraki, Hiroshi; Kiriya, Daisuke; Lien, Der-Hsien; Brooks, George A.; Davis, Ronald W.; Javey, Ali

    2016-01-01

    Wearable sensor technologies are essential to the realization of personalized medicine through continuously monitoring an individual’s state of health. Sampling human sweat, which is rich in physiological information, could enable non-invasive monitoring. Previously reported sweat-based and other non-invasive biosensors either can only monitor a single analyte at a time or lack on-site signal processing circuitry and sensor calibration mechanisms for accurate analysis of the physiological state. Given the complexity of sweat secretion, simultaneous and multiplexed screening of target biomarkers is critical and requires full system integration to ensure the accuracy of measurements. Here we present a mechanically flexible and fully integrated (that is, no external analysis is needed) sensor array for multiplexed in situ perspiration analysis, which simultaneously and selectively measures sweat metabolites (such as glucose and lactate) and electrolytes (such as sodium and potassium ions), as well as the skin temperature (to calibrate the response of the sensors). Our work bridges the technological gap between signal transduction, conditioning (amplification and filtering), processing and wireless transmission in wearable biosensors by merging plastic-based sensors that interface with the skin with silicon integrated circuits consolidated on a flexible circuit board for complex signal processing. This application could not have been realized using either of these technologies alone owing to their respective inherent limitations. The wearable system is used to measure the detailed sweat profile of human subjects engaged in prolonged indoor and outdoor physical activities, and to make a real-time assessment of the physiological state of the subjects. This platform enables a wide range of personalized diagnostic and physiological monitoring applications.

  20. Future Directions in Vulnerability to Depression among Youth: Integrating Risk Factors and Processes across Multiple Levels of Analysis

    PubMed Central

    Hankin, Benjamin L.

    2014-01-01

    Depression is a developmental phenomenon. Considerable progress has been made in describing the syndrome, establishing its prevalence and features, providing clues as to its etiology, and developing evidence-based treatment and prevention options. Despite considerable headway in distinct lines of vulnerability research, there is an explanatory gap in the field ability to more comprehensively explain and predict who is likely to become depressed, when, and why. Still, despite clear success in predicting moderate variance for future depression, especially with empirically rigorous methods and designs, the heterogeneous and multi-determined nature of depression suggests that additional etiologies need to be included to advance knowledge on developmental pathways to depression. This paper advocates for a multiple levels of analysis approach to investigating vulnerability to depression across the lifespan and providing a more comprehensive understanding of its etiology. One example of a multiple levels of analysis model of vulnerabilities to depression is provided that integrates the most accessible, observable factors (e.g., cognitive and temperament risks), intermediate processes and endophenotypes (e.g., information processing biases, biological stress physiology, and neural activation and connectivity), and genetic influences (e.g., candidate genes and epigenetics). Evidence for each of these factors as well as their cross-level integration is provided. Methodological and conceptual considerations important for conducting integrative, multiple levels of depression vulnerability research are discussed. Finally, translational implications for how a multiple levels of analysis perspective may confer additional leverage to reduce the global burden of depression and improve care are considered. PMID:22900513

  1. Integrating NASA's Land Analysis System (LAS) image processing software with an appropriate Geographic Information System (GIS): A review of candidates in the public domain

    NASA Technical Reports Server (NTRS)

    Rochon, Gilbert L.

    1989-01-01

    A user requirements analysis (URA) was undertaken to determine and appropriate public domain Geographic Information System (GIS) software package for potential integration with NASA's LAS (Land Analysis System) 5.0 image processing system. The necessity for a public domain system was underscored due to the perceived need for source code access and flexibility in tailoring the GIS system to the needs of a heterogenous group of end-users, and to specific constraints imposed by LAS and its user interface, Transportable Applications Executive (TAE). Subsequently, a review was conducted of a variety of public domain GIS candidates, including GRASS 3.0, MOSS, IEMIS, and two university-based packages, IDRISI and KBGIS. The review method was a modified version of the GIS evaluation process, development by the Federal Interagency Coordinating Committee on Digital Cartography. One IEMIS-derivative product, the ALBE (AirLand Battlefield Environment) GIS, emerged as the most promising candidate for integration with LAS. IEMIS (Integrated Emergency Management Information System) was developed by the Federal Emergency Management Agency (FEMA). ALBE GIS is currently under development at the Pacific Northwest Laboratory under contract with the U.S. Army Corps of Engineers' Engineering Topographic Laboratory (ETL). Accordingly, recommendations are offered with respect to a potential LAS/ALBE GIS linkage and with respect to further system enhancements, including coordination with the development of the Spatial Analysis and Modeling System (SAMS) GIS in Goddard's IDM (Intelligent Data Management) developments in Goddard's National Space Science Data Center.

  2. Communication Network Integration and Group Uniformity in a Complex Organization.

    ERIC Educational Resources Information Center

    Danowski, James A.; Farace, Richard V.

    This paper contains a discussion of the limitations of research on group processes in complex organizations and the manner in which a procedure for network analysis in on-going systems can reduce problems. The research literature on group uniformity processes and on theoretical models of these processes from an information processing perspective…

  3. The MSFC Collaborative Engineering Process for Preliminary Design and Concept Definition Studies

    NASA Technical Reports Server (NTRS)

    Mulqueen, Jack; Jones, David; Hopkins, Randy

    2011-01-01

    This paper describes a collaborative engineering process developed by the Marshall Space Flight Center's Advanced Concepts Office for performing rapid preliminary design and mission concept definition studies for potential future NASA missions. The process has been developed and demonstrated for a broad range of mission studies including human space exploration missions, space transportation system studies and in-space science missions. The paper will describe the design team structure and specialized analytical tools that have been developed to enable a unique rapid design process. The collaborative engineering process consists of integrated analysis approach for mission definition, vehicle definition and system engineering. The relevance of the collaborative process elements to the standard NASA NPR 7120.1 system engineering process will be demonstrated. The study definition process flow for each study discipline will be will be outlined beginning with the study planning process, followed by definition of ground rules and assumptions, definition of study trades, mission analysis and subsystem analyses leading to a standardized set of mission concept study products. The flexibility of the collaborative engineering design process to accommodate a wide range of study objectives from technology definition and requirements definition to preliminary design studies will be addressed. The paper will also describe the applicability of the collaborative engineering process to include an integrated systems analysis approach for evaluating the functional requirements of evolving system technologies and capabilities needed to meet the needs of future NASA programs.

  4. Proceedings of the 13th Project integration meeting

    NASA Technical Reports Server (NTRS)

    Mcdonald, R. R.

    1979-01-01

    Progress made by the Low Cost Solar Array Project during the period April through August 1979 is presented. Reports are given on project analysis and integration; technology development in silicon material, large area sheet silicon, and encapsulation; production process and equipment development; engineering and operations, and a discussion of the steps taken to integrate these efforts. A report on, and copies of viewgraphs presented at the Project Integration Meeting held August 22-23, 1979 are presented.

  5. Scenario Analysis: An Integrative Study and Guide to Implementation in the United States Air Force

    DTIC Science & Technology

    1994-09-01

    Environmental Analysis ................................ 3-3 Classifications of Environments ......................... 3-5 Characteristics of... Environments ........................ 3-8 iii Page Components of the Environmental Analysis Process ........... 3-12 Forecasting... Environmental Analysis ...................... 3-4 3-2 Model of the Industry Environment ......................... 3-6 3-3 Model of Macroenvironment

  6. Detection of a novel, integrative aging process suggests complex physiological integration.

    PubMed

    Cohen, Alan A; Milot, Emmanuel; Li, Qing; Bergeron, Patrick; Poirier, Roxane; Dusseault-Bélanger, Francis; Fülöp, Tamàs; Leroux, Maxime; Legault, Véronique; Metter, E Jeffrey; Fried, Linda P; Ferrucci, Luigi

    2015-01-01

    Many studies of aging examine biomarkers one at a time, but complex systems theory and network theory suggest that interpretations of individual markers may be context-dependent. Here, we attempted to detect underlying processes governing the levels of many biomarkers simultaneously by applying principal components analysis to 43 common clinical biomarkers measured longitudinally in 3694 humans from three longitudinal cohort studies on two continents (Women's Health and Aging I & II, InCHIANTI, and the Baltimore Longitudinal Study on Aging). The first axis was associated with anemia, inflammation, and low levels of calcium and albumin. The axis structure was precisely reproduced in all three populations and in all demographic sub-populations (by sex, race, etc.); we call the process represented by the axis "integrated albunemia." Integrated albunemia increases and accelerates with age in all populations, and predicts mortality and frailty--but not chronic disease--even after controlling for age. This suggests a role in the aging process, though causality is not yet clear. Integrated albunemia behaves more stably across populations than its component biomarkers, and thus appears to represent a higher-order physiological process emerging from the structure of underlying regulatory networks. If this is correct, detection of this process has substantial implications for physiological organization more generally.

  7. Adaptation of ICT Integration Approach Scale to Kosovo Culture: A Study of Validity and Reliability Analysis

    ERIC Educational Resources Information Center

    Kervan, Serdan; Tezci, Erdogan

    2018-01-01

    The aim of this study is to adapt ICT integration approach scale to Kosovo culture, which measures ICT integration approaches of university faculty to teaching and learning process. The scale developed in Turkish has been translated into Albanian to provide linguistic equivalence. The survey was given to a total of 303 instructors [161 (53.1%)…

  8. FMAP: Functional Mapping and Analysis Pipeline for metagenomics and metatranscriptomics studies.

    PubMed

    Kim, Jiwoong; Kim, Min Soo; Koh, Andrew Y; Xie, Yang; Zhan, Xiaowei

    2016-10-10

    Given the lack of a complete and comprehensive library of microbial reference genomes, determining the functional profile of diverse microbial communities is challenging. The available functional analysis pipelines lack several key features: (i) an integrated alignment tool, (ii) operon-level analysis, and (iii) the ability to process large datasets. Here we introduce our open-sourced, stand-alone functional analysis pipeline for analyzing whole metagenomic and metatranscriptomic sequencing data, FMAP (Functional Mapping and Analysis Pipeline). FMAP performs alignment, gene family abundance calculations, and statistical analysis (three levels of analyses are provided: differentially-abundant genes, operons and pathways). The resulting output can be easily visualized with heatmaps and functional pathway diagrams. FMAP functional predictions are consistent with currently available functional analysis pipelines. FMAP is a comprehensive tool for providing functional analysis of metagenomic/metatranscriptomic sequencing data. With the added features of integrated alignment, operon-level analysis, and the ability to process large datasets, FMAP will be a valuable addition to the currently available functional analysis toolbox. We believe that this software will be of great value to the wider biology and bioinformatics communities.

  9. The initial design of LAPAN's IR micro bolometer using mission analysis process

    NASA Astrophysics Data System (ADS)

    Bustanul, A.; Irwan, P.; M. T., Andi; Firman, B.

    2016-11-01

    As new player in Infra Red (IR) sector, uncooled, small, and lightweight IR Micro Bolometer has been chosen as one of payloads for LAPAN's next micro satellite project. Driven the desire to create our own IR Micro Bolometer, mission analysis design procedure has been applied. After tracing all possible missions, the Planck's and Wien's Law for black body, Temperature Responsivity (TR), and sub-pixel response had been utilized in order to determine the appropriate spectral radiance. The 3.8 - 4 μm wavelength were available to detect wild fire (forest fire) and active volcanoes, two major problems faced by Indonesia. In order to strengthen and broaden the result, iteration process had been used throughout the process. The analysis, then, were continued by calculating Ground pixel size, IFOV pixel, swath width, and focus length. Meanwhile, regarding of resolution, at least it is 400 m. The further procedure covered the integrated of optical design, wherein we combined among optical design software, Zemax, with mechanical analysis software (structure and thermal analysis), such as Nastran and Thermal Desktop / Sinda Fluint. The integration process was intended to produce high performance optical system of our IR Micro Bolometer that can be used under extreme environment. The results of all those analysis, either in graphs or in measurement, show that the initial design of LAPAN'S IR Micro Bolometer meets the determined requirement. However, it needs the further evaluation (iteration). This paper describes the initial design of LAPAN's IR Micro Bolometer using mission analysis process

  10. Should different impact assessment instruments be integrated? Evidence from English spatial planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tajima, Ryo, E-mail: tajima.ryo@nies.go.jp; Fischer, Thomas B., E-mail: fischer@liverpool.ac.uk

    This paper aims at providing empirical evidence to the question as to whether integration of different instruments is achieving its aim in supporting sustainable decision making, focusing on SEA inclusive sustainability appraisal (SA) and other impact assessments (IAs) currently used in English spatial planning. Usage of IAs in addition to SA is established and an analysis of the integration approach (in terms of process, output, and assessor) as well as its effectiveness is conducted. It is found that while integration enhances effectiveness to some extent, too much integration, especially in terms of the procedural element, appears to diminish the overallmore » effectiveness of each IA in influencing decisions as they become captured by the balancing function of SA. -- Highlights: ► The usage of different impact assessments in English spatial planning is clarified. ► The relationship between integration approach and effectiveness is analyzed. ► Results suggest that integration does not necessarily lead to more sustainable decisions. ► Careful consideration is recommended upon process integration.« less

  11. Integrated analysis of remote sensing products from basic geological surveys. [Brazil

    NASA Technical Reports Server (NTRS)

    Dasilvafagundesfilho, E. (Principal Investigator)

    1984-01-01

    Recent advances in remote sensing led to the development of several techniques to obtain image information. These techniques as effective tools in geological maping are analyzed. A strategy for optimizing the images in basic geological surveying is presented. It embraces as integrated analysis of spatial, spectral, and temporal data through photoptic (color additive viewer) and computer processing at different scales, allowing large areas survey in a fast, precise, and low cost manner.

  12. Shuttle payload interface verification equipment study. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A preliminary design analysis of a stand alone payload integration device (IVE) is provided that is capable of verifying payload compatibility in form, fit and function with the shuttle orbiter prior to on-line payload/orbiter operations. The IVE is a high fidelity replica of the orbiter payload accommodations capable of supporting payload functional checkout and mission simulation. A top level payload integration analysis developed detailed functional flow block diagrams of the payload integration process for the broad spectrum of P/L's and identified degree of orbiter data required by the payload user and potential applications of the IVE.

  13. A constitutive material model for nonlinear finite element structural analysis using an iterative matrix approach

    NASA Technical Reports Server (NTRS)

    Koenig, Herbert A.; Chan, Kwai S.; Cassenti, Brice N.; Weber, Richard

    1988-01-01

    A unified numerical method for the integration of stiff time dependent constitutive equations is presented. The solution process is directly applied to a constitutive model proposed by Bodner. The theory confronts time dependent inelastic behavior coupled with both isotropic hardening and directional hardening behaviors. Predicted stress-strain responses from this model are compared to experimental data from cyclic tests on uniaxial specimens. An algorithm is developed for the efficient integration of the Bodner flow equation. A comparison is made with the Euler integration method. An analysis of computational time is presented for the three algorithms.

  14. A framework for interactive visual analysis of heterogeneous marine data in an integrated problem solving environment

    NASA Astrophysics Data System (ADS)

    Liu, Shuai; Chen, Ge; Yao, Shifeng; Tian, Fenglin; Liu, Wei

    2017-07-01

    This paper presents a novel integrated marine visualization framework which focuses on processing, analyzing the multi-dimension spatiotemporal marine data in one workflow. Effective marine data visualization is needed in terms of extracting useful patterns, recognizing changes, and understanding physical processes in oceanography researches. However, the multi-source, multi-format, multi-dimension characteristics of marine data pose a challenge for interactive and feasible (timely) marine data analysis and visualization in one workflow. And, global multi-resolution virtual terrain environment is also needed to give oceanographers and the public a real geographic background reference and to help them to identify the geographical variation of ocean phenomena. This paper introduces a data integration and processing method to efficiently visualize and analyze the heterogeneous marine data. Based on the data we processed, several GPU-based visualization methods are explored to interactively demonstrate marine data. GPU-tessellated global terrain rendering using ETOPO1 data is realized and the video memory usage is controlled to ensure high efficiency. A modified ray-casting algorithm for the uneven multi-section Argo volume data is also presented and the transfer function is designed to analyze the 3D structure of ocean phenomena. Based on the framework we designed, an integrated visualization system is realized. The effectiveness and efficiency of the framework is demonstrated. This system is expected to make a significant contribution to the demonstration and understanding of marine physical process in a virtual global environment.

  15. Flexible distributed architecture for semiconductor process control and experimentation

    NASA Astrophysics Data System (ADS)

    Gower, Aaron E.; Boning, Duane S.; McIlrath, Michael B.

    1997-01-01

    Semiconductor fabrication requires an increasingly expensive and integrated set of tightly controlled processes, driving the need for a fabrication facility with fully computerized, networked processing equipment. We describe an integrated, open system architecture enabling distributed experimentation and process control for plasma etching. The system was developed at MIT's Microsystems Technology Laboratories and employs in-situ CCD interferometry based analysis in the sensor-feedback control of an Applied Materials Precision 5000 Plasma Etcher (AME5000). Our system supports accelerated, advanced research involving feedback control algorithms, and includes a distributed interface that utilizes the internet to make these fabrication capabilities available to remote users. The system architecture is both distributed and modular: specific implementation of any one task does not restrict the implementation of another. The low level architectural components include a host controller that communicates with the AME5000 equipment via SECS-II, and a host controller for the acquisition and analysis of the CCD sensor images. A cell controller (CC) manages communications between these equipment and sensor controllers. The CC is also responsible for process control decisions; algorithmic controllers may be integrated locally or via remote communications. Finally, a system server images connections from internet/intranet (web) based clients and uses a direct link with the CC to access the system. Each component communicates via a predefined set of TCP/IP socket based messages. This flexible architecture makes integration easier and more robust, and enables separate software components to run on the same or different computers independent of hardware or software platform.

  16. Fostering development of nursing practices to support integrated care when implementing integrated care pathways: what levers to use?

    PubMed

    Longpré, Caroline; Dubois, Carl-Ardy

    2017-11-29

    Care integration has been the focus of recent health system reforms. Given their functions at all levels of the care continuum, nurses have a substantial and primordial role to play in such integration processes. The aim of this study was to identify levers and strategies that organizations can use to support the development of a nursing practice aligned with the requirements of care integration in a health and social services centre (HSSC) in Quebec. The research design was a cross-sectional descriptive qualitative study based on a single case study with nested levels of analysis. The case was a public, multi-disciplinary HSSC in a semi-urban region of Quebec. Semi-structured interviews with 37 persons (nurses, professionals, managers, administrators) allowed for data saturation and ensured theoretical representation by covering four care pathways constituting different care integration contexts. Analysis involved four steps: preparing a predetermined list of codes based on the reference framework developed by Minkman (2011); coding transcript content; developing general and summary matrices to group observations for each care pathway; and creating a general model showing the overall results for the four pathways. The organization's capacity for response with regard to developing an integrated system of services resulted in two types of complementary interventions. The first involved investing in key resources and renewing organizational structures; the second involved deploying a series of organizational and clinical-administrative processes. In resource terms, integration efforts resulted in setting up new strategic services, re-arranging physical infrastructures, and deploying new technological resources. Organizational and clinical-administrative processes to promote integration involved renewing governance, improving the flow of care pathways, fostering continuous quality improvement, developing new roles, promoting clinician collaboration, and strengthening care providers' capacities. However, progress in these areas was offset by persistent constraints. The results highlight key levers organizations can use to foster the implementation and institutionalization of integrative nursing practices. They show that progress in this area requires a combination of strategies using multiple complementary levers. They also suggest that such progress calls for rethinking not only the deployment of certain organizational resources and structures, but also a series of organizational and clinical processes.

  17. An integrated green process: Subcritical water, enzymatic hydrolysis, and fermentation, for biohydrogen production from coconut husk.

    PubMed

    Muharja, Maktum; Junianti, Fitri; Ranggina, Dian; Nurtono, Tantular; Widjaja, Arief

    2018-02-01

    The objective of this work is to develop an integrated green process of subcritical water (SCW), enzymatic hydrolysis and fermentation of coconut husk (CCH) to biohydrogen. The maximum sugar yield was obtained at mild severity factor. This was confirmed by the degradation of hemicellulose, cellulose and lignin. The tendency of the changing of sugar yield as a result of increasing severity factor was opposite to the tendency of pH change. It was found that CO 2 gave a different tendency of severity factor compared to N 2 as the pressurizing gas. The result of SEM analysis confirmed the structural changes during SCW pretreatment. This study integrated three steps all of which are green processes which ensured an environmentally friendly process to produce a clean biohydrogen. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Applications of integrated human error identification techniques on the chemical cylinder change task.

    PubMed

    Cheng, Ching-Min; Hwang, Sheue-Ling

    2015-03-01

    This paper outlines the human error identification (HEI) techniques that currently exist to assess latent human errors. Many formal error identification techniques have existed for years, but few have been validated to cover latent human error analysis in different domains. This study considers many possible error modes and influential factors, including external error modes, internal error modes, psychological error mechanisms, and performance shaping factors, and integrates several execution procedures and frameworks of HEI techniques. The case study in this research was the operational process of changing chemical cylinders in a factory. In addition, the integrated HEI method was used to assess the operational processes and the system's reliability. It was concluded that the integrated method is a valuable aid to develop much safer operational processes and can be used to predict human error rates on critical tasks in the plant. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  19. The MIGenAS integrated bioinformatics toolkit for web-based sequence analysis

    PubMed Central

    Rampp, Markus; Soddemann, Thomas; Lederer, Hermann

    2006-01-01

    We describe a versatile and extensible integrated bioinformatics toolkit for the analysis of biological sequences over the Internet. The web portal offers convenient interactive access to a growing pool of chainable bioinformatics software tools and databases that are centrally installed and maintained by the RZG. Currently, supported tasks comprise sequence similarity searches in public or user-supplied databases, computation and validation of multiple sequence alignments, phylogenetic analysis and protein–structure prediction. Individual tools can be seamlessly chained into pipelines allowing the user to conveniently process complex workflows without the necessity to take care of any format conversions or tedious parsing of intermediate results. The toolkit is part of the Max-Planck Integrated Gene Analysis System (MIGenAS) of the Max Planck Society available at (click ‘Start Toolkit’). PMID:16844980

  20. Integrated Analysis Tools for Determination of Structural Integrity and Durability of High temperature Polymer Matrix Composites

    DTIC Science & Technology

    2008-08-18

    fidelity will be used to reduce the massive experimental testing and associated time required for qualification of new materials. Tools and...develping a model of the thermo-oxidative process for polymer systems, that incorporates the effects of reaction rates, Fickian diffusion, time varying...degradation processes. Year: 2005 Month: 12 Not required at this time . AIR FORCE OFFICE OF SCIENTIFIC KESEARCH 04 SEP 2008 Page 2 of 2 DTIC Data

  1. The Requirement for Acquisition and Logistics Integration: An Examination of Reliability Management Within the Marine Corps Acquisition Process

    DTIC Science & Technology

    2002-12-01

    HMMWV family of vehicles, LVS family of vehicles, and the M198 Howitzer). The analysis is limited to an assessment of reliability management issues...AND LOGISTICS INTEGRATION: AN EXAMINATION OF RELIABILITY MANAGEMENT WITHIN THE MARINE CORPS ACQUISITION PROCESS by Marvin L. Norcross, Jr...Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction

  2. Visual and auditory synchronization deficits among dyslexic readers as compared to non-impaired readers: a cross-correlation algorithm analysis

    PubMed Central

    Sela, Itamar

    2014-01-01

    Visual and auditory temporal processing and crossmodal integration are crucial factors in the word decoding process. The speed of processing (SOP) gap (Asynchrony) between these two modalities, which has been suggested as related to the dyslexia phenomenon, is the focus of the current study. Nineteen dyslexic and 17 non-impaired University adult readers were given stimuli in a reaction time (RT) procedure where participants were asked to identify whether the stimulus type was only visual, only auditory or crossmodally integrated. Accuracy, RT, and Event Related Potential (ERP) measures were obtained for each of the three conditions. An algorithm to measure the contribution of the temporal SOP of each modality to the crossmodal integration in each group of participants was developed. Results obtained using this model for the analysis of the current study data, indicated that in the crossmodal integration condition the presence of the auditory modality at the pre-response time frame (between 170 and 240 ms after stimulus presentation), increased processing speed in the visual modality among the non-impaired readers, but not in the dyslexic group. The differences between the temporal SOP of the modalities among the dyslexics and the non-impaired readers give additional support to the theory that an asynchrony between the visual and auditory modalities is a cause of dyslexia. PMID:24959125

  3. Semantic integration of gene expression analysis tools and data sources using software connectors

    PubMed Central

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data. PMID:24341380

  4. Semantic integration of gene expression analysis tools and data sources using software connectors.

    PubMed

    Miyazaki, Flávia A; Guardia, Gabriela D A; Vêncio, Ricardo Z N; de Farias, Cléver R G

    2013-10-25

    The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heterogeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools and data sources. The methodology can be used in the development of connectors supporting both simple and nontrivial processing requirements, thus assuring accurate data exchange and information interpretation from exchanged data.

  5. 1992 NASA Life Support Systems Analysis workshop

    NASA Technical Reports Server (NTRS)

    Evanich, Peggy L.; Crabb, Thomas M.; Gartrell, Charles F.

    1992-01-01

    The 1992 Life Support Systems Analysis Workshop was sponsored by NASA's Office of Aeronautics and Space Technology (OAST) to integrate the inputs from, disseminate information to, and foster communication among NASA, industry, and academic specialists. The workshop continued discussion and definition of key issues identified in the 1991 workshop, including: (1) modeling and experimental validation; (2) definition of systems analysis evaluation criteria; (3) integration of modeling at multiple levels; and (4) assessment of process control modeling approaches. Through both the 1991 and 1992 workshops, NASA has continued to seek input from industry and university chemical process modeling and analysis experts, and to introduce and apply new systems analysis approaches to life support systems. The workshop included technical presentations, discussions, and interactive planning, with sufficient time allocated for discussion of both technology status and technology development recommendations. Key personnel currently involved with life support technology developments from NASA, industry, and academia provided input to the status and priorities of current and future systems analysis methods and requirements.

  6. JIMM: the next step for mission-level models

    NASA Astrophysics Data System (ADS)

    Gump, Jamieson; Kurker, Robert G.; Nalepka, Joseph P.

    2001-09-01

    The (Simulation Based Acquisition) SBA process is one in which the planning, design, and test of a weapon system or other product is done through the more effective use of modeling and simulation, information technology, and process improvement. This process results in a product that is produced faster, cheaper, and more reliably than its predecessors. Because the SBA process requires realistic and detailed simulation conditions, it was necessary to develop a simulation tool that would provide a simulation environment acceptable for doing SBA analysis. The Joint Integrated Mission Model (JIMM) was created to help define and meet the analysis, test and evaluation, and training requirements of a Department of Defense program utilizing SBA. Through its generic nature of representing simulation entities, its data analysis capability, and its robust configuration management process, JIMM can be used to support a wide range of simulation applications as both a constructive and a virtual simulation tool. JIMM is a Mission Level Model (MLM). A MLM is capable of evaluating the effectiveness and survivability of a composite force of air and space systems executing operational objectives in a specific scenario against an integrated air and space defense system. Because MLMs are useful for assessing a system's performance in a realistic, integrated, threat environment, they are key to implementing the SBA process. JIMM is a merger of the capabilities of one legacy model, the Suppressor MLM, into another, the Simulated Warfare Environment Generator (SWEG) MLM. By creating a more capable MLM, JIMM will not only be a tool to support the SBA initiative, but could also provide the framework for the next generation of MLMs.

  7. dada - a web-based 2D detector analysis tool

    NASA Astrophysics Data System (ADS)

    Osterhoff, Markus

    2017-06-01

    The data daemon, dada, is a server backend for unified access to 2D pixel detector image data stored with different detectors, file formats and saved with varying naming conventions and folder structures across instruments. Furthermore, dada implements basic pre-processing and analysis routines from pixel binning over azimuthal integration to raster scan processing. Common user interactions with dada are by a web frontend, but all parameters for an analysis are encoded into a Uniform Resource Identifier (URI) which can also be written by hand or scripts for batch processing.

  8. A computer-aided movement analysis system.

    PubMed

    Fioretti, S; Leo, T; Pisani, E; Corradini, M L

    1990-08-01

    Interaction with biomechanical data concerning human movement analysis implies the adoption of various experimental equipments and the choice of suitable models, data processing, and graphical data restitution techniques. The integration of measurement setups with the associated experimental protocols and the relative software procedures constitutes a computer-aided movement analysis (CAMA) system. In the present paper such integration is mapped onto the causes that limit the clinical acceptance of movement analysis methods. The structure of the system is presented. A specific CAMA system devoted to posture analysis is described in order to show the attainable features. Scientific results obtained with the support of the described system are also reported.

  9. A Structured Decision Approach for Integrating and Analyzing Community Perspectives in Re-Use Planning of Vacant Properties in Cleveland, Ohio

    EPA Science Inventory

    An integrated GIS-based, multi-attribute decision model deployed in a web-based platform is presented enabling an iterative, spatially explicit and collaborative analysis of relevant and available information for repurposing vacant land. The process incorporated traditional and ...

  10. An Integrated Library System from Existing Microcomputer Programs.

    ERIC Educational Resources Information Center

    Kuntz, Lynda S.

    1988-01-01

    Demonstrates how three commercial microcomputer software packages--PC-Talk III, Wordstar, and dBase III--were combined to produce an integrated library system at the U.S. Army Concepts Analysis Agency library. The retrospective conversion process is discussed, and the four modules of the system are described: acquisitions/cataloging; online…

  11. Testing an Integrated Model of Advice Giving in Supportive Interactions

    ERIC Educational Resources Information Center

    Feng, Bo

    2009-01-01

    Viewing supportive communication as a multistage process, the present study proposed and tested an integrated model of advice giving, which specifies three sequential moves in supportive interactions involving advice: emotional support, problem inquiry and analysis, and advice. Seven hundred and fifty-two participants read and responded to a…

  12. Low-cost solar array project and Proceedings of the 15th Project Integration Meeting

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Progress made by the Low-Cost Solar Array Project during the period December 1979 to April 1980 is described. Project analysis and integration, technology development in silicon material, large area silicon sheet and encapsulation, production process and equipment development, engineering, and operation are included.

  13. An Example for Integrated Gas Turbine Engine Testing and Analysis Using Modeling and Simulation

    DTIC Science & Technology

    2006-12-01

    USAF Academy in a joint test and analysis effort of the F109 turbofan engine. This process uses a swirl investigation as a vehicle to exercise and...test and analysis effort of the F109 turbofan engine. This process uses a swirl investigation as a vehicle to exercise and demonstrate the approach...test and analysis effort of the F109 turbofan engine, an effort which uses a swirl investigation as a vehicle to exercise and demonstrate the

  14. System-Level and Granger Network Analysis of Integrated Proteomic and Metabolomic Dynamics Identifies Key Points of Grape Berry Development at the Interface of Primary and Secondary Metabolism.

    PubMed

    Wang, Lei; Sun, Xiaoliang; Weiszmann, Jakob; Weckwerth, Wolfram

    2017-01-01

    Grapevine is a fruit crop with worldwide economic importance. The grape berry undergoes complex biochemical changes from fruit set until ripening. This ripening process and production processes define the wine quality. Thus, a thorough understanding of berry ripening is crucial for the prediction of wine quality. For a systemic analysis of grape berry development we applied mass spectrometry based platforms to analyse the metabolome and proteome of Early Campbell at 12 stages covering major developmental phases. Primary metabolites involved in central carbon metabolism, such as sugars, organic acids and amino acids together with various bioactive secondary metabolites like flavonols, flavan-3-ols and anthocyanins were annotated and quantified. At the same time, the proteomic analysis revealed the protein dynamics of the developing grape berries. Multivariate statistical analysis of the integrated metabolomic and proteomic dataset revealed the growth trajectory and corresponding metabolites and proteins contributing most to the specific developmental process. K-means clustering analysis revealed 12 highly specific clusters of co-regulated metabolites and proteins. Granger causality network analysis allowed for the identification of time-shift correlations between metabolite-metabolite, protein- protein and protein-metabolite pairs which is especially interesting for the understanding of developmental processes. The integration of metabolite and protein dynamics with their corresponding biochemical pathways revealed an energy-linked metabolism before veraison with high abundances of amino acids and accumulation of organic acids, followed by protein and secondary metabolite synthesis. Anthocyanins were strongly accumulated after veraison whereas other flavonoids were in higher abundance at early developmental stages and decreased during the grape berry developmental processes. A comparison of the anthocyanin profile of Early Campbell to other cultivars revealed similarities to Concord grape and indicates the strong effect of genetic background on metabolic partitioning in primary and secondary metabolism.

  15. System-Level and Granger Network Analysis of Integrated Proteomic and Metabolomic Dynamics Identifies Key Points of Grape Berry Development at the Interface of Primary and Secondary Metabolism

    PubMed Central

    Wang, Lei; Sun, Xiaoliang; Weiszmann, Jakob; Weckwerth, Wolfram

    2017-01-01

    Grapevine is a fruit crop with worldwide economic importance. The grape berry undergoes complex biochemical changes from fruit set until ripening. This ripening process and production processes define the wine quality. Thus, a thorough understanding of berry ripening is crucial for the prediction of wine quality. For a systemic analysis of grape berry development we applied mass spectrometry based platforms to analyse the metabolome and proteome of Early Campbell at 12 stages covering major developmental phases. Primary metabolites involved in central carbon metabolism, such as sugars, organic acids and amino acids together with various bioactive secondary metabolites like flavonols, flavan-3-ols and anthocyanins were annotated and quantified. At the same time, the proteomic analysis revealed the protein dynamics of the developing grape berries. Multivariate statistical analysis of the integrated metabolomic and proteomic dataset revealed the growth trajectory and corresponding metabolites and proteins contributing most to the specific developmental process. K-means clustering analysis revealed 12 highly specific clusters of co-regulated metabolites and proteins. Granger causality network analysis allowed for the identification of time-shift correlations between metabolite-metabolite, protein- protein and protein-metabolite pairs which is especially interesting for the understanding of developmental processes. The integration of metabolite and protein dynamics with their corresponding biochemical pathways revealed an energy-linked metabolism before veraison with high abundances of amino acids and accumulation of organic acids, followed by protein and secondary metabolite synthesis. Anthocyanins were strongly accumulated after veraison whereas other flavonoids were in higher abundance at early developmental stages and decreased during the grape berry developmental processes. A comparison of the anthocyanin profile of Early Campbell to other cultivars revealed similarities to Concord grape and indicates the strong effect of genetic background on metabolic partitioning in primary and secondary metabolism. PMID:28713396

  16. Intelligent Performance Analysis with a Natural Language Interface

    NASA Astrophysics Data System (ADS)

    Juuso, Esko K.

    2017-09-01

    Performance improvement is taken as the primary goal in the asset management. Advanced data analysis is needed to efficiently integrate condition monitoring data into the operation and maintenance. Intelligent stress and condition indices have been developed for control and condition monitoring by combining generalized norms with efficient nonlinear scaling. These nonlinear scaling methodologies can also be used to handle performance measures used for management since management oriented indicators can be presented in the same scale as intelligent condition and stress indices. Performance indicators are responses of the process, machine or system to the stress contributions analyzed from process and condition monitoring data. Scaled values are directly used in intelligent temporal analysis to calculate fluctuations and trends. All these methodologies can be used in prognostics and fatigue prediction. The meanings of the variables are beneficial in extracting expert knowledge and representing information in natural language. The idea of dividing the problems into the variable specific meanings and the directions of interactions provides various improvements for performance monitoring and decision making. The integrated temporal analysis and uncertainty processing facilitates the efficient use of domain expertise. Measurements can be monitored with generalized statistical process control (GSPC) based on the same scaling functions.

  17. A Control System and Streaming DAQ Platform with Image-Based Trigger for X-ray Imaging

    NASA Astrophysics Data System (ADS)

    Stevanovic, Uros; Caselle, Michele; Cecilia, Angelica; Chilingaryan, Suren; Farago, Tomas; Gasilov, Sergey; Herth, Armin; Kopmann, Andreas; Vogelgesang, Matthias; Balzer, Matthias; Baumbach, Tilo; Weber, Marc

    2015-06-01

    High-speed X-ray imaging applications play a crucial role for non-destructive investigations of the dynamics in material science and biology. On-line data analysis is necessary for quality assurance and data-driven feedback, leading to a more efficient use of a beam time and increased data quality. In this article we present a smart camera platform with embedded Field Programmable Gate Array (FPGA) processing that is able to stream and process data continuously in real-time. The setup consists of a Complementary Metal-Oxide-Semiconductor (CMOS) sensor, an FPGA readout card, and a readout computer. It is seamlessly integrated in a new custom experiment control system called Concert that provides a more efficient way of operating a beamline by integrating device control, experiment process control, and data analysis. The potential of the embedded processing is demonstrated by implementing an image-based trigger. It records the temporal evolution of physical events with increased speed while maintaining the full field of view. The complete data acquisition system, with Concert and the smart camera platform was successfully integrated and used for fast X-ray imaging experiments at KIT's synchrotron radiation facility ANKA.

  18. The effectivenes of science domain-based science learning integrated with local potency

    NASA Astrophysics Data System (ADS)

    Kurniawati, Arifah Putri; Prasetyo, Zuhdan Kun; Wilujeng, Insih; Suryadarma, I. Gusti Putu

    2017-08-01

    This research aimed to determine the significant effect of science domain-based science learning integrated with local potency toward science process skills. The research method used was a quasi-experimental design with nonequivalent control group design. The population of this research was all students of class VII SMP Negeri 1 Muntilan. The sample of this research was selected through cluster random sampling, namely class VII B as an experiment class (24 students) and class VII C as a control class (24 students). This research used a test instrument that was adapted from Agus Dwianto's research. The aspect of science process skills in this research was observation, classification, interpretation and communication. The analysis of data used the one factor anova at 0,05 significance level and normalized gain score. The significance level result of science process skills with one factor anova is 0,000. It shows that the significance level < alpha (0,05). It means that there was significant effect of science domain-based science learning integrated with local potency toward science learning process skills. The results of analysis show that the normalized gain score are 0,29 (low category) in control class and 0,67 (medium category) in experiment class.

  19. Integration of sustainability into process simulaton of a dairy process

    USDA-ARS?s Scientific Manuscript database

    Life cycle analysis, a method used to quantify the energy and environmental flows of a process or product on the environment, is increasingly utilized by food processors to develop strategies to lessen the carbon footprint of their operations. In the case of the milk supply chain, the method requir...

  20. Central Processing Dysfunctions in Children: A Review of Research.

    ERIC Educational Resources Information Center

    Chalfant, James C.; Scheffelin, Margaret A.

    Research on central processing dysfunctions in children is reviewed in three major areas. The first, dysfunctions in the analysis of sensory information, includes auditory, visual, and haptic processing. The second, dysfunction in the synthesis of sensory information, covers multiple stimulus integration and short-term memory. The third area of…

  1. Designing Image Analysis Pipelines in Light Microscopy: A Rational Approach.

    PubMed

    Arganda-Carreras, Ignacio; Andrey, Philippe

    2017-01-01

    With the progress of microscopy techniques and the rapidly growing amounts of acquired imaging data, there is an increased need for automated image processing and analysis solutions in biological studies. Each new application requires the design of a specific image analysis pipeline, by assembling a series of image processing operations. Many commercial or free bioimage analysis software are now available and several textbooks and reviews have presented the mathematical and computational fundamentals of image processing and analysis. Tens, if not hundreds, of algorithms and methods have been developed and integrated into image analysis software, resulting in a combinatorial explosion of possible image processing sequences. This paper presents a general guideline methodology to rationally address the design of image processing and analysis pipelines. The originality of the proposed approach is to follow an iterative, backwards procedure from the target objectives of analysis. The proposed goal-oriented strategy should help biologists to better apprehend image analysis in the context of their research and should allow them to efficiently interact with image processing specialists.

  2. An Integrated Strategy to Qualitatively Differentiate Components of Raw and Processed Viticis Fructus Based on NIR, HPLC and UPLC-MS Analysis.

    PubMed

    Diao, Jiayin; Xu, Can; Zheng, Huiting; He, Siyi; Wang, Shumei

    2018-06-21

    Viticis Fructus is a traditional Chinese herbal drug processed by various methods to achieve different clinical purposes. Thermal treatment potentially alters chemical composition, which may impact on effectiveness and toxicity. In order to interpret the constituent discrepancies of raw versus processed (stir-fried) Viticis Fructus, a multivariate detection method (NIR, HPLC, and UPLC-MS) based on metabonomics and chemometrics was developed. Firstly, synergy interval partial least squares and partial least squares-discriminant analysis were employed to screen the distinctive wavebands (4319 - 5459 cm -1 ) based on preprocessed near-infrared spectra. Then, HPLC with principal component analysis was performed to characterize the distinction. Subsequently, a total of 49 compounds were identified by UPLC-MS, among which 42 compounds were eventually characterized as having a significant change during processing via the semiquantitative volcano plot analysis. Moreover, based on the partial least squares-discriminant analysis, 16 compounds were chosen as characteristic markers that could be in close correlation with the discriminatory near-infrared wavebands. Together, all of these characterization techniques effectively discriminated raw and processed products of Viticis Fructus. In general, our work provides an integrated way of classifying Viticis Fructus, and a strategy to explore discriminatory chemical markers for other traditional Chinese herbs, thus ensuring safety and efficacy for consumers. Georg Thieme Verlag KG Stuttgart · New York.

  3. An integratable microfluidic cartridge for forensic swab samples lysis.

    PubMed

    Yang, Jianing; Brooks, Carla; Estes, Matthew D; Hurth, Cedric M; Zenhausern, Frederic

    2014-01-01

    Fully automated rapid forensic DNA analysis requires integrating several multistep processes onto a single microfluidic platform, including substrate lysis, extraction of DNA from the released lysate solution, multiplexed PCR amplification of STR loci, separation of PCR products by capillary electrophoresis, and analysis for allelic peak calling. Over the past several years, most of the rapid DNA analysis systems developed started with the reference swab sample lysate and involved an off-chip lysis of collected substrates. As a result of advancement in technology and chemistry, addition of a microfluidic module for swab sample lysis has been achieved in a few of the rapid DNA analysis systems. However, recent reports on integrated rapid DNA analysis systems with swab-in and answer-out capability lack any quantitative and qualitative characterization of the swab-in sample lysis module, which is important for downstream forensic sample processing. Maximal collection and subsequent recovery of the biological material from the crime scene is one of the first and critical steps in forensic DNA technology. Herein we present the design, fabrication and characterization of an integratable swab lysis cartridge module and the test results obtained from different types of commonly used forensic swab samples, including buccal, saliva, and blood swab samples, demonstrating the compatibility with different downstream DNA extraction chemistries. This swab lysis cartridge module is easy to operate, compatible with both forensic and microfluidic requirements, and ready to be integrated with our existing automated rapid forensic DNA analysis system. Following the characterization of the swab lysis module, an integrated run from buccal swab sample-in to the microchip CE electropherogram-out was demonstrated on the integrated prototype instrument. Therefore, in this study, we demonstrate that this swab lysis cartridge module is: (1) functionally, comparable with routine benchtop lysis, (2) compatible with various types of swab samples and chemistries, and (3) integratable to achieve a micro total analysis system (μTAS) for rapid DNA analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  4. Industry structures in private dental markets in Finland.

    PubMed

    Widström, E; Mikkola, H

    2012-12-01

    To use industrial organisation and organisational ecology research methods to survey industry structures and performance in the markets for private dental services and the effect of competition. Data on practice characteristics, performance, and perceived competition were collected from full-time private dentists (n = 1,121) using a questionnaire. The response rate was 59.6%. Cluster analysis was used to identify practice type based on service differentiation and process integration variables formulated from the questionnaire. Four strategic groups were identified in the Finnish markets: Solo practices formed one distinct group and group practices were classified into three clusters Integrated practices, Small practices, and Loosely integrated practices. Statistically significant differences were found in performance and perceived competitiveness between the groups. Integrated practices with the highest level of process integration and service differentiation performed better than solo and small practices. Moreover, loosely integrated and small practices outperformed solo practises. Competitive intensity was highest among small practices which had a low level of service differentiation and was above average among solo practises. Private dental care providers that had differentiated their services from public services and that had a high number of integrated service production processes enjoyed higher performance and less competitive pressures than those who had not.

  5. INTEGRATION OF FACILITY MODELING CAPABILITIES FOR NUCLEAR NONPROLIFERATION ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorensek, M.; Hamm, L.; Garcia, H.

    2011-07-18

    Developing automated methods for data collection and analysis that can facilitate nuclear nonproliferation assessment is an important research area with significant consequences for the effective global deployment of nuclear energy. Facility modeling that can integrate and interpret observations collected from monitored facilities in order to ascertain their functional details will be a critical element of these methods. Although improvements are continually sought, existing facility modeling tools can characterize all aspects of reactor operations and the majority of nuclear fuel cycle processing steps, and include algorithms for data processing and interpretation. Assessing nonproliferation status is challenging because observations can come frommore » many sources, including local and remote sensors that monitor facility operations, as well as open sources that provide specific business information about the monitored facilities, and can be of many different types. Although many current facility models are capable of analyzing large amounts of information, they have not been integrated in an analyst-friendly manner. This paper addresses some of these facility modeling capabilities and illustrates how they could be integrated and utilized for nonproliferation analysis. The inverse problem of inferring facility conditions based on collected observations is described, along with a proposed architecture and computer framework for utilizing facility modeling tools. After considering a representative sampling of key facility modeling capabilities, the proposed integration framework is illustrated with several examples.« less

  6. Information integration and diagnosis analysis of equipment status and production quality for machining process

    NASA Astrophysics Data System (ADS)

    Zan, Tao; Wang, Min; Hu, Jianzhong

    2010-12-01

    Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.

  7. An integrated exhaust gas analysis system with self-contained data processing and automatic calibration

    NASA Technical Reports Server (NTRS)

    Anderson, R. C.; Summers, R. L.

    1981-01-01

    An integrated gas analysis system designed to operate in automatic, semiautomatic, and manual modes from a remote control panel is described. The system measures the carbon monoxide, oxygen, water vapor, total hydrocarbons, carbon dioxide, and oxides of nitrogen. A pull through design provides increased reliability and eliminates the need for manual flow rate adjustment and pressure correction. The system contains two microprocessors to range the analyzers, calibrate the system, process the raw data to units of concentration, and provides information to the facility research computer and to the operator through terminal and the control panels. After initial setup, the system operates for several hours without significant operator attention.

  8. Process Improvement Through Tool Integration in Aero-Mechanical Design

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2010-01-01

    Emerging capabilities in commercial design tools promise to significantly improve the multi-disciplinary and inter-disciplinary design and analysis coverage for aerospace mechanical engineers. This paper explores the analysis process for two example problems of a wing and flap mechanical drive system and an aircraft landing gear door panel. The examples begin with the design solid models and include various analysis disciplines such as structural stress and aerodynamic loads. Analytical methods include CFD, multi-body dynamics with flexible bodies and structural analysis. Elements of analysis data management, data visualization and collaboration are also included.

  9. Integration Framework of Process Planning based on Resource Independent Operation Summary to Support Collaborative Manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kulvatunyou, Boonserm; Wysk, Richard A.; Cho, Hyunbo

    2004-06-01

    In today's global manufacturing environment, manufacturing functions are distributed as never before. Design, engineering, fabrication, and assembly of new products are done routinely in many different enterprises scattered around the world. Successful business transactions require the sharing of design and engineering data on an unprecedented scale. This paper describes a framework that facilitates the collaboration of engineering tasks, particularly process planning and analysis, to support such globalized manufacturing activities. The information models of data and the software components that integrate those information models are described. The integration framework uses an Integrated Product and Process Data (IPPD) representation called a Resourcemore » Independent Operation Summary (RIOS) to facilitate the communication of business and manufacturing requirements. Hierarchical process modeling, process planning decomposition and an augmented AND/OR directed graph are used in this representation. The Resource Specific Process Planning (RSPP) module assigns required equipment and tools, selects process parameters, and determines manufacturing costs based on two-level hierarchical RIOS data. The shop floor knowledge (resource and process knowledge) and a hybrid approach (heuristic and linear programming) to linearize the AND/OR graph provide the basis for the planning. Finally, a prototype system is developed and demonstrated with an exemplary part. Java and XML (Extensible Markup Language) are used to ensure software and information portability.« less

  10. Inertial navigation sensor integrated motion analysis for autonomous vehicle navigation

    NASA Technical Reports Server (NTRS)

    Roberts, Barry; Bhanu, Bir

    1992-01-01

    Recent work on INS integrated motion analysis is described. Results were obtained with a maximally passive system of obstacle detection (OD) for ground-based vehicles and rotorcraft. The OD approach involves motion analysis of imagery acquired by a passive sensor in the course of vehicle travel to generate range measurements to world points within the sensor FOV. INS data and scene analysis results are used to enhance interest point selection, the matching of the interest points, and the subsequent motion-based computations, tracking, and OD. The most important lesson learned from the research described here is that the incorporation of inertial data into the motion analysis program greatly improves the analysis and makes the process more robust.

  11. Computer-aided operations engineering with integrated models of systems and operations

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Ryan, Dan; Fleming, Land

    1994-01-01

    CONFIG 3 is a prototype software tool that supports integrated conceptual design evaluation from early in the product life cycle, by supporting isolated or integrated modeling, simulation, and analysis of the function, structure, behavior, failures and operation of system designs. Integration and reuse of models is supported in an object-oriented environment providing capabilities for graph analysis and discrete event simulation. Integration is supported among diverse modeling approaches (component view, configuration or flow path view, and procedure view) and diverse simulation and analysis approaches. Support is provided for integrated engineering in diverse design domains, including mechanical and electro-mechanical systems, distributed computer systems, and chemical processing and transport systems. CONFIG supports abstracted qualitative and symbolic modeling, for early conceptual design. System models are component structure models with operating modes, with embedded time-related behavior models. CONFIG supports failure modeling and modeling of state or configuration changes that result in dynamic changes in dependencies among components. Operations and procedure models are activity structure models that interact with system models. CONFIG is designed to support evaluation of system operability, diagnosability and fault tolerance, and analysis of the development of system effects of problems over time, including faults, failures, and procedural or environmental difficulties.

  12. After Integration; Problems of Race Relations in the High School Today. A Study of Madison High School with Recommendations for New York City Schools.

    ERIC Educational Resources Information Center

    New York City Commission on Human Rights, NY.

    This report first presents a narrative and analysis of the process and aftermath of the integration of Madison High School in Brooklyn, New York City. Then 13 recommendations are stated, among which are the following: (1) Board of Education should establish a special unit to provide technical assistance for integrated schools; (2) the New York…

  13. Cross-Border Higher Education for Regional Integration:Analysis of the JICA-RI Survey on Leading Universities in East Asia. JICA-RI Working Paper. No. 26

    ERIC Educational Resources Information Center

    Kuroda, Kazuo; Yuki, Takako; Kang, Kyuwon

    2010-01-01

    Set against the backdrop of increasing economic interdependence in East Asia, the idea of regional integration is now being discussed as a long-term political process in the region. As in the field of the international economy, de facto integration and interdependence exist with respect to the internationalization of the higher education system…

  14. The 17th Project Integration Meeting

    NASA Technical Reports Server (NTRS)

    Mcdonald, R. R.

    1981-01-01

    Progress made by the Low-Cost Solar Array Project during the period September 1980 to February 1981 is described. Included are reports on project analysis and integration; technology development in silicon material, large-area silicon sheet and encapsulation; production process and equipment development; engineering, and operations. A report on and copies of visual presentations made at the Project Integration Meeting held at Pasadena, California on February 4 and 5, 1981 are also included.

  15. Beyond the limitations of best practices: How logic analysis helped reinterpret dual diagnosis guidelines

    PubMed Central

    Brousselle, Astrid; Lamothe, Lise; Mercier, Céline; Perreault, Michel

    2012-01-01

    The co-occurrence of mental health and substance use disorders is becoming increasingly recognized as a single problem, and professionals recognize that both should be addressed at the same time. Medical best practices recommend integrated treatment. However, criticisms have arisen, particularly concerning the difficulty of implementing integrated teams in specific health-care contexts and the appropriateness of the proposed model for certain populations. Using logic analysis, we identify the key clinical and organizational factors that contribute to successful implementation. Building on both the professional and organizational literatures on integrated services, we propose a conceptual model that makes it possible to analyze integration processes and places integrated treatment within an interpretative framework. Using this model, it becomes possible to identify key factors necessary to support service integration, and suggest new models of practice adapted to particular contexts. PMID:17689316

  16. Three-Axis Distributed Fiber Optic Strain Measurement in 3D Woven Composite Structures

    NASA Technical Reports Server (NTRS)

    Castellucci, Matt; Klute, Sandra; Lally, Evan M.; Froggatt, Mark E.; Lowry, David

    2013-01-01

    Recent advancements in composite materials technologies have broken further from traditional designs and require advanced instrumentation and analysis capabilities. Success or failure is highly dependent on design analysis and manufacturing processes. By monitoring smart structures throughout manufacturing and service life, residual and operational stresses can be assessed and structural integrity maintained. Composite smart structures can be manufactured by integrating fiber optic sensors into existing composite materials processes such as ply layup, filament winding and three-dimensional weaving. In this work optical fiber was integrated into 3D woven composite parts at a commercial woven products manufacturing facility. The fiber was then used to monitor the structures during a VARTM manufacturing process, and subsequent static and dynamic testing. Low cost telecommunications-grade optical fiber acts as the sensor using a high resolution commercial Optical Frequency Domain Reflectometer (OFDR) system providing distributed strain measurement at spatial resolutions as low as 2mm. Strain measurements using the optical fiber sensors are correlated to resistive strain gage measurements during static structural loading. Keywords: fiber optic, distributed strain sensing, Rayleigh scatter, optical frequency domain reflectometry

  17. Macro-fingerprint analysis-through-separation of licorice based on FT-IR and 2DCOS-IR

    NASA Astrophysics Data System (ADS)

    Wang, Yang; Wang, Ping; Xu, Changhua; Yang, Yan; Li, Jin; Chen, Tao; Li, Zheng; Cui, Weili; Zhou, Qun; Sun, Suqin; Li, Huifen

    2014-07-01

    In this paper, a step-by-step analysis-through-separation method under the navigation of multi-step IR macro-fingerprint (FT-IR integrated with second derivative IR (SD-IR) and 2DCOS-IR) was developed for comprehensively characterizing the hierarchical chemical fingerprints of licorice from entirety to single active components. Subsequently, the chemical profile variation rules of three parts (flavonoids, saponins and saccharides) in the separation process were holistically revealed and the number of matching peaks and correlation coefficients with standards of pure compounds was increasing along the extracting directions. The findings were supported by UPLC results and a verification experiment of aqueous separation process. It has been demonstrated that the developed multi-step IR macro-fingerprint analysis-through-separation approach could be a rapid, effective and integrated method not only for objectively providing comprehensive chemical characterization of licorice and all its separated parts, but also for rapidly revealing the global enrichment trend of the active components in licorice separation process.

  18. Atlanta Tower Cab and TRACON Equipment Integration Analysis

    DOT National Transportation Integrated Search

    1980-10-01

    This report presents an analysis of how the new Terminal Information Processing System (TIPS) and Consolidated Cab Display (CCD) equipments might appear to Air Traffic personnel if they were installed in the tower cab and TRACON at Hartsfield-Atlanta...

  19. NexGen PVAs: Incorporating Eco-Evolutionary Processes into Population Viability Models

    EPA Science Inventory

    We examine how the integration of evolutionary and ecological processes in population dynamics – an emerging framework in ecology – could be incorporated into population viability analysis (PVA). Driven by parallel, complementary advances in population genomics and computational ...

  20. SHARP pre-release v1.0 - Current Status and Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mahadevan, Vijay S.; Rahaman, Ronald O.

    The NEAMS Reactor Product Line effort aims to develop an integrated multiphysics simulation capability for the design and analysis of future generations of nuclear power plants. The Reactor Product Line code suite’s multi-resolution hierarchy is being designed to ultimately span the full range of length and time scales present in relevant reactor design and safety analyses, as well as scale from desktop to petaflop computing platforms. In this report, building on a several previous report issued in September 2014, we describe our continued efforts to integrate thermal/hydraulics, neutronics, and structural mechanics modeling codes to perform coupled analysis of a representativemore » fast sodium-cooled reactor core in preparation for a unified release of the toolkit. The work reported in the current document covers the software engineering aspects of managing the entire stack of components in the SHARP toolkit and the continuous integration efforts ongoing to prepare a release candidate for interested reactor analysis users. Here we report on the continued integration effort of PROTEUS/Nek5000 and Diablo into the NEAMS framework and the software processes that enable users to utilize the capabilities without losing scientific productivity. Due to the complexity of the individual modules and their necessary/optional dependency library chain, we focus on the configuration and build aspects for the SHARP toolkit, which includes capability to autodownload dependencies and configure/install with optimal flags in an architecture-aware fashion. Such complexity is untenable without strong software engineering processes such as source management, source control, change reviews, unit tests, integration tests and continuous test suites. Details on these processes are provided in the report as a building step for a SHARP user guide that will accompany the first release, expected by Mar 2016.« less

  1. Performance and Reliability Optimization for Aerospace Systems subject to Uncertainty and Degradation

    NASA Technical Reports Server (NTRS)

    Miller, David W.; Uebelhart, Scott A.; Blaurock, Carl

    2004-01-01

    This report summarizes work performed by the Space Systems Laboratory (SSL) for NASA Langley Research Center in the field of performance optimization for systems subject to uncertainty. The objective of the research is to develop design methods and tools to the aerospace vehicle design process which take into account lifecycle uncertainties. It recognizes that uncertainty between the predictions of integrated models and data collected from the system in its operational environment is unavoidable. Given the presence of uncertainty, the goal of this work is to develop means of identifying critical sources of uncertainty, and to combine these with the analytical tools used with integrated modeling. In this manner, system uncertainty analysis becomes part of the design process, and can motivate redesign. The specific program objectives were: 1. To incorporate uncertainty modeling, propagation and analysis into the integrated (controls, structures, payloads, disturbances, etc.) design process to derive the error bars associated with performance predictions. 2. To apply modern optimization tools to guide in the expenditure of funds in a way that most cost-effectively improves the lifecycle productivity of the system by enhancing the subsystem reliability and redundancy. The results from the second program objective are described. This report describes the work and results for the first objective: uncertainty modeling, propagation, and synthesis with integrated modeling.

  2. Enabling integration in sports for adolescents with intellectual disabilities.

    PubMed

    Grandisson, Marie; Tétreault, Sylvie; Freeman, Andrew R

    2012-05-01

    Promoting the health and social participation of adolescents with intellectual disability is important as they are particularly vulnerable to encountering difficulties in those areas. Integration of these individuals in integrated sports is one strategy to address this issue. The main objective of this study was to gain a better understanding of the factors associated with the integration of adolescents with intellectual disability in sports alongside their non-disabled peers. Individual interviews were completed with 40 adolescents with intellectual disability and their parents, while 39 rehabilitation staff participated via either a discussion group or self-administered questionnaires. The Disability Creation Process (DCP) theoretical model was used to frame the analysis and the presentation of the findings (The Quebec Classification: Disability Creation Process. International Network on the Disability Creation Process/CSICIDH, Québec, QC, 1998). Various personal and environmental factors that have an impact on integration in sports were identified by participants. For example, attitudes, practical support, individuals' experiences in sports and in integrated settings as well as behaviour control emerged as important elements to consider. Integration in integrated sports can engender a lot of benefits for individuals with intellectual disability, their parents and non-disabled athletes. However, many barriers need to be removed before such benefits can be more widely realized. © 2012 Blackwell Publishing Ltd.

  3. Two-Dimensional Signal Processing, Optical Information Storage and Processing, and Electromagnetic Measurements

    DTIC Science & Technology

    1994-05-16

    analysis of anisotropic grating diffraction, perfor- mance analysis of Givens rotation integrated optical interdigitated-electrode cross- channel Bragg...11. T. R. Gardos and R. M. Mersereau, "FIR filtering on a lattice with periodically deleted samples," Proc. 1991 IEEE Int. Conf. on Acoustics...pp. vol. 1, pp. 301-311, July 1992. 23. T. R. Gardos , K. Nayebi, and R. M. Mersereau, "Time domain analysis of multi- dimensional multi-rate filter

  4. Implementing Information and Communication Technology to Support Community Aged Care Service Integration: Lessons from an Australian Aged Care Provider.

    PubMed

    Douglas, Heather E; Georgiou, Andrew; Tariq, Amina; Prgomet, Mirela; Warland, Andrew; Armour, Pauline; Westbrook, Johanna I

    2017-04-10

    There is limited evidence of the benefits of information and communication technology (ICT) to support integrated aged care services. We undertook a case study to describe carelink+, a centralised client service management ICT system implemented by a large aged and community care service provider, Uniting. We sought to explicate the care-related information exchange processes associated with carelink+ and identify lessons for organisations attempting to use ICT to support service integration. Our case study included seventeen interviews and eleven observation sessions with a purposive sample of staff within the organisation. Inductive analysis was used to develop a model of ICT-supported information exchange. Management staff described the integrated care model designed to underpin carelink+. Frontline staff described complex information exchange processes supporting coordination of client services. Mismatches between the data quality and the functions carelink+ was designed to support necessitated the evolution of new work processes associated with the system. There is value in explicitly modelling the work processes that emerge as a consequence of ICT. Continuous evaluation of the match between ICT and work processes will help aged care organisations to achieve higher levels of ICT maturity that support their efforts to provide integrated care to clients.

  5. Implementing Information and Communication Technology to Support Community Aged Care Service Integration: Lessons from an Australian Aged Care Provider

    PubMed Central

    Georgiou, Andrew; Tariq, Amina; Prgomet, Mirela; Warland, Andrew; Armour, Pauline; Westbrook, Johanna I

    2017-01-01

    Introduction: There is limited evidence of the benefits of information and communication technology (ICT) to support integrated aged care services. Objectives: We undertook a case study to describe carelink+, a centralised client service management ICT system implemented by a large aged and community care service provider, Uniting. We sought to explicate the care-related information exchange processes associated with carelink+ and identify lessons for organisations attempting to use ICT to support service integration. Methods: Our case study included seventeen interviews and eleven observation sessions with a purposive sample of staff within the organisation. Inductive analysis was used to develop a model of ICT-supported information exchange. Results: Management staff described the integrated care model designed to underpin carelink+. Frontline staff described complex information exchange processes supporting coordination of client services. Mismatches between the data quality and the functions carelink+ was designed to support necessitated the evolution of new work processes associated with the system. Conclusions: There is value in explicitly modelling the work processes that emerge as a consequence of ICT. Continuous evaluation of the match between ICT and work processes will help aged care organisations to achieve higher levels of ICT maturity that support their efforts to provide integrated care to clients. PMID:29042851

  6. A randomized wait-list controlled analysis of the implementation integrity of team-initiated problem solving processes.

    PubMed

    Newton, J Stephen; Horner, Robert H; Algozzine, Bob; Todd, Anne W; Algozzine, Kate

    2012-08-01

    Members of Positive Behavior Interventions and Supports (PBIS) teams from 34 elementary schools participated in a Team-Initiated Problem Solving (TIPS) Workshop and follow-up technical assistance. Within the context of a randomized wait-list controlled trial, team members who were the first recipients of the TIPS intervention demonstrated greater implementation integrity in using the problem-solving processes during their team meetings than did members of PBIS Teams in the Wait-List Control group. The success of TIPS at improving implementation integrity of the problem-solving processes is encouraging and suggests the value of conducting additional research focused on determining whether there is a functional relation between use of these problem-solving processes and actual resolution of targeted student academic and social problems. Copyright © 2012 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  7. Main Engine Prototype Development for 2nd Generation RLV RS-83

    NASA Technical Reports Server (NTRS)

    Vilja, John; Fisher, Mark; Lyles, Garry M. (Technical Monitor)

    2002-01-01

    This presentation reports on the NASA project to develop a prototype for RS-83 engine designed for use on reusable launch vehicles (RLV). Topics covered include: program objectives, overview schedule, organizational chart, integrated systems engineering processes, requirement analysis, catastrophic engine loss, maintainability analysis tools, and prototype design analysis.

  8. Ten years of maintaining and expanding a microbial genome and metagenome analysis system.

    PubMed

    Markowitz, Victor M; Chen, I-Min A; Chu, Ken; Pati, Amrita; Ivanova, Natalia N; Kyrpides, Nikos C

    2015-11-01

    Launched in March 2005, the Integrated Microbial Genomes (IMG) system is a comprehensive data management system that supports multidimensional comparative analysis of genomic data. At the core of the IMG system is a data warehouse that contains genome and metagenome datasets sequenced at the Joint Genome Institute or provided by scientific users, as well as public genome datasets available at the National Center for Biotechnology Information Genbank sequence data archive. Genomes and metagenome datasets are processed using IMG's microbial genome and metagenome sequence data processing pipelines and are integrated into the data warehouse using IMG's data integration toolkits. Microbial genome and metagenome application specific data marts and user interfaces provide access to different subsets of IMG's data and analysis toolkits. This review article revisits IMG's original aims, highlights key milestones reached by the system during the past 10 years, and discusses the main challenges faced by a rapidly expanding system, in particular the complexity of maintaining such a system in an academic setting with limited budgets and computing and data management infrastructure. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Parallel processing for nonlinear dynamics simulations of structures including rotating bladed-disk assemblies

    NASA Technical Reports Server (NTRS)

    Hsieh, Shang-Hsien

    1993-01-01

    The principal objective of this research is to develop, test, and implement coarse-grained, parallel-processing strategies for nonlinear dynamic simulations of practical structural problems. There are contributions to four main areas: finite element modeling and analysis of rotational dynamics, numerical algorithms for parallel nonlinear solutions, automatic partitioning techniques to effect load-balancing among processors, and an integrated parallel analysis system.

  10. FE-Analysis of Stretch-Blow Moulded Bottles Using an Integrative Process Simulation

    NASA Astrophysics Data System (ADS)

    Hopmann, C.; Michaeli, W.; Rasche, S.

    2011-05-01

    The two-stage stretch-blow moulding process has been established for the large scale production of high quality PET containers with excellent mechanical and optical properties. The total production costs of a bottle are significantly caused by the material costs. Due to this dominant share of the bottle material, the PET industry is interested in reducing the total production costs by an optimised material efficiency. However, a reduced material inventory means decreasing wall thicknesses and therewith a reduction of the bottle properties (e.g. mechanical properties, barrier properties). Therefore, there is often a trade-off between a minimal bottle weight and adequate properties of the bottle. In order to achieve the objectives Computer Aided Engineering (CAE) techniques can assist the designer of new stretch-blow moulded containers. Hence, tools such as the process simulation and the structural analysis have become important in the blow moulding sector. The Institute of Plastics Processing (IKV) at RWTH Aachen University, Germany, has developed an integrative three-dimensional process simulation which models the complete path of a preform through a stretch-blow moulding machine. At first, the reheating of the preform is calculated by a thermal simulation. Afterwards, the inflation of the preform to a bottle is calculated by finite element analysis (FEA). The results of this step are e.g. the local wall thickness distribution and the local biaxial stretch ratios. Not only the material distribution but also the material properties that result from the deformation history of the polymer have significant influence on the bottle properties. Therefore, a correlation between the material properties and stretch ratios is considered in an integrative simulation approach developed at IKV. The results of the process simulation (wall thickness, stretch ratios) are transferred to a further simulation program and mapped on the bottles FE mesh. This approach allows a local determination of the material properties and thus a more accurate prediction of the bottle properties. The approach was applied both for a mechanical structural analysis and for a barrier analysis. First results point out that the approach can improve the FE analysis and might be a helpful tool for designing new stretch-blow moulded bottles.

  11. Effective Integration of ICT in Singapore Schools: Pedagogical and Policy Implications

    ERIC Educational Resources Information Center

    Lim, Cher Ping

    2007-01-01

    This paper examines and analyses where and how information and communication technologies (ICT) are integrated in Singapore schools to engage students in higher-order thinking activities. Taking the activity system as a unit of analysis, the study documents the actual processes and sociocultural elements that engage students in higher-order…

  12. Measuring the Impact of Data Mining on Churn Management.

    ERIC Educational Resources Information Center

    Lejeune, Miguel A. P. M.

    2001-01-01

    Churn management is a concern for businesses, particularly in the digital economy. A customer relationship framework is proposed to help deal with churn issues. The model integrates the electronic channel and involves four tools for enhancing data collection, data treatment, data analysis and data integration in the decision-making process.…

  13. Improved analyses using function datasets and statistical modeling

    Treesearch

    John S. Hogland; Nathaniel M. Anderson

    2014-01-01

    Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space and have limited statistical functionality and machine learning algorithms. To address this issue, we developed a new modeling framework using C# and ArcObjects and integrated that framework...

  14. Progress Report 18 for the Period February to July 1981 and Proceeidngs of the 18th Project Integration Meeting

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Progress in the low cost solar array project during the period February to July 1981 is reported. Included are: (1) project analysis and integration; (2) technology development in silicon material, large area silicon sheer and encapsulation; (3) process development; (4) engineering, and operations.

  15. Integrated watershed analysis: adapting to changing times

    Treesearch

    Gordon H. Reeves

    2013-01-01

    Resource managers are increasingly required to conduct integrated analyses of aquatic and terrestrial ecosystems before undertaking any activities. Th ere are a number of research studies on the impacts of management actions on these ecosystems, as well as a growing body of knowledge about ecological processes that aff ect them, particularly aquatic ecosystems, which...

  16. Work-Integrated Learning Process in Tourism Training Programs in Vietnam: Voices of Education and Industry

    ERIC Educational Resources Information Center

    Khuong, Cam Thi Hong

    2016-01-01

    This paper addresses the work-integrated learning (WIL) initiative embedded in selected tourism training programs in Vietnam. The research was grounded on the framework of stakeholder ethos. Drawing on tourism training curriculum analysis and interviews with lecturers, institutional leaders, industry managers and internship supervisors, this study…

  17. Integrating Spatial Components into FIA Models of Forest Resources: Some Technical Aspects

    Treesearch

    Pat Terletzky; Tracey Frescino

    2005-01-01

    We examined two software packages to determine their feasibility of implementing spatially explicit, forest resource models that integrate Forest Inventory and Analysis data (FIA). ARCINFO and Interactive Data Language (IDL) were examined for their input requirements, speed of processing, storage requirements, and flexibility of implementing. Implementations of two...

  18. Fully integrated wearable sensor arrays for multiplexed in situ perspiration analysis

    PubMed Central

    Nyein, Hnin Yin Yin; Challa, Samyuktha; Chen, Kevin; Peck, Austin; Fahad, Hossain M.; Ota, Hiroki; Shiraki, Hiroshi; Kiriya, Daisuke; Lien, Der-Hsien; Brooks, George A.; Davis, Ronald W.; Javey, Ali

    2016-01-01

    Wearable sensor technologies are essential to the realization of personalized medicine through continuously monitoring an individual's state of health1–12. Sampling human sweat, which is rich in physiological information13, could enable non-invasive monitoring. Previously reported sweat-based and other non-invasive biosensors either can only monitor a single analyte at a time or lack on-site signal processing circuitry and sensor calibration mechanisms for accurate analysis of the physiological state14–18. Given the complexity of sweat secretion, simultaneous and multiplexed screening of target biomarkers is critical and requires full system integration to ensure the accuracy of measurements. Here we present a mechanically flexible and fully integrated (that is, no external analysis is needed) sensor array for multiplexed in situ perspiration analysis, which simultaneously and selectively measures sweat metabolites (such as glucose and lactate) and electrolytes (such as sodium and potassium ions), as well as the skin temperature (to calibrate the response of the sensors). Our work bridges the technological gap between signal transduction, conditioning (amplification and filtering), processing and wireless transmission in wearable biosensors by merging plastic-based sensors that interface with the skin with silicon integrated circuits consolidated on a flexible circuit board for complex signal processing. This application could not have been realized using either of these technologies alone owing to their respective inherent limitations. The wearable system is used to measure the detailed sweat profile of human subjects engaged in prolonged indoor and outdoor physical activities, and to make a real-time assessment of the physiological state of the subjects. This platform enables a wide range of personalized diagnostic and physiological monitoring applications. PMID:26819044

  19. Advantages of comparative studies in songbirds to understand the neural basis of sensorimotor integration.

    PubMed

    Murphy, Karagh; James, Logan S; Sakata, Jon T; Prather, Jonathan F

    2017-08-01

    Sensorimotor integration is the process through which the nervous system creates a link between motor commands and associated sensory feedback. This process allows for the acquisition and refinement of many behaviors, including learned communication behaviors such as speech and birdsong. Consequently, it is important to understand fundamental mechanisms of sensorimotor integration, and comparative analyses of this process can provide vital insight. Songbirds offer a powerful comparative model system to study how the nervous system links motor and sensory information for learning and control. This is because the acquisition, maintenance, and control of birdsong critically depend on sensory feedback. Furthermore, there is an incredible diversity of song organizations across songbird species, ranging from songs with simple, stereotyped sequences to songs with complex sequencing of vocal gestures, as well as a wide diversity of song repertoire sizes. Despite this diversity, the neural circuitry for song learning, control, and maintenance remains highly similar across species. Here, we highlight the utility of songbirds for the analysis of sensorimotor integration and the insights about mechanisms of sensorimotor integration gained by comparing different songbird species. Key conclusions from this comparative analysis are that variation in song sequence complexity seems to covary with the strength of feedback signals in sensorimotor circuits and that sensorimotor circuits contain distinct representations of elements in the vocal repertoire, possibly enabling evolutionary variation in repertoire sizes. We conclude our review by highlighting important areas of research that could benefit from increased comparative focus, with particular emphasis on the integration of new technologies. Copyright © 2017 the American Physiological Society.

  20. Identification of candidate genes in osteoporosis by integrated microarray analysis.

    PubMed

    Li, J J; Wang, B Q; Fei, Q; Yang, Y; Li, D

    2016-12-01

    In order to screen the altered gene expression profile in peripheral blood mononuclear cells of patients with osteoporosis, we performed an integrated analysis of the online microarray studies of osteoporosis. We searched the Gene Expression Omnibus (GEO) database for microarray studies of peripheral blood mononuclear cells in patients with osteoporosis. Subsequently, we integrated gene expression data sets from multiple microarray studies to obtain differentially expressed genes (DEGs) between patients with osteoporosis and normal controls. Gene function analysis was performed to uncover the functions of identified DEGs. A total of three microarray studies were selected for integrated analysis. In all, 1125 genes were found to be significantly differentially expressed between osteoporosis patients and normal controls, with 373 upregulated and 752 downregulated genes. Positive regulation of the cellular amino metabolic process (gene ontology (GO): 0033240, false discovery rate (FDR) = 1.00E + 00) was significantly enriched under the GO category for biological processes, while for molecular functions, flavin adenine dinucleotide binding (GO: 0050660, FDR = 3.66E-01) and androgen receptor binding (GO: 0050681, FDR = 6.35E-01) were significantly enriched. DEGs were enriched in many osteoporosis-related signalling pathways, including those of mitogen-activated protein kinase (MAPK) and calcium. Protein-protein interaction (PPI) network analysis showed that the significant hub proteins contained ubiquitin specific peptidase 9, X-linked (Degree = 99), ubiquitin specific peptidase 19 (Degree = 57) and ubiquitin conjugating enzyme E2 B (Degree = 57). Analysis of gene function of identified differentially expressed genes may expand our understanding of fundamental mechanisms leading to osteoporosis. Moreover, significantly enriched pathways, such as MAPK and calcium, may involve in osteoporosis through osteoblastic differentiation and bone formation.Cite this article: J. J. Li, B. Q. Wang, Q. Fei, Y. Yang, D. Li. Identification of candidate genes in osteoporosis by integrated microarray analysis. Bone Joint Res 2016;5:594-601. DOI: 10.1302/2046-3758.512.BJR-2016-0073.R1. © 2016 Fei et al.

  1. Recurrence Quantification Analysis of Processes and Products of Discourse: A Tutorial in R

    ERIC Educational Resources Information Center

    Wallot, Sebastian

    2017-01-01

    Processes of naturalistic reading and writing are based on complex linguistic input, stretch-out over time, and rely on an integrated performance of multiple perceptual, cognitive, and motor processes. Hence, naturalistic reading and writing performance is nonstationary and exhibits fluctuations and transitions. However, instead of being just…

  2. The Spectral Image Processing System (SIPS): Software for integrated analysis of AVIRIS data

    NASA Technical Reports Server (NTRS)

    Kruse, F. A.; Lefkoff, A. B.; Boardman, J. W.; Heidebrecht, K. B.; Shapiro, A. T.; Barloon, P. J.; Goetz, A. F. H.

    1992-01-01

    The Spectral Image Processing System (SIPS) is a software package developed by the Center for the Study of Earth from Space (CSES) at the University of Colorado, Boulder, in response to a perceived need to provide integrated tools for analysis of imaging spectrometer data both spectrally and spatially. SIPS was specifically designed to deal with data from the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) and the High Resolution Imaging Spectrometer (HIRIS), but was tested with other datasets including the Geophysical and Environmental Research Imaging Spectrometer (GERIS), GEOSCAN images, and Landsat TM. SIPS was developed using the 'Interactive Data Language' (IDL). It takes advantage of high speed disk access and fast processors running under the UNIX operating system to provide rapid analysis of entire imaging spectrometer datasets. SIPS allows analysis of single or multiple imaging spectrometer data segments at full spatial and spectral resolution. It also allows visualization and interactive analysis of image cubes derived from quantitative analysis procedures such as absorption band characterization and spectral unmixing. SIPS consists of three modules: SIPS Utilities, SIPS_View, and SIPS Analysis. SIPS version 1.1 is described below.

  3. An integrated workflow for analysis of ChIP-chip data.

    PubMed

    Weigelt, Karin; Moehle, Christoph; Stempfl, Thomas; Weber, Bernhard; Langmann, Thomas

    2008-08-01

    Although ChIP-chip is a powerful tool for genome-wide discovery of transcription factor target genes, the steps involving raw data analysis, identification of promoters, and correlation with binding sites are still laborious processes. Therefore, we report an integrated workflow for the analysis of promoter tiling arrays with the Genomatix ChipInspector system. We compare this tool with open-source software packages to identify PU.1 regulated genes in mouse macrophages. Our results suggest that ChipInspector data analysis, comparative genomics for binding site prediction, and pathway/network modeling significantly facilitate and enhance whole-genome promoter profiling to reveal in vivo sites of transcription factor-DNA interactions.

  4. From proteomics to systems biology: MAPA, MASS WESTERN, PROMEX, and COVAIN as a user-oriented platform.

    PubMed

    Weckwerth, Wolfram; Wienkoop, Stefanie; Hoehenwarter, Wolfgang; Egelhofer, Volker; Sun, Xiaoliang

    2014-01-01

    Genome sequencing and systems biology are revolutionizing life sciences. Proteomics emerged as a fundamental technique of this novel research area as it is the basis for gene function analysis and modeling of dynamic protein networks. Here a complete proteomics platform suited for functional genomics and systems biology is presented. The strategy includes MAPA (mass accuracy precursor alignment; http://www.univie.ac.at/mosys/software.html ) as a rapid exploratory analysis step; MASS WESTERN for targeted proteomics; COVAIN ( http://www.univie.ac.at/mosys/software.html ) for multivariate statistical analysis, data integration, and data mining; and PROMEX ( http://www.univie.ac.at/mosys/databases.html ) as a database module for proteogenomics and proteotypic peptides for targeted analysis. Moreover, the presented platform can also be utilized to integrate metabolomics and transcriptomics data for the analysis of metabolite-protein-transcript correlations and time course analysis using COVAIN. Examples for the integration of MAPA and MASS WESTERN data, proteogenomic and metabolic modeling approaches for functional genomics, phosphoproteomics by integration of MOAC (metal-oxide affinity chromatography) with MAPA, and the integration of metabolomics, transcriptomics, proteomics, and physiological data using this platform are presented. All software and step-by-step tutorials for data processing and data mining can be downloaded from http://www.univie.ac.at/mosys/software.html.

  5. EVALUATING LANDSCAPE CHANGE AND HYDROLOGICAL CONSEQUENCES IN A SEMI-ARID ENVIRONMENT

    EPA Science Inventory

    During the past two decades, important advances in the integration of remote imagery, computer processing, and spatial analysis technologies have been used to better understand the distribution of natural communities and ecosystems, and the ecological processes that affect these ...

  6. Work Flow Analysis Report Consisting of Work Management - Preventive Maintenance - Materials and Equipment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    JENNINGS, T.L.

    The Work Flow analysis Report will be used to facilitate the requirements for implementing the Work Control module of Passport. The report consists of workflow integration processes for Work Management, Preventative Maintenance, Materials and Equipment

  7. Structural Optimization in automotive design

    NASA Technical Reports Server (NTRS)

    Bennett, J. A.; Botkin, M. E.

    1984-01-01

    Although mathematical structural optimization has been an active research area for twenty years, there has been relatively little penetration into the design process. Experience indicates that often this is due to the traditional layout-analysis design process. In many cases, optimization efforts have been outgrowths of analysis groups which are themselves appendages to the traditional design process. As a result, optimization is often introduced into the design process too late to have a significant effect because many potential design variables have already been fixed. A series of examples are given to indicate how structural optimization has been effectively integrated into the design process.

  8. A 3D THz image processing methodology for a fully integrated, semi-automatic and near real-time operational system

    NASA Astrophysics Data System (ADS)

    Brook, A.; Cristofani, E.; Vandewal, M.; Matheis, C.; Jonuscheit, J.; Beigang, R.

    2012-05-01

    The present study proposes a fully integrated, semi-automatic and near real-time mode-operated image processing methodology developed for Frequency-Modulated Continuous-Wave (FMCW) THz images with the center frequencies around: 100 GHz and 300 GHz. The quality control of aeronautics composite multi-layered materials and structures using Non-Destructive Testing is the main focus of this work. Image processing is applied on the 3-D images to extract useful information. The data is processed by extracting areas of interest. The detected areas are subjected to image analysis for more particular investigation managed by a spatial model. Finally, the post-processing stage examines and evaluates the spatial accuracy of the extracted information.

  9. Optical performance assessment under environmental and mechanical perturbations in large, deployable telescopes

    NASA Astrophysics Data System (ADS)

    Folley, Christopher; Bronowicki, Allen

    2005-09-01

    Prediction of optical performance for large, deployable telescopes under environmental conditions and mechanical disturbances is a crucial part of the design verification process of such instruments for all phases of design and operation: ground testing, commissioning, and on-orbit operation. A Structural-Thermal-Optical-Performance (STOP) analysis methodology is often created that integrates the output of one analysis with the input of another. The integration of thermal environment predictions with structural models is relatively well understood, while the integration of structural deformation results into optical analysis/design software is less straightforward. A Matlab toolbox has been created that effectively integrates the predictions of mechanical deformations on optical elements generated by, for example, finite element analysis, and computes optical path differences for the distorted prescription. The engine of the toolbox is the real ray-tracing algorithm that allows the optical surfaces to be defined in a single, global coordinate system thereby allowing automatic alignment of the mechanical coordinate system with the optical coordinate system. Therefore, the physical location of the optical surfaces is identical in the optical prescription and the finite element model. The application of rigid body displacements to optical surfaces, however, is more general than for use solely in STOP analysis, such as the analysis of misalignments during the commissioning process. Furthermore, all the functionality of Matlab is available for optimization and control. Since this is a new tool for use on flight programs, it has been verified against CODE V. The toolbox' functionality, to date, is described, verification results are presented, and, as an example of its utility, results of a thermal distortion analysis are presented using the James Webb Space Telescope (JWST) prescription.

  10. Service-oriented model-encapsulation strategy for sharing and integrating heterogeneous geo-analysis models in an open web environment

    NASA Astrophysics Data System (ADS)

    Yue, Songshan; Chen, Min; Wen, Yongning; Lu, Guonian

    2016-04-01

    Earth environment is extremely complicated and constantly changing; thus, it is widely accepted that the use of a single geo-analysis model cannot accurately represent all details when solving complex geo-problems. Over several years of research, numerous geo-analysis models have been developed. However, a collaborative barrier between model providers and model users still exists. The development of cloud computing has provided a new and promising approach for sharing and integrating geo-analysis models across an open web environment. To share and integrate these heterogeneous models, encapsulation studies should be conducted that are aimed at shielding original execution differences to create services which can be reused in the web environment. Although some model service standards (such as Web Processing Service (WPS) and Geo Processing Workflow (GPW)) have been designed and developed to help researchers construct model services, various problems regarding model encapsulation remain. (1) The descriptions of geo-analysis models are complicated and typically require rich-text descriptions and case-study illustrations, which are difficult to fully represent within a single web request (such as the GetCapabilities and DescribeProcess operations in the WPS standard). (2) Although Web Service technologies can be used to publish model services, model users who want to use a geo-analysis model and copy the model service into another computer still encounter problems (e.g., they cannot access the model deployment dependencies information). This study presents a strategy for encapsulating geo-analysis models to reduce problems encountered when sharing models between model providers and model users and supports the tasks with different web service standards (e.g., the WPS standard). A description method for heterogeneous geo-analysis models is studied. Based on the model description information, the methods for encapsulating the model-execution program to model services and for describing model-service deployment information are also included in the proposed strategy. Hence, the model-description interface, model-execution interface and model-deployment interface are studied to help model providers and model users more easily share, reuse and integrate geo-analysis models in an open web environment. Finally, a prototype system is established, and the WPS standard is employed as an example to verify the capability and practicability of the model-encapsulation strategy. The results show that it is more convenient for modellers to share and integrate heterogeneous geo-analysis models in cloud computing platforms.

  11. Achieving Integration in Mixed Methods Designs—Principles and Practices

    PubMed Central

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-01-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs—exploratory sequential, explanatory sequential, and convergent—and through four advanced frameworks—multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. PMID:24279835

  12. Achieving integration in mixed methods designs-principles and practices.

    PubMed

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-12-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs-exploratory sequential, explanatory sequential, and convergent-and through four advanced frameworks-multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. © Health Research and Educational Trust.

  13. Research of processes of reception and analysis of dynamic digital medical images in hardware/software complexes used for diagnostics and treatment of cardiovascular diseases

    NASA Astrophysics Data System (ADS)

    Karmazikov, Y. V.; Fainberg, E. M.

    2005-06-01

    Work with DICOM compatible equipment integrated into hardware and software systems for medical purposes has been considered. Structures of process of reception and translormation of the data are resulted by the example of digital rentgenography and angiography systems, included in hardware-software complex DIMOL-IK. Algorithms of reception and the analysis of the data are offered. Questions of the further processing and storage of the received data are considered.

  14. Numerical and experimental analysis of a ducted propeller designed by a fully automated optimization process under open water condition

    NASA Astrophysics Data System (ADS)

    Yu, Long; Druckenbrod, Markus; Greve, Martin; Wang, Ke-qi; Abdel-Maksoud, Moustafa

    2015-10-01

    A fully automated optimization process is provided for the design of ducted propellers under open water conditions, including 3D geometry modeling, meshing, optimization algorithm and CFD analysis techniques. The developed process allows the direct integration of a RANSE solver in the design stage. A practical ducted propeller design case study is carried out for validation. Numerical simulations and open water tests are fulfilled and proved that the optimum ducted propeller improves hydrodynamic performance as predicted.

  15. Stochastic Analysis and Applied Probability(3.3.1): Topics in the Theory and Applications of Stochastic Analysis

    DTIC Science & Technology

    2015-08-13

    is due to Reiman [36] who considered the case where the arrivals and services are mutually independent renewal processes with square integrable summands...to a reflected diffusion process with drift and diffusion coefficients that depend on the state of the process. In models considered in works of Reiman ...the infinity Laplacian. Jour. AMS, to appear [36] M. I. Reiman . Open queueing networks in heavy traffic. Mathematics of Operations Research, 9(3): 441

  16. Evolution of the Tropical Cyclone Integrated Data Exchange And Analysis System (TC-IDEAS)

    NASA Technical Reports Server (NTRS)

    Turk, J.; Chao, Y.; Haddad, Z.; Hristova-Veleva, S.; Knosp, B.; Lambrigtsen, B.; Li, P.; Licata, S.; Poulsen, W.; Su, H.; hide

    2010-01-01

    The Tropical Cyclone Integrated Data Exchange and Analysis System (TC-IDEAS) is being jointly developed by the Jet Propulsion Laboratory (JPL) and the Marshall Space Flight Center (MSFC) as part of NASA's Hurricane Science Research Program. The long-term goal is to create a comprehensive tropical cyclone database of satellite and airborne observations, in-situ measurements and model simulations containing parameters that pertain to the thermodynamic and microphysical structure of the storms; the air-sea interaction processes; and the large-scale environment.

  17. Debris Examination Using Ballistic and Radar Integrated Software

    NASA Technical Reports Server (NTRS)

    Griffith, Anthony; Schottel, Matthew; Lee, David; Scully, Robert; Hamilton, Joseph; Kent, Brian; Thomas, Christopher; Benson, Jonathan; Branch, Eric; Hardman, Paul; hide

    2012-01-01

    The Debris Examination Using Ballistic and Radar Integrated Software (DEBRIS) program was developed to provide rapid and accurate analysis of debris observed by the NASA Debris Radar (NDR). This software provides a greatly improved analysis capacity over earlier manual processes, allowing for up to four times as much data to be analyzed by one-quarter of the personnel required by earlier methods. There are two applications that comprise the DEBRIS system: the Automated Radar Debris Examination Tool (ARDENT) and the primary DEBRIS tool.

  18. Analysis of metabolic energy utilization in the Skylab astronauts

    NASA Technical Reports Server (NTRS)

    Leonard, J. I.

    1977-01-01

    Skylab biomedical data regarding man's metabolic processes for extended periods of weightlessness is presented. The data was used in an integrated metabolic balance analysis which included analysis of Skylab water balance, electrolyte balance, evaporative water loss, and body composition. A theoretical analysis of energy utilization in man is presented. The results of the analysis are presented in tabular and graphic format.

  19. Millimeter-Wave GaN MMIC Integration with Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Coffey, Michael

    This thesis addresses the analysis, design, integration and test of microwave and millimeter-wave monolithic microwave integrated circuits (MMIC or MMICs). Recent and ongoing progress in semiconductor device fabrication and MMIC processing technology has pushed the upper limit in MMIC frequencies from millimeter-wave (30-300 GHz) to terahertz (300-3000 GHz). MMIC components operating at these frequencies will be used to improve the sensitivity and performance of radiometers, receivers for communication systems, passive remote sensing systems, transceivers for radar instruments and radio astronomy systems. However, a serious hurdle in the utilization of these MMIC components, and a main topic presented in this thesis, is the development and reliable fabrication of practical packaging techniques. The focus of this thesis is the investigation of first, the design and analysis of microwave and millimeter-wave GaN MMICs and second, the integration of those MMICs into usable waveguide components. The analysis, design and testing of various X-band (8-12 GHz) thru H-band (170-260 GHz) GaN MMIC power amplifier (PA or PAs), including a V-band (40-75 GHz) voltage controlled oscillator, is the majority of this work. Several PA designs utilizing high-efficiency techniques are analyzed, designed and tested. These examples include a 2nd harmonic injection amplifier, a Class-E amplifier fabricated with a GaN-on-SiC 300 GHz fT process, and an example of the applicability of supply-modulation with a Doherty power amplifier, all operating at 10 GHz. Two H-band GaN MMIC PAs are designed, one with integrated CPW-to-waveguide transitions for integration. The analysis of PA stability is especially important for wideband, high- fT devices and a new way of analyzing stability is explored and experimentally validated. Last, the challenges of integrating MMICs operating at millimeter-wave frequencies are discussed and assemblies using additive and traditional manufacturing are demonstrated.

  20. The EUCLID/V1 Integrated Code for Safety Assessment of Liquid Metal Cooled Fast Reactors. Part 1: Basic Models

    NASA Astrophysics Data System (ADS)

    Mosunova, N. A.

    2018-05-01

    The article describes the basic models included in the EUCLID/V1 integrated code intended for safety analysis of liquid metal (sodium, lead, and lead-bismuth) cooled fast reactors using fuel rods with a gas gap and pellet dioxide, mixed oxide or nitride uranium-plutonium fuel under normal operation, under anticipated operational occurrences and accident conditions by carrying out interconnected thermal-hydraulic, neutronics, and thermal-mechanical calculations. Information about the Russian and foreign analogs of the EUCLID/V1 integrated code is given. Modeled objects, equation systems in differential form solved in each module of the EUCLID/V1 integrated code (the thermal-hydraulic, neutronics, fuel rod analysis module, and the burnup and decay heat calculation modules), the main calculated quantities, and also the limitations on application of the code are presented. The article also gives data on the scope of functions performed by the integrated code's thermal-hydraulic module, using which it is possible to describe both one- and twophase processes occurring in the coolant. It is shown that, owing to the availability of the fuel rod analysis module in the integrated code, it becomes possible to estimate the performance of fuel rods in different regimes of the reactor operation. It is also shown that the models implemented in the code for calculating neutron-physical processes make it possible to take into account the neutron field distribution over the fuel assembly cross section as well as other features important for the safety assessment of fast reactors.

  1. An integrative neural model of social perception, action observation, and theory of mind.

    PubMed

    Yang, Daniel Y-J; Rosenblau, Gabriela; Keifer, Cara; Pelphrey, Kevin A

    2015-04-01

    In the field of social neuroscience, major branches of research have been instrumental in describing independent components of typical and aberrant social information processing, but the field as a whole lacks a comprehensive model that integrates different branches. We review existing research related to the neural basis of three key neural systems underlying social information processing: social perception, action observation, and theory of mind. We propose an integrative model that unites these three processes and highlights the posterior superior temporal sulcus (pSTS), which plays a central role in all three systems. Furthermore, we integrate these neural systems with the dual system account of implicit and explicit social information processing. Large-scale meta-analyses based on Neurosynth confirmed that the pSTS is at the intersection of the three neural systems. Resting-state functional connectivity analysis with 1000 subjects confirmed that the pSTS is connected to all other regions in these systems. The findings presented in this review are specifically relevant for psychiatric research especially disorders characterized by social deficits such as autism spectrum disorder. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Structural elucidation of Eucalyptus lignin and its dynamic changes in the cell walls during an integrated process of ionic liquids and successive alkali treatments.

    PubMed

    Li, Han-Yin; Wang, Chen-Zhou; Chen, Xue; Cao, Xue-Fei; Sun, Shao-Ni; Sun, Run-Cang

    2016-12-01

    An integrated process based on ionic liquids ([Bmim]Cl and [Bmim]OAc) pretreatment and successive alkali post-treatments (0.5, 2.0, and 4.0% NaOH at 90°C for 2h) was performed to isolate lignins from Eucalyptus. The structural features and spatial distribution of lignin in the Eucalyptus cell wall were investigated thoroughly. Results revealed that the ionic liquids pretreatment promoted the isolation of alkaline lignin from the pretreated samples without obvious structural changes. Additionally, the integrated process resulted in syringyl-rich lignin macromolecules with more β-O-4' linkages and less phenolic hydroxyl groups. Confocal Raman microscopy analysis showed that the dissolution behavior of lignin was varied in the morphologically distinct regions during the successive alkali treatments, and lignin dissolved was mainly stemmed from the secondary wall regions. These results provided some useful information for understanding the mechanisms of delignification during the integrated process and enhancing the potential utilizations of lignin in future biorefineries. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Opportunity integrated assessment facilitating critical thinking and science process skills measurement on acid base matter

    NASA Astrophysics Data System (ADS)

    Sari, Anggi Ristiyana Puspita; Suyanta, LFX, Endang Widjajanti; Rohaeti, Eli

    2017-05-01

    Recognizing the importance of the development of critical thinking and science process skills, the instrument should give attention to the characteristics of chemistry. Therefore, constructing an accurate instrument for measuring those skills is important. However, the integrated instrument assessment is limited in number. The purpose of this study is to validate an integrated assessment instrument for measuring students' critical thinking and science process skills on acid base matter. The development model of the test instrument adapted McIntire model. The sample consisted of 392 second grade high school students in the academic year of 2015/2016 in Yogyakarta. Exploratory Factor Analysis (EFA) was conducted to explore construct validity, whereas content validity was substantiated by Aiken's formula. The result shows that the KMO test is 0.714 which indicates sufficient items for each factor and the Bartlett test is significant (a significance value of less than 0.05). Furthermore, content validity coefficient which is based on 8 experts is obtained at 0.85. The findings support the integrated assessment instrument to measure critical thinking and science process skills on acid base matter.

  4. An integrative neural model of social perception, action observation, and theory of mind

    PubMed Central

    Yang, Daniel Y.-J.; Rosenblau, Gabriela; Keifer, Cara; Pelphrey, Kevin A.

    2016-01-01

    In the field of social neuroscience, major branches of research have been instrumental in describing independent components of typical and aberrant social information processing, but the field as a whole lacks a comprehensive model that integrates different branches. We review existing research related to the neural basis of three key neural systems underlying social information processing: social perception, action observation, and theory of mind. We propose an integrative model that unites these three processes and highlights the posterior superior temporal sulcus (pSTS), which plays a central role in all three systems. Furthermore, we integrate these neural systems with the dual system account of implicit and explicit social information processing. Large-scale meta-analyses based on Neurosynth confirmed that the pSTS is at the intersection of the three neural systems. Resting-state functional connectivity analysis with 1000 subjects confirmed that the pSTS is connected to all other regions in these systems. The findings presented in this review are specifically relevant for psychiatric research especially disorders characterized by social deficits such as autism spectrum disorder. PMID:25660957

  5. A statistical-based material and process guidelines for design of carbon nanotube field-effect transistors in gigascale integrated circuits.

    PubMed

    Ghavami, Behnam; Raji, Mohsen; Pedram, Hossein

    2011-08-26

    Carbon nanotube field-effect transistors (CNFETs) show great promise as building blocks of future integrated circuits. However, synthesizing single-walled carbon nanotubes (CNTs) with accurate chirality and exact positioning control has been widely acknowledged as an exceedingly complex task. Indeed, density and chirality variations in CNT growth can compromise the reliability of CNFET-based circuits. In this paper, we present a novel statistical compact model to estimate the failure probability of CNFETs to provide some material and process guidelines for the design of CNFETs in gigascale integrated circuits. We use measured CNT spacing distributions within the framework of detailed failure analysis to demonstrate that both the CNT density and the ratio of metallic to semiconducting CNTs play dominant roles in defining the failure probability of CNFETs. Besides, it is argued that the large-scale integration of these devices within an integrated circuit will be feasible only if a specific range of CNT density with an acceptable ratio of semiconducting to metallic CNTs can be adjusted in a typical synthesis process.

  6. FORCinel Version 3.0: An Integrated Environment for Processing, Analysis and Simulation of First-Order Reversal Curve Diagrams

    NASA Astrophysics Data System (ADS)

    Lascu, I.; Harrison, R. J.

    2016-12-01

    First-order reversal curve (FORC) diagrams are a powerful method to characterise the hysteresis properties of magnetic grain ensembles. Methods of processing, analysis and simulation of FORC diagrams have developed rapidly over the past few years, dramatically expanding their utility within rock magnetic research. Here we announce the latest release of FORCinel (Version 3.0), which integrates many of these developments into a unified, user-friendly package running within Igor Pro (www.wavemetrics.com). FORCinel v. 3.0 can be downloaded from https://wserv4.esc.cam.ac.uk/nanopaleomag/. The release will be accompanied by a series of video tutorials outlining each of the new features, including: i) improved work flow, with unified smoothing approach; ii) increased processing speed using multiple processors; iii) control of output resolution, enabling large datasets (> 500 FORCs) to be smoothed in a matter of seconds; iv) load, process, analyse and average multiple FORC diagrams; v) load and process non-gridded data and data acquired on non-PMC systems; vi) improved method for exploring optimal smoothing parameters; vii) interactive and un-doable data-pretreatments; viii) automated detection and removal of measurement outliers; ix) improved interactive method for the generation and optimisation of colour scales; x) full integration with FORCem1 - supervised quantitative unmixing of FORC diagrams using principle component analysis (PCA); xi) full integration with FORCulator2 - micromagnetic simulation of FORC diagrams; xiii) simulate TRM acquisition using the kinetic Monte Carlo simulation algorithm of Shcherbakov3. 1. Lascu, I., Harrison, R.J., Li, Y., Muraszko, J.R., Channell, J.E.T., Piotrowski, A.M., Hodell, D.A., 2015. Magnetic unmixing of first-order reversal curve diagrams using principal component analysis. Geochemistry, Geophys. Geosystems 16, 2900-2915. 2. Harrison, R.J., Lascu, I., 2014. FORCulator: A micromagnetic tool for simulating first-order reversal curve diagrams. Geochemistry Geophys. Geosystems 15, 4671-4691. 3. Shcherbakov, V.P., Lamash, B.E., Sycheva, N.K., 1995. Monte-Carlo modelling of thermoremanence acquisition in interacting single-domain grains. Phys. Earth Planet. Inter. 87, 197-211.

  7. The integration of the risk management process with the lifecycle of medical device software.

    PubMed

    Pecoraro, F; Luzi, D

    2014-01-01

    The application of software in the Medical Device (MD) domain has become central to the improvement of diagnoses and treatments. The new European regulations that specifically address software as an important component of MD, require complex procedures to make software compliant with safety requirements, introducing thereby new challenges in the qualification and classification of MD software as well as in the performance of risk management activities. Under this perspective, the aim of this paper is to propose an integrated framework that combines the activities to be carried out by the manufacturer to develop safe software within the development lifecycle based on the regulatory requirements reported in US and European regulations as well as in the relevant standards and guidelines. A comparative analysis was carried out to identify the main issues related to the application of the current new regulations. In addition, standards and guidelines recently released to harmonise procedures for the validation of MD software have been used to define the risk management activities to be carried out by the manufacturer during the software development process. This paper highlights the main issues related to the qualification and classification of MD software, providing an analysis of the different regulations applied in Europe and the US. A model that integrates the risk management process within the software development lifecycle has been proposed too. It is based on regulatory requirements and considers software risk analysis as a central input to be managed by the manufacturer already at the initial stages of the software design, in order to prevent MD failures. Relevant changes in the process of MD development have been introduced with the recognition of software being an important component of MDs as stated in regulations and standards. This implies the performance of highly iterative processes that have to integrate the risk management in the framework of software development. It also makes it necessary to involve both medical and software engineering competences to safeguard patient and user safety.

  8. An approach to developing an integrated pyroprocessing simulator

    NASA Astrophysics Data System (ADS)

    Lee, Hyo Jik; Ko, Won Il; Choi, Sung Yeol; Kim, Sung Ki; Kim, In Tae; Lee, Han Soo

    2014-02-01

    Pyroprocessing has been studied for a decade as one of the promising fuel recycling options in Korea. We have built a pyroprocessing integrated inactive demonstration facility (PRIDE) to assess the feasibility of integrated pyroprocessing technology and scale-up issues of the processing equipment. Even though such facility cannot be replaced with a real integrated facility using spent nuclear fuel (SF), many insights can be obtained in terms of the world's largest integrated pyroprocessing operation. In order to complement or overcome such limited test-based research, a pyroprocessing Modelling and simulation study began in 2011. The Korea Atomic Energy Research Institute (KAERI) suggested a Modelling architecture for the development of a multi-purpose pyroprocessing simulator consisting of three-tiered models: unit process, operation, and plant-level-model. The unit process model can be addressed using governing equations or empirical equations as a continuous system (CS). In contrast, the operation model describes the operational behaviors as a discrete event system (DES). The plant-level model is an integrated model of the unit process and an operation model with various analysis modules. An interface with different systems, the incorporation of different codes, a process-centered database design, and a dynamic material flow are discussed as necessary components for building a framework of the plant-level model. As a sample model that contains methods decoding the above engineering issues was thoroughly reviewed, the architecture for building the plant-level-model was verified. By analyzing a process and operation-combined model, we showed that the suggested approach is effective for comprehensively understanding an integrated dynamic material flow. This paper addressed the current status of the pyroprocessing Modelling and simulation activity at KAERI, and also predicted its path forward.

  9. An approach to developing an integrated pyroprocessing simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Hyo Jik; Ko, Won Il; Choi, Sung Yeol

    Pyroprocessing has been studied for a decade as one of the promising fuel recycling options in Korea. We have built a pyroprocessing integrated inactive demonstration facility (PRIDE) to assess the feasibility of integrated pyroprocessing technology and scale-up issues of the processing equipment. Even though such facility cannot be replaced with a real integrated facility using spent nuclear fuel (SF), many insights can be obtained in terms of the world's largest integrated pyroprocessing operation. In order to complement or overcome such limited test-based research, a pyroprocessing Modelling and simulation study began in 2011. The Korea Atomic Energy Research Institute (KAERI) suggestedmore » a Modelling architecture for the development of a multi-purpose pyroprocessing simulator consisting of three-tiered models: unit process, operation, and plant-level-model. The unit process model can be addressed using governing equations or empirical equations as a continuous system (CS). In contrast, the operation model describes the operational behaviors as a discrete event system (DES). The plant-level model is an integrated model of the unit process and an operation model with various analysis modules. An interface with different systems, the incorporation of different codes, a process-centered database design, and a dynamic material flow are discussed as necessary components for building a framework of the plant-level model. As a sample model that contains methods decoding the above engineering issues was thoroughly reviewed, the architecture for building the plant-level-model was verified. By analyzing a process and operation-combined model, we showed that the suggested approach is effective for comprehensively understanding an integrated dynamic material flow. This paper addressed the current status of the pyroprocessing Modelling and simulation activity at KAERI, and also predicted its path forward.« less

  10. D.C. - ARC plasma generator for nonequilibrium plasmachemical processes

    NASA Astrophysics Data System (ADS)

    Kvaltin, J.

    1990-06-01

    The analysis of conditions for generation of nonequilibrium plasma to plasmachemical processes is made and the design of d.c.-arc plasma generator on the base of integral criterion is suggested. The measurement of potentials on the plasma column of that generator is presented.

  11. 14 CFR § 1216.302 - Responsibilities.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... responsible for overseeing and guiding NASA's integration of NEPA into the Agency's planning and decision... NEPA analysis into Agency planning and decision-making processes. The SEO shall monitor this process to... Agency's planning and decision making for all NASA activities. The HQ/EMD provides advice and...

  12. 10 CFR 70.62 - Safety program and integrated safety analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...; (iv) Potential accident sequences caused by process deviations or other events internal to the... of occurrence of each potential accident sequence identified pursuant to paragraph (c)(1)(iv) of this... have experience in nuclear criticality safety, radiation safety, fire safety, and chemical process...

  13. 10 CFR 70.62 - Safety program and integrated safety analysis.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...; (iv) Potential accident sequences caused by process deviations or other events internal to the... of occurrence of each potential accident sequence identified pursuant to paragraph (c)(1)(iv) of this... have experience in nuclear criticality safety, radiation safety, fire safety, and chemical process...

  14. 10 CFR 70.62 - Safety program and integrated safety analysis.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...; (iv) Potential accident sequences caused by process deviations or other events internal to the... of occurrence of each potential accident sequence identified pursuant to paragraph (c)(1)(iv) of this... have experience in nuclear criticality safety, radiation safety, fire safety, and chemical process...

  15. Analysis of the control structures for an integrated ethanol processor for proton exchange membrane fuel cell systems

    NASA Astrophysics Data System (ADS)

    Biset, S.; Nieto Deglioumini, L.; Basualdo, M.; Garcia, V. M.; Serra, M.

    The aim of this work is to investigate which would be a good preliminary plantwide control structure for the process of Hydrogen production from bioethanol to be used in a proton exchange membrane (PEM) accounting only steady-state information. The objective is to keep the process under optimal operation point, that is doing energy integration to achieve the maximum efficiency. Ethanol, produced from renewable feedstocks, feeds a fuel processor investigated for steam reforming, followed by high- and low-temperature shift reactors and preferential oxidation, which are coupled to a polymeric fuel cell. Applying steady-state simulation techniques and using thermodynamic models the performance of the complete system with two different control structures have been evaluated for the most typical perturbations. A sensitivity analysis for the key process variables together with the rigorous operability requirements for the fuel cell are taking into account for defining acceptable plantwide control structure. This is the first work showing an alternative control structure applied to this kind of process.

  16. Integrating Robotic Observatories into Astronomy Labs

    NASA Astrophysics Data System (ADS)

    Ruch, Gerald T.

    2015-01-01

    The University of St. Thomas (UST) and a consortium of five local schools is using the UST Robotic Observatory, housing a 17' telescope, to develop labs and image processing tools that allow easy integration of observational labs into existing introductory astronomy curriculum. Our lab design removes the burden of equipment ownership by sharing access to a common resource and removes the burden of data processing by automating processing tasks that are not relevant to the learning objectives.Each laboratory exercise takes place over two lab periods. During period one, students design and submit observation requests via the lab website. Between periods, the telescope automatically acquires the data and our image processing pipeline produces data ready for student analysis. During period two, the students retrieve their data from the website and perform the analysis. The first lab, 'Weighing Jupiter,' was successfully implemented at UST and several of our partner schools. We are currently developing a second lab to measure the age of and distance to a globular cluster.

  17. GEOTAIL Spacecraft historical data report

    NASA Technical Reports Server (NTRS)

    Boersig, George R.; Kruse, Lawrence F.

    1993-01-01

    The purpose of this GEOTAIL Historical Report is to document ground processing operations information gathered on the GEOTAIL mission during processing activities at the Cape Canaveral Air Force Station (CCAFS). It is hoped that this report may aid management analysis, improve integration processing and forecasting of processing trends, and reduce real-time schedule changes. The GEOTAIL payload is the third Delta 2 Expendable Launch Vehicle (ELV) mission to document historical data. Comparisons of planned versus as-run schedule information are displayed. Information will generally fall into the following categories: (1) payload stay times (payload processing facility/hazardous processing facility/launch complex-17A); (2) payload processing times (planned, actual); (3) schedule delays; (4) integrated test times (experiments/launch vehicle); (5) unique customer support requirements; (6) modifications performed at facilities; (7) other appropriate information (Appendices A & B); and (8) lessons learned (reference Appendix C).

  18. Propofol disrupts functional interactions between sensory and high-order processing of auditory verbal memory.

    PubMed

    Liu, Xiaolin; Lauer, Kathryn K; Ward, Barney D; Rao, Stephen M; Li, Shi-Jiang; Hudetz, Anthony G

    2012-10-01

    Current theories suggest that disrupting cortical information integration may account for the mechanism of general anesthesia in suppressing consciousness. Human cognitive operations take place in hierarchically structured neural organizations in the brain. The process of low-order neural representation of sensory stimuli becoming integrated in high-order cortices is also known as cognitive binding. Combining neuroimaging, cognitive neuroscience, and anesthetic manipulation, we examined how cognitive networks involved in auditory verbal memory are maintained in wakefulness, disrupted in propofol-induced deep sedation, and re-established in recovery. Inspired by the notion of cognitive binding, an functional magnetic resonance imaging-guided connectivity analysis was utilized to assess the integrity of functional interactions within and between different levels of the task-defined brain regions. Task-related responses persisted in the primary auditory cortex (PAC), but vanished in the inferior frontal gyrus (IFG) and premotor areas in deep sedation. For connectivity analysis, seed regions representing sensory and high-order processing of the memory task were identified in the PAC and IFG. Propofol disrupted connections from the PAC seed to the frontal regions and thalamus, but not the connections from the IFG seed to a set of widely distributed brain regions in the temporal, frontal, and parietal lobes (with exception of the PAC). These later regions have been implicated in mediating verbal comprehension and memory. These results suggest that propofol disrupts cognition by blocking the projection of sensory information to high-order processing networks and thus preventing information integration. Such findings contribute to our understanding of anesthetic mechanisms as related to information and integration in the brain. Copyright © 2011 Wiley Periodicals, Inc.

  19. 1991 NASA Life Support Systems Analysis workshop

    NASA Technical Reports Server (NTRS)

    Evanich, Peggy L.; Crabb, Thomas M.; Gartrell, Charles F.

    1992-01-01

    The 1991 Life Support Systems Analysis Workshop was sponsored by NASA Headquarters' Office of Aeronautics and Space Technology (OAST) to foster communication among NASA, industrial, and academic specialists, and to integrate their inputs and disseminate information to them. The overall objective of systems analysis within the Life Support Technology Program of OAST is to identify, guide the development of, and verify designs which will increase the performance of the life support systems on component, subsystem, and system levels for future human space missions. The specific goals of this workshop were to report on the status of systems analysis capabilities, to integrate the chemical processing industry technologies, and to integrate recommendations for future technology developments related to systems analysis for life support systems. The workshop included technical presentations, discussions, and interactive planning, with time allocated for discussion of both technology status and time-phased technology development recommendations. Key personnel from NASA, industry, and academia delivered inputs and presentations on the status and priorities of current and future systems analysis methods and requirements.

  20. Automated Hazard Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Riddle, F. J.

    2003-06-26

    The Automated Hazard Analysis (AHA) application is a software tool used to conduct job hazard screening and analysis of tasks to be performed in Savannah River Site facilities. The AHA application provides a systematic approach to the assessment of safety and environmental hazards associated with specific tasks, and the identification of controls regulations, and other requirements needed to perform those tasks safely. AHA is to be integrated into existing Savannah River site work control and job hazard analysis processes. Utilization of AHA will improve the consistency and completeness of hazard screening and analysis, and increase the effectiveness of the workmore » planning process.« less

  1. Semantics-Based Interoperability Framework for the Geosciences

    NASA Astrophysics Data System (ADS)

    Sinha, A.; Malik, Z.; Raskin, R.; Barnes, C.; Fox, P.; McGuinness, D.; Lin, K.

    2008-12-01

    Interoperability between heterogeneous data, tools and services is required to transform data to knowledge. To meet geoscience-oriented societal challenges such as forcing of climate change induced by volcanic eruptions, we suggest the need to develop semantic interoperability for data, services, and processes. Because such scientific endeavors require integration of multiple data bases associated with global enterprises, implicit semantic-based integration is impossible. Instead, explicit semantics are needed to facilitate interoperability and integration. Although different types of integration models are available (syntactic or semantic) we suggest that semantic interoperability is likely to be the most successful pathway. Clearly, the geoscience community would benefit from utilization of existing XML-based data models, such as GeoSciML, WaterML, etc to rapidly advance semantic interoperability and integration. We recognize that such integration will require a "meanings-based search, reasoning and information brokering", which will be facilitated through inter-ontology relationships (ontologies defined for each discipline). We suggest that Markup languages (MLs) and ontologies can be seen as "data integration facilitators", working at different abstraction levels. Therefore, we propose to use an ontology-based data registration and discovery approach to compliment mark-up languages through semantic data enrichment. Ontologies allow the use of formal and descriptive logic statements which permits expressive query capabilities for data integration through reasoning. We have developed domain ontologies (EPONT) to capture the concept behind data. EPONT ontologies are associated with existing ontologies such as SUMO, DOLCE and SWEET. Although significant efforts have gone into developing data (object) ontologies, we advance the idea of developing semantic frameworks for additional ontologies that deal with processes and services. This evolutionary step will facilitate the integrative capabilities of scientists as we examine the relationships between data and external factors such as processes that may influence our understanding of "why" certain events happen. We emphasize the need to go from analysis of data to concepts related to scientific principles of thermodynamics, kinetics, heat flow, mass transfer, etc. Towards meeting these objectives, we report on a pair of related service engines: DIA (Discovery, integration and analysis), and SEDRE (Semantically-Enabled Data Registration Engine) that utilize ontologies for semantic interoperability and integration.

  2. Elements for successful sensor-based process control {Integrated Metrology}

    NASA Astrophysics Data System (ADS)

    Butler, Stephanie Watts

    1998-11-01

    Current productivity needs have stimulated development of alternative metrology, control, and equipment maintenance methods. Specifically, sensor applications provide the opportunity to increase productivity, tighten control, reduce scrap, and improve maintenance schedules and procedures. Past experience indicates a complete integrated solution must be provided for sensor-based control to be used successfully in production. In this paper, Integrated Metrology is proposed as the term for an integrated solution that will result in a successful application of sensors for process control. This paper defines and explores the perceived four elements of successful sensor applications: business needs, integration, components, and form. Based upon analysis of existing successful commercially available controllers, the necessary business factors have been determined to be strong, measurable industry-wide business needs whose solution is profitable and feasible. This paper examines why the key aspect of integration is the decision making process. A detailed discussion is provided of the components of most importance to sensor based control: decision-making methods, the 3R's of sensors, and connectivity. A metric for one of the R's (resolution) is proposed to allow focus on this important aspect of measurement. A form for these integrated components which synergistically partitions various aspects of control at the equipment and MES levels to efficiently achieve desired benefits is recommended.

  3. Integrating ecosystem services analysis into scenario planning practice: accounting for street tree benefits with i-Tree valuation in Central Texas.

    PubMed

    Hilde, Thomas; Paterson, Robert

    2014-12-15

    Scenario planning continues to gain momentum in the United States as an effective process for building consensus on long-range community plans and creating regional visions for the future. However, efforts to integrate more sophisticated information into the analytical framework to help identify important ecosystem services have lagged in practice. This is problematic because understanding the tradeoffs of land consumption patterns on ecological integrity is central to mitigating the environmental degradation caused by land use change and new development. In this paper we describe how an ecosystem services valuation model, i-Tree, was integrated into a mainstream scenario planning software tool, Envision Tomorrow, to assess the benefits of public street trees for alternative future development scenarios. The tool is then applied to development scenarios from the City of Hutto, TX, a Central Texas Sustainable Places Project demonstration community. The integrated tool represents a methodological improvement for scenario planning practice, offers a way to incorporate ecosystem services analysis into mainstream planning processes, and serves as an example of how open source software tools can expand the range of issues available for community and regional planning consideration, even in cases where community resources are limited. The tool also offers room for future improvements; feasible options include canopy analysis of various future land use typologies, as well as a generalized street tree model for broader U.S. application. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Dissolution process analysis using model-free Noyes-Whitney integral equation.

    PubMed

    Hattori, Yusuke; Haruna, Yoshimasa; Otsuka, Makoto

    2013-02-01

    Drug dissolution process of solid dosages is theoretically described by Noyes-Whitney-Nernst equation. However, the analysis of the process is demonstrated assuming some models. Normally, the model-dependent methods are idealized and require some limitations. In this study, Noyes-Whitney integral equation was proposed and applied to represent the drug dissolution profiles of a solid formulation via the non-linear least squares (NLLS) method. The integral equation is a model-free formula involving the dissolution rate constant as a parameter. In the present study, several solid formulations were prepared via changing the blending time of magnesium stearate (MgSt) with theophylline monohydrate, α-lactose monohydrate, and crystalline cellulose. The formula could excellently represent the dissolution profile, and thereby the rate constant and specific surface area could be obtained by NLLS method. Since the long time blending coated the particle surface with MgSt, it was found that the water permeation was disturbed by its layer dissociating into disintegrant particles. In the end, the solid formulations were not disintegrated; however, the specific surface area gradually increased during the process of dissolution. The X-ray CT observation supported this result and demonstrated that the rough surface was dominant as compared to dissolution, and thus, specific surface area of the solid formulation gradually increased. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Deployment Process, Mechanization, and Testing for the Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Iskenderian, Ted

    2004-01-01

    NASA's Mar Exploration Rover (MER) robotic prospectors were produced in an environment of unusually challenging schedule, volume, and mass restrictions. The technical challenges pushed the system s design towards extensive integration of function, which resulted in complex system engineering issues. One example of the system's integrated complexity can be found in the deployment process for the rover. Part of this process, rover "standup", is outlined in this paper. Particular attention is given to the Rover Lift Mechanism's (RLM) role and its design. Analysis methods are presented and compared to test results. It is shown that because prudent design principles were followed, a robust mechanism was created that minimized the duration of integration and test, and enabled recovery without perturbing related systems when reasonably foreseeable problems did occur. Examples of avoidable, unnecessary difficulty are also presented.

  6. System integration of marketable subsystems

    NASA Technical Reports Server (NTRS)

    1978-01-01

    These monthly reports, covering the period February 1978 through June 1978, describe the progress made in the major areas of the program. The areas covered are: systems integration of marketable subsystems; development, design, and building of site data acquisition subsystems; development and operation of the central data processing system; operation of the MSFC Solar Test Facility; and systems analysis.

  7. Web-Based Tools for Designing and Developing Teaching Materials for Integration of Information Technology into Instruction

    ERIC Educational Resources Information Center

    Chang, Kuo-En; Sung, Yao-Ting; Hou, Huei-Tse

    2006-01-01

    Educational software for teachers is an important, yet usually ignored, link for integrating information technology into classroom instruction. This study builds a web-based teaching material design and development system. The process in the system is divided into four stages, analysis, design, development, and practice. Eight junior high school…

  8. An APOS Analysis of Natural Science Students' Understanding of Integration

    ERIC Educational Resources Information Center

    Maharaj, Aneshkumar

    2014-01-01

    This article reports on a study which used the APOS (action-process-object-schema) Theory framework and a classification of errors to investigate university students' understanding of the integration concept and its applications. Research was done at the Westville Campus of the University of KwaZulu-Natal in South Africa. The relevant rules for…

  9. [Thermodynamic analysis of water adsorption and desorption process of Chinese herbal decoction pieces].

    PubMed

    Cheng, Lin; Luo, Xiao-Jian; Han, Xiu-Lin; Wang, Wen-Kai; Rao, Xiao-Yong; Xu, Shao-Zhong; He, Yan

    2016-09-01

    Based on the basic theory of thermodynamics, the thermodynamic parameters and related equations in the process of water adsorption and desorption of Chinese herbal decoction pieces were established, and their water absorption and desorption characteristics were analyzed. The physical significance of the thermodynamic parameters, such as differential adsorption enthalpy, differential adsorption entropy, integral adsorption enthalpy, integral adsorption entropy and the free energy of adsorption, were discussed in this paper to provide theoretical basis for the research on the water adsorption and desorption mechanism, optimum drying process parameters, storage conditions and packaging methods of Chinese herbal decoction pieces. Copyright© by the Chinese Pharmaceutical Association.

  10. The Route to an Integrative Associative Memory Is Influenced by Emotion

    PubMed Central

    Murray, Brendan D.; Kensinger, Elizabeth A.

    2014-01-01

    Though the hippocampus typically has been implicated in processes related to associative binding, special types of associations – such as those created by integrative mental imagery – may be supported by processes implemented in other medial temporal-lobe or sensory processing regions. Here, we investigated what neural mechanisms underlie the formation and subsequent retrieval of integrated mental images, and whether those mechanisms differ based on the emotionality of the integration (i.e., whether it contains an emotional item or not). Participants viewed pairs of words while undergoing a functional MRI scan. They were instructed to imagine the two items separately from one another (“non-integrative” study) or as a single, integrated mental image (“integrative” study). They provided ratings of how successful they were at generating vivid images that fit the instructions. They were then given a surprise associative recognition test, also while undergoing an fMRI scan. The cuneus showed parametric correspondence to increasing imagery success selectively during encoding and retrieval of emotional integrations, while the parahippocampal gyri and prefrontal cortices showed parametric correspondence during the encoding and retrieval of non-emotional integrations. Connectivity analysis revealed that selectively during negative integration, left amygdala activity was negatively correlated with frontal and hippocampal activity. These data indicate that individuals utilize two different neural routes for forming and retrieving integrations depending on their emotional content, and they suggest a potentially disruptive role for the amygdala on frontal and medial-temporal regions during negative integration. PMID:24427267

  11. An application of computer aided requirements analysis to a real time deep space system

    NASA Technical Reports Server (NTRS)

    Farny, A. M.; Morris, R. V.; Hartsough, C.; Callender, E. D.; Teichroew, D.; Chikofsky, E.

    1981-01-01

    The entire procedure of incorporating the requirements and goals of a space flight project into integrated, time ordered sequences of spacecraft commands, is called the uplink process. The Uplink Process Control Task (UPCT) was created to examine the uplink process and determine ways to improve it. The Problem Statement Language/Problem Statement Analyzer (PSL/PSA) designed to assist the designer/analyst/engineer in the preparation of specifications of an information system is used as a supporting tool to aid in the analysis. Attention is given to a definition of the uplink process, the definition of PSL/PSA, the construction of a PSA database, the value of analysis to the study of the uplink process, and the PSL/PSA lessons learned.

  12. Introduction to Radar Signal and Data Processing: The Opportunity

    DTIC Science & Technology

    2006-09-01

    SpA) Director of Analysis of Integrated Systems Group Via Tiburtina Km. 12.400 00131 Rome ITALY e.mail: afarina@selex-si.com Key words: radar...signal processing, data processing, adaptivity, space-time adaptive processing, knowledge based systems , CFAR. 1. SUMMARY This paper introduces to...the lecture series dedicated to the knowledge-based radar signal and data processing. Knowledge-based expert system (KBS) is in the realm of

  13. Integrated Droplet-Based Microextraction with ESI-MS for Removal of Matrix Interference in Single-Cell Analysis.

    PubMed

    Zhang, Xiao-Chao; Wei, Zhen-Wei; Gong, Xiao-Yun; Si, Xing-Yu; Zhao, Yao-Yao; Yang, Cheng-Dui; Zhang, Si-Chun; Zhang, Xin-Rong

    2016-04-29

    Integrating droplet-based microfluidics with mass spectrometry is essential to high-throughput and multiple analysis of single cells. Nevertheless, matrix effects such as the interference of culture medium and intracellular components influence the sensitivity and the accuracy of results in single-cell analysis. To resolve this problem, we developed a method that integrated droplet-based microextraction with single-cell mass spectrometry. Specific extraction solvent was used to selectively obtain intracellular components of interest and remove interference of other components. Using this method, UDP-Glc-NAc, GSH, GSSG, AMP, ADP and ATP were successfully detected in single MCF-7 cells. We also applied the method to study the change of unicellular metabolites in the biological process of dysfunctional oxidative phosphorylation. The method could not only realize matrix-free, selective and sensitive detection of metabolites in single cells, but also have the capability for reliable and high-throughput single-cell analysis.

  14. Evolving Postmortems as Teams Evolve Through TxP

    DTIC Science & Technology

    2014-12-01

    Instead of waiting for SEI to compile enough data to repeat this kind of analysis for the system integration test domain , a system integration test team...and stand up their Team Test Process (TTP). Some abilities, like planning on how many mistakes will be made by the team in producing a test procedure...can only be performed after the team has determined a) which mistakes count in the domain of system integration testing, b) what units to use to

  15. Henson v1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Monozov, Dmitriy; Lukie, Zarija

    2016-04-01

    Modern scientific and engineering simulations track the time evolution of billions of elements. For such large runs, storing most time steps for later analysis is not a viable strategy. It is far more efficient to analyze the simulation data while it is still in memory. The developers present a novel design for running multiple codes in situ: using coroutines and position-independent executables they enable cooperative multitasking between simulation and analysis, allowing the same executables to post-process simulation output, as well as to process it on the fly, both in situ and in transit. They present Henson, an implementation of ourmore » design, and illustrate its versatility by tackling analysis tasks with different computational requirements. Our design differs significantly from the existing frameworks and offers an efficient and robust approach to integrating multiple codes on modern supercomputers. The presented techniques can also be integrated into other in situ frameworks.« less

  16. Design and implementation of spatial knowledge grid for integrated spatial analysis

    NASA Astrophysics Data System (ADS)

    Liu, Xiangnan; Guan, Li; Wang, Ping

    2006-10-01

    Supported by spatial information grid(SIG), the spatial knowledge grid (SKG) for integrated spatial analysis utilizes the middleware technology in constructing the spatial information grid computation environment and spatial information service system, develops spatial entity oriented spatial data organization technology, carries out the profound computation of the spatial structure and spatial process pattern on the basis of Grid GIS infrastructure, spatial data grid and spatial information grid (specialized definition). At the same time, it realizes the complex spatial pattern expression and the spatial function process simulation by taking the spatial intelligent agent as the core to establish space initiative computation. Moreover through the establishment of virtual geographical environment with man-machine interactivity and blending, complex spatial modeling, network cooperation work and spatial community decision knowledge driven are achieved. The framework of SKG is discussed systematically in this paper. Its implement flow and the key technology with examples of overlay analysis are proposed as well.

  17. NMRPro: an integrated web component for interactive processing and visualization of NMR spectra.

    PubMed

    Mohamed, Ahmed; Nguyen, Canh Hao; Mamitsuka, Hiroshi

    2016-07-01

    The popularity of using NMR spectroscopy in metabolomics and natural products has driven the development of an array of NMR spectral analysis tools and databases. Particularly, web applications are well used recently because they are platform-independent and easy to extend through reusable web components. Currently available web applications provide the analysis of NMR spectra. However, they still lack the necessary processing and interactive visualization functionalities. To overcome these limitations, we present NMRPro, a web component that can be easily incorporated into current web applications, enabling easy-to-use online interactive processing and visualization. NMRPro integrates server-side processing with client-side interactive visualization through three parts: a python package to efficiently process large NMR datasets on the server-side, a Django App managing server-client interaction, and SpecdrawJS for client-side interactive visualization. Demo and installation instructions are available at http://mamitsukalab.org/tools/nmrpro/ mohamed@kuicr.kyoto-u.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  18. Development and fabrication of a solar cell junction processing system

    NASA Technical Reports Server (NTRS)

    Banker, S.

    1982-01-01

    Development of a pulsed electron beam subsystem, wafer transport system, and ion implanter are discussed. A junction processing system integration and cost analysis are reviewed. Maintenance of the electron beam processor and the experimental test unit of the non-mass analyzed ion implanter is reviewed.

  19. Integration, status and potential of environmental justice and the social impact assessment process in transportation development in Missouri

    DOT National Transportation Integrated Search

    2003-12-01

    This research examines the Social Impact Assessment Process at the Missouri Department of Transportation as directed by the : National Environmental Policy Act (NEPA). The analysis includes an examination of the influences of the more recent directiv...

  20. Cognitive Processes in Dissociation: An Analysis of Core Theoretical Assumptions

    ERIC Educational Resources Information Center

    Giesbrecht, Timo; Lilienfield, Scott O.; Lynn, Steven Jay; Merckelbach, Harald

    2008-01-01

    Dissociation is typically defined as the lack of normal integration of thoughts, feelings, and experiences into consciousness and memory. The present article critically evaluates the research literature on cognitive processes in dissociation. The authors' review indicates that dissociation is characterized by subtle deficits in neuropsychological…

  1. 10 CFR 70.62 - Safety program and integrated safety analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...) Radiological hazards related to possessing or processing licensed material at its facility; (ii) Chemical hazards of licensed material and hazardous chemicals produced from licensed material; (iii) Facility... performed by a team with expertise in engineering and process operations. The team shall include at least...

  2. ETO - ENGINEERING TRADE-OFFS (SYSTEMS ANALYSIS BRANCH, SUSTAINABLE TECHNOLOGY DIVISION, NRMRL)

    EPA Science Inventory

    The ETO - Engineering Trade-Offs program is to develop a new, integrated decision-making approach to compare/contrast two or more states of being: a benchmark and an alternative, a change in a production process, alternative processes or products. ETO highlights the difference in...

  3. Using the Unified Modelling Language (UML) to guide the systemic description of biological processes and systems.

    PubMed

    Roux-Rouquié, Magali; Caritey, Nicolas; Gaubert, Laurent; Rosenthal-Sabroux, Camille

    2004-07-01

    One of the main issues in Systems Biology is to deal with semantic data integration. Previously, we examined the requirements for a reference conceptual model to guide semantic integration based on the systemic principles. In the present paper, we examine the usefulness of the Unified Modelling Language (UML) to describe and specify biological systems and processes. This makes unambiguous representations of biological systems, which would be suitable for translation into mathematical and computational formalisms, enabling analysis, simulation and prediction of these systems behaviours.

  4. On the way toward systems biology of Aspergillus fumigatus infection.

    PubMed

    Albrecht, Daniela; Kniemeyer, Olaf; Mech, Franziska; Gunzer, Matthias; Brakhage, Axel; Guthke, Reinhard

    2011-06-01

    Pathogenicity of Aspergillus fumigatus is multifactorial. Thus, global studies are essential for the understanding of the infection process. Therefore, a data warehouse was established where genome sequence, transcriptome and proteome data are stored. These data are analyzed for the elucidation of virulence determinants. The data analysis workflow starts with pre-processing including imputing of missing values and normalization. Last step is the identification of differentially expressed genes/proteins as interesting candidates for further analysis, in particular for functional categorization and correlation studies. Sequence data and other prior knowledge extracted from databases are integrated to support the inference of gene regulatory networks associated with pathogenicity. This knowledge-assisted data analysis aims at establishing mathematical models with predictive strength to assist further experimental work. Recently, first steps were done to extend the integrative data analysis and computational modeling by evaluating spatio-temporal data (movies) that monitor interactions of A. fumigatus morphotypes (e.g. conidia) with host immune cells. Copyright © 2011 Elsevier GmbH. All rights reserved.

  5. Spacelab operations planning. [ground handling, launch, flight and experiments

    NASA Technical Reports Server (NTRS)

    Lee, T. J.

    1976-01-01

    The paper reviews NASA planning in the fields of ground, launch and flight operations and experiment integration to effectively operate Spacelab. Payload mission planning is discussed taking consideration of orbital analysis and the mission of a multiuser payload which may be either single or multidiscipline. Payload analytical integration - as active process of analyses to ensure that the experiment payload is compatible to the mission objectives and profile ground and flight operations and that the resource demands upon Spacelab can be satisfied - is considered. Software integration is touched upon and the major integration levels in ground operational processing of Spacelab and its experimental payloads are examined. Flight operations, encompassing the operation of the Space Transportation System and the payload, are discussed as are the initial Spacelab missions. Charts and diagrams are presented illustrating the various planning areas.

  6. Model-based engineering for laser weapons systems

    NASA Astrophysics Data System (ADS)

    Panthaki, Malcolm; Coy, Steve

    2011-10-01

    The Comet Performance Engineering Workspace is an environment that enables integrated, multidisciplinary modeling and design/simulation process automation. One of the many multi-disciplinary applications of the Comet Workspace is for the integrated Structural, Thermal, Optical Performance (STOP) analysis of complex, multi-disciplinary space systems containing Electro-Optical (EO) sensors such as those which are designed and developed by and for NASA and the Department of Defense. The CometTM software is currently able to integrate performance simulation data and processes from a wide range of 3-D CAD and analysis software programs including CODE VTM from Optical Research Associates and SigFitTM from Sigmadyne Inc. which are used to simulate the optics performance of EO sensor systems in space-borne applications. Over the past year, Comet Solutions has been working with MZA Associates of Albuquerque, NM, under a contract with the Air Force Research Laboratories. This funded effort is a "risk reduction effort", to help determine whether the combination of Comet and WaveTrainTM, a wave optics systems engineering analysis environment developed and maintained by MZA Associates and used by the Air Force Research Laboratory, will result in an effective Model-Based Engineering (MBE) environment for the analysis and design of laser weapons systems. This paper will review the results of this effort and future steps.

  7. Determining the Partial Pressure of Volatile Components via Substrate-Integrated Hollow Waveguide Infrared Spectroscopy with Integrated Microfluidics.

    PubMed

    Kokoric, Vjekoslav; Theisen, Johannes; Wilk, Andreas; Penisson, Christophe; Bernard, Gabriel; Mizaikoff, Boris; Gabriel, Jean-Christophe P

    2018-04-03

    A microfluidic system combined with substrate-integrated hollow waveguide (iHWG) vapor phase infrared spectroscopy has been developed for evaluating the chemical activity of volatile compounds dissolved in complex fluids. Chemical activity is an important yet rarely exploited parameter in process analysis and control. Access to chemical activity parameters enables systematic studies on phase diagrams of complex fluids, the detection of aggregation processes, etc. The instrumental approach developed herein uniquely enables controlled evaporation/permeation from a sample solution into a hollow waveguide structure and the analysis of the partial pressures of volatile constituents. For the example of a binary system, it was shown that the chemical activity may be deduced from partial pressure measurements at thermodynamic equilibrium conditions. The combined microfluidic-iHWG midinfrared sensor system (μFLUID-IR) allows the realization of such studies in the absence of any perturbations provoked by sampling operations, which is unavoidable using state-of-the-art analytical techniques such as headspace gas chromatography. For demonstration purposes, a water/ethanol mixture was investigated, and the derived data was cross-validated with established literature values at different mixture ratios. Next to perturbation-free measurements, a response time of the sensor <150 s ( t 90 ) at a recovery time <300 s ( t recovery ) has been achieved, which substantiates the utility of μFLUID-IR for future process analysis-and-control applications.

  8. Improved Anomaly Detection using Integrated Supervised and Unsupervised Processing

    NASA Astrophysics Data System (ADS)

    Hunt, B.; Sheppard, D. G.; Wetterer, C. J.

    There are two broad technologies of signal processing applicable to space object feature identification using nonresolved imagery: supervised processing analyzes a large set of data for common characteristics that can be then used to identify, transform, and extract information from new data taken of the same given class (e.g. support vector machine); unsupervised processing utilizes detailed physics-based models that generate comparison data that can then be used to estimate parameters presumed to be governed by the same models (e.g. estimation filters). Both processes have been used in non-resolved space object identification and yield similar results yet arrived at using vastly different processes. The goal of integrating the results of the two is to seek to achieve an even greater performance by building on the process diversity. Specifically, both supervised processing and unsupervised processing will jointly operate on the analysis of brightness (radiometric flux intensity) measurements reflected by space objects and observed by a ground station to determine whether a particular day conforms to a nominal operating mode (as determined from a training set) or exhibits anomalous behavior where a particular parameter (e.g. attitude, solar panel articulation angle) has changed in some way. It is demonstrated in a variety of different scenarios that the integrated process achieves a greater performance than each of the separate processes alone.

  9. Perceiving integration of a complementary medicine service within a general surgery department through documentation of consultations: a thematic analysis.

    PubMed

    Schiff, Elad; Ben-Arye, Eran; Attias, Samuel; Sroka, Gideon; Matter, Ibrahim; Keshet, Yael

    2012-12-01

    This study aims to examine the meaning and practical implications of integration of a complementary medicine-based surgery service in a hospital setting (CISS--Complementary/Integrative Surgery Service) through analysis of consultation reports associated with this service. Thematic analysis was used to evaluate CISS consultation reports in a hospital electronic consultant charting system during the first half year of the service's activity. 304 consultation reports were analyzed. Nurses initiated significantly more consultations than physicians (55% vs 7%). Consultation requests were gradually more focused on specific symptoms, possibly manifesting a better understanding of the scope of complementary medicine in the surgery setting. CISS practitioners responded in more biomedical language over time, albeit offering a more holistic perspective regarding patients' needs as well as clarifications regarding the nature of the treatment they provided. Diverse communication patterns in consultations evolved over time representing dynamics in multiple levels of integration of the CISS. Documented communication through consultations can provide a window to the process of integration of complementary medicine-based services in health systems. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  10. Systems Analysis of Physical Absorption of CO2 in Ionic Liquids for Pre-Combustion Carbon Capture.

    PubMed

    Zhai, Haibo; Rubin, Edward S

    2018-04-17

    This study develops an integrated technical and economic modeling framework to investigate the feasibility of ionic liquids (ILs) for precombustion carbon capture. The IL 1-hexyl-3-methylimidazolium bis(trifluoromethylsulfonyl)imide is modeled as a potential physical solvent for CO 2 capture at integrated gasification combined cycle (IGCC) power plants. The analysis reveals that the energy penalty of the IL-based capture system comes mainly from the process and product streams compression and solvent pumping, while the major capital cost components are the compressors and absorbers. On the basis of the plant-level analysis, the cost of CO 2 avoided by the IL-based capture and storage system is estimated to be $63 per tonne of CO 2 . Technical and economic comparisons between IL- and Selexol-based capture systems at the plant level show that an IL-based system could be a feasible option for CO 2 capture. Improving the CO 2 solubility of ILs can simplify the capture process configuration and lower the process energy and cost penalties to further enhance the viability of this technology.

  11. Planning for Success: Integrating Analysis with Decision Making.

    ERIC Educational Resources Information Center

    Goho, James; Webb, Ken

    2003-01-01

    Describes a successful strategic planning process at a large community college, which linked the analytic inputs of research with the authority and intuition of leaders. Reports key factors attributed to the process' success, including a collegial and organized structure, detailed project management plans, and confidence in the environmental scan.…

  12. Oak Ridge Computerized Hierarchical Information System (ORCHIS) status report, July 1973

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brooks, A.A.

    1974-01-01

    This report summarizes the concepts, software, and contents of the Oak Ridge Computerized Hierarchical Information System. This data analysis and text processing system was developed as an integrated, comprehensive information processing capability to meet the needs of an on-going multidisciplinary research and development organization. (auth)

  13. Integrating Social Media Technologies in Higher Education: Costs-Benefits Analysis

    ERIC Educational Resources Information Center

    Okoro, Ephraim

    2012-01-01

    Social networking and electronic channels of communication are effective tools in the process of teaching and learning and have increasingly improved the quality of students' learning outcomes in higher education in recent years. The process encourages students' active engagement, collaboration, and participation in class activities and group…

  14. What can one sample tell us? Stable isotopes can assess complex processes in national assessments of lakes, rivers and streams.

    EPA Science Inventory

    Stable isotopes can be very useful in large-scale monitoring programs because samples for isotopic analysis are easy to collect, and isotopes integrate information about complex processes such as evaporation from water isotopes and denitrification from nitrogen isotopes. Traditi...

  15. Rapid self-assembly of DNA on a microfluidic chip

    PubMed Central

    Zheng, Yao; Footz, Tim; Manage, Dammika P; Backhouse, Christopher James

    2005-01-01

    Background DNA self-assembly methods have played a major role in enabling methods for acquiring genetic information without having to resort to sequencing, a relatively slow and costly procedure. However, even self-assembly processes tend to be very slow when they rely upon diffusion on a large scale. Miniaturisation and integration therefore hold the promise of greatly increasing this speed of operation. Results We have developed a rapid method for implementing the self-assembly of DNA within a microfluidic system by electrically extracting the DNA from an environment containing an uncharged denaturant. By controlling the parameters of the electrophoretic extraction and subsequent analysis of the DNA we are able to control when the hybridisation occurs as well as the degree of hybridisation. By avoiding off-chip processing or long thermal treatments we are able to perform this hybridisation rapidly and can perform hybridisation, sizing, heteroduplex analysis and single-stranded conformation analysis within a matter of minutes. The rapidity of this analysis allows the sampling of transient effects that may improve the sensitivity of mutation detection. Conclusions We believe that this method will aid the integration of self-assembly methods upon microfluidic chips. The speed of this analysis also appears to provide information upon the dynamics of the self-assembly process. PMID:15717935

  16. Rapid Prototyping Integrated With Nondestructive Evaluation and Finite Element Analysis

    NASA Technical Reports Server (NTRS)

    Abdul-Aziz, Ali; Baaklini, George Y.

    2001-01-01

    Most reverse engineering approaches involve imaging or digitizing an object then creating a computerized reconstruction that can be integrated, in three dimensions, into a particular design environment. Rapid prototyping (RP) refers to the practical ability to build high-quality physical prototypes directly from computer aided design (CAD) files. Using rapid prototyping, full-scale models or patterns can be built using a variety of materials in a fraction of the time required by more traditional prototyping techniques (refs. 1 and 2). Many software packages have been developed and are being designed to tackle the reverse engineering and rapid prototyping issues just mentioned. For example, image processing and three-dimensional reconstruction visualization software such as Velocity2 (ref. 3) are being used to carry out the construction process of three-dimensional volume models and the subsequent generation of a stereolithography file that is suitable for CAD applications. Producing three-dimensional models of objects from computed tomography (CT) scans is becoming a valuable nondestructive evaluation methodology (ref. 4). Real components can be rendered and subjected to temperature and stress tests using structural engineering software codes. For this to be achieved, accurate high-resolution images have to be obtained via CT scans and then processed, converted into a traditional file format, and translated into finite element models. Prototyping a three-dimensional volume of a composite structure by reading in a series of two-dimensional images generated via CT and by using and integrating commercial software (e.g. Velocity2, MSC/PATRAN (ref. 5), and Hypermesh (ref. 6)) is being applied successfully at the NASA Glenn Research Center. The building process from structural modeling to the analysis level is outlined in reference 7. Subsequently, a stress analysis of a composite cooling panel under combined thermomechanical loading conditions was performed to validate this process.

  17. Toshiba TDF-500 High Resolution Viewing And Analysis System

    NASA Astrophysics Data System (ADS)

    Roberts, Barry; Kakegawa, M.; Nishikawa, M.; Oikawa, D.

    1988-06-01

    A high resolution, operator interactive, medical viewing and analysis system has been developed by Toshiba and Bio-Imaging Research. This system provides many advanced features including high resolution displays, a very large image memory and advanced image processing capability. In particular, the system provides CRT frame buffers capable of update in one frame period, an array processor capable of image processing at operator interactive speeds, and a memory system capable of updating multiple frame buffers at frame rates whilst supporting multiple array processors. The display system provides 1024 x 1536 display resolution at 40Hz frame and 80Hz field rates. In particular, the ability to provide whole or partial update of the screen at the scanning rate is a key feature. This allows multiple viewports or windows in the display buffer with both fixed and cine capability. To support image processing features such as windowing, pan, zoom, minification, filtering, ROI analysis, multiplanar and 3D reconstruction, a high performance CPU is integrated into the system. This CPU is an array processor capable of up to 400 million instructions per second. To support the multiple viewer and array processors' instantaneous high memory bandwidth requirement, an ultra fast memory system is used. This memory system has a bandwidth capability of 400MB/sec and a total capacity of 256MB. This bandwidth is more than adequate to support several high resolution CRT's and also the fast processing unit. This fully integrated approach allows effective real time image processing. The integrated design of viewing system, memory system and array processor are key to the imaging system. It is the intention to describe the architecture of the image system in this paper.

  18. Image Harvest: an open-source platform for high-throughput plant image processing and analysis

    PubMed Central

    Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal

    2016-01-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917

  19. Ontology-based, Tissue MicroArray oriented, image centered tissue bank

    PubMed Central

    Viti, Federica; Merelli, Ivan; Caprera, Andrea; Lazzari, Barbara; Stella, Alessandra; Milanesi, Luciano

    2008-01-01

    Background Tissue MicroArray technique is becoming increasingly important in pathology for the validation of experimental data from transcriptomic analysis. This approach produces many images which need to be properly managed, if possible with an infrastructure able to support tissue sharing between institutes. Moreover, the available frameworks oriented to Tissue MicroArray provide good storage for clinical patient, sample treatment and block construction information, but their utility is limited by the lack of data integration with biomolecular information. Results In this work we propose a Tissue MicroArray web oriented system to support researchers in managing bio-samples and, through the use of ontologies, enables tissue sharing aimed at the design of Tissue MicroArray experiments and results evaluation. Indeed, our system provides ontological description both for pre-analysis tissue images and for post-process analysis image results, which is crucial for information exchange. Moreover, working on well-defined terms it is then possible to query web resources for literature articles to integrate both pathology and bioinformatics data. Conclusions Using this system, users associate an ontology-based description to each image uploaded into the database and also integrate results with the ontological description of biosequences identified in every tissue. Moreover, it is possible to integrate the ontological description provided by the user with a full compliant gene ontology definition, enabling statistical studies about correlation between the analyzed pathology and the most commonly related biological processes. PMID:18460177

  20. [Standard of integration management at company level and its auditing].

    PubMed

    Flach, T; Hetzel, C; Mozdzanowski, M; Schian, H-M

    2006-10-01

    Responsibility at company level for the employment of workers with health-related problems or disabilities has increased, inter alia because of integration management at company level according to section 84 (2) of the German Social Code Book IX. Although several recommendations exist, no standard is available for auditing and certification. Such a standard could be a basis for granting premiums according to section 84 (3) of Book IX of the German Social Code. AUDIT AND CERTIFICATION: One product of the international "disability management" movement is the "Consensus Based Disability Management Audit" (CBDMA). The Audit is a systematic and independent measurement of the effectiveness of integration management at company level. CBDMA goals are to give evidence of the quality of the integration management implemented, to identify opportunities for improvement and recommend appropriate corrective and preventive action. In May 2006, the integration management of Ford-Werke GmbH Germany with about 23 900 employees was audited and certified as the first company in Europe. STANDARD OF INTEGRATION MANAGEMENT AT COMPANY LEVEL: In dialogue with corporate practitioners, the international standard of CBDMA has been adapted, completed and verified concerning its practicability. Process orientation is the key approach, and the structure is similar to DIN EN ISO 9001:2000. Its structure is as follows: (1) management-labour responsibility (goals and objectives, program planning, management-labour review), (2) management of resources (disability manager and DM team, employees' participation, cooperation with external partners, infrastructure), (3) communication (internal and external public relations), (4) case management (identifying cases, contact, situation analysis, planning actions, implementing actions and monitoring, process and outcome evaluation), (5) analysis and improvement (analysis and program evaluation), (6) documentation (manual, records).

  1. Integrating information from disparate sources: the Walter Reed National Surgical Quality Improvement Program Data Transfer Project.

    PubMed

    Nelson, Victoria; Nelson, Victoria Ruth; Li, Fiona; Green, Susan; Tamura, Tomoyoshi; Liu, Jun-Min; Class, Margaret

    2008-11-06

    The Walter Reed National Surgical Quality Improvement Program Data Transfer web module integrates with medical and surgical information systems, and leverages outside standards, such as the National Library of Medicine's RxNorm, to process surgical and risk assessment data. Key components of the project included a needs assessment with nurse reviewers and a data analysis for federated (standards were locally controlled) data sources. The resulting interface streamlines nurse reviewer workflow by integrating related tasks and data.

  2. International Space Station Alpha (ISSA) Integrated Traffic Model

    NASA Technical Reports Server (NTRS)

    Gates, R. E.

    1995-01-01

    The paper discusses the development process of the International Space Station Alpha (ISSA) Integrated Traffic Model which is a subsystem analyses tool utilized in the ISSA design analysis cycles. Fast-track prototyping of the detailed relationships between daily crew and station consumables, propellant needs, maintenance requirements and crew rotation via spread sheets provide adequate benchmarks to assess cargo vehicle design and performance characteristics.

  3. A Comparative Analysis between Direct and Indirect Measurement of Year I Integrated Project

    ERIC Educational Resources Information Center

    Abdullah, Siti Rozaimah Sheikh; Mohamad, Abu Bakar; Anuar, Nurina; Markom, Masturah; Ismail, Manal; Rosli, Masli Irwan; Hasan, Hassimi Abu

    2013-01-01

    The Integrated Project (IP) has been practised in the Department of Chemical and Process Engineering (JKKP) since the 2006/2007 session. Initially, the IP is only implemented for the Year II students for both Chemical (KK) and Biochemical Engineering (KB) programmes. Previously, the Year 1 curriculum was only based on the common faculty courses.…

  4. Building the ECON extension: Functionality and lessons learned

    Treesearch

    Fred C. Martin

    2008-01-01

    The functionality of the ECON extension to FVS is described with emphasis on the ability to dynamically interact with all elements of the FVS simulation process. Like other extensions, ECON is fully integrated within FVS. This integration allows: (1) analysis of multiple alternative tree-removal actions within a single simulation without altering “normal” stand...

  5. Towards a Local Integration of Theories: Codes and Praxeologies in the Case of Computer-Based Instruction

    ERIC Educational Resources Information Center

    Gellert, Uwe; Barbe, Joaquim; Espinoza, Lorena

    2013-01-01

    We report on the development of a "language of description" that facilitates an integrated analysis of classroom video data in terms of the quality of the teaching-learning process and the students' access to valued forms of mathematical knowledge. Our research setting is the introduction of software for teachers for improving the mathematical…

  6. The Effects of Training and Performance Feedback during Behavioral Consultation on General Education Middle School Teachers' Integrity to Functional Analysis Procedures

    ERIC Educational Resources Information Center

    McKenney, Elizabeth L. W.; Waldron, Nancy; Conroy, Maureen

    2013-01-01

    This study describes the integrity with which 3 general education middle school teachers implemented functional analyses (FA) of appropriate behavior for students who typically engaged in disruption. A 4-step model consistent with behavioral consultation was used to support the assessment process. All analyses were conducted during ongoing…

  7. Department of the Army Cost Analysis Manual

    DTIC Science & Technology

    2001-05-01

    SECTION I - AUTOMATED COST ESTIMATING INTEGRATED TOOLS ( ACEIT ) ................................................................179 SECTION II - AUTOMATED...Management & Comptroller) endorsed the Automated Cost Estimating Integrated Tools ( ACEIT ) model and since it is widely used to prepare POEs, CCAs and...CRB IPT (in ACEIT ) will be the basis for information contained in the CAB. Any remaining unresolved issues from the IPT process will be raised at the

  8. Integrated and Early Childhood Education: Preparation for Social Development. Theme D: Looking Forward - Integrated Participation in Processes of Change.

    ERIC Educational Resources Information Center

    Springer, Hugh

    This seminar paper presents an analysis of the many complex issues inherent in planning and implementing those educational interventions designed to accelerate human change so that it matches the pace of external change in traditional societies. After establishing the importance of sensitive periods when learning can be massive and intervention is…

  9. Assessment of Material Solutions of Multi-level Garage Structure Within Integrated Life Cycle Design Process

    NASA Astrophysics Data System (ADS)

    Wałach, Daniel; Sagan, Joanna; Gicala, Magdalena

    2017-10-01

    The paper presents an environmental and economic analysis of the material solutions of multi-level garage. The construction project approach considered reinforced concrete structure under conditions of use of ordinary concrete and high-performance concrete (HPC). Using of HPC allowed to significant reduction of reinforcement steel, mainly in compression elements (columns) in the construction of the object. The analysis includes elements of the methodology of integrated lice cycle design (ILCD). By making multi-criteria analysis based on established weight of the economic and environmental parameters, three solutions have been evaluated and compared within phase of material production (information modules A1-A3).

  10. Integrated exhaust gas analysis system for aircraft turbine engine component testing

    NASA Technical Reports Server (NTRS)

    Summers, R. L.; Anderson, R. C.

    1985-01-01

    An integrated exhaust gas analysis system was designed and installed in the hot-section facility at the Lewis Research Center. The system is designed to operate either manually or automatically and also to be operated from a remote station. The system measures oxygen, water vapor, total hydrocarbons, carbon monoxide, carbon dioxide, and oxides of nitrogen. Two microprocessors control the system and the analyzers, collect data and process them into engineering units, and present the data to the facility computers and the system operator. Within the design of this system there are innovative concepts and procedures that are of general interest and application to other gas analysis tasks.

  11. Computational System For Rapid CFD Analysis In Engineering

    NASA Technical Reports Server (NTRS)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.

    1995-01-01

    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  12. Aggregate-level analysis and prediction of midterm senatorial elections in the United States, 1974-1986.

    PubMed

    Lichtman, A J; Keilis-Borok, V I

    1989-12-01

    Pattern recognition study demonstrates that the outcomes of American midterm senatorial elections follow the dynamics of simple integral parameters that depict preelectoral situations aggregated to the state as a whole. A set of "commonsense" parameters is identified that is sufficient to predict such elections state-by-state and year-by-year. The analysis rejects many similar commonsense parameters. The existence and nature of integral collective behavior in U.S. elections at the level of the individual states is investigated. Implications for understanding the American electoral process are discussed.

  13. FIA: An Open Forensic Integration Architecture for Composing Digital Evidence

    NASA Astrophysics Data System (ADS)

    Raghavan, Sriram; Clark, Andrew; Mohay, George

    The analysis and value of digital evidence in an investigation has been the domain of discourse in the digital forensic community for several years. While many works have considered different approaches to model digital evidence, a comprehensive understanding of the process of merging different evidence items recovered during a forensic analysis is still a distant dream. With the advent of modern technologies, pro-active measures are integral to keeping abreast of all forms of cyber crimes and attacks. This paper motivates the need to formalize the process of analyzing digital evidence from multiple sources simultaneously. In this paper, we present the forensic integration architecture (FIA) which provides a framework for abstracting the evidence source and storage format information from digital evidence and explores the concept of integrating evidence information from multiple sources. The FIA architecture identifies evidence information from multiple sources that enables an investigator to build theories to reconstruct the past. FIA is hierarchically composed of multiple layers and adopts a technology independent approach. FIA is also open and extensible making it simple to adapt to technological changes. We present a case study using a hypothetical car theft case to demonstrate the concepts and illustrate the value it brings into the field.

  14. Self spectrum window method in wigner-ville distribution.

    PubMed

    Liu, Zhongguo; Liu, Changchun; Liu, Boqiang; Lv, Yangsheng; Lei, Yinsheng; Yu, Mengsun

    2005-01-01

    Wigner-Ville distribution (WVD) is an important type of time-frequency analysis in biomedical signal processing. The cross-term interference in WVD has a disadvantageous influence on its application. In this research, the Self Spectrum Window (SSW) method was put forward to suppress the cross-term interference, based on the fact that the cross-term and auto-WVD- terms in integral kernel function are orthogonal. With the Self Spectrum Window (SSW) algorithm, a real auto-WVD function was used as a template to cross-correlate with the integral kernel function, and the Short Time Fourier Transform (STFT) spectrum of the signal was used as window function to process the WVD in time-frequency plane. The SSW method was confirmed by computer simulation with good analysis results. Satisfactory time- frequency distribution was obtained.

  15. An Integrated Framework for Parameter-based Optimization of Scientific Workflows.

    PubMed

    Kumar, Vijay S; Sadayappan, P; Mehta, Gaurang; Vahi, Karan; Deelman, Ewa; Ratnakar, Varun; Kim, Jihie; Gil, Yolanda; Hall, Mary; Kurc, Tahsin; Saltz, Joel

    2009-01-01

    Data analysis processes in scientific applications can be expressed as coarse-grain workflows of complex data processing operations with data flow dependencies between them. Performance optimization of these workflows can be viewed as a search for a set of optimal values in a multi-dimensional parameter space. While some performance parameters such as grouping of workflow components and their mapping to machines do not a ect the accuracy of the output, others may dictate trading the output quality of individual components (and of the whole workflow) for performance. This paper describes an integrated framework which is capable of supporting performance optimizations along multiple dimensions of the parameter space. Using two real-world applications in the spatial data analysis domain, we present an experimental evaluation of the proposed framework.

  16. Implementing an extension of the analytical hierarchy process using ordered weighted averaging operators with fuzzy quantifiers in ArcGIS

    NASA Astrophysics Data System (ADS)

    Boroushaki, Soheil; Malczewski, Jacek

    2008-04-01

    This paper focuses on the integration of GIS and an extension of the analytical hierarchy process (AHP) using quantifier-guided ordered weighted averaging (OWA) procedure. AHP_OWA is a multicriteria combination operator. The nature of the AHP_OWA depends on some parameters, which are expressed by means of fuzzy linguistic quantifiers. By changing the linguistic terms, AHP_OWA can generate a wide range of decision strategies. We propose a GIS-multicriteria evaluation (MCE) system through implementation of AHP_OWA within ArcGIS, capable of integrating linguistic labels within conventional AHP for spatial decision making. We suggest that the proposed GIS-MCE would simplify the definition of decision strategies and facilitate an exploratory analysis of multiple criteria by incorporating qualitative information within the analysis.

  17. Optimisation study of a vehicle bumper subsystem with fuzzy parameters

    NASA Astrophysics Data System (ADS)

    Farkas, L.; Moens, D.; Donders, S.; Vandepitte, D.

    2012-10-01

    This paper deals with the design and optimisation for crashworthiness of a vehicle bumper subsystem, which is a key scenario for vehicle component design. The automotive manufacturers and suppliers have to find optimal design solutions for such subsystems that comply with the conflicting requirements of the regulatory bodies regarding functional performance (safety and repairability) and regarding the environmental impact (mass). For the bumper design challenge, an integrated methodology for multi-attribute design engineering of mechanical structures is set up. The integrated process captures the various tasks that are usually performed manually, this way facilitating the automated design iterations for optimisation. Subsequently, an optimisation process is applied that takes the effect of parametric uncertainties into account, such that the system level of failure possibility is acceptable. This optimisation process is referred to as possibility-based design optimisation and integrates the fuzzy FE analysis applied for the uncertainty treatment in crash simulations. This process is the counterpart of the reliability-based design optimisation used in a probabilistic context with statistically defined parameters (variabilities).

  18. Opportunities and prospects of biorefinery-based valorisation of pulp and paper sludge.

    PubMed

    Gottumukkala, Lalitha Devi; Haigh, Kate; Collard, François-Xavier; van Rensburg, Eugéne; Görgens, Johann

    2016-09-01

    The paper and pulp industry is one of the major industries that generate large amount of solid waste with high moisture content. Numerous opportunities exist for valorisation of waste paper sludge, although this review focuses on primary sludge with high cellulose content. The most mature options for paper sludge valorisation are fermentation, anaerobic digestion and pyrolysis. In this review, biochemical and thermal processes are considered individually and also as integrated biorefinery. The objective of integrated biorefinery is to reduce or avoid paper sludge disposal by landfilling, water reclamation and value addition. Assessment of selected processes for biorefinery varies from a detailed analysis of a single process to high level optimisation and integration of the processes, which allow the initial assessment and comparison of technologies. This data can be used to provide key stakeholders with a roadmap of technologies that can generate economic benefits, and reduce carbon wastage and pollution load. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Hybrid-renewable processes for biofuels production: concentrated solar pyrolysis of biomass residues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George, Anthe; Geier, Manfred; Dedrick, Daniel E.

    2014-10-01

    The viability of thermochemically-derived biofuels can be greatly enhanced by reducing the process parasitic energy loads. Integrating renewable power into biofuels production is one method by which these efficiency drains can be eliminated. There are a variety of such potentially viable "hybrid-renewable" approaches; one is to integrate concentrated solar power (CSP) to power biomass-to-liquid fuels (BTL) processes. Barriers to CSP integration into BTL processes are predominantly the lack of fundamental kinetic and mass transport data to enable appropriate systems analysis and reactor design. A novel design for the reactor has been created that can allow biomass particles to be suspendedmore » in a flow gas, and be irradiated with a simulated solar flux. Pyrolysis conditions were investigated and a comparison between solar and non-solar biomass pyrolysis was conducted in terms of product distributions and pyrolysis oil quality. A novel method was developed to analyse pyrolysis products, and investigate their stability.« less

  20. Evaluation of grid generation technologies from an applied perspective

    NASA Technical Reports Server (NTRS)

    Hufford, Gary S.; Harrand, Vincent J.; Patel, Bhavin C.; Mitchell, Curtis R.

    1995-01-01

    An analysis of the grid generation process from the point of view of an applied CFD engineer is given. Issues addressed include geometric modeling, structured grid generation, unstructured grid generation, hybrid grid generation and use of virtual parts libraries in large parametric analysis projects. The analysis is geared towards comparing the effective turn around time for specific grid generation and CFD projects. The conclusion was made that a single grid generation methodology is not universally suited for all CFD applications due to both limitations in grid generation and flow solver technology. A new geometric modeling and grid generation tool, CFD-GEOM, is introduced to effectively integrate the geometric modeling process to the various grid generation methodologies including structured, unstructured, and hybrid procedures. The full integration of the geometric modeling and grid generation allows implementation of extremely efficient updating procedures, a necessary requirement for large parametric analysis projects. The concept of using virtual parts libraries in conjunction with hybrid grids for large parametric analysis projects is also introduced to improve the efficiency of the applied CFD engineer.

  1. Economic Liberalization and Its Impact on Human Development: A Comparative Analysis of Turkey and Azerbaijan

    ERIC Educational Resources Information Center

    Gulaliyev, Mayis G.; Ok, Nuri I.; Musayeva, Fargana Q.; Efendiyev, Rufat J.; Musayeva, Jamila Q.; Agayeva, Samira R.

    2016-01-01

    The aim of the article is to study the nature of liberalization as a specific economic process, which is formed and developed under the influence of the changing conditions of the globalization and integration processes in the society, as well as to identify the characteristic differences in the processes of liberalization of Turkey and Azerbaijan…

  2. Specifications of a Simulation Model for a Local Area Network Design in Support of a Stock Point Logistics Integrated Communication Environment (SPLICE).

    DTIC Science & Technology

    1983-06-01

    constrained at each step. Use of dis- crete simulation can be a powerful tool in this process if its role is carefully planned. The gross behavior of the...by projecting: - the arrival of units of work at SPLICE processing facilities (workload analysis) . - the amount of processing resources comsumed in

  3. Formation of the Integral Ecological Quality Index of the Technological Processes in Machine Building Based on Their Energy Efficiency

    ERIC Educational Resources Information Center

    Egorov, Sergey B.; Kapitanov, Alexey V.; Mitrofanov, Vladimir G.; Shvartsburg, Leonid E.; Ivanova, Natalia A.; Ryabov, Sergey A.

    2016-01-01

    The aim of article is to provide development of a unified assessment methodology in relation to various technological processes and the actual conditions of their implementation. To carry the energy efficiency analysis of the technological processes through comparison of the established power and the power consumed by the actual technological…

  4. Logistics Process Analysis ToolProcess Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2008-03-31

    LPAT is the resulting integrated system between ANL-developed Enhanced Logistics Intra Theater Support Tool (ELIST) sponsored by SDDC-TEA and the Fort Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the process Anlysis Tool (PAT) which evolved into a stand=-along tool for detailed process analysis at a location. Combined with ELIST, an inter-installation logistics component was added to enable users to define large logistical agent-based models without having to program. PAT is the evolution of an ANL-developed software system called Fortmore » Future Virtual Installation Tool (sponsored by CERL). The Fort Future Simulation Engine was an application written in the ANL Repast Simphony framework and used as the basis for the Process Analysis Tool(PAT) which evolved into a stand-alone tool for detailed process analysis at a location (sponsored by the SDDC-TEA).« less

  5. Proposal for an integrated evaluation model for the study of whole systems health care in cancer.

    PubMed

    Jonas, Wayne B; Beckner, William; Coulter, Ian

    2006-12-01

    For more than 200 years, biomedicine has approached the treatment of disease by studying disease processes (patho-genesis), inferring causal connections and developing specific approaches for therapeutically interfering with those processes. This pathogenic approach has been highly successful in acute and traumatic disease but less successful in chronic disease, primarily because of the complex, multi-factorial nature of most chronic disease, which does not allow for simple causal inference or for simple therapeutic interventions. This article suggests that chronic disease is best approached by enhancing healing processes (salutogenesis) as a whole system. Because of the nature of complex systems in chronic disease, an evaluation model based on integrative medicine is felt to be more appropriate than a disease model. The authors propose and describe an integrated model for the evaluation of healing (IMEH) that collects multilevel "thick case" observational data in assessing complex practices for chronic disease. If successful, this approach could become a blueprint for studying healing capacity in whole medical systems, including complementary medicine, traditional medicine, and conventional primary care. In addition, streamlining data collection and applying rapid informatics management might allow for such data to be used in guiding clinical practice. The IMEH involves collection, integration, and potentially feedback of relevant variables in the following areas: (1) sociocultural, (2) psychological and behavioral, (3) clinical (diagnosis based), and (4) biological. Evaluation and integration of these components would involve specialized research teams that feed their data into a single data management and information analysis center. These data can then be subjected to descriptive and pathway analysis providing "bench and bedside" information.

  6. On the Development of a Computing Infrastructure that Facilitates IPPD from a Decision-Based Design Perspective

    NASA Technical Reports Server (NTRS)

    Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.

    1995-01-01

    Integrated Product and Process Development (IPPD) embodies the simultaneous application of both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. Georgia Tech has proposed the development of an Integrated Design Engineering Simulator that will merge Integrated Product and Process Development with interdisciplinary analysis techniques and state-of-the-art computational technologies. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. The current status of development is given and future directions are outlined.

  7. Progress in integrated-circuit horn antennas for receiver applications. Part 1: Antenna design

    NASA Technical Reports Server (NTRS)

    Eleftheriades, George V.; Ali-Ahmad, Walid Y.; Rebeiz, Gabriel M.

    1992-01-01

    The purpose of this work is to present a systematic method for the design of multimode quasi-integrated horn antennas. The design methodology is based on the Gaussian beam approach and the structures are optimized for achieving maximum fundamental Gaussian coupling efficiency. For this purpose, a hybrid technique is employed in which the integrated part of the antennas is treated using full-wave analysis, whereas the machined part is treated using an approximate method. This results in a simple and efficient design process. The developed design procedure has been applied for the design of a 20, a 23, and a 25 dB quasi-integrated horn antennas, all with a Gaussian coupling efficiency exceeding 97 percent. The designed antennas have been tested and characterized using both full-wave analysis and 90 GHz/370 GHz measurements.

  8. Total systems design analysis of high performance structures

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1993-01-01

    Designer-control parameters were identified at interdiscipline interfaces to optimize structural systems performance and downstream development and operations with reliability and least life-cycle cost. Interface tasks and iterations are tracked through a matrix of performance disciplines integration versus manufacturing, verification, and operations interactions for a total system design analysis. Performance integration tasks include shapes, sizes, environments, and materials. Integrity integrating tasks are reliability and recurring structural costs. Significant interface designer control parameters were noted as shapes, dimensions, probability range factors, and cost. Structural failure concept is presented, and first-order reliability and deterministic methods, benefits, and limitations are discussed. A deterministic reliability technique combining benefits of both is proposed for static structures which is also timely and economically verifiable. Though launch vehicle environments were primarily considered, the system design process is applicable to any surface system using its own unique filed environments.

  9. An exploration of the mechanisms of change following an integrated group intervention for stuttering, as perceived by school-aged children who stutter (CWS).

    PubMed

    Caughter, Sarah; Dunsmuir, Sandra

    2017-03-01

    To explore the process of change and role of resilience following an integrated group intervention for children who stutter (CWS). Using an exploratory multiple case study design, this research sought to identify the most significant changes perceived by seven participants following therapy, the mechanisms of change, and the role of resilience in the process of change. Quantitative measurements of resilience were combined with qualitative analysis of semi-structured interviews. Thematic analysis of qualitative data showed that cognitive and emotional change was a key driver for therapeutic change, enabled by the shared experience of the group and a positive therapeutic environment. These changes were generalised into clients' real-world experiences, facilitated by their support network. Quantitative data demonstrated a statistically reliable positive change in overall Resiliency scores for four participants and reduced impact of stuttering scores on OASES-S for all participants, maintained at 12 month follow-up. This study demonstrates the importance of adopting an integrated approach in therapy for CWS, which incorporates Cognitive Behavioural Therapy (CBT) as a key component, to facilitate change and build resilience. These results are unique to this cohort of CWS and further investigation into the use of CBT and the process of change may be warranted. The reader will be able to (1) describe the integrated intervention used in this study (2) define the most significant change following therapy for the participants involved (3) summarise the key factors that facilitated change during the therapy process (as perceived by the participants). Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Integrated Safety Analysis Teams

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jonathan C.

    2008-01-01

    Today's complex systems require understanding beyond one person s capability to comprehend. Each system requires a team to divide the system into understandable subsystems which can then be analyzed with an Integrated Hazard Analysis. The team must have both specific experiences and diversity of experience. Safety experience and system understanding are not always manifested in one individual. Group dynamics make the difference between success and failure as well as the difference between a difficult task and a rewarding experience. There are examples in the news which demonstrate the need to connect the pieces of a system into a complete picture. The Columbia disaster is now a standard example of a low consequence hazard in one part of the system; the External Tank is a catastrophic hazard cause for a companion subsystem, the Space Shuttle Orbiter. The interaction between the hardware, the manufacturing process, the handling, and the operations contributed to the problem. Each of these had analysis performed, but who constituted the team which integrated this analysis together? This paper will explore some of the methods used for dividing up a complex system; and how one integration team has analyzed the parts. How this analysis has been documented in one particular launch space vehicle case will also be discussed.

  11. The effect of additive geometry on the integration of secondary elements during Friction Stir Processing

    NASA Astrophysics Data System (ADS)

    Zens, A.; Gnedel, M.; Zaeh, M. F.; Haider, F.

    2018-06-01

    Friction Stir Processing (FSP) can be used to locally modify properties in materials such as aluminium. This may be used, for example, to produce a fine microstructure or to integrate secondary elements into the base material. The purpose of this work is to examine the effect of the properties of the metal additives on the resulting material distribution in the processed region. For this, commercially pure iron and copper were integrated into an EN AW-1050 aluminium base material using FSP. Iron in the form of powder, wire and foil as well as copper in powder form were assessed. The various additive forms represent materials with differing surface-to-volume ratios as well as varying dispersion characteristics in the processing zone. The processing parameters for each additive form remained constant; however, two- and four-pass FSP processes were conducted. The results of CT analysis proved especially insightful regarding the spatial distribution of the various additive form within the workpiece. As expected, the powder additive was most widely distributed within the welding zone. Micro-hardness mappings showed that the powder additive contributed to the hardness within the weld nugget in comparison to the processed material without secondary elements.

  12. Argo: an integrative, interactive, text mining-based workbench supporting curation

    PubMed Central

    Rak, Rafal; Rowley, Andrew; Black, William; Ananiadou, Sophia

    2012-01-01

    Curation of biomedical literature is often supported by the automatic analysis of textual content that generally involves a sequence of individual processing components. Text mining (TM) has been used to enhance the process of manual biocuration, but has been focused on specific databases and tasks rather than an environment integrating TM tools into the curation pipeline, catering for a variety of tasks, types of information and applications. Processing components usually come from different sources and often lack interoperability. The well established Unstructured Information Management Architecture is a framework that addresses interoperability by defining common data structures and interfaces. However, most of the efforts are targeted towards software developers and are not suitable for curators, or are otherwise inconvenient to use on a higher level of abstraction. To overcome these issues we introduce Argo, an interoperable, integrative, interactive and collaborative system for text analysis with a convenient graphic user interface to ease the development of processing workflows and boost productivity in labour-intensive manual curation. Robust, scalable text analytics follow a modular approach, adopting component modules for distinct levels of text analysis. The user interface is available entirely through a web browser that saves the user from going through often complicated and platform-dependent installation procedures. Argo comes with a predefined set of processing components commonly used in text analysis, while giving the users the ability to deposit their own components. The system accommodates various areas and levels of user expertise, from TM and computational linguistics to ontology-based curation. One of the key functionalities of Argo is its ability to seamlessly incorporate user-interactive components, such as manual annotation editors, into otherwise completely automatic pipelines. As a use case, we demonstrate the functionality of an in-built manual annotation editor that is well suited for in-text corpus annotation tasks. Database URL: http://www.nactem.ac.uk/Argo PMID:22434844

  13. Post-processing Seasonal Precipitation Forecasts via Integrating Climate Indices and the Analog Approach

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Zhang, Y.; Wood, A.; Lee, H. S.; Wu, L.; Schaake, J. C.

    2016-12-01

    Seasonal precipitation forecasts are a primary driver for seasonal streamflow prediction that is critical for a range of water resources applications, such as reservoir operations and drought management. However, it is well known that seasonal precipitation forecasts from climate models are often biased and also too coarse in spatial resolution for hydrologic applications. Therefore, post-processing procedures such as downscaling and bias correction are often needed. In this presentation, we discuss results from a recent study that applies a two-step methodology to downscale and correct the ensemble mean precipitation forecasts from the Climate Forecast System (CFS). First, CFS forecasts are downscaled and bias corrected using monthly reforecast analogs: we identify past precipitation forecasts that are similar to the current forecast, and then use the finer-scale observational analysis fields from the corresponding dates to represent the post-processed ensemble forecasts. Second, we construct the posterior distribution of forecast precipitation from the post-processed ensemble by integrating climate indices: a correlation analysis is performed to identify dominant climate indices for the study region, which are then used to weight the analysis analogs selected in the first step using a Bayesian approach. The methodology is applied to the California Nevada River Forecast Center (CNRFC) and the Middle Atlantic River Forecast Center (MARFC) regions for 1982-2015, using the North American Land Data Assimilation System (NLDAS-2) precipitation as the analysis. The results from cross validation show that the post-processed CFS precipitation forecast are considerably more skillful than the raw CFS with the analog approach only. Integrating climate indices can further improve the skill if the number of ensemble members considered is large enough; however, the improvement is generally limited to the first couple of months when compared against climatology. Impacts of various factors such as ensemble size, lead time, and choice of climate indices will also be discussed.

  14. Case based reasoning in criminal intelligence using forensic case data.

    PubMed

    Ribaux, O; Margot, P

    2003-01-01

    A model that is based on the knowledge of experienced investigators in the analysis of serial crime is suggested to bridge a gap between technology and methodology. Its purpose is to provide a solid methodology for the analysis of serial crimes that supports decision making in the deployment of resources, either by guiding proactive policing operations or helping the investigative process. Formalisation has helped to derive a computerised system that efficiently supports the reasoning processes in the analysis of serial crime. This novel approach fully integrates forensic science data.

  15. Integrating Remote Sensing Data, Hybrid-Cloud Computing, and Event Notifications for Advanced Rapid Imaging & Analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Hua, H.; Owen, S. E.; Yun, S.; Lundgren, P.; Fielding, E. J.; Agram, P.; Manipon, G.; Stough, T. M.; Simons, M.; Rosen, P. A.; Wilson, B. D.; Poland, M. P.; Cervelli, P. F.; Cruz, J.

    2013-12-01

    Space-based geodetic measurement techniques such as Interferometric Synthetic Aperture Radar (InSAR) and Continuous Global Positioning System (CGPS) are now important elements in our toolset for monitoring earthquake-generating faults, volcanic eruptions, hurricane damage, landslides, reservoir subsidence, and other natural and man-made hazards. Geodetic imaging's unique ability to capture surface deformation with high spatial and temporal resolution has revolutionized both earthquake science and volcanology. Continuous monitoring of surface deformation and surface change before, during, and after natural hazards improves decision-making from better forecasts, increased situational awareness, and more informed recovery. However, analyses of InSAR and GPS data sets are currently handcrafted following events and are not generated rapidly and reliably enough for use in operational response to natural disasters. Additionally, the sheer data volumes needed to handle a continuous stream of InSAR data sets also presents a bottleneck. It has been estimated that continuous processing of InSAR coverage of California alone over 3-years would reach PB-scale data volumes. Our Advanced Rapid Imaging and Analysis for Monitoring Hazards (ARIA-MH) science data system enables both science and decision-making communities to monitor areas of interest with derived geodetic data products via seamless data preparation, processing, discovery, and access. We will present our findings on the use of hybrid-cloud computing to improve the timely processing and delivery of geodetic data products, integrating event notifications from USGS to improve the timely processing for response, as well as providing browse results for quick looks with other tools for integrative analysis.

  16. A Review of Diagnostic Techniques for ISHM Applications

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, Ann; Biswas, Gautam; Aaseng, Gordon; Narasimhan, Sriam; Pattipati, Krishna

    2005-01-01

    System diagnosis is an integral part of any Integrated System Health Management application. Diagnostic applications make use of system information from the design phase, such as safety and mission assurance analysis, failure modes and effects analysis, hazards analysis, functional models, fault propagation models, and testability analysis. In modern process control and equipment monitoring systems, topological and analytic , models of the nominal system, derived from design documents, are also employed for fault isolation and identification. Depending on the complexity of the monitored signals from the physical system, diagnostic applications may involve straightforward trending and feature extraction techniques to retrieve the parameters of importance from the sensor streams. They also may involve very complex analysis routines, such as signal processing, learning or classification methods to derive the parameters of importance to diagnosis. The process that is used to diagnose anomalous conditions from monitored system signals varies widely across the different approaches to system diagnosis. Rule-based expert systems, case-based reasoning systems, model-based reasoning systems, learning systems, and probabilistic reasoning systems are examples of the many diverse approaches ta diagnostic reasoning. Many engineering disciplines have specific approaches to modeling, monitoring and diagnosing anomalous conditions. Therefore, there is no "one-size-fits-all" approach to building diagnostic and health monitoring capabilities for a system. For instance, the conventional approaches to diagnosing failures in rotorcraft applications are very different from those used in communications systems. Further, online and offline automated diagnostic applications are integrated into an operations framework with flight crews, flight controllers and maintenance teams. While the emphasis of this paper is automation of health management functions, striking the correct balance between automated and human-performed tasks is a vital concern.

  17. Welded joints integrity analysis and optimization for fiber laser welding of dissimilar materials

    NASA Astrophysics Data System (ADS)

    Ai, Yuewei; Shao, Xinyu; Jiang, Ping; Li, Peigen; Liu, Yang; Liu, Wei

    2016-11-01

    Dissimilar materials welded joints provide many advantages in power, automotive, chemical, and spacecraft industries. The weld bead integrity which is determined by process parameters plays a significant role in the welding quality during the fiber laser welding (FLW) of dissimilar materials. In this paper, an optimization method by taking the integrity of the weld bead and weld area into consideration is proposed for FLW of dissimilar materials, the low carbon steel and stainless steel. The relationships between the weld bead integrity and process parameters are developed by the genetic algorithm optimized back propagation neural network (GA-BPNN). The particle swarm optimization (PSO) algorithm is taken for optimizing the predicted outputs from GA-BPNN for the objective. Through the optimization process, the desired weld bead with good integrity and minimum weld area are obtained and the corresponding microstructure and microhardness are excellent. The mechanical properties of the optimized joints are greatly improved compared with that of the un-optimized welded joints. Moreover, the effects of significant factors are analyzed based on the statistical approach and the laser power (LP) is identified as the most significant factor on the weld bead integrity and weld area. The results indicate that the proposed method is effective for improving the reliability and stability of welded joints in the practical production.

  18. Network and biosignature analysis for the integration of transcriptomic and metabolomic data to characterize leaf senescence process in sunflower.

    PubMed

    Moschen, Sebastián; Higgins, Janet; Di Rienzo, Julio A; Heinz, Ruth A; Paniego, Norma; Fernandez, Paula

    2016-06-06

    In recent years, high throughput technologies have led to an increase of datasets from omics disciplines allowing the understanding of the complex regulatory networks associated with biological processes. Leaf senescence is a complex mechanism controlled by multiple genetic and environmental variables, which has a strong impact on crop yield. Transcription factors (TFs) are key proteins in the regulation of gene expression, regulating different signaling pathways; their function is crucial for triggering and/or regulating different aspects of the leaf senescence process. The study of TF interactions and their integration with metabolic profiles under different developmental conditions, especially for a non-model organism such as sunflower, will open new insights into the details of gene regulation of leaf senescence. Weighted Gene Correlation Network Analysis (WGCNA) and BioSignature Discoverer (BioSD, Gnosis Data Analysis, Heraklion, Greece) were used to integrate transcriptomic and metabolomic data. WGCNA allowed the detection of 10 metabolites and 13 TFs whereas BioSD allowed the detection of 1 metabolite and 6 TFs as potential biomarkers. The comparative analysis demonstrated that three transcription factors were detected through both methodologies, highlighting them as potentially robust biomarkers associated with leaf senescence in sunflower. The complementary use of network and BioSignature Discoverer analysis of transcriptomic and metabolomic data provided a useful tool for identifying candidate genes and metabolites which may have a role during the triggering and development of the leaf senescence process. The WGCNA tool allowed us to design and test a hypothetical network in order to infer relationships across selected transcription factor and metabolite candidate biomarkers involved in leaf senescence, whereas BioSignature Discoverer selected transcripts and metabolites which discriminate between different ages of sunflower plants. The methodology presented here would help to elucidate and predict novel networks and potential biomarkers of leaf senescence in sunflower.

  19. An integrated assessment instrument: Developing and validating instrument for facilitating critical thinking abilities and science process skills on electrolyte and nonelectrolyte solution matter

    NASA Astrophysics Data System (ADS)

    Astuti, Sri Rejeki Dwi; Suyanta, LFX, Endang Widjajanti; Rohaeti, Eli

    2017-05-01

    The demanding of assessment in learning process was impact by policy changes. Nowadays, assessment is not only emphasizing knowledge, but also skills and attitudes. However, in reality there are many obstacles in measuring them. This paper aimed to describe how to develop integrated assessment instrument and to verify instruments' validity such as content validity and construct validity. This instrument development used test development model by McIntire. Development process data was acquired based on development test step. Initial product was observed by three peer reviewer and six expert judgments (two subject matter experts, two evaluation experts and two chemistry teachers) to acquire content validity. This research involved 376 first grade students of two Senior High Schools in Bantul Regency to acquire construct validity. Content validity was analyzed used Aiken's formula. The verifying of construct validity was analyzed by exploratory factor analysis using SPSS ver 16.0. The result show that all constructs in integrated assessment instrument are asserted valid according to content validity and construct validity. Therefore, the integrated assessment instrument is suitable for measuring critical thinking abilities and science process skills of senior high school students on electrolyte solution matter.

  20. Combined Pressure, Temperature Contrast and Surface-Enhanced Separation of Carbon Dioxide for Post-Combustion Carbon Capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Zhen; Wong, Michael; Gupta, Mayank

    The Rice University research team developed a hybrid carbon dioxide (CO 2) absorption process combining absorber and stripper columns using a high surface area ceramic foam gas-liquid contactor for enhanced mass transfer and utilizing waste heat for regeneration. This integrated absorber/desorber arrangement will reduce space requirements, an important factor for retrofitting existing coal-fired power plants with CO 2 capture technology. Described in this report, we performed an initial analysis to estimate the technical and economic feasibility of the process. A one-dimensional (1D) CO 2 absorption column was fabricated to measure the hydrodynamic and mass transfer characteristics of the ceramic foam.more » A bench-scale prototype was constructed to implement the complete CO 2 separation process and tested to study various aspects of fluid flow in the process. A model was developed to simulate the two-dimensional (2D) fluid flow and optimize the CO 2 capture process. Test results were used to develop a final technoeconomic analysis and identify the most appropriate absorbent as well as optimum operating conditions to minimize capital and operating costs. Finally, a technoeconomic study was performed to assess the feasibility of integrating the process into a 600 megawatt electric (MWe) coal-fired power plant. With process optimization, $82/MWh of COE can be achieved using our integrated absorber/desorber CO 2 capture technology, which is very close to DOE's target that no more than a 35% increase in COE with CCS. An environmental, health, and safety (EH&S) assessment of the capture process indicated no significant concern in terms of EH&S effects or legislative compliance.« less

  1. Integrative multi-platform meta-analysis of gene expression profiles in pancreatic ductal adenocarcinoma patients for identifying novel diagnostic biomarkers.

    PubMed

    Irigoyen, Antonio; Jimenez-Luna, Cristina; Benavides, Manuel; Caba, Octavio; Gallego, Javier; Ortuño, Francisco Manuel; Guillen-Ponce, Carmen; Rojas, Ignacio; Aranda, Enrique; Torres, Carolina; Prados, Jose

    2018-01-01

    Applying differentially expressed genes (DEGs) to identify feasible biomarkers in diseases can be a hard task when working with heterogeneous datasets. Expression data are strongly influenced by technology, sample preparation processes, and/or labeling methods. The proliferation of different microarray platforms for measuring gene expression increases the need to develop models able to compare their results, especially when different technologies can lead to signal values that vary greatly. Integrative meta-analysis can significantly improve the reliability and robustness of DEG detection. The objective of this work was to develop an integrative approach for identifying potential cancer biomarkers by integrating gene expression data from two different platforms. Pancreatic ductal adenocarcinoma (PDAC), where there is an urgent need to find new biomarkers due its late diagnosis, is an ideal candidate for testing this technology. Expression data from two different datasets, namely Affymetrix and Illumina (18 and 36 PDAC patients, respectively), as well as from 18 healthy controls, was used for this study. A meta-analysis based on an empirical Bayesian methodology (ComBat) was then proposed to integrate these datasets. DEGs were finally identified from the integrated data by using the statistical programming language R. After our integrative meta-analysis, 5 genes were commonly identified within the individual analyses of the independent datasets. Also, 28 novel genes that were not reported by the individual analyses ('gained' genes) were also discovered. Several of these gained genes have been already related to other gastroenterological tumors. The proposed integrative meta-analysis has revealed novel DEGs that may play an important role in PDAC and could be potential biomarkers for diagnosing the disease.

  2. Warpgroup: increased precision of metabolomic data processing by consensus integration bound analysis

    PubMed Central

    Mahieu, Nathaniel G.; Spalding, Jonathan L.; Patti, Gary J.

    2016-01-01

    Motivation: Current informatic techniques for processing raw chromatography/mass spectrometry data break down under several common, non-ideal conditions. Importantly, hydrophilic liquid interaction chromatography (a key separation technology for metabolomics) produces data which are especially challenging to process. We identify three critical points of failure in current informatic workflows: compound specific drift, integration region variance, and naive missing value imputation. We implement the Warpgroup algorithm to address these challenges. Results: Warpgroup adds peak subregion detection, consensus integration bound detection, and intelligent missing value imputation steps to the conventional informatic workflow. When compared with the conventional workflow, Warpgroup made major improvements to the processed data. The coefficient of variation for peaks detected in replicate injections of a complex Escherichia Coli extract were halved (a reduction of 19%). Integration regions across samples were much more robust. Additionally, many signals lost by the conventional workflow were ‘rescued’ by the Warpgroup refinement, thereby resulting in greater analyte coverage in the processed data. Availability and implementation: Warpgroup is an open source R package available on GitHub at github.com/nathaniel-mahieu/warpgroup. The package includes example data and XCMS compatibility wrappers for ease of use. Supplementary information: Supplementary data are available at Bioinformatics online. Contact: nathaniel.mahieu@wustl.edu or gjpattij@wustl.edu PMID:26424859

  3. [Optimization theory and practical application of membrane science technology based on resource of traditional Chinese medicine residue].

    PubMed

    Zhu, Hua-Xu; Duan, Jin-Ao; Guo, Li-Wei; Li, Bo; Lu, Jin; Tang, Yu-Ping; Pan, Lin-Mei

    2014-05-01

    Resource of traditional Chinese medicine residue is an inevitable choice to form new industries characterized of modem, environmental protection and intensive in the Chinese medicine industry. Based on the analysis of source and the main chemical composition of the herb residue, and for the advantages of membrane science and technology used in the pharmaceutical industry, especially membrane separation technology used in improvement technical reserves of traditional extraction and separation process in the pharmaceutical industry, it is proposed that membrane science and technology is one of the most important choices in technological design of traditional Chinese medicine resource industrialization. Traditional Chinese medicine residue is a very complex material system in composition and character, and scientific and effective "separation" process is the key areas of technology to re-use it. Integrated process can improve the productivity of the target product, enhance the purity of the product in the separation process, and solve many tasks which conventional separation is difficult to achieve. As integrated separation technology has the advantages of simplified process and reduced consumption, which are in line with the trend of the modern pharmaceutical industry, the membrane separation technology can provide a broad platform for integrated process, and membrane separation technology with its integrated technology have broad application prospects in achieving resource and industrialization process of traditional Chinese medicine residue. We discuss the principles, methods and applications practice of effective component resources in herb residue using membrane separation and integrated technology, describe the extraction, separation, concentration and purification application of membrane technology in traditional Chinese medicine residue, and systematically discourse suitability and feasibility of membrane technology in the process of traditional Chinese medicine resource industrialization in this paper.

  4. OpenICE medical device interoperability platform overview and requirement analysis.

    PubMed

    Arney, David; Plourde, Jeffrey; Goldman, Julian M

    2018-02-23

    We give an overview of OpenICE, an open source implementation of the ASTM standard F2761 for the Integrated Clinical Environment (ICE) that leverages medical device interoperability, together with an analysis of the clinical and non-functional requirements and community process that inspired its design.

  5. Unattended reaction monitoring using an automated microfluidic sampler and on-line liquid chromatography.

    PubMed

    Patel, Darshan C; Lyu, Yaqi Fara; Gandarilla, Jorge; Doherty, Steve

    2018-04-03

    In-process sampling and analysis is an important aspect of monitoring kinetic profiles and impurity formation or rejection, both in development and during commercial manufacturing. In pharmaceutical process development, the technology of choice for a substantial portion of this analysis is high-performance liquid chromatography (HPLC). Traditionally, the sample extraction and preparation for reaction characterization have been performed manually. This can be time consuming, laborious, and impractical for long processes. Depending on the complexity of the sample preparation, there can be variability introduced by different analysts, and in some cases, the integrity of the sample can be compromised during handling. While there are commercial instruments available for on-line monitoring with HPLC, they lack capabilities in many key areas. Some do not provide integration of the sampling and analysis, while others afford limited flexibility in sample preparation. The current offerings provide a limited number of unit operations available for sample processing and no option for workflow customizability. This work describes development of a microfluidic automated program (MAP) which fully automates the sample extraction, manipulation, and on-line LC analysis. The flexible system is controlled using an intuitive Microsoft Excel based user interface. The autonomous system is capable of unattended reaction monitoring that allows flexible unit operations and workflow customization to enable complex operations and on-line sample preparation. The automated system is shown to offer advantages over manual approaches in key areas while providing consistent and reproducible in-process data. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. The problems concerning the integration of very thin mirror shells

    NASA Astrophysics Data System (ADS)

    Basso, S.; Citterio, O.; Mazzoleni, F.; Pareschi, G.; Tagliaferri, G.; Valtolina, R.; Conconi, P.; Parodi, G.

    2009-08-01

    The necessity to reduce the mass and to increase the collecting area requires that the thickness of the optics becomes more and more thinner. Simbol-X was a typical example of this trend. Such thickness makes the shells floppy and therefore unable to maintain the correct shape. During the integration of the shells into the mechanical structure, only negligible deformation must be introduced. The low thickness means also that the shells must be glued on both sides to reach a good stiffness of the whole mirror module and this fact introduces a set of mounting problems. In INAF - Osservatorio Astronomico di Brera an integration process has been developed. The use of stiffening rings and of a temporary structure is the key to maintain the right shape of the shell. In this article the results of the integration of the first three prototypes of the Simbol-X optics are presented. The description of the process and the analysis of the degradation of the performances during the integration are shown in detail.

  7. Integrated continuous bioprocessing: Economic, operational, and environmental feasibility for clinical and commercial antibody manufacture.

    PubMed

    Pollock, James; Coffman, Jon; Ho, Sa V; Farid, Suzanne S

    2017-07-01

    This paper presents a systems approach to evaluating the potential of integrated continuous bioprocessing for monoclonal antibody (mAb) manufacture across a product's lifecycle from preclinical to commercial manufacture. The economic, operational, and environmental feasibility of alternative continuous manufacturing strategies were evaluated holistically using a prototype UCL decisional tool that integrated process economics, discrete-event simulation, environmental impact analysis, operational risk analysis, and multiattribute decision-making. The case study focused on comparing whole bioprocesses that used either batch, continuous or a hybrid combination of batch and continuous technologies for cell culture, capture chromatography, and polishing chromatography steps. The cost of goods per gram (COG/g), E-factor, and operational risk scores of each strategy were established across a matrix of scenarios with differing combinations of clinical development phase and company portfolio size. The tool outputs predict that the optimal strategy for early phase production and small/medium-sized companies is the integrated continuous strategy (alternating tangential flow filtration (ATF) perfusion, continuous capture, continuous polishing). However, the top ranking strategy changes for commercial production and companies with large portfolios to the hybrid strategy with fed-batch culture, continuous capture and batch polishing from a COG/g perspective. The multiattribute decision-making analysis highlighted that if the operational feasibility was considered more important than the economic benefits, the hybrid strategy would be preferred for all company scales. Further considerations outside the scope of this work include the process development costs required to adopt continuous processing. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:854-866, 2017. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers.

  8. Integrated continuous bioprocessing: Economic, operational, and environmental feasibility for clinical and commercial antibody manufacture

    PubMed Central

    Pollock, James; Coffman, Jon; Ho, Sa V.

    2017-01-01

    This paper presents a systems approach to evaluating the potential of integrated continuous bioprocessing for monoclonal antibody (mAb) manufacture across a product's lifecycle from preclinical to commercial manufacture. The economic, operational, and environmental feasibility of alternative continuous manufacturing strategies were evaluated holistically using a prototype UCL decisional tool that integrated process economics, discrete‐event simulation, environmental impact analysis, operational risk analysis, and multiattribute decision‐making. The case study focused on comparing whole bioprocesses that used either batch, continuous or a hybrid combination of batch and continuous technologies for cell culture, capture chromatography, and polishing chromatography steps. The cost of goods per gram (COG/g), E‐factor, and operational risk scores of each strategy were established across a matrix of scenarios with differing combinations of clinical development phase and company portfolio size. The tool outputs predict that the optimal strategy for early phase production and small/medium‐sized companies is the integrated continuous strategy (alternating tangential flow filtration (ATF) perfusion, continuous capture, continuous polishing). However, the top ranking strategy changes for commercial production and companies with large portfolios to the hybrid strategy with fed‐batch culture, continuous capture and batch polishing from a COG/g perspective. The multiattribute decision‐making analysis highlighted that if the operational feasibility was considered more important than the economic benefits, the hybrid strategy would be preferred for all company scales. Further considerations outside the scope of this work include the process development costs required to adopt continuous processing. © 2017 The Authors Biotechnology Progress published by Wiley Periodicals, Inc. on behalf of American Institute of Chemical Engineers Biotechnol. Prog., 33:854–866, 2017 PMID:28480535

  9. All-integrated and highly sensitive paper based device with sample treatment platform for Cd2+ immunodetection in drinking/tap waters.

    PubMed

    López Marzo, Adaris M; Pons, Josefina; Blake, Diane A; Merkoçi, Arben

    2013-04-02

    Nowadays, the development of systems, devices, or methods that integrate several process steps into one multifunctional step for clinical, environmental, or industrial purposes constitutes a challenge for many ongoing research projects. Here, we present a new integrated paper based cadmium (Cd(2+)) immunosensing system in lateral flow format, which integrates the sample treatment process with the analyte detection process. The principle of Cd(2+) detection is based on competitive reaction between the cadmium-ethylenediaminetetraacetic acid-bovine serum albumin-gold nanoparticles (Cd-EDTA-BSA-AuNP) conjugate deposited on the conjugation pad strip and the Cd-EDTA complex formed in the analysis sample for the same binding sites of the 2A81G5 monoclonal antibody (mAb), specific to Cd-EDTA but not Cd(2+) free, which is immobilized onto the test line. This platform operates without any sample pretreatment step for Cd(2+) detection thanks to an extra conjugation pad that ensures Cd(2+) complexation with EDTA and interference masking through ovalbumin (OVA). The detection and quantification limits found for the device were 0.1 and 0.4 ppb, respectively, these being the lowest limits reported up to now for metal sensors based on paper. The accuracy of the device was evaluated by addition of known quantities of Cd(2+) to different drinking water samples and subsequent Cd(2+) content analysis. Sample recoveries ranged from 95 to 105% and the coefficient of variation for the intermediate precision assay was less than 10%. In addition, the results obtained here were compared with those obtained with the well-established inductively coupled plasma emission spectroscopy (ICPES) and the analysis of certificate standard samples.

  10. Process-based Cost Estimation for Ramjet/Scramjet Engines

    NASA Technical Reports Server (NTRS)

    Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John

    2003-01-01

    Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.

  11. IOTA: integration optimization, triage and analysis tool for the processing of XFEL diffraction images.

    PubMed

    Lyubimov, Artem Y; Uervirojnangkoorn, Monarin; Zeldin, Oliver B; Brewster, Aaron S; Murray, Thomas D; Sauter, Nicholas K; Berger, James M; Weis, William I; Brunger, Axel T

    2016-06-01

    Serial femtosecond crystallography (SFX) uses an X-ray free-electron laser to extract diffraction data from crystals not amenable to conventional X-ray light sources owing to their small size or radiation sensitivity. However, a limitation of SFX is the high variability of the diffraction images that are obtained. As a result, it is often difficult to determine optimal indexing and integration parameters for the individual diffraction images. Presented here is a software package, called IOTA , which uses a grid-search technique to determine optimal spot-finding parameters that can in turn affect the success of indexing and the quality of integration on an image-by-image basis. Integration results can be filtered using a priori information about the Bravais lattice and unit-cell dimensions and analyzed for unit-cell isomorphism, facilitating an improvement in subsequent data-processing steps.

  12. Integrated versus stand-alone second generation ethanol production from sugarcane bagasse and trash.

    PubMed

    Dias, Marina O S; Junqueira, Tassia L; Cavalett, Otávio; Cunha, Marcelo P; Jesus, Charles D F; Rossell, Carlos E V; Maciel Filho, Rubens; Bonomi, Antonio

    2012-01-01

    Ethanol production from lignocellulosic materials is often conceived considering independent, stand-alone production plants; in the Brazilian scenario, where part of the potential feedstock (sugarcane bagasse) for second generation ethanol production is already available at conventional first generation production plants, an integrated first and second generation production process seems to be the most obvious option. In this study stand-alone second generation ethanol production from surplus sugarcane bagasse and trash is compared with conventional first generation ethanol production from sugarcane and with integrated first and second generation; simulations were developed to represent the different technological scenarios, which provided data for economic and environmental analysis. Results show that the integrated first and second generation ethanol production process from sugarcane leads to better economic results when compared with the stand-alone plant, especially when advanced hydrolysis technologies and pentoses fermentation are included. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. Cortical Hubs Form a Module for Multisensory Integration on Top of the Hierarchy of Cortical Networks

    PubMed Central

    Zamora-López, Gorka; Zhou, Changsong; Kurths, Jürgen

    2009-01-01

    Sensory stimuli entering the nervous system follow particular paths of processing, typically separated (segregated) from the paths of other modal information. However, sensory perception, awareness and cognition emerge from the combination of information (integration). The corticocortical networks of cats and macaque monkeys display three prominent characteristics: (i) modular organisation (facilitating the segregation), (ii) abundant alternative processing paths and (iii) the presence of highly connected hubs. Here, we study in detail the organisation and potential function of the cortical hubs by graph analysis and information theoretical methods. We find that the cortical hubs form a spatially delocalised, but topologically central module with the capacity to integrate multisensory information in a collaborative manner. With this, we resolve the underlying anatomical substrate that supports the simultaneous capacity of the cortex to segregate and to integrate multisensory information. PMID:20428515

  14. Stepwise Connectivity of the Modal Cortex Reveals the Multimodal Organization of the Human Brain

    PubMed Central

    Sepulcre, Jorge; Sabuncu, Mert R.; Yeo, Thomas B.; Liu, Hesheng; Johnson, Keith A.

    2012-01-01

    How human beings integrate information from external sources and internal cognition to produce a coherent experience is still not well understood. During the past decades, anatomical, neurophysiological and neuroimaging research in multimodal integration have stood out in the effort to understand the perceptual binding properties of the brain. Areas in the human lateral occipito-temporal, prefrontal and posterior parietal cortices have been associated with sensory multimodal processing. Even though this, rather patchy, organization of brain regions gives us a glimpse of the perceptual convergence, the articulation of the flow of information from modality-related to the more parallel cognitive processing systems remains elusive. Using a method called Stepwise Functional Connectivity analysis, the present study analyzes the functional connectome and transitions from primary sensory cortices to higher-order brain systems. We identify the large-scale multimodal integration network and essential connectivity axes for perceptual integration in the human brain. PMID:22855814

  15. Social Workers' Attempts to Navigate Among the Elderly, Their Families, and Foreign Home Care Workers in the Haredi Community.

    PubMed

    Freund, Anat; Band-Winterstein, Tova

    2017-02-01

    The study's aim is to examine social workers' experience in facilitating the integration of foreign home care workers (FHCWs) into the ultraorthodox Jewish (UOJ) community for the purpose of treating older adults. Using the qualitative-phenomenological approach, semistructured, in-depth interviews were conducted with 18 social workers in daily contact with UOJ older adult clients in the process of integrating FHCWs. Data analysis revealed three central themes-integrating FHCWs into the aging UOJ family: barriers and challenges in the interaction between the two worlds; "even the rabbi has a FHCW": changing trends in caring for older adults; and the social worker as mediator and facilitator of a successful relationship. Social workers play a central role, serving as a cultural bridge in the process of integrating FHCWs, as a way of addressing the needs of ultraorthodox elderly and their families, while also considering the needs of the foreign workers.

  16. SemanticSCo: A platform to support the semantic composition of services for gene expression analysis.

    PubMed

    Guardia, Gabriela D A; Ferreira Pires, Luís; da Silva, Eduardo G; de Farias, Cléver R G

    2017-02-01

    Gene expression studies often require the combined use of a number of analysis tools. However, manual integration of analysis tools can be cumbersome and error prone. To support a higher level of automation in the integration process, efforts have been made in the biomedical domain towards the development of semantic web services and supporting composition environments. Yet, most environments consider only the execution of simple service behaviours and requires users to focus on technical details of the composition process. We propose a novel approach to the semantic composition of gene expression analysis services that addresses the shortcomings of the existing solutions. Our approach includes an architecture designed to support the service composition process for gene expression analysis, and a flexible strategy for the (semi) automatic composition of semantic web services. Finally, we implement a supporting platform called SemanticSCo to realize the proposed composition approach and demonstrate its functionality by successfully reproducing a microarray study documented in the literature. The SemanticSCo platform provides support for the composition of RESTful web services semantically annotated using SAWSDL. Our platform also supports the definition of constraints/conditions regarding the order in which service operations should be invoked, thus enabling the definition of complex service behaviours. Our proposed solution for semantic web service composition takes into account the requirements of different stakeholders and addresses all phases of the service composition process. It also provides support for the definition of analysis workflows at a high-level of abstraction, thus enabling users to focus on biological research issues rather than on the technical details of the composition process. The SemanticSCo source code is available at https://github.com/usplssb/SemanticSCo. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Multiscale analysis of the correlation of processing parameters on viscidity of composites fabricated by automated fiber placement

    NASA Astrophysics Data System (ADS)

    Han, Zhenyu; Sun, Shouzheng; Fu, Yunzhong; Fu, Hongya

    2017-10-01

    Viscidity is an important physical indicator for assessing fluidity of resin that is beneficial to contact resin with the fibers effectively and reduce manufacturing defects during automated fiber placement (AFP) process. However, the effect of processing parameters on viscidity evolution is rarely studied during AFP process. In this paper, viscidities under different scales are analyzed based on multi-scale analysis method. Firstly, viscous dissipation energy (VDE) within meso-unit under different processing parameters is assessed by using finite element method (FEM). According to multi-scale energy transfer model, meso-unit energy is used as the boundary condition for microscopic analysis. Furthermore, molecular structure of micro-system is built by molecular dynamics (MD) method. And viscosity curves are then obtained by integrating stress autocorrelation function (SACF) with time. Finally, the correlation characteristics of processing parameters to viscosity are revealed by using gray relational analysis method (GRAM). A group of processing parameters is found out to achieve the stability of viscosity and better fluidity of resin.

  18. Quality-assurance plan for the analysis of fluvial sediment by the U.S. Geological Survey Kentucky Water Science Center Sediment Laboratory

    USGS Publications Warehouse

    Shreve, Elizabeth A.; Downs, Aimee C.

    2005-01-01

    This report describes laboratory procedures used by the U.S. Geological Survey Kentucky Water Science Center Sediment Laboratory for the processing and analysis of fluvial-sediment samples for concentration of sand and finer material. The report details the processing of a sediment sample through the laboratory from receiving the sediment sample, through the analytical process, to compiling results of the requested analysis. Procedures for preserving sample integrity, calibrating and maintaining of laboratory and field instruments and equipment, analyzing samples, internal quality assurance and quality control, and validity of the sediment-analysis results also are described. The report includes a list of references cited and a glossary of sediment and quality-assurance terms.

  19. A case study of the change process of integrating technology into an elementary science methods course from 1997 to 2003

    NASA Astrophysics Data System (ADS)

    Hsu, Pi-Sui

    The purpose of this qualitative case study was to provide a detailed description of the change process of technology integration into a science methods course, SCIED 458, as well as to interpret and analyze essential issues involved in the change process and examine how these factors influenced the change process. This study undertook qualitative research that employed case study research design. In-depth interviewing and review of the documents were two major data collection methods in this study. Participants included the three key faculty members in the science education program, a former graduate student who participated in writing the Link-to-Learn grant proposal, a former graduate student who taught SCIED 458, and two current graduate students who were teaching SCIED 458. A number of data analysis strategies were used in this study; these strategies included (1) coding for different periods of time and project categories and roles of people, (2) identifying themes, trends and coding for patterns, (3) reducing the data for analysis of trends and synthesizing and summarizing the data, and (4) integrating the data into one analytical framework. The findings indicated that this change process had evolved through the stages of adoption and diffusion, implementation, and institutionalization and a number of strategies facilitated the changes in individual stages, including the formation of a leadership team in the early stages, gradual adoption of technology tools, use of powerful pedagogy and methodology, the formation of a research community, and separation of technology training and subject teaching. The findings also indicated the essential factors and systems that interacted with each other and sustained the change process; these included a transformational shared leadership team, the formation of a learning and research community, reduced resistance of the elementary prospective teachers to technology, availability of university resources, involvement of the local school districts, support of the state department of education, recognition of the professional organizations, creation of partnerships with software companies, and technology advancements in society. A framework for integrating technology was presented to assist school reformers and instructional designers in initiating, implementing, and sustaining the changes due to technology integration in a systemic manner.

  20. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis

    PubMed Central

    Guardia, Gabriela D. A.; Pires, Luís Ferreira; Vêncio, Ricardo Z. N.; Malmegrim, Kelen C. R.; de Farias, Cléver R. G.

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis. PMID:26207740

  1. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    PubMed

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  2. Participative leadership in the management process of nightshift nursing.

    PubMed

    da Costa, Diovane Ghignatti; Dall'Agnol, Clarice Maria

    2011-01-01

    This is a qualitative, exploratory, descriptive study, aiming to identify the perceptions of nurses regarding the leadership process and to analyze how this process takes place on the nightshift. Data collection was performed through the Focus Groups Technique, with 13 nightshift nurses of a public teaching hospital. Two categories that resulted from the thematic analysis are the focus of this article: the context of nightshift nursing work and leadership from the perception of the nightshift nurses. Teamwork is an important condition to vitalize the participatory perspective of the leadership process, given the necessary relationship of support and integration, above all in the nightshift nursing work. This exercise challenges the nurse in the solidification of a culture that promotes spaces for reflection regarding the work, integrating leadership with a learning process that is constituted through constructive bonds between the workers.

  3. Complex multidisciplinary system composition for aerospace vehicle conceptual design

    NASA Astrophysics Data System (ADS)

    Gonzalez, Lex

    Although, there exists a vast amount of work concerning the analysis, design, integration of aerospace vehicle systems, there is no standard for how this data and knowledge should be combined in order to create a synthesis system. Each institution creating a synthesis system has in house vehicle and hardware components they are attempting to model and proprietary methods with which to model them. This leads to the fact that synthesis systems begin as one-off creations meant to answer a specific problem. As the scope of the synthesis system grows to encompass more and more problems, so does its size and complexity; in order for a single synthesis system to answer multiple questions the number of methods and method interface must increase. As a means to curtail the requirement that the increase of an aircraft synthesis systems capability leads to an increase in its size and complexity, this research effort focuses on the idea that each problem in aerospace requires its own analysis framework. By focusing on the creation of a methodology which centers on the matching of an analysis framework towards the problem being solved, the complexity of the analysis framework is decoupled from the complexity of the system that creates it. The derived methodology allows for the composition of complex multi-disciplinary systems (CMDS) through the automatic creation and implementation of system and disciplinary method interfaces. The CMDS Composition process follows a four step methodology meant to take a problem definition and progress towards the creation of an analysis framework meant to answer said problem. The unique implementation of the CMDS Composition process take user selected disciplinary analysis methods and automatically integrates them, together in order to create a syntactically composable analysis framework. As a means of assessing the validity of the CMDS Composition process a prototype system (AVDDBMS) has been developed. AVD DBMS has been used to model the Generic Hypersonic Vehicle (GHV), an open source family of hypersonic vehicles originating from the Air Force Research Laboratory. AVDDBMS has been applied in three different ways in order to assess its validity: Verification using GHV disciplinary data, Validation using selected disciplinary analysis methods, and Application of the CMDS Composition Process to assess the design solution space for the GHV hardware. The research demonstrates the holistic effect that selection of individual disciplinary analysis methods has on the structure and integration of the analysis framework.

  4. State of the art in pathology business process analysis, modeling, design and optimization.

    PubMed

    Schrader, Thomas; Blobel, Bernd; García-Rojo, Marcial; Daniel, Christel; Słodkowska, Janina

    2012-01-01

    For analyzing current workflows and processes, for improving them, for quality management and quality assurance, for integrating hardware and software components, but also for education, training and communication between different domains' experts, modeling business process in a pathology department is inevitable. The authors highlight three main processes in pathology: general diagnostic, cytology diagnostic, and autopsy. In this chapter, those processes are formally modeled and described in detail. Finally, specialized processes such as immunohistochemistry and frozen section have been considered.

  5. Research and development of low cost processes for integrated solar arrays. Final report, April 15, 1974--January 14, 1976

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graham, C.D.; Kulkarni, S.; Louis, E.

    1976-05-01

    Results of a program to study process routes leading to a low cost large area integrated silicon solar array manufacture for terrestrial applications are reported. Potential processes for the production of solar-grade silicon are evaluated from thermodynamic, economic, and technical feasibility points of view. Upgrading of the present arc-furnace process is found most favorable. Experimental studies of the Si/SiF/sub 4/ transport and purification process show considerable impurity removal and reasonable transport rates. Silicon deformation experiments indicate production of silicon sheet by rolling at 1350/sup 0/C is feasible. Significant recrystallization by strain-anneal technique has been observed. Experimental recrystallization studies using anmore » electron beam line source are discussed. A maximum recrystallization velocity of approximately 9 m/hr is calculated for silicon sheet. A comparative process rating technique based on detailed cost analysis is presented.« less

  6. Does the Component Processes Task Assess Text-Based Inferences Important for Reading Comprehension? A Path Analysis in Primary School Children

    PubMed Central

    Wassenburg, Stephanie I.; de Koning, Björn B.; de Vries, Meinou H.; van der Schoot, Menno

    2016-01-01

    Using a component processes task (CPT) that differentiates between higher-level cognitive processes of reading comprehension provides important advantages over commonly used general reading comprehension assessments. The present study contributes to further development of the CPT by evaluating the relative contributions of its components (text memory, text inferencing, and knowledge integration) and working memory to general reading comprehension within a single study using path analyses. Participants were 173 third- and fourth-grade children. As hypothesized, knowledge integration was the only component of the CPT that directly contributed to reading comprehension, indicating that the text-inferencing component did not assess inferential processes related to reading comprehension. Working memory was a significant predictor of reading comprehension over and above the component processes. Future research should focus on finding ways to ensure that the text-inferencing component taps into processes important for reading comprehension. PMID:27378989

  7. Integration of geological remote-sensing techniques in subsurface analysis

    USGS Publications Warehouse

    Taranik, James V.; Trautwein, Charles M.

    1976-01-01

    Geological remote sensing is defined as the study of the Earth utilizing electromagnetic radiation which is either reflected or emitted from its surface in wavelengths ranging from 0.3 micrometre to 3 metres. The natural surface of the Earth is composed of a diversified combination of surface cover types, and geologists must understand the characteristics of surface cover types to successfully evaluate remotely-sensed data. In some areas landscape surface cover changes throughout the year, and analysis of imagery acquired at different times of year can yield additional geological information. Integration of different scales of analysis allows landscape features to be effectively interpreted. Interpretation of the static elements displayed on imagery is referred to as an image interpretation. Image interpretation is dependent upon: (1) the geologist's understanding of the fundamental aspects of image formation, and (2.) his ability to detect, delineate, and classify image radiometric data; recognize radiometric patterns; and identify landscape surface characteristics as expressed on imagery. A geologic interpretation integrates surface characteristics of the landscape with subsurface geologic relationships. Development of a geologic interpretation from imagery is dependent upon: (1) the geologist's ability to interpret geomorphic processes from their static surface expression as landscape characteristics on imagery, (2) his ability to conceptualize the dynamic processes responsible for the evolution 6f interpreted geologic relationships (his ability to develop geologic models). The integration of geologic remote-sensing techniques in subsurface analysis is illustrated by development of an exploration model for ground water in the Tucson area of Arizona, and by the development of an exploration model for mineralization in southwest Idaho.

  8. Analysis of power sector efficiency improvements for an integrated utility planning process in Costa Rica

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waddle, D.B.; MacDonald, J.M.

    1990-01-01

    In an effort to analyze and document the potential for power sector efficiency improvements from generation to end-use, the Agency for International Development and the Government of Costa Rica are jointly conducting an integrated power sector efficiency analysis. Potential for energy and cost savings in power plants, transmission and distribution, and demand-side management programs are being evaluated. The product of this study will be an integrated investment plan for the Instituto Costarricense de Electricidad, incorporating both supply and demand side investment options. This paper presents the methodology employed in the study, as well as preliminary estimates of the results ofmore » the study. 14 refs., 4 figs., 5 tabs.« less

  9. Extended device profiles and testing procedures for the approval process of integrated medical devices using the IEEE 11073 communication standard.

    PubMed

    Janß, Armin; Thorn, Johannes; Schmitz, Malte; Mildner, Alexander; Dell'Anna-Pudlik, Jasmin; Leucker, Martin; Radermacher, Klaus

    2018-02-23

    Nowadays, only closed and proprietary integrated operating room systems (IORS) from big manufacturers are available on the market. Hence, the interconnection of components from third-party vendors is only possible with increased time and costs. In the context of the German Federal Ministry of Education and Research (BMBF)-funded project OR.NET (2012-2016), the open integration of medical devices from different manufacturers was addressed. An integrated operating theater based on the open communication standard IEEE 11073 shall give clinical operators the opportunity to choose medical devices independently of the manufacturer. This approach would be advantageous especially for hospital operators and small- and medium-sized enterprises (SME) of medical devices. Actual standards and concepts regarding technical feasibility and the approval process do not cope with the requirements for a modular integration of medical devices in the operating room (OR), based on an open communication standard. Therefore, innovative approval strategies and corresponding certification and test procedures, which cover actual legal and normative standards, have to be developed in order to support the future risk management and the usability engineering process of open integrated medical devices in the OR. The use of standardized device and service profiles and a three-step testing procedure, including conformity, interoperability and integration tests are described in this paper and shall support the manufacturers to integrate their medical devices without disclosing the medical devices' risk analysis and related confidential expertise or proprietary information.

  10. Normative Data on Audiovisual Speech Integration Using Sentence Recognition and Capacity Measures

    PubMed Central

    Altieri, Nicholas; Hudock, Daniel

    2016-01-01

    Objective The ability to use visual speech cues and integrate them with auditory information is important, especially in noisy environments and for hearing-impaired (HI) listeners. Providing data on measures of integration skills that encompass accuracy and processing speed will benefit researchers and clinicians. Design The study consisted of two experiments: First, accuracy scores were obtained using CUNY sentences, and capacity measures that assessed reaction-time distributions were obtained from a monosyllabic word recognition task. Study Sample We report data on two measures of integration obtained from a sample comprised of 86 young and middle-age adult listeners: Results To summarize our results, capacity showed a positive correlation with accuracy measures of audiovisual benefit obtained from sentence recognition. More relevant, factor analysis indicated that a single-factor model captured audiovisual speech integration better than models containing more factors. Capacity exhibited strong loadings on the factor, while the accuracy-based measures from sentence recognition exhibited weaker loadings. Conclusions Results suggest that a listener’s integration skills may be assessed optimally using a measure that incorporates both processing speed and accuracy. PMID:26853446

  11. Normative data on audiovisual speech integration using sentence recognition and capacity measures.

    PubMed

    Altieri, Nicholas; Hudock, Daniel

    2016-01-01

    The ability to use visual speech cues and integrate them with auditory information is important, especially in noisy environments and for hearing-impaired (HI) listeners. Providing data on measures of integration skills that encompass accuracy and processing speed will benefit researchers and clinicians. The study consisted of two experiments: First, accuracy scores were obtained using City University of New York (CUNY) sentences, and capacity measures that assessed reaction-time distributions were obtained from a monosyllabic word recognition task. We report data on two measures of integration obtained from a sample comprised of 86 young and middle-age adult listeners: To summarize our results, capacity showed a positive correlation with accuracy measures of audiovisual benefit obtained from sentence recognition. More relevant, factor analysis indicated that a single-factor model captured audiovisual speech integration better than models containing more factors. Capacity exhibited strong loadings on the factor, while the accuracy-based measures from sentence recognition exhibited weaker loadings. Results suggest that a listener's integration skills may be assessed optimally using a measure that incorporates both processing speed and accuracy.

  12. Self-aligned blocking integration demonstration for critical sub-30nm pitch Mx level patterning with EUV self-aligned double patterning

    NASA Astrophysics Data System (ADS)

    Raley, Angélique; Lee, Joe; Smith, Jeffrey T.; Sun, Xinghua; Farrell, Richard A.; Shearer, Jeffrey; Xu, Yongan; Ko, Akiteru; Metz, Andrew W.; Biolsi, Peter; Devilliers, Anton; Arnold, John; Felix, Nelson

    2018-04-01

    We report a sub-30nm pitch self-aligned double patterning (SADP) integration scheme with EUV lithography coupled with self-aligned block technology (SAB) targeting the back end of line (BEOL) metal line patterning applications for logic nodes beyond 5nm. The integration demonstration is a validation of the scalability of a previously reported flow, which used 193nm immersion SADP targeting a 40nm pitch with the same material sets (Si3N4 mandrel, SiO2 spacer, Spin on carbon, spin on glass). The multi-color integration approach is successfully demonstrated and provides a valuable method to address overlay concerns and more generally edge placement error (EPE) as a whole for advanced process nodes. Unbiased LER/LWR analysis comparison between EUV SADP and 193nm immersion SADP shows that both integrations follow the same trend throughout the process steps. While EUV SADP shows increased LER after mandrel pull, metal hardmask open and dielectric etch compared to 193nm immersion SADP, the final process performance is matched in terms of LWR (1.08nm 3 sigma unbiased) and is only 6% higher than 193nm immersion SADP for average unbiased LER. Using EUV SADP enables almost doubling the line density while keeping most of the remaining processes and films unchanged, and provides a compelling alternative to other multipatterning integrations, which present their own sets of challenges.

  13. CMOS Time-Resolved, Contact, and Multispectral Fluorescence Imaging for DNA Molecular Diagnostics

    PubMed Central

    Guo, Nan; Cheung, Ka Wai; Wong, Hiu Tung; Ho, Derek

    2014-01-01

    Instrumental limitations such as bulkiness and high cost prevent the fluorescence technique from becoming ubiquitous for point-of-care deoxyribonucleic acid (DNA) detection and other in-field molecular diagnostics applications. The complimentary metal-oxide-semiconductor (CMOS) technology, as benefited from process scaling, provides several advanced capabilities such as high integration density, high-resolution signal processing, and low power consumption, enabling sensitive, integrated, and low-cost fluorescence analytical platforms. In this paper, CMOS time-resolved, contact, and multispectral imaging are reviewed. Recently reported CMOS fluorescence analysis microsystem prototypes are surveyed to highlight the present state of the art. PMID:25365460

  14. LANDSAT demonstration/application and GIS integration in south central Alaska

    NASA Technical Reports Server (NTRS)

    Burns, A. W.; Derrenbacher, W.

    1981-01-01

    Automated geographic information systems were developed for two sites in Southcentral Alaska to serve as tests for both the process of integrating classified LANDSAT data into a comprehensive environmental data base and the process of using automated information in land capability/suitability analysis and environmental planning. The Big Lake test site, located approximately 20 miles north of the City of Anchorage, comprises an area of approximately 150 square miles. The Anchorage Hillside test site, lying approximately 5 miles southeast of the central part of the city, extends over an area of some 25 square miles. Map construction and content is described.

  15. S-193 scatterometer transfer function analysis for data processing

    NASA Technical Reports Server (NTRS)

    Johnson, L.

    1974-01-01

    A mathematical model for converting raw data measurements of the S-193 scatterometer into processed values of radar scattering coefficient is presented. The argument is based on an approximation derived from the Radar Equation and actual operating principles of the S-193 Scatterometer hardware. Possible error sources are inaccuracies in transmitted wavelength, range, antenna illumination integrals, and the instrument itself. The dominant source of error in the calculation of scattering coefficent is accuracy of the range. All other ractors with the possible exception of illumination integral are not considered to cause significant error in the calculation of scattering coefficient.

  16. Bio-jETI: a service integration, design, and provisioning platform for orchestrated bioinformatics processes.

    PubMed

    Margaria, Tiziana; Kubczak, Christian; Steffen, Bernhard

    2008-04-25

    With Bio-jETI, we introduce a service platform for interdisciplinary work on biological application domains and illustrate its use in a concrete application concerning statistical data processing in R and xcms for an LC/MS analysis of FAAH gene knockout. Bio-jETI uses the jABC environment for service-oriented modeling and design as a graphical process modeling tool and the jETI service integration technology for remote tool execution. As a service definition and provisioning platform, Bio-jETI has the potential to become a core technology in interdisciplinary service orchestration and technology transfer. Domain experts, like biologists not trained in computer science, directly define complex service orchestrations as process models and use efficient and complex bioinformatics tools in a simple and intuitive way.

  17. Development of a plan for automating integrated circuit processing

    NASA Technical Reports Server (NTRS)

    1971-01-01

    The operations analysis and equipment evaluations pertinent to the design of an automated production facility capable of manufacturing beam-lead CMOS integrated circuits are reported. The overall plan shows approximate cost of major equipment, production rate and performance capability, flexibility, and special maintenance requirements. Direct computer control is compared with supervisory-mode operations. The plan is limited to wafer processing operations from the starting wafer to the finished beam-lead die after separation etching. The work already accomplished in implementing various automation schemes, and the type of equipment which can be found for instant automation are described. The plan is general, so that small shops or large production units can perhaps benefit. Examples of major types of automated processing machines are shown to illustrate the general concepts of automated wafer processing.

  18. Final report of coordination and cooperation with the European Union on embankment failure analysis

    USDA-ARS?s Scientific Manuscript database

    There has been an emphasis in the European Union (EU) community on the investigation of extreme flood processes and the uncertainties related to these processes. Over a 3-year period, the EU and the U.S. dam safety community (1) coordinated their efforts and collected information needed to integrate...

  19. Glass sealing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brow, R.K.; Kovacic, L.; Chambers, R.S.

    1996-04-01

    Hernetic glass sealing technologies developed for weapons component applications can be utilized for the design and manufacture of fuel cells. Design and processing of of a seal are optimized through an integrated approach based on glass composition research, finite element analysis, and sealing process definition. Glass sealing procedures are selected to accommodate the limits imposed by glass composition and predicted calculations.

  20. Principles of Integrative Modelling at Studying of Plasma and Welding Processes

    ERIC Educational Resources Information Center

    Anakhov, Sergey V.; Perminov, Evgeniy ?.; Dzyubich, Denis K.; Yarushina, Maria A.; Tarasova, Yuliya A.

    2016-01-01

    The relevance of the problem subject to the research is conditioned by need for introduction of modern technologies into the educational process and insufficient adaptation of the higher school teachers to the applied information and automated procedures in education and science. The purpose of the publication consists in the analysis of automated…

  1. Relationship of Class-Size to Classroom Processes, Teacher Satisfaction and Pupil Affect: A Meta-Analysis.

    ERIC Educational Resources Information Center

    Smith, Mary Lee; Glass, Gene V.

    Using data from previously completed research, the authors of this report attempted to examine the relationship between class size and measures of outcomes such as student attitudes and behavior, classroom processes and learning environment, and teacher satisfaction. The authors report that statistical integration of the existing research…

  2. Building a Model of Employee Training through Holistic Analysis of Biological, Psychological, and Sociocultural Factors

    ERIC Educational Resources Information Center

    Schenck, Andrew

    2015-01-01

    While theories of adult learning and motivation are often framed as being either biological, psychological, or sociocultural, they represent a more complex, integral process. To gain a more holistic perspective of this process, a study was designed to concurrently investigate relationships between a biological factor (age), psychological factors…

  3. European Vocational Education and Training: A Prerequisite for Mobility?

    ERIC Educational Resources Information Center

    Rauner, Felix

    2008-01-01

    Purpose: The purpose of this paper is to demonstrate that the internationalisation of nearly all spheres of society and the process of European integration will be leading to the development of a European vocational education and training VET architecture. Design/methodology/approach: The analysis of the "Copenhagen process" is based on…

  4. The Development of Group Interaction Patterns: How Groups become Adaptive, Generative, and Transformative Learners

    ERIC Educational Resources Information Center

    London, Manuel; Sessa, Valerie I.

    2007-01-01

    This article integrates the literature on group interaction process analysis and group learning, providing a framework for understanding how patterns of interaction develop. The model proposes how adaptive, generative, and transformative learning processes evolve and vary in their functionality. Environmental triggers for learning, the group's…

  5. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    ERIC Educational Resources Information Center

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  6. Learning Actions, Objects and Types of Interaction: A Methodological Analysis of Expansive Learning among Pre-Service Teachers

    ERIC Educational Resources Information Center

    Rantavuori, Juhana; Engeström, Yrjö; Lipponen, Lasse

    2016-01-01

    The paper analyzes a collaborative learning process among Finnish pre-service teachers planning their own learning in a self-regulated way. The study builds on cultural-historical activity theory and the theory of expansive learning, integrating for the first time an analysis of learning actions and an analysis of types of interaction. We examine…

  7. Execution Of Systems Integration Principles During Systems Engineering Design

    DTIC Science & Technology

    2016-09-01

    This thesis discusses integration failures observed by DOD and non - DOD systems as, inadequate stakeholder analysis, incomplete problem space and design ... design , development, test and deployment of a system. A lifecycle structure consists of phases within a methodology or process model. There are many...investigate design decisions without the need to commit to physical forms; “ experimental investigation using a model yields design or operational

  8. International Space Station Alpha (ISSA) Integrated Traffic Model

    NASA Technical Reports Server (NTRS)

    Gates, Robert E.

    1994-01-01

    The paper discusses the development process of the International Space Station Alpha (ISSA) Integrated Traffic Model which is a subsystem analyses tool utilized in the ISSA design analysis cycles. Fast-track prototyping of the detailed relationships between daily crew and station consumables, propellant needs, maintenance requirements, and crew rotation via spread sheets provides adequate bench marks to assess cargo vehicle design and performance characteristics.

  9. A Design Architecture for an Integrated Training System Decision Support System

    DTIC Science & Technology

    1990-07-01

    Sensory modes include visual, auditory, tactile, or kinesthetic; performance categories include time to complete , speed of response, or correct action ...procedures, and finally application and examples from the aviation proponency with emphasis on the LHX program. Appendix B is a complete bibliography...integrated analysis of ITS development. The approach was designed to provide an accurate and complete representation of the ITS development process and

  10. Analysis of Horizontal Integration within the Program Executive Office, Integrated Warfare Systems

    DTIC Science & Technology

    2006-09-01

    intervention approach is likely to be effective at changing social relationships and behaviors. Change occurs at the basis of qualitative social time...four types of planned change interventions deal with organizational structures, processes, beliefs, and social relationships . Kotter’s change stages...effective at changing social relationships which favors the long-term. Since the commanding intervention does not deal with the social aspect

  11. Processing Pathways in Mental Arithmetic—Evidence from Probabilistic Fiber Tracking

    PubMed Central

    Glauche, Volkmar; Weiller, Cornelius; Willmes, Klaus

    2013-01-01

    Numerical cognition is a case of multi-modular and distributed cerebral processing. So far neither the anatomo-functional connections between the cortex areas involved nor their integration into established frameworks such as the differentiation between dorsal and ventral processing streams have been specified. The current study addressed this issue combining a re-analysis of previously published fMRI data with probabilistic fiber tracking data from an independent sample. We aimed at differentiating neural correlates and connectivity for relatively easy and more difficult addition problems in healthy adults and their association with either rather verbally mediated fact retrieval or magnitude manipulations, respectively. The present data suggest that magnitude- and fact retrieval-related processing seem to be subserved by two largely separate networks, both of them comprising dorsal and ventral connections. Importantly, these networks not only differ in localization of activation but also in the connections between the cortical areas involved. However, it has to be noted that even though seemingly distinct anatomically, these networks operate as a functionally integrated circuit for mental calculation as revealed by a parametric analysis of brain activation. PMID:23383194

  12. The levels of analysis revisited

    PubMed Central

    MacDougall-Shackleton, Scott A.

    2011-01-01

    The term levels of analysis has been used in several ways: to distinguish between ultimate and proximate levels, to categorize different kinds of research questions and to differentiate levels of reductionism. Because questions regarding ultimate function and proximate mechanisms are logically distinct, I suggest that distinguishing between these two levels is the best use of the term. Integrating across levels in research has potential risks, but many benefits. Consideration at one level can help generate novel hypotheses at the other, define categories of behaviour and set criteria that must be addressed. Taking an adaptationist stance thus strengthens research on proximate mechanisms. Similarly, it is critical for researchers studying adaptation and function to have detailed knowledge of proximate mechanisms that may constrain or modulate evolutionary processes. Despite the benefits of integrating across ultimate and proximate levels, failure to clearly identify levels of analysis, and whether or not hypotheses are exclusive alternatives, can create false debates. Such non-alternative hypotheses may occur between or within levels, and are not limited to integrative approaches. In this review, I survey different uses of the term levels of analysis and the benefits of integration, and highlight examples of false debate within and between levels. The best integrative biology reciprocally uses ultimate and proximate hypotheses to generate a more complete understanding of behaviour. PMID:21690126

  13. A hybrid modeling system designed to support decision making in the optimization of extrusion of inhomogeneous materials

    NASA Astrophysics Data System (ADS)

    Kryuchkov, D. I.; Zalazinsky, A. G.

    2017-12-01

    Mathematical models and a hybrid modeling system are developed for the implementation of the experimental-calculation method for the engineering analysis and optimization of the plastic deformation of inhomogeneous materials with the purpose of improving metal-forming processes and machines. The created software solution integrates Abaqus/CAE, a subroutine for mathematical data processing, with the use of Python libraries and the knowledge base. Practical application of the software solution is exemplified by modeling the process of extrusion of a bimetallic billet. The results of the engineering analysis and optimization of the extrusion process are shown, the material damage being monitored.

  14. Integration of decentralized clinical data in a data warehouse: a service-oriented design and realization.

    PubMed

    Hanss, Sabine; Schaaf, T; Wetzel, T; Hahn, C; Schrader, T; Tolxdorff, T

    2009-01-01

    In this paper we present a general concept and describe the difficulties for the integration of data from various clinical partners in one data warehouse using the Open European Nephrology Science Center (OpEN.SC) as an example. This includes a requirements analysis of the data integration process and also the design according to these requirements. This conceptual approach based on the Rational Unified Process (RUP) and paradigm of Service-Oriented Architecture (SOA). Because we have to enhance the confidence of our partners in the OpEN.SC system and with this the willingness of them to participate, important requirements are controllability, transparency and security for all partners. Reusable and fine-grained components were found to be necessary when working with diverse data sources. With SOA the requested reusability is implemented easily. A key step in the development of a data integration process within such a health information system like OpEN.SC is to analyze the requirements. And to show that this is not only a theoretical work, we present a design - developed with RUP and SOA - which fulfills these requirements.

  15. A Study of the Ozone Formation by Ensemble Back Trajectory-process Analysis Using the Eta-CMAQ Forecast Model over the Northeastern U.S. During the 2004 ICARTT Period

    EPA Science Inventory

    The integrated process rates (IPR) estimated by the Eta-CMAQ model at grid cells along the trajectory of the air mass transport path were analyzed to quantitatively investigate the relative importance of physical and chemical processes for O3 formation and evolution ov...

  16. Integrating Information: An Analysis of the Processes Involved and the Products Generated in a Written Synthesis Task

    ERIC Educational Resources Information Center

    Sole, Isabel; Miras, Mariana; Castells, Nuria; Espino, Sandra; Minguela, Marta

    2013-01-01

    The case study reported here explores the processes involved in producing a written synthesis of three history texts and their possible relation to the characteristics of the texts produced and the degree of comprehension achieved following the task. The processes carried out by 10 final-year compulsory education students (15 and 16 years old) to…

  17. A collaborative design method to support integrated care. An ICT development method containing continuous user validation improves the entire care process and the individual work situation

    PubMed Central

    Scandurra, Isabella; Hägglund, Maria

    2009-01-01

    Introduction Integrated care involves different professionals, belonging to different care provider organizations and requires immediate and ubiquitous access to patient-oriented information, supporting an integrated view on the care process [1]. Purpose To present a method for development of usable and work process-oriented information and communication technology (ICT) systems for integrated care. Theory and method Based on Human-computer Interaction Science and in particular Participatory Design [2], we present a new collaborative design method in the context of health information systems (HIS) development [3]. This method implies a thorough analysis of the entire interdisciplinary cooperative work and a transformation of the results into technical specifications, via user validated scenarios, prototypes and use cases, ultimately leading to the development of appropriate ICT for the variety of occurring work situations for different user groups, or professions, in integrated care. Results and conclusions Application of the method in homecare of the elderly resulted in an HIS that was well adapted to the intended user groups. Conducted in multi-disciplinary seminars, the method captured and validated user needs and system requirements for different professionals, work situations, and environments not only for current work; it also aimed to improve collaboration in future (ICT supported) work processes. A holistic view of the entire care process was obtained and supported through different views of the HIS for different user groups, resulting in improved work in the entire care process as well as for each collaborating profession [4].

  18. Design and Implementation of Hydrologic Process Knowledge-base Ontology: A case study for the Infiltration Process

    NASA Astrophysics Data System (ADS)

    Elag, M.; Goodall, J. L.

    2013-12-01

    Hydrologic modeling often requires the re-use and integration of models from different disciplines to simulate complex environmental systems. Component-based modeling introduces a flexible approach for integrating physical-based processes across disciplinary boundaries. Several hydrologic-related modeling communities have adopted the component-based approach for simulating complex physical systems by integrating model components across disciplinary boundaries in a workflow. However, it is not always straightforward to create these interdisciplinary models due to the lack of sufficient knowledge about a hydrologic process. This shortcoming is a result of using informal methods for organizing and sharing information about a hydrologic process. A knowledge-based ontology provides such standards and is considered the ideal approach for overcoming this challenge. The aims of this research are to present the methodology used in analyzing the basic hydrologic domain in order to identify hydrologic processes, the ontology itself, and how the proposed ontology is integrated with the Water Resources Component (WRC) ontology. The proposed ontology standardizes the definitions of a hydrologic process, the relationships between hydrologic processes, and their associated scientific equations. The objective of the proposed Hydrologic Process (HP) Ontology is to advance the idea of creating a unified knowledge framework for components' metadata by introducing a domain-level ontology for hydrologic processes. The HP ontology is a step toward an explicit and robust domain knowledge framework that can be evolved through the contribution of domain users. Analysis of the hydrologic domain is accomplished using the Formal Concept Approach (FCA), in which the infiltration process, an important hydrologic process, is examined. Two infiltration methods, the Green-Ampt and Philip's methods, were used to demonstrate the implementation of information in the HP ontology. Furthermore, a SPARQL service is provided for semantic-based querying of the ontology.

  19. Impediments to integrated urban stormwater management: the need for institutional reform.

    PubMed

    Brown, Rebekah R

    2005-09-01

    It is now well established that the traditional practice of urban stormwater management contributes to the degradation of receiving waterways, and this practice was more recently critiqued for facilitating the wastage of a valuable water resource. However, despite significant advances in alternative "integrated urban stormwater management" techniques and processes over the last 20 years, wide-scale implementation has been limited. This problem is indicative of broader institutional impediments that are beyond current concerns of strengthening technological and planning process expertise. Presented here is an analysis of the institutionalization of urban stormwater management across Sydney with the objective of scoping institutional impediments to more sustainable management approaches. The analysis reveals that the inertia with the public administration of urban stormwater inherently privileges and perpetuates traditional stormwater management practices at implementation. This inertia is characterized by historically entrained forms of technocratic institutional power and expertise, values and leadership, and structure and jurisdiction posing significant impediments to change and the realization of integrated urban stormwater management. These insights strongly point to the need for institutional change specifically directed at fostering horizontal integration of the various functions of the existing administrative regime. This would need to be underpinned with capacity-building interventions targeted at enabling a learning culture that values integration and participatory decision making. These insights also provide guideposts for assessing the institutional and capacity development needs for improving urban water management practices in other contexts.

  20. From the past to the future: Integrating work experience into the design process.

    PubMed

    Bittencourt, João Marcos; Duarte, Francisco; Béguin, Pascal

    2017-01-01

    Integrating work activity issues into design process is a broadly discussed theme in ergonomics. Participation is presented as the main means for such integration. However, a late participation can limit the development of both project solutions and future work activity. This article presents the concept of construction of experience aiming at the articulated development of future activities and project solutions. It is a non-teleological approach where the initial concepts will be transformed by the experience built up throughout the design process. The method applied was a case study of an ergonomic participation during the design of a new laboratory complex for biotechnology research. Data was obtained through analysis of records in a simulation process using a Lego scale model and interviews with project participants. The simulation process allowed for developing new ways of working and generating changes in the initial design solutions, which enable workers to adopt their own developed strategies for conducting work more safely and efficiently in the future work system. Each project decision either opens or closes a window of opportunities for developing a future activity. Construction of experience in a non-teleological design process allows for understanding the consequences of project solutions for future work.

  1. Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    NASA Technical Reports Server (NTRS)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    This research is aimed at developing a neiv and advanced simulation framework that will significantly improve the overall efficiency of aerospace systems design and development. This objective will be accomplished through an innovative integration of object-oriented and Web-based technologies ivith both new and proven simulation methodologies. The basic approach involves Ihree major areas of research: Aerospace system and component representation using a hierarchical object-oriented component model which enables the use of multimodels and enforces component interoperability. Collaborative software environment that streamlines the process of developing, sharing and integrating aerospace design and analysis models. . Development of a distributed infrastructure which enables Web-based exchange of models to simplify the collaborative design process, and to support computationally intensive aerospace design and analysis processes. Research for the first year dealt with the design of the basic architecture and supporting infrastructure, an initial implementation of that design, and a demonstration of its application to an example aircraft engine system simulation.

  2. Carbon footprint analysis as a tool for energy and environmental management in small and medium-sized enterprises

    NASA Astrophysics Data System (ADS)

    Giama, E.; Papadopoulos, A. M.

    2018-01-01

    The reduction of carbon emissions has become a top priority in the decision-making process for governments and companies, the strict European legislation framework being a major driving force behind this effort. On the other hand, many companies face difficulties in estimating their footprint and in linking the results derived from environmental evaluation processes with an integrated energy management strategy, which will eventually lead to energy-efficient and cost-effective solutions. The paper highlights the need of companies to establish integrated environmental management practices, with tools such as carbon footprint analysis to monitor the energy performance of production processes. Concepts and methods are analysed, and selected indicators are presented by means of benchmarking, monitoring and reporting the results in order to be used effectively from the companies. The study is based on data from more than 90 Greek small and medium enterprises, followed by a comprehensive discussion of cost-effective and realistic energy-saving measures.

  3. Integration, warehousing, and analysis strategies of Omics data.

    PubMed

    Gedela, Srinubabu

    2011-01-01

    "-Omics" is a current suffix for numerous types of large-scale biological data generation procedures, which naturally demand the development of novel algorithms for data storage and analysis. With next generation genome sequencing burgeoning, it is pivotal to decipher a coding site on the genome, a gene's function, and information on transcripts next to the pure availability of sequence information. To explore a genome and downstream molecular processes, we need umpteen results at the various levels of cellular organization by utilizing different experimental designs, data analysis strategies and methodologies. Here comes the need for controlled vocabularies and data integration to annotate, store, and update the flow of experimental data. This chapter explores key methodologies to merge Omics data by semantic data carriers, discusses controlled vocabularies as eXtensible Markup Languages (XML), and provides practical guidance, databases, and software links supporting the integration of Omics data.

  4. Technoeconomic Assessment of an Advanced Aqueous Ammonia-Based Postcombustion Capture Process Integrated with a 650-MW Coal-Fired Power Station.

    PubMed

    Li, Kangkang; Yu, Hai; Yan, Shuiping; Feron, Paul; Wardhaugh, Leigh; Tade, Moses

    2016-10-04

    Using a rigorous, rate-based model and a validated economic model, we investigated the technoeconomic performance of an aqueous NH 3 -based CO 2 capture process integrated with a 650-MW coal-fired power station. First, the baseline NH 3 process was explored with the process design of simultaneous capture of CO 2 and SO 2 to replace the conventional FGD unit. This reduced capital investment of the power station by US$425/kW (a 13.1% reduction). Integration of this NH 3 baseline process with the power station takes the CO 2 -avoided cost advantage over the MEA process (US$67.3/tonne vs US$86.4/tonne). We then investigated process modifications of a two-stage absorption, rich-split configuration and interheating stripping to further advance the NH 3 process. The modified process reduced energy consumption by 31.7 MW/h (20.2% reduction) and capital costs by US$55.4 million (6.7% reduction). As a result, the CO 2 -avoided cost fell to $53.2/tonne: a savings of $14.1 and $21.9/tonne CO 2 compared with the NH 3 baseline and advanced MEA process, respectively. The analysis of energy breakdown and cost distribution indicates that the technoeconomic performance of the NH 3 process still has great potential to be improved.

  5. Generalized Majority Logic Criterion to Analyze the Statistical Strength of S-Boxes

    NASA Astrophysics Data System (ADS)

    Hussain, Iqtadar; Shah, Tariq; Gondal, Muhammad Asif; Mahmood, Hasan

    2012-05-01

    The majority logic criterion is applicable in the evaluation process of substitution boxes used in the advanced encryption standard (AES). The performance of modified or advanced substitution boxes is predicted by processing the results of statistical analysis by the majority logic criteria. In this paper, we use the majority logic criteria to analyze some popular and prevailing substitution boxes used in encryption processes. In particular, the majority logic criterion is applied to AES, affine power affine (APA), Gray, Lui J, residue prime, S8 AES, Skipjack, and Xyi substitution boxes. The majority logic criterion is further extended into a generalized majority logic criterion which has a broader spectrum of analyzing the effectiveness of substitution boxes in image encryption applications. The integral components of the statistical analyses used for the generalized majority logic criterion are derived from results of entropy analysis, contrast analysis, correlation analysis, homogeneity analysis, energy analysis, and mean of absolute deviation (MAD) analysis.

  6. Risk Assessment of Groundwater Contamination: A Multilevel Fuzzy Comprehensive Evaluation Approach Based on DRASTIC Model

    PubMed Central

    Zhang, Yan; Zhong, Ming

    2013-01-01

    Groundwater contamination is a serious threat to water supply. Risk assessment of groundwater contamination is an effective way to protect the safety of groundwater resource. Groundwater is a complex and fuzzy system with many uncertainties, which is impacted by different geological and hydrological factors. In order to deal with the uncertainty in the risk assessment of groundwater contamination, we propose an approach with analysis hierarchy process and fuzzy comprehensive evaluation integrated together. Firstly, the risk factors of groundwater contamination are identified by the sources-pathway-receptor-consequence method, and a corresponding index system of risk assessment based on DRASTIC model is established. Due to the complexity in the process of transitions between the possible pollution risks and the uncertainties of factors, the method of analysis hierarchy process is applied to determine the weights of each factor, and the fuzzy sets theory is adopted to calculate the membership degrees of each factor. Finally, a case study is presented to illustrate and test this methodology. It is concluded that the proposed approach integrates the advantages of both analysis hierarchy process and fuzzy comprehensive evaluation, which provides a more flexible and reliable way to deal with the linguistic uncertainty and mechanism uncertainty in groundwater contamination without losing important information. PMID:24453883

  7. An independent confirmatory factor analysis of the Wechsler Intelligence Scale for Children-fourth Edition (WISC-IV) integrated: what do the process approach subtests measure?

    PubMed

    Benson, Nicholas; Hulac, David M; Bernstein, Joshua D

    2013-09-01

    The Wechsler intelligence scale for children--fourth edition (WISC-IV) Integrated contains the WISC-IV core and supplemental subtests along with process approach subtests designed to facilitate a process-oriented approach to score interpretation. The purpose of this study was to examine the extent to which WISC-IV Integrated subtests measure the constructs they are purported to measure. In addition to examining the measurement and scoring model provided in the manual, this study also tested hypotheses regarding Cattell-Horn-Carroll abilities that might be measured along with other substantive questions regarding the factor structure of the WISC-IV Integrated and the nature of abilities measured by process approach subtests. Results provide insight regarding the constructs measured by these subtests. Many subtests appear to be good to excellent measures of psychometric g (i.e., the general factor presumed to cause the positive correlation of mental tasks). Other abilities measured by subtests are described. For some subtests, the majority of variance is not accounted for by theoretical constructs included in the scoring model. Modifications made to remove demands such as memory recall and verbal expression were found to reduce construct-irrelevant variance. The WISC-IV Integrated subtests appear to measure similar constructs across ages 6-16, although strict factorial invariance was not supported.

  8. Army-NASA aircrew/aircraft integration program: Phase 4 A(3)I Man-Machine Integration Design and Analysis System (MIDAS) software detailed design document

    NASA Technical Reports Server (NTRS)

    Banda, Carolyn; Bushnell, David; Chen, Scott; Chiu, Alex; Constantine, Betsy; Murray, Jerry; Neukom, Christian; Prevost, Michael; Shankar, Renuka; Staveland, Lowell

    1991-01-01

    The Man-Machine Integration Design and Analysis System (MIDAS) is an integrated suite of software components that constitutes a prototype workstation to aid designers in applying human factors principles to the design of complex human-machine systems. MIDAS is intended to be used at the very early stages of conceptual design to provide an environment wherein designers can use computational representations of the crew station and operator, instead of hardware simulators and man-in-the-loop studies, to discover problems and ask 'what if' questions regarding the projected mission, equipment, and environment. This document is the Software Product Specification for MIDAS. Introductory descriptions of the processing requirements, hardware/software environment, structure, I/O, and control are given in the main body of the document for the overall MIDAS system, with detailed discussion of the individual modules included in Annexes A-J.

  9. Integrative Application of Life Cycle Assessment and Risk Assessment to Environmental Impacts of Anthropogenic Pollutants at a Watershed Scale.

    PubMed

    Lin, Xiaodan; Yu, Shen; Ma, Hwongwen

    2018-01-01

    Intense human activities have led to increasing deterioration of the watershed environment via pollutant discharge, which threatens human health and ecosystem function. To meet a need of comprehensive environmental impact/risk assessment for sustainable watershed development, a biogeochemical process-based life cycle assessment and risk assessment (RA) integration for pollutants aided by geographic information system is proposed in this study. The integration is to frame a conceptual protocol of "watershed life cycle assessment (WLCA) for pollutants". The proposed WLCA protocol consists of (1) geographic and environmental characterization mapping; (2) life cycle inventory analysis; (3) integration of life-cycle impact assessment (LCIA) with RA via characterization factor of pollutant of interest; and (4) result analysis and interpretation. The WLCA protocol can visualize results of LCIA and RA spatially for the pollutants of interest, which might be useful for decision or policy makers for mitigating impacts of watershed development.

  10. Plastic lab-on-a-chip for fluorescence excitation with integrated organic semiconductor lasers.

    PubMed

    Vannahme, Christoph; Klinkhammer, Sönke; Lemmer, Uli; Mappes, Timo

    2011-04-25

    Laser light excitation of fluorescent markers offers highly sensitive and specific analysis for bio-medical or chemical analysis. To profit from these advantages for applications in the field or at the point-of-care, a plastic lab-on-a-chip with integrated organic semiconductor lasers is presented here. First order distributed feedback lasers based on the organic semiconductor tris(8-hydroxyquinoline) aluminum (Alq3) doped with the laser dye 4-dicyanomethylene-2-methyl-6-(p-dimethylaminostyril)-4H-pyrane (DCM), deep ultraviolet induced waveguides, and a nanostructured microfluidic channel are integrated into a poly(methyl methacrylate) (PMMA) substrate. A simple and parallel fabrication process is used comprising thermal imprint, DUV exposure, evaporation of the laser material, and sealing by thermal bonding. The excitation of two fluorescent marker model systems including labeled antibodies with light emitted by integrated lasers is demonstrated.

  11. 10 CFR 70.62 - Safety program and integrated safety analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... conclusion of each failure investigation of an item relied on for safety or management measure. (b) Process... methodology being used. (3) Requirements for existing licensees. Individuals holding an NRC license on...

  12. Supercomputing resources empowering superstack with interactive and integrated systems

    NASA Astrophysics Data System (ADS)

    Rückemann, Claus-Peter

    2012-09-01

    This paper presents the results from the development and implementation of Superstack algorithms to be dynamically used with integrated systems and supercomputing resources. Processing of geophysical data, thus named geoprocessing, is an essential part of the analysis of geoscientific data. The theory of Superstack algorithms and the practical application on modern computing architectures was inspired by developments introduced with processing of seismic data on mainframes and within the last years leading to high end scientific computing applications. There are several stacking algorithms known but with low signal to noise ratio in seismic data the use of iterative algorithms like the Superstack can support analysis and interpretation. The new Superstack algorithms are in use with wave theory and optical phenomena on highly performant computing resources for huge data sets as well as for sophisticated application scenarios in geosciences and archaeology.

  13. A Large-Scale Design Integration Approach Developed in Conjunction with the Ares Launch Vehicle Program

    NASA Technical Reports Server (NTRS)

    Redmon, John W.; Shirley, Michael C.; Kinard, Paul S.

    2012-01-01

    This paper presents a method for performing large-scale design integration, taking a classical 2D drawing envelope and interface approach and applying it to modern three dimensional computer aided design (3D CAD) systems. Today, the paradigm often used when performing design integration with 3D models involves a digital mockup of an overall vehicle, in the form of a massive, fully detailed, CAD assembly; therefore, adding unnecessary burden and overhead to design and product data management processes. While fully detailed data may yield a broad depth of design detail, pertinent integration features are often obscured under the excessive amounts of information, making them difficult to discern. In contrast, the envelope and interface method results in a reduction in both the amount and complexity of information necessary for design integration while yielding significant savings in time and effort when applied to today's complex design integration projects. This approach, combining classical and modern methods, proved advantageous during the complex design integration activities of the Ares I vehicle. Downstream processes, benefiting from this approach by reducing development and design cycle time, include: Creation of analysis models for the Aerodynamic discipline; Vehicle to ground interface development; Documentation development for the vehicle assembly.

  14. Multi-Scale and Object-Oriented Analysis for Mountain Terrain Segmentation and Geomorphological Assessment

    NASA Astrophysics Data System (ADS)

    Marston, B. K.; Bishop, M. P.; Shroder, J. F.

    2009-12-01

    Digital terrain analysis of mountain topography is widely utilized for mapping landforms, assessing the role of surface processes in landscape evolution, and estimating the spatial variation of erosion. Numerous geomorphometry techniques exist to characterize terrain surface parameters, although their utility to characterize the spatial hierarchical structure of the topography and permit an assessment of the erosion/tectonic impact on the landscape is very limited due to scale and data integration issues. To address this problem, we apply scale-dependent geomorphometric and object-oriented analyses to characterize the hierarchical spatial structure of mountain topography. Specifically, we utilized a high resolution digital elevation model to characterize complex topography in the Shimshal Valley in the Western Himalaya of Pakistan. To accomplish this, we generate terrain objects (geomorphological features and landform) including valley floors and walls, drainage basins, drainage network, ridge network, slope facets, and elemental forms based upon curvature. Object-oriented analysis was used to characterize object properties accounting for object size, shape, and morphometry. The spatial overlay and integration of terrain objects at various scales defines the nature of the hierarchical organization. Our results indicate that variations in the spatial complexity of the terrain hierarchical organization is related to the spatio-temporal influence of surface processes and landscape evolution dynamics. Terrain segmentation and the integration of multi-scale terrain information permits further assessment of process domains and erosion, tectonic impact potential, and natural hazard potential. We demonstrate this with landform mapping and geomorphological assessment examples.

  15. An integrated microfluidic analysis microsystems with bacterial capture enrichment and in-situ impedance detection

    NASA Astrophysics Data System (ADS)

    Liu, Hai-Tao; Wen, Zhi-Yu; Xu, Yi; Shang, Zheng-Guo; Peng, Jin-Lan; Tian, Peng

    2017-09-01

    In this paper, an integrated microfluidic analysis microsystems with bacterial capture enrichment and in-situ impedance detection was purposed based on microfluidic chips dielectrophoresis technique and electrochemical impedance detection principle. The microsystems include microfluidic chip, main control module, and drive and control module, and signal detection and processing modulet and result display unit. The main control module produce the work sequence of impedance detection system parts and achieve data communication functions, the drive and control circuit generate AC signal which amplitude and frequency adjustable, and it was applied on the foodborne pathogens impedance analysis microsystems to realize the capture enrichment and impedance detection. The signal detection and processing circuit translate the current signal into impendence of bacteria, and transfer to computer, the last detection result is displayed on the computer. The experiment sample was prepared by adding Escherichia coli standard sample into chicken sample solution, and the samples were tested on the dielectrophoresis chip capture enrichment and in-situ impedance detection microsystems with micro-array electrode microfluidic chips. The experiments show that the Escherichia coli detection limit of microsystems is 5 × 104 CFU/mL and the detection time is within 6 min in the optimization of voltage detection 10 V and detection frequency 500 KHz operating conditions. The integrated microfluidic analysis microsystems laid the solid foundation for rapid real-time in-situ detection of bacteria.

  16. A Value Analysis of Lean Processes in Target Value Design and Integrated Project Delivery.

    PubMed

    Nanda, Upali; K Rybkowski, Zofia; Pati, Sipra; Nejati, Adeleh

    2017-04-01

    To investigate what key stakeholders consider to be the advantages and the opportunities for improvement in using lean thinking and tools in the integrated project delivery (IPD) process. A detailed literature review was followed by case study of a Lean-IPD project. Interviews with members of the project leadership team, focus groups with the integrated team as well as the design team, and an online survey of all stakeholders were conducted. Statistical analysis and thematic content analysis were used to analyze the data, followed by a plus-delta analysis. (1) Learning is a large, implicit benefit of Lean-IPD that is not currently captured by any success metric; (2) the cardboard mock-up was the most successful lean strategy; (3) although a collaborative project, the level of influence of different stakeholder groups was perceived to be different by different stakeholders; (4) overall, Lean-IPD was rated as better than traditional design-bid-build methods; and (5) opportunities for improvement reported were increase in accurate cost estimating, more efficient use of time, perception of imbalance of control/influence, and need for facilitation (which represents different points of view). While lean tools and an IPD method are preferred to traditional design-bid-build methods, the perception of different stakeholders varies and more work needs to be done to allow a truly shared decision-making model. Learning was identified as one of the biggest advantages.

  17. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-109

    NASA Technical Reports Server (NTRS)

    Oliu, Armando

    2005-01-01

    The Debris Team has developed and implemented measures to control damage from debris in the Shuttle operational environment and to make the control measures a part of routine launch flows. These measures include engineering surveillance during vehicle processing and closeout operations, facility and flight hardware inspections before and after launch, and photographic analysis of mission events. Photographic analyses of mission imagery from launch, on-orbit, and landing provide significant data in verifying proper operation of systems and evaluating anomalies. In addition to the Kennedy Space Center Photo/Video Analysis, reports from Johnson Space Center and Marshall Space Flight Center are also included in this document to provide an integrated assessment of the mission.

  18. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-110

    NASA Technical Reports Server (NTRS)

    Oliu, Armando

    2005-01-01

    The Debris Team has developed and implemented measures to control damage from debris in the Shuttle operational environment and to make the control measures a part of routine launch flows. These measures include engineering surveillance during vehicle processing and closeout operations, facility and flight hardware inspections before and after launch, and photographic analysis of mission events. Photographic analyses of mission imagery from launch, on-orbit, and landing provide significant data in verifying proper operation of systems and evaluating anomalies. In addition to the Kennedy Space Center Photo/Video Analysis, reports from Johnson Space Center and Marshall Space Flight Center are also included in this document to provide an integrated assessment of the mission.

  19. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-105

    NASA Technical Reports Server (NTRS)

    Oliu, Armando

    2005-01-01

    The Debris Team has developed and implemented measures to control damage from debris in the Shuttle operational environment and to make the control measures a part of routine launch flows. These measures include engineering surveillance during vehicle processing and closeout operations, facility and flight hardware inspections before and after launch, and photographic analysis of mission events. Photographic analyses of mission imagery from launch, on-orbit, and landing provide significant data in verifying proper operation of systems and evaluating anomalies. In addition to the Kennedy Space Center Photo/Video Analysis, reports from Johnson Space Center and Marshall Space Flight Center are also included in this document to provide an integrated assessment of the mission.

  20. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-104

    NASA Technical Reports Server (NTRS)

    Oliu, Armando

    2005-01-01

    The Debris Team has developed and implemented measures to control damage from debris in the Shuttle operational environment and to make the control measures a part of routine launch flows. These measures include engineering surveillance during vehicle processing and closeout operations, facility and flight hardware inspections before and after launch, and photographic analysis of mission events. Photographic analyses of mission imagery from launch, on-orbit, and landing provide significant data in verifying proper operation of systems and evaluating anomalies. In addition to the Kennedy Space Center Photo/Video Analysis, reports from Johnson Space Center and Marshall Space Flight Center are also included in this document to provide an integrated assessment of the mission.

  1. Debris/Ice/TPS Assessment and Integrated Photographic Analysis of Shuttle Mission STS-108

    NASA Technical Reports Server (NTRS)

    Oliu, Armando

    2005-01-01

    The Debris Team has developed and implemented measures to control damage from debris in the Shuttle operational environment and to make the control measures a part of routine launch flows. These measures include engineering surveillance during vehicle processing and closeout operations, facility and flight hardware inspections before and after launch, and photographic analysis of mission events. Photographic analyses of mission imagery from launch, on-orbit, and landing provide significant data in verifying proper operation of systems and evaluating anomalies. In addition to the Kennedy Space Center Photo/Video Analysis, reports from Johnson Space Center and Marshall Space Flight Center are also included in this document to provide an integrated assessment of the mission.

  2. [Digital signal processing of a novel neuron discharge model stimulation strategy for cochlear implants].

    PubMed

    Yang, Yiwei; Xu, Yuejin; Miu, Jichang; Zhou, Linghong; Xiao, Zhongju

    2012-10-01

    To apply the classic leakage integrate-and-fire models, based on the mechanism of the generation of physiological auditory stimulation, in the information processing coding of cochlear implants to improve the auditory result. The results of algorithm simulation in digital signal processor (DSP) were imported into Matlab for a comparative analysis. Compared with CIS coding, the algorithm of membrane potential integrate-and-fire (MPIF) allowed more natural pulse discharge in a pseudo-random manner to better fit the physiological structures. The MPIF algorithm can effectively solve the problem of the dynamic structure of the delivered auditory information sequence issued in the auditory center and allowed integration of the stimulating pulses and time coding to ensure the coherence and relevance of the stimulating pulse time.

  3. Integrating automated structured analysis and design with Ada programming support environments

    NASA Technical Reports Server (NTRS)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  4. SDI-based business processes: A territorial analysis web information system in Spain

    NASA Astrophysics Data System (ADS)

    Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.

    2012-09-01

    Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.

  5. Development of hybrid lifecycle cost estimating tool (HLCET) for manufacturing influenced design tradeoff

    NASA Astrophysics Data System (ADS)

    Sirirojvisuth, Apinut

    In complex aerospace system design, making an effective design decision requires multidisciplinary knowledge from both product and process perspectives. Integrating manufacturing considerations into the design process is most valuable during the early design stages since designers have more freedom to integrate new ideas when changes are relatively inexpensive in terms of time and effort. Several metrics related to manufacturability are cost, time, and manufacturing readiness level (MRL). Yet, there is a lack of structured methodology that quantifies how changes in the design decisions impact these metrics. As a result, a new set of integrated cost analysis tools are proposed in this study to quantify the impacts. Equally important is the capability to integrate this new cost tool into the existing design methodologies without sacrificing agility and flexibility required during the early design phases. To demonstrate the applicability of this concept, a ModelCenter environment is used to develop software architecture that represents Integrated Product and Process Development (IPPD) methodology used in several aerospace systems designs. The environment seamlessly integrates product and process analysis tools and makes effective transition from one design phase to the other while retaining knowledge gained a priori. Then, an advanced cost estimating tool called Hybrid Lifecycle Cost Estimating Tool (HLCET), a hybrid combination of weight-, process-, and activity-based estimating techniques, is integrated with the design framework. A new weight-based lifecycle cost model is created based on Tailored Cost Model (TCM) equations [3]. This lifecycle cost tool estimates the program cost based on vehicle component weights and programmatic assumptions. Additional high fidelity cost tools like process-based and activity-based cost analysis methods can be used to modify the baseline TCM result as more knowledge is accumulated over design iterations. Therefore, with this concept, the additional manufacturing knowledge can be used to identify a more accurate lifecycle cost and facilitate higher fidelity tradeoffs during conceptual and preliminary design. Advanced Composite Cost Estimating Model (ACCEM) is employed as a process-based cost component to replace the original TCM result of the composite part production cost. The reason for the replacement is that TCM estimates production costs from part weights as a result of subtractive manufacturing of metallic origin such as casting, forging, and machining processes. A complexity factor can sometimes be adjusted to reflect different types of metal and machine settings. The TCM assumption, however, gives erroneous results when applied to additive processes like those of composite manufacturing. Another innovative aspect of this research is the introduction of a work measurement technique called Maynard Operation Sequence Technique (MOST) to be used, similarly to Activity-Based Costing (ABC) approach, to estimate manufacturing time of a part by virtue of breaking down the operations occurred during its production. ABC allows a realistic determination of cost incurred in each activity, as opposed to using a traditional method of time estimation by analogy or using response surface equations from historical process data. The MOST concept provides a tailored study of an individual process typically required for a new, innovative design. Nevertheless, the MOST idea has some challenges, one of which is its requirement to build a new process from ground up. The process development requires a Subject Matter Expertise (SME) in manufacturing method of the particular design. The SME must have also a comprehensive understanding of the MOST system so that the correct parameters are chosen. In practice, these knowledge requirements may demand people from outside of the design discipline and a priori training of MOST. To relieve the constraint, this study includes an entirely new sub-system architecture that comprises 1) a knowledge-based system to provide the required knowledge during the process selection; and 2) a new user-interface to guide the parameter selection when building the process using MOST. Also included in this study is the demonstration of how the HLCET and its constituents can be integrated with a Georgia Tech' Integrated Product and Process Development (IPPD) methodology. The applicability of this work will be shown through a complex aerospace design example to gain insights into how manufacturing knowledge helps make better design decisions during the early stages. The setup process is explained with an example of its utility demonstrated in a hypothetical fighter aircraft wing redesign. The evaluation of the system effectiveness against existing methodologies is illustrated to conclude the thesis.

  6. Structured Analysis of the Logistics Support Analysis (LSA) Task, and Integrated Logistic Support (ILS) Element, ’Standardization and Interoperability (S and I)’.

    DTIC Science & Technology

    1988-11-01

    system, using graphic techniques which enable users, analysts, and designers to get a clear and common picture of the system and how its parts fit...boxes into hierarchies suitable for computer implementation. ŗ. Structured Design uses tools, especially graphic ones, to render systems readily...LSA, PROCESSES, DATA FLOWS, DATA STORES, EX"RNAL ENTITIES, OVERALL SYSTEMS DESIGN PROCESS, over 19, ABSTRACT (Continue on reverse if necessary and

  7. Synthesizing Results from Empirical Research on Computer-Based Scaffolding in STEM Education: A Meta-Analysis

    ERIC Educational Resources Information Center

    Belland, Brian R.; Walker, Andrew E.; Kim, Nam Ju; Lefler, Mason

    2017-01-01

    Computer-based scaffolding assists students as they generate solutions to complex problems, goals, or tasks, helping increase and integrate their higher order skills in the process. However, despite decades of research on scaffolding in STEM (science, technology, engineering, and mathematics) education, no existing comprehensive meta-analysis has…

  8. Configuration Management of an Optimization Application in a Research Environment

    NASA Technical Reports Server (NTRS)

    Townsend, James C.; Salas, Andrea O.; Schuler, M. Patricia

    1999-01-01

    Multidisciplinary design optimization (MDO) research aims to increase interdisciplinary communication and reduce design cycle time by combining system analyses (simulations) with design space search and decision making. The High Performance Computing and Communication Program's current High Speed Civil Transport application, HSCT4.0, at NASA Langley Research Center involves a highly complex analysis process with high-fidelity analyses that are more realistic than previous efforts at the Center. The multidisciplinary processes have been integrated to form a distributed application by using the Java language and Common Object Request Broker Architecture (CORBA) software techniques. HSCT4.0 is a research project in which both the application problem and the implementation strategy have evolved as the MDO and integration issues became better understood. Whereas earlier versions of the application and integrated system were developed with a simple, manual software configuration management (SCM) process, it was evident that this larger project required a more formal SCM procedure. This report briefly describes the HSCT4.0 analysis and its CORBA implementation and then discusses some SCM concepts and their application to this project. In anticipation that SCM will prove beneficial for other large research projects, the report concludes with some lessons learned in overcoming SCM implementation problems for HSCT4.0.

  9. An Integrated Assessment of Location-Dependent Scaling for Microalgae Biofuel Production Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coleman, Andre M.; Abodeely, Jared; Skaggs, Richard

    Successful development of a large-scale microalgae-based biofuels industry requires comprehensive analysis and understanding of the feedstock supply chain—from facility siting/design through processing/upgrading of the feedstock to a fuel product. The evolution from pilot-scale production facilities to energy-scale operations presents many multi-disciplinary challenges, including a sustainable supply of water and nutrients, operational and infrastructure logistics, and economic competitiveness with petroleum-based fuels. These challenges are addressed in part by applying the Integrated Assessment Framework (IAF)—an integrated multi-scale modeling, analysis, and data management suite—to address key issues in developing and operating an open-pond facility by analyzing how variability and uncertainty in space andmore » time affect algal feedstock production rates, and determining the site-specific “optimum” facility scale to minimize capital and operational expenses. This approach explicitly and systematically assesses the interdependence of biofuel production potential, associated resource requirements, and production system design trade-offs. The IAF was applied to a set of sites previously identified as having the potential to cumulatively produce 5 billion-gallons/year in the southeastern U.S. and results indicate costs can be reduced by selecting the most effective processing technology pathway and scaling downstream processing capabilities to fit site-specific growing conditions, available resources, and algal strains.« less

  10. JAMS - a software platform for modular hydrological modelling

    NASA Astrophysics Data System (ADS)

    Kralisch, Sven; Fischer, Christian

    2015-04-01

    Current challenges of understanding and assessing the impacts of climate and land use changes on environmental systems demand for an ever-increasing integration of data and process knowledge in corresponding simulation models. Software frameworks that allow for a seamless creation of integrated models based on less complex components (domain models, process simulation routines) have therefore gained increasing attention during the last decade. JAMS is an Open-Source software framework that has been especially designed to cope with the challenges of eco-hydrological modelling. This is reflected by (i) its flexible approach for representing time and space, (ii) a strong separation of process simulation components from the declarative description of more complex models using domain specific XML, (iii) powerful analysis and visualization functions for spatial and temporal input and output data, and (iv) parameter optimization and uncertainty analysis functions commonly used in environmental modelling. Based on JAMS, different hydrological and nutrient-transport simulation models were implemented and successfully applied during the last years. We will present the JAMS core concepts and give an overview of models, simulation components and support tools available for that framework. Sample applications will be used to underline the advantages of component-based model designs and to show how JAMS can be used to address the challenges of integrated hydrological modelling.

  11. Cloudy Solar Software - Enhanced Capabilities for Finding, Pre-processing, and Visualizing Solar Data

    NASA Astrophysics Data System (ADS)

    Istvan Etesi, Laszlo; Tolbert, K.; Schwartz, R.; Zarro, D.; Dennis, B.; Csillaghy, A.

    2010-05-01

    In our project "Extending the Virtual Solar Observatory (VSO)” we have combined some of the features available in Solar Software (SSW) to produce an integrated environment for data analysis, supporting the complete workflow from data location, retrieval, preparation, and analysis to creating publication-quality figures. Our goal is an integrated analysis experience in IDL, easy-to-use but flexible enough to allow more sophisticated procedures such as multi-instrument analysis. To that end, we have made the transition from a locally oriented setting where all the analysis is done on the user's computer, to an extended analysis environment where IDL has access to services available on the Internet. We have implemented a form of Cloud Computing that uses the VSO search and a new data retrieval and pre-processing server (PrepServer) that provides remote execution of instrument-specific data preparation. We have incorporated the interfaces to the VSO search and the PrepServer into an IDL widget (SHOW_SYNOP) that provides user-friendly searching and downloading of raw solar data and optionally sends search results for pre-processing to the PrepServer prior to downloading the data. The raw and pre-processed data can be displayed with our plotting suite, PLOTMAN, which can handle different data types (light curves, images, and spectra) and perform basic data operations such as zooming, image overlays, solar rotation, etc. PLOTMAN is highly configurable and suited for visual data analysis and for creating publishable figures. PLOTMAN and SHOW_SYNOP work hand-in-hand for a convenient working environment. Our environment supports a growing number of solar instruments that currently includes RHESSI, SOHO/EIT, TRACE, SECCHI/EUVI, HINODE/XRT, and HINODE/EIS.

  12. Bridging analytical approaches for low-carbon transitions

    NASA Astrophysics Data System (ADS)

    Geels, Frank W.; Berkhout, Frans; van Vuuren, Detlef P.

    2016-06-01

    Low-carbon transitions are long-term multi-faceted processes. Although integrated assessment models have many strengths for analysing such transitions, their mathematical representation requires a simplification of the causes, dynamics and scope of such societal transformations. We suggest that integrated assessment model-based analysis should be complemented with insights from socio-technical transition analysis and practice-based action research. We discuss the underlying assumptions, strengths and weaknesses of these three analytical approaches. We argue that full integration of these approaches is not feasible, because of foundational differences in philosophies of science and ontological assumptions. Instead, we suggest that bridging, based on sequential and interactive articulation of different approaches, may generate a more comprehensive and useful chain of assessments to support policy formation and action. We also show how these approaches address knowledge needs of different policymakers (international, national and local), relate to different dimensions of policy processes and speak to different policy-relevant criteria such as cost-effectiveness, socio-political feasibility, social acceptance and legitimacy, and flexibility. A more differentiated set of analytical approaches thus enables a more differentiated approach to climate policy making.

  13. Conducting financial due diligence of medical practices.

    PubMed

    Louiselle, P

    1995-12-01

    Many healthcare organizations are acquiring medical practices in an effort to build more integrated systems of healthcare products and services. This acquisition activity must be approached cautiously to ensure that medical practices being acquired do not have deficiencies that would jeopardize integration efforts. Conducting a thorough due diligence analysis of medical practices before finalizing the transaction can limit the acquiring organizations' legal and financial exposure and is a necessary component to the acquisition process. The author discusses the components of a successful financial due diligence analysis and addresses some of the risk factors in a practice acquisition.

  14. Distributed software framework and continuous integration in hydroinformatics systems

    NASA Astrophysics Data System (ADS)

    Zhou, Jianzhong; Zhang, Wei; Xie, Mengfei; Lu, Chengwei; Chen, Xiao

    2017-08-01

    When encountering multiple and complicated models, multisource structured and unstructured data, complex requirements analysis, the platform design and integration of hydroinformatics systems become a challenge. To properly solve these problems, we describe a distributed software framework and it’s continuous integration process in hydroinformatics systems. This distributed framework mainly consists of server cluster for models, distributed database, GIS (Geographic Information System) servers, master node and clients. Based on it, a GIS - based decision support system for joint regulating of water quantity and water quality of group lakes in Wuhan China is established.

  15. Integrated multiplexed capillary electrophoresis system

    DOEpatents

    Yeung, Edward S.; Tan, Hongdong

    2002-05-14

    The present invention provides an integrated multiplexed capillary electrophoresis system for the analysis of sample analytes. The system integrates and automates multiple components, such as chromatographic columns and separation capillaries, and further provides a detector for the detection of analytes eluting from the separation capillaries. The system employs multiplexed freeze/thaw valves to manage fluid flow and sample movement. The system is computer controlled and is capable of processing samples through reaction, purification, denaturation, pre-concentration, injection, separation and detection in parallel fashion. Methods employing the system of the invention are also provided.

  16. Image Harvest: an open-source platform for high-throughput plant image processing and analysis.

    PubMed

    Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal

    2016-05-01

    High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  17. Business intelligence modeling in launch operations

    NASA Astrophysics Data System (ADS)

    Bardina, Jorge E.; Thirumalainambi, Rajkumar; Davis, Rodney D.

    2005-05-01

    The future of business intelligence in space exploration will focus on the intelligent system-of-systems real-time enterprise. In present business intelligence, a number of technologies that are most relevant to space exploration are experiencing the greatest change. Emerging patterns of set of processes rather than organizational units leading to end-to-end automation is becoming a major objective of enterprise information technology. The cost element is a leading factor of future exploration systems. This technology project is to advance an integrated Planning and Management Simulation Model for evaluation of risks, costs, and reliability of launch systems from Earth to Orbit for Space Exploration. The approach builds on research done in the NASA ARC/KSC developed Virtual Test Bed (VTB) to integrate architectural, operations process, and mission simulations for the purpose of evaluating enterprise level strategies to reduce cost, improve systems operability, and reduce mission risks. The objectives are to understand the interdependency of architecture and process on recurring launch cost of operations, provide management a tool for assessing systems safety and dependability versus cost, and leverage lessons learned and empirical models from Shuttle and International Space Station to validate models applied to Exploration. The systems-of-systems concept is built to balance the conflicting objectives of safety, reliability, and process strategy in order to achieve long term sustainability. A planning and analysis test bed is needed for evaluation of enterprise level options and strategies for transit and launch systems as well as surface and orbital systems. This environment can also support agency simulation based acquisition process objectives. The technology development approach is based on the collaborative effort set forth in the VTB's integrating operations, process models, systems and environment models, and cost models as a comprehensive disciplined enterprise analysis environment. Significant emphasis is being placed on adapting root cause from existing Shuttle operations to exploration. Technical challenges include cost model validation, integration of parametric models with discrete event process and systems simulations, and large-scale simulation integration. The enterprise architecture is required for coherent integration of systems models. It will also require a plan for evolution over the life of the program. The proposed technology will produce long-term benefits in support of the NASA objectives for simulation based acquisition, will improve the ability to assess architectural options verses safety/risk for future exploration systems, and will facilitate incorporation of operability as a systems design consideration, reducing overall life cycle cost for future systems.

  18. Business Intelligence Modeling in Launch Operations

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge E.; Thirumalainambi, Rajkumar; Davis, Rodney D.

    2005-01-01

    This technology project is to advance an integrated Planning and Management Simulation Model for evaluation of risks, costs, and reliability of launch systems from Earth to Orbit for Space Exploration. The approach builds on research done in the NASA ARC/KSC developed Virtual Test Bed (VTB) to integrate architectural, operations process, and mission simulations for the purpose of evaluating enterprise level strategies to reduce cost, improve systems operability, and reduce mission risks. The objectives are to understand the interdependency of architecture and process on recurring launch cost of operations, provide management a tool for assessing systems safety and dependability versus cost, and leverage lessons learned and empirical models from Shuttle and International Space Station to validate models applied to Exploration. The systems-of-systems concept is built to balance the conflicting objectives of safety, reliability, and process strategy in order to achieve long term sustainability. A planning and analysis test bed is needed for evaluation of enterprise level options and strategies for transit and launch systems as well as surface and orbital systems. This environment can also support agency simulation .based acquisition process objectives. The technology development approach is based on the collaborative effort set forth in the VTB's integrating operations. process models, systems and environment models, and cost models as a comprehensive disciplined enterprise analysis environment. Significant emphasis is being placed on adapting root cause from existing Shuttle operations to exploration. Technical challenges include cost model validation, integration of parametric models with discrete event process and systems simulations. and large-scale simulation integration. The enterprise architecture is required for coherent integration of systems models. It will also require a plan for evolution over the life of the program. The proposed technology will produce long-term benefits in support of the NASA objectives for simulation based acquisition, will improve the ability to assess architectural options verses safety/risk for future exploration systems, and will facilitate incorporation of operability as a systems design consideration, reducing overall life cycle cost for future systems. The future of business intelligence of space exploration will focus on the intelligent system-of-systems real-time enterprise. In present business intelligence, a number of technologies that are most relevant to space exploration are experiencing the greatest change. Emerging patterns of set of processes rather than organizational units leading to end-to-end automation is becoming a major objective of enterprise information technology. The cost element is a leading factor of future exploration systems.

  19. The role of country-to-region assignments in global integrated modeling of energy, agriculture, land use, and climate

    NASA Astrophysics Data System (ADS)

    Kyle, P.; Patel, P.; Calvin, K. V.

    2014-12-01

    Global integrated assessment models used for understanding the linkages between the future energy, agriculture, and climate systems typically represent between 8 and 30 geopolitical macro-regions, balancing the benefits of geographic resolution with the costs of additional data collection, processing, analysis, and computing resources. As these models are continually being improved and updated in order to address new questions for the research and policy communities, it is worth examining the consequences of the country-to-region mapping schemes used for model results. This study presents an application of a data processing system built for the GCAM integrated assessment model that allows any country-to-region assignments, with a minimum of four geopolitical regions and a maximum of 185. We test ten different mapping schemes, including the specific mappings used in existing major integrated assessment models. We also explore the impacts of clustering nations into regions according to the similarity of the structure of each nation's energy and agricultural sectors, as indicated by multivariate analysis. Scenarios examined include a reference scenario, a low-emissions scenario, and scenarios with agricultural and buildings sector climate change impacts. We find that at the global level, the major output variables (primary energy, agricultural land use) are surprisingly similar regardless of regional assignments, but at finer geographic scales, differences are pronounced. We suggest that enhancing geographic resolution is advantageous for analysis of climate impacts on the buildings and agricultural sectors, due to the spatial heterogeneity of these drivers.

  20. A novel process control method for a TT-300 E-Beam/X-Ray system

    NASA Astrophysics Data System (ADS)

    Mittendorfer, Josef; Gallnböck-Wagner, Bernhard

    2018-02-01

    This paper presents some aspects of the process control method for a TT-300 E-Beam/X-Ray system at Mediscan, Austria. The novelty of the approach is the seamless integration of routine monitoring dosimetry with process data. This allows to calculate a parametric dose for each production unit and consequently a fine grain and holistic process performance monitoring. Process performance is documented in process control charts for the analysis of individual runs as well as historic trending of runs of specific process categories over a specified time range.

Top