Sample records for analysis process final

  1. Canister Storage Building (CSB) Hazard Analysis Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    POWERS, T.B.

    2000-03-16

    This report describes the methodology used in conducting the Canister Storage Building (CSB) Hazard Analysis to support the final CSB Safety Analysis Report and documents the results. This report describes the methodology used in conducting the Canister Storage Building (CSB) hazard analysis to support the CSB final safety analysis report (FSAR) and documents the results. The hazard analysis process identified hazardous conditions and material-at-risk, determined causes for potential accidents, identified preventive and mitigative features, and qualitatively estimated the frequencies and consequences of specific occurrences. The hazard analysis was performed by a team of cognizant CSB operations and design personnel, safetymore » analysts familiar with the CSB, and technical experts in specialty areas. The material included in this report documents the final state of a nearly two-year long process. Attachment A provides two lists of hazard analysis team members and describes the background and experience of each. The first list is a complete list of the hazard analysis team members that have been involved over the two-year long process. The second list is a subset of the first list and consists of those hazard analysis team members that reviewed and agreed to the final hazard analysis documentation. The material included in this report documents the final state of a nearly two-year long process involving formal facilitated group sessions and independent hazard and accident analysis work. The hazard analysis process led to the selection of candidate accidents for further quantitative analysis. New information relative to the hazards, discovered during the accident analysis, was incorporated into the hazard analysis data in order to compile a complete profile of facility hazards. Through this process, the results of the hazard and accident analyses led directly to the identification of safety structures, systems, and components, technical safety requirements, and other controls required to protect the public, workers, and environment.« less

  2. NASA Hazard Analysis Process

    NASA Technical Reports Server (NTRS)

    Deckert, George

    2010-01-01

    This viewgraph presentation reviews The NASA Hazard Analysis process. The contents include: 1) Significant Incidents and Close Calls in Human Spaceflight; 2) Subsystem Safety Engineering Through the Project Life Cycle; 3) The Risk Informed Design Process; 4) Types of NASA Hazard Analysis; 5) Preliminary Hazard Analysis (PHA); 6) Hazard Analysis Process; 7) Identify Hazardous Conditions; 8) Consider All Interfaces; 9) Work a Preliminary Hazard List; 10) NASA Generic Hazards List; and 11) Final Thoughts

  3. Solucion de Problemas y Procesos Cognoscitivos (Problem Solving and Cognitive Processes). Publication No. 41.

    ERIC Educational Resources Information Center

    Rimoldi, Horacio J. A.

    The study of problem solving is made through the analysis of the process that leads to the final answer. The type of information obtained through the study of the process is compared with the information obtained by studying the final answer. The experimental technique used permits to identify the sequence of questions (tactics) that subjects ask…

  4. Analysis of Tube Free Hydroforming using an Inverse Approach with FLD-based Adjustment of Process Parameters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nguyen, Ba Nghiep; Johnson, Kenneth I.; Khaleel, Mohammad A.

    2003-04-01

    This paper employs an inverse approach (IA) formulation for the analysis of tubes under free hydroforming conditions. The IA formulation is derived from that of Guo et al. established for flat sheet hydroforming analysis using constant strain triangular membrane elements. At first, an incremental analysis of free hydroforming for a hot-dip galvanized (HG/Z140) DP600 tube is performed using the finite element Marc code. The deformed geometry obtained at the last converged increment is then used as the final configuration in the inverse analysis. This comparative study allows us to assess the predicting capability of the inverse analysis. The results willmore » be compared with the experimental values determined by Asnafi and Skogsgardh. After that, a procedure based on a forming limit diagram (FLD) is proposed to adjust the process parameters such as the axial feed and internal pressure. Finally, the adjustment process is illustrated through a re-analysis of the same tube using the inverse approach« less

  5. Implementation of hazard analysis and critical control point (HACCP) in dried anchovy production process

    NASA Astrophysics Data System (ADS)

    Citraresmi, A. D. P.; Wahyuni, E. E.

    2018-03-01

    The aim of this study was to inspect the implementation of Hazard Analysis and Critical Control Point (HACCP) for identification and prevention of potential hazards in the production process of dried anchovy at PT. Kelola Mina Laut (KML), Lobuk unit, Sumenep. Cold storage process is needed in each anchovy processing step in order to maintain its physical and chemical condition. In addition, the implementation of quality assurance system should be undertaken to maintain product quality. The research was conducted using a survey method, by following the whole process of making anchovy from the receiving raw materials to the packaging of final product. The method of data analysis used was descriptive analysis method. Implementation of HACCP at PT. KML, Lobuk unit, Sumenep was conducted by applying Pre Requisite Programs (PRP) and preparation stage consisting of 5 initial stages and 7 principles of HACCP. The results showed that CCP was found in boiling process flow with significant hazard of Listeria monocytogenesis bacteria and final sorting process with significant hazard of foreign material contamination in the product. Actions taken were controlling boiling temperature of 100 – 105°C for 3 - 5 minutes and training for sorting process employees.

  6. Quality by design case study: an integrated multivariate approach to drug product and process development.

    PubMed

    Huang, Jun; Kaul, Goldi; Cai, Chunsheng; Chatlapalli, Ramarao; Hernandez-Abad, Pedro; Ghosh, Krishnendu; Nagi, Arwinder

    2009-12-01

    To facilitate an in-depth process understanding, and offer opportunities for developing control strategies to ensure product quality, a combination of experimental design, optimization and multivariate techniques was integrated into the process development of a drug product. A process DOE was used to evaluate effects of the design factors on manufacturability and final product CQAs, and establish design space to ensure desired CQAs. Two types of analyses were performed to extract maximal information, DOE effect & response surface analysis and multivariate analysis (PCA and PLS). The DOE effect analysis was used to evaluate the interactions and effects of three design factors (water amount, wet massing time and lubrication time), on response variables (blend flow, compressibility and tablet dissolution). The design space was established by the combined use of DOE, optimization and multivariate analysis to ensure desired CQAs. Multivariate analysis of all variables from the DOE batches was conducted to study relationships between the variables and to evaluate the impact of material attributes/process parameters on manufacturability and final product CQAs. The integrated multivariate approach exemplifies application of QbD principles and tools to drug product and process development.

  7. A Comparative Analysis of Transitions from Education to Work in Europe (CATEWE). Final Report [and] Annex to the Final Report.

    ERIC Educational Resources Information Center

    Smyth, Emer; Gangl, Markus; Raffe, David; Hannan, Damian F.; McCoy, Selina

    This project aimed to develop a more comprehensive conceptual framework of school-to-work transitions in different national contexts and apply this framework to the empirical analysis of transition processes across European countries. It drew on these two data sources: European Community Labor Force Survey and integrated databases on national…

  8. The finite element simulation analysis research of 38CrSi cylindrical power spinning

    NASA Astrophysics Data System (ADS)

    Liang, Wei; Lv, Qiongying; Zhao, Yujuan; Lv, Yunxia

    2018-01-01

    In order to grope for the influence of the main cylindrical spinning process parameters on the spinning process, this paper combines with real tube power spinning process and uses ABAQUS finite element analysis software to simulate the tube power spinning process of 38CrSi steel materials, through the analysis of the stress, strain of the part forming process, analyzes the influence of the thickness reduction and the feed rate to the forming process, and analyzes the variation of the spinning force, finally determines the reasonable main spinning process parameters combination.

  9. Post-test navigation data analysis techniques for the shuttle ALT

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Postflight test analysis data processing techniques for shuttle approach and landing tests (ALT) navigation data are defined. Postfight test processor requirements are described along with operational and design requirements, data input requirements, and software test requirements. The postflight test data processing is described based on the natural test sequence: quick-look analysis, postflight navigation processing, and error isolation processing. Emphasis is placed on the tradeoffs that must remain open and subject to analysis until final definition is achieved in the shuttle data processing system and the overall ALT plan. A development plan for the implementation of the ALT postflight test navigation data processing system is presented. Conclusions are presented.

  10. AN ANALYSIS OF THE BEHAVIORAL PROCESSES INVOLVED IN SELF-INSTRUCTION WITH TEACHING MACHINES.

    ERIC Educational Resources Information Center

    HOLLAND, JAMES G.; SKINNER, B.F.

    THIS COLLECTION OF PAPERS CONSTITUTES THE FINAL REPORT OF A PROJECT DEVOTED TO AN ANALYSIS OF THE BEHAVIORAL PROCESSES UNDERLYING PROGRAMED INSTRUCTION. THE PAPERS ARE GROUPED UNDER THREE HEADINGS--(1) "PROGRAMING RESEARCH," (2) "BASIC SKILLS--RATIONALE AND PROCEDURE," AND (3) "BASIC SKILLS--SPECIFIC SKILLS." THE…

  11. U.S. Coast Guard SARSAT Final Evaluation Report. Volume II. Appendices.

    DOT National Transportation Integrated Search

    1987-03-01

    Contents: Controlled Tests; Controlled Test Error Analysis, Processing of Westwind Data; Exercises and Homing Tests; Further Analysis of Controlled Tests; Sar Case Analysis Tables; Narratives of Real Distress Cases; RCC Response Scenarios; Workload A...

  12. Knowledge transmission model with differing initial transmission and retransmission process

    NASA Astrophysics Data System (ADS)

    Wang, Haiying; Wang, Jun; Small, Michael

    2018-10-01

    Knowledge transmission is a cyclic dynamic diffusion process. The rate of acceptance of knowledge differs upon whether or not the recipient has previously held the knowledge. In this paper, the knowledge transmission process is divided into an initial and a retransmission procedure, each with its own transmission and self-learning parameters. Based on epidemic spreading model, we propose a naive-evangelical-agnostic (VEA) knowledge transmission model and derive mean-field equations to describe the dynamics of knowledge transmission in homogeneous networks. Theoretical analysis identifies a criterion for the persistence of knowledge, i.e., the reproduction number R0 depends on the minor effective parameters between the initial and retransmission process. Moreover, the final size of evangelical individuals is only related to retransmission process parameters. Numerical simulations validate the theoretical analysis. Furthermore, the simulations indicate that increasing the initial transmission parameters, including first transmission and self-learning rates of naive individuals, can accelerate the velocity of knowledge transmission efficiently but have no effect on the final size of evangelical individuals. In contrast, the retransmission parameters, including retransmission and self-learning rates of agnostic individuals, have a significant effect on the rate of knowledge transmission, i.e., the larger parameters the greater final density of evangelical individuals.

  13. Analysis of procurement processes and development of recommendations for intelligent transportation systems (ITS) procurements : final report.

    DOT National Transportation Integrated Search

    2007-09-01

    Traditional state procurement processes are not well-suited to the procurement of Intelligent Transportation Systems (ITS). The objective of this study was to analyze Kentuckys existing procurement processes, identify strengths and weaknesses of e...

  14. Combining microwave resonance technology to multivariate data analysis as a novel PAT tool to improve process understanding in fluid bed granulation.

    PubMed

    Lourenço, Vera; Herdling, Thorsten; Reich, Gabriele; Menezes, José C; Lochmann, Dirk

    2011-08-01

    A set of 192 fluid bed granulation batches at industrial scale were in-line monitored using microwave resonance technology (MRT) to determine moisture, temperature and density of the granules. Multivariate data analysis techniques such as multiway partial least squares (PLS), multiway principal component analysis (PCA) and multivariate batch control charts were applied onto collected batch data sets. The combination of all these techniques, along with off-line particle size measurements, led to significantly increased process understanding. A seasonality effect could be put into evidence that impacted further processing through its influence on the final granule size. Moreover, it was demonstrated by means of a PLS that a relation between the particle size and the MRT measurements can be quantitatively defined, highlighting a potential ability of the MRT sensor to predict information about the final granule size. This study has contributed to improve a fluid bed granulation process, and the process knowledge obtained shows that the product quality can be built in process design, following Quality by Design (QbD) and Process Analytical Technology (PAT) principles. Copyright © 2011. Published by Elsevier B.V.

  15. Calendering and Rolling of Viscoplastic Materials: Theory and Experiments

    NASA Astrophysics Data System (ADS)

    Mitsoulis, E.; Sofou, S.; Muliawan, E. B.; Hatzikiriakos, S. G.

    2007-04-01

    The calendering and rolling processes are used in a wide variety of industries for the production of rolled sheets or films of specific thickness and final appearance. The acquired final sheet thickness depends mainly on the rheological properties of the material. Materials which have been used in the present study are foodstuff (such as mozzarella cheese and flour-water dough) used in food processing. These materials are rheologically viscoplastic, obeying the Herschel-Bulkley model. The results give the final sheet thickness and the torque as a function of the roll speed. Theoretical analysis based on the Lubrication Approximation Theory (LAT) shows that LAT is a good predictive tool for calendering, where the sheet thickness is very small compared with the roll size. However, in rolling where this is not true, LAT does not hold, and a 2-D analysis is necessary.

  16. Study for Identification of Beneficial Uses of Space (BUS). Volume 2: Technical report. Book 4: Development and business analysis of space processed surface acoustic wave devices

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Preliminary development plans, analysis of required R and D and production resources, the costs of such resources, and, finally, the potential profitability of a commercial space processing opportunity for the production of very high frequency surface acoustic wave devices are presented.

  17. COMBATXXI, JDAFS, and LBC Integration Requirements for EASE

    DTIC Science & Technology

    2015-10-06

    process as linear and as new data is made available, any previous analysis is obsolete and has to start the process over again. Figure 2 proposes a...final line of the manifest file names the scenario file associated with the run. Under the usual practice, the analyst now starts the COMBATXXI...describes which events are to be logged. Finally the scenario is started with the click of a button. The simulation generates logs of a couple of sorts

  18. Validation of contractor HMA testing data in the materials acceptance process - phase II : final report.

    DOT National Transportation Integrated Search

    2016-08-01

    This study conducted an analysis of the SCDOT HMA specification. A Research Steering Committee provided oversight : of the process. The research process included extensive statistical analyses of test data supplied by SCDOT. : A total of 2,789 AC tes...

  19. Slope Stability Analysis of Waste Dump in Sandstone Open Pit Osielec

    NASA Astrophysics Data System (ADS)

    Adamczyk, Justyna; Cała, Marek; Flisiak, Jerzy; Kolano, Malwina; Kowalski, Michał

    2013-03-01

    This paper presents the slope stability analysis for the current as well as projected (final) geometry of waste dump Sandstone Open Pit "Osielec". For the stability analysis six sections were selected. Then, the final geometry of the waste dump was designed and the stability analysis was conducted. On the basis of the analysis results the opportunities to improve the stability of the object were identified. The next issue addressed in the paper was to determine the proportion of the mixture containing mining and processing wastes, for which the waste dump remains stable. Stability calculations were carried out using Janbu method, which belongs to the limit equilibrium methods.

  20. Analysis of 3D printing parameters of gears for hybrid manufacturing

    NASA Astrophysics Data System (ADS)

    Budzik, Grzegorz; Przeszlowski, Łukasz; Wieczorowski, Michal; Rzucidlo, Arkadiusz; Gapinski, Bartosz; Krolczyk, Grzegorz

    2018-05-01

    The paper deals with analysis and selection of parameters of rapid prototyping of gears by selective sintering of metal powders. Presented results show wide spectrum of application of RP systems in manufacturing processes of machine elements, basing on analysis of market in term of application of additive manufacturing technology in different sectors of industry. Considerable growth of these methods over the past years can be observed. The characteristic errors of printed model with respect to ideal one for each technique were pointed out. Special attention was paid to the method of preparation of numerical data CAD/STL/RP. Moreover the analysis of manufacturing processes of gear type elements was presented. The tested gears were modeled with different allowances for final machining and made by DMLS. Metallographic analysis and strength tests on prepared specimens were performed. The above mentioned analysis and tests were used to compare the real properties of material with the nominal ones. To improve the quality of surface after sintering the gears were subjected to final machining. The analysis of geometry of gears after hybrid manufacturing method was performed (fig.1). The manufacturing process was defined in a traditional way as well as with the aid of modern manufacturing techniques. Methodology and obtained results can be used for other machine elements than gears and constitutes the general theory of production processes in rapid prototyping methods as well as in designing and implementation of production.

  1. Digital image processing and analysis for activated sludge wastewater treatment.

    PubMed

    Khan, Muhammad Burhan; Lee, Xue Yong; Nisar, Humaira; Ng, Choon Aun; Yeap, Kim Ho; Malik, Aamir Saeed

    2015-01-01

    Activated sludge system is generally used in wastewater treatment plants for processing domestic influent. Conventionally the activated sludge wastewater treatment is monitored by measuring physico-chemical parameters like total suspended solids (TSSol), sludge volume index (SVI) and chemical oxygen demand (COD) etc. For the measurement, tests are conducted in the laboratory, which take many hours to give the final measurement. Digital image processing and analysis offers a better alternative not only to monitor and characterize the current state of activated sludge but also to predict the future state. The characterization by image processing and analysis is done by correlating the time evolution of parameters extracted by image analysis of floc and filaments with the physico-chemical parameters. This chapter briefly reviews the activated sludge wastewater treatment; and, procedures of image acquisition, preprocessing, segmentation and analysis in the specific context of activated sludge wastewater treatment. In the latter part additional procedures like z-stacking, image stitching are introduced for wastewater image preprocessing, which are not previously used in the context of activated sludge. Different preprocessing and segmentation techniques are proposed, along with the survey of imaging procedures reported in the literature. Finally the image analysis based morphological parameters and correlation of the parameters with regard to monitoring and prediction of activated sludge are discussed. Hence it is observed that image analysis can play a very useful role in the monitoring of activated sludge wastewater treatment plants.

  2. Developing a Methodology for Designing Systems of Instruction.

    ERIC Educational Resources Information Center

    Carpenter, Polly

    This report presents a description of a process for instructional system design, identification of the steps in the design process, and determination of their sequence and interrelationships. As currently envisioned, several interrelated steps must be taken, five of which provide the inputs to the final design process. There are analysis of…

  3. Occupational Analysis Technology: Expanded Role in Development of Cost-Effective Maintenance Systems. Final Report.

    ERIC Educational Resources Information Center

    Foley, John P., Jr.

    A study was conducted to refine and coordinate occupational analysis, job performance aids, and elements of the instructional systems development process for task specific Air Force maintenance training. Techniques for task identification and analysis (TI & A) and data gathering techniques for occupational analysis were related. While TI &…

  4. Application of Failure Mode and Effect Analysis (FMEA), cause and effect analysis, and Pareto diagram in conjunction with HACCP to a corn curl manufacturing plant.

    PubMed

    Varzakas, Theodoros H; Arvanitoyannis, Ioannis S

    2007-01-01

    The Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of corn curl manufacturing. A tentative approach of FMEA application to the snacks industry was attempted in an effort to exclude the presence of GMOs in the final product. This is of crucial importance both from the ethics and the legislation (Regulations EC 1829/2003; EC 1830/2003; Directive EC 18/2001) point of view. The Preliminary Hazard Analysis and the Fault Tree Analysis were used to analyze and predict the occurring failure modes in a food chain system (corn curls processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and the fishbone diagram). Finally, Pareto diagrams were employed towards the optimization of GMOs detection potential of FMEA.

  5. The Development of Reading for Comprehension: An Information Processing Analysis. Final Report.

    ERIC Educational Resources Information Center

    Schadler, Margaret; Juola, James F.

    This report summarizes research performed at the Universtiy of Kansas that involved several topics related to reading and learning to read, including the development of automatic word recognition processes, reading for comprehension, and the development of new computer technologies designed to facilitate the reading process. The first section…

  6. U.S. EPA'S RESEARCH ON LIFE-CYCLE ANALYSIS

    EPA Science Inventory

    Life-cycle analysis (LCA) consists of looking at a product, process or activity from its inception through its completion. or consumer products, this includes the stages of raw material acquisition, manufacturing and fabrication, distribution, consumer use/reuse and final disposa...

  7. [State Recognition of Solid Fermentation Process Based on Near Infrared Spectroscopy with Adaboost and Spectral Regression Discriminant Analysis].

    PubMed

    Yu, Shuang; Liu, Guo-hai; Xia, Rong-sheng; Jiang, Hui

    2016-01-01

    In order to achieve the rapid monitoring of process state of solid state fermentation (SSF), this study attempted to qualitative identification of process state of SSF of feed protein by use of Fourier transform near infrared (FT-NIR) spectroscopy analysis technique. Even more specifically, the FT-NIR spectroscopy combined with Adaboost-SRDA-NN integrated learning algorithm as an ideal analysis tool was used to accurately and rapidly monitor chemical and physical changes in SSF of feed protein without the need for chemical analysis. Firstly, the raw spectra of all the 140 fermentation samples obtained were collected by use of Fourier transform near infrared spectrometer (Antaris II), and the raw spectra obtained were preprocessed by use of standard normal variate transformation (SNV) spectral preprocessing algorithm. Thereafter, the characteristic information of the preprocessed spectra was extracted by use of spectral regression discriminant analysis (SRDA). Finally, nearest neighbors (NN) algorithm as a basic classifier was selected and building state recognition model to identify different fermentation samples in the validation set. Experimental results showed as follows: the SRDA-NN model revealed its superior performance by compared with other two different NN models, which were developed by use of the feature information form principal component analysis (PCA) and linear discriminant analysis (LDA), and the correct recognition rate of SRDA-NN model achieved 94.28% in the validation set. In this work, in order to further improve the recognition accuracy of the final model, Adaboost-SRDA-NN ensemble learning algorithm was proposed by integrated the Adaboost and SRDA-NN methods, and the presented algorithm was used to construct the online monitoring model of process state of SSF of feed protein. Experimental results showed as follows: the prediction performance of SRDA-NN model has been further enhanced by use of Adaboost lifting algorithm, and the correct recognition rate of the Adaboost-SRDA-NN model achieved 100% in the validation set. The overall results demonstrate that SRDA algorithm can effectively achieve the spectral feature information extraction to the spectral dimension reduction in model calibration process of qualitative analysis of NIR spectroscopy. In addition, the Adaboost lifting algorithm can improve the classification accuracy of the final model. The results obtained in this work can provide research foundation for developing online monitoring instruments for the monitoring of SSF process.

  8. Analysis of microbiological contamination in mixed pressed ham and cooked sausage in Korea.

    PubMed

    Park, Myoung-Su; Wang, Jun; Park, Joong-Hyun; Forghani, Fereidoun; Moon, Jin-San; Oh, Deog-Hwan

    2014-03-01

    The objective of this study was to investigate the microbial contamination levels (aerobic bacteria plate count [APC], coliforms, Escherichia coli, Staphylococcus aureus, and Listeria monocytogenes) in mixed pressed ham and cooked sausage. A total of 180 samples were collected from factories with and without hazard analysis critical control point (HACCP) systems at four steps: after chopping (AC), after mixing (AM), cooling after the first heating process, and cooling after the second heating process. For ham, APCs and coliform and E. coli counts increased when ingredients were added to the meat at the AC step. Final product APC was 1.63 to 1.85 log CFU/g, and coliforms and E. coli were not detected. S. aureus and L. monocytogenes were found in nine (15.0%) and six (10.0%) samples, respectively, but only at the AC and AM steps and not in the final product. Sausage results were similar to those for ham. The final product APC was 1.52 to 3.85 log CFU/g, and coliforms and E. coli were not detected. S. aureus and L. monocytogenes were found in 29 (24.2%) and 25 (20.8%) samples at the AC and AM steps, respectively, but not in the final product. These results indicate that the temperature and time of the first and second heating are of extreme importance to ensure the microbiological safety of the final product regardless of whether a HACCP system is in place. Microorganism contamination must be monitored regularly and regulations regarding sanitization during processing should be improved. Education regarding employee personal hygiene, environmental hygiene, prevention of cross-contamination, ingredient control, and step-by-step process control is needed to reduce the risk of food poisoning.

  9. 75 FR 78978 - Record of Decision for the 158th Fighter Wing's Proposed Realignment of National Guard Avenue and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-17

    ... resources and personnel). The decision was based on matters discussed in the Final Environmental Impact... from the public and regulatory agencies, and other relevant factors. The Final EIS was made available... NEPA of 1969 (42 USC. 4321, et seq.) and the Air Force's Environmental Impact Analysis Process (EIAP...

  10. Phase B: Final definition and preliminary design study for the initial Atmospheric Cloud Physics Laboratory (ACPL): A spacelab mission payload. Final review (DR-MA-03)

    NASA Technical Reports Server (NTRS)

    Clausen, O. W.

    1976-01-01

    Systems design for an initial atmospheric cloud physics laboratory to study microphysical processes in zero gravity is presented. Included are descriptions of the fluid, thermal, mechanical, control and data, and electrical distribution interfaces with Spacelab. Schedule and cost analysis are discussed.

  11. Xylo-Oligosaccharide Process Development, Composition, and Techno-Economic Analysis. Cooperative Research and Development Final Report, CRADA Number CRD-12-483

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shekiro, Joe; Elander, Richard

    2015-12-01

    The purpose of this cooperative work agreement between General Mills Inc. (GMI) and NREL is to determine the feasibility of producing a valuable food ingredient (xylo-oligosaccharides or XOS), a highly soluble fiber material, from agricultural waste streams, at an advantaged cost level relative to similar existing ingredients. The scope of the project includes pilot-scale process development (Task 1), compositional analysis (Task 2), and techno-economic analysis (Task 3).

  12. A retrospective review of how nonconformities are expressed and finalized in external inspections of health-care facilities.

    PubMed

    Hovlid, Einar; Høifødt, Helge; Smedbråten, Bente; Braut, Geir Sverre

    2015-09-23

    External inspections are widely used in health care as a means of improving the quality of care. However, the way external inspections affect the involved organization is poorly understood. A better understanding of these processes is important to improve our understanding of the varying effects of external inspections in different organizations. In turn, this can contribute to the development of more effective ways of conducting inspections. The way the inspecting organization states their grounds for noncompliant behavior and subsequently follows up to enforce the necessary changes can have implications for the inspected organization's change process. We explore how inspecting organizations express and state their grounds for noncompliant behavior and how they follow up to enforce improvements. We conducted a retrospective review, in which we performed a content analysis of the documents from 36 external inspections in Norway. Our analysis was guided by Donabedian's structure, process, and outcome model. Deficiencies in the management system in combination with clinical work processes was considered as nonconformity by the inspecting organizations. Two characteristic patterns were identified in the way observations led to a statement of nonconformity: one in which it was clearly demonstrated how deficiencies in the management system could affect clinical processes, and one in which this connection was not demonstrated. Two characteristic patterns were also identified in the way the inspecting organization followed up and finalized their inspection: one in which the inspection was finalized solely based on the documented changes in structural deficiencies addressed in the nonconformity statement, and one based on the documented changes in structural and process deficiencies addressed in the nonconformity statement. External inspections are performed to improve the quality of care. To accomplish this aim, we suggest that nonconformities should be grounded by observations that clearly demonstrate how deficiencies in the management system might affect the clinical processes, and that the inspection should be finalized based on documented changes in both structural and process deficiencies addressed in the nonconformity statement.

  13. Bending Distortion Analysis of a Steel Shaft Manufacturing Chain from Cold Drawing to Grinding

    NASA Astrophysics Data System (ADS)

    Dias, Vinicius Waechter; da Silva Rocha, Alexandre; Zottis, Juliana; Dong, Juan; Epp, Jérémy; Zoch, Hans Werner

    2017-04-01

    Shafts are usually manufactured from bars that are cold drawn, cut machined, induction hardened, straightened, and finally ground. The main distortion is characterized by bending that appears after induction hardening and is corrected by straightening and/or grinding. In this work, the consequence of the variation of manufacturing parameters on the distortion was analyzed for a complete manufacturing route for production of induction hardened shafts made of Grade 1045 steel. A DoE plan was implemented varying the drawing angle, cutting method, induction hardening layer depth, and grinding penetration depth. The distortion was determined by calculating curvature vectors from dimensional analysis by 3D coordinate measurements. Optical microscopy, microhardness testing, residual stress analysis, and FEM process simulation were used to evaluate and understand effects of the main carriers of distortion potential. The drawing process was identified as the most significant influence on the final distortion of the shafts.

  14. An Operationally Responsive Space Architecture for 2025

    DTIC Science & Technology

    2008-06-22

    Organizational Relationships, Asset Loss Mitigation, Availability, Flexibility, and Streamlined Acquisition Processes . These pillars allowed the solutions...were considered. Analysis was further supported by a performance versus cost process which provided a final test of solution feasibility. Relative cost...Availability, Flexibility, and Streamlined Acquisition Processes . These pillars allowed the solutions, material and non-material, to be organized for

  15. Noise limitations in optical linear algebra processors.

    PubMed

    Batsell, S G; Jong, T L; Walkup, J F; Krile, T F

    1990-05-10

    A general statistical noise model is presented for optical linear algebra processors. A statistical analysis which includes device noise, the multiplication process, and the addition operation is undertaken. We focus on those processes which are architecturally independent. Finally, experimental results which verify the analytical predictions are also presented.

  16. Numerical simulation and experimentation of adjusting the curvatures of micro-cantilevers using the water-confined laser-generated plasma

    NASA Astrophysics Data System (ADS)

    Gu, Chunxing; Shen, Zongbao; Liu, Huixia; Li, Pin; Lu, Mengmeng; Zhao, Yinxin; Wang, Xiao

    2013-04-01

    This paper describes a precise and non-contact adjustment technique using the water-confined laser-generated plasma to adjust the curvature of micro-components (micro-mechanical cantilevers). A series of laser shock micro-adjustment experiments were conducted on 0.4 mm-thick Al samples using pulsed Nd:YAG lasers operating at 1064 nm wavelengths to verify the technical feasibility. Systematic study was carried out in the term of effects of various factors on the adjusting results, including laser energies, laser focus positions, laser shock times and confined regime configuration. The research results have shown that the different bending angles and bending directions can be obtained by changing the laser processing parameters. And, for the adjustment process, the absence of confined regime configuration could also generate suitable bending deformation. But, in the case of larger energy, the final surfaces would have the sign of ablation, hence resulting in poor surface quality. An analysis procedure including dynamic analysis performed by ANSYS/LS-DYNA and static analysis performed by ANSYS is presented in detail to attain the simulation of laser shock micro-adjustment to predict the final bending deformation. The predicted bending profiles is well correlated with the available experimental data, showing the finite element analysis can predict the final curvatures of the micro-cantilevers properly.

  17. Regulatory Impact Analysis: Amendments to the National Emission Standards for Hazardous Air Pollutants (NESHAP) and New Source Perofrmance Standards (NSPS) for the Portland Cement Manufacturing Industry Final Report

    EPA Pesticide Factsheets

    For the regulatory process, EPA is required to develop a regulatory impact analysis (RIA). This August 2010 RIA includes an economic impact analysis (EIA) and a small entity impacts analysis and documents the RIA methods and results for the 2010 rules

  18. Influences on Academic Achievement Across High and Low Income Countries: A Re-Analysis of IEA Data.

    ERIC Educational Resources Information Center

    Heyneman, S.; Loxley, W.

    Previous international studies of science achievement put the data through a process of winnowing to decide which variables to keep in the final regressions. Variables were allowed to enter the final regressions if they met a minimum beta coefficient criterion of 0.05 averaged across rich and poor countries alike. The criterion was an average…

  19. An Investigation of Spoken Brazilian Portuguese: Part I, Technical Report. Final Report.

    ERIC Educational Resources Information Center

    Hutchins, John A.

    This final report of a study which developed a working corpus of spoken and written Portuguese from which syntactical studies could be conducted includes computer-processed data on which the findings and analysis are based. A data base, obtained by taping some 487 conversations between Brazil and the United States, serves as the corpus from which…

  20. Optimization of Parameter Ranges for Composite Tape Winding Process Based on Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Yu, Tao; Shi, Yaoyao; He, Xiaodong; Kang, Chao; Deng, Bo; Song, Shibo

    2017-08-01

    This study is focus on the parameters sensitivity of winding process for composite prepreg tape. The methods of multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis are proposed. The polynomial empirical model of interlaminar shear strength is established by response surface experimental method. Using this model, the relative sensitivity of key process parameters including temperature, tension, pressure and velocity is calculated, while the single-parameter sensitivity curves are obtained. According to the analysis of sensitivity curves, the stability and instability range of each parameter are recognized. Finally, the optimization method of winding process parameters is developed. The analysis results show that the optimized ranges of the process parameters for interlaminar shear strength are: temperature within [100 °C, 150 °C], tension within [275 N, 387 N], pressure within [800 N, 1500 N], and velocity within [0.2 m/s, 0.4 m/s], respectively.

  1. Programming and machining of complex parts based on CATIA solid modeling

    NASA Astrophysics Data System (ADS)

    Zhu, Xiurong

    2017-09-01

    The complex parts of the use of CATIA solid modeling programming and simulation processing design, elaborated in the field of CNC machining, programming and the importance of processing technology. In parts of the design process, first make a deep analysis on the principle, and then the size of the design, the size of each chain, connected to each other. After the use of backstepping and a variety of methods to calculate the final size of the parts. In the selection of parts materials, careful study, repeated testing, the final choice of 6061 aluminum alloy. According to the actual situation of the processing site, it is necessary to make a comprehensive consideration of various factors in the machining process. The simulation process should be based on the actual processing, not only pay attention to shape. It can be used as reference for machining.

  2. Breaking the Change Barrier: A 40 Year Analysis of Air Force Pilot Retention Solutions

    DTIC Science & Technology

    national defense. A problem/solution research methodology using the organizational management theory of path dependence explored the implications of the...exodus is to start the incentive process earlier in the career and prior to the final decision to separate. Path dependent analysis indicates all prior... incentive options and personal involvement in the overall process. The Air Force can annually budget and forecast incentive requirements and personnel

  3. Spectroscopic analysis and control

    DOEpatents

    Tate; , James D.; Reed, Christopher J.; Domke, Christopher H.; Le, Linh; Seasholtz, Mary Beth; Weber, Andy; Lipp, Charles

    2017-04-18

    Apparatus for spectroscopic analysis which includes a tunable diode laser spectrometer having a digital output signal and a digital computer for receiving the digital output signal from the spectrometer, the digital computer programmed to process the digital output signal using a multivariate regression algorithm. In addition, a spectroscopic method of analysis using such apparatus. Finally, a method for controlling an ethylene cracker hydrogenator.

  4. Paediatric x-ray radiation dose reduction and image quality analysis.

    PubMed

    Martin, L; Ruddlesden, R; Makepeace, C; Robinson, L; Mistry, T; Starritt, H

    2013-09-01

    Collaboration of multiple staff groups has resulted in significant reduction in the risk of radiation-induced cancer from radiographic x-ray exposure during childhood. In this study at an acute NHS hospital trust, a preliminary audit identified initial exposure factors. These were compared with European and UK guidance, leading to the introduction of new factors that were in compliance with European guidance on x-ray tube potentials. Image quality was assessed using standard anatomical criteria scoring, and visual grading characteristics analysis assessed the impact on image quality of changes in exposure factors. This analysis determined the acceptability of gradual radiation dose reduction below the European and UK guidance levels. Chest and pelvis exposures were optimised, achieving dose reduction for each age group, with 7%-55% decrease in critical organ dose. Clinicians confirmed diagnostic image quality throughout the iterative process. Analysis of images acquired with preliminary and final exposure factors indicated an average visual grading analysis result of 0.5, demonstrating equivalent image quality. The optimisation process and final radiation doses are reported for Carestream computed radiography to aid other hospitals in minimising radiation risks to children.

  5. Development of environmentally conscious cleaning process for leadless chip carrier assemblies. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, B.E.

    1995-04-01

    A cross-functional team of process, product, quality, material, and design lab engineers was assembled to develop an environmentally friendly cleaning process for leadless chip carrier assemblies (LCCAs). Using flush and filter testing, Auger surface analysis, GC-Mass spectrophotometry, production yield results, and electrical testing results over an extended testing period, the team developed an aqueous cleaning process for LCCAs. The aqueous process replaced the Freon vapor degreasing/ultrasonic rinse process.

  6. Expert system for web based collaborative CAE

    NASA Astrophysics Data System (ADS)

    Hou, Liang; Lin, Zusheng

    2006-11-01

    An expert system for web based collaborative CAE was developed based on knowledge engineering, relational database and commercial FEA (Finite element analysis) software. The architecture of the system was illustrated. In this system, the experts' experiences, theories and typical examples and other related knowledge, which will be used in the stage of pre-process in FEA, were categorized into analysis process and object knowledge. Then, the integrated knowledge model based on object-oriented method and rule based method was described. The integrated reasoning process based on CBR (case based reasoning) and rule based reasoning was presented. Finally, the analysis process of this expert system in web based CAE application was illustrated, and an analysis example of a machine tool's column was illustrated to prove the validity of the system.

  7. Operations research methods improve chemotherapy patient appointment scheduling.

    PubMed

    Santibáñez, Pablo; Aristizabal, Ruben; Puterman, Martin L; Chow, Vincent S; Huang, Wenhai; Kollmannsberger, Christian; Nordin, Travis; Runzer, Nancy; Tyldesley, Scott

    2012-12-01

    Clinical complexity, scheduling restrictions, and outdated manual booking processes resulted in frequent clerical rework, long waitlists for treatment, and late appointment notification for patients at a chemotherapy clinic in a large cancer center in British Columbia, Canada. A 17-month study was conducted to address booking, scheduling and workload issues and to develop, implement, and evaluate solutions. A review of scheduling practices included process observation and mapping, analysis of historical appointment data, creation of a new performance metric (final appointment notification lead time), and a baseline patient satisfaction survey. Process improvement involved discrete event simulation to evaluate alternative booking practice scenarios, development of an optimization-based scheduling tool to improve scheduling efficiency, and change management for implementation of process changes. Results were evaluated through analysis of appointment data, a follow-up patient survey, and staff surveys. Process review revealed a two-stage scheduling process. Long waitlists and late notification resulted from an inflexible first-stage process. The second-stage process was time consuming and tedious. After a revised, more flexible first-stage process and an automated second-stage process were implemented, the median percentage of appointments exceeding the final appointment notification lead time target of one week was reduced by 57% and median waitlist size decreased by 83%. Patient surveys confirmed increased satisfaction while staff feedback reported reduced stress levels. Significant operational improvements can be achieved through process redesign combined with operations research methods.

  8. What is the Final Verification of Engineering Requirements?

    NASA Technical Reports Server (NTRS)

    Poole, Eric

    2010-01-01

    This slide presentation reviews the process of development through the final verification of engineering requirements. The definition of the requirements is driven by basic needs, and should be reviewed by both the supplier and the customer. All involved need to agree upon a formal requirements including changes to the original requirements document. After the requirements have ben developed, the engineering team begins to design the system. The final design is reviewed by other organizations. The final operational system must satisfy the original requirements, though many verifications should be performed during the process. The verification methods that are used are test, inspection, analysis and demonstration. The plan for verification should be created once the system requirements are documented. The plan should include assurances that every requirement is formally verified, that the methods and the responsible organizations are specified, and that the plan is reviewed by all parties. The options of having the engineering team involved in all phases of the development as opposed to having some other organization continue the process once the design has been complete is discussed.

  9. Columbia River System Operation Review : Final Environmental Impact Statement, Appendix J: Recreation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Columbia River System Operation Review

    1995-11-01

    This Appendix J of the Final Environmental Impact Statement for the Columbia River System discusses impacts on the recreational activities in the region. Major sections include the following: scope and processes; recreation in the Columbia River Basin today - by type, location, participation, user characteristics, factors which affect usage, and managing agencies; recreation analysis procedures and methodology; and alternatives and their impacts.

  10. Design of Online Spheroidization Process for 1.0C-1.5Cr Bearing Steel and Microstructure Analysis

    NASA Astrophysics Data System (ADS)

    Li, Zhen-Xing; Li, Chang-Sheng; Ren, Jin-Yi; Li, Bin-Zhou; Suh, Dong-Woo

    2018-02-01

    Using thermo-mechanical control process, the online spheroidization annealing process of 1.0C-1.5Cr bearing steel was designed. Apart from intercritical online spheroidization (IS), a novel subcritical online spheroidization (SS) process was proposed, which is characterized by water-cooling to around 773 K (500 °C) after the final rolling pass, and then directly reheating to 973 K (700 °C) for isothermal holding. Compared with the results from the traditional offline spheroidization (TS) process, the size of spheroidized carbides is similar in both the TS and IS processes, whereas it is much smaller in the SS process. After spheroidization annealing, microstructure evolution during austenitization and quenching treatment was examined. It is shown that the refining of spheroidized carbides accelerates the dissolution of carbides during the austenitizing process, and decreases the size of undissolved carbides. In addition, the SS process can obtain finer prior austenite grain after quenching, which contributes to the enhancement of final hardness.

  11. CAPACITY BUILDING PROCESS IN ENVIRONMENTAL AND HEALTH IMPACT ASSESSMENT FOR A THAI COMMUNITY.

    PubMed

    Chaithui, Suthat; Sithisarankul, Pornchai; Hengpraprom, Sarunya

    2017-03-01

    This research aimed at exploring the development of the capacitybuilding process in environmental and health impact assessment, including the consideration of subsequent, capacity-building achievements. Data were gathered through questionnaires, participatory observations, in-depth interviews, focus group discussions, and capacity building checklist forms. These data were analyzed using content analysis, descriptive statistics, and inferential statistics. Our study used the components of the final draft for capacity-building processes consisting of ten steps that were formulated by synthesis from each respective process. Additionally, the evaluation of capacity building levels was performed using 10-item evaluation criteria for nine communities. The results indicated that the communities performed well under these criteria. Finally, exploration of the factors influencing capacity building in environmental and health impact assessment indicated that the learning of community members by knowledge exchange via activities and study visits were the most influential factors of the capacity building processes in environmental and health impact assessment. The final revised version of capacitybuilding process in environmental and health impact assessment could serve as a basis for the consideration of interventions in similar areas, so that they increased capacity in environmental and health impact assessments.

  12. State of the art in pathology business process analysis, modeling, design and optimization.

    PubMed

    Schrader, Thomas; Blobel, Bernd; García-Rojo, Marcial; Daniel, Christel; Słodkowska, Janina

    2012-01-01

    For analyzing current workflows and processes, for improving them, for quality management and quality assurance, for integrating hardware and software components, but also for education, training and communication between different domains' experts, modeling business process in a pathology department is inevitable. The authors highlight three main processes in pathology: general diagnostic, cytology diagnostic, and autopsy. In this chapter, those processes are formally modeled and described in detail. Finally, specialized processes such as immunohistochemistry and frozen section have been considered.

  13. Final report of coordination and cooperation with the European Union on embankment failure analysis

    USDA-ARS?s Scientific Manuscript database

    There has been an emphasis in the European Union (EU) community on the investigation of extreme flood processes and the uncertainties related to these processes. Over a 3-year period, the EU and the U.S. dam safety community (1) coordinated their efforts and collected information needed to integrate...

  14. Inferential Judgments Affecting the Decision-Making Process in the Attorney General's Commission on Pornography.

    ERIC Educational Resources Information Center

    Gouran, Dennis S.

    Although the Attorney General's Commission on Pornography, also known as the Meese Commission, has been criticized excessively at times for threatening freedom of speech and press and individual rights to privacy, an analysis of its "Final Report" reveals numerous deficiencies in the Commission's decision-making process. These…

  15. Environmental Impact Analysis Process. Final Environmental Impact Statement. Part 2A. Proposed Central Radar System Over-the-Horizon Backscatter Radar Program

    DTIC Science & Technology

    1987-05-01

    processes or thermoregulation . Most investigations involving chronic exposures of mammals indicated either that no effects occurred or that reversible...radiofrequency radiation danger "* Fish, reptiles , and amphibians - Few species and fisheries - Avoid streams and wetlands, when possible 3-37 BIRDS "* The

  16. Integrating Information: An Analysis of the Processes Involved and the Products Generated in a Written Synthesis Task

    ERIC Educational Resources Information Center

    Sole, Isabel; Miras, Mariana; Castells, Nuria; Espino, Sandra; Minguela, Marta

    2013-01-01

    The case study reported here explores the processes involved in producing a written synthesis of three history texts and their possible relation to the characteristics of the texts produced and the degree of comprehension achieved following the task. The processes carried out by 10 final-year compulsory education students (15 and 16 years old) to…

  17. Collaborative Platform for DFM

    DTIC Science & Technology

    2007-12-20

    generation litho hotspot checkers have also been implemented in automated hotspot fixers that can automatically fix designs by making small changes...processing side (ex. new CMP models, etch models, litho models) and on the circuit side (ex. Process aware circuit analysis or yield optimization...Since final gate CD is a function of not only litho , but Post Exposure Bake, ashing, and etch, the processing module can be augmented with more

  18. Multiscale analysis of the correlation of processing parameters on viscidity of composites fabricated by automated fiber placement

    NASA Astrophysics Data System (ADS)

    Han, Zhenyu; Sun, Shouzheng; Fu, Yunzhong; Fu, Hongya

    2017-10-01

    Viscidity is an important physical indicator for assessing fluidity of resin that is beneficial to contact resin with the fibers effectively and reduce manufacturing defects during automated fiber placement (AFP) process. However, the effect of processing parameters on viscidity evolution is rarely studied during AFP process. In this paper, viscidities under different scales are analyzed based on multi-scale analysis method. Firstly, viscous dissipation energy (VDE) within meso-unit under different processing parameters is assessed by using finite element method (FEM). According to multi-scale energy transfer model, meso-unit energy is used as the boundary condition for microscopic analysis. Furthermore, molecular structure of micro-system is built by molecular dynamics (MD) method. And viscosity curves are then obtained by integrating stress autocorrelation function (SACF) with time. Finally, the correlation characteristics of processing parameters to viscosity are revealed by using gray relational analysis method (GRAM). A group of processing parameters is found out to achieve the stability of viscosity and better fluidity of resin.

  19. Applying thematic analysis theory to practice: a researcher's experience.

    PubMed

    Tuckett, Anthony G

    2005-01-01

    This article describes an experience of thematic analysis. In order to answer the question 'What does analysis look like in practice?' it describes in brief how the methodology of grounded theory, the epistemology of social constructionism, and the theoretical stance of symbolic interactionism inform analysis. Additionally, analysis is examined by evidencing the systematic processes--here termed organising, coding, writing, theorising, and reading--that led the researcher to develop a final thematic schema.

  20. Spontaneous imbibition in fractal tortuous micro-nano pores considering dynamic contact angle and slip effect: phase portrait analysis and analytical solutions.

    PubMed

    Li, Caoxiong; Shen, Yinghao; Ge, Hongkui; Zhang, Yanjun; Liu, Tao

    2018-03-02

    Shales have abundant micro-nano pores. Meanwhile, a considerable amount of fracturing liquid is imbibed spontaneously in the hydraulic fracturing process. The spontaneous imbibition in tortuous micro-nano pores is special to shale, and dynamic contact angle and slippage are two important characteristics. In this work, we mainly investigate spontaneous imbibition considering dynamic contact angle and slip effect in fractal tortuous capillaries. We introduce phase portrait analysis to analyse the dynamic state and stability of imbibition. Moreover, analytical solutions to the imbibition equation are derived under special situations, and the solutions are verified by published data. Finally, we discuss the influences of slip length, dynamic contact angle and gravity on spontaneous imbibition. The analysis shows that phase portrait is an ideal tool for analysing spontaneous imbibition because it can evaluate the process without solving the complex governing ordinary differential equations. Moreover, dynamic contact angle and slip effect play an important role in fluid imbibition in fractal tortuous capillaries. Neglecting slip effect in micro-nano pores apparently underestimates imbibition capability, and ignoring variations in contact angle causes inaccuracy in predicting imbibition speed at the initial stage of the process. Finally, gravity is one of the factors that control the stabilisation of the imbibition process.

  1. Analysis of Phenix end-of-life natural convection test with the MARS-LMR code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeong, H. Y.; Ha, K. S.; Lee, K. L.

    The end-of-life test of Phenix reactor performed by the CEA provided an opportunity to have reliable and valuable test data for the validation and verification of a SFR system analysis code. KAERI joined this international program for the analysis of Phenix end-of-life natural circulation test coordinated by the IAEA from 2008. The main objectives of this study were to evaluate the capability of existing SFR system analysis code MARS-LMR and to identify any limitation of the code. The analysis was performed in three stages: pre-test analysis, blind posttest analysis, and final post-test analysis. In the pre-test analysis, the design conditionsmore » provided by the CEA were used to obtain a prediction of the test. The blind post-test analysis was based on the test conditions measured during the tests but the test results were not provided from the CEA. The final post-test analysis was performed to predict the test results as accurate as possible by improving the previous modeling of the test. Based on the pre-test analysis and blind test analysis, the modeling for heat structures in the hot pool and cold pool, steel structures in the core, heat loss from roof and vessel, and the flow path at core outlet were reinforced in the final analysis. The results of the final post-test analysis could be characterized into three different phases. In the early phase, the MARS-LMR simulated the heat-up process correctly due to the enhanced heat structure modeling. In the mid phase before the opening of SG casing, the code reproduced the decrease of core outlet temperature successfully. Finally, in the later phase the increase of heat removal by the opening of the SG opening was well predicted with the MARS-LMR code. (authors)« less

  2. An Approach Based on Social Network Analysis Applied to a Collaborative Learning Experience

    ERIC Educational Resources Information Center

    Claros, Iván; Cobos, Ruth; Collazos, César A.

    2016-01-01

    The Social Network Analysis (SNA) techniques allow modelling and analysing the interaction among individuals based on their attributes and relationships. This approach has been used by several researchers in order to measure the social processes in collaborative learning experiences. But oftentimes such measures were calculated at the final state…

  3. Computer analysis of arteriograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Armstrong, J. H.; Beckenbach, E. B.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.; Sanmarco, M. E.

    1977-01-01

    A computer system has been developed to quantify the degree of atherosclerosis in the human femoral artery. The analysis involves first scanning and digitizing angiographic film, then tracking the outline of the arterial image and finally computing the relative amount of roughness or irregularity in the vessel wall. The image processing system and method are described.

  4. Learning Principal Component Analysis by Using Data from Air Quality Networks

    ERIC Educational Resources Information Center

    Perez-Arribas, Luis Vicente; Leon-González, María Eugenia; Rosales-Conrado, Noelia

    2017-01-01

    With the final objective of using computational and chemometrics tools in the chemistry studies, this paper shows the methodology and interpretation of the Principal Component Analysis (PCA) using pollution data from different cities. This paper describes how students can obtain data on air quality and process such data for additional information…

  5. An overview of data acquisition, signal coding and data analysis techniques for MST radars

    NASA Technical Reports Server (NTRS)

    Rastogi, P. K.

    1986-01-01

    An overview is given of the data acquisition, signal processing, and data analysis techniques that are currently in use with high power MST/ST (mesosphere stratosphere troposphere/stratosphere troposphere) radars. This review supplements the works of Rastogi (1983) and Farley (1984) presented at previous MAP workshops. A general description is given of data acquisition and signal processing operations and they are characterized on the basis of their disparate time scales. Then signal coding, a brief description of frequently used codes, and their limitations are discussed, and finally, several aspects of statistical data processing such as signal statistics, power spectrum and autocovariance analysis, outlier removal techniques are discussed.

  6. An image analysis of TLC patterns for quality control of saffron based on soil salinity effect: A strategy for data (pre)-processing.

    PubMed

    Sereshti, Hassan; Poursorkh, Zahra; Aliakbarzadeh, Ghazaleh; Zarre, Shahin; Ataolahi, Sahar

    2018-01-15

    Quality of saffron, a valuable food additive, could considerably affect the consumers' health. In this work, a novel preprocessing strategy for image analysis of saffron thin layer chromatographic (TLC) patterns was introduced. This includes performing a series of image pre-processing techniques on TLC images such as compression, inversion, elimination of general baseline (using asymmetric least squares (AsLS)), removing spots shift and concavity (by correlation optimization warping (COW)), and finally conversion to RGB chromatograms. Subsequently, an unsupervised multivariate data analysis including principal component analysis (PCA) and k-means clustering was utilized to investigate the soil salinity effect, as a cultivation parameter, on saffron TLC patterns. This method was used as a rapid and simple technique to obtain the chemical fingerprints of saffron TLC images. Finally, the separated TLC spots were chemically identified using high-performance liquid chromatography-diode array detection (HPLC-DAD). Accordingly, the saffron quality from different areas of Iran was evaluated and classified. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Study for identification of beneficial Uses of Space (BUS). Volume 2: Technical report. Book 3: Development and business analysis of space processed tungsten fox X-ray targets

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The development plans, analysis of required R and D and production resources, the costs of such resources, and finally, the potential profitability of a commercial space processing opportunity for containerless melting and resolidification of tungsten are discussed. The aim is to obtain a form of tungsten which, when fabricated into targets for X-ray tubes, provides at least, a 50 percent increase in service life.

  8. [Failure mode effect analysis applied to preparation of intravenous cytostatics].

    PubMed

    Santos-Rubio, M D; Marín-Gil, R; Muñoz-de la Corte, R; Velázquez-López, M D; Gil-Navarro, M V; Bautista-Paloma, F J

    2016-01-01

    To proactively identify risks in the preparation of intravenous cytostatic drugs, and to prioritise and establish measures to improve safety procedures. Failure Mode Effect Analysis methodology was used. A multidisciplinary team identified potential failure modes of the procedure through a brainstorming session. The impact associated with each failure mode was assessed with the Risk Priority Number (RPN), which involves three variables: occurrence, severity, and detectability. Improvement measures were established for all identified failure modes, with those with RPN>100 considered critical. The final RPN (theoretical) that would result from the proposed measures was also calculated and the process was redesigned. A total of 34 failure modes were identified. The initial accumulated RPN was 3022 (range: 3-252), and after recommended actions the final RPN was 1292 (range: 3-189). RPN scores >100 were obtained in 13 failure modes; only the dispensing sub-process was free of critical points (RPN>100). A final reduction of RPN>50% was achieved in 9 failure modes. This prospective risk analysis methodology allows the weaknesses of the procedure to be prioritised, optimize use of resources, and a substantial improvement in the safety of the preparation of cytostatic drugs through the introduction of double checking and intermediate product labelling. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  9. Effect of egg freshness on texture and baking characteristics of batter systems formulated using egg, flour and sugar.

    PubMed

    Xing, Liting; Niu, Fuge; Su, Yujie; Yang, Yanjun

    2016-04-01

    The aim of this work was to evaluate the effects of egg freshness on baking properties and final qualities in batter systems. Batters were made with eggs of different freshness, and the properties of batter systems were studied through rheological analysis, rapid viscosity analysis (RVA), differential scanning calorimetry (DSC), batter density and expansion rate during the baking and cooling processes. Moreover, the qualities of final baked systems were investigated, including specific volume and texture profile analysis (TPA). The flow behavior of batters showed that the consistency index (K) decreased as the Haugh unit (HU) value decreased, while the flow behavior index (n) increased. Both the storage modulus (G') and loss modulus (G″) determined by mechanical spectra at 20 °C decreased with decreasing HU. RVA and DSC determinations revealed that lower-HU samples had a lower viscosity in the baking process and a shorter time for starch gelatinization and egg protein denaturation. Observation of the batter density revealed an increasing change, which was reflected by a decrease in the specific volume of final models. TPA showed significant differences in hardness and chewiness, but no significant differences in springiness and cohesiveness were found. The egg freshness affected the properties of batter systems. © 2015 Society of Chemical Industry.

  10. Improved protein hydrogen/deuterium exchange mass spectrometry platform with fully automated data processing.

    PubMed

    Zhang, Zhongqi; Zhang, Aming; Xiao, Gang

    2012-06-05

    Protein hydrogen/deuterium exchange (HDX) followed by protease digestion and mass spectrometric (MS) analysis is accepted as a standard method for studying protein conformation and conformational dynamics. In this article, an improved HDX MS platform with fully automated data processing is described. The platform significantly reduces systematic and random errors in the measurement by introducing two types of corrections in HDX data analysis. First, a mixture of short peptides with fast HDX rates is introduced as internal standards to adjust the variations in the extent of back exchange from run to run. Second, a designed unique peptide (PPPI) with slow intrinsic HDX rate is employed as another internal standard to reflect the possible differences in protein intrinsic HDX rates when protein conformations at different solution conditions are compared. HDX data processing is achieved with a comprehensive HDX model to simulate the deuterium labeling and back exchange process. The HDX model is implemented into the in-house developed software MassAnalyzer and enables fully unattended analysis of the entire protein HDX MS data set starting from ion detection and peptide identification to final processed HDX output, typically within 1 day. The final output of the automated data processing is a set (or the average) of the most possible protection factors for each backbone amide hydrogen. The utility of the HDX MS platform is demonstrated by exploring the conformational transition of a monoclonal antibody by increasing concentrations of guanidine.

  11. Estimating Cumulative Traffic Loads, Final Report for Phase 1

    DOT National Transportation Integrated Search

    2000-07-01

    The knowledge of traffic loads is a prerequisite for the pavement analysis process, especially for the development of load-related distress prediction models. Furthermore, the emerging mechanistically based pavement performance models and pavement de...

  12. Efficient Process Migration for Parallel Processing on Non-Dedicated Networks of Workstations

    NASA Technical Reports Server (NTRS)

    Chanchio, Kasidit; Sun, Xian-He

    1996-01-01

    This paper presents the design and preliminary implementation of MpPVM, a software system that supports process migration for PVM application programs in a non-dedicated heterogeneous computing environment. New concepts of migration point as well as migration point analysis and necessary data analysis are introduced. In MpPVM, process migrations occur only at previously inserted migration points. Migration point analysis determines appropriate locations to insert migration points; whereas, necessary data analysis provides a minimum set of variables to be transferred at each migration pint. A new methodology to perform reliable point-to-point data communications in a migration environment is also discussed. Finally, a preliminary implementation of MpPVM and its experimental results are presented, showing the correctness and promising performance of our process migration mechanism in a scalable non-dedicated heterogeneous computing environment. While MpPVM is developed on top of PVM, the process migration methodology introduced in this study is general and can be applied to any distributed software environment.

  13. Analysis Of The IJCNN 2011 UTL Challenge

    DTIC Science & Technology

    2012-01-13

    large datasets from various application domains: handwriting recognition, image recognition, video processing, text processing, and ecology. The goal...validation and final evaluation sets consist of 4096 examples each. Dataset Domain Features Sparsity Devel. Transf. AVICENNA Handwriting 120 0% 150205...documents [3]. Transfer learning methods could accelerate the application of handwriting recognizers to historical manuscript by reducing the need for

  14. Overcoming pitfalls: Results from a mandatory peer review process for written examinations.

    PubMed

    Wilby, Kyle John; El Hajj, Maguy S; El-Bashir, Marwa; Mraiche, Fatima

    2018-04-01

    Written assessments are essential components of higher education practices. However, faculty members encounter common pitfalls when designing questions intended to evaluate student-learning outcomes. The objective of this project was to determine the impact of a mandatory examination peer review process on question accuracy, alignment with learning objectives, use of best practices in question design, and language/grammar. A mandatory peer review process was implemented for all midterm (before phase) and final (after phase) examinations. Peer review occurred by two reviewers and followed a pre-defined guidance document. Non-punitive feedback given to faculty members served as the intervention. Frequencies of flagged questions according to guidance categories were compared between phases. A total of 21 midterm and 21 final exam reviews were included in the analysis. A total of 637 questions were reviewed across all midterms and 1003 questions were reviewed across all finals. Few questions were flagged for accuracy and alignment with learning outcomes. The median total proportion of questions flagged for best practices was significantly lower for final exams versus midterm exams (15.8 vs. 6.45%, p = 0.014). The intervention did not influence language and grammar errors (9.68 vs. 10.0% of questions flagged before and after, respectively, p = 0.305). A non-punitive peer review process for written examinations can overcome pitfalls in exam creation and improve best practices in question writing. The peer-review process had a substantial effect at flagging language/grammar errors but error rate did not differ between midterm and final exams. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Monitoring of an antigen manufacturing process.

    PubMed

    Zavatti, Vanessa; Budman, Hector; Legge, Raymond; Tamer, Melih

    2016-06-01

    Fluorescence spectroscopy in combination with multivariate statistical methods was employed as a tool for monitoring the manufacturing process of pertactin (PRN), one of the virulence factors of Bordetella pertussis utilized in whopping cough vaccines. Fluorophores such as amino acids and co-enzymes were detected throughout the process. The fluorescence data collected at different stages of the fermentation and purification process were treated employing principal component analysis (PCA). Through PCA, it was feasible to identify sources of variability in PRN production. Then, partial least square (PLS) was employed to correlate the fluorescence spectra obtained from pure PRN samples and the final protein content measured by a Kjeldahl test from these samples. In view that a statistically significant correlation was found between fluorescence and PRN levels, this approach could be further used as a method to predict the final protein content.

  16. Evaluation of the effect of post-translational modification toward protein structure: Chemical synthesis of glycosyl crambins having either a high mannose-type or a complex-type oligosaccharide.

    PubMed

    Dedola, Simone; Izumi, Masayuki; Makimura, Yutaka; Ito, Yukishige; Kajihara, Yasuhiro

    2016-11-04

    Glycoproteins are assembled and folded in the endoplasmic reticulum (ER) and transported to the Golgi for further processing of their oligosaccharides. During these processes, two types of oligosaccharides are used: that is, high mannose-type oligosaccharide in the ER and complex-type oligosaccharide in the Golgi. We were interested to know how two different types of oligosaccharides could influence the folding pathway or the final three-dimensional structure of the glycoproteins. For this purpose, we synthesized a new glycosyl crambin having complex-type oligosaccharide and evaluated the folding process, the final protein structure analyzed by NMR, and compared the CD spectra with previously synthesized glycosyl crambin bearing high mannose-type oligosaccharides. From our analysis, we found that the two different oligosaccharides do not influence the folding pathway in vitro and the final structure of the small glycoproteins. © 2015 Wiley Periodicals, Inc. Biopolymers (Pept Sci) 106: 446-452, 2016. © 2015 Wiley Periodicals, Inc.

  17. Taguchi Method Applied in Optimization of Shipley SJR 5740 Positive Resist Deposition

    NASA Technical Reports Server (NTRS)

    Hui, A.; Blosiu, J. O.; Wiberg, D. V.

    1998-01-01

    Taguchi Methods of Robust Design presents a way to optimize output process performance through an organized set of experiments by using orthogonal arrays. Analysis of variance and signal-to-noise ratio is used to evaluate the contribution of each of the process controllable parameters in the realization of the process optimization. In the photoresist deposition process, there are numerous controllable parameters that can affect the surface quality and thickness of the final photoresist layer.

  18. Social network supported process recommender system.

    PubMed

    Ye, Yanming; Yin, Jianwei; Xu, Yueshen

    2014-01-01

    Process recommendation technologies have gained more and more attention in the field of intelligent business process modeling to assist the process modeling. However, most of the existing technologies only use the process structure analysis and do not take the social features of processes into account, while the process modeling is complex and comprehensive in most situations. This paper studies the feasibility of social network research technologies on process recommendation and builds a social network system of processes based on the features similarities. Then, three process matching degree measurements are presented and the system implementation is discussed subsequently. Finally, experimental evaluations and future works are introduced.

  19. Collaboration processes and perceived effectiveness of integrated care projects in primary care: a longitudinal mixed-methods study.

    PubMed

    Valentijn, Pim P; Ruwaard, Dirk; Vrijhoef, Hubertus J M; de Bont, Antoinette; Arends, Rosa Y; Bruijnzeels, Marc A

    2015-10-09

    Collaborative partnerships are considered an essential strategy for integrating local disjointed health and social services. Currently, little evidence is available on how integrated care arrangements between professionals and organisations are achieved through the evolution of collaboration processes over time. The first aim was to develop a typology of integrated care projects (ICPs) based on the final degree of integration as perceived by multiple stakeholders. The second aim was to study how types of integration differ in changes of collaboration processes over time and final perceived effectiveness. A longitudinal mixed-methods study design based on two data sources (surveys and interviews) was used to identify the perceived degree of integration and patterns in collaboration among 42 ICPs in primary care in The Netherlands. We used cluster analysis to identify distinct subgroups of ICPs based on the final perceived degree of integration from a professional, organisational and system perspective. With the use of ANOVAs, the subgroups were contrasted based on: 1) changes in collaboration processes over time (shared ambition, interests and mutual gains, relationship dynamics, organisational dynamics and process management) and 2) final perceived effectiveness (i.e. rated success) at the professional, organisational and system levels. The ICPs were classified into three subgroups with: 'United Integration Perspectives (UIP)', 'Disunited Integration Perspectives (DIP)' and 'Professional-oriented Integration Perspectives (PIP)'. ICPs within the UIP subgroup made the strongest increase in trust-based (mutual gains and relationship dynamics) as well as control-based (organisational dynamics and process management) collaboration processes and had the highest overall effectiveness rates. On the other hand, ICPs with the DIP subgroup decreased on collaboration processes and had the lowest overall effectiveness rates. ICPs within the PIP subgroup increased in control-based collaboration processes (organisational dynamics and process management) and had the highest effectiveness rates at the professional level. The differences across the three subgroups in terms of the development of collaboration processes and the final perceived effectiveness provide evidence that united stakeholders' perspectives are achieved through a constructive collaboration process over time. Disunited perspectives at the professional, organisation and system levels can be aligned by both trust-based and control-based collaboration processes.

  20. Metacognition and evidence analysis instruction: an educational framework and practical experience.

    PubMed

    Parrott, J Scott; Rubinstein, Matthew L

    2015-08-21

    The role of metacognitive skills in the evidence analysis process has received little attention in the research literature. While the steps of the evidence analysis process are well defined, the role of higher-level cognitive operations (metacognitive strategies) in integrating the steps of the process is not well understood. In part, this is because it is not clear where and how metacognition is implicated in the evidence analysis process nor how these skills might be taught. The purposes of this paper are to (a) suggest a model for identifying critical thinking and metacognitive skills in evidence analysis instruction grounded in current educational theory and research and (b) demonstrate how freely available systematic review/meta-analysis tools can be used to focus on higher-order metacognitive skills, while providing a framework for addressing common student weaknesses. The final goal of this paper is to provide an instructional framework that can generate critique and elaboration while providing the conceptual basis and rationale for future research agendas on this topic.

  1. Automatic Line Network Extraction from Aerial Imagery of Urban Areas through Knowledge Based Image Analysis

    DTIC Science & Technology

    1989-08-01

    Automatic Line Network Extraction from Aerial Imangery of Urban Areas Sthrough KnowledghBased Image Analysis N 04 Final Technical ReportI December...Automatic Line Network Extraction from Aerial Imagery of Urban Areas through Knowledge Based Image Analysis Accesion For NTIS CRA&I DTIC TAB 0...paittern re’ognlition. blac’kboardl oriented symbollic processing, knowledge based image analysis , image understanding, aer’ial imsagery, urban area, 17

  2. Plutonium Finishing Plant (PFP) Final Safety Analysis Report (FSAR) [SEC 1 THRU 11

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ULLAH, M K

    2001-02-26

    The Plutonium Finishing Plant (PFP) is located on the US Department of Energy (DOE) Hanford Site in south central Washington State. The DOE Richland Operations (DOE-RL) Project Hanford Management Contract (PHMC) is with Fluor Hanford Inc. (FH). Westinghouse Safety Management Systems (WSMS) provides management support to the PFP facility. Since 1991, the mission of the PFP has changed from plutonium material processing to preparation for decontamination and decommissioning (D and D). The PFP is in transition between its previous mission and the proposed D and D mission. The objective of the transition is to place the facility into a stablemore » state for long-term storage of plutonium materials before final disposition of the facility. Accordingly, this update of the Final Safety Analysis Report (FSAR) reflects the current status of the buildings, equipment, and operations during this transition. The primary product of the PFP was plutonium metal in the form of 2.2-kg, cylindrical ingots called buttoms. Plutonium nitrate was one of several chemical compounds containing plutonium that were produced as an intermediate processing product. Plutonium recovery was performed at the Plutonium Reclamation Facility (PRF) and plutonium conversion (from a nitrate form to a metal form) was performed at the Remote Mechanical C (RMC) Line as the primary processes. Plutonium oxide was also produced at the Remote Mechanical A (RMA) Line. Plutonium processed at the PFP contained both weapons-grade and fuels-grade plutonium materials. The capability existed to process both weapons-grade and fuels-grade material through the PRF and only weapons-grade material through the RMC Line although fuels-grade material was processed through the line before 1984. Amounts of these materials exist in storage throughout the facility in various residual forms left from previous years of operations.« less

  3. Water research program final report, March 15, 1970 to October 31, 1972. Separations processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Minturn, R. E.

    A summary article on separation by filtration is reprinted, and research is reported in the following areas: dynamic membranes, cast film membranes, concentration polarization, economic analysis; and enhanced heat transfer. (DHM)

  4. Distress modeling for DARWin-ME : final report.

    DOT National Transportation Integrated Search

    2013-12-01

    Distress prediction models, or transfer functions, are key components of the Pavement M-E Design and relevant analysis. The accuracy of such models depends on a successful process of calibration and subsequent validation of model coefficients in the ...

  5. Orion Post-Landing Crew Thermal Control Modeling and Analysis Results

    NASA Technical Reports Server (NTRS)

    Cross, Cynthia D.; Bue, Grant; Rains, George E.

    2009-01-01

    In a vehicle constrained by mass and power, it is necessary to ensure that during the process of reducing hardware mass and power that the health and well being of the crew is not compromised in the design process. To that end, it is necessary to ensure that in the final phase of flight - recovery, that the crew core body temperature remains below the crew cognitive deficit set by the Constellation program. This paper will describe the models used to calculate the thermal environment of the spacecraft after splashdown as well as the human thermal model used to calculate core body temperature. Then the results of these models will be examined to understand the key drivers for core body temperature. Finally, the analysis results will be used to show that additional cooling capability must be added to the vehicle to ensure crew member health post landing.

  6. Solvent removal and spore inactivation directly in dispensing vials with supercritical carbon dioxide and sterilant.

    PubMed

    Howell, Jahna; Niu, Fengui; McCabe, Shannon E; Zhou, Wei; Decedue, Charles J

    2012-06-01

    A process is described using supercritical carbon dioxide to extract organic solvents from drug solutions contained in 30-mL serum vials. We report drying times of less than 1 h with quantitative recovery of sterile drug. A six-log reduction of three spore types used as biological indicators is achieved with direct addition of peracetic acid to a final concentration of approximately 5 mM (~0.04 %) to the drug solution in the vial. Analysis of two drugs, acetaminophen and paclitaxel, indicated no drug degradation as a result of the treatment. Furthermore, analysis of the processed drug substance showed that no residual peracetic acid could be detected in the final product. We have demonstrated an effective means to simultaneously dry and sterilize active pharmaceutical ingredients from organic solvents directly in a dispensing container.

  7. Devising and Implementing a Business Proposal Module: Constraints and Compromises

    ERIC Educational Resources Information Center

    Flowerdew, Lynne

    2010-01-01

    This article describes the design and implementation of a business proposal module for final-year science students at a tertiary institution in Hong Kong. It is argued that in the needs analysis process, the present situation analysis (PSA), that is, personal information about the learners and factors which may affect their learning, is just as if…

  8. Development of a mobile system based on laser-induced breakdown spectroscopy and dedicated to in situ analysis of polluted soils

    NASA Astrophysics Data System (ADS)

    Bousquet, B.; Travaillé, G.; Ismaël, A.; Canioni, L.; Michel-Le Pierrès, K.; Brasseur, E.; Roy, S.; le Hecho, I.; Larregieu, M.; Tellier, S.; Potin-Gautier, M.; Boriachon, T.; Wazen, P.; Diard, A.; Belbèze, S.

    2008-10-01

    Principal Components Analysis (PCA) is successfully applied to the full laser-induced breakdown spectroscopy (LIBS) spectra of soil samples, defining classes according to the concentrations of the major elements. The large variability of the LIBS data is related to the heterogeneity of the samples and the representativeness of the data is finally discussed. Then, the development of a mobile LIBS system dedicated to the in-situ analysis of soils polluted by heavy metals is described. Based on the use of ten-meter long optical fibers, the mobile system allows deported measurements. Finally, the laser-assisted drying process studied by the use of a customized laser has not been retained to overcome the problem of moisture.

  9. Analysis and modeling of wafer-level process variability in 28 nm FD-SOI using split C-V measurements

    NASA Astrophysics Data System (ADS)

    Pradeep, Krishna; Poiroux, Thierry; Scheer, Patrick; Juge, André; Gouget, Gilles; Ghibaudo, Gérard

    2018-07-01

    This work details the analysis of wafer level global process variability in 28 nm FD-SOI using split C-V measurements. The proposed approach initially evaluates the native on wafer process variability using efficient extraction methods on split C-V measurements. The on-wafer threshold voltage (VT) variability is first studied and modeled using a simple analytical model. Then, a statistical model based on the Leti-UTSOI compact model is proposed to describe the total C-V variability in different bias conditions. This statistical model is finally used to study the contribution of each process parameter to the total C-V variability.

  10. Current Status of the LOFAR EoR Key Science Project

    NASA Astrophysics Data System (ADS)

    Koopmans, L. V. E.; LOFAR EoR KSP Team

    2018-05-01

    A short status update on the LOFAR Epoch of Reionization (EoR) Key Science Project (KSP) is given, regarding data acquisition, data processing and analysis, and current power-spectrum limits on the redshifted 21-cm signal of neutral hydrogen at redshifts z = 8 - 10. With caution, we present a preliminary astrophysical analysis of ~60 hr of processed LOFAR data and their resulting power spectrum, showing that potentially already interesting limits on X-ray heating during the Cosmic Dawn can already be gained. This is by no means the final analysis of this sub-set of data, but illustrates the future potential when all nearly 3000 hr of data in hand on two EoR windows will have been processed.

  11. Physics and control of wall turbulence for drag reduction.

    PubMed

    Kim, John

    2011-04-13

    Turbulence physics responsible for high skin-friction drag in turbulent boundary layers is first reviewed. A self-sustaining process of near-wall turbulence structures is then discussed from the perspective of controlling this process for the purpose of skin-friction drag reduction. After recognizing that key parts of this self-sustaining process are linear, a linear systems approach to boundary-layer control is discussed. It is shown that singular-value decomposition analysis of the linear system allows us to examine different approaches to boundary-layer control without carrying out the expensive nonlinear simulations. Results from the linear analysis are consistent with those observed in full nonlinear simulations, thus demonstrating the validity of the linear analysis. Finally, fundamental performance limit expected of optimal control input is discussed.

  12. De novo transcriptome sequencing and customized abscission zone-specific microarray as a new molecular tool for analysis of tomato organ abscission

    USDA-ARS?s Scientific Manuscript database

    Abscission, which is the process of organ separation, is a highly regulated process occurring as a final stage of organ development. In the tomato (Solanum lycopersicum) system, flower and leaf abscission was induced by flower removal or leaf deblading, leading to auxin depletion which results in in...

  13. Assessment of the Assessment Tool: Analysis of Items in a Non-MCQ Mathematics Exam

    ERIC Educational Resources Information Center

    Khoshaim, Heba Bakr; Rashid, Saima

    2016-01-01

    Assessment is one of the vital steps in the teaching and learning process. The reported action research examines the effectiveness of an assessment process and inspects the validity of exam questions used for the assessment purpose. The instructors of a college-level mathematics course studied questions used in the final exams during the academic…

  14. Hybrid life-cycle assessment of natural gas based fuel chains for transportation.

    PubMed

    Strømman, Anders Hammer; Solli, Christian; Hertwich, Edgar G

    2006-04-15

    This research compares the use of natural gas, methanol, and hydrogen as transportation fuels. These three fuel chains start with the extraction and processing of natural gas in the Norwegian North Sea and end with final use in Central Europe. The end use is passenger transportation with a sub-compact car that has an internal combustion engine for the natural gas case and a fuel cell for the methanol and hydrogen cases. The life cycle assessment is performed by combining a process based life-cycle inventory with economic input-output data. The analysis shows that the potential climate impacts are lowest for the hydrogen fuel scenario with CO2 deposition. The hydrogen fuel chain scenario has no significant environmental disadvantage compared to the other fuel chains. Detailed analysis shows that the construction of the car contributes significantly to most impact categories. Finally, it is shown how the application of a hybrid inventory model ensures a more complete inventory description compared to standard process-based life-cycle assessment. This is particularly significant for car construction which would have been significantly underestimated in this study using standard process life-cycle assessment alone.

  15. Deficient symbol processing in Alzheimer disease.

    PubMed

    Toepper, Max; Steuwe, Carolin; Beblo, Thomas; Bauer, Eva; Boedeker, Sebastian; Thomas, Christine; Markowitsch, Hans J; Driessen, Martin; Sammer, Gebhard

    2014-01-01

    Symbols and signs have been suggested to improve the orientation of patients suffering from Alzheimer disease (AD). However, there are hardly any studies that confirm whether AD patients benefit from signs or symbols and which symbol characteristics might improve or impede their symbol comprehension. To address these issues, 30 AD patients and 30 matched healthy controls performed a symbol processing task (SPT) with 4 different item categories. A repeated-measures analysis of variance was run to identify impact of different item categories on performance accuracy in both the experimental groups. Moreover, SPT scores were correlated with neuropsychological test scores in a broad range of other cognitive domains. Finally, diagnostic accuracy of the SPT was calculated by a receiver-operating characteristic curve analysis. Results revealed a global symbol processing dysfunction in AD that was associated with semantic memory and executive deficits. Moreover, AD patients showed a disproportional performance decline at SPT items with visual distraction. Finally, the SPT total score showed high sensitivity and specificity in differentiating between AD patients and healthy controls. The present findings suggest that specific symbol features impede symbol processing in AD and argue for a diagnostic benefit of the SPT in neuropsychological assessment.

  16. Description of the supporting factors of final project in Mathematics and Natural Sciences Faculty of Syiah Kuala University with multiple correspondence analysis

    NASA Astrophysics Data System (ADS)

    Rusyana, Asep; Nurhasanah; Maulizasari

    2018-05-01

    Syiah Kuala University (Unsyiah) is hoped to have graduates who are qualified for working or creating a field of work. A final project course implementation process must be effective. This research uses data from the evaluation conducted by Mathematics and Natural Sciences Faculty (FMIPA) of Unsyiah. Some of the factors that support the completion of the final project are duration, guidance, the final project seminars, facility, public impact, and quality. This research aims to know the factors that have a relationship with the completion of the final project and identify similarities among variables. The factors that support the completion of the final project at every study program in FMIPA are (1) duration, (2) guidance and (3) facilities. These factors are examined for the correlations by chi-square test. After that, the variables are analyzed with multiple correspondence analysis. Based on the plot of correspondence, the activities of the guidance and facilities in Informatics Study Program are included in the fair category, while the guidance and facilities in the Chemistry are included in the best category. Besides that, students in Physics can finish the final project with the fastest completion duration, while students in Pharmacy finish for the longest time.

  17. Summary and recommendations. [reduced gravitational effects on materials manufactured in space

    NASA Technical Reports Server (NTRS)

    1975-01-01

    An economic analysis using econometric and cost benefit analysis techniques was performed to determine the feasibility of space processing of certain products. The overall objectives of the analysis were (1) to determine specific products or processes uniquely connected with space manufacturing, (2) to select a specific product or process from each of the areas of semiconductors, metals, and biochemicals, and (3) to determine the overall price/cost structure of each product or process considered. The economic elements of the analysis involved a generalized decision making format for analyzing space manufacturing, a comparative cost study of the selected processes in space vs. earth manufacturing, and a supply and demand study of the economic relationships of one of the manufacturing processes. Space processing concepts were explored. The first involved the use of the shuttle as the factory with all operations performed during individual flights. The second concept involved a permanent unmanned space factory which would be launched separately. The shuttle in this case would be used only for maintenance and refurbishment. Finally, some consideration was given to a permanent manned space factory.

  18. Thermal Cracking to Improve the Qualification of the Waxes

    NASA Astrophysics Data System (ADS)

    He, B.; Agblevor, F. A.; Chen, C. G.; Feng, J.

    2018-05-01

    Thermal cracking of waxes at mild conditions (430-500°C) has been reconsidered as a possible refining technology for the production of fuels and chemicals. In this study, the more moderate thermal cracking was investigated to process Uinta Basin soft waxes to achieve the required pour point so that they can be pumped to the refineries. The best thermal cracking conditions were set 420°C and 20 minutes. The viscosity and density of the final liquid product were respectively achieved as 2.63 mP•s and 0.784 g/cm3 at 40°C. The result of FT-IR analysis of the liquid product indicated that the unsaturated hydrocarbons were produced after thermal cracking, which was corroborated by the 13C NMR spectrum. The GC analysis of the final gas product indicated that the hydrogen was produced; the dehydrogenation reaction was also proved by the elemental analysis and HHV results. The pour point of the final liquid product met the requirement.

  19. TESS Data Processing and Quick-look Pipeline

    NASA Astrophysics Data System (ADS)

    Fausnaugh, Michael; Huang, Xu; Glidden, Ana; Guerrero, Natalia; TESS Science Office

    2018-01-01

    We describe the data analysis procedures and pipelines for the Transiting Exoplanet Survey Satellite (TESS). We briefly review the processing pipeline developed and implemented by the Science Processing Operations Center (SPOC) at NASA Ames, including pixel/full-frame image calibration, photometric analysis, pre-search data conditioning, transiting planet search, and data validation. We also describe data-quality diagnostic analyses and photometric performance assessment tests. Finally, we detail a "quick-look pipeline" (QLP) that has been developed by the MIT branch of the TESS Science Office (TSO) to provide a fast and adaptable routine to search for planet candidates in the 30 minute full-frame images.

  20. Retinal imaging analysis based on vessel detection.

    PubMed

    Jamal, Arshad; Hazim Alkawaz, Mohammed; Rehman, Amjad; Saba, Tanzila

    2017-07-01

    With an increase in the advancement of digital imaging and computing power, computationally intelligent technologies are in high demand to be used in ophthalmology cure and treatment. In current research, Retina Image Analysis (RIA) is developed for optometrist at Eye Care Center in Management and Science University. This research aims to analyze the retina through vessel detection. The RIA assists in the analysis of the retinal images and specialists are served with various options like saving, processing and analyzing retinal images through its advanced interface layout. Additionally, RIA assists in the selection process of vessel segment; processing these vessels by calculating its diameter, standard deviation, length, and displaying detected vessel on the retina. The Agile Unified Process is adopted as the methodology in developing this research. To conclude, Retina Image Analysis might help the optometrist to get better understanding in analyzing the patient's retina. Finally, the Retina Image Analysis procedure is developed using MATLAB (R2011b). Promising results are attained that are comparable in the state of art. © 2017 Wiley Periodicals, Inc.

  1. Research on the Intensity Analysis and Result Visualization of Construction Land in Urban Planning

    NASA Astrophysics Data System (ADS)

    Cui, J.; Dong, B.; Li, J.; Li, L.

    2017-09-01

    As a fundamental work of urban planning, the intensity analysis of construction land involves many repetitive data processing works that are prone to cause errors or data precision loss, and the lack of efficient methods and tools to visualizing the analysis results in current urban planning. In the research a portable tool is developed by using the Model Builder technique embedded in ArcGIS to provide automatic data processing and rapid result visualization for the works. A series of basic modules provided by ArcGIS are linked together to shape a whole data processing chain in the tool. Once the required data is imported, the analysis results and related maps and graphs including the intensity values and zoning map, the skyline analysis map etc. are produced automatically. Finally the tool is installation-free and can be dispatched quickly between planning teams.

  2. Regression Analysis and Calibration Recommendations for the Characterization of Balance Temperature Effects

    NASA Technical Reports Server (NTRS)

    Ulbrich, N.; Volden, T.

    2018-01-01

    Analysis and use of temperature-dependent wind tunnel strain-gage balance calibration data are discussed in the paper. First, three different methods are presented and compared that may be used to process temperature-dependent strain-gage balance data. The first method uses an extended set of independent variables in order to process the data and predict balance loads. The second method applies an extended load iteration equation during the analysis of balance calibration data. The third method uses temperature-dependent sensitivities for the data analysis. Physical interpretations of the most important temperature-dependent regression model terms are provided that relate temperature compensation imperfections and the temperature-dependent nature of the gage factor to sets of regression model terms. Finally, balance calibration recommendations are listed so that temperature-dependent calibration data can be obtained and successfully processed using the reviewed analysis methods.

  3. Environmental Impact Analysis Process. Volume 3. Preliminary Final Environmental Impact Statement Construction and Operation of Space Launch Complex 7

    DTIC Science & Technology

    1989-10-23

    817) 824-5606. Reelfoot Lake Water Level believes that the proposed timber EIS No. 890197. Final, USA. UT, Tooele Management Plan. Implementation...2: Incidenial lake erit The potential need for an incidental take permit is discussed in Section 4.4.1, Regional Environment. Comment No. 3...and Crow Indian Tribes. Yellowstone, E02, Lake Catamount ResortAvailability Big Horn and Rosebud Counties, MT, Construction. Special Use Permit

  4. Nonstationary Dynamics Data Analysis with Wavelet-SVD Filtering

    NASA Technical Reports Server (NTRS)

    Brenner, Marty; Groutage, Dale; Bessette, Denis (Technical Monitor)

    2001-01-01

    Nonstationary time-frequency analysis is used for identification and classification of aeroelastic and aeroservoelastic dynamics. Time-frequency multiscale wavelet processing generates discrete energy density distributions. The distributions are processed using the singular value decomposition (SVD). Discrete density functions derived from the SVD generate moments that detect the principal features in the data. The SVD standard basis vectors are applied and then compared with a transformed-SVD, or TSVD, which reduces the number of features into more compact energy density concentrations. Finally, from the feature extraction, wavelet-based modal parameter estimation is applied.

  5. Social Network Supported Process Recommender System

    PubMed Central

    Ye, Yanming; Yin, Jianwei; Xu, Yueshen

    2014-01-01

    Process recommendation technologies have gained more and more attention in the field of intelligent business process modeling to assist the process modeling. However, most of the existing technologies only use the process structure analysis and do not take the social features of processes into account, while the process modeling is complex and comprehensive in most situations. This paper studies the feasibility of social network research technologies on process recommendation and builds a social network system of processes based on the features similarities. Then, three process matching degree measurements are presented and the system implementation is discussed subsequently. Finally, experimental evaluations and future works are introduced. PMID:24672309

  6. The Independent Technical Analysis Process Final Report 2006-2007.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duberstein, Corey; Ham, Kenneth; Dauble, Dennis

    2007-03-01

    The Bonneville Power Administration (BPA) contracted with the Pacific Northwest National Laboratory (PNNL) to provide technical analytical support for system-wide fish passage information (BPA Project No. 2006-010-00). The goal of this project was to produce rigorous technical analysis products using independent analysts and anonymous peer reviewers. This project provided an independent technical source for non-routine fish passage analyses while allowing routine support functions to be performed by other well-qualified entities. The Independent Technical Analysis Process (ITAP) was created to provide non-routine analysis for fish and wildlife agencies and tribes in particular and the public in general on matters related tomore » juvenile and adult salmon and steelhead passage through the mainstem hydrosystem. The process was designed to maintain the independence of analysts and reviewers from parties requesting analyses, to avoid potential bias in technical products. The objectives identified for this project were to administer a rigorous, transparent process to deliver unbiased technical assistance necessary to coordinate recommendations for storage reservoir and river operations that avoid potential conflicts between anadromous and resident fish. Seven work elements, designated by numbered categories in the Pisces project tracking system, were created to define and accomplish project goals as follows: (1) 118 Coordination - Coordinate technical analysis and review process: (a) Retain expertise for analyst/reviewer roles. (b) Draft research directives. (c) Send directive to the analyst. (d) Coordinate two independent reviews of the draft report. (e) Ensure reviewer comments are addressed within the final report. (2) 162 Analyze/Interpret Data - Implement the independent aspects of the project. (3) 122 Provide Technical Review - Implement the review process for the analysts. (4) 132 Produce Annual Report - FY06 annual progress report with Pisces Disseminate (5) 161 Disseminate Raw/Summary Data and Results - Post technical products on the ITAP web site. (6) 185-Produce Pisces Status Report - Provide periodic status reports to BPA. (7) 119 Manage and Administer Projects - project/contract administration.« less

  7. Curriculum for Development: Analysis and Review of Processes, Products and Outcomes. Final Report: Sub-Regional Curriculum Workshop (Colombo, Sri Lanka, October 1-30, 1976).

    ERIC Educational Resources Information Center

    United Nations Educational, Scientific, and Cultural Organization, Bangkok (Thailand). Regional Office for Education in Asia and Oceania.

    Presenting proceedings and materials covered at an Asian curriculum workshop involving 15 participants from 7 countries (Afghanistan, Bangladesh, Indonesia, Malaysia, the Philippines, India, and Sri Lanka), this document includes: a discussion of criteria for curriculum analysis re: health education and nutrition instruction for grades 6-10; a…

  8. Assessing the accuracy of wildland fire situation analysis (WFSA) fire size and suppression cost estimates.

    Treesearch

    Geoffrey H. Donovan; Peter. Noordijk

    2005-01-01

    To determine the optimal suppression strategy for escaped wildfires, federal land managers are requiredto conduct a wildland fire situation analysis (WFSA). As part of the WFSA process, fire managers estimate final fire size and suppression costs. Estimates from 58 WFSAs conducted during the 2002 fire season are compared to actual outcomes. Results indicate that...

  9. 75 FR 40847 - Agency Information Collection Activities: Proposed Collection; Comment Request, 1660-0036...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-14

    ... Emergency Management Agency Individual Assistance Customer Satisfaction Surveys AGENCY: Federal Emergency..., timeliness and satisfaction with initial, continuing and final delivery of disaster-related assistance. DATES..., Customer Satisfaction Analysis Section, Texas National Processing Service Center, Recovery Directorate...

  10. Optical granulometric analysis of sedimentary deposits by color segmentation-based software: OPTGRAN-CS

    NASA Astrophysics Data System (ADS)

    Chávez, G. Moreno; Sarocchi, D.; Santana, E. Arce; Borselli, L.

    2015-12-01

    The study of grain size distribution is fundamental for understanding sedimentological environments. Through these analyses, clast erosion, transport and deposition processes can be interpreted and modeled. However, grain size distribution analysis can be difficult in some outcrops due to the number and complexity of the arrangement of clasts and matrix and their physical size. Despite various technological advances, it is almost impossible to get the full grain size distribution (blocks to sand grain size) with a single method or instrument of analysis. For this reason development in this area continues to be fundamental. In recent years, various methods of particle size analysis by automatic image processing have been developed, due to their potential advantages with respect to classical ones; speed and final detailed content of information (virtually for each analyzed particle). In this framework, we have developed a novel algorithm and software for grain size distribution analysis, based on color image segmentation using an entropy-controlled quadratic Markov measure field algorithm and the Rosiwal method for counting intersections between clast and linear transects in the images. We test the novel algorithm in different sedimentary deposit types from 14 varieties of sedimentological environments. The results of the new algorithm were compared with grain counts performed manually by the same Rosiwal methods applied by experts. The new algorithm has the same accuracy as a classical manual count process, but the application of this innovative methodology is much easier and dramatically less time-consuming. The final productivity of the new software for analysis of clasts deposits after recording field outcrop images can be increased significantly.

  11. In-depth analysis and characterization of a dual damascene process with respect to different CD

    NASA Astrophysics Data System (ADS)

    Krause, Gerd; Hofmann, Detlef; Habets, Boris; Buhl, Stefan; Gutsch, Manuela; Lopez-Gomez, Alberto; Kim, Wan-Soo; Thrun, Xaver

    2018-03-01

    In a 200 mm high volume environment, we studied data from a dual damascene process. Dual damascene is a combination of lithography, etch and CMP that is used to create copper lines and contacts in one single step. During these process steps, different metal CD are measured by different measurement methods. In this study, we analyze the key numbers of the different measurements after different process steps and develop simple models to predict the electrical behavior* . In addition, radial profiles have been analyzed of both inline measurement parameters and electrical parameters. A matching method was developed based on inline and electrical data. Finally, correlation analysis for radial signatures is presented that can be used to predict excursions in electrical signatures.

  12. Research progress in Asia on methods of processing laser-induced breakdown spectroscopy data

    NASA Astrophysics Data System (ADS)

    Guo, Yang-Min; Guo, Lian-Bo; Li, Jia-Ming; Liu, Hong-Di; Zhu, Zhi-Hao; Li, Xiang-You; Lu, Yong-Feng; Zeng, Xiao-Yan

    2016-10-01

    Laser-induced breakdown spectroscopy (LIBS) has attracted much attention in terms of both scientific research and industrial application. An important branch of LIBS research in Asia, the development of data processing methods for LIBS, is reviewed. First, the basic principle of LIBS and the characteristics of spectral data are briefly introduced. Next, two aspects of research on and problems with data processing methods are described: i) the basic principles of data preprocessing methods are elaborated in detail on the basis of the characteristics of spectral data; ii) the performance of data analysis methods in qualitative and quantitative analysis of LIBS is described. Finally, a direction for future development of data processing methods for LIBS is also proposed.

  13. A System for the Individualization and Optimization of Learning Through Computer Management of the Educational Process. Final Report.

    ERIC Educational Resources Information Center

    Schure, Alexander

    A computer-based system model for the monitoring and management of the instructional process was conceived, developed and refined through the techniques of systems analysis. This report describes the various aspects and components of this project in a series of independent and self-contained units. The first unit provides an overview of the entire…

  14. Impact of selected troposphere models on Precise Point Positioning convergence

    NASA Astrophysics Data System (ADS)

    Kalita, Jakub; Rzepecka, Zofia

    2016-04-01

    The Precise Point Positioning (PPP) absolute method is currently intensively investigated in order to reach fast convergence time. Among various sources that influence the convergence of the PPP, the tropospheric delay is one of the most important. Numerous models of tropospheric delay are developed and applied to PPP processing. However, with rare exceptions, the quality of those models does not allow fixing the zenith path delay tropospheric parameter, leaving difference between nominal and final value to the estimation process. Here we present comparison of several PPP result sets, each of which based on different troposphere model. The respective nominal values are adopted from models: VMF1, GPT2w, MOPS and ZERO-WET. The PPP solution admitted as reference is based on the final troposphere product from the International GNSS Service (IGS). The VMF1 mapping function was used for all processing variants in order to provide capability to compare impact of applied nominal values. The worst case initiates zenith wet delay with zero value (ZERO-WET). Impact from all possible models for tropospheric nominal values should fit inside both IGS and ZERO-WET border variants. The analysis is based on data from seven IGS stations located in mid-latitude European region from year 2014. For the purpose of this study several days with the most active troposphere were selected for each of the station. All the PPP solutions were determined using gLAB open-source software, with the Kalman filter implemented independently by the authors of this work. The processing was performed on 1 hour slices of observation data. In addition to the analysis of the output processing files, the presented study contains detailed analysis of the tropospheric conditions for the selected data. The overall results show that for the height component the VMF1 model outperforms GPT2w and MOPS by 35-40% and ZERO-WET variant by 150%. In most of the cases all solutions converge to the same values during first hour of processing. Finally, the results have been compared against results obtained during calm tropospheric conditions.

  15. Back-channel-etch amorphous indium-gallium-zinc oxide thin-film transistors: The impact of source/drain metal etch and final passivation

    NASA Astrophysics Data System (ADS)

    Nag, Manoj; Bhoolokam, Ajay; Steudel, Soeren; Chasin, Adrian; Myny, Kris; Maas, Joris; Groeseneken, Guido; Heremans, Paul

    2014-11-01

    We report on the impact of source/drain (S/D) metal (molybdenum) etch and the final passivation (SiO2) layer on the bias-stress stability of back-channel-etch (BCE) configuration based amorphous indium-gallium-zinc oxide (a-IGZO) thin-film transistors (TFTs). It is observed that the BCE configurations TFTs suffer poor bias-stability in comparison to etch-stop-layer (ESL) TFTs. By analysis with transmission electron microscopy (TEM) and energy dispersive spectroscopy (EDS), as well as by a comparative analysis of contacts formed by other metals, we infer that this poor bias-stability for BCE transistors having Mo S/D contacts is associated with contamination of the back channel interface, which occurs by Mo-containing deposits on the back channel during the final plasma process of the physical vapor deposited SiO2 passivation.

  16. Measurement of WW and WZ production in the lepton plus heavy flavor jets final state at CDF

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leone, Sandra

    We present the CDF measurement of the diboson WW and WZ production cross section in a final state consistent with leptonic W decay and jets originating from heavy flavor quarks, based on the full Tevatron Run II dataset. The analysis of the di–jet invariant mass spectrum allows the observation of 3.7 sigma evidence for the combined production processes of either WW or WZ bosons. The different heavy flavor decay pattern of the W and Z bosons and the analysis of the secondary–decay vertex properties allow to independently measure the WW and WZ production cross section in a hadronic final state.more » The measured cross sections are consistent with the standard model predictions and correspond to signal significances of 2.9 and 2.1 sigma for WW and WZ production, respectively.« less

  17. A DMAIC approach for process capability improvement an engine crankshaft manufacturing process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, P. Srinivasa

    2014-05-01

    The define-measure-analyze-improve-control (DMAIC) approach is a five-strata approach, namely DMAIC. This approach is the scientific approach for reducing the deviations and improving the capability levels of the manufacturing processes. The present work elaborates on DMAIC approach applied in reducing the process variations of the stub-end-hole boring operation of the manufacture of crankshaft. This statistical process control study starts with selection of the critical-to-quality (CTQ) characteristic in the define stratum. The next stratum constitutes the collection of dimensional measurement data of the CTQ characteristic identified. This is followed by the analysis and improvement strata where the various quality control tools like Ishikawa diagram, physical mechanism analysis, failure modes effects analysis and analysis of variance are applied. Finally, the process monitoring charts are deployed at the workplace for regular monitoring and control of the concerned CTQ characteristic. By adopting DMAIC approach, standard deviation is reduced from 0.003 to 0.002. The process potential capability index ( C P) values improved from 1.29 to 2.02 and the process performance capability index ( C PK) values improved from 0.32 to 1.45, respectively.

  18. Optimizing Endoscope Reprocessing Resources Via Process Flow Queuing Analysis.

    PubMed

    Seelen, Mark T; Friend, Tynan H; Levine, Wilton C

    2018-05-04

    The Massachusetts General Hospital (MGH) is merging its older endoscope processing facilities into a single new facility that will enable high-level disinfection of endoscopes for both the ORs and Endoscopy Suite, leveraging economies of scale for improved patient care and optimal use of resources. Finalized resource planning was necessary for the merging of facilities to optimize staffing and make final equipment selections to support the nearly 33,000 annual endoscopy cases. To accomplish this, we employed operations management methodologies, analyzing the physical process flow of scopes throughout the existing Endoscopy Suite and ORs and mapping the future state capacity of the new reprocessing facility. Further, our analysis required the incorporation of historical case and reprocessing volumes in a multi-server queuing model to identify any potential wait times as a result of the new reprocessing cycle. We also performed sensitivity analysis to understand the impact of future case volume growth. We found that our future-state reprocessing facility, given planned capital expenditures for automated endoscope reprocessors (AERs) and pre-processing sinks, could easily accommodate current scope volume well within the necessary pre-cleaning-to-sink reprocessing time limit recommended by manufacturers. Further, in its current planned state, our model suggested that the future endoscope reprocessing suite at MGH could support an increase in volume of at least 90% over the next several years. Our work suggests that with simple mathematical analysis of historic case data, significant changes to a complex perioperative environment can be made with ease while keeping patient safety as the top priority.

  19. Prevalence and Persistence of Listeria monocytogenes in Ready-to-Eat Tilapia Sashimi Processing Plants.

    PubMed

    Chen, Bang-Yuan; Wang, Chung-Yi; Wang, Chia-Lan; Fan, Yang-Chi; Weng, I-Ting; Chou, Chung-Hsi

    2016-11-01

    A 2-year study was performed at two ready-to-eat tilapia sashimi processing plants (A and B) to identify possible routes of contamination with Listeria monocytogenes during processing. Samples were collected from the aquaculture environments, transportation tanks, processing plants, and final products. Seventy-nine L. monocytogenes isolates were found in the processing environments and final products; 3.96% (50 of 1,264 samples) and 3.86% (29 of 752 samples) of the samples from plants A and B, respectively, were positive for L. monocytogenes . No L. monocytogenes was detected in the aquaculture environments or transportation tanks. The predominant L. monocytogenes serotypes were 1/2b (55.70%) and 4b (37.97%); serotypes 3b and 4e were detected at much lower percentages. At both plants, most processing sections were contaminated with L. monocytogenes before the start of processing, which indicated that the cleaning and sanitizing methods did not achieve adequate pathogen removal. Eleven seropulsotypes were revealed by pulsed-field gel electrophoresis and serotyping. Analysis of seropulsotype distribution revealed that the contamination was disseminated by the processing work; the same seropulsotypes were repeatedly found along the work flow line and in the final products. Specific seropulsotypes were persistently found during different sampling periods, which suggests that the sanitation procedures or equipment used at these plants were inadequate. Plant staff should improve the sanitation procedures and equipment to reduce the risk of L. monocytogenes cross-contamination and ensure the safety of ready-to-eat tilapia products.

  20. Shared Resources: Sharing Right-Of-Way For Telecommunications, Identification, Review And Analysis Of Legal And Institutional Issues, Final Report

    DOT National Transportation Integrated Search

    1998-09-01

    Commercial Vehicle Administrative (CVO) Processes Cross-Cutting report summarizes and interprets the results of several Field Operational Tests (FOTs) conducted to evaluate systems that increase the efficiency of commercial vehicle administrative pro...

  1. CONCEPTS AND APPROACHES FOR THE BIOASSESSMENT OF NON-WADEABLE STREAMS AND RIVERS

    EPA Science Inventory

    This document is intended to assist users in establishing or refining protocols, including the specific methods related to field sampling, laboratory sample processing, taxonomy, data entry, management and analysis, and final assessment and reporting. It also reviews and provide...

  2. VII data use analysis and processing (DUAP) : final project report (phase II).

    DOT National Transportation Integrated Search

    2010-10-01

    This report covers several key subjects related to the generation of IntelliDriveSM probe vehicle data and use of this data in : application of interest to state departments of transportation and local public transportation agencies. The evaluations ...

  3. Three Tier Unified Process Model for Requirement Negotiations and Stakeholder Collaborations

    NASA Astrophysics Data System (ADS)

    Niazi, Muhammad Ashraf Khan; Abbas, Muhammad; Shahzad, Muhammad

    2012-11-01

    This research paper is focused towards carrying out a pragmatic qualitative analysis of various models and approaches of requirements negotiations (a sub process of requirements management plan which is an output of scope managementís collect requirements process) and studies stakeholder collaborations methodologies (i.e. from within communication management knowledge area). Experiential analysis encompass two tiers; first tier refers to the weighted scoring model while second tier focuses on development of SWOT matrices on the basis of findings of weighted scoring model for selecting an appropriate requirements negotiation model. Finally the results are simulated with the help of statistical pie charts. On the basis of simulated results of prevalent models and approaches of negotiations, a unified approach for requirements negotiations and stakeholder collaborations is proposed where the collaboration methodologies are embeded into selected requirements negotiation model as internal parameters of the proposed process alongside some external required parameters like MBTI, opportunity analysis etc.

  4. Image processing and analysis of Saturn's rings

    NASA Technical Reports Server (NTRS)

    Yagi, G. M.; Jepsen, P. L.; Garneau, G. W.; Mosher, J. A.; Doyle, L. R.; Lorre, J. J.; Avis, C. C.; Korsmo, E. P.

    1981-01-01

    Processing of Voyager image data of Saturn's rings at JPL's Image Processing Laboratory is described. A software system to navigate the flight images, facilitate feature tracking, and to project the rings has been developed. This system has been used to make measurements of ring radii and to measure the velocities of the spoke features in the B-Ring. A projected ring movie to study the development of these spoke features has been generated. Finally, processing to facilitate comparison of the photometric properties of Saturn's rings at various phase angles is described.

  5. A data mining approach to optimize pellets manufacturing process based on a decision tree algorithm.

    PubMed

    Ronowicz, Joanna; Thommes, Markus; Kleinebudde, Peter; Krysiński, Jerzy

    2015-06-20

    The present study is focused on the thorough analysis of cause-effect relationships between pellet formulation characteristics (pellet composition as well as process parameters) and the selected quality attribute of the final product. The shape using the aspect ratio value expressed the quality of pellets. A data matrix for chemometric analysis consisted of 224 pellet formulations performed by means of eight different active pharmaceutical ingredients and several various excipients, using different extrusion/spheronization process conditions. The data set contained 14 input variables (both formulation and process variables) and one output variable (pellet aspect ratio). A tree regression algorithm consistent with the Quality by Design concept was applied to obtain deeper understanding and knowledge of formulation and process parameters affecting the final pellet sphericity. The clear interpretable set of decision rules were generated. The spehronization speed, spheronization time, number of holes and water content of extrudate have been recognized as the key factors influencing pellet aspect ratio. The most spherical pellets were achieved by using a large number of holes during extrusion, a high spheronizer speed and longer time of spheronization. The described data mining approach enhances knowledge about pelletization process and simultaneously facilitates searching for the optimal process conditions which are necessary to achieve ideal spherical pellets, resulting in good flow characteristics. This data mining approach can be taken into consideration by industrial formulation scientists to support rational decision making in the field of pellets technology. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Redesigning the Content and Sequence of Instruction in Music Theory. Final Report to Fund for the Improvement of Post Secondary Education.

    ERIC Educational Resources Information Center

    Ashley, Richard D.

    This report summarizes a project in which a number of new approaches were taken to improve learning in undergraduate basic music instruction for music majors. The basic viewpoint proposed was that music activities can be seen as skilled problem solving in the areas of aural analysis, visual analysis, and understanding of compositional processes.…

  7. A simple method for processing data with least square method

    NASA Astrophysics Data System (ADS)

    Wang, Chunyan; Qi, Liqun; Chen, Yongxiang; Pang, Guangning

    2017-08-01

    The least square method is widely used in data processing and error estimation. The mathematical method has become an essential technique for parameter estimation, data processing, regression analysis and experimental data fitting, and has become a criterion tool for statistical inference. In measurement data analysis, the distribution of complex rules is usually based on the least square principle, i.e., the use of matrix to solve the final estimate and to improve its accuracy. In this paper, a new method is presented for the solution of the method which is based on algebraic computation and is relatively straightforward and easy to understand. The practicability of this method is described by a concrete example.

  8. Laser Doppler velocimeter system simulation for sensing aircraft wake vortices. Part 2: Processing and analysis of LDV data (for runs 1023 and 2023)

    NASA Technical Reports Server (NTRS)

    Meng, J. C. S.; Thomson, J. A. L.

    1975-01-01

    A data analysis program constructed to assess LDV system performance, to validate the simulation model, and to test various vortex location algorithms is presented. Real or simulated Doppler spectra versus range and elevation is used and the spatial distributions of various spectral moments or other spectral characteristics are calculated and displayed. Each of the real or simulated scans can be processed by one of three different procedures: simple frequency or wavenumber filtering, matched filtering, and deconvolution filtering. The final output is displayed as contour plots in an x-y coordinate system, as well as in the form of vortex tracks deduced from the maxima of the processed data. A detailed analysis of run number 1023 and run number 2023 is presented to demonstrate the data analysis procedure. Vortex tracks and system range resolutions are compared with theoretical predictions.

  9. Supporting Handoff in Asynchronous Collaborative Sensemaking Using Knowledge-Transfer Graphs.

    PubMed

    Zhao, Jian; Glueck, Michael; Isenberg, Petra; Chevalier, Fanny; Khan, Azam

    2018-01-01

    During asynchronous collaborative analysis, handoff of partial findings is challenging because externalizations produced by analysts may not adequately communicate their investigative process. To address this challenge, we developed techniques to automatically capture and help encode tacit aspects of the investigative process based on an analyst's interactions, and streamline explicit authoring of handoff annotations. We designed our techniques to mediate awareness of analysis coverage, support explicit communication of progress and uncertainty with annotation, and implicit communication through playback of investigation histories. To evaluate our techniques, we developed an interactive visual analysis system, KTGraph, that supports an asynchronous investigative document analysis task. We conducted a two-phase user study to characterize a set of handoff strategies and to compare investigative performance with and without our techniques. The results suggest that our techniques promote the use of more effective handoff strategies, help increase an awareness of prior investigative process and insights, as well as improve final investigative outcomes.

  10. Final Environmental Assessment for Shared Use Paths (SUP), Eglin Air Force Base, Florida

    DTIC Science & Technology

    2011-07-01

    NEPA; 40 Code of Federal Regulations [CFR] 1500-1508); the USAF environmental impact analysis process as effectuated by 32 CFR Part 989; and DoD...alternatives were considered, but not carried forward for analysis . Alternative B This alternative would consist of constructing a 10’ wide SUP...EA Section 2.3.1, page 12) No-Action Alternative This alternative also was carried forward for analysis . (EA Section 2.4.1, page 16) ENVIRONMENTAL

  11. A Comparative of business process modelling techniques

    NASA Astrophysics Data System (ADS)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  12. Methodology for assessing the effectiveness of access management techniques : final report, September 14, 1998.

    DOT National Transportation Integrated Search

    1998-09-14

    A methodology for assessing the effectiveness of access management techniques on suburban arterial highways is developed. The methodology is described as a seven-step process as follows: (1) establish the purpose of the analysis (2) establish the mea...

  13. Multi-Response Parameter Interval Sensitivity and Optimization for the Composite Tape Winding Process.

    PubMed

    Deng, Bo; Shi, Yaoyao; Yu, Tao; Kang, Chao; Zhao, Pan

    2018-01-31

    The composite tape winding process, which utilizes a tape winding machine and prepreg tapes, provides a promising way to improve the quality of composite products. Nevertheless, the process parameters of composite tape winding have crucial effects on the tensile strength and void content, which are closely related to the performances of the winding products. In this article, two different object values of winding products, including mechanical performance (tensile strength) and a physical property (void content), were respectively calculated. Thereafter, the paper presents an integrated methodology by combining multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis to obtain the optimal intervals of the composite tape winding process. First, the global multi-parameter sensitivity analysis method was applied to investigate the sensitivity of each parameter in the tape winding processing. Then, the local single-parameter sensitivity analysis method was employed to calculate the sensitivity of a single parameter within the corresponding range. Finally, the stability and instability ranges of each parameter were distinguished. Meanwhile, the authors optimized the process parameter ranges and provided comprehensive optimized intervals of the winding parameters. The verification test validated that the optimized intervals of the process parameters were reliable and stable for winding products manufacturing.

  14. Multi-Response Parameter Interval Sensitivity and Optimization for the Composite Tape Winding Process

    PubMed Central

    Yu, Tao; Kang, Chao; Zhao, Pan

    2018-01-01

    The composite tape winding process, which utilizes a tape winding machine and prepreg tapes, provides a promising way to improve the quality of composite products. Nevertheless, the process parameters of composite tape winding have crucial effects on the tensile strength and void content, which are closely related to the performances of the winding products. In this article, two different object values of winding products, including mechanical performance (tensile strength) and a physical property (void content), were respectively calculated. Thereafter, the paper presents an integrated methodology by combining multi-parameter relative sensitivity analysis and single-parameter sensitivity analysis to obtain the optimal intervals of the composite tape winding process. First, the global multi-parameter sensitivity analysis method was applied to investigate the sensitivity of each parameter in the tape winding processing. Then, the local single-parameter sensitivity analysis method was employed to calculate the sensitivity of a single parameter within the corresponding range. Finally, the stability and instability ranges of each parameter were distinguished. Meanwhile, the authors optimized the process parameter ranges and provided comprehensive optimized intervals of the winding parameters. The verification test validated that the optimized intervals of the process parameters were reliable and stable for winding products manufacturing. PMID:29385048

  15. Rapid Prototyping Technology for Manufacturing GTE Turbine Blades

    NASA Astrophysics Data System (ADS)

    Balyakin, A. V.; Dobryshkina, E. M.; Vdovin, R. A.; Alekseev, V. P.

    2018-03-01

    The conventional approach to manufacturing turbine blades by investment casting is expensive and time-consuming, as it takes a lot of time to make geometrically precise and complex wax patterns. Turbine blade manufacturing in pilot production can be sped up by accelerating the casting process while keeping the geometric precision of the final product. This paper compares the rapid prototyping method (casting the wax pattern composition into elastic silicone molds) to the conventional technology. Analysis of the size precision of blade casts shows that silicon-mold casting features sufficient geometric precision. Thus, this method for making wax patterns can be a cost-efficient solution for small-batch or pilot production of turbine blades for gas-turbine units (GTU) and gas-turbine engines (GTE). The paper demonstrates how additive technology and thermographic analysis can speed up the cooling of wax patterns in silicone molds. This is possible at an optimal temperature and solidification time, which make the process more cost-efficient while keeping the geometric quality of the final product.

  16. A consensus reaching model for 2-tuple linguistic multiple attribute group decision making with incomplete weight information

    NASA Astrophysics Data System (ADS)

    Zhang, Wancheng; Xu, Yejun; Wang, Huimin

    2016-01-01

    The aim of this paper is to put forward a consensus reaching method for multi-attribute group decision-making (MAGDM) problems with linguistic information, in which the weight information of experts and attributes is unknown. First, some basic concepts and operational laws of 2-tuple linguistic label are introduced. Then, a grey relational analysis method and a maximising deviation method are proposed to calculate the incomplete weight information of experts and attributes respectively. To eliminate the conflict in the group, a weight-updating model is employed to derive the weights of experts based on their contribution to the consensus reaching process. After conflict elimination, the final group preference can be obtained which will give the ranking of the alternatives. The model can effectively avoid information distortion which is occurred regularly in the linguistic information processing. Finally, an illustrative example is given to illustrate the application of the proposed method and comparative analysis with the existing methods are offered to show the advantages of the proposed method.

  17. Investigation of novel sol-gel hydrophobic surfaces for desorption electrospray ionization-mass spectrometry analysis.

    PubMed

    Penna, Andrea; Elviri, Lisa; Careri, Maria; Mangia, Alessandro; Predieri, Giovanni

    2011-05-01

    Sol-gel-based materials were synthesized, characterized and finally tested as solid supports for desorption electrospray ionization-mass spectrometry (DESI-MS) analysis of a mixture of compounds of different polarity. Films with thickness in the 2-4 μm range were obtained by a dip-coating process using tetraethoxysilane (TEOS) and octyltriethoxysilane (OTES) as sol-gel precursors. Three types of surface with different hydrophobic character were obtained by varying the TEOS/OTES ratio in the sol-gel mixture. Each coating was characterized by atomic force microscopy investigations, gaining insight into homogeneity, smoothness and thickness of the obtained films. To study hydrophobicity of each surface, surface free energy measurements were performed. Different DESI-MS responses were observed when different solvent mixture deposition procedures and solvent spray compositions were investigated. Results were finally compared to those obtained by using commercial polytetrafluoroethylene-coated slides. It was found that surface free energy plays an important role in the desorption/ionization process as a function of the polarity of analytes.

  18. Validation of Student and Parent Reported Data on the Basic Grant Application Form: Pre-Award Validation Analysis Study. Revised Final Report.

    ERIC Educational Resources Information Center

    Applied Management Sciences, Inc., Silver Spring, MD.

    The 1978-1979 pre-award institution validation process for the Basic Educational Opportunity Grant (BEOG) program was studied, based on applicant and grant recipient files as of the end of February 1979. The objective was to assess the impact of the validation process on the proper award of BEOGs, and to determine whether the criteria for…

  19. Situation-specific theories from the middle-range transitions theory.

    PubMed

    Im, Eun-Ok

    2014-01-01

    The purpose of this article was to analyze the theory development process of the situation-specific theories that were derived from the middle-range transitions theory. This analysis aims to provide directions for future development of situation-specific theories. First, transitions theory is concisely described with its history, goal, and major concepts. Then, the approach that was used to retrieve the situation-specific theories derived from transitions theory is described. Next, an analysis of 6 situation-specific theories is presented. Finally, 4 themes reflecting commonalities and variances in the theory development process are discussed with implications for future theoretical development.

  20. Markov random fields and graphs for uncertainty management and symbolic data fusion in an urban scene interpretation

    NASA Astrophysics Data System (ADS)

    Moissinac, Henri; Maitre, Henri; Bloch, Isabelle

    1995-11-01

    An image interpretation method is presented for the automatic processing of aerial pictures of a urban landscape. In order to improve the picture analysis, some a priori knowledge extracted from a geographic map is introduced. A coherent graph-based model of the city is built, starting with the road network. A global uncertainty management scheme has been designed in order to evaluate the final confidence we can have in the final results. This model and the uncertainty management tend to reflect the hierarchy of the available data and the interpretation levels. The symbolic relationships linking the different kinds of elements are taken into account while propagating and combining the confidence measures along the interpretation process.

  1. Cleanliness of Ti-bearing Al-killed ultra-low-carbon steel during different heating processes

    NASA Astrophysics Data System (ADS)

    Guo, Jian-long; Bao, Yan-ping; Wang, Min

    2017-12-01

    During the production of Ti-bearing Al-killed ultra-low-carbon (ULC) steel, two different heating processes were used when the converter tapping temperature or the molten steel temperature in the Ruhrstahl-Heraeus (RH) process was low: heating by Al addition during the RH decarburization process and final deoxidation at the end of the RH decarburization process (process-I), and increasing the oxygen content at the end of RH decarburization, heating and final deoxidation by one-time Al addition (process-II). Temperature increases of 10°C by different processes were studied; the results showed that the two heating processes could achieve the same heating effect. The T.[O] content in the slab and the refining process was better controlled by process-I than by process-II. Statistical analysis of inclusions showed that the numbers of inclusions in the slab obtained by process-I were substantially less than those in the slab obtained by process-II. For process-I, the Al2O3 inclusions produced by Al added to induce heating were substantially removed at the end of decarburization. The amounts of inclusions were substantially greater for process-II than for process-I at different refining stages because of the higher dissolved oxygen concentration in process-II. Industrial test results showed that process-I was more beneficial for improving the cleanliness of molten steel.

  2. Simulation of 7050 Wrought Aluminum Alloy Wheel Die Forging and its Defects Analysis based on DEFORM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang Shiquan; Yi Youping; Zhang Yuxun

    2010-06-15

    Defects such as folding, intercrystalline cracking and flow lines outcrop are very likely to occur in the forging of aluminum alloy. Moreover, it is difficult to achieve the optimal set of process parameters just by trial and error within an industrial environment. In producing 7050 wrought aluminum alloy wheel, a rigid-plastic finite element method (FEM) analysis has been performed to optimize die forging process. Processing parameters were analyzed, focusing on the effects of punch speed, friction factor and temperature. Meanwhile, mechanism as well as the evolution with respect to the defects of the wrought wheel was studied in details. Frommore » an analysis of the results, isothermal die forging was proposed for producing 7050 aluminum alloy wheel with good mechanical properties. Finally, verification experiment was carried out on hydropress.« less

  3. Computational Analysis and Simulation of Empathic Behaviors: a Survey of Empathy Modeling with Behavioral Signal Processing Framework.

    PubMed

    Xiao, Bo; Imel, Zac E; Georgiou, Panayiotis; Atkins, David C; Narayanan, Shrikanth S

    2016-05-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, and facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation and offer a series of open problems for future research.

  4. Information processing and dynamics in minimally cognitive agents.

    PubMed

    Beer, Randall D; Williams, Paul L

    2015-01-01

    There has been considerable debate in the literature about the relative merits of information processing versus dynamical approaches to understanding cognitive processes. In this article, we explore the relationship between these two styles of explanation using a model agent evolved to solve a relational categorization task. Specifically, we separately analyze the operation of this agent using the mathematical tools of information theory and dynamical systems theory. Information-theoretic analysis reveals how task-relevant information flows through the system to be combined into a categorization decision. Dynamical analysis reveals the key geometrical and temporal interrelationships underlying the categorization decision. Finally, we propose a framework for directly relating these two different styles of explanation and discuss the possible implications of our analysis for some of the ongoing debates in cognitive science. Copyright © 2014 Cognitive Science Society, Inc.

  5. Evaluation of Uranium-235 Measurement Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaspar, Tiffany C.; Lavender, Curt A.; Dibert, Mark W.

    2017-05-23

    Monolithic U-Mo fuel plates are rolled to final fuel element form from the original cast ingot, and thus any inhomogeneities in 235U distribution present in the cast ingot are maintained, and potentially exaggerated, in the final fuel foil. The tolerance for inhomogeneities in the 235U concentration in the final fuel element foil is very low. A near-real-time, nondestructive technique to evaluate the 235U distribution in the cast ingot is required in order to provide feedback to the casting process. Based on the technical analysis herein, gamma spectroscopy has been recommended to provide a near-real-time measure of the 235U distribution inmore » U-Mo cast plates.« less

  6. State-selected chemical reaction dynamics at the S matrix level - Final-state specificities of near-threshold processes at low and high energies

    NASA Technical Reports Server (NTRS)

    Chatfield, David C.; Truhlar, Donald G.; Schwenke, David W.

    1992-01-01

    State-to-state reaction probabilities are found to be highly final-state specific at state-selected threshold energies for the reactions O + H2 yield OH + H and H + H2 yield H2 + H. The study includes initial rotational states with quantum numbers 0-15, and the specificity is especially dramatic for the more highly rotationally excited reactants. The analysis is based on accurate quantum mechanical reactive scattering calculations. Final-state specificity is shown in general to increase with the rotational quantum number of the reactant diatom, and the trends are confirmed for both zero and nonzero values of the total angular momentum.

  7. Final Report for Dynamic Models for Causal Analysis of Panel Data. Models for Change in Quantitative Variables, Part II Scholastic Models. Part II, Chapter 4.

    ERIC Educational Resources Information Center

    Hannan, Michael T.

    This document is part of a series of chapters described in SO 011 759. Stochastic models for the sociological analysis of change and the change process in quantitative variables are presented. The author lays groundwork for the statistical treatment of simple stochastic differential equations (SDEs) and discusses some of the continuities of…

  8. A national analytical quality assurance program: Developing guidelines and analytical tools for the forest inventory and analysis program

    Treesearch

    Phyllis C. Adams; Glenn A. Christensen

    2012-01-01

    A rigorous quality assurance (QA) process assures that the data and information provided by the Forest Inventory and Analysis (FIA) program meet the highest possible standards of precision, completeness, representativeness, comparability, and accuracy. FIA relies on its analysts to check the final data quality prior to release of a State’s data to the national FIA...

  9. Radiography for intensive care: participatory process analysis in a PACS-equipped and film/screen environment

    NASA Astrophysics Data System (ADS)

    Peer, Regina; Peer, Siegfried; Sander, Heike; Marsolek, Ingo; Koller, Wolfgang; Pappert, Dirk; Hierholzer, Johannes

    2002-05-01

    If new technology is introduced into medical practice it must prove to make a difference. However traditional approaches of outcome analysis failed to show a direct benefit of PACS on patient care and economical benefits are still in debate. A participatory process analysis was performed to compare workflow in a film based hospital and a PACS environment. This included direct observation of work processes, interview of involved staff, structural analysis and discussion of observations with staff members. After definition of common structures strong and weak workflow steps were evaluated. With a common workflow structure in both hospitals, benefits of PACS were revealed in workflow steps related to image reporting with simultaneous image access for ICU-physicians and radiologists, archiving of images as well as image and report distribution. However PACS alone is not able to cover the complete process of 'radiography for intensive care' from ordering of an image till provision of the final product equals image + report. Interference of electronic workflow with analogue process steps such as paper based ordering reduces the potential benefits of PACS. In this regard workflow modeling proved to be very helpful for the evaluation of complex work processes linking radiology and the ICU.

  10. Integrated electrocoagulation-electrooxidation process for the treatment of soluble coffee effluent: Optimization of COD degradation and operation time analysis.

    PubMed

    Ibarra-Taquez, Harold N; GilPavas, Edison; Blatchley, Ernest R; Gómez-García, Miguel-Ángel; Dobrosz-Gómez, Izabela

    2017-09-15

    Soluble coffee production generates wastewater containing complex mixtures of organic macromolecules. In this work, a sequential Electrocoagulation-Electrooxidation (EC-EO) process, using aluminum and graphite electrodes, was proposed as an alternative way for the treatment of soluble coffee effluent. Process operational parameters were optimized, achieving total decolorization, as well as 74% and 63.5% of COD and TOC removal, respectively. The integrated EC-EO process yielded a highly oxidized (AOS = 1.629) and biocompatible (BOD 5 /COD ≈ 0.6) effluent. The Molecular Weight Distribution (MWD) analysis showed that during the EC-EO process, EC effectively decomposed contaminants with molecular weight in the range of 10-30 kDa. In contrast, EO was quite efficient in mineralization of contaminants with molecular weight higher than 30 kDa. A kinetic analysis allowed determination of the time required to meet Colombian permissible discharge limits. Finally, a comprehensive operational cost analysis was performed. The integrated EC-EO process was demonstrated as an efficient alternative for the treatment of industrial effluents resulting from soluble coffee production. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Development of computer program NAS3D using Vector processing for geometric nonlinear analysis of structures

    NASA Technical Reports Server (NTRS)

    Mangalgiri, P. D.; Prabhakaran, R.

    1986-01-01

    An algorithm for vectorized computation of stiffness matrices of an 8 noded isoparametric hexahedron element for geometric nonlinear analysis was developed. This was used in conjunction with the earlier 2-D program GAMNAS to develop the new program NAS3D for geometric nonlinear analysis. A conventional, modified Newton-Raphson process is used for the nonlinear analysis. New schemes for the computation of stiffness and strain energy release rates is presented. The organization the program is explained and some results on four sample problems are given. The study of CPU times showed that savings by a factor of 11 to 13 were achieved when vectorized computation was used for the stiffness instead of the conventional scalar one. Finally, the scheme of inputting data is explained.

  12. Re-accumulation Scenarios Governing Final Global Shapes of Rubble-Pile Asteroids

    NASA Astrophysics Data System (ADS)

    Hestroffer, Daniel; Tanga, P.; Comito, C.; Paolicchi, P.; Walsh, K.; Richardson, D. C.; Cellino, A.

    2009-05-01

    Asteroids, since the formation of the solar system, are known to have experienced catastrophic collisions, which---depending on the impact energy---can produce a major disruption of the parent body and possibly give birth to asteroid families or binaries [1]. We present a general study of the final shape and dynamical state of asteroids produced by the re-accumulation process following a catastrophic disruption. Starting from a cloud of massive particles (mono-disperse spheres) with given density and velocity distributions, we analyse the final shape, spin state, and angular momentum of the system from numerical integration of a N-body gravitational system (code pkdgrav [2]). The re-accumulation process itself is relatively fast, with a dynamical time corresponding to the spin-period of the final body (several hours). The final global shapes---which are described as tri-axial ellipsoids---exhibit slopes consistent with a degree of shear stress sustained by interlocking particles. We point out a few results: -the final shapes are close to those of hydrostatic equilibrium for incompressible fluids, preferably Maclaurin spheroid rather than Jacobi ellipsoids -for bodies closest to the sequence of hydrostatic equilibrium, there is a direct relation between spin, density and outer shape, suggesting that the outer surface is nearly equipotential -the evolution of the shape during the process follows a track along a gradient of potential energy, without necessarily reaching its minimum -the loose random packing of the particles implies low friction angle and hence fluid-like behaviour, which extends the results of [3]. Future steps of our analysis will include feature refinements of the model initial conditions and re-accumulation process, including impact shakings, realistic velocity distributions, and non equal-sized elementary spheres. References [1] Michel P. et al. 2001. Science 294, 1696 [2] Leinhardt Z.M. et al. 2000. Icarus 146, 133 [3] Richardson D.C. et al. 2005. Icarus 173, 349

  13. An expert system for natural language processing

    NASA Technical Reports Server (NTRS)

    Hennessy, John F.

    1988-01-01

    A solution to the natural language processing problem that uses a rule based system, written in OPS5, to replace the traditional parsing method is proposed. The advantage to using a rule based system are explored. Specifically, the extensibility of a rule based solution is discussed as well as the value of maintaining rules that function independently. Finally, the power of using semantics to supplement the syntactic analysis of a sentence is considered.

  14. Environmental Impact Analysis Process. Final Environmental Impact Statement. Air Force, Space Division Housing Project, San Pedro, California

    DTIC Science & Technology

    1986-07-24

    impact on tme local zommunity’s use of :hese facilities. g ) Released to the public 24, :986. FINAL ENVIRONMENTAL IMPACT STATEMENT kIR FORCE, SP.CE...LFM.*’* Alternative G 9 80 0 0 21 90 9 acres in southeast corner of WP and 21 acres at FM.** Alternative H 0 0 22 80 21 90 22 acres at BP and (Buildable...it may not be considered a permanent irreversible or irretrievable use of the land, the Proposed Action and alternatives (except Alternative G which

  15. Parameter estimation for terrain modeling from gradient data. [navigation system for Martian rover

    NASA Technical Reports Server (NTRS)

    Dangelo, K. R.

    1974-01-01

    A method is developed for modeling terrain surfaces for use on an unmanned Martian roving vehicle. The modeling procedure employs a two-step process which uses gradient as well as height data in order to improve the accuracy of the model's gradient. Least square approximation is used in order to stochastically determine the parameters which describe the modeled surface. A complete error analysis of the modeling procedure is included which determines the effect of instrumental measurement errors on the model's accuracy. Computer simulation is used as a means of testing the entire modeling process which includes the acquisition of data points, the two-step modeling process and the error analysis. Finally, to illustrate the procedure, a numerical example is included.

  16. Applying PCI in Combination Swivel Head Wrench

    NASA Astrophysics Data System (ADS)

    Chen, Tsang-Chiang; Yang, Chun-Ming; Hsu, Chang-Hsien; Hung, Hsiang-Wen

    2017-09-01

    Taiwan’s traditional industries are subject to competition in the era of globalization and environmental change, the industry is facing economic pressure and shock, and now sustainable business can only continue to improve production efficiency and quality of technology, in order to stabilize the market, to obtain high occupancy. The use of process capability indices to monitor the quality of the ratchet wrench to find the key function of the dual-use ratchet wrench, the actual measurement data, The use of process capability Cpk index analysis, and draw Process Capability Analysis Chart model. Finally, this study explores the current situation of this case and proposes a lack of improvement and improvement methods to improve the overall quality and thereby enhance the overall industry.

  17. Train integrity detection risk analysis based on PRISM

    NASA Astrophysics Data System (ADS)

    Wen, Yuan

    2018-04-01

    GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.

  18. Individual Difference Relations in Psychometric and Experimental Cognitive Tasks. Final Report. No. 163.

    ERIC Educational Resources Information Center

    Carroll, John B.

    Fifty-five recent studies of individual differences (IDs) in elementary cognitive tasks (ECTs) are reviewed. Twenty-five data sets are examined, analyzed, or reanalyzed by factor analysis. The following promising dimensions are identified: basic perceptual processes, reaction and movement times, mental comparison and recognition tasks, retrieval…

  19. New 2012 precipitation frequency estimation analysis for Alaska : musings on data used and the final product.

    DOT National Transportation Integrated Search

    2013-06-01

    The major product of this study was a precipitation frequency atlas for the entire state of Alaska; this atlas is available at : http://dipper.nws.noaa.gov/hdsc/pfds/. The process of contributing to this study provided an opportunity to (1) evaluate ...

  20. GP3: GENEPIX POST-PROCESSING SCRIPT FOR THE AUTOMATED ANALYSIS OF RAW MICROARRAY IMAGE OUTPUT FILES. (R827402)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  1. VII data use analysis and processing (DUAP) : final project report (phase II) (October 2010) with revision (July 2011).

    DOT National Transportation Integrated Search

    2011-07-01

    This report covers several key subjects related to the generation of IntelliDriveSM probe vehicle data and use of this data in application of interest to state departments of transportation and local public transportation agencies. The evaluations co...

  2. The Relationship between Elementary Principals' Visionary Leadership and Students' Reading Performance

    ERIC Educational Resources Information Center

    Mora-Whitehurst, Rina

    2013-01-01

    This article focuses on elementary principals as instructional leaders, as well as public school initiatives and educational accountability in the United States. It presents the methodology, instrumentation, measures of academic achievement in Florida, data collection, and processing procedures. Finally, it presents data analysis, results of the…

  3. Cooperative spreading processes in multiplex networks.

    PubMed

    Wei, Xiang; Chen, Shihua; Wu, Xiaoqun; Ning, Di; Lu, Jun-An

    2016-06-01

    This study is concerned with the dynamic behaviors of epidemic spreading in multiplex networks. A model composed of two interacting complex networks is proposed to describe cooperative spreading processes, wherein the virus spreading in one layer can penetrate into the other to promote the spreading process. The global epidemic threshold of the model is smaller than the epidemic thresholds of the corresponding isolated networks. Thus, global epidemic onset arises in the interacting networks even though an epidemic onset does not arise in each isolated network. Simulations verify the analysis results and indicate that cooperative spreading processes in multiplex networks enhance the final infection fraction.

  4. Comparative study of the bioconversion process using R-(+)- and S-(-)-limonene as substrates for Fusarium oxysporum 152B.

    PubMed

    Molina, Gustavo; Bution, Murillo L; Bicas, Juliano L; Dolder, Mary Anne Heidi; Pastore, Gláucia M

    2015-05-01

    This study compared the bioconversion process of S-(-)-limonene into limonene-1,2-diol with the already established biotransformation of R-(+)-limonene into α-terpineol using the same biocatalyst in both processes, Fusarium oxysporum 152B. The bioconversion of the S-(-)-isomer was tested on cell permeabilisation under anaerobic conditions and using a biphasic system. When submitted to permeabilisation trials, this biocatalyst has shown a relatively high resistance; still, no production of limonene-1,2-diol and a loss of activity of the biocatalyst were observed after intense cell treatment, indicating a complete loss of cell viability. Furthermore, the results showed that this process can be characterised as an aerobic system that was catalysed by limonene-1,2-epoxide hydrolase, had an intracellular nature and was cofactor-dependent because the final product was not detected by an anaerobic process. Finally, this is the first report to characterise the bioconversion of R-(+)- and S-(-)-limonene by cellular detoxification using ultra-structural analysis. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Quality assessment of crude and processed Arecae semen based on colorimeter and HPLC combined with chemometrics methods.

    PubMed

    Sun, Meng; Yan, Donghui; Yang, Xiaolu; Xue, Xingyang; Zhou, Sujuan; Liang, Shengwang; Wang, Shumei; Meng, Jiang

    2017-05-01

    Raw Arecae Semen, the seed of Areca catechu L., as well as Arecae Semen Tostum and Arecae semen carbonisata are traditionally processed by stir-baking for subsequent use in a variety of clinical applications. These three Arecae semen types, important Chinese herbal drugs, have been used in China and other Asian countries for thousands of years. In this study, the sensory technologies of a colorimeter and sensitive validated high-performance liquid chromatography with diode array detection were employed to discriminate raw Arecae semen and its processed drugs. The color parameters of the samples were determined by a colorimeter instrument CR-410. Moreover, the fingerprints of the four alkaloids of arecaidine, guvacine, arecoline and guvacoline were surveyed by high-performance liquid chromatography. Subsequently, Student's t test, the analysis of variance, fingerprint similarity analysis, hierarchical cluster analysis, principal component analysis, factor analysis and Pearson's correlation test were performed for final data analysis. The results obtained demonstrated a significant color change characteristic for components in raw Arecae semen and its processed drugs. Crude and processed Arecae semen could be determined based on colorimetry and high-performance liquid chromatography with a diode array detector coupled with chemometrics methods for a comprehensive quality evaluation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Use of near-infrared spectroscopy and multipoint measurements for quality control of pharmaceutical drug products.

    PubMed

    Boiret, Mathieu; Chauchard, Fabien

    2017-01-01

    Near-infrared (NIR) spectroscopy is a non-destructive analytical technique that enables better-understanding and optimization of pharmaceutical processes and final drug products. The use in line is often limited by acquisition speed and sampling area. This work focuses on performing a multipoint measurement at high acquisition speed at the end of the manufacturing process on a conveyor belt system to control both the distribution and the content of active pharmaceutical ingredient within final drug products, i.e., tablets. A specially designed probe with several collection fibers was developed for this study. By measuring spectral and spatial information, it provides physical and chemical knowledge on the final drug product. The NIR probe was installed on a conveyor belt system that enables the analysis of a lot of tablets. The use of these NIR multipoint measurement probes on a conveyor belt system provided an innovative method that has the potential to be used as a new paradigm to ensure the drug product quality at the end of the manufacturing process and as a new analytical method for the real-time release control strategy. Graphical abstract Use of near-infrared spectroscopy and multipoint measurements for quality control of pharmaceutical drug products.

  7. Qualitative data analysis: conceptual and practical considerations.

    PubMed

    Liamputtong, Pranee

    2009-08-01

    Qualitative inquiry requires that collected data is organised in a meaningful way, and this is referred to as data analysis. Through analytic processes, researchers turn what can be voluminous data into understandable and insightful analysis. This paper sets out the different approaches that qualitative researchers can use to make sense of their data including thematic analysis, narrative analysis, discourse analysis and semiotic analysis and discusses the ways that qualitative researchers can analyse their data. I first discuss salient issues in performing qualitative data analysis, and then proceed to provide some suggestions on different methods of data analysis in qualitative research. Finally, I provide some discussion on the use of computer-assisted data analysis.

  8. Effects on Physiology and Performance of Wearing the Aviator NBC ensemble While Flying the UH-60 Helicopter Flight Simulator in a Controlled Heat Environment.

    DTIC Science & Technology

    1992-09-01

    and collecting and processing data. They were at the front line in interacting with the subjects and maintaining morale. They did an excellent job. They...second for 16 parameter channels, and the data were processed to produce a single root mean square (RMS) error value for each channel appropriate to...represented in the final analysis. Physiological data The physiological data on the VAX were processed by sampling them at 5-minute intervals throughout the

  9. Rheological of chocolate-flavored, reduced-calories coating as a function of conching process.

    PubMed

    Medina-Torres, Luis; Sanchez-Olivares, Guadalupe; Nuñez-Ramirez, Diola Marina; Moreno, Leonardo; Calderas, Fausto

    2014-07-01

    Continuous flow and linear viscoelasticity rheology of chocolate coating is studied in this work using fat substitute gums (xanthan, GX). An alternative conching process, using a Rotor-Estator (RE) type impeller, is proposed. The objective is to obtain a chocolate coating material with improved flow properties. Characterization of the final material through particle size distribution (PSD), differential scanning calorimetry (DSC) and proximal analysis is reported. Particle size distribution of the final material showed less polydispersity and therefore, greater homogeneity; fusion points were also generated at around 20 °C assuming crystal type I (β'2) and II (α). Moreover, the final material exhibited crossover points (higher structure material), whereas the commercial brand chocolate used for comparison did not. The best conditions to produce the coating were maturing of 36 h and 35 °C, showing crossover points around 76 Pa and a 0.505 solids particle dispersion (average particle diameter of 0.364 μm), and a fusion point at 20.04 °C with a ΔHf of 1.40 (J/g). The results indicate that xanthan gum is a good substitute for cocoa butter and provides stability to the final product.

  10. Kinematic analysis and simulation of a substation inspection robot guided by magnetic sensor

    NASA Astrophysics Data System (ADS)

    Xiao, Peng; Luan, Yiqing; Wang, Haipeng; Li, Li; Li, Jianxiang

    2017-01-01

    In order to improve the performance of the magnetic navigation system used by substation inspection robot, the kinematic characteristics is analyzed based on a simplified magnetic guiding system model, and then the simulation process is executed to verify the reasonability of the whole analysis procedure. Finally, some suggestions are extracted out, which will be helpful to guide the design of the inspection robot system in the future.

  11. Longitudinal study on the sources of Listeria monocytogenes contamination in cold-smoked salmon and its processing environment in Italy.

    PubMed

    Di Ciccio, Pierluigi; Meloni, Domenico; Festino, Anna Rita; Conter, Mauro; Zanardi, Emanuela; Ghidini, Sergio; Vergara, Alberto; Mazzette, Rina; Ianieri, Adriana

    2012-08-01

    The aim of the present study was to investigate the sources of Listeria monocytogenes contamination in a cold smoked salmon processing environment over a period of six years (2003-2008). A total of 170 samples of raw material, semi-processed, final product and processing surfaces at different production stages were tested for the presence of L. monocytogenes. The L. monocytogenes isolates were characterized by multiplex PCR for the analysis of virulence factors and for serogrouping. The routes of contamination over the six year period were traced by PFGE. L. monocytogenes was isolated from 24% of the raw salmon samples, 14% of the semi-processed products and 12% of the final products. Among the environmental samples, 16% were positive for L. monocytogenes. Serotyping yielded three serovars: 1/2a, 1/2b, 4b, with the majority belonging to serovars 1/2a (46%) and 1/2b (39%). PFGE yielded 14 profiles: two of them were repeatedly isolated in 2005-2006 and in 2007-2008 mainly from the processing environment and final products but also from raw materials. The results of this longitudinal study highlighted that contamination of smoked salmon occurs mainly during processing rather than originating from raw materials, even if raw fish can be a contamination source of the working environment. Molecular subtyping is critical for the identification of the contamination routes of L. monocytogenes and its niches into the production plant when control strategies must be implemented with the aim to reduce its prevalence during manufacturing. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Modeling and experimental analysis of electrospinning bending region physics in determining fiber diameter for hydrophilic polymer solvent systems

    NASA Astrophysics Data System (ADS)

    Cai, Yunshen

    Electrospinning produces submicron fibers from a wide range of polymer/solvent systems that enable a variety of different applications. In electrospinning process, a straight polymer/solvent charged jet is initially formed, followed by a circular moving jet in the shape of a cone, called the bending region. The process physics in the bending region are difficult to study since the jet diameter cannot be measured directly due to its rapid motion and small size ( microns and smaller), and due to complex coupling of multiple forces, mass transport, and changing jet geometry. Since the solutions studied are hydrophilic, they readily absorb ambient moisture. This thesis explores the role of the bending region in determining the resulting electrospun fiber diameter through a combined experimental and modeling analysis for a variety of hydrophilic polymer/solvent solutions. Electrospinning experiments were conducted over a broad range of operating conditions for 4 different polymer/solvent systems. Comparison of the final straight jet diameters to fiber diameters reveals that between 30% to 60% jet thinning occurs in the bending region. These experiments also reveal that relative humidity significantly affects the electrospinning process and final fiber diameter, even for non-aqueous solutions. A model is developed to obtain insight into the bending region process physics. Important ones include understanding the mass transport for non-aqueous hydrophilic jets (including solvent evaporation and water absorption on the jet surface, radial diffusion, and axial advection), and the coupling between the mass and force balances that determines the final fiber diameter. The absorption and evaporation physics is validated by evaporation experiments. The developed model predicts fiber diameter to within of 8%, even though the solution properties and operating conditions that determines net stretching forces and net evaporation rates vary over a large range. Model analysis reveals how the net evaporation rate affects the jet length and net stretching force, both of which ultimately determine the fiber diameter. It is also shown that the primary impact of RH on the process is through occupation of the surface states that limits solvent evaporation rate, rather than the amount of water absorbed. Correlation functions between process conditions, solution properties and the resulting fiber diameters are discussed.

  13. Computational Analysis and Simulation of Empathic Behaviors: A Survey of Empathy Modeling with Behavioral Signal Processing Framework

    PubMed Central

    Xiao, Bo; Imel, Zac E.; Georgiou, Panayiotis; Atkins, David C.; Narayanan, Shrikanth S.

    2017-01-01

    Empathy is an important psychological process that facilitates human communication and interaction. Enhancement of empathy has profound significance in a range of applications. In this paper, we review emerging directions of research on computational analysis of empathy expression and perception as well as empathic interactions, including their simulation. We summarize the work on empathic expression analysis by the targeted signal modalities (e.g., text, audio, facial expressions). We categorize empathy simulation studies into theory-based emotion space modeling or application-driven user and context modeling. We summarize challenges in computational study of empathy including conceptual framing and understanding of empathy, data availability, appropriate use and validation of machine learning techniques, and behavior signal processing. Finally, we propose a unified view of empathy computation, and offer a series of open problems for future research. PMID:27017830

  14. PLACE: an open-source python package for laboratory automation, control, and experimentation.

    PubMed

    Johnson, Jami L; Tom Wörden, Henrik; van Wijk, Kasper

    2015-02-01

    In modern laboratories, software can drive the full experimental process from data acquisition to storage, processing, and analysis. The automation of laboratory data acquisition is an important consideration for every laboratory. When implementing a laboratory automation scheme, important parameters include its reliability, time to implement, adaptability, and compatibility with software used at other stages of experimentation. In this article, we present an open-source, flexible, and extensible Python package for Laboratory Automation, Control, and Experimentation (PLACE). The package uses modular organization and clear design principles; therefore, it can be easily customized or expanded to meet the needs of diverse laboratories. We discuss the organization of PLACE, data-handling considerations, and then present an example using PLACE for laser-ultrasound experiments. Finally, we demonstrate the seamless transition to post-processing and analysis with Python through the development of an analysis module for data produced by PLACE automation. © 2014 Society for Laboratory Automation and Screening.

  15. The use of process approach to base the need of automation of business processes in educational institutions

    NASA Astrophysics Data System (ADS)

    Frolova, M. A.; Razumova, T. A.

    2017-01-01

    This article is dedicated to the analysis of business processes in a comprehensive institution on the basis of the process approach. Decomposition of the processes in study is carried out by means of the IDEF0 methodology, both the basic mechanisms and control actions are determined, AS-IS diagrams for documentation support for educational service provision are developed. Disadvantages of the existing business processes are revealed on the basis of the diagrams and a way to solve the problem is proposed which allows increasing the efficiency of the use of labor resources. The results of the implementation of the solution that takes into account the use of software as a means of the solution are presented as TO-BE diagrams. The analysis carried out on the basis of the diagrams led to the conclusion about the need to automate the test-task database formation process for preparing students for the State Final Examination.

  16. Conceptual analysis of Physiology of vision in Ayurveda.

    PubMed

    Balakrishnan, Praveen; Ashwini, M J

    2014-07-01

    The process by which the world outside is seen is termed as visual process or physiology of vision. There are three phases in this visual process: phase of refraction of light, phase of conversion of light energy into electrical impulse and finally peripheral and central neurophysiology. With the advent of modern instruments step by step biochemical changes occurring at each level of the visual process has been deciphered. Many investigations have emerged to track these changes and helping to diagnose the exact nature of the disease. Ayurveda has described this physiology of vision based on the functions of vata and pitta. Philosophical textbook of ayurveda, Tarka Sangraha, gives certain basics facts of visual process. This article discusses the second and third phase of visual process. Step by step analysis of the visual process through the spectacles of ayurveda amalgamated with the basics of philosophy from Tarka Sangraha has been analyzed critically to generate a concrete idea regarding the physiology and hence thereby interpret the pathology on the grounds of ayurveda based on the investigative reports.

  17. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE PAGES

    Dai, Heng; Ye, Ming; Walker, Anthony P.; ...

    2017-03-28

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  18. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  19. Making the Hubble Space Telescope servicing mission safe

    NASA Technical Reports Server (NTRS)

    Bahr, N. J.; Depalo, S. V.

    1992-01-01

    The implementation of the HST system safety program is detailed. Numerous safety analyses are conducted through various phases of design, test, and fabrication, and results are presented to NASA management for discussion during dedicated safety reviews. Attention is given to the system safety assessment and risk analysis methodologies used, i.e., hazard analysis, fault tree analysis, and failure modes and effects analysis, and to how they are coupled with engineering and test analysis for a 'synergistic picture' of the system. Some preliminary safety analysis results, showing the relationship between hazard identification, control or abatement, and finally control verification, are presented as examples of this safety process.

  20. Improved ethanol yield and reduced minimum ethanol selling price (MESP) by modifying low severity dilute acid pretreatment with deacetylation and mechanical refining: 2) Techno-economic analysis

    PubMed Central

    2012-01-01

    Background Our companion paper discussed the yield benefits achieved by integrating deacetylation, mechanical refining, and washing with low acid and low temperature pretreatment. To evaluate the impact of the modified process on the economic feasibility, a techno-economic analysis (TEA) was performed based on the experimental data presented in the companion paper. Results The cost benefits of dilute acid pretreatment technology combined with the process alternatives of deacetylation, mechanical refining, and pretreated solids washing were evaluated using cost benefit analysis within a conceptual modeling framework. Control cases were pretreated at much lower acid loadings and temperatures than used those in the NREL 2011 design case, resulting in much lower annual ethanol production. Therefore, the minimum ethanol selling prices (MESP) of the control cases were $0.41-$0.77 higher than the $2.15/gallon MESP of the design case. This increment is highly dependent on the carbohydrate content in the corn stover. However, if pretreatment was employed with either deacetylation or mechanical refining, the MESPs were reduced by $0.23-$0.30/gallon. Combing both steps could lower the MESP further by $0.44 ~ $0.54. Washing of the pretreated solids could also greatly improve the final ethanol yields. However, the large capital cost of the solid–liquid separation unit negatively influences the process economics. Finally, sensitivity analysis was performed to study the effect of the cost of the pretreatment reactor and the energy input for mechanical refining. A 50% cost reduction in the pretreatment reactor cost reduced the MESP of the entire conversion process by $0.11-$0.14/gallon, while a 10-fold increase in energy input for mechanical refining will increase the MESP by $0.07/gallon. Conclusion Deacetylation and mechanical refining process options combined with low acid, low severity pretreatments show improvements in ethanol yields and calculated MESP for cellulosic ethanol production. PMID:22967479

  1. Research status of wave energy conversion (WEC) device of raft structure

    NASA Astrophysics Data System (ADS)

    Dong, Jianguo; Gao, Jingwei; Tao, Liang; Zheng, Peng

    2017-10-01

    This paper has briefly described the concept of wave energy generation and six typical conversion devices. As for raft structure, detailed analysis is provided from its development process to typical devices. Taking the design process and working principle of Plamis as an example, the general principle of raft structure is briefly described. After that, a variety of raft structure models are introduced. Finally, the advantages and disadvantages, and development trend of raft structure are pointed out.

  2. Combined Pressure, Temperature Contrast and Surface-Enhanced Separation of Carbon Dioxide for Post-Combustion Carbon Capture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Zhen; Wong, Michael; Gupta, Mayank

    The Rice University research team developed a hybrid carbon dioxide (CO 2) absorption process combining absorber and stripper columns using a high surface area ceramic foam gas-liquid contactor for enhanced mass transfer and utilizing waste heat for regeneration. This integrated absorber/desorber arrangement will reduce space requirements, an important factor for retrofitting existing coal-fired power plants with CO 2 capture technology. Described in this report, we performed an initial analysis to estimate the technical and economic feasibility of the process. A one-dimensional (1D) CO 2 absorption column was fabricated to measure the hydrodynamic and mass transfer characteristics of the ceramic foam.more » A bench-scale prototype was constructed to implement the complete CO 2 separation process and tested to study various aspects of fluid flow in the process. A model was developed to simulate the two-dimensional (2D) fluid flow and optimize the CO 2 capture process. Test results were used to develop a final technoeconomic analysis and identify the most appropriate absorbent as well as optimum operating conditions to minimize capital and operating costs. Finally, a technoeconomic study was performed to assess the feasibility of integrating the process into a 600 megawatt electric (MWe) coal-fired power plant. With process optimization, $82/MWh of COE can be achieved using our integrated absorber/desorber CO 2 capture technology, which is very close to DOE's target that no more than a 35% increase in COE with CCS. An environmental, health, and safety (EH&S) assessment of the capture process indicated no significant concern in terms of EH&S effects or legislative compliance.« less

  3. Performance Modeling and Cost Analysis of a Pilot-Scale Reverse Osmosis Process for the Final Purification of Olive Mill Wastewater

    PubMed Central

    Ochando-Pulido, Javier Miguel; Hodaifa, Gassan; Victor-Ortega, Maria Dolores; Martinez-Ferez, Antonio

    2013-01-01

    A secondary treatment for olive mill wastewater coming from factories working with the two-phase olive oil production process (OMW-2) has been set-up on an industrial scale in an olive oil mill in the premises of Jaén (Spain). The secondary treatment comprises Fenton-like oxidation followed by flocculation-sedimentation and filtration through olive stones. In this work, performance modelization and preliminary cost analysis of a final reverse osmosis (RO) process was examined on pilot scale for ulterior purification of OMW-2 with the goal of closing the loop of the industrial production process. Reduction of concentration polarization on the RO membrane equal to 26.3% was provided upon increment of the turbulence over the membrane to values of Reynolds number equal to 2.6 × 104. Medium operating pressure (25 bar) should be chosen to achieve significant steady state permeate flux (21.1 L h−1 m−2) and minimize membrane fouling, ensuring less than 14.7% flux drop and up to 90% feed recovery. Under these conditions, irreversible fouling below 0.08 L h−2 m−2 bar−1 helped increase the longevity of the membrane and reduce the costs of the treatment. For 10 m3 day−1 OMW-2 on average, 47.4 m2 required membrane area and 0.87 € m−3 total costs for the RO process were estimated. PMID:24957058

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Looney, J.H.; Im, C.J.

    Under the sponsorship of DOE/METC, UCC Research completed a program in 1984 concerned with the development, testing, and manufacture of an ultra-clean coal-water mixture fuel using the UCC two-stage physical beneficiation and coal-water mixture preparation process. Several gallons of ultra-clean coal-water slurry produced at the UCC Research pilot facility was supplied to DOE/METC for combustion testing. The finalization of this project resulted in the presentation of a conceptual design and economic analysis of an ultra-clean coal-water mixture processing facility sufficient in size to continuously supply fuel to a 100 MW turbine power generation system. Upon completion of the above program,more » it became evident that substantial technological and economic improvement could be realized through further laboratory and engineering investigation of the UCC two-stage physical beneficiation process. Therefore, as an extension to the previous work, the purpose of the present program was to define the relationship between the controlling technical parameters as related to coal-water slurry quality and product price, and to determine the areas of improvement in the existing flow-scheme, associated cost savings, and the overall effect of these savings on final coal-water slurry price. Contents of this report include: (1) introduction; (2) process refinement (improvement of coal beneficiation process, different source coals and related cleanability, dispersants and other additives); (3) coal beneficiation and cost parametrics summary; (4) revised conceptual design and economic analysis; (5) operating and capital cost reduction; (6) conclusion; and (7) appendices. 24 figs., 12 tabs.« less

  5. Supercritical antisolvent precipitation of nimesulide: preliminary experiments.

    PubMed

    Moneghini, M; Perissutti, B; Vecchione, F; Kikic, I; Alessi, P; Cortesi, A; Princivalle, F

    2007-07-01

    The purpose of this preliminary study was to investigate the physico-chemical properties of nimesulide precipitated by continuous supercritical antisolvent (SAS) from different organic solvents like acetone, chloroform and dichloromethane at 40 degrees C and 80, 85 and 88 bar, respectively. Scanning electron microscopy, differential scanning calorimetry, X-Ray diffractometry and in vitro dissolution tests were employed to study how the technological process and the solvent nature would affect the final product. SAS-processed nimesulide particles showed dramatic morphological change in crystalline structure if compared to native nimesulide, resulting in needle and thin rods shaped crystals. The solid state analysis showed that using chloroform or dichloromethane as a solvent the drug solid state remained substantially unchanged, whilst if using acetone the applied method caused a transition from the starting form I to the meta-stable form II. So as to identify which process was responsible for this result, nimesulide was further precipitated from the same solvent by conventional evaporation method (RV-sample). On the basis of this comparison, the solvent was found to be responsible for the re-organization into the different polymorphic form and the potential of the SAS process to produce micronic needle shaped particles, with an enhanced dissolution rate if compared to the to the pure drug, was ascertained. Finally, the stability of the nimesulide form II, checked by DSC analysis, was ruled on over a period of 15 months.

  6. Fabrication of Titanium-Niobium-Zirconium-Tantalium Alloy (TNZT) Bioimplant Components with Controllable Porosity by Spark Plasma Sintering

    PubMed Central

    Rechtin, Jack; Torresani, Elisa; Ivanov, Eugene; Olevsky, Eugene

    2018-01-01

    Spark Plasma Sintering (SPS) is used to fabricate Titanium-Niobium-Zirconium-Tantalum alloy (TNZT) powder—based bioimplant components with controllable porosity. The developed densification maps show the effects of final SPS temperature, pressure, holding time, and initial particle size on final sample relative density. Correlations between the final sample density and mechanical properties of the fabricated TNZT components are also investigated and microstructural analysis of the processed material is conducted. A densification model is proposed and used to calculate the TNZT alloy creep activation energy. The obtained experimental data can be utilized for the optimized fabrication of TNZT components with specific microstructural and mechanical properties suitable for biomedical applications. PMID:29364165

  7. Site Suitability Analysis for Beekeeping via Analythical Hyrearchy Process, Konya Example

    NASA Astrophysics Data System (ADS)

    Sarı, F.; Ceylan, D. A.

    2017-11-01

    Over the past decade, the importance of the beekeeping activities has been emphasized in the field of biodiversity, ecosystems, agriculture and human health. Thus, efficient management and deciding correct beekeeping activities seems essential to maintain and improve productivity and efficiency. Due to this importance, considering the economic contributions to the rural area, the need for suitability analysis concept has been revealed. At this point, Multi Criteria Decision Analysis (MCDA) and Geographical Information Systems (GIS) integration provides efficient solutions to the complex structure of decision- making process for beekeeping activities. In this study, site suitability analysis via Analytical Hierarchy Process (AHP) was carried out for Konya city in Turkey. Slope, elevation, aspect, distance to water resources, roads and settlements, precipitation and flora criteria are included to determine suitability. The requirements, expectations and limitations of beekeeping activities are specified with the participation of experts and stakeholders. The final suitability map were validated with existing 117 beekeeping locations and Turkish Statistical Institute 2016 beekeeping statistics for Konya province.

  8. Design principles for data- and change-oriented organisational analysis in workplace health promotion.

    PubMed

    Inauen, A; Jenny, G J; Bauer, G F

    2012-06-01

    This article focuses on organizational analysis in workplace health promotion (WHP) projects. It shows how this analysis can be designed such that it provides rational data relevant to the further context-specific and goal-oriented planning of WHP and equally supports individual and organizational change processes implied by WHP. Design principles for organizational analysis were developed on the basis of a narrative review of the guiding principles of WHP interventions and organizational change as well as the scientific principles of data collection. Further, the practical experience of WHP consultants who routinely conduct organizational analysis was considered. This resulted in a framework with data-oriented and change-oriented design principles, addressing the following elements of organizational analysis in WHP: planning the overall procedure, data content, data-collection methods and information processing. Overall, the data-oriented design principles aim to produce valid, reliable and representative data, whereas the change-oriented design principles aim to promote motivation, coherence and a capacity for self-analysis. We expect that the simultaneous consideration of data- and change-oriented design principles for organizational analysis will strongly support the WHP process. We finally illustrate the applicability of the design principles to health promotion within a WHP case study.

  9. Development of a Premium Quality Plasma-derived IVIg (IQYMUNE®) Utilizing the Principles of Quality by Design-A Worked-through Case Study.

    PubMed

    Paolantonacci, Philippe; Appourchaux, Philippe; Claudel, Béatrice; Ollivier, Monique; Dennett, Richard; Siret, Laurent

    2018-01-01

    Polyvalent human normal immunoglobulins for intravenous use (IVIg), indicated for rare and often severe diseases, are complex plasma-derived protein preparations. A quality by design approach has been used to develop the Laboratoire Français du Fractionnement et des Biotechnologies new-generation IVIg, targeting a high level of purity to generate an enhanced safety profile while maintaining a high level of efficacy. A modular approach of quality by design was implemented consisting of five consecutive steps to cover all the stages from the product design to the final product control strategy.A well-defined target product profile was translated into 27 product quality attributes that formed the basis of the process design. In parallel, a product risk analysis was conducted and identified 19 critical quality attributes among the product quality attributes. Process risk analysis was carried out to establish the links between process parameters and critical quality attributes. Twelve critical steps were identified, and for each of these steps a risk mitigation plan was established.Among the different process risk mitigation exercises, five process robustness studies were conducted at qualified small scale with a design of experiment approach. For each process step, critical process parameters were identified and, for each critical process parameter, proven acceptable ranges were established. The quality risk management and risk mitigation outputs, including verification of proven acceptable ranges, were used to design the process verification exercise at industrial scale.Finally, the control strategy was established using a mix, or hybrid, of the traditional approach plus elements of the quality by design enhanced approach, as illustrated, to more robustly assign material and process controls and in order to securely meet product specifications.The advantages of this quality by design approach were improved process knowledge for industrial design and process validation and a clear justification of the process and product specifications as a basis for control strategy and future comparability exercises. © PDA, Inc. 2018.

  10. Certification-Based Process Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  11. Design and optimization of a chromatographic purification process for Streptococcus pneumoniae serotype 23F capsular polysaccharide by a Design of Experiments approach.

    PubMed

    Ji, Yu; Tian, Yang; Ahnfelt, Mattias; Sui, Lili

    2014-06-27

    Multivalent pneumococcal vaccines were used worldwide to protect human beings from pneumococcal diseases. In order to eliminate the toxic organic solutions used in the traditional vaccine purification process, an alternative chromatographic process for Streptococcus pneumoniae serotype 23F capsular polysaccharide (CPS) was proposed in this study. The strategy of Design of Experiments (DoE) was introduced into the process development to solve the complicated design procedure. An initial process analysis was given to review the whole flowchart, identify the critical factors of chromatography through FMEA and chose the flowthrough mode due to the property of the feed. A resin screening study was then followed to select candidate resins. DoE was utilized to generate a resolution IV fractional factorial design to further compare candidates and narrow down the design space. After Capto Adhere was selected, the Box-Behnken DoE was executed to model the process and characterize all effects of factors on the responses. Finally, Monte Carlo simulation was used to optimize the process, test the chosen optimal conditions and define the control limit. The results of three scale-up runs at set points verified the DoE and simulation predictions. The final results were well in accordance with the EU pharmacopeia requirements: Protein/CPS (w/w) 1.08%; DNA/CPS (w/w) 0.61%; the phosphorus content 3.1%; the nitrogen 0.315% and the Methyl-pentose percentage 47.9%. Other tests of final pure CPS also met the pharmacopeia specifications. This alternative chromatographic purification process for pneumococcal vaccine without toxic organic solvents was successfully developed by the DoE approach and proved scalability, robustness and suitability for large scale manufacturing. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Uncertainty Analysis for Peer Assessment: Oral Presentation Skills for Final Year Project

    ERIC Educational Resources Information Center

    Kim, Ho Sung

    2014-01-01

    Peer assessment plays an important role in engineering education for an active involvement in the assessment process, developing autonomy, enhancing reflection, and understanding of how to achieve the learning outcomes. Peer assessment uncertainty for oral presentation skills as part of the FYP assessment is studied. Validity and reliability for…

  13. Meaningful Learning and Summative Assessment in Geography Education: An Analysis in Secondary Education in the Netherlands

    ERIC Educational Resources Information Center

    Bijsterbosch, Erik; van der Schee, Joop; Kuiper, Wilmad

    2017-01-01

    Enhancing meaningful learning is an important aim in geography education. Also, assessment should reflect this aim. Both formative and summative assessments contribute to meaningful learning when more complex knowledge and cognitive processes are assessed. The internal school-based geography examinations of the final exam in pre-vocational…

  14. The SACE Review Panel's Final Report: Significant Flaws in the Analysis of Statistical Data

    ERIC Educational Resources Information Center

    Gregory, Kelvin

    2006-01-01

    The South Australian Certificate of Education (SACE) is a credential and formal qualification within the Australian Qualifications Framework. A recent review of the SACE outlined a number of recommendations for significant changes to this certificate. These recommendations were the result of a process that began with the review panel…

  15. On board processor development for NASA's spaceborne imaging radar with system-on-chip technology

    NASA Technical Reports Server (NTRS)

    Fang, Wai-Chi

    2004-01-01

    This paper reports a preliminary study result of an on-board spaceborne SAR processor. It consists of a processing requirement analysis, functional specifications, and implementation with system-on-chip technology. Finally, a minimum version of this on-board processor designed for performance evaluation and for partial demonstration is illustrated.

  16. Electrocoagulation efficiency of the tannery effluent treatment using aluminium electrodes.

    PubMed

    Espinoza-Quiñones, Fernando R; Fornari, Marilda M T; Módenes, Aparecido N; Palácio, Soraya M; Trigueros, Daniela E G; Borba, Fernando H; Kroumov, Alexander D

    2009-01-01

    An electro-coagulation laboratory scale system using aluminium plates electrodes was studied for the removal of organic and inorganic pollutants as a by-product from leather finishing industrial process. A fractional factorial 2(3) experimental design was applied in order to obtain optimal values of the system state variables. The electro-coagulation (EC) process efficiency was based on the chemical oxygen demand (COD), turbidity, total suspended solid, total fixed solid, total volatile solid, and chemical element concentration values. Analysis of variance (ANOVA) for final pH, total fixed solid (TFS), turbidity and Ca concentration have confirmed the predicted models by the experimental design within a 95% confidence level. The reactor working conditions close to real effluent pH (7.6) and electrolysis time in the range 30-45 min were enough to achieve the cost effective reduction factors of organic and inorganic pollutants' concentrations. An appreciable improvement in COD removal efficiency was obtained for electro-coagulation treatment. Finally, the technical-economical analysis results have clearly shown that the electro-coagulation method is very promising for industrial application.

  17. Transitioning mine warfare to network-centric sensor analysis: future PMA technologies & capabilities

    NASA Astrophysics Data System (ADS)

    Stack, J. R.; Guthrie, R. S.; Cramer, M. A.

    2009-05-01

    The purpose of this paper is to outline the requisite technologies and enabling capabilities for network-centric sensor data analysis within the mine warfare community. The focus includes both automated processing and the traditional humancentric post-mission analysis (PMA) of tactical and environmental sensor data. This is motivated by first examining the high-level network-centric guidance and noting the breakdown in the process of distilling actionable requirements from this guidance. Examples are provided that illustrate the intuitive and substantial capability improvement resulting from processing sensor data jointly in a network-centric fashion. Several candidate technologies are introduced including the ability to fully process multi-sensor data given only partial overlap in sensor coverage and the ability to incorporate target identification information in stride. Finally the critical enabling capabilities are outlined including open architecture, open business, and a concept of operations. This ability to process multi-sensor data in a network-centric fashion is a core enabler of the Navy's vision and will become a necessity with the increasing number of manned and unmanned sensor systems and the requirement for their simultaneous use.

  18. Double-strangeness production in {bar p}Xe annihilation at low energy in the DIANA chamber

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barmin, V.V.; Barylov, V.G.; Chernukha, S.F.

    1994-09-01

    From the analysis of about 10{sup 5} annihilations of antiprotons at rest and in flight (0.4 - 0.9 GeV/c) on Xe nuclei, new results are presented for the final state K{sup 0}{sub s}SX (S = K{sup 0}{sub s}, K{sup {minus}}, {Lambda}, {Sigma}{sup 0}, {Sigma}{sup +-}). From these results and from earlier results for the final state K{sup +}SX, inclusive and semiinclusive yields are determined, giving virtually all strange channels. The effective strangeness contents were deduced to be 5.5% at rest and 5.3% in flight. In addition to 14 events reported earlier in the final states K{sup +}K{sup +}X and K{supmore » +}K{sup 0}{sub s}{Lambda}X and based on an analysis of 5.4 x 10{sup 5} annihilations, three new events were found in a further analysis of 10{sup 5} annihilations. The new events, together with the revised yields and momentum and mass distributions, are presented. The results confirm earlier results on the effective mass of {Lambda}{Lambda} and a determination of the preferable cascade process.« less

  19. Amino acid analysis

    NASA Technical Reports Server (NTRS)

    Winitz, M.; Graff, J. (Inventor)

    1974-01-01

    The process and apparatus for qualitative and quantitative analysis of the amino acid content of a biological sample are presented. The sample is deposited on a cation exchange resin and then is washed with suitable solvents. The amino acids and various cations and organic material with a basic function remain on the resin. The resin is eluted with an acid eluant, and the eluate containing the amino acids is transferred to a reaction vessel where the eluant is removed. Final analysis of the purified acylated amino acid esters is accomplished by gas-liquid chromatographic techniques.

  20. Automated Production of Movies on a Cluster of Computers

    NASA Technical Reports Server (NTRS)

    Nail, Jasper; Le, Duong; Nail, William L.; Nail, William

    2008-01-01

    A method of accelerating and facilitating production of video and film motion-picture products, and software and generic designs of computer hardware to implement the method, are undergoing development. The method provides for automation of most of the tedious and repetitive tasks involved in editing and otherwise processing raw digitized imagery into final motion-picture products. The method was conceived to satisfy requirements, in industrial and scientific testing, for rapid processing of multiple streams of simultaneously captured raw video imagery into documentation in the form of edited video imagery and video derived data products for technical review and analysis. In the production of such video technical documentation, unlike in production of motion-picture products for entertainment, (1) it is often necessary to produce multiple video derived data products, (2) there are usually no second chances to repeat acquisition of raw imagery, (3) it is often desired to produce final products within minutes rather than hours, days, or months, and (4) consistency and quality, rather than aesthetics, are the primary criteria for judging the products. In the present method, the workflow has both serial and parallel aspects: processing can begin before all the raw imagery has been acquired, each video stream can be subjected to different stages of processing simultaneously on different computers that may be grouped into one or more cluster(s), and the final product may consist of multiple video streams. Results of processing on different computers are shared, so that workers can collaborate effectively.

  1. Modelling the effect of the physical and chemical characteristics of the materials used as casing layers on the production parameters of Agaricus bisporus.

    PubMed

    Pardo, Arturo; Emilio Pardo, J; de Juan, J Arturo; Zied, Diego Cunha

    2010-12-01

    The aim of this research was to show the mathematical data obtained through the correlations found between the physical and chemical characteristics of casing layers and the final mushrooms' properties. For this purpose, 8 casing layers were used: soil, soil + peat moss, soil + black peat, soil + composted pine bark, soil + coconut fibre pith, soil + wood fibre, soil + composted vine shoots and, finally, the casing of La Rioja subjected to the ruffling practice. The conclusion that interplays in the fructification process with only the physical and chemical characteristics of casing are complicated was drawn. The mathematical data obtained in earliness could be explained in non-ruffled cultivation. The variability observed for the mushroom weight and the mushroom diameter variables could be explained in both ruffled and non-ruffled cultivations. Finally, the properties of the final quality of mushrooms were established by regression analysis.

  2. Improve the Efficiency of the Service Process as a Result of the Muda Ideology

    NASA Astrophysics Data System (ADS)

    Lorenc, Augustyn; Przyłuski, Krzysztof

    2018-06-01

    The aim of the paper was to improve service processes carried out by Knorr-Bremse Systemy Kolejowe Polska sp. z o.o. Particularly, emphasise unnecessary movements and physical efforts of employees. The indirect goal was to find a solution in the simplest possible way using the Muda ideology. In order to improve the service process at the beginning was executed the process mapping for the devices to be repaired, ie. brake callipers, electro-hydraulic units and auxiliary release units. The processes were assessed and shown as Pareto-Lorenz analysis. In order to determine the most time consuming process. Based on the obtained results use of a column crane with articulated arm was proposed to facilitate the transfer of heavy components between areas. The final step was to assess the effectiveness of the proposed solution in terms of time saving. From the company perspective results of the analysis are important. The proposed solution not only reduces total service time but also contributes to crew's work comfort.

  3. Error analysis in stereo vision for location measurement of 3D point

    NASA Astrophysics Data System (ADS)

    Li, Yunting; Zhang, Jun; Tian, Jinwen

    2015-12-01

    Location measurement of 3D point in stereo vision is subjected to different sources of uncertainty that propagate to the final result. For current methods of error analysis, most of them are based on ideal intersection model to calculate the uncertainty region of point location via intersecting two fields of view of pixel that may produce loose bounds. Besides, only a few of sources of error such as pixel error or camera position are taken into account in the process of analysis. In this paper we present a straightforward and available method to estimate the location error that is taken most of source of error into account. We summed up and simplified all the input errors to five parameters by rotation transformation. Then we use the fast algorithm of midpoint method to deduce the mathematical relationships between target point and the parameters. Thus, the expectations and covariance matrix of 3D point location would be obtained, which can constitute the uncertainty region of point location. Afterwards, we turned back to the error propagation of the primitive input errors in the stereo system and throughout the whole analysis process from primitive input errors to localization error. Our method has the same level of computational complexity as the state-of-the-art method. Finally, extensive experiments are performed to verify the performance of our methods.

  4. Recovery of Lithium from Geothermal Brine with Lithium–Aluminum Layered Double Hydroxide Chloride Sorbents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paranthaman, Mariappan Parans; Li, Ling; Luo, Jiaqi

    In this paper, we report a three-stage bench-scale column extraction process to selectively extract lithium chloride from geothermal brine. The goal of this research is to develop materials and processing technologies to improve the economics of lithium extraction and production from naturally occurring geothermal and other brines for energy storage applications. A novel sorbent, lithium aluminum layered double hydroxide chloride (LDH), is synthesized and characterized with X-ray powder diffraction, scanning electron microscopy, inductively coupled plasma optical emission spectrometry (ICP-OES), and thermogravimetric analysis. Each cycle of the column extraction process consists of three steps: (1) loading the sorbent with lithium chloridemore » from brine; (2) intermediate washing to remove unwanted ions; (3) final washing for unloading the lithium chloride ions. Our experimental analysis of eluate vs feed concentrations of Li and competing ions demonstrates that our optimized sorbents can achieve a recovery efficiency of ~91% and possess excellent Li apparent selectivity of 47.8 compared to Na ions and 212 compared to K ions, respectively in the brine. Finally, the present work demonstrates that LDH is an effective sorbent for selective extraction of lithium from brines, thus offering the possibility of effective application of lithium salts in lithium-ion batteries leading to a fundamental shift in the lithium supply chain.« less

  5. Recovery of Lithium from Geothermal Brine with Lithium–Aluminum Layered Double Hydroxide Chloride Sorbents

    DOE PAGES

    Paranthaman, Mariappan Parans; Li, Ling; Luo, Jiaqi; ...

    2017-10-27

    In this paper, we report a three-stage bench-scale column extraction process to selectively extract lithium chloride from geothermal brine. The goal of this research is to develop materials and processing technologies to improve the economics of lithium extraction and production from naturally occurring geothermal and other brines for energy storage applications. A novel sorbent, lithium aluminum layered double hydroxide chloride (LDH), is synthesized and characterized with X-ray powder diffraction, scanning electron microscopy, inductively coupled plasma optical emission spectrometry (ICP-OES), and thermogravimetric analysis. Each cycle of the column extraction process consists of three steps: (1) loading the sorbent with lithium chloridemore » from brine; (2) intermediate washing to remove unwanted ions; (3) final washing for unloading the lithium chloride ions. Our experimental analysis of eluate vs feed concentrations of Li and competing ions demonstrates that our optimized sorbents can achieve a recovery efficiency of ~91% and possess excellent Li apparent selectivity of 47.8 compared to Na ions and 212 compared to K ions, respectively in the brine. Finally, the present work demonstrates that LDH is an effective sorbent for selective extraction of lithium from brines, thus offering the possibility of effective application of lithium salts in lithium-ion batteries leading to a fundamental shift in the lithium supply chain.« less

  6. Processed red meat intake and risk of COPD: A systematic review and dose-response meta-analysis of prospective cohort studies.

    PubMed

    Salari-Moghaddam, Asma; Milajerdi, Alireza; Larijani, Bagher; Esmaillzadeh, Ahmad

    2018-06-01

    No earlier study has summarized findings from previous publications on processed red meat intake and risk of Chronic Obstructive Pulmonary Disease (COPD). This systematic review and meta-analysis was conducted to examine the association between processed red meat intake and COPD risk. We searched in PubMed/Medline, ISI Web of Knowledge, Scopus, EMBASE and Google Scholar up to April 2018 to identify relevant studies. Prospective cohort studies that considered processed red meat as the exposure variable and COPD as the main outcome variable or as one of the outcomes were included in the systematic review. Publications in which hazard ratios (HRs) were reported as effect size were included in the meta-analysis. Finally, five cohort studies were considered in this systematic review and meta-analysis. In total, 289,952 participants, including 8338 subjects with COPD, aged ≥27 years were included in the meta-analysis. These studies were from Sweden and the US. Linear dose response meta-analysis revealed that each 50 gr/week increase in processed red meat intake was associated with 8% higher risk of COPD (HR: 1.08; 95% CI: 1.03, 1.13). There was an evidence of non-linear association between processed red meat intake and risk of COPD (P < 0.001). In this systematic review and meta-analysis, we found a significant positive association between processed red meat intake and risk of COPD. CRD42017077971. Copyright © 2018 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  7. Routine Analysis of all available GNSS Stations in Greece: Processing Scheme and Dissemination of Products and Data.

    NASA Astrophysics Data System (ADS)

    Papanikolaou, Xanthos; Anastasiou, Demitris; Marinou, Aggeliki; Zacharis, Vangelis; Paradissis, Demitris

    2015-04-01

    Dionysos Satellite Observatory and Higher Geodesy Laboratory of the National Technical University of Athens, have developed an automated processing scheme to accommodate the daily analysis of all available continuous GNSS stations in Greece. For the moment, a total of approximately 150 regional stations are processed, divided in 4 subnetworks. GNSS data are processed routinely on a daily basis, via Bernese GNSS Software v5.0, developed by AIUB. Each network is solved twice, within a period of 20 days, first using ultra-rapid products (with a latency of ~10 hours) and then using final products (with a latency of ~20 days). Observations are processed using carrier phase, modelled to double differences in the ionosphere-free linear combination. Analysis results, include coordinate estimates, ionospheric corrections (TEC maps) and hourly tropospheric parameters (zenith delay). This processing scheme, has proved helpful in investigating in near real-time abrupt geophysical phenomena, as in the 2011 Santorini inflation episode and the 2014 Kephalonia earthquake events. All analysis results and products are made available via a dedicated webpage. Additionally, most of the GNSS data are hosted in a GSAC web platform, available to all interested parties. Data and results are made available through the laboratory's dedicated website: http://dionysos.survey.ntua.gr/.

  8. Numerical wind-tunnel simulation for Spar platform

    NASA Astrophysics Data System (ADS)

    Shen, Wenjun

    2017-05-01

    ANSYS Fluent software is used in the simulation analysis of numerical wind tunnel model for the upper Spar platform module. Design Modeler (DM), Meshing, FLUENT and CFD-POST are chosen in the numerical calculation. And DM is used to deal with and repair the geometric model, and Meshing is used to mesh the model, Fluent is used to set up and solve the calculation condition, finally CFD-POST is used for post-processing of the results. The wind loads are obtained under different direction and incidence angles. Finally, comparison is made between numerical results and empirical formula.

  9. Smith predictor-based multiple periodic disturbance compensation for long dead-time processes

    NASA Astrophysics Data System (ADS)

    Tan, Fang; Li, Han-Xiong; Shen, Ping

    2018-05-01

    Many disturbance rejection methods have been proposed for processes with dead-time, while these existing methods may not work well under multiple periodic disturbances. In this paper, a multiple periodic disturbance rejection is proposed under the Smith predictor configuration for processes with long dead-time. One feedback loop is added to compensate periodic disturbance while retaining the advantage of the Smith predictor. With information of the disturbance spectrum, the added feedback loop can remove multiple periodic disturbances effectively. The robust stability can be easily maintained through the rigorous analysis. Finally, simulation examples demonstrate the effectiveness and robustness of the proposed method for processes with long dead-time.

  10. Users guide to E859 phoswich analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Costales, J.B.

    1992-11-30

    In this memo the authors describe the analysis path used to transform the phoswich data from raw data banks into cross sections suitable for publication. The primary purpose of this memo is not to document each analysis step in great detail but rather to point the reader to the fortran code used and to point out the essential features of the analysis path. A flow chart which summarizes the various steps performed to massage the data from beginning to end is given. In general, each step corresponds to a fortran program which was written to perform that particular task. Themore » automation of the data analysis has been kept purposefully minimal in order to ensure the highest quality of the final product. However, tools have been developed which ease the non--automated steps. There are two major parallel routes for the data analysis: data reduction and acceptance determination using detailed GEANT Monte Carlo simulations. In this memo, the authors will first describe the data reduction up to the point where PHAD banks (Pass 1-like banks) are created. They the will describe the steps taken in the GEANT Monte Carlo route. Note that a detailed memo describing the methodology of the acceptance corrections has already been written. Therefore the discussion of the acceptance determination will be kept to a minimum and the reader will be referred to the other memo for further details. Finally, they will describe the cross section formation process and how final spectra are extracted.« less

  11. A Review of PAT Strategies in Secondary Solid Oral Dosage Manufacturing of Small Molecules.

    PubMed

    Laske, Stephan; Paudel, Amrit; Scheibelhofer, Otto

    2017-03-01

    Pharmaceutical solid oral dosage product manufacturing is a well-established, yet revolutionizing area. To this end, process analytical technology (PAT) involves interdisciplinary and multivariate (chemical, physical, microbiological, and mathematical) methods for material (e.g., materials, intermediates, products) and process (e.g., temperature, pressure, throughput, etc.) analysis. This supports rational process modeling and enhanced control strategies for improved product quality and process efficiency. Therefore, it is often difficult to orient and find the relevant, integrated aspects of the current state-of-the-art. Especially, the link between fundamental research, in terms of sensor and control system development, to the application both in laboratory and manufacturing scale, is difficult to comprehend. This review compiles a nonexhaustive overview on current approaches from the recognized academia and industrial practices of PAT, including screening, selection, and final implementations in solid oral dosage manufacturing, through a wide diversity of use cases. Finally, the authors attempt to extract a common consensus toward developing PAT application guidance for different unit operations of drug product manufacturing. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  12. New Ground Truth Capability from InSAR Time Series Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buckley, S; Vincent, P; Yang, D

    2005-07-13

    We demonstrate that next-generation interferometric synthetic aperture radar (InSAR) processing techniques applied to existing data provide rich InSAR ground truth content for exploitation in seismic source identification. InSAR time series analyses utilize tens of interferograms and can be implemented in different ways. In one such approach, conventional InSAR displacement maps are inverted in a final post-processing step. Alternatively, computationally intensive data reduction can be performed with specialized InSAR processing algorithms. The typical final result of these approaches is a synthesized set of cumulative displacement maps. Examples from our recent work demonstrate that these InSAR processing techniques can provide appealing newmore » ground truth capabilities. We construct movies showing the areal and temporal evolution of deformation associated with previous nuclear tests. In other analyses, we extract time histories of centimeter-scale surface displacement associated with tunneling. The potential exists to identify millimeter per year surface movements when sufficient data exists for InSAR techniques to isolate and remove phase signatures associated with digital elevation model errors and the atmosphere.« less

  13. Downstream processing of stevioside and its potential applications.

    PubMed

    Puri, Munish; Sharma, Deepika; Tiwari, Ashok K

    2011-01-01

    Stevioside is a natural sweetener extracted from leaves of Stevia rebaudiana Bertoni, which is commercially produced by conventional (chemical/physical) processes. This article gives an overview of the stevioside structure, various analysis technique, new technologies required and the advances achieved in recent years. An enzymatic process is established, by which the maximum efficacy and benefit of the process can be achieved. The efficiency of the enzymatic process is quite comparable to that of other physical and chemical methods. Finally, we believe that in the future, the enzyme-based extraction will ensure more cost-effective availability of stevioside, thus assisting in the development of more food-based applications. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. Automatic theory generation from analyst text files using coherence networks

    NASA Astrophysics Data System (ADS)

    Shaffer, Steven C.

    2014-05-01

    This paper describes a three-phase process of extracting knowledge from analyst textual reports. Phase 1 involves performing natural language processing on the source text to extract subject-predicate-object triples. In phase 2, these triples are then fed into a coherence network analysis process, using a genetic algorithm optimization. Finally, the highest-value sub networks are processed into a semantic network graph for display. Initial work on a well- known data set (a Wikipedia article on Abraham Lincoln) has shown excellent results without any specific tuning. Next, we ran the process on the SYNthetic Counter-INsurgency (SYNCOIN) data set, developed at Penn State, yielding interesting and potentially useful results.

  15. Quality-by-Design approach to monitor the operation of a batch bioreactor in an industrial avian vaccine manufacturing process.

    PubMed

    Largoni, Martina; Facco, Pierantonio; Bernini, Donatella; Bezzo, Fabrizio; Barolo, Massimiliano

    2015-10-10

    Monitoring batch bioreactors is a complex task, due to the fact that several sources of variability can affect a running batch and impact on the final product quality. Additionally, the product quality itself may not be measurable on line, but requires sampling and lab analysis taking several days to be completed. In this study we show that, by using appropriate process analytical technology tools, the operation of an industrial batch bioreactor used in avian vaccine manufacturing can be effectively monitored as the batch progresses. Multivariate statistical models are built from historical databases of batches already completed, and they are used to enable the real time identification of the variability sources, to reliably predict the final product quality, and to improve process understanding, paving the way to a reduction of final product rejections, as well as to a reduction of the product cycle time. It is also shown that the product quality "builds up" mainly during the first half of a batch, suggesting on the one side that reducing the variability during this period is crucial, and on the other side that the batch length can possibly be shortened. Overall, the study demonstrates that, by using a Quality-by-Design approach centered on the appropriate use of mathematical modeling, quality can indeed be built "by design" into the final product, whereas the role of end-point product testing can progressively reduce its importance in product manufacturing. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Dose-Dependent Thresholds of 10-ns Electric Pulse Induced Plasma Membrane Disruption and Cytotoxicity in Multiple Cell Lines

    DTIC Science & Technology

    2011-01-01

    normalized to parallel controls. Flow Cytometry and Confocal Microscopy Upon exposure to 10-ns EP, aliquots of the cellular suspension were added to a tube...Survival data was processed and plotted using GrapherH software (Golden Software, Golden, Colorado). Flow cytometry results were processed in C6 software...Accuri Cytometers, Inc., Ann Arbor, MI) and FCSExpress software (DeNovo Software, Los Angeles, CA). Final analysis and presentation of flow cytometry

  17. Numerical analysis of laser ablation using the axisymmetric two-temperature model

    NASA Astrophysics Data System (ADS)

    Dziatkiewicz, Jolanta; Majchrzak, Ewa

    2018-01-01

    Laser ablation of the axisymmetric micro-domain is analyzed. To describe the thermal processes occurring in the micro-domain the two-temperature hyperbolic model supplemented by the boundary and initial conditions is used. This model takes into account the phase changes of material (solid-liquid and liquid-vapour) and the ablation process. At the stage of numerical computations the finite difference method with staggered grid is used. In the final part the results of computations are shown.

  18. Proceedings on Combating the Unrestricted Warfare Threat: Integrating Strategy, Analysis, and Technology, 10-11 March 2008

    DTIC Science & Technology

    2008-03-01

    irregular struggle, and, finally, a protracted struggle that will last decades rather than years. how will this war Evolve? It is hazardous to...There is no downside to engagement. It is not an act of para- noia or pessimism to engage Americans in the very real hazards that confront us. It...making process about what to do next. Because it is an undisciplined process, they work through about 100 options when there are only two: duck or

  19. Natural Language Processing as a Discipline at LLNL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Firpo, M A

    The field of Natural Language Processing (NLP) is described as it applies to the needs of LLNL in handling free-text. The state of the practice is outlined with the emphasis placed on two specific aspects of NLP: Information Extraction and Discourse Integration. A brief description is included of the NLP applications currently being used at LLNL. A gap analysis provides a look at where the technology needs work in order to meet the needs of LLNL. Finally, recommendations are made to meet these needs.

  20. Experimental and Numerical Analysis of Injection Molding of Ti-6Al-4V Powders for High-Performance Titanium Parts

    NASA Astrophysics Data System (ADS)

    Lin, Dongguo; Kang, Tae Gon; Han, Jun Sae; Park, Seong Jin; Chung, Seong Taek; Kwon, Young-Sam

    2018-02-01

    Both experimental and numerical analysis of powder injection molding (PIM) of Ti-6Al-4V alloy were performed to prepare a defect-free high-performance Ti-6Al-4V part with low carbon/oxygen contents. The prepared feedstock was characterized with specific experiments to identify its viscosity, pressure-volume-temperature and thermal properties to simulate its injection molding process. A finite-element-based numerical scheme was employed to simulate the thermomechanical process during the injection molding. In addition, the injection molding, debinding, sintering and hot isostatic pressing processes were performed in sequence to prepare the PIMed parts. With optimized processing conditions, the PIMed Ti-6Al-4V part exhibits excellent physical and mechanical properties, showing a final density of 99.8%, tensile strength of 973 MPa and elongation of 16%.

  1. Refining and end use study of coal liquids II - linear programming analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowe, C.; Tam, S.

    1995-12-31

    A DOE-funded study is underway to determine the optimum refinery processing schemes for producing transportation fuels that will meet CAAA regulations from direct and indirect coal liquids. The study consists of three major parts: pilot plant testing of critical upgrading processes, linear programming analysis of different processing schemes, and engine emission testing of final products. Currently, fractions of a direct coal liquid produced form bituminous coal are being tested in sequence of pilot plant upgrading processes. This work is discussed in a separate paper. The linear programming model, which is the subject of this paper, has been completed for themore » petroleum refinery and is being modified to handle coal liquids based on the pilot plant test results. Preliminary coal liquid evaluation studies indicate that, if a refinery expansion scenario is adopted, then the marginal value of the coal liquid (over the base petroleum crude) is $3-4/bbl.« less

  2. Conducting requirements analyses for research using routinely collected health data: a model driven approach.

    PubMed

    de Lusignan, Simon; Cashman, Josephine; Poh, Norman; Michalakidis, Georgios; Mason, Aaron; Desombre, Terry; Krause, Paul

    2012-01-01

    Medical research increasingly requires the linkage of data from different sources. Conducting a requirements analysis for a new application is an established part of software engineering, but rarely reported in the biomedical literature; and no generic approaches have been published as to how to link heterogeneous health data. Literature review, followed by a consensus process to define how requirements for research, using, multiple data sources might be modeled. We have developed a requirements analysis: i-ScheDULEs - The first components of the modeling process are indexing and create a rich picture of the research study. Secondly, we developed a series of reference models of progressive complexity: Data flow diagrams (DFD) to define data requirements; unified modeling language (UML) use case diagrams to capture study specific and governance requirements; and finally, business process models, using business process modeling notation (BPMN). These requirements and their associated models should become part of research study protocols.

  3. Pyrolysis process for the treatment of food waste.

    PubMed

    Grycová, Barbora; Koutník, Ivan; Pryszcz, Adrian

    2016-10-01

    Different waste materials were pyrolysed in the laboratory pyrolysis unit to the final temperature of 800°C with a 10min delay at the final temperature. After the pyrolysis process a mass balance of the resulting products, off-line analysis of the pyrolysis gas and evaluation of solid and liquid products were carried out. The gas from the pyrolysis experiments was captured discontinuously into Tedlar gas sampling bags and the selected components were analyzed by gas chromatography (methane, ethene, ethane, propane, propene, hydrogen, carbon monoxide and carbon dioxide). The highest concentration of measured hydrogen (WaCe 61%vol.; WaPC 66%vol.) was analyzed at the temperature from 750 to 800°C. The heating values of the solid and liquid residues indicate the possibility of its further use for energy recovery. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Environmental friendly method for the extraction of coir fibre and isolation of nanofibre.

    PubMed

    Abraham, Eldho; Deepa, B; Pothen, L A; Cintil, J; Thomas, S; John, M J; Anandjiwala, R; Narine, S S

    2013-02-15

    The objective of this work was to develop an environmental friendly method for the effective utilization of coir fibre by adopting steam pre-treatment. The retting of the coconut bunch makes strong environmental problems which can be avoided by this method. Chemical characterization of the fibre during each processing stages confirmed the increase of cellulose content from raw (40%) to final steam treated fibres (93%). Morphological and dynamic light scattering analyses of the fibres at different processing stages revealed that the isolation of cellulose nano fibres occur in the final step of the process as an aqueous suspension. FT-IR and XRD analysis demonstrated that the treatments lead to the gradual removal of lignin and hemicelluloses from the fibres. The existence of strong lignin-cellulose complex in the raw coir fibre is proved by its enhanced thermal stability. Steam explosion has been proved to be a green method to expand the application areas of coir fibre. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. FE-simulation of hot forging with an integrated heat treatment with the objective of residual stress prediction

    NASA Astrophysics Data System (ADS)

    Behrens, Bernd-Arno; Chugreeva, Anna; Chugreev, Alexander

    2018-05-01

    Hot forming as a coupled thermo-mechanical process comprises numerous material phenomena with a corresponding impact on the material behavior during and after the forming process as well as on the final component performance. In this context, a realistic FE-simulation requires reliable mathematical models as well as detailed thermo-mechanical material data. This paper presents experimental and numerical results focused on the FE-based simulation of a hot forging process with a subsequent heat treatment step aiming at the prediction of the final mechanical properties and residual stress state in the forged component made of low alloy CrMo-steel DIN 42CrMo4. For this purpose, hot forging experiments of connecting rod geometry with a corresponding metallographic analysis and x-ray residual stress measurements have been carried out. For the coupled thermo-mechanical-metallurgical FE-simulations, a special user-defined material model based on the additive strain decomposition method and implemented in Simufact Forming via MSC.Marc solver features has been used.

  6. Experimental and modeling approaches for food waste composting: a review.

    PubMed

    Li, Zhentong; Lu, Hongwei; Ren, Lixia; He, Li

    2013-10-01

    Composting has been used as a method to dispose food waste (FW) and recycle organic matter to improve soil structure and fertility. Considering the significance of composting in FW treatment, many researchers have paid their attention on how to improve FW composting efficiency, reduce operating cost, and mitigate the associated environmental damage. This review focuses on the overall studies of FW composting, not only various parameters significantly affecting the processes and final results, but also a number of simulation approaches that are greatly instrumental in well understanding the process mechanism and/or results prediction. Implications of many key ingredients on FW composting performance are also discussed. Perspects of effective laboratory experiments and computer-based simulation are finally investigated, demonstrating many demanding areas for enhanced research efforts, which include the screening of multi-functional additives, volatile organiccompound emission control, necessity of modeling and post-modeling analysis, and usefulness of developing more conjunctive AI-based process control techniques. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. The lexical development of children with hearing impairment and associated factors.

    PubMed

    Penna, Leticia Macedo; Lemos, Stela Maris Aguiar; Alves, Cláudia Regina Lindgren

    2014-01-01

    This study aimed at analyzing the association between the lexical development of children with hearing impairment and their psychosocial and socioeconomic characteristics and medical history. An analytic transversal study was conducted in an Auditive Health Attention Service. One hundred and ten children from 6 to 10 years old using hearing aids and presenting hearing loss that ranged from light to deep levels were evaluated. All children were subjected to oral, written language and auditory perception tests. Parents answered a structured questionnaire to collect data from their medical history and socioeconomic status, and questionnaires about the features of the family environment and psychosocial characteristics. Multivariate analysis was performed by logistic regression, being the initial model composed by variables with p<0,20 in the univariate analysis. In the final model, we adopted a significance level of 5%. The final model of the multivariate analysis showed an association between the performance on the vocabulary test and the results of phonemic discrimination test (OR=0.81; 95%CI 0.73-0.89). The results show the importance of stimulating the auditory processing, particularly the phonemic discrimination skill, throughout the rehabilitation process of children with hearing impairment. This stimulation can enhance lexical development and minimize the metalanguage and learning difficulties often observed in these children.

  8. 40 CFR 61.134 - Standard: Naphthalene processing, final coolers, and final-cooler cooling towers.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... POLLUTANTS National Emission Standard for Benzene Emissions from Coke By-Product Recovery Plants § 61.134... are allowed from naphthalene processing, final coolers and final-cooler cooling towers at coke by...

  9. 40 CFR 61.134 - Standard: Naphthalene processing, final coolers, and final-cooler cooling towers.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... POLLUTANTS National Emission Standard for Benzene Emissions from Coke By-Product Recovery Plants § 61.134... are allowed from naphthalene processing, final coolers and final-cooler cooling towers at coke by...

  10. 40 CFR 61.134 - Standard: Naphthalene processing, final coolers, and final-cooler cooling towers.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... POLLUTANTS National Emission Standard for Benzene Emissions from Coke By-Product Recovery Plants § 61.134... are allowed from naphthalene processing, final coolers and final-cooler cooling towers at coke by...

  11. 40 CFR 61.134 - Standard: Naphthalene processing, final coolers, and final-cooler cooling towers.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... POLLUTANTS National Emission Standard for Benzene Emissions from Coke By-Product Recovery Plants § 61.134... are allowed from naphthalene processing, final coolers and final-cooler cooling towers at coke by...

  12. Applying traditional signal processing techniques to social media exploitation for situational understanding

    NASA Astrophysics Data System (ADS)

    Abdelzaher, Tarek; Roy, Heather; Wang, Shiguang; Giridhar, Prasanna; Al Amin, Md. Tanvir; Bowman, Elizabeth K.; Kolodny, Michael A.

    2016-05-01

    Signal processing techniques such as filtering, detection, estimation and frequency domain analysis have long been applied to extract information from noisy sensor data. This paper describes the exploitation of these signal processing techniques to extract information from social networks, such as Twitter and Instagram. Specifically, we view social networks as noisy sensors that report events in the physical world. We then present a data processing stack for detection, localization, tracking, and veracity analysis of reported events using social network data. We show using a controlled experiment that the behavior of social sources as information relays varies dramatically depending on context. In benign contexts, there is general agreement on events, whereas in conflict scenarios, a significant amount of collective filtering is introduced by conflicted groups, creating a large data distortion. We describe signal processing techniques that mitigate such distortion, resulting in meaningful approximations of actual ground truth, given noisy reported observations. Finally, we briefly present an implementation of the aforementioned social network data processing stack in a sensor network analysis toolkit, called Apollo. Experiences with Apollo show that our techniques are successful at identifying and tracking credible events in the physical world.

  13. Data processing has major impact on the outcome of quantitative label-free LC-MS analysis.

    PubMed

    Chawade, Aakash; Sandin, Marianne; Teleman, Johan; Malmström, Johan; Levander, Fredrik

    2015-02-06

    High-throughput multiplexed protein quantification using mass spectrometry is steadily increasing in popularity, with the two major techniques being data-dependent acquisition (DDA) and targeted acquisition using selected reaction monitoring (SRM). However, both techniques involve extensive data processing, which can be performed by a multitude of different software solutions. Analysis of quantitative LC-MS/MS data is mainly performed in three major steps: processing of raw data, normalization, and statistical analysis. To evaluate the impact of data processing steps, we developed two new benchmark data sets, one each for DDA and SRM, with samples consisting of a long-range dilution series of synthetic peptides spiked in a total cell protein digest. The generated data were processed by eight different software workflows and three postprocessing steps. The results show that the choice of the raw data processing software and the postprocessing steps play an important role in the final outcome. Also, the linear dynamic range of the DDA data could be extended by an order of magnitude through feature alignment and a charge state merging algorithm proposed here. Furthermore, the benchmark data sets are made publicly available for further benchmarking and software developments.

  14. [Processing and characterization of fried beans varieties Pinto 114, Suave 85 and Tórtola Inia].

    PubMed

    Hurtado, M L; Escobar, B; Estévez, A M

    2001-06-01

    The objective of this study was develop a snack product based on fried beans. For this purpose, three bean varieties were used: Pinto 114, Suave 85 and Tórtola Inia. The beans were treated with two soaking solutions, EDTA disodium salt and a mixture of NaOH/water, to determine if they had some effect on the product's final quality. On the other hand, before the beans were fried, some grains were given thermal treatment (blanched), leaving the other ones without this process (raw); this also had an effect on the final quality of the fried beans. Physical, chemical and sensory characteristics of the final fried products were determined. For three beans varieties, the blanched products had higher water content, higher oil absorption, lower protein content and larger water activity. The soaking solutions had no effect on the quality of manufactured products. The sensory analysis determined that the best treatment for Pinto 114 and Tórtola Inia was NaOH/water-raw grain, and EDTA raw grain for Suave 85.

  15. Automated Meteor Detection by All-Sky Digital Camera Systems

    NASA Astrophysics Data System (ADS)

    Suk, Tomáš; Šimberová, Stanislava

    2017-12-01

    We have developed a set of methods to detect meteor light traces captured by all-sky CCD cameras. Operating at small automatic observatories (stations), these cameras create a network spread over a large territory. Image data coming from these stations are merged in one central node. Since a vast amount of data is collected by the stations in a single night, robotic storage and analysis are essential to processing. The proposed methodology is adapted to data from a network of automatic stations equipped with digital fish-eye cameras and includes data capturing, preparation, pre-processing, analysis, and finally recognition of objects in time sequences. In our experiments we utilized real observed data from two stations.

  16. QFD analysis of RSRM aqueous cleaners

    NASA Technical Reports Server (NTRS)

    Marrs, Roy D.; Jones, Randy K.

    1995-01-01

    This paper presents a Quality Function Deployment (QFD) analysis of the final down-selected aqueous cleaners to be used on the Redesigned Solid Rocket Motor (RSRM) program. The new cleaner will replace solvent vapor degreasing. The RSRM Ozone Depleting Compound Elimination program is discontinuing the methyl chloroform vapor degreasing process and replacing it with a spray-in-air aqueous cleaning process. Previously, 15 cleaners were down-selected to two candidates by passing screening tests involving toxicity, flammability, cleaning efficiency, contaminant solubility, corrosion potential, cost, and bond strength. The two down-selected cleaners were further evaluated with more intensive testing and evaluated using QFD techniques to assess suitability for cleaning RSRM case and nozzle surfaces in preparation for adhesive bonding.

  17. System Modeling of Lunar Oxygen Production: Mass and Power Requirements

    NASA Technical Reports Server (NTRS)

    Steffen, Christopher J.; Freeh, Joshua E.; Linne, Diane L.; Faykus, Eric W.; Gallo, Christopher A.; Green, Robert D.

    2007-01-01

    A systems analysis tool for estimating the mass and power requirements for a lunar oxygen production facility is introduced. The individual modeling components involve the chemical processing and cryogenic storage subsystems needed to process a beneficiated regolith stream into liquid oxygen via ilmenite reduction. The power can be supplied from one of six different fission reactor-converter systems. A baseline system analysis, capable of producing 15 metric tons of oxygen per annum, is presented. The influence of reactor-converter choice was seen to have a small but measurable impact on the system configuration and performance. Finally, the mission concept of operations can have a substantial impact upon individual component size and power requirements.

  18. Function-based design process for an intelligent ground vehicle vision system

    NASA Astrophysics Data System (ADS)

    Nagel, Robert L.; Perry, Kenneth L.; Stone, Robert B.; McAdams, Daniel A.

    2010-10-01

    An engineering design framework for an autonomous ground vehicle vision system is discussed. We present both the conceptual and physical design by following the design process, development and testing of an intelligent ground vehicle vision system constructed for the 2008 Intelligent Ground Vehicle Competition. During conceptual design, the requirements for the vision system are explored via functional and process analysis considering the flows into the vehicle and the transformations of those flows. The conceptual design phase concludes with a vision system design that is modular in both hardware and software and is based on a laser range finder and camera for visual perception. During physical design, prototypes are developed and tested independently, following the modular interfaces identified during conceptual design. Prototype models, once functional, are implemented into the final design. The final vision system design uses a ray-casting algorithm to process camera and laser range finder data and identify potential paths. The ray-casting algorithm is a single thread of the robot's multithreaded application. Other threads control motion, provide feedback, and process sensory data. Once integrated, both hardware and software testing are performed on the robot. We discuss the robot's performance and the lessons learned.

  19. Microbial Performance of Food Safety Control and Assurance Activities in a Fresh Produce Processing Sector Measured Using a Microbial Assessment Scheme and Statistical Modeling.

    PubMed

    Njage, Patrick Murigu Kamau; Sawe, Chemutai Tonui; Onyango, Cecilia Moraa; Habib, I; Njagi, Edmund Njeru; Aerts, Marc; Molenberghs, Geert

    2017-01-01

    Current approaches such as inspections, audits, and end product testing cannot detect the distribution and dynamics of microbial contamination. Despite the implementation of current food safety management systems, foodborne outbreaks linked to fresh produce continue to be reported. A microbial assessment scheme and statistical modeling were used to systematically assess the microbial performance of core control and assurance activities in five Kenyan fresh produce processing and export companies. Generalized linear mixed models and correlated random-effects joint models for multivariate clustered data followed by empirical Bayes estimates enabled the analysis of the probability of contamination across critical sampling locations (CSLs) and factories as a random effect. Salmonella spp. and Listeria monocytogenes were not detected in the final products. However, none of the processors attained the maximum safety level for environmental samples. Escherichia coli was detected in five of the six CSLs, including the final product. Among the processing-environment samples, the hand or glove swabs of personnel revealed a higher level of predicted contamination with E. coli , and 80% of the factories were E. coli positive at this CSL. End products showed higher predicted probabilities of having the lowest level of food safety compared with raw materials. The final products were E. coli positive despite the raw materials being E. coli negative for 60% of the processors. There was a higher probability of contamination with coliforms in water at the inlet than in the final rinse water. Four (80%) of the five assessed processors had poor to unacceptable counts of Enterobacteriaceae on processing surfaces. Personnel-, equipment-, and product-related hygiene measures to improve the performance of preventive and intervention measures are recommended.

  20. Parental Involvement in the Educational Process of Children with Special Needs. An Annotated Report. [Final Report] and Research Analysis.

    ERIC Educational Resources Information Center

    Kreger, Robert D.

    A listing of resources is presented for parents of handicapped children. Resources are categorized according to the following types: programs, organizations, products, and additional resources; federally funded programs; national parent organizations; local parent training; and handbooks, book lists, and media. Entries are organized according to…

  1. Digital Avionics Information System (DAIS): Mid-1980's Maintenance Task Analysis. Final Report.

    ERIC Educational Resources Information Center

    Czuchry, Andrew J.; And Others

    The fundamental objective of the Digital Avionics Information System (DAIS) Life Cycle Cost (LCC) Study is to provide the Air Force with an enhanced in-house capability to incorporate LCC considerations during all stages of the system acquisition process. The purpose of this report is to describe the technical approach, results, and conclusions…

  2. 76 FR 34986 - Agency Procedure for Disclosure of Documents and Information in the Enforcement Process

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-15

    ... referrals to the Office of General Counsel from the Commission's Reports Analysis Division or Audit Division... hearing before the Commission prior to the Commission's adoption of a Final Audit Report,\\11\\ and (3) a.../2009/notice_2009-11.pdf . \\11\\ See Procedural Rules for Audit Hearings, 74 FR 33140 (July 10, 2009...

  3. 77 FR 6778 - Nez Perce-Clearwater National Forests; Idaho; Clear Creek Integrated Restoration Project

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-09

    ... Environmental Policy Act (NEPA) analysis and decision making process on the proposal so interested and affected members of the public may participate and contribute to the final decision. DATES: Comments concerning the...Forest Supervisor. 12730 Highway 12, Orofinio, ID 83544. The Decision To Be Made is whether to adopt the...

  4. New York State Educational Information System (NYSEIS) Systems Design. Volume I, Phase II. Final Report.

    ERIC Educational Resources Information Center

    Price Waterhouse and Co., New York, NY.

    This volume on Phase II of the New York State Educational Information System (NYSEIS) describes the Gross Systems Analysis and Design, which includes the general flow diagram and processing chart for each of the student, personnel, and financial subsystems. Volume II, Functional Specifications, includes input/output requirements and file…

  5. A Computer Evolution in Teaching Undergraduate Time Series

    ERIC Educational Resources Information Center

    Hodgess, Erin M.

    2004-01-01

    In teaching undergraduate time series courses, we have used a mixture of various statistical packages. We have finally been able to teach all of the applied concepts within one statistical package; R. This article describes the process that we use to conduct a thorough analysis of a time series. An example with a data set is provided. We compare…

  6. Status and analysis of test standard for on-board charger

    NASA Astrophysics Data System (ADS)

    Hou, Shuai; Liu, Haiming; Jiang, Li; Chen, Xichen; Ma, Junjie; Zhao, Bing; Wu, Zaiyuan

    2018-05-01

    This paper analyzes the test standards of on -board charger (OBC). In the process of testing, we found that there exists some problems in test method and functional status, such as failed to follow up the latest test standards, estimated loosely, rectification uncertainty and consistency. Finally, putting forward some own viewpoints on these problems.

  7. Therapeutic change in interaction: conversation analysis of a transforming sequence.

    PubMed

    Voutilainen, Liisa; Perakyla, Anssi; Ruusuvuori, Johanna

    2011-05-01

    A process of change within a single case of cognitive-constructivist therapy is analyzed by means of conversation analysis (CA). The focus is on a process of change in the sequences of interaction, which consist of the therapist's conclusion and the patient's response to it. In the conclusions, the therapist investigates and challenges the patient's tendency to transform her feelings of disappointment and anger into self-blame. Over the course of the therapy, the patient's responses to these conclusions are recast: from the patient first rejecting the conclusion, to then being ambivalent, and finally to agreeing with the therapist. On the basis of this case study, we suggest that an analysis that focuses on sequences of talk that are interactionally similar offers a sensitive method to investigate the manifestation of therapeutic change. It is suggested that this line of research can complement assimilation analysis and other methods of analyzing changes in a client's talk.

  8. Automotive manufacturing assessment system. Volume IV: engine manufacturing analysis. Final report Jun 77-Aug 78

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, T. Jr

    Volume IV represents the results of one of four major study areas under the Automotive Manufacturing Assessment System (AMAS) sponsored by the DOT/Transportation Systems Center. AMAS was designed to assist in the evaluation of industry's capability to produce fuel efficient vehicles. An analysis of automotive engine manufacturing was conducted in order to determine the impact of regulatory changes on tooling costs and the production process. The 351W CID V-8 engine at Ford's Windsor No. 1 Plant was the subject of the analysis. A review of plant history and its product is presented along with an analysis of manufacturing operations, includingmore » material and production flow, plant layout, machining and assembly processes, tooling, supporting facilities, inspection, service and repair. Four levels of product change intensity showing the impact on manufacturing methods and cost is also presented.« less

  9. Overview of Sensitivity Analysis and Shape Optimization for Complex Aerodynamic Configurations

    NASA Technical Reports Server (NTRS)

    Newman, Perry A.; Newman, James C., III; Barnwell, Richard W.; Taylor, Arthur C., III; Hou, Gene J.-W.

    1998-01-01

    This paper presents a brief overview of some of the more recent advances in steady aerodynamic shape-design sensitivity analysis and optimization, based on advanced computational fluid dynamics. The focus here is on those methods particularly well- suited to the study of geometrically complex configurations and their potentially complex associated flow physics. When nonlinear state equations are considered in the optimization process, difficulties are found in the application of sensitivity analysis. Some techniques for circumventing such difficulties are currently being explored and are included here. Attention is directed to methods that utilize automatic differentiation to obtain aerodynamic sensitivity derivatives for both complex configurations and complex flow physics. Various examples of shape-design sensitivity analysis for unstructured-grid computational fluid dynamics algorithms are demonstrated for different formulations of the sensitivity equations. Finally, the use of advanced, unstructured-grid computational fluid dynamics in multidisciplinary analyses and multidisciplinary sensitivity analyses within future optimization processes is recommended and encouraged.

  10. Artificial neural networks for document analysis and recognition.

    PubMed

    Marinai, Simone; Gori, Marco; Soda, Giovanni; Society, Computer

    2005-01-01

    Artificial neural networks have been extensively applied to document analysis and recognition. Most efforts have been devoted to the recognition of isolated handwritten and printed characters with widely recognized successful results. However, many other document processing tasks, like preprocessing, layout analysis, character segmentation, word recognition, and signature verification, have been effectively faced with very promising results. This paper surveys the most significant problems in the area of offline document image processing, where connectionist-based approaches have been applied. Similarities and differences between approaches belonging to different categories are discussed. A particular emphasis is given on the crucial role of prior knowledge for the conception of both appropriate architectures and learning algorithms. Finally, the paper provides a critical analysis on the reviewed approaches and depicts the most promising research guidelines in the field. In particular, a second generation of connectionist-based models are foreseen which are based on appropriate graphical representations of the learning environment.

  11. On Intelligent Design and Planning Method of Process Route Based on Gun Breech Machining Process

    NASA Astrophysics Data System (ADS)

    Hongzhi, Zhao; Jian, Zhang

    2018-03-01

    The paper states an approach of intelligent design and planning of process route based on gun breech machining process, against several problems, such as complex machining process of gun breech, tedious route design and long period of its traditional unmanageable process route. Based on gun breech machining process, intelligent design and planning system of process route are developed by virtue of DEST and VC++. The system includes two functional modules--process route intelligent design and its planning. The process route intelligent design module, through the analysis of gun breech machining process, summarizes breech process knowledge so as to complete the design of knowledge base and inference engine. And then gun breech process route intelligently output. On the basis of intelligent route design module, the final process route is made, edited and managed in the process route planning module.

  12. Acquisition Management for Systems-of-Systems: Exploratory Model Development and Experimentation

    DTIC Science & Technology

    2009-04-22

    outputs of the Requirements Development and Logical Analysis processes into alternative design solutions and selects a final design solution. Decision...Analysis Provides the basis for evaluating and selecting alternatives when decisions need to be made. Implementation Yields the lowest-level system... Dependenc y Matrix 1 ⎥ ⎥ ⎥ ⎦ ⎤ ⎢ ⎢ ⎢ ⎣ ⎡ 011 100 110 2 ⎥ ⎥ ⎥ ⎦ ⎤ ⎢ ⎢ ⎢ ⎣ ⎡ 000 100 100 a) Example of SoS b) Model Structure for Example SoS

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    The Benchmarks of Global Clean Energy Manufacturing will help policymakers and industry gain deeper understanding of global manufacturing of clean energy technologies. Increased knowledge of the product supply chains can inform decisions related to manufacturing facilities for extracting and processing raw materials, making the array of required subcomponents, and assembling and shipping the final product. This brochure summarized key findings from the analysis and includes important figures from the report. The report was prepared by the Clean Energy Manufacturing Analysis Center (CEMAC) analysts at the U.S. Department of Energy's National Renewable Energy Laboratory.

  14. Performance Analysis of Visible Light Communication Using CMOS Sensors.

    PubMed

    Do, Trong-Hop; Yoo, Myungsik

    2016-02-29

    This paper elucidates the fundamentals of visible light communication systems that use the rolling shutter mechanism of CMOS sensors. All related information involving different subjects, such as photometry, camera operation, photography and image processing, are studied in tandem to explain the system. Then, the system performance is analyzed with respect to signal quality and data rate. To this end, a measure of signal quality, the signal to interference plus noise ratio (SINR), is formulated. Finally, a simulation is conducted to verify the analysis.

  15. Performance Analysis of Visible Light Communication Using CMOS Sensors

    PubMed Central

    Do, Trong-Hop; Yoo, Myungsik

    2016-01-01

    This paper elucidates the fundamentals of visible light communication systems that use the rolling shutter mechanism of CMOS sensors. All related information involving different subjects, such as photometry, camera operation, photography and image processing, are studied in tandem to explain the system. Then, the system performance is analyzed with respect to signal quality and data rate. To this end, a measure of signal quality, the signal to interference plus noise ratio (SINR), is formulated. Finally, a simulation is conducted to verify the analysis. PMID:26938535

  16. Process Analytical Technology (PAT): batch-to-batch reproducibility of fermentation processes by robust process operational design and control.

    PubMed

    Gnoth, S; Jenzsch, M; Simutis, R; Lübbert, A

    2007-10-31

    The Process Analytical Technology (PAT) initiative of the FDA is a reaction on the increasing discrepancy between current possibilities in process supervision and control of pharmaceutical production processes and its current application in industrial manufacturing processes. With rigid approval practices based on standard operational procedures, adaptations of production reactors towards the state of the art were more or less inhibited for long years. Now PAT paves the way for continuous process and product improvements through improved process supervision based on knowledge-based data analysis, "Quality-by-Design"-concepts, and, finally, through feedback control. Examples of up-to-date implementations of this concept are presented. They are taken from one key group of processes in recombinant pharmaceutical protein manufacturing, the cultivations of genetically modified Escherichia coli bacteria.

  17. 40 CFR 61.134 - Standard: Naphthalene processing, final coolers, and final-cooler cooling towers.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... POLLUTANTS National Emission Standard for Benzene Emissions from Coke By-Product Recovery Plants § 61.134... are allowed from naphthalene processing, final coolers and final-cooler cooling towers at coke by-product recovery plants. ...

  18. A detailed comparison of analysis processes for MCC-IMS data in disease classification—Automated methods can replace manual peak annotations

    PubMed Central

    Horsch, Salome; Kopczynski, Dominik; Kuthe, Elias; Baumbach, Jörg Ingo; Rahmann, Sven

    2017-01-01

    Motivation Disease classification from molecular measurements typically requires an analysis pipeline from raw noisy measurements to final classification results. Multi capillary column—ion mobility spectrometry (MCC-IMS) is a promising technology for the detection of volatile organic compounds in the air of exhaled breath. From raw measurements, the peak regions representing the compounds have to be identified, quantified, and clustered across different experiments. Currently, several steps of this analysis process require manual intervention of human experts. Our goal is to identify a fully automatic pipeline that yields competitive disease classification results compared to an established but subjective and tedious semi-manual process. Method We combine a large number of modern methods for peak detection, peak clustering, and multivariate classification into analysis pipelines for raw MCC-IMS data. We evaluate all combinations on three different real datasets in an unbiased cross-validation setting. We determine which specific algorithmic combinations lead to high AUC values in disease classifications across the different medical application scenarios. Results The best fully automated analysis process achieves even better classification results than the established manual process. The best algorithms for the three analysis steps are (i) SGLTR (Savitzky-Golay Laplace-operator filter thresholding regions) and LM (Local Maxima) for automated peak identification, (ii) EM clustering (Expectation Maximization) and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) for the clustering step and (iii) RF (Random Forest) for multivariate classification. Thus, automated methods can replace the manual steps in the analysis process to enable an unbiased high throughput use of the technology. PMID:28910313

  19. Prevalence of Campylobacter and Salmonella species on farm, after transport, and at processing in specialty market poultry.

    PubMed

    McCrea, B A; Tonooka, K H; VanWorth, C; Boggs, C L; Atwill, E R; Schrader, J S

    2006-01-01

    The prevalence of Campylobacter and Salmonella spp. was determined from live bird to prepackaged carcass for 3 flocks from each of 6 types of California niche-market poultry. Commodities sampled included squab, quail, guinea fowl, duck, poussin (young chicken), and free-range broiler chickens. Campylobacter on-farm prevalence was lowest for squab, followed by guinea fowl, duck, quail, and free-range chickens. Poussin had the highest prevalence of Campylobacter. No Salmonella was isolated from guinea fowl or quail flocks. A few positive samples were observed in duck and squab, predominately of S. Typhimurium. Free-range and poussin chickens had the highest prevalence of Salmonella. Post-transport prevalence was not significantly higher than on-farm, except in free-range flocks, where a higher prevalence of positive chickens was found after 6 to 8 h holding before processing. In most cases, the prevalence of Campylobacter- and Salmonella-positive birds was lower on the final product than on-farm or during processing. Odds ratio analysis indicated that the risk of a positive final product carcass was not increased by the prevalence of a positive sample at an upstream point in the processing line, or by on-farm prevalence (i.e., none of the common sampling stations among the 6 commodities could be acknowledged as critical control points). This suggests that hazard analysis critical control point plans for Campylobacter and Salmonella control in the niche-market poultry commodities will need to be specifically determined for each species and each processing facility.

  20. Creating the learning situation to promote student deep learning: Data analysis and application case

    NASA Astrophysics Data System (ADS)

    Guo, Yuanyuan; Wu, Shaoyan

    2017-05-01

    How to lead students to deeper learning and cultivate engineering innovative talents need to be studied for higher engineering education. In this study, through the survey data analysis and theoretical research, we discuss the correlation of teaching methods, learning motivation, and learning methods. In this research, we find that students have different motivation orientation according to the perception of teaching methods in the process of engineering education, and this affects their choice of learning methods. As a result, creating situations is critical to lead students to deeper learning. Finally, we analyze the process of learning situational creation in the teaching process of «bidding and contract management workshops». In this creation process, teachers use the student-centered teaching to lead students to deeper study. Through the study of influence factors of deep learning process, and building the teaching situation for the purpose of promoting deep learning, this thesis provide a meaningful reference for enhancing students' learning quality, teachers' teaching quality and the quality of innovation talent.

  1. Numerical analysis of the heating phase and densification mechanism in polymers selective laser melting process

    NASA Astrophysics Data System (ADS)

    Mokrane, Aoulaiche; Boutaous, M'hamed; Xin, Shihe

    2018-05-01

    The aim of this work is to address a modeling of the SLS process at the scale of the part in PA12 polymer powder bed. The powder bed is considered as a continuous medium with homogenized properties, meanwhile understanding multiple physical phenomena occurring during the process and studying the influence of process parameters on the quality of final product. A thermal model, based on enthalpy approach, will be presented with details on the multiphysical couplings that allow the thermal history: laser absorption, melting, coalescence, densification, volume shrinkage and on numerical implementation using FV method. The simulations were carried out in 3D with an in-house developed FORTRAN code. After validation of the model with comparison to results from literature, a parametric analysis will be proposed. Some original results as densification process and the thermal history with the evolution of the material, from the granular solid state to homogeneous melted state will be discussed with regards to the involved physical phenomena.

  2. Synthesis of alumina ceramic encapsulation for self-healing materials on thermal barrier coating

    NASA Astrophysics Data System (ADS)

    Golim, O. P.; Prastomo, N.; Izzudin, H.; Hastuty, S.; Sundawa, R.; Sugiarti, E.; Thosin, K. A. Z.

    2018-03-01

    Durability of Thermal Barrier Coating or TBC can be optimized by inducing Self-Healing capabilities with intermetallic materials MoSi2. Nevertheless, high temperature operation causes the self-healing materials to become oxidized and lose its healing capabilities. Therefore, a method to introduce ceramic encapsulation for MoSi2 is needed to protect it from early oxidation. The encapsulation process is synthesized through a simple precipitation method with colloidal aluminum hydroxide as precursor and variations on calcination process. Semi-quantitative analysis on the synthesized sample is done by using X-ray diffraction (XRD) method. Meanwhile, qualitative analysis on the morphology of the encapsulation was carried out by using Scanning Electron Microscope (SEM) and Field Emission Scanning Electron Microscope (FESEM) equipped with dual Focus Ion Beam (FIB). The result of the experiment shows that calcination process significantly affects the final characteristic of encapsulation. The optimum encapsulation process was synthesized by colloidal aluminum hydroxide as a precursor, with a double step calcination process in low pressure until 900 °C.

  3. Pre- and post-processing for Cosmic/NASTRAN on personal computers and mainframes

    NASA Technical Reports Server (NTRS)

    Kamel, H. A.; Mobley, A. V.; Nagaraj, B.; Watkins, K. W.

    1986-01-01

    An interface between Cosmic/NASTRAN and GIFTS has recently been released, combining the powerful pre- and post-processing capabilities of GIFTS with Cosmic/NASTRAN's analysis capabilities. The interface operates on a wide range of computers, even linking Cosmic/NASTRAN and GIFTS when the two are on different computers. GIFTS offers a wide range of elements for use in model construction, each translated by the interface into the nearest Cosmic/NASTRAN equivalent; and the options of automatic or interactive modelling and loading in GIFTS make pre-processing easy and effective. The interface itself includes the programs GFTCOS, which creates the Cosmic/NASTRAN input deck (and, if desired, control deck) from the GIFTS Unified Data Base, COSGFT, which translates the displacements from the Cosmic/NASTRAN analysis back into GIFTS; and HOSTR, which handles stress computations for a few higher-order elements available in the interface, but not supported by the GIFTS processor STRESS. Finally, the versatile display options in GIFTS post-processing allow the user to examine the analysis results through an especially wide range of capabilities, including such possibilities as creating composite loading cases, plotting in color and animating the analysis.

  4. The design of an m-Health monitoring system based on a cloud computing platform

    NASA Astrophysics Data System (ADS)

    Xu, Boyi; Xu, Lida; Cai, Hongming; Jiang, Lihong; Luo, Yang; Gu, Yizhi

    2017-01-01

    Compared to traditional medical services provided within hospitals, m-Health monitoring systems (MHMSs) face more challenges in personalised health data processing. To achieve personalised and high-quality health monitoring by means of new technologies, such as mobile network and cloud computing, in this paper, a framework of an m-Health monitoring system based on a cloud computing platform (Cloud-MHMS) is designed to implement pervasive health monitoring. Furthermore, the modules of the framework, which are Cloud Storage and Multiple Tenants Access Control Layer, Healthcare Data Annotation Layer, and Healthcare Data Analysis Layer, are discussed. In the data storage layer, a multiple tenant access method is designed to protect patient privacy. In the data annotation layer, linked open data are adopted to augment health data interoperability semantically. In the data analysis layer, the process mining algorithm and similarity calculating method are implemented to support personalised treatment plan selection. These three modules cooperate to implement the core functions in the process of health monitoring, which are data storage, data processing, and data analysis. Finally, we study the application of our architecture in the monitoring of antimicrobial drug usage to demonstrate the usability of our method in personal healthcare analysis.

  5. Science Operations Management

    NASA Astrophysics Data System (ADS)

    Squibb, Gael F.

    1984-10-01

    The operation teams for the Infrared Astronomical Satellite (IRAS) included scientists from the IRAS International Science Team. The scientific decisions on an hour-to-hour basis, as well as the long-term strategic decisions, were made by science team members. The IRAS scientists were involved in the analysis of the instrument performance, the analysis of the quality of the data, the decision to reacquire data that was contaminated by radiation effects, the strategy for acquiring the survey data, and the process for using the telescope for additional observations, as well as the processing decisions required to ensure the publication of the final scientific products by end of flight operations plus one year. Early in the project, two science team members were selected to be responsible for the scientific operational decisions. One, located at the operations control center in England, was responsible for the scientific aspects of the satellite operations; the other, located at the scientific processing center in Pasadena, was responsible for the scientific aspects of the processing. These science team members were then responsible for approving the design and test of the tools to support their responsibilities and then, after launch, for using these tools in making their decisions. The ability of the project to generate the final science data products one year after the end of flight operations is due in a large measure to the active participation of the science team members in the operations. This paper presents a summary of the operational experiences gained from this scientific involvement.

  6. Study of the solid state of carbamazepine after processing with gas anti-solvent technique.

    PubMed

    Moneghini, M; Kikic, I; Voinovich, D; Perissutti, B; Alessi, P; Cortesi, A; Princivalle, F; Solinas, D

    2003-09-01

    The purpose of this study was to investigate the influence of supercritical CO2 processing on the physico-chemical properties of carbamazepine, a poorly soluble drug. The gas anti-solvent (GAS) technique was used to precipitate the drug from three different solvents (acetone, ethylacetate and dichloromethane) to study how they would affect the final product. The samples were analysed before and after treatment by scanning electron microscopy analysis and laser granulometry for possible changes in the habitus of the crystals. In addition, the solid state of the samples was studied by means of X-ray powder diffraction, differential scanning calorimetry, diffuse reflectance Fourier-transform infrared spectroscopy and hot stage microscopy. Finally, the in vitro dissolution tests were carried out. The solid state analysis of both samples untreated and treated with CO2, showed that the applied method caused a transition from the starting form III to the form I as well as determined a dramatic change of crystal morphology, resulting in needle-shaped crystals, regardless of the chosen solvent. In order to identify which process was responsible for the above results, carbamazepine was further precipitated from the same three solvents by traditional evaporation method (RV-samples). On the basis of this cross-testing, the solvents were found to be responsible for the reorganisation into a different polymorphic form, and the potential of the GAS process to produce micronic needle shaped particles, with an enhanced dissolution rate compared to the RV-carbamazepine, was ascertained.

  7. Effect of ambient vibration on solid rocket motor grain and propellant/liner bonding interface

    NASA Astrophysics Data System (ADS)

    Cao, Yijun; Huang, Weidong; Li, Jinfei

    2017-05-01

    In order to study the condition of structural integrity in the process of the solid propellant motor launching and transporting, the stress and strain field analysis were studied on a certain type of solid propellant motor. the vibration acceleration on the solid propellant motors' transport process were monitored, then the original vibration data was eliminated the noise and the trend term efficiently, finally the characteristic frequency of vibration was got to the finite element analysis. Experiment and simulation results show that the monitored solid propellant motor mainly bear 0.2 HZ and 15 HZ low frequency vibration in the process of transportation; Under the low frequency vibration loading, solid propellant motor grain stress concentration position is respectively below the head and tail of the propellant/liner bonding surface and the grain roots.

  8. Behaviors of printed circuit boards due to microwave supported curing process of coating materials.

    PubMed

    Bremerkamp, Felix; Nowottnick, Mathias; Seehase, Dirk; Bui, Trinh Dung

    2012-01-01

    The Application of a microwave supported curing process for coatings in the field of electronic industry poses a challenge. Here the implementation of this technology is represented. Within the scope of the investigation special PCB Test Layouts were designed and the polymer curing process examined by the method of dielectric analysis. Furthermore the coupling of microwave radiation with conductive PCB structures was analyzed experimentally by means of special test boards. The formation of standing waves and regular heating distribution along the conductive wires on the PCB could be observed. The experimental results were compared with numerical simulation. In this context the numerical analysis of microwave PCB interaction led to important findings concerning wave propagation on wired PCB. The final valuation demonstrated a substantial similarity between numerical simulations and experimental results.

  9. Conceptual analysis of Physiology of vision in Ayurveda

    PubMed Central

    Balakrishnan, Praveen; Ashwini, M. J.

    2014-01-01

    The process by which the world outside is seen is termed as visual process or physiology of vision. There are three phases in this visual process: phase of refraction of light, phase of conversion of light energy into electrical impulse and finally peripheral and central neurophysiology. With the advent of modern instruments step by step biochemical changes occurring at each level of the visual process has been deciphered. Many investigations have emerged to track these changes and helping to diagnose the exact nature of the disease. Ayurveda has described this physiology of vision based on the functions of vata and pitta. Philosophical textbook of ayurveda, Tarka Sangraha, gives certain basics facts of visual process. This article discusses the second and third phase of visual process. Step by step analysis of the visual process through the spectacles of ayurveda amalgamated with the basics of philosophy from Tarka Sangraha has been analyzed critically to generate a concrete idea regarding the physiology and hence thereby interpret the pathology on the grounds of ayurveda based on the investigative reports. PMID:25336853

  10. Carbon Mineralization by Aqueous Precipitation for Beneficial Use of CO 2 from Flue Gas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Devenney, Martin; Gilliam, Ryan; Seeker, Randy

    The objective of this project was to demonstrate an innovative process to mineralize CO 2 from flue gas directly to reactive carbonates and maximize the value and versatility of its beneficial use products. The program scope includes the design, construction, and testing of a CO 2 Conversion to Material Products (CCMP) Pilot Demonstration Plant utilizing CO 2 from the flue gas of a power production facility in Moss Landing, CA as well as flue gas from coal combustion. This final report details all development, analysis, design and testing of the project. Also included in the final report are an updatedmore » Techno-Economic Analysis and CO 2 Lifecycle Analysis. The subsystems included in the pilot demonstration plant are the mineralization subsystem, the Alkalinity Based on Low Energy (ABLE) subsystem, the waste calcium oxide processing subsystem, and the fiber cement board production subsystem. The fully integrated plant was proven to be capable of capturing CO 2 from various sources (gas and coal) and mineralizing it into a reactive calcium carbonate binder and subsequently producing commercial size (4ftx8ft) fiber cement boards. The final report provides a description of the “as built” design of these subsystems and the results of the commissioning activities that have taken place to confirm operability. The report also discusses the results of the fully integrated operation of the facility. Fiber cement boards have been produced in this facility exclusively using reactive calcium carbonate from captured CO 2 from flue gas. These boards meet all US and China appropriate acceptance standards. Use demonstrations for these boards are now underway.« less

  11. ADP Analysis project for the Human Resources Management Division

    NASA Technical Reports Server (NTRS)

    Tureman, Robert L., Jr.

    1993-01-01

    The ADP (Automated Data Processing) Analysis Project was conducted for the Human Resources Management Division (HRMD) of NASA's Langley Research Center. The three major areas of work in the project were computer support, automated inventory analysis, and an ADP study for the Division. The goal of the computer support work was to determine automation needs of Division personnel and help them solve computing problems. The goal of automated inventory analysis was to find a way to analyze installed software and usage on a Macintosh. Finally, the ADP functional systems study for the Division was designed to assess future HRMD needs concerning ADP organization and activities.

  12. 10 CFR 52.157 - Contents of applications; technical information in final safety analysis report.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...; technical information in final safety analysis report. The application must contain a final safety analysis...) Information sufficient to demonstrate compliance with the applicable requirements regarding testing, analysis... 10 Energy 2 2013-01-01 2013-01-01 false Contents of applications; technical information in final...

  13. 10 CFR 52.157 - Contents of applications; technical information in final safety analysis report.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...; technical information in final safety analysis report. The application must contain a final safety analysis...) Information sufficient to demonstrate compliance with the applicable requirements regarding testing, analysis... 10 Energy 2 2012-01-01 2012-01-01 false Contents of applications; technical information in final...

  14. 10 CFR 52.157 - Contents of applications; technical information in final safety analysis report.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...; technical information in final safety analysis report. The application must contain a final safety analysis...) Information sufficient to demonstrate compliance with the applicable requirements regarding testing, analysis... 10 Energy 2 2014-01-01 2014-01-01 false Contents of applications; technical information in final...

  15. 10 CFR 52.157 - Contents of applications; technical information in final safety analysis report.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...; technical information in final safety analysis report. The application must contain a final safety analysis...) Information sufficient to demonstrate compliance with the applicable requirements regarding testing, analysis... 10 Energy 2 2011-01-01 2011-01-01 false Contents of applications; technical information in final...

  16. A hydrometallurgical process for the recovery of terbium from fluorescent lamps: Experimental design, optimization of acid leaching process and process analysis.

    PubMed

    Innocenzi, Valentina; Ippolito, Nicolò Maria; De Michelis, Ida; Medici, Franco; Vegliò, Francesco

    2016-12-15

    Terbium and rare earths recovery from fluorescent powders of exhausted lamps by acid leaching with hydrochloric acid was the objective of this study. In order to investigate the factors affecting leaching a series of experiments was performed in according to a full factorial plan with four variables and two levels (4 2 ). The factors studied were temperature, concentration of acid, pulp density and leaching time. Experimental conditions of terbium dissolution were optimized by statistical analysis. The results showed that temperature and pulp density were significant with a positive and negative effect, respectively. The empirical mathematical model deducted by experimental data demonstrated that terbium content was completely dissolved under the following conditions: 90 °C, 2 M hydrochloric acid and 5% of pulp density; while when the pulp density was 15% an extraction of 83% could be obtained at 90 °C and 5 M hydrochloric acid. Finally a flow sheet for the recovery of rare earth elements was proposed. The process was tested and simulated by commercial software for the chemical processes. The mass balance of the process was calculated: from 1 ton of initial powder it was possible to obtain around 160 kg of a concentrate of rare earths having a purity of 99%. The main rare earths elements in the final product was yttrium oxide (86.43%) following by cerium oxide (4.11%), lanthanum oxide (3.18%), europium oxide (3.08%) and terbium oxide (2.20%). The estimated total recovery of the rare earths elements was around 70% for yttrium and europium and 80% for the other rare earths. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nick Cannell; Dr. Mark Samonds; Adi Sholapurwalla

    The investment casting process is an expendable mold process where wax patterns of the part and rigging are molded, assembled, shelled and melted to produce a ceramic mold matching the shape of the component to be cast. Investment casting is an important manufacturing method for critical parts because of the ability to maintain dimensional shape and tolerances. However, these tolerances can be easily exceeded if the molding components do not maintain their individual shapes well. In the investment casting process there are several opportunities for the final casting shape to not maintain the intended size and shape, such as shrinkagemore » of the wax in the injection tool, the modification of the shape during shell heating, and with the thermal shrink and distortion in the casting process. Studies have been completed to look at the casting and shell distortions through the process in earlier phases of this project. Dr. Adrian Sabau at Oak Ridge National Labs performed characterizations and validations of 17-4 PH stainless steel in primarily fused silica shell systems with good agreement between analysis results and experimental data. Further tasks provided material property measurements of wax and methodology for employing a viscoelastic definition of wax materials into software. The final set of tasks involved the implementation of the findings into the commercial casting analysis software ProCAST, owned and maintained by ESI Group. This included: o the transfer of the wax material property data from its raw form into separate temperature-dependent thermophysical and mechanical property datasets o adding this wax material property data into an easily viewable and modifiable user interface within the pre-processing application of the ProCAST suite, namely PreCAST o and validating the data and viscoelastic wax model with respect to experimental results« less

  18. EARLINET Single Calculus Chain - overview on methodology and strategy

    NASA Astrophysics Data System (ADS)

    D'Amico, G.; Amodeo, A.; Baars, H.; Binietoglou, I.; Freudenthaler, V.; Mattis, I.; Wandinger, U.; Pappalardo, G.

    2015-11-01

    In this paper we describe the EARLINET Single Calculus Chain (SCC), a tool for the automatic analysis of lidar measurements. The development of this tool started in the framework of EARLINET-ASOS (European Aerosol Research Lidar Network - Advanced Sustainable Observation System); it was extended within ACTRIS (Aerosol, Clouds and Trace gases Research InfraStructure Network), and it is continuing within ACTRIS-2. The main idea was to develop a data processing chain that allows all EARLINET stations to retrieve, in a fully automatic way, the aerosol backscatter and extinction profiles starting from the raw lidar data of the lidar systems they operate. The calculus subsystem of the SCC is composed of two modules: a pre-processor module which handles the raw lidar data and corrects them for instrumental effects and an optical processing module for the retrieval of aerosol optical products from the pre-processed data. All input parameters needed to perform the lidar analysis are stored in a database to keep track of all changes which may occur for any EARLINET lidar system over the time. The two calculus modules are coordinated and synchronized by an additional module (daemon) which makes the whole analysis process fully automatic. The end user can interact with the SCC via a user-friendly web interface. All SCC modules are developed using open-source and freely available software packages. The final products retrieved by the SCC fulfill all requirements of the EARLINET quality assurance programs on both instrumental and algorithm levels. Moreover, the manpower needed to provide aerosol optical products is greatly reduced and thus the near-real-time availability of lidar data is improved. The high-quality of the SCC products is proven by the good agreement between the SCC analysis, and the corresponding independent manual retrievals. Finally, the ability of the SCC to provide high-quality aerosol optical products is demonstrated for an EARLINET intense observation period.

  19. Evaluation of control over the microbiological contamination of carcasses in a lamb carcass dressing process operated with or without pasteurizing treatment.

    PubMed

    Milios, K; Mataragas, M; Pantouvakis, A; Drosinos, E H; Zoiopoulos, P E

    2011-03-30

    The aim of this study was to quantify the hygienic status of a lamb slaughterhouse by means of multivariate statistical analysis, to demonstrate how the microbiological data could be exploited to improve the lamb slaughter process by constructing control charts and to evaluate the potential effect of an intervention step such as steam application on the microbiological quality of lamb carcasses. Results showed that pelt removal and evisceration were hygienically uncontrolled. TVC and Enterobacteriaceae progressively increased from the stage 'after pelt removal of hind and forelegs/before final pulling' to the stage 'after evisceration/before pluck removal' thus indicating possible deposition of microorganisms during these operations. It seems that the processing stages of freshly produced carcasses were better distinguished by Enterobacteriaceae, with evisceration contributing mostly to the final Enterobacteriaceae counts. Application of steam during the lamb slaughter process reduced microbial counts without adverse effects on the organoleptic characteristics of the carcasses. Moreover, the construction of control charts showed that decontamination with steam contributed to the maintenance of an in control process compared to that before the application of steam, suggesting the potential use of steam as an intervention step during the lamb slaughter process. Copyright © 2011 Elsevier B.V. All rights reserved.

  20. A network analysis of indirect carbon emission flows among different industries in China.

    PubMed

    Du, Qiang; Xu, Yadan; Wu, Min; Sun, Qiang; Bai, Libiao; Yu, Ming

    2018-06-17

    Indirect carbon emissions account for a large ratio of the total carbon emissions in processes to make the final products, and this implies indirect carbon emission flow across industries. Understanding these flows is crucial for allocating a carbon allowance for each industry. By combining input-output analysis and complex network theory, this study establishes an indirect carbon emission flow network (ICEFN) for 41 industries from 2005 to 2014 to investigate the interrelationships among different industries. The results show that the ICEFN was consistent with a small-world nature based on an analysis of the average path lengths and the clustering coefficients. Moreover, key industries in the ICEFN were identified using complex network theory on the basis of degree centrality and betweenness centrality. Furthermore, the 41 industries of the ICEFN were divided into four industrial subgroups that are related closely to one another. Finally, possible policy implications were provided based on the knowledge of the structure of the ICEFN and its trend.

  1. The application of hazard analysis and critical control points and risk management in the preparation of anti-cancer drugs.

    PubMed

    Bonan, Brigitte; Martelli, Nicolas; Berhoune, Malik; Maestroni, Marie-Laure; Havard, Laurent; Prognon, Patrice

    2009-02-01

    To apply the Hazard analysis and Critical Control Points method to the preparation of anti-cancer drugs. To identify critical control points in our cancer chemotherapy process and to propose control measures and corrective actions to manage these processes. The Hazard Analysis and Critical Control Points application began in January 2004 in our centralized chemotherapy compounding unit. From October 2004 to August 2005, monitoring of the process nonconformities was performed to assess the method. According to the Hazard Analysis and Critical Control Points method, a multidisciplinary team was formed to describe and assess the cancer chemotherapy process. This team listed all of the critical points and calculated their risk indexes according to their frequency of occurrence, their severity and their detectability. The team defined monitoring, control measures and corrective actions for each identified risk. Finally, over a 10-month period, pharmacists reported each non-conformity of the process in a follow-up document. Our team described 11 steps in the cancer chemotherapy process. The team identified 39 critical control points, including 11 of higher importance with a high-risk index. Over 10 months, 16,647 preparations were performed; 1225 nonconformities were reported during this same period. The Hazard Analysis and Critical Control Points method is relevant when it is used to target a specific process such as the preparation of anti-cancer drugs. This method helped us to focus on the production steps, which can have a critical influence on product quality, and led us to improve our process.

  2. Clinical process cost analysis.

    PubMed

    Marrin, C A; Johnson, L C; Beggs, V L; Batalden, P B

    1997-09-01

    New systems of reimbursement are exerting enormous pressure on clinicians and hospitals to reduce costs. Using cheaper supplies or reducing the length of stay may be a satisfactory short-term solution, but the best strategy for long-term success is radical reduction of costs by reengineering the processes of care. However, few clinicians or institutions know the actual costs of medical care; nor do they understand, in detail, the activities involved in the delivery of care. Finally, there is no accepted method for linking the two. Clinical process cost analysis begins with the construction of a detailed flow diagram incorporating each activity in the process of care. The cost of each activity is then calculated, and the two are linked. This technique was applied to Diagnosis Related Group 75 to analyze the real costs of the operative treatment of lung cancer at one institution. Total costs varied between $6,400 and $7,700. The major driver of costs was personnel time, which accounted for 55% of the total. Forty percent of the total cost was incurred in the operating room. The cost of care decreased progressively during hospitalization. Clinical process cost analysis provides detailed information about the costs and processes of care. The insights thus obtained may be used to reduce costs by reengineering the process.

  3. Red and processed meat consumption and gastric cancer risk: a systematic review and meta-analysis

    PubMed Central

    Zhao, Zhanwei; Yin, Zifang; Zhao, Qingchuan

    2017-01-01

    The associations between red and processed meat consumption and gastric cancer risk have remained inconclusive. We performed a systematic review and meta-analysis to analyze these associations. We searched PubMed and EMBASE to identify studies published from inception through October 2016. Subtype analyses of gastric cancer (gastric cardia adenocarcinoma and gastric non-cardiac adenocarcinoma) and dose-response analyses were performed. We finally selected 42 eligible studies. The summary relative risks of highest versus lowest consumption were positive for case-control studies with 1.67 (1.36-2.05) for red meat and 1.76 (1.51-2.05) for processed meat, but negative for cohort studies with 1.14 (0.97-1.34) for red meat and 1.23 (0.98-1.55) for processed meat. Subtype analyses of cohort studies suggested null results for gastric cardia adenocarcinoma (red meat, P = 0.79; processed meat, P = 0.89) and gastric non-cardiac adenocarcinoma (red meat, P = 0.12; processed meat, P = 0.12). In conclusion, the present analysis suggested null results between red and processed meat consumption and gastric cancer risk in cohort studies, although case-control studies yielded positive associations. Further well-designed prospective studies are needed to validate these findings. PMID:28430644

  4. Analysis and correlation of the test data from an advanced technology rotor system

    NASA Technical Reports Server (NTRS)

    Jepson, D.; Moffitt, R.; Hilzinger, K.; Bissell, J.

    1983-01-01

    Comparisons were made of the performance and blade vibratory loads characteristics for an advanced rotor system as predicted by analysis and as measured in a 1/5 scale model wind tunnel test, a full scale model wind tunnel test and flight test. The accuracy with which the various tools available at the various stages in the design/development process (analysis, model test etc.) could predict final characteristics as measured on the aircraft was determined. The accuracy of the analyses in predicting the effects of systematic tip planform variations investigated in the full scale wind tunnel test was evaluated.

  5. External Tank Liquid Hydrogen (LH2) Prepress Regression Analysis Independent Review Technical Consultation Report

    NASA Technical Reports Server (NTRS)

    Parsons, Vickie s.

    2009-01-01

    The request to conduct an independent review of regression models, developed for determining the expected Launch Commit Criteria (LCC) External Tank (ET)-04 cycle count for the Space Shuttle ET tanking process, was submitted to the NASA Engineering and Safety Center NESC on September 20, 2005. The NESC team performed an independent review of regression models documented in Prepress Regression Analysis, Tom Clark and Angela Krenn, 10/27/05. This consultation consisted of a peer review by statistical experts of the proposed regression models provided in the Prepress Regression Analysis. This document is the consultation's final report.

  6. Study on a Multi-Frequency Homotopy Analysis Method for Period-Doubling Solutions of Nonlinear Systems

    NASA Astrophysics Data System (ADS)

    Fu, H. X.; Qian, Y. H.

    In this paper, a modification of homotopy analysis method (HAM) is applied to study the two-degree-of-freedom coupled Duffing system. Firstly, the process of calculating the two-degree-of-freedom coupled Duffing system is presented. Secondly, the single periodic solutions and double periodic solutions are obtained by solving the constructed nonlinear algebraic equations. Finally, comparing the periodic solutions obtained by the multi-frequency homotopy analysis method (MFHAM) and the fourth-order Runge-Kutta method, it is found that the approximate solution agrees well with the numerical solution.

  7. Exploratory Mediation Analysis via Regularization

    PubMed Central

    Serang, Sarfaraz; Jacobucci, Ross; Brimhall, Kim C.; Grimm, Kevin J.

    2017-01-01

    Exploratory mediation analysis refers to a class of methods used to identify a set of potential mediators of a process of interest. Despite its exploratory nature, conventional approaches are rooted in confirmatory traditions, and as such have limitations in exploratory contexts. We propose a two-stage approach called exploratory mediation analysis via regularization (XMed) to better address these concerns. We demonstrate that this approach is able to correctly identify mediators more often than conventional approaches and that its estimates are unbiased. Finally, this approach is illustrated through an empirical example examining the relationship between college acceptance and enrollment. PMID:29225454

  8. Tracking the Spatiotemporal Neural Dynamics of Real-world Object Size and Animacy in the Human Brain.

    PubMed

    Khaligh-Razavi, Seyed-Mahdi; Cichy, Radoslaw Martin; Pantazis, Dimitrios; Oliva, Aude

    2018-06-07

    Animacy and real-world size are properties that describe any object and thus bring basic order into our perception of the visual world. Here, we investigated how the human brain processes real-world size and animacy. For this, we applied representational similarity to fMRI and MEG data to yield a view of brain activity with high spatial and temporal resolutions, respectively. Analysis of fMRI data revealed that a distributed and partly overlapping set of cortical regions extending from occipital to ventral and medial temporal cortex represented animacy and real-world size. Within this set, parahippocampal cortex stood out as the region representing animacy and size stronger than most other regions. Further analysis of the detailed representational format revealed differences among regions involved in processing animacy. Analysis of MEG data revealed overlapping temporal dynamics of animacy and real-world size processing starting at around 150 msec and provided the first neuromagnetic signature of real-world object size processing. Finally, to investigate the neural dynamics of size and animacy processing simultaneously in space and time, we combined MEG and fMRI with a novel extension of MEG-fMRI fusion by representational similarity. This analysis revealed partly overlapping and distributed spatiotemporal dynamics, with parahippocampal cortex singled out as a region that represented size and animacy persistently when other regions did not. Furthermore, the analysis highlighted the role of early visual cortex in representing real-world size. A control analysis revealed that the neural dynamics of processing animacy and size were distinct from the neural dynamics of processing low-level visual features. Together, our results provide a detailed spatiotemporal view of animacy and size processing in the human brain.

  9. Experience with case tools in the design of process-oriented software

    NASA Astrophysics Data System (ADS)

    Novakov, Ognian; Sicard, Claude-Henri

    1994-12-01

    In Accelerator systems such as the CERN PS complex, process equipment has a life time which may exceed the typical life cycle of its related software. Taking into account the variety of such equipment, it is important to keep the analysis and design of the software in a system-independent form. This paper discusses the experience gathered in using commercial CASE tools for analysis, design and reverse engineering of different process-oriented software modules, with a principal emphasis on maintaining the initial analysis in a standardized form. Such tools have been in existence for several years, but this paper shows that they are not fully adapted to our needs. In particular, the paper stresses the problems of integrating such a tool into an existing data-base-dependent development chain, the lack of real-time simulation tools and of Object-Oriented concepts in existing commercial packages. Finally, the paper gives a broader view of software engineering needs in our particular context.

  10. Alignment of an acoustic manipulation device with cepstral analysis of electronic impedance data.

    PubMed

    Hughes, D A; Qiu, Y; Démoré, C; Weijer, C J; Cochran, S

    2015-02-01

    Acoustic particle manipulation is an emerging technology that uses ultrasonic standing waves to position objects with pressure gradients and acoustic radiation forces. To produce strong standing waves, the transducer and the reflector must be aligned properly such that they are parallel to each other. This can be a difficult process due to the need to visualise the ultrasound waves and as higher frequencies are introduced, this alignment requires higher accuracy. In this paper, we present a method for aligning acoustic resonators with cepstral analysis. This is a simple signal processing technique that requires only the electrical impedance measurement data of the resonator, which is usually recorded during the fabrication process of the device. We first introduce the mathematical basis of cepstral analysis and then demonstrate and validate it using a computer simulation of an acoustic resonator. Finally, the technique is demonstrated experimentally to create many parallel linear traps for 10 μm fluorescent beads inside an acoustic resonator. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. An updated comprehensive techno-economic analysis of algae biodiesel.

    PubMed

    Nagarajan, Sanjay; Chou, Siaw Kiang; Cao, Shenyan; Wu, Chen; Zhou, Zhi

    2013-10-01

    Algae biodiesel is a promising but expensive alternative fuel to petro-diesel. To overcome cost barriers, detailed cost analyses are needed. A decade-old cost analysis by the U.S. National Renewable Energy Laboratory indicated that the costs of algae biodiesel were in the range of $0.53-0.85/L (2012 USD values). However, the cost of land and transesterification were just roughly estimated. In this study, an updated comprehensive techno-economic analysis was conducted with optimized processes and improved cost estimations. Latest process improvement, quotes from vendors, government databases, and other relevant data sources were used to calculate the updated algal biodiesel costs, and the final costs of biodiesel are in the range of $0.42-0.97/L. Additional improvements on cost-effective biodiesel production around the globe to cultivate algae was also recommended. Overall, the calculated costs seem promising, suggesting that a single step biodiesel production process is close to commercial reality. Copyright © 2012 Elsevier Ltd. All rights reserved.

  12. Cognitive task analysis of network analysts and managers for network situational awareness

    NASA Astrophysics Data System (ADS)

    Erbacher, Robert F.; Frincke, Deborah A.; Wong, Pak Chung; Moody, Sarah; Fink, Glenn

    2010-01-01

    The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The situational-awareness capabilities being developed focus on novel visualization techniques as well as data analysis techniques designed to improve the comprehensibility of the visualizations. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into ensuring we had feedback from network analysts and managers and understanding what their needs truly are. This paper discusses the cognitive task analysis methodology we followed to acquire feedback from the analysts. This paper also provides the details we acquired from the analysts on their processes, goals, concerns, etc. A final result we describe is the generation of a task-flow diagram.

  13. Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Li, Wesley; Lung, Shun-fat

    2008-01-01

    An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.

  14. Analysis of high field effects on the steady-state current-voltage response of semi-insulating 4H-SiC for photoconductive switch applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tiskumara, R.; Joshi, R. P., E-mail: ravi.joshi@ttu.edu; Mauch, D.

    A model-based analysis of the steady-state, current-voltage response of semi-insulating 4H-SiC is carried out to probe the internal mechanisms, focusing on electric field driven effects. Relevant physical processes, such as multiple defects, repulsive potential barriers to electron trapping, band-to-trap impact ionization, and field-dependent detrapping, are comprehensively included. Results of our model match the available experimental data fairly well over orders of magnitude variation in the current density. A number of important parameters are also extracted in the process through comparisons with available data. Finally, based on our analysis, the possible presence of holes in the samples can be discounted upmore » to applied fields as high as ∼275 kV/cm.« less

  15. Neural networks related to dysfunctional face processing in autism spectrum disorder

    PubMed Central

    Nickl-Jockschat, Thomas; Rottschy, Claudia; Thommes, Johanna; Schneider, Frank; Laird, Angela R.; Fox, Peter T.; Eickhoff, Simon B.

    2016-01-01

    One of the most consistent neuropsychological findings in autism spectrum disorders (ASD) is a reduced interest in and impaired processing of human faces. We conducted an activation likelihood estimation meta-analysis on 14 functional imaging studies on neural correlates of face processing enrolling a total of 164 ASD patients. Subsequently, normative whole-brain functional connectivity maps for the identified regions of significant convergence were computed for the task-independent (resting-state) and task-dependent (co-activations) state in healthy subjects. Quantitative functional decoding was performed by reference to the BrainMap database. Finally, we examined the overlap of the delineated network with the results of a previous meta-analysis on structural abnormalities in ASD as well as with brain regions involved in human action observation/imitation. We found a single cluster in the left fusiform gyrus showing significantly reduced activation during face processing in ASD across all studies. Both task-dependent and task-independent analyses indicated significant functional connectivity of this region with the temporo-occipital and lateral occipital cortex, the inferior frontal and parietal cortices, the thalamus and the amygdala. Quantitative reverse inference then indicated an association of these regions mainly with face processing, affective processing, and language-related tasks. Moreover, we found that the cortex in the region of right area V5 displaying structural changes in ASD patients showed consistent connectivity with the region showing aberrant responses in the context of face processing. Finally, this network was also implicated in the human action observation/imitation network. In summary, our findings thus suggest a functionally and structurally disturbed network of occipital regions related primarily to face (but potentially also language) processing, which interact with inferior frontal as well as limbic regions and may be the core of aberrant face processing and reduced interest in faces in ASD. PMID:24869925

  16. A Modified Isotropic-Kinematic Hardening Model to Predict the Defects in Tube Hydroforming Process

    NASA Astrophysics Data System (ADS)

    Jin, Kai; Guo, Qun; Tao, Jie; Guo, Xun-zhong

    2017-11-01

    Numerical simulations of tube hydroforming process of hollow crankshafts were conducted by using finite element analysis method. Moreover, the modified model involving the integration of isotropic-kinematic hardening model with ductile criteria model was used to more accurately optimize the process parameters such as internal pressure, feed distance and friction coefficient. Subsequently, hydroforming experiments were performed based on the simulation results. The comparison between experimental and simulation results indicated that the prediction of tube deformation, crack and wrinkle was quite accurate for the tube hydroforming process. Finally, hollow crankshafts with high thickness uniformity were obtained and the thickness distribution between numerical and experimental results was well consistent.

  17. Cascade process modeling with mechanism-based hierarchical neural networks.

    PubMed

    Cong, Qiumei; Yu, Wen; Chai, Tianyou

    2010-02-01

    Cascade process, such as wastewater treatment plant, includes many nonlinear sub-systems and many variables. When the number of sub-systems is big, the input-output relation in the first block and the last block cannot represent the whole process. In this paper we use two techniques to overcome the above problem. Firstly we propose a new neural model: hierarchical neural networks to identify the cascade process; then we use serial structural mechanism model based on the physical equations to connect with neural model. A stable learning algorithm and theoretical analysis are given. Finally, this method is used to model a wastewater treatment plant. Real operational data of wastewater treatment plant is applied to illustrate the modeling approach.

  18. Electrophoresis gel image processing and analysis using the KODAK 1D software.

    PubMed

    Pizzonia, J

    2001-06-01

    The present article reports on the performance of the KODAK 1D Image Analysis Software for the acquisition of information from electrophoresis experiments and highlights the utility of several mathematical functions for subsequent image processing, analysis, and presentation. Digital images of Coomassie-stained polyacrylamide protein gels containing molecular weight standards and ethidium bromide stained agarose gels containing DNA mass standards are acquired using the KODAK Electrophoresis Documentation and Analysis System 290 (EDAS 290). The KODAK 1D software is used to optimize lane and band identification using features such as isomolecular weight lines. Mathematical functions for mass standard representation are presented, and two methods for estimation of unknown band mass are compared. Given the progressive transition of electrophoresis data acquisition and daily reporting in peer-reviewed journals to digital formats ranging from 8-bit systems such as EDAS 290 to more expensive 16-bit systems, the utility of algorithms such as Gaussian modeling, which can correct geometric aberrations such as clipping due to signal saturation common at lower bit depth levels, is discussed. Finally, image-processing tools that can facilitate image preparation for presentation are demonstrated.

  19. Statistical interpretation of chromatic indicators in correlation to phytochemical profile of a sulfur dioxide-free mulberry (Morus nigra) wine submitted to non-thermal maturation processes.

    PubMed

    Tchabo, William; Ma, Yongkun; Kwaw, Emmanuel; Zhang, Haining; Xiao, Lulu; Apaliya, Maurice T

    2018-01-15

    The four different methods of color measurement of wine proposed by Boulton, Giusti, Glories and Commission International de l'Eclairage (CIE) were applied to assess the statistical relationship between the phytochemical profile and chromatic characteristics of sulfur dioxide-free mulberry (Morus nigra) wine submitted to non-thermal maturation processes. The alteration in chromatic properties and phenolic composition of non-thermal aged mulberry wine were examined, aided by the used of Pearson correlation, cluster and principal component analysis. The results revealed a positive effect of non-thermal processes on phytochemical families of wines. From Pearson correlation analysis relationships between chromatic indexes and flavonols as well as anthocyanins were established. Cluster analysis highlighted similarities between Boulton and Giusti parameters, as well as Glories and CIE parameters in the assessment of chromatic properties of wines. Finally, principal component analysis was able to discriminate wines subjected to different maturation techniques on the basis of their chromatic and phenolics characteristics. Copyright © 2017. Published by Elsevier Ltd.

  20. Tritium glovebox stripper system seismic design evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grinnell, J. J.; Klein, J. E.

    2015-09-01

    The use of glovebox confinement at US Department of Energy (DOE) tritium facilities has been discussed in numerous publications. Glovebox confinement protects the workers from radioactive material (especially tritium oxide), provides an inert atmosphere for prevention of flammable gas mixtures and deflagrations, and allows recovery of tritium released from the process into the glovebox when a glovebox stripper system (GBSS) is part of the design. Tritium recovery from the glovebox atmosphere reduces emissions from the facility and the radiological dose to the public. Location of US DOE defense programs facilities away from public boundaries also aids in reducing radiological dosesmore » to the public. This is a study based upon design concepts to identify issues and considerations for design of a Seismic GBSS. Safety requirements and analysis should be considered preliminary. Safety requirements for design of GBSS should be developed and finalized as a part of the final design process.« less

  1. Performance of the Extravehicular Mobility Unit (EMU) Airlock Coolant Loop Remediation (A/L CLR) Hardware - Final

    NASA Technical Reports Server (NTRS)

    Steele, John W.; Rector, Tony; Gazda, Daniel; Lewis, John

    2011-01-01

    An EMU water processing kit (Airlock Coolant Loop Recovery -- A/L CLR) was developed as a corrective action to Extravehicular Mobility Unit (EMU) coolant flow disruptions experienced on the International Space Station (ISS) in May of 2004 and thereafter. A conservative duty cycle and set of use parameters for A/L CLR use and component life were initially developed and implemented based on prior analysis results and analytical modeling. Several initiatives were undertaken to optimize the duty cycle and use parameters of the hardware. Examination of post-flight samples and EMU Coolant Loop hardware provided invaluable information on the performance of the A/L CLR and has allowed for an optimization of the process. The intent of this paper is to detail the evolution of the A/L CLR hardware, efforts to optimize the duty cycle and use parameters, and the final recommendations for implementation in the post-Shuttle retirement era.

  2. A 3D THz image processing methodology for a fully integrated, semi-automatic and near real-time operational system

    NASA Astrophysics Data System (ADS)

    Brook, A.; Cristofani, E.; Vandewal, M.; Matheis, C.; Jonuscheit, J.; Beigang, R.

    2012-05-01

    The present study proposes a fully integrated, semi-automatic and near real-time mode-operated image processing methodology developed for Frequency-Modulated Continuous-Wave (FMCW) THz images with the center frequencies around: 100 GHz and 300 GHz. The quality control of aeronautics composite multi-layered materials and structures using Non-Destructive Testing is the main focus of this work. Image processing is applied on the 3-D images to extract useful information. The data is processed by extracting areas of interest. The detected areas are subjected to image analysis for more particular investigation managed by a spatial model. Finally, the post-processing stage examines and evaluates the spatial accuracy of the extracted information.

  3. Quality control process improvement of flexible printed circuit board by FMEA

    NASA Astrophysics Data System (ADS)

    Krasaephol, Siwaporn; Chutima, Parames

    2018-02-01

    This research focuses on the quality control process improvement of Flexible Printed Circuit Board (FPCB), centred around model 7-Flex, by using Failure Mode and Effect Analysis (FMEA) method to decrease proportion of defective finished goods that are found at the final inspection process. Due to a number of defective units that were found at the final inspection process, high scraps may be escaped to customers. The problem comes from poor quality control process which is not efficient enough to filter defective products from in-process because there is no In-Process Quality Control (IPQC) or sampling inspection in the process. Therefore, the quality control process has to be improved by setting inspection gates and IPCQs at critical processes in order to filter the defective products. The critical processes are analysed by the FMEA method. IPQC is used for detecting defective products and reducing chances of defective finished goods escaped to the customers. Reducing proportion of defective finished goods also decreases scrap cost because finished goods incur higher scrap cost than work in-process. Moreover, defective products that are found during process can reflect the abnormal processes; therefore, engineers and operators should timely solve the problems. Improved quality control was implemented for 7-Flex production lines from July 2017 to September 2017. The result shows decreasing of the average proportion of defective finished goods and the average of Customer Manufacturers Lot Reject Rate (%LRR of CMs) equal to 4.5% and 4.1% respectively. Furthermore, cost saving of this quality control process equals to 100K Baht.

  4. Automation or De-automation

    NASA Astrophysics Data System (ADS)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  5. The Type and Impact of Evidence Review Group Exploratory Analyses in the NICE Single Technology Appraisal Process.

    PubMed

    Carroll, Christopher; Kaltenthaler, Eva; Hill-McManus, Daniel; Scope, Alison; Holmes, Michael; Rice, Stephen; Rose, Micah; Tappenden, Paul; Woolacott, Nerys

    2017-06-01

    As part of the UK National Institute for Health and Care Excellence (NICE) single technology appraisal process, independent evidence review groups (ERGs) critically appraise a company's submission relating to a specific technology and indication. To explore the type of additional exploratory analyses conducted by ERGs and their impact on the recommendations made by NICE. The 100 most recently completed single technology appraisals with published guidance were selected for inclusion. A content analysis of relevant documents was undertaken to identify and extract relevant data, and narrative synthesis was used to rationalize and present these data. The types of exploratory analysis conducted in relation to companies' models were fixing errors, addressing violations, addressing matters of judgment, and the provision of a new, ERG-preferred base case. Ninety-three of the 100 ERG reports contained at least one of these analyses. The most frequently reported type of analysis in these 93 ERG reports related to the category "Matters of judgment," which was reported in 83 reports (89%). At least one of the exploratory analyses conducted and reported by an ERG is mentioned in 97% of NICE appraisal consultation documents and 94% of NICE final appraisal determinations, and had a clear influence on recommendations in 72% of appraisal consultation documents and 47% of final appraisal determinations. These results suggest that the additional analyses undertaken by ERGs in the appraisal of company submissions are highly influential in the policy-making and decision-making process. Copyright © 2017 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  6. Characterization of the Spatio-temporal Evolution of the Energy of Recent Tsunamis in Chile and its Connection with the Seismic Source and Geomorphological Conditions

    NASA Astrophysics Data System (ADS)

    Quiroz, M.; Cienfuegos, R.

    2017-12-01

    At present, there is good knowledge acquired by the scientific community on characterizing the evolution of tsunami energy at ocean and shelf scales. For instance, the investigations of Rabinovich (2013) and Yamazaki (2011), represent some important advances in this subject. In the present paper we rather focus on tsunami energy evolution, and ultimately its decay, in coastal areas because characteristic time scales of this process has implications for early warning, evacuation initiation, and cancelling. We address the tsunami energy evolution analysis at three different spatial scales, a global scale at the ocean basin level, in particular the Pacific Ocean basin, a regional scale comprising processes that occur at the continental shelf level, and finally a local scale comprising coastal areas or bays. These scales were selected following the motivation to understand how the response is associated with tsunami, and how the energy evolves until it is completely dissipated. Through signal processing methods, such as discrete and wavelets analysis, we analyze time series of recent tsunamigenic events in the main Chilean coastal cities. Based on this analysis, we propose a conceptual model based on the influence of geomorphological variables on the evolution and decay of tsunami energy. This model acts as a filter from the seismic source to the observed response in coastal zones. Finally, we hope to conclude with practical tools that will establish patterns of behavior and scaling of energy evolution through interconnections from seismic source variables and the geomorphological component to understand the response and predict behavior for a given site.

  7. Multi-surface topography targeted plateau honing for the processing of cylinder liner surfaces of automotive engines

    NASA Astrophysics Data System (ADS)

    Lawrence, K. Deepak; Ramamoorthy, B.

    2016-03-01

    Cylinder bores of automotive engines are 'engineered' surfaces that are processed using multi-stage honing process to generate multiple layers of micro geometry for meeting the different functional requirements of the piston assembly system. The final processed surfaces should comply with several surface topographic specifications that are relevant for the good tribological performance of the engine. Selection of the process parameters in three stages of honing to obtain multiple surface topographic characteristics simultaneously within the specification tolerance is an important module of the process planning and is often posed as a challenging task for the process engineers. This paper presents a strategy by combining the robust process design and gray-relational analysis to evolve the operating levels of honing process parameters in rough, finish and plateau honing stages targeting to meet multiple surface topographic specifications on the final running surface of the cylinder bores. Honing experiments were conducted in three stages namely rough, finish and plateau honing on cast iron cylinder liners by varying four honing process parameters such as rotational speed, oscillatory speed, pressure and honing time. Abbott-Firestone curve based functional parameters (Rk, Rpk, Rvk, Mr1 and Mr2) coupled with mean roughness depth (Rz, DIN/ISO) and honing angle were measured and identified as the surface quality performance targets to be achieved. The experimental results have shown that the proposed approach is effective to generate cylinder liner surface that would simultaneously meet the explicit surface topographic specifications currently practiced by the industry.

  8. Benefits of an automated GLP final report preparation software solution.

    PubMed

    Elvebak, Larry E

    2011-07-01

    The final product of analytical laboratories performing US FDA-regulated (or GLP) method validation and bioanalysis studies is the final report. Although there are commercial-off-the-shelf (COTS) software/instrument systems available to laboratory managers to automate and manage almost every aspect of the instrumental and sample-handling processes of GLP studies, there are few software systems available to fully manage the GLP final report preparation process. This lack of appropriate COTS tools results in the implementation of rather Byzantine and manual processes to cobble together all the information needed to generate a GLP final report. The manual nature of these processes results in the need for several iterative quality control and quality assurance events to ensure data accuracy and report formatting. The industry is in need of a COTS solution that gives laboratory managers and study directors the ability to manage as many portions as possible of the GLP final report writing process and the ability to generate a GLP final report with the click of a button. This article describes the COTS software features needed to give laboratory managers and study directors such a solution.

  9. Approaches to self-assembly of colloidal monolayers: A guide for nanotechnologists.

    PubMed

    Lotito, Valeria; Zambelli, Tomaso

    2017-08-01

    Self-assembly of quasi-spherical colloidal particles in two-dimensional (2D) arrangements is essential for a wide range of applications from optoelectronics to surface engineering, from chemical and biological sensing to light harvesting and environmental remediation. Several self-assembly approaches have flourished throughout the years, with specific features in terms of complexity of the implementation, sensitivity to process parameters, characteristics of the final colloidal assembly. Selecting the proper method for a given application amidst the vast literature in this field can be a challenging task. In this review, we present an extensive classification and comparison of the different techniques adopted for 2D self-assembly in order to provide useful guidelines for scientists approaching this field. After an overview of the main applications of 2D colloidal assemblies, we describe the main mechanisms underlying their formation and introduce the mathematical tools commonly used to analyse their final morphology. Subsequently, we examine in detail each class of self-assembly techniques, with an explanation of the physical processes intervening in crystallization and a thorough investigation of the technical peculiarities of the different practical implementations. We point out the specific characteristics of the set-ups and apparatuses developed for self-assembly in terms of complexity, requirements, reproducibility, robustness, sensitivity to process parameters and morphology of the final colloidal pattern. Such an analysis will help the reader to individuate more easily the approach more suitable for a given application and will draw the attention towards the importance of the details of each implementation for the final results. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. 77 FR 777 - Fresh Garlic From the People's Republic of China: Final Results of Expedited Sunset Review of the...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-06

    ... ingredients or heat processing. The differences between grades are based on color, size, sheathing, and level... otherwise prepared for use as seed. The subject merchandise is used principally as a food product and for... seed must be accompanied by declarations to U.S. Customs and Border Protection to that effect. Analysis...

  11. Counselor Training in Statistical Analysis via Electronic Processing for Research on Local and Regional Student Data. Final Report.

    ERIC Educational Resources Information Center

    Long, Thomas E.

    In this institute, the participants were trained to use peripheral computer related equipment. They were taught Fortran programming skills so they might write and redimension statistical formulary programs, and they were trained to assemble data so they might access computers via both card and punched-tape input. The objectives of the Institute…

  12. Inflatable antenna for earth observing systems

    NASA Astrophysics Data System (ADS)

    Wang, Hong-Jian; Guan, Fu-ling; Xu, Yan; Yi, Min

    2010-09-01

    This paper describe mechanical design, dynamic analysis, and deployment demonstration of the antenna , and the photogrammetry detecting RMS of inflatable antenna surface, the possible errors results form the measurement are also analysed. Ticra's Grasp software are used to predict the inflatable antenna pattern based on the coordinates of the 460 points on the parabolic surface, the final results verified the whole design process.

  13. Validation of Student and Parent Reported Data on the Basic Grant Application Form: Corrections Analysis Study. Final Report, Volume 4.

    ERIC Educational Resources Information Center

    Vogel, Ronald J.

    A study was conducted in 1976 of applicants who submitted corrections or amendments to their Student Eligibility Reports (SERs) for the Basic Educational Opportunity Grant (BEOG) Program. The objective was to review the applications corrections process and to determine factors linked to applicants' use of correction procedures. Attention was…

  14. An Analysis of Collaborative Problem-Solving Mechanisms in Sponsored Projects: Applying the 5-Day Sprint Model

    ERIC Educational Resources Information Center

    Raubenolt, Amy

    2016-01-01

    In May 2016, the office of Finance and Sponsored Projects at The Research Institute at Nationwide Children's Hospital conducted a 5-day design sprint session to re-evaluate and redesign a flawed final reporting process within the department. The department sprint was modeled after the design sprint sessions that occur routinely in software…

  15. Alpha particle-induced soft errors in microelectronic devices. I

    NASA Astrophysics Data System (ADS)

    Redman, D. J.; Sega, R. M.; Joseph, R.

    1980-03-01

    The article provides a tutorial review and trend assessment of the problem of alpha particle-induced soft errors in VLSI memories. Attention is given to an analysis of the design evolution of modern ICs, and the characteristics of alpha particles and their origin in IC packaging are reviewed. Finally, the process of an alpha particle penetrating an IC is examined.

  16. Numerical study of influence of hydrogen backflow on krypton Hall effect thruster plasma focusing

    NASA Astrophysics Data System (ADS)

    Yan, Shilin; Ding, Yongjie; Wei, Liqiu; Hu, Yanlin; Li, Jie; Ning, Zhongxi; Yu, Daren

    2017-03-01

    The influence of backflow hydrogen on plasma plume focusing of a krypton Hall effect thruster is studied via a numerical simulation method. Theoretical analysis indicates that hydrogen participates in the plasma discharge process, changes the potential and ionization distribution in the thruster discharge cavity, and finally affects the plume focusing within a vacuum vessel.

  17. Dynamical analysis of yeast protein interaction network during the sake brewing process.

    PubMed

    Mirzarezaee, Mitra; Sadeghi, Mehdi; Araabi, Babak N

    2011-12-01

    Proteins interact with each other for performing essential functions of an organism. They change partners to get involved in various processes at different times or locations. Studying variations of protein interactions within a specific process would help better understand the dynamic features of the protein interactions and their functions. We studied the protein interaction network of Saccharomyces cerevisiae (yeast) during the brewing of Japanese sake. In this process, yeast cells are exposed to several stresses. Analysis of protein interaction networks of yeast during this process helps to understand how protein interactions of yeast change during the sake brewing process. We used gene expression profiles of yeast cells for this purpose. Results of our experiments revealed some characteristics and behaviors of yeast hubs and non-hubs and their dynamical changes during the brewing process. We found that just a small portion of the proteins (12.8 to 21.6%) is responsible for the functional changes of the proteins in the sake brewing process. The changes in the number of edges and hubs of the yeast protein interaction networks increase in the first stages of the process and it then decreases at the final stages.

  18. Effect of equal channel angular pressing on the microstructure and mechanical properties of Al-10Zn-2Mg alloy

    NASA Astrophysics Data System (ADS)

    Manjunath, G. K.; Kumar, G. V. Preetham; Bhat, K. Udaya

    2018-04-01

    The current investigation is focused on evaluating the mechanical properties and the microstructure of cast Al-10Zn-2Mg alloy processed through equal channel angular pressing (ECAP). The ECAP processing was attempted at minimum possible processing temperature. Microstructural characterization was carried out in optical microscopy, scanning electron microscopy, transmission electron microscopy and X-ray diffraction analysis. Hardness measurement and tensile tests were employed to estimate the mechanical properties. Experimental results showed that, ECAP processing leads to noticeable grain refinement in the alloy. Reasonable amount of dislocations were observed in the ECAP processed material. After ECAP processing, precipitates nucleation in the material was detected in the XRD analysis. ECAP leads to considerable enhancement in the mechanical properties of the material. After ECAP processing, microhardness of the material is increased from 144 Hv to 216 Hv. Also, after ECAP processing the UTS of the material is increased from 140 MPa to 302 MPa. The increase in the mechanical properties of the alloy after ECAP processing is due to the dislocation strengthening and grain refinement strengthening. Finally, fracture surface morphology of the tensile test samples also studied.

  19. An Analysis of Preliminary and Post-Discussion Priority Scores for Grant Applications Peer Reviewed by the Center for Scientific Review at the NIH

    PubMed Central

    Martin, Michael R.; Kopstein, Andrea; Janice, Joy M.

    2010-01-01

    There has been the impression amongst many observers that discussion of a grant application has little practical impact on the final priority scores. Rather the final score is largely dictated by the range of preliminary scores given by the assigned reviewers. The implication is that the preliminary and final scores are the same and the discussion has little impact. The purpose of this examination of the peer review process at the National Institutes of Health is to describe the relationship between preliminary priority scores of the assigned reviewers and the final priority score given by the scientific review group. This study also describes the practical importance of any differences in priority scores. Priority scores for a sample of standard (R01) research grant applications were used in this assessment. The results indicate that the preliminary meeting evaluation is positively correlated with the final meeting outcome but that they are on average significantly different. The results demonstrate that discussion at the meeting has an important practical impact on over 13% of the applications. PMID:21103331

  20. Mature vs. Active Deep-Seated Landslides: A Comparison Through Two Case Histories in the Alps

    NASA Astrophysics Data System (ADS)

    Delle Piane, Luca; Perello, Paolo; Baietto, Alessandro; Giorza, Alessandra; Musso, Alessia; Gabriele, Piercarlo; Baster, Ira

    2016-06-01

    Two case histories are presented, concerning the still poorly known alpine deep-seated gravitational slope deformations (DSD) located nearby Lanzada (central Italian Alps), and Sarre (north-western Italian Alps). The Lanzada DSD is a constantly monitored, juvenile, and active phenomenon, partly affecting an existing hydropower plant. Its well-developed landforms allow a precise field characterization of the instability-affected area. The Sarre DSD is a mature, strongly remodeled phenomenon, where the only hazard factor is represented by secondary instability processes at the base of the slope. In this case, the remodeling imposed the adoption of complementary analytical techniques to support the field work. The two presented studies had to be adapted to external factors, namely (a) available information, (b) geological and geomorphological setting, and (c) final scope of the work. The Lanzada case essentially relied upon accurate field work; the Sarre case was mostly based on digital image and DTM processing. In both cases a sound field structural analysis formed the necessary background to understand the mechanisms leading to instability. A back-analysis of the differences between the study methods adopted in the two cases is finally presented, leading to suggestions for further investigations and design.

  1. Reliability modelling and analysis of a multi-state element based on a dynamic Bayesian network

    NASA Astrophysics Data System (ADS)

    Li, Zhiqiang; Xu, Tingxue; Gu, Junyuan; Dong, Qi; Fu, Linyu

    2018-04-01

    This paper presents a quantitative reliability modelling and analysis method for multi-state elements based on a combination of the Markov process and a dynamic Bayesian network (DBN), taking perfect repair, imperfect repair and condition-based maintenance (CBM) into consideration. The Markov models of elements without repair and under CBM are established, and an absorbing set is introduced to determine the reliability of the repairable element. According to the state-transition relations between the states determined by the Markov process, a DBN model is built. In addition, its parameters for series and parallel systems, namely, conditional probability tables, can be calculated by referring to the conditional degradation probabilities. Finally, the power of a control unit in a failure model is used as an example. A dynamic fault tree (DFT) is translated into a Bayesian network model, and subsequently extended to a DBN. The results show the state probabilities of an element and the system without repair, with perfect and imperfect repair, and under CBM, with an absorbing set plotted by differential equations and verified. Through referring forward, the reliability value of the control unit is determined in different kinds of modes. Finally, weak nodes are noted in the control unit.

  2. Unmanned Aerial Vehicle - A Tool for Acquiring Spatial Data for Research and Commercial Purposes. New Course in the Geography and Cartography Curriculum in Higher Education

    NASA Astrophysics Data System (ADS)

    Jeziorska, J.

    2014-04-01

    This paper describes the syllabus for the innovative course "Unmanned aerial observations of Terrain" introduced to the curriculum by the Department of Geoinformatics and Cartography of the University of Wroclaw (Poland). It indicates the objectives of the new subject, its didactic purpose, methods used in the teaching process, specifications of teaching materials, and the knowledge and skills that students are expected to acquire. Finally, it presents the content of the course and description of lesson units. The subject will be obligatory for graduate students majoring in Geography, who are participants in the Geoinformatics and Cartography Master's program. Thirty-eight hours in a summer semester has been earmarked for the course. That includes 30 hours of instructor-guided laboratory and fieldtrip work, and 8 hours of individual work. The course aims to prepare future geographers to conduct a multi-step process that includes defining the purpose of using UAV in light of the chosen research problem, preparation of the mission, flight execution; geoprocessing of acquired aerial imagery; generation of cartomertic final products, and analysis of outcomes in order to answer the initially asked research question. This comprehensive approach will allow students, future experts in the field of geoinformatics and cartography, to gain the skills needed to acquire spatial data using an UAV, process them, and apply the results of their analysis in practice.

  3. Microfluidic Sample Preparation for Diagnostic Cytopathology

    PubMed Central

    Mach, Albert J.; Adeyiga, Oladunni B.; Di Carlo, Dino

    2014-01-01

    The cellular components of body fluids are routinely analyzed to identify disease and treatment approaches. While significant focus has been placed on developing cell analysis technologies, tools to automate the preparation of cellular specimens have been more limited, especially for body fluids beyond blood. Preparation steps include separating, concentrating, and exposing cells to reagents. Sample preparation continues to be routinely performed off-chip by technicians, preventing cell-based point-of-care diagnostics, increasing the cost of tests, and reducing the consistency of the final analysis following multiple manually-performed steps. Here, we review the assortment of biofluids for which suspended cells are analyzed, along with their characteristics and diagnostic value. We present an overview of the conventional sample preparation processes for cytological diagnosis. We finally discuss the challenges and opportunities in developing microfluidic devices for the purpose of automating or miniaturizing these processes, with particular emphases on preparing large or small volume samples, working with samples of high cellularity, automating multi-step processes, and obtaining high purity subpopulations of cells. We hope to convey the importance of and help identify new research directions addressing the vast biological and clinical applications in preparing and analyzing the array of available biological fluids. Successfully addressing the challenges described in this review can lead to inexpensive systems to improve diagnostic accuracy while simultaneously reducing overall systemic healthcare costs. PMID:23380972

  4. Algorithms for Image Analysis and Combination of Pattern Classifiers with Application to Medical Diagnosis

    NASA Astrophysics Data System (ADS)

    Georgiou, Harris

    2009-10-01

    Medical Informatics and the application of modern signal processing in the assistance of the diagnostic process in medical imaging is one of the more recent and active research areas today. This thesis addresses a variety of issues related to the general problem of medical image analysis, specifically in mammography, and presents a series of algorithms and design approaches for all the intermediate levels of a modern system for computer-aided diagnosis (CAD). The diagnostic problem is analyzed with a systematic approach, first defining the imaging characteristics and features that are relevant to probable pathology in mammo-grams. Next, these features are quantified and fused into new, integrated radio-logical systems that exhibit embedded digital signal processing, in order to improve the final result and minimize the radiological dose for the patient. In a higher level, special algorithms are designed for detecting and encoding these clinically interest-ing imaging features, in order to be used as input to advanced pattern classifiers and machine learning models. Finally, these approaches are extended in multi-classifier models under the scope of Game Theory and optimum collective deci-sion, in order to produce efficient solutions for combining classifiers with minimum computational costs for advanced diagnostic systems. The material covered in this thesis is related to a total of 18 published papers, 6 in scientific journals and 12 in international conferences.

  5. Reliability analysis of different structure parameters of PCBA under drop impact

    NASA Astrophysics Data System (ADS)

    Liu, P. S.; Fan, G. M.; Liu, Y. H.

    2018-03-01

    The establishing process of PCBA is modelled by finite element analysis software ABAQUS. Firstly, introduce the Input-G method and the fatigue life under drop impact are introduced and the mechanism of the solder joint failure in the process of drop is analysed. The main reason of solder joint failure is that the PCB component is suffering repeated tension and compression stress during the drop impact. Finally, the equivalent stress and peel stress of different solder joint and plate-level components under different impact acceleration are also analysed. The results show that the reliability of tin-silver copper joint is better than that of tin- lead solder joint, and the fatigue life of solder joint expectancy decrease as the impact pulse amplitude increases.

  6. New approach to gallbladder ultrasonic images analysis and lesions recognition.

    PubMed

    Bodzioch, Sławomir; Ogiela, Marek R

    2009-03-01

    This paper presents a new approach to gallbladder ultrasonic image processing and analysis towards detection of disease symptoms on processed images. First, in this paper, there is presented a new method of filtering gallbladder contours from USG images. A major stage in this filtration is to segment and section off areas occupied by the said organ. In most cases this procedure is based on filtration that plays a key role in the process of diagnosing pathological changes. Unfortunately ultrasound images present among the most troublesome methods of analysis owing to the echogenic inconsistency of structures under observation. This paper provides for an inventive algorithm for the holistic extraction of gallbladder image contours. The algorithm is based on rank filtration, as well as on the analysis of histogram sections on tested organs. The second part concerns detecting lesion symptoms of the gallbladder. Automating a process of diagnosis always comes down to developing algorithms used to analyze the object of such diagnosis and verify the occurrence of symptoms related to given affection. Usually the final stage is to make a diagnosis based on the detected symptoms. This last stage can be carried out through either dedicated expert systems or more classic pattern analysis approach like using rules to determine illness basing on detected symptoms. This paper discusses the pattern analysis algorithms for gallbladder image interpretation towards classification of the most frequent illness symptoms of this organ.

  7. Combinations of NIR, Raman spectroscopy and physicochemical measurements for improved monitoring of solvent extraction processes using hierarchical multivariate analysis models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nee, K.; Bryan, S.; Levitskaia, T.

    The reliability of chemical processes can be greatly improved by implementing inline monitoring systems. Combining multivariate analysis with non-destructive sensors can enhance the process without interfering with the operation. Here, we present here hierarchical models using both principal component analysis and partial least square analysis developed for different chemical components representative of solvent extraction process streams. A training set of 380 samples and an external validation set of 95 samples were prepared and Near infrared and Raman spectral data as well as conductivity under variable temperature conditions were collected. The results from the models indicate that careful selection of themore » spectral range is important. By compressing the data through Principal Component Analysis (PCA), we lower the rank of the data set to its most dominant features while maintaining the key principal components to be used in the regression analysis. Within the studied data set, concentration of five chemical components were modeled; total nitrate (NO 3 -), total acid (H +), neodymium (Nd 3+), sodium (Na +), and ionic strength (I.S.). The best overall model prediction for each of the species studied used a combined data set comprised of complementary techniques including NIR, Raman, and conductivity. Finally, our study shows that chemometric models are powerful but requires significant amount of carefully analyzed data to capture variations in the chemistry.« less

  8. Combinations of NIR, Raman spectroscopy and physicochemical measurements for improved monitoring of solvent extraction processes using hierarchical multivariate analysis models

    DOE PAGES

    Nee, K.; Bryan, S.; Levitskaia, T.; ...

    2017-12-28

    The reliability of chemical processes can be greatly improved by implementing inline monitoring systems. Combining multivariate analysis with non-destructive sensors can enhance the process without interfering with the operation. Here, we present here hierarchical models using both principal component analysis and partial least square analysis developed for different chemical components representative of solvent extraction process streams. A training set of 380 samples and an external validation set of 95 samples were prepared and Near infrared and Raman spectral data as well as conductivity under variable temperature conditions were collected. The results from the models indicate that careful selection of themore » spectral range is important. By compressing the data through Principal Component Analysis (PCA), we lower the rank of the data set to its most dominant features while maintaining the key principal components to be used in the regression analysis. Within the studied data set, concentration of five chemical components were modeled; total nitrate (NO 3 -), total acid (H +), neodymium (Nd 3+), sodium (Na +), and ionic strength (I.S.). The best overall model prediction for each of the species studied used a combined data set comprised of complementary techniques including NIR, Raman, and conductivity. Finally, our study shows that chemometric models are powerful but requires significant amount of carefully analyzed data to capture variations in the chemistry.« less

  9. The Processing and Analysis of the Data from an Air Force Geophysics Laboratory Atmospheric Optical Measurement Station and the Maintenance of the Central Data Logger System.

    DTIC Science & Technology

    1984-02-15

    Directory ... ....... 42 19. Sample Interval Monitor Graph ................. 46 vi vii LIST OF FIGURES P age I. Example of DATA PROFILE Plot...Final Report, AFGL-TR-81-0130, ADA1? 7879 . PART I. RAW DATA TAPE PROCESSING PROCEDURES. 1.1 EXPERIMENT SAMPLING SEQUENCES Due to the changing...Data Quality 79 QQQQ Packed Eltro Data Quality 80 QQQQQQQQ Packed Luxmeter Data Quality 8i QQOQ Packed Night Path Data Quality P? QQQQQQQ Packed Vis

  10. A simplified and powerful image processing methods to separate Thai jasmine rice and sticky rice varieties

    NASA Astrophysics Data System (ADS)

    Khondok, Piyoros; Sakulkalavek, Aparporn; Suwansukho, Kajpanya

    2018-03-01

    A simplified and powerful image processing procedures to separate the paddy of KHAW DOK MALI 105 or Thai jasmine rice and the paddy of sticky rice RD6 varieties were proposed. The procedures consist of image thresholding, image chain coding and curve fitting using polynomial function. From the fitting, three parameters of each variety, perimeters, area, and eccentricity, were calculated. Finally, the overall parameters were determined by using principal component analysis. The result shown that these procedures can be significantly separate both varieties.

  11. New atmospheric sensor analysis study

    NASA Technical Reports Server (NTRS)

    Parker, K. G.

    1989-01-01

    The functional capabilities of the ESAD Research Computing Facility are discussed. The system is used in processing atmospheric measurements which are used in the evaluation of sensor performance, conducting design-concept simulation studies, and also in modeling the physical and dynamical nature of atmospheric processes. The results may then be evaluated to furnish inputs into the final design specifications for new space sensors intended for future Spacelab, Space Station, and free-flying missions. In addition, data gathered from these missions may subsequently be analyzed to provide better understanding of requirements for numerical modeling of atmospheric phenomena.

  12. Echo™ User Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harvey, Dustin Yewell

    Echo™ is a MATLAB-based software package designed for robust and scalable analysis of complex data workflows. An alternative to tedious, error-prone conventional processes, Echo is based on three transformative principles for data analysis: self-describing data, name-based indexing, and dynamic resource allocation. The software takes an object-oriented approach to data analysis, intimately connecting measurement data with associated metadata. Echo operations in an analysis workflow automatically track and merge metadata and computation parameters to provide a complete history of the process used to generate final results, while automated figure and report generation tools eliminate the potential to mislabel those results. History reportingmore » and visualization methods provide straightforward auditability of analysis processes. Furthermore, name-based indexing on metadata greatly improves code readability for analyst collaboration and reduces opportunities for errors to occur. Echo efficiently manages large data sets using a framework that seamlessly allocates resources such that only the necessary computations to produce a given result are executed. Echo provides a versatile and extensible framework, allowing advanced users to add their own tools and data classes tailored to their own specific needs. Applying these transformative principles and powerful features, Echo greatly improves analyst efficiency and quality of results in many application areas.« less

  13. Automated recognition of helium speech. Phase I: Investigation of microprocessor based analysis/synthesis system

    NASA Astrophysics Data System (ADS)

    Jelinek, H. J.

    1986-01-01

    This is the Final Report of Electronic Design Associates on its Phase I SBIR project. The purpose of this project is to develop a method for correcting helium speech, as experienced in diver-surface communication. The goal of the Phase I study was to design, prototype, and evaluate a real time helium speech corrector system based upon digital signal processing techniques. The general approach was to develop hardware (an IBM PC board) to digitize helium speech and software (a LAMBDA computer based simulation) to translate the speech. As planned in the study proposal, this initial prototype may now be used to assess expected performance from a self contained real time system which uses an identical algorithm. The Final Report details the work carried out to produce the prototype system. Four major project tasks were: a signal processing scheme for converting helium speech to normal sounding speech was generated. The signal processing scheme was simulated on a general purpose (LAMDA) computer. Actual helium speech was supplied to the simulation and the converted speech was generated. An IBM-PC based 14 bit data Input/Output board was designed and built. A bibliography of references on speech processing was generated.

  14. Design and Optimization of Composite Automotive Hatchback Using Integrated Material-Structure-Process-Performance Method

    NASA Astrophysics Data System (ADS)

    Yang, Xudong; Sun, Lingyu; Zhang, Cheng; Li, Lijun; Dai, Zongmiao; Xiong, Zhenkai

    2018-03-01

    The application of polymer composites as a substitution of metal is an effective approach to reduce vehicle weight. However, the final performance of composite structures is determined not only by the material types, structural designs and manufacturing process, but also by their mutual restrict. Hence, an integrated "material-structure-process-performance" method is proposed for the conceptual and detail design of composite components. The material selection is based on the principle of composite mechanics such as rule of mixture for laminate. The design of component geometry, dimension and stacking sequence is determined by parametric modeling and size optimization. The selection of process parameters are based on multi-physical field simulation. The stiffness and modal constraint conditions were obtained from the numerical analysis of metal benchmark under typical load conditions. The optimal design was found by multi-discipline optimization. Finally, the proposed method was validated by an application case of automotive hatchback using carbon fiber reinforced polymer. Compared with the metal benchmark, the weight of composite one reduces 38.8%, simultaneously, its torsion and bending stiffness increases 3.75% and 33.23%, respectively, and the first frequency also increases 44.78%.

  15. DIORAMA Communications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galassi, Mark C.

    Diorama is written as a collection of modules that can run in separate threads or in separate processes. This defines a clear interface between the modules and also allows concurrent processing of different parts of the pipeline. The pipeline is determined by a description in a scenario file[Norman and Tornga, 2012, Tornga and Norman, 2014]. The scenario manager parses the XML scenario and sets up the sequence of modules which will generate an event, propagate the signal to a set of sensors, and then run processing modules on the results provided by those sensor simulations. During a run a varietymore » of “observer” and “processor” modules can be invoked to do interim analysis of results. Observers do not modify the simulation results, while processors may affect the final result. At the end of a run results are collated and final reports are put out. A detailed description of the scenario file and how it puts together a simulation are given in [Tornga and Norman, 2014]. The processing pipeline and how to program it with the Diorama API is described in Tornga et al. [2015] and Tornga and Wakeford [2015]. In this report I describe the communications infrastructure that is used.« less

  16. Automatic initial and final segmentation in cleft palate speech of Mandarin speakers

    PubMed Central

    Liu, Yin; Yin, Heng; Zhang, Junpeng; Zhang, Jing; Zhang, Jiang

    2017-01-01

    The speech unit segmentation is an important pre-processing step in the analysis of cleft palate speech. In Mandarin, one syllable is composed of two parts: initial and final. In cleft palate speech, the resonance disorders occur at the finals and the voiced initials, while the articulation disorders occur at the unvoiced initials. Thus, the initials and finals are the minimum speech units, which could reflect the characteristics of cleft palate speech disorders. In this work, an automatic initial/final segmentation method is proposed. It is an important preprocessing step in cleft palate speech signal processing. The tested cleft palate speech utterances are collected from the Cleft Palate Speech Treatment Center in the Hospital of Stomatology, Sichuan University, which has the largest cleft palate patients in China. The cleft palate speech data includes 824 speech segments, and the control samples contain 228 speech segments. The syllables are extracted from the speech utterances firstly. The proposed syllable extraction method avoids the training stage, and achieves a good performance for both voiced and unvoiced speech. Then, the syllables are classified into with “quasi-unvoiced” or with “quasi-voiced” initials. Respective initial/final segmentation methods are proposed to these two types of syllables. Moreover, a two-step segmentation method is proposed. The rough locations of syllable and initial/final boundaries are refined in the second segmentation step, in order to improve the robustness of segmentation accuracy. The experiments show that the initial/final segmentation accuracies for syllables with quasi-unvoiced initials are higher than quasi-voiced initials. For the cleft palate speech, the mean time error is 4.4ms for syllables with quasi-unvoiced initials, and 25.7ms for syllables with quasi-voiced initials, and the correct segmentation accuracy P30 for all the syllables is 91.69%. For the control samples, P30 for all the syllables is 91.24%. PMID:28926572

  17. Automatic initial and final segmentation in cleft palate speech of Mandarin speakers.

    PubMed

    He, Ling; Liu, Yin; Yin, Heng; Zhang, Junpeng; Zhang, Jing; Zhang, Jiang

    2017-01-01

    The speech unit segmentation is an important pre-processing step in the analysis of cleft palate speech. In Mandarin, one syllable is composed of two parts: initial and final. In cleft palate speech, the resonance disorders occur at the finals and the voiced initials, while the articulation disorders occur at the unvoiced initials. Thus, the initials and finals are the minimum speech units, which could reflect the characteristics of cleft palate speech disorders. In this work, an automatic initial/final segmentation method is proposed. It is an important preprocessing step in cleft palate speech signal processing. The tested cleft palate speech utterances are collected from the Cleft Palate Speech Treatment Center in the Hospital of Stomatology, Sichuan University, which has the largest cleft palate patients in China. The cleft palate speech data includes 824 speech segments, and the control samples contain 228 speech segments. The syllables are extracted from the speech utterances firstly. The proposed syllable extraction method avoids the training stage, and achieves a good performance for both voiced and unvoiced speech. Then, the syllables are classified into with "quasi-unvoiced" or with "quasi-voiced" initials. Respective initial/final segmentation methods are proposed to these two types of syllables. Moreover, a two-step segmentation method is proposed. The rough locations of syllable and initial/final boundaries are refined in the second segmentation step, in order to improve the robustness of segmentation accuracy. The experiments show that the initial/final segmentation accuracies for syllables with quasi-unvoiced initials are higher than quasi-voiced initials. For the cleft palate speech, the mean time error is 4.4ms for syllables with quasi-unvoiced initials, and 25.7ms for syllables with quasi-voiced initials, and the correct segmentation accuracy P30 for all the syllables is 91.69%. For the control samples, P30 for all the syllables is 91.24%.

  18. PCR-DGGE analysis of lactic acid bacteria and yeast dynamics during the production processes of three varieties of Panettone.

    PubMed

    Garofalo, C; Silvestri, G; Aquilanti, L; Clementi, F

    2008-07-01

    To study lactic acid bacteria (LAB) and yeast dynamics during the production processes of sweet-leavened goods manufactured with type I sourdoughs. Fourteen sourdough and dough samples were taken from a baking company in central Italy during the production lines of three varieties of Panettone. The samples underwent pH measurements and plating analysis on three solid media. The microbial DNA was extracted from both the (sour)doughs and the viable LAB and yeast cells collected in bulk, and subjected to PCR-denaturing gradient gel electrophoresis (DGGE) analysis. The molecular fingerprinting of the cultivable plus noncultivable microbial populations provide evidence of the dominance of Lactobacillus sanfranciscensis, Lactobacillus brevis and Candida humilis in the three fermentation processes. The DGGE profiles of the cultivable communities reveal a bacterial shift in the final stages of two of the production processes, suggesting an effect of technological parameters on the selection of the dough microflora. Our findings confirm the importance of using a combined analytical approach to explore microbial communities that develop during the leavening process of sweet-leavened goods. In-depth studies of sourdough biodiversity and population dynamics occurring during sourdough fermentation are fundamental for the control of the leavening process and the manufacture of standardized, high-quality products.

  19. Image analysis and mathematical modelling for the supervision of the dough fermentation process

    NASA Astrophysics Data System (ADS)

    Zettel, Viktoria; Paquet-Durand, Olivier; Hecker, Florian; Hitzmann, Bernd

    2016-10-01

    The fermentation (proof) process of dough is one of the quality-determining steps in the production of baking goods. Beside the fluffiness, whose fundaments are built during fermentation, the flavour of the final product is influenced very much during this production stage. However, until now no on-line measurement system is available, which can supervise this important process step. In this investigation the potential of an image analysis system is evaluated, that enables the determination of the volume of fermented dough pieces. The camera is moving around the fermenting pieces and collects images from the objects by means of different angles (360° range). Using image analysis algorithms the volume increase of individual dough pieces is determined. Based on a detailed mathematical description of the volume increase, which based on the Bernoulli equation, carbon dioxide production rate of yeast cells and the diffusion processes of carbon dioxide, the fermentation process is supervised. Important process parameters, like the carbon dioxide production rate of the yeast cells and the dough viscosity can be estimated just after 300 s of proofing. The mean percentage error for forecasting the further evolution of the relative volume of the dough pieces is just 2.3 %. Therefore, a forecast of the further evolution can be performed and used for fault detection.

  20. 3D finite element modelling of sheet metal blanking process

    NASA Astrophysics Data System (ADS)

    Bohdal, Lukasz; Kukielka, Leon; Chodor, Jaroslaw; Kulakowska, Agnieszka; Patyk, Radoslaw; Kaldunski, Pawel

    2018-05-01

    The shearing process such as the blanking of sheet metals has been used often to prepare workpieces for subsequent forming operations. The use of FEM simulation is increasing for investigation and optimizing the blanking process. In the current literature a blanking FEM simulations for the limited capability and large computational cost of the three dimensional (3D) analysis has been largely limited to two dimensional (2D) plane axis-symmetry problems. However, a significant progress in modelling which takes into account the influence of real material (e.g. microstructure of the material), physical and technological conditions can be obtained by using 3D numerical analysis methods in this area. The objective of this paper is to present 3D finite element analysis of the ductile fracture, strain distribution and stress in blanking process with the assumption geometrical and physical nonlinearities. The physical, mathematical and computer model of the process are elaborated. Dynamic effects, mechanical coupling, constitutive damage law and contact friction are taken into account. The application in ANSYS/LS-DYNA program is elaborated. The effect of the main process parameter a blanking clearance on the deformation of 1018 steel and quality of the blank's sheared edge is analyzed. The results of computer simulations can be used to forecasting quality of the final parts optimization.

  1. A Meta-Analysis and Review of Holistic Face Processing

    PubMed Central

    Richler, Jennifer J.; Gauthier, Isabel

    2014-01-01

    The concept of holistic processing is a cornerstone of face recognition research, yet central questions related to holistic processing remain unanswered, and debates have thus far failed to reach a resolution despite accumulating empirical evidence. We argue that a considerable source of confusion in this literature stems from a methodological problem. Specifically, two different measures of holistic processing based on the composite paradigm (complete design and partial design) are used in the literature, but they often lead to qualitatively different results. First, we present a comprehensive review of the work that directly compares the two designs, and which clearly favors the complete design over the partial design. Second, we report a meta-analysis of holistic face processing according to both designs, and use this as further evidence for one design over the other. The meta-analysis effect size of holistic processing in the complete design is nearly three times that of the partial design. Effect sizes were not correlated between measures, consistent with the suggestion that they do not measure the same thing. Our meta-analysis also examines the correlation between conditions in the complete design of the composite task, and suggests that in an individual differences context, little is gained by including a misaligned baseline. Finally, we offer a comprehensive review of the state of knowledge about holistic processing based on evidence gathered from the measure we favor based on the first sections of our review—the complete design—and outline outstanding research questions in that new context. PMID:24956123

  2. Parallel processing considerations for image recognition tasks

    NASA Astrophysics Data System (ADS)

    Simske, Steven J.

    2011-01-01

    Many image recognition tasks are well-suited to parallel processing. The most obvious example is that many imaging tasks require the analysis of multiple images. From this standpoint, then, parallel processing need be no more complicated than assigning individual images to individual processors. However, there are three less trivial categories of parallel processing that will be considered in this paper: parallel processing (1) by task; (2) by image region; and (3) by meta-algorithm. Parallel processing by task allows the assignment of multiple workflows-as diverse as optical character recognition [OCR], document classification and barcode reading-to parallel pipelines. This can substantially decrease time to completion for the document tasks. For this approach, each parallel pipeline is generally performing a different task. Parallel processing by image region allows a larger imaging task to be sub-divided into a set of parallel pipelines, each performing the same task but on a different data set. This type of image analysis is readily addressed by a map-reduce approach. Examples include document skew detection and multiple face detection and tracking. Finally, parallel processing by meta-algorithm allows different algorithms to be deployed on the same image simultaneously. This approach may result in improved accuracy.

  3. Sensor fault diagnosis of aero-engine based on divided flight status.

    PubMed

    Zhao, Zhen; Zhang, Jun; Sun, Yigang; Liu, Zhexu

    2017-11-01

    Fault diagnosis and safety analysis of an aero-engine have attracted more and more attention in modern society, whose safety directly affects the flight safety of an aircraft. In this paper, the problem concerning sensor fault diagnosis is investigated for an aero-engine during the whole flight process. Considering that the aero-engine is always working in different status through the whole flight process, a flight status division-based sensor fault diagnosis method is presented to improve fault diagnosis precision for the aero-engine. First, aero-engine status is partitioned according to normal sensor data during the whole flight process through the clustering algorithm. Based on that, a diagnosis model is built for each status using the principal component analysis algorithm. Finally, the sensors are monitored using the built diagnosis models by identifying the aero-engine status. The simulation result illustrates the effectiveness of the proposed method.

  4. Intershot Analysis of Flows in DIII-D

    NASA Astrophysics Data System (ADS)

    Meyer, W. H.; Allen, S. L.; Samuell, C. M.; Howard, J.

    2016-10-01

    Analysis of the DIII-D flow diagnostic data require demodulation of interference images, and inversion of the resultant line integrated emissivity and flow (phase) images. Four response matrices are pre-calculated: the emissivity line integral and the line integral of the scalar product of the lines-of-site with the orthogonal unit vectors of parallel flow. Equilibrium data determines the relative weight of the component matrices used in the final flow inversion matrix. Serial processing has been used for the lower divertor viewing flow camera 800x600 pixel image. The full cross section viewing camera will require parallel processing of the 2160x2560 pixel image. We will discuss using a Posix thread pool and a Tesla K40c GPU in the processing of this data. Prepared by LLNL under Contract DE-AC52-07NA27344. This material is based upon work supported by the U.S. DOE, Office of Science, Fusion Energy Sciences.

  5. Sensor fault diagnosis of aero-engine based on divided flight status

    NASA Astrophysics Data System (ADS)

    Zhao, Zhen; Zhang, Jun; Sun, Yigang; Liu, Zhexu

    2017-11-01

    Fault diagnosis and safety analysis of an aero-engine have attracted more and more attention in modern society, whose safety directly affects the flight safety of an aircraft. In this paper, the problem concerning sensor fault diagnosis is investigated for an aero-engine during the whole flight process. Considering that the aero-engine is always working in different status through the whole flight process, a flight status division-based sensor fault diagnosis method is presented to improve fault diagnosis precision for the aero-engine. First, aero-engine status is partitioned according to normal sensor data during the whole flight process through the clustering algorithm. Based on that, a diagnosis model is built for each status using the principal component analysis algorithm. Finally, the sensors are monitored using the built diagnosis models by identifying the aero-engine status. The simulation result illustrates the effectiveness of the proposed method.

  6. Perspective: Optical measurement of feature dimensions and shapes by scatterometry

    NASA Astrophysics Data System (ADS)

    Diebold, Alain C.; Antonelli, Andy; Keller, Nick

    2018-05-01

    The use of optical scattering to measure feature shape and dimensions, scatterometry, is now routine during semiconductor manufacturing. Scatterometry iteratively improves an optical model structure using simulations that are compared to experimental data from an ellipsometer. These simulations are done using the rigorous coupled wave analysis for solving Maxwell's equations. In this article, we describe the Mueller matrix spectroscopic ellipsometry based scatterometry. Next, the rigorous coupled wave analysis for Maxwell's equations is presented. Following this, several example measurements are described as they apply to specific process steps in the fabrication of gate-all-around (GAA) transistor structures. First, simulations of measurement sensitivity for the inner spacer etch back step of horizontal GAA transistor processing are described. Next, the simulated metrology sensitivity for sacrificial (dummy) amorphous silicon etch back step of vertical GAA transistor processing is discussed. Finally, we present the application of plasmonically active test structures for improving the sensitivity of the measurement of metal linewidths.

  7. Influence of plasma shock wave on the morphology of laser drilling in different environments

    NASA Astrophysics Data System (ADS)

    Zhai, Zhaoyang; Wang, Wenjun; Mei, Xuesong; Wang, Kedian; Yang, Huizhu

    2017-05-01

    Nanosecond pulse laser was used to study nickel-based alloy drilling and compare processing results of microholes in air environment and water environment. Through analysis and comparison, it's found that environmental medium had obvious influence on morphology of laser drilling. High-speed camera was used to shoot plasma morphology during laser drilling process, theoretical formula was used to calculate boundary dimension of plasma and shock wave velocity, and finally parameters were substituted into computational fluid dynamics simulation software to obtain solutions. Obtained analysis results could intuitively explain different morphological features and forming reasons between laser drilling in air environment and water environment in the experiment from angle of plasma shock waves. By comparing simulation results and experimental results, it could help to get an understanding of formation mechanism of microhole morphology, thus providing basis for further improving process optimization of laser drilling quality.

  8. Fuel ethanol production: process design trends and integration opportunities.

    PubMed

    Cardona, Carlos A; Sánchez, Oscar J

    2007-09-01

    Current fuel ethanol research and development deals with process engineering trends for improving biotechnological production of ethanol. In this work, the key role that process design plays during the development of cost-effective technologies is recognized through the analysis of major trends in process synthesis, modeling, simulation and optimization related to ethanol production. Main directions in techno-economical evaluation of fuel ethanol processes are described as well as some prospecting configurations. The most promising alternatives for compensating ethanol production costs by the generation of valuable co-products are analyzed. Opportunities for integration of fuel ethanol production processes and their implications are underlined. Main ways of process intensification through reaction-reaction, reaction-separation and separation-separation processes are analyzed in the case of bioethanol production. Some examples of energy integration during ethanol production are also highlighted. Finally, some concluding considerations on current and future research tendencies in fuel ethanol production regarding process design and integration are presented.

  9. Talent identification and selection process of outfield players and goalkeepers in a professional soccer club.

    PubMed

    Gil, Susana María; Zabala-Lili, Jon; Bidaurrazaga-Letona, Iraia; Aduna, Badiola; Lekue, Jose Antonio; Santos-Concejero, Jordan; Granados, Cristina

    2014-12-01

    Abstract The aim of this study was to analyse the talent identification process of a professional soccer club. A preselection of players (n = 64) aged 9-10 years and a final selection (n = 21) were performed by the technical staff through the observation during training sessions and matches. Also, 34 age-matched players of an open soccer camp (CampP) acted as controls. All participants underwent anthropometric, maturity and performance measurements. Preselected outfield players (OFs) were older and leaner than CampP (P < 0.05). Besides, they performed better in velocity, agility, endurance and jump tests (P < 0.05). A discriminant analysis showed that velocity and agility were the most important parameters. Finally, selected OFs were older and displayed better agility and endurance compared to the nonselected OFs (P < 0.05). Goalkeepers (GKs) were taller and heavier and had more body fat than OFs; also, they performed worse in the physical tests (P < 0.05). Finally, selected GKs were older and taller, had a higher predicted height and advanced maturity and performed better in the handgrip (dynamometry) and jump tests (P < 0.05). Thus, the technical staff selected OFs with a particular anthropometry and best performance, particularly agility and endurance, while GKs had a different profile. Moreover, chronological age had an important role in the whole selection process.

  10. Comparative analysis of rationale used by dentists and patient for final esthetic outcome of dental treatment.

    PubMed

    Reddy, S Varalakshmi; Madineni, Praveen Kumar; Sudheer, A; Gujjarlapudi, Manmohan Choudary; Sreedevi, B; Reddy, Patelu Sunil Kumar

    2013-05-01

    To compare and evaluate the perceptions of esthetics among dentists and patients regarding the final esthetic outcome of a dental treatment. Esthetics is a matter of perception and is associated with the way different people look at an object. What constitutes esthetic for a particular person may not be acceptable for another. Hence it is subjective in nature. This becomes more obvious during the post-treatment evaluation of esthetics by dentist and the concerned patient. Opinion seldom matches. Hence, the study is a necessary part of the process of understanding the mind of dentist and patient regarding what constitutes esthetics. A survey has been conducted by means of a questionnaire consisting of 10 questions, on two groups of people. First group consists of 100 dentists picked at random in Kanyakumari district of Tamil Nadu, India. Second group consisted of 100 patients who required complete denture prosthesis. The second group was divided into two subgroups A and B. Subgroup A consisting of 50 men and subgroup B consisting of 50 women. In each subgroup 25 patients were selected in age group of 40 to 50 and 25 patients were selected in the age group of 50 to 60. The questionnaire was given to both the groups and asked to fill up, which was then statistically analyzed to look for patterns of thought process among them. Results were subjected to statistical analysis by Student's t-test. Perceptions of esthetics differs from dentist who is educated regarding esthetic principles of treatment and a patient who is not subjected to such education. Since, the questions were formulated such that patients could better understand the underlying problem, the final outcome of survey is a proof that dentists need to take into account what the patient regards as esthetics in order to provide a satisfactory treatment. CLINICAL AND ACADEMIC SIGNIFICANCE: The current study helps the dentist to better educate the patient regarding esthetics so that patient appreciates the final scientifically based esthetic outcome of treatment. It also helps the dental students to understand the underlying patient's thought process regarding esthetics.

  11. Practical Strategies for Integrating Final Ecosystem Goods and ...

    EPA Pesticide Factsheets

    The concept of Final Ecosystem Goods and Services (FEGS) explicitly connects ecosystem services to the people that benefit from them. This report presents a number of practical strategies for incorporating FEGS, and more broadly ecosystem services, into the decision-making process. Whether a decision process is in early or late stages, or whether a process includes informal or formal decision analysis, there are multiple points where ecosystem services concepts can be integrated. This report uses Structured Decision Making (SDM) as an organizing framework to illustrate the role ecosystem services can play in a values-focused decision-process, including: • Clarifying the decision context: Ecosystem services can help clarify the potential impacts of an issue on natural resources together with their spatial and temporal extent based on supply and delivery of those services, and help identify beneficiaries for inclusion as stakeholders in the deliberative process. • Defining objectives and performance measures: Ecosystem services may directly represent stakeholder objectives, or may be means toward achieving other objectives. • Creating alternatives: Ecosystem services can bring to light creative alternatives for achieving other social, economic, health, or general well-being objectives. • Estimating consequences: Ecosystem services assessments can implement ecological production functions (EPFs) and ecological benefits functions (EBFs) to link decision alt

  12. A 45° saw-dicing process applied to a glass substrate for wafer-level optical splitter fabrication for optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Maciel, M. J.; Costa, C. G.; Silva, M. F.; Gonçalves, S. B.; Peixoto, A. C.; Ribeiro, A. Fernando; Wolffenbuttel, R. F.; Correia, J. H.

    2016-08-01

    This paper reports on the development of a technology for the wafer-level fabrication of an optical Michelson interferometer, which is an essential component in a micro opto-electromechanical system (MOEMS) for a miniaturized optical coherence tomography (OCT) system. The MOEMS consists on a titanium dioxide/silicon dioxide dielectric beam splitter and chromium/gold micro-mirrors. These optical components are deposited on 45° tilted surfaces to allow the horizontal/vertical separation of the incident beam in the final micro-integrated system. The fabrication process consists of 45° saw dicing of a glass substrate and the subsequent deposition of dielectric multilayers and metal layers. The 45° saw dicing is fully characterized in this paper, which also includes an analysis of the roughness. The optimum process results in surfaces with a roughness of 19.76 nm (rms). The actual saw dicing process for a high-quality final surface results as a compromise between the dicing blade’s grit size (#1200) and the cutting speed (0.3 mm s-1). The proposed wafer-level fabrication allows rapid and low-cost processing, high compactness and the possibility of wafer-level alignment/assembly with other optical micro components for OCT integrated imaging.

  13. Measurement of muon plus proton final states in ν μ interactions on hydrocarbon at < E ν > = 4.2 GeV

    DOE PAGES

    Walton, T.

    2015-04-01

    A study of charged-current muon neutrino scattering on hydrocarbon in which the final state includes a muon, at least one proton, and no pions is presented. Although this signature has the topology of neutrino quasielastic scattering from neutrons, the event sample contains contributions from quasielastic and inelastic processes where pions are absorbed in the nucleus. The analysis accepts events with muon production angles up to 70° and proton kinetic energies greater than 110 MeV. The cross section, when based completely on hadronic kinematics, is well described by a relativistic Fermi gas nuclear model including the neutrino event generator modeling formore » inelastic processes and particle transportation through the nucleus. This is in contrast to the quasielastic cross section based on muon kinematics, which is best described by an extended model that incorporates multinucleon correlations. As a result, this measurement guides the formulation of a complete description of neutrino-nucleus interactions that encompasses the hadronic as well as the leptonic aspects of this process.« less

  14. A study with ESI PAM-STAMP® on the influence of tool deformation on final part quality during a forming process

    NASA Astrophysics Data System (ADS)

    Vrolijk, Mark; Ogawa, Takayuki; Camanho, Arthur; Biasutti, Manfredi; Lorenz, David

    2018-05-01

    As a result from the ever increasing demand to produce lighter vehicles, more and more advanced high-strength materials are used in automotive industry. Focusing on sheet metal cold forming processes, these materials require high pressing forces and exhibit large springback after forming. Due to the high pressing forces deformations occur in the tooling geometry, introducing dimensional inaccuracies in the blank and potentially impact the final springback behavior. As a result the tool deformations can have an impact on the final assembly or introduce cosmetic defects. Often several iterations are required in try-out to obtain the required tolerances, with costs going up to as much as 30% of the entire product development cost. To investigate the sheet metal part feasibility and quality, in automotive industry CAE tools are widely used. However, in current practice the influence of the tool deformations on the final part quality is generally neglected and simulations are carried out with rigid tools to avoid drastically increased calculation times. If the tool deformation is analyzed through simulation it is normally done at the end of the drawing prosses, when contact conditions are mapped on the die structure and a static analysis is performed to check the deflections of the tool. But this method does not predict the influence of these deflections on the final quality of the part. In order to take tool deformations into account during drawing simulations, ESI has developed the ability to couple solvers efficiently in a way the tool deformations can be real-time included in the drawing simulation without high increase in simulation time compared to simulations with rigid tools. In this paper a study will be presented which demonstrates the effect of tool deformations on the final part quality.

  15. Final report on the Seventh International Comparison of Absolute Gravimeters (ICAG 2005)

    USGS Publications Warehouse

    Jiang, Z.; Francis, O.; Vitushkin, L.; Palinkas, V.; Germak, A.; Becker, M.; D'Agostino, G.; Amalvict, M.; Bayer, R.; Bilker-Koivula, M.; Desogus, S.; Faller, J.; Falk, R.; Hinderer, J.; Gagnon, C.; Jakob, T.; Kalish, E.; Kostelecky, J.; Lee, C.; Liard, J.; Lokshyn, Y.; Luck, B.; Makinen, J.; Mizushima, S.; Le, Moigne N.; Origlia, C.; Pujol, E.R.; Richard, P.; Robertsson, L.; Ruess, D.; Schmerge, D.; Stus, Y.; Svitlov, S.; Thies, S.; Ullrich, C.; Van Camp, M.; Vitushkin, A.; Ji, W.; Wilmes, H.

    2011-01-01

    The Bureau International des Poids et Mesures (BIPM), S??vres, France, hosted the 7th International Comparison of Absolute Gravimeters (ICAG) and the associated Relative Gravity Campaign (RGC) from August to September 2005. ICAG 2005 was prepared and performed as a metrological pilot study, which aimed: To determine the gravity comparison reference values; To determine the offsets of the absolute gravimeters; and As a pilot study to accumulate experience for the CIPM Key Comparisons. This document presents a complete and extensive review of the technical protocol and data processing procedures. The 1st ICAG-RGC comparison was held at the BIPM in 1980-1981 and since then meetings have been organized every 4 years. In this paper, we present an overview of how the meeting was organized, the conditions of BIPM gravimetric sites, technical specifications, data processing strategy and an analysis of the final results. This 7th ICAG final report supersedes all previously published reports. Readings were obtained from participating instruments, 19 absolute gravimeters and 15 relative gravimeters. Precise levelling measurements were carried out and all measurements were performed on the BIPM micro-gravity network which was specifically designed for the comparison. ?? 2011 BIPM & IOP Publishing Ltd.

  16. The influence of composition and final pyrolysis temperature variations on global kinetics of combustion of segregated municipal solid waste

    NASA Astrophysics Data System (ADS)

    Pranoto; Himawanto, D. A.; Arifin, N. A.

    2017-04-01

    The combustion of segregated municipal solid waste (MSW) and the resulted char from the pyrolysis process were investigated in this research. The segregated MSW that was collected and used can be divided into organic and inorganic waste materials. The organic materials were bamboo and banana leaves and the inorganic materials were Styrofoam and snack wrappings. The composition ratio of the waste was based on the percentage of weight of each sample. The thermal behaviour of the segregated MSW was investigated by thermo gravimetric analysis. For the pyrolysis process the prepared samples of 200gram were heated from ambient temperature until a variance of final pyrolysis temperature of 550°C, 650°C and 750°C at a constant heating rate of 25°C/min. It was found that the highest activation energy of the raw materials is achieved from sample CC1 (Char with 100% inorganic materials). The activation energy of the raw materials is relatively lower than that of the char. The higher the final pyrolysis temperature, the lower the calorific value of char. The calorific value gradually increases with the amount of inorganic materials.

  17. Bullying in Virtual Learning Communities.

    PubMed

    Nikiforos, Stefanos; Tzanavaris, Spyros; Kermanidis, Katia Lida

    2017-01-01

    Bullying through the internet has been investigated and analyzed mainly in the field of social media. In this paper, it is attempted to analyze bullying in the Virtual Learning Communities using Natural Language Processing (NLP) techniques, mainly in the context of sociocultural learning theories. Therefore four case studies took place. We aim to apply NLP techniques to speech analysis on communication data of online communities. Emphasis is given on qualitative data, taking into account the subjectivity of the collaborative activity. Finally, this is the first time such type of analysis is attempted on Greek data.

  18. Environmental Assessment, Search and Rescue Training, HH-60 and HC-130, 920th Rescue Group, 301st and 39th Rescue Squadrons, Patrick Air Force Base, Florida

    DTIC Science & Technology

    2003-10-01

    920th RQG Final Environmental Assessment i FINDING OF NO SIGNIFICANT IMPACT (FONSI) FOR THE ENVIRONMENTAL ASSESSMENT FOR THE TRAINING OF THE...Regulations 32 Part 989 (Environmental Impact Analysis Process, July 1999), the 920th RQG has requested the U. S. Air Force (USAF), 45th Space Wing...45SW) to conduct an environmental impact analysis of their Proposed Action on PAFB, CCAFS, APAFR, TSR, the Banana River, and the Atlantic Ocean in

  19. Analysis on influencing factors of EV charging station planning based on AHP

    NASA Astrophysics Data System (ADS)

    Yan, F.; Ma, X. F.

    2016-08-01

    As a new means of transport, electric vehicle (EV) is of great significance to alleviate the energy crisis. EV charging station planning has a far-reaching significance for the development of EV industry. This paper analyzes the impact factors of EV charging station planning, and then uses the analytic hierarchy process (AHP) to carry on the further analysis to the influencing factors, finally it gets the weight of each influence factor, and provides the basis for the evaluation scheme of the planning of charging stations for EV.

  20. A new user-friendly visual environment for breast MRI data analysis.

    PubMed

    Antonios, Danelakis; Dimitrios, Verganelakis A; Theoharis, Theoharis

    2013-06-01

    In this paper a novel, user friendly visual environment for Breast MRI Data Analysis is presented (BreDAn). Given planar MRI images before and after IV contrast medium injection, BreDAn generates kinematic graphs, color maps of signal increase and decrease and finally detects high risk breast areas. The advantage of BreDAn, which has been validated and tested successfully, is the automation of the radiodiagnostic process in an accurate and reliable manner. It can potentially facilitate radiologists' workload. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  1. Design and Manufacturing of Composite Tower Structure for Wind Turbine Equipment

    NASA Astrophysics Data System (ADS)

    Park, Hyunbum

    2018-02-01

    This study proposes the composite tower design process for large wind turbine equipment. In this work, structural design of tower and analysis using finite element method was performed. After structural design, prototype blade manufacturing and test was performed. The used material is a glass fiber and epoxy resin composite. And also, sand was used in the middle part. The optimized structural design and analysis was performed. The parameter for optimized structural design is weight reduction and safety of structure. Finally, structure of tower will be confirmed by structural test.

  2. Thermal decomposition kinetics of hydrazinium cerium 2,3-Pyrazinedicarboxylate hydrate: a new precursor for CeO2.

    PubMed

    Premkumar, Thathan; Govindarajan, Subbiah; Coles, Andrew E; Wight, Charles A

    2005-04-07

    The thermal decomposition kinetics of N(2)H(5)[Ce(pyrazine-2,3-dicarboxylate)(2)(H(2)O)] (Ce-P) have been studied by thermogravimetric analysis (TGA) and differential scanning calorimetry (DSC), for the first time; TGA analysis reveals an oxidative decomposition process yielding CeO(2) as the final product with an activation energy of approximately 160 kJ mol(-1). This complex may be used as a precursor to fine particle cerium oxides due to its low temperature of decomposition.

  3. Description of data on the Nimbus 7 LIMS map archive tape: Water vapor and nitrogen dioxide

    NASA Technical Reports Server (NTRS)

    Haggard, Kenneth V.; Marshall, B. T.; Kurzeja, Robert J.; Remsberg, Ellis E.; Russell, James M., III

    1988-01-01

    Described is the process by which the analysis of the Limb Infrared Monitor of the Stratosphere (LIMS) experiment data were used to produce estimates of synoptic maps of water vapor and nitrogen dioxide. In addition to a detailed description of the analysis procedure, also discussed are several interesting features in the data which are used to demonstrate how the analysis procedure produced the final maps and how one can estimate the uncertainties in the maps. In addition, features in the analysis are noted that would influence how one might use, or interpret, the results. These include subjects such as smoothing and the interpretation of wave components.

  4. The Pan-STARRS PS1 Image Processing Pipeline

    NASA Astrophysics Data System (ADS)

    Magnier, E.

    The Pan-STARRS PS1 Image Processing Pipeline (IPP) performs the image processing and data analysis tasks needed to enable the scientific use of the images obtained by the Pan-STARRS PS1 prototype telescope. The primary goals of the IPP are to process the science images from the Pan-STARRS telescopes and make the results available to other systems within Pan-STARRS. It also is responsible for combining all of the science images in a given filter into a single representation of the non-variable component of the night sky defined as the "Static Sky". To achieve these goals, the IPP also performs other analysis functions to generate the calibrations needed in the science image processing, and to occasionally use the derived data to generate improved astrometric and photometric reference catalogs. It also provides the infrastructure needed to store the incoming data and the resulting data products. The IPP inherits lessons learned, and in some cases code and prototype code, from several other astronomy image analysis systems, including Imcat (Kaiser), the Sloan Digital Sky Survey (REF), the Elixir system (Magnier & Cuillandre), and Vista (Tonry). Imcat and Vista have a large number of robust image processing functions. SDSS has demonstrated a working analysis pipeline and large-scale databasesystem for a dedicated project. The Elixir system has demonstrated an automatic image processing system and an object database system for operational usage. This talk will present an overview of the IPP architecture, functional flow, code development structure, and selected analysis algorithms. Also discussed is the HW highly parallel HW configuration necessary to support PS1 operational requirements. Finally, results are presented of the processing of images collected during PS1 early commissioning tasks utilizing the Pan-STARRS Test Camera #3.

  5. Application of ISO22000, failure mode, and effect analysis (FMEA) cause and effect diagrams and pareto in conjunction with HACCP and risk assessment for processing of pastry products.

    PubMed

    Varzakas, Theodoros H

    2011-09-01

    The Failure Mode and Effect Analysis (FMEA) model has been applied for the risk assessment of pastry processing. A tentative approach of FMEA application to the pastry industry was attempted in conjunction with ISO22000. Preliminary Hazard Analysis was used to analyze and predict the occurring failure modes in a food chain system (pastry processing plant), based on the functions, characteristics, and/or interactions of the ingredients or the processes, upon which the system depends. Critical Control points have been identified and implemented in the cause and effect diagram (also known as Ishikawa, tree diagram, and fishbone diagram). In this work a comparison of ISO22000 analysis with HACCP is carried out over pastry processing and packaging. However, the main emphasis was put on the quantification of risk assessment by determining the Risk Priority Number (RPN) per identified processing hazard. Storage of raw materials and storage of final products at -18°C followed by freezing were the processes identified as the ones with the highest RPN (225, 225, and 144 respectively) and corrective actions were undertaken. Following the application of corrective actions, a second calculation of RPN values was carried out leading to considerably lower values (below the upper acceptable limit of 130). It is noteworthy that the application of Ishikawa (Cause and Effect or Tree diagram) led to converging results thus corroborating the validity of conclusions derived from risk assessment and FMEA. Therefore, the incorporation of FMEA analysis within the ISO22000 system of a pastry processing industry is considered imperative.

  6. Unbiased plasma metabolomics reveal the correlation of metabolic pathways and Prakritis of humans.

    PubMed

    Shirolkar, Amey; Chakraborty, Sutapa; Mandal, Tusharkanti; Dabur, Rajesh

    2017-11-25

    Ayurveda, an ancient Indian medicinal system, has categorized human body constitutions in three broad constitutional types (prakritis) i.e. Vata, Pitta and Kapha. Analysis of plasma metabolites and related pathways to classify Prakriti specific dominant marker metabolites and metabolic pathways. 38 healthy male individuals were assessed for dominant Prakritis and their fasting blood samples were collected. The processed plasma samples were subjected to rapid resolution liquid chromatography-electrospray ionization-quadrupole time of flight mass spectrometry (RRLC-ESI-QTOFMS). Mass profiles were aligned and subjected to multivariate analysis. Partial least square discriminant analysis (PLS-DA) model showed 97.87% recognition capability. List of PLS-DA metabolites was subjected to permutative Benjamini-Hochberg false discovery rate (FDR) correction and final list of 76 metabolites with p < 0.05 and fold-change > 2.0 was identified. Pathway analysis using metascape and JEPETTO plugins in Cytoscape revealed that steroidal hormone biosynthesis, amino acid, and arachidonic acid metabolism are major pathways varying with different constitution. Biological Go processes analysis showed that aromatic amino acids, sphingolipids, and pyrimidine nucleotides metabolic processes were dominant in kapha type of body constitution. Fat soluble vitamins, cellular amino acid, and androgen biosynthesis process along with branched chain amino acid and glycerolipid catabolic processes were dominant in pitta type individuals. Vata Prakriti was found to have dominant catecholamine, arachidonic acid and hydrogen peroxide metabolomics processes. The neurotransmission and oxidative stress in vata, BCAA catabolic, androgen, xenobiotics metabolic processes in pitta, and aromatic amino acids, sphingolipid, and pyrimidine metabolic process in kaphaPrakriti were the dominant marker pathways. Copyright © 2017 Transdisciplinary University, Bangalore and World Ayurveda Foundation. Published by Elsevier B.V. All rights reserved.

  7. Fermentanomics: Relating quality attributes of a monoclonal antibody to cell culture process variables and raw materials using multivariate data analysis.

    PubMed

    Rathore, Anurag S; Kumar Singh, Sumit; Pathak, Mili; Read, Erik K; Brorson, Kurt A; Agarabi, Cyrus D; Khan, Mansoor

    2015-01-01

    Fermentanomics is an emerging field of research and involves understanding the underlying controlled process variables and their effect on process yield and product quality. Although major advancements have occurred in process analytics over the past two decades, accurate real-time measurement of significant quality attributes for a biotech product during production culture is still not feasible. Researchers have used an amalgam of process models and analytical measurements for monitoring and process control during production. This article focuses on using multivariate data analysis as a tool for monitoring the internal bioreactor dynamics, the metabolic state of the cell, and interactions among them during culture. Quality attributes of the monoclonal antibody product that were monitored include glycosylation profile of the final product along with process attributes, such as viable cell density and level of antibody expression. These were related to process variables, raw materials components of the chemically defined hybridoma media, concentration of metabolites formed during the course of the culture, aeration-related parameters, and supplemented raw materials such as glucose, methionine, threonine, tryptophan, and tyrosine. This article demonstrates the utility of multivariate data analysis for correlating the product quality attributes (especially glycosylation) to process variables and raw materials (especially amino acid supplements in cell culture media). The proposed approach can be applied for process optimization to increase product expression, improve consistency of product quality, and target the desired quality attribute profile. © 2015 American Institute of Chemical Engineers.

  8. A marker-free system for the analysis of movement disabilities.

    PubMed

    Legrand, L; Marzani, F; Dusserre, L

    1998-01-01

    A major step toward improving the treatments of disabled persons may be achieved by using motion analysis equipment. We are developing such a system. It allows the analysis of plane human motion (e.g. gait) without using the tracking of markers. The system is composed of one fixed camera which acquires an image sequence of a human in motion. Then the treatment is divided into two steps: first, a large number of pixels belonging to the boundaries of the human body are extracted at each acquisition time. Secondly, a two-dimensional model of the human body, based on tapered superquadrics, is successively matched with the sets of pixels previously extracted; a specific fuzzy clustering process is used for this purpose. Moreover, an optical flow procedure gives a prediction of the model location at each acquisition time from its location at the previous time. Finally we present some results of this process applied to a leg in motion.

  9. Dynamic event tree analysis with the SAS4A/SASSYS-1 safety analysis code

    DOE PAGES

    Jankovsky, Zachary K.; Denman, Matthew R.; Aldemir, Tunc

    2018-02-02

    The consequences of a transient in an advanced sodium-cooled fast reactor are difficult to capture with the traditional approach to probabilistic risk assessment (PRA). Numerous safety-relevant systems are passive and may have operational states that cannot be represented by binary success or failure. In addition, the specific order and timing of events may be crucial which necessitates the use of dynamic PRA tools such as ADAPT. The modifications to the SAS4A/SASSYS-1 sodium-cooled fast reactor safety analysis code for linking it to ADAPT to perform a dynamic PRA are described. A test case is used to demonstrate the linking process andmore » to illustrate the type of insights that may be gained with this process. Finally, newly-developed dynamic importance measures are used to assess the significance of reactor parameters/constituents on calculated consequences of initiating events.« less

  10. Dynamic event tree analysis with the SAS4A/SASSYS-1 safety analysis code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jankovsky, Zachary K.; Denman, Matthew R.; Aldemir, Tunc

    The consequences of a transient in an advanced sodium-cooled fast reactor are difficult to capture with the traditional approach to probabilistic risk assessment (PRA). Numerous safety-relevant systems are passive and may have operational states that cannot be represented by binary success or failure. In addition, the specific order and timing of events may be crucial which necessitates the use of dynamic PRA tools such as ADAPT. The modifications to the SAS4A/SASSYS-1 sodium-cooled fast reactor safety analysis code for linking it to ADAPT to perform a dynamic PRA are described. A test case is used to demonstrate the linking process andmore » to illustrate the type of insights that may be gained with this process. Finally, newly-developed dynamic importance measures are used to assess the significance of reactor parameters/constituents on calculated consequences of initiating events.« less

  11. Canister Storage Building (CSB) Design Basis Accident Analysis Documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CROWE, R.D.; PIEPHO, M.G.

    2000-03-23

    This document provided the detailed accident analysis to support HNF-3553, Spent Nuclear Fuel Project Final Safety Analysis Report, Annex A, ''Canister Storage Building Final Safety Analysis Report''. All assumptions, parameters, and models used to provide the analysis of the design basis accidents are documented to support the conclusions in the Canister Storage Building Final Safety Analysis Report.

  12. Understanding the Perception of Very Small Software Companies towards the Adoption of Process Standards

    NASA Astrophysics Data System (ADS)

    Basri, Shuib; O'Connor, Rory V.

    This paper is concerned with understanding the issues that affect the adoption of software process standards by Very Small Entities (VSEs), their needs from process standards and their willingness to engage with the new ISO/IEC 29110 standard in particular. In order to achieve this goal, a series of industry data collection studies were undertaken with a collection of VSEs. A twin track approach of a qualitative data collection (interviews and focus groups) and quantitative data collection (questionnaire) were undertaken. Data analysis was being completed separately and the final results were merged, using the coding mechanisms of grounded theory. This paper serves as a roadmap for both researchers wishing to understand the issues of process standards adoption by very small companies and also for the software process standards community.

  13. Enhnacing the science of the WFIRST coronagraph instrument with post-processing.

    NASA Astrophysics Data System (ADS)

    Pueyo, Laurent; WFIRST CGI data analysis and post-processing WG

    2018-01-01

    We summarize the results of a three years effort investigating how to apply to the WFIRST coronagraph instrument (CGI) modern image analysis methods, now routinely used with ground-based coronagraphs. In this post we quantify the gain associated post-processing for WFIRST-CGI observing scenarios simulated between 2013 and 2017. We also show based one simulations that spectrum of planet can be confidently retrieved using these processing tools with and Integral Field Spectrograph. We then discuss our work using CGI experimental data and quantify coronagraph post-processing testbed gains. We finally introduce stability metrics that are simple to define and measure, and place useful lower bound and upper bounds on the achievable RDI post-processing contrast gain. We show that our bounds hold in the case of the testbed data.

  14. Development of Test-Analysis Models (TAM) for correlation of dynamic test and analysis results

    NASA Technical Reports Server (NTRS)

    Angelucci, Filippo; Javeed, Mehzad; Mcgowan, Paul

    1992-01-01

    The primary objective of structural analysis of aerospace applications is to obtain a verified finite element model (FEM). The verified FEM can be used for loads analysis, evaluate structural modifications, or design control systems. Verification of the FEM is generally obtained as the result of correlating test and FEM models. A test analysis model (TAM) is very useful in the correlation process. A TAM is essentially a FEM reduced to the size of the test model, which attempts to preserve the dynamic characteristics of the original FEM in the analysis range of interest. Numerous methods for generating TAMs have been developed in the literature. The major emphasis of this paper is a description of the procedures necessary for creation of the TAM and the correlation of the reduced models with the FEM or the test results. Herein, three methods are discussed, namely Guyan, Improved Reduced System (IRS), and Hybrid. Also included are the procedures for performing these analyses using MSC/NASTRAN. Finally, application of the TAM process is demonstrated with an experimental test configuration of a ten bay cantilevered truss structure.

  15. Track Score Processing of Multiple Dissimilar Sensors

    DTIC Science & Technology

    2007-06-01

    sensors ( infrared and light detection and ranging system) and one radio frenquency sensor (radar). The signal to noise ratio and design considerations...categorized as Johnson noise , shot noise , generation-recombination noise , temperature noise , microphonic noise , 1/f noise , and finally electronic...of 2.1 µm. The values of detectivity in this figure were derived from an analysis of commercial detectors , under background- limited conditions, at

  16. Digital Video Projects of, by, and for New Teachers: The Multiple Educational Functions of Creating Multimedia

    ERIC Educational Resources Information Center

    Halter, Christopher; Levin, James

    2014-01-01

    A three year study of digital video creation in higher education investigated the impact that creating short digital videos by university students in their final class of a teacher education program had on those students. Each student created a short video reflecting on the process of how he/she became a teacher. An analysis of the videos…

  17. Impacts and trends of journalistic telework: the journalists' viewpoint.

    PubMed

    Manssour, Ana Beatriz Benites

    2003-02-01

    This article, the last of a trilogy, presents general reflections about technological advances, especially considering cyberspace as a way of information transmission and exchange, and on telework forms. Based on research for an administration masters' thesis, the derivation process that resulted in the last final analysis category illustrates the feelings of the interviewed journalists about telework effects on their professional and personal lives.

  18. A Design Architecture for an Integrated Training System Decision Support System

    DTIC Science & Technology

    1990-07-01

    Sensory modes include visual, auditory, tactile, or kinesthetic; performance categories include time to complete , speed of response, or correct action ...procedures, and finally application and examples from the aviation proponency with emphasis on the LHX program. Appendix B is a complete bibliography...integrated analysis of ITS development. The approach was designed to provide an accurate and complete representation of the ITS development process and

  19. Identifying environmental features for land management decisions

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The major accomplishments of the Center for Remote Sensing and Cartography are outlined. The analysis and inventory of the Parker Mountain rangeland and the use of multitemporal data to study aspen succession stages are discussed. New and continuing projects are also described including a Salt Lake County land use study, Wasatch-Cache riparian study, and Humboldt River riparian habitat study. Finally, progress in digital processing techniques is reported.

  20. Molecular dynamics study of silicon carbide properties under external dynamic loading

    NASA Astrophysics Data System (ADS)

    Utkin, A. V.; Fomin, V. M.

    2017-10-01

    In this study, molecular dynamic simulations of high-velocity impact of a spherical 3C-SiC cluster, with a wide range of velocities (from 100 to 2600 m/s) and with a rigid wall, were performed. The analysis of the final structure shows that no structural phase transformation occurred in the material, despite the high pressure during the collision process.

  1. Five-Photon Absorption and Selective Enhancement of Multiphoton Absorption Processes

    PubMed Central

    2015-01-01

    We study one-, two-, three-, four-, and five-photon absorption of three centrosymmetric molecules using density functional theory. These calculations are the first ab initio calculations of five-photon absorption. Even- and odd-order absorption processes show different trends in the absorption cross sections. The behavior of all even- and odd-photon absorption properties shows a semiquantitative similarity, which can be explained using few-state models. This analysis shows that odd-photon absorption processes are largely determined by the one-photon absorption strength, whereas all even-photon absorption strengths are largely dominated by the two-photon absorption strength, in both cases modulated by powers of the polarizability of the final excited state. We demonstrate how to selectively enhance a specific multiphoton absorption process. PMID:26120588

  2. Five-Photon Absorption and Selective Enhancement of Multiphoton Absorption Processes.

    PubMed

    Friese, Daniel H; Bast, Radovan; Ruud, Kenneth

    2015-05-20

    We study one-, two-, three-, four-, and five-photon absorption of three centrosymmetric molecules using density functional theory. These calculations are the first ab initio calculations of five-photon absorption. Even- and odd-order absorption processes show different trends in the absorption cross sections. The behavior of all even- and odd-photon absorption properties shows a semiquantitative similarity, which can be explained using few-state models. This analysis shows that odd-photon absorption processes are largely determined by the one-photon absorption strength, whereas all even-photon absorption strengths are largely dominated by the two-photon absorption strength, in both cases modulated by powers of the polarizability of the final excited state. We demonstrate how to selectively enhance a specific multiphoton absorption process.

  3. Stochastic Analysis of Reaction–Diffusion Processes

    PubMed Central

    Hu, Jifeng; Kang, Hye-Won

    2013-01-01

    Reaction and diffusion processes are used to model chemical and biological processes over a wide range of spatial and temporal scales. Several routes to the diffusion process at various levels of description in time and space are discussed and the master equation for spatially discretized systems involving reaction and diffusion is developed. We discuss an estimator for the appropriate compartment size for simulating reaction–diffusion systems and introduce a measure of fluctuations in a discretized system. We then describe a new computational algorithm for implementing a modified Gillespie method for compartmental systems in which reactions are aggregated into equivalence classes and computational cells are searched via an optimized tree structure. Finally, we discuss several examples that illustrate the issues that have to be addressed in general systems. PMID:23719732

  4. User's Manual and Final Report for Hot-SMAC GUI Development

    NASA Technical Reports Server (NTRS)

    Yarrington, Phil

    2001-01-01

    A new software package called Higher Order Theory-Structural/Micro Analysis Code (HOT-SMAC) has been developed as an effective alternative to the finite element approach for Functionally Graded Material (FGM) modeling. HOT-SMAC is a self-contained package including pre- and post-processing through an intuitive graphical user interface, along with the well-established Higher Order Theory for Functionally Graded Materials (HOTFGM) thermomechanical analysis engine. This document represents a Getting Started/User's Manual for HOT-SMAC and a final report for its development. First, the features of the software are presented in a simple step-by-step example where a HOT-SMAC model representing a functionally graded material is created, mechanical and thermal boundary conditions are applied, the model is analyzed and results are reviewed. In a second step-by-step example, a HOT-SMAC model of an actively cooled metallic channel with ceramic thermal barrier coating is built and analyzed. HOT-SMAC results from this model are compared to recently published results (NASA/TM-2001-210702) for two grid densities. Finally, a prototype integration of HOTSMAC with the commercially available HyperSizer(R) structural analysis and sizing software is presented. In this integration, local strain results from HyperSizer's structural analysis are fed to a detailed HOT-SMAC model of the flange-to-facesheet bond region of a stiffened panel. HOT-SMAC is then used to determine the peak shear and peel (normal) stresses between the facesheet and bonded flange of the panel and determine the "free edge" effects.

  5. Process capability improvement through DMAIC for aluminum alloy wheel machining

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, P. Srinivasa; Babu, B. Surendra

    2017-07-01

    This paper first enlists the generic problems of alloy wheel machining and subsequently details on the process improvement of the identified critical-to-quality machining characteristic of A356 aluminum alloy wheel machining process. The causal factors are traced using the Ishikawa diagram and prioritization of corrective actions is done through process failure modes and effects analysis. Process monitoring charts are employed for improving the process capability index of the process, at the industrial benchmark of four sigma level, which is equal to the value of 1.33. The procedure adopted for improving the process capability levels is the define-measure-analyze-improve-control (DMAIC) approach. By following the DMAIC approach, the C p, C pk and C pm showed signs of improvement from an initial value of 0.66, -0.24 and 0.27, to a final value of 4.19, 3.24 and 1.41, respectively.

  6. An assembly process model based on object-oriented hierarchical time Petri Nets

    NASA Astrophysics Data System (ADS)

    Wang, Jiapeng; Liu, Shaoli; Liu, Jianhua; Du, Zenghui

    2017-04-01

    In order to improve the versatility, accuracy and integrity of the assembly process model of complex products, an assembly process model based on object-oriented hierarchical time Petri Nets is presented. A complete assembly process information model including assembly resources, assembly inspection, time, structure and flexible parts is established, and this model describes the static and dynamic data involved in the assembly process. Through the analysis of three-dimensional assembly process information, the assembly information is hierarchically divided from the whole, the local to the details and the subnet model of different levels of object-oriented Petri Nets is established. The communication problem between Petri subnets is solved by using message database, and it reduces the complexity of system modeling effectively. Finally, the modeling process is presented, and a five layer Petri Nets model is established based on the hoisting process of the engine compartment of a wheeled armored vehicle.

  7. Information granules in image histogram analysis.

    PubMed

    Wieclawek, Wojciech

    2018-04-01

    A concept of granular computing employed in intensity-based image enhancement is discussed. First, a weighted granular computing idea is introduced. Then, the implementation of this term in the image processing area is presented. Finally, multidimensional granular histogram analysis is introduced. The proposed approach is dedicated to digital images, especially to medical images acquired by Computed Tomography (CT). As the histogram equalization approach, this method is based on image histogram analysis. Yet, unlike the histogram equalization technique, it works on a selected range of the pixel intensity and is controlled by two parameters. Performance is tested on anonymous clinical CT series. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Transfer-appropriate processing in the testing effect.

    PubMed

    Veltre, Mary T; Cho, Kit W; Neely, James H

    2015-01-01

    The testing effect is the finding that taking a review test enhances performance on a final test relative to restudying the material. The present experiment investigated transfer-appropriate processing in the testing effect using semantic and orthographic cues to evoke conceptual and data-driven processing, respectively. After a study phase, subjects either restudied the material or took a cued-recall test consisting of half semantic and half orthographic cues in which the correct response was given as feedback. A final, cued-recall test consisted of the identical cue, or a new cue that was of the same type or different type of cue (semantic/orthographic or orthographic/semantic) as that used for that target in the review test. Testing enhanced memory in all conditions. When the review cues and final-test cues were identical, final recall was higher for semantic than orthographic cues. Consistent with test-based transfer-appropriate processing, memory performance improved as the review and final cues became more similar. These results suggest that the testing effect could potentially be caused by the episodic retrieval processes in a final memory test overlapping more with the episodic retrieval processes in a review test than with the encoding operations performed during restudy.

  9. Comparative study of submerged and surface culture acetification process for orange vinegar.

    PubMed

    Cejudo-Bastante, Cristina; Durán-Guerrero, Enrique; García-Barroso, Carmelo; Castro-Mejías, Remedios

    2018-02-01

    The two main acetification methodologies generally employed in the production of vinegar (surface and submerged cultures) were studied and compared for the production of orange vinegar. Polyphenols (UPLC/DAD) and volatiles compounds (SBSE-GC/MS) were considered as the main variables in the comparative study. Sensory characteristics of the obtained vinegars were also evaluated. Seventeen polyphenols and 24 volatile compounds were determined in the samples during both acetification processes. For phenolic compounds, analysis of variance showed significant higher concentrations when surface culture acetification was employed. However, for the majority of volatile compounds higher contents were observed for submerged culture acetification process, and it was also reflected in the sensory analysis, presenting higher scores for the different descriptors. Multivariate statistical analysis such as principal component analysis demonstrated the possibility of discriminating the samples regarding the type of acetification process. Polyphenols such as apigenin derivative or ferulic acid and volatile compounds such as 4-vinylguaiacol, decanoic acid, nootkatone, trans-geraniol, β-citronellol or α-terpineol, among others, were those compounds that contributed more to the discrimination of the samples. The acetification process employed in the production of orange vinegar has been demonstrated to be very significant for the final characteristics of the vinegar obtained. So it must be carefully controlled to obtain high quality products. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  10. Additive Manufacturing Design Considerations for Liquid Engine Components

    NASA Technical Reports Server (NTRS)

    Whitten, Dave; Hissam, Andy; Baker, Kevin; Rice, Darron

    2014-01-01

    The Marshall Space Flight Center's Propulsion Systems Department has gained significant experience in the last year designing, building, and testing liquid engine components using additive manufacturing. The department has developed valve, duct, turbo-machinery, and combustion device components using this technology. Many valuable lessons were learned during this process. These lessons will be the focus of this presentation. We will present criteria for selecting part candidates for additive manufacturing. Some part characteristics are 'tailor made' for this process. Selecting the right parts for the process is the first step to maximizing productivity gains. We will also present specific lessons we learned about feature geometry that can and cannot be produced using additive manufacturing machines. Most liquid engine components were made using a two-step process. The base part was made using additive manufacturing and then traditional machining processes were used to produce the final part. The presentation will describe design accommodations needed to make the base part and lessons we learned about which features could be built directly and which require the final machine process. Tolerance capabilities, surface finish, and material thickness allowances will also be covered. Additive Manufacturing can produce internal passages that cannot be made using traditional approaches. It can also eliminate a significant amount of manpower by reducing part count and leveraging model-based design and analysis techniques. Information will be shared about performance enhancements and design efficiencies we experienced for certain categories of engine parts.

  11. [Optimization of blood gas analysis in intensive care units : Reduction of preanalytical errors and improvement of workflow].

    PubMed

    Kieninger, M; Zech, N; Mulzer, Y; Bele, S; Seemann, M; Künzig, H; Schneiker, A; Gruber, M

    2015-05-01

    Point of care testing with blood gas analysis (BGA) is an important factor for intensive care medicine. Continuous efforts to optimize workflow, improve safety for the staff and avoid preanalytical mistakes are important and should reflect quality management standards. In a prospective observational study it was investigated whether the implementation of a new system for BGA using labeled syringes and automated processing of the specimens leads to improvements compared to the previously used procedure. In a 4-week test period the time until receiving the final results of the BGA with the standard method used in the clinical routine (control group) was compared to the results in a second 4-week test period using the new labeled syringes and automated processing of the specimens (intervention group). In addition, preanalytical mistakes with both systems were checked during routine daily use. Finally, it was investigated whether a delay of 10 min between taking and analyzing the blood samples alters the results of the BGA. Preanalytical errors were frequently observed in the control group where non-deaerated samples were recorded in 87.3 % but in the intervention group almost all samples (98.9 %) were correctly deaerated. Insufficient homogenization due to omission of manual pivoting was seen in 83.2 % in the control group and in 89.9 % in the intervention group; however, in the intervention group the samples were homogenized automatically during the further analytical process. Although a survey among the staff revealed a high acceptance of the new system and a subjective improvement of workflow, a measurable gain in time after conversion to the new procedure could not be seen. The mean time needed for a complete analysis process until receiving the final results was 244 s in the intervention group and 201 s in the control group. A 10-min delay between taking and analyzing the blood samples led to a significant and clinically relevant elevation of the values for partial pressure of oxygen (pO2) in both groups compared to the results when analyzing the samples immediately (118.4 vs. 148.6 mmHg in the control group and 115.3 vs. 123.7 mmHg in the intervention group). When using standard syringes the partial pressure of carbon dioxide (pCO2) was significantly lower (40.5 vs. 38.3 mmHg) whereas no alterations were seen when using the labeled syringes. The implementation of a new BGA system with labeled syringes and automated processing of the specimens was possible without any difficulties under daily clinical routine conditions in this 10-bed intensive care unit (ICU). A gain of time could not be measured but a reduction in preanalytical errors using the labeled syringes with automated processing was found. Delayed analysis of blood samples can lead to significant changes in pO2 and pCO2 depending on the type of syringe used.

  12. Static analysis techniques for semiautomatic synthesis of message passing software skeletons

    DOE PAGES

    Sottile, Matthew; Dagit, Jason; Zhang, Deli; ...

    2015-06-29

    The design of high-performance computing architectures demands performance analysis of large-scale parallel applications to derive various parameters concerning hardware design and software development. The process of performance analysis and benchmarking an application can be done in several ways with varying degrees of fidelity. One of the most cost-effective ways is to do a coarse-grained study of large-scale parallel applications through the use of program skeletons. The concept of a “program skeleton” that we discuss in this article is an abstracted program that is derived from a larger program where source code that is determined to be irrelevant is removed formore » the purposes of the skeleton. In this work, we develop a semiautomatic approach for extracting program skeletons based on compiler program analysis. Finally, we demonstrate correctness of our skeleton extraction process by comparing details from communication traces, as well as show the performance speedup of using skeletons by running simulations in the SST/macro simulator.« less

  13. Robotic solid phase extraction and high performance liquid chromatographic analysis of ranitidine in serum or plasma.

    PubMed

    Lloyd, T L; Perschy, T B; Gooding, A E; Tomlinson, J J

    1992-01-01

    A fully automated assay for the analysis of ranitidine in serum and plasma, with and without an internal standard, was validated. It utilizes robotic solid phase extraction with on-line high performance liquid chromatographic (HPLC) analysis. The ruggedness of the assay was demonstrated over a three-year period. A Zymark Py Technology II robotic system was used for serial processing from initial aspiration of samples from original collection containers, to final direct injection onto the on-line HPLC system. Automated serial processing with on-line analysis provided uniform sample history and increased productivity by freeing the chemist to analyse data and perform other tasks. The solid phase extraction efficiency was 94% throughout the assay range of 10-250 ng/mL. The coefficients of variation for within- and between-day quality control samples ranged from 1 to 6% and 1 to 5%, respectively. Mean accuracy for between-day standards and quality control results ranged from 97 to 102% of the respective theoretical concentrations.

  14. Nursing professionalism: An evolutionary concept analysis

    PubMed Central

    Ghadirian, Fataneh; Salsali, Mahvash; Cheraghi, Mohammad Ali

    2014-01-01

    Background: Professionalism is an important feature of the professional jobs. Dynamic nature and the various interpretations of this term lead to multiple definitions of this concept. The aim of this paper is to identify the core attributes of the nursing professionalism. Materials and Methods: We followed Rodgers’ evolutionary method of concept analysis. Texts published in scientific databases about nursing professionalism between 1980 and 2011 were assessed. After applying the selection criteria, the final sample consisting of 4 books and 213 articles was selected, examined, and analyzed in depth. Two experts checked the process of analysis and monitored and reviewed them. Results: The analysis showed that nursing professionalism is determined by three attributes of cognitive, attitudinal, and psychomotor. In addition, the most important antecedents concepts were demographic, experiential, educational, environmental, and attitudinal factors. Conclusion: Nursing professionalism is an inevitable, complex, varied, and dynamic process. In this study, the importance, scope, and concept of professionalism in nursing, the concept of a beginning for further research and development, and expanding the nursing knowledge are explained and clarified. PMID:24554953

  15. The Manicouagan impact structure - An analysis of its original dimensions and form

    NASA Technical Reports Server (NTRS)

    Grieve, R. A. F.; Head, J. W., III

    1983-01-01

    A reanalysis of the preerosional geology of the Canadian impact crater, Manicouagan, is presented. Although most of the current features of the annular moat are primarily a result of erosional processes, the original dimensions of the cavity have been determined to include a transient cavity 60 km in diam. The final floor of the crater was studied and found to be an impact melt-covered inner plateau 55 km in diam. Comparisons with similar crater bottoms on the moon are used to estimate a final crater rim diameter of 85-95 km. The inner plateau and relatively smooth deposits on the crater floor are noted to be most similar to the lunar crater Copernicus.

  16. Analysis and control of the METC fluid bed gasifier. Final report (includes technical progress report for October 1994--January 1995), September 1994--September 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-09-01

    This document presents a modeling and control study of the Fluid Bed Gasification (FBG) unit at the Morgantown Energy Technology Center (METC). The work is performed under contract no. DE-FG21-94MC31384. The purpose of this study is to generate a simple FBG model from process data, and then use the model to suggest an improved control scheme which will improve operation of the gasifier. The work first developes a simple linear model of the gasifier, then suggests an improved gasifier pressure and MGCR control configuration, and finally suggests the use of a multivariable control strategy for the gasifier.

  17. District heating and cooling systems for communities through power plant retrofit and distribution networks. Phase 1: identificaion and assessment. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1979-09-01

    Appendix A, Utility Plant Characteristics, contains information describing the characteristics of seven utility plants that were considered during the final site selection process. The plants are: Valley Electric Generating Plant, downtown Milwaukee; Manitowoc Electric Generating Plant, downtown Manitowoc; Blount Street Electric Generating Plant, downtown Madison; Pulliam Electric Generating Plant, downtown Green Bay; Edgewater Electric Generating Plant, downtown Sheboygan; Rock River Electric Generating Plant, near Janesville and Beloit; and Black Hawk Electric Generating Plant, downtown Beloit. Additional appendices are: Future Loads; hvac Inventory; Load Calculations; Factors to Induce Potential Users; Turbine Retrofit/Distribution System Data; and Detailed Economic Analysis Results/Data.

  18. Wind Plant Performance Prediction (WP3) Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Craig, Anna

    The methods for analysis of operational wind plant data are highly variable across the wind industry, leading to high uncertainties in the validation and bias-correction of preconstruction energy estimation methods. Lack of credibility in the preconstruction energy estimates leads to significant impacts on project financing and therefore the final levelized cost of energy for the plant. In this work, the variation in the evaluation of a wind plant's operational energy production as a result of variations in the processing methods applied to the operational data is examined. Preliminary results indicate that selection of the filters applied to the data andmore » the filter parameters can have significant impacts in the final computed assessment metrics.« less

  19. Cost-Utility Analysis: Current Methodological Issues and Future Perspectives

    PubMed Central

    Nuijten, Mark J. C.; Dubois, Dominique J.

    2011-01-01

    The use of cost–effectiveness as final criterion in the reimbursement process for listing of new pharmaceuticals can be questioned from a scientific and policy point of view. There is a lack of consensus on main methodological issues and consequently we may question the appropriateness of the use of cost–effectiveness data in health care decision-making. Another concern is the appropriateness of the selection and use of an incremental cost–effectiveness threshold (Cost/QALY). In this review, we focus mainly on only some key methodological concerns relating to discounting, the utility concept, cost assessment, and modeling methodologies. Finally we will consider the relevance of some other important decision criteria, like social values and equity. PMID:21713127

  20. F-Tank Farm Performance Assessment Updates through the Special Analysis Process at Savannah River Site - 12169

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Layton, Mark H.

    2012-07-01

    The F-Area Tank Farm (FTF) is owned by the U.S. Department of Energy and operated by Savannah River Remediation, LLC (SRR), Liquid Waste Operations contractor at DOE's Savannah River Site (SRS). The FTF is in the north-central portion of the SRS and occupies approximately 22 acres within F-Area. The FTF is an active radioactive waste storage facility consisting of 22 carbon steel waste tanks and ancillary equipment such as transfer lines, evaporators and pump tanks. An FTF Performance Assessment (PA) was prepared to support the eventual closure of the FTF underground radioactive waste tanks and ancillary equipment. The PA providesmore » the technical basis and results to be used in subsequent documents to demonstrate compliance with the pertinent requirements identified below for final closure of FTF. The FTank Farm is subject to a state industrial waste water permit and Federal Facility Agreement. Closure documentation will include an F-Tank Farm Closure Plan and tank-specific closure modules utilizing information from the performance assessment. For this reason, the State of South Carolina and the Environmental Protection Agency must be involved in the performance assessment review process. The residual material remaining after tank cleaning is also subject to reclassification prior to closure via a waste determination pursuant to Section 3116 of the Ronald W. Reagan National Defense Authorization Act of Fiscal Year 2005. The projected waste tank inventories in the FTF PA provide reasonably bounding FTF inventory projections while taking into account uncertainties in the effectiveness of future tank cleaning technologies. As waste is removed from the FTF waste tanks, the residual contaminants will be sampled and the remaining residual inventory is characterized. In this manner, tank specific data for the tank inventories at closure will be available to supplement the waste tank inventory projections currently used in the FTF PA. For FTF, the new tank specific data will be evaluated through the Special Analysis process. The FTF Special Analyses process will be utilized to evaluate information regarding the final residual waste that will be grouted in place in the FTF Tanks and assess the potential impact the new inventory information has on the FTF PA assumptions and results. The Special Analysis can then be used to inform decisions regarding FTF tank closure documents. The purpose of this paper is to discuss the Special Analysis process and share insights gained while implementing this process. An example of an area of interest in the revision process is balancing continuous improvement versus configuration control of agreed upon methodologies. Other subjects to be covered include: 1) defining the scope of the revisions included in the Special Analysis, 2) determining which PA results should be addressed in the Special Analysis, and 3) deciding whether the Special Analysis should utilize more qualitative or quantitative assessments. For the SRS FTF, an FTF PA has been prepared to provide the technical basis and results to be used in subsequent documents to demonstrate compliance with the pertinent requirements for final closure of FTF. The FTF Special Analyses process will be utilized to evaluate the impact new information has on the FTF PA assumptions and results. The Special Analysis can then be used to inform decisions regarding FTF tank closure documents. In preparing SAs, it is crucial that the scope of the SA be well defined within the SA, since the specific scope will vary from SA to SA. Since the SAs are essentially addendums to the PA, the SA scope should utilize the PA as the baseline from which the SA scope is defined. The SA needs to focus on evaluating the change associated with the scope, and not let other changes interfere with the ability to perform that evaluation by masking the impact of the change. In preparing the SA, it is also important to let the scope determine whether the Special Analysis should utilize more qualitative or quantitative assessments and also which results from the PA should be addressed in the Special Analysis. These decisions can vary from SA and should not be predetermined. (author)« less

  1. Ghost analysis visualization techniques for complex systems: examples from the NIF Final Optics Assembly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beer, G K; Hendrix, J L; Rowe, J

    1998-06-26

    The stray light or "ghost" analysis of the National Ignition Facility's (NIP) Final Optics Assembly (FOA) has proved to be one of the most complex ghost analyses ever attempted. The NIF FOA consists of a bundle of four beam lines that: 1) provides the vacuum seal to the target chamber, 2) converts 1ω to 3ω light, 3) focuses the light on the target, 4) separates a fraction of the 3ω beam for energy diagnostics, 5) separates the three wavelengths to diffract unwanted 1ω & 2ω light away from the target, 6) provides spatial beam smoothing, and 7) provides a debrismore » barrier between the target chamber and the switchyard mirrors. The three wavelengths of light and seven optical elements with three diffractive optic surfaces generate three million ghosts through 4 th order. Approximately 24,000 of these ghosts have peak fluence exceeding 1 J/cm 2. The shear number of ghost paths requires a visualization method that allows overlapping ghosts on optics and mechanical components to be summed and then mapped to the optical and mechanical component surfaces in 3D space. This paper addresses the following aspects of the NIF Final Optics Ghost analysis: 1) materials issues for stray light mitigation, 2) limitations of current software tools (especially in modeling diffractive optics), 3) computer resource limitations affecting automated coherent raytracing, 4) folding the stray light analysis into the opto-mechanical design process, 5) analysis and visualization tools from simple hand calculations to specialized stray light analysis computer codes, and 6) attempts at visualizing these ghosts using a CAD model and another using a high end data visualization software approach.« less

  2. Switching and optimizing control for coal flotation process based on a hybrid model

    PubMed Central

    Dong, Zhiyong; Wang, Ranfeng; Fan, Minqiang; Fu, Xiang

    2017-01-01

    Flotation is an important part of coal preparation, and the flotation column is widely applied as efficient flotation equipment. This process is complex and affected by many factors, with the froth depth and reagent dosage being two of the most important and frequently manipulated variables. This paper proposes a new method of switching and optimizing control for the coal flotation process. A hybrid model is built and evaluated using industrial data. First, wavelet analysis and principal component analysis (PCA) are applied for signal pre-processing. Second, a control model for optimizing the set point of the froth depth is constructed based on fuzzy control, and a control model is designed to optimize the reagent dosages based on expert system. Finally, the least squares-support vector machine (LS-SVM) is used to identify the operating conditions of the flotation process and to select one of the two models (froth depth or reagent dosage) for subsequent operation according to the condition parameters. The hybrid model is developed and evaluated on an industrial coal flotation column and exhibits satisfactory performance. PMID:29040305

  3. Meat waste as feedstock for home composting: Effects on the process and quality of compost.

    PubMed

    Storino, Francesco; Arizmendiarrieta, Joseba S; Irigoyen, Ignacio; Muro, Julio; Aparicio-Tejo, Pedro M

    2016-10-01

    Home composting is a powerful tool, which is spreading in different parts of the world, to reduce the generation of municipal waste. However, there is debate concerning the appropriateness, in terms of domestic hygiene and safety, of keeping a composter bin in the household deputed to kitchen waste of animal origin, such as meat or fish scraps and pet droppings. The purpose of our work was to study how the addition of meat scraps to household waste influences the composting process and the quality of the final compost obtained. We compared four raw material mixtures, characterized by a different combination of vegetable and meat waste and different ratios of woody bulking agent. Changes in temperature, mass and volume, phenotypic microbial diversity (by Biolog™) and organic matter humification were determined during the process. At the end of the experiment, the four composts were weighed and characterized by physicochemical analysis. In addition, the presence of viable weed seeds was investigated and a germination bioassay was carried out to determine the level of phytotoxicity. Finally, the levels of pathogens (Escherichia coli and Salmonella spp.) were also determined in the final compost. Here we show that the presence of meat waste as raw feedstock for composting in bins can improve the activity of the process, the physicochemical characteristics and maturity of the compost obtained, without significantly affecting its salinity, pH and phytotoxicity. Pathogen levels were low, showing that they can be controlled by an intensive management and proper handling of the composter bins. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Outcomes assessment in the SPRINT multicenter tibial fracture trial: Adjudication committee size has trivial effect on trial results.

    PubMed

    Simunovic, Nicole; Walter, Stephen; Devereaux, P J; Sprague, Sheila; Guyatt, Gordon H; Schemitsch, Emil; Tornetta, Paul; Sanders, David; Swiontkowski, Marc; Bhandari, Mohit

    2011-09-01

    To evaluate how the size of an outcome adjudication committee, and the potential for dominance among its members, potentially impacts a trial's results. We conducted a retrospective analysis of data from the six-member adjudication committee in the Study to Prospectively Evaluate Reamed Intramedullary Nails in Patients with Tibial Fractures (SPRINT) Trial. We modeled the adjudication process, predicted the results and costs if smaller committees had been used, and tested for the presence of a dominant adjudicator. Use of smaller committee sizes (one to five members) would have had little impact on the final study results, although one analysis suggested that the benefit in reduction of reoperations with reamed nails in closed tibial fractures would have lost significance if committee sizes of three or less were used. We identified a significant difference between adjudicators in the number of times their original minority decisions became the final consensus decision (χ(2)=9.67, P=0.046), suggesting that dominant adjudicators were present. However, their impact on the final study results was trivial. Reducing the number of adjudicators from six to four would have led to little change in the final SPRINT study results irrespective of the significance of the original trial results, demonstrating the potential for savings in trial resources. Copyright © 2011 Elsevier Inc. All rights reserved.

  5. CDO budgeting

    NASA Astrophysics Data System (ADS)

    Nesladek, Pavel; Wiswesser, Andreas; Sass, Björn; Mauermann, Sebastian

    2008-04-01

    The Critical dimension off-target (CDO) is a key parameter for mask house customer, affecting directly the performance of the mask. The CDO is the difference between the feature size target and the measured feature size. The change of CD during the process is either compensated within the process or by data correction. These compensation methods are commonly called process bias and data bias, respectively. The difference between data bias and process bias in manufacturing results in systematic CDO error, however, this systematic error does not take into account the instability of the process bias. This instability is a result of minor variations - instabilities of manufacturing processes and changes in materials and/or logistics. Using several masks the CDO of the manufacturing line can be estimated. For systematic investigation of the unit process contribution to CDO and analysis of the factors influencing the CDO contributors, a solid understanding of each unit process and huge number of masks is necessary. Rough identification of contributing processes and splitting of the final CDO variation between processes can be done with approx. 50 masks with identical design, material and process. Such amount of data allows us to identify the main contributors and estimate the effect of them by means of Analysis of variance (ANOVA) combined with multivariate analysis. The analysis does not provide information about the root cause of the variation within the particular unit process, however, it provides a good estimate of the impact of the process on the stability of the manufacturing line. Additionally this analysis can be used to identify possible interaction between processes, which cannot be investigated if only single processes are considered. Goal of this work is to evaluate limits for CDO budgeting models given by the precision and the number of measurements as well as partitioning the variation within the manufacturing process. The CDO variation splits according to the suggested model into contributions from particular processes or process groups. Last but not least the power of this method to determine the absolute strength of each parameter will be demonstrated. Identification of the root cause of this variation within the unit process itself is not scope of this work.

  6. Racial Healthcare Disparities: A Social Psychological Analysis

    PubMed Central

    Penner, Louis A.; Hagiwara, Nao; Eggly, Susan; Gaertner, Samuel L.; Albrecht, Terrance L.; Dovidio, John F.

    2014-01-01

    Around the world, members of racial/ethnic minority groups typically experience poorer health than members of racial/ethnic majority groups. The core premise of this article is that thoughts, feelings, and behaviors related to race and ethnicity play a critical role in healthcare disparities. Social psychological theories of the origins and consequences of these thoughts, feelings, and behaviors offer critical insights into the processes responsible for these disparities and suggest interventions to address them. We present a multilevel model that explains how societal, intrapersonal, and interpersonal factors can influence ethnic/racial health disparities. We focus our literature review, including our own research, and conceptual analysis at the intrapersonal (the race-related thoughts and feelings of minority patients and non-minority physicians) and interpersonal levels (intergroup processes that affect medical interactions between minority patients and non-minority physicians). At both levels of analysis, we use theories of social categorization, social identity, contemporary forms of racial bias, stereotype activation, stigma, and other social psychological processes to identify and understand potential causes and processes of health and healthcare disparities. In the final section, we identify theory-based interventions that might reduce ethnic/racial disparities in health and healthcare. PMID:25197206

  7. A Systematic Investigation into Aging Related Genes in Brain and Their Relationship with Alzheimer's Disease.

    PubMed

    Meng, Guofeng; Zhong, Xiaoyan; Mei, Hongkang

    2016-01-01

    Aging, as a complex biological process, is accompanied by the accumulation of functional loses at different levels, which makes age to be the biggest risk factor to many neurological diseases. Even following decades of investigation, the process of aging is still far from being fully understood, especially at a systematic level. In this study, we identified aging related genes in brain by collecting the ones with sustained and consistent gene expression or DNA methylation changes in the aging process. Functional analysis with Gene Ontology to these genes suggested transcriptional regulators to be the most affected genes in the aging process. Transcription regulation analysis found some transcription factors, especially Specificity Protein 1 (SP1), to play important roles in regulating aging related gene expression. Module-based functional analysis indicated these genes to be associated with many well-known aging related pathways, supporting the validity of our approach to select aging related genes. Finally, we investigated the roles of aging related genes on Alzheimer's Disease (AD). We found that aging and AD related genes both involved some common pathways, which provided a possible explanation why aging made the brain more vulnerable to Alzheimer's Disease.

  8. The analysis method of the DRAM cell pattern hotspot

    NASA Astrophysics Data System (ADS)

    Lee, Kyusun; Lee, Kweonjae; Chang, Jinman; Kim, Taeheon; Han, Daehan; Hong, Aeran; Kim, Yonghyeon; Kang, Jinyoung; Choi, Bumjin; Lee, Joosung; Lee, Jooyoung; Hong, Hyeongsun; Lee, Kyupil; Jin, Gyoyoung

    2015-03-01

    It is increasingly difficult to determine degree of completion of the patterning and the distribution at the DRAM Cell Patterns. When we research DRAM Device Cell Pattern, there are three big problems currently, it is as follows. First, due to etch loading, it is difficult to predict the potential defect. Second, due to under layer topology, it is impossible to demonstrate the influence of the hotspot. Finally, it is extremely difficult to predict final ACI pattern by the photo simulation, because current patterning process is double patterning technology which means photo pattern is completely different from final etch pattern. Therefore, if the hotspot occurs in wafer, it is very difficult to find it. CD-SEM is the most common pattern measurement tool in semiconductor fabrication site. CD-SEM is used to accurately measure small region of wafer pattern primarily. Therefore, there is no possibility of finding places where unpredictable defect occurs. Even though, "Current Defect detector" can measure a wide area, every chip has same pattern issue, the detector cannot detect critical hotspots. Because defect detecting algorithm of bright field machine is based on image processing, if same problems occur on compared and comparing chip, the machine cannot identify it. Moreover this instrument is not distinguished the difference of distribution about 1nm~3nm. So, "Defect detector" is difficult to handle the data for potential weak point far lower than target CD. In order to solve those problems, another method is needed. In this paper, we introduce the analysis method of the DRAM Cell Pattern Hotspot.

  9. Paradoxical Behavior of Granger Causality

    NASA Astrophysics Data System (ADS)

    Witt, Annette; Battaglia, Demian; Gail, Alexander

    2013-03-01

    Granger causality is a standard tool for the description of directed interaction of network components and is popular in many scientific fields including econometrics, neuroscience and climate science. For time series that can be modeled as bivariate auto-regressive processes we analytically derive an expression for spectrally decomposed Granger Causality (SDGC) and show that this quantity depends only on two out of four groups of model parameters. Then we present examples of such processes whose SDGC expose paradoxical behavior in the sense that causality is high for frequency ranges with low spectral power. For avoiding misinterpretations of Granger causality analysis we propose to complement it by partial spectral analysis. Our findings are illustrated by an example from brain electrophysiology. Finally, we draw implications for the conventional definition of Granger causality. Bernstein Center for Computational Neuroscience Goettingen

  10. A multi-phase network situational awareness cognitive task analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erbacher, Robert; Frincke, Deborah A.; Wong, Pak C.

    Abstract The goal of our project is to create a set of next-generation cyber situational-awareness capabilities with applications to other domains in the long term. The objective is to improve the decision-making process to enable decision makers to choose better actions. To this end, we put extensive effort into making certain that we had feedback from network analysts and managers and understand what their genuine needs are. This article discusses the cognitive task-analysis methodology that we followed to acquire feedback from the analysts. This article also provides the details we acquired from the analysts on their processes, goals, concerns, themore » data and metadata that they analyze. Finally, we describe the generation of a novel task-flow diagram representing the activities of the target user base.« less

  11. Techniques for fire detection

    NASA Technical Reports Server (NTRS)

    Bukowski, Richard W.

    1987-01-01

    An overview is given of the basis for an analysis of combustable materials and potential ignition sources in a spacecraft. First, the burning process is discussed in terms of the production of the fire signatures normally associated with detection devices. These include convected and radiated thermal energy, particulates, and gases. Second, the transport processes associated with the movement of these from the fire to the detector, along with the important phenomena which cause the level of these signatures to be reduced, are described. Third, the operating characteristics of the individual types of detectors which influence their response to signals, are presented. Finally, vulnerability analysis using predictive fire modeling techniques is discussed as a means to establish the necessary response of the detection system to provide the level of protection required in the application.

  12. A neural network model of metaphor understanding with dynamic interaction based on a statistical language analysis: targeting a human-like model.

    PubMed

    Terai, Asuka; Nakagawa, Masanori

    2007-08-01

    The purpose of this paper is to construct a model that represents the human process of understanding metaphors, focusing specifically on similes of the form an "A like B". Generally speaking, human beings are able to generate and understand many sorts of metaphors. This study constructs the model based on a probabilistic knowledge structure for concepts which is computed from a statistical analysis of a large-scale corpus. Consequently, this model is able to cover the many kinds of metaphors that human beings can generate. Moreover, the model implements the dynamic process of metaphor understanding by using a neural network with dynamic interactions. Finally, the validity of the model is confirmed by comparing model simulations with the results from a psychological experiment.

  13. Localization in covariance matrices of coupled heterogenous Ornstein-Uhlenbeck processes

    NASA Astrophysics Data System (ADS)

    Barucca, Paolo

    2014-12-01

    We define a random-matrix ensemble given by the infinite-time covariance matrices of Ornstein-Uhlenbeck processes at different temperatures coupled by a Gaussian symmetric matrix. The spectral properties of this ensemble are shown to be in qualitative agreement with some stylized facts of financial markets. Through the presented model formulas are given for the analysis of heterogeneous time series. Furthermore evidence for a localization transition in eigenvectors related to small and large eigenvalues in cross-correlations analysis of this model is found, and a simple explanation of localization phenomena in financial time series is provided. Finally we identify both in our model and in real financial data an inverted-bell effect in correlation between localized components and their local temperature: high- and low-temperature components are the most localized ones.

  14. An Algebra-Based Introductory Computational Neuroscience Course with Lab.

    PubMed

    Fink, Christian G

    2017-01-01

    A course in computational neuroscience has been developed at Ohio Wesleyan University which requires no previous experience with calculus or computer programming, and which exposes students to theoretical models of neural information processing and techniques for analyzing neural data. The exploration of theoretical models of neural processes is conducted in the classroom portion of the course, while data analysis techniques are covered in lab. Students learn to program in MATLAB and are offered the opportunity to conclude the course with a final project in which they explore a topic of their choice within computational neuroscience. Results from a questionnaire administered at the beginning and end of the course indicate significant gains in student facility with core concepts in computational neuroscience, as well as with analysis techniques applied to neural data.

  15. The Isolation of Nanofibre Cellulose from Oil Palm Empty Fruit Bunch Via Steam Explosion and Hydrolysis with HCl 10%

    NASA Astrophysics Data System (ADS)

    Gea, S.; Zulfahmi, Z.; Yunus, D.; Andriayani, A.; Hutapea, Y. A.

    2018-03-01

    Cellulose nanofibrils were obtained from oil palm empty fruit bunch using steam explosion and hydrolized with 10% solution of HCl. Steam explosion coupled with acid hydrolysis pretreatment on the oil palm empty fruit bunch was very effective in the depolymerization and defibrillation process of the fibre to produce fibers in nanodimension. Structural analysis of steam exploded fibers was determined by Fourier Transform Infrared (FT-IR) spectroscopy. Thermal stability of cellulose measured using image analysis software image J. Characterization of the fibers by TEM and SEM displayed that fiber diameter decreases with mechanical-chemical treatment and final nanofibril size was 20-30 nm. FT-IR and TGA data confirmed the removal of hemicellulose and lignin during the chemical treatment process.

  16. Effects of Type of Agreement Violation and Utterance Position on the Auditory Processing of Subject-Verb Agreement: An ERP Study

    PubMed Central

    Dube, Sithembinkosi; Kung, Carmen; Peter, Varghese; Brock, Jon; Demuth, Katherine

    2016-01-01

    Previous ERP studies have often reported two ERP components—LAN and P600—in response to subject-verb (S-V) agreement violations (e.g., the boys *runs). However, the latency, amplitude and scalp distribution of these components have been shown to vary depending on various experiment-related factors. One factor that has not received attention is the extent to which the relative perceptual salience related to either the utterance position (verbal inflection in utterance-medial vs. utterance-final contexts) or the type of agreement violation (errors of omission vs. errors of commission) may influence the auditory processing of S-V agreement. The lack of reports on these effects in ERP studies may be due to the fact that most studies have used the visual modality, which does not reveal acoustic information. To address this gap, we used ERPs to measure the brain activity of Australian English-speaking adults while they listened to sentences in which the S-V agreement differed by type of agreement violation and utterance position. We observed early negative and positive clusters (AN/P600 effects) for the overall grammaticality effect. Further analysis revealed that the mean amplitude and distribution of the P600 effect was only significant in contexts where the S-V agreement violation occurred utterance-finally, regardless of type of agreement violation. The mean amplitude and distribution of the negativity did not differ significantly across types of agreement violation and utterance position. These findings suggest that the increased perceptual salience of the violation in utterance final position (due to phrase-final lengthening) influenced how S-V agreement violations were processed during sentence comprehension. Implications for the functional interpretation of language-related ERPs and experimental design are discussed. PMID:27625617

  17. Effects of Type of Agreement Violation and Utterance Position on the Auditory Processing of Subject-Verb Agreement: An ERP Study.

    PubMed

    Dube, Sithembinkosi; Kung, Carmen; Peter, Varghese; Brock, Jon; Demuth, Katherine

    2016-01-01

    Previous ERP studies have often reported two ERP components-LAN and P600-in response to subject-verb (S-V) agreement violations (e.g., the boys (*) runs). However, the latency, amplitude and scalp distribution of these components have been shown to vary depending on various experiment-related factors. One factor that has not received attention is the extent to which the relative perceptual salience related to either the utterance position (verbal inflection in utterance-medial vs. utterance-final contexts) or the type of agreement violation (errors of omission vs. errors of commission) may influence the auditory processing of S-V agreement. The lack of reports on these effects in ERP studies may be due to the fact that most studies have used the visual modality, which does not reveal acoustic information. To address this gap, we used ERPs to measure the brain activity of Australian English-speaking adults while they listened to sentences in which the S-V agreement differed by type of agreement violation and utterance position. We observed early negative and positive clusters (AN/P600 effects) for the overall grammaticality effect. Further analysis revealed that the mean amplitude and distribution of the P600 effect was only significant in contexts where the S-V agreement violation occurred utterance-finally, regardless of type of agreement violation. The mean amplitude and distribution of the negativity did not differ significantly across types of agreement violation and utterance position. These findings suggest that the increased perceptual salience of the violation in utterance final position (due to phrase-final lengthening) influenced how S-V agreement violations were processed during sentence comprehension. Implications for the functional interpretation of language-related ERPs and experimental design are discussed.

  18. Neural Network Modeling for Gallium Arsenide IC Fabrication Process and Device Characteristics.

    NASA Astrophysics Data System (ADS)

    Creech, Gregory Lee, I.

    This dissertation presents research focused on the utilization of neurocomputing technology to achieve enhanced yield and effective yield prediction in integrated circuit (IC) manufacturing. Artificial neural networks are employed to model complex relationships between material and device characteristics at critical stages of the semiconductor fabrication process. Whole wafer testing was performed on the starting substrate material and during wafer processing at four critical steps: Ohmic or Post-Contact, Post-Recess, Post-Gate and Final, i.e., at completion of fabrication. Measurements taken and subsequently used in modeling include, among others, doping concentrations, layer thicknesses, planar geometries, layer-to-layer alignments, resistivities, device voltages, and currents. The neural network architecture used in this research is the multilayer perceptron neural network (MLPNN). The MLPNN is trained in the supervised mode using the generalized delta learning rule. It has one hidden layer and uses continuous perceptrons. The research focuses on a number of different aspects. First is the development of inter-process stage models. Intermediate process stage models are created in a progressive fashion. Measurements of material and process/device characteristics taken at a specific processing stage and any previous stages are used as input to the model of the next processing stage characteristics. As the wafer moves through the fabrication process, measurements taken at all previous processing stages are used as input to each subsequent process stage model. Secondly, the development of neural network models for the estimation of IC parametric yield is demonstrated. Measurements of material and/or device characteristics taken at earlier fabrication stages are used to develop models of the final DC parameters. These characteristics are computed with the developed models and compared to acceptance windows to estimate the parametric yield. A sensitivity analysis is performed on the models developed during this yield estimation effort. This is accomplished by analyzing the total disturbance of network outputs due to perturbed inputs. When an input characteristic bears no, or little, statistical or deterministic relationship to the output characteristics, it can be removed as an input. Finally, neural network models are developed in the inverse direction. Characteristics measured after the final processing step are used as the input to model critical in-process characteristics. The modeled characteristics are used for whole wafer mapping and its statistical characterization. It is shown that this characterization can be accomplished with minimal in-process testing. The concepts and methodologies used in the development of the neural network models are presented. The modeling results are provided and compared to the actual measured values of each characteristic. An in-depth discussion of these results and ideas for future research are presented.

  19. CFD analysis of turbopump volutes

    NASA Technical Reports Server (NTRS)

    Ascoli, Edward P.; Chan, Daniel C.; Darian, Armen; Hsu, Wayne W.; Tran, Ken

    1993-01-01

    An effort is underway to develop a procedure for the regular use of CFD analysis in the design of turbopump volutes. Airflow data to be taken at NASA Marshall will be used to validate the CFD code and overall procedure. Initial focus has been on preprocessing (geometry creation, translation, and grid generation). Volute geometries have been acquired electronically and imported into the CATIA CAD system and RAGGS (Rockwell Automated Grid Generation System) via the IGES standard. An initial grid topology has been identified and grids have been constructed for turbine inlet and discharge volutes. For CFD analysis of volutes to be used regularly, a procedure must be defined to meet engineering design needs in a timely manner. Thus, a compromise must be established between making geometric approximations, the selection of grid topologies, and possible CFD code enhancements. While the initial grid developed approximated the volute tongue with a zero thickness, final computations should more accurately account for the geometry in this region. Additionally, grid topologies will be explored to minimize skewness and high aspect ratio cells that can affect solution accuracy and slow code convergence. Finally, as appropriate, code modifications will be made to allow for new grid topologies in an effort to expedite the overall CFD analysis process.

  20. The Multi-Isotope Process (MIP) Monitor Project: FY13 Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meier, David E.; Coble, Jamie B.; Jordan, David V.

    The Multi-Isotope Process (MIP) Monitor provides an efficient approach to monitoring the process conditions in reprocessing facilities in support of the goal of “… (minimization of) the risks of nuclear proliferation and terrorism.” The MIP Monitor measures the distribution of the radioactive isotopes in product and waste streams of a nuclear reprocessing facility. These isotopes are monitored online by gamma spectrometry and compared, in near-real-time, to spectral patterns representing “normal” process conditions using multivariate analysis and pattern recognition algorithms. The combination of multivariate analysis and gamma spectroscopy allows us to detect small changes in the gamma spectrum, which may indicatemore » changes in process conditions. By targeting multiple gamma-emitting indicator isotopes, the MIP Monitor approach is compatible with the use of small, portable, relatively high-resolution gamma detectors that may be easily deployed throughout an existing facility. The automated multivariate analysis can provide a level of data obscurity, giving a built-in information barrier to protect sensitive or proprietary operational data. Proof-of-concept simulations and experiments have been performed in previous years to demonstrate the validity of this tool in a laboratory setting for systems representing aqueous reprocessing facilities. However, pyroprocessing is emerging as an alternative to aqueous reprocessing techniques.« less

  1. Sequencing of Dust Filter Production Process Using Design Structure Matrix (DSM)

    NASA Astrophysics Data System (ADS)

    Sari, R. M.; Matondang, A. R.; Syahputri, K.; Anizar; Siregar, I.; Rizkya, I.; Ursula, C.

    2018-01-01

    Metal casting company produces machinery spare part for manufactures. One of the product produced is dust filter. Most of palm oil mill used this product. Since it is used in most of palm oil mill, company often have problems to address this product. One of problem is the disordered of production process. It carried out by the job sequencing. The important job that should be solved first, least implement, while less important job and could be completed later, implemented first. Design Structure Matrix (DSM) used to analyse and determine priorities in the production process. DSM analysis is sort of production process through dependency sequencing. The result of dependency sequences shows the sequence process according to the inter-process linkage considering before and after activities. Finally, it demonstrates their activities to the coupled activities for metal smelting, refining, grinding, cutting container castings, metal expenditure of molds, metal casting, coating processes, and manufacture of molds of sand.

  2. A combined approach based on MAF analysis and AHP method to fault detection mapping: A case study from a gas field, southwest of Iran

    NASA Astrophysics Data System (ADS)

    Shakiba, Sima; Asghari, Omid; Khah, Nasser Keshavarz Faraj

    2018-01-01

    A combined geostatitical methodology based on Min/Max Auto-correlation Factor (MAF) analysis and Analytical Hierarchy Process (AHP) is presented to generate a suitable Fault Detection Map (FDM) through seismic attributes. Five seismic attributes derived from a 2D time slice obtained from data related to a gas field located in southwest of Iran are used including instantaneous amplitude, similarity, energy, frequency, and Fault Enhancement Filter (FEF). The MAF analysis is implemented to reduce dimension of input variables, and then AHP method is applied on three obtained de-correlated MAF factors as evidential layer. Three Decision Makers (DMs) are used to construct PCMs for determining weights of selected evidential layer. Finally, weights obtained by AHP were multiplied in normalized valued of each alternative (MAF layers) and the concluded weighted layers were integrated in order to prepare final FDM. Results proved that applying algorithm proposed in this study generate a map more acceptable than the each individual attribute and sharpen the non-surface discontinuities as well as enhancing continuity of detected faults.

  3. Ares I-X Range Safety Simulation Verification and Analysis Independent Validation and Verification

    NASA Technical Reports Server (NTRS)

    Merry, Carl M.; Tarpley, Ashley F.; Craig, A. Scott; Tartabini, Paul V.; Brewer, Joan D.; Davis, Jerel G.; Dulski, Matthew B.; Gimenez, Adrian; Barron, M. Kyle

    2011-01-01

    NASA s Ares I-X vehicle launched on a suborbital test flight from the Eastern Range in Florida on October 28, 2009. To obtain approval for launch, a range safety final flight data package was generated to meet the data requirements defined in the Air Force Space Command Manual 91-710 Volume 2. The delivery included products such as a nominal trajectory, trajectory envelopes, stage disposal data and footprints, and a malfunction turn analysis. The Air Force s 45th Space Wing uses these products to ensure public and launch area safety. Due to the criticality of these data, an independent validation and verification effort was undertaken to ensure data quality and adherence to requirements. As a result, the product package was delivered with the confidence that independent organizations using separate simulation software generated data to meet the range requirements and yielded consistent results. This document captures Ares I-X final flight data package verification and validation analysis, including the methodology used to validate and verify simulation inputs, execution, and results and presents lessons learned during the process

  4. Psychotherapy research needs theory. Outline for an epistemology of the clinical exchange.

    PubMed

    Salvatore, Sergio

    2011-09-01

    This paper provides an analysis of a basic assumption grounding the clinical research: the ontological autonomy of psychotherapy-based on the idea that the clinical exchange is sufficiently distinguished from other social objects (i.e. exchange between teacher and pupils, or between buyer and seller, or interaction during dinner, and so forth). A criticism of such an assumption is discussed together with the proposal of a different epistemological interpretation, based on the distinction between communicative dynamics and the process of psychotherapy-psychotherapy is a goal-oriented process based on the general dynamics of human communication. Theoretical and methodological implications are drawn from such a view: It allows further sources of knowledge to be integrated within clinical research (i.e. those coming from other domains of analysis of human communication); it also enables a more abstract definition of the psychotherapy process to be developed, leading to innovative views of classical critical issues, like the specific-nonspecific debate. The final part of the paper is devoted to presenting a model of human communication--the Semiotic Dialogical Dialectic Theory--which is meant as the framework for the analysis of psychotherapy.

  5. Statistical t Analysis for the Solution of Prediction Trash Management in Dusun Tanjung Sari Kec. Ngaglik Kab Sleman, Yogyakarta

    NASA Astrophysics Data System (ADS)

    Salmahaminati; Husnaqilati, Atina; Yahya, Amri

    2017-01-01

    Trash management is one of the society participation to have a good hygiene for each area or nationally. Trash is known as the remainder of regular consumption that should be disposed to do waste processing which will be beneficial and improve the hygiene. The way to do is by sorting plastic which is processed into goods in accordance with the waste. In this study, we will know what are the factors that affect the desire of citizens to process the waste. The factors would have the identity and the state of being of each resident, having known of these factors will be the education about waste management, so it can be compared how the results of the extension by using preliminary data prior to the extension and the final data after extension. The analysis uses multiple logistic regression is the identify factors that influence people’s to desire the waste while the comparison results using t analysis. Data is derived from statistical instrument in the form of a questionnaire.

  6. Full Life Cycle of Data Analysis with Climate Model Diagnostic Analyzer (CMDA)

    NASA Astrophysics Data System (ADS)

    Lee, S.; Zhai, C.; Pan, L.; Tang, B.; Zhang, J.; Bao, Q.; Malarout, N.

    2017-12-01

    We have developed a system that supports the full life cycle of a data analysis process, from data discovery, to data customization, to analysis, to reanalysis, to publication, and to reproduction. The system called Climate Model Diagnostic Analyzer (CMDA) is designed to demonstrate that the full life cycle of data analysis can be supported within one integrated system for climate model diagnostic evaluation with global observational and reanalysis datasets. CMDA has four subsystems that are highly integrated to support the analysis life cycle. Data System manages datasets used by CMDA analysis tools, Analysis System manages CMDA analysis tools which are all web services, Provenance System manages the meta data of CMDA datasets and the provenance of CMDA analysis history, and Recommendation System extracts knowledge from CMDA usage history and recommends datasets/analysis tools to users. These four subsystems are not only highly integrated but also easily expandable. New datasets can be easily added to Data System and scanned to be visible to the other subsystems. New analysis tools can be easily registered to be available in the Analysis System and Provenance System. With CMDA, a user can start a data analysis process by discovering datasets of relevance to their research topic using the Recommendation System. Next, the user can customize the discovered datasets for their scientific use (e.g. anomaly calculation, regridding, etc) with tools in the Analysis System. Next, the user can do their analysis with the tools (e.g. conditional sampling, time averaging, spatial averaging) in the Analysis System. Next, the user can reanalyze the datasets based on the previously stored analysis provenance in the Provenance System. Further, they can publish their analysis process and result to the Provenance System to share with other users. Finally, any user can reproduce the published analysis process and results. By supporting the full life cycle of climate data analysis, CMDA improves the research productivity and collaboration level of its user.

  7. Knitting Mochilas: A Sociocultural, Developmental Practice in Arhuaco Indigenous Communities

    PubMed Central

    Rodríguez-Burgos, Lilian Patricia; Rodríguez-Castro, Jennifer; Bojacá-Rodríguez, Sandra Milena; Izquierdo-Martínez, Dwrya Elena; Amórtegui-Lozano, Allain Alexander; Prieto-Castellanos, Miguel Angel

    2016-01-01

    The purpose of this article is to analyze the psycho-cultural processes involved in knitting “mochilas” (traditional bags), a common craft in the Arhuaco indigenous community located in the Sierra Nevada de Santa Marta, Colombia. The article is structured in three parts, as follows: first, issues related to child development are discussed; then, the analysis method used to study the processes involved in the practice of knitting is presented and, finally, we reflect on the importance of recovering the sense and meaning of this everyday practice as a way to study child development. PMID:27298634

  8. Experimental Characterization of Aluminum-Based Hybrid Composites Obtained Through Powder Metallurgy

    NASA Astrophysics Data System (ADS)

    Marcu, D. F.; Buzatu, M.; Ghica, V. G.; Petrescu, M. I.; Popescu, G.; Niculescu, F.; Iacob, G.

    2018-06-01

    The paper presents some experimental results concerning fabrication through powder metallurgy (P/M) of aluminum-based hybrid composites - Al/Al2O3/Gr. In order to understand the mechanisms that occur during the P/M processes of obtaining Al/Al2O3/Gr composite, we correlated the physical characteristics with their micro-structural characteristics. The characterization was performed using analysis techniques specific for P/M process, SEM-EDS and XRD analyses. Micro-structural characterization of the composites has revealed fairly uniform distribution this resulting in good properties of the final composite material.

  9. Computer-assisted cartography: an overview.

    USGS Publications Warehouse

    Guptill, S.C.; Starr, L.E.

    1984-01-01

    An assessment of the current status of computer-assisted cartography, in part, is biased by one's view of the cartographic process as a whole. From a traditional viewpoint we are concerned about automating the mapping process; from a progressive viewpoint we are concerned about using the tools of computer science to convey spatial information. On the surface these viewpoints appear to be in opposition. However, it is postulated that in the final analysis, they face the same goal. This overview uses the perspectives from two viewpoints to depict the current state of computer-assisted cartography and speculate on future goals, trends, and challenges.-Authors

  10. Cyclopropenimine superbases: Competitive initiation processes in lactide polymerization

    DOE PAGES

    Stukenbroeker, Tyler S.; Bandar, Jeffrey S.; Zhang, Xiangyi; ...

    2015-07-30

    Cyclopropenimine superbases were employed in this study to catalyze the ring-opening polymerization of lactide. Polymerization occurred readily in the presence and absence of alcohol initiators. Polymerizations in the absence of alcohol initiators revealed a competitive initiation mechanism involving deprotonation of lactide by the cyclopropenimine to generate an enolate. NMR and MALDI-TOF analysis of the poly(lactides) generated from cyclopropenimines in the absence of alcohol initiators showed acylated lactide and hydroxyl end groups. Finally, model studies and comparative experiments with guanidine and phosphazene catalysts revealed the subtle influence of the nature of the superbase on competitive initiation processes.

  11. Potential impact of the implementation of multiple-criteria decision analysis (MCDA) on the Polish pricing and reimbursement process of orphan drugs.

    PubMed

    Kolasa, Katarzyna; Zwolinski, Krzysztof M; Kalo, Zoltan; Hermanowski, Tomasz

    2016-03-10

    The objective of this study was to assess the potential impact of the implementation of multiple-criteria decision analysis (MCDA) on the Polish pricing and reimbursement (P&R) process with regard to orphan drugs. A four step approach was designed. Firstly, a systematic literature review was conducted to select the MCDA criteria. Secondly, a database of orphan drugs was established. Thirdly, health technology appraisals (HTA recommendations) were categorized and an MCDA appraisal was conducted. Finally, a comparison of HTA and MCDA outcomes was carried out. An MCDA outcome was considered positive if more than 50% of the maximum number of points was reached (base case). In the sensitivity analysis, 25% and 75% thresholds were tested as well. Out of 2242 publications, 23 full-text articles were included. The final MCDA tool consisted of ten criteria. In total, 27 distinctive drug-indication pairs regarding 21 drugs were used for the study. Six negative and 21 positive HTA recommendations were issued. In the base case, there were 19 positive MCDA outcomes. Of the 27 cases, there were 12 disagreements between the HTA and MCDA outcomes, the majority of which related to positive HTA guidance for negative MCDA outcomes. All drug-indication pairs with negative HTA recommendations were appraised positively in the MCDA framework. Economic details were available for 12 cases, of which there were 9 positive MCDA outcomes. Amongst the 12 drug-indication pairs, two were negatively appraised in the HTA process, with positive MCDA guidance, and two were appraised in the opposite direction. An MCDA approach may lead to different P&R outcomes compared to a standard HTA process. On the one hand, enrichment of the list of decision making criteria means further scrutiny of a given health technology and as such increases the odds of a negative P&R outcome. On the other hand, it may uncover additional values and as such increase the odds of positive P&R outcomes.

  12. A meta-analysis and review of holistic face processing.

    PubMed

    Richler, Jennifer J; Gauthier, Isabel

    2014-09-01

    The concept of holistic processing is a cornerstone of face recognition research, yet central questions related to holistic processing remain unanswered, and debates have thus far failed to reach a resolution despite accumulating empirical evidence. We argue that a considerable source of confusion in this literature stems from a methodological problem. Specifically, 2 measures of holistic processing based on the composite paradigm (complete design and partial design) are used in the literature, but they often lead to qualitatively different results. First, we present a comprehensive review of the work that directly compares the 2 designs, and which clearly favors the complete design over the partial design. Second, we report a meta-analysis of holistic face processing according to both designs and use this as further evidence for one design over the other. The meta-analysis effect size of holistic processing in the complete design is nearly 3 times that of the partial design. Effect sizes were not correlated between measures, consistent with the suggestion that they do not measure the same thing. Our meta-analysis also examines the correlation between conditions in the complete design of the composite task, and suggests that in an individual differences context, little is gained by including a misaligned baseline. Finally, we offer a comprehensive review of the state of knowledge about holistic processing based on evidence gathered from the measure we favor based on the 1st sections of our review-the complete design-and outline outstanding research questions in that new context. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  13. Microchemical Systems for Fuel Processing and Conversion to Electrical Power

    DTIC Science & Technology

    2007-03-15

    Processing and Conversion to Electrical Power - Final Report 2 Table of Contents Table of Contents... Processing and Conversion to Electrical Power - Final Report 3 8.7 Development of Large Free-Standing Electrolyte-supported Micro Fuel Cell Membranes...84 MURI Microchemical Systems for Fuel Processing and

  14. Improving the treatment planning and delivery process of Xoft electronic skin brachytherapy.

    PubMed

    Manger, Ryan; Rahn, Douglas; Hoisak, Jeremy; Dragojević, Irena

    2018-05-14

    To develop an improved Xoft electronic skin brachytherapy process and identify areas of further improvement. A multidisciplinary team conducted a failure modes and effects analysis (FMEA) by developing a process map and a corresponding list of failure modes. The failure modes were scored for their occurrence, severity, and detectability, and a risk priority number (RPN) was calculated for each failure mode as the product of occurrence, severity, and detectability. Corrective actions were implemented to address the higher risk failure modes, and a revised process was generated. The RPNs of the failure modes were compared between the initial process and final process to assess the perceived benefits of the corrective actions. The final treatment process consists of 100 steps and 114 failure modes. The FMEA took approximately 20 person-hours (one physician, three physicists, and two therapists) to complete. The 10 most dangerous failure modes had RPNs ranging from 336 to 630. Corrective actions were effective at addressing most failure modes (10 riskiest RPNs ranging from 189 to 310), yet the RPNs were higher than those published for alternative systems. Many of these high-risk failure modes remained due to hardware design limitations. FMEA helps guide process improvement efforts by emphasizing the riskiest steps. Significant risks are apparent when using a Xoft treatment unit for skin brachytherapy due to hardware limitations such as the lack of several interlocks, a short source lifespan, and variability in source output. The process presented in this article is expected to reduce but not eliminate these risks. Copyright © 2018 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  15. Evolution and Advances in Satellite Analysis of Volcanoes

    NASA Astrophysics Data System (ADS)

    Dean, K. G.; Dehn, J.; Webley, P.; Bailey, J.

    2008-12-01

    Over the past 20 years satellite data used for monitoring and analysis of volcanic eruptions has evolved in terms of timeliness, access, distribution, resolution and understanding of volcanic processes. Initially satellite data was used for retrospective analysis but has evolved to proactive monitoring systems. Timely acquisition of data and the capability to distribute large data files paralleled advances in computer technology and was a critical component for near real-time monitoring. The sharing of these data and resulting discussions has improved our understanding of eruption processes and, even more importantly, their impact on society. To illustrate this evolution, critical scientific discoveries will be highlighted, including detection of airborne ash and sulfur dioxide, cloud-height estimates, prediction of ash cloud movement, and detection of thermal anomalies as precursor-signals to eruptions. AVO has been a leader in implementing many of these advances into an operational setting such as, automated eruption detection, database analysis systems, and remotely accessible web-based analysis systems. Finally, limitations resulting from trade-offs between resolution and how they impact some weakness in detection techniques and hazard assessments will be presented.

  16. Blast investigation by fast multispectral radiometric analysis

    NASA Astrophysics Data System (ADS)

    Devir, A. D.; Bushlin, Y.; Mendelewicz, I.; Lessin, A. B.; Engel, M.

    2011-06-01

    Knowledge regarding the processes involved in blasts and detonations is required in various applications, e.g. missile interception, blasts of high-explosive materials, final ballistics and IED identification. Blasts release large amount of energy in short time duration. Some part of this energy is released as intense radiation in the optical spectral bands. This paper proposes to measure the blast radiation by a fast multispectral radiometer. The measurement is made, simultaneously, in appropriately chosen spectral bands. These spectral bands provide extensive information on the physical and chemical processes that govern the blast through the time-dependence of the molecular and aerosol contributions to the detonation products. Multi-spectral blast measurements are performed in the visible, SWIR and MWIR spectral bands. Analysis of the cross-correlation between the measured multi-spectral signals gives the time dependence of the temperature, aerosol and gas composition of the blast. Farther analysis of the development of these quantities in time may indicate on the order of the detonation and amount and type of explosive materials. Examples of analysis of measured explosions are presented to demonstrate the power of the suggested fast multispectral radiometric analysis approach.

  17. Applicability of the Common Safety Method for Risk Evaluation and Assessment (CSM-RA) to the Space Domain

    NASA Astrophysics Data System (ADS)

    Moreira, Francisco; Silva, Nuno

    2016-08-01

    Safety systems require accident avoidance. This is covered by application standards, processes, techniques and tools that support the identification, analysis, elimination or reduction to an acceptable level of system risks and hazards. Ideally, a safety system should be free of hazards. However, both industry and academia have been struggling to ensure appropriate risk and hazard analysis, especially in what concerns completeness of the hazards, formalization, and timely analysis in order to influence the specifications and the implementation. Such analysis is also important when considering a change to an existing system. The Common Safety Method for Risk Evaluation and Assessment (CSM- RA) is a mandatory procedure whenever any significant change is proposed to the railway system in a European Member State. This paper provides insights on the fundamentals of CSM-RA based and complemented with Hazard Analysis. When and how to apply them, and the relation and similarities of these processes with industry standards and the system life cycles is highlighted. Finally, the paper shows how CSM-RA can be the basis of a change management process, guiding the identification and management of the hazards helping ensuring the similar safety level as the initial system. This paper will show how the CSM-RA principles can be used in other domains particularly for space system evolution.

  18. Accident analysis and control options in support of the sludge water system safety analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HEY, B.E.

    A hazards analysis was initiated for the SWS in July 2001 (SNF-8626, K Basin Sludge and Water System Preliminary Hazard Analysis) and updated in December 2001 (SNF-10020 Rev. 0, Hazard Evaluation for KE Sludge and Water System - Project A16) based on conceptual design information for the Sludge Retrieval System (SRS) and 60% design information for the cask and container. SNF-10020 was again revised in September 2002 to incorporate new hazards identified from final design information and from a What-if/Checklist evaluation of operational steps. The process hazards, controls, and qualitative consequence and frequency estimates taken from these efforts have beenmore » incorporated into Revision 5 of HNF-3960, K Basins Hazards Analysis. The hazards identification process documented in the above referenced reports utilized standard industrial safety techniques (AIChE 1992, Guidelines for Hazard Evaluation Procedures) to systematically guide several interdisciplinary teams through the system using a pre-established set of process parameters (e.g., flow, temperature, pressure) and guide words (e.g., high, low, more, less). The teams generally included representation from the U.S. Department of Energy (DOE), K Basins Nuclear Safety, T Plant Nuclear Safety, K Basin Industrial Safety, fire protection, project engineering, operations, and facility engineering.« less

  19. Durability Characteristics Analysis of Plastic Worm Wheel with Glass Fiber Reinforced Polyamide.

    PubMed

    Kim, Gun-Hee; Lee, Jeong-Won; Seo, Tae-Il

    2013-05-10

    Plastic worm wheel is widely used in the vehicle manufacturing field because it is favorable for weight lightening, vibration and noise reduction, as well as corrosion resistance. However, it is very difficult for general plastics to secure the mechanical properties that are required for vehicle gears. If the plastic resin is reinforced by glass fiber in the fabrication process of plastic worm wheel, it is possible to achieve the mechanical properties of metallic material levels. In this study, the mechanical characteristic analysis of the glass-reinforced plastic worm wheel, according to the contents of glass fiber, is performed by analytic and experimental methods. In the case of the glass fiber-reinforced resin, the orientation and contents of glass fibers can influence the mechanical properties. For the characteristic prediction of plastic worm wheel, computer-aided engineering (CAE) analysis processes such as structural and injection molding analysis were executed with the polyamide resin reinforcement glass fiber (25 wt %, 50 wt %). The injection mold for fabricating the prototype plastic worm wheel was designed and made to reflect the CAE analysis results. Finally, the durability of prototype plastic worm wheel fabricated by the injection molding process was evaluated by the experimental method and the characteristics according to the glass fiber contents.

  20. Durability Characteristics Analysis of Plastic Worm Wheel with Glass Fiber Reinforced Polyamide

    PubMed Central

    Kim, Gun-Hee; Lee, Jeong-Won; Seo, Tae-Il

    2013-01-01

    Plastic worm wheel is widely used in the vehicle manufacturing field because it is favorable for weight lightening, vibration and noise reduction, as well as corrosion resistance. However, it is very difficult for general plastics to secure the mechanical properties that are required for vehicle gears. If the plastic resin is reinforced by glass fiber in the fabrication process of plastic worm wheel, it is possible to achieve the mechanical properties of metallic material levels. In this study, the mechanical characteristic analysis of the glass-reinforced plastic worm wheel, according to the contents of glass fiber, is performed by analytic and experimental methods. In the case of the glass fiber-reinforced resin, the orientation and contents of glass fibers can influence the mechanical properties. For the characteristic prediction of plastic worm wheel, computer-aided engineering (CAE) analysis processes such as structural and injection molding analysis were executed with the polyamide resin reinforcement glass fiber (25 wt %, 50 wt %). The injection mold for fabricating the prototype plastic worm wheel was designed and made to reflect the CAE analysis results. Finally, the durability of prototype plastic worm wheel fabricated by the injection molding process was evaluated by the experimental method and the characteristics according to the glass fiber contents. PMID:28809248

  1. PRODUCCION DE PLACAS DELGADAS DE UO$sub 2$ INFORME NO. 71. (Production of Thin Plates of UO$sub 2$. Report No. 71)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koll, H.; Carrea, A.J.

    1962-01-01

    The effect of some parameters on the fabrication of thin plates of UO/ sub 2/ by the sintering process is studied. Compacting pressures of 0.25 to 2 ton/cm/sub 2/, temperatures from 1100 to 1400 deg C, and sintering times from 1 to 3 hrs were used to determine the optimum values of these parameters. An analysis of the effect of the lubricant during the compression showed that the results were improved by the substitution of polyethylene glycol types for steric types, as the former were more easily removed from the compact and did not attack the UO/sub 2/ during sintering.more » Fracture during compression and extraction was studied. The compression law for the powder was determined, and the valid ity of the Bal'shin law was proved. The furnace atmospher is of importance to the sintered product. Two types of atmosphere were analyzed ---neutral atmosphere during sintering with final reduction in hydrogen and slightly reducing atmosphere during the entire process. An analysis of the effects on the final density and porosity showed that adding 3% H/sub 2/ to Ar produced good density and a stoichiometric oxide in the final product. It was shown that density is not a sufficient measurement to evaluate the degree of sintering. Only the combined use of density and porosity give a good evaluation. The compression pressure has a great effect on the pore size and distribution in the sintered product. Best results are obtained with high pressures, which gives small uniformly distributed pores. A metallographic study was made to determine the relation between pore size and distribution and the process parameters. Compact zones'' were observed with mean diameter from 1 to 2 mm with very reduced porosity. These zones had better hardness and resistance to corrosion and chemical attack than the rest of the material. (tr-auth)« less

  2. Introduction to Quantitative Science, a Ninth-Grade Laboratory-Centered Course Stressing Quantitative Observation and Mathematical Analysis of Experimental Results. Final Report.

    ERIC Educational Resources Information Center

    Badar, Lawrence J.

    This report, in the form of a teacher's guide, presents materials for a ninth grade introductory course on Introduction to Quantitative Science (IQS). It is intended to replace a traditional ninth grade general science with a process oriented course that will (1) unify the sciences, and (2) provide a quantitative preparation for the new science…

  3. Optical design and system characterization of an imaging microscope at 121.6 nm

    NASA Astrophysics Data System (ADS)

    Gao, Weichuan; Finan, Emily; Kim, Geon-Hee; Kim, Youngsik; Milster, Thomas D.

    2018-03-01

    We present the optical design and system characterization of an imaging microscope prototype at 121.6 nm. System engineering processes are demonstrated through the construction of a Schwarzschild microscope objective, including tolerance analysis, fabrication, alignment, and testing. Further improvements on the as-built system with a correction phase plate are proposed and analyzed. Finally, the microscope assembly and the imaging properties of the prototype are demonstrated.

  4. Environmental Impact Analysis Process. Supplement to Final Environmental Impact Statement Space Shuttle Program, Vandenberg AFB, California

    DTIC Science & Technology

    1983-07-01

    problems . Six appendices offer more detailed environmental assessments for the key issues of air quality impacts, inadvertent weather modification...research studies in problem areas, and newly- acquired knowledge of the affected environment. The physical, chemi- cal, biological, and...Shuttle program, in conjunction with other projects within the county, will aggravate short-tenm problems concerning housing, and the quality and quantity

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarocki, John Charles; Zage, David John; Fisher, Andrew N.

    LinkShop is a software tool for applying the method of Linkography to the analysis time-sequence data. LinkShop provides command line, web, and application programming interfaces (API) for input and processing of time-sequence data, abstraction models, and ontologies. The software creates graph representations of the abstraction model, ontology, and derived linkograph. Finally, the tool allows the user to perform statistical measurements of the linkograph and refine the ontology through direct manipulation of the linkograph.

  6. Statistical Models and Inference Procedures for Structural and Materials Reliability

    DTIC Science & Technology

    1990-12-01

    as an official Department of the Army positio~n, policy, or decision, unless sD designated by other documentazion. 12a. DISTRIBUTION /AVAILABILITY...Some general stress-strength models were also developed and applied to the failure of systems subject to cyclic loading. Involved in the failure of...process control ideas and sequential design and analysis methods. Finally, smooth nonparametric quantile .wJ function estimators were studied. All of

  7. Analysis of Requirements of On-Line Network Cataloging Services for Small, Academic, Public, School and Other Libraries: A Demonstration Project using the OCLC System. Final Report.

    ERIC Educational Resources Information Center

    Markuson, Barbara Evans

    This report results from a project using the OCLC system to provide catalog services to small libraries. Alternatives described include: centralized cataloging, centralized book processing, sharing of OCLC terminals, and use of dial-up terminals. The OCLC data base was found useful for all types of small libraries. It is recommended that network…

  8. Reliability modelling and analysis of a multi-state element based on a dynamic Bayesian network

    PubMed Central

    Xu, Tingxue; Gu, Junyuan; Dong, Qi; Fu, Linyu

    2018-01-01

    This paper presents a quantitative reliability modelling and analysis method for multi-state elements based on a combination of the Markov process and a dynamic Bayesian network (DBN), taking perfect repair, imperfect repair and condition-based maintenance (CBM) into consideration. The Markov models of elements without repair and under CBM are established, and an absorbing set is introduced to determine the reliability of the repairable element. According to the state-transition relations between the states determined by the Markov process, a DBN model is built. In addition, its parameters for series and parallel systems, namely, conditional probability tables, can be calculated by referring to the conditional degradation probabilities. Finally, the power of a control unit in a failure model is used as an example. A dynamic fault tree (DFT) is translated into a Bayesian network model, and subsequently extended to a DBN. The results show the state probabilities of an element and the system without repair, with perfect and imperfect repair, and under CBM, with an absorbing set plotted by differential equations and verified. Through referring forward, the reliability value of the control unit is determined in different kinds of modes. Finally, weak nodes are noted in the control unit. PMID:29765629

  9. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    NASA Astrophysics Data System (ADS)

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-12-01

    The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  10. Attribution of emotions to body postures: an independent component analysis study of functional connectivity in autism.

    PubMed

    Libero, Lauren E; Stevens, Carl E; Kana, Rajesh K

    2014-10-01

    The ability to interpret others' body language is a vital skill that helps us infer their thoughts and emotions. However, individuals with autism spectrum disorder (ASD) have been found to have difficulty in understanding the meaning of people's body language, perhaps leading to an overarching deficit in processing emotions. The current fMRI study investigates the functional connectivity underlying emotion and action judgment in the context of processing body language in high-functioning adolescents and young adults with autism, using an independent components analysis (ICA) of the fMRI time series. While there were no reliable group differences in brain activity, the ICA revealed significant involvement of occipital and parietal regions in processing body actions; and inferior frontal gyrus, superior medial prefrontal cortex, and occipital cortex in body expressions of emotions. In a between-group analysis, participants with autism, relative to typical controls, demonstrated significantly reduced temporal coherence in left ventral premotor cortex and right superior parietal lobule while processing emotions. Participants with ASD, on the other hand, showed increased temporal coherence in left fusiform gyrus while inferring emotions from body postures. Finally, a positive predictive relationship was found between empathizing ability and the brain areas underlying emotion processing in ASD participants. These results underscore the differential role of frontal and parietal brain regions in processing emotional body language in autism. Copyright © 2014 Wiley Periodicals, Inc.

  11. QFD-ANP Approach for the Conceptual Design of Research Vessels: A Case Study

    NASA Astrophysics Data System (ADS)

    Venkata Subbaiah, Kambagowni; Yeshwanth Sai, Koneru; Suresh, Challa

    2016-10-01

    Conceptual design is a subset of concept art wherein a new idea of product is created instead of a visual representation which would directly be used in a final product. The purpose is to understand the needs of conceptual design which are being used in engineering designs and to clarify the current conceptual design practice. Quality function deployment (QFD) is a customer oriented design approach for developing new or improved products and services to enhance customer satisfaction. House of quality (HOQ) has been traditionally used as planning tool of QFD which translates customer requirements (CRs) into design requirements (DRs). Factor analysis is carried out in order to reduce the CR portions of HOQ. The analytical hierarchical process is employed to obtain the priority ratings of CR's which are used in constructing HOQ. This paper mainly discusses about the conceptual design of an oceanographic research vessel using analytical network process (ANP) technique. Finally the QFD-ANP integrated methodology helps to establish the importance ratings of DRs.

  12. Charm Penguin in B± → K±K+K-: Partonic and hadronic loops

    NASA Astrophysics Data System (ADS)

    Bediaga, I.; Frederico, T.; Magalhães, P. C.

    2018-05-01

    Charm penguin diagrams are known to be the main contribution to charmless B decay process with strangeness variation equal to minus one, which is the case of B± →K±K+K- decay. The large phase space available in this and other B three-body decays allows non trivial final state interactions with all sort of rescattering processes and also access high momentum transfers in the central region of the Dalitz plane. In this work we investigate the charm Penguin contribution to B± →K±K+K-, described by a hadronic triangle loop in nonperturbative regions of the phase space, and by a partonic loop at the quasi perturbative region. These nonresonant amplitudes should have a particular structure in the Dalitz plane and their contributions to the final decay amplitude can be confirmed by a data amplitude analysis in this channel. In particular, the hadronic amplitude has a changing sign in the phase at D D bar threshold which can result in a change of sign for the CP asymmetry.

  13. Universality of next-to-leading power threshold effects for colourless final states in hadronic collisions

    NASA Astrophysics Data System (ADS)

    Del Duca, V.; Laenen, E.; Magnea, L.; Vernazza, L.; White, C. D.

    2017-11-01

    We consider the production of an arbitrary number of colour-singlet particles near partonic threshold, and show that next-to-leading order cross sections for this class of processes have a simple universal form at next-to-leading power (NLP) in the energy of the emitted gluon radiation. Our analysis relies on a recently derived factorisation formula for NLP threshold effects at amplitude level, and therefore applies both if the leading-order process is tree-level and if it is loop-induced. It holds for differential distributions as well. The results can furthermore be seen as applications of recently derived next-to-soft theorems for gauge theory amplitudes. We use our universal expression to re-derive known results for the production of up to three Higgs bosons at NLO in the large top mass limit, and for the hadro-production of a pair of electroweak gauge bosons. Finally, we present new analytic results for Higgs boson pair production at NLO and NLP, with exact top-mass dependence.

  14. High Voltage Insulation Technology

    NASA Astrophysics Data System (ADS)

    Scherb, V.; Rogalla, K.; Gollor, M.

    2008-09-01

    In preparation of new Electronic Power Conditioners (EPC's) for Travelling Wave Tub Amplifiers (TWTA's) on telecom satellites a study for the development of new high voltage insulation technology is performed. The initiative is mandatory to allow compact designs and to enable higher operating voltages. In a first task a market analysis was performed, comparing different materials with respect to their properties and processes. A hierarchy of selection criteria was established and finally five material candidates (4 Epoxy resins and 1 Polyurethane resin) were selected to be further investigated in the test program. Samples for the test program were designed to represent core elements of an EPC, the high voltage transformer and Printed Circuit Boards of the high voltage section. All five materials were assessed in the practical work flow of the potting process and electrical, mechanical, thermal and lifetime testing was performed. Although the lifetime tests results were overlayed by a larges scatter, finally two candidates have been identified for use in a subsequent qualification program. This activity forms part of element 5 of the ESA ARTES Programme.

  15. A web-based portfolio model as the students' final assignment: Dealing with the development of higher education trend

    NASA Astrophysics Data System (ADS)

    Utanto, Yuli; Widhanarto, Ghanis Putra; Maretta, Yoris Adi

    2017-03-01

    This study aims to develop a web-based portfolio model. The model developed in this study could reveal the effectiveness of the new model in experiments conducted at research respondents in the department of curriculum and educational technology FIP Unnes. In particular, the further research objectives to be achieved through this development of research, namely: (1) Describing the process of implementing a portfolio in a web-based model; (2) Assessing the effectiveness of web-based portfolio model for the final task, especially in Web-Based Learning courses. This type of research is the development of research Borg and Gall (2008: 589) says "educational research and development (R & D) is a process used to develop and validate educational production". The series of research and development carried out starting with exploration and conceptual studies, followed by testing and evaluation, and also implementation. For the data analysis, the technique used is simple descriptive analysis, analysis of learning completeness, which then followed by prerequisite test for normality and homogeneity to do T - test. Based on the data analysis, it was concluded that: (1) a web-based portfolio model can be applied to learning process in higher education; (2) The effectiveness of web-based portfolio model with field data from the respondents of large group trial participants (field trial), the number of respondents who reached mastery learning (a score of 60 and above) were 24 people (92.3%) in which it indicates that the web-based portfolio model is effective. The conclusion of this study is that a web-based portfolio model is effective. The implications of the research development of this model, the next researcher is expected to be able to use the guideline of the development model based on the research that has already been conducted to be developed on other subjects.

  16. Rapid Modeling and Analysis Tools: Evolution, Status, Needs and Directions

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Stone, Thomas J.; Ransom, Jonathan B. (Technical Monitor)

    2002-01-01

    Advanced aerospace systems are becoming increasingly more complex, and customers are demanding lower cost, higher performance, and high reliability. Increased demands are placed on the design engineers to collaborate and integrate design needs and objectives early in the design process to minimize risks that may occur later in the design development stage. High performance systems require better understanding of system sensitivities much earlier in the design process to meet these goals. The knowledge, skills, intuition, and experience of an individual design engineer will need to be extended significantly for the next generation of aerospace system designs. Then a collaborative effort involving the designer, rapid and reliable analysis tools and virtual experts will result in advanced aerospace systems that are safe, reliable, and efficient. This paper discusses the evolution, status, needs and directions for rapid modeling and analysis tools for structural analysis. First, the evolution of computerized design and analysis tools is briefly described. Next, the status of representative design and analysis tools is described along with a brief statement on their functionality. Then technology advancements to achieve rapid modeling and analysis are identified. Finally, potential future directions including possible prototype configurations are proposed.

  17. Double Fourier analysis for Emotion Identification in Voiced Speech

    NASA Astrophysics Data System (ADS)

    Sierra-Sosa, D.; Bastidas, M.; Ortiz P., D.; Quintero, O. L.

    2016-04-01

    We propose a novel analysis alternative, based on two Fourier Transforms for emotion recognition from speech. Fourier analysis allows for display and synthesizes different signals, in terms of power spectral density distributions. A spectrogram of the voice signal is obtained performing a short time Fourier Transform with Gaussian windows, this spectrogram portraits frequency related features, such as vocal tract resonances and quasi-periodic excitations during voiced sounds. Emotions induce such characteristics in speech, which become apparent in spectrogram time-frequency distributions. Later, the signal time-frequency representation from spectrogram is considered an image, and processed through a 2-dimensional Fourier Transform in order to perform the spatial Fourier analysis from it. Finally features related with emotions in voiced speech are extracted and presented.

  18. Estimating costs in the economic evaluation of medical technologies.

    PubMed

    Luce, B R; Elixhauser, A

    1990-01-01

    The complexities and nuances of evaluating the costs associated with providing medical technologies are often underestimated by analysts engaged in economic evaluations. This article describes the theoretical underpinnings of cost estimation, emphasizing the importance of accounting for opportunity costs and marginal costs. The various types of costs that should be considered in an analysis are described; a listing of specific cost elements may provide a helpful guide to analysis. The process of identifying and estimating costs is detailed, and practical recommendations for handling the challenges of cost estimation are provided. The roles of sensitivity analysis and discounting are characterized, as are determinants of the types of costs to include in an analysis. Finally, common problems facing the analyst are enumerated with suggestions for managing these problems.

  19. Patient assessment within the context of healthcare delivery packages: A comparative analysis.

    PubMed

    Rossen, Camilla Blach; Buus, Niels; Stenager, Egon; Stenager, Elsebeth

    2016-01-01

    Due to an increased focus on productivity and cost-effectiveness, many countries across the world have implemented a variety of tools for standardizing diagnostics and treatment. In Denmark, healthcare delivery packages are increasingly used for assessment of patients. A package is a tool for creating coordination, continuity and efficient pathways; each step is pre-booked, and the package has a well-defined content within a predefined category of diseases. The aim of this study was to investigate how assessment processes took place within the context of healthcare delivery packages. The study used a constructivist Grounded Theory approach. Ethnographic fieldwork was carried out in three specialized units: a mental health unit and two multiple sclerosis clinics in Southern Denmark, which all used assessment packages. Several types of data were sampled through theoretical sampling. Participant observation was conducted for a total of 126h. Formal and informal interviews were conducted with 12 healthcare professionals and 13 patients. Furthermore, audio recordings were made of 9 final consultations between physicians and patients; 193min of recorded consultations all in all. Lastly, the medical records of 13 patients and written information about packages were collected. The comparative, abductive analysis focused on the process of assessment and the work made by all the actors involved. In this paper, we emphasized the work of healthcare professionals. We constructed five interrelated categories: 1. "Standardized assessing", 2. "Flexibility", which has two sub-categories, 2.1. "Diagnostic options" and 2.2. "Time and organization", and, finally, 3. "Resisting the frames". The process of assessment required all participants to perform the predefined work in the specified way at the specified time. Multidisciplinary teamwork was essential for the success of the process. The local organization of the packages influenced the assessment process, most notably the pre-defined scope of relevant diseases targeted by the package. The inflexible frames of the assessment package could cause resistance among clinicians. Moreover, expert knowledge was an important factor for the efficiency of the process. Some types of organizational work processes resulted in many patients being assessed, but without being diagnosed with at package-relevant disease. Limiting the grounds for using specialist knowledge in structured health care delivery may affect specialists' sense of professional autonomy and can result in professionals employing strategies to resist the frames of the packages. Finally, when organizing healthcare delivery packages, it seems important to consider how to make the optimal use of specialist knowledge. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Techniques of Final Preseal Visual Inspection

    NASA Technical Reports Server (NTRS)

    Anstead, R. J.

    1975-01-01

    A dissertation is given on the final preseal visual inspection of microcircuit devices to detect manufacturing defects and reduce failure rates in service. The processes employed in fabricating monolithic integrated circuits and hybrid microcircuits, various failure mechanisms resulting from deficiencies in those processes, and the rudiments of performing final inspection are outlined.

  1. Change in oligosaccharides during processing of soybean sheet.

    PubMed

    Wang, Qiushuang; Ke, Leqin; Yang, Dongmei; Bao, Bili; Jiang, Jianmei; Ying, Tiejin

    2007-01-01

    Oligosaccharides have been credited with many health-promoting functions, which had been identified in many clinical studies, such as promoting the growth of Bifidobacterium in human intestine and balance of intestinal bacteria, modulating the immune response, inhibition of cancer and tumor, stimulation of mineral absorption. In this study the effect of processing unit operations on the levels of soybean oligosaccharides during production of soybean sheet were investigated. The concentrations of oligosaccharide in initial raw soybean were: sucrose 43.05 g/kg, raffinose 7.52 g/kg and stachyose 41.32 g/kg (in dry matter). Oligosaccharide losses in the soaking water, in the first filtrating stage, in the second filtrating stage and finally in the sheet formation stage were 0.68, 10.3, 8.15 and 47.22 g/kg (initial dry soybean) respectively, representing 0.74, 11.21, 8.87 and 51.39% of the total oligosaccharides present in the initial soybeans. The recovery of oligosaccharides in the final soybean sheet from the initial soybean was 27.92%. The loss of soybean oligosaccharides in different processing stages, especially in the by-product, the sweet slurry, was considerable. The loss of oligosaccharides was mainly associated with water/matter removal in production process. The analysis of loss profile implied possible ways to improve the technology for production of oligosaccharides-enriched soy-sheets.

  2. Confocal Analysis of Nuclear Lamina Behavior during Male Meiosis and Spermatogenesis in Drosophila melanogaster.

    PubMed

    Fabbretti, Fabiana; Iannetti, Ilaria; Guglielmi, Loredana; Perconti, Susanna; Evangelistella, Chiara; Proietti De Santis, Luca; Bongiorni, Silvia; Prantera, Giorgio

    2016-01-01

    Lamin family proteins are structural components of a filamentous framework, the nuclear lamina (NL), underlying the inner membrane of nuclear envelope. The NL not only plays a role in nucleus mechanical support and nuclear shaping, but is also involved in many cellular processes including DNA replication, gene expression and chromatin positioning. Spermatogenesis is a very complex differentiation process in which each stage is characterized by nuclear architecture dramatic changes, from the early mitotic stage to the sperm differentiation final stage. Nevertheless, very few data are present in the literature on the NL behavior during this process. Here we show the first and complete description of NL behavior during meiosis and spermatogenesis in Drosophila melanogaster. By confocal imaging, we characterized the NL modifications from mitotic stages, through meiotic divisions to sperm differentiation with an anti-laminDm0 antibody against the major component of the Drosophila NL. We observed that continuous changes in the NL structure occurred in parallel with chromatin reorganization throughout the whole process and that meiotic divisions occurred in a closed context. Finally, we analyzed NL in solofuso meiotic mutant, where chromatin segregation is severely affected, and found the strict correlation between the presence of chromatin and that of NL.

  3. Confocal Analysis of Nuclear Lamina Behavior during Male Meiosis and Spermatogenesis in Drosophila melanogaster

    PubMed Central

    Fabbretti, Fabiana; Iannetti, Ilaria; Guglielmi, Loredana; Perconti, Susanna; Evangelistella, Chiara; Proietti De Santis, Luca; Bongiorni, Silvia; Prantera, Giorgio

    2016-01-01

    Lamin family proteins are structural components of a filamentous framework, the nuclear lamina (NL), underlying the inner membrane of nuclear envelope. The NL not only plays a role in nucleus mechanical support and nuclear shaping, but is also involved in many cellular processes including DNA replication, gene expression and chromatin positioning. Spermatogenesis is a very complex differentiation process in which each stage is characterized by nuclear architecture dramatic changes, from the early mitotic stage to the sperm differentiation final stage. Nevertheless, very few data are present in the literature on the NL behavior during this process. Here we show the first and complete description of NL behavior during meiosis and spermatogenesis in Drosophila melanogaster. By confocal imaging, we characterized the NL modifications from mitotic stages, through meiotic divisions to sperm differentiation with an anti-laminDm0 antibody against the major component of the Drosophila NL. We observed that continuous changes in the NL structure occurred in parallel with chromatin reorganization throughout the whole process and that meiotic divisions occurred in a closed context. Finally, we analyzed NL in solofuso meiotic mutant, where chromatin segregation is severely affected, and found the strict correlation between the presence of chromatin and that of NL. PMID:26963718

  4. Adoption Space and the Idea-to-Market Process of Health Technologies.

    PubMed

    Saranummi, Niilo; Beuscart, Regis; Black, Norman; Maglaveras, Nicos; Strano, Chiara; Karavidopoulou, Youla

    2016-01-01

    Although Europe 'produces' excellent science, it has not been equally successful in translating scientific results into commercially successful companies in spite of European and national efforts invested in supporting the translation process. The Idea-to-Market process is highly complex due to the large number of actors and stakeholders. ITECH was launched to propose recommendations which would accelerate the Idea-to-Market process of health technologies leading to improvements in the competitiveness of the European health technology industry in the global markets. The project went through the following steps: defining the Idea-to-Market process model; collection and analysis of funding opportunities; identification of 12 gaps and barriers in the Idea-to-Market process; a detailed analysis of these supported by interviews; a prioritization process to select the most important issues; construction of roadmaps for the prioritized issues; and finally generating recommendations and associated action plans. Seven issues were classified as in need of actions. Three of these are part of the ongoing Medical Device Directive Reform (MDR), namely health technology assessment, post-market surveillance and regulatory process, and therefore not within the scope of ITECH. Recommendations were made for eHealth taxonomy; Education and training; Clinical trials and Adoption space and Human Factors Engineering (HFE).

  5. Design and Production of the Injection Mould with a Cax Assistance

    NASA Astrophysics Data System (ADS)

    Likavčan, Lukáš; Frnčík, Martin; Zaujec, Rudolf; Satin, Lukáš; Martinkovič, Maroš

    2016-09-01

    This paper is focused on the process of designing the desired plastic component and injection mould by using the 3D CAD systems. The subsequent FEM analysis of the injection mould process was carried out in order to define shrinkage and deformation of the plastic material by CAE system. The dimensions of the mould were then modified to compensate the shrinkage effect. Machining process (milling and the laser texturing) of the mould was performed by using CAM systems. Finally, after the production of the plastic components by the injection mould technology, the inspection of the plastic component dimensions was carried out by CAQ in order to define the accuracy of the whole CAx chain. It was also demonstrated that CAx systems are an integral part of pre-production and production process.

  6. Fault detection of Tennessee Eastman process based on topological features and SVM

    NASA Astrophysics Data System (ADS)

    Zhao, Huiyang; Hu, Yanzhu; Ai, Xinbo; Hu, Yu; Meng, Zhen

    2018-03-01

    Fault detection in industrial process is a popular research topic. Although the distributed control system(DCS) has been introduced to monitor the state of industrial process, it still cannot satisfy all the requirements for fault detection of all the industrial systems. In this paper, we proposed a novel method based on topological features and support vector machine(SVM), for fault detection of industrial process. The proposed method takes global information of measured variables into account by complex network model and predicts whether a system has generated some faults or not by SVM. The proposed method can be divided into four steps, i.e. network construction, network analysis, model training and model testing respectively. Finally, we apply the model to Tennessee Eastman process(TEP). The results show that this method works well and can be a useful supplement for fault detection of industrial process.

  7. Research and development of low cost processes for integrated solar arrays. Final report, April 15, 1974--January 14, 1976

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graham, C.D.; Kulkarni, S.; Louis, E.

    1976-05-01

    Results of a program to study process routes leading to a low cost large area integrated silicon solar array manufacture for terrestrial applications are reported. Potential processes for the production of solar-grade silicon are evaluated from thermodynamic, economic, and technical feasibility points of view. Upgrading of the present arc-furnace process is found most favorable. Experimental studies of the Si/SiF/sub 4/ transport and purification process show considerable impurity removal and reasonable transport rates. Silicon deformation experiments indicate production of silicon sheet by rolling at 1350/sup 0/C is feasible. Significant recrystallization by strain-anneal technique has been observed. Experimental recrystallization studies using anmore » electron beam line source are discussed. A maximum recrystallization velocity of approximately 9 m/hr is calculated for silicon sheet. A comparative process rating technique based on detailed cost analysis is presented.« less

  8. Harvesting Social Signals to Inform Peace Processes Implementation and Monitoring

    PubMed Central

    Nigam, Aastha; Dambanemuya, Henry K.; Joshi, Madhav; Chawla, Nitesh V.

    2017-01-01

    Abstract Peace processes are complex, protracted, and contentious involving significant bargaining and compromising among various societal and political stakeholders. In civil war terminations, it is pertinent to measure the pulse of the nation to ensure that the peace process is responsive to citizens' concerns. Social media yields tremendous power as a tool for dialogue, debate, organization, and mobilization, thereby adding more complexity to the peace process. Using Colombia's final peace agreement and national referendum as a case study, we investigate the influence of two important indicators: intergroup polarization and public sentiment toward the peace process. We present a detailed linguistic analysis to detect intergroup polarization and a predictive model that leverages Tweet structure, content, and user-based features to predict public sentiment toward the Colombian peace process. We demonstrate that had proaccord stakeholders leveraged public opinion from social media, the outcome of the Colombian referendum could have been different. PMID:29235916

  9. Research on the injectors remanufacturing

    NASA Astrophysics Data System (ADS)

    Daraba, D.; Alexandrescu, I. M.; Daraba, C.

    2017-05-01

    During the remanufacturing process, the injector body - after disassembling and cleaning process - should be subjected to some strict control processes, both visually and by an electronic microscope, for evidencing any defects that may occur on the sealing surface of the injector body and the atomizer. In this paper we present the path followed by an injector body in the process of remanufacturing, exemplifying the verification method of roughness and hardness of the sealing surfaces, as well as the microscopic analysis of the sealing surface areas around the inlet. These checks can indicate which path the injector body has to follow during the remanufacturing. The control methodology of the injector body, that is established on the basis of this research, helps preventing some defective injector bodies to enter into the remanufacturing process, thus reducing to a minimum the number of remanufactured injectors to be declared non-conforming after final verification process.

  10. Harvesting Social Signals to Inform Peace Processes Implementation and Monitoring.

    PubMed

    Nigam, Aastha; Dambanemuya, Henry K; Joshi, Madhav; Chawla, Nitesh V

    2017-12-01

    Peace processes are complex, protracted, and contentious involving significant bargaining and compromising among various societal and political stakeholders. In civil war terminations, it is pertinent to measure the pulse of the nation to ensure that the peace process is responsive to citizens' concerns. Social media yields tremendous power as a tool for dialogue, debate, organization, and mobilization, thereby adding more complexity to the peace process. Using Colombia's final peace agreement and national referendum as a case study, we investigate the influence of two important indicators: intergroup polarization and public sentiment toward the peace process. We present a detailed linguistic analysis to detect intergroup polarization and a predictive model that leverages Tweet structure, content, and user-based features to predict public sentiment toward the Colombian peace process. We demonstrate that had proaccord stakeholders leveraged public opinion from social media, the outcome of the Colombian referendum could have been different.

  11. Re-engineering the mission life cycle with ABC and IDEF

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Rackley, Michael; Karlin, Jay

    1994-01-01

    The theory behind re-engineering a business process is to remove the non-value added activities thereby lowering the process cost. In order to achieve this, one must be able to identify where the non-value added elements are located which is not a trivial task. This is because the non-value added elements are often hidden in the form of overhead and/or pooled resources. In order to be able to isolate these non-value added processes from among the other processes, one must first decompose the overall top level process into lower layers of sub-processes. In addition, costing data must be assigned to each sub-process along with the value the sub-process adds towards the final product. IDEF0 is a Federal Information Processing Standard (FIPS) process-modeling tool that allows for this functional decomposition through structured analysis. In addition, it illustrates the relationship of the process and the value added to the product or service. The value added portion is further defined in IDEF1X which is an entity relationship diagramming tool. The entity relationship model is the blueprint of the product as it moves along the 'assembly line' and therefore relates all of the parts to each other and the final product. It also relates the parts to the tools that produce the product and all of the paper work that is used in their acquisition. The use of IDEF therefore facilitates the use of Activity Based Costing (ABC). ABC is an essential method in a high variety, product-customizing environment, to facilitate rapid response to externally caused change. This paper describes the work being done in the Mission Operations Division to re-engineer the development and operation life cycle of Mission Operations Centers using these tools.

  12. Ecological studies on the revegetation process of surface coal mined areas in North Dakota. 9. Viability and diversity of the seed bank. Final report Aug 75-Jun 82

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iverson, L.R.; Brophy, L.

    1982-06-01

    Analysis of seed numbers present in topsoils indicated that seeds of the most prevalent colonizers (e.g. Kochia scoparia, Setaria virdis, and Salsola collins) were not present in the topsoil upon respreading but rather appeared by immigration from the surrounding areas. Seed bank analysis was also undertaken on mined sites ranging in age of 2 to 6 years. As with the previous part of this study there was a poor correlation between the aboveground flora and the belowground seed composition.

  13. Preparation and Microcosmic Structural Analysis of Recording Coating on Inkjet Printing Media

    PubMed Central

    Jiang, Bo; Liu, Weiyan; Bai, Yongping; Huang, Yudong; Liu, Li; Han, Jianping

    2011-01-01

    Preparation of recording coating on inkjet printing (RC-IJP) media was proposed. The microstructure and roughness of RC-IJP was analyzed by scanning electron microscopy (SEM) and atomic force microscope (AFM). The surface infiltration process of RC-IJP was studied by a liquid infiltration instrument. The distribution of C, O and Si composites on recording coating surface is analyzed by energy dispersive spectrum (EDS). The transmission electron microscopy (TEM) analysis showed that the nanoscale silica could be dissolved uniformly in water. Finally, the print color is shown clearly by the preparative recording coating. PMID:21954368

  14. Opinion Dynamics and Decision of Vote in Bipolar Political Systems

    NASA Astrophysics Data System (ADS)

    Caruso, Filippo; Castorina, Paolo

    A model of the opinion dynamics underlying the political decision is proposed. The analysis is restricted to a bipolar scheme with a possible third political area. The interaction among voters is local but the final decision strongly depends on global effects such as the rating of the governments. As in the realistic case, the individual decision making process is determined by the most relevant personal interests and problems. The phenomenological analysis of the national vote in Italy and Germany has been carried out and a prediction of the next Italian vote as a function of the government rating is presented.

  15. Quantitative analysis of peel-off degree for printed electronics

    NASA Astrophysics Data System (ADS)

    Park, Janghoon; Lee, Jongsu; Sung, Ki-Hak; Shin, Kee-Hyun; Kang, Hyunkyoo

    2018-02-01

    We suggest a facile methodology of peel-off degree evaluation by image processing on printed electronics. The quantification of peeled and printed areas was performed using open source programs. To verify the accuracy of methods, we manually removed areas from the printed circuit that was measured, resulting in 96.3% accuracy. The sintered patterns showed a decreasing tendency in accordance with the increase in the energy density of an infrared lamp, and the peel-off degree increased. Thus, the comparison between both results was presented. Finally, the correlation between performance characteristics was determined by quantitative analysis.

  16. Thickness optimization of auricular silicone scaffold based on finite element analysis.

    PubMed

    Jiang, Tao; Shang, Jianzhong; Tang, Li; Wang, Zhuo

    2016-01-01

    An optimized thickness of a transplantable auricular silicone scaffold was researched. The original image data were acquired from CT scans, and reverse modeling technology was used to build a digital 3D model of an auricle. The transplant process was simulated in ANSYS Workbench by finite element analysis (FEA), solid scaffolds were manufactured based on the FEA results, and the transplantable artificial auricle was finally obtained with an optimized thickness, as well as sufficient intensity and hardness. This paper provides a reference for clinical transplant surgery. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Rotorcraft Conceptual Design Environment

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne; Sinsay, Jeffrey

    2009-01-01

    Requirements for a rotorcraft conceptual design environment are discussed, from the perspective of a government laboratory. Rotorcraft design work in a government laboratory must support research, by producing technology impact assessments and defining the context for research and development; and must support the acquisition process, including capability assessments and quantitative evaluation of designs, concepts, and alternatives. An information manager that will enable increased fidelity of analysis early in the design effort is described. This manager will be a framework to organize information that describes the aircraft, and enable movement of that information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described.

  18. Rotorcraft Conceptual Design Environment

    NASA Technical Reports Server (NTRS)

    Johnson, Wayne; Sinsay, Jeffrey D.

    2010-01-01

    Requirements for a rotorcraft conceptual design environment are discussed, from the perspective of a government laboratory. Rotorcraft design work in a government laboratory must support research, by producing technology impact assessments and defining the context for research and development; and must support the acquisition process, including capability assessments and quantitative evaluation of designs, concepts, and alternatives. An information manager that will enable increased fidelity of analysis early in the design effort is described. This manager will be a framework to organize information that describes the aircraft, and enable movement of that information to and from analyses. Finally, a recently developed rotorcraft system analysis tool is described.

  19. Boundary element analysis of post-tensioned slabs

    NASA Astrophysics Data System (ADS)

    Rashed, Youssef F.

    2015-06-01

    In this paper, the boundary element method is applied to carry out the structural analysis of post-tensioned flat slabs. The shear-deformable plate-bending model is employed. The effect of the pre-stressing cables is taken into account via the equivalent load method. The formulation is automated using a computer program, which uses quadratic boundary elements. Verification samples are presented, and finally a practical application is analyzed where results are compared against those obtained from the finite element method. The proposed method is efficient in terms of computer storage and processing time as well as the ease in data input and modifications.

  20. System data communication structures for active-control transport aircraft, volume 1

    NASA Technical Reports Server (NTRS)

    Hopkins, A. L.; Martin, J. H.; Brock, L. D.; Jansson, D. G.; Serben, S.; Smith, T. B.; Hanley, L. D.

    1981-01-01

    Candidate data communication techniques are identified, including dedicated links, local buses, broadcast buses, multiplex buses, and mesh networks. The design methodology for mesh networks is then discussed, including network topology and node architecture. Several concepts of power distribution are reviewed, including current limiting and mesh networks for power. The technology issues of packaging, transmission media, and lightning are addressed, and, finally, the analysis tools developed to aid in the communication design process are described. There are special tools to analyze the reliability and connectivity of networks and more general reliability analysis tools for all types of systems.

  1. Conducting financial due diligence of medical practices.

    PubMed

    Louiselle, P

    1995-12-01

    Many healthcare organizations are acquiring medical practices in an effort to build more integrated systems of healthcare products and services. This acquisition activity must be approached cautiously to ensure that medical practices being acquired do not have deficiencies that would jeopardize integration efforts. Conducting a thorough due diligence analysis of medical practices before finalizing the transaction can limit the acquiring organizations' legal and financial exposure and is a necessary component to the acquisition process. The author discusses the components of a successful financial due diligence analysis and addresses some of the risk factors in a practice acquisition.

  2. A Variational Assimilation Method for Satellite and Conventional Data: Development of Basic Model for Diagnosis of Cyclone Systems

    NASA Technical Reports Server (NTRS)

    Achtemeier, Gary L.; Scott, Robert W.; Chen, J.

    1991-01-01

    A summary is presented of the progress toward the completion of a comprehensive diagnostic objective analysis system based upon the calculus of variations. The approach was to first develop the objective analysis subject to the constraints that the final product satisfies the five basic primitive equations for a dry inviscid atmosphere: the two nonlinear horizontal momentum equations, the continuity equation, the hydrostatic equation, and the thermodynamic equation. Then, having derived the basic model, there would be added to it the equations for moist atmospheric processes and the radiative transfer equation.

  3. Anatomy of a Security Operations Center

    NASA Technical Reports Server (NTRS)

    Wang, John

    2010-01-01

    Many agencies and corporations are either contemplating or in the process of building a cyber Security Operations Center (SOC). Those Agencies that have established SOCs are most likely working on major revisions or enhancements to existing capabilities. As principle developers of the NASA SOC; this Presenters' goals are to provide the GFIRST community with examples of some of the key building blocks of an Agency scale cyber Security Operations Center. This presentation viII include the inputs and outputs, the facilities or shell, as well as the internal components and the processes necessary to maintain the SOC's subsistence - in other words, the anatomy of a SOC. Details to be presented include the SOC architecture and its key components: Tier 1 Call Center, data entry, and incident triage; Tier 2 monitoring, incident handling and tracking; Tier 3 computer forensics, malware analysis, and reverse engineering; Incident Management System; Threat Management System; SOC Portal; Log Aggregation and Security Incident Management (SIM) systems; flow monitoring; IDS; etc. Specific processes and methodologies discussed include Incident States and associated Work Elements; the Incident Management Workflow Process; Cyber Threat Risk Assessment methodology; and Incident Taxonomy. The Evolution of the Cyber Security Operations Center viII be discussed; starting from reactive, to proactive, and finally to proactive. Finally, the resources necessary to establish an Agency scale SOC as well as the lessons learned in the process of standing up a SOC viII be presented.

  4. Activating clinical trials: a process improvement approach.

    PubMed

    Martinez, Diego A; Tsalatsanis, Athanasios; Yalcin, Ali; Zayas-Castro, José L; Djulbegovic, Benjamin

    2016-02-24

    The administrative process associated with clinical trial activation has been criticized as costly, complex, and time-consuming. Prior research has concentrated on identifying administrative barriers and proposing various solutions to reduce activation time, and consequently associated costs. Here, we expand on previous research by incorporating social network analysis and discrete-event simulation to support process improvement decision-making. We searched for all operational data associated with the administrative process of activating industry-sponsored clinical trials at the Office of Clinical Research of the University of South Florida in Tampa, Florida. We limited the search to those trials initiated and activated between July 2011 and June 2012. We described the process using value stream mapping, studied the interactions of the various process participants using social network analysis, and modeled potential process modifications using discrete-event simulation. The administrative process comprised 5 sub-processes, 30 activities, 11 decision points, 5 loops, and 8 participants. The mean activation time was 76.6 days. Rate-limiting sub-processes were those of contract and budget development. Key participants during contract and budget development were the Office of Clinical Research, sponsors, and the principal investigator. Simulation results indicate that slight increments on the number of trials, arriving to the Office of Clinical Research, would increase activation time by 11 %. Also, incrementing the efficiency of contract and budget development would reduce the activation time by 28 %. Finally, better synchronization between contract and budget development would reduce time spent on batching documentation; however, no improvements would be attained in total activation time. The presented process improvement analytic framework not only identifies administrative barriers, but also helps to devise and evaluate potential improvement scenarios. The strength of our framework lies in its system analysis approach that recognizes the stochastic duration of the activation process and the interdependence between process activities and entities.

  5. A 3-year hygiene and safety monitoring of a meat processing plant which uses raw materials of global origin.

    PubMed

    Manios, Stavros G; Grivokostopoulos, Nikolaos C; Bikouli, Vasiliki C; Doultsos, Dimitrios A; Zilelidou, Evangelia A; Gialitaki, Maria A; Skandamis, Panagiotis N

    2015-09-16

    A systematic approach in monitoring the hygiene of a meat processing plant using classical microbiological analyses combined with molecular characterization tools may assist in the safety of the final products. This study aimed: (i) to evaluate the total hygiene level and, (ii) to monitor and characterize the occurrence and spread of Salmonella spp. and Listeria monocytogenes in the environment and the final products of a meat industry that processes meat of global origin. In total, 2541 samples from the processing environment, the raw materials, and the final products were collected from a Greek meat industry in the period 2011-2013. All samples were subjected to enumeration of total viable counts (TVC), Escherichia coli (EC) and total coliforms (TCC) and the detection of Salmonella spp., while 709 of these samples were also analyzed for the presence L. monocytogenes. Pathogen isolates were serotyped and further characterized for their antibiotic resistance and subtyped by PFGE. Raw materials were identified as the primary source of contamination, while improper handling might have also favored the proliferation of the initial microbial load. The occurrence of Salmonella spp. and L. monocytogenes reached 5.5% and 26.9%, respectively. Various (apparent) cross-contamination or persistence trends were deduced based on PFGE analysis results. Salmonella isolates showed wide variation in their innate antibiotic resistance, contrary to L. monocytogenes ones, which were found susceptible to all antibiotics except for cefotaxime. The results emphasize the biodiversity of foodborne pathogens in a meat industry and may be used by meat processors to understand the spread of pathogens in the processing environment, as well as to assist the Food Business Operator (FBO) in establishing effective criteria for selection of raw materials and in improving meat safety and quality. This approach can limit the increase of microbial contamination during the processing steps observed in our study as well as the cross contamination of meat products. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. The importance of ion fluxes for cancer proliferation and metastasis: A thermodynamic analysis.

    PubMed

    Lucia, Umberto; Deisboeck, Thomas S

    2018-05-14

    Following a thermodynamic approach, we develop a new theoretical analysis of ion transfer across cell membranes. Supported also by experimental data from the literature, we highlight that ion channels determine the typical features of cancer cells, i.e. independence from growth-regulatory signals, avoidance of apoptosis, indefinite proliferative potential, and the capability of inducing angiogenesis. Specifically, we analyse how ion transport, with particular regards to Ca 2+ fluxes, modulates cancer cell proliferation, and regulates cell cycle checkpoints. Finally, our analysis also suggests that in malignant tumours aerobic glycolysis is the more efficient metabolic process when taking the required solvent capacity into account. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. Preliminary Design and Analysis of the ARES Atmospheric Flight Vehicle Thermal Control System

    NASA Technical Reports Server (NTRS)

    Gasbarre, J. F.; Dillman, R. A.

    2003-01-01

    The Aerial Regional-scale Environmental Survey (ARES) is a proposed 2007 Mars Scout Mission that will be the first mission to deploy an atmospheric flight vehicle (AFV) on another planet. This paper will describe the preliminary design and analysis of the AFV thermal control system for its flight through the Martian atmosphere and also present other analyses broadening the scope of that design to include other phases of the ARES mission. Initial analyses are discussed and results of trade studies are presented which detail the design process for AFV thermal control. Finally, results of the most recent AFV thermal analysis are shown and the plans for future work are discussed.

  8. Uncertainty analysis of signal deconvolution using a measured instrument response function

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartouni, E. P.; Beeman, B.; Caggiano, J. A.

    2016-10-05

    A common analysis procedure minimizes the ln-likelihood that a set of experimental observables matches a parameterized model of the observation. The model includes a description of the underlying physical process as well as the instrument response function (IRF). Here, we investigate the National Ignition Facility (NIF) neutron time-of-flight (nTOF) spectrometers, the IRF is constructed from measurements and models. IRF measurements have a finite precision that can make significant contributions to the uncertainty estimate of the physical model’s parameters. Finally, we apply a Bayesian analysis to properly account for IRF uncertainties in calculating the ln-likelihood function used to find the optimummore » physical parameters.« less

  9. Product Development and Cost Analysis of Fabricating the Prototype of Roller Clamp in Intravenous (I.V) Tubing Medical Devices using Fused Deposition Modeling (FDM) Technology

    NASA Astrophysics Data System (ADS)

    Way, Yusoff

    2018-01-01

    The main aim of this research is to develop a new prototype and to conduct cost analysis of the existing roller clamp which is one of parts attached to Intravenous (I.V) Tubing used in Intravenous therapy medical device. Before proceed with the process to manufacture the final product using Fused Deposition Modeling (FDM) Technology, the data collected from survey were analyzed using Product Design Specifications approach. Selected concept has been proven to have better quality, functions and criteria compared to the existing roller clamp and the cost analysis of fabricating the roller clamp prototype was calculated.

  10. Metal Big Area Additive Manufacturing: Process Modeling and Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simunovic, Srdjan; Nycz, Andrzej; Noakes, Mark W

    Metal Big Area Additive Manufacturing (mBAAM) is a new additive manufacturing (AM) technology for printing large-scale 3D objects. mBAAM is based on the gas metal arc welding process and uses a continuous feed of welding wire to manufacture an object. An electric arc forms between the wire and the substrate, which melts the wire and deposits a bead of molten metal along the predetermined path. In general, the welding process parameters and local conditions determine the shape of the deposited bead. The sequence of the bead deposition and the corresponding thermal history of the manufactured object determine the long rangemore » effects, such as thermal-induced distortions and residual stresses. Therefore, the resulting performance or final properties of the manufactured object are dependent on its geometry and the deposition path, in addition to depending on the basic welding process parameters. Physical testing is critical for gaining the necessary knowledge for quality prints, but traversing the process parameter space in order to develop an optimized build strategy for each new design is impractical by pure experimental means. Computational modeling and optimization may accelerate development of a build process strategy and saves time and resources. Because computational modeling provides these opportunities, we have developed a physics-based Finite Element Method (FEM) simulation framework and numerical models to support the mBAAM process s development and design. In this paper, we performed a sequentially coupled heat transfer and stress analysis for predicting the final deformation of a small rectangular structure printed using the mild steel welding wire. Using the new simulation technologies, material was progressively added into the FEM simulation as the arc weld traversed the build path. In the sequentially coupled heat transfer and stress analysis, the heat transfer was performed to calculate the temperature evolution, which was used in a stress analysis to evaluate the residual stresses and distortions. In this formulation, we assume that physics is directionally coupled, i.e. the effect of stress of the component on the temperatures is negligible. The experiment instrumentation (measurement types, sensor types, sensor locations, sensor placements, measurement intervals) and the measurements are presented. The temperatures and distortions from the simulations show good correlation with experimental measurements. Ongoing modeling work is also briefly discussed.« less

  11. Effectiveness of different final irrigation protocols in removing debris in flattened root canals.

    PubMed

    Nadalin, Michele Regina; Perez, Danyel Elias da Cruz; Vansan, Luiz Pascoal; Paschoala, Cristina; Souza-Neto, Manoel Damião; Saquy, Paulo César

    2009-01-01

    This study evaluated in vitro the capacity of debris removal from the apical third of flattened root canals, using different final irrigation protocols. Thirty human mandibular central incisors with a mesiodistal flattened root were prepared using rotary instrumentation by Endo-Flare 25.12 and Hero 642 30.06, 35.02, 40.02 files, irrigated with 2 mL of 1% NaOCl after each file. The specimens were randomly distributed into 5 groups according to the final irrigation of root canals: Group I: 10 mL of distilled water (control), Group II: 10 mL of 1% NaOCl for 8 min, Group III: 2 mL of 1% NaOCl for 2 min (repeated 4 times), Group IV: 10 mL of 2.5% NaOCl for 8 min, and Group V: 10 mL of 2.5% NaOCl for 2 min (repeated 4 times). The apical thirds of the specimens were subjected to histological processing and 6-microm cross-sections were obtained and stained with hematoxylin-eosin. The specimens were examined under optical microscopy at x40 magnification and the images were subjected to morphometric analysis using the Scion image-analysis software. The total area of root canal and the area with debris were measured in square millimeters. Analysis of variance showed no statistically significant difference (p>0.05) among the groups GI (2.39 +/- 3.59), GII (2.91 +/- 2.21), GIII (0.73 +/- 1.36), GIV (0.95 +/- 0.84) and GV (0.51 +/- 0.22). In conclusion, the final irrigation protocols evaluated in this study using the Luer syringe presented similar performance in the removal of debris from the apical third of flattened root canals.

  12. Evaluation of MARC for the analysis of rotating composite blades

    NASA Technical Reports Server (NTRS)

    Bartos, Karen F.; Ernst, Michael A.

    1993-01-01

    The suitability of the MARC code for the analysis of rotating composite blades was evaluated using a four-task process. A nonlinear displacement analysis and subsequent eigenvalue analysis were performed on a rotating spring mass system to ensure that displacement-dependent centrifugal forces were accounted for in the eigenvalue analysis. Normal modes analyses were conducted on isotropic plates with various degrees of twist to evaluate MARC's ability to handle blade twist. Normal modes analyses were conducted on flat composite plates to validate the newly developed coupled COBSTRAN-MARC methodology. Finally, normal modes analyses were conducted on four composite propfan blades that were designed, analyzed, and fabricated at NASA Lewis Research Center. Results were compared with experimental data. The research documented herein presents MARC as a viable tool for the analysis of rotating composite blades.

  13. Evaluation of MARC for the analysis of rotating composite blades

    NASA Astrophysics Data System (ADS)

    Bartos, Karen F.; Ernst, Michael A.

    1993-03-01

    The suitability of the MARC code for the analysis of rotating composite blades was evaluated using a four-task process. A nonlinear displacement analysis and subsequent eigenvalue analysis were performed on a rotating spring mass system to ensure that displacement-dependent centrifugal forces were accounted for in the eigenvalue analysis. Normal modes analyses were conducted on isotropic plates with various degrees of twist to evaluate MARC's ability to handle blade twist. Normal modes analyses were conducted on flat composite plates to validate the newly developed coupled COBSTRAN-MARC methodology. Finally, normal modes analyses were conducted on four composite propfan blades that were designed, analyzed, and fabricated at NASA Lewis Research Center. Results were compared with experimental data. The research documented herein presents MARC as a viable tool for the analysis of rotating composite blades.

  14. Specific and sensitive enzyme-linked immunosorbent assays for analysis of residual allergenic food proteins in commercial bottled wine fined with egg white, milk, and nongrape-derived tannins.

    PubMed

    Rolland, Jennifer M; Apostolou, Effie; de Leon, Maria P; Stockley, Creina S; O'Hehir, Robyn E

    2008-01-23

    Regulations introduced by the Food Standards Australia New Zealand in December 2002 require all wine and wine product labels in Australia to identify the presence of a processing aid, additive or other ingredient, which is known to be a potential allergen. The objective of this study was to establish sensitive assays to detect and measure allergenic proteins from commonly used processing aids in final bottled wine. Sensitive and specific enzyme-linked immunosorbent assays (ELISA) were developed and established for the proteins casein, ovalbumin, and peanut. Lower limits of detection of these proteins were 8, 1, and 8 ng/mL, respectively. A panel of 153 commercially available bottled Australian wines were tested by these ELISA, and except for two red wines known to contain added whole eggs, residuals of these food allergens were not detected in any wine. These findings are consistent with a lack of residual potentially allergenic egg-, milk-, or nut-derived processing aids in final bottled wine produced in Australia according to good manufacturing practice at a concentration that could cause an adverse reaction in egg, milk, or peanut/tree-nut allergic adult consumers.

  15. Nitrogen removal by recycle water nitritation as an attractive alternative for retrofit technologies in municipal wastewater treatment plants.

    PubMed

    Gil, K I; Choi, E

    2004-01-01

    The recycle water from sludge processing in municipal wastewater treatment plants causes many serious problems in the efficiency and stability of the mainstream process. Thus, the design approach for recycle water is an important part of any biological nutrient removal system design when a retrofit technology is required for upgrading an existing plant. Moreover, the application of nitrogen removal from recycle water using the nitritation process has recently increased due to economic reasons associated with an effective carbon allocation as well as the minimization of aeration costs. However, for the actual application of recycle water nitritation, it has not been fully examined whether or not additional volume would be required in an existing plant. In this paper, the addition of recycle water nitritation to an existing plant was evaluated based on a volume analysis and estimation of final effluent quality. It was expected that using the reserve volume of the aeration tank in existing plants, recycle water nitritation could be applied to a plant without any enlargement. With the addition of recycle water nitritation, it was estimated that the final effluent quality would be improved and stabilized, especially in the winter season.

  16. Purification and proteomic analysis of plant plasma membranes.

    PubMed

    Alexandersson, Erik; Gustavsson, Niklas; Bernfur, Katja; Karlsson, Adine; Kjellbom, Per; Larsson, Christer

    2008-01-01

    All techniques needed for proteomic analyses of plant plasma membranes are described in detail, from isolation of plasma membranes to protein identification by mass spectrometry (MS). Plasma membranes are isolated by aqueous two-phase partitioning yielding vesicles with a cytoplasmic side-in orientation and a purity of about 95%. These vesicles are turned inside-out by treatment with Brij 58, which removes soluble contaminating proteins enclosed in the vesicles as well as loosely attached proteins. The final plasma membrane preparation thus retains all integral proteins and many peripheral proteins. Proteins are separated by one-dimensional sodium dodecyl sulphate-polyacrylamide gel electrophoresis (SDS-PAGE), and protein bands are excised and digested with trypsin. Peptides in tryptic digests are separated by nanoflow liquid chromatography and either fed directly into an ESI-MS or spotted onto matrix-assisted laser desorption ionization (MALDI) plates for analysis with MALDI-MS. Finally, data processing and database searching are used for protein identification to define a plasma membrane proteome.

  17. Image Registration Algorithm Based on Parallax Constraint and Clustering Analysis

    NASA Astrophysics Data System (ADS)

    Wang, Zhe; Dong, Min; Mu, Xiaomin; Wang, Song

    2018-01-01

    To resolve the problem of slow computation speed and low matching accuracy in image registration, a new image registration algorithm based on parallax constraint and clustering analysis is proposed. Firstly, Harris corner detection algorithm is used to extract the feature points of two images. Secondly, use Normalized Cross Correlation (NCC) function to perform the approximate matching of feature points, and the initial feature pair is obtained. Then, according to the parallax constraint condition, the initial feature pair is preprocessed by K-means clustering algorithm, which is used to remove the feature point pairs with obvious errors in the approximate matching process. Finally, adopt Random Sample Consensus (RANSAC) algorithm to optimize the feature points to obtain the final feature point matching result, and the fast and accurate image registration is realized. The experimental results show that the image registration algorithm proposed in this paper can improve the accuracy of the image matching while ensuring the real-time performance of the algorithm.

  18. Towards understanding the effects of additives on the vermicomposting of sewage sludge.

    PubMed

    Xing, Meiyan; Lv, Baoyi; Zhao, Chunhui; Yang, Jian

    2015-03-01

    This work evaluated the effects of additives on the chemical properties of the final products (vermicompost) from vermicomposting of sewage sludge and the adaptable characteristics of Eisenia fetida during the process. An experimental design with different ratios of sewage sludge and the additives (cattle dung or pig manure) was conducted. The results showed that the vermicomposting reduced total organic carbon and the quotient of total organic carbon to total nitrogen (C/N ratio) of the initial mixtures and enhanced the stability and agronomical value of the final products. Notably, principal component analysis indicated that the additives had significant effects on the characteristics of the vermicomposts. Moreover, the vermibeds containing cattle dung displayed a better earthworm growth and reproduction than those with pig manure. Additionally, redundancy analysis demonstrated that electrical conductivity (EC), pH, and C/N ratio played crucial roles on earthworm growth and reproduction. In all, the additives with high C/N ratio, pH buffering capacity, and low EC are recommended to be used for vermicomposting of sewage sludge.

  19. Acquisition and analysis of accelerometer data

    NASA Astrophysics Data System (ADS)

    Verges, Keith R.

    1990-08-01

    Acceleration data reduction must be undertaken with a complete understanding of the physical process, the means by which the data are acquired, and finally, the calculations necessary to put the data into a meaningful format. Discussed here are the acceleration sensor requirements dictated by the measurements desired. Sensor noise, dynamic range, and linearity will be determined from the physical parameters of the experiment. The digitizer requirements are discussed. Here the system from sensor to digital storage medium will be integrated, and rules of thumb for experiment duration, filter response, and number of bits are explained. Data reduction techniques after storage are also discussed. Time domain operations including decimating, digital filtering, and averaging are covered, as well as frequency domain methods, including windowing and the difference between power and amplitude spectra, and simple noise determination via coherence analysis. Finally, an example experiment using the Teledyne Geotech Model 44000 Seismometer to measure from 1 Hz to 10(exp -6) Hz is discussed. The sensor, data acquisition system, and example spectra are presented.

  20. Acquisition and analysis of accelerometer data

    NASA Technical Reports Server (NTRS)

    Verges, Keith R.

    1990-01-01

    Acceleration data reduction must be undertaken with a complete understanding of the physical process, the means by which the data are acquired, and finally, the calculations necessary to put the data into a meaningful format. Discussed here are the acceleration sensor requirements dictated by the measurements desired. Sensor noise, dynamic range, and linearity will be determined from the physical parameters of the experiment. The digitizer requirements are discussed. Here the system from sensor to digital storage medium will be integrated, and rules of thumb for experiment duration, filter response, and number of bits are explained. Data reduction techniques after storage are also discussed. Time domain operations including decimating, digital filtering, and averaging are covered, as well as frequency domain methods, including windowing and the difference between power and amplitude spectra, and simple noise determination via coherence analysis. Finally, an example experiment using the Teledyne Geotech Model 44000 Seismometer to measure from 1 Hz to 10(exp -6) Hz is discussed. The sensor, data acquisition system, and example spectra are presented.

Top